[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2018021261A1 - Dispositif de traitement d'image, dispositif de reproduction, procédé de traitement d'image et procédé de reproduction - Google Patents

Dispositif de traitement d'image, dispositif de reproduction, procédé de traitement d'image et procédé de reproduction Download PDF

Info

Publication number
WO2018021261A1
WO2018021261A1 PCT/JP2017/026748 JP2017026748W WO2018021261A1 WO 2018021261 A1 WO2018021261 A1 WO 2018021261A1 JP 2017026748 W JP2017026748 W JP 2017026748W WO 2018021261 A1 WO2018021261 A1 WO 2018021261A1
Authority
WO
WIPO (PCT)
Prior art keywords
still image
image data
unit
hdr
data
Prior art date
Application number
PCT/JP2017/026748
Other languages
English (en)
Japanese (ja)
Inventor
上坂 靖
小塚 雅之
美裕 森
福島 俊之
和彦 甲野
柏木 吉一郎
茂生 阪上
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Priority to EP17834267.1A priority Critical patent/EP3493532B8/fr
Priority to US16/317,081 priority patent/US11184596B2/en
Priority to CN201780046042.0A priority patent/CN109479111B/zh
Priority to JP2018529885A priority patent/JPWO2018021261A1/ja
Publication of WO2018021261A1 publication Critical patent/WO2018021261A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/10Digital recording or reproducing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/92Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/93Regeneration of the television signal or of selected parts thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level

Definitions

  • the present disclosure relates to an image processing device, a playback device, an image processing method, and a playback method.
  • Patent Document 1 discloses an imaging apparatus that records a HDR (High Dynamic Range) still image with a wide dynamic range by combining a plurality of images with different exposures.
  • HDR High Dynamic Range
  • the present disclosure provides an image processing device, a playback device, an image processing method, and a playback method that can obtain highly convenient still image data.
  • An image processing apparatus uses an acquisition unit that acquires still image data obtained by imaging, and the still image data acquired by the acquisition unit to have different luminance dynamic ranges and independent of each other.
  • Generating unit that logically generates one data unit including first still image data and second still image data that can be reproduced, and an output that outputs the data unit generated by the generating unit A section.
  • the playback device is logically one data unit including first still image data and second still image data that have different luminance dynamic ranges and can be played back independently of each other.
  • a reproducing unit that reproduces one of the first still image data and the second still image data included in the data unit acquired by the acquiring unit.
  • the image processing apparatus can obtain highly convenient still image data.
  • FIG. 1 is a diagram for explaining the evolution of video technology.
  • FIG. 2 is a diagram for explaining the HDR display technique.
  • FIG. 3A is a diagram for explaining a PQ (Perceptual Quantization) method.
  • FIG. 3B is a diagram for explaining an HLG (Hybrid Log Gamma) system.
  • FIG. 4 is a diagram comparing an example of an HDR image corresponding to HDR and an example of an SDR image corresponding to SDR.
  • FIG. 5 is a diagram for explaining an imaging device that supports HDR or SDR, a file format of image data obtained by the imaging device, a display device that displays image data, or a printing device that prints image data. is there.
  • FIG. 1 is a diagram for explaining the evolution of video technology.
  • FIG. 2 is a diagram for explaining the HDR display technique.
  • FIG. 3A is a diagram for explaining a PQ (Perceptual Quantization) method.
  • FIG. 3B is a diagram for explaining an HLG (H
  • FIG. 6 is a diagram for explaining an HDR shooting mode in which an image with an expanded dynamic range is obtained by combining two images.
  • FIG. 7 is a diagram for explaining an HDR shooting mode in which an image with an expanded dynamic range is obtained by combining two images.
  • FIG. 8 is a diagram for describing an HDR image captured for HDR display.
  • FIG. 9 is a diagram for explaining the difference between the color space of a moving image and the color space of a still image.
  • FIG. 10 is a diagram showing a comparison between Ultra HD Blu-ray (registered trademark, the same applies hereinafter) and Blu-ray.
  • FIG. 11 is a diagram for explaining an HDR imaging device that generates an HDR image with a wide luminance range.
  • FIG. 12 is a diagram for explaining the HDR still image file format.
  • FIG. 13 is a diagram for explaining the multi-picture format.
  • FIG. 14 is a diagram for explaining the JPEG XT system that handles JPEG data and difference data for HDR expansion in association with each other.
  • FIG. 15 is a block diagram schematically illustrating an example of the configuration of the image processing apparatus according to the first embodiment.
  • FIG. 16 is a diagram schematically illustrating an example in which one logical data unit is configured to include one file including two types of still image data.
  • FIG. 17 is a diagram schematically illustrating an example of information included in the management data.
  • FIG. 18 is a diagram illustrating an example of a relationship between the luminance value of each pixel constituting the still image of the first still image data and the number of pixels of each luminance value as a histogram.
  • FIG. 18 is a diagram illustrating an example of a relationship between the luminance value of each pixel constituting the still image of the first still image data and the number of pixels of each luminance value as a histogram.
  • FIG. 19 is a diagram showing another example of the relationship between the luminance value of each pixel constituting the still image of the first still image data and the number of pixels of each luminance value as a histogram.
  • FIG. 20 is a diagram schematically illustrating an example of a case where one logical data unit includes two files.
  • FIG. 21 is a diagram schematically illustrating another example in which one data unit is configured to include two files.
  • FIG. 22 is a block diagram schematically showing an example of the configuration of the generation unit in the first embodiment.
  • FIG. 23 is a block diagram schematically showing an example of the configuration of the generation unit in the first embodiment.
  • FIG. 24 is a flowchart illustrating an example of an operation related to image processing of the image processing apparatus according to the first embodiment.
  • FIG. 25 is a flowchart illustrating an example of a generation process by the generation unit according to the first embodiment.
  • FIG. 26 is a flowchart illustrating an example of generation processing by the generation unit according to the first embodiment.
  • FIG. 27 is a diagram for explaining Example 1 in the first embodiment.
  • FIG. 28 is a diagram for explaining an example 2 in the first embodiment.
  • FIG. 29 is a diagram for explaining an example 3 in the first embodiment.
  • FIG. 30 is a diagram for explaining an example 4 in the first embodiment.
  • FIG. 31 is a diagram for explaining a fifth example in the first embodiment.
  • FIG. 32 is a block diagram schematically showing an example of the configuration of the playback device in the first embodiment.
  • FIG. 33 is a flowchart illustrating an example of operations related to the reproduction processing of the reproduction device according to the first embodiment.
  • FIG. 34 is a diagram for describing a specific example of auxiliary information in the first embodiment.
  • the present disclosure is intended to provide a new user value of HDR still image and a new photographic culture by using two technologies of HDR (High Dynamic Range) display technology and HDR imaging technology.
  • the above-mentioned new user value improves the sense of reality and reduces whiteout (a state in which the gradation of a bright region is impaired) and blackout (a state in which a gradation of a dark region is impaired).
  • Generating still image data refers to a display device (hereinafter referred to as “HDR display device”) that supports HDR display of an HDR still image obtained by capturing an image with a camera that supports HDR still image capturing.
  • An HDR still image is also called an HDR photograph.
  • the present disclosure is compatible with SDR display but not with HDR display (hereinafter referred to as “SDR display device”) and SDR still image printing.
  • SDR display device an image processing apparatus and an image processing method capable of generating still image data that can be displayed or printed even in a printing apparatus that does not support printing (hereinafter referred to as “SDR printing apparatus”). That is, the present disclosure is not limited to a device that supports HDR still image processing, but also a device that supports SDR still image processing but does not support HDR still image processing.
  • the present invention provides an image processing apparatus and an image processing method capable of improving the convenience of HDR still image data by providing still image data capable of reproducing HDR still images.
  • reproduction of an HDR still image includes display of the HDR still image and printing by performing image processing on the HDR still image. That is, in the present disclosure, reproduction includes display and printing.
  • JPEG Joint Photographic Experts Group
  • JPEG Joint Photographic Experts Group
  • JPEG data that is SDR still image data and new HDR still image data are stored in one file. It is conceivable to attach a JPG extension that can realize compatibility to the file.
  • a user using such a portable terminal can obtain HDR still image data by HDR synthesis realized on the portable terminal. Therefore, it is easier for the user himself to generate and manage the SDR shooting file (conventional JPEG file) and the HDR shooting file individually than before.
  • SDR shooting file conventional JPEG file
  • HDR still image data is generated by an imaging device such as a camera
  • a conventional JPEG file is generated at the same time, and two data of HDR still image data and SDR still image data are stored in one file. It is also possible to provide the imaging apparatus with an option that allows them to be managed individually.
  • a new data format function for HDR is provided in a television set or a camera with a moving image recording function
  • a HEVC encoder is installed in these devices
  • the data format of a still image by HEVC compression may be used. it can.
  • TIFF Tagged Image File Format
  • HDR compatible with conventional devices such as HLG (Hybrid Log Gamma) is one of the data format candidates in that device.
  • HLG Hybrid Log Gamma
  • an image processing device and a playback device that can meet various requests for the HDR still image data format, such as whether or not compatibility with a conventional device is present and whether or not there is a mounting burden on the conventional device. can do.
  • FIG. 1 is a diagram for explaining the evolution of video technology.
  • HDR High
  • SDR Standard Dynamic Range
  • ITU-R International Telecommunication Union-Radiocommunication Sector
  • FIG. 2 is a diagram for explaining the HDR display technique.
  • HDR is not just a method for realizing a very bright television set.
  • HDR refers to the luminance range (range) of an image as bt. 709 (Broadcasting Service (Television)) 709 standard 0.1 nit-100 nit extended from 0-10,000 nit (in case of ST 2084), the bright sun and sky that could not be expressed in the past This is a system that enables expression of light reflection or the like, and enables recording of a bright part and a dark part at the same time.
  • luminance here is an optical brightness
  • HDR includes the SMPTE ST2084 format suitable for video (packaged video) that undergoes grading processing (processing that adjusts the color and tone of video) after imaging, video that is distributed via IP (Internet Protocol), and live broadcasting.
  • grading processing processing that adjusts the color and tone of video
  • IP Internet Protocol
  • HCG Hybrid Log Gamma
  • HDR display technology includes the HLG method that can realize compatibility between SDR and HDR, and the PQ method that does not have simple display compatibility between SDR and HDR.
  • FIG. 3A is a diagram for explaining the PQ method.
  • FIG. 3B is a diagram for explaining the HLG method.
  • SMPTE ST2084 is a method in which SDR and HDR are not compatible.
  • SDR and HDR are individually graded and transmitted separately.
  • the HDR video SDR conversion for converting data into SDR video data is required.
  • ITU-R 2100 Hybrid Log Gamma is a method having compatibility between SDR and HDR. In this method, HLG grading is performed, and only the HLG stream is transmitted. The HLG stream is compatible with SDR. For this reason, when HDR video data is displayed on SDRTV, SDR conversion for converting HDR video data into SDR video data is unnecessary.
  • FIG. 4 is a diagram comparing an example of an HDR image corresponding to HDR and an example of an SDR image corresponding to SDR.
  • FIG. 4 shows an HDR image and an SDR image of a single image having a relatively large difference in brightness, in which a relatively dark scene in the room and a relatively bright scene outside the window are mixed.
  • the HDR image is an image obtained by reproducing HDR still image data or HDR moving image data.
  • An SDR image is an image obtained by reproducing SDR still image data or SDR moving image data.
  • a relatively bright scene outside the window and a relatively dark scene in the room are both expressed with appropriate brightness.
  • the exposure is adjusted so that a relatively bright scenery outside the window is expressed. Therefore, the relatively dark scenery in the room becomes too dark and part of the image is crushed, making it difficult to see. It has become. If the exposure is adjusted so that the scenery in the room is properly represented, the scenery outside the window will become too bright and partly overexposed, making it difficult to see (not shown) .
  • the HDR image is difficult to realize in the SDR image, and includes a combination of a relatively bright scene and a relatively dark scene. Thus, it is possible to realize an image with high gradation that reduces both.
  • FIG. 5 is a diagram for explaining an imaging device that supports HDR or SDR, a file format of image data obtained by the imaging device, a display device that displays image data, or a printing device that prints image data. is there.
  • the HDR imaging device 10 shown in FIG. 5 supports HDR imaging.
  • the HDR imaging device 10 includes an HDR imaging unit 11, an SDR imaging unit 14, a conversion unit 12, and a JPEG compression unit 13.
  • the HDR imaging device 10 can cope with image data obtained by imaging in the HDR imaging mode in the HDR imaging unit 11 displayed on the SDR display device 40 or printed by the SDR printing device 50. It is configured. Specifically, in the HDR imaging device 10, HDR still image data of an HDR image obtained by imaging in the HDR imaging mode in the HDR imaging unit 11 is converted into SDR still image data in the conversion unit 12.
  • the SDR still image data obtained by the conversion in the conversion unit 12 is JPEG compressed in the JPEG compression unit 13, and the SDR still image data in the JPEG format obtained by the compression is output.
  • SDR still image data of an SDR image obtained by imaging in the conventional imaging mode (SDR imaging mode) in the SDR imaging unit 14 is also JPEG compressed by the JPEG compression unit 13, SDR still image data in JPEG format obtained by compression is output.
  • the SDR imaging device 20 includes an SDR imaging unit 21 and a JPEG compression unit 22.
  • the SDR still image data of the SDR image obtained by the imaging in the SDR imaging unit 21 is performed in the same manner as when the HDR imaging device 10 performs imaging in the conventional imaging mode (SDR imaging mode).
  • the JPEG compression unit 22 performs JPEG compression, and JPEG format SDR still image data obtained by the compression is output.
  • the HDR display device 30, the SDR display device 40, and the SDR printing device 50 obtain SDR still image data obtained by SDR conversion of HDR still image data obtained by HDR imaging, or SDR still image data obtained by SDR imaging.
  • the SDR image based on the SDR still image data is reproduced (displayed or printed).
  • 6 and 7 are diagrams for explaining the HDR shooting mode for obtaining an image with an expanded dynamic range by combining two images.
  • Some smartphones, digital cameras, and the like have an HDR shooting mode that can capture images with a wide luminance range.
  • the HDR shooting mode as shown in FIG. 6 and FIG. 7A, in order to obtain HDR image data having a wide luminance range, the double exposure (the same subject is imaged a plurality of times in different exposure states).
  • the two SDR images obtained by the above method are synthesized so as to be within the luminance range determined by the SDR. Accordingly, as shown in FIG. 6 and FIG. 7B, the HDR image can be displayed on the SDR display device.
  • FIG. 8 is a diagram for explaining an HDR image captured for HDR display.
  • the HDR image for HDR display is imaged in a brightness range (range) where the brightness of the scene to be imaged is wider than in the SDR imaging mode.
  • the image data obtained by this imaging is graded to generate an HDR image for HDR display, and the HDR image is transmitted to each device and reproduced. Since an HDR image has a wider luminance range than an SDR image, it cannot be displayed on an SDR display device as it is. In order to display the HDR image on the SDR display device, it is necessary to convert the HDR image into the SDR image.
  • the combined image is generated so as to be within the luminance range determined by the SDR, and thus the HDR display device 30 and the SDR display device 40. (Or the SDR printing apparatus 50).
  • FIG. 9 is a diagram for explaining the difference between the color space of a moving image and the color space of a still image.
  • Bt. which is a standard related to the color space of moving images.
  • 709 and sRGB standard RGB
  • Bt. A color space extended from the color space defined by 709 or sRGB is also defined.
  • Bt.standardized for ULTRA HD. 2020 is a color space wider than DCI (Digital Cinematic Initiatives) P3 or AdobeRGB. For this reason, bt. 2020 can cover the color space of DCI P3 and AdobeRGB.
  • the DCI P3 color space and the Adobe RGB color space have the same area, but the areas are different from each other.
  • FIG. 10 is a diagram showing a comparison between Ultra HD Blu-ray and Blu-ray.
  • Ultra HD Blu-ray exceeds Blu-ray in all items of resolution, color space, HDR (maximum luminance), compression technology, and transfer rate.
  • HDR display devices such as HDRTV have been proposed that can display HDR image data for displaying HDR images without performing SDR conversion.
  • the HDR technology is used mainly for the purpose of backlight correction in the camera having the HDR shooting mode (HDR imaging function).
  • the still image imaged with the camera using HDR technology is mainly reproduced
  • a camera having an HDR imaging function it is generally possible to generate HDR image data having a wide luminance range (range) that makes use of the display capability of HDRTV. Data was not generated.
  • FIG. 11 is a diagram for explaining an HDR imaging apparatus 10A that generates an HDR image with a wide luminance range.
  • the HDR display function of the HDRTV for example, the HDR display device 30
  • the HDR image data for HDR display is generated, the HDR image data is not converted into SDR image data as it is. May be displayed in HDRTV.
  • An HDR imaging device 10A illustrated in FIG. 11 includes an HDR imaging unit 11, an SDR imaging unit 14, a conversion unit 12, a JPEG compression unit 13, an HDR image correction unit 15, and HDMI (registered trademark, the same applies hereinafter) (High). -Definition (Multimedia Interface) output unit 16.
  • the HDR image correction unit 15 performs HDR image correction in order to generate HDR image data.
  • the HDR image correction unit 15 uses, for example, HDRTV (for example, HDR-EOTF (HDR-Electro-Transfer Transfer Function) such as a PQ curve, etc., as raw data obtained by performing imaging with the HDR imaging unit 11.
  • HDRTV for example, HDR-EOTF (HDR-Electro-Transfer Transfer Function) such as a PQ curve, etc.
  • the image is converted into a 10-bit image that can be displayed on the HDR display device 30) corresponding to the HDR10 standard.
  • the HDR imaging device 10A outputs the HDR image data obtained by the HDR image correction unit 15 from the HDMI output unit 16 to the HDRTV (for example, the HDR display device 30).
  • HDRTV for example, HDR display device 30
  • the HDR image according to the HDR image data is displayed.
  • the HDR imaging device 10A and the HDR display device 30 need to be connected to each other with an HDMI cable corresponding to the HDMI 2.0 standard. That is, the HDR imaging device 10A cannot transmit the HDR image data as it is to a device that does not support the HDMI 2.0 standard.
  • an HDR still image file format is required so that HDR image data can be transmitted to a device that does not support the HDMI 2.0 standard.
  • data can be exchanged between the HDR imaging device 10A, the HDR display device 30, the SDR display device 40, and the SDR printing device 50 in the SDR still image file format. Therefore, an HDR still image file format for storing and exchanging data in the HDR format is required.
  • the HDR still image file format has the following problems.
  • FIG. 12 is a diagram for explaining the HDR still image file format.
  • JPEG compression unit 13A includes an HDR imaging unit 11, an SDR imaging unit 14, a conversion unit 12, a JPEG compression unit 13, an HDR image correction unit 15, and a JPEG compression unit 13A.
  • the HDR still image file format does not have a widely used file format such as JPEG, which is one of the SDR still image file formats.
  • the file format for storing HDR image data has a color space of bt. It is necessary to expand to the 2020 region and cover a wide luminance range. Further, in order to suppress banding and the like that are conspicuous in the case of a still image, it is desirable that an HDR image can display a gradation of at least 10 bits, preferably 12 bits or more.
  • the JPEG file format is limited to SDR and limited to a color space defined by sRGB, and the gradation is limited to 8 bits.
  • FIG. 12 shows an example in which a JPEG-based file format is used as a file format for storing HDR still image data in the JPEG compression unit 13A of the HDR imaging apparatus 10B.
  • the JPEG-based file format has problems such as banding due to insufficient gradation display capability with respect to image quality. Therefore, using a JPEG-based file format as a file format for storing HDR still image data has not been put into practical use.
  • FIG. 13 is a diagram for explaining the multi-picture format.
  • Multi-picture format is a format that can store multiple photo data in one file.
  • a main image HDR still image data
  • a still image hereinafter referred to as monitor display
  • SDR still image data a still image
  • a plurality of still image data which are individual files such as multi-viewpoint (stereoscopic) images, can be recorded in one file in association with each other.
  • Baseline MP file shown in (a) of FIG. 13 Baseline MP file shown in (a) of FIG. 13 and Extended MP file shown in (b) of FIG.
  • the Baseline MP file shown in FIG. 13A can record the main image (HDR still image data) and the monitor display image (SDR still image data) in one file in association with each other.
  • Baseline MP file extension is “.JPG”.
  • a monitor display image corresponding to the main image can be reproduced by a conventional device or conventional software, and the main image (HDR still image data) can be displayed as it is on the HDR display device.
  • the advantage of Baseline MP file is that the existing display device and printing device can play back the monitor display image (that is, SDR still image data) corresponding to the main image stored for compatibility. is there.
  • the image editing software may misunderstand the Baseline MP file as a normal JPEG file and erase the second image data (HDR still image data). This is because the Baseline MP file contains two data in one file, but the extension “.JPG” of a JPEG file that stores one data in one file is used. It is. However, in the case of image editing software capable of editing the multi-picture format, the above-mentioned problem does not occur.
  • the Extended MP file shown in (b) of FIG. 13 can record, for example, two multi-view images (multi-view image 1 and multi-view image 2) used for stereoscopic viewing or the like in one file in association with each other. it can.
  • An Extended MP file is defined as a file format with a new extension so that one image is not lost when played back or saved using a conventional device or conventional software.
  • the advantage of the Extended MP file is that two data are stored in one file corresponding to the JPEG file, but the extension “.JPG” of the JPEG file is not used. For this reason, this file cannot be edited by image editing software other than image editing software compatible with the multi-picture format.
  • FIG. 14 is a diagram for explaining the JPEG XT method that handles JPEG data and difference data for HDR expansion in association with each other.
  • HDR imaging unit 14 includes an HDR imaging unit 11, an SDR imaging unit 14, a conversion unit 12, a JPEG compression unit 13, an HDR image correction unit 15, and a JPEG XT compression unit 13B.
  • JPEG XT ISO 184757
  • This standard defines a method for handling JPEG data storing SDR still image data and difference data for HDR expansion in association with each other.
  • the JPEG XT compression unit 13B performs JPEG XT compression processing on the HDR image data that has been subjected to HDR image correction by the HDR image correction unit 15.
  • SDR still image data JPEG
  • JPEG XT makes it possible to play back SDR still image data with an apparatus that can play back existing JPEG data.
  • the difference data for generating the HDR still image and the SDR still image data must be reproduced in combination with the display device or the printing device.
  • special processing corresponding to the JPEG XT HDR still image file format different from the normal HDR display function is required in the display device or the printing device.
  • existing HDRTV for example, HDR display device 30 shown in FIG. 14, etc.
  • existing display device or printing device can only play back SDR still image files included in JPEG XT.
  • the JPEG XT HDR still image is displayed on a display device that supports playback of the JPEG XT HDR still image file format (that is, the JPEG XT HDR still image file format different from the normal HDR display function is played back). Can be reproduced (displayed) only by a display device capable of special processing, such as the HDR display device 60 shown in FIG.
  • the imaging device for example, HDR imaging device 10C
  • a function for example, JPEG XT compression unit 13B
  • JPEG XT has many problems to be solved, and is not often used in HDR imaging devices and HDR display devices.
  • the image processing apparatus described below is for generating such a data format.
  • FIG. 15 is a block diagram schematically illustrating an example of the configuration of the image processing apparatus 100 according to the first embodiment.
  • the image processing apparatus 100 includes an acquisition unit 110, a generation unit 120, and an output unit 130.
  • the image processing apparatus 100 may be incorporated in the imaging apparatus or may be realized as a single apparatus.
  • the acquisition unit 110 acquires still image data obtained by imaging with an imaging unit (not shown) such as an image sensor. At this time, imaging is performed so as to include brightness in a wide range (range) from dark brightness to bright brightness, so that still image data is, for example, HDR image data including brightness from 0 to 10,000 nits. Generated.
  • the acquisition unit 110 may be realized by, for example, a processor that executes a predetermined program (a program created so as to execute the above-described processes) and a memory that stores the predetermined program.
  • the acquisition unit 110 may be realized by a dedicated circuit that executes each of the processes described above.
  • the generation unit 120 logically generates one data unit using the still image data acquired by the acquisition unit 110.
  • logically one data unit includes first still image data and second still image data that have different luminance dynamic ranges and can be reproduced independently of each other. It is one piece of data that is configured.
  • FIG. 16 shows an example in which one data unit D10 is logically configured with one file F10 including two types of still image data (first still image data D12 and second still image data D13). It is a figure shown typically.
  • FIG. 17 is a diagram schematically illustrating an example of information included in the management data D11.
  • FIG. 18 is a diagram showing an example of the relationship between the luminance value of each pixel constituting the still image of the first still image data D12 and the number of pixels of each luminance value as a histogram.
  • FIG. 19 is a diagram showing another example of the relationship between the luminance value of each pixel constituting the still image of the first still image data D12 and the number of pixels of each luminance value as a histogram.
  • the horizontal axis represents the luminance value
  • the vertical axis represents the number of pixels.
  • FIG. 20 is a diagram schematically illustrating an example in which one data unit D20 is configured to include two files (a first still image file F21 and a second still image file F22).
  • the first still image file F21 includes first still image data D22
  • the second still image file F22 includes second still image data D24.
  • FIG. 21 is a diagram schematically illustrating another example in which one data unit D30 is configured to include two files (a first still image file F32 and a second still image file F33).
  • the first still image file F32 includes first still image data D32
  • the second still image file F33 includes second still image data D33.
  • the generation unit 120 logically generates one data unit.
  • the generation unit 120 may generate one file F10 including the first still image data D12 and the second still image data D13 as the data unit D10.
  • the data unit D10 generated by the generation unit 120 is composed of one file F10.
  • the file F10 includes management data D11, first still image data D12, and second still image data D13.
  • the first still image data D12 is, for example, HDR image data
  • the second still image data D13 is, for example, SDR image data.
  • the file name of the file F10 is “DSC0001.HDR”, for example.
  • the generation unit 120 may further add auxiliary information (see FIG. 17) to the data unit D10 shown in FIG.
  • the auxiliary information may include information indicating that a higher quality image can be reproduced by reproducing the first still image data D12 than when the second still image data D13 is reproduced.
  • Management data D11 is data for managing the first still image data D12 and the second still image data D13. As illustrated in FIG. 17, the management data D11 includes date information, size information, a first storage address, a second storage address, and auxiliary information.
  • the date information is information indicating the date when the still image that is the source of the first still image data D12 and the still image that is the source of the second still image data D13 are captured.
  • the size information is information indicating the size (resolution) of the still image based on the first still image data D12 and the size (resolution) of the still image based on the second still image data D13.
  • the first storage address is information indicating an address where the first still image data D12 is stored in the file F10.
  • the second storage address is information indicating an address where the second still image data D13 is stored in the file F10.
  • the auxiliary information may include luminance region information indicating whether the luminance of the still image based on the first still image data D12 has priority over the high luminance region. For example, as illustrated in FIG. 18, in the case of an HDR still image in which luminance values are distributed in a high luminance region, the generation unit 120 indicates luminance region information indicating that a luminance value is distributed in a high luminance region. May be generated as management information D11.
  • the auxiliary information may include luminance region information indicating whether or not the luminance of the still image based on the first still image data D12 prioritizes the low luminance region.
  • luminance region information indicating whether or not the luminance of the still image based on the first still image data D12 prioritizes the low luminance region.
  • the generation unit 120 indicates luminance region information indicating that a large number of luminance values are distributed in the low luminance region. May be generated as management information D11.
  • the high luminance area is an area having higher luminance than the low luminance area.
  • the high luminance region and the low luminance region may be set so as not to overlap each other, or may be set so as to include regions overlapping each other.
  • the high luminance area may be set as an area having a luminance higher than the maximum luminance value of the SDR, for example.
  • the low luminance area may be set as an area having a luminance equal to or lower than the maximum luminance value of SDR, for example.
  • the generation unit 120 analyzes the first still image data D12, so that the still image based on the first still image data D12 has a higher luminance value (or higher) with respect to all the number of pixels constituting the still image (or It is also possible to specify a luminance area occupied by a number of pixels equal to or greater than a predetermined ratio (from the lower luminance value) and generate management data D11 including luminance area information indicating the luminance area as auxiliary information. Further, the luminance area information may be information set by the user.
  • the generation unit 120 includes one first still image file F21 including the first still image data D22 and second still image data D24, and a file name body (extended An object composed of the second still image file F22 having the same file name excluding the child) as the first still image file F21 may be generated as the data unit D20.
  • the data unit D20 generated by the generation unit 120 includes two files, a first still image file F21 and a second still image file F22.
  • the first still image file F21 includes first management data D21 and first still image data D22.
  • the file name of the first still image file F21 is, for example, “DSC0002.HDR”.
  • the second still image file F22 includes second management data D23 and second still image data D24.
  • the file name of the second still image file F22 is, for example, “DSC0002.JPG”.
  • the body of the file name of the first still image file F21 (file name excluding the extension) and the body of the file name of the second still image file F22 (file name excluding the extension) are both “DSC0002”. Are the same as each other.
  • the first still image data D22 is HDR image data, like the first still image data D12. Similar to the second still image data D13, the second still image data D24 is SDR image data.
  • the first management data D21 is composed of information obtained by removing the second storage address from the management data D11 shown in FIG.
  • the second management data D23 is composed of information obtained by removing the first storage address from the management data D11 shown in FIG.
  • the generation unit 120 includes one first still image file F32 including the first still image data D32 and second still image data D33, and a file name body (extended The file name excluding the child) includes the second still image file F33, which is the same as the first still image file F32, and the management data D31, and the body of the file name (file name excluding the extension) is the first still image file F32.
  • An object composed of the same management file F31 may be generated as the data unit D30.
  • the data unit D30 generated by the generation unit 120 includes three files, that is, a management file F31, a first still image file F32, and a second still image file F33.
  • Management file F31 includes management data D31.
  • the file name of the management file F31 is, for example, “DSC0003.INFO”.
  • the first still image file F32 includes first still image data D32.
  • the file name of the first still image file F32 is, for example, “DSC0003.HDR”.
  • the second still image file F33 includes second still image data D33.
  • the file name of the second still image file F33 is, for example, “DSC0003.JPG”.
  • the body of the file name (the file name excluding the extension) is “DSC0003” and is the same as each other.
  • Management data D31 is substantially the same as management data D11 shown in FIG.
  • the first still image data D32 is HDR image data, like the first still image data D12. Similar to the second still image data D13, the second still image data D33 is SDR image data.
  • the output unit 130 shown in FIG. 15 outputs the data unit generated by the generation unit 120.
  • FIG. 22 is a block diagram schematically illustrating an example of the configuration of the generation unit 120 in the first embodiment.
  • the generation unit 120 includes an HDR image processing unit 121, a conversion unit 123, and a format unit 125.
  • the generation unit 120 may further include at least one of the HDR image compression unit 122 and the SDR image compression unit 124 as indicated by a broken line in FIG. In other words, the generation unit 120 may be configured not to include at least one of the HDR image compression unit 122 and the SDR image compression unit 124.
  • the HDR image processing unit 121 converts the still image data (for example, RAW data) acquired by the acquisition unit 110 into a 10-bit image (HDR image processing) using HDR-EOTF, thereby The image data is converted into HDR still image data having a dynamic range for HDR display.
  • the HDR image processing unit 121 outputs uncompressed HDR still image data.
  • the HDR image compression unit 122 compresses the uncompressed HDR still image data output from the HDR image processing unit 121, and generates compressed HDR still image data.
  • the HDR image compression unit 122 outputs the compressed HDR still image data to the format unit 125.
  • the conversion unit 123 performs SDR conversion on uncompressed HDR still image data, and generates uncompressed SDR still image data.
  • the SDR image compression unit 124 compresses the uncompressed SDR still image data output from the conversion unit 123, and generates compressed SDR still image data.
  • the SDR image compression unit 124 outputs the compressed SDR still image data to the format unit 125.
  • the format unit 125 generates logically one data unit including the HDR still image data compressed by the HDR image compression unit 122 and the SDR still image data compressed by the SDR image compression unit 124, and the generated data The unit is output to the output unit 130.
  • the still image data may include information regarding imaging (hereinafter referred to as imaging information) when the image that is the source of the still image data is captured.
  • imaging information includes, for example, information indicating an aperture value, a shutter speed, ISO (International Organization for Standardization) sensitivity, picture control, and the like of a camera that is an imaging apparatus.
  • the generation unit 120 may perform each process by the functional blocks constituting the generation unit 120 using the imaging information.
  • the image processing apparatus 100 may include a generation unit 120A instead of the generation unit 120.
  • the generation unit 120A is different from the generation unit 120 in that the generation unit 120A includes an SDR image processing unit 123A instead of the conversion unit 123.
  • FIG. 23 is a block diagram schematically showing an example of the configuration of the generation unit 120A in the first embodiment.
  • the generation unit 120A differs from the generation unit 120 only in the configuration of the SDR image processing unit 123A. Therefore, only the SDR image processing unit 123A will be described below.
  • the SDR image processing unit 123A converts the still image data (eg, RAW data) acquired by the acquisition unit 110 into an 8-bit image (SDR image) using SDR-EOTF (SDR-Electro-Optical Transfer Function). By performing the processing, the still image data is converted into SDR still image data having a dynamic range for SDR display.
  • the SDR image processing unit 123A outputs uncompressed HDR still image data.
  • the SDR image compression unit 124 compresses the uncompressed SDR still image data output from the SDR image processing unit 123A, and generates compressed SDR still image data.
  • the generation unit 120A may also perform each process using the functional blocks constituting the generation unit 120A using the imaging information.
  • FIG. 24 is a flowchart illustrating an example of operations related to image processing of the image processing apparatus 100 according to the first embodiment.
  • the acquisition unit 110 of the image processing apparatus 100 acquires still image data (step S101).
  • the generation unit 120 logically generates one data unit using the still image data acquired by the acquisition unit 110 in step S101 (step S102).
  • one data unit includes HDR still image data and SDR still image data that have different luminance dynamic ranges and can be reproduced independently of each other.
  • the output unit 130 outputs one logical data unit generated by the generation unit 120 in step S102 (step S103).
  • step S102 is different between the process performed by the generation unit 120 illustrated in FIG. 22 and the process performed by the generation unit 120A illustrated in FIG. Hereinafter, the difference is demonstrated using a flowchart.
  • FIG. 25 is a flowchart illustrating an example of generation processing by the generation unit 120 according to the first embodiment.
  • the HDR image processing unit 121 converts the still image data into HDR still image data by performing predetermined image processing on the still image data acquired in step S101 (step S111).
  • the HDR image processing unit 121 outputs the HDR still image data (uncompressed HDR still image data).
  • the HDR image compression unit 122 compresses the uncompressed HDR still image data output from the HDR image processing unit 121 in step S111 (step S112). Note that the process of step S112 may not be performed. Therefore, in FIG. 25, step S112 is indicated by a broken line.
  • the conversion unit 123 performs SDR conversion on the uncompressed HDR still image data output from the HDR image processing unit 121 in step S111 (step S113).
  • the conversion unit 123 outputs uncompressed SDR still image data obtained by SDR conversion.
  • step S114 The SDR image compression unit 124 compresses the uncompressed SDR still image data output from the conversion unit 123 in step S113 (step S114). Note that the process of step S114 need not be performed. Therefore, in FIG. 25, step S114 is indicated by a broken line.
  • the format unit 125 includes the compressed HDR still image data generated by the HDR image compression unit 122 in step S112 (or the uncompressed HDR still image data output from the HDR image processing unit 121 in step S111), Logically one data unit including the compressed SDR still image data generated by the SDR image compression unit 124 in S114 (or the uncompressed SDR still image data output from the conversion unit 123 in step S113) Is generated (step S115).
  • FIG. 26 is a flowchart illustrating an example of generation processing by the generation unit 120A according to the first embodiment.
  • step S113A is performed instead of step S113. Therefore, only step S113A will be described below.
  • the SDR image processing unit 123A of the generating unit 120A converts the still image data into SDR still image data by performing predetermined image processing on the still image data acquired in step S101 (step S113A).
  • the SDR image processing unit 123A outputs the SDR still image data (uncompressed SDR still image data).
  • step S113A step S114 or step S115 is performed as in the flowchart shown in FIG.
  • FIG. 27 is a diagram for explaining Example 1 in the first embodiment.
  • the first embodiment shows a configuration example in which one file including HDR still image data and SDR still image data is generated using a multi-picture format method.
  • the HDR imaging device 10D may generate one file including HDR still image data and SDR still image data using a multi-picture format method.
  • the HDR imaging device 10D includes an HDR imaging unit 11, a conversion unit 12, a JPEG compression unit 13, a multi-picture format generation unit 13C, an SDR imaging unit 14, and an HDR image correction unit 15.
  • the conversion unit 12, the JPEG compression unit 13, the multi-picture format generation unit 13C, and the HDR image correction unit 15 are included in the image processing apparatus 100 described with reference to FIGS. This corresponds to the generation unit 120 and the output unit 130.
  • the HDR imaging unit 11 generates an HDR image (HDR still image) by performing imaging in the HDR imaging mode.
  • the HDR imaging unit 11 is realized by, for example, a lens, an image sensor, a processor, a memory, and the like.
  • the conversion unit 12 is a processing unit corresponding to the conversion unit 123 (see FIG. 22) of the generation unit 120 of the image processing apparatus 100.
  • the conversion unit 12 generates uncompressed SDR still image data by performing SDR conversion on the uncompressed HDR still image data output from the HDR imaging unit 11.
  • the conversion unit 12 outputs the generated uncompressed SDR still image data to the JPEG compression unit 13.
  • the conversion unit 12 is realized by, for example, a processor, a memory, and the like.
  • the JPEG compression unit 13 is a processing unit corresponding to the SDR image compression unit 124 (see FIG. 22) of the generation unit 120 of the image processing apparatus 100.
  • the JPEG compression unit 13 generates compressed SDR still image data by performing JPEG compression on the input non-compressed SDR still image data.
  • the JPEG compression unit 13 is realized by, for example, a processor, a memory, and the like.
  • the HDR image correction unit 15 is a processing unit corresponding to the HDR image processing unit 121 (see FIG. 22) of the generation unit 120 of the image processing apparatus 100.
  • the HDR image correction unit 15 generates uncompressed HDR still image data that can be displayed on the HDRTV such as the HDR display device 30 and the HDR display device 61 from the RAW data acquired from the HDR imaging unit 11.
  • the HDR image correction unit 15 is realized by, for example, a processor, a memory, and the like.
  • the multi-picture format generation unit 13C is a processing unit corresponding to the format unit 125 (see FIG. 22) and the output unit 130 (see FIG. 15) of the generation unit 120 of the image processing apparatus 100.
  • the multi-picture format generation unit 13C stores non-compressed HDR still image data and JPEG-compressed SDR still image data in one file in a multi-picture format method, and HDR still image file format HDR still image Image file (JPEG MPF) F100 is generated. Then, the multi-picture format generation unit 13C outputs the generated HDR still image file F100.
  • the compressed SDR still image data generated by the JPEG compression unit 13 using the SDR still image data output from the conversion unit 12 is used.
  • compressed SDR still image data generated by the JPEG compression unit 13 using SDR still image data obtained by performing imaging in the SDR imaging unit 14 may be used.
  • the configuration of the HDR still image file F100 generated by the multi-picture format generation unit 13C corresponds to, for example, the configuration of the file F10 of the data unit D10 shown in FIG.
  • the multi-picture format generation unit 13C is realized by a processor, a memory, and the like, for example.
  • the SDR imaging unit 14 generates an SDR image (SDR still image) by performing imaging in the conventional imaging mode (SDR imaging mode).
  • the SDR imaging unit 14 is realized by, for example, a lens, an image sensor, a processor, a memory, and the like.
  • each processing unit described above may be realized by a dedicated circuit that executes each processing described above, instead of the processor and the memory.
  • the HDR still image file F100 stores two data, SDR still image data (JPEG data for SDR compatibility) and HDR still image data.
  • the HDR still image data corresponds to the first still image data D12 shown in FIG. 16, and the SDR still image data corresponds to the second still image data D13 shown in FIG.
  • the HDR still image file F100 has an extension “.JPG”. Therefore, the HDR still image file F100 is used not only in the HDR display device 61 that supports the multi-picture format, but also in the HDR display device 30, the SDR display device 40, and the SDR printing device 50 that do not support the multi-picture format. Can be played (displayed or printed).
  • the advantage of the first embodiment is that SDR-JPEG (generated by the JPEG compression unit 13) in an apparatus capable of reproducing an existing JPEG file (for example, the SDR display apparatus 40 and the SDR printing apparatus 50) (Compressed SDR still image data) can be reproduced.
  • the advantage of the first embodiment is that a function for displaying the HDR still image file F100 can be mounted relatively easily on the existing HDRTV (for example, the HDR display device 30 and the HDR display device 61).
  • the advantage of the first embodiment is that it is relatively easy to realize HDR dedicated processing even in the imaging apparatus.
  • a multi-picture format file having a new extension may be generated.
  • HDR still image data may be deleted by an image editing software that has an extension of “.JPG” and can be edited by image editing software that can edit a JPEG file.
  • a file with a new extension is a dedicated image editing software that can edit a file with that extension for two data of HDR still image data and SDR still image data stored in the file. Since only editing is possible, the possibility of deleting HDR still image data is reduced.
  • a multi-picture format file having a new extension has such advantages. However, it is difficult to reproduce a multi-picture format file having a new extension with an existing device (for example, the SDR display device 40 and the SDR printing device 50).
  • FIG. 28 is a diagram for explaining an example 2 in the first embodiment.
  • the second embodiment shows a configuration example in which one data unit including HDR still image data and SDR still image data is generated using a HEVC (High Efficiency Video coding) moving image file system.
  • HEVC High Efficiency Video coding
  • the HDR imaging device 10E may generate one data unit including HDR still image data in the HEVC moving image file format and SDR still image data.
  • the HDR imaging device 10E includes an HDR imaging unit 11, a conversion unit 12, a JPEG compression unit 13, a HEVC compression unit 13D, an SDR imaging unit 14, an HDR image correction unit 15A, and a data unit generation unit 17.
  • the conversion unit 12, the JPEG compression unit 13, the HEVC compression unit 13D, the HDR image correction unit 15, and the data unit generation unit 17 have been described with reference to FIGS. This corresponds to the generation unit 120 and the output unit 130 of the image processing apparatus 100.
  • the HDR imaging unit 11, the conversion unit 12, the JPEG compression unit 13, and the SDR imaging unit 14 illustrated in FIG. 28 have substantially the same configuration as the components having the same names illustrated in FIG.
  • the HDR image correction unit 15A is a processing unit corresponding to the HDR image processing unit 121 (see FIG. 22) of the generation unit 120 of the image processing apparatus 100.
  • the HDR image correction unit 15A converts the RAW data acquired from the HDR imaging unit 11 into HDR image data that can be displayed on the HDRTV such as the HDR display device 30 using HDR-EOTF, so that an uncompressed HDR image is obtained. Generate data.
  • the HDR image correction unit 15A is realized by, for example, a processor, a memory, and the like.
  • the HEVC compression unit 13D is a processing unit corresponding to the HDR image compression unit 122 (see FIG. 22) of the generation unit 120 of the image processing apparatus 100.
  • the HEVC compression unit 13D compresses uncompressed HDR image data as a moving image in the HEVC format.
  • the HEVC compression unit 13D is realized by, for example, a processor, a memory, and the like.
  • the data unit generation unit 17 is a processing unit corresponding to the format unit 125 (see FIG. 22) and the output unit 130 (see FIG. 15) of the generation unit 120 of the image processing apparatus 100.
  • the data unit generation unit 17 includes an HDR still image file F110 including HDR still image data compressed in the HEVC moving image format, and an SDR still image file F120 including JPEG-compressed SDR still image data.
  • an object having a common body (file name excluding extension) of each file name is generated as a data unit D200.
  • generation part 17 outputs the produced
  • the data unit generation unit 17 uses the compressed SDR still image data generated by the JPEG compression unit 13 using the SDR still image data output from the conversion unit 12 to generate the data unit D200.
  • the compressed SDR still image data generated by the JPEG compression unit 13 using the SDR still image data obtained by imaging in the SDR imaging unit 14 may be used.
  • the configuration of the data unit D200 generated by the data unit generation unit 17 corresponds to, for example, the configuration of the data unit D20 shown in FIG.
  • the data unit generation unit 17 is realized by, for example, a processor, a memory, and the like.
  • each processing unit described above may be realized by a dedicated circuit that executes each processing described above, instead of the processor and the memory.
  • the advantage of the second embodiment is that since the HDR still image data is a HEVC moving image format file, it can be displayed on an existing device (for example, the HDR display device 30). Further, if the imaging apparatus has a recording function in the HEVC moving image format, the configuration shown in the second embodiment can be realized relatively easily.
  • FIG. 29 is a diagram for explaining an example 3 in the first embodiment.
  • Example 3 shows a configuration example in which one data unit including HDR still image data and SDR still image data is generated using a TIFF (Tagged Image File Format) format.
  • TIFF Tagged Image File Format
  • the HDR imaging apparatus 10F may generate one data unit including uncompressed TIFF format HDR still image data and uncompressed TIFF format SDR still image data.
  • the HDR imaging device 10F includes an HDR imaging unit 11, a conversion unit 12, a JPEG compression unit 13, an SDR imaging unit 14, an HDR image correction unit 15B, and a TIFF output unit 17A.
  • the conversion unit 12, the HDR image correction unit 15B, and the TIFF output unit 17A are the generation unit 120 and the output unit of the image processing device 100 described with reference to FIGS. 15 and 22. 130.
  • the HDR imaging unit 11, the conversion unit 12, the JPEG compression unit 13, and the SDR imaging unit 14 illustrated in FIG. 29 have substantially the same configuration as the components having the same names illustrated in FIG.
  • the HDR image correction unit 15B is a processing unit corresponding to the HDR image processing unit 121 (see FIG. 22) of the generation unit 120 of the image processing apparatus 100.
  • the HDR image correction unit 15B can display the RAW data acquired from the HDR imaging unit 11 on an HDRTV (HLG (Hybrid Log-Gamma) compatible display device) such as the HDR display device 62 and the like, HDR-OETF (HLG). To a 16 / 12-bit image (HDR still image data).
  • HLR Hybrid Log-Gamma
  • the HDR image correction unit 15B is realized by, for example, a processor, a memory, and the like.
  • the TIFF output unit 17A is a processing unit corresponding to the format unit 125 (see FIG. 22) and the output unit 130 (see FIG. 15) of the generation unit 120 of the image processing apparatus 100.
  • the TIFF output unit 17A generates a TIFF still image file F210 including HDR still image data and a TIFF still image file F220 including SDR still image data as one data unit D300 in the TIFF file format, and the generated data unit D300 is output.
  • the TIFF output unit 17A when storing two data of HDR still image data and SDR still image data in a TIFF file, respectively, as a tag of TIFF, an HDR tag (SDR, HDR (HLG (System Gamma 1.. 2)), one that identifies HDR (PQ)) and one with a color space tag (one that identifies sRGB, Adobe RGB, bt.2020) may be used as the file format.
  • the TIFF output unit 17A may use the SDR still image data output from the conversion unit 12 for generation of the data unit D300, or the SDR still image obtained by imaging in the SDR imaging unit 14 Data may be used.
  • the configuration of the data unit D300 generated by the TIFF output unit 17A corresponds to, for example, the configuration of the data unit D20 shown in FIG.
  • the TIFF output unit 17A is realized by, for example, a processor, a memory, and the like.
  • the conventional device for example, the SDR display device 40, the SDR printing device 50, or the existing HDR display device 30, etc.
  • the HDR imaging apparatus 10F the HDR still image data is subjected to SDR conversion in the conversion unit 12, and the SDR still image data obtained by JPEG compression of the SDR still image data obtained by the conversion is generated by the JPEG compression unit 13.
  • the image data may be generated (recorded) simultaneously as a separate file.
  • TIFF format is that it is possible to deal with TIFF files simply by implementing a TIFF display function on the HDR display device side, which is easy to realize. Further, since the imaging apparatus only needs to add a generation function for generating the data unit D300 in the TIFF file format, the configuration shown in the third embodiment can be realized relatively easily.
  • a device corresponding to such a new file format for example, HDR display device 62 or SDR printing device 63
  • any of TIFF still image file F210 or TIFF still image file F220 in data unit D300 can be reproduced. It is.
  • TIFF compatible HDRTV (for example, HDR display device 62, etc.) includes HDR still image data of HLG, bt. Any of the wide color gamut SDR still image data defined in 2020 can be displayed.
  • each processing unit described above may be realized by a dedicated circuit that executes each processing described above, instead of the processor and the memory.
  • FIG. 30 is a diagram for explaining an example 4 in the first embodiment.
  • the fourth embodiment shows a configuration example in which one data unit including HDR still image data and SDR still image data is generated using a method of generating an I picture of HEVC (I picture compression method).
  • the HDR imaging device 10G may generate one data unit including HDR still image data and SDR still image data by using a HEVC I-picture compression method.
  • the HDR imaging device 10G includes an HDR imaging unit 11, a conversion unit 12, a JPEG compression unit 13, an SDR imaging unit 14, an HDR image correction unit 15B, and an HEVC compression unit 17B.
  • the conversion unit 12, the HDR image correction unit 15B, and the HEVC compression unit 17B are the generation unit 120 and the output unit of the image processing device 100 described with reference to FIGS. 15 and 22. 130.
  • the HDR imaging unit 11, the conversion unit 12, the JPEG compression unit 13, the SDR imaging unit 14, and the HDR image correction unit 15B illustrated in FIG. 30 have substantially the same configuration as the components having the same names illustrated in FIG. Therefore, explanation is omitted.
  • the HEVC compression unit 17B is a processing unit corresponding to the HDR image compression unit 122, the SDR image compression unit 124, the format unit 125 (see FIG. 22), and the output unit 130 (see FIG. 15) of the generation unit 120 of the image processing apparatus 100. It is.
  • the HEVC compression unit 17B compresses the HDR still image data output from the HDR image correction unit 15B as an HEVC I picture.
  • the HEVC compression unit 17B compresses the SDR still image data output from the conversion unit 12 as an HEVC I picture.
  • the HEVC compression unit 17B also performs HEVC-I still image file F310 including HDR still image data as an I picture of HEVC obtained by compression, and HEVC-I still including SDR still image data as an I picture of HEVC.
  • An object including the image file F320 and having a common body (file name excluding extension) of each file name is generated as a data unit D400. Then, the HEVC compression unit 17B outputs the generated data unit D400.
  • the HEVC compression unit 17B uses the HDR tag (SDR, HDR (HLG (System Gamma 1.2)) when storing two data of HDR still image data and SDR still image data in the HEVC-I file.
  • HDR tag SDR, HDR (HLG (System Gamma 1.2)
  • HDR HDR
  • HDR HDR
  • RGB System Gamma 1.2
  • a color space tag one that identifies sRGB, Adobe RGB, bt.2020
  • the HEVC compression unit 17B may use the SDR still image data output from the conversion unit 12 to generate the data unit D400, or the SDR still image obtained by performing imaging in the SDR imaging unit 14. Data may be used.
  • the configuration of the data unit D400 generated by the HEVC compression unit 17B corresponds to, for example, the configuration of the data unit D20 illustrated in FIG.
  • the HEVC compression unit 17B is realized by, for example, a processor, a memory, and the like.
  • the conventional device for example, the SDR display device 40, the SDR printing device 50, or the existing HDR display device 30, etc.
  • the HDR still image data is SDR converted in the conversion unit 12, and the SDR still image data obtained by JPEG compression of the SDR still image data obtained by the conversion is generated by the JPEG compression unit 13.
  • the image data may be generated (recorded) simultaneously as a separate file.
  • the HEVC compression unit 17B generates an HDR still image file (JPEG MPF) in the HDR still image file format in which HDR still image data and SDR still image data are stored in one file by a multi-picture format method. May be. That is, the HEVC compression unit 17B may generate an HDR still image file having a configuration similar to that of the data unit D10 illustrated in FIG.
  • JPEG MPF HDR still image file
  • the HEVC compression unit 17B may generate an HDR still image file in an HDR still image file format that stores HDR still image data and SDR still image data in one file, which is different from the multi-picture format method. .
  • the advantage of using the HEVC I picture compression method is that, since the existing HDR display device has a HEVC decoding function, display or playback of HEVC I pictures can be implemented relatively easily.
  • an imaging apparatus that supports 4K video imaging often has a HEVC compression function, it is relatively easy to implement a function that compresses HDR still image data and SDR still image data as HEVC I-pictures.
  • a printing apparatus for example, the SDR printing apparatus 63
  • HDRTV capable of displaying an HEVC I picture includes an HLG HDR still image, bt. Any of the wide color gamut SDR still images defined in 2020 can be displayed.
  • each processing unit described above may be realized by a dedicated circuit that executes each processing described above, instead of the processor and the memory.
  • FIG. 31 is a diagram for explaining a fifth example in the first embodiment.
  • a configuration example in which one data unit including HDR still image data and SDR still image data is generated using a method of compressing with JPEG2000 is shown.
  • the HDR imaging apparatus 10H may generate one data unit including HDR still image data and SDR still image data using a JPEG2000 compression method.
  • the HDR imaging device 10H includes an HDR imaging unit 11, a conversion unit 12, a JPEG compression unit 13, an SDR imaging unit 14, an HDR image correction unit 15B, and a JPEG2000 compression unit 17C.
  • the conversion unit 12, the HDR image correction unit 15B, and the JPEG2000 compression unit 17C are the generation unit 120 and the output unit of the image processing device 100 described with reference to FIGS. 15 and 22. 130.
  • the HDR imaging unit 11, the conversion unit 12, the JPEG compression unit 13, the SDR imaging unit 14, and the HDR image correction unit 15B illustrated in FIG. 31 have substantially the same configuration as the components having the same names illustrated in FIG. Therefore, explanation is omitted.
  • the JPEG2000 compression unit 17C is a processing unit corresponding to the HDR image compression unit 122, the SDR image compression unit 124, the format unit 125 (see FIG. 22), and the output unit 130 (see FIG. 15) of the generation unit 120 of the image processing apparatus 100. It is.
  • the JPEG2000 compression unit 17C compresses the HDR still image data output from the HDR image correction unit 15B using the JPEG2000 method.
  • the JPEG 2000 compression unit 17C compresses the SDR still image data output from the conversion unit 12 using the JPEG 2000 method.
  • the JPEG2000 compression unit 17C includes a JPEG2000 still image file F410 including HDR still image data in JPEG2000 format obtained by compression, and a JPEG2000 still image file F420 including SDR still image data in JPEG2000 format.
  • an object having a common body (file name excluding extension) of each file name is generated as a data unit D500. Then, the JPEG2000 compression unit 17C outputs the generated data unit D500.
  • the JPEG2000 compression unit 17C stores the HDR data (HDR (System Gamma 1.2)), HDR when storing two data of HDR still image data and SDR still image data as JPEG2000 files. (PQ) and a color space tag (sRGB, Adobe RGB, bt.2020 identification) may be used as a file format.
  • the JPEG2000 compression unit 17C may use the SDR still image data output from the conversion unit 12 to generate the data unit D500, or an SDR still image obtained by performing imaging in the SDR imaging unit 14. Data may be used.
  • the configuration of the data unit D500 generated by the JPEG2000 compression unit 17C corresponds to, for example, the configuration of the data unit D20 shown in FIG.
  • the JPEG2000 compression unit 17C is realized by, for example, a processor, a memory, and the like.
  • the conventional device for example, the SDR display device 40, the SDR printing device 50, or the existing HDR display device 30, etc.
  • the HDR imaging apparatus 10H the HDR still image data is subjected to SDR conversion in the conversion unit 12, and the SDR still image data obtained by JPEG compression of the SDR still image data obtained by the conversion is generated by the JPEG compression unit 13.
  • the image data may be generated (recorded) simultaneously as a separate file.
  • the JPEG2000 compression unit 17C generates an HDR still image file (JPEG MPF) in the HDR still image file format in which HDR still image data and SDR still image data are stored in one file by a multi-picture format method. May be. That is, the JPEG2000 compression unit 17C may generate an HDR still image file having a configuration similar to that of the data unit D10 illustrated in FIG.
  • JPEG MPF HDR still image file
  • the JPEG2000 compression unit 17C may generate an HDR still image file in an HDR still image file format that stores HDR still image data and SDR still image data in one file, which is different from the multi-picture format method. .
  • An advantage of using the JPEG2000 compression method is that a function corresponding to JPEG2000 can be mounted on an existing HDR display device relatively easily.
  • an HDRTV compatible with JPEG2000 (for example, the HDR display device 62) can play back either an HDR JPEG2000 still image file F410 or an SDR JPEG2000 still image file F420.
  • HDRTV compatible with JPEG2000 (for example, HDR display device 62)
  • HDR still image of HLG, bt. Any of the wide color gamut SDR still images defined in 2020 can be displayed.
  • each processing unit described above may be realized by a dedicated circuit that executes each processing described above, instead of the processor and the memory.
  • FIG. 32 is a block diagram schematically showing an example of the configuration of the playback device in the first embodiment.
  • the playback device 200 includes an acquisition unit 210 and a playback unit 220.
  • the playback device 200 may further be realized as a display device that further includes a display unit (not shown) and displays the playback result on the display unit.
  • the playback device 200 may further be realized as a printing device that further includes a printing unit and prints the playback result on a print medium such as paper.
  • the acquisition unit 210 logically acquires one data unit.
  • one data unit includes HDR still image data and SDR still image data that have different luminance dynamic ranges and can be reproduced independently of each other.
  • the reproduction unit 220 reproduces one of the HDR still image data and the SDR still image data included in the data unit acquired by the acquisition unit 210.
  • the playback unit 220 may select and play back the HDR still image data according to the auxiliary information added to the data unit.
  • the reproducing unit 220 includes all the luminance regions indicated to be given priority by the luminance region information among the luminance regions of the HDR still image data.
  • the brightness adjustment of the HDR still image data may be performed, and the image data on which the brightness adjustment has been performed may be reproduced. For example, when reproducing the HDR still image data (for example, the HDR still image data illustrated in FIG.
  • the reproducing unit 220 displays the HDR still image data.
  • the image data may be converted to image data that has been adjusted so that all the high-luminance regions indicated as prioritized are included, and the converted image data may be reproduced.
  • the reproducing unit 220 reproduces the HDR still image data.
  • the converted image data may be reproduced after being converted to image data that has been subjected to luminance adjustment so that all the low-luminance regions indicated to have priority are included.
  • FIG. 33 is a flowchart showing an example of an operation related to the reproduction process of the reproduction apparatus 200 according to the first embodiment.
  • the acquisition unit 210 of the playback device 200 acquires a data unit (step S201).
  • the playback unit 220 plays back one of the HDR still image data and the SDR still image data included in the data unit acquired by the acquisition unit 210 in step S201 (step S202).
  • the acquisition unit 210 and the reproduction unit 220 may be realized by, for example, a processor that executes a predetermined program (a program created so as to execute the above-described processes) and a memory that stores the predetermined program.
  • a processor that executes a predetermined program (a program created so as to execute the above-described processes)
  • a memory that stores the predetermined program.
  • the processor and the memory it may be realized by a dedicated circuit that executes each of the processes described above.
  • FIG. 34 is a diagram for describing a specific example of auxiliary information in the first embodiment.
  • the image processing apparatus 100 may add auxiliary information when generating data units. That is, the image processing apparatus 100 may generate a data unit to which auxiliary information is added and output the data unit to which auxiliary information is added.
  • the auxiliary information may include information indicating that a high-quality image can be reproduced.
  • the playback device 200 is an SDR playback device that supports a new still image format
  • a JPEG-format SDR image is displayed (see B in FIG. 34).
  • the auxiliary information may include information for use in determining whether the route is a route) or an SDR image obtained by SDR conversion of an HDR image (HLG) (route A shown in FIG. 34).
  • the auxiliary information may include information indicating that a high-definition image can be reproduced when the SDR reproduction device displays the HDR image (HLG) after SDR conversion.
  • the auxiliary information includes luminance area information indicating whether or not the high-luminance area is given priority to the luminance of the still picture based on the HDR still picture data, or a luminance area indicating whether or not the low-luminance area is given priority.
  • Information may be included. That is, the auxiliary information may include a flag indicating whether the high-luminance area is prioritized when generating the HDR image, or a flag indicating whether the low-luminance area is prioritized when generating the HDR image. Further, the auxiliary information in this case may include a threshold value that determines a high-luminance region or a low-luminance region that is prioritized when the still image is generated.
  • the auxiliary information includes an instruction for indicating a photographer's instruction to display a JPEG SDR image as it is instead of converting the HDR image into an SDR image when displaying the SDR image on the SDR display device.
  • a flag conversion prohibition flag
  • HDR still image data, SDR still image data, and management information may be stored in one file.
  • the body of the file name (file name excluding the extension) is made common to each other as a plurality of files of DCF (Design rule for Camera File System) object. It may be generated as one logical data unit that is associated with each other.
  • the image processing apparatus uses the acquisition unit that acquires still image data obtained by imaging and the still image data acquired by the acquisition unit, so that the dynamic range of luminance is mutually equal.
  • a generating unit that logically generates one data unit including first still image data and second still image data that are different and can be reproduced independently of each other, and a data unit generated by the generating unit And an output unit for outputting.
  • the playback device includes a first still image data and a second still image data that have different luminance dynamic ranges and can be played back independently from each other.
  • An acquisition unit that acquires one data unit, and a reproduction unit that reproduces one of the first still image data and the second still image data included in the data unit acquired by the acquisition unit.
  • the image processing method acquires still image data obtained by imaging, and uses the acquired still image data to reproduce the luminance dynamic ranges different from each other and independently of each other.
  • a logical data unit including the first still image data and the second still image data that can be generated is generated, and the generated data unit is output.
  • the reproduction method includes a first still image data and a second still image data that have different dynamic dynamic ranges and can be reproduced independently of each other.
  • One data unit is acquired, and one of the first still image data and the second still image data included in the acquired data unit is reproduced.
  • the image processing apparatus 100 is an example of an image processing apparatus.
  • the acquisition unit 110 is an example of an acquisition unit included in the image processing apparatus.
  • Each of the generation unit 120 and the generation unit 120A is an example of a generation unit.
  • the output unit 130 is an example of an output unit.
  • the multi-picture format generation unit 13C, the data unit generation unit 17, the TIFF output unit 17A, the HEVC compression unit 17B, and the JPEG2000 compression unit 17C correspond to the output unit 130, respectively.
  • the HDR still image data is an example of first still image data.
  • the SDR still image data is an example of second still image data.
  • the playback device 200 is an example of a playback device.
  • the acquisition unit 210 is an example of an acquisition unit included in the playback device.
  • Each of the data unit D10, the data unit D20, the data unit D30, the data unit D200, the data unit D300, the data unit D400, and the data unit D500 is an example of a data unit generated by the generation unit of the image processing apparatus. It is an example of the data unit acquired by the acquisition part of a reproducing
  • the playback unit 220 is an example of a playback unit.
  • the image processing apparatus 100 includes an acquisition unit 110, a generation unit 120, and an output unit 130.
  • the acquisition unit 110 acquires still image data.
  • the generation unit 120 uses the still image data acquired by the acquisition unit 110 to generate first still image data (HDR still image data) that have different dynamic dynamic ranges and can be reproduced independently of each other. And logically one data unit including the second still image data (SDR still image data).
  • the output unit 130 outputs the data unit (for example, the data unit D10) generated by the generation unit 120.
  • the playback device 200 has the first still image data (HDR still image data) and the first still image data that have different luminance dynamic ranges and can be played back independently of each other.
  • An acquisition unit 210 that logically acquires one data unit (for example, data unit D10) including two still image data (SDR still image data), and a data unit (for example, data) acquired by the acquisition unit 210
  • a reproduction unit 220 that reproduces one of the first still image data (HDR still image data) and the second still image data (SDR still image data) included in the unit D10).
  • the image processing apparatus 100 configured in this manner is logically one piece of data including HDR still image data and SDR still image data that have different luminance dynamic ranges and can be reproduced independently of each other. Units can be output. Further, the playback device 200 can acquire and play back the data unit. For this reason, the image processing apparatus 100 outputs the data unit, and can reproduce either the HDR still image data or the SDR still image data included in the data unit by the reproduction apparatus (for example, the reproduction apparatus 200). . Therefore, the image processing apparatus 100 can provide still image data that is highly convenient for the user.
  • the generation unit may generate one file including the first still image data and the second still image data as a data unit.
  • the first still image data D12 is an example of first still image data.
  • the second still image data D13 is an example of second still image data.
  • the file F10 is an example of one file including the first still image data and the second still image data.
  • the data unit D10 is an example of a data unit generated by the generation unit.
  • the HDR still image file F100 corresponds to the file F10.
  • the generation unit 120 includes first still image data D12 (HDR still image data) and second still image data D13 (SDR still image data).
  • One file F10 is generated as a data unit D10.
  • the image processing apparatus 100 configured as described above, it is possible to prevent the HDR still image data and the SDR still image data from being managed separately.
  • the generation unit includes a first still image file including the first still image data, a second still image data, and a file name body that is the same as the first still image file.
  • An object including two still image files may be generated as a data unit.
  • Each of the first still image data D22 and the first still image data D32 is an example of first still image data.
  • Each of the second still image data D24 and the second still image data D33 is an example of second still image data.
  • Each of the first still image file F21 and the first still image file F32 is an example of a first still image file.
  • Each of the second still image file F22 and the second still image file F33 is an example of a second still image file.
  • Each of DSC0002 and DSC0003 is an example of a body of a file name (a file name excluding an extension).
  • Each of the data unit D20 and the data unit D30 is an example of a data unit generated by the generation unit.
  • the HDR still image file F110, the TIFF still image file F210, the HEVC-I still image file F310, and the JPEG2000 still image file F410 each correspond to the first still image file F21 (or the first still image file F32).
  • Each of the SDR still image file F120, the TIFF still image file F220, the HEVC-I still image file F320, and the JPEG2000 still image file F420 corresponds to the second still image file F22 (or the second still image file F33).
  • Data unit D200, data unit D300, data unit D400, and data unit D500 each correspond to data unit D20 (or data unit D30).
  • the generation unit 120 includes one HDR still image file (for example, the first still image data D22) including the HDR still image data (for example, the first still image data D22).
  • a still image file F21) and SDR still image data (for example, second still image data D24), and the file name body (for example, DSC0002) is an HDR still image file (for example, first still image file F21).
  • An object including an SDR still image file (for example, second still image file F22) having the same file name body (for example, DSC0002) as a data unit (for example, data unit D20) is generated. May be.
  • a playback apparatus for example, the playback apparatus 200
  • the playback apparatus supports playback of HDR still image files or SDR still image files
  • the corresponding file is supported. Can be used to reproduce the image.
  • the generation unit further adds auxiliary information in units of data indicating that a higher quality image than the image obtained by reproducing the second still image data can be reproduced by reproducing the first still image data. Also good.
  • the auxiliary information included in the management data D11 is an example of auxiliary information.
  • the generation unit 120 reproduces the HDR still image data (for example, the first still image data D12), thereby generating the SDR still image data (for example, the first still image data D12).
  • Auxiliary information indicating that a higher quality image than the image obtained by reproducing the second still image data D13) can be reproduced is added to the data unit (for example, the data unit D10).
  • the playback device (for example, playback device 200) that has received the data unit (for example, data unit D10) by the image processing device 100 configured as described above maximizes the playback capability of the playback device according to the auxiliary information. Still images can be played back with the best quality.
  • the generation unit determines whether the luminance of the still image based on the first still image data is luminance region information indicating whether the high luminance region is prioritized or whether the low luminance region is prioritized. Auxiliary information including the luminance area information to be shown may be added to the data unit.
  • the generation unit 120 determines whether or not the high-luminance area has priority for the luminance of the still image based on the HDR still image data (first still image data). Or auxiliary information including luminance area information indicating whether or not priority is given to the low luminance area may be added to the data unit.
  • the reproduction apparatus (for example, the reproduction apparatus 200) that has received the data unit by the image processing apparatus 100 configured as described above, according to the auxiliary information, takes a still image with a quality that effectively utilizes the reproduction capability of the reproduction apparatus. Can be played.
  • the first still image data may be HDR image data
  • the second still image data may be SDR image data
  • the first still image data is HDR image data
  • the second still image data is SDR image data
  • the first embodiment has been described as an example of the technique disclosed in the present application.
  • the technology in the present disclosure is not limited to this, and can be applied to embodiments in which changes, replacements, additions, omissions, and the like are made.
  • each component may be configured by dedicated hardware, or may be realized by a processor executing a software program suitable for each component.
  • Each component may be realized by a program execution unit such as a CPU (Central Processing Unit) or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
  • the software that realizes the image processing method or the reproduction method according to the above embodiment is the following program.
  • this program can acquire still image data obtained by imaging on a computer and use the acquired still image data to have different luminance dynamic ranges and to reproduce them independently of each other.
  • An image processing method for logically generating one data unit including the first still image data and the second still image data and outputting the generated data unit is executed.
  • this program is a logical unit of data including first still picture data and second still picture data that can be reproduced independently of each other and have different dynamic dynamic ranges. And a reproduction method for reproducing one of the first still image data and the second still image data included in the acquired data unit is executed.
  • the present disclosure can be applied to an image processing apparatus that can obtain highly convenient still image data, a reproduction apparatus that can reproduce the still image data, an image processing method, and a reproduction method.
  • the present disclosure is applicable to an imaging device such as a camera, a display device such as a television, or a printing device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Studio Devices (AREA)
  • Television Signal Processing For Recording (AREA)
  • Image Processing (AREA)

Abstract

L'invention concerne un dispositif de traitement d'images susceptible d'obtenir des données d'images fixes très pratiques. Le dispositif de traitement d'images comprend : une unité d'acquisition qui acquiert des données d'images fixes obtenues par imagerie ; une unité de production qui utilise les données d'images fixes acquises par l'unité d'acquisition pour produire de manière logique une unité de données qui comprend des premières données d'images fixes et des secondes données d'images fixes, dont chacune présente des plages dynamiques différentes de luminosité et peut être reproduite de manière indépendante ; et une unité de sortie qui émet l'unité de données produite par l'unité de production.
PCT/JP2017/026748 2016-07-27 2017-07-25 Dispositif de traitement d'image, dispositif de reproduction, procédé de traitement d'image et procédé de reproduction WO2018021261A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP17834267.1A EP3493532B8 (fr) 2016-07-27 2017-07-25 Dispositif de traitement d'image et procédé de traitement d'image
US16/317,081 US11184596B2 (en) 2016-07-27 2017-07-25 Image processing device, reproduction device, image processing method, and reproduction method
CN201780046042.0A CN109479111B (zh) 2016-07-27 2017-07-25 图像处理装置、再生装置、图像处理方法以及再生方法
JP2018529885A JPWO2018021261A1 (ja) 2016-07-27 2017-07-25 画像処理装置、再生装置、画像処理方法、および、再生方法

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201662367425P 2016-07-27 2016-07-27
US62/367425 2016-07-27
JP2017-123170 2017-06-23
JP2017123170 2017-06-23

Publications (1)

Publication Number Publication Date
WO2018021261A1 true WO2018021261A1 (fr) 2018-02-01

Family

ID=61016485

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/026748 WO2018021261A1 (fr) 2016-07-27 2017-07-25 Dispositif de traitement d'image, dispositif de reproduction, procédé de traitement d'image et procédé de reproduction

Country Status (1)

Country Link
WO (1) WO2018021261A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019159620A1 (fr) * 2018-02-16 2019-08-22 キヤノン株式会社 Dispositif d'imagerie, dispositif d'enregistrement et dispositif de commande d'affichage
CN113491104A (zh) * 2019-02-28 2021-10-08 佳能株式会社 摄像设备和图像处理设备及其控制方法和程序

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007074124A (ja) * 2005-09-05 2007-03-22 Murata Mach Ltd ネットワークファクシミリ装置、画像出力システム、及び画像出力方法
JP2007534238A (ja) * 2004-04-23 2007-11-22 ブライトサイド テクノロジーズ インコーポレイテッド 高ダイナミックレンジ画像の符号化、復号化、及び表現
JP2008204266A (ja) * 2007-02-21 2008-09-04 Canon Inc ファイル管理装置およびその制御方法、プログラム
JP2014023062A (ja) * 2012-07-20 2014-02-03 Canon Inc 撮像装置およびその制御方法
JP2014204175A (ja) * 2013-04-01 2014-10-27 キヤノン株式会社 画像処理装置及びその制御方法
JP2015056807A (ja) 2013-09-12 2015-03-23 キヤノン株式会社 撮像装置及びその制御方法
WO2016039172A1 (fr) * 2014-09-12 2016-03-17 ソニー株式会社 Dispositif de lecture, procédé de lecture, dispositif de traitement de l'information, procédé de traitement de l'information, programme et support d'enregistrement
WO2016039025A1 (fr) * 2014-09-08 2016-03-17 ソニー株式会社 Dispositif de traitement d'informations, support d'enregistrement d'informations, procédé de traitement d'informations, et programme

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007534238A (ja) * 2004-04-23 2007-11-22 ブライトサイド テクノロジーズ インコーポレイテッド 高ダイナミックレンジ画像の符号化、復号化、及び表現
JP2007074124A (ja) * 2005-09-05 2007-03-22 Murata Mach Ltd ネットワークファクシミリ装置、画像出力システム、及び画像出力方法
JP2008204266A (ja) * 2007-02-21 2008-09-04 Canon Inc ファイル管理装置およびその制御方法、プログラム
JP2014023062A (ja) * 2012-07-20 2014-02-03 Canon Inc 撮像装置およびその制御方法
JP2014204175A (ja) * 2013-04-01 2014-10-27 キヤノン株式会社 画像処理装置及びその制御方法
JP2015056807A (ja) 2013-09-12 2015-03-23 キヤノン株式会社 撮像装置及びその制御方法
WO2016039025A1 (fr) * 2014-09-08 2016-03-17 ソニー株式会社 Dispositif de traitement d'informations, support d'enregistrement d'informations, procédé de traitement d'informations, et programme
WO2016039172A1 (fr) * 2014-09-12 2016-03-17 ソニー株式会社 Dispositif de lecture, procédé de lecture, dispositif de traitement de l'information, procédé de traitement de l'information, programme et support d'enregistrement

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019159620A1 (fr) * 2018-02-16 2019-08-22 キヤノン株式会社 Dispositif d'imagerie, dispositif d'enregistrement et dispositif de commande d'affichage
JP2019145917A (ja) * 2018-02-16 2019-08-29 キヤノン株式会社 撮像装置、記録装置及び表示制御装置
CN111727603A (zh) * 2018-02-16 2020-09-29 佳能株式会社 摄像装置、记录装置和显示控制装置
US11308991B2 (en) 2018-02-16 2022-04-19 Canon Kabushiki Kaisha Image capture device, recording device, and display control device
CN111727603B (zh) * 2018-02-16 2023-03-21 佳能株式会社 摄像装置、控制方法和记录介质
JP7246855B2 (ja) 2018-02-16 2023-03-28 キヤノン株式会社 撮像装置、記録装置及び表示制御装置
JP7551812B2 (ja) 2018-02-16 2024-09-17 キヤノン株式会社 撮像装置、撮像装置の制御方法、記録装置、及び、記録装置の制御方法
CN113491104A (zh) * 2019-02-28 2021-10-08 佳能株式会社 摄像设备和图像处理设备及其控制方法和程序
CN113491104B (zh) * 2019-02-28 2023-07-18 佳能株式会社 摄像设备和图像处理设备及其控制方法和存储介质
US11750934B2 (en) 2019-02-28 2023-09-05 Canon Kabushiki Kaisha Imaging apparatus, image processing apparatus, control method of these, and storage medium

Similar Documents

Publication Publication Date Title
CN109479111B (zh) 图像处理装置、再生装置、图像处理方法以及再生方法
JP5991502B2 (ja) 変換方法および変換装置
US10567727B2 (en) Reproduction method, creation method, reproduction device, creation device, and recording medium
JP2016225965A (ja) 表示方法および表示装置
JP2005252754A (ja) 画像ファイル生成装置および方法ならびに画像ファイル再生装置および方法
JP3926947B2 (ja) 画像データ形成装置および画像データ処理方法
US20090154551A1 (en) Apparatus for recording/reproducing moving picture, and recording medium thereof
US9756278B2 (en) Image processing system and image capturing apparatus
US20160037014A1 (en) Imaging apparatus and imaging apparatus control method
US8625002B2 (en) Image processing apparatus and control method thereof for use in multiplexing image data and additional information
WO2018021261A1 (fr) Dispositif de traitement d'image, dispositif de reproduction, procédé de traitement d'image et procédé de reproduction
US20090153704A1 (en) Recording and reproduction apparatus and methods, and a storage medium having recorded thereon computer program to perform the methods
US8379093B2 (en) Recording and reproduction apparatus and methods, and a recording medium storing a computer program for executing the methods
US10965925B2 (en) Image capturing apparatus, client apparatus, control method, and storage medium
JP2010021710A (ja) 撮像装置、画像処理装置およびプログラム
JP2007274661A (ja) 撮像装置、画像再生装置およびプログラム
EP3522526B1 (fr) Procédé d'édition, procédé de création, dispositif d'édition, dispositif de création et support d'enregistrement
JP2017011676A (ja) 画像処理装置及び画像処理方法
US20240348940A1 (en) Imaging apparatus configured to generate raw image, control method, and storage medium
JP2008312021A (ja) 画像表示システム、画像再生装置、撮影機器
JP2018019122A (ja) 画像データ処理装置および撮像装置
JP4809451B2 (ja) 画像ファイル生成装置および方法ならびに画像ファイル再生装置および方法
JP2010081510A (ja) 映像処理装置及び映像処理方法
JP2009164768A (ja) 画像ファイル作成装置、画像ファイル作成方法、画像ファイル修復装置
JP2009124224A (ja) カラーデータ処理装置および表示データ処理プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17834267

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2018529885

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017834267

Country of ref document: EP

Effective date: 20190227