WO2018021261A1 - Image processing device, reproduction device, image processing method, and reproduction method - Google Patents

Image processing device, reproduction device, image processing method, and reproduction method Download PDF

Info

Publication number
WO2018021261A1
WO2018021261A1 PCT/JP2017/026748 JP2017026748W WO2018021261A1 WO 2018021261 A1 WO2018021261 A1 WO 2018021261A1 JP 2017026748 W JP2017026748 W JP 2017026748W WO 2018021261 A1 WO2018021261 A1 WO 2018021261A1
Authority
WO
WIPO (PCT)
Prior art keywords
still image
image data
unit
hdr
data
Prior art date
Application number
PCT/JP2017/026748
Other languages
French (fr)
Japanese (ja)
Inventor
上坂 靖
小塚 雅之
美裕 森
福島 俊之
和彦 甲野
柏木 吉一郎
茂生 阪上
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Priority to JP2018529885A priority Critical patent/JPWO2018021261A1/en
Priority to EP17834267.1A priority patent/EP3493532B8/en
Priority to CN201780046042.0A priority patent/CN109479111B/en
Priority to US16/317,081 priority patent/US11184596B2/en
Publication of WO2018021261A1 publication Critical patent/WO2018021261A1/en

Links

Images

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/10Digital recording or reproducing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/92Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/93Regeneration of the television signal or of selected parts thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level

Definitions

  • the present disclosure relates to an image processing device, a playback device, an image processing method, and a playback method.
  • Patent Document 1 discloses an imaging apparatus that records a HDR (High Dynamic Range) still image with a wide dynamic range by combining a plurality of images with different exposures.
  • HDR High Dynamic Range
  • the present disclosure provides an image processing device, a playback device, an image processing method, and a playback method that can obtain highly convenient still image data.
  • An image processing apparatus uses an acquisition unit that acquires still image data obtained by imaging, and the still image data acquired by the acquisition unit to have different luminance dynamic ranges and independent of each other.
  • Generating unit that logically generates one data unit including first still image data and second still image data that can be reproduced, and an output that outputs the data unit generated by the generating unit A section.
  • the playback device is logically one data unit including first still image data and second still image data that have different luminance dynamic ranges and can be played back independently of each other.
  • a reproducing unit that reproduces one of the first still image data and the second still image data included in the data unit acquired by the acquiring unit.
  • the image processing apparatus can obtain highly convenient still image data.
  • FIG. 1 is a diagram for explaining the evolution of video technology.
  • FIG. 2 is a diagram for explaining the HDR display technique.
  • FIG. 3A is a diagram for explaining a PQ (Perceptual Quantization) method.
  • FIG. 3B is a diagram for explaining an HLG (Hybrid Log Gamma) system.
  • FIG. 4 is a diagram comparing an example of an HDR image corresponding to HDR and an example of an SDR image corresponding to SDR.
  • FIG. 5 is a diagram for explaining an imaging device that supports HDR or SDR, a file format of image data obtained by the imaging device, a display device that displays image data, or a printing device that prints image data. is there.
  • FIG. 1 is a diagram for explaining the evolution of video technology.
  • FIG. 2 is a diagram for explaining the HDR display technique.
  • FIG. 3A is a diagram for explaining a PQ (Perceptual Quantization) method.
  • FIG. 3B is a diagram for explaining an HLG (H
  • FIG. 6 is a diagram for explaining an HDR shooting mode in which an image with an expanded dynamic range is obtained by combining two images.
  • FIG. 7 is a diagram for explaining an HDR shooting mode in which an image with an expanded dynamic range is obtained by combining two images.
  • FIG. 8 is a diagram for describing an HDR image captured for HDR display.
  • FIG. 9 is a diagram for explaining the difference between the color space of a moving image and the color space of a still image.
  • FIG. 10 is a diagram showing a comparison between Ultra HD Blu-ray (registered trademark, the same applies hereinafter) and Blu-ray.
  • FIG. 11 is a diagram for explaining an HDR imaging device that generates an HDR image with a wide luminance range.
  • FIG. 12 is a diagram for explaining the HDR still image file format.
  • FIG. 13 is a diagram for explaining the multi-picture format.
  • FIG. 14 is a diagram for explaining the JPEG XT system that handles JPEG data and difference data for HDR expansion in association with each other.
  • FIG. 15 is a block diagram schematically illustrating an example of the configuration of the image processing apparatus according to the first embodiment.
  • FIG. 16 is a diagram schematically illustrating an example in which one logical data unit is configured to include one file including two types of still image data.
  • FIG. 17 is a diagram schematically illustrating an example of information included in the management data.
  • FIG. 18 is a diagram illustrating an example of a relationship between the luminance value of each pixel constituting the still image of the first still image data and the number of pixels of each luminance value as a histogram.
  • FIG. 18 is a diagram illustrating an example of a relationship between the luminance value of each pixel constituting the still image of the first still image data and the number of pixels of each luminance value as a histogram.
  • FIG. 19 is a diagram showing another example of the relationship between the luminance value of each pixel constituting the still image of the first still image data and the number of pixels of each luminance value as a histogram.
  • FIG. 20 is a diagram schematically illustrating an example of a case where one logical data unit includes two files.
  • FIG. 21 is a diagram schematically illustrating another example in which one data unit is configured to include two files.
  • FIG. 22 is a block diagram schematically showing an example of the configuration of the generation unit in the first embodiment.
  • FIG. 23 is a block diagram schematically showing an example of the configuration of the generation unit in the first embodiment.
  • FIG. 24 is a flowchart illustrating an example of an operation related to image processing of the image processing apparatus according to the first embodiment.
  • FIG. 25 is a flowchart illustrating an example of a generation process by the generation unit according to the first embodiment.
  • FIG. 26 is a flowchart illustrating an example of generation processing by the generation unit according to the first embodiment.
  • FIG. 27 is a diagram for explaining Example 1 in the first embodiment.
  • FIG. 28 is a diagram for explaining an example 2 in the first embodiment.
  • FIG. 29 is a diagram for explaining an example 3 in the first embodiment.
  • FIG. 30 is a diagram for explaining an example 4 in the first embodiment.
  • FIG. 31 is a diagram for explaining a fifth example in the first embodiment.
  • FIG. 32 is a block diagram schematically showing an example of the configuration of the playback device in the first embodiment.
  • FIG. 33 is a flowchart illustrating an example of operations related to the reproduction processing of the reproduction device according to the first embodiment.
  • FIG. 34 is a diagram for describing a specific example of auxiliary information in the first embodiment.
  • the present disclosure is intended to provide a new user value of HDR still image and a new photographic culture by using two technologies of HDR (High Dynamic Range) display technology and HDR imaging technology.
  • the above-mentioned new user value improves the sense of reality and reduces whiteout (a state in which the gradation of a bright region is impaired) and blackout (a state in which a gradation of a dark region is impaired).
  • Generating still image data refers to a display device (hereinafter referred to as “HDR display device”) that supports HDR display of an HDR still image obtained by capturing an image with a camera that supports HDR still image capturing.
  • An HDR still image is also called an HDR photograph.
  • the present disclosure is compatible with SDR display but not with HDR display (hereinafter referred to as “SDR display device”) and SDR still image printing.
  • SDR display device an image processing apparatus and an image processing method capable of generating still image data that can be displayed or printed even in a printing apparatus that does not support printing (hereinafter referred to as “SDR printing apparatus”). That is, the present disclosure is not limited to a device that supports HDR still image processing, but also a device that supports SDR still image processing but does not support HDR still image processing.
  • the present invention provides an image processing apparatus and an image processing method capable of improving the convenience of HDR still image data by providing still image data capable of reproducing HDR still images.
  • reproduction of an HDR still image includes display of the HDR still image and printing by performing image processing on the HDR still image. That is, in the present disclosure, reproduction includes display and printing.
  • JPEG Joint Photographic Experts Group
  • JPEG Joint Photographic Experts Group
  • JPEG data that is SDR still image data and new HDR still image data are stored in one file. It is conceivable to attach a JPG extension that can realize compatibility to the file.
  • a user using such a portable terminal can obtain HDR still image data by HDR synthesis realized on the portable terminal. Therefore, it is easier for the user himself to generate and manage the SDR shooting file (conventional JPEG file) and the HDR shooting file individually than before.
  • SDR shooting file conventional JPEG file
  • HDR still image data is generated by an imaging device such as a camera
  • a conventional JPEG file is generated at the same time, and two data of HDR still image data and SDR still image data are stored in one file. It is also possible to provide the imaging apparatus with an option that allows them to be managed individually.
  • a new data format function for HDR is provided in a television set or a camera with a moving image recording function
  • a HEVC encoder is installed in these devices
  • the data format of a still image by HEVC compression may be used. it can.
  • TIFF Tagged Image File Format
  • HDR compatible with conventional devices such as HLG (Hybrid Log Gamma) is one of the data format candidates in that device.
  • HLG Hybrid Log Gamma
  • an image processing device and a playback device that can meet various requests for the HDR still image data format, such as whether or not compatibility with a conventional device is present and whether or not there is a mounting burden on the conventional device. can do.
  • FIG. 1 is a diagram for explaining the evolution of video technology.
  • HDR High
  • SDR Standard Dynamic Range
  • ITU-R International Telecommunication Union-Radiocommunication Sector
  • FIG. 2 is a diagram for explaining the HDR display technique.
  • HDR is not just a method for realizing a very bright television set.
  • HDR refers to the luminance range (range) of an image as bt. 709 (Broadcasting Service (Television)) 709 standard 0.1 nit-100 nit extended from 0-10,000 nit (in case of ST 2084), the bright sun and sky that could not be expressed in the past This is a system that enables expression of light reflection or the like, and enables recording of a bright part and a dark part at the same time.
  • luminance here is an optical brightness
  • HDR includes the SMPTE ST2084 format suitable for video (packaged video) that undergoes grading processing (processing that adjusts the color and tone of video) after imaging, video that is distributed via IP (Internet Protocol), and live broadcasting.
  • grading processing processing that adjusts the color and tone of video
  • IP Internet Protocol
  • HCG Hybrid Log Gamma
  • HDR display technology includes the HLG method that can realize compatibility between SDR and HDR, and the PQ method that does not have simple display compatibility between SDR and HDR.
  • FIG. 3A is a diagram for explaining the PQ method.
  • FIG. 3B is a diagram for explaining the HLG method.
  • SMPTE ST2084 is a method in which SDR and HDR are not compatible.
  • SDR and HDR are individually graded and transmitted separately.
  • the HDR video SDR conversion for converting data into SDR video data is required.
  • ITU-R 2100 Hybrid Log Gamma is a method having compatibility between SDR and HDR. In this method, HLG grading is performed, and only the HLG stream is transmitted. The HLG stream is compatible with SDR. For this reason, when HDR video data is displayed on SDRTV, SDR conversion for converting HDR video data into SDR video data is unnecessary.
  • FIG. 4 is a diagram comparing an example of an HDR image corresponding to HDR and an example of an SDR image corresponding to SDR.
  • FIG. 4 shows an HDR image and an SDR image of a single image having a relatively large difference in brightness, in which a relatively dark scene in the room and a relatively bright scene outside the window are mixed.
  • the HDR image is an image obtained by reproducing HDR still image data or HDR moving image data.
  • An SDR image is an image obtained by reproducing SDR still image data or SDR moving image data.
  • a relatively bright scene outside the window and a relatively dark scene in the room are both expressed with appropriate brightness.
  • the exposure is adjusted so that a relatively bright scenery outside the window is expressed. Therefore, the relatively dark scenery in the room becomes too dark and part of the image is crushed, making it difficult to see. It has become. If the exposure is adjusted so that the scenery in the room is properly represented, the scenery outside the window will become too bright and partly overexposed, making it difficult to see (not shown) .
  • the HDR image is difficult to realize in the SDR image, and includes a combination of a relatively bright scene and a relatively dark scene. Thus, it is possible to realize an image with high gradation that reduces both.
  • FIG. 5 is a diagram for explaining an imaging device that supports HDR or SDR, a file format of image data obtained by the imaging device, a display device that displays image data, or a printing device that prints image data. is there.
  • the HDR imaging device 10 shown in FIG. 5 supports HDR imaging.
  • the HDR imaging device 10 includes an HDR imaging unit 11, an SDR imaging unit 14, a conversion unit 12, and a JPEG compression unit 13.
  • the HDR imaging device 10 can cope with image data obtained by imaging in the HDR imaging mode in the HDR imaging unit 11 displayed on the SDR display device 40 or printed by the SDR printing device 50. It is configured. Specifically, in the HDR imaging device 10, HDR still image data of an HDR image obtained by imaging in the HDR imaging mode in the HDR imaging unit 11 is converted into SDR still image data in the conversion unit 12.
  • the SDR still image data obtained by the conversion in the conversion unit 12 is JPEG compressed in the JPEG compression unit 13, and the SDR still image data in the JPEG format obtained by the compression is output.
  • SDR still image data of an SDR image obtained by imaging in the conventional imaging mode (SDR imaging mode) in the SDR imaging unit 14 is also JPEG compressed by the JPEG compression unit 13, SDR still image data in JPEG format obtained by compression is output.
  • the SDR imaging device 20 includes an SDR imaging unit 21 and a JPEG compression unit 22.
  • the SDR still image data of the SDR image obtained by the imaging in the SDR imaging unit 21 is performed in the same manner as when the HDR imaging device 10 performs imaging in the conventional imaging mode (SDR imaging mode).
  • the JPEG compression unit 22 performs JPEG compression, and JPEG format SDR still image data obtained by the compression is output.
  • the HDR display device 30, the SDR display device 40, and the SDR printing device 50 obtain SDR still image data obtained by SDR conversion of HDR still image data obtained by HDR imaging, or SDR still image data obtained by SDR imaging.
  • the SDR image based on the SDR still image data is reproduced (displayed or printed).
  • 6 and 7 are diagrams for explaining the HDR shooting mode for obtaining an image with an expanded dynamic range by combining two images.
  • Some smartphones, digital cameras, and the like have an HDR shooting mode that can capture images with a wide luminance range.
  • the HDR shooting mode as shown in FIG. 6 and FIG. 7A, in order to obtain HDR image data having a wide luminance range, the double exposure (the same subject is imaged a plurality of times in different exposure states).
  • the two SDR images obtained by the above method are synthesized so as to be within the luminance range determined by the SDR. Accordingly, as shown in FIG. 6 and FIG. 7B, the HDR image can be displayed on the SDR display device.
  • FIG. 8 is a diagram for explaining an HDR image captured for HDR display.
  • the HDR image for HDR display is imaged in a brightness range (range) where the brightness of the scene to be imaged is wider than in the SDR imaging mode.
  • the image data obtained by this imaging is graded to generate an HDR image for HDR display, and the HDR image is transmitted to each device and reproduced. Since an HDR image has a wider luminance range than an SDR image, it cannot be displayed on an SDR display device as it is. In order to display the HDR image on the SDR display device, it is necessary to convert the HDR image into the SDR image.
  • the combined image is generated so as to be within the luminance range determined by the SDR, and thus the HDR display device 30 and the SDR display device 40. (Or the SDR printing apparatus 50).
  • FIG. 9 is a diagram for explaining the difference between the color space of a moving image and the color space of a still image.
  • Bt. which is a standard related to the color space of moving images.
  • 709 and sRGB standard RGB
  • Bt. A color space extended from the color space defined by 709 or sRGB is also defined.
  • Bt.standardized for ULTRA HD. 2020 is a color space wider than DCI (Digital Cinematic Initiatives) P3 or AdobeRGB. For this reason, bt. 2020 can cover the color space of DCI P3 and AdobeRGB.
  • the DCI P3 color space and the Adobe RGB color space have the same area, but the areas are different from each other.
  • FIG. 10 is a diagram showing a comparison between Ultra HD Blu-ray and Blu-ray.
  • Ultra HD Blu-ray exceeds Blu-ray in all items of resolution, color space, HDR (maximum luminance), compression technology, and transfer rate.
  • HDR display devices such as HDRTV have been proposed that can display HDR image data for displaying HDR images without performing SDR conversion.
  • the HDR technology is used mainly for the purpose of backlight correction in the camera having the HDR shooting mode (HDR imaging function).
  • the still image imaged with the camera using HDR technology is mainly reproduced
  • a camera having an HDR imaging function it is generally possible to generate HDR image data having a wide luminance range (range) that makes use of the display capability of HDRTV. Data was not generated.
  • FIG. 11 is a diagram for explaining an HDR imaging apparatus 10A that generates an HDR image with a wide luminance range.
  • the HDR display function of the HDRTV for example, the HDR display device 30
  • the HDR image data for HDR display is generated, the HDR image data is not converted into SDR image data as it is. May be displayed in HDRTV.
  • An HDR imaging device 10A illustrated in FIG. 11 includes an HDR imaging unit 11, an SDR imaging unit 14, a conversion unit 12, a JPEG compression unit 13, an HDR image correction unit 15, and HDMI (registered trademark, the same applies hereinafter) (High). -Definition (Multimedia Interface) output unit 16.
  • the HDR image correction unit 15 performs HDR image correction in order to generate HDR image data.
  • the HDR image correction unit 15 uses, for example, HDRTV (for example, HDR-EOTF (HDR-Electro-Transfer Transfer Function) such as a PQ curve, etc., as raw data obtained by performing imaging with the HDR imaging unit 11.
  • HDRTV for example, HDR-EOTF (HDR-Electro-Transfer Transfer Function) such as a PQ curve, etc.
  • the image is converted into a 10-bit image that can be displayed on the HDR display device 30) corresponding to the HDR10 standard.
  • the HDR imaging device 10A outputs the HDR image data obtained by the HDR image correction unit 15 from the HDMI output unit 16 to the HDRTV (for example, the HDR display device 30).
  • HDRTV for example, HDR display device 30
  • the HDR image according to the HDR image data is displayed.
  • the HDR imaging device 10A and the HDR display device 30 need to be connected to each other with an HDMI cable corresponding to the HDMI 2.0 standard. That is, the HDR imaging device 10A cannot transmit the HDR image data as it is to a device that does not support the HDMI 2.0 standard.
  • an HDR still image file format is required so that HDR image data can be transmitted to a device that does not support the HDMI 2.0 standard.
  • data can be exchanged between the HDR imaging device 10A, the HDR display device 30, the SDR display device 40, and the SDR printing device 50 in the SDR still image file format. Therefore, an HDR still image file format for storing and exchanging data in the HDR format is required.
  • the HDR still image file format has the following problems.
  • FIG. 12 is a diagram for explaining the HDR still image file format.
  • JPEG compression unit 13A includes an HDR imaging unit 11, an SDR imaging unit 14, a conversion unit 12, a JPEG compression unit 13, an HDR image correction unit 15, and a JPEG compression unit 13A.
  • the HDR still image file format does not have a widely used file format such as JPEG, which is one of the SDR still image file formats.
  • the file format for storing HDR image data has a color space of bt. It is necessary to expand to the 2020 region and cover a wide luminance range. Further, in order to suppress banding and the like that are conspicuous in the case of a still image, it is desirable that an HDR image can display a gradation of at least 10 bits, preferably 12 bits or more.
  • the JPEG file format is limited to SDR and limited to a color space defined by sRGB, and the gradation is limited to 8 bits.
  • FIG. 12 shows an example in which a JPEG-based file format is used as a file format for storing HDR still image data in the JPEG compression unit 13A of the HDR imaging apparatus 10B.
  • the JPEG-based file format has problems such as banding due to insufficient gradation display capability with respect to image quality. Therefore, using a JPEG-based file format as a file format for storing HDR still image data has not been put into practical use.
  • FIG. 13 is a diagram for explaining the multi-picture format.
  • Multi-picture format is a format that can store multiple photo data in one file.
  • a main image HDR still image data
  • a still image hereinafter referred to as monitor display
  • SDR still image data a still image
  • a plurality of still image data which are individual files such as multi-viewpoint (stereoscopic) images, can be recorded in one file in association with each other.
  • Baseline MP file shown in (a) of FIG. 13 Baseline MP file shown in (a) of FIG. 13 and Extended MP file shown in (b) of FIG.
  • the Baseline MP file shown in FIG. 13A can record the main image (HDR still image data) and the monitor display image (SDR still image data) in one file in association with each other.
  • Baseline MP file extension is “.JPG”.
  • a monitor display image corresponding to the main image can be reproduced by a conventional device or conventional software, and the main image (HDR still image data) can be displayed as it is on the HDR display device.
  • the advantage of Baseline MP file is that the existing display device and printing device can play back the monitor display image (that is, SDR still image data) corresponding to the main image stored for compatibility. is there.
  • the image editing software may misunderstand the Baseline MP file as a normal JPEG file and erase the second image data (HDR still image data). This is because the Baseline MP file contains two data in one file, but the extension “.JPG” of a JPEG file that stores one data in one file is used. It is. However, in the case of image editing software capable of editing the multi-picture format, the above-mentioned problem does not occur.
  • the Extended MP file shown in (b) of FIG. 13 can record, for example, two multi-view images (multi-view image 1 and multi-view image 2) used for stereoscopic viewing or the like in one file in association with each other. it can.
  • An Extended MP file is defined as a file format with a new extension so that one image is not lost when played back or saved using a conventional device or conventional software.
  • the advantage of the Extended MP file is that two data are stored in one file corresponding to the JPEG file, but the extension “.JPG” of the JPEG file is not used. For this reason, this file cannot be edited by image editing software other than image editing software compatible with the multi-picture format.
  • FIG. 14 is a diagram for explaining the JPEG XT method that handles JPEG data and difference data for HDR expansion in association with each other.
  • HDR imaging unit 14 includes an HDR imaging unit 11, an SDR imaging unit 14, a conversion unit 12, a JPEG compression unit 13, an HDR image correction unit 15, and a JPEG XT compression unit 13B.
  • JPEG XT ISO 184757
  • This standard defines a method for handling JPEG data storing SDR still image data and difference data for HDR expansion in association with each other.
  • the JPEG XT compression unit 13B performs JPEG XT compression processing on the HDR image data that has been subjected to HDR image correction by the HDR image correction unit 15.
  • SDR still image data JPEG
  • JPEG XT makes it possible to play back SDR still image data with an apparatus that can play back existing JPEG data.
  • the difference data for generating the HDR still image and the SDR still image data must be reproduced in combination with the display device or the printing device.
  • special processing corresponding to the JPEG XT HDR still image file format different from the normal HDR display function is required in the display device or the printing device.
  • existing HDRTV for example, HDR display device 30 shown in FIG. 14, etc.
  • existing display device or printing device can only play back SDR still image files included in JPEG XT.
  • the JPEG XT HDR still image is displayed on a display device that supports playback of the JPEG XT HDR still image file format (that is, the JPEG XT HDR still image file format different from the normal HDR display function is played back). Can be reproduced (displayed) only by a display device capable of special processing, such as the HDR display device 60 shown in FIG.
  • the imaging device for example, HDR imaging device 10C
  • a function for example, JPEG XT compression unit 13B
  • JPEG XT has many problems to be solved, and is not often used in HDR imaging devices and HDR display devices.
  • the image processing apparatus described below is for generating such a data format.
  • FIG. 15 is a block diagram schematically illustrating an example of the configuration of the image processing apparatus 100 according to the first embodiment.
  • the image processing apparatus 100 includes an acquisition unit 110, a generation unit 120, and an output unit 130.
  • the image processing apparatus 100 may be incorporated in the imaging apparatus or may be realized as a single apparatus.
  • the acquisition unit 110 acquires still image data obtained by imaging with an imaging unit (not shown) such as an image sensor. At this time, imaging is performed so as to include brightness in a wide range (range) from dark brightness to bright brightness, so that still image data is, for example, HDR image data including brightness from 0 to 10,000 nits. Generated.
  • the acquisition unit 110 may be realized by, for example, a processor that executes a predetermined program (a program created so as to execute the above-described processes) and a memory that stores the predetermined program.
  • the acquisition unit 110 may be realized by a dedicated circuit that executes each of the processes described above.
  • the generation unit 120 logically generates one data unit using the still image data acquired by the acquisition unit 110.
  • logically one data unit includes first still image data and second still image data that have different luminance dynamic ranges and can be reproduced independently of each other. It is one piece of data that is configured.
  • FIG. 16 shows an example in which one data unit D10 is logically configured with one file F10 including two types of still image data (first still image data D12 and second still image data D13). It is a figure shown typically.
  • FIG. 17 is a diagram schematically illustrating an example of information included in the management data D11.
  • FIG. 18 is a diagram showing an example of the relationship between the luminance value of each pixel constituting the still image of the first still image data D12 and the number of pixels of each luminance value as a histogram.
  • FIG. 19 is a diagram showing another example of the relationship between the luminance value of each pixel constituting the still image of the first still image data D12 and the number of pixels of each luminance value as a histogram.
  • the horizontal axis represents the luminance value
  • the vertical axis represents the number of pixels.
  • FIG. 20 is a diagram schematically illustrating an example in which one data unit D20 is configured to include two files (a first still image file F21 and a second still image file F22).
  • the first still image file F21 includes first still image data D22
  • the second still image file F22 includes second still image data D24.
  • FIG. 21 is a diagram schematically illustrating another example in which one data unit D30 is configured to include two files (a first still image file F32 and a second still image file F33).
  • the first still image file F32 includes first still image data D32
  • the second still image file F33 includes second still image data D33.
  • the generation unit 120 logically generates one data unit.
  • the generation unit 120 may generate one file F10 including the first still image data D12 and the second still image data D13 as the data unit D10.
  • the data unit D10 generated by the generation unit 120 is composed of one file F10.
  • the file F10 includes management data D11, first still image data D12, and second still image data D13.
  • the first still image data D12 is, for example, HDR image data
  • the second still image data D13 is, for example, SDR image data.
  • the file name of the file F10 is “DSC0001.HDR”, for example.
  • the generation unit 120 may further add auxiliary information (see FIG. 17) to the data unit D10 shown in FIG.
  • the auxiliary information may include information indicating that a higher quality image can be reproduced by reproducing the first still image data D12 than when the second still image data D13 is reproduced.
  • Management data D11 is data for managing the first still image data D12 and the second still image data D13. As illustrated in FIG. 17, the management data D11 includes date information, size information, a first storage address, a second storage address, and auxiliary information.
  • the date information is information indicating the date when the still image that is the source of the first still image data D12 and the still image that is the source of the second still image data D13 are captured.
  • the size information is information indicating the size (resolution) of the still image based on the first still image data D12 and the size (resolution) of the still image based on the second still image data D13.
  • the first storage address is information indicating an address where the first still image data D12 is stored in the file F10.
  • the second storage address is information indicating an address where the second still image data D13 is stored in the file F10.
  • the auxiliary information may include luminance region information indicating whether the luminance of the still image based on the first still image data D12 has priority over the high luminance region. For example, as illustrated in FIG. 18, in the case of an HDR still image in which luminance values are distributed in a high luminance region, the generation unit 120 indicates luminance region information indicating that a luminance value is distributed in a high luminance region. May be generated as management information D11.
  • the auxiliary information may include luminance region information indicating whether or not the luminance of the still image based on the first still image data D12 prioritizes the low luminance region.
  • luminance region information indicating whether or not the luminance of the still image based on the first still image data D12 prioritizes the low luminance region.
  • the generation unit 120 indicates luminance region information indicating that a large number of luminance values are distributed in the low luminance region. May be generated as management information D11.
  • the high luminance area is an area having higher luminance than the low luminance area.
  • the high luminance region and the low luminance region may be set so as not to overlap each other, or may be set so as to include regions overlapping each other.
  • the high luminance area may be set as an area having a luminance higher than the maximum luminance value of the SDR, for example.
  • the low luminance area may be set as an area having a luminance equal to or lower than the maximum luminance value of SDR, for example.
  • the generation unit 120 analyzes the first still image data D12, so that the still image based on the first still image data D12 has a higher luminance value (or higher) with respect to all the number of pixels constituting the still image (or It is also possible to specify a luminance area occupied by a number of pixels equal to or greater than a predetermined ratio (from the lower luminance value) and generate management data D11 including luminance area information indicating the luminance area as auxiliary information. Further, the luminance area information may be information set by the user.
  • the generation unit 120 includes one first still image file F21 including the first still image data D22 and second still image data D24, and a file name body (extended An object composed of the second still image file F22 having the same file name excluding the child) as the first still image file F21 may be generated as the data unit D20.
  • the data unit D20 generated by the generation unit 120 includes two files, a first still image file F21 and a second still image file F22.
  • the first still image file F21 includes first management data D21 and first still image data D22.
  • the file name of the first still image file F21 is, for example, “DSC0002.HDR”.
  • the second still image file F22 includes second management data D23 and second still image data D24.
  • the file name of the second still image file F22 is, for example, “DSC0002.JPG”.
  • the body of the file name of the first still image file F21 (file name excluding the extension) and the body of the file name of the second still image file F22 (file name excluding the extension) are both “DSC0002”. Are the same as each other.
  • the first still image data D22 is HDR image data, like the first still image data D12. Similar to the second still image data D13, the second still image data D24 is SDR image data.
  • the first management data D21 is composed of information obtained by removing the second storage address from the management data D11 shown in FIG.
  • the second management data D23 is composed of information obtained by removing the first storage address from the management data D11 shown in FIG.
  • the generation unit 120 includes one first still image file F32 including the first still image data D32 and second still image data D33, and a file name body (extended The file name excluding the child) includes the second still image file F33, which is the same as the first still image file F32, and the management data D31, and the body of the file name (file name excluding the extension) is the first still image file F32.
  • An object composed of the same management file F31 may be generated as the data unit D30.
  • the data unit D30 generated by the generation unit 120 includes three files, that is, a management file F31, a first still image file F32, and a second still image file F33.
  • Management file F31 includes management data D31.
  • the file name of the management file F31 is, for example, “DSC0003.INFO”.
  • the first still image file F32 includes first still image data D32.
  • the file name of the first still image file F32 is, for example, “DSC0003.HDR”.
  • the second still image file F33 includes second still image data D33.
  • the file name of the second still image file F33 is, for example, “DSC0003.JPG”.
  • the body of the file name (the file name excluding the extension) is “DSC0003” and is the same as each other.
  • Management data D31 is substantially the same as management data D11 shown in FIG.
  • the first still image data D32 is HDR image data, like the first still image data D12. Similar to the second still image data D13, the second still image data D33 is SDR image data.
  • the output unit 130 shown in FIG. 15 outputs the data unit generated by the generation unit 120.
  • FIG. 22 is a block diagram schematically illustrating an example of the configuration of the generation unit 120 in the first embodiment.
  • the generation unit 120 includes an HDR image processing unit 121, a conversion unit 123, and a format unit 125.
  • the generation unit 120 may further include at least one of the HDR image compression unit 122 and the SDR image compression unit 124 as indicated by a broken line in FIG. In other words, the generation unit 120 may be configured not to include at least one of the HDR image compression unit 122 and the SDR image compression unit 124.
  • the HDR image processing unit 121 converts the still image data (for example, RAW data) acquired by the acquisition unit 110 into a 10-bit image (HDR image processing) using HDR-EOTF, thereby The image data is converted into HDR still image data having a dynamic range for HDR display.
  • the HDR image processing unit 121 outputs uncompressed HDR still image data.
  • the HDR image compression unit 122 compresses the uncompressed HDR still image data output from the HDR image processing unit 121, and generates compressed HDR still image data.
  • the HDR image compression unit 122 outputs the compressed HDR still image data to the format unit 125.
  • the conversion unit 123 performs SDR conversion on uncompressed HDR still image data, and generates uncompressed SDR still image data.
  • the SDR image compression unit 124 compresses the uncompressed SDR still image data output from the conversion unit 123, and generates compressed SDR still image data.
  • the SDR image compression unit 124 outputs the compressed SDR still image data to the format unit 125.
  • the format unit 125 generates logically one data unit including the HDR still image data compressed by the HDR image compression unit 122 and the SDR still image data compressed by the SDR image compression unit 124, and the generated data The unit is output to the output unit 130.
  • the still image data may include information regarding imaging (hereinafter referred to as imaging information) when the image that is the source of the still image data is captured.
  • imaging information includes, for example, information indicating an aperture value, a shutter speed, ISO (International Organization for Standardization) sensitivity, picture control, and the like of a camera that is an imaging apparatus.
  • the generation unit 120 may perform each process by the functional blocks constituting the generation unit 120 using the imaging information.
  • the image processing apparatus 100 may include a generation unit 120A instead of the generation unit 120.
  • the generation unit 120A is different from the generation unit 120 in that the generation unit 120A includes an SDR image processing unit 123A instead of the conversion unit 123.
  • FIG. 23 is a block diagram schematically showing an example of the configuration of the generation unit 120A in the first embodiment.
  • the generation unit 120A differs from the generation unit 120 only in the configuration of the SDR image processing unit 123A. Therefore, only the SDR image processing unit 123A will be described below.
  • the SDR image processing unit 123A converts the still image data (eg, RAW data) acquired by the acquisition unit 110 into an 8-bit image (SDR image) using SDR-EOTF (SDR-Electro-Optical Transfer Function). By performing the processing, the still image data is converted into SDR still image data having a dynamic range for SDR display.
  • the SDR image processing unit 123A outputs uncompressed HDR still image data.
  • the SDR image compression unit 124 compresses the uncompressed SDR still image data output from the SDR image processing unit 123A, and generates compressed SDR still image data.
  • the generation unit 120A may also perform each process using the functional blocks constituting the generation unit 120A using the imaging information.
  • FIG. 24 is a flowchart illustrating an example of operations related to image processing of the image processing apparatus 100 according to the first embodiment.
  • the acquisition unit 110 of the image processing apparatus 100 acquires still image data (step S101).
  • the generation unit 120 logically generates one data unit using the still image data acquired by the acquisition unit 110 in step S101 (step S102).
  • one data unit includes HDR still image data and SDR still image data that have different luminance dynamic ranges and can be reproduced independently of each other.
  • the output unit 130 outputs one logical data unit generated by the generation unit 120 in step S102 (step S103).
  • step S102 is different between the process performed by the generation unit 120 illustrated in FIG. 22 and the process performed by the generation unit 120A illustrated in FIG. Hereinafter, the difference is demonstrated using a flowchart.
  • FIG. 25 is a flowchart illustrating an example of generation processing by the generation unit 120 according to the first embodiment.
  • the HDR image processing unit 121 converts the still image data into HDR still image data by performing predetermined image processing on the still image data acquired in step S101 (step S111).
  • the HDR image processing unit 121 outputs the HDR still image data (uncompressed HDR still image data).
  • the HDR image compression unit 122 compresses the uncompressed HDR still image data output from the HDR image processing unit 121 in step S111 (step S112). Note that the process of step S112 may not be performed. Therefore, in FIG. 25, step S112 is indicated by a broken line.
  • the conversion unit 123 performs SDR conversion on the uncompressed HDR still image data output from the HDR image processing unit 121 in step S111 (step S113).
  • the conversion unit 123 outputs uncompressed SDR still image data obtained by SDR conversion.
  • step S114 The SDR image compression unit 124 compresses the uncompressed SDR still image data output from the conversion unit 123 in step S113 (step S114). Note that the process of step S114 need not be performed. Therefore, in FIG. 25, step S114 is indicated by a broken line.
  • the format unit 125 includes the compressed HDR still image data generated by the HDR image compression unit 122 in step S112 (or the uncompressed HDR still image data output from the HDR image processing unit 121 in step S111), Logically one data unit including the compressed SDR still image data generated by the SDR image compression unit 124 in S114 (or the uncompressed SDR still image data output from the conversion unit 123 in step S113) Is generated (step S115).
  • FIG. 26 is a flowchart illustrating an example of generation processing by the generation unit 120A according to the first embodiment.
  • step S113A is performed instead of step S113. Therefore, only step S113A will be described below.
  • the SDR image processing unit 123A of the generating unit 120A converts the still image data into SDR still image data by performing predetermined image processing on the still image data acquired in step S101 (step S113A).
  • the SDR image processing unit 123A outputs the SDR still image data (uncompressed SDR still image data).
  • step S113A step S114 or step S115 is performed as in the flowchart shown in FIG.
  • FIG. 27 is a diagram for explaining Example 1 in the first embodiment.
  • the first embodiment shows a configuration example in which one file including HDR still image data and SDR still image data is generated using a multi-picture format method.
  • the HDR imaging device 10D may generate one file including HDR still image data and SDR still image data using a multi-picture format method.
  • the HDR imaging device 10D includes an HDR imaging unit 11, a conversion unit 12, a JPEG compression unit 13, a multi-picture format generation unit 13C, an SDR imaging unit 14, and an HDR image correction unit 15.
  • the conversion unit 12, the JPEG compression unit 13, the multi-picture format generation unit 13C, and the HDR image correction unit 15 are included in the image processing apparatus 100 described with reference to FIGS. This corresponds to the generation unit 120 and the output unit 130.
  • the HDR imaging unit 11 generates an HDR image (HDR still image) by performing imaging in the HDR imaging mode.
  • the HDR imaging unit 11 is realized by, for example, a lens, an image sensor, a processor, a memory, and the like.
  • the conversion unit 12 is a processing unit corresponding to the conversion unit 123 (see FIG. 22) of the generation unit 120 of the image processing apparatus 100.
  • the conversion unit 12 generates uncompressed SDR still image data by performing SDR conversion on the uncompressed HDR still image data output from the HDR imaging unit 11.
  • the conversion unit 12 outputs the generated uncompressed SDR still image data to the JPEG compression unit 13.
  • the conversion unit 12 is realized by, for example, a processor, a memory, and the like.
  • the JPEG compression unit 13 is a processing unit corresponding to the SDR image compression unit 124 (see FIG. 22) of the generation unit 120 of the image processing apparatus 100.
  • the JPEG compression unit 13 generates compressed SDR still image data by performing JPEG compression on the input non-compressed SDR still image data.
  • the JPEG compression unit 13 is realized by, for example, a processor, a memory, and the like.
  • the HDR image correction unit 15 is a processing unit corresponding to the HDR image processing unit 121 (see FIG. 22) of the generation unit 120 of the image processing apparatus 100.
  • the HDR image correction unit 15 generates uncompressed HDR still image data that can be displayed on the HDRTV such as the HDR display device 30 and the HDR display device 61 from the RAW data acquired from the HDR imaging unit 11.
  • the HDR image correction unit 15 is realized by, for example, a processor, a memory, and the like.
  • the multi-picture format generation unit 13C is a processing unit corresponding to the format unit 125 (see FIG. 22) and the output unit 130 (see FIG. 15) of the generation unit 120 of the image processing apparatus 100.
  • the multi-picture format generation unit 13C stores non-compressed HDR still image data and JPEG-compressed SDR still image data in one file in a multi-picture format method, and HDR still image file format HDR still image Image file (JPEG MPF) F100 is generated. Then, the multi-picture format generation unit 13C outputs the generated HDR still image file F100.
  • the compressed SDR still image data generated by the JPEG compression unit 13 using the SDR still image data output from the conversion unit 12 is used.
  • compressed SDR still image data generated by the JPEG compression unit 13 using SDR still image data obtained by performing imaging in the SDR imaging unit 14 may be used.
  • the configuration of the HDR still image file F100 generated by the multi-picture format generation unit 13C corresponds to, for example, the configuration of the file F10 of the data unit D10 shown in FIG.
  • the multi-picture format generation unit 13C is realized by a processor, a memory, and the like, for example.
  • the SDR imaging unit 14 generates an SDR image (SDR still image) by performing imaging in the conventional imaging mode (SDR imaging mode).
  • the SDR imaging unit 14 is realized by, for example, a lens, an image sensor, a processor, a memory, and the like.
  • each processing unit described above may be realized by a dedicated circuit that executes each processing described above, instead of the processor and the memory.
  • the HDR still image file F100 stores two data, SDR still image data (JPEG data for SDR compatibility) and HDR still image data.
  • the HDR still image data corresponds to the first still image data D12 shown in FIG. 16, and the SDR still image data corresponds to the second still image data D13 shown in FIG.
  • the HDR still image file F100 has an extension “.JPG”. Therefore, the HDR still image file F100 is used not only in the HDR display device 61 that supports the multi-picture format, but also in the HDR display device 30, the SDR display device 40, and the SDR printing device 50 that do not support the multi-picture format. Can be played (displayed or printed).
  • the advantage of the first embodiment is that SDR-JPEG (generated by the JPEG compression unit 13) in an apparatus capable of reproducing an existing JPEG file (for example, the SDR display apparatus 40 and the SDR printing apparatus 50) (Compressed SDR still image data) can be reproduced.
  • the advantage of the first embodiment is that a function for displaying the HDR still image file F100 can be mounted relatively easily on the existing HDRTV (for example, the HDR display device 30 and the HDR display device 61).
  • the advantage of the first embodiment is that it is relatively easy to realize HDR dedicated processing even in the imaging apparatus.
  • a multi-picture format file having a new extension may be generated.
  • HDR still image data may be deleted by an image editing software that has an extension of “.JPG” and can be edited by image editing software that can edit a JPEG file.
  • a file with a new extension is a dedicated image editing software that can edit a file with that extension for two data of HDR still image data and SDR still image data stored in the file. Since only editing is possible, the possibility of deleting HDR still image data is reduced.
  • a multi-picture format file having a new extension has such advantages. However, it is difficult to reproduce a multi-picture format file having a new extension with an existing device (for example, the SDR display device 40 and the SDR printing device 50).
  • FIG. 28 is a diagram for explaining an example 2 in the first embodiment.
  • the second embodiment shows a configuration example in which one data unit including HDR still image data and SDR still image data is generated using a HEVC (High Efficiency Video coding) moving image file system.
  • HEVC High Efficiency Video coding
  • the HDR imaging device 10E may generate one data unit including HDR still image data in the HEVC moving image file format and SDR still image data.
  • the HDR imaging device 10E includes an HDR imaging unit 11, a conversion unit 12, a JPEG compression unit 13, a HEVC compression unit 13D, an SDR imaging unit 14, an HDR image correction unit 15A, and a data unit generation unit 17.
  • the conversion unit 12, the JPEG compression unit 13, the HEVC compression unit 13D, the HDR image correction unit 15, and the data unit generation unit 17 have been described with reference to FIGS. This corresponds to the generation unit 120 and the output unit 130 of the image processing apparatus 100.
  • the HDR imaging unit 11, the conversion unit 12, the JPEG compression unit 13, and the SDR imaging unit 14 illustrated in FIG. 28 have substantially the same configuration as the components having the same names illustrated in FIG.
  • the HDR image correction unit 15A is a processing unit corresponding to the HDR image processing unit 121 (see FIG. 22) of the generation unit 120 of the image processing apparatus 100.
  • the HDR image correction unit 15A converts the RAW data acquired from the HDR imaging unit 11 into HDR image data that can be displayed on the HDRTV such as the HDR display device 30 using HDR-EOTF, so that an uncompressed HDR image is obtained. Generate data.
  • the HDR image correction unit 15A is realized by, for example, a processor, a memory, and the like.
  • the HEVC compression unit 13D is a processing unit corresponding to the HDR image compression unit 122 (see FIG. 22) of the generation unit 120 of the image processing apparatus 100.
  • the HEVC compression unit 13D compresses uncompressed HDR image data as a moving image in the HEVC format.
  • the HEVC compression unit 13D is realized by, for example, a processor, a memory, and the like.
  • the data unit generation unit 17 is a processing unit corresponding to the format unit 125 (see FIG. 22) and the output unit 130 (see FIG. 15) of the generation unit 120 of the image processing apparatus 100.
  • the data unit generation unit 17 includes an HDR still image file F110 including HDR still image data compressed in the HEVC moving image format, and an SDR still image file F120 including JPEG-compressed SDR still image data.
  • an object having a common body (file name excluding extension) of each file name is generated as a data unit D200.
  • generation part 17 outputs the produced
  • the data unit generation unit 17 uses the compressed SDR still image data generated by the JPEG compression unit 13 using the SDR still image data output from the conversion unit 12 to generate the data unit D200.
  • the compressed SDR still image data generated by the JPEG compression unit 13 using the SDR still image data obtained by imaging in the SDR imaging unit 14 may be used.
  • the configuration of the data unit D200 generated by the data unit generation unit 17 corresponds to, for example, the configuration of the data unit D20 shown in FIG.
  • the data unit generation unit 17 is realized by, for example, a processor, a memory, and the like.
  • each processing unit described above may be realized by a dedicated circuit that executes each processing described above, instead of the processor and the memory.
  • the advantage of the second embodiment is that since the HDR still image data is a HEVC moving image format file, it can be displayed on an existing device (for example, the HDR display device 30). Further, if the imaging apparatus has a recording function in the HEVC moving image format, the configuration shown in the second embodiment can be realized relatively easily.
  • FIG. 29 is a diagram for explaining an example 3 in the first embodiment.
  • Example 3 shows a configuration example in which one data unit including HDR still image data and SDR still image data is generated using a TIFF (Tagged Image File Format) format.
  • TIFF Tagged Image File Format
  • the HDR imaging apparatus 10F may generate one data unit including uncompressed TIFF format HDR still image data and uncompressed TIFF format SDR still image data.
  • the HDR imaging device 10F includes an HDR imaging unit 11, a conversion unit 12, a JPEG compression unit 13, an SDR imaging unit 14, an HDR image correction unit 15B, and a TIFF output unit 17A.
  • the conversion unit 12, the HDR image correction unit 15B, and the TIFF output unit 17A are the generation unit 120 and the output unit of the image processing device 100 described with reference to FIGS. 15 and 22. 130.
  • the HDR imaging unit 11, the conversion unit 12, the JPEG compression unit 13, and the SDR imaging unit 14 illustrated in FIG. 29 have substantially the same configuration as the components having the same names illustrated in FIG.
  • the HDR image correction unit 15B is a processing unit corresponding to the HDR image processing unit 121 (see FIG. 22) of the generation unit 120 of the image processing apparatus 100.
  • the HDR image correction unit 15B can display the RAW data acquired from the HDR imaging unit 11 on an HDRTV (HLG (Hybrid Log-Gamma) compatible display device) such as the HDR display device 62 and the like, HDR-OETF (HLG). To a 16 / 12-bit image (HDR still image data).
  • HLR Hybrid Log-Gamma
  • the HDR image correction unit 15B is realized by, for example, a processor, a memory, and the like.
  • the TIFF output unit 17A is a processing unit corresponding to the format unit 125 (see FIG. 22) and the output unit 130 (see FIG. 15) of the generation unit 120 of the image processing apparatus 100.
  • the TIFF output unit 17A generates a TIFF still image file F210 including HDR still image data and a TIFF still image file F220 including SDR still image data as one data unit D300 in the TIFF file format, and the generated data unit D300 is output.
  • the TIFF output unit 17A when storing two data of HDR still image data and SDR still image data in a TIFF file, respectively, as a tag of TIFF, an HDR tag (SDR, HDR (HLG (System Gamma 1.. 2)), one that identifies HDR (PQ)) and one with a color space tag (one that identifies sRGB, Adobe RGB, bt.2020) may be used as the file format.
  • the TIFF output unit 17A may use the SDR still image data output from the conversion unit 12 for generation of the data unit D300, or the SDR still image obtained by imaging in the SDR imaging unit 14 Data may be used.
  • the configuration of the data unit D300 generated by the TIFF output unit 17A corresponds to, for example, the configuration of the data unit D20 shown in FIG.
  • the TIFF output unit 17A is realized by, for example, a processor, a memory, and the like.
  • the conventional device for example, the SDR display device 40, the SDR printing device 50, or the existing HDR display device 30, etc.
  • the HDR imaging apparatus 10F the HDR still image data is subjected to SDR conversion in the conversion unit 12, and the SDR still image data obtained by JPEG compression of the SDR still image data obtained by the conversion is generated by the JPEG compression unit 13.
  • the image data may be generated (recorded) simultaneously as a separate file.
  • TIFF format is that it is possible to deal with TIFF files simply by implementing a TIFF display function on the HDR display device side, which is easy to realize. Further, since the imaging apparatus only needs to add a generation function for generating the data unit D300 in the TIFF file format, the configuration shown in the third embodiment can be realized relatively easily.
  • a device corresponding to such a new file format for example, HDR display device 62 or SDR printing device 63
  • any of TIFF still image file F210 or TIFF still image file F220 in data unit D300 can be reproduced. It is.
  • TIFF compatible HDRTV (for example, HDR display device 62, etc.) includes HDR still image data of HLG, bt. Any of the wide color gamut SDR still image data defined in 2020 can be displayed.
  • each processing unit described above may be realized by a dedicated circuit that executes each processing described above, instead of the processor and the memory.
  • FIG. 30 is a diagram for explaining an example 4 in the first embodiment.
  • the fourth embodiment shows a configuration example in which one data unit including HDR still image data and SDR still image data is generated using a method of generating an I picture of HEVC (I picture compression method).
  • the HDR imaging device 10G may generate one data unit including HDR still image data and SDR still image data by using a HEVC I-picture compression method.
  • the HDR imaging device 10G includes an HDR imaging unit 11, a conversion unit 12, a JPEG compression unit 13, an SDR imaging unit 14, an HDR image correction unit 15B, and an HEVC compression unit 17B.
  • the conversion unit 12, the HDR image correction unit 15B, and the HEVC compression unit 17B are the generation unit 120 and the output unit of the image processing device 100 described with reference to FIGS. 15 and 22. 130.
  • the HDR imaging unit 11, the conversion unit 12, the JPEG compression unit 13, the SDR imaging unit 14, and the HDR image correction unit 15B illustrated in FIG. 30 have substantially the same configuration as the components having the same names illustrated in FIG. Therefore, explanation is omitted.
  • the HEVC compression unit 17B is a processing unit corresponding to the HDR image compression unit 122, the SDR image compression unit 124, the format unit 125 (see FIG. 22), and the output unit 130 (see FIG. 15) of the generation unit 120 of the image processing apparatus 100. It is.
  • the HEVC compression unit 17B compresses the HDR still image data output from the HDR image correction unit 15B as an HEVC I picture.
  • the HEVC compression unit 17B compresses the SDR still image data output from the conversion unit 12 as an HEVC I picture.
  • the HEVC compression unit 17B also performs HEVC-I still image file F310 including HDR still image data as an I picture of HEVC obtained by compression, and HEVC-I still including SDR still image data as an I picture of HEVC.
  • An object including the image file F320 and having a common body (file name excluding extension) of each file name is generated as a data unit D400. Then, the HEVC compression unit 17B outputs the generated data unit D400.
  • the HEVC compression unit 17B uses the HDR tag (SDR, HDR (HLG (System Gamma 1.2)) when storing two data of HDR still image data and SDR still image data in the HEVC-I file.
  • HDR tag SDR, HDR (HLG (System Gamma 1.2)
  • HDR HDR
  • HDR HDR
  • RGB System Gamma 1.2
  • a color space tag one that identifies sRGB, Adobe RGB, bt.2020
  • the HEVC compression unit 17B may use the SDR still image data output from the conversion unit 12 to generate the data unit D400, or the SDR still image obtained by performing imaging in the SDR imaging unit 14. Data may be used.
  • the configuration of the data unit D400 generated by the HEVC compression unit 17B corresponds to, for example, the configuration of the data unit D20 illustrated in FIG.
  • the HEVC compression unit 17B is realized by, for example, a processor, a memory, and the like.
  • the conventional device for example, the SDR display device 40, the SDR printing device 50, or the existing HDR display device 30, etc.
  • the HDR still image data is SDR converted in the conversion unit 12, and the SDR still image data obtained by JPEG compression of the SDR still image data obtained by the conversion is generated by the JPEG compression unit 13.
  • the image data may be generated (recorded) simultaneously as a separate file.
  • the HEVC compression unit 17B generates an HDR still image file (JPEG MPF) in the HDR still image file format in which HDR still image data and SDR still image data are stored in one file by a multi-picture format method. May be. That is, the HEVC compression unit 17B may generate an HDR still image file having a configuration similar to that of the data unit D10 illustrated in FIG.
  • JPEG MPF HDR still image file
  • the HEVC compression unit 17B may generate an HDR still image file in an HDR still image file format that stores HDR still image data and SDR still image data in one file, which is different from the multi-picture format method. .
  • the advantage of using the HEVC I picture compression method is that, since the existing HDR display device has a HEVC decoding function, display or playback of HEVC I pictures can be implemented relatively easily.
  • an imaging apparatus that supports 4K video imaging often has a HEVC compression function, it is relatively easy to implement a function that compresses HDR still image data and SDR still image data as HEVC I-pictures.
  • a printing apparatus for example, the SDR printing apparatus 63
  • HDRTV capable of displaying an HEVC I picture includes an HLG HDR still image, bt. Any of the wide color gamut SDR still images defined in 2020 can be displayed.
  • each processing unit described above may be realized by a dedicated circuit that executes each processing described above, instead of the processor and the memory.
  • FIG. 31 is a diagram for explaining a fifth example in the first embodiment.
  • a configuration example in which one data unit including HDR still image data and SDR still image data is generated using a method of compressing with JPEG2000 is shown.
  • the HDR imaging apparatus 10H may generate one data unit including HDR still image data and SDR still image data using a JPEG2000 compression method.
  • the HDR imaging device 10H includes an HDR imaging unit 11, a conversion unit 12, a JPEG compression unit 13, an SDR imaging unit 14, an HDR image correction unit 15B, and a JPEG2000 compression unit 17C.
  • the conversion unit 12, the HDR image correction unit 15B, and the JPEG2000 compression unit 17C are the generation unit 120 and the output unit of the image processing device 100 described with reference to FIGS. 15 and 22. 130.
  • the HDR imaging unit 11, the conversion unit 12, the JPEG compression unit 13, the SDR imaging unit 14, and the HDR image correction unit 15B illustrated in FIG. 31 have substantially the same configuration as the components having the same names illustrated in FIG. Therefore, explanation is omitted.
  • the JPEG2000 compression unit 17C is a processing unit corresponding to the HDR image compression unit 122, the SDR image compression unit 124, the format unit 125 (see FIG. 22), and the output unit 130 (see FIG. 15) of the generation unit 120 of the image processing apparatus 100. It is.
  • the JPEG2000 compression unit 17C compresses the HDR still image data output from the HDR image correction unit 15B using the JPEG2000 method.
  • the JPEG 2000 compression unit 17C compresses the SDR still image data output from the conversion unit 12 using the JPEG 2000 method.
  • the JPEG2000 compression unit 17C includes a JPEG2000 still image file F410 including HDR still image data in JPEG2000 format obtained by compression, and a JPEG2000 still image file F420 including SDR still image data in JPEG2000 format.
  • an object having a common body (file name excluding extension) of each file name is generated as a data unit D500. Then, the JPEG2000 compression unit 17C outputs the generated data unit D500.
  • the JPEG2000 compression unit 17C stores the HDR data (HDR (System Gamma 1.2)), HDR when storing two data of HDR still image data and SDR still image data as JPEG2000 files. (PQ) and a color space tag (sRGB, Adobe RGB, bt.2020 identification) may be used as a file format.
  • the JPEG2000 compression unit 17C may use the SDR still image data output from the conversion unit 12 to generate the data unit D500, or an SDR still image obtained by performing imaging in the SDR imaging unit 14. Data may be used.
  • the configuration of the data unit D500 generated by the JPEG2000 compression unit 17C corresponds to, for example, the configuration of the data unit D20 shown in FIG.
  • the JPEG2000 compression unit 17C is realized by, for example, a processor, a memory, and the like.
  • the conventional device for example, the SDR display device 40, the SDR printing device 50, or the existing HDR display device 30, etc.
  • the HDR imaging apparatus 10H the HDR still image data is subjected to SDR conversion in the conversion unit 12, and the SDR still image data obtained by JPEG compression of the SDR still image data obtained by the conversion is generated by the JPEG compression unit 13.
  • the image data may be generated (recorded) simultaneously as a separate file.
  • the JPEG2000 compression unit 17C generates an HDR still image file (JPEG MPF) in the HDR still image file format in which HDR still image data and SDR still image data are stored in one file by a multi-picture format method. May be. That is, the JPEG2000 compression unit 17C may generate an HDR still image file having a configuration similar to that of the data unit D10 illustrated in FIG.
  • JPEG MPF HDR still image file
  • the JPEG2000 compression unit 17C may generate an HDR still image file in an HDR still image file format that stores HDR still image data and SDR still image data in one file, which is different from the multi-picture format method. .
  • An advantage of using the JPEG2000 compression method is that a function corresponding to JPEG2000 can be mounted on an existing HDR display device relatively easily.
  • an HDRTV compatible with JPEG2000 (for example, the HDR display device 62) can play back either an HDR JPEG2000 still image file F410 or an SDR JPEG2000 still image file F420.
  • HDRTV compatible with JPEG2000 (for example, HDR display device 62)
  • HDR still image of HLG, bt. Any of the wide color gamut SDR still images defined in 2020 can be displayed.
  • each processing unit described above may be realized by a dedicated circuit that executes each processing described above, instead of the processor and the memory.
  • FIG. 32 is a block diagram schematically showing an example of the configuration of the playback device in the first embodiment.
  • the playback device 200 includes an acquisition unit 210 and a playback unit 220.
  • the playback device 200 may further be realized as a display device that further includes a display unit (not shown) and displays the playback result on the display unit.
  • the playback device 200 may further be realized as a printing device that further includes a printing unit and prints the playback result on a print medium such as paper.
  • the acquisition unit 210 logically acquires one data unit.
  • one data unit includes HDR still image data and SDR still image data that have different luminance dynamic ranges and can be reproduced independently of each other.
  • the reproduction unit 220 reproduces one of the HDR still image data and the SDR still image data included in the data unit acquired by the acquisition unit 210.
  • the playback unit 220 may select and play back the HDR still image data according to the auxiliary information added to the data unit.
  • the reproducing unit 220 includes all the luminance regions indicated to be given priority by the luminance region information among the luminance regions of the HDR still image data.
  • the brightness adjustment of the HDR still image data may be performed, and the image data on which the brightness adjustment has been performed may be reproduced. For example, when reproducing the HDR still image data (for example, the HDR still image data illustrated in FIG.
  • the reproducing unit 220 displays the HDR still image data.
  • the image data may be converted to image data that has been adjusted so that all the high-luminance regions indicated as prioritized are included, and the converted image data may be reproduced.
  • the reproducing unit 220 reproduces the HDR still image data.
  • the converted image data may be reproduced after being converted to image data that has been subjected to luminance adjustment so that all the low-luminance regions indicated to have priority are included.
  • FIG. 33 is a flowchart showing an example of an operation related to the reproduction process of the reproduction apparatus 200 according to the first embodiment.
  • the acquisition unit 210 of the playback device 200 acquires a data unit (step S201).
  • the playback unit 220 plays back one of the HDR still image data and the SDR still image data included in the data unit acquired by the acquisition unit 210 in step S201 (step S202).
  • the acquisition unit 210 and the reproduction unit 220 may be realized by, for example, a processor that executes a predetermined program (a program created so as to execute the above-described processes) and a memory that stores the predetermined program.
  • a processor that executes a predetermined program (a program created so as to execute the above-described processes)
  • a memory that stores the predetermined program.
  • the processor and the memory it may be realized by a dedicated circuit that executes each of the processes described above.
  • FIG. 34 is a diagram for describing a specific example of auxiliary information in the first embodiment.
  • the image processing apparatus 100 may add auxiliary information when generating data units. That is, the image processing apparatus 100 may generate a data unit to which auxiliary information is added and output the data unit to which auxiliary information is added.
  • the auxiliary information may include information indicating that a high-quality image can be reproduced.
  • the playback device 200 is an SDR playback device that supports a new still image format
  • a JPEG-format SDR image is displayed (see B in FIG. 34).
  • the auxiliary information may include information for use in determining whether the route is a route) or an SDR image obtained by SDR conversion of an HDR image (HLG) (route A shown in FIG. 34).
  • the auxiliary information may include information indicating that a high-definition image can be reproduced when the SDR reproduction device displays the HDR image (HLG) after SDR conversion.
  • the auxiliary information includes luminance area information indicating whether or not the high-luminance area is given priority to the luminance of the still picture based on the HDR still picture data, or a luminance area indicating whether or not the low-luminance area is given priority.
  • Information may be included. That is, the auxiliary information may include a flag indicating whether the high-luminance area is prioritized when generating the HDR image, or a flag indicating whether the low-luminance area is prioritized when generating the HDR image. Further, the auxiliary information in this case may include a threshold value that determines a high-luminance region or a low-luminance region that is prioritized when the still image is generated.
  • the auxiliary information includes an instruction for indicating a photographer's instruction to display a JPEG SDR image as it is instead of converting the HDR image into an SDR image when displaying the SDR image on the SDR display device.
  • a flag conversion prohibition flag
  • HDR still image data, SDR still image data, and management information may be stored in one file.
  • the body of the file name (file name excluding the extension) is made common to each other as a plurality of files of DCF (Design rule for Camera File System) object. It may be generated as one logical data unit that is associated with each other.
  • the image processing apparatus uses the acquisition unit that acquires still image data obtained by imaging and the still image data acquired by the acquisition unit, so that the dynamic range of luminance is mutually equal.
  • a generating unit that logically generates one data unit including first still image data and second still image data that are different and can be reproduced independently of each other, and a data unit generated by the generating unit And an output unit for outputting.
  • the playback device includes a first still image data and a second still image data that have different luminance dynamic ranges and can be played back independently from each other.
  • An acquisition unit that acquires one data unit, and a reproduction unit that reproduces one of the first still image data and the second still image data included in the data unit acquired by the acquisition unit.
  • the image processing method acquires still image data obtained by imaging, and uses the acquired still image data to reproduce the luminance dynamic ranges different from each other and independently of each other.
  • a logical data unit including the first still image data and the second still image data that can be generated is generated, and the generated data unit is output.
  • the reproduction method includes a first still image data and a second still image data that have different dynamic dynamic ranges and can be reproduced independently of each other.
  • One data unit is acquired, and one of the first still image data and the second still image data included in the acquired data unit is reproduced.
  • the image processing apparatus 100 is an example of an image processing apparatus.
  • the acquisition unit 110 is an example of an acquisition unit included in the image processing apparatus.
  • Each of the generation unit 120 and the generation unit 120A is an example of a generation unit.
  • the output unit 130 is an example of an output unit.
  • the multi-picture format generation unit 13C, the data unit generation unit 17, the TIFF output unit 17A, the HEVC compression unit 17B, and the JPEG2000 compression unit 17C correspond to the output unit 130, respectively.
  • the HDR still image data is an example of first still image data.
  • the SDR still image data is an example of second still image data.
  • the playback device 200 is an example of a playback device.
  • the acquisition unit 210 is an example of an acquisition unit included in the playback device.
  • Each of the data unit D10, the data unit D20, the data unit D30, the data unit D200, the data unit D300, the data unit D400, and the data unit D500 is an example of a data unit generated by the generation unit of the image processing apparatus. It is an example of the data unit acquired by the acquisition part of a reproducing
  • the playback unit 220 is an example of a playback unit.
  • the image processing apparatus 100 includes an acquisition unit 110, a generation unit 120, and an output unit 130.
  • the acquisition unit 110 acquires still image data.
  • the generation unit 120 uses the still image data acquired by the acquisition unit 110 to generate first still image data (HDR still image data) that have different dynamic dynamic ranges and can be reproduced independently of each other. And logically one data unit including the second still image data (SDR still image data).
  • the output unit 130 outputs the data unit (for example, the data unit D10) generated by the generation unit 120.
  • the playback device 200 has the first still image data (HDR still image data) and the first still image data that have different luminance dynamic ranges and can be played back independently of each other.
  • An acquisition unit 210 that logically acquires one data unit (for example, data unit D10) including two still image data (SDR still image data), and a data unit (for example, data) acquired by the acquisition unit 210
  • a reproduction unit 220 that reproduces one of the first still image data (HDR still image data) and the second still image data (SDR still image data) included in the unit D10).
  • the image processing apparatus 100 configured in this manner is logically one piece of data including HDR still image data and SDR still image data that have different luminance dynamic ranges and can be reproduced independently of each other. Units can be output. Further, the playback device 200 can acquire and play back the data unit. For this reason, the image processing apparatus 100 outputs the data unit, and can reproduce either the HDR still image data or the SDR still image data included in the data unit by the reproduction apparatus (for example, the reproduction apparatus 200). . Therefore, the image processing apparatus 100 can provide still image data that is highly convenient for the user.
  • the generation unit may generate one file including the first still image data and the second still image data as a data unit.
  • the first still image data D12 is an example of first still image data.
  • the second still image data D13 is an example of second still image data.
  • the file F10 is an example of one file including the first still image data and the second still image data.
  • the data unit D10 is an example of a data unit generated by the generation unit.
  • the HDR still image file F100 corresponds to the file F10.
  • the generation unit 120 includes first still image data D12 (HDR still image data) and second still image data D13 (SDR still image data).
  • One file F10 is generated as a data unit D10.
  • the image processing apparatus 100 configured as described above, it is possible to prevent the HDR still image data and the SDR still image data from being managed separately.
  • the generation unit includes a first still image file including the first still image data, a second still image data, and a file name body that is the same as the first still image file.
  • An object including two still image files may be generated as a data unit.
  • Each of the first still image data D22 and the first still image data D32 is an example of first still image data.
  • Each of the second still image data D24 and the second still image data D33 is an example of second still image data.
  • Each of the first still image file F21 and the first still image file F32 is an example of a first still image file.
  • Each of the second still image file F22 and the second still image file F33 is an example of a second still image file.
  • Each of DSC0002 and DSC0003 is an example of a body of a file name (a file name excluding an extension).
  • Each of the data unit D20 and the data unit D30 is an example of a data unit generated by the generation unit.
  • the HDR still image file F110, the TIFF still image file F210, the HEVC-I still image file F310, and the JPEG2000 still image file F410 each correspond to the first still image file F21 (or the first still image file F32).
  • Each of the SDR still image file F120, the TIFF still image file F220, the HEVC-I still image file F320, and the JPEG2000 still image file F420 corresponds to the second still image file F22 (or the second still image file F33).
  • Data unit D200, data unit D300, data unit D400, and data unit D500 each correspond to data unit D20 (or data unit D30).
  • the generation unit 120 includes one HDR still image file (for example, the first still image data D22) including the HDR still image data (for example, the first still image data D22).
  • a still image file F21) and SDR still image data (for example, second still image data D24), and the file name body (for example, DSC0002) is an HDR still image file (for example, first still image file F21).
  • An object including an SDR still image file (for example, second still image file F22) having the same file name body (for example, DSC0002) as a data unit (for example, data unit D20) is generated. May be.
  • a playback apparatus for example, the playback apparatus 200
  • the playback apparatus supports playback of HDR still image files or SDR still image files
  • the corresponding file is supported. Can be used to reproduce the image.
  • the generation unit further adds auxiliary information in units of data indicating that a higher quality image than the image obtained by reproducing the second still image data can be reproduced by reproducing the first still image data. Also good.
  • the auxiliary information included in the management data D11 is an example of auxiliary information.
  • the generation unit 120 reproduces the HDR still image data (for example, the first still image data D12), thereby generating the SDR still image data (for example, the first still image data D12).
  • Auxiliary information indicating that a higher quality image than the image obtained by reproducing the second still image data D13) can be reproduced is added to the data unit (for example, the data unit D10).
  • the playback device (for example, playback device 200) that has received the data unit (for example, data unit D10) by the image processing device 100 configured as described above maximizes the playback capability of the playback device according to the auxiliary information. Still images can be played back with the best quality.
  • the generation unit determines whether the luminance of the still image based on the first still image data is luminance region information indicating whether the high luminance region is prioritized or whether the low luminance region is prioritized. Auxiliary information including the luminance area information to be shown may be added to the data unit.
  • the generation unit 120 determines whether or not the high-luminance area has priority for the luminance of the still image based on the HDR still image data (first still image data). Or auxiliary information including luminance area information indicating whether or not priority is given to the low luminance area may be added to the data unit.
  • the reproduction apparatus (for example, the reproduction apparatus 200) that has received the data unit by the image processing apparatus 100 configured as described above, according to the auxiliary information, takes a still image with a quality that effectively utilizes the reproduction capability of the reproduction apparatus. Can be played.
  • the first still image data may be HDR image data
  • the second still image data may be SDR image data
  • the first still image data is HDR image data
  • the second still image data is SDR image data
  • the first embodiment has been described as an example of the technique disclosed in the present application.
  • the technology in the present disclosure is not limited to this, and can be applied to embodiments in which changes, replacements, additions, omissions, and the like are made.
  • each component may be configured by dedicated hardware, or may be realized by a processor executing a software program suitable for each component.
  • Each component may be realized by a program execution unit such as a CPU (Central Processing Unit) or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
  • the software that realizes the image processing method or the reproduction method according to the above embodiment is the following program.
  • this program can acquire still image data obtained by imaging on a computer and use the acquired still image data to have different luminance dynamic ranges and to reproduce them independently of each other.
  • An image processing method for logically generating one data unit including the first still image data and the second still image data and outputting the generated data unit is executed.
  • this program is a logical unit of data including first still picture data and second still picture data that can be reproduced independently of each other and have different dynamic dynamic ranges. And a reproduction method for reproducing one of the first still image data and the second still image data included in the acquired data unit is executed.
  • the present disclosure can be applied to an image processing apparatus that can obtain highly convenient still image data, a reproduction apparatus that can reproduce the still image data, an image processing method, and a reproduction method.
  • the present disclosure is applicable to an imaging device such as a camera, a display device such as a television, or a printing device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Television Signal Processing For Recording (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

Provided is an image processing device capable of obtaining highly convenient still image data. The image processing device comprises: an acquisition unit that acquires still image data obtained by imaging; a generation unit that uses the still image data acquired by the acquisition unit to logically generate one data unit which includes first still image data and second still image data, each of which have different dynamic ranges of brightness and can be independently reproduced; and an output unit that outputs the data unit generated by the generation unit.

Description

画像処理装置、再生装置、画像処理方法、および、再生方法Image processing device, playback device, image processing method, and playback method
 本開示は、画像処理装置、再生装置、画像処理方法、および、再生方法に関する。 The present disclosure relates to an image processing device, a playback device, an image processing method, and a playback method.
 特許文献1は、露出の異なる複数の画像を合成することにより、ダイナミックレンジの広いHDR(High Dynamic Range)静止画を記録する撮像装置を開示している。 Patent Document 1 discloses an imaging apparatus that records a HDR (High Dynamic Range) still image with a wide dynamic range by combining a plurality of images with different exposures.
 しかしながら、特許文献1に開示された技術では、利便性の高い静止画データを得ることが難しかった。 However, with the technique disclosed in Patent Document 1, it is difficult to obtain highly convenient still image data.
特開2015-056807号公報Japanese Patent Laying-Open No. 2015-056807
 本開示は、利便性の高い静止画データを得ることができる画像処理装置と、再生装置、画像処理方法、および再生方法を提供する。 The present disclosure provides an image processing device, a playback device, an image processing method, and a playback method that can obtain highly convenient still image data.
 本開示における画像処理装置は、撮像により得られた静止画データを取得する取得部と、前記取得部に取得された前記静止画データを用いて、輝度のダイナミックレンジが互いに異なり、かつ、互いに独立して再生することが可能な第1静止画データおよび第2静止画データを含む、論理的に1つのデータ単位を生成する生成部と、前記生成部によって生成された前記データ単位を出力する出力部と、を備える。 An image processing apparatus according to the present disclosure uses an acquisition unit that acquires still image data obtained by imaging, and the still image data acquired by the acquisition unit to have different luminance dynamic ranges and independent of each other. Generating unit that logically generates one data unit including first still image data and second still image data that can be reproduced, and an output that outputs the data unit generated by the generating unit A section.
 また、本開示における再生装置は、輝度のダイナミックレンジが互いに異なり、かつ、互いに独立して再生することが可能な第1静止画データおよび第2静止画データを含む、論理的に1つのデータ単位を取得する取得部と、前記取得部により取得された前記データ単位に含まれる前記第1静止画データおよび前記第2静止画データのうちの一方を再生する再生部と、を備える。 In addition, the playback device according to the present disclosure is logically one data unit including first still image data and second still image data that have different luminance dynamic ranges and can be played back independently of each other. And a reproducing unit that reproduces one of the first still image data and the second still image data included in the data unit acquired by the acquiring unit.
 なお、これらの全般的または具体的な態様は、システム、方法、集積回路、コンピュータプログラムまたはコンピュータ読み取り可能なCD-ROMなどの記録媒体で実現されてもよく、システム、方法、集積回路、コンピュータプログラムおよび記録媒体の任意な組み合わせで実現されてもよい。 These general or specific aspects may be realized by a system, a method, an integrated circuit, a computer program, or a recording medium such as a computer-readable CD-ROM. The system, method, integrated circuit, computer program And any combination of recording media.
 本開示における画像処理装置は、利便性の高い静止画データを得ることができる。 The image processing apparatus according to the present disclosure can obtain highly convenient still image data.
図1は、映像技術の進化について説明するための図である。FIG. 1 is a diagram for explaining the evolution of video technology. 図2は、HDR表示技術について説明するための図である。FIG. 2 is a diagram for explaining the HDR display technique. 図3Aは、PQ(Perceptual Quantization)方式を説明するための図である。FIG. 3A is a diagram for explaining a PQ (Perceptual Quantization) method. 図3Bは、HLG(Hybrid Log Gamma)方式を説明するための図である。FIG. 3B is a diagram for explaining an HLG (Hybrid Log Gamma) system. 図4は、HDRに対応しているHDR画像の一例と、SDRに対応しているSDR画像の一例とを比較して示した図である。FIG. 4 is a diagram comparing an example of an HDR image corresponding to HDR and an example of an SDR image corresponding to SDR. 図5は、HDRまたはSDRに対応した撮像装置と、撮像装置によって得られる画像データのファイルフォーマットと、画像データを表示する表示装置または画像データを印刷する印刷装置と、について説明するための図である。FIG. 5 is a diagram for explaining an imaging device that supports HDR or SDR, a file format of image data obtained by the imaging device, a display device that displays image data, or a printing device that prints image data. is there. 図6は、2つの画像を合成することによりダイナミックレンジを拡大した画像を得るHDR撮影モードについて説明するための図である。FIG. 6 is a diagram for explaining an HDR shooting mode in which an image with an expanded dynamic range is obtained by combining two images. 図7は、2つの画像を合成することによりダイナミックレンジを拡大した画像を得るHDR撮影モードについて説明するための図である。FIG. 7 is a diagram for explaining an HDR shooting mode in which an image with an expanded dynamic range is obtained by combining two images. 図8は、HDR表示用に撮像されたHDR画像について説明するための図である。FIG. 8 is a diagram for describing an HDR image captured for HDR display. 図9は、動画の色空間と静止画の色空間との違いについて説明するための図である。FIG. 9 is a diagram for explaining the difference between the color space of a moving image and the color space of a still image. 図10は、Ultra HD Blu-ray(登録商標、以下同様)とBlu-rayとを比較して示した図である。FIG. 10 is a diagram showing a comparison between Ultra HD Blu-ray (registered trademark, the same applies hereinafter) and Blu-ray. 図11は、輝度の範囲(レンジ)が広いHDR画像を生成するHDR撮像装置を説明するための図である。FIG. 11 is a diagram for explaining an HDR imaging device that generates an HDR image with a wide luminance range. 図12は、HDR静止画ファイルフォーマットについて説明するための図である。FIG. 12 is a diagram for explaining the HDR still image file format. 図13は、マルチピクチャーフォーマットについて説明するための図である。FIG. 13 is a diagram for explaining the multi-picture format. 図14は、JPEGデータとHDR拡張用の差分データとを関連付けて扱うJPEG XT方式を説明するための図である。FIG. 14 is a diagram for explaining the JPEG XT system that handles JPEG data and difference data for HDR expansion in association with each other. 図15は、実施の形態1における画像処理装置の構成の一例を模式的に示すブロック図である。FIG. 15 is a block diagram schematically illustrating an example of the configuration of the image processing apparatus according to the first embodiment. 図16は、論理的に1つのデータ単位が2種類の静止画データを含む1つのファイルを備えて構成される場合の一例を模式的に示す図である。FIG. 16 is a diagram schematically illustrating an example in which one logical data unit is configured to include one file including two types of still image data. 図17は、管理データに含まれる情報の一例を模式的に示す図である。FIG. 17 is a diagram schematically illustrating an example of information included in the management data. 図18は、第1静止画データの静止画を構成する各画素の輝度値と、各輝度値の画素の数との関係の一例をヒストグラムとして示す図である。FIG. 18 is a diagram illustrating an example of a relationship between the luminance value of each pixel constituting the still image of the first still image data and the number of pixels of each luminance value as a histogram. 図19は、第1静止画データの静止画を構成する各画素の輝度値と、各輝度値の画素の数との関係の他の一例をヒストグラムとして示す図である。FIG. 19 is a diagram showing another example of the relationship between the luminance value of each pixel constituting the still image of the first still image data and the number of pixels of each luminance value as a histogram. 図20は、論理的に1つのデータ単位が2つのファイルを備えて構成される場合の一例を模式的に示す図である。FIG. 20 is a diagram schematically illustrating an example of a case where one logical data unit includes two files. 図21は、論理的に1つのデータ単位が2つのファイルを備えて構成される場合の他の一例を模式的に示す図である。FIG. 21 is a diagram schematically illustrating another example in which one data unit is configured to include two files. 図22は、実施の形態1における生成部の構成の一例を模式的に示すブロック図である。FIG. 22 is a block diagram schematically showing an example of the configuration of the generation unit in the first embodiment. 図23は、実施の形態1における生成部の構成の一例を模式的に示すブロック図である。FIG. 23 is a block diagram schematically showing an example of the configuration of the generation unit in the first embodiment. 図24は、実施の形態1における画像処理装置の画像処理に係る動作の一例を示すフローチャートである。FIG. 24 is a flowchart illustrating an example of an operation related to image processing of the image processing apparatus according to the first embodiment. 図25は、実施の形態1における生成部による生成処理の一例を示すフローチャートである。FIG. 25 is a flowchart illustrating an example of a generation process by the generation unit according to the first embodiment. 図26は、実施の形態1における生成部による生成処理の一例を示すフローチャートである。FIG. 26 is a flowchart illustrating an example of generation processing by the generation unit according to the first embodiment. 図27は、実施の形態1における実施例1を説明するための図である。FIG. 27 is a diagram for explaining Example 1 in the first embodiment. 図28は、実施の形態1における実施例2を説明するための図である。FIG. 28 is a diagram for explaining an example 2 in the first embodiment. 図29は、実施の形態1における実施例3を説明するための図である。FIG. 29 is a diagram for explaining an example 3 in the first embodiment. 図30は、実施の形態1における実施例4を説明するための図である。FIG. 30 is a diagram for explaining an example 4 in the first embodiment. 図31は、実施の形態1における実施例5を説明するための図である。FIG. 31 is a diagram for explaining a fifth example in the first embodiment. 図32は、実施の形態1における再生装置の構成の一例を模式的に示すブロック図である。FIG. 32 is a block diagram schematically showing an example of the configuration of the playback device in the first embodiment. 図33は、実施の形態1における再生装置の再生処理に係る動作の一例を示すフローチャートである。FIG. 33 is a flowchart illustrating an example of operations related to the reproduction processing of the reproduction device according to the first embodiment. 図34は、実施の形態1における補助情報の具体的な一例について説明するための図である。FIG. 34 is a diagram for describing a specific example of auxiliary information in the first embodiment.
 (本開示の目的)
 本開示は、HDR(High Dynamic Range)表示技術と、HDR撮像技術との2つの技術を使い、HDR静止画という新たなユーザ価値と新たな写真文化を提供するためのものである。上記の新たなユーザ価値とは、臨場感を向上させ、かつ、白飛び(明るい領域の階調が損なわれた状態)および黒潰れ(暗い領域の階調が損なわれた状態)等が低減された静止画データを生成することである。また、上記の新たな写真文化とは、HDR静止画の撮像に対応しているカメラで撮像することにより得られたHDR静止画を、HDR表示に対応する表示装置(以下、「HDR表示装置」という)に表示して鑑賞することである。なお、HDR静止画は、HDR写真ともいう。
(Purpose of this disclosure)
The present disclosure is intended to provide a new user value of HDR still image and a new photographic culture by using two technologies of HDR (High Dynamic Range) display technology and HDR imaging technology. The above-mentioned new user value improves the sense of reality and reduces whiteout (a state in which the gradation of a bright region is impaired) and blackout (a state in which a gradation of a dark region is impaired). Generating still image data. In addition, the above-mentioned new photographic culture refers to a display device (hereinafter referred to as “HDR display device”) that supports HDR display of an HDR still image obtained by capturing an image with a camera that supports HDR still image capturing. To display and appreciate. An HDR still image is also called an HDR photograph.
 本開示は、SDR表示には対応しているがHDR表示には対応していない表示装置(以下、「SDR表示装置」という)およびSDR静止画の印刷には対応しているがHDR静止画の印刷には対応していない印刷装置(以下、「SDR印刷装置」という)においても表示または印刷が可能な静止画データを生成することができる画像処理装置および画像処理方法を提供する。つまり、本開示は、HDR静止画の画像処理に対応した装置に対してだけでなく、SDR静止画の画像処理には対応しているがHDR静止画の画像処理には対応していない装置に対しても、HDR静止画の再生を行うことが可能な静止画データを提供することで、HDR静止画データの利便性を向上させることができる画像処理装置および画像処理方法を提供する。なお、本開示において、HDR静止画の再生は、HDR静止画の表示、および、HDR静止画を画像処理することによる印刷、を含む。つまり、本開示において、再生は、表示および印刷を含む。 The present disclosure is compatible with SDR display but not with HDR display (hereinafter referred to as “SDR display device”) and SDR still image printing. Provided are an image processing apparatus and an image processing method capable of generating still image data that can be displayed or printed even in a printing apparatus that does not support printing (hereinafter referred to as “SDR printing apparatus”). That is, the present disclosure is not limited to a device that supports HDR still image processing, but also a device that supports SDR still image processing but does not support HDR still image processing. In contrast, the present invention provides an image processing apparatus and an image processing method capable of improving the convenience of HDR still image data by providing still image data capable of reproducing HDR still images. In the present disclosure, reproduction of an HDR still image includes display of the HDR still image and printing by performing image processing on the HDR still image. That is, in the present disclosure, reproduction includes display and printing.
 このようなHDR静止画の静止画データのデータフォーマットを決める場合、SDR静止画に対応する従来のSDR表示装置またはSDR印刷装置との互換性を維持するか否かを定める必要がある。通常、新たなデータフォーマットを導入する際には、既存の機器との互換性が重視される。 When determining the data format of such still image data of an HDR still image, it is necessary to determine whether or not compatibility with a conventional SDR display device or SDR printing device corresponding to the SDR still image is maintained. Usually, when a new data format is introduced, compatibility with existing devices is emphasized.
 従来の装置との互換性を維持するために、例えば、新たなHDR静止画データを、その属性等を拡張して、従来の標準的データフォーマットであるJPEG(Joint Photographic Experts Group)に格納することが考えられる。しかし、JPEGには階調が8ビットという制約があるため、階調が10ビット以上のHDR静止画データをJPEGに格納することは困難である。 In order to maintain compatibility with conventional devices, for example, new HDR still image data should be stored in JPEG (Joint Photographic Experts Group), which is a conventional standard data format, with its attributes expanded. Can be considered. However, since JPEG has a restriction that gradation is 8 bits, it is difficult to store HDR still image data having gradation of 10 bits or more in JPEG.
 JPEGの制約に対応し、かつ従来の装置との互換性を維持するために、SDR静止画データであるJPEGデータと新たなHDR静止画データとの2つのデータを1つのファイルに格納し、そのファイルに、互換性を実現できるJPGの拡張子をつけることが考えられる。 In order to comply with JPEG restrictions and maintain compatibility with conventional devices, JPEG data that is SDR still image data and new HDR still image data are stored in one file. It is conceivable to attach a JPG extension that can realize compatibility to the file.
 しかし、そのような対応で従来の装置との互換性を図ろうとすると、編集ソフトや表示ソフトでそのJPEGファイルを表示または編集する際に、JPEGファイル形式における制約により、そのJPEGファイルに格納されたHDR写真データが消去される可能性がある。これらのことから、従来の装置との互換性を断念し、HDRに最適なフォーマットを採用することが望ましいと考えられる。一方、その場合、現行の印刷装置、表示装置、またはカメラ等において、HDRに対応するために必要となる修正・変更等(実装負担ともいう)は、できるだけ少ないことが望ましい。 However, when trying to achieve compatibility with conventional devices in such a manner, when the JPEG file is displayed or edited with editing software or display software, it is stored in the JPEG file due to restrictions on the JPEG file format. HDR photo data may be erased. For these reasons, it is desirable to abandon compatibility with conventional devices and adopt an optimum format for HDR. On the other hand, in such a case, it is desirable that the current printing apparatus, display apparatus, camera, or the like has as few corrections / changes (also referred to as mounting burden) necessary to support HDR.
 例えばスマートフォン等の高機能な携帯端末の普及により、そのような携帯端末を使用するユーザは、その携帯端末上で実現されるHDR合成によって、HDR静止画データを得ることが可能である。そのため、ユーザ自身が、SDR撮影ファイル(従来のJPEGファイル)とHDR撮影ファイルとをそれぞれ個別に生成して管理することが、従来よりも容易になった。つまり、カメラ等の撮像装置でHDR静止画データを生成するときに、従来のJPEGファイルを同時に生成し、HDR静止画データとSDR静止画データとの2つのデータを1つのファイルに格納するのではなく、それらを個別に管理できるようにするオプションをその撮像装置に設けることも可能となっている。 For example, with the spread of highly functional portable terminals such as smartphones, a user using such a portable terminal can obtain HDR still image data by HDR synthesis realized on the portable terminal. Therefore, it is easier for the user himself to generate and manage the SDR shooting file (conventional JPEG file) and the HDR shooting file individually than before. In other words, when HDR still image data is generated by an imaging device such as a camera, a conventional JPEG file is generated at the same time, and two data of HDR still image data and SDR still image data are stored in one file. It is also possible to provide the imaging apparatus with an option that allows them to be managed individually.
 HDR用の新しいデータフォーマット機能をテレビジョンセットや動画記録機能付きカメラに設ける場合は、例えば、それらの機器にHEVCのエンコーダが搭載されていれば、HEVC圧縮による静止画のデータフォーマットを用いることができる。しかし、HEVC圧縮による静止画のデータフォーマットを得るために、動画記録機能に対応していないカメラに、動画圧縮用のHEVCのエンコーダを搭載することは、現実的ではない。また、そのようなカメラでは、非圧縮のデータを格納できる、TIFF(Tagged Image File Format)ベースのHDR静止画データへの対応を検討することも必要となる。 When a new data format function for HDR is provided in a television set or a camera with a moving image recording function, for example, if a HEVC encoder is installed in these devices, the data format of a still image by HEVC compression may be used. it can. However, in order to obtain a still image data format by HEVC compression, it is not realistic to mount a HEVC encoder for moving image compression on a camera that does not support the moving image recording function. In addition, such a camera needs to consider the correspondence to TIFF (Tagged Image File Format) -based HDR still image data that can store uncompressed data.
 HDR静止画データをSDR印刷装置で印刷することを想定した場合、その装置において、データフォーマットの候補の1つとして、HLG(Hybrid Log Gamma)のような、従来の装置との互換性のあるHDR技術を使うことも想定される。 Assuming that HDR still image data is printed by an SDR printing device, HDR compatible with conventional devices such as HLG (Hybrid Log Gamma) is one of the data format candidates in that device. The use of technology is also envisaged.
 本開示によれば、従来の装置との互換性の有無、従来の装置における実装負担の有無、等、HDR静止画データフォーマットへの多様な要求に応えることができる画像処理装置および再生装置を実現することができる。 According to the present disclosure, it is possible to realize an image processing device and a playback device that can meet various requests for the HDR still image data format, such as whether or not compatibility with a conventional device is present and whether or not there is a mounting burden on the conventional device. can do.
 (HDR表示技術の背景)
 図1は、映像技術の進化について説明するための図である。
(Background of HDR display technology)
FIG. 1 is a diagram for explaining the evolution of video technology.
 これまで、映像の高画質化としては、表示画素数の拡大に主眼がおかれていた。そして、従来の720×480画素のStandard Definition (SD)映像に代えて、1920×1080画素のHigh Definition(HD)映像が普及している。 Until now, the main focus has been on increasing the number of display pixels in order to improve image quality. Then, instead of the conventional Standard Definition (SD) video of 720 × 480 pixels, a High Definition (HD) video of 1920 × 1080 pixels has become widespread.
 近年、更なる高画質化のために、3840×2160画素のUltra High Definition(UHD)映像、あるいは、さらに画素数が多い4096×2160画素の映像(いわゆる、4K映像)が提案されている。また、4K映像とともに、ダイナミックレンジの拡張、色域の拡大、フレームレートの追加、等も検討されている。 Recently, in order to further improve image quality, 3840 × 2160 pixel Ultra High Definition (UHD) video or 4096 × 2160 pixel video (so-called 4K video) having a larger number of pixels has been proposed. In addition to 4K video, expansion of dynamic range, expansion of color gamut, addition of frame rate, etc. are also being studied.
 ダイナミックレンジに関しては、暗部階調を維持しつつ、現行のテレビジョン信号での表現が困難な鏡面反射光等の明るい光を、より現実に近い明るさで表現するための方式として、HDR(High Dynamic Range)が提案されている。これまでのテレビジョン信号は、SDR(Standard Dynamic Range)と呼ばれ、最高輝度が100nitであった。一方、HDRでは、1000nit以上まで最高輝度を拡大することが想定されている。そして、SMPTE(Society of Motion Picture and Television Engineers)、ITU-R(International Telecommunication Union-Radiocommunication Sector)等において、マスタリングディスプレー用の規格の標準化も進行中である。HDRの具体的な適用先としては、HDやUHDと同様に、放送やパッケージメディア(Blu-ray Disc等)、インターネット配信、等がある。 As for the dynamic range, HDR (High) is used as a method for expressing bright light such as specular reflection light, which is difficult to express in the current television signal, with more realistic brightness while maintaining dark gradation. Dynamic Range) has been proposed. Conventional television signals are called SDR (Standard Dynamic Range) and have a maximum luminance of 100 nits. On the other hand, in HDR, it is assumed that the maximum luminance is expanded to 1000 nit or more. Standardization for mastering displays is also underway in SMPTE (Society of Motion Picture and Television Engineers), ITU-R (International Telecommunication Union-Radiocommunication Sector), etc. As specific application destinations of HDR, there are broadcasting, package media (Blu-ray Disc, etc.), Internet distribution, etc., as in HD and UHD.
 (HDR表示技術)
 図2は、HDR表示技術について説明するための図である。
(HDR display technology)
FIG. 2 is a diagram for explaining the HDR display technique.
 HDRは、単なる非常に明るいテレビジョンセットを実現するための方式ではない。HDRとは、映像の輝度の範囲(レンジ)を、SDRの一例であるbt.709(Broadcasting Service (Television)709)の規格で定められた0.1nit-100nitから、0-10,000nit(ST 2084の場合)に拡張して、従来は表現できなかった、明るい太陽や空や光線の反射等の表現を可能にしたり、明るい部分と暗い部分とを同時に記録することを可能にしたりする方式である。なお、ここでいう輝度とは、光学的な輝度のことであり、光源の明るさを表す物理量のことである。HDRには、撮像後にグレーディング処理(映像の色やトーンを調整する処理)を行う映像(パッケージされる映像)およびIP(Internet Protocol)配信される映像等に適したSMPTE ST2084方式と、ライブ放送の映像およびユーザに撮像された映像等に適したHybrid Log Gamma(HLG)との2つの方式がある。 HDR is not just a method for realizing a very bright television set. HDR refers to the luminance range (range) of an image as bt. 709 (Broadcasting Service (Television)) 709 standard 0.1 nit-100 nit extended from 0-10,000 nit (in case of ST 2084), the bright sun and sky that could not be expressed in the past This is a system that enables expression of light reflection or the like, and enables recording of a bright part and a dark part at the same time. In addition, the brightness | luminance here is an optical brightness | luminance, and is a physical quantity showing the brightness of a light source. HDR includes the SMPTE ST2084 format suitable for video (packaged video) that undergoes grading processing (processing that adjusts the color and tone of video) after imaging, video that is distributed via IP (Internet Protocol), and live broadcasting. There are two systems: Hybrid Log Gamma (HLG) suitable for video and video captured by the user.
 HDRの表示技術には、SDRとHDRとの互換性を実現できるHLG方式と、SDRとHDRとの単純な表示互換性がないPQ方式とがある。 HDR display technology includes the HLG method that can realize compatibility between SDR and HDR, and the PQ method that does not have simple display compatibility between SDR and HDR.
 図3Aは、PQ方式を説明するための図である。図3Bは、HLG方式を説明するための図である。 FIG. 3A is a diagram for explaining the PQ method. FIG. 3B is a diagram for explaining the HLG method.
 図3Aに示すように、SMPTE ST2084(PQ)は、SDRとHDRとの互換性がない方式である。この方式の場合、SDRとHDRとは、個別にグレーディングされて個別に伝送される。また、この方式の場合、Ultra HD Blu-rayで再生された映像を、SDRTV(SDRには対応しているが、HDRには対応していないテレビジョンセット)に表示させる場合は、HDRの映像データをSDRの映像データに変換するSDR変換が必要となる。 As shown in FIG. 3A, SMPTE ST2084 (PQ) is a method in which SDR and HDR are not compatible. In this method, SDR and HDR are individually graded and transmitted separately. In addition, in the case of this method, when displaying the video played back with Ultra HD Blu-ray on SDRTV (a television set that supports SDR but not HDR), the HDR video SDR conversion for converting data into SDR video data is required.
 図3Bに示すように、ITU-R 2100 Hybrid Log Gamma(HLG)は、SDRとHDRとの互換性を有する方式である。この方式の場合、HLG用のグレーディングが行われ、HLG用ストリームのみが伝送される。HLG用ストリームは、SDRに対する互換性がある。このため、HDRの映像データを、SDRTVに表示させる場合に、HDRの映像データをSDRの映像データに変換するSDR変換は不要である。 As shown in FIG. 3B, ITU-R 2100 Hybrid Log Gamma (HLG) is a method having compatibility between SDR and HDR. In this method, HLG grading is performed, and only the HLG stream is transmitted. The HLG stream is compatible with SDR. For this reason, when HDR video data is displayed on SDRTV, SDR conversion for converting HDR video data into SDR video data is unnecessary.
 図4は、HDRに対応しているHDR画像の一例と、SDRに対応しているSDR画像の一例とを比較して示した図である。図4には、室内の比較的暗い景色と窓の外の比較的明るい景色とが混在する、明暗差が相対的に大きい1枚の画像を、HDR画像とSDR画像とで示す。 FIG. 4 is a diagram comparing an example of an HDR image corresponding to HDR and an example of an SDR image corresponding to SDR. FIG. 4 shows an HDR image and an SDR image of a single image having a relatively large difference in brightness, in which a relatively dark scene in the room and a relatively bright scene outside the window are mixed.
 HDR画像は、HDR静止画データまたはHDR動画データを再生することで得られる画像である。SDR画像は、SDR静止画データまたはSDR動画データを再生することで得られる画像である。図4に例示するように、HDR画像では、窓の外の比較的明るい景色と、室内の比較的暗い景色とが、ともに適切な明るさで表現されている。一方、SDR画像では、窓の外の比較的明るい景色が表現されるように露出が調整されているため、室内の比較的暗い景色は、暗くなりすぎて一部に黒潰れが生じ、見えにくくなっている。仮に、室内の景色が適切に表現されるように露出が調整された場合は、窓の外の景色は、明るくなりすぎて一部に白飛びが生じ、見えにくくなってしまう(図示せず)。このように、HDR画像は、SDR画像において実現が困難であった、比較的明るい景色と比較的暗い景色とが混在する、明暗差が相対的に大きい1枚の画像において白飛びと黒潰れとの両方を低減した階調性の高い画像を実現することができる。 The HDR image is an image obtained by reproducing HDR still image data or HDR moving image data. An SDR image is an image obtained by reproducing SDR still image data or SDR moving image data. As illustrated in FIG. 4, in the HDR image, a relatively bright scene outside the window and a relatively dark scene in the room are both expressed with appropriate brightness. On the other hand, in the SDR image, the exposure is adjusted so that a relatively bright scenery outside the window is expressed. Therefore, the relatively dark scenery in the room becomes too dark and part of the image is crushed, making it difficult to see. It has become. If the exposure is adjusted so that the scenery in the room is properly represented, the scenery outside the window will become too bright and partly overexposed, making it difficult to see (not shown) . As described above, the HDR image is difficult to realize in the SDR image, and includes a combination of a relatively bright scene and a relatively dark scene. Thus, it is possible to realize an image with high gradation that reduces both.
 図5は、HDRまたはSDRに対応した撮像装置と、撮像装置によって得られる画像データのファイルフォーマットと、画像データを表示する表示装置または画像データを印刷する印刷装置と、について説明するための図である。 FIG. 5 is a diagram for explaining an imaging device that supports HDR or SDR, a file format of image data obtained by the imaging device, a display device that displays image data, or a printing device that prints image data. is there.
 図5に示すHDR撮像装置10は、HDRでの撮像に対応している。HDR撮像装置10は、HDR撮像部11と、SDR撮像部14と、変換部12と、JPEG圧縮部13と、を備える。HDR撮像装置10は、HDR撮像部11においてHDR撮影モードで撮像されて得られた画像データが、SDR表示装置40で表示される、またはSDR印刷装置50で印刷される、ことに対応できるように構成されている。具体的には、HDR撮像装置10では、HDR撮像部11においてHDR撮影モードで撮像されて得られたHDR画像のHDR静止画データが、変換部12においてSDR静止画データに変換される。そして、HDR撮像装置10では、変換部12での変換により得られたSDR静止画データが、JPEG圧縮部13においてJPEG圧縮され、その圧縮により得られたJPEGフォーマットのSDR静止画データが出力される。また、HDR撮像装置10では、SDR撮像部14において従来の撮映モード(SDR撮映モード)で撮像されて得られたSDR画像のSDR静止画データについても、JPEG圧縮部13でJPEG圧縮され、圧縮により得られたJPEGフォーマットのSDR静止画データが出力される。 The HDR imaging device 10 shown in FIG. 5 supports HDR imaging. The HDR imaging device 10 includes an HDR imaging unit 11, an SDR imaging unit 14, a conversion unit 12, and a JPEG compression unit 13. The HDR imaging device 10 can cope with image data obtained by imaging in the HDR imaging mode in the HDR imaging unit 11 displayed on the SDR display device 40 or printed by the SDR printing device 50. It is configured. Specifically, in the HDR imaging device 10, HDR still image data of an HDR image obtained by imaging in the HDR imaging mode in the HDR imaging unit 11 is converted into SDR still image data in the conversion unit 12. In the HDR imaging device 10, the SDR still image data obtained by the conversion in the conversion unit 12 is JPEG compressed in the JPEG compression unit 13, and the SDR still image data in the JPEG format obtained by the compression is output. . In the HDR imaging device 10, SDR still image data of an SDR image obtained by imaging in the conventional imaging mode (SDR imaging mode) in the SDR imaging unit 14 is also JPEG compressed by the JPEG compression unit 13, SDR still image data in JPEG format obtained by compression is output.
 SDR撮像装置20は、SDR撮像部21と、JPEG圧縮部22と、を備える。SDR撮像装置20では、HDR撮像装置10において従来の撮映モード(SDR撮映モード)で撮像が行われる場合と同様に、SDR撮像部21での撮像で得られたSDR画像のSDR静止画データが、JPEG圧縮部22でJPEG圧縮され、その圧縮により得られたJPEGフォーマットのSDR静止画データが出力される。 The SDR imaging device 20 includes an SDR imaging unit 21 and a JPEG compression unit 22. In the SDR imaging device 20, the SDR still image data of the SDR image obtained by the imaging in the SDR imaging unit 21 is performed in the same manner as when the HDR imaging device 10 performs imaging in the conventional imaging mode (SDR imaging mode). However, the JPEG compression unit 22 performs JPEG compression, and JPEG format SDR still image data obtained by the compression is output.
 したがって、HDR表示装置30、SDR表示装置40、およびSDR印刷装置50では、HDR撮像によるHDR静止画データがSDR変換されて得られたSDR静止画データ、または、SDR撮像によるSDR静止画データを取得して、当該SDR静止画データによるSDR画像を再生(表示または印刷)していた。 Therefore, the HDR display device 30, the SDR display device 40, and the SDR printing device 50 obtain SDR still image data obtained by SDR conversion of HDR still image data obtained by HDR imaging, or SDR still image data obtained by SDR imaging. Thus, the SDR image based on the SDR still image data is reproduced (displayed or printed).
 次に、HDR撮影モードについて図6および図7を用いて説明する。 Next, the HDR shooting mode will be described with reference to FIGS.
 図6および図7は、2つの画像を合成することによりダイナミックレンジを拡大した画像を得るHDR撮影モードについて説明するための図である。 6 and 7 are diagrams for explaining the HDR shooting mode for obtaining an image with an expanded dynamic range by combining two images.
 スマートフォンおよびデジタルカメラ等には、輝度の範囲(レンジ)が広い映像を撮像できるHDR撮影モードを有するものがある。HDR撮影モードでは、図6および図7の(a)に示すように、輝度の範囲(レンジ)が広いHDR画像データを得るために、2重露光(同一被写体を互いに異なる露光状態で複数回撮像する手法)等で得られた2つのSDR画像が、SDRで定められた輝度の範囲に収まるように合成される。これにより、図6および図7の(b)に示すように、HDR画像をSDR表示装置で表示することが可能になる。 Some smartphones, digital cameras, and the like have an HDR shooting mode that can capture images with a wide luminance range. In the HDR shooting mode, as shown in FIG. 6 and FIG. 7A, in order to obtain HDR image data having a wide luminance range, the double exposure (the same subject is imaged a plurality of times in different exposure states). The two SDR images obtained by the above method are synthesized so as to be within the luminance range determined by the SDR. Accordingly, as shown in FIG. 6 and FIG. 7B, the HDR image can be displayed on the SDR display device.
 次に、図8を用いて、HDR表示用に撮像されたHDR画像について説明する。 Next, the HDR image captured for HDR display will be described with reference to FIG.
 図8は、HDR表示用に撮像されたHDR画像について説明するための図である。 FIG. 8 is a diagram for explaining an HDR image captured for HDR display.
 図8に示すように、HDR表示用のHDR画像は、撮像の対象となるシーンの明るさが、SDR撮影モードよりも広い輝度の範囲(レンジ)で撮像される。この撮像により得られた画像データがグレーディングされてHDR表示用のHDR画像が生成され、そのHDR画像が各装置に伝送されて再生される。HDR画像は、SDR画像よりも輝度の範囲(レンジ)が広いため、そのままではSDR表示装置で表示することはできない。HDR画像をSDR表示装置で表示するためには、HDR画像からSDR画像への変換が必要となる。 As shown in FIG. 8, the HDR image for HDR display is imaged in a brightness range (range) where the brightness of the scene to be imaged is wider than in the SDR imaging mode. The image data obtained by this imaging is graded to generate an HDR image for HDR display, and the HDR image is transmitted to each device and reproduced. Since an HDR image has a wider luminance range than an SDR image, it cannot be displayed on an SDR display device as it is. In order to display the HDR image on the SDR display device, it is necessary to convert the HDR image into the SDR image.
 一方、図6および図7を用いて説明したHDR撮影モードでは、合成後の画像は、SDRで定められた輝度の範囲に収まるように生成されているため、HDR表示装置30とSDR表示装置40(またはSDR印刷装置50)との双方で再生することが可能である。 On the other hand, in the HDR shooting mode described with reference to FIGS. 6 and 7, the combined image is generated so as to be within the luminance range determined by the SDR, and thus the HDR display device 30 and the SDR display device 40. (Or the SDR printing apparatus 50).
 次に、図9を用いて、動画の色空間と静止画の色空間との違いについて説明する。 Next, the difference between the moving image color space and the still image color space will be described with reference to FIG.
 図9は、動画の色空間と静止画の色空間との違いについて説明するための図である。 FIG. 9 is a diagram for explaining the difference between the color space of a moving image and the color space of a still image.
 動画の色空間に関する規格であるbt.709と、静止画の色空間に関する規格であるsRGB(standard RGB)とは、名称は異なるが互いに同じ色空間を示す。また、bt.709またはsRGBで定義される色空間よりも拡張された色空間も定義されている。 Bt., Which is a standard related to the color space of moving images. 709 and sRGB (standard RGB), which is a standard relating to the color space of still images, indicate the same color space although they have different names. Bt. A color space extended from the color space defined by 709 or sRGB is also defined.
 ULTRA HDのために規格化されたbt.2020は、DCI(Digital Cinema Initiatives) P3またはAdobeRGBよりも広い色空間である。このため、bt.2020によってDCI P3およびAdobeRGBの色空間をカバーできる。なお、DCI P3の色空間とAdobeRGBの色空間とは、面積は互いに同程度であるが、領域は互いに異なる。 Bt.standardized for ULTRA HD. 2020 is a color space wider than DCI (Digital Cinematic Initiatives) P3 or AdobeRGB. For this reason, bt. 2020 can cover the color space of DCI P3 and AdobeRGB. The DCI P3 color space and the Adobe RGB color space have the same area, but the areas are different from each other.
 図10は、Ultra HD Blu-rayとBlu-rayとを比較して示した図である。 FIG. 10 is a diagram showing a comparison between Ultra HD Blu-ray and Blu-ray.
 図10に示すように、Ultra HD Blu-rayは、解像度、色空間、HDR(最大輝度)、圧縮技術、および転送レートの全ての項目においてBlu-rayを上回っている。 As shown in FIG. 10, Ultra HD Blu-ray exceeds Blu-ray in all items of resolution, color space, HDR (maximum luminance), compression technology, and transfer rate.
 図5に戻って説明を続ける。近年、HDR画像を表示するためのHDR画像データを、SDR変換をせずに表示することができる、HDRTV等のHDR表示装置が提案されている。 Referring back to FIG. In recent years, HDR display devices such as HDRTV have been proposed that can display HDR image data for displaying HDR images without performing SDR conversion.
 一方、HDR撮影モード(HDR撮像機能)を有するカメラでは、HDRTVとは異なり、主に、逆光補正等を目的としてHDR技術が使用されている。そして、そのカメラでHDR技術を使用して撮像された静止画像は、主にSDR表示装置またはSDR印刷装置で再生される。そのため、そのカメラは、映像用のHDR画像データを生成することが可能な撮像素子を備え、HDR技術を用いた撮像が可能であるにも関わらず、HDR撮影モードで撮像したHDR画像をSDR変換し、SDR静止画データを出力する。このように、HDR撮像機能を有するカメラでは、HDRTVの表示能力を活かすような輝度の範囲(レンジ)が広いHDR画像データを生成することが可能であるにもかかわらず、一般的にはHDR画像データが生成されていなかった。 On the other hand, unlike the HDRTV, the HDR technology is used mainly for the purpose of backlight correction in the camera having the HDR shooting mode (HDR imaging function). And the still image imaged with the camera using HDR technology is mainly reproduced | regenerated with a SDR display apparatus or a SDR printing apparatus. Therefore, the camera is provided with an image sensor capable of generating HDR image data for video, and the HDR image captured in the HDR imaging mode is SDR converted even though the imaging using the HDR technology is possible. SDR still image data is output. As described above, in a camera having an HDR imaging function, it is generally possible to generate HDR image data having a wide luminance range (range) that makes use of the display capability of HDRTV. Data was not generated.
 図11は、輝度の範囲(レンジ)が広いHDR画像を生成するHDR撮像装置10Aを説明するための図である。 FIG. 11 is a diagram for explaining an HDR imaging apparatus 10A that generates an HDR image with a wide luminance range.
 HDRTV(例えば、HDR表示装置30)のHDR表示機能を活かすためには、HDR表示用のHDR画像データが生成されたときに、HDR画像データをSDR画像データに変換せず、そのままのHDR画像データをHDRTVで表示すればよい。 In order to utilize the HDR display function of the HDRTV (for example, the HDR display device 30), when HDR image data for HDR display is generated, the HDR image data is not converted into SDR image data as it is. May be displayed in HDRTV.
 図11に示すHDR撮像装置10Aは、HDR撮像部11と、SDR撮像部14と、変換部12と、JPEG圧縮部13と、HDR画像補正部15と、HDMI(登録商標、以下同様)(High-Definition Multimedia Interface)出力部16と、を備える。HDR撮像装置10Aは、HDR画像データを生成するために、HDR画像補正部15においてHDR画像補正を行う。 An HDR imaging device 10A illustrated in FIG. 11 includes an HDR imaging unit 11, an SDR imaging unit 14, a conversion unit 12, a JPEG compression unit 13, an HDR image correction unit 15, and HDMI (registered trademark, the same applies hereinafter) (High). -Definition (Multimedia Interface) output unit 16. In the HDR imaging device 10A, the HDR image correction unit 15 performs HDR image correction in order to generate HDR image data.
 HDR画像補正部15では、例えば、HDR撮像部11で撮像が行われることにより得られたRAWデータを、PQカーブ等のHDR-EOTF(HDR-Electro-Optical Transfer Function)を用いて、HDRTV(例えば、HDR10規格に対応したHDR表示装置30)で表示可能な10ビット画像に変換する。そして、HDR撮像装置10Aは、HDR画像補正部15により得られたHDR画像データを、HDMI出力部16からHDRTV(例えば、HDR表示装置30)に出力する。これにより、そのHDR画像データを受信したHDRTV(例えば、HDR表示装置30)では、そのHDR画像データに応じたHDR画像が表示される。 The HDR image correction unit 15 uses, for example, HDRTV (for example, HDR-EOTF (HDR-Electro-Transfer Transfer Function) such as a PQ curve, etc., as raw data obtained by performing imaging with the HDR imaging unit 11. The image is converted into a 10-bit image that can be displayed on the HDR display device 30) corresponding to the HDR10 standard. Then, the HDR imaging device 10A outputs the HDR image data obtained by the HDR image correction unit 15 from the HDMI output unit 16 to the HDRTV (for example, the HDR display device 30). Thereby, in HDRTV (for example, HDR display device 30) which received the HDR image data, the HDR image according to the HDR image data is displayed.
 ただし、この場合、HDR撮像装置10AとHDR表示装置30とが、HDMI 2.0規格に対応したHDMIケーブルで相互接続される必要がある。すなわち、HDR撮像装置10Aは、HDMI 2.0規格に対応していない装置に対しては、そのHDR画像データをそのまま送信することができない。この問題に対応するためには、HDMI 2.0規格に対応していない装置に対してもHDR画像データを送信できるような、HDR静止画ファイルフォーマットが必要となる。言い換えると、この問題に対応するためには、SDR静止画ファイルフォーマットで、HDR撮像装置10Aと、HDR表示装置30、SDR表示装置40、およびSDR印刷装置50との間でデータ交換ができるように、HDR形式のまま保存およびデータ交換するためのHDR静止画ファイルフォーマットが必要である。しかしながら、HDR静止画ファイルフォーマットには次に示すような問題がある。 However, in this case, the HDR imaging device 10A and the HDR display device 30 need to be connected to each other with an HDMI cable corresponding to the HDMI 2.0 standard. That is, the HDR imaging device 10A cannot transmit the HDR image data as it is to a device that does not support the HDMI 2.0 standard. In order to cope with this problem, an HDR still image file format is required so that HDR image data can be transmitted to a device that does not support the HDMI 2.0 standard. In other words, in order to cope with this problem, data can be exchanged between the HDR imaging device 10A, the HDR display device 30, the SDR display device 40, and the SDR printing device 50 in the SDR still image file format. Therefore, an HDR still image file format for storing and exchanging data in the HDR format is required. However, the HDR still image file format has the following problems.
 図12は、HDR静止画ファイルフォーマットについて説明するための図である。 FIG. 12 is a diagram for explaining the HDR still image file format.
 図12に示すHDR撮像装置10Bは、HDR撮像部11と、SDR撮像部14と、変換部12と、JPEG圧縮部13と、HDR画像補正部15と、JPEG圧縮部13Aと、を備える。 12 includes an HDR imaging unit 11, an SDR imaging unit 14, a conversion unit 12, a JPEG compression unit 13, an HDR image correction unit 15, and a JPEG compression unit 13A.
 HDR静止画ファイルフォーマットには、SDR静止画ファイルフォーマットの1つであるJPEGのような、広く普及しているファイルフォーマットがない。 The HDR still image file format does not have a widely used file format such as JPEG, which is one of the SDR still image file formats.
 HDR画像データを格納するファイルフォーマットには、色空間をbt.2020の領域まで拡張し、広い輝度範囲をカバーすることが必要とされる。また、HDR画像には、静止画の場合に目立ちやすいバンディング等を抑えるために、最低でも10ビット、できれば12ビット以上の階調を表示できることが望ましい。一方、JPEGファイルフォーマットは、SDRに制限されるとともにsRGBで定義される色空間に制限され、さらに、階調は8ビットに制限される。 The file format for storing HDR image data has a color space of bt. It is necessary to expand to the 2020 region and cover a wide luminance range. Further, in order to suppress banding and the like that are conspicuous in the case of a still image, it is desirable that an HDR image can display a gradation of at least 10 bits, preferably 12 bits or more. On the other hand, the JPEG file format is limited to SDR and limited to a color space defined by sRGB, and the gradation is limited to 8 bits.
 HDR静止画データを格納するファイルフォーマットとして、JPEGベースのファイルフォーマットを使用する場合は、JPEGファイルフォーマットを、bt.2020で定義される色空間とし、HDRに拡張して圧縮すればよい。このように、JPEGベースのファイルフォーマットにHDR静止画データを格納することは、技術的には可能である。なお、図12には、HDR撮像装置10BのJPEG圧縮部13Aにおいて、HDR静止画データを格納するファイルフォーマットとして、JPEGベースのファイルフォーマットを使用する例を示している。しかし、JPEGベースのファイルフォーマットには、画像の品質に関して、階調表示能力の不足によるバンディング等の問題がある。そのため、HDR静止画データを格納するファイルフォーマットとしてJPEGベースのファイルフォーマットを使用することは、実用化されていない。 When using a JPEG-based file format as a file format for storing HDR still image data, change the JPEG file format to bt. The color space defined in 2020 may be expanded and compressed to HDR. Thus, it is technically possible to store HDR still image data in a JPEG-based file format. FIG. 12 shows an example in which a JPEG-based file format is used as a file format for storing HDR still image data in the JPEG compression unit 13A of the HDR imaging apparatus 10B. However, the JPEG-based file format has problems such as banding due to insufficient gradation display capability with respect to image quality. Therefore, using a JPEG-based file format as a file format for storing HDR still image data has not been put into practical use.
 このように、HDR静止画データを格納するのにJPEGファイルフォーマットを使用することには、上述したような問題がある。そして、現状では有力なHDR静止画ファイルフォーマットがない。このため、HDR表示装置30では、HDR静止画ファイルフォーマットがサポートされていない。 Thus, using the JPEG file format to store HDR still image data has the problems described above. At present, there is no powerful HDR still image file format. For this reason, the HDR display device 30 does not support the HDR still image file format.
 ここで、SDR静止画データとHDR静止画データとの2種類のデータを格納するファイルフォーマットについて考える。 Here, consider a file format for storing two types of data, SDR still image data and HDR still image data.
 図13は、マルチピクチャーフォーマットについて説明するための図である。 FIG. 13 is a diagram for explaining the multi-picture format.
 複数の写真データを1つのファイルに格納することが可能なフォーマットとして、マルチピクチャーフォーマットがある。マルチピクチャーフォーマットでは、例えば、主画像(HDR静止画データ)と、主画像をテレビジョンセット等のモニタに表示するのに適したサイズ(SDR静止画データ)にした静止画像(以下、モニタ表示用画像ともいう)とを、互いに関連付けて記録することができる。また、マルチピクチャーフォーマットでは、多視点(立体視)画像等、個別のファイルである複数の静止画データを、互いに関連付けて1つのファイルに記録することができる。 ∙ Multi-picture format is a format that can store multiple photo data in one file. In the multi-picture format, for example, a main image (HDR still image data) and a still image (hereinafter referred to as monitor display) having a size suitable for displaying the main image on a monitor such as a television set (SDR still image data). Can also be recorded in association with each other. In the multi-picture format, a plurality of still image data, which are individual files such as multi-viewpoint (stereoscopic) images, can be recorded in one file in association with each other.
 マルチピクチャーフォーマットには、図13の(a)に示すBaseline MP ファイルと、図13の(b)に示すExtended MP ファイルとの2つの方式がある。 There are two types of multi-picture formats: Baseline MP file shown in (a) of FIG. 13 and Extended MP file shown in (b) of FIG.
 図13の(a)に示すBaseline MP ファイルは、主画像(HDR静止画データ)とモニタ表示用画像(SDR静止画データ)とを互いに関連付けて1つのファイルに記録することができる。Baseline MP ファイルの拡張子は「.JPG」である。Baseline MP ファイルを用いると、従来機器または従来のソフトウェアで、主画像に対応するモニタ表示用画像を再生でき、主画像(HDR静止画データ)をHDR表示装置にそのまま表示することもできる。Baseline MP ファイルの利点は、既存の表示装置および印刷装置で、互換性のために格納されている、主画像に対応するモニタ表示用画像(つまりSDR静止画データ)を、再生することができることである。一方、画像編集ソフトが、Baseline MP ファイルを通常のJPEGファイルと誤解して、2個目の画像データ(HDR静止画データ)を消去する可能性がある。これは、Baseline MP ファイルには、1つのファイル内に2つのデータが格納されているが、1つのファイル内に1つのデータを格納するJPEGファイルの拡張子「.JPG」が使用されているためである。ただし、マルチピクチャーフォーマットを編集できる画像編集ソフトの場合には、上記した問題は発生しない。 The Baseline MP file shown in FIG. 13A can record the main image (HDR still image data) and the monitor display image (SDR still image data) in one file in association with each other. Baseline MP file extension is “.JPG”. By using a Baseline MP file, a monitor display image corresponding to the main image can be reproduced by a conventional device or conventional software, and the main image (HDR still image data) can be displayed as it is on the HDR display device. The advantage of Baseline MP file is that the existing display device and printing device can play back the monitor display image (that is, SDR still image data) corresponding to the main image stored for compatibility. is there. On the other hand, the image editing software may misunderstand the Baseline MP file as a normal JPEG file and erase the second image data (HDR still image data). This is because the Baseline MP file contains two data in one file, but the extension “.JPG” of a JPEG file that stores one data in one file is used. It is. However, in the case of image editing software capable of editing the multi-picture format, the above-mentioned problem does not occur.
 図13の(b)に示すExtended MP ファイルは、例えば立体視等に用いられる2つのマルチビュー画像(マルチビュー画像1とマルチビュー画像2)等を、互いに関連付けて1つのファイルに記録することができる。Extended MP ファイルは、従来機器または従来のソフトウェアを用いて再生または保存等を行う際に一方の画像が消失しないように、新規の拡張子を持つファイル形式として定義される。Extended MP ファイルの利点は、JPEGファイルに準ずる1つのファイル内に2つのデータが格納されているが、JPEGファイルの拡張子「.JPG」が使用されていないことである。このため、マルチピクチャーフォーマット対応の画像編集ソフト以外の画像編集ソフトでは、このファイルを編集できない。したがって、Extended MP ファイルの場合、Baseline MP ファイルのように、画像編集ソフトが通常のJPEGファイルと誤解して、2個目の画像データを消す可能性は少ない。Extended MP ファイルの問題は、拡張子がJPEGと異なるため、既存の表示装置および印刷装置で再生することができないことである。 The Extended MP file shown in (b) of FIG. 13 can record, for example, two multi-view images (multi-view image 1 and multi-view image 2) used for stereoscopic viewing or the like in one file in association with each other. it can. An Extended MP file is defined as a file format with a new extension so that one image is not lost when played back or saved using a conventional device or conventional software. The advantage of the Extended MP file is that two data are stored in one file corresponding to the JPEG file, but the extension “.JPG” of the JPEG file is not used. For this reason, this file cannot be edited by image editing software other than image editing software compatible with the multi-picture format. Therefore, in the case of an Extended MP file, unlike the Baseline MP file, there is little possibility that the image editing software misunderstands a normal JPEG file and erase the second image data. The problem with the Extended MP file is that it cannot be played back on existing display and printing devices because the extension is different from JPEG.
 図14は、JPEGデータとHDR拡張用の差分データとを関連付けて扱うJPEG XT方式を説明するための図である。 FIG. 14 is a diagram for explaining the JPEG XT method that handles JPEG data and difference data for HDR expansion in association with each other.
 図14に示すHDR撮像装置10Cは、HDR撮像部11と、SDR撮像部14と、変換部12と、JPEG圧縮部13と、HDR画像補正部15と、JPEG XT圧縮部13Bと、を備える。 14 includes an HDR imaging unit 11, an SDR imaging unit 14, a conversion unit 12, a JPEG compression unit 13, an HDR image correction unit 15, and a JPEG XT compression unit 13B.
 HDR静止画用の規格として、JPEG XT(ISO 18477)がある。この規格は、SDR静止画データを格納したJPEGデータとHDR拡張用の差分データとを互いに関連付けて扱うための方式を定めている。この規格に準ずるHDR撮像装置10Cでは、JPEG XT圧縮部13Bが、HDR画像補正部15でHDR画像補正が施されたHDR画像データに対して、JPEG XT圧縮処理を行う。これにより、HDR撮像装置10Cでは、SDR画像の再生に用いられるSDR静止画データ(JPEG)と、HDR静止画生成用の差分データとが得られる。 There is JPEG XT (ISO 18477) as a standard for HDR still images. This standard defines a method for handling JPEG data storing SDR still image data and difference data for HDR expansion in association with each other. In the HDR imaging device 10C conforming to this standard, the JPEG XT compression unit 13B performs JPEG XT compression processing on the HDR image data that has been subjected to HDR image correction by the HDR image correction unit 15. Thereby, in the HDR imaging apparatus 10C, SDR still image data (JPEG) used for reproducing the SDR image and difference data for generating the HDR still image are obtained.
 JPEG XTの利点は、既存のJPEGデータを再生可能な装置で、SDR静止画データを利用できることである。すなわち、JPEG XTにより、既存のJPEGデータを再生可能な装置でSDR静止画データを再生することが可能となる。 The advantage of JPEG XT is that SDR still image data can be used on a device that can reproduce existing JPEG data. That is, JPEG XT makes it possible to play back SDR still image data with an apparatus that can play back existing JPEG data.
 一方、JPEG XTによるHDR静止画を再生する場合、表示装置または印刷装置において、HDR静止画生成用の差分データとSDR静止画データとを組み合わせて再生しなければならない。そのためには、通常のHDR表示機能とは異なるJPEG XTのHDR静止画ファイルフォーマットに対応した特殊な処理が、表示装置または印刷装置において必要となる。すなわち、JPEG XTの問題点は、既存のHDRTV(例えば、図14に示すHDR表示装置30、等)において、JPEG XTのHDR静止画生成用の差分データを含むデータを再生することができないことである。すなわち、既存の表示装置または印刷装置では、JPEG XTに含まれたSDR静止画ファイルしか再生することができない。このように、JPEG XTのHDR静止画は、JPEG XTのHDR静止画ファイルフォーマットの再生に対応した表示装置(すなわち、通常のHDR表示機能とは異なるJPEG XTのHDR静止画ファイルフォーマットを再生するための特殊な処理が可能な表示装置、例えば、図14に示すHDR表示装置60、等)でしか再生(表示)することができない。また、JPEG XTを生成するためには、撮像装置(例えば、HDR撮像装置10C)においても、JPEG XT圧縮処理のような特殊な処理を実行する機能(例えば、JPEG XT圧縮部13B)が必要となる。また、JPEG XTに含まれた、既存の表示装置および印刷装置に対して互換性を有するSDR静止画データ(JPEGデータ)を、画像編集ソフト等で編集した場合、HDR静止画生成用の差分データが消失する可能性がある。さらに、JPEG XTに含まれるHDR静止画生成用の差分データは、差分形式のデータのため、HDR写真としての編集が困難である。さらに、画像編集ソフト等でJPEG XTに含まれるSDR静止画データを編集すると、JPEG XTのHDR静止画ファイルフォーマットの再生に対応した表示装置(例えば、図14に示すHDR表示装置60、等)において、HDR静止画生成用の差分データをSDR静止画データと組み合わせて再生することができなくなる可能性がある。 On the other hand, when an HDR still image by JPEG XT is reproduced, the difference data for generating the HDR still image and the SDR still image data must be reproduced in combination with the display device or the printing device. For this purpose, special processing corresponding to the JPEG XT HDR still image file format different from the normal HDR display function is required in the display device or the printing device. In other words, the problem with JPEG XT is that existing HDRTV (for example, HDR display device 30 shown in FIG. 14, etc.) cannot reproduce data including difference data for generating a HDR still image of JPEG XT. is there. That is, an existing display device or printing device can only play back SDR still image files included in JPEG XT. In this way, the JPEG XT HDR still image is displayed on a display device that supports playback of the JPEG XT HDR still image file format (that is, the JPEG XT HDR still image file format different from the normal HDR display function is played back). Can be reproduced (displayed) only by a display device capable of special processing, such as the HDR display device 60 shown in FIG. In addition, in order to generate JPEG XT, the imaging device (for example, HDR imaging device 10C) also needs a function (for example, JPEG XT compression unit 13B) that performs special processing such as JPEG XT compression processing. Become. Also, when editing SDR still image data (JPEG data) included in JPEG XT that is compatible with existing display devices and printing devices using image editing software, etc., differential data for generating HDR still images May disappear. Furthermore, since the difference data for generating HDR still images included in JPEG XT is data in a difference format, editing as an HDR photograph is difficult. Furthermore, when SDR still image data included in JPEG XT is edited with image editing software or the like, in a display device (for example, HDR display device 60 shown in FIG. 14 or the like) that supports playback of the HDR still image file format of JPEG XT. The difference data for generating the HDR still image may not be reproduced in combination with the SDR still image data.
 以上のように、JPEG XTは、解決すべき問題が多く、HDR撮像装置およびHDR表示装置ではあまり使用されていない。 As described above, JPEG XT has many problems to be solved, and is not often used in HDR imaging devices and HDR display devices.
 これらのことから、SDR静止画データおよびHDR静止画データの2種類の静止画データを容易に扱うためのデータフォーマットが必要とされている。 Therefore, there is a need for a data format for easily handling two types of still image data, SDR still image data and HDR still image data.
 以下に説明する画像処理装置は、そのようなデータフォーマットを生成するためのものである。 The image processing apparatus described below is for generating such a data format.
 以下、適宜図面を参照しながら、実施の形態を詳細に説明する。ただし、必要以上に詳細な説明は省略する場合がある。例えば、すでによく知られた事項の詳細説明、および実質的に同一の構成に対する重複説明等を省略する場合がある。これは、以下の説明が不必要に冗長になるのを避け、当業者の理解を容易にするためである。 Hereinafter, embodiments will be described in detail with reference to the drawings as appropriate. However, more detailed explanation than necessary may be omitted. For example, a detailed description of already well-known matters and a redundant description of substantially the same configuration may be omitted. This is to avoid the following description from becoming unnecessarily redundant and to facilitate understanding by those skilled in the art.
 なお、添付図面および以下の説明は、当業者が本開示を十分に理解するために提供されるのであって、これらにより特許請求の範囲に記載の主題を限定することは意図されていない。 It should be noted that the accompanying drawings and the following description are provided for those skilled in the art to fully understand the present disclosure, and are not intended to limit the subject matter described in the claims.
 また、各図は、必ずしも厳密に図示されたものではなく、本開示をわかりやすく示すために適宜省略等を行った模式図である。また、各図において、実質的に同じ構成要素については同じ符号を付し、説明を省略または簡略化する場合がある。 Each figure is not necessarily illustrated strictly, but is a schematic diagram appropriately omitted in order to easily understand the present disclosure. Moreover, in each figure, the same code | symbol is attached | subjected about the substantially same component, and description may be abbreviate | omitted or simplified.
 (1.実施の形態1)
 図15は、実施の形態1における画像処理装置100の構成の一例を模式的に示すブロック図である。
(1. Embodiment 1)
FIG. 15 is a block diagram schematically illustrating an example of the configuration of the image processing apparatus 100 according to the first embodiment.
 画像処理装置100は、取得部110と、生成部120と、出力部130と、を備える。画像処理装置100は、撮像装置に組み込まれていてもよいし、単独の装置として実現されてもよい。 The image processing apparatus 100 includes an acquisition unit 110, a generation unit 120, and an output unit 130. The image processing apparatus 100 may be incorporated in the imaging apparatus or may be realized as a single apparatus.
 取得部110は、イメージセンサ等の撮像部(図示せず)で撮像が行われることにより得られた静止画データを取得する。このとき、暗い輝度から明るい輝度までの広い範囲(レンジ)の明るさを含むように撮像が行われることで、静止画データは、例えば0から10,000nitまでの輝度を含むHDR画像のデータとして生成される。取得部110は、例えば、所定のプログラム(上述の各処理を実行するように作成されたプログラム)を実行するプロセッサおよび所定のプログラムを格納しているメモリにより実現されてもよい。また、取得部110は、上述の各処理を実行する専用回路により実現されてもよい。 The acquisition unit 110 acquires still image data obtained by imaging with an imaging unit (not shown) such as an image sensor. At this time, imaging is performed so as to include brightness in a wide range (range) from dark brightness to bright brightness, so that still image data is, for example, HDR image data including brightness from 0 to 10,000 nits. Generated. The acquisition unit 110 may be realized by, for example, a processor that executes a predetermined program (a program created so as to execute the above-described processes) and a memory that stores the predetermined program. In addition, the acquisition unit 110 may be realized by a dedicated circuit that executes each of the processes described above.
 生成部120は、取得部110に取得された静止画データを用いて、論理的に1つのデータ単位を生成する。本実施の形態において、論理的に1つのデータ単位とは、輝度のダイナミックレンジが互いに異なり、かつ、互いに独立して再生することが可能な第1静止画データおよび第2静止画データを含んで構成される1つのデータのことである。 The generation unit 120 logically generates one data unit using the still image data acquired by the acquisition unit 110. In the present embodiment, logically one data unit includes first still image data and second still image data that have different luminance dynamic ranges and can be reproduced independently of each other. It is one piece of data that is configured.
 ここで、論理的に1つのデータ単位の具体例について、図16~図19を用いて説明する。 Here, specific examples of logically one data unit will be described with reference to FIGS.
 図16は、論理的に1つのデータ単位D10が2種類の静止画データ(第1静止画データD12および第2静止画データD13)を含む1つのファイルF10を備えて構成される場合の一例を模式的に示す図である。 FIG. 16 shows an example in which one data unit D10 is logically configured with one file F10 including two types of still image data (first still image data D12 and second still image data D13). It is a figure shown typically.
 図17は、管理データD11に含まれる情報の一例を模式的に示す図である。 FIG. 17 is a diagram schematically illustrating an example of information included in the management data D11.
 図18は、第1静止画データD12の静止画を構成する各画素の輝度値と、各輝度値の画素の数との関係の一例をヒストグラムとして示す図である。 FIG. 18 is a diagram showing an example of the relationship between the luminance value of each pixel constituting the still image of the first still image data D12 and the number of pixels of each luminance value as a histogram.
 図19は、第1静止画データD12の静止画を構成する各画素の輝度値と、各輝度値の画素の数との関係の他の一例をヒストグラムとして示す図である。 FIG. 19 is a diagram showing another example of the relationship between the luminance value of each pixel constituting the still image of the first still image data D12 and the number of pixels of each luminance value as a histogram.
 なお、図18、図19において、横軸は輝度値を表し、縦軸は画素の数を表す。 In FIGS. 18 and 19, the horizontal axis represents the luminance value, and the vertical axis represents the number of pixels.
 図20は、論理的に1つのデータ単位D20が2つのファイル(第1静止画ファイルF21および第2静止画ファイルF22)を備えて構成される場合の一例を模式的に示す図である。図20に示す例では、第1静止画ファイルF21は第1静止画データD22を含み、第2静止画ファイルF22は第2静止画データD24を含む。 FIG. 20 is a diagram schematically illustrating an example in which one data unit D20 is configured to include two files (a first still image file F21 and a second still image file F22). In the example shown in FIG. 20, the first still image file F21 includes first still image data D22, and the second still image file F22 includes second still image data D24.
 図21は、論理的に1つのデータ単位D30が2つのファイル(第1静止画ファイルF32および第2静止画ファイルF33)を備えて構成される場合の他の一例を模式的に示す図である。図21に示す例では、第1静止画ファイルF32は第1静止画データD32を含み、第2静止画ファイルF33は第2静止画データD33を含む。 FIG. 21 is a diagram schematically illustrating another example in which one data unit D30 is configured to include two files (a first still image file F32 and a second still image file F33). . In the example shown in FIG. 21, the first still image file F32 includes first still image data D32, and the second still image file F33 includes second still image data D33.
 生成部120は、論理的に1つのデータ単位を生成する。生成部120は、例えば、図16に例示するように、第1静止画データD12および第2静止画データD13を含む1つのファイルF10をデータ単位D10として生成してもよい。この場合、生成部120により生成されるデータ単位D10は、1つのファイルF10で構成される。ファイルF10は、管理データD11、第1静止画データD12および第2静止画データD13を含む。第1静止画データD12は、例えばHDRの画像データであり、第2静止画データD13は、例えばSDRの画像データである。ファイルF10のファイル名は、例えば、「DSC0001.HDR」である。また、生成部120は、図16に示すデータ単位D10に、さらに補助情報(図17参照)を付加してもよい。補助情報は、第1静止画データD12を再生することによって、第2静止画データD13を再生する場合よりも高品位な画像を再生できることを示す情報を含んでいてもよい。 The generation unit 120 logically generates one data unit. For example, as illustrated in FIG. 16, the generation unit 120 may generate one file F10 including the first still image data D12 and the second still image data D13 as the data unit D10. In this case, the data unit D10 generated by the generation unit 120 is composed of one file F10. The file F10 includes management data D11, first still image data D12, and second still image data D13. The first still image data D12 is, for example, HDR image data, and the second still image data D13 is, for example, SDR image data. The file name of the file F10 is “DSC0001.HDR”, for example. The generation unit 120 may further add auxiliary information (see FIG. 17) to the data unit D10 shown in FIG. The auxiliary information may include information indicating that a higher quality image can be reproduced by reproducing the first still image data D12 than when the second still image data D13 is reproduced.
 管理データD11は、第1静止画データD12および第2静止画データD13を管理するデータである。管理データD11は、図17に例示するように、日付情報、サイズ情報、第1格納アドレス、第2格納アドレス、および補助情報を含む。 Management data D11 is data for managing the first still image data D12 and the second still image data D13. As illustrated in FIG. 17, the management data D11 includes date information, size information, a first storage address, a second storage address, and auxiliary information.
 日付情報は、第1静止画データD12の元となる静止画および第2静止画データD13の元となる静止画が撮像された日付を示す情報である。サイズ情報は、第1静止画データD12による静止画のサイズ(解像度)および第2静止画データD13による静止画のサイズ(解像度)を示す情報である。第1格納アドレスは、ファイルF10において第1静止画データD12が格納されているアドレスを示す情報である。第2格納アドレスは、ファイルF10において第2静止画データD13が格納されているアドレスを示す情報である。 The date information is information indicating the date when the still image that is the source of the first still image data D12 and the still image that is the source of the second still image data D13 are captured. The size information is information indicating the size (resolution) of the still image based on the first still image data D12 and the size (resolution) of the still image based on the second still image data D13. The first storage address is information indicating an address where the first still image data D12 is stored in the file F10. The second storage address is information indicating an address where the second still image data D13 is stored in the file F10.
 ここで、補助情報について図18および図19を用いて説明する。 Here, the auxiliary information will be described with reference to FIGS. 18 and 19.
 補助情報は、第1静止画データD12による静止画の輝度が高輝度領域を優先しているか否かを示す輝度領域情報を含んでいてもよい。例えば、図18に示すように、輝度値が高輝度領域に多く分布しているHDR静止画の場合、生成部120は、高輝度領域に輝度値が多く分布していることを示す輝度領域情報を補助情報として含む管理データD11を生成してもよい。 The auxiliary information may include luminance region information indicating whether the luminance of the still image based on the first still image data D12 has priority over the high luminance region. For example, as illustrated in FIG. 18, in the case of an HDR still image in which luminance values are distributed in a high luminance region, the generation unit 120 indicates luminance region information indicating that a luminance value is distributed in a high luminance region. May be generated as management information D11.
 また、補助情報は、第1静止画データD12による静止画の輝度が低輝度領域を優先している否かを示す輝度領域情報を含んでいてもよい。例えば、図19に示すように、輝度値が低輝度領域に多く分布しているHDR静止画の場合、生成部120は、低輝度領域に輝度値が多く分布していることを示す輝度領域情報を補助情報として含む管理データD11を生成してもよい。 Further, the auxiliary information may include luminance region information indicating whether or not the luminance of the still image based on the first still image data D12 prioritizes the low luminance region. For example, as illustrated in FIG. 19, in the case of an HDR still image in which a large number of luminance values are distributed in a low luminance region, the generation unit 120 indicates luminance region information indicating that a large number of luminance values are distributed in the low luminance region. May be generated as management information D11.
 なお、高輝度領域は、低輝度領域よりも輝度が高い領域である。高輝度領域と低輝度領域とは、互いに重複しないように設定されてもよいし、互いに重複する領域を含むように設定されてもよい。高輝度領域は、例えば、SDRの最高輝度値よりも高い輝度の領域として設定されてもよい。低輝度領域は、例えば、SDRの最高輝度値以下の輝度の領域として設定されてもよい。 Note that the high luminance area is an area having higher luminance than the low luminance area. The high luminance region and the low luminance region may be set so as not to overlap each other, or may be set so as to include regions overlapping each other. The high luminance area may be set as an area having a luminance higher than the maximum luminance value of the SDR, for example. The low luminance area may be set as an area having a luminance equal to or lower than the maximum luminance value of SDR, for example.
 生成部120は、第1静止画データD12を解析することで、第1静止画データD12による静止画に関して、その静止画を構成する全ての画素数に対して、輝度値が高い方から(または輝度値が低い方から)所定の割合以上の画素数が占める輝度領域を特定し、当該輝度領域を示す輝度領域情報を補助情報として含む管理データD11を生成してもよい。また、輝度領域情報は、ユーザにより設定された情報であってもよい。 The generation unit 120 analyzes the first still image data D12, so that the still image based on the first still image data D12 has a higher luminance value (or higher) with respect to all the number of pixels constituting the still image (or It is also possible to specify a luminance area occupied by a number of pixels equal to or greater than a predetermined ratio (from the lower luminance value) and generate management data D11 including luminance area information indicating the luminance area as auxiliary information. Further, the luminance area information may be information set by the user.
 また、生成部120は、図20に例示するように、第1静止画データD22を含む1つの第1静止画ファイルF21と、第2静止画データD24を含み、かつ、ファイル名のボディ(拡張子を除くファイル名)が第1静止画ファイルF21と同じ第2静止画ファイルF22とにより構成されるオブジェクトを、データ単位D20として生成してもよい。この場合、生成部120により生成されるデータ単位D20は、第1静止画ファイルF21および第2静止画ファイルF22の2つのファイルを含んで構成される。 Further, as illustrated in FIG. 20, the generation unit 120 includes one first still image file F21 including the first still image data D22 and second still image data D24, and a file name body (extended An object composed of the second still image file F22 having the same file name excluding the child) as the first still image file F21 may be generated as the data unit D20. In this case, the data unit D20 generated by the generation unit 120 includes two files, a first still image file F21 and a second still image file F22.
 第1静止画ファイルF21は、第1管理データD21および第1静止画データD22を含む。また、第1静止画ファイルF21のファイル名は、例えば、「DSC0002.HDR」である。第2静止画ファイルF22は、第2管理データD23および第2静止画データD24を含む。また、第2静止画ファイルF22のファイル名は、例えば、「DSC0002.JPG」である。このように、第1静止画ファイルF21のファイル名のボディ(拡張子を除くファイル名)と、第2静止画ファイルF22のファイル名のボディ(拡張子を除くファイル名)とは、ともに「DSC0002」であり互いに同じである。 The first still image file F21 includes first management data D21 and first still image data D22. The file name of the first still image file F21 is, for example, “DSC0002.HDR”. The second still image file F22 includes second management data D23 and second still image data D24. The file name of the second still image file F22 is, for example, “DSC0002.JPG”. Thus, the body of the file name of the first still image file F21 (file name excluding the extension) and the body of the file name of the second still image file F22 (file name excluding the extension) are both “DSC0002”. Are the same as each other.
 なお、第1静止画データD22は、第1静止画データD12と同様に、HDRの画像データである。第2静止画データD24は、第2静止画データD13と同様に、SDRの画像データである。 The first still image data D22 is HDR image data, like the first still image data D12. Similar to the second still image data D13, the second still image data D24 is SDR image data.
 第1管理データD21は、図17に示した管理データD11から第2格納アドレスを除いた情報により構成される。第2管理データD23は、図17に示した管理データD11から第1格納アドレスを除いた情報により構成される。 The first management data D21 is composed of information obtained by removing the second storage address from the management data D11 shown in FIG. The second management data D23 is composed of information obtained by removing the first storage address from the management data D11 shown in FIG.
 なお、図20では、第1静止画ファイルF21に第1管理データD21が含まれ、第2静止画ファイルF22に第2管理データD23が含まれる構成例を説明したが、本開示は何らこの構成例に限定されない。例えば、生成部120は、図21に例示するように、第1静止画データD32を含む1つの第1静止画ファイルF32と、第2静止画データD33を含み、かつ、ファイル名のボディ(拡張子を除くファイル名)が第1静止画ファイルF32と同じ第2静止画ファイルF33と、管理データD31を含み、かつ、ファイル名のボディ(拡張子を除くファイル名)が第1静止画ファイルF32と同じ管理ファイルF31と、により構成されるオブジェクトを、データ単位D30として生成してもよい。この場合、生成部120により生成されるデータ単位D30は、管理ファイルF31、第1静止画ファイルF32および第2静止画ファイルF33の3つのファイルを含んで構成される。 In FIG. 20, the configuration example in which the first management data D21 is included in the first still image file F21 and the second management data D23 is included in the second still image file F22 has been described. It is not limited to examples. For example, as illustrated in FIG. 21, the generation unit 120 includes one first still image file F32 including the first still image data D32 and second still image data D33, and a file name body (extended The file name excluding the child) includes the second still image file F33, which is the same as the first still image file F32, and the management data D31, and the body of the file name (file name excluding the extension) is the first still image file F32. An object composed of the same management file F31 may be generated as the data unit D30. In this case, the data unit D30 generated by the generation unit 120 includes three files, that is, a management file F31, a first still image file F32, and a second still image file F33.
 管理ファイルF31は、管理データD31を含む。管理ファイルF31のファイル名は、例えば、「DSC0003.INFO」である。第1静止画ファイルF32は、第1静止画データD32を含む。第1静止画ファイルF32のファイル名は、例えば、「DSC0003.HDR」である。第2静止画ファイルF33は、第2静止画データD33を含む。第2静止画ファイルF33のファイル名は、例えば、「DSC0003.JPG」である。このように、管理ファイルF31のファイル名のボディ(拡張子を除くファイル名)と、第1静止画ファイルF32のファイル名のボディ(拡張子を除くファイル名)と、第2静止画ファイルF33のファイル名のボディ(拡張子を除くファイル名)とは、ともに「DSC0003」であり互いに同じである。 Management file F31 includes management data D31. The file name of the management file F31 is, for example, “DSC0003.INFO”. The first still image file F32 includes first still image data D32. The file name of the first still image file F32 is, for example, “DSC0003.HDR”. The second still image file F33 includes second still image data D33. The file name of the second still image file F33 is, for example, “DSC0003.JPG”. As described above, the file name body (file name excluding the extension) of the management file F31, the file name body (file name excluding the extension) of the first still image file F32, and the second still image file F33. The body of the file name (the file name excluding the extension) is “DSC0003” and is the same as each other.
 管理データD31は、図17に示した管理データD11と実質的に同じである。また、第1静止画データD32は、第1静止画データD12と同様に、HDRの画像データである。第2静止画データD33は、第2静止画データD13と同様に、SDRの画像データである。 Management data D31 is substantially the same as management data D11 shown in FIG. The first still image data D32 is HDR image data, like the first still image data D12. Similar to the second still image data D13, the second still image data D33 is SDR image data.
 図15に示す出力部130は、生成部120によって生成されたデータ単位を出力する。 The output unit 130 shown in FIG. 15 outputs the data unit generated by the generation unit 120.
 図22は、実施の形態1における生成部120の構成の一例を模式的に示すブロック図である。 FIG. 22 is a block diagram schematically illustrating an example of the configuration of the generation unit 120 in the first embodiment.
 生成部120は、HDR画像処理部121と、変換部123と、フォーマット部125と、を有する。生成部120は、図22に破線で示すように、さらに、HDR画像圧縮部122およびSDR画像圧縮部124の少なくとも一方を有していてもよい。言い換えると、生成部120は、HDR画像圧縮部122およびSDR画像圧縮部124の少なくとも一方を有していない構成であってもよい。 The generation unit 120 includes an HDR image processing unit 121, a conversion unit 123, and a format unit 125. The generation unit 120 may further include at least one of the HDR image compression unit 122 and the SDR image compression unit 124 as indicated by a broken line in FIG. In other words, the generation unit 120 may be configured not to include at least one of the HDR image compression unit 122 and the SDR image compression unit 124.
 HDR画像処理部121は、取得部110により取得された静止画データ(例えば、RAWデータ)に対してHDR-EOTFを用いて10ビット画像への変換(HDR画像処理)を行うことにより、その静止画データを、HDR表示用のダイナミックレンジを有するHDR静止画データに変換する。HDR画像処理部121は、非圧縮のHDR静止画データを出力する。 The HDR image processing unit 121 converts the still image data (for example, RAW data) acquired by the acquisition unit 110 into a 10-bit image (HDR image processing) using HDR-EOTF, thereby The image data is converted into HDR still image data having a dynamic range for HDR display. The HDR image processing unit 121 outputs uncompressed HDR still image data.
 HDR画像圧縮部122は、HDR画像処理部121から出力された非圧縮のHDR静止画データを圧縮し、圧縮されたHDR静止画データを生成する。HDR画像圧縮部122は、圧縮されたHDR静止画データをフォーマット部125に出力する。 The HDR image compression unit 122 compresses the uncompressed HDR still image data output from the HDR image processing unit 121, and generates compressed HDR still image data. The HDR image compression unit 122 outputs the compressed HDR still image data to the format unit 125.
 変換部123は、非圧縮のHDR静止画データをSDR変換し、非圧縮のSDR静止画データを生成する。 The conversion unit 123 performs SDR conversion on uncompressed HDR still image data, and generates uncompressed SDR still image data.
 SDR画像圧縮部124は、変換部123から出力された非圧縮のSDR静止画データを圧縮し、圧縮されたSDR静止画データを生成する。SDR画像圧縮部124は、圧縮されたSDR静止画データをフォーマット部125に出力する。 The SDR image compression unit 124 compresses the uncompressed SDR still image data output from the conversion unit 123, and generates compressed SDR still image data. The SDR image compression unit 124 outputs the compressed SDR still image data to the format unit 125.
 フォーマット部125は、HDR画像圧縮部122で圧縮されたHDR静止画データと、SDR画像圧縮部124で圧縮されたSDR静止画データとを含む論理的に1つのデータ単位を生成し、生成したデータ単位を出力部130に出力する。 The format unit 125 generates logically one data unit including the HDR still image data compressed by the HDR image compression unit 122 and the SDR still image data compressed by the SDR image compression unit 124, and the generated data The unit is output to the output unit 130.
 なお、静止画データには、その静止画データの元となる画像が撮像されたときの撮像に関する情報(以下、撮像情報という)が含まれていてもよい。撮像情報は、例えば、撮像装置であるカメラの絞り値、シャッター速度、ISO(International Organization for Standardization)感度、ピクチャーコントロール、等を示す情報を含む。生成部120は、撮像情報を用いて、生成部120を構成する機能ブロックによる各処理を行ってもよい。 Note that the still image data may include information regarding imaging (hereinafter referred to as imaging information) when the image that is the source of the still image data is captured. The imaging information includes, for example, information indicating an aperture value, a shutter speed, ISO (International Organization for Standardization) sensitivity, picture control, and the like of a camera that is an imaging apparatus. The generation unit 120 may perform each process by the functional blocks constituting the generation unit 120 using the imaging information.
 なお、図22には、生成部120が変換部123を有する構成例を示したが、本開示はこの構成例に限定されない。画像処理装置100は、例えば、生成部120に代えて、生成部120Aを備えていてもよい。生成部120Aは、変換部123の代わりにSDR画像処理部123Aを有する点が、生成部120と異なる。 In addition, in FIG. 22, although the production | generation part 120 showed the structural example which has the conversion part 123, this indication is not limited to this structural example. For example, the image processing apparatus 100 may include a generation unit 120A instead of the generation unit 120. The generation unit 120A is different from the generation unit 120 in that the generation unit 120A includes an SDR image processing unit 123A instead of the conversion unit 123.
 図23は、実施の形態1における生成部120Aの構成の一例を模式的に示すブロック図である。 FIG. 23 is a block diagram schematically showing an example of the configuration of the generation unit 120A in the first embodiment.
 なお、生成部120Aは、生成部120と比較して、SDR画像処理部123Aの構成のみが異なる。そのため、以下では、SDR画像処理部123Aの説明のみを行う。 Note that the generation unit 120A differs from the generation unit 120 only in the configuration of the SDR image processing unit 123A. Therefore, only the SDR image processing unit 123A will be described below.
 SDR画像処理部123Aは、取得部110により取得された静止画データ(例えば、RAWデータ)に対してSDR-EOTF(SDR-Electro-Optical Transfer Function)を用いて8ビット画像への変換(SDR画像処理)を行うことにより、その静止画データを、SDR表示用のダイナミックレンジを有するSDR静止画データに変換する。SDR画像処理部123Aは、非圧縮のHDR静止画データを出力する。 The SDR image processing unit 123A converts the still image data (eg, RAW data) acquired by the acquisition unit 110 into an 8-bit image (SDR image) using SDR-EOTF (SDR-Electro-Optical Transfer Function). By performing the processing, the still image data is converted into SDR still image data having a dynamic range for SDR display. The SDR image processing unit 123A outputs uncompressed HDR still image data.
 SDR画像圧縮部124は、SDR画像処理部123Aから出力された非圧縮のSDR静止画データを圧縮し、圧縮されたSDR静止画データを生成する。 The SDR image compression unit 124 compresses the uncompressed SDR still image data output from the SDR image processing unit 123A, and generates compressed SDR still image data.
 生成部120Aも、生成部120と同様に、撮像情報を用いて、生成部120Aを構成する機能ブロックによる各処理を行ってもよい。 Similarly to the generation unit 120, the generation unit 120A may also perform each process using the functional blocks constituting the generation unit 120A using the imaging information.
 図24は、実施の形態1における画像処理装置100の画像処理に係る動作の一例を示すフローチャートである。 FIG. 24 is a flowchart illustrating an example of operations related to image processing of the image processing apparatus 100 according to the first embodiment.
 画像処理装置100の取得部110は、静止画データを取得する(ステップS101)。 The acquisition unit 110 of the image processing apparatus 100 acquires still image data (step S101).
 生成部120は、ステップS101で取得部110により取得された静止画データを用いて、論理的に1つのデータ単位を生成する(ステップS102)。論理的に1つのデータ単位は、輝度のダイナミックレンジが互いに異なり、かつ、互い独立して再生可能なHDR静止画データとSDR静止画データとを含む。 The generation unit 120 logically generates one data unit using the still image data acquired by the acquisition unit 110 in step S101 (step S102). Logically, one data unit includes HDR still image data and SDR still image data that have different luminance dynamic ranges and can be reproduced independently of each other.
 出力部130は、ステップS102で生成部120により生成された論理的に1つのデータ単位を出力する(ステップS103)。 The output unit 130 outputs one logical data unit generated by the generation unit 120 in step S102 (step S103).
 なお、ステップS102の生成処理に関しては、図22に示した生成部120で行われる処理と、図23に示した生成部120Aで行われる処理とで差異がある。以下、その差異を、フローチャートを用いて説明する。 Note that the generation process in step S102 is different between the process performed by the generation unit 120 illustrated in FIG. 22 and the process performed by the generation unit 120A illustrated in FIG. Hereinafter, the difference is demonstrated using a flowchart.
 図25は、実施の形態1における生成部120による生成処理の一例を示すフローチャートである。 FIG. 25 is a flowchart illustrating an example of generation processing by the generation unit 120 according to the first embodiment.
 HDR画像処理部121は、ステップS101で取得された静止画データに対して所定の画像処理を行うことにより、その静止画データをHDR静止画データに変換する(ステップS111)。HDR画像処理部121は、そのHDR静止画データ(非圧縮のHDR静止画データ)を出力する。 The HDR image processing unit 121 converts the still image data into HDR still image data by performing predetermined image processing on the still image data acquired in step S101 (step S111). The HDR image processing unit 121 outputs the HDR still image data (uncompressed HDR still image data).
 HDR画像圧縮部122は、ステップS111でHDR画像処理部121から出力された非圧縮のHDR静止画データを圧縮する(ステップS112)。なお、ステップS112の処理は行われなくてもよい。したがって、図25では、ステップS112を破線で示す。 The HDR image compression unit 122 compresses the uncompressed HDR still image data output from the HDR image processing unit 121 in step S111 (step S112). Note that the process of step S112 may not be performed. Therefore, in FIG. 25, step S112 is indicated by a broken line.
 変換部123は、ステップS111でHDR画像処理部121から出力された非圧縮のHDR静止画データをSDR変換する(ステップS113)。変換部123は、SDR変換により得られた非圧縮のSDR静止画データを出力する。 The conversion unit 123 performs SDR conversion on the uncompressed HDR still image data output from the HDR image processing unit 121 in step S111 (step S113). The conversion unit 123 outputs uncompressed SDR still image data obtained by SDR conversion.
 SDR画像圧縮部124は、ステップS113で変換部123から出力された非圧縮のSDR静止画データを圧縮する(ステップS114)。なお、ステップS114の処理は行われなくてもよい。したがって、図25では、ステップS114を破線で示す。 The SDR image compression unit 124 compresses the uncompressed SDR still image data output from the conversion unit 123 in step S113 (step S114). Note that the process of step S114 need not be performed. Therefore, in FIG. 25, step S114 is indicated by a broken line.
 フォーマット部125は、ステップS112でHDR画像圧縮部122により生成された圧縮されたHDR静止画データ(または、ステップS111でHDR画像処理部121から出力された非圧縮のHDR静止画データ)と、ステップS114でSDR画像圧縮部124により生成された圧縮されたSDR静止画データ(または、ステップS113で変換部123から出力された非圧縮のSDR静止画データ)と、を含む論理的に1つのデータ単位を生成する(ステップS115)。 The format unit 125 includes the compressed HDR still image data generated by the HDR image compression unit 122 in step S112 (or the uncompressed HDR still image data output from the HDR image processing unit 121 in step S111), Logically one data unit including the compressed SDR still image data generated by the SDR image compression unit 124 in S114 (or the uncompressed SDR still image data output from the conversion unit 123 in step S113) Is generated (step S115).
 なお、上述したように、生成部120で実行される図25に示したフローチャートにおいて、ステップS112の処理およびステップS114の処理の少なくとも一方は行われなくてもよい。 Note that, as described above, in the flowchart shown in FIG. 25 executed by the generation unit 120, at least one of the process of step S112 and the process of step S114 may not be performed.
 図26は、実施の形態1における生成部120Aによる生成処理の一例を示すフローチャートである。 FIG. 26 is a flowchart illustrating an example of generation processing by the generation unit 120A according to the first embodiment.
 なお、図26のフローチャートに示す生成処理は、図25のフローチャートに示した生成処理と比較して、ステップS113の代わりにステップS113Aを行う点のみが異なる。そのため、以下では、ステップS113Aについてのみ説明する。 Note that the generation process shown in the flowchart of FIG. 26 differs from the generation process shown in the flowchart of FIG. 25 only in that step S113A is performed instead of step S113. Therefore, only step S113A will be described below.
 生成部120AのSDR画像処理部123Aは、ステップS101で取得された静止画データに対して所定の画像処理を行うことにより、その静止画データをSDR静止画データに変換する(ステップS113A)。SDR画像処理部123Aは、そのSDR静止画データ(非圧縮のSDR静止画データ)を出力する。 The SDR image processing unit 123A of the generating unit 120A converts the still image data into SDR still image data by performing predetermined image processing on the still image data acquired in step S101 (step S113A). The SDR image processing unit 123A outputs the SDR still image data (uncompressed SDR still image data).
 ステップS113Aの後は、図25に示したフローチャートと同様に、ステップS114またはステップS115が行われる。 After step S113A, step S114 or step S115 is performed as in the flowchart shown in FIG.
 次に、画像処理装置100を含むHDR撮像装置による動作の具体例について説明する。なお、以下では、すでに説明した構成要素にはその構成要素と同じ符号を付し、その構成要素の説明を省略している。 Next, a specific example of the operation of the HDR imaging device including the image processing device 100 will be described. In the following, the components already described are denoted by the same reference numerals as those components, and description of the components is omitted.
 (1-1-1.実施例1)
 図27は、実施の形態1における実施例1を説明するための図である。実施例1には、マルチピクチャーフォーマット方式を用いて、HDR静止画データとSDR静止画データとを含む1つのファイルを生成する構成例を示す。
(1-1-1. Example 1)
FIG. 27 is a diagram for explaining Example 1 in the first embodiment. The first embodiment shows a configuration example in which one file including HDR still image data and SDR still image data is generated using a multi-picture format method.
 実施例1におけるHDR撮像装置10Dは、マルチピクチャーフォーマット方式を用いて、HDR静止画データとSDR静止画データとを含む1つのファイルを生成してもよい。図27に示すように、HDR撮像装置10Dは、HDR撮像部11、変換部12、JPEG圧縮部13、マルチピクチャーフォーマット生成部13C、SDR撮像部14、およびHDR画像補正部15を備える。HDR撮像装置10Dが備える各構成要素のうち、変換部12、JPEG圧縮部13、マルチピクチャーフォーマット生成部13CおよびHDR画像補正部15は、図15および図22を用いて説明した画像処理装置100の生成部120および出力部130に対応している。 The HDR imaging device 10D according to the first embodiment may generate one file including HDR still image data and SDR still image data using a multi-picture format method. As illustrated in FIG. 27, the HDR imaging device 10D includes an HDR imaging unit 11, a conversion unit 12, a JPEG compression unit 13, a multi-picture format generation unit 13C, an SDR imaging unit 14, and an HDR image correction unit 15. Among the components included in the HDR imaging apparatus 10D, the conversion unit 12, the JPEG compression unit 13, the multi-picture format generation unit 13C, and the HDR image correction unit 15 are included in the image processing apparatus 100 described with reference to FIGS. This corresponds to the generation unit 120 and the output unit 130.
 HDR撮像部11は、HDR撮影モードで撮像を行うことで、HDR画像(HDR静止画)を生成する。HDR撮像部11は、例えば、レンズ、イメージセンサ、プロセッサ、メモリ、等により実現される。 The HDR imaging unit 11 generates an HDR image (HDR still image) by performing imaging in the HDR imaging mode. The HDR imaging unit 11 is realized by, for example, a lens, an image sensor, a processor, a memory, and the like.
 変換部12は、画像処理装置100の生成部120の変換部123(図22参照)に対応する処理部である。変換部12は、HDR撮像部11から出力された非圧縮のHDR静止画データをSDR変換することで、非圧縮のSDR静止画データを生成する。変換部12は、生成した非圧縮のSDR静止画データをJPEG圧縮部13に出力する。変換部12は、例えば、プロセッサ、メモリ、等により実現される。 The conversion unit 12 is a processing unit corresponding to the conversion unit 123 (see FIG. 22) of the generation unit 120 of the image processing apparatus 100. The conversion unit 12 generates uncompressed SDR still image data by performing SDR conversion on the uncompressed HDR still image data output from the HDR imaging unit 11. The conversion unit 12 outputs the generated uncompressed SDR still image data to the JPEG compression unit 13. The conversion unit 12 is realized by, for example, a processor, a memory, and the like.
 JPEG圧縮部13は、画像処理装置100の生成部120のSDR画像圧縮部124(図22参照)に対応する処理部である。JPEG圧縮部13は、入力された非圧縮のSDR静止画データをJPEG圧縮することで、圧縮されたSDR静止画データを生成する。JPEG圧縮部13は、例えば、プロセッサ、メモリ、等により実現される。 The JPEG compression unit 13 is a processing unit corresponding to the SDR image compression unit 124 (see FIG. 22) of the generation unit 120 of the image processing apparatus 100. The JPEG compression unit 13 generates compressed SDR still image data by performing JPEG compression on the input non-compressed SDR still image data. The JPEG compression unit 13 is realized by, for example, a processor, a memory, and the like.
 HDR画像補正部15は、画像処理装置100の生成部120のHDR画像処理部121(図22参照)に対応する処理部である。HDR画像補正部15は、HDR撮像部11から取得されるRAWデータから、HDR表示装置30およびHDR表示装置61等のHDRTVで表示可能な非圧縮のHDR静止画データを生成する。HDR画像補正部15は、例えば、プロセッサ、メモリ、等により実現される。 The HDR image correction unit 15 is a processing unit corresponding to the HDR image processing unit 121 (see FIG. 22) of the generation unit 120 of the image processing apparatus 100. The HDR image correction unit 15 generates uncompressed HDR still image data that can be displayed on the HDRTV such as the HDR display device 30 and the HDR display device 61 from the RAW data acquired from the HDR imaging unit 11. The HDR image correction unit 15 is realized by, for example, a processor, a memory, and the like.
 マルチピクチャーフォーマット生成部13Cは、画像処理装置100の、生成部120のフォーマット部125(図22参照)、および出力部130(図15参照)に対応する処理部である。マルチピクチャーフォーマット生成部13Cは、マルチピクチャーフォーマット方式で、非圧縮のHDR静止画データと、JPEG圧縮されたSDR静止画データと、を1つのファイルに格納して、HDR静止画ファイルフォーマットのHDR静止画ファイル(JPEG MPF)F100を生成する。そして、マルチピクチャーフォーマット生成部13Cは、生成したHDR静止画ファイルF100を出力する。マルチピクチャーフォーマット生成部13Cでは、HDR静止画ファイルF100を生成する際に、変換部12から出力されたSDR静止画データを用いてJPEG圧縮部13で生成された圧縮されたSDR静止画データが用いられてもよく、あるいは、SDR撮像部14において撮像が行われることで得られたSDR静止画データを用いてJPEG圧縮部13で生成された圧縮されたSDR静止画データが用いられてもよい。 The multi-picture format generation unit 13C is a processing unit corresponding to the format unit 125 (see FIG. 22) and the output unit 130 (see FIG. 15) of the generation unit 120 of the image processing apparatus 100. The multi-picture format generation unit 13C stores non-compressed HDR still image data and JPEG-compressed SDR still image data in one file in a multi-picture format method, and HDR still image file format HDR still image Image file (JPEG MPF) F100 is generated. Then, the multi-picture format generation unit 13C outputs the generated HDR still image file F100. In the multi-picture format generation unit 13C, when the HDR still image file F100 is generated, the compressed SDR still image data generated by the JPEG compression unit 13 using the SDR still image data output from the conversion unit 12 is used. Alternatively, compressed SDR still image data generated by the JPEG compression unit 13 using SDR still image data obtained by performing imaging in the SDR imaging unit 14 may be used.
 マルチピクチャーフォーマット生成部13Cによって生成されるHDR静止画ファイルF100の構成は、例えば、図16に示したデータ単位D10のファイルF10の構成に対応する。マルチピクチャーフォーマット生成部13Cは、例えば、プロセッサ、メモリ、等により実現される。 The configuration of the HDR still image file F100 generated by the multi-picture format generation unit 13C corresponds to, for example, the configuration of the file F10 of the data unit D10 shown in FIG. The multi-picture format generation unit 13C is realized by a processor, a memory, and the like, for example.
 SDR撮像部14は、従来の撮影モード(SDR撮影モード)で撮像を行うことで、SDR画像(SDR静止画)を生成する。SDR撮像部14は、例えば、レンズ、イメージセンサ、プロセッサ、メモリ、等により実現される。 The SDR imaging unit 14 generates an SDR image (SDR still image) by performing imaging in the conventional imaging mode (SDR imaging mode). The SDR imaging unit 14 is realized by, for example, a lens, an image sensor, a processor, a memory, and the like.
 なお、上述の各処理部は、プロセッサおよびメモリに代えて、上述の各処理を実行する専用回路により実現されてもよい。 Note that each processing unit described above may be realized by a dedicated circuit that executes each processing described above, instead of the processor and the memory.
 HDR静止画ファイルF100には、SDR静止画データ(SDR互換用のJPEGデータ)とHDR静止画データとの2つのデータが格納されている。そのHDR静止画データは、図16に示した第1静止画データD12に対応し、そのSDR静止画データは、図16に示した第2静止画データD13に対応する。そして、HDR静止画ファイルF100は、拡張子が「.JPG」である。したがって、HDR静止画ファイルF100は、マルチピクチャーフォーマットに対応しているHDR表示装置61だけでなく、マルチピクチャーフォーマットに対応していないHDR表示装置30、SDR表示装置40およびSDR印刷装置50においても、再生(表示または印刷)することができる。 The HDR still image file F100 stores two data, SDR still image data (JPEG data for SDR compatibility) and HDR still image data. The HDR still image data corresponds to the first still image data D12 shown in FIG. 16, and the SDR still image data corresponds to the second still image data D13 shown in FIG. The HDR still image file F100 has an extension “.JPG”. Therefore, the HDR still image file F100 is used not only in the HDR display device 61 that supports the multi-picture format, but also in the HDR display device 30, the SDR display device 40, and the SDR printing device 50 that do not support the multi-picture format. Can be played (displayed or printed).
 このように、実施例1の利点は、既存のJPEGファイルを再生可能な装置(例えば、SDR表示装置40およびSDR印刷装置50、等)において、SDR-JPEG(JPEG圧縮部13で生成された、圧縮されたSDR静止画データ)の再生が可能である点である。また、実施例1の利点は、既存のHDRTV(例えば、HDR表示装置30およびHDR表示装置61)に、HDR静止画ファイルF100を表示させるための機能を比較的容易に実装できる点である。また、実施例1の利点は、撮像装置においても、HDR専用処理の実現が比較的容易な点である。 As described above, the advantage of the first embodiment is that SDR-JPEG (generated by the JPEG compression unit 13) in an apparatus capable of reproducing an existing JPEG file (for example, the SDR display apparatus 40 and the SDR printing apparatus 50) (Compressed SDR still image data) can be reproduced. The advantage of the first embodiment is that a function for displaying the HDR still image file F100 can be mounted relatively easily on the existing HDRTV (for example, the HDR display device 30 and the HDR display device 61). The advantage of the first embodiment is that it is relatively easy to realize HDR dedicated processing even in the imaging apparatus.
 なお、実施例1では、HDR静止画ファイルF100の拡張子を「.JPG」とする構成例を説明したが、本開示はこの構成例に限定されない。例えば、新たな拡張子を持つマルチピクチャーフォーマット方式のファイルが生成されてもよい。上述したように、「.JPG」の拡張子を有しJPEGファイルを編集できる画像編集ソフトで編集可能なファイルからは、その画像編集ソフトによってHDR静止画データが削除される可能性がある。しかし、新たな拡張子を持つファイルは、そのファイル内に格納されるHDR静止画データおよびSDR静止画データの2つのデータに対して、その拡張子を有するファイルを編集できる専用の画像編集ソフトでしか編集できないため、HDR静止画データが削除される可能性が低減される。新たな拡張子を持つマルチピクチャーフォーマット方式のファイルは、このような利点を有する。ただし、新たな拡張子を持つマルチピクチャーフォーマット方式のファイルは、既存の装置(例えば、SDR表示装置40およびSDR印刷装置50、等)で再生することは困難である。 In the first embodiment, the configuration example in which the extension of the HDR still image file F100 is “.JPG” has been described, but the present disclosure is not limited to this configuration example. For example, a multi-picture format file having a new extension may be generated. As described above, HDR still image data may be deleted by an image editing software that has an extension of “.JPG” and can be edited by image editing software that can edit a JPEG file. However, a file with a new extension is a dedicated image editing software that can edit a file with that extension for two data of HDR still image data and SDR still image data stored in the file. Since only editing is possible, the possibility of deleting HDR still image data is reduced. A multi-picture format file having a new extension has such advantages. However, it is difficult to reproduce a multi-picture format file having a new extension with an existing device (for example, the SDR display device 40 and the SDR printing device 50).
 (1-1-2.実施例2)
 図28は、実施の形態1における実施例2を説明するための図である。実施例2には、HEVC(High Efficiency Video coding)動画ファイル方式を用いて、HDR静止画データとSDR静止画データとを含む1つのデータ単位を生成する構成例を示す。
(1-1-2. Example 2)
FIG. 28 is a diagram for explaining an example 2 in the first embodiment. The second embodiment shows a configuration example in which one data unit including HDR still image data and SDR still image data is generated using a HEVC (High Efficiency Video coding) moving image file system.
 実施例2におけるHDR撮像装置10Eは、HEVC動画ファイル形式のHDR静止画データと、SDR静止画データと、を含む1つのデータ単位を生成してもよい。図28に示すように、HDR撮像装置10Eは、HDR撮像部11、変換部12、JPEG圧縮部13、HEVC圧縮部13D、SDR撮像部14、HDR画像補正部15A、およびデータ単位生成部17を備える。HDR撮像装置10Eが備える各構成要素のうち、変換部12、JPEG圧縮部13、HEVC圧縮部13D、HDR画像補正部15、およびデータ単位生成部17は、図15および図22を用いて説明した画像処理装置100の生成部120および出力部130に対応している。なお、図28に示すHDR撮像部11、変換部12、JPEG圧縮部13、およびSDR撮像部14は、図27に示した同名の構成要素と実質的に同じ構成であるので説明を省略する。 The HDR imaging device 10E according to the second embodiment may generate one data unit including HDR still image data in the HEVC moving image file format and SDR still image data. As shown in FIG. 28, the HDR imaging device 10E includes an HDR imaging unit 11, a conversion unit 12, a JPEG compression unit 13, a HEVC compression unit 13D, an SDR imaging unit 14, an HDR image correction unit 15A, and a data unit generation unit 17. Prepare. Among the components included in the HDR imaging device 10E, the conversion unit 12, the JPEG compression unit 13, the HEVC compression unit 13D, the HDR image correction unit 15, and the data unit generation unit 17 have been described with reference to FIGS. This corresponds to the generation unit 120 and the output unit 130 of the image processing apparatus 100. Note that the HDR imaging unit 11, the conversion unit 12, the JPEG compression unit 13, and the SDR imaging unit 14 illustrated in FIG. 28 have substantially the same configuration as the components having the same names illustrated in FIG.
 HDR画像補正部15Aは、画像処理装置100の生成部120のHDR画像処理部121(図22参照)に対応する処理部である。HDR画像補正部15Aは、HDR撮像部11から取得されるRAWデータを、HDR-EOTFを用いてHDR表示装置30等のHDRTVで表示可能なHDR画像データに変換することで、非圧縮のHDR画像データを生成する。HDR画像補正部15Aは、例えば、プロセッサ、メモリ、等により実現される。 The HDR image correction unit 15A is a processing unit corresponding to the HDR image processing unit 121 (see FIG. 22) of the generation unit 120 of the image processing apparatus 100. The HDR image correction unit 15A converts the RAW data acquired from the HDR imaging unit 11 into HDR image data that can be displayed on the HDRTV such as the HDR display device 30 using HDR-EOTF, so that an uncompressed HDR image is obtained. Generate data. The HDR image correction unit 15A is realized by, for example, a processor, a memory, and the like.
 HEVC圧縮部13Dは、画像処理装置100の生成部120のHDR画像圧縮部122(図22参照)に対応する処理部である。HEVC圧縮部13Dは、非圧縮のHDR画像データを、HEVC形式の動画として圧縮する。HEVC圧縮部13Dは、例えば、プロセッサ、メモリ、等により実現される。 The HEVC compression unit 13D is a processing unit corresponding to the HDR image compression unit 122 (see FIG. 22) of the generation unit 120 of the image processing apparatus 100. The HEVC compression unit 13D compresses uncompressed HDR image data as a moving image in the HEVC format. The HEVC compression unit 13D is realized by, for example, a processor, a memory, and the like.
 データ単位生成部17は、画像処理装置100の、生成部120のフォーマット部125(図22参照)、および出力部130(図15参照)に対応する処理部である。データ単位生成部17は、HEVC動画形式に圧縮されたHDR静止画データを含むHDR静止画ファイルF110と、JPEG圧縮されたSDR静止画データを含むSDR静止画ファイルF120と、を含んで構成され、かつ、それぞれのファイル名のボディ(拡張子を除くファイル名)が互いに共通するオブジェクトを、データ単位D200として生成する。そして、データ単位生成部17は、生成したデータ単位D200を出力する。 The data unit generation unit 17 is a processing unit corresponding to the format unit 125 (see FIG. 22) and the output unit 130 (see FIG. 15) of the generation unit 120 of the image processing apparatus 100. The data unit generation unit 17 includes an HDR still image file F110 including HDR still image data compressed in the HEVC moving image format, and an SDR still image file F120 including JPEG-compressed SDR still image data. In addition, an object having a common body (file name excluding extension) of each file name is generated as a data unit D200. And the data unit production | generation part 17 outputs the produced | generated data unit D200.
 なお、データ単位生成部17では、データ単位D200の生成に、変換部12から出力されたSDR静止画データを用いてJPEG圧縮部13で生成された圧縮されたSDR静止画データが用いられてもよく、あるいは、SDR撮像部14において撮像が行われることで得られたSDR静止画データを用いてJPEG圧縮部13で生成された圧縮されたSDR静止画データが用いられてもよい。 The data unit generation unit 17 uses the compressed SDR still image data generated by the JPEG compression unit 13 using the SDR still image data output from the conversion unit 12 to generate the data unit D200. Alternatively, the compressed SDR still image data generated by the JPEG compression unit 13 using the SDR still image data obtained by imaging in the SDR imaging unit 14 may be used.
 データ単位生成部17によって生成されるデータ単位D200の構成は、例えば、図20に示したデータ単位D20の構成に対応する。データ単位生成部17は、例えば、プロセッサ、メモリ、等により実現される。 The configuration of the data unit D200 generated by the data unit generation unit 17 corresponds to, for example, the configuration of the data unit D20 shown in FIG. The data unit generation unit 17 is realized by, for example, a processor, a memory, and the like.
 なお、上述の各処理部は、プロセッサおよびメモリに代えて、上述の各処理を実行する専用回路により実現されてもよい。 Note that each processing unit described above may be realized by a dedicated circuit that executes each processing described above, instead of the processor and the memory.
 実施例2の利点は、HDR静止画データがHEVC動画形式のファイルとなるため、既存の装置(例えば、HDR表示装置30、等)で表示することができる点である。また、撮像装置は、HEVC動画形式による記録機能があれば、実施例2に示した構成を比較的容易に実現できる。 The advantage of the second embodiment is that since the HDR still image data is a HEVC moving image format file, it can be displayed on an existing device (for example, the HDR display device 30). Further, if the imaging apparatus has a recording function in the HEVC moving image format, the configuration shown in the second embodiment can be realized relatively easily.
 (1-1-3.実施例3)
 図29は、実施の形態1における実施例3を説明するための図である。実施例3には、TIFF(Tagged Image File Format)形式を用いて、HDR静止画データとSDR静止画データとを含む1つのデータ単位を生成する構成例を示す。
(1-1-3. Example 3)
FIG. 29 is a diagram for explaining an example 3 in the first embodiment. Example 3 shows a configuration example in which one data unit including HDR still image data and SDR still image data is generated using a TIFF (Tagged Image File Format) format.
 実施例3におけるHDR撮像装置10Fは、非圧縮のTIFF形式のHDR静止画データと、非圧縮のTIFF形式のSDR静止画データと、を含む1つのデータ単位を生成してもよい。図29に示すように、HDR撮像装置10Fは、HDR撮像部11、変換部12、JPEG圧縮部13、SDR撮像部14、HDR画像補正部15B、およびTIFF出力部17Aを備える。HDR撮像装置10Fが備える各構成要素のうち、変換部12、HDR画像補正部15B、およびTIFF出力部17Aは、図15および図22を用いて説明した画像処理装置100の生成部120および出力部130に対応している。なお、図29に示すHDR撮像部11、変換部12、JPEG圧縮部13、およびSDR撮像部14は、図27に示した同名の構成要素と実質的に同じ構成であるので説明を省略する。 The HDR imaging apparatus 10F according to the third embodiment may generate one data unit including uncompressed TIFF format HDR still image data and uncompressed TIFF format SDR still image data. As illustrated in FIG. 29, the HDR imaging device 10F includes an HDR imaging unit 11, a conversion unit 12, a JPEG compression unit 13, an SDR imaging unit 14, an HDR image correction unit 15B, and a TIFF output unit 17A. Among the components included in the HDR imaging device 10F, the conversion unit 12, the HDR image correction unit 15B, and the TIFF output unit 17A are the generation unit 120 and the output unit of the image processing device 100 described with reference to FIGS. 15 and 22. 130. Note that the HDR imaging unit 11, the conversion unit 12, the JPEG compression unit 13, and the SDR imaging unit 14 illustrated in FIG. 29 have substantially the same configuration as the components having the same names illustrated in FIG.
 HDR画像補正部15Bは、画像処理装置100の生成部120のHDR画像処理部121(図22参照)に対応する処理部である。HDR画像補正部15Bは、HDR撮像部11から取得されるRAWデータを、HDR表示装置62等のHDRTV(HLG(Hybrid Log-Gamma)対応の表示装置)で表示可能な、HDR-OETF(HLG)の16/12ビット画像(HDR静止画データ)に変換する。これにより、TIFF形式のデータを再生可能な再生装置(例えば、HDR表示装置62)であれば、HLGデータはSDR互換があるため、そのHDR静止画データをそのまま再生できる。HDR画像補正部15Bは、例えば、プロセッサ、メモリ、等により実現される。 The HDR image correction unit 15B is a processing unit corresponding to the HDR image processing unit 121 (see FIG. 22) of the generation unit 120 of the image processing apparatus 100. The HDR image correction unit 15B can display the RAW data acquired from the HDR imaging unit 11 on an HDRTV (HLG (Hybrid Log-Gamma) compatible display device) such as the HDR display device 62 and the like, HDR-OETF (HLG). To a 16 / 12-bit image (HDR still image data). Thus, if the playback device is capable of playing back data in TIFF format (for example, the HDR display device 62), since the HLG data is SDR compatible, the HDR still image data can be played back as it is. The HDR image correction unit 15B is realized by, for example, a processor, a memory, and the like.
 TIFF出力部17Aは、画像処理装置100の、生成部120のフォーマット部125(図22参照)、および出力部130(図15参照)に対応する処理部である。TIFF出力部17Aは、HDR静止画データを含むTIFF静止画ファイルF210と、SDR静止画データを含むTIFF静止画ファイルF220とを、TIFFファイル形式で1つのデータ単位D300として生成し、生成したデータ単位D300を出力する。 The TIFF output unit 17A is a processing unit corresponding to the format unit 125 (see FIG. 22) and the output unit 130 (see FIG. 15) of the generation unit 120 of the image processing apparatus 100. The TIFF output unit 17A generates a TIFF still image file F210 including HDR still image data and a TIFF still image file F220 including SDR still image data as one data unit D300 in the TIFF file format, and the generated data unit D300 is output.
 なお、TIFF出力部17Aは、HDR静止画データとSDR静止画データとの2つのデータをそれぞれTIFFファイルに格納する際に、TIFFのタグとして、HDRタグ(SDR、HDR(HLG(System Gamma 1.2))、HDR(PQ)を識別するもの)および色空間タグ(sRGB、Adobe RGB、bt.2020を識別するもの)をつけたものをファイルフォーマットとして用いてもよい。TIFF出力部17Aは、データ単位D300の生成に、変換部12から出力されたSDR静止画データが用いられてもよく、あるいは、SDR撮像部14において撮像が行われることで得られたSDR静止画データが用いられてもよい。 Note that the TIFF output unit 17A, when storing two data of HDR still image data and SDR still image data in a TIFF file, respectively, as a tag of TIFF, an HDR tag (SDR, HDR (HLG (System Gamma 1.. 2)), one that identifies HDR (PQ)) and one with a color space tag (one that identifies sRGB, Adobe RGB, bt.2020) may be used as the file format. The TIFF output unit 17A may use the SDR still image data output from the conversion unit 12 for generation of the data unit D300, or the SDR still image obtained by imaging in the SDR imaging unit 14 Data may be used.
 TIFF出力部17Aによって生成されるデータ単位D300の構成は、例えば、図20に示したデータ単位D20の構成に対応する。TIFF出力部17Aは、例えば、プロセッサ、メモリ、等により実現される。 The configuration of the data unit D300 generated by the TIFF output unit 17A corresponds to, for example, the configuration of the data unit D20 shown in FIG. The TIFF output unit 17A is realized by, for example, a processor, a memory, and the like.
 なお、従来の装置(例えば、SDR表示装置40、SDR印刷装置50、または既存のHDR表示装置30、等)では、実施例3に示すフォーマットのデータは扱えない。このため、HDR撮像装置10Fでは、変換部12においてHDR静止画データがSDR変換され、その変換により得られたSDR静止画データがJPEG圧縮部13によりJPEG圧縮されて生成された圧縮されたSDR静止画データが、別ファイルとして同時に生成(記録)されてもよい。 Note that the conventional device (for example, the SDR display device 40, the SDR printing device 50, or the existing HDR display device 30, etc.) cannot handle data in the format shown in the third embodiment. Therefore, in the HDR imaging apparatus 10F, the HDR still image data is subjected to SDR conversion in the conversion unit 12, and the SDR still image data obtained by JPEG compression of the SDR still image data obtained by the conversion is generated by the JPEG compression unit 13. The image data may be generated (recorded) simultaneously as a separate file.
 TIFF形式を用いる利点は、HDR表示装置側にTIFF表示機能を実装するだけでTIFFファイルに対応することが可能であり、実現が容易である点である。また、撮像装置は、TIFF形式のファイルフォーマットでのデータ単位D300を生成するための生成機能を付加するだけでよいので、実施例3に示した構成を比較的容易に実現できる。また、このような新ファイルフォーマットに対応した装置(例えば、HDR表示装置62またはSDR印刷装置63、等)では、データ単位D300中のTIFF静止画ファイルF210あるいはTIFF静止画ファイルF220のいずれも再生可能である。また、TIFF対応のHDRTV(例えば、HDR表示装置62、等)は、HLGのHDR静止画データと、bt.2020で定義される広い色域のSDR静止画データとのいずれも表示することができる。 The advantage of using the TIFF format is that it is possible to deal with TIFF files simply by implementing a TIFF display function on the HDR display device side, which is easy to realize. Further, since the imaging apparatus only needs to add a generation function for generating the data unit D300 in the TIFF file format, the configuration shown in the third embodiment can be realized relatively easily. In addition, in a device corresponding to such a new file format (for example, HDR display device 62 or SDR printing device 63), any of TIFF still image file F210 or TIFF still image file F220 in data unit D300 can be reproduced. It is. In addition, TIFF compatible HDRTV (for example, HDR display device 62, etc.) includes HDR still image data of HLG, bt. Any of the wide color gamut SDR still image data defined in 2020 can be displayed.
 なお、上述の各処理部は、プロセッサおよびメモリに代えて、上述の各処理を実行する専用回路により実現されてもよい。 Note that each processing unit described above may be realized by a dedicated circuit that executes each processing described above, instead of the processor and the memory.
 (1-1-4.実施例4)
 図30は、実施の形態1における実施例4を説明するための図である。実施例4には、HEVCのIピクチャを生成する方式(Iピクチャの圧縮方式)を用いて、HDR静止画データとSDR静止画データとを含む1つのデータ単位を生成する構成例を示す。
(1-1-4. Example 4)
FIG. 30 is a diagram for explaining an example 4 in the first embodiment. The fourth embodiment shows a configuration example in which one data unit including HDR still image data and SDR still image data is generated using a method of generating an I picture of HEVC (I picture compression method).
 実施例4におけるHDR撮像装置10Gは、HEVCのIピクチャの圧縮方式を用いて、HDR静止画データとSDR静止画データとを含む1つのデータ単位を生成してもよい。図30に示すように、HDR撮像装置10Gは、HDR撮像部11、変換部12、JPEG圧縮部13、SDR撮像部14、HDR画像補正部15B、およびHEVC圧縮部17Bを備える。HDR撮像装置10Gが備える各構成要素のうち、変換部12、HDR画像補正部15B、およびHEVC圧縮部17Bは、図15および図22を用いて説明した画像処理装置100の生成部120および出力部130に対応している。なお、図30に示すHDR撮像部11、変換部12、JPEG圧縮部13、SDR撮像部14、およびHDR画像補正部15Bは、図29に示した同名の構成要素と実質的に同じ構成であるので説明を省略する。 The HDR imaging device 10G according to the fourth embodiment may generate one data unit including HDR still image data and SDR still image data by using a HEVC I-picture compression method. As illustrated in FIG. 30, the HDR imaging device 10G includes an HDR imaging unit 11, a conversion unit 12, a JPEG compression unit 13, an SDR imaging unit 14, an HDR image correction unit 15B, and an HEVC compression unit 17B. Among the components included in the HDR imaging device 10G, the conversion unit 12, the HDR image correction unit 15B, and the HEVC compression unit 17B are the generation unit 120 and the output unit of the image processing device 100 described with reference to FIGS. 15 and 22. 130. Note that the HDR imaging unit 11, the conversion unit 12, the JPEG compression unit 13, the SDR imaging unit 14, and the HDR image correction unit 15B illustrated in FIG. 30 have substantially the same configuration as the components having the same names illustrated in FIG. Therefore, explanation is omitted.
 HEVC圧縮部17Bは、画像処理装置100の生成部120のHDR画像圧縮部122、SDR画像圧縮部124、フォーマット部125(図22参照)、および出力部130(図15参照)に対応する処理部である。HEVC圧縮部17Bは、HDR画像補正部15Bから出力されたHDR静止画データを、HEVCのIピクチャとして圧縮する。また、HEVC圧縮部17Bは、変換部12から出力されたSDR静止画データを、HEVCのIピクチャとして圧縮する。また、HEVC圧縮部17Bは、圧縮により得られたHEVCのIピクチャとしてのHDR静止画データを含むHEVC-I静止画ファイルF310と、HEVCのIピクチャとしてのSDR静止画データを含むHEVC-I静止画ファイルF320と、を含んで構成され、かつ、それぞれのファイル名のボディ(拡張子を除くファイル名)が互いに共通するオブジェクトを、データ単位D400として生成する。そして、HEVC圧縮部17Bは、生成したデータ単位D400を出力する。 The HEVC compression unit 17B is a processing unit corresponding to the HDR image compression unit 122, the SDR image compression unit 124, the format unit 125 (see FIG. 22), and the output unit 130 (see FIG. 15) of the generation unit 120 of the image processing apparatus 100. It is. The HEVC compression unit 17B compresses the HDR still image data output from the HDR image correction unit 15B as an HEVC I picture. The HEVC compression unit 17B compresses the SDR still image data output from the conversion unit 12 as an HEVC I picture. The HEVC compression unit 17B also performs HEVC-I still image file F310 including HDR still image data as an I picture of HEVC obtained by compression, and HEVC-I still including SDR still image data as an I picture of HEVC. An object including the image file F320 and having a common body (file name excluding extension) of each file name is generated as a data unit D400. Then, the HEVC compression unit 17B outputs the generated data unit D400.
 なお、HEVC圧縮部17Bは、HDR静止画データとSDR静止画データとの2つのデータをそれぞれHEVC-Iファイルに格納する際に、HDRタグ(SDR、HDR(HLG(System Gamma 1.2))、HDR(PQ)を識別するもの)と色空間タグ(sRGB、Adobe RGB、bt.2020を識別するもの)をつけたものをファイルフォーマットとして用いてもよい。HEVC圧縮部17Bは、データ単位D400の生成に、変換部12から出力されたSDR静止画データが用いられてもよく、あるいは、SDR撮像部14において撮像が行われることで得られたSDR静止画データが用いられてもよい。 Note that the HEVC compression unit 17B uses the HDR tag (SDR, HDR (HLG (System Gamma 1.2)) when storing two data of HDR still image data and SDR still image data in the HEVC-I file. , One that identifies HDR (PQ)) and a color space tag (one that identifies sRGB, Adobe RGB, bt.2020) may be used as a file format. The HEVC compression unit 17B may use the SDR still image data output from the conversion unit 12 to generate the data unit D400, or the SDR still image obtained by performing imaging in the SDR imaging unit 14. Data may be used.
 HEVC圧縮部17Bによって生成されるデータ単位D400の構成は、例えば、図20に示したデータ単位D20の構成に対応する。HEVC圧縮部17Bは、例えば、プロセッサ、メモリ、等により実現される。 The configuration of the data unit D400 generated by the HEVC compression unit 17B corresponds to, for example, the configuration of the data unit D20 illustrated in FIG. The HEVC compression unit 17B is realized by, for example, a processor, a memory, and the like.
 なお、従来の装置(例えば、SDR表示装置40、SDR印刷装置50、または既存のHDR表示装置30、等)では、実施例4に示すフォーマットのデータは扱えない。このため、HDR撮像装置10Gでは、変換部12においてHDR静止画データがSDR変換され、その変換により得られたSDR静止画データがJPEG圧縮部13によりJPEG圧縮されて生成された圧縮されたSDR静止画データが、別ファイルとして同時に生成(記録)されてもよい。 Note that the conventional device (for example, the SDR display device 40, the SDR printing device 50, or the existing HDR display device 30, etc.) cannot handle data in the format shown in the fourth embodiment. Therefore, in the HDR imaging apparatus 10G, the HDR still image data is SDR converted in the conversion unit 12, and the SDR still image data obtained by JPEG compression of the SDR still image data obtained by the conversion is generated by the JPEG compression unit 13. The image data may be generated (recorded) simultaneously as a separate file.
 また、HEVC圧縮部17Bは、マルチピクチャーフォーマット方式で、HDR静止画データと、SDR静止画データとを1つのファイルに格納した、HDR静止画ファイルフォーマットのHDR静止画ファイル(JPEG MPF)を生成してもよい。つまり、HEVC圧縮部17Bは、図16に示したデータ単位D10と同様の構成のHDR静止画ファイルを生成してもよい。 Further, the HEVC compression unit 17B generates an HDR still image file (JPEG MPF) in the HDR still image file format in which HDR still image data and SDR still image data are stored in one file by a multi-picture format method. May be. That is, the HEVC compression unit 17B may generate an HDR still image file having a configuration similar to that of the data unit D10 illustrated in FIG.
 また、HEVC圧縮部17Bは、マルチピクチャーフォーマット方式とは異なる、HDR静止画データと、SDR静止画データとを1つのファイルに格納するHDR静止画ファイルフォーマットのHDR静止画ファイルを生成してもよい。 Further, the HEVC compression unit 17B may generate an HDR still image file in an HDR still image file format that stores HDR still image data and SDR still image data in one file, which is different from the multi-picture format method. .
 HEVCのIピクチャの圧縮方式を用いる利点は、既存のHDR表示装置にはHEVCデコード機能が搭載されているため、HEVCのIピクチャの表示または再生を比較的容易に実装可能な点である。また、4K動画の撮像に対応した撮像装置は、HEVC圧縮機能を搭載していることが多いため、HDR静止画データおよびSDR静止画データをHEVCのIピクチャとして圧縮する機能の実装を比較的容易に実現できる。また、HEVCのIピクチャを再生できる印刷装置(例えば、SDR印刷装置63、等)では、HDRのHEVC-I静止画ファイルF310あるいはSDRのHEVC-I静止画ファイルF320のいずれも印刷することができる。また、HEVCのIピクチャを表示可能なHDRTV(例えば、HDR表示装置62、等)は、HLGのHDR静止画と、bt.2020で定義される広い色域のSDR静止画とのいずれも表示することができる。 The advantage of using the HEVC I picture compression method is that, since the existing HDR display device has a HEVC decoding function, display or playback of HEVC I pictures can be implemented relatively easily. In addition, since an imaging apparatus that supports 4K video imaging often has a HEVC compression function, it is relatively easy to implement a function that compresses HDR still image data and SDR still image data as HEVC I-pictures. Can be realized. In addition, a printing apparatus (for example, the SDR printing apparatus 63) that can reproduce an HEVC I picture can print either the HDR HEVC-I still image file F310 or the SDR HEVC-I still image file F320. . In addition, HDRTV (for example, HDR display device 62, etc.) capable of displaying an HEVC I picture includes an HLG HDR still image, bt. Any of the wide color gamut SDR still images defined in 2020 can be displayed.
 なお、上述の各処理部は、プロセッサおよびメモリに代えて、上述の各処理を実行する専用回路により実現されてもよい。 Note that each processing unit described above may be realized by a dedicated circuit that executes each processing described above, instead of the processor and the memory.
 (1-1-5.実施例5)
 図31は、実施の形態1における実施例5を説明するための図である。実施例5には、JPEG2000で圧縮する方式を用いて、HDR静止画データとSDR静止画データとを含む1つのデータ単位を生成する構成例を示す。
(1-1-5. Example 5)
FIG. 31 is a diagram for explaining a fifth example in the first embodiment. In the fifth embodiment, a configuration example in which one data unit including HDR still image data and SDR still image data is generated using a method of compressing with JPEG2000 is shown.
 実施例5におけるHDR撮像装置10Hは、JPEG2000の圧縮方式を用いて、HDR静止画データとSDR静止画データとを含む1つのデータ単位を生成してもよい。図31に示すように、HDR撮像装置10Hは、HDR撮像部11、変換部12、JPEG圧縮部13、SDR撮像部14、HDR画像補正部15B、およびJPEG2000圧縮部17Cを備える。HDR撮像装置10Hが備える各構成要素のうち、変換部12、HDR画像補正部15B、およびJPEG2000圧縮部17Cは、図15および図22を用いて説明した画像処理装置100の生成部120および出力部130に対応している。なお、図31に示すHDR撮像部11、変換部12、JPEG圧縮部13、SDR撮像部14、およびHDR画像補正部15Bは、図29に示した同名の構成要素と実質的に同じ構成であるので説明を省略する。 The HDR imaging apparatus 10H according to the fifth embodiment may generate one data unit including HDR still image data and SDR still image data using a JPEG2000 compression method. As illustrated in FIG. 31, the HDR imaging device 10H includes an HDR imaging unit 11, a conversion unit 12, a JPEG compression unit 13, an SDR imaging unit 14, an HDR image correction unit 15B, and a JPEG2000 compression unit 17C. Among the components included in the HDR imaging device 10H, the conversion unit 12, the HDR image correction unit 15B, and the JPEG2000 compression unit 17C are the generation unit 120 and the output unit of the image processing device 100 described with reference to FIGS. 15 and 22. 130. Note that the HDR imaging unit 11, the conversion unit 12, the JPEG compression unit 13, the SDR imaging unit 14, and the HDR image correction unit 15B illustrated in FIG. 31 have substantially the same configuration as the components having the same names illustrated in FIG. Therefore, explanation is omitted.
 JPEG2000圧縮部17Cは、画像処理装置100の生成部120のHDR画像圧縮部122、SDR画像圧縮部124、フォーマット部125(図22参照)、および出力部130(図15参照)に対応する処理部である。JPEG2000圧縮部17Cは、HDR画像補正部15Bから出力されたHDR静止画データを、JPEG2000の方式で圧縮する。また、JPEG2000圧縮部17Cは、変換部12から出力されたSDR静止画データを、JPEG2000の方式で圧縮する。また、JPEG2000圧縮部17Cは、圧縮により得られたJPEG2000形式のHDR静止画データを含むJPEG2000静止画ファイルF410と、JPEG2000形式のSDR静止画データを含むJPEG2000静止画ファイルF420と、を含んで構成され、かつ、それぞれのファイル名のボディ(拡張子を除くファイル名)が互いに共通するオブジェクトを、データ単位D500として生成する。そして、JPEG2000圧縮部17Cは、生成したデータ単位D500を出力する。 The JPEG2000 compression unit 17C is a processing unit corresponding to the HDR image compression unit 122, the SDR image compression unit 124, the format unit 125 (see FIG. 22), and the output unit 130 (see FIG. 15) of the generation unit 120 of the image processing apparatus 100. It is. The JPEG2000 compression unit 17C compresses the HDR still image data output from the HDR image correction unit 15B using the JPEG2000 method. The JPEG 2000 compression unit 17C compresses the SDR still image data output from the conversion unit 12 using the JPEG 2000 method. The JPEG2000 compression unit 17C includes a JPEG2000 still image file F410 including HDR still image data in JPEG2000 format obtained by compression, and a JPEG2000 still image file F420 including SDR still image data in JPEG2000 format. In addition, an object having a common body (file name excluding extension) of each file name is generated as a data unit D500. Then, the JPEG2000 compression unit 17C outputs the generated data unit D500.
 なお、JPEG2000圧縮部17Cは、HDR静止画データとSDR静止画データとの2つのデータをそれぞれJPEG2000ファイルとして格納する際に、HDRタグ(SDR、HDR(HLG(System Gamma 1.2))、HDR(PQ)を識別するもの)と色空間タグ(sRGB、Adobe RGB、bt.2020を識別するもの)をつけたものをファイルフォーマットとして用いてもよい。JPEG2000圧縮部17Cは、データ単位D500の生成に、変換部12から出力されたSDR静止画データが用いられてもよく、あるいは、SDR撮像部14において撮像が行われることで得られたSDR静止画データが用いられてもよい。 The JPEG2000 compression unit 17C stores the HDR data (HDR (System Gamma 1.2)), HDR when storing two data of HDR still image data and SDR still image data as JPEG2000 files. (PQ) and a color space tag (sRGB, Adobe RGB, bt.2020 identification) may be used as a file format. The JPEG2000 compression unit 17C may use the SDR still image data output from the conversion unit 12 to generate the data unit D500, or an SDR still image obtained by performing imaging in the SDR imaging unit 14. Data may be used.
 JPEG2000圧縮部17Cによって生成されるデータ単位D500の構成は、例えば、図20に示したデータ単位D20の構成に対応する。JPEG2000圧縮部17Cは、例えば、プロセッサ、メモリ、等により実現される。 The configuration of the data unit D500 generated by the JPEG2000 compression unit 17C corresponds to, for example, the configuration of the data unit D20 shown in FIG. The JPEG2000 compression unit 17C is realized by, for example, a processor, a memory, and the like.
 なお、従来の装置(例えば、SDR表示装置40、SDR印刷装置50、または既存のHDR表示装置30、等)では、実施例5に示すフォーマットのデータは扱えない。このため、HDR撮像装置10Hでは、変換部12においてHDR静止画データがSDR変換され、その変換により得られたSDR静止画データがJPEG圧縮部13によりJPEG圧縮されて生成された圧縮されたSDR静止画データが、別ファイルとして同時に生成(記録)されてもよい。 Note that the conventional device (for example, the SDR display device 40, the SDR printing device 50, or the existing HDR display device 30, etc.) cannot handle data in the format shown in the fifth embodiment. Therefore, in the HDR imaging apparatus 10H, the HDR still image data is subjected to SDR conversion in the conversion unit 12, and the SDR still image data obtained by JPEG compression of the SDR still image data obtained by the conversion is generated by the JPEG compression unit 13. The image data may be generated (recorded) simultaneously as a separate file.
 また、JPEG2000圧縮部17Cは、マルチピクチャーフォーマット方式で、HDR静止画データと、SDR静止画データとを1つのファイルに格納した、HDR静止画ファイルフォーマットのHDR静止画ファイル(JPEG MPF)を生成してもよい。つまり、JPEG2000圧縮部17Cは、図16に示したデータ単位D10と同様の構成のHDR静止画ファイルを生成してもよい。 In addition, the JPEG2000 compression unit 17C generates an HDR still image file (JPEG MPF) in the HDR still image file format in which HDR still image data and SDR still image data are stored in one file by a multi-picture format method. May be. That is, the JPEG2000 compression unit 17C may generate an HDR still image file having a configuration similar to that of the data unit D10 illustrated in FIG.
 また、JPEG2000圧縮部17Cは、マルチピクチャーフォーマット方式とは異なる、HDR静止画データと、SDR静止画データとを1つのファイルに格納するHDR静止画ファイルフォーマットのHDR静止画ファイルを生成してもよい。 Also, the JPEG2000 compression unit 17C may generate an HDR still image file in an HDR still image file format that stores HDR still image data and SDR still image data in one file, which is different from the multi-picture format method. .
 JPEG2000の圧縮方式を用いる利点は、既存のHDR表示装置にはJPEG2000に対応する機能を比較的容易に実装することが可能である点である。また、JPEG2000対応のHDRTV(例えば、HDR表示装置62、等)では、HDRのJPEG2000静止画ファイルF410あるいはSDRのJPEG2000静止画ファイルF420のいずれも再生できる。また、JPEG2000対応のHDRTV(例えば、HDR表示装置62、等)では、HLGのHDR静止画と、bt.2020で定義される広い色域のSDR静止画とのいずれも表示することができる。 An advantage of using the JPEG2000 compression method is that a function corresponding to JPEG2000 can be mounted on an existing HDR display device relatively easily. In addition, an HDRTV compatible with JPEG2000 (for example, the HDR display device 62) can play back either an HDR JPEG2000 still image file F410 or an SDR JPEG2000 still image file F420. In addition, in HDRTV compatible with JPEG2000 (for example, HDR display device 62), HDR still image of HLG, bt. Any of the wide color gamut SDR still images defined in 2020 can be displayed.
 なお、上述の各処理部は、プロセッサおよびメモリに代えて、上述の各処理を実行する専用回路により実現されてもよい。 Note that each processing unit described above may be realized by a dedicated circuit that executes each processing described above, instead of the processor and the memory.
 (1-1-6.実施例6)
 図32は、実施の形態1における再生装置の構成の一例を模式的に示すブロック図である。
(1-1-6. Example 6)
FIG. 32 is a block diagram schematically showing an example of the configuration of the playback device in the first embodiment.
 図32に示すように、再生装置200は、取得部210と、再生部220と、を備える。再生装置200は、表示部(図示せず)をさらに有し、再生した結果を表示部に表示する表示装置として実現されてもよい。また、再生装置200は、印刷部をさらに有し、再生した結果を紙等の印刷媒体に印刷する印刷装置として実現されてもよい。 32, the playback device 200 includes an acquisition unit 210 and a playback unit 220. The playback device 200 may further be realized as a display device that further includes a display unit (not shown) and displays the playback result on the display unit. The playback device 200 may further be realized as a printing device that further includes a printing unit and prints the playback result on a print medium such as paper.
 取得部210は、論理的に1つのデータ単位を取得する。論理的に1つのデータ単位は、輝度のダイナミックレンジが互いに異なり、かつ、互いに独立して再生することが可能なHDR静止画データおよびSDR静止画データを含む。 The acquisition unit 210 logically acquires one data unit. Logically, one data unit includes HDR still image data and SDR still image data that have different luminance dynamic ranges and can be reproduced independently of each other.
 再生部220は、取得部210により取得されたデータ単位に含まれるHDR静止画データおよびSDR静止画データのうちの一方を再生する。再生部220は、データ単位に付加されている補助情報に応じて、HDR静止画データを選択して再生してもよい。また、再生部220は、補助情報に輝度領域情報が含まれている場合、HDR静止画データの輝度領域のうちの、輝度領域情報によって優先することが示された輝度領域が全て含まれるようにそのHDR静止画データの輝度調整を行い、輝度調整が行われた画像データを再生してもよい。例えば、再生部220は、輝度領域情報が高輝度領域を優先していることを示すHDR静止画データ(例えば、図18に示したHDR静止画データ)を再生する場合、そのHDR静止画データを、優先していることが示された高輝度領域が全て含まれるように輝度調整が行われた画像データに変換し、変換後の画像データを再生してもよい。あるいは、再生部220は、輝度領域情報が低輝度領域を優先していることを示すHDR静止画データ(例えば、図19に示したHDR静止画データ)を再生する場合、そのHDR静止画データを、優先していることが示された低輝度領域が全て含まれるように輝度調整が行われた画像データに変換し、変換後の画像データを再生してもよい。 The reproduction unit 220 reproduces one of the HDR still image data and the SDR still image data included in the data unit acquired by the acquisition unit 210. The playback unit 220 may select and play back the HDR still image data according to the auxiliary information added to the data unit. In addition, when the luminance information is included in the auxiliary information, the reproducing unit 220 includes all the luminance regions indicated to be given priority by the luminance region information among the luminance regions of the HDR still image data. The brightness adjustment of the HDR still image data may be performed, and the image data on which the brightness adjustment has been performed may be reproduced. For example, when reproducing the HDR still image data (for example, the HDR still image data illustrated in FIG. 18) indicating that the luminance region information has priority over the high luminance region, the reproducing unit 220 displays the HDR still image data. Alternatively, the image data may be converted to image data that has been adjusted so that all the high-luminance regions indicated as prioritized are included, and the converted image data may be reproduced. Alternatively, when reproducing the HDR still image data (for example, the HDR still image data shown in FIG. 19) indicating that the luminance region information has priority for the low luminance region, the reproducing unit 220 reproduces the HDR still image data. Alternatively, the converted image data may be reproduced after being converted to image data that has been subjected to luminance adjustment so that all the low-luminance regions indicated to have priority are included.
 次に、再生装置200の再生処理に係る動作を、フローチャートを用いて説明する。 Next, operations related to the playback process of the playback apparatus 200 will be described using a flowchart.
 図33は、実施の形態1における再生装置200の再生処理に係る動作の一例を示すフローチャートである。 FIG. 33 is a flowchart showing an example of an operation related to the reproduction process of the reproduction apparatus 200 according to the first embodiment.
 再生装置200の取得部210は、データ単位を取得する(ステップS201)。 The acquisition unit 210 of the playback device 200 acquires a data unit (step S201).
 再生部220は、ステップS201で取得部210により取得されたデータ単位に含まれるHDR静止画データおよびSDR静止画データのうちの一方を再生する(ステップS202)。 The playback unit 220 plays back one of the HDR still image data and the SDR still image data included in the data unit acquired by the acquisition unit 210 in step S201 (step S202).
 なお、取得部210および再生部220は、例えば、所定のプログラム(上述の各処理を実行するように作成されたプログラム)を実行するプロセッサおよび所定のプログラムを格納しているメモリにより実現されてもよく、あるいは、プロセッサおよびメモリに代えて、上述の各処理を実行する専用回路により実現されてもよい。 Note that the acquisition unit 210 and the reproduction unit 220 may be realized by, for example, a processor that executes a predetermined program (a program created so as to execute the above-described processes) and a memory that stores the predetermined program. Alternatively, instead of the processor and the memory, it may be realized by a dedicated circuit that executes each of the processes described above.
 (1-1-7.実施例7)
 次に、再生装置200の動作の具体例を説明する。
(1-1-7. Example 7)
Next, a specific example of the operation of the playback device 200 will be described.
 図34は、実施の形態1における補助情報の具体的な一例について説明するための図である。 FIG. 34 is a diagram for describing a specific example of auxiliary information in the first embodiment.
 上述したように、画像処理装置100(図15参照)は、データ単位の生成時に、補助情報を付加してもよい。つまり、画像処理装置100は、補助情報を付加したデータ単位を生成し、補助情報が付加されたデータ単位を出力してもよい。 As described above, the image processing apparatus 100 (see FIG. 15) may add auxiliary information when generating data units. That is, the image processing apparatus 100 may generate a data unit to which auxiliary information is added and output the data unit to which auxiliary information is added.
 この構成の場合、補助情報に、高品位な画像を再生できることを示す情報が含まれていてもよい。例えば、再生装置200が新しい静止画フォーマットに対応したSDR再生装置である場合、当該SDR再生装置がSDR静止画データを再生するときに、JPEG形式のSDR画像を表示する(図34に示すBのルート)のか、あるいは、HDR画像(HLG)をSDR変換したSDR画像を表示する(図34に示すAのルート)のか、の判断に使うための情報が補助情報に含まれていてもよい。例えば、補助情報には、SDR再生装置がHDR画像(HLG)をSDR変換して表示すると、高品位な画像を再生できることを示す情報が含まれていてもよい。 In the case of this configuration, the auxiliary information may include information indicating that a high-quality image can be reproduced. For example, when the playback device 200 is an SDR playback device that supports a new still image format, when the SDR playback device plays back SDR still image data, a JPEG-format SDR image is displayed (see B in FIG. 34). The auxiliary information may include information for use in determining whether the route is a route) or an SDR image obtained by SDR conversion of an HDR image (HLG) (route A shown in FIG. 34). For example, the auxiliary information may include information indicating that a high-definition image can be reproduced when the SDR reproduction device displays the HDR image (HLG) after SDR conversion.
 また、補助情報には、HDR静止画データによる静止画の輝度が、高輝度領域を優先しているか否かを示す輝度領域情報、または、低輝度領域を優先しているか否かを示す輝度領域情報が含まれていてもよい。つまり、補助情報には、HDR画像生成時に高輝度領域が優先された静止画像かどうかのフラグ、またはHDR画像生成時に低輝度領域が優先された静止画像かどうかフラグが含まれていてもよい。また、この場合の補助情報には、その静止画像生成時に優先された高輝度領域または低輝度領域を定める閾値が含まれていてもよい。 In addition, the auxiliary information includes luminance area information indicating whether or not the high-luminance area is given priority to the luminance of the still picture based on the HDR still picture data, or a luminance area indicating whether or not the low-luminance area is given priority. Information may be included. That is, the auxiliary information may include a flag indicating whether the high-luminance area is prioritized when generating the HDR image, or a flag indicating whether the low-luminance area is prioritized when generating the HDR image. Further, the auxiliary information in this case may include a threshold value that determines a high-luminance region or a low-luminance region that is prioritized when the still image is generated.
 また、補助情報には、SDR表示装置にSDR画像を表示するときに、HDR画像をSDR画像に変換するのではなく、JPEG形式のSDR画像をそのまま表示する、という撮影者の指示を示す指示用フラグ(変換禁止フラグ)が含まれていてもよい。 The auxiliary information includes an instruction for indicating a photographer's instruction to display a JPEG SDR image as it is instead of converting the HDR image into an SDR image when displaying the SDR image on the SDR display device. A flag (conversion prohibition flag) may be included.
 これらの、HDR静止画データ、SDR静止画データ、および管理情報は、1つのファイルに格納されてもよい。これらの、HDR静止画データ、SDR静止画データ、および管理情報は、ファイル名のボディ(拡張子を除くファイル名)が互いに共通化され、DCF(Design rule for Camera File System)オブジェクトの複数ファイルとして互いに関連付けされた、論理的に1つのデータ単位として生成されてもよい。 These HDR still image data, SDR still image data, and management information may be stored in one file. As for these HDR still image data, SDR still image data, and management information, the body of the file name (file name excluding the extension) is made common to each other as a plurality of files of DCF (Design rule for Camera File System) object. It may be generated as one logical data unit that is associated with each other.
 (1-2.効果、等)
 以上のように、本実施の形態において、画像処理装置は、撮像により得られた静止画データを取得する取得部と、取得部に取得された静止画データを用いて、輝度のダイナミックレンジが互いに異なり、かつ、互いに独立して再生することが可能な第1静止画データおよび第2静止画データを含む、論理的に1つのデータ単位を生成する生成部と、生成部によって生成されたデータ単位を出力する出力部と、を備える。
(1-2. Effect, etc.)
As described above, in this embodiment, the image processing apparatus uses the acquisition unit that acquires still image data obtained by imaging and the still image data acquired by the acquisition unit, so that the dynamic range of luminance is mutually equal. A generating unit that logically generates one data unit including first still image data and second still image data that are different and can be reproduced independently of each other, and a data unit generated by the generating unit And an output unit for outputting.
 また、本実施の形態において、再生装置は、輝度のダイナミックレンジが互いに異なり、かつ、互いに独立して再生することが可能な第1静止画データおよび第2静止画データを含む、論理的に1つのデータ単位を取得する取得部と、取得部により取得されたデータ単位に含まれる第1静止画データおよび第2静止画データのうちの一方を再生する再生部と、を備える。 Further, in the present embodiment, the playback device includes a first still image data and a second still image data that have different luminance dynamic ranges and can be played back independently from each other. An acquisition unit that acquires one data unit, and a reproduction unit that reproduces one of the first still image data and the second still image data included in the data unit acquired by the acquisition unit.
 また、本実施の形態において、画像処理方法は、撮像により得られた静止画データを取得し、取得した静止画データを用いて、輝度のダイナミックレンジが互いに異なり、かつ、互いに独立して再生することが可能な第1静止画データおよび第2静止画データを含む、論理的に1つのデータ単位を生成し、生成したデータ単位を出力する。 In the present embodiment, the image processing method acquires still image data obtained by imaging, and uses the acquired still image data to reproduce the luminance dynamic ranges different from each other and independently of each other. A logical data unit including the first still image data and the second still image data that can be generated is generated, and the generated data unit is output.
 また、本実施の形態において、再生方法は、輝度のダイナミックレンジが互いに異なり、かつ、互いに独立して再生することが可能な第1静止画データおよび第2静止画データを含む、論理的に1つのデータ単位を取得し、取得したデータ単位に含まれる第1静止画データおよび第2静止画データのうちの一方を再生する。 Further, in the present embodiment, the reproduction method includes a first still image data and a second still image data that have different dynamic dynamic ranges and can be reproduced independently of each other. One data unit is acquired, and one of the first still image data and the second still image data included in the acquired data unit is reproduced.
 なお、画像処理装置100は画像処理装置の一例である。取得部110は画像処理装置が備える取得部の一例である。生成部120および生成部120Aは、それぞれが生成部の一例である。出力部130は出力部の一例である。なお、マルチピクチャーフォーマット生成部13C、データ単位生成部17、TIFF出力部17A、HEVC圧縮部17B、JPEG2000圧縮部17Cは、それぞれが出力部130に対応する。HDR静止画データは第1静止画データの一例である。SDR静止画データは第2静止画データの一例である。再生装置200は再生装置の一例である。取得部210は再生装置が備える取得部の一例である。データ単位D10、データ単位D20、データ単位D30、データ単位D200、データ単位D300、データ単位D400、およびデータ単位D500は、それぞれが、画像処理装置の生成部で生成されるデータ単位の一例であり、再生装置の取得部で取得されるデータ単位の一例である。再生部220は再生部の一例である。 The image processing apparatus 100 is an example of an image processing apparatus. The acquisition unit 110 is an example of an acquisition unit included in the image processing apparatus. Each of the generation unit 120 and the generation unit 120A is an example of a generation unit. The output unit 130 is an example of an output unit. The multi-picture format generation unit 13C, the data unit generation unit 17, the TIFF output unit 17A, the HEVC compression unit 17B, and the JPEG2000 compression unit 17C correspond to the output unit 130, respectively. The HDR still image data is an example of first still image data. The SDR still image data is an example of second still image data. The playback device 200 is an example of a playback device. The acquisition unit 210 is an example of an acquisition unit included in the playback device. Each of the data unit D10, the data unit D20, the data unit D30, the data unit D200, the data unit D300, the data unit D400, and the data unit D500 is an example of a data unit generated by the generation unit of the image processing apparatus. It is an example of the data unit acquired by the acquisition part of a reproducing | regenerating apparatus. The playback unit 220 is an example of a playback unit.
 例えば、実施の形態1に示した例では、画像処理装置100は、取得部110と、生成部120と、出力部130と、を備える。取得部110は、静止画データを取得する。生成部120は、取得部110により取得された静止画データを用いて、輝度のダイナミックレンジが互いに異なり、かつ、互いに独立して再生することが可能な第1静止画データ(HDR静止画データ)および第2静止画データ(SDR静止画データ)を含む、論理的に1つのデータ単位を生成する。出力部130は、生成部120によって生成されたデータ単位(例えば、データ単位D10、等)を出力する。 For example, in the example shown in the first embodiment, the image processing apparatus 100 includes an acquisition unit 110, a generation unit 120, and an output unit 130. The acquisition unit 110 acquires still image data. The generation unit 120 uses the still image data acquired by the acquisition unit 110 to generate first still image data (HDR still image data) that have different dynamic dynamic ranges and can be reproduced independently of each other. And logically one data unit including the second still image data (SDR still image data). The output unit 130 outputs the data unit (for example, the data unit D10) generated by the generation unit 120.
 また、実施の形態1に示した例では、再生装置200は、輝度のダイナミックレンジが互いに異なり、かつ、互いに独立して再生することが可能な第1静止画データ(HDR静止画データ)および第2静止画データ(SDR静止画データ)を含む、論理的に1つのデータ単位(例えば、データ単位D10、等)を取得する取得部210と、取得部210により取得されたデータ単位(例えば、データ単位D10、等)に含まれる第1静止画データ(HDR静止画データ)および第2静止画データ(SDR静止画データ)のうちの一方を再生する再生部220と、を備える。 Further, in the example shown in the first embodiment, the playback device 200 has the first still image data (HDR still image data) and the first still image data that have different luminance dynamic ranges and can be played back independently of each other. An acquisition unit 210 that logically acquires one data unit (for example, data unit D10) including two still image data (SDR still image data), and a data unit (for example, data) acquired by the acquisition unit 210 A reproduction unit 220 that reproduces one of the first still image data (HDR still image data) and the second still image data (SDR still image data) included in the unit D10).
 このように構成された画像処理装置100は、輝度のダイナミックレンジが互いに異なり、かつ、互いに独立して再生することが可能なHDR静止画データおよびSDR静止画データを含む、論理的に1つのデータ単位を出力することができる。また、再生装置200は、そのデータ単位を取得して再生することができる。このため、画像処理装置100は、そのデータ単位を出力し、当該データ単位に含まれるHDR静止画データまたはSDR静止画データのいずれかを再生装置(例えば、再生装置200)で再生することができる。したがって、画像処理装置100は、ユーザにとって利便性の高い静止画データを提供することができる。 The image processing apparatus 100 configured in this manner is logically one piece of data including HDR still image data and SDR still image data that have different luminance dynamic ranges and can be reproduced independently of each other. Units can be output. Further, the playback device 200 can acquire and play back the data unit. For this reason, the image processing apparatus 100 outputs the data unit, and can reproduce either the HDR still image data or the SDR still image data included in the data unit by the reproduction apparatus (for example, the reproduction apparatus 200). . Therefore, the image processing apparatus 100 can provide still image data that is highly convenient for the user.
 画像処理装置において、生成部は、第1静止画データおよび第2静止画データを含む1つのファイルをデータ単位として生成してもよい。 In the image processing apparatus, the generation unit may generate one file including the first still image data and the second still image data as a data unit.
 なお、第1静止画データD12は第1静止画データの一例である。第2静止画データD13は第2静止画データの一例である。ファイルF10は、第1静止画データおよび第2静止画データを含む1つのファイルの一例である。データ単位D10は生成部が生成するデータ単位の一例である。なお、HDR静止画ファイルF100はファイルF10に対応する。 The first still image data D12 is an example of first still image data. The second still image data D13 is an example of second still image data. The file F10 is an example of one file including the first still image data and the second still image data. The data unit D10 is an example of a data unit generated by the generation unit. The HDR still image file F100 corresponds to the file F10.
 例えば、実施の形態1に示した例では、画像処理装置100において、生成部120は、第1静止画データD12(HDR静止画データ)および第2静止画データD13(SDR静止画データ)を含む1つのファイルF10をデータ単位D10として生成する。 For example, in the example shown in the first embodiment, in the image processing apparatus 100, the generation unit 120 includes first still image data D12 (HDR still image data) and second still image data D13 (SDR still image data). One file F10 is generated as a data unit D10.
 このように構成された画像処理装置100では、HDR静止画データおよびSDR静止画データが別々に管理されることが防止される。 In the image processing apparatus 100 configured as described above, it is possible to prevent the HDR still image data and the SDR still image data from being managed separately.
 画像処理装置において、生成部は、第1静止画データを含む1つの第1静止画ファイルと、第2静止画データを含み、かつ、ファイル名のボディが第1静止画ファイルと同じである第2静止画ファイルとを含んで構成されるオブジェクトを、データ単位として生成してもよい。 In the image processing device, the generation unit includes a first still image file including the first still image data, a second still image data, and a file name body that is the same as the first still image file. An object including two still image files may be generated as a data unit.
 なお、第1静止画データD22および第1静止画データD32は、それぞれが第1静止画データの一例である。第2静止画データD24および第2静止画データD33は、それぞれが第2静止画データの一例である。第1静止画ファイルF21および第1静止画ファイルF32は、それぞれが第1静止画ファイルの一例である。第2静止画ファイルF22および第2静止画ファイルF33は、それぞれが第2静止画ファイルの一例である。DSC0002およびDSC0003は、それぞれがファイル名のボディ(拡張子を除くファイル名)の一例である。データ単位D20およびデータ単位D30は、それぞれが、生成部が生成するデータ単位の一例である。なお、HDR静止画ファイルF110、TIFF静止画ファイルF210、HEVC-I静止画ファイルF310、およびJPEG2000静止画ファイルF410は、それぞれが第1静止画ファイルF21(または第1静止画ファイルF32)に対応する。SDR静止画ファイルF120、TIFF静止画ファイルF220、HEVC-I静止画ファイルF320、およびJPEG2000静止画ファイルF420は、それぞれが第2静止画ファイルF22(または第2静止画ファイルF33)に対応する。データ単位D200、データ単位D300、データ単位D400、およびデータ単位D500は、それぞれがデータ単位D20(またはデータ単位D30)に対応する。 Each of the first still image data D22 and the first still image data D32 is an example of first still image data. Each of the second still image data D24 and the second still image data D33 is an example of second still image data. Each of the first still image file F21 and the first still image file F32 is an example of a first still image file. Each of the second still image file F22 and the second still image file F33 is an example of a second still image file. Each of DSC0002 and DSC0003 is an example of a body of a file name (a file name excluding an extension). Each of the data unit D20 and the data unit D30 is an example of a data unit generated by the generation unit. The HDR still image file F110, the TIFF still image file F210, the HEVC-I still image file F310, and the JPEG2000 still image file F410 each correspond to the first still image file F21 (or the first still image file F32). . Each of the SDR still image file F120, the TIFF still image file F220, the HEVC-I still image file F320, and the JPEG2000 still image file F420 corresponds to the second still image file F22 (or the second still image file F33). Data unit D200, data unit D300, data unit D400, and data unit D500 each correspond to data unit D20 (or data unit D30).
 例えば、実施の形態1に示した例では、画像処理装置100において、生成部120は、HDR静止画データ(例えば、第1静止画データD22)を含む1つのHDR静止画ファイル(例えば、第1静止画ファイルF21)と、SDR静止画データ(例えば、第2静止画データD24)を含み、かつ、ファイル名のボディ(例えば、DSC0002)がHDR静止画ファイル(例えば、第1静止画ファイルF21)のファイル名のボディ(例えば、DSC0002)と同じであるSDR静止画ファイル(例えば、第2静止画ファイルF22)とを含んで構成されるオブジェクトを、データ単位(例えば、データ単位D20)として生成してもよい。 For example, in the example shown in the first embodiment, in the image processing apparatus 100, the generation unit 120 includes one HDR still image file (for example, the first still image data D22) including the HDR still image data (for example, the first still image data D22). A still image file F21) and SDR still image data (for example, second still image data D24), and the file name body (for example, DSC0002) is an HDR still image file (for example, first still image file F21). An object including an SDR still image file (for example, second still image file F22) having the same file name body (for example, DSC0002) as a data unit (for example, data unit D20) is generated. May be.
 このように構成された画像処理装置100により、再生装置(例えば、再生装置200)では、その再生装置がHDR静止画ファイルまたはSDR静止画ファイルの再生に対応していれば、対応しているファイルを用いて、画像を再生することができる。 With the image processing apparatus 100 configured as described above, in a playback apparatus (for example, the playback apparatus 200), if the playback apparatus supports playback of HDR still image files or SDR still image files, the corresponding file is supported. Can be used to reproduce the image.
 画像処理装置において、生成部は、第1静止画データを再生することによって、第2静止画データの再生による画像よりも高品位な画像を再生できることを示す補助情報をデータ単位にさらに付加してもよい。 In the image processing apparatus, the generation unit further adds auxiliary information in units of data indicating that a higher quality image than the image obtained by reproducing the second still image data can be reproduced by reproducing the first still image data. Also good.
 なお、管理データD11に含まれる補助情報は、補助情報の一例である。 The auxiliary information included in the management data D11 is an example of auxiliary information.
 例えば、実施の形態1に示した例では、画像処理装置100において、生成部120は、HDR静止画データ(例えば、第1静止画データD12)を再生することによって、SDR静止画データ(例えば、第2静止画データD13)の再生による画像よりも高品位な画像を再生できることを示す補助情報をデータ単位(例えば、データ単位D10)に付加する。 For example, in the example shown in the first embodiment, in the image processing apparatus 100, the generation unit 120 reproduces the HDR still image data (for example, the first still image data D12), thereby generating the SDR still image data (for example, the first still image data D12). Auxiliary information indicating that a higher quality image than the image obtained by reproducing the second still image data D13) can be reproduced is added to the data unit (for example, the data unit D10).
 このように構成された画像処理装置100により、そのデータ単位(例えば、データ単位D10)を受け取った再生装置(例えば、再生装置200)は、補助情報に応じて、その再生装置の再生能力を最大限に生かした品質で静止画を再生することができる。 The playback device (for example, playback device 200) that has received the data unit (for example, data unit D10) by the image processing device 100 configured as described above maximizes the playback capability of the playback device according to the auxiliary information. Still images can be played back with the best quality.
 画像処理装置において、生成部は、第1静止画データによる静止画の輝度が、高輝度領域を優先しているか否かを示す輝度領域情報、または、低輝度領域を優先しているか否かを示す輝度領域情報を含んだ補助情報をデータ単位に付加してもよい。 In the image processing apparatus, the generation unit determines whether the luminance of the still image based on the first still image data is luminance region information indicating whether the high luminance region is prioritized or whether the low luminance region is prioritized. Auxiliary information including the luminance area information to be shown may be added to the data unit.
 例えば、実施の形態1に示した例では、画像処理装置100において、生成部120は、HDR静止画データ(第1静止画データ)による静止画の輝度が、高輝度領域を優先しているか否かを示す輝度領域情報、または、低輝度領域を優先しているか否かを示す輝度領域情報を含んだ補助情報をデータ単位に付加してもよい。 For example, in the example shown in the first embodiment, in the image processing apparatus 100, the generation unit 120 determines whether or not the high-luminance area has priority for the luminance of the still image based on the HDR still image data (first still image data). Or auxiliary information including luminance area information indicating whether or not priority is given to the low luminance area may be added to the data unit.
 このように構成された画像処理装置100により、そのデータ単位を受け取った再生装置(例えば、再生装置200)は、補助情報に応じて、その再生装置の再生能力を有効に活用した品質で静止画を再生することができる。 The reproduction apparatus (for example, the reproduction apparatus 200) that has received the data unit by the image processing apparatus 100 configured as described above, according to the auxiliary information, takes a still image with a quality that effectively utilizes the reproduction capability of the reproduction apparatus. Can be played.
 画像処理装置において、第1静止画データは、HDRの画像データであってもよく、第2静止画データは、SDRの画像データであってもよい。 In the image processing apparatus, the first still image data may be HDR image data, and the second still image data may be SDR image data.
 例えば、実施の形態1に示した例では、画像処理装置100において、第1静止画データはHDRの画像データであり、第2静止画データはSDRの画像データである。 For example, in the example shown in the first embodiment, in the image processing apparatus 100, the first still image data is HDR image data, and the second still image data is SDR image data.
 (他の実施の形態)
 以上のように、本出願において開示する技術の例示として、実施の形態1を説明した。しかしながら、本開示における技術は、これに限定されず、変更、置き換え、付加、省略等を行った実施の形態にも適用できる。また、上記実施の形態1の実施例1~7で説明した各構成要素を組み合わせて、新たな実施の形態とすることも可能である。
(Other embodiments)
As described above, the first embodiment has been described as an example of the technique disclosed in the present application. However, the technology in the present disclosure is not limited to this, and can be applied to embodiments in which changes, replacements, additions, omissions, and the like are made. In addition, it is possible to combine the components described in the first to seventh embodiments of the first embodiment into a new embodiment.
 なお、上記実施の形態において、各構成要素は、専用のハードウェアで構成されてもよく、あるいは、各構成要素に適したソフトウェアプログラムをプロセッサが実行することによって実現されてもよい。各構成要素は、CPU(Central Processing Unit)またはプロセッサ等のプログラム実行部が、ハードディスクまたは半導体メモリなどの記録媒体に記録されたソフトウェアプログラムを読み出して実行することによって実現されてもよい。ここで、上記実施の形態の画像処理方法または再生方法等を実現するソフトウェアは、次のようなプログラムである。 In the above embodiment, each component may be configured by dedicated hardware, or may be realized by a processor executing a software program suitable for each component. Each component may be realized by a program execution unit such as a CPU (Central Processing Unit) or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory. Here, the software that realizes the image processing method or the reproduction method according to the above embodiment is the following program.
 すなわち、このプログラムは、コンピュータに、撮像により得られた静止画データを取得し、取得した静止画データを用いて、輝度のダイナミックレンジが互いに異なり、かつ、互いに独立して再生することが可能な第1静止画データおよび第2静止画データを含む、論理的に1つのデータ単位を生成し、生成したデータ単位を出力する画像処理方法を実行させる。 In other words, this program can acquire still image data obtained by imaging on a computer and use the acquired still image data to have different luminance dynamic ranges and to reproduce them independently of each other. An image processing method for logically generating one data unit including the first still image data and the second still image data and outputting the generated data unit is executed.
 または、このプログラムは、コンピュータに、輝度のダイナミックレンジが互いに異なり、かつ、互いに独立して再生することが可能な第1静止画データおよび第2静止画データを含む、論理的に1つのデータ単位を取得し、取得したデータ単位に含まれる第1静止画データおよび第2静止画データのうちの一方を再生する再生方法を実行させる。 Alternatively, this program is a logical unit of data including first still picture data and second still picture data that can be reproduced independently of each other and have different dynamic dynamic ranges. And a reproduction method for reproducing one of the first still image data and the second still image data included in the acquired data unit is executed.
 なお、添付図面および詳細な説明に記載された構成要素の中には、課題解決のために必須な構成要素だけでなく、上記技術を例示するために、課題解決のためには必須でない構成要素も含まれ得る。そのため、それらの必須ではない構成要素が添付図面や詳細な説明に記載されていることをもって、直ちに、それらの必須ではない構成要素が必須であるとの認定をするべきではない。 Note that, among the components described in the attached drawings and detailed description, not only the components essential for solving the problem, but also the components not essential for solving the problem in order to exemplify the above technique. May also be included. Therefore, it should not be immediately recognized that these non-essential components are essential as those non-essential components are described in the accompanying drawings and detailed description.
 また、上述の実施の形態は、本開示における技術を例示するためのものであるから、特許請求の範囲またはその均等の範囲において種々の変更、置き換え、付加、省略などを行うことができる。 In addition, since the above-described embodiments are for illustrating the technique in the present disclosure, various modifications, replacements, additions, omissions, and the like can be made within the scope of the claims and the equivalents thereof.
 本開示は、利便性の高い静止画データを得ることができる画像処理装置、当該静止画データを再生できる再生装置、画像処理方法、および、再生方法に適用可能である。具体的には、カメラ等の撮像装置、テレビ等の表示装置、または印刷装置等に、本開示は適用可能である。 The present disclosure can be applied to an image processing apparatus that can obtain highly convenient still image data, a reproduction apparatus that can reproduce the still image data, an image processing method, and a reproduction method. Specifically, the present disclosure is applicable to an imaging device such as a camera, a display device such as a television, or a printing device.
10,10A,10B,10C,10D,10E,10F,10G,10H  HDR撮像装置
11  HDR撮像部
12  変換部
13,13A  JPEG圧縮部
13B  JPEG XT圧縮部
13C  マルチピクチャーフォーマット生成部
13D  HEVC圧縮部
14  SDR撮像部
15,15A,15B  HDR画像補正部
16  HDMI出力部
17  データ単位生成部
17A  TIFF出力部
17B  HEVC圧縮部
17C  JPEG2000圧縮部
20  SDR撮像装置
21  SDR撮像部
22  JPEG圧縮部
30,60,61,62  HDR表示装置
40  SDR表示装置
50,63  SDR印刷装置
100  画像処理装置
110  取得部
120,120A  生成部
121  HDR画像処理部
122  HDR画像圧縮部
123  変換部
123A  SDR画像処理部
124  SDR画像圧縮部
125  フォーマット部
130  出力部
200  再生装置
210  取得部
220  再生部
D10,D20,D30  データ単位
D11,D31  管理データ
D12,D22,D32  第1静止画データ
D13,D24,D33  第2静止画データ
D21  第1管理データ
D23  第2管理データ
D200,D300,D400,D500  データ単位
F10  ファイル
F21,F32  第1静止画ファイル
F22,F33  第2静止画ファイル
F31  管理ファイル
F100,F110  HDR静止画ファイル
F120  SDR静止画ファイル
F210,F220  TIFF静止画ファイル
F310,F320  HEVC-I静止画ファイル
F410,F420  JPEG2000静止画ファイル
10, 10A, 10B, 10C, 10D, 10E, 10F, 10G, 10H HDR imaging device 11 HDR imaging unit 12 conversion unit 13, 13A JPEG compression unit 13B JPEG XT compression unit 13C multi-picture format generation unit 13D HEVC compression unit 14 SDR Imaging unit 15, 15A, 15B HDR image correction unit 16 HDMI output unit 17 Data unit generation unit 17A TIFF output unit 17B HEVC compression unit 17C JPEG2000 compression unit 20 SDR imaging unit 21 SDR imaging unit 22 JPEG compression unit 30, 60, 61, 62 HDR display device 40 SDR display device 50, 63 SDR printing device 100 Image processing device 110 Acquisition unit 120, 120A Generation unit 121 HDR image processing unit 122 HDR image compression unit 123 Conversion unit 123A SDR image Image processing unit 124 SDR image compression unit 125 Format unit 130 Output unit 200 Playback device 210 Acquisition unit 220 Playback units D10, D20, D30 Data units D11, D31 Management data D12, D22, D32 First still image data D13, D24, D33 Second still image data D21 First management data D23 Second management data D200, D300, D400, D500 Data unit F10 Files F21, F32 First still image files F22, F33 Second still image files F31 Management files F100, F110 HDR still Image file F120 SDR still image file F210, F220 TIFF still image file F310, F320 HEVC-I still image file F410, F420 JPEG2000 still image file

Claims (9)

  1. 撮像により得られた静止画データを取得する取得部と、
    前記取得部に取得された前記静止画データを用いて、輝度のダイナミックレンジが互いに異なり、かつ、互いに独立して再生することが可能な第1静止画データおよび第2静止画データを含む、論理的に1つのデータ単位を生成する生成部と、
    前記生成部によって生成された前記データ単位を出力する出力部と、を備える、
    画像処理装置。
    An acquisition unit for acquiring still image data obtained by imaging;
    A logic including first still image data and second still image data that have different dynamic dynamic ranges and can be reproduced independently of each other using the still image data acquired by the acquisition unit. A generation unit for generating one data unit in general,
    An output unit that outputs the data unit generated by the generation unit,
    Image processing device.
  2. 前記生成部は、前記第1静止画データおよび前記第2静止画データを含む1つのファイルを前記データ単位として生成する、
    請求項1に記載の画像処理装置。
    The generation unit generates one file including the first still image data and the second still image data as the data unit.
    The image processing apparatus according to claim 1.
  3. 前記生成部は、前記第1静止画データを含む1つの第1静止画ファイルと、前記第2静止画データを含み、かつ、ファイル名のボディが前記第1静止画ファイルと同じである第2静止画ファイルとを含んで構成されるオブジェクトを、前記データ単位として生成する、
    請求項1に記載の画像処理装置。
    The generation unit includes a first still image file including the first still image data, the second still image data, and a file name body that is the same as the first still image file. Generating an object including a still image file as the data unit;
    The image processing apparatus according to claim 1.
  4. 前記生成部は、前記第1静止画データを再生することによって、前記第2静止画データの再生による画像よりも高品位な画像を再生できることを示す補助情報を前記データ単位にさらに付加する、
    請求項1から3のいずれか1項に記載の画像処理装置。
    The generation unit further adds auxiliary information indicating that a higher quality image than the image obtained by reproducing the second still image data can be reproduced by reproducing the first still image data to the data unit.
    The image processing apparatus according to claim 1.
  5. 前記生成部は、前記第1静止画データによる静止画の輝度が、高輝度領域を優先しているか否かを示す輝度領域情報、または、低輝度領域を優先しているか否かを示す輝度領域情報を含んだ前記補助情報を前記データ単位に付加する、
    請求項4に記載の画像処理装置。
    The generation unit includes luminance area information indicating whether or not a high-luminance area is prioritized for luminance of a still image based on the first still image data, or a luminance area indicating whether or not a low-luminance area is prioritized. Adding the auxiliary information including information to the data unit;
    The image processing apparatus according to claim 4.
  6. 前記第1静止画データは、HDR(High Dynamic Range)の画像データであり、
    前記第2静止画データは、SDR(Standard Dynamic Range)の画像データである、
    請求項1から5のいずれか1項に記載の画像処理装置。
    The first still image data is HDR (High Dynamic Range) image data,
    The second still image data is SDR (Standard Dynamic Range) image data.
    The image processing apparatus according to claim 1.
  7. 輝度のダイナミックレンジが互いに異なり、かつ、互いに独立して再生することが可能な第1静止画データおよび第2静止画データを含む、論理的に1つのデータ単位を取得する取得部と、
    前記取得部により取得された前記データ単位に含まれる前記第1静止画データおよび前記第2静止画データのうちの一方を再生する再生部と、を備える、
    再生装置。
    An acquisition unit that logically acquires one data unit including first still image data and second still image data that have different luminance dynamic ranges and can be reproduced independently of each other;
    A reproduction unit that reproduces one of the first still image data and the second still image data included in the data unit acquired by the acquisition unit;
    Playback device.
  8. 撮像により得られた静止画データを取得し、
    取得した前記静止画データを用いて、輝度のダイナミックレンジが互いに異なり、かつ、互いに独立して再生することが可能な第1静止画データおよび第2静止画データを含む、論理的に1つのデータ単位を生成し、
    生成した前記データ単位を出力する、
    画像処理方法。
    Obtain still image data obtained by imaging,
    Logically one piece of data including first still image data and second still image data that have different dynamic dynamic ranges and can be reproduced independently of each other using the acquired still image data Generate units,
    Outputting the generated data unit;
    Image processing method.
  9. 輝度のダイナミックレンジが互いに異なり、かつ、互いに独立して再生することが可能な第1静止画データおよび第2静止画データを含む、論理的に1つのデータ単位を取得し、
    取得した前記データ単位に含まれる前記第1静止画データおよび前記第2静止画データのうちの一方を再生する、
    再生方法。
    Obtaining a logical unit of data including first still image data and second still image data having different dynamic ranges of luminance and capable of being reproduced independently of each other;
    Playing back one of the first still image data and the second still image data included in the acquired data unit;
    Playback method.
PCT/JP2017/026748 2016-07-27 2017-07-25 Image processing device, reproduction device, image processing method, and reproduction method WO2018021261A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2018529885A JPWO2018021261A1 (en) 2016-07-27 2017-07-25 Image processing apparatus, reproduction apparatus, image processing method, and reproduction method
EP17834267.1A EP3493532B8 (en) 2016-07-27 2017-07-25 Image processing device and image processing method
CN201780046042.0A CN109479111B (en) 2016-07-27 2017-07-25 Image processing apparatus, reproduction apparatus, image processing method, and reproduction method
US16/317,081 US11184596B2 (en) 2016-07-27 2017-07-25 Image processing device, reproduction device, image processing method, and reproduction method

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201662367425P 2016-07-27 2016-07-27
US62/367425 2016-07-27
JP2017123170 2017-06-23
JP2017-123170 2017-06-23

Publications (1)

Publication Number Publication Date
WO2018021261A1 true WO2018021261A1 (en) 2018-02-01

Family

ID=61016485

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/026748 WO2018021261A1 (en) 2016-07-27 2017-07-25 Image processing device, reproduction device, image processing method, and reproduction method

Country Status (1)

Country Link
WO (1) WO2018021261A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019159620A1 (en) * 2018-02-16 2019-08-22 キヤノン株式会社 Imaging device, recording device, and display control device
CN113491104A (en) * 2019-02-28 2021-10-08 佳能株式会社 Image capturing apparatus, image processing apparatus, control method therefor, and program

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007074124A (en) * 2005-09-05 2007-03-22 Murata Mach Ltd Network facsimile machine, image output system, and image output method
JP2007534238A (en) * 2004-04-23 2007-11-22 ブライトサイド テクノロジーズ インコーポレイテッド Encoding, decoding and representation of high dynamic range images
JP2008204266A (en) * 2007-02-21 2008-09-04 Canon Inc File management system, its control method and program
JP2014023062A (en) * 2012-07-20 2014-02-03 Canon Inc Image pickup device and control method thereof
JP2014204175A (en) * 2013-04-01 2014-10-27 キヤノン株式会社 Image processing apparatus and control method thereof
JP2015056807A (en) 2013-09-12 2015-03-23 キヤノン株式会社 Imaging apparatus and control method for the same
WO2016039025A1 (en) * 2014-09-08 2016-03-17 ソニー株式会社 Information processing device, information recording medium, information processing method, and program
WO2016039172A1 (en) * 2014-09-12 2016-03-17 ソニー株式会社 Playback device, playback method, information processing device, information processing method, program, and recording medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007534238A (en) * 2004-04-23 2007-11-22 ブライトサイド テクノロジーズ インコーポレイテッド Encoding, decoding and representation of high dynamic range images
JP2007074124A (en) * 2005-09-05 2007-03-22 Murata Mach Ltd Network facsimile machine, image output system, and image output method
JP2008204266A (en) * 2007-02-21 2008-09-04 Canon Inc File management system, its control method and program
JP2014023062A (en) * 2012-07-20 2014-02-03 Canon Inc Image pickup device and control method thereof
JP2014204175A (en) * 2013-04-01 2014-10-27 キヤノン株式会社 Image processing apparatus and control method thereof
JP2015056807A (en) 2013-09-12 2015-03-23 キヤノン株式会社 Imaging apparatus and control method for the same
WO2016039025A1 (en) * 2014-09-08 2016-03-17 ソニー株式会社 Information processing device, information recording medium, information processing method, and program
WO2016039172A1 (en) * 2014-09-12 2016-03-17 ソニー株式会社 Playback device, playback method, information processing device, information processing method, program, and recording medium

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019159620A1 (en) * 2018-02-16 2019-08-22 キヤノン株式会社 Imaging device, recording device, and display control device
JP2019145917A (en) * 2018-02-16 2019-08-29 キヤノン株式会社 Imaging apparatus, recording device and display control device
CN111727603A (en) * 2018-02-16 2020-09-29 佳能株式会社 Image pickup apparatus, recording apparatus, and display control apparatus
US11308991B2 (en) 2018-02-16 2022-04-19 Canon Kabushiki Kaisha Image capture device, recording device, and display control device
CN111727603B (en) * 2018-02-16 2023-03-21 佳能株式会社 Image pickup apparatus, control method, and recording medium
JP7246855B2 (en) 2018-02-16 2023-03-28 キヤノン株式会社 Imaging device, recording device and display control device
CN113491104A (en) * 2019-02-28 2021-10-08 佳能株式会社 Image capturing apparatus, image processing apparatus, control method therefor, and program
CN113491104B (en) * 2019-02-28 2023-07-18 佳能株式会社 Image pickup apparatus, image processing apparatus, control method therefor, and storage medium
US11750934B2 (en) 2019-02-28 2023-09-05 Canon Kabushiki Kaisha Imaging apparatus, image processing apparatus, control method of these, and storage medium

Similar Documents

Publication Publication Date Title
CN109479111B (en) Image processing apparatus, reproduction apparatus, image processing method, and reproduction method
JP5991502B2 (en) Conversion method and conversion device
US10567727B2 (en) Reproduction method, creation method, reproduction device, creation device, and recording medium
JP2016225965A (en) Display method and display device
JP2005252754A (en) Apparatus and method for creating and reproducing image file
JP3926947B2 (en) Image data forming apparatus and image data processing method
US9756278B2 (en) Image processing system and image capturing apparatus
US20090154551A1 (en) Apparatus for recording/reproducing moving picture, and recording medium thereof
US20160037014A1 (en) Imaging apparatus and imaging apparatus control method
WO2018021261A1 (en) Image processing device, reproduction device, image processing method, and reproduction method
US8625002B2 (en) Image processing apparatus and control method thereof for use in multiplexing image data and additional information
US20090153704A1 (en) Recording and reproduction apparatus and methods, and a storage medium having recorded thereon computer program to perform the methods
US8379093B2 (en) Recording and reproduction apparatus and methods, and a recording medium storing a computer program for executing the methods
US10965925B2 (en) Image capturing apparatus, client apparatus, control method, and storage medium
JP2010021710A (en) Imaging device, image processor, and program
JP2007274661A (en) Imaging apparatus, image reproducing device and program
EP3522526B1 (en) Editing method, creation method, editing device, creation device, and recording medium
JP2008312021A (en) Image display system, image reproducing device, photographic equipment
JP2018019122A (en) Image data processor and imaging apparatus
JP2017011676A (en) Image-processing apparatus and image-processing method
JP4809451B2 (en) Image file generating apparatus and method, and image file reproducing apparatus and method
JP2010081510A (en) Video processing apparatus, and video processing method
JP2009164768A (en) Image file creation device, image file creation method, image file restoration device
JP2009124224A (en) Color data processing apparatus and display data processing program
JP2012253780A (en) Image display system, image reproducing device, and photographic apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17834267

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2018529885

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017834267

Country of ref document: EP

Effective date: 20190227