WO2018092711A1 - Image processing device, image processing method, and program - Google Patents

Image processing device, image processing method, and program Download PDF

Info

Publication number
WO2018092711A1
WO2018092711A1 PCT/JP2017/040732 JP2017040732W WO2018092711A1 WO 2018092711 A1 WO2018092711 A1 WO 2018092711A1 JP 2017040732 W JP2017040732 W JP 2017040732W WO 2018092711 A1 WO2018092711 A1 WO 2018092711A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
still image
printing
luminance
hdr
Prior art date
Application number
PCT/JP2017/040732
Other languages
French (fr)
Japanese (ja)
Inventor
小塚 雅之
柏木 吉一郎
山本 雅哉
上坂 靖
美裕 森
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Priority to EP17871075.2A priority Critical patent/EP3544280B1/en
Priority to CN201780070657.7A priority patent/CN109983754B/en
Priority to US16/348,026 priority patent/US10726315B2/en
Priority to JP2018551611A priority patent/JP6719061B2/en
Publication of WO2018092711A1 publication Critical patent/WO2018092711A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/40Picture signal circuits
    • H04N1/407Control or modification of tonal gradation or of extreme levels, e.g. background level
    • H04N1/4072Control or modification of tonal gradation or of extreme levels, e.g. background level dependent on the contents of the original
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors

Definitions

  • the present disclosure relates to an image processing apparatus and an image processing method.
  • Patent Document 1 discloses an imaging device that records a wide dynamic range HDR (High Dynamic Range) still image by combining a plurality of images with different exposures.
  • HDR High Dynamic Range
  • Patent Document 1 In the technique disclosed in Patent Document 1, it is difficult to obtain still image data for printing high-quality printed matter.
  • the present disclosure provides an image processing apparatus and an image processing method that can obtain still image data for printing high-quality printed matter.
  • An image processing apparatus includes an acquisition unit that acquires first still image data obtained by imaging and having a luminance range defined by a first dynamic range, and capability information indicating the printing capability of the printing apparatus; Defining the first still image data acquired by the second dynamic range having a luminance range narrower than the first dynamic range according to the printing capability indicated by the capability information acquired by the acquiring unit And an output unit for outputting the second still image data converted by the converting unit to the printing apparatus.
  • the first still image data obtained by imaging and having a luminance range defined by a first dynamic range is acquired, and the acquired capability information indicating the printing capability of the printing apparatus is acquired.
  • 1 Still image data is converted into second still image data defined in a second dynamic range whose luminance range is narrower than the first dynamic range according to the printing capability indicated by the acquired capability information.
  • the converted second still image data is output to the printing apparatus.
  • the image processing apparatus and the image processing method in the present disclosure can obtain still image data for printing high-quality printed matter.
  • FIG. 1 is a schematic view for explaining the evolution of video technology.
  • FIG. 2 is a schematic view for explaining the HDR display technology.
  • FIG. 3A is a schematic view for explaining a PQ (Perceptual Quantization) method.
  • FIG. 3B is a schematic view for explaining the HLG (Hybrid Log Gamma) method.
  • FIG. 4 is a diagram showing an example of an HDR image corresponding to HDR and an example of an SDR image corresponding to SDR (Standard Dynamic Range) in comparison.
  • FIG. 5 is a diagram showing an example of a scale of luminance when capturing an image.
  • FIG. 6 is a diagram showing an example of the luminance of a captured image.
  • FIG. 1 is a schematic view for explaining the evolution of video technology.
  • FIG. 2 is a schematic view for explaining the HDR display technology.
  • FIG. 3A is a schematic view for explaining a PQ (Perceptual Quantization) method.
  • FIG. 3B is a schematic
  • FIG. 7A is a view showing an example of the luminance as a result of mastering the original image shown in FIG. 6 into the SDR image.
  • FIG. 7B is a view schematically showing an example of the relationship between the original signal value and the SDR signal value for converting (mastering) the original signal value into the SDR signal value.
  • FIG. 8A is a view showing an example of the luminance as a result of mastering the original image shown in FIG. 6 into the HDR image.
  • FIG. 8B is a view schematically showing an example of the relationship between the original signal value and the HDR signal value for converting (mastering) the original signal value into the HDR signal value.
  • FIG. 9 is a schematic diagram for describing an imaging device compatible with HDR or SDR, a file format of image data obtained by the imaging device, and a display device for displaying image data or a printing device for printing image data. It is.
  • FIG. 10 is a schematic diagram for describing an HDR imaging mode in which an image with an expanded dynamic range is obtained by combining two images.
  • FIG. 11 is a schematic diagram for explaining an HDR shooting mode for obtaining an image with an expanded dynamic range by combining two images.
  • FIG. 12 is a schematic view for explaining an HDR image captured for HDR display.
  • FIG. 13 is a schematic view for explaining the problem in the case of causing the SDR printing apparatus to print an image.
  • FIG. 10 is a schematic diagram for describing an HDR imaging mode in which an image with an expanded dynamic range is obtained by combining two images.
  • FIG. 11 is a schematic diagram for explaining an HDR shooting mode for obtaining an image with an expanded dynamic range by combining two images.
  • FIG. 12 is a
  • FIG. 14 is a block diagram schematically showing an example of the hardware configuration of the image processing apparatus according to the first embodiment.
  • FIG. 15 is a block diagram schematically showing a first example of a functional configuration of the image processing apparatus in the first embodiment.
  • FIG. 16 is a diagram showing an example of the result of printing the test pattern according to the first embodiment on a sheet.
  • FIG. 17 shows a signal value of the first dynamic range and a signal value of the second dynamic range for converting from the first dynamic range to the second dynamic range in the conversion section shown in the first embodiment. It is a figure which shows an example of a relationship typically.
  • FIG. 18 is a flowchart showing an example of the operation of the image processing apparatus according to the first embodiment.
  • FIG. 18 is a flowchart showing an example of the operation of the image processing apparatus according to the first embodiment.
  • FIG. 19 is a block diagram schematically showing an example of a functional configuration of the image processing apparatus in the first modification of the first embodiment.
  • FIG. 20 is a block diagram schematically showing an example of a functional configuration of the image processing apparatus in the second modification of the first embodiment.
  • FIG. 21 is a block diagram schematically showing an example of a functional configuration of the image processing apparatus according to the third modification of the first embodiment.
  • FIG. 22 is a schematic diagram for explaining an example of the image processing apparatus according to the first embodiment (or the first to fifth modifications).
  • FIG. 23 is a diagram schematically illustrating a first example of a communication protocol between the display device and the printing device in the embodiment.
  • FIG. 24 is a diagram schematically showing a second example of the communication protocol between the display device and the printing device in the embodiment.
  • FIG. 25 is a diagram schematically illustrating a third example of the communication protocol between the display device and the printing device in the embodiment.
  • FIG. 26 is a diagram schematically showing a fourth example of the communication protocol between the
  • the present disclosure is to provide a new user value of HDR still images and a new photographic culture using two technologies of HDR (High Dynamic Range) display technology and HDR imaging technology.
  • the above new user value is to improve the sense of realism and to cause whiteout (in which the gradation of a bright area is lost, also referred to as white collapse) and black collapse (in which the gradation of a dark area is lost) ) Is to generate still image data with reduced and the like.
  • the above-mentioned new photographic culture refers to a display device corresponding to HDR display (hereinafter referred to as “HDR display device”), which is an HDR still image obtained by imaging with a camera compatible with HDR still image imaging.
  • the HDR display device is, for example, an HDRTV (HDR television set), an HDR compatible tablet terminal, an HDR compatible smartphone, an HDR compatible PC (Personal Computer), an HDR compatible display, or the like.
  • HDR still images are also referred to as HDR photographs.
  • the present disclosure is compatible with SDR display but not with HDR display (hereinafter referred to as “SDR display device”) and SDR still image printing but with HDR still images
  • SDR display device SDR display device
  • SDR still image printing but with HDR still images Provided are an image processing apparatus and an image processing method capable of generating still image data that can be displayed or printed even in a printing apparatus not compatible with printing (hereinafter referred to as "SDR printing apparatus"). That is, the present disclosure is applicable not only to an apparatus compatible with HDR still image processing but also an apparatus compatible with SDR still image processing but not to HDR still image processing.
  • the present invention also provides an image processing apparatus and an image processing method capable of improving the convenience of HDR still image data by providing still image data capable of reproducing HDR still images.
  • reproduction of an HDR still image includes display of the HDR still image and printing by image processing the HDR still image. That is, in the present disclosure, playback includes display and printing.
  • FIG. 1 is a schematic view for explaining the evolution of video technology.
  • HDR High
  • SDR Standard Dynamic Range
  • ITU International Telecommunication Union-Radiocommunication Sector
  • Specific applications of HDR include broadcast, packaged media (Blu-ray (registered trademark) Disc etc.), Internet distribution, etc., as with HD and UHD.
  • FIG. 2 is a schematic view for explaining the HDR display technology.
  • HDR is not just a scheme to realize a very bright television set.
  • HDR refers to the luminance range (dynamic range) of a video as BT. From 0.1 nit-100 nit defined in the standard of 709 (Broadcasting Service (Television) 709), for example 0-10 as defined in ST 2084 in SMPTE (Society of Motion Picture and Television Engineers) Expand to 000 nit to enable the representation of bright images such as bright sun, sky and reflection of light rays that could not be expressed conventionally, or to record light and dark areas simultaneously Method.
  • the brightness is an optical brightness and is a physical quantity that represents the brightness of the light source.
  • HDR High Speed Downlink Streaming
  • a video packetaged video to be subjected to grading processing (processing to adjust the color and tone of the video) after imaging
  • ST 2084 PQ method
  • IP Internet Protocol
  • HCG method Hybrid Log Gamma
  • the HDR display technology includes the HLG method capable of achieving compatibility between SDR and HDR, and the PQ method not having simple display compatibility between SDR and HDR.
  • the PQ method is also referred to as the HDR 10 method.
  • FIG. 3A is a schematic view for explaining the PQ method.
  • FIG. 3B is a schematic view for explaining the HLG method.
  • the PQ scheme is a scheme in which there is no compatibility between SDR and HDR.
  • SDR and HDR are graded separately and transmitted separately.
  • SDR conversion is required to convert HDR video data into SDR video data.
  • HLG Hybrid Log Gamma
  • ITU-R The ITU Radio communication Sector
  • grading for HLG is performed, and only the HLG stream is transmitted.
  • the HLG stream is compatible with SDR. For this reason, when displaying video data of HDR on SDRTV, SDR conversion which converts video data of HDR into video data of SDR is unnecessary.
  • FIG. 4 is a diagram comparing and showing an example of an HDR image corresponding to HDR and an example of an SDR image corresponding to SDR.
  • a HDR image and an SDR image show a single image with a relatively large difference in brightness, in which a relatively dark scene in a room and a relatively bright scene outside a window are mixed.
  • the HDR image is an image obtained by reproducing the HDR still image data or the HDR video data.
  • the SDR image is an image obtained by reproducing SDR still image data or SDR moving image data.
  • both the relatively bright scenery outside the window and the relatively dark scenery in the room are expressed with appropriate brightness.
  • the SDR image as illustrated in the lower part of FIG. 4, when the exposure is adjusted so that a relatively bright scene outside the window is expressed, the relatively dark scene in the room becomes too dark. Some parts are blacked out and difficult to see.
  • FIG. 5 is a diagram showing an example of a scale of luminance when capturing an image.
  • 18% gray at which the reflectance is 18% is used as the reference point of brightness.
  • 18% gray is a standard reflectance which is a standard of brightness.
  • the number of Stops shown in FIG. 5 relatively represents the luminance, and the luminance at 18% gray is a reference point, and the number of Stops of the reference point is zero.
  • the Stop number is defined to increase by one each time the luminance is doubled and to decrease by one each time the luminance is halved.
  • the luminance obtained from the camera image sensor (for example, a complementary metal-oxide semiconductor (CMOS) or a charge coupled device (CCD)) is an aperture, shutter speed, sensitivity setting, etc. It changes according to the exposure by. That is, the luminance obtained from the image sensor has different values according to the exposure, even if the subject with the same brightness is imaged. For this reason, the value of the Stop number itself is not an absolute value but a relative value. That is, the Stop number can not represent luminance.
  • CMOS complementary metal-oxide semiconductor
  • CCD charge coupled device
  • the exposure is usually adjusted according to the brightness of the object.
  • the dark area occupying a large part of the image is not blackened.
  • the exposure adjustment is performed on the camera so that the gradation of the dark area is expressed and the bright area with relatively small area in the image is overexposed.
  • most of the image is set by setting the shutter speed to be fast, the aperture to be narrowed, etc. Adjust the exposure to the camera to prevent overexposure in the bright area that occupies the.
  • FIG. 6 is a diagram showing an example of the luminance of a captured image.
  • the captured image shown in FIG. 6 is hereinafter referred to as an original image 70.
  • an area of a pixel having a luminance of 18 nit corresponding to 0 Stop which is a reference of brightness is shown as an area A.
  • a region of a pixel having a luminance of 90 nit corresponding to 2.3 Stops is shown as a region B.
  • a region of a pixel having a luminance of 2.3 nit corresponding to substantially black -3 Stops is shown as a region C.
  • a region of a pixel having a luminance of 1150 nit corresponding to 6 Stops is shown as a region D.
  • Region D includes pixels obtained by imaging the sun, and very bright luminance (for example, the brightest luminance in the original image 70) is obtained.
  • very bright luminance for example, the brightest luminance in the original image 70
  • a region of pixels having a luminance of 290 nit corresponding to 4 Stops is shown as a region E.
  • the area indicated by the area E includes pixels obtained by imaging a location causing specular reflection.
  • an image (original image) of content having a high luminance component of 100 nit or more, which is captured by a camera, is BT.
  • This processing is conversion processing to conform to a broadcast standard such as 709, and is realized by performing knee curve processing on an original image.
  • the knee curve process is a process of converting an input signal by a knee curve.
  • the knee curve is an input / output conversion curve which compresses a gain for an input signal having a predetermined value (knee point) or more and outputs the compressed signal.
  • the gain is set to 1 for luminance below a certain value (knee point) in the original image (that is, the input luminance is output as it is), and for luminance above a certain value (knee point). Compresses the gain so that it falls within a predetermined luminance.
  • the predetermined luminance may be, for example, the maximum luminance that can be displayed by a display device that displays the image after processing. When conforming to 709, it may be 100 nit. Therefore, for example, when performing the SDR grading process on the original image 70, the brightness of the current image 70 is held as it is linearly up to the knee point (for example, around 80 nit) in the normal grading process. Reduces each luminance so that the maximum luminance of the current image 70 falls within 100 nit.
  • FIG. 7A is a view showing an example of the luminance as a result of mastering the original image 70 shown in FIG. 6 on the SDR image 71.
  • FIG. 7B is a diagram schematically showing an example of the relationship between the original signal value and the SDR signal value for converting the original signal value into the SDR signal value (hereinafter, also referred to as “mastering”).
  • the original signal value is the luminance in the luminance range of 0 nit to the maximum luminance (for example, 1150 nit) of the original image (for example, the original image 70) (hereinafter referred to as "the luminance of the original image").
  • the maximum value of the original signal value is 10000 in FIG. 7B, the maximum value of the original signal value changes depending on the original image. For example, for the original image 70 shown in FIG. 6, the maximum value of the original signal value is 1150.
  • the pixel corresponding to 0.sub.Stop is a pixel having a reference luminance as a reference of brightness. Therefore, even after converting the original image 70 into the SDR image 71 by mastering the original image 70 into the SDR image 71, the brightness (18 nit) of the original image 70 corresponding to 0 Stop in the original image 70 is not changed.
  • the luminance (18 nit) of the SDR is assumed (refer to the area indicated by the area A in the SDR image 71 of FIG. 7A).
  • the maximum value of the original signal value shown in FIG. 7B is not 10000 but 1150.
  • the knee point is 90.
  • the luminance of the original image 70 is set as the luminance of SDR without changing. That is, pixels of 0 to 90 nit in the original image 70 become pixels of 0 to 90 nit in the SDR image 71 after mastering.
  • the luminance of the original image 70 is set to the SDR luminance of the luminance range of 90 to 100 nit. Perform linear conversion. That is, pixels of 90 to 1150 nit in the original image 70 are pixels of 90 to 100 nit in the SDR image 71 after mastering.
  • the pixels corresponding to 90 nit in the original image 70 are the original image 70 corresponding to 90 nit in the original image 70.
  • the luminance is assumed to be the luminance of SDR without changing (refer to the area indicated by area B in the SDR image 71 of FIG. 7A).
  • the pixels corresponding to 2.3 nit in the original image 70 are the same as the original image 70.
  • the luminance of the original image 70 corresponding to 2.3 nit in the above is the luminance of the SDR without changing (refer to the area indicated by the area C in the SDR image 71 of FIG. 7A).
  • the pixel corresponding to 1150 nit in the original image 70 sets the luminance of the original image 70 corresponding to 1150 nit in the original image 70 to 100 nit, which is the maximum luminance of SDR. Convert (refer to the area indicated by D in the SDR image 71 of FIG. 7A).
  • the pixel corresponding to 290 nit in the original image 70 converts the luminance of the original image 70 corresponding to 290 nit in the original image 70 to 95 nit (FIG. 7A Refer to the area indicated by the area E in the SDR image 71).
  • FIG. 8A is a view showing an example of the luminance as a result of mastering the original image 70 shown in FIG. 6 into the HDR image 72.
  • FIG. 8B is a view schematically showing an example of the relationship between the original signal value and the HDR signal value for converting (mastering) the original signal value into the HDR signal value.
  • the HDR signal value is the luminance in the luminance range of the HDR (hereinafter referred to as “the luminance of the HDR”).
  • the luminance of the HDR in mastering from the original image 70 to the HDR image 72, it is assumed that the luminance up to 2000 nit is permitted to be the luminance of the HDR.
  • the maximum luminance of the original image 70 is 1150 nit. Therefore, in the HDR image 72, the luminance of the original image 70 can be held as it is.
  • the pixel corresponding to 0.sub.Stop is a pixel having a reference luminance as a reference of brightness. Therefore, regarding the pixel, even after converting the original image 70 to the HDR image 72 by mastering from the original image 70 to the HDR image 72, the luminance in the original image 70 is not changed but is set as the HDR luminance (see FIG. An area indicated by an area A in the HDR image 72 of 8A).
  • a pixel corresponding to 90 nit in the original image 70 a pixel corresponding to 2.3 nit in the original image 70, a pixel corresponding to 1150 nit in the original image 70, and a pixel corresponding to 290 nit in the original image 70
  • the luminance of the original image 70 is made the luminance of the HDR without changing (FIG. 8A In the HDR image 72, the areas indicated by area B, area C, area D, and area E are referred to).
  • FIG. 9 is a schematic diagram for describing an imaging device compatible with HDR or SDR, a file format of image data obtained by the imaging device, and a display device for displaying image data or a printing device for printing image data. It is.
  • the HDR imaging device 10 illustrated in FIG. 9 corresponds to imaging in HDR.
  • the HDR imaging device 10 includes an HDR imaging unit 11, an SDR imaging unit 14, a conversion unit 12, and a JPEG compression unit 13.
  • the HDR imaging device 10 can respond to the image data obtained by imaging in the HDR imaging mode in the HDR imaging unit 11 being displayed on the SDR display device 40 or being printed on the SDR printing device 50. It is configured. Specifically, in the HDR imaging device 10, HDR still image data of an HDR image obtained by imaging in the HDR imaging mode in the HDR imaging unit 11 is converted into SDR still image data in the conversion unit 12.
  • the SDR still image data obtained by the conversion in the conversion unit 12 is JPEG-compressed in the JPEG compression unit 13, and the SDR still image data in the JPEG format obtained by the compression is output.
  • the SDR still image data of the SDR image obtained by imaging in the conventional imaging mode (SDR imaging mode) in the SDR imaging unit 14 is also JPEG-compressed by the JPEG compression unit 13, SDR still image data in JPEG format obtained by compression is output.
  • the SDR imaging device 20 includes an SDR imaging unit 21 and a JPEG compression unit 22.
  • the SDR display device 40, and the SDR printing device 50 SDR still image data obtained by SDR converting HDR still image data by HDR imaging or SDR still image data by SDR imaging is acquired Then, the SDR image based on the SDR still image data is reproduced (displayed or printed).
  • FIG. 10 and FIG. 11 are schematic diagrams for describing the HDR imaging mode for obtaining an image in which the luminance range (dynamic range) is expanded by combining two images.
  • Some smartphones, digital cameras, and the like have an HDR shooting mode capable of capturing an image having a wide luminance range (dynamic range).
  • the HDR shooting mode as shown in FIGS. 10 and 11A, double exposure (the same subject is photographed a plurality of times in different exposure states to obtain HDR image data having a wide luminance range (dynamic range),
  • the two SDR images obtained by the above method are synthesized so as to fall within the luminance range defined by the SDR.
  • (b) of FIG. 10 and FIG. 11 it becomes possible to display the HDR image on the SDR display device.
  • FIG. 12 is a schematic view for explaining an HDR image captured for HDR display.
  • the HDR image for HDR display is captured in a brightness range (dynamic range) in which the brightness of the scene to be captured is wider than that in the SDR shooting mode.
  • a grading process is performed on the image data obtained by this imaging to generate an HDR image for HDR display, and the HDR image is transmitted to each device and reproduced.
  • the HDR image can not be displayed on the SDR display device as it is because the luminance range (dynamic range) is wider than the SDR image.
  • conversion from the HDR image to the SDR image is required.
  • the combined image is generated so as to fall within the luminance range defined by the SDR. Therefore, the HDR display device 30 and the SDR display device 40 ( Or, it is possible to reproduce with both the SDR printing apparatus 50).
  • HDR display devices such as HDRTV that can display HDR image data for displaying an HDR image without performing SDR conversion have been proposed.
  • HDR imaging function unlike HDRTV, HDR technology is mainly used for the purpose of backlight correction and the like. And, still images captured by the camera using the HDR technology may be reproduced by the SDR display device or the SDR printing device. Therefore, the camera is provided with an imaging element capable of generating HDR image data for video, and the HDR image captured in the HDR imaging mode is SDR converted although imaging using the HDR technology is possible. May output SDR still image data.
  • HDR image data is generated although it is possible to generate HDR image data having a wide luminance range (dynamic range) that makes use of the display capability of HDRTV. There was a case that did not.
  • FIG. 13 is a schematic view for explaining the problem in the case of causing the SDR printing apparatus 50 to print an image.
  • HDR display function of HDRTV for example, HDR display device 30
  • HDR image data for HDR display is generated
  • the HDR image data is not converted into SDR image data, and the HDR image data as it is Can be displayed on HDRTV.
  • the HDR imaging device 10A illustrated in FIG. 13 includes an HDR imaging unit 11, an SDR imaging unit 14, a conversion unit 12, a JPEG compression unit 13, an HDR image correction unit 15, and a high efficiency video coding (HEVC) compression unit 16 And.
  • the HDR imaging device 10A performs the HDR image correction in the HDR image correction unit 15 in order to generate the HDR image data.
  • HEVC high efficiency video coding
  • the HDR image correction unit 15 uses, for example, the HDR data (for example, HDR-EOTF (HDR-Electro-Optical Transfer Function) such as a PQ curve) to capture the RAW data obtained by performing imaging by the HDR imaging unit 11. , 10-bit image that can be displayed by the HDR display device 30) compatible with the HDR 10 standard. Then, the HDR imaging device 10A transmits the HDR image data obtained by the HDR image correction unit 15 to the HDRTV (for example, the HDR display device 30) from the HEVC compression unit 16 via HDMI (registered trademark, High-Definition Multimedia Interface), for example. Output to). Thereby, in the HDR TV (for example, the HDR display device 30) which has received the HDR image data, the HDR image according to the HDR image data is displayed.
  • the HDR data for example, HDR-EOTF (HDR-Electro-Optical Transfer Function) such as a PQ curve
  • the HDR imaging device 10A transmits the HDR image data
  • pseudo-HDR conversion is performed on HDR image data, and a pseudo-HDR image in SDR format obtained by the pseudo-HDR conversion is output to the SDR display device 40 to display the SDR image data on the SDR display device 40.
  • the pseudo-HDR conversion is to convert an HDR image into an SDR-format image (pseudo-HDR image) having a luminance range (dynamic range) in accordance with the maximum luminance value that can be displayed by the SDR display device 40.
  • the SDR printing apparatus 50 prints SDR image data obtained by performing imaging in the conventional SDR imaging mode. That is, in the SDR printing apparatus 50, even if high quality HDR image data is obtained, it is not possible to print a high quality image based on high quality HDR image data.
  • each drawing is not necessarily strictly illustrated, and is a schematic view appropriately omitted in order to clearly show the present disclosure.
  • substantially the same components may be denoted by the same reference numerals and descriptions thereof may be omitted or simplified.
  • an image processing apparatus that generates still image data for enabling the SDR printing apparatus 50 to print high-quality images.
  • FIG. 14 is a block diagram schematically showing an example of the hardware configuration of the image processing apparatus 100 according to the first embodiment.
  • the image processing apparatus 100 has a hardware configuration including a central processing unit (CPU) 101, a main memory 102, a storage 103, an input interface (IF) 104, and a communication interface (interface) 105. And.
  • the CPU 101 is a processor that executes a control program stored in the storage 103 or the like.
  • the main memory 102 is a volatile storage area used as a work area used when the CPU 101 executes a control program.
  • the main memory 102 can be configured by, for example, a semiconductor memory or the like.
  • the storage 103 is a non-volatile storage area that holds a control program, content, and the like.
  • the storage 103 can be configured by, for example, a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
  • the input IF 104 is a keyboard, a mouse, a touch pad, a button, a touch panel, or the like.
  • the communication IF 105 is a communication interface that communicates with another device via a communication network.
  • the other devices are, for example, a printing device 200, an input device 300, and the like described later.
  • the communication IF 105 is, for example, a wireless LAN (Local Area Network) interface conforming to the IEEE 802.11a, b, g, n standards, but the third generation mobile communication system (3G), the fourth generation mobile communication system (4G) Or a wireless communication interface conforming to a communication standard used in a mobile communication system such as LTE (registered trademark, LONG Term Evolution), or a wireless communication interface conforming to the Bluetooth (registered trademark) standard May be
  • the communication IF 105 may be a wired communication interface such as a wired LAN interface or a USB (Universal Serial Bus) interface.
  • FIG. 15 is a block diagram schematically showing a first example of a functional configuration of the image processing apparatus in the first embodiment.
  • the image processing apparatus 100 includes an acquisition unit 110, a conversion unit 120, and an output unit 130.
  • the image processing apparatus 100 may be realized as an apparatus incorporated in an imaging apparatus or may be realized as a single apparatus. Further, the printing apparatus 200 is connected to the image processing apparatus 100 by wired connection or wireless connection.
  • the printing apparatus 200 is an example of the SDR printing apparatus 50 shown in FIG.
  • the acquisition unit 110 acquires the first still image data D1 acquired by imaging and the capability information I1 indicating the printing capability of the printing apparatus 200.
  • the first still image data D1 is still image data whose luminance range is defined by the first dynamic range.
  • the first dynamic range is, for example, HDR (High Dynamic Range).
  • the acquisition unit 110 may acquire, for example, paper information indicating the type of paper used for printing by the printing apparatus 200 (or the paper set in the printing apparatus 200) as the capability information I1.
  • the paper information may be included, for example, in an instruction when the user causes the printing apparatus 200 to print an image (hereinafter, also referred to as a “printing instruction”).
  • the acquisition unit 110 may also acquire, as the capability information I1, a print result obtained by printing a specific pattern (hereinafter also referred to as a “test pattern”) on a sheet used for printing by the printing apparatus 200.
  • the acquisition unit 110 may acquire the print result by the user inputting the print result to the image processing apparatus 100 (or the printing apparatus 200). The print result will be described later.
  • the acquisition unit 110 may acquire the first still image data D1 from, for example, an imaging device, an information terminal, or a storage device connected to the image processing apparatus 100 by wire connection or wireless connection.
  • the acquisition unit 110 for acquiring the first still image data D1 may be realized by, for example, the communication IF 105 (see FIG. 14).
  • the acquisition unit 110 may acquire, for example, a print instruction including paper information as the capability information I1 by receiving an input operation from the user.
  • the acquisition unit 110 for acquiring the capability information I1 may be realized by, for example, the input IF 104 (see FIG. 14).
  • the acquisition unit 110 may acquire the capability information I1 without acquiring the print instruction.
  • the conversion unit 120 converts the first still image data D1 acquired by the acquisition unit 110 into second still image data D2 according to the printing capability indicated by the capability information I1 acquired by the acquisition unit 110.
  • the second still image data D2 is still image data defined in a second dynamic range whose luminance range is narrower than the first dynamic range.
  • the conversion unit 120 refers to the information indicating a predetermined relationship to indicate the paper indicated by the paper information acquired by the acquisition unit 110. Identify the reflected brightness that corresponds to the type of The information indicating the predetermined relationship is information indicating the relationship between the plurality of types of sheets and the reflection luminance corresponding to each of the plurality of types of sheets. Then, the converting unit 120 converts the first still image data D1 into the second still image data D2 with the brightness range (dynamic range) in which the reflection brightness specified by the paper information is the maximum brightness as the second dynamic range. .
  • the reflection luminance is the luminance of the light reflected by the sheet when the sheet is irradiated with the light of the specified luminance.
  • the prescribed luminance may be defined, for example, as luminance representing general indoor brightness, or may be any other arbitrarily determined luminance.
  • the information indicating the predetermined relationship (the relationship between the types of the plurality of sheets and the reflection luminance) irradiates light of the prescribed luminance to the plurality of types of sheets in advance, and the light is reflected on each sheet It can be obtained from the result of measuring the brightness of the received light.
  • the information indicating the predetermined relationship may be represented, for example, by a table representing the relationship between the types of paper and the reflection luminance, or may be stored in the storage 103. Further, the information indicating the predetermined relationship may be acquired from an external information processing apparatus (not shown) via the communication network.
  • the conversion unit 120 sets the luminance range (dynamic range) determined according to the print result as the second dynamic range. Then, the conversion unit 120 converts the first still image data D1 into second still image data D2 based on the second dynamic range.
  • the conversion unit 120 may be realized by, for example, the CPU 101, the main memory 102, the storage 103 (see FIG. 14), and the like.
  • the conversion unit 120 may be realized by a dedicated circuit configured of a semiconductor integrated circuit or the like.
  • the output unit 130 outputs the second still image data D2 converted from the first still image data D1 by the conversion unit 120 to the printing apparatus 200.
  • the output unit 130 may be realized by, for example, the communication IF 105 (see FIG. 14).
  • the printing apparatus 200 receives the second still image data D2 from the image processing apparatus 100 and a print instruction from the user. Then, the printing apparatus 200 prints the still image indicated by the received second still image data D2 on the sheet of the type specified in the received print instruction.
  • the image processing apparatus 100 may output a print instruction for printing a test pattern to the printing apparatus 200, and may cause the printing apparatus 200 to print the test pattern.
  • the image processing apparatus 100 may also include a display (not shown). Alternatively, a display may be connected to the image processing apparatus 100. Then, the image processing apparatus 100 outputs, to the printing apparatus 200, a print instruction to execute printing of the test pattern, and then an input for allowing the user to input a print result obtained by printing the test pattern on a sheet.
  • the instructions may be displayed on a display.
  • the input instruction may be, for example, a message, an image, or a UI (User Interface), which prompts the user to input a print result obtained from a sheet on which the test pattern is printed.
  • FIG. 16 is a diagram showing an example of the result of printing the test pattern 150 according to the first embodiment on a sheet.
  • a test pattern 150 shown in FIG. 16 is printed on a sheet P1 used for printing by the printing apparatus 200 as a specific pattern (test pattern).
  • the test pattern 150 shown as an example in FIG. 16 includes patterns 1 to 3 and patterns 4 to 6.
  • patterns 1 to 3 as shown in FIG. 16, in the area on the upper left side (upper left side in FIG. 16) on paper P1, three blacks serving as the comparison reference are arranged in parallel in the longitudinal direction.
  • Gray is arranged in the area on the right side, and the density of each gray becomes lighter in order from the top. That is, with respect to patterns 1 to 3, with the three blacks arranged on the left side (left side in FIG.
  • the grays arranged on the right side of each black are three steps from dark gray to light gray Is a changing pattern.
  • three whites serving as a comparison reference are arranged in parallel in the vertical direction in the area on the lower left side (lower left side in FIG. 16) of paper P1.
  • Gray is arranged in the area on the right side, and the density of each gray becomes lighter in order from the top. That is, with respect to the patterns 4 to 6, the grays arranged on the right side of each white are compared with the light gray to the light gray 3 based on three whites arranged on the left side (left side in FIG. 16) of the paper P1. It is a pattern that changes in stages.
  • numerals 1 to 6 for distinguishing the patterns 1 to 6 from each other are described in parallel.
  • the arrangement positions of the patterns 1 to 6 are not limited to the arrangement positions shown in FIG. Each pattern may be arranged in the lateral direction. Also, patterns other than the patterns 1 to 6 may be included in the test pattern 150. The number of patterns included in the test pattern 150 may be five or less, or seven or more. For example, in addition to the patterns 1 to 6, one or more patterns in which gray density is more finely changed may be further included in the test pattern 150 shown in FIG.
  • image processing is performed on numbers of patterns (or patterns which can not be discriminated) which can discriminate the gradation of black or white and gray as comparison reference.
  • An input instruction prompting the user to input to the device 100 (or the printing device 200) is shown to the user, for example, through a display or the like.
  • a table in which combinations (for example, nine combinations) of each number of patterns 1 to 3 and each number of patterns 4 to 6 and a luminance range (dynamic range) are associated. are stored, for example, in the storage 103 or the like.
  • the image processing apparatus 100 acquires a combination of the numbers of patterns 1 to 6.
  • the combination of the numbers obtained in this manner is referred to as "print result".
  • the combination of the numbers is an example of the print result.
  • alphabets, symbols, other characters, or a combination thereof may be used instead of the numbers.
  • two numbers input by the user that is, any one of the patterns 1 to 3) out of the luminance range previously associated with each of, for example, nine combinations of the numbers of patterns 1 to 6
  • One luminance range corresponding to the number and any of the patterns 4 to 6 is selected as the second dynamic range corresponding to the sheet.
  • a concrete example of those operations is shown.
  • the user inputs, to the user, the number of the smallest pattern capable of determining the gradation of the patterns 1 to 3 and the number of the largest pattern capable of determining the gradation of the patterns 4 to 6.
  • An input instruction prompting to do is displayed through a display or the like.
  • the image processing apparatus 100 selects the luminance range previously associated with the combination of the pattern numbers based on the numbers of the two patterns input by the user according to the input instruction, and selects the second selected luminance range. Dynamic range. Thus, the second dynamic range is determined in the image processing apparatus 100.
  • FIG. 17 shows signal values in the first dynamic range and signal values in the second dynamic range for converting from the first dynamic range to the second dynamic range in conversion section 120 shown in the first embodiment. It is a figure showing typically an example of a relation of.
  • the signal value of the first dynamic range is, for example, the luminance in the HDR
  • the signal value of the second dynamic range is the luminance in the luminance range (dynamic range) having the maximum luminance smaller than the maximum luminance of the HDR. is there.
  • an image is displayed by adjusting the intensity of light emission of RGB (Red, Green, Blue), which is the three primary colors of light, for each pixel. Therefore, in the display device, the image is represented by absolute luminance.
  • a printing apparatus an image is displayed by printing a plurality of paints including CMYK (Cyan, Magenta, Yellow, blacK) on a sheet.
  • CMYK Cyan, Magenta, Yellow, blacK
  • the image printed on the sheet is expressed by the intensity of light reflected according to the paint applied to the sheet. Therefore, the maximum brightness in the image is the brightness of the white (paper white) area where the paint is not applied.
  • the brightness of the light reflected on the sheet changes depending on the type of the sheet (reflectance of the sheet), the brightness of the light irradiated to the sheet, the angle of the light irradiated to the sheet, and the like.
  • the converting unit 120 sets the first dynamic range to a second dynamic range in which the upper limit of the expressive capability of the sheet (that is, the brightness represented by the white of the sheet) is the maximum brightness.
  • the upper limit of the expressive capability of the sheet that is, the brightness represented by the white of the sheet
  • the maximum luminance represented by the white of the sheet can be determined, for example, from the reflectance of the sheet.
  • the conversion unit 120 calculates the brightness of the reflected light (hereinafter referred to as “reflection brightness”) when light of a prescribed brightness is reflected on the white of the paper using, for example, the reflectance of the paper, and the calculated reflection
  • the luminance may be set as the upper limit of the expressive ability of the sheet.
  • the conversion unit 120 obtains the maximum luminance in the first still image data D1, and converts the luminance range (dynamic range) by converting the luminance range (dynamic range) so that the maximum luminance in the first still image data D1 becomes the reflection luminance.
  • the first still image data D1 may be converted into the second still image data D2.
  • FIG. 18 is a flowchart showing an example of the operation of the image processing apparatus 100 according to the first embodiment.
  • the acquisition unit 110 acquires the first still image data D1 (step S101).
  • the acquisition unit 110 acquires the capability information I1 (step S102).
  • the conversion unit 120 generates the second still image data based on the second dynamic range corresponding to the print capability indicated by the capability information I1 acquired by the acquisition unit 110, based on the first still image data D1 acquired by the acquisition unit 110. It converts to D2 (step S103).
  • the output unit 130 outputs the second still image data D2 converted from the first still image data D1 by the conversion unit 120 to the printing apparatus 200 (step S104).
  • the image processing apparatus obtains the first still image data obtained by imaging and the luminance range defined by the first dynamic range and the capability information indicating the printing capability of the printing apparatus.
  • a second dynamic range having a luminance range narrower than the first dynamic range according to the acquiring unit and the first still image data acquired by the acquiring unit according to the printing capability indicated by the capability information acquired by the acquiring unit
  • a output unit configured to output the second still image data converted by the conversion unit to the printing apparatus.
  • the image processing method acquires first still image data obtained by imaging and having a luminance range defined by the first dynamic range, and acquires capability information indicating the printing capability of the printing apparatus. Converting the acquired first still image data into second still image data defined in a second dynamic range whose luminance range is narrower than the first dynamic range according to the printing capability indicated by the acquired capability information , Outputting the converted second still image data to the printing apparatus.
  • the image processing apparatus 100 is an example of an image processing apparatus.
  • the first still image data D1 is an example of first still image data.
  • the printing apparatus 200 is an example of a printing apparatus.
  • the capability information I1 is an example of the capability information.
  • the acquisition unit 110 is an example of an acquisition unit.
  • the second still image data D2 is an example of second still image data.
  • the conversion unit 120 is an example of a conversion unit.
  • the output unit 130 is an example of the output unit.
  • the image processing apparatus 100 has the ability to indicate the first still image data D1 obtained by imaging and having the luminance range defined by the first dynamic range and the printing capability of the printing apparatus 200.
  • the acquisition unit 110 for acquiring the information I1 and the first still image data D1 acquired by the acquisition unit 110 are compared with the first dynamic range according to the printing capability indicated by the capability information I1 acquired by the acquisition unit 110.
  • the image processing method executed by the image processing apparatus 100 acquires the first still image data D1 obtained by imaging and in which the luminance range is defined by the first dynamic range.
  • the capability information I1 indicating the printing capability of the printing apparatus 200 is acquired (step S102), and the acquired first still image data D1 is subjected to a first processing according to the printing capability indicated by the acquired capability information I1.
  • the second still image data D2 defined in the second dynamic range having a narrower luminance range than the dynamic range is converted (step S103), and the converted second still image data D2 is output to the printing apparatus 200 (step S104) .
  • the image processing apparatus 100 configured in this manner converts the first still image data D1 into second still image data D2 defined by the second dynamic range determined according to the printing capability of the printing apparatus 200.
  • the second still image data D2 can be output to the printing apparatus 200. Therefore, the image processing apparatus 100 can cause the printing apparatus 200 to print an image based on the first still image data D1 in a luminance range (dynamic range) corresponding to the print capability of the printing apparatus 200. Therefore, the image processing apparatus 100 can cause the printing apparatus 200 to print an image based on the first still image data D1 with high quality.
  • the acquisition unit may acquire paper information indicating the type of paper used for printing by the printing apparatus as the capability information.
  • the conversion unit indicates the sheet information acquired by the acquisition unit by referring to the relationship between the types of a plurality of sheets and the reflection luminance of the light reflected by the sheet when the sheets are irradiated with light.
  • the reflected luminance corresponding to the type of paper may be specified.
  • the conversion unit may convert the first still image data into the second still image data, with the luminance range in which the specified reflection luminance is the maximum luminance as the second dynamic range.
  • the acquisition unit 110 acquires paper information indicating the type of paper used for printing by the printing apparatus 200 as the capability information I1.
  • the paper information acquired by the acquisition unit 110 by referring to the relationship between the types of a plurality of sheets and the reflection luminance of the light reflected by the sheets when the sheets are irradiated with light. Identify the reflection brightness corresponding to the type of paper indicated by.
  • the converting unit 120 converts the first still image data D1 into the second still image data D2 with the luminance range in which the specified reflection luminance is the maximum luminance as the second dynamic range.
  • the image processing apparatus 100 configured in this manner converts the first still image data D1 into second still image data D2 defined by the second dynamic range based on the type of paper used for printing by the printing apparatus 200.
  • the second still image data D2 can be output to the printing apparatus 200. Therefore, the image processing apparatus 100 can cause the printing apparatus 200 to print an image based on the first still image data D1 in a luminance range (dynamic range) according to the expressive ability of the sheet. Therefore, the image processing apparatus 100 can cause the printing apparatus 200 to print an image based on the first still image data D1 with high quality.
  • the acquisition unit may acquire, as the capability information, a print result obtained by printing the specific pattern on the sheet used for printing by the printing apparatus.
  • the conversion unit may convert the first still image data into the second still image data, using the luminance range determined according to the print result acquired by the acquisition unit as the second dynamic range.
  • the test pattern 150 shown in FIG. 16 is an example of a specific pattern.
  • the acquiring unit 110 obtains the capability information I1 by printing the specific pattern on the sheet used for printing by the printing apparatus 200. Get the printed result.
  • the conversion unit 120 converts the first still image data D1 into the second still image data D2 as the second dynamic range, which is the luminance range determined according to the print result acquired by the acquisition unit 110.
  • a specific pattern is printed on a sheet used for printing by the printing apparatus 200, and a print result based on the sheet P1 on which the specific pattern is printed is acquired from the user and acquired.
  • the second dynamic range can be determined according to the printing result.
  • the image processing apparatus 100 converts the first still image data D1 into second still image data D2 defined by the second dynamic range, and outputs the second still image data D2 to the printing apparatus 200. can do.
  • the image processing apparatus 100 can easily estimate the expressive ability of the sheet used for printing by the printing apparatus 200 (that is, the luminance range in which the luminance represented by the white of the sheet is the maximum luminance),
  • the second dynamic range can be easily determined by setting the luminance range (dynamic range) suitable for the second dynamic range as the second dynamic range. Therefore, the image processing apparatus 100 can cause the printing apparatus 200 to print an image based on the first still image data D1 with high quality.
  • the first dynamic range may be HDR.
  • the first dynamic range is HDR.
  • the image processing apparatus 100 can print an image based on the first still image data D1 with high quality without losing the luminance range (dynamic range) of the first still image data D1 corresponding to HDR as much as possible. It can be printed on 200.
  • each of the image processing apparatuses 100A, 100B, and 100C is an example of the image processing apparatus.
  • Each of the acquisition units 110A, 110B, and 110C is an example of an acquisition unit.
  • the conversion unit 120C is an example of a conversion unit.
  • Each of the printing devices 200A and 200B is an example of a printing device.
  • FIG. 19 is a block diagram schematically showing an example of a functional configuration of the image processing apparatus 100A in the first modification of the first embodiment.
  • the image processing apparatus 100A includes an acquisition unit 110A, a conversion unit 120, and an output unit 130.
  • the image processing apparatus 100A according to the first modification differs from the image processing apparatus 100 according to the first embodiment in that the capability information I1 is acquired from the input device 300.
  • the other configuration in the image processing apparatus 100A is substantially the same as that of the image processing apparatus 100 in the first embodiment, and thus the detailed description will be omitted.
  • the acquisition unit 110A acquires the first still image data D1 and the capability information I1.
  • the acquisition unit 110A is, for example, an imaging device or an information terminal to which the first still image data D1 is wired or wirelessly connected to the image processing apparatus 100A. Or from a storage device or the like.
  • Acquisition part 110A which acquires the 1st still picture data D1 may be realized by communication IF105 (refer to Drawing 14), for example.
  • the acquisition unit 110A may acquire, for example, the capability information I1 from the input device 300 wired or wirelessly connected to the image processing apparatus 100A.
  • the capability information I1 acquired by the acquisition unit 110A may be paper information or a print result.
  • the acquisition unit 110A that acquires the capability information I1 may be realized by, for example, the communication IF 105 (see FIG. 14).
  • the input device 300 is, for example, an information terminal for the user to output a print instruction to the printing device 200, and is, for example, a smartphone, a tablet terminal, or a PC.
  • the input device 300 includes an input IF and a communication IF (not shown), and transmits a print instruction including paper information input at the input IF to the image processing apparatus 100A using the communication IF.
  • the input IF provided in the input device 300 may have, for example, the same configuration as the input IF 104 (see FIG. 14).
  • the communication IF provided in the input device 300 may be configured to be able to communicate with the image processing apparatus 100A, and may be, for example, the same configuration as the communication IF 105 (see FIG. 14).
  • the image processing apparatus 100A may obtain a print instruction including the capability information I1 via the external input device 300.
  • the image processing apparatus 100A may acquire, from the input device 300, the print result input to the input device 300 by the user.
  • FIG. 20 is a block diagram schematically showing an example of a functional configuration of the image processing apparatus 100B in the second modification of the first embodiment.
  • the image processing apparatus 100 ⁇ / b> B includes an acquisition unit 110 ⁇ / b> B, a conversion unit 120, and an output unit 130.
  • the image processing apparatus 100B according to the second modification differs from the image processing apparatus 100 according to the first embodiment in that the capability information I1 is acquired from the printing apparatus 200A.
  • the other configuration in the image processing apparatus 100B is substantially the same as that of the image processing apparatus 100 in the first embodiment, and thus the detailed description will be omitted.
  • the acquisition unit 110B acquires the first still image data D1 and the capability information I1.
  • the acquisition unit 110B is, for example, an imaging device or an information terminal to which the first still image data D1 is wired or wirelessly connected to the image processing apparatus 100B. Or from a storage device or the like.
  • Acquisition part 110B which acquires the 1st still picture data D1 may be realized by communication IF105 (refer to Drawing 14), for example.
  • the acquiring unit 110B may acquire, for example, the capability information I1 from the printing apparatus 200A wired or wirelessly connected to the image processing apparatus 100B.
  • the capability information I1 acquired by the acquisition unit 110B may be paper information or a print result.
  • the acquisition unit 110B that acquires the capability information I1 may be realized by, for example, the communication IF 105 (see FIG. 14).
  • the printing apparatus 200A includes an input IF 201 for the user to input a print instruction to the printing apparatus 200A.
  • the printing apparatus 200A receives a print instruction to the input IF 201 from the user, the printing apparatus 200A transmits the print instruction to the image processing apparatus 100B via a communication IF (not shown) included in the printing apparatus 200A.
  • the input IF 201 is configured to include, for example, one or more of a touch panel, an input button, a display, and the like.
  • the communication IF included in the printing apparatus 200A may be configured to be able to communicate with the image processing apparatus 100B, and may be, for example, the same configuration as the communication IF 105 (see FIG. 14).
  • the image processing apparatus 100B may obtain a print instruction including the capability information I1 via the printing apparatus 200A.
  • the image processing apparatus 100B may acquire, from the printing apparatus 200A, the print result input to the printing apparatus 200A by the user.
  • FIG. 21 is a block diagram schematically showing an example of a functional configuration of the image processing apparatus 100C in the third modification of the first embodiment.
  • the image processing apparatus 100C includes an acquisition unit 110C, a conversion unit 120C, and an output unit 130.
  • the image processing apparatus 100C according to the third modification differs from the image processing apparatus 100B according to the second modification in that the capability information I2 different from the capability information I1 is acquired from the printing apparatus 200B. Further, the processing in the conversion unit 120C in the image processing apparatus 100C in the modification 3 is different from the processing in the conversion unit 120 in the modification 2.
  • the other configuration in the image processing apparatus 100C is substantially the same as that of the image processing apparatus 100B in the second modification, and thus the detailed description is omitted.
  • the acquisition unit 110C acquires the first still image data D1 and the capability information I2.
  • the acquisition unit 110C is, for example, an imaging device or an information terminal to which the first still image data D1 is wired or wirelessly connected to the image processing apparatus 100C. Or from a storage device or the like.
  • the acquisition unit 110C that acquires the first still image data D1 may be realized by, for example, the communication IF 105 (see FIG. 14).
  • the acquisition unit 110C acquires the capability information I2 from the printing apparatus 200B wired or wirelessly connected to the image processing apparatus 100C. Specifically, the acquisition unit 110C is configured to scan a sheet used for printing by the printing apparatus 200B (for example, a sheet without printing before printing) as the capability information I2 by the scanner 202 of the printing apparatus 200B. To obtain a scan image obtained by The acquisition unit 110C is realized by, for example, the communication IF 105 (see FIG. 14).
  • the conversion unit 120C specifies the maximum luminance corresponding to the sheet scanned by the scanner 202, based on the luminance (reflection luminance) of the scan image acquired by the acquisition unit 110C. Then, the conversion unit 120C determines the second dynamic range based on the maximum luminance. That is, the conversion unit 120C converts the first still image data into the second still image data with the luminance range (dynamic range) in which the reflection luminance identified based on the capability information I2 is the maximum luminance as the second dynamic range.
  • the conversion unit 120C is realized by, for example, the CPU 101, the main memory 102, the storage 103, and the like (see FIG. 14).
  • the converter 120C may change the conversion process according to the type of the scanner 202.
  • the conversion unit 120C may correct the luminance of the obtained image according to at least one of the brightness of the light source of the scanner 202 and the sensitivity of the image sensor of the scanner 202.
  • the conversion unit 120C may correct the luminance of the image such that the luminance of the obtained image decreases as the brightness of the light source of the scanner 202 increases.
  • the conversion unit 120C may correct the luminance of the image such that the luminance of the obtained image decreases as the sensitivity of the image sensor of the scanner 202 increases.
  • the image processing apparatus 100C stores, in the storage 103, correction information in which information representing the type of the scanner 202 and information representing correction processing for correcting the luminance of the image are associated in advance. It is also good. Further, the image processing apparatus 100C acquires information representing the type of the scanner 202 from the printing apparatus 200B, identifies the correction processing corresponding to the acquired information representing the type of the scanner 202 from the correction information, and performs the identified correction processing. Thus, conversion processing according to the type of scanner 202 may be performed. In the present modification, the correction information may not be stored in the storage 103. The image processing apparatus 100C may acquire correction information from an external information processing apparatus via the communication IF 105 (see FIG. 14).
  • the printing apparatus 200 ⁇ / b> B includes a scanner 202.
  • the printing apparatus 200B can obtain a scanned image by scanning the sheet used for printing with the printing apparatus 200B (for example, a sheet without printing before printing) with the scanner 202.
  • the printing apparatus 200B transmits the obtained scan image as the capability information I2 to the image processing apparatus 100C via a communication IF (not shown) included in the printing apparatus 200B.
  • the image processing apparatus 100C may include a display. Then, the image processing apparatus 100C may display a scan instruction on the display. This scan instruction is a message, an image, or a UI, etc. for prompting the user to scan a sheet used for printing by the printing apparatus 200B by the scanner 202 provided in the printing apparatus 200B connected to the image processing apparatus 100C. May be In this case, the image processing apparatus 100C may determine whether the printing apparatus 200B includes the scanner 202 by performing wired communication or wireless communication with the printing apparatus 200B connected to the image processing apparatus 100C.
  • the image processing apparatus 100C displays a scan instruction for prompting the user to scan the sheet on the display, and the printing apparatus 200B scans the sheet, and when the scanned image of the sheet is obtained by the scan, the printing apparatus The control for causing the image processing apparatus 100C to transmit the obtained scan image may be performed on 200B.
  • the acquisition unit may acquire, as the capability information, a scanned image obtained by scanning a sheet used for printing by the printing apparatus.
  • the conversion unit specifies the reflection luminance of the light reflected by the paper when the light is irradiated to the paper based on the luminance of the scan image acquired by the acquisition unit, and the specified reflection luminance is the maximum luminance.
  • the first still image data may be converted into second still image data with the luminance range to be set as a second dynamic range.
  • the capability information I2 is an example of the capability information.
  • the acquisition unit 110C acquires a scanned image obtained by scanning a sheet used for printing by the printing apparatus 200B as the capability information I2.
  • the conversion unit 120C specifies the reflection luminance of the light reflected by the paper when the light is irradiated to the paper based on the luminance of the scan image acquired by the acquisition unit 110C, and specifies the specified reflection luminance.
  • the first still image data D1 is converted into second still image data D2 as a second dynamic range, which is the luminance range to be the maximum luminance.
  • the scanner 202 scans a sheet used for printing by the printing apparatus 200B to acquire a scan image, and the second dynamic range is set according to the acquired scan image. It can be decided. Then, the image processing apparatus 100C converts the first still image data D1 into second still image data D2 defined by the second dynamic range, and outputs the second still image data D2 to the printing apparatus 200B. be able to.
  • the image processing apparatus 100C can easily estimate the expressive ability of the sheet used for printing by the printing apparatus 200B (that is, the luminance range with the luminance represented by the white of the sheet as the maximum luminance), and the expressive ability of the sheet
  • the second dynamic range can be easily determined by setting the luminance range (dynamic range) suitable for the second dynamic range as the second dynamic range. Therefore, the image processing apparatus 100C can cause the printing apparatus 200B to print an image based on the first still image data D1 with high quality.
  • the image processing apparatus 100C may acquire the print result input to the printing apparatus 200B by the user from the printing apparatus 200B.
  • the image processing apparatus 100 (100A, 100B or 100C) is an independent apparatus separate from the printing apparatus 200 (200A or 200B).
  • the printing apparatus 200 may be incorporated in the image processing apparatus 100 or the image processing apparatus 100A
  • the printing apparatus 200A may be incorporated in the image processing apparatus 100B
  • the printing apparatus 200B is incorporated in the image processing apparatus 100C.
  • the communication IF 105 capable of communicating with the printing apparatus 200 (200A or 200B) may be omitted in the image processing apparatus 100 (100A, 100B or 100C).
  • the configuration example in which the converting unit 120 (120C) performs the process of converting the first still image data D1 into the second still image data D2 has been described. It is not limited to this configuration.
  • the conversion unit 120 120C
  • the conversion from the first still image data D1 to the second still image data D2 may not be performed.
  • the acquisition unit 110 acquires the first still image data D1, and further has a third dynamic range whose luminance range (dynamic range) is narrower than the first dynamic range.
  • the defined third still image data D3 is obtained (see FIG. 22).
  • the third dynamic range is, for example, SDR (Standard Dynamic Range).
  • the acquisition unit 110 adds the third still image data D3 to the image processing apparatus 100 (100A, 100B, or 100C) as when acquiring the first still image data D1. It may be acquired from an imaging device, an information terminal, a storage device, or the like connected by wire or wirelessly.
  • an imaging device for example, the HDR imaging device 10 illustrated in FIG. 9 or the like illustrated in FIG. 9
  • the acquisition unit 110 can acquire the third still image data D3 from the imaging device.
  • the output unit 130 outputs the third still image data D3 acquired by the acquisition unit 110 (110A, 110B or 110C) to the printing apparatus 200 (200A or 200B).
  • the acquiring unit acquires the first still image data, and further, a third still defined by a third dynamic range whose luminance range is narrower than the first dynamic range.
  • Image data may be acquired.
  • the conversion unit does not convert the first still image data into the second still image data, and the output unit is acquired by the acquisition unit.
  • the acquired third still image data may be output to the printing apparatus.
  • the third still image data D3 is an example of third still image data.
  • the acquisition unit 110 acquires the first still image data D1 and further And acquiring third still image data D3 defined in a third dynamic range whose luminance range is narrower than the first dynamic range.
  • the conversion unit 120 does not convert the first still image data D1 into the second still image data D2
  • the output unit 130 outputs the third still image data D3 acquired by the acquisition unit 110 (110A, 110B or 110C) to the printing apparatus 200 (200A or 200B).
  • the image processing apparatus 100 (100A, 100B, or 100C) configured in this way detects the third still image of SDR when the maximum brightness of the first still image data D1 of HDR is smaller than the reflection brightness corresponding to the sheet.
  • the image data D3 can be output to the printing apparatus 200 (200A or 200B), and an image based on the third still image data D3 can be printed on the printing apparatus 200 (200A or 200B).
  • the maximum luminance of the first still image data D1 is smaller than the reflection luminance corresponding to the sheet used for printing in the printing apparatus 200 (200A or 200B). Therefore, the first still image data D1 to the second still image Even when conversion processing to data D2 is performed, a high quality image may not be printed by the printing apparatus 200 (200A or 200B).
  • the image processing apparatus 100 (100A, 100B, or 100C) does not perform conversion processing from the first still image data D1 to the second still image data D2, and the third still image data D3 of SDR is processed. It outputs to the printing apparatus 200 (200A or 200B). Therefore, according to the configuration example shown in the fifth modification, in the image processing apparatus 100 (100A, 100B, or 100C), the load associated with the conversion process can be reduced.
  • FIG. 22 is a schematic diagram for describing an example of the image processing apparatus 100 (100A, 100B, or 100C) in the first embodiment (or the first to fifth modifications).
  • the image processing apparatus 100 acquires the first still image data D1 that is an HDR image, and outputs the first still image data D1 to the HDR display apparatus 30 as it is. Since the HDR display device 30 supports HDR, the first still image data D1 can be displayed with high quality.
  • the image processing apparatus 100 obtains the first still image data D1 and the capability information I1 (I2) of the SDR printing apparatus 50. Then, the image processing apparatus 100 (100A, 100B, or 100C) converts the first still image data D1 into the second still image data D2 based on the acquired capability information I1 (I2), and converts the second still image data D1. The data D2 is output to the SDR printing apparatus 50. The SDR printing apparatus 50 can not print the first still image data D1 as it is.
  • the SDR printing apparatus 50 corresponds to the second still image data D2 converted from the first still image data D1 in the image processing apparatus 100 (100A, 100B, or 100C)
  • the second still image data An image based on the second still image data D2 can be printed on a sheet using D2.
  • the second still image data D2 is defined in a luminance range (dynamic range) wider than the third still image data D3 which is an SDR image
  • the second still image data D2 is higher in quality than the SDR image by the third still image data D3. Images can be printed on paper.
  • the image processing apparatus 100 detects the first still image data D1 to the second still image when the maximum brightness of the first still image data D1 is smaller than the reflection brightness corresponding to the sheet.
  • the third still image data D3 is output to the SDR printing apparatus 50 without conversion to the data D2. Since the SDR printing apparatus 50 corresponds to the third still image data D3 which is an SDR image, the image based on the third still image data D3 can be printed as it is.
  • FIG. 23 is a view schematically showing a first example of a communication protocol between the display device 400 and the printing device 500 in the embodiment.
  • the display device 400 is a device having an image display function or an image reproduction function such as, for example, a television set, a video recorder, a video player, or a digital camera.
  • the printing apparatus 500 is an apparatus having a printing function such as a printer.
  • the communication protocol of the display device 400 includes a USB physical layer 440 as a physical layer, a PTP Transport layer 430 as a transport layer, a DPS layer 420 as a conversion layer, and a DPS application layer 410 as an application layer in order from the lower level. Is configured.
  • the communication protocol of the printing apparatus 500 includes a USB physical layer 540 as a physical layer, a PTP Transport layer 530 as a transport layer, a DPS layer 520 as a conversion layer, and a DPS application layer 510 as an application layer in order from the lower level. Is configured.
  • USB physical layers 440 and 540 are used as the physical layer, and the display device 400 and the printing device 500 are USB-connected, the display device 400 and the printing device 500 are used. May be connected by wireless communication (for example, Wi-Fi).
  • FIG. 23 illustrates a configuration example in which the Picture Transfer Protocol (PTP) defined as ISO 15740 is used in the transport layer.
  • the conversion layer defines an I / F (InterFace) for the application layer, and converts input / output from the application layer into a PTP protocol.
  • I / F InterFace
  • DPS Discovery exists to negotiate whether the display device 400 and the printing device 500 have mutually corresponding functions.
  • the DPS Discovery 421 exists in the DPS Layer 420, and the DPS Discovery 521 exists in the DPS Layer 520.
  • the connection of the USB cable to the display device 400 and the printing device 500 is taken as an opportunity to establish the interconnection in PTP.
  • the printing apparatus 500 mutually confirms the connection partner by DPS Discovery 421 and 521.
  • the display device 400 that holds the still image to be printed provides the file as the storage server 412 to the printing device 500 that is the storage client 511.
  • the printing apparatus 500 receives a request from the display apparatus 400 to be the Print Client 411 as the Print Server 512.
  • the Print Client 411 of the display device 400 inquires the Print Server 512 about the capability of the printer, and appropriately displays the result of the inquiry on a UI (User Interface) of the display device 400.
  • UI User Interface
  • a list of still images is displayed on the display device 400, and the user selects a still image to be printed from among the displayed images, and the user issues a print instruction to the display device 400.
  • the display device 400 requests the Print Server 512 to print the instructed still image.
  • the printing apparatus 500 requests the storage server 412 of the display apparatus 400 as a storage client 511 for a file corresponding to a still image for which printing is instructed.
  • the display device 400 notifies the printing device 500 of the held still image as a file.
  • FIG. 24 is a diagram schematically showing a second example of the communication protocol between the display device 400A and the printing device 500A in the embodiment.
  • the display device 400A is a device having an image display function or an image reproduction function, such as a television set, a video recorder, a video player, or a digital camera.
  • the printing apparatus 500A is an apparatus having a printing function, such as a printer, as in the first example.
  • the communication protocol of the display device 400A is configured to include a TCP / IP layer 440A, a UPnP layer 430A, a Control Middle layer 420A, and an application layer 410A in order from the lower side.
  • the communication protocol of the printing apparatus 500A includes a TCP / IP layer 540A, a UPnP layer 530A, a Control Middle layer 520A, and an application layer 510A in order from the lower side.
  • the communication protocol of the display device 400A and the printing device 500A uses Wi-Fi as a physical layer (not shown), and employs TCP / IP layers 440A and 540A as a transport layer thereon.
  • the display device 400A and the printing device 500A use UPnP as a protocol for mutually discovering a connection partner on the TCP / IP layers 440A and 540.
  • the actual print data is obtained with the help of the Control Middle layer 420A, 520A which performs control of the print job etc. , Between the display device 400A and the printing device 500A.
  • the printer has its own print command for each model, and in a PC (Personal Computer), the driver can absorb the difference in the print command for each print.
  • a PC Personal Computer
  • the display device 400A such as a television set or a video recorder, unlike such a PC, it is difficult to take a mechanism of installing a driver. Therefore, the display device 400A may use a general-purpose printing description language.
  • XHTML-print specified by W3C.
  • the display device 400A creates a print instruction based on the XHTML-print content 411A, and sends the created print instruction to the printing device 500A.
  • the XHTML drawing processing unit 511A performs layout processing and rasterization processing of an image file or a text character string, and generates data to be actually printed.
  • the print processing unit 512A prints the data thus obtained.
  • FIG. 25 is a diagram illustrating a third example of the communication protocol between the display device 400B and the printing device 500B in the embodiment.
  • the display device 400B is, for example, a device having an image display function or an image reproduction function such as a television set, a video recorder, a video player, or a digital camera, as in the first and second examples.
  • the printing apparatus 500B is an apparatus having a printing function, such as a printer, as in the first and second examples. Note that the printing apparatus 500B is an apparatus that does not support HDR and does not support SDR only when printing the first still image data D1.
  • the communication protocol of the display device 400B is configured to include a Wi-Fi layer 440B, a PTP Transport layer 430, a DPS layer 420, and a DPS application layer 410B in order from the lower side.
  • the communication protocol of the printing apparatus 500B is configured to include a Wi-Fi layer 540B, a PTP Transport layer 530, a DPS Layer 520, and a DPS application layer 510 in order from the lower side.
  • the communication protocol between the display apparatus 400B and the printing apparatus 500B shown in the third example is based on the communication protocol using PTP shown in the first example.
  • the Print Client 411 of the display device 400B recognizes the function of the printing device 500B, which is the connection partner, through communication with the Print Server 512 of the printing device 500B.
  • the display device 400B can pass the HDR still image data held by the display device 400B as it is to the printing device 500B. Can not. In such a case, the display device 400B adjusts the HDR still image data held by the display device 400B with the grading module 413B in accordance with the function of the printing device 500B to generate an 8 bits JPG file. That is, the display device 400B creates an 8 bits JPG file from the HDR still image data in accordance with the function of the printing device 500B.
  • the display device 400B provides the 8 bits JPG file obtained from the HDR still image data to the printing device 500B as a response to the request from the Storage Client 511 of the printing device 500B. That is, the display device 400B shown in the third example is configured to include any of the image processing devices 100 and 100A to 100C described in the first embodiment and the first to fifth modifications.
  • the display device 400B may create in advance an 8 bits JPG file corresponding to the HDR still image file.
  • the digital camera may use an image obtained by imaging, first still image data D1 corresponding to HDR, and 8 bits JPG files (for example, It is desirable to generate in both formats of 3 still image data D3).
  • the print quality may differ depending on the type of paper and ink used for printing. Therefore, in some cases, it is better for the display device 400B to change the grading method when creating the 8 bits JPG file from the HDR still image file, in accordance with the settings and capabilities of the printing device 500B. Therefore, the display device 400B may create an 8 bits JPG file each time according to the settings and capabilities of the printing device 500B.
  • the printing apparatus 500B may be the printing apparatus 200 (200A or 200B).
  • FIG. 26 is a diagram schematically showing a fourth example of a communication protocol between the display device 400C and the printing device 500C in the embodiment.
  • the display device 400C is a device having an image display function or an image reproduction function, such as a television set, a video recorder, a video player, or a digital camera.
  • the printing apparatus 500C is an apparatus having a printing function such as a printer as in the third example, and when printing the first still image data D1, the printing apparatus 500C is not compatible with HDR and only compatible with SDR. Not a device.
  • the communication protocol of the display device 400C is configured to include a Wi-Fi layer 440B, a PTP Transport layer 430, a DPS Layer 420, and a DPS application layer 410 in order from the lower side.
  • the communication protocol of the printing apparatus 500C is configured to include a Wi-Fi layer 540B, a PTP Transport layer 530, a DPS Layer 520, and a DPS application layer 510C in order from the lower side.
  • the communication protocol between the display apparatus 400C and the printing apparatus 500C shown in the fourth example is the processing performed by the grading module 413B of the display apparatus 400B in the third example, compared to the configuration shown in the third example. Is different from that performed by the grading module 513C of the printing apparatus 500C. That is, the printing apparatus 500C shown in the fourth example is configured to include any of the image processing apparatuses 100 and 100A to 100C described in the first embodiment and the first to fifth modifications.
  • the Storage Server 412 of the display device 400C provides the HDR still image data as it is to the printing device 500C as a response to the request from the Storage Client 511 of the printing device 500C.
  • the grading module 513C appropriately grades and prints the received HDR still image data according to the type of paper and ink used in the printing device 500C, the setting regarding the printing quality, and the like.
  • the printing apparatus 500C may be the printing apparatus 200 (200A or 200B).
  • Embodiment 1 and Modifications 1 to 5 have been described as examples of the technology disclosed in the present application.
  • the technology in the present disclosure is not limited to this, and can also be applied to embodiments in which changes, replacements, additions, omissions, and the like have been made.
  • the absolute luminance value is used as the luminance information used in the first still image data D1
  • the absolute luminance value is used instead.
  • the luminance calculated from the STOP number of each part of the light and dark part of the photographed picture may be used. That is, the conversion unit 120 (120C) may convert the first still image data D1 into the second still image data D2 using relative luminance without using the absolute luminance of the first still image data D1.
  • the printing capability has been described as the capability represented by the reflectance of a single sheet or the like.
  • the printing capability may be an addition characteristic to which the printing characteristic of the printing apparatus is added. That is, the printability may also take into consideration the characteristics of the ink of the printing apparatus and the characteristics of the ejection of the ink. Then, the first dynamic range of the first still image data D1, the type of paper, the type and brightness of the light source, and the luminance range (dynamic range) that can be expressed by the characteristics of the ink are considered.
  • One still picture data D1 may be converted into second still picture data D2.
  • the ink ejection amount of the printing apparatus may be controlled so as to obtain the same effect as the signal conversion process. That is, instead of generating the second still image data D2, the ink discharge amount of the printing apparatus may be controlled such that the printed product equivalent to the case where the second still image data D2 is printed is output by the printing apparatus. .
  • each component may be configured by dedicated hardware (for example, an electronic circuit including a semiconductor integrated circuit), or a software program suitable for each component may be used. It may be realized by execution by a processor. Each component may be realized by a program execution unit such as a central processing unit (CPU) or processor reading and executing a software program recorded in a recording medium such as a hard disk or a semiconductor memory.
  • a program execution unit such as a central processing unit (CPU) or processor reading and executing a software program recorded in a recording medium such as a hard disk or a semiconductor memory.
  • a plurality of functional blocks may be realized as one functional block, or one functional block may be divided into a plurality, or some functions may be transferred to another functional block.
  • the functions of multiple functional blocks may be processed in parallel by a single piece of hardware or software, or may be processed in time division.
  • some functions of the plurality of functional blocks may be realized by hardware, and the remaining functions may be realized by software.
  • this program acquires the first still image data obtained by imaging and having the luminance range defined by the first dynamic range in the computer, and acquires the acquired capability information indicating the printing capability of the printing apparatus.
  • the image processing method, a computer program that causes a computer to execute the image processing method, and a computer readable recording medium recording the program are included in the scope of the present disclosure.
  • a computer readable recording medium for example, a flexible disk, a hard disk, a CD-ROM, an MO, a DVD, a DVD, a DVD-ROM, a DVD-RAM, a BD (Blu-ray (registered trademark) Disc), a semiconductor memory, Etc.
  • the computer program is not limited to one recorded in the above recording medium, and may be transmitted via a telecommunication line, a wireless or wired communication line, a network such as the Internet, or the like.
  • each of the above-described devices may be configured from an IC card or a single module that can be attached to or detached from each device.
  • LSI Large Scale Integration
  • Each processing unit is not limited to an LSI or an IC, and may be realized by a dedicated circuit or a general-purpose processor. Alternatively, it may be realized by an FPGA (field programmable gate array) capable of programming the circuit configuration, or a reconfigurable processor capable of reconfigurable connection and setting of circuit cells in the LSI.
  • FPGA field programmable gate array
  • the above program may be recorded on a recording medium and distributed or distributed.
  • the apparatus can perform various processes.
  • the computer program or the digital signal in the present disclosure may be transmitted via a telecommunication line, a wireless or wired communication line, a network such as the Internet, data broadcasting, and the like.
  • the present disclosure may be implemented by another independent computer system by recording and transporting a program or digital signal on a recording medium, or by transporting a program or digital signal via a network or the like. It is also good.
  • each process may be realized by centralized processing by a single device (system), or realized by distributed processing by a plurality of devices. It is also good.
  • the present disclosure is applicable to an image processing apparatus and an image processing method that can obtain still image data for printing high-quality printed matter.
  • a video display device such as a television set or display
  • a video reproduction device such as a video recorder or video player
  • an imaging device such as a digital camera or video camera
  • a terminal device such as a smartphone or tablet computer
  • printing such as a printer
  • the present disclosure is applicable to an apparatus or a computer apparatus such as a PC or a server.
  • HDR Imaging device 100A, 100B, 100C Image Processing Device 101 CPU 102 Main memory 103 Storage 104 Input IF 105 Communication IF 110, 110A, 110B, 110C acquisition unit 120, 120C conversion unit 130 output unit 150 test patterns 200, 200A, 200B printing apparatus 201 input IF 202 Scanner 300

Abstract

Provided is an image processing device which makes it possible to obtain still image data for printing high-quality printed matter. The image processing device is provided with: an acquisition unit which acquires first still image data which is obtained by capturing an image and of which the brightness range is defined by a first dynamic range, and capability information indicating the printing capability of a printing device; a conversion unit which, in accordance with the printing capability indicated by the capability information acquired by the acquisition unit, converts the first still image data acquired by the acquisition unit into second still image data of which the brightness range is defined by a second dynamic range narrower than the first dynamic range; and an output unit which outputs the second still image data obtained by the conversion by the conversion unit to the printing device.

Description

画像処理装置、画像処理方法、およびプログラムIMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM
 本開示は、画像処理装置および画像処理方法に関する。 The present disclosure relates to an image processing apparatus and an image processing method.
 特許文献1は、露出の異なる複数の画像を合成することにより、ダイナミックレンジの広いHDR(High Dynamic Range)静止画を記録する撮像装置を開示している。 Patent Document 1 discloses an imaging device that records a wide dynamic range HDR (High Dynamic Range) still image by combining a plurality of images with different exposures.
特開2015-056807号公報JP, 2015-056807, A
 特許文献1に開示された技術では、高品位な印刷物を印刷するための静止画データを得ることが難しい。 In the technique disclosed in Patent Document 1, it is difficult to obtain still image data for printing high-quality printed matter.
 本開示は、高品位な印刷物を印刷するための静止画データを得ることができる画像処理装置および画像処理方法を提供する。 The present disclosure provides an image processing apparatus and an image processing method that can obtain still image data for printing high-quality printed matter.
 本開示における画像処理装置は、撮像により得られ輝度範囲が第1のダイナミックレンジで定義された第1静止画データと印刷装置の印刷能力を示す能力情報とを取得する取得部と、前記取得部により取得された前記第1静止画データを、前記取得部により取得された前記能力情報が示す前記印刷能力に応じて、前記第1のダイナミックレンジよりも輝度範囲が狭い第2のダイナミックレンジで定義された第2静止画データに変換する変換部と、前記変換部により変換された前記第2静止画データを前記印刷装置に出力する出力部と、を備える。 An image processing apparatus according to an embodiment of the present disclosure includes an acquisition unit that acquires first still image data obtained by imaging and having a luminance range defined by a first dynamic range, and capability information indicating the printing capability of the printing apparatus; Defining the first still image data acquired by the second dynamic range having a luminance range narrower than the first dynamic range according to the printing capability indicated by the capability information acquired by the acquiring unit And an output unit for outputting the second still image data converted by the converting unit to the printing apparatus.
 本開示における画像処理方法は、撮像により得られ輝度範囲が第1のダイナミックレンジで定義された第1静止画データを取得し、印刷装置の印刷能力を示す能力情報を取得し、取得した前記第1静止画データを、取得した前記能力情報が示す前記印刷能力に応じて、前記第1のダイナミックレンジよりも輝度範囲が狭い第2のダイナミックレンジで定義された第2静止画データに変換し、変換した前記第2静止画データを前記印刷装置に出力する。 In the image processing method according to the present disclosure, the first still image data obtained by imaging and having a luminance range defined by a first dynamic range is acquired, and the acquired capability information indicating the printing capability of the printing apparatus is acquired. 1 Still image data is converted into second still image data defined in a second dynamic range whose luminance range is narrower than the first dynamic range according to the printing capability indicated by the acquired capability information. The converted second still image data is output to the printing apparatus.
 なお、これらの全般的または具体的な態様は、システム、方法、集積回路、コンピュータプログラムまたはコンピュータ読み取り可能なCD-ROM等の記録媒体で実現されてもよく、システム、方法、集積回路、コンピュータプログラムおよび記録媒体の任意な組み合わせで実現されてもよい。 Note that these general or specific aspects may be realized by a system, a method, an integrated circuit, a computer program, or a recording medium such as a computer readable CD-ROM, and the system, the method, the integrated circuit, the computer program And any combination of recording media.
 本開示における画像処理装置および画像処理方法は、高品位な印刷物を印刷するための静止画データを得ることができる。 The image processing apparatus and the image processing method in the present disclosure can obtain still image data for printing high-quality printed matter.
図1は、映像技術の進化について説明するための模式図である。FIG. 1 is a schematic view for explaining the evolution of video technology. 図2は、HDR表示技術について説明するための模式図である。FIG. 2 is a schematic view for explaining the HDR display technology. 図3Aは、PQ(Perceptual Quantization)方式を説明するための模式図である。FIG. 3A is a schematic view for explaining a PQ (Perceptual Quantization) method. 図3Bは、HLG(Hybrid Log Gamma)方式を説明するための模式図である。FIG. 3B is a schematic view for explaining the HLG (Hybrid Log Gamma) method. 図4は、HDRに対応しているHDR画像の一例と、SDR(Standard Dynamic Range)に対応しているSDR画像の一例とを比較して示した図である。FIG. 4 is a diagram showing an example of an HDR image corresponding to HDR and an example of an SDR image corresponding to SDR (Standard Dynamic Range) in comparison. 図5は、画像撮像時の輝度の尺度の一例を示す図である。FIG. 5 is a diagram showing an example of a scale of luminance when capturing an image. 図6は、撮像された画像の輝度の一例を示す図である。FIG. 6 is a diagram showing an example of the luminance of a captured image. 図7Aは、図6で示した原画像をSDR画像にマスタリングした結果の輝度の一例を示す図である。FIG. 7A is a view showing an example of the luminance as a result of mastering the original image shown in FIG. 6 into the SDR image. 図7Bは、原信号値をSDR信号値に変換する(マスタリングする)ための、原信号値とSDR信号値との関係の一例を模式的に示す図である。FIG. 7B is a view schematically showing an example of the relationship between the original signal value and the SDR signal value for converting (mastering) the original signal value into the SDR signal value. 図8Aは、図6で示した原画像をHDR画像にマスタリングした結果の輝度の一例を示す図である。FIG. 8A is a view showing an example of the luminance as a result of mastering the original image shown in FIG. 6 into the HDR image. 図8Bは、原信号値をHDR信号値に変換する(マスタリングする)ための、原信号値とHDR信号値との関係の一例を模式的に示す図である。FIG. 8B is a view schematically showing an example of the relationship between the original signal value and the HDR signal value for converting (mastering) the original signal value into the HDR signal value. 図9は、HDRまたはSDRに対応した撮像装置と、撮像装置によって得られる画像データのファイルフォーマットと、画像データを表示する表示装置または画像データを印刷する印刷装置と、について説明するための模式図である。FIG. 9 is a schematic diagram for describing an imaging device compatible with HDR or SDR, a file format of image data obtained by the imaging device, and a display device for displaying image data or a printing device for printing image data. It is. 図10は、2つの画像を合成することによりダイナミックレンジを拡大した画像を得るHDR撮影モードについて説明するための模式図である。FIG. 10 is a schematic diagram for describing an HDR imaging mode in which an image with an expanded dynamic range is obtained by combining two images. 図11は、2つの画像を合成することによりダイナミックレンジを拡大した画像を得るHDR撮影モードについて説明するための模式図である。FIG. 11 is a schematic diagram for explaining an HDR shooting mode for obtaining an image with an expanded dynamic range by combining two images. 図12は、HDR表示用に撮像されたHDR画像について説明するための模式図である。FIG. 12 is a schematic view for explaining an HDR image captured for HDR display. 図13は、SDR印刷装置に画像を印刷させる場合の課題について説明するための模式図である。FIG. 13 is a schematic view for explaining the problem in the case of causing the SDR printing apparatus to print an image. 図14は、実施の形態1における画像処理装置のハードウェア構成の一例を模式的に示すブロック図である。FIG. 14 is a block diagram schematically showing an example of the hardware configuration of the image processing apparatus according to the first embodiment. 図15は、実施の形態1における画像処理装置の機能構成の第1の例を模式的に示すブロック図である。FIG. 15 is a block diagram schematically showing a first example of a functional configuration of the image processing apparatus in the first embodiment. 図16は、実施の形態1におけるテストパターンを用紙に印刷した結果の一例を示す図である。FIG. 16 is a diagram showing an example of the result of printing the test pattern according to the first embodiment on a sheet. 図17は、実施の形態1に示す変換部における、第1のダイナミックレンジから第2のダイナミックレンジへ変換するための、第1のダイナミックレンジの信号値と第2のダイナミックレンジの信号値との関係の一例を模式的に示す図である。FIG. 17 shows a signal value of the first dynamic range and a signal value of the second dynamic range for converting from the first dynamic range to the second dynamic range in the conversion section shown in the first embodiment. It is a figure which shows an example of a relationship typically. 図18は、実施の形態1における画像処理装置の動作の一例を示すフローチャートである。FIG. 18 is a flowchart showing an example of the operation of the image processing apparatus according to the first embodiment. 図19は、実施の形態1の変形例1における画像処理装置の機能構成の一例を模式的に示すブロック図である。FIG. 19 is a block diagram schematically showing an example of a functional configuration of the image processing apparatus in the first modification of the first embodiment. 図20は、実施の形態1の変形例2における画像処理装置の機能構成の一例を模式的に示すブロック図である。FIG. 20 is a block diagram schematically showing an example of a functional configuration of the image processing apparatus in the second modification of the first embodiment. 図21は、実施の形態1の変形例3における画像処理装置の機能構成の一例を模式的に示すブロック図である。FIG. 21 is a block diagram schematically showing an example of a functional configuration of the image processing apparatus according to the third modification of the first embodiment. 図22は、実施の形態1(または変形例1~5)における画像処理装置の実施例を説明するための模式図である。FIG. 22 is a schematic diagram for explaining an example of the image processing apparatus according to the first embodiment (or the first to fifth modifications). 図23は、実施の形態における表示装置と印刷装置との間の通信プロトコルの第1の例を模式的に示す図である。FIG. 23 is a diagram schematically illustrating a first example of a communication protocol between the display device and the printing device in the embodiment. 図24は、実施の形態における表示装置と印刷装置との間の通信プロトコルの第2の例を模式的に示す図である。FIG. 24 is a diagram schematically showing a second example of the communication protocol between the display device and the printing device in the embodiment. 図25は、実施の形態における表示装置と印刷装置との間の通信プロトコルの第3の例を模式的に示す図である。FIG. 25 is a diagram schematically illustrating a third example of the communication protocol between the display device and the printing device in the embodiment. 図26は、実施の形態における表示装置と印刷装置との間の通信プロトコルの第4の例を模式的に示す図である。FIG. 26 is a diagram schematically showing a fourth example of the communication protocol between the display device and the printing device in the embodiment.
 (本開示の目的)
 本開示は、HDR(High Dynamic Range)表示技術と、HDR撮像技術との2つの技術を使い、HDR静止画という新たなユーザ価値と新たな写真文化を提供するためのものである。上記の新たなユーザ価値とは、臨場感を向上させ、かつ、白飛び(明るい領域の階調が損なわれた状態、白潰れともいう)および黒潰れ(暗い領域の階調が損なわれた状態)等が低減された静止画データを生成することである。また、上記の新たな写真文化とは、HDR静止画の撮像に対応しているカメラで撮像することにより得られたHDR静止画を、HDR表示に対応する表示装置(以下、「HDR表示装置」という)に表示して鑑賞することである。HDR表示装置としては、例えば、HDRTV(HDRテレビジョンセット)、HDR対応タブレット端末、HDR対応スマートフォン、HDR対応PC(Personal Computer)、HDR対応ディスプレイ、等である。なお、HDR静止画は、HDR写真ともいう。
(Purpose of this disclosure)
The present disclosure is to provide a new user value of HDR still images and a new photographic culture using two technologies of HDR (High Dynamic Range) display technology and HDR imaging technology. The above new user value is to improve the sense of realism and to cause whiteout (in which the gradation of a bright area is lost, also referred to as white collapse) and black collapse (in which the gradation of a dark area is lost) ) Is to generate still image data with reduced and the like. In addition, the above-mentioned new photographic culture refers to a display device corresponding to HDR display (hereinafter referred to as “HDR display device”), which is an HDR still image obtained by imaging with a camera compatible with HDR still image imaging. It is to display and appreciate. The HDR display device is, for example, an HDRTV (HDR television set), an HDR compatible tablet terminal, an HDR compatible smartphone, an HDR compatible PC (Personal Computer), an HDR compatible display, or the like. HDR still images are also referred to as HDR photographs.
 本開示は、SDR表示には対応しているがHDR表示には対応していない表示装置(以下、「SDR表示装置」という)およびSDR静止画の印刷には対応しているがHDR静止画の印刷には対応していない印刷装置(以下、「SDR印刷装置」という)においても表示または印刷が可能な静止画データを生成することができる画像処理装置および画像処理方法を提供する。つまり、本開示は、HDR静止画の画像処理に対応した装置に対してだけでなく、SDR静止画の画像処理には対応しているがHDR静止画の画像処理には対応していない装置に対しても、HDR静止画の再生を行うことが可能な静止画データを提供することで、HDR静止画データの利便性を向上させることができる画像処理装置および画像処理方法を提供する。なお、本開示において、HDR静止画の再生は、HDR静止画の表示、および、HDR静止画を画像処理することによる印刷、を含む。つまり、本開示において、再生は、表示および印刷を含む。 The present disclosure is compatible with SDR display but not with HDR display (hereinafter referred to as “SDR display device”) and SDR still image printing but with HDR still images Provided are an image processing apparatus and an image processing method capable of generating still image data that can be displayed or printed even in a printing apparatus not compatible with printing (hereinafter referred to as "SDR printing apparatus"). That is, the present disclosure is applicable not only to an apparatus compatible with HDR still image processing but also an apparatus compatible with SDR still image processing but not to HDR still image processing. The present invention also provides an image processing apparatus and an image processing method capable of improving the convenience of HDR still image data by providing still image data capable of reproducing HDR still images. Note that, in the present disclosure, reproduction of an HDR still image includes display of the HDR still image and printing by image processing the HDR still image. That is, in the present disclosure, playback includes display and printing.
 (HDR表示技術の背景)
 図1は、映像技術の進化について説明するための模式図である。
(Background of HDR display technology)
FIG. 1 is a schematic view for explaining the evolution of video technology.
 これまで、映像の高画質化としては、表示画素数の拡大に主眼がおかれていた。そして、従来の720×480画素のStandard Definition (SD)映像に代えて、1920×1080画素のHigh Definition(HD)映像が普及している。 Up until now, the main focus has been on expanding the number of display pixels in order to improve the image quality of video. Then, in place of the conventional 720 × 480 pixel Standard Definition (SD) video, a 1920 × 1080 pixel High Definition (HD) video is in widespread use.
 近年、更なる高画質化のために、3840×2160画素のUltra High Definition(UHD)映像、あるいは、さらに画素数が多い4096×2160画素の映像(いわゆる、4K映像)が提案されている。また、4K映像とともに、輝度範囲(以下、「ダイナミックレンジ」ともいう)の拡張、色域の拡大、フレームレートの増加、等も検討されている。 In recent years, an Ultra High Definition (UHD) image of 3840 × 2160 pixels or an image of 4096 × 2160 pixels (so-called 4K image) having a larger number of pixels has been proposed for further improvement of the image quality. In addition to 4K video, expansion of luminance range (hereinafter also referred to as "dynamic range"), expansion of color gamut, increase of frame rate, and the like are also studied.
 ダイナミックレンジに関しては、暗部階調を維持しつつ、現行のテレビジョン信号での表現が困難な鏡面反射光等の明るい光を、より現実に近い明るさで表現するための方式として、HDR(High Dynamic Range)が提案されている。これまでのテレビジョン信号は、SDR(Standard Dynamic Range)と呼ばれ、最大輝度が100nitであった。一方、HDRでは、1000nit以上まで最大輝度を拡大することが想定されている。そして、SMPTE(Society of Motion Picture and Television Engineers)、ITU-R(International Telecommunication Union-Radiocommunication Sector)等において、マスタリングディスプレー用の規格の標準化も進行中である。HDRの具体的な適用先としては、HDやUHDと同様に、放送やパッケージメディア(Blu-ray(登録商標) Disc等)、インターネット配信、等がある。 With regard to the dynamic range, HDR (High) is used as a method for expressing bright light such as specular reflection light that is difficult to express in current television signals with brightness closer to reality while maintaining dark area gradation. Dynamic Range) has been proposed. The conventional television signal is called SDR (Standard Dynamic Range), and has a maximum luminance of 100 nit. On the other hand, in HDR, it is assumed that the maximum luminance is expanded to 1000 nit or more. Further, standardization of mastering display standards is also in progress in the Society of Motion Picture and Television Engineers (SMPTE), International Telecommunication Union-Radiocommunication Sector (ITU), and the like. Specific applications of HDR include broadcast, packaged media (Blu-ray (registered trademark) Disc etc.), Internet distribution, etc., as with HD and UHD.
 (HDR表示技術)
 図2は、HDR表示技術について説明するための模式図である。
(HDR display technology)
FIG. 2 is a schematic view for explaining the HDR display technology.
 HDRは、単なる非常に明るいテレビジョンセットを実現するための方式ではない。HDRとは、映像の輝度範囲(ダイナミックレンジ)を、SDRの一例であるBT.709(Broadcasting Service (Television) 709)の規格で定められた0.1nit-100nitから、例えばSMPTE(Society of Motion Picture and Television Engineers:米国映画テレビ技術者協会)におけるST2084で定められた0-10,000nitに拡張して、従来は表現できなかった、明るい太陽、空、および光線の反射等の輝度の高い画像の表現を可能にしたり、明るい部分と暗い部分とを同時に記録することを可能にしたりする方式である。なお、ここでいう輝度とは、光学的な輝度のことであり、光源の明るさを表す物理量のことである。HDRには、撮像後にグレーディング処理(映像の色やトーンを調整する処理)を行う映像(パッケージされる映像)およびIP(Internet Protocol)配信される映像等に適したST2084(PQ方式)と、ライブ放送の映像およびユーザに撮像された映像に適したHybrid Log Gamma(HLG方式)との2つの方式がある。 HDR is not just a scheme to realize a very bright television set. HDR refers to the luminance range (dynamic range) of a video as BT. From 0.1 nit-100 nit defined in the standard of 709 (Broadcasting Service (Television) 709), for example 0-10 as defined in ST 2084 in SMPTE (Society of Motion Picture and Television Engineers) Expand to 000 nit to enable the representation of bright images such as bright sun, sky and reflection of light rays that could not be expressed conventionally, or to record light and dark areas simultaneously Method. Here, the brightness is an optical brightness and is a physical quantity that represents the brightness of the light source. In HDR, a video (packaged video) to be subjected to grading processing (processing to adjust the color and tone of the video) after imaging, ST 2084 (PQ method) suitable for video delivered to IP (Internet Protocol), etc. There are two methods, Hybrid Log Gamma (HLG method) suitable for broadcast video and video captured by the user.
 このように、HDRの表示技術には、SDRとHDRとの互換性を実現できるHLG方式と、SDRとHDRとの単純な表示互換性がないPQ方式とがある。なお、PQ方式は、HDR10方式ともいう。 As described above, the HDR display technology includes the HLG method capable of achieving compatibility between SDR and HDR, and the PQ method not having simple display compatibility between SDR and HDR. The PQ method is also referred to as the HDR 10 method.
 図3Aは、PQ方式を説明するための模式図である。図3Bは、HLG方式を説明するための模式図である。 FIG. 3A is a schematic view for explaining the PQ method. FIG. 3B is a schematic view for explaining the HLG method.
 図3Aに示すように、PQ方式は、SDRとHDRとの互換性がない方式である。この方式の場合、SDRとHDRとは、個別にグレーディングされて個別に伝送される。また、この方式の場合、Ultra HD Blu-ray(登録商標)で再生された映像を、SDRTV(SDRには対応しているが、HDRには対応していないテレビジョンセット)に表示させる場合は、HDRの映像データをSDRの映像データに変換するSDR変換が必要となる。 As shown in FIG. 3A, the PQ scheme is a scheme in which there is no compatibility between SDR and HDR. In this scheme, SDR and HDR are graded separately and transmitted separately. Moreover, in the case of this method, when displaying a video reproduced by Ultra HD Blu-ray (registered trademark) on a SDRTV (a television set compatible with SDR but not compatible with HDR), , SDR conversion is required to convert HDR video data into SDR video data.
 図3Bに示すように、ITU-R(The ITU Radiocommunication Sector:国際電気通信連合 無線通信部門)の2100 Hybrid Log Gamma(HLG)は、SDRとHDRとの互換性を有する方式である。この方式の場合、HLG用のグレーディングが行われ、HLG用ストリームのみが伝送される。HLG用ストリームは、SDRに対する互換性がある。このため、HDRの映像データを、SDRTVに表示させる場合に、HDRの映像データをSDRの映像データに変換するSDR変換は不要である。 As shown in FIG. 3B, 2100 Hybrid Log Gamma (HLG) of ITU-R (The ITU Radio communication Sector) is a method having compatibility between SDR and HDR. In this case, grading for HLG is performed, and only the HLG stream is transmitted. The HLG stream is compatible with SDR. For this reason, when displaying video data of HDR on SDRTV, SDR conversion which converts video data of HDR into video data of SDR is unnecessary.
 図4は、HDRに対応しているHDR画像の一例と、SDRに対応しているSDR画像の一例とを比較して示した図である。図4には、室内の比較的暗い景色と窓の外の比較的明るい景色とが混在する、明暗差が相対的に大きい1枚の画像を、HDR画像とSDR画像とで示す。 FIG. 4 is a diagram comparing and showing an example of an HDR image corresponding to HDR and an example of an SDR image corresponding to SDR. In FIG. 4, a HDR image and an SDR image show a single image with a relatively large difference in brightness, in which a relatively dark scene in a room and a relatively bright scene outside a window are mixed.
 HDR画像は、HDR静止画データまたはHDR動画データを再生することで得られる画像である。SDR画像は、SDR静止画データまたはSDR動画データを再生することで得られる画像である。図4の上段に例示するように、HDR画像では、窓の外の比較的明るい景色と、室内の比較的暗い景色とが、ともに適切な明るさで表現されている。一方、SDR画像では、図4の下段に例示するように、窓の外の比較的明るい景色が表現されるように露出が調整されている場合、室内の比較的暗い景色は、暗くなりすぎて一部に黒潰れが生じ、見えにくくなっている。仮に、室内の景色が適切に表現されるように露出が調整された場合は、窓の外の景色は、明るくなりすぎて一部に白飛びが生じ、見えにくくなってしまう(図示せず)。このように、HDR画像は、SDR画像において実現が困難であった、比較的明るい景色と比較的暗い景色とが混在する、明暗差が相対的に大きい1枚の画像において白飛びと黒潰れとの両方を低減した階調性の高い画像を実現することができる。 The HDR image is an image obtained by reproducing the HDR still image data or the HDR video data. The SDR image is an image obtained by reproducing SDR still image data or SDR moving image data. As illustrated in the upper part of FIG. 4, in the HDR image, both the relatively bright scenery outside the window and the relatively dark scenery in the room are expressed with appropriate brightness. On the other hand, in the SDR image, as illustrated in the lower part of FIG. 4, when the exposure is adjusted so that a relatively bright scene outside the window is expressed, the relatively dark scene in the room becomes too dark. Some parts are blacked out and difficult to see. Temporarily, if the exposure is adjusted so that the indoor scenery is properly represented, the scenery outside the window will be too bright and some of the windows will be blown out, making it difficult to see (not shown) . Thus, HDR images are difficult to realize in SDR images, with overexposure and overexposure in a single image with a relatively large difference between bright and dark scenes. It is possible to realize an image with high tonality with both of them reduced.
 図5は、画像撮像時の輝度の尺度の一例を示す図である。 FIG. 5 is a diagram showing an example of a scale of luminance when capturing an image.
 被写体をカメラで撮像する場合、一般的には、図5に示すように、反射率が18%になる18%グレーを明るさの基準点とする。18%グレーは、明るさの基準になる基準反射率である。図5に示すStop数は、輝度を相対的に表しており、18%グレーにおける輝度を基準点とし、基準点のStop数を0とする。Stop数は、輝度が2倍になる毎に1ずつ増加し、輝度が1/2倍になる毎に1ずつ減少するように定義されている。 When an object is imaged by a camera, generally, as shown in FIG. 5, 18% gray at which the reflectance is 18% is used as the reference point of brightness. 18% gray is a standard reflectance which is a standard of brightness. The number of Stops shown in FIG. 5 relatively represents the luminance, and the luminance at 18% gray is a reference point, and the number of Stops of the reference point is zero. The Stop number is defined to increase by one each time the luminance is doubled and to decrease by one each time the luminance is halved.
 被写体をカメラで撮像するときに、カメラのイメージセンサ(例えば、CMOS(complementary Metal-Oxide Semiconductor)、またはCCD(Charge Coupled Device)、等)から得られる輝度は、絞り、シャッタースピード、感度設定、等による露出に応じて変化する。つまり、イメージセンサから得られる輝度は、同じ明るさの被写体を撮像したとしても、露出に応じて異なる値となる。このために、Stop数の値自体は絶対的な値ではなく、相対的な値である。つまり、Stop数では、輝度を表すことはできない。 When an object is captured by a camera, the luminance obtained from the camera image sensor (for example, a complementary metal-oxide semiconductor (CMOS) or a charge coupled device (CCD)) is an aperture, shutter speed, sensitivity setting, etc. It changes according to the exposure by. That is, the luminance obtained from the image sensor has different values according to the exposure, even if the subject with the same brightness is imaged. For this reason, the value of the Stop number itself is not an absolute value but a relative value. That is, the Stop number can not represent luminance.
 また、カメラで被写体を撮像するときには、通常は、被写体の明るさに応じて、露出が調整される。例えば、図5の(1)に暗い被写体の一例として示す夜のシーンをカメラで撮像する場合は、一般的には、画像の大部分を占める暗い領域の黒潰れを起こさないようにするために、シャッタースピードを遅くする、絞りを開ける、等の設定により、暗い領域の階調が表され、画像における面積が比較的少ない明るい領域は白飛びさせるような露出の調整を、カメラに対して行う。 In addition, when an object is imaged by a camera, the exposure is usually adjusted according to the brightness of the object. For example, when an image of a night scene shown as an example of a dark subject in (1) of FIG. 5 is captured by a camera, generally, the dark area occupying a large part of the image is not blackened. By setting the shutter speed slow down, opening the aperture, etc., the exposure adjustment is performed on the camera so that the gradation of the dark area is expressed and the bright area with relatively small area in the image is overexposed. .
 図5の(2)に中程度の明るさの被写体の一例として示す昼の室内のシーンをカメラで撮像する場合は、一般的には、暗い部分と明るい部分とのバランスが良くなるような露出の調整を、カメラに対して行う。 When an image of a daytime indoor scene shown in FIG. 5 (2) as an example of a medium-brightness subject is captured by a camera, in general, the exposure is such that the balance between the dark part and the bright part is improved. Make adjustments to the camera.
 図5の(3)に明るい被写体の一例として示す昼の屋外のシーンをカメラで撮像する場合は、一般的には、シャッタースピードを速くする、絞りを絞る、等の設定により、画像の大部分を占める明るい領域の白飛びを防ぐような露出の調整を、カメラに対して行う。 When imaging a daytime outdoor scene shown as an example of a bright subject in (3) of FIG. 5 with a camera, generally, most of the image is set by setting the shutter speed to be fast, the aperture to be narrowed, etc. Adjust the exposure to the camera to prevent overexposure in the bright area that occupies the.
 このようにして得られた相対的な輝度を、絶対的な輝度に変換するためには、露出の基準点からの相対関係を計算する必要がある。 In order to convert the relative brightness thus obtained into an absolute brightness, it is necessary to calculate the relative relationship of the exposure from the reference point.
 図6は、撮像された画像の輝度の一例を示す図である。 FIG. 6 is a diagram showing an example of the luminance of a captured image.
 図6に示す撮像された画像を、以下、原画像70とする。原画像70では、明るさの基準になる0 Stopに対応する輝度18nitを有する画素の領域を領域Aとして示す。また、原画像70では、2.3 Stopsに対応する輝度90nitを有する画素の領域を領域Bとして示す。また、原画像70では、ほぼ黒の-3 Stopsに対応する輝度2.3nitを有する画素の領域を領域Cとして示す。また、原画像70では、6 Stopsに対応する輝度1150nitを有する画素の領域を領域Dとして示す。領域Dには、太陽を撮像することで得られた画素が含まれており、非常に明るい輝度(例えば、原画像70において最も明るい輝度)が得られている。また、原画像70では、4 Stopsに対応する輝度290nitを有する画素の領域を領域Eとして示す。領域Eで示される領域には、鏡面反射を起こしている場所を撮像することで得られた画素が含まれている。 The captured image shown in FIG. 6 is hereinafter referred to as an original image 70. In the original image 70, an area of a pixel having a luminance of 18 nit corresponding to 0 Stop which is a reference of brightness is shown as an area A. Further, in the original image 70, a region of a pixel having a luminance of 90 nit corresponding to 2.3 Stops is shown as a region B. Further, in the original image 70, a region of a pixel having a luminance of 2.3 nit corresponding to substantially black -3 Stops is shown as a region C. In the original image 70, a region of a pixel having a luminance of 1150 nit corresponding to 6 Stops is shown as a region D. Region D includes pixels obtained by imaging the sun, and very bright luminance (for example, the brightest luminance in the original image 70) is obtained. In the original image 70, a region of pixels having a luminance of 290 nit corresponding to 4 Stops is shown as a region E. The area indicated by the area E includes pixels obtained by imaging a location causing specular reflection.
 次に、原画像70をSDR画像へ変換するSDRグレーディング処理(マスタリング処理)について説明する。 Next, SDR grading processing (mastering processing) for converting the original image 70 into an SDR image will be described.
 SDRグレーディング処理は、カメラで撮像された、100nit以上の高輝度成分を有するコンテンツの映像(原画像)を、BT.709等の放送規格に適合するように変換する処理であり、原画像に対してニーカーブ処理を施すことで実現される。ニーカーブ処理は、ニーカーブによって入力信号を変換する処理であり、ニーカーブは、一定値(ニーポイント)以上の入力信号に対してゲインを圧縮して出力する入出力変換カーブである。SDRグレーディング処理では、原画像における一定値(ニーポイント)以下の輝度に対してはゲインを1とし(すなわち、入力される輝度をそのまま出力し)、一定値(ニーポイント)以上の輝度に対しては、所定の輝度に収まるようにゲインを圧縮する。所定の輝度は、例えば、処理後の画像を表示する表示装置で表示可能な最大輝度であってもよく、処理後の画像をBT.709に適合させる場合は100nitであってもよい。したがって、例えば、原画像70に対してSDRグレーディング処理を施す場合、通常のグレーディング処理で、ニーポイント(例えば、80nit前後)までは現画像70の輝度をそのままリニアに保持し、ニーポイント以上の輝度は、現画像70の最高輝度が100nitに収まるように、各輝度を低減する。 In the SDR grading process, an image (original image) of content having a high luminance component of 100 nit or more, which is captured by a camera, is BT. This processing is conversion processing to conform to a broadcast standard such as 709, and is realized by performing knee curve processing on an original image. The knee curve process is a process of converting an input signal by a knee curve. The knee curve is an input / output conversion curve which compresses a gain for an input signal having a predetermined value (knee point) or more and outputs the compressed signal. In the SDR grading process, the gain is set to 1 for luminance below a certain value (knee point) in the original image (that is, the input luminance is output as it is), and for luminance above a certain value (knee point). Compresses the gain so that it falls within a predetermined luminance. The predetermined luminance may be, for example, the maximum luminance that can be displayed by a display device that displays the image after processing. When conforming to 709, it may be 100 nit. Therefore, for example, when performing the SDR grading process on the original image 70, the brightness of the current image 70 is held as it is linearly up to the knee point (for example, around 80 nit) in the normal grading process. Reduces each luminance so that the maximum luminance of the current image 70 falls within 100 nit.
 図7Aは、図6で示した原画像70をSDR画像71にマスタリングした結果の輝度の一例を示す図である。図7Bは、原信号値をSDR信号値に変換する(以下、「マスタリングする」ともいう)ための、原信号値とSDR信号値との関係の一例を模式的に示す図である。なお、原信号値とは、原画像(例えば、原画像70)の0nit~最大輝度(例えば、1150nit)の輝度範囲における輝度(以下、「原画像の輝度」という)であり、SDR信号値とは、SDRの輝度範囲における輝度(以下、「SDRの輝度」という)である。なお、図7Bでは、原信号値の最大値を10000としているが、原信号値の最大値は原画像によって変化する。例えば、図6に示した原画像70に関しては、原信号値の最大値は1150である。 FIG. 7A is a view showing an example of the luminance as a result of mastering the original image 70 shown in FIG. 6 on the SDR image 71. As shown in FIG. FIG. 7B is a diagram schematically showing an example of the relationship between the original signal value and the SDR signal value for converting the original signal value into the SDR signal value (hereinafter, also referred to as “mastering”). The original signal value is the luminance in the luminance range of 0 nit to the maximum luminance (for example, 1150 nit) of the original image (for example, the original image 70) (hereinafter referred to as "the luminance of the original image"). Is the brightness in the brightness range of SDR (hereinafter referred to as “brightness of SDR”). Although the maximum value of the original signal value is 10000 in FIG. 7B, the maximum value of the original signal value changes depending on the original image. For example, for the original image 70 shown in FIG. 6, the maximum value of the original signal value is 1150.
 原画像70において、0 Stopに対応する画素は、明るさの基準になる基準輝度を有する画素である。このため、原画像70からSDR画像71へのマスタリングにより、原画像70をSDR画像71に変換した後も、原画像70における0 Stopに対応する原画像70の輝度(18nit)を、変更せずに、SDRの輝度(18nit)とする(図7AのSDR画像71において領域Aで示す領域を参照)。 In the original image 70, the pixel corresponding to 0.sub.Stop is a pixel having a reference luminance as a reference of brightness. Therefore, even after converting the original image 70 into the SDR image 71 by mastering the original image 70 into the SDR image 71, the brightness (18 nit) of the original image 70 corresponding to 0 Stop in the original image 70 is not changed. The luminance (18 nit) of the SDR is assumed (refer to the area indicated by the area A in the SDR image 71 of FIG. 7A).
 ここでは、図7Bに示すニーカーブによって、原画像70からSDR画像71へのマスタリングを行う例を説明する。ただし、図7Bに示す原信号値の最大値は10000ではなく1150となる。また、ニーポイントは90とする。このマスタリングでは、原画像70の90nitに対応する原画像70の輝度以下の輝度範囲(0~90nit)においては、原画像70の輝度を、変更せずに、SDRの輝度とする。すなわち、原画像70において0~90nitの画素は、マスタリング後のSDR画像71でも、0~90nitの画素となる。また、原画像70の90nitに対応する原画像70の輝度より大きい原画像70の輝度範囲(90~1150nit)においては、原画像70の輝度を、90~100nitの輝度範囲のSDRの輝度に、線形変換する。すなわち、原画像70において90~1150nitの画素は、マスタリング後のSDR画像71では、90~100nitの画素となる。 Here, an example will be described in which mastering from the original image 70 to the SDR image 71 is performed by the knee curves shown in FIG. 7B. However, the maximum value of the original signal value shown in FIG. 7B is not 10000 but 1150. Also, the knee point is 90. In this mastering, in the luminance range (0 to 90 nit) equal to or lower than the luminance of the original image 70 corresponding to 90 nit of the original image 70, the luminance of the original image 70 is set as the luminance of SDR without changing. That is, pixels of 0 to 90 nit in the original image 70 become pixels of 0 to 90 nit in the SDR image 71 after mastering. In the luminance range (90 to 1150 nit) of the original image 70 larger than the luminance of the original image 70 corresponding to 90 nit of the original image 70, the luminance of the original image 70 is set to the SDR luminance of the luminance range of 90 to 100 nit. Perform linear conversion. That is, pixels of 90 to 1150 nit in the original image 70 are pixels of 90 to 100 nit in the SDR image 71 after mastering.
 例えば、原画像70において90nitに対応する画素は、原画像70からSDR画像71へのマスタリングにより、原画像70をSDR画像71に変換した後も、原画像70における90nitに対応する原画像70の輝度を、変更せずに、SDRの輝度とする(図7AのSDR画像71において領域Bで示す領域を参照)。 For example, even if the original image 70 is converted to the SDR image 71 by mastering from the original image 70 to the SDR image 71, the pixels corresponding to 90 nit in the original image 70 are the original image 70 corresponding to 90 nit in the original image 70. The luminance is assumed to be the luminance of SDR without changing (refer to the area indicated by area B in the SDR image 71 of FIG. 7A).
 また、例えば、原画像70において2.3nitに対応する画素は、原画像70からSDR画像71へのマスタリングにより、原画像70をSDR画像71に変換した後も、上述と同様に、原画像70における2.3nitに対応する原画像70の輝度を、変更せずに、SDRの輝度とする(図7AのSDR画像71において領域Cで示す領域を参照)。 Further, for example, even after the original image 70 is converted into the SDR image 71 by mastering the original image 70 to the SDR image 71, the pixels corresponding to 2.3 nit in the original image 70 are the same as the original image 70. The luminance of the original image 70 corresponding to 2.3 nit in the above is the luminance of the SDR without changing (refer to the area indicated by the area C in the SDR image 71 of FIG. 7A).
 一方、例えば、原画像70において1150nitに対応する画素は、原画像70からSDR画像71へのマスタリングにおいて、原画像70における1150nitに対応する原画像70の輝度を、SDRの最大輝度である100nitに変換する(図7AのSDR画像71においてDで示す領域を参照)。 On the other hand, for example, in the mastering of the original image 70 to the SDR image 71, the pixel corresponding to 1150 nit in the original image 70 sets the luminance of the original image 70 corresponding to 1150 nit in the original image 70 to 100 nit, which is the maximum luminance of SDR. Convert (refer to the area indicated by D in the SDR image 71 of FIG. 7A).
 また、例えば、原画像70において290nitに対応する画素は、原画像70からSDR画像71へのマスタリングにおいて、原画像70における290nitに対応する原画像70の輝度を、95nitに変換する(図7AのSDR画像71において領域Eで示す領域を参照)。 Further, for example, in the mastering of the original image 70 to the SDR image 71, the pixel corresponding to 290 nit in the original image 70 converts the luminance of the original image 70 corresponding to 290 nit in the original image 70 to 95 nit (FIG. 7A Refer to the area indicated by the area E in the SDR image 71).
 図8Aは、図6で示した原画像70をHDR画像72にマスタリングした結果の輝度の一例を示す図である。図8Bは、原信号値をHDR信号値に変換する(マスタリングする)ための、原信号値とHDR信号値との関係の一例を模式的に示す図である。なお、HDR信号値とは、HDRの輝度範囲における輝度(以下、「HDRの輝度」という)である。なお、この実施例では、原画像70からHDR画像72へのマスタリングにおいて、2000nitまでの輝度をHDRの輝度とすることが許されているものとする。一方、上述したように、原画像70の最大輝度は1150nitである。そのため、HDR画像72においては、原画像70の輝度をそのまま保持できる。 FIG. 8A is a view showing an example of the luminance as a result of mastering the original image 70 shown in FIG. 6 into the HDR image 72. As shown in FIG. FIG. 8B is a view schematically showing an example of the relationship between the original signal value and the HDR signal value for converting (mastering) the original signal value into the HDR signal value. The HDR signal value is the luminance in the luminance range of the HDR (hereinafter referred to as “the luminance of the HDR”). In this embodiment, in mastering from the original image 70 to the HDR image 72, it is assumed that the luminance up to 2000 nit is permitted to be the luminance of the HDR. On the other hand, as described above, the maximum luminance of the original image 70 is 1150 nit. Therefore, in the HDR image 72, the luminance of the original image 70 can be held as it is.
 原画像70において、0 Stopに対応する画素は、明るさの基準になる基準輝度を有する画素である。そのため、その画素に関しては、原画像70からHDR画像72へのマスタリングにより、原画像70をHDR画像72に変換した後も、原画像70における輝度を変更せずに、HDRの輝度とする(図8AのHDR画像72において領域Aで示す領域)。 In the original image 70, the pixel corresponding to 0.sub.Stop is a pixel having a reference luminance as a reference of brightness. Therefore, regarding the pixel, even after converting the original image 70 to the HDR image 72 by mastering from the original image 70 to the HDR image 72, the luminance in the original image 70 is not changed but is set as the HDR luminance (see FIG. An area indicated by an area A in the HDR image 72 of 8A).
 同様に、例えば、原画像70において90nitに対応する画素と、原画像70において2.3nitに対応する画素と、原画像70において1150nitに対応する画素と、原画像70において290nitに対応する画素と、のそれぞれに関しても、原画像70からHDR画像72へのマスタリングにより原画像70をHDR画像72に変換した後も、原画像70の輝度を、変更せずに、HDRの輝度とする(図8AのHDR画像72において領域B、領域C、領域D、領域Eで示す領域を参照)。 Similarly, for example, a pixel corresponding to 90 nit in the original image 70, a pixel corresponding to 2.3 nit in the original image 70, a pixel corresponding to 1150 nit in the original image 70, and a pixel corresponding to 290 nit in the original image 70 For each of the above, also after converting the original image 70 to the HDR image 72 by mastering from the original image 70 to the HDR image 72, the luminance of the original image 70 is made the luminance of the HDR without changing (FIG. 8A In the HDR image 72, the areas indicated by area B, area C, area D, and area E are referred to).
 図9は、HDRまたはSDRに対応した撮像装置と、撮像装置によって得られる画像データのファイルフォーマットと、画像データを表示する表示装置または画像データを印刷する印刷装置と、について説明するための模式図である。 FIG. 9 is a schematic diagram for describing an imaging device compatible with HDR or SDR, a file format of image data obtained by the imaging device, and a display device for displaying image data or a printing device for printing image data. It is.
 図9に示すHDR撮像装置10は、HDRでの撮像に対応している。HDR撮像装置10は、HDR撮像部11と、SDR撮像部14と、変換部12と、JPEG圧縮部13と、を備える。HDR撮像装置10は、HDR撮像部11においてHDR撮影モードで撮像されて得られた画像データが、SDR表示装置40で表示されること、またはSDR印刷装置50で印刷されることに対応できるように構成されている。具体的には、HDR撮像装置10では、HDR撮像部11においてHDR撮影モードで撮像されて得られたHDR画像のHDR静止画データが、変換部12においてSDR静止画データに変換される。そして、HDR撮像装置10では、変換部12での変換により得られたSDR静止画データが、JPEG圧縮部13においてJPEG圧縮され、その圧縮により得られたJPEGフォーマットのSDR静止画データが出力される。また、HDR撮像装置10では、SDR撮像部14において従来の撮映モード(SDR撮映モード)で撮像されて得られたSDR画像のSDR静止画データについても、JPEG圧縮部13でJPEG圧縮され、圧縮により得られたJPEGフォーマットのSDR静止画データが出力される。 The HDR imaging device 10 illustrated in FIG. 9 corresponds to imaging in HDR. The HDR imaging device 10 includes an HDR imaging unit 11, an SDR imaging unit 14, a conversion unit 12, and a JPEG compression unit 13. The HDR imaging device 10 can respond to the image data obtained by imaging in the HDR imaging mode in the HDR imaging unit 11 being displayed on the SDR display device 40 or being printed on the SDR printing device 50. It is configured. Specifically, in the HDR imaging device 10, HDR still image data of an HDR image obtained by imaging in the HDR imaging mode in the HDR imaging unit 11 is converted into SDR still image data in the conversion unit 12. Then, in the HDR imaging apparatus 10, the SDR still image data obtained by the conversion in the conversion unit 12 is JPEG-compressed in the JPEG compression unit 13, and the SDR still image data in the JPEG format obtained by the compression is output. . Further, in the HDR imaging device 10, the SDR still image data of the SDR image obtained by imaging in the conventional imaging mode (SDR imaging mode) in the SDR imaging unit 14 is also JPEG-compressed by the JPEG compression unit 13, SDR still image data in JPEG format obtained by compression is output.
 SDR撮像装置20は、SDR撮像部21と、JPEG圧縮部22と、を備える。SDR撮像装置20では、HDR撮像装置10において従来の撮映モード(SDR撮映モード)で撮像が行われる場合と同様に、SDR撮像部21での撮像で得られたSDR画像のSDR静止画データが、JPEG圧縮部22でJPEG圧縮され、その圧縮により得られたJPEGフォーマットのSDR静止画データが出力される。 The SDR imaging device 20 includes an SDR imaging unit 21 and a JPEG compression unit 22. In the SDR imaging device 20, SDR still image data of an SDR image obtained by imaging in the SDR imaging unit 21 as in the case where imaging is performed in the conventional imaging mode (SDR imaging mode) in the HDR imaging device 10 Is JPEG-compressed by the JPEG compression unit 22, and the SDR still image data of JPEG format obtained by the compression is output.
 したがって、HDR表示装置30、SDR表示装置40、およびSDR印刷装置50では、HDR撮像によるHDR静止画データがSDR変換されて得られたSDR静止画データ、または、SDR撮像によるSDR静止画データを取得して、当該SDR静止画データによるSDR画像を再生(表示または印刷)する。 Therefore, in the HDR display device 30, the SDR display device 40, and the SDR printing device 50, SDR still image data obtained by SDR converting HDR still image data by HDR imaging or SDR still image data by SDR imaging is acquired Then, the SDR image based on the SDR still image data is reproduced (displayed or printed).
 次に、HDR撮影モードについて図10および図11を用いて説明する。 Next, the HDR shooting mode will be described with reference to FIGS. 10 and 11.
 図10および図11は、2つの画像を合成することにより輝度範囲(ダイナミックレンジ)を拡大した画像を得るHDR撮影モードについて説明するための模式図である。 FIG. 10 and FIG. 11 are schematic diagrams for describing the HDR imaging mode for obtaining an image in which the luminance range (dynamic range) is expanded by combining two images.
 スマートフォンおよびデジタルカメラ等には、輝度範囲(ダイナミックレンジ)が広い映像を撮像できるHDR撮影モードを有するものがある。HDR撮影モードでは、図10および図11の(a)に示すように、輝度範囲(ダイナミックレンジ)が広いHDR画像データを得るために、2重露光(同一被写体を互いに異なる露光状態で複数回撮像する手法)等で得られた2つのSDR画像が、SDRで定められた輝度範囲に収まるように合成される。これにより、図10および図11の(b)に示すように、HDR画像をSDR表示装置で表示することが可能になる。 Some smartphones, digital cameras, and the like have an HDR shooting mode capable of capturing an image having a wide luminance range (dynamic range). In the HDR shooting mode, as shown in FIGS. 10 and 11A, double exposure (the same subject is photographed a plurality of times in different exposure states to obtain HDR image data having a wide luminance range (dynamic range), The two SDR images obtained by the above method are synthesized so as to fall within the luminance range defined by the SDR. As a result, as shown in (b) of FIG. 10 and FIG. 11, it becomes possible to display the HDR image on the SDR display device.
 次に、図12を用いて、HDR表示用に撮像されたHDR画像について説明する。 Next, an HDR image captured for HDR display will be described using FIG. 12.
 図12は、HDR表示用に撮像されたHDR画像について説明するための模式図である。 FIG. 12 is a schematic view for explaining an HDR image captured for HDR display.
 図12に示すように、HDR表示用のHDR画像は、撮像の対象となるシーンの明るさが、SDR撮影モードよりも広い輝度範囲(ダイナミックレンジ)で、撮像される。この撮像により得られた画像データにグレーディング処理が施されてHDR表示用のHDR画像が生成され、そのHDR画像が各装置に伝送されて再生される。HDR画像は、SDR画像よりも輝度範囲(ダイナミックレンジ)が広いため、そのままではSDR表示装置で表示することはできない。HDR画像をSDR表示装置で表示するためには、HDR画像からSDR画像への変換が必要となる。 As shown in FIG. 12, the HDR image for HDR display is captured in a brightness range (dynamic range) in which the brightness of the scene to be captured is wider than that in the SDR shooting mode. A grading process is performed on the image data obtained by this imaging to generate an HDR image for HDR display, and the HDR image is transmitted to each device and reproduced. The HDR image can not be displayed on the SDR display device as it is because the luminance range (dynamic range) is wider than the SDR image. In order to display the HDR image on the SDR display device, conversion from the HDR image to the SDR image is required.
 一方、図10および図11を用いて説明したHDR撮影モードでは、合成後の画像は、SDRで定められた輝度範囲に収まるように生成されているため、HDR表示装置30とSDR表示装置40(またはSDR印刷装置50)との双方で再生することが可能である。 On the other hand, in the HDR imaging mode described with reference to FIGS. 10 and 11, the combined image is generated so as to fall within the luminance range defined by the SDR. Therefore, the HDR display device 30 and the SDR display device 40 ( Or, it is possible to reproduce with both the SDR printing apparatus 50).
 図9に戻って説明を続ける。近年、HDR画像を表示するためのHDR画像データを、SDR変換をせずに表示することができる、HDRTV等のHDR表示装置が提案されている。 Returning to FIG. 9, the description will be continued. BACKGROUND In recent years, HDR display devices such as HDRTV that can display HDR image data for displaying an HDR image without performing SDR conversion have been proposed.
 一方、HDR撮影モード(HDR撮像機能)を有するカメラでは、HDRTVとは異なり、主に、逆光補正等を目的としてHDR技術が使用されている。そして、そのカメラでHDR技術を使用して撮像された静止画像は、SDR表示装置またはSDR印刷装置で再生されることがある。そのため、そのカメラは、映像用のHDR画像データを生成することが可能な撮像素子を備え、HDR技術を用いた撮像が可能であるにも関わらず、HDR撮影モードで撮像したHDR画像をSDR変換し、SDR静止画データを出力する場合がある。このように、HDR撮像機能を有するカメラでは、HDRTVの表示能力を活かすような輝度範囲(ダイナミックレンジ)が広いHDR画像データを生成することが可能であるにもかかわらず、HDR画像データが生成されていない場合があった。 On the other hand, in a camera having an HDR shooting mode (HDR imaging function), unlike HDRTV, HDR technology is mainly used for the purpose of backlight correction and the like. And, still images captured by the camera using the HDR technology may be reproduced by the SDR display device or the SDR printing device. Therefore, the camera is provided with an imaging element capable of generating HDR image data for video, and the HDR image captured in the HDR imaging mode is SDR converted although imaging using the HDR technology is possible. May output SDR still image data. As described above, in a camera having an HDR imaging function, HDR image data is generated although it is possible to generate HDR image data having a wide luminance range (dynamic range) that makes use of the display capability of HDRTV. There was a case that did not.
 図13は、SDR印刷装置50に画像を印刷させる場合の課題について説明するための模式図である。 FIG. 13 is a schematic view for explaining the problem in the case of causing the SDR printing apparatus 50 to print an image.
 HDRTV(例えば、HDR表示装置30)のHDR表示機能を活かすためには、HDR表示用のHDR画像データが生成されたときに、HDR画像データをSDR画像データに変換せず、そのままのHDR画像データをHDRTVで表示すればよい。 In order to take advantage of the HDR display function of HDRTV (for example, HDR display device 30), when HDR image data for HDR display is generated, the HDR image data is not converted into SDR image data, and the HDR image data as it is Can be displayed on HDRTV.
 図13に示すHDR撮像装置10Aは、HDR撮像部11と、SDR撮像部14と、変換部12と、JPEG圧縮部13と、HDR画像補正部15と、HEVC(High Efficiency Video coding)圧縮部16と、を備える。HDR撮像装置10Aは、HDR画像データを生成するために、HDR画像補正部15においてHDR画像補正を行う。 The HDR imaging device 10A illustrated in FIG. 13 includes an HDR imaging unit 11, an SDR imaging unit 14, a conversion unit 12, a JPEG compression unit 13, an HDR image correction unit 15, and a high efficiency video coding (HEVC) compression unit 16 And. The HDR imaging device 10A performs the HDR image correction in the HDR image correction unit 15 in order to generate the HDR image data.
 HDR画像補正部15では、例えば、HDR撮像部11で撮像が行われることにより得られたRAWデータを、PQカーブ等のHDR-EOTF(HDR-Electro-Optical Transfer Function)を用いて、HDRTV(例えば、HDR10規格に対応したHDR表示装置30)で表示可能な10ビット画像に変換する。そして、HDR撮像装置10Aは、HDR画像補正部15により得られたHDR画像データを、例えば、HEVC圧縮部16からHDMI(登録商標、High-Definition Multimedia Interface)経由でHDRTV(例えば、HDR表示装置30)に出力する。これにより、そのHDR画像データを受信したHDRTV(例えば、HDR表示装置30)では、そのHDR画像データに応じたHDR画像が表示される。 The HDR image correction unit 15 uses, for example, the HDR data (for example, HDR-EOTF (HDR-Electro-Optical Transfer Function) such as a PQ curve) to capture the RAW data obtained by performing imaging by the HDR imaging unit 11. , 10-bit image that can be displayed by the HDR display device 30) compatible with the HDR 10 standard. Then, the HDR imaging device 10A transmits the HDR image data obtained by the HDR image correction unit 15 to the HDRTV (for example, the HDR display device 30) from the HEVC compression unit 16 via HDMI (registered trademark, High-Definition Multimedia Interface), for example. Output to). Thereby, in the HDR TV (for example, the HDR display device 30) which has received the HDR image data, the HDR image according to the HDR image data is displayed.
 また、HDR画像データに対して疑似HDR変換を行い、疑似HDR変換によって得られたSDR形式の疑似HDR画像をSDR表示装置40に出力することで、SDR表示装置40に、SDR画像データを表示させるよりも高品位な画像を表示させることができる。なお、疑似HDR変換とは、HDR画像を、SDR表示装置40が表示できる最大輝度値に合わせた輝度範囲(ダイナミックレンジ)を有するSDR形式の画像(疑似HDR画像)へ変換することである。 In addition, pseudo-HDR conversion is performed on HDR image data, and a pseudo-HDR image in SDR format obtained by the pseudo-HDR conversion is output to the SDR display device 40 to display the SDR image data on the SDR display device 40. Higher quality images can be displayed. The pseudo-HDR conversion is to convert an HDR image into an SDR-format image (pseudo-HDR image) having a luminance range (dynamic range) in accordance with the maximum luminance value that can be displayed by the SDR display device 40.
 しかしながら、SDR印刷装置50においては、HDR画像データを高品位に印刷することができない。そのため、SDR印刷装置50では、従来のSDR撮影モードで撮像が行われることで得られたSDR画像データを印刷する。つまり、SDR印刷装置50では、高品位なHDR画像データが得られたとしても、高品位なHDR画像データによる高品位な画像を印刷することができない。 However, in the SDR printing apparatus 50, HDR image data can not be printed with high quality. Therefore, the SDR printing apparatus 50 prints SDR image data obtained by performing imaging in the conventional SDR imaging mode. That is, in the SDR printing apparatus 50, even if high quality HDR image data is obtained, it is not possible to print a high quality image based on high quality HDR image data.
 以下、適宜図面を参照しながら、実施の形態を詳細に説明する。ただし、必要以上に詳細な説明は省略する場合がある。例えば、すでによく知られた事項の詳細説明、および実質的に同一の構成に対する重複説明等を省略する場合がある。これは、以下の説明が不必要に冗長になるのを避け、当業者の理解を容易にするためである。 Hereinafter, embodiments will be described in detail with reference to the drawings as appropriate. However, more detailed description than necessary may be omitted. For example, detailed descriptions of well-known matters and redundant descriptions of substantially the same configurations may be omitted. This is to avoid unnecessary redundancy in the following description and to facilitate understanding by those skilled in the art.
 なお、添付図面および以下の説明は、当業者が本開示を十分に理解するために提供されるのであって、これらにより特許請求の範囲に記載の主題を限定することは意図されていない。 It should be noted that the attached drawings and the following description are provided to enable those skilled in the art to fully understand the present disclosure, and they are not intended to limit the claimed subject matter.
 また、各図は、必ずしも厳密に図示されたものではなく、本開示をわかりやすく示すために適宜省略等を行った模式図である。また、各図において、実質的に同じ構成要素については同じ符号を付し、説明を省略または簡略化する場合がある。 Further, each drawing is not necessarily strictly illustrated, and is a schematic view appropriately omitted in order to clearly show the present disclosure. In each of the drawings, substantially the same components may be denoted by the same reference numerals and descriptions thereof may be omitted or simplified.
 (1.実施の形態1)
 本実施の形態では、SDR印刷装置50において高品位な画像を印刷できるようにするための静止画データを生成する画像処理装置を開示する。
(1. Embodiment 1)
In the present embodiment, an image processing apparatus is disclosed that generates still image data for enabling the SDR printing apparatus 50 to print high-quality images.
 (1-1.構成)
 図14は、実施の形態1における画像処理装置100のハードウェア構成の一例を模式的に示すブロック図である。
(1-1. Configuration)
FIG. 14 is a block diagram schematically showing an example of the hardware configuration of the image processing apparatus 100 according to the first embodiment.
 図14に示すように、画像処理装置100は、ハードウェア構成として、CPU(Central Processing Unit)101と、メインメモリ102と、ストレージ103と、入力IF(Interface)104と、通信IF(Interface)105と、を備える。 As illustrated in FIG. 14, the image processing apparatus 100 has a hardware configuration including a central processing unit (CPU) 101, a main memory 102, a storage 103, an input interface (IF) 104, and a communication interface (interface) 105. And.
 CPU101は、ストレージ103等に記憶された制御プログラムを実行するプロセッサである。 The CPU 101 is a processor that executes a control program stored in the storage 103 or the like.
 メインメモリ102は、CPU101が制御プログラムを実行するときに使用するワークエリアとして用いられる揮発性の記憶領域である。メインメモリ102は、例えば半導体メモリ等によって構成することができる。 The main memory 102 is a volatile storage area used as a work area used when the CPU 101 executes a control program. The main memory 102 can be configured by, for example, a semiconductor memory or the like.
 ストレージ103は、制御プログラムおよびコンテンツ等を保持する不揮発性の記憶領域である。ストレージ103は、例えば、磁気ディスク、光ディスク、光磁気ディスク、または半導体メモリ、等によって構成することができる。 The storage 103 is a non-volatile storage area that holds a control program, content, and the like. The storage 103 can be configured by, for example, a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
 入力IF104は、キーボード、マウス、タッチパッド、ボタン、タッチパネル、等である。 The input IF 104 is a keyboard, a mouse, a touch pad, a button, a touch panel, or the like.
 通信IF105は、通信ネットワークを介して他の装置と通信する通信インタフェースである。他の装置とは、例えば、後述する印刷装置200、入力装置300等である。通信IF105は、例えば、IEEE802.11a、b、g、n規格に適合した無線LAN(Local Area Network)インタフェースであるが、第3世代移動通信システム(3G)、第4世代移動通信システム(4G)、または、LTE(登録商標、LONG Term Evolution)等の移動通信システムで利用される通信規格に適合した無線通信インタフェースであってもよいし、Bluetooth(登録商標)規格に適合した無線通信インタフェースであってもよい。また、通信IF105は、有線LANインタフェース、USB(Universal Serial Bus)インタフェース等の有線通信インタフェースであってもよい。 The communication IF 105 is a communication interface that communicates with another device via a communication network. The other devices are, for example, a printing device 200, an input device 300, and the like described later. The communication IF 105 is, for example, a wireless LAN (Local Area Network) interface conforming to the IEEE 802.11a, b, g, n standards, but the third generation mobile communication system (3G), the fourth generation mobile communication system (4G) Or a wireless communication interface conforming to a communication standard used in a mobile communication system such as LTE (registered trademark, LONG Term Evolution), or a wireless communication interface conforming to the Bluetooth (registered trademark) standard May be The communication IF 105 may be a wired communication interface such as a wired LAN interface or a USB (Universal Serial Bus) interface.
 図15は、実施の形態1における画像処理装置の機能構成の第1の例を模式的に示すブロック図である。 FIG. 15 is a block diagram schematically showing a first example of a functional configuration of the image processing apparatus in the first embodiment.
 画像処理装置100は、取得部110と、変換部120と、出力部130と、を備える。画像処理装置100は、撮像装置に組み込まれた装置として実現されてもよいし、単独の装置として実現されてもよい。また、画像処理装置100には、印刷装置200が、有線接続または無線接続されている。印刷装置200は、図9に示したSDR印刷装置50の一例である。 The image processing apparatus 100 includes an acquisition unit 110, a conversion unit 120, and an output unit 130. The image processing apparatus 100 may be realized as an apparatus incorporated in an imaging apparatus or may be realized as a single apparatus. Further, the printing apparatus 200 is connected to the image processing apparatus 100 by wired connection or wireless connection. The printing apparatus 200 is an example of the SDR printing apparatus 50 shown in FIG.
 取得部110は、撮像により得られた第1静止画データD1と、印刷装置200の印刷能力を示す能力情報I1と、を取得する。第1静止画データD1は、輝度範囲が第1のダイナミックレンジで定義された静止画データである。第1のダイナミックレンジは、例えば、HDR(High Dynamic Range)である。 The acquisition unit 110 acquires the first still image data D1 acquired by imaging and the capability information I1 indicating the printing capability of the printing apparatus 200. The first still image data D1 is still image data whose luminance range is defined by the first dynamic range. The first dynamic range is, for example, HDR (High Dynamic Range).
 取得部110は、能力情報I1として、例えば、印刷装置200で印刷に用いる用紙(または、印刷装置200に設定されている用紙)の種類を示す紙情報を取得してもよい。紙情報は、例えば、ユーザが印刷装置200に画像を印刷させる際の指示(以下、「印刷指示」ともいう)に含まれていてもよい。 The acquisition unit 110 may acquire, for example, paper information indicating the type of paper used for printing by the printing apparatus 200 (or the paper set in the printing apparatus 200) as the capability information I1. The paper information may be included, for example, in an instruction when the user causes the printing apparatus 200 to print an image (hereinafter, also referred to as a “printing instruction”).
 また、取得部110は、印刷装置200で印刷に用いる用紙に特定のパターン(以下、「テストパターン」ともいう)を印刷することにより得られる印刷結果を能力情報I1として取得してもよい。取得部110は、ユーザが、印刷結果を画像処理装置100(または、印刷装置200)に入力することで、当該印刷結果を取得してもよい。この印刷結果については後述する。 The acquisition unit 110 may also acquire, as the capability information I1, a print result obtained by printing a specific pattern (hereinafter also referred to as a “test pattern”) on a sheet used for printing by the printing apparatus 200. The acquisition unit 110 may acquire the print result by the user inputting the print result to the image processing apparatus 100 (or the printing apparatus 200). The print result will be described later.
 取得部110は、第1静止画データD1を、例えば、画像処理装置100に有線接続または無線接続された、撮像装置、情報端末、または記憶装置、等から取得してもよい。第1静止画データD1を取得する取得部110は、例えば、通信IF105(図14参照)により実現されてもよい。 The acquisition unit 110 may acquire the first still image data D1 from, for example, an imaging device, an information terminal, or a storage device connected to the image processing apparatus 100 by wire connection or wireless connection. The acquisition unit 110 for acquiring the first still image data D1 may be realized by, for example, the communication IF 105 (see FIG. 14).
 取得部110は、例えば、能力情報I1としての紙情報を含む印刷指示を、ユーザからの入力操作を受け付けることにより取得してもよい。能力情報I1を取得する取得部110は、例えば、入力IF104(図14参照)により実現されてもよい。なお、取得部110は、印刷指示を取得せずに能力情報I1を取得してもよい。 The acquisition unit 110 may acquire, for example, a print instruction including paper information as the capability information I1 by receiving an input operation from the user. The acquisition unit 110 for acquiring the capability information I1 may be realized by, for example, the input IF 104 (see FIG. 14). The acquisition unit 110 may acquire the capability information I1 without acquiring the print instruction.
 変換部120は、取得部110により取得された第1静止画データD1を、取得部110により取得された能力情報I1が示す印刷能力に応じて、第2静止画データD2に変換する。第2静止画データD2は、第1のダイナミックレンジよりも輝度範囲が狭い第2のダイナミックレンジで定義された静止画データである。 The conversion unit 120 converts the first still image data D1 acquired by the acquisition unit 110 into second still image data D2 according to the printing capability indicated by the capability information I1 acquired by the acquisition unit 110. The second still image data D2 is still image data defined in a second dynamic range whose luminance range is narrower than the first dynamic range.
 変換部120は、能力情報I1が、用紙の種類を示す情報を含む紙情報である場合、予め定められた関係を示す情報を参照することで、取得部110により取得された紙情報が示す用紙の種類に対応する反射輝度を特定する。なお、当該予め定められた関係を示す情報は、複数の用紙の種類と、複数の用紙の種類のそれぞれに対応している反射輝度との関係を示した情報である。そして、変換部120は、紙情報により特定された反射輝度を最大輝度とする輝度範囲(ダイナミックレンジ)を第2のダイナミックレンジとして、第1静止画データD1を第2静止画データD2へ変換する。なお、反射輝度とは、用紙に規定の輝度の光を照射したときに当該用紙で反射される当該光の輝度のことである。規定の輝度は、例えば、一般的な室内の明るさを表す輝度で定義されてもよく、あるいは、その他の任意に定められた輝度であってもよい。そして、当該予め定められた関係(複数の用紙の種類と反射輝度との関係)を示す情報は、予め、複数の種類の用紙のそれぞれに、規定の輝度の光を照射し、各用紙で反射された光の輝度を計測した結果から得ることができる。当該予め定められた関係を示す情報は、例えば、複数の用紙の種類と反射輝度との関係を表したテーブルにより示されてもよく、ストレージ103に記憶されていてもよい。また、当該予め定められた関係を示す情報は、外部の情報処理装置(図示せず)から通信ネットワークを介して取得されてもよい。 When the capability information I1 is paper information including information indicating the type of paper, the conversion unit 120 refers to the information indicating a predetermined relationship to indicate the paper indicated by the paper information acquired by the acquisition unit 110. Identify the reflected brightness that corresponds to the type of The information indicating the predetermined relationship is information indicating the relationship between the plurality of types of sheets and the reflection luminance corresponding to each of the plurality of types of sheets. Then, the converting unit 120 converts the first still image data D1 into the second still image data D2 with the brightness range (dynamic range) in which the reflection brightness specified by the paper information is the maximum brightness as the second dynamic range. . The reflection luminance is the luminance of the light reflected by the sheet when the sheet is irradiated with the light of the specified luminance. The prescribed luminance may be defined, for example, as luminance representing general indoor brightness, or may be any other arbitrarily determined luminance. Then, the information indicating the predetermined relationship (the relationship between the types of the plurality of sheets and the reflection luminance) irradiates light of the prescribed luminance to the plurality of types of sheets in advance, and the light is reflected on each sheet It can be obtained from the result of measuring the brightness of the received light. The information indicating the predetermined relationship may be represented, for example, by a table representing the relationship between the types of paper and the reflection luminance, or may be stored in the storage 103. Further, the information indicating the predetermined relationship may be acquired from an external information processing apparatus (not shown) via the communication network.
 変換部120は、取得部110により取得された能力情報I1が印刷結果を示す情報を含む場合、その印刷結果に応じて決定した輝度範囲(ダイナミックレンジ)を、第2のダイナミックレンジとする。そして、変換部120は、その第2のダイナミックレンジに基づき、第1静止画データD1を第2静止画データD2へ変換する。 When the capability information I1 acquired by the acquisition unit 110 includes information indicating the print result, the conversion unit 120 sets the luminance range (dynamic range) determined according to the print result as the second dynamic range. Then, the conversion unit 120 converts the first still image data D1 into second still image data D2 based on the second dynamic range.
 変換部120は、例えば、CPU101、メインメモリ102、ストレージ103(図14参照)、等により実現されてもよい。変換部120は、半導体集積回路等で構成された専用回路により実現されてもよい。 The conversion unit 120 may be realized by, for example, the CPU 101, the main memory 102, the storage 103 (see FIG. 14), and the like. The conversion unit 120 may be realized by a dedicated circuit configured of a semiconductor integrated circuit or the like.
 出力部130は、変換部120により第1静止画データD1から変換された第2静止画データD2を、印刷装置200に出力する。出力部130は、例えば、通信IF105(図14参照)により実現されてもよい。 The output unit 130 outputs the second still image data D2 converted from the first still image data D1 by the conversion unit 120 to the printing apparatus 200. The output unit 130 may be realized by, for example, the communication IF 105 (see FIG. 14).
 印刷装置200は、画像処理装置100からの第2静止画データD2およびユーザからの印刷指示を受信する。そして、印刷装置200は、受信した印刷指示において指定された種類の用紙に、受信した第2静止画データD2が示す静止画を印刷する。 The printing apparatus 200 receives the second still image data D2 from the image processing apparatus 100 and a print instruction from the user. Then, the printing apparatus 200 prints the still image indicated by the received second still image data D2 on the sheet of the type specified in the received print instruction.
 なお、画像処理装置100は、印刷装置200に、テストパターンを印刷させるための印刷指示を出力し、印刷装置200にそのテストパターンの印刷を実行させてもよい。また、画像処理装置100は、ディスプレイ(図示せず)を備えていてもよい。あるいは、画像処理装置100にディスプレイが接続されていてもよい。そして、画像処理装置100は、テストパターンの印刷を実行させる印刷指示を印刷装置200に出力した後に、テストパターンが用紙に印刷されることで得られる印刷結果をユーザに入力してもらうための入力指示を、ディスプレイに表示してもよい。この入力指示は、例えば、テストパターンが印刷された用紙から得られる印刷結果の入力をユーザに促す、メッセージ、画像、またはUI(User Interface)、等であってもよい。 The image processing apparatus 100 may output a print instruction for printing a test pattern to the printing apparatus 200, and may cause the printing apparatus 200 to print the test pattern. The image processing apparatus 100 may also include a display (not shown). Alternatively, a display may be connected to the image processing apparatus 100. Then, the image processing apparatus 100 outputs, to the printing apparatus 200, a print instruction to execute printing of the test pattern, and then an input for allowing the user to input a print result obtained by printing the test pattern on a sheet. The instructions may be displayed on a display. The input instruction may be, for example, a message, an image, or a UI (User Interface), which prompts the user to input a print result obtained from a sheet on which the test pattern is printed.
 図16は、実施の形態1におけるテストパターン150を用紙に印刷した結果の一例を示す図である。 FIG. 16 is a diagram showing an example of the result of printing the test pattern 150 according to the first embodiment on a sheet.
 本実施の形態では、特定のパターン(テストパターン)として、例えば図16に示すテストパターン150が、印刷装置200で印刷に用いる用紙P1に印刷される。図16に一例として示すテストパターン150には、パターン1~3と、パターン4~6と、が含まれている。パターン1~3は、図16に示すように、用紙P1において、左側上段(図16における左側上段)の領域に、縦方向に並列に3本の比較基準となる黒が配置され、各黒の右側の領域にグレーが配置され、各グレーの濃さが上から順に薄くなるパターンである。すなわち、パターン1~3は、用紙P1の左側(図16における左側)に配置された3本の黒を比較基準として、各黒の右側に配置されたグレーが、濃いグレーから薄いグレーに3段階で変化するパターンである。パターン4~6は、図16に示すように、用紙P1において、左側下段(図16における左側下段)の領域に、縦方向に並列に3本の比較基準となる白が配置され、各白の右側の領域にグレーが配置され、各グレーの濃さが上から順に薄くなるパターンである。すなわち、パターン4~6は、用紙P1の左側(図16における左側)に配置された3本の白を比較基準として、各白の右側に配置されたグレーが、薄いグレーからさらに薄いグレーに3段階で変化するパターンである。なお、図16に示すように、パターン1~6には、パターン1~6を互いに区別するための数字1~6が並記されている。 In the present embodiment, for example, a test pattern 150 shown in FIG. 16 is printed on a sheet P1 used for printing by the printing apparatus 200 as a specific pattern (test pattern). The test pattern 150 shown as an example in FIG. 16 includes patterns 1 to 3 and patterns 4 to 6. In patterns 1 to 3, as shown in FIG. 16, in the area on the upper left side (upper left side in FIG. 16) on paper P1, three blacks serving as the comparison reference are arranged in parallel in the longitudinal direction. Gray is arranged in the area on the right side, and the density of each gray becomes lighter in order from the top. That is, with respect to patterns 1 to 3, with the three blacks arranged on the left side (left side in FIG. 16) of sheet P1 as a comparison reference, the grays arranged on the right side of each black are three steps from dark gray to light gray Is a changing pattern. As shown in FIG. 16, in patterns 4 to 6, three whites serving as a comparison reference are arranged in parallel in the vertical direction in the area on the lower left side (lower left side in FIG. 16) of paper P1. Gray is arranged in the area on the right side, and the density of each gray becomes lighter in order from the top. That is, with respect to the patterns 4 to 6, the grays arranged on the right side of each white are compared with the light gray to the light gray 3 based on three whites arranged on the left side (left side in FIG. 16) of the paper P1. It is a pattern that changes in stages. As shown in FIG. 16, in the patterns 1 to 6, numerals 1 to 6 for distinguishing the patterns 1 to 6 from each other are described in parallel.
 なお、パターン1~6の配置位置は何ら図16に示す配置位置に限定されない。各パターンが横方向に配置されていてもよい。また、パターン1~6以外のパターンがテストパターン150に含まれていてもよい。また、テストパターン150に含まれるパターンの数は5以下であってもよく、7以上であってもよい。例えば、パターン1~6に加えて、グレーの濃さがより細かく変化した単数または複数のパターンが、図16に示すテストパターン150にさらに含まれていてもよい。 The arrangement positions of the patterns 1 to 6 are not limited to the arrangement positions shown in FIG. Each pattern may be arranged in the lateral direction. Also, patterns other than the patterns 1 to 6 may be included in the test pattern 150. The number of patterns included in the test pattern 150 may be five or less, or seven or more. For example, in addition to the patterns 1 to 6, one or more patterns in which gray density is more finely changed may be further included in the test pattern 150 shown in FIG.
 画像処理装置100においては、例えば、テストパターン150が印刷された用紙P1を参照し、比較基準の黒または白とグレーとの階調を判別できるパターン(または、判別できないパターン)の番号を画像処理装置100(または、印刷装置200)に入力することをユーザに促す入力指示が、例えばディスプレイ等を通して、ユーザに示される。一方、画像処理装置100には、パターン1~3の各番号とパターン4~6の各番号との組み合わせ(例えば、9通りの組み合わせ)と、輝度範囲(ダイナミックレンジ)とが対応付けられたテーブルが、例えばストレージ103等に記憶されている。そして、その入力指示に従った入力操作がユーザにより行われることで、画像処理装置100は、パターン1~6の番号の組み合わせを取得する。このようにして得られた番号の組み合わせを、本実施の形態では「印刷結果」という。この番号の組み合わせは、印刷結果の一例である。なお、図16に示すテストパターン150では、番号に代えて、例えば、アルファベット、記号、その他の文字、またはそれらの組み合わせ、等が用いられてもよい。 In the image processing apparatus 100, for example, with reference to the paper P1 on which the test pattern 150 is printed, image processing is performed on numbers of patterns (or patterns which can not be discriminated) which can discriminate the gradation of black or white and gray as comparison reference. An input instruction prompting the user to input to the device 100 (or the printing device 200) is shown to the user, for example, through a display or the like. On the other hand, in the image processing apparatus 100, a table in which combinations (for example, nine combinations) of each number of patterns 1 to 3 and each number of patterns 4 to 6 and a luminance range (dynamic range) are associated. Are stored, for example, in the storage 103 or the like. Then, when the user performs an input operation according to the input instruction, the image processing apparatus 100 acquires a combination of the numbers of patterns 1 to 6. In the present embodiment, the combination of the numbers obtained in this manner is referred to as "print result". The combination of the numbers is an example of the print result. In the test pattern 150 shown in FIG. 16, for example, alphabets, symbols, other characters, or a combination thereof may be used instead of the numbers.
 画像処理装置100では、パターン1~6の番号の例えば9通りの組み合わせのそれぞれに予め対応付けられている輝度範囲の中から、ユーザによって入力された2つの番号(すなわち、パターン1~3のいずれかの番号とパターン4~6のいずれかの番号)に対応する1つの輝度範囲が、当該用紙に対応する第2のダイナミックレンジとして選択される。それらの動作の具体的な一例を示す。例えば、画像処理装置100において、ユーザに、パターン1~3のうちの階調を判別できる最も小さいパターンの番号と、パターン4~6のうちの階調を判別できる最も大きいパターンの番号とを入力することを促す入力指示が、ディスプレイ等を通して、示される。そして、画像処理装置100は、その入力指示に従ってユーザから入力された2つパターンの番号に基づき、そのパターンの番号の組合せに予め対応付けられた輝度範囲を選択し、選択した輝度範囲を第2のダイナミックレンジとする。このようにして、第2のダイナミックレンジが、画像処理装置100において決定される。 In the image processing apparatus 100, two numbers input by the user (that is, any one of the patterns 1 to 3) out of the luminance range previously associated with each of, for example, nine combinations of the numbers of patterns 1 to 6 One luminance range corresponding to the number and any of the patterns 4 to 6) is selected as the second dynamic range corresponding to the sheet. A concrete example of those operations is shown. For example, in the image processing apparatus 100, the user inputs, to the user, the number of the smallest pattern capable of determining the gradation of the patterns 1 to 3 and the number of the largest pattern capable of determining the gradation of the patterns 4 to 6. An input instruction prompting to do is displayed through a display or the like. Then, the image processing apparatus 100 selects the luminance range previously associated with the combination of the pattern numbers based on the numbers of the two patterns input by the user according to the input instruction, and selects the second selected luminance range. Dynamic range. Thus, the second dynamic range is determined in the image processing apparatus 100.
 図17は、実施の形態1に示す変換部120における、第1のダイナミックレンジから第2のダイナミックレンジへ変換するための、第1のダイナミックレンジの信号値と第2のダイナミックレンジの信号値との関係の一例を模式的に示す図である。なお、第1のダイナミックレンジの信号値は、例えば、HDRにおける輝度であり、第2のダイナミックレンジの信号値は、HDRの最大輝度よりも小さい最大輝度を有する輝度範囲(ダイナミックレンジ)の輝度である。 FIG. 17 shows signal values in the first dynamic range and signal values in the second dynamic range for converting from the first dynamic range to the second dynamic range in conversion section 120 shown in the first embodiment. It is a figure showing typically an example of a relation of. The signal value of the first dynamic range is, for example, the luminance in the HDR, and the signal value of the second dynamic range is the luminance in the luminance range (dynamic range) having the maximum luminance smaller than the maximum luminance of the HDR. is there.
 表示装置では、画素毎に、光の3原色であるRGB(Red、Green、Blue)の発光の強弱が調整されることで、画像が表示される。そのため、表示装置では、画像は絶対輝度で表現される。一方、印刷装置では、用紙に、CMYK(Cyan、Magenta、Yellow、blacK)を含む複数の塗料が印刷されることで、画像が表示される。用紙に印刷された画像は、用紙に塗布された塗料に応じて反射した光の輝度で表現される。そのため、その画像における最大輝度は、塗料が塗布されていない白(用紙の白)の領域の輝度となる。また、用紙の種類(用紙の反射率)、用紙へ照射される光の輝度、用紙へ照射される光の角度、等によって、用紙において反射される光の明るさは変化する。 In a display device, an image is displayed by adjusting the intensity of light emission of RGB (Red, Green, Blue), which is the three primary colors of light, for each pixel. Therefore, in the display device, the image is represented by absolute luminance. On the other hand, in a printing apparatus, an image is displayed by printing a plurality of paints including CMYK (Cyan, Magenta, Yellow, blacK) on a sheet. The image printed on the sheet is expressed by the intensity of light reflected according to the paint applied to the sheet. Therefore, the maximum brightness in the image is the brightness of the white (paper white) area where the paint is not applied. Further, the brightness of the light reflected on the sheet changes depending on the type of the sheet (reflectance of the sheet), the brightness of the light irradiated to the sheet, the angle of the light irradiated to the sheet, and the like.
 したがって、輝度範囲(ダイナミックレンジ)が拡張された高品位な画像を用紙に印刷するためには、用紙の種類、画像が印刷された用紙を設置する環境を照射する光源の種類および明るさ、等を想定した上で、HDR画像の信号レベルを第1のダイナミックレンジから第2のダイナミックレンジへ変換する変換処理を行うことが有効である。 Therefore, in order to print a high-quality image with an extended luminance range (dynamic range) on a sheet, the type of sheet, the type and brightness of the light source illuminating the environment in which the sheet on which the image is printed is installed, etc. It is effective to perform conversion processing to convert the signal level of the HDR image from the first dynamic range to the second dynamic range, assuming that
 変換部120は、例えば、図17に示すように、第1のダイナミックレンジを、用紙の表現能力の上限(すなわち、用紙の白で表現される輝度)を最大輝度とした第2のダイナミックレンジへ変換する処理を行うことで、第1静止画データD1としてのHDR画像データを、第2静止画データD2へ変換する。用紙の白で表現される最大輝度は、例えば、用紙の反射率から求めることができる。変換部120は、例えば、用紙の反射率を用いて、規定の輝度の光が当該用紙の白で反射したときの反射光の輝度(以下、「反射輝度」という)を算出し、算出した反射輝度を当該用紙の表現能力の上限としてもよい。 For example, as shown in FIG. 17, the converting unit 120 sets the first dynamic range to a second dynamic range in which the upper limit of the expressive capability of the sheet (that is, the brightness represented by the white of the sheet) is the maximum brightness. By performing conversion processing, HDR image data as the first still image data D1 is converted into second still image data D2. The maximum luminance represented by the white of the sheet can be determined, for example, from the reflectance of the sheet. The conversion unit 120 calculates the brightness of the reflected light (hereinafter referred to as “reflection brightness”) when light of a prescribed brightness is reflected on the white of the paper using, for example, the reflectance of the paper, and the calculated reflection The luminance may be set as the upper limit of the expressive ability of the sheet.
 また、変換部120は、第1静止画データD1における最大輝度を取得し、第1静止画データD1における最大輝度が反射輝度となるように、輝度範囲(ダイナミックレンジ)を変換することにより、第1静止画データD1を第2静止画データD2へ変換してもよい。 Further, the conversion unit 120 obtains the maximum luminance in the first still image data D1, and converts the luminance range (dynamic range) by converting the luminance range (dynamic range) so that the maximum luminance in the first still image data D1 becomes the reflection luminance. The first still image data D1 may be converted into the second still image data D2.
 (1-2.動作)
 図18は、実施の形態1における画像処理装置100の動作の一例を示すフローチャートである。
(1-2. Operation)
FIG. 18 is a flowchart showing an example of the operation of the image processing apparatus 100 according to the first embodiment.
 取得部110は、第1静止画データD1を取得する(ステップS101)。 The acquisition unit 110 acquires the first still image data D1 (step S101).
 取得部110は、能力情報I1を取得する(ステップS102)。 The acquisition unit 110 acquires the capability information I1 (step S102).
 変換部120は、取得部110により取得された第1静止画データD1を、取得部110により取得された能力情報I1が示す印刷能力に応じた第2のダイナミックレンジに基づき、第2静止画データD2に変換する(ステップS103)。 The conversion unit 120 generates the second still image data based on the second dynamic range corresponding to the print capability indicated by the capability information I1 acquired by the acquisition unit 110, based on the first still image data D1 acquired by the acquisition unit 110. It converts to D2 (step S103).
 出力部130は、変換部120により第1静止画データD1から変換された第2静止画データD2を、印刷装置200に出力する(ステップS104)。 The output unit 130 outputs the second still image data D2 converted from the first still image data D1 by the conversion unit 120 to the printing apparatus 200 (step S104).
 (1-3.効果等)
 以上のように、本実施の形態において、画像処理装置は、撮像により得られ輝度範囲が第1のダイナミックレンジで定義された第1静止画データと印刷装置の印刷能力を示す能力情報とを取得する取得部と、取得部により取得された第1静止画データを、取得部により取得された能力情報が示す印刷能力に応じて、第1のダイナミックレンジよりも輝度範囲が狭い第2のダイナミックレンジで定義された第2静止画データに変換する変換部と、変換部により変換された第2静止画データを印刷装置に出力する出力部と、を備える。
(1-3. Effects etc.)
As described above, in the present embodiment, the image processing apparatus obtains the first still image data obtained by imaging and the luminance range defined by the first dynamic range and the capability information indicating the printing capability of the printing apparatus. A second dynamic range having a luminance range narrower than the first dynamic range according to the acquiring unit and the first still image data acquired by the acquiring unit according to the printing capability indicated by the capability information acquired by the acquiring unit And a output unit configured to output the second still image data converted by the conversion unit to the printing apparatus.
 また、本実施の形態において、画像処理方法は、撮像により得られ輝度範囲が第1のダイナミックレンジで定義された第1静止画データを取得し、印刷装置の印刷能力を示す能力情報を取得し、取得した第1静止画データを、取得した能力情報が示す印刷能力に応じて、第1のダイナミックレンジよりも輝度範囲が狭い第2のダイナミックレンジで定義された第2静止画データに変換し、変換した第2静止画データを印刷装置に出力する。 Further, in the present embodiment, the image processing method acquires first still image data obtained by imaging and having a luminance range defined by the first dynamic range, and acquires capability information indicating the printing capability of the printing apparatus. Converting the acquired first still image data into second still image data defined in a second dynamic range whose luminance range is narrower than the first dynamic range according to the printing capability indicated by the acquired capability information , Outputting the converted second still image data to the printing apparatus.
 なお、画像処理装置100は画像処理装置の一例である。第1静止画データD1は第1静止画データの一例である。印刷装置200は印刷装置の一例である。能力情報I1は能力情報の一例である。取得部110は取得部の一例である。第2静止画データD2は第2静止画データの一例である。変換部120は変換部の一例である。出力部130は出力部の一例である。 The image processing apparatus 100 is an example of an image processing apparatus. The first still image data D1 is an example of first still image data. The printing apparatus 200 is an example of a printing apparatus. The capability information I1 is an example of the capability information. The acquisition unit 110 is an example of an acquisition unit. The second still image data D2 is an example of second still image data. The conversion unit 120 is an example of a conversion unit. The output unit 130 is an example of the output unit.
 例えば、実施の形態1に示した例では、画像処理装置100は、撮像により得られ輝度範囲が第1のダイナミックレンジで定義された第1静止画データD1と印刷装置200の印刷能力を示す能力情報I1とを取得する取得部110と、取得部110により取得された第1静止画データD1を、取得部110により取得された能力情報I1が示す印刷能力に応じて、第1のダイナミックレンジよりも輝度範囲が狭い第2のダイナミックレンジで定義された第2静止画データD2に変換する変換部120と、変換部120により変換された第2静止画データD2を印刷装置200に出力する出力部130と、を備える。 For example, in the example shown in the first embodiment, the image processing apparatus 100 has the ability to indicate the first still image data D1 obtained by imaging and having the luminance range defined by the first dynamic range and the printing capability of the printing apparatus 200. The acquisition unit 110 for acquiring the information I1 and the first still image data D1 acquired by the acquisition unit 110 are compared with the first dynamic range according to the printing capability indicated by the capability information I1 acquired by the acquisition unit 110. Also, a converter 120 for converting the second still image data D2 defined by the second dynamic range having a narrow luminance range, and an output unit for outputting the second still image data D2 converted by the converter 120 to the printing apparatus 200 And 130.
 また、実施の形態1に示した例では、画像処理装置100で実行される画像処理方法は、撮像により得られ輝度範囲が第1のダイナミックレンジで定義された第1静止画データD1を取得し(ステップS101)、印刷装置200の印刷能力を示す能力情報I1を取得し(ステップS102)、取得した第1静止画データD1を、取得した能力情報I1が示す印刷能力に応じて、第1のダイナミックレンジよりも輝度範囲が狭い第2のダイナミックレンジで定義された第2静止画データD2に変換し(ステップS103)、変換した第2静止画データD2を印刷装置200に出力する(ステップS104)。 Further, in the example shown in the first embodiment, the image processing method executed by the image processing apparatus 100 acquires the first still image data D1 obtained by imaging and in which the luminance range is defined by the first dynamic range. (Step S101) The capability information I1 indicating the printing capability of the printing apparatus 200 is acquired (step S102), and the acquired first still image data D1 is subjected to a first processing according to the printing capability indicated by the acquired capability information I1. The second still image data D2 defined in the second dynamic range having a narrower luminance range than the dynamic range is converted (step S103), and the converted second still image data D2 is output to the printing apparatus 200 (step S104) .
 このように構成された画像処理装置100は、第1静止画データD1を、印刷装置200の印刷能力に応じて決定された第2のダイナミックレンジで定義された第2静止画データD2に変換し、当該第2静止画データD2を印刷装置200に出力することができる。このため、画像処理装置100は、印刷装置200の印刷能力に応じた輝度範囲(ダイナミックレンジ)で、第1静止画データD1に基づく画像を、印刷装置200に印刷させることができる。よって、画像処理装置100は、第1静止画データD1に基づく画像を、印刷装置200に高品位に印刷させることができる。 The image processing apparatus 100 configured in this manner converts the first still image data D1 into second still image data D2 defined by the second dynamic range determined according to the printing capability of the printing apparatus 200. The second still image data D2 can be output to the printing apparatus 200. Therefore, the image processing apparatus 100 can cause the printing apparatus 200 to print an image based on the first still image data D1 in a luminance range (dynamic range) corresponding to the print capability of the printing apparatus 200. Therefore, the image processing apparatus 100 can cause the printing apparatus 200 to print an image based on the first still image data D1 with high quality.
 当該画像処理装置において、取得部は、能力情報として、印刷装置で印刷に用いる用紙の種類を示す紙情報を取得してもよい。変換部は、複数の用紙の種類と、当該用紙に光を照射したときに当該用紙で反射される当該光の反射輝度との関係を参照することで、取得部により取得された紙情報が示す用紙の種類に対応する反射輝度を特定してもよい。そして、変換部は、特定した反射輝度を最大輝度とする輝度範囲を第2のダイナミックレンジとして、第1静止画データから第2静止画データへ変換してもよい。 In the image processing apparatus, the acquisition unit may acquire paper information indicating the type of paper used for printing by the printing apparatus as the capability information. The conversion unit indicates the sheet information acquired by the acquisition unit by referring to the relationship between the types of a plurality of sheets and the reflection luminance of the light reflected by the sheet when the sheets are irradiated with light. The reflected luminance corresponding to the type of paper may be specified. Then, the conversion unit may convert the first still image data into the second still image data, with the luminance range in which the specified reflection luminance is the maximum luminance as the second dynamic range.
 例えば、実施の形態1に示した例では、画像処理装置100において、取得部110は、能力情報I1として、印刷装置200で印刷に用いる用紙の種類を示す紙情報を取得する。変換部120は、複数の用紙の種類と、当該用紙に光を照射したときに当該用紙で反射される当該光の反射輝度との関係を参照することで、取得部110により取得された紙情報が示す用紙の種類に対応する反射輝度を特定する。そして、変換部120は、特定した反射輝度を最大輝度とする輝度範囲を第2のダイナミックレンジとして、第1静止画データD1から第2静止画データD2へ変換する。 For example, in the example described in the first embodiment, in the image processing apparatus 100, the acquisition unit 110 acquires paper information indicating the type of paper used for printing by the printing apparatus 200 as the capability information I1. The paper information acquired by the acquisition unit 110 by referring to the relationship between the types of a plurality of sheets and the reflection luminance of the light reflected by the sheets when the sheets are irradiated with light. Identify the reflection brightness corresponding to the type of paper indicated by. Then, the converting unit 120 converts the first still image data D1 into the second still image data D2 with the luminance range in which the specified reflection luminance is the maximum luminance as the second dynamic range.
 このように構成された画像処理装置100は、第1静止画データD1を、印刷装置200で印刷に用いる用紙の種類に基づく第2のダイナミックレンジで定義された第2静止画データD2に変換し、当該第2静止画データD2を印刷装置200に出力することができる。このため、画像処理装置100は、第1静止画データD1に基づく画像を、当該用紙の表現能力に応じた輝度範囲(ダイナミックレンジ)で、印刷装置200に印刷させることができる。よって、画像処理装置100は、第1静止画データD1に基づく画像を、印刷装置200に高品位に印刷させることができる。 The image processing apparatus 100 configured in this manner converts the first still image data D1 into second still image data D2 defined by the second dynamic range based on the type of paper used for printing by the printing apparatus 200. The second still image data D2 can be output to the printing apparatus 200. Therefore, the image processing apparatus 100 can cause the printing apparatus 200 to print an image based on the first still image data D1 in a luminance range (dynamic range) according to the expressive ability of the sheet. Therefore, the image processing apparatus 100 can cause the printing apparatus 200 to print an image based on the first still image data D1 with high quality.
 当該画像処理装置において、取得部は、能力情報として、印刷装置で印刷に用いる用紙に当該印刷装置が特定のパターンを印刷して得られた印刷結果を取得してもよい。変換部は、取得部により取得された当該印刷結果に応じて決定した輝度範囲を第2のダイナミックレンジとして、第1静止画データから第2静止画データへ変換してもよい。 In the image processing apparatus, the acquisition unit may acquire, as the capability information, a print result obtained by printing the specific pattern on the sheet used for printing by the printing apparatus. The conversion unit may convert the first still image data into the second still image data, using the luminance range determined according to the print result acquired by the acquisition unit as the second dynamic range.
 なお、図16に示すテストパターン150は特定のパターンの一例である。図16に示すテストパターン150が印刷された用紙P1を参照したユーザから画像処理装置100が得られるパターン1~6の番号の組み合わせは、印刷結果の一例である。 The test pattern 150 shown in FIG. 16 is an example of a specific pattern. The combination of the numbers of patterns 1 to 6 at which the image processing apparatus 100 can obtain the image processing apparatus 100 from the user who has referred to the paper P1 on which the test pattern 150 shown in FIG.
 例えば、実施の形態1に示した例では、画像処理装置100において、取得部110は、能力情報I1として、印刷装置200で印刷に用いる用紙に印刷装置200が特定のパターンを印刷して得られた印刷結果を取得する。変換部120は、取得部110により取得された当該印刷結果に応じて決定した輝度範囲を第2のダイナミックレンジとして、第1静止画データD1から第2静止画データD2へ変換する。 For example, in the example shown in the first embodiment, in the image processing apparatus 100, the acquiring unit 110 obtains the capability information I1 by printing the specific pattern on the sheet used for printing by the printing apparatus 200. Get the printed result. The conversion unit 120 converts the first still image data D1 into the second still image data D2 as the second dynamic range, which is the luminance range determined according to the print result acquired by the acquisition unit 110.
 このように構成された画像処理装置100では、印刷装置200で印刷に用いる用紙に特定のパターンを印刷し、特定のパターンが印刷された用紙P1に基づく印刷結果をユーザから取得し、取得された印刷結果に応じて、第2のダイナミックレンジを決定することができる。そして、画像処理装置100は、第1静止画データD1を、その第2のダイナミックレンジで定義された第2静止画データD2に変換して、当該第2静止画データD2を印刷装置200に出力することができる。つまり、画像処理装置100は、印刷装置200で印刷に用いる用紙の表現能力(すなわち、当該用紙の白で表現される輝度を最大輝度とした輝度範囲)を容易に推定でき、当該用紙の表現能力に適した輝度範囲(ダイナミックレンジ)を第2のダイナミックレンジとすることで、第2のダイナミックレンジを容易に決定することができる。このため、画像処理装置100は、第1静止画データD1に基づく画像を、印刷装置200に高品位に印刷させることができる。 In the image processing apparatus 100 configured in this manner, a specific pattern is printed on a sheet used for printing by the printing apparatus 200, and a print result based on the sheet P1 on which the specific pattern is printed is acquired from the user and acquired. The second dynamic range can be determined according to the printing result. Then, the image processing apparatus 100 converts the first still image data D1 into second still image data D2 defined by the second dynamic range, and outputs the second still image data D2 to the printing apparatus 200. can do. That is, the image processing apparatus 100 can easily estimate the expressive ability of the sheet used for printing by the printing apparatus 200 (that is, the luminance range in which the luminance represented by the white of the sheet is the maximum luminance), The second dynamic range can be easily determined by setting the luminance range (dynamic range) suitable for the second dynamic range as the second dynamic range. Therefore, the image processing apparatus 100 can cause the printing apparatus 200 to print an image based on the first still image data D1 with high quality.
 当該画像処理装置において、第1のダイナミックレンジは、HDRであってもよい。 In the image processing apparatus, the first dynamic range may be HDR.
 例えば、実施の形態1に示した例では、画像処理装置100において、第1のダイナミックレンジは、HDRである。 For example, in the example shown in the first embodiment, in the image processing apparatus 100, the first dynamic range is HDR.
 したがって、この画像処理装置100は、HDRに対応している第1静止画データD1の輝度範囲(ダイナミックレンジ)を極力損なわずに、第1静止画データD1に基づく画像を、高品位に印刷装置200に印刷させることができる。 Therefore, the image processing apparatus 100 can print an image based on the first still image data D1 with high quality without losing the luminance range (dynamic range) of the first still image data D1 corresponding to HDR as much as possible. It can be printed on 200.
 (1-4.実施の形態1の変形例)
 次に、図19~図21を参照しながら、実施の形態1の変形例1~5について説明する。なお、以下の説明において、実施の形態1で説明した構成要素と実質的に同じ構成要素には、その構成要素と同じ符号を付与し、説明を省略する。また、以下の説明において、画像処理装置100A、100B、100Cは、それぞれが画像処理装置の一例である。取得部110A、110B、110Cは、それぞれが取得部の一例である。変換部120Cは変換部の一例である。印刷装置200A、200Bは、それぞれが印刷装置の一例である。
(1-4. Modification of Embodiment 1)
Next, Modifications 1 to 5 of Embodiment 1 will be described with reference to FIGS. 19 to 21. In the following description, components substantially the same as the components described in the first embodiment will be assigned the same reference numerals as the components and descriptions thereof will be omitted. In the following description, each of the image processing apparatuses 100A, 100B, and 100C is an example of the image processing apparatus. Each of the acquisition units 110A, 110B, and 110C is an example of an acquisition unit. The conversion unit 120C is an example of a conversion unit. Each of the printing devices 200A and 200B is an example of a printing device.
 (1-4-1.変形例1)
 まず、実施の形態1の変形例1について説明する。
(1-4-1. Modified example 1)
First, a first modification of the first embodiment will be described.
 図19は、実施の形態1の変形例1における画像処理装置100Aの機能構成の一例を模式的に示すブロック図である。 FIG. 19 is a block diagram schematically showing an example of a functional configuration of the image processing apparatus 100A in the first modification of the first embodiment.
 画像処理装置100Aは、取得部110Aと、変換部120と、出力部130と、を備える。変形例1における画像処理装置100Aは、実施の形態1における画像処理装置100と比較して、能力情報I1を入力装置300から取得する点が画像処理装置100と異なる。画像処理装置100Aにおけるその他の構成は、実施の形態1における画像処理装置100と実質的に同じであるので詳細な説明を省略する。 The image processing apparatus 100A includes an acquisition unit 110A, a conversion unit 120, and an output unit 130. The image processing apparatus 100A according to the first modification differs from the image processing apparatus 100 according to the first embodiment in that the capability information I1 is acquired from the input device 300. The other configuration in the image processing apparatus 100A is substantially the same as that of the image processing apparatus 100 in the first embodiment, and thus the detailed description will be omitted.
 取得部110Aは、実施の形態1の画像処理装置100における取得部110と同様に、第1静止画データD1と、能力情報I1と、を取得する。 Similar to the acquisition unit 110 in the image processing apparatus 100 according to the first embodiment, the acquisition unit 110A acquires the first still image data D1 and the capability information I1.
 取得部110Aは、実施の形態1の画像処理装置100における取得部110と同様に、第1静止画データD1を、例えば、画像処理装置100Aに有線接続または無線接続された、撮像装置、情報端末、または記憶装置、等から取得してもよい。第1静止画データD1を取得する取得部110Aは、例えば、通信IF105(図14参照)により実現されてもよい。 Similar to the acquisition unit 110 in the image processing apparatus 100 according to the first embodiment, the acquisition unit 110A is, for example, an imaging device or an information terminal to which the first still image data D1 is wired or wirelessly connected to the image processing apparatus 100A. Or from a storage device or the like. Acquisition part 110A which acquires the 1st still picture data D1 may be realized by communication IF105 (refer to Drawing 14), for example.
 取得部110Aは、能力情報I1を、例えば、画像処理装置100Aに有線接続または無線接続された入力装置300から取得してもよい。取得部110Aが取得する能力情報I1は、実施の形態1と同様に、紙情報であってもよいし、印刷結果であってもよい。能力情報I1を取得する取得部110Aは、例えば、通信IF105(図14参照)により実現されてもよい。 The acquisition unit 110A may acquire, for example, the capability information I1 from the input device 300 wired or wirelessly connected to the image processing apparatus 100A. As in the first embodiment, the capability information I1 acquired by the acquisition unit 110A may be paper information or a print result. The acquisition unit 110A that acquires the capability information I1 may be realized by, for example, the communication IF 105 (see FIG. 14).
 入力装置300は、例えば、ユーザが印刷指示を印刷装置200に出力するための情報端末であり、例えば、スマートフォン、タブレット端末、またはPC、等である。入力装置300は、入力IFおよび通信IFを備えており(図示せず)、当該入力IFにおいて入力された紙情報を含む印刷指示を、当該通信IFを用いて画像処理装置100Aに送信する。入力装置300が備える入力IFは、例えば、入力IF104(図14参照)と同様の構成であってもよい。また、入力装置300が備える通信IFは、画像処理装置100Aと通信可能な構成であればよく、例えば、通信IF105(図14参照)と同様の構成であってもよい。 The input device 300 is, for example, an information terminal for the user to output a print instruction to the printing device 200, and is, for example, a smartphone, a tablet terminal, or a PC. The input device 300 includes an input IF and a communication IF (not shown), and transmits a print instruction including paper information input at the input IF to the image processing apparatus 100A using the communication IF. The input IF provided in the input device 300 may have, for example, the same configuration as the input IF 104 (see FIG. 14). Further, the communication IF provided in the input device 300 may be configured to be able to communicate with the image processing apparatus 100A, and may be, for example, the same configuration as the communication IF 105 (see FIG. 14).
 このように、画像処理装置100Aは、外部の入力装置300を介して能力情報I1を含む印刷指示を取得してもよい。また、画像処理装置100Aは、ユーザによって入力装置300に入力された印刷結果を、入力装置300から取得してもよい。 As described above, the image processing apparatus 100A may obtain a print instruction including the capability information I1 via the external input device 300. In addition, the image processing apparatus 100A may acquire, from the input device 300, the print result input to the input device 300 by the user.
 (1-4-2.変形例2)
 次に、実施の形態1の変形例2について説明する。
(1-4-2. Modified Example 2)
Next, a second modification of the first embodiment will be described.
 図20は、実施の形態1の変形例2における画像処理装置100Bの機能構成の一例を模式的に示すブロック図である。 FIG. 20 is a block diagram schematically showing an example of a functional configuration of the image processing apparatus 100B in the second modification of the first embodiment.
 画像処理装置100Bは、取得部110Bと、変換部120と、出力部130と、を備える。変形例2における画像処理装置100Bは、実施の形態1における画像処理装置100と比較して、能力情報I1を印刷装置200Aから取得する点が画像処理装置100と異なる。画像処理装置100Bにおけるその他の構成は、実施の形態1における画像処理装置100と実質的に同じであるので詳細な説明を省略する。 The image processing apparatus 100 </ b> B includes an acquisition unit 110 </ b> B, a conversion unit 120, and an output unit 130. The image processing apparatus 100B according to the second modification differs from the image processing apparatus 100 according to the first embodiment in that the capability information I1 is acquired from the printing apparatus 200A. The other configuration in the image processing apparatus 100B is substantially the same as that of the image processing apparatus 100 in the first embodiment, and thus the detailed description will be omitted.
 取得部110Bは、実施の形態1における取得部110および変形例1における取得部110Aと同様に、第1静止画データD1と、能力情報I1と、を取得する。 Similar to the acquisition unit 110 in the first embodiment and the acquisition unit 110A in the first modification, the acquisition unit 110B acquires the first still image data D1 and the capability information I1.
 取得部110Bは、実施の形態1の画像処理装置100における取得部110と同様に、第1静止画データD1を、例えば、画像処理装置100Bに有線接続または無線接続された、撮像装置、情報端末、または記憶装置、等から取得してもよい。第1静止画データD1を取得する取得部110Bは、例えば、通信IF105(図14参照)により実現されてもよい。 Similar to the acquisition unit 110 in the image processing apparatus 100 according to the first embodiment, the acquisition unit 110B is, for example, an imaging device or an information terminal to which the first still image data D1 is wired or wirelessly connected to the image processing apparatus 100B. Or from a storage device or the like. Acquisition part 110B which acquires the 1st still picture data D1 may be realized by communication IF105 (refer to Drawing 14), for example.
 取得部110Bは、能力情報I1を、例えば、画像処理装置100Bに有線接続または無線接続された印刷装置200Aから取得してもよい。取得部110Bが取得する能力情報I1は、実施の形態1および変形例1と同様に、紙情報であってもよいし、印刷結果であってもよい。能力情報I1を取得する取得部110Bは、例えば、通信IF105(図14参照)により実現されてもよい。 The acquiring unit 110B may acquire, for example, the capability information I1 from the printing apparatus 200A wired or wirelessly connected to the image processing apparatus 100B. As in the first embodiment and the first modification, the capability information I1 acquired by the acquisition unit 110B may be paper information or a print result. The acquisition unit 110B that acquires the capability information I1 may be realized by, for example, the communication IF 105 (see FIG. 14).
 印刷装置200Aは、ユーザが印刷指示を印刷装置200Aに入力するための入力IF201を備える。印刷装置200Aは、入力IF201への印刷指示をユーザから受け付けると、当該印刷指示を、印刷装置200Aが備える通信IF(図示せず)を介して、画像処理装置100Bに送信する。入力IF201は、例えば、タッチパネル、入力ボタン、およびディスプレイ、等のうちの1つまたは複数を備えて構成される。印刷装置200Aが備える通信IFは、画像処理装置100Bと通信可能な構成であればよく、例えば、通信IF105(図14参照)と同様の構成であってもよい。 The printing apparatus 200A includes an input IF 201 for the user to input a print instruction to the printing apparatus 200A. When the printing apparatus 200A receives a print instruction to the input IF 201 from the user, the printing apparatus 200A transmits the print instruction to the image processing apparatus 100B via a communication IF (not shown) included in the printing apparatus 200A. The input IF 201 is configured to include, for example, one or more of a touch panel, an input button, a display, and the like. The communication IF included in the printing apparatus 200A may be configured to be able to communicate with the image processing apparatus 100B, and may be, for example, the same configuration as the communication IF 105 (see FIG. 14).
 このように、画像処理装置100Bは、印刷装置200Aを介して能力情報I1を含む印刷指示を取得してもよい。また、画像処理装置100Bは、ユーザによって印刷装置200Aに入力された印刷結果を、印刷装置200Aから取得してもよい。 Thus, the image processing apparatus 100B may obtain a print instruction including the capability information I1 via the printing apparatus 200A. In addition, the image processing apparatus 100B may acquire, from the printing apparatus 200A, the print result input to the printing apparatus 200A by the user.
 (1-4-3.変形例3)
 次に、実施の形態1の変形例3について説明する。
(1-4-3. Modification 3)
Next, a third modification of the first embodiment will be described.
 図21は、実施の形態1の変形例3における画像処理装置100Cの機能構成の一例を模式的に示すブロック図である。 FIG. 21 is a block diagram schematically showing an example of a functional configuration of the image processing apparatus 100C in the third modification of the first embodiment.
 画像処理装置100Cは、取得部110Cと、変換部120Cと、出力部130と、を備える。変形例3における画像処理装置100Cは、変形例2における画像処理装置100Bと比較して、能力情報I1とは異なる能力情報I2を印刷装置200Bから取得する点が画像処理装置100Bと異なる。また、変形例3における画像処理装置100Cは、変換部120Cにおける処理が、変形例2における変換部120における処理とは異なる。画像処理装置100Cにおけるその他の構成は、変形例2における画像処理装置100Bと実質的に同じであるので詳細な説明を省略する。 The image processing apparatus 100C includes an acquisition unit 110C, a conversion unit 120C, and an output unit 130. The image processing apparatus 100C according to the third modification differs from the image processing apparatus 100B according to the second modification in that the capability information I2 different from the capability information I1 is acquired from the printing apparatus 200B. Further, the processing in the conversion unit 120C in the image processing apparatus 100C in the modification 3 is different from the processing in the conversion unit 120 in the modification 2. The other configuration in the image processing apparatus 100C is substantially the same as that of the image processing apparatus 100B in the second modification, and thus the detailed description is omitted.
 取得部110Cは、第1静止画データD1と、能力情報I2と、を取得する。 The acquisition unit 110C acquires the first still image data D1 and the capability information I2.
 取得部110Cは、実施の形態1の画像処理装置100における取得部110と同様に、第1静止画データD1を、例えば、画像処理装置100Cに有線接続または無線接続された、撮像装置、情報端末、または記憶装置、等から取得してもよい。第1静止画データD1を取得する取得部110Cは、例えば、通信IF105(図14参照)により実現されてもよい。 Similar to the acquisition unit 110 in the image processing apparatus 100 according to the first embodiment, the acquisition unit 110C is, for example, an imaging device or an information terminal to which the first still image data D1 is wired or wirelessly connected to the image processing apparatus 100C. Or from a storage device or the like. The acquisition unit 110C that acquires the first still image data D1 may be realized by, for example, the communication IF 105 (see FIG. 14).
 取得部110Cは、能力情報I2を、画像処理装置100Cに有線接続または無線接続された印刷装置200Bから取得する。具体的には、取得部110Cは、能力情報I2として、印刷装置200Bで印刷に用いる用紙(例えば、印刷前の、何も印刷されていない用紙)が印刷装置200Bのスキャナ202でスキャンされることにより得られたスキャン画像を取得する。取得部110Cは、例えば、通信IF105(図14参照)により実現される。 The acquisition unit 110C acquires the capability information I2 from the printing apparatus 200B wired or wirelessly connected to the image processing apparatus 100C. Specifically, the acquisition unit 110C is configured to scan a sheet used for printing by the printing apparatus 200B (for example, a sheet without printing before printing) as the capability information I2 by the scanner 202 of the printing apparatus 200B. To obtain a scan image obtained by The acquisition unit 110C is realized by, for example, the communication IF 105 (see FIG. 14).
 変換部120Cは、取得部110Cにより取得されたスキャン画像の輝度(反射輝度)に基づいて、スキャナ202でスキャンされた用紙に対応する最大輝度を特定する。そして、変換部120Cは、その最大輝度に基づき第2のダイナミックレンジを決定する。すなわち、変換部120Cは、能力情報I2に基づき特定した反射輝度を最大輝度とする輝度範囲(ダイナミックレンジ)を第2のダイナミックレンジとして、第1静止画データを第2静止画データへ変換する。変換部120Cは、例えば、CPU101、メインメモリ102、ストレージ103、等により(図14参照)実現される。 The conversion unit 120C specifies the maximum luminance corresponding to the sheet scanned by the scanner 202, based on the luminance (reflection luminance) of the scan image acquired by the acquisition unit 110C. Then, the conversion unit 120C determines the second dynamic range based on the maximum luminance. That is, the conversion unit 120C converts the first still image data into the second still image data with the luminance range (dynamic range) in which the reflection luminance identified based on the capability information I2 is the maximum luminance as the second dynamic range. The conversion unit 120C is realized by, for example, the CPU 101, the main memory 102, the storage 103, and the like (see FIG. 14).
 変換部120Cは、スキャナ202の種類に応じて変換処理を変更してもよい。変換部120Cは、例えば、スキャナ202が有する光源の明るさと、スキャナ202が有するイメージセンサの感度との少なくとも一方に応じて、得られた画像の輝度を補正してもよい。変換部120Cは、例えば、スキャナ202が有する光源の明るさが明るいほど、得られた画像の輝度が小さくなるように、画像の輝度を補正してもよい。また、変換部120Cは、例えば、スキャナ202が有するイメージセンサの感度が高いほど、得られた画像の輝度が小さくなるように、画像の輝度を補正してもよい。なお、この場合、画像処理装置100Cは、スキャナ202の種類を表す情報と、画像の輝度を補正するための補正処理を表す情報とを予め関係付けた補正情報を、ストレージ103に記憶していてもよい。また、画像処理装置100Cは、印刷装置200Bからスキャナ202の種類を表す情報を取得し、取得したスキャナ202の種類を表す情報に対応する補正処理を補正情報から特定し、特定した補正処理を行うことで、スキャナ202の種類に応じた変換処理を行ってもよい。なお、本変形例において、補正情報は、ストレージ103に記憶されていなくてもよい。画像処理装置100Cは、補正情報を、外部の情報処理装置から通信IF105(図14参照)を介して取得してもよい。 The converter 120C may change the conversion process according to the type of the scanner 202. For example, the conversion unit 120C may correct the luminance of the obtained image according to at least one of the brightness of the light source of the scanner 202 and the sensitivity of the image sensor of the scanner 202. For example, the conversion unit 120C may correct the luminance of the image such that the luminance of the obtained image decreases as the brightness of the light source of the scanner 202 increases. In addition, for example, the conversion unit 120C may correct the luminance of the image such that the luminance of the obtained image decreases as the sensitivity of the image sensor of the scanner 202 increases. In this case, the image processing apparatus 100C stores, in the storage 103, correction information in which information representing the type of the scanner 202 and information representing correction processing for correcting the luminance of the image are associated in advance. It is also good. Further, the image processing apparatus 100C acquires information representing the type of the scanner 202 from the printing apparatus 200B, identifies the correction processing corresponding to the acquired information representing the type of the scanner 202 from the correction information, and performs the identified correction processing. Thus, conversion processing according to the type of scanner 202 may be performed. In the present modification, the correction information may not be stored in the storage 103. The image processing apparatus 100C may acquire correction information from an external information processing apparatus via the communication IF 105 (see FIG. 14).
 印刷装置200Bは、スキャナ202を備えている。印刷装置200Bは、印刷装置200Bで印刷に用いる用紙(例えば、印刷前の、何も印刷されていない用紙)をスキャナ202でスキャンすることにより、スキャン画像を得ることができる。印刷装置200Bは、得られたスキャン画像を、能力情報I2として、印刷装置200Bが備える通信IF(図示せず)を介して、画像処理装置100Cに送信する。 The printing apparatus 200 </ b> B includes a scanner 202. The printing apparatus 200B can obtain a scanned image by scanning the sheet used for printing with the printing apparatus 200B (for example, a sheet without printing before printing) with the scanner 202. The printing apparatus 200B transmits the obtained scan image as the capability information I2 to the image processing apparatus 100C via a communication IF (not shown) included in the printing apparatus 200B.
 例えば、画像処理装置100Cは、ディスプレイを備えていてもよい。そして、画像処理装置100Cは、スキャン指示をディスプレイに表示させてもよい。このスキャン指示は、画像処理装置100Cに接続されている印刷装置200Bが備えるスキャナ202で、印刷装置200Bで印刷に用いる用紙をスキャンするようにユーザに促す、メッセージ、画像、またはUI、等であってもよい。この場合、画像処理装置100Cは、画像処理装置100Cに接続されている印刷装置200Bと有線通信または無線通信することで、印刷装置200Bがスキャナ202を備えているか否かを判定してもよい。また、画像処理装置100Cは、ユーザに用紙のスキャンを促すスキャン指示をディスプレイに表示し、印刷装置200Bにおいて用紙のスキャンが行われ、当該スキャンで用紙のスキャン画像が得られた場合に、印刷装置200Bに対して、得られたスキャン画像を画像処理装置100Cに送信させる制御を行ってもよい。 For example, the image processing apparatus 100C may include a display. Then, the image processing apparatus 100C may display a scan instruction on the display. This scan instruction is a message, an image, or a UI, etc. for prompting the user to scan a sheet used for printing by the printing apparatus 200B by the scanner 202 provided in the printing apparatus 200B connected to the image processing apparatus 100C. May be In this case, the image processing apparatus 100C may determine whether the printing apparatus 200B includes the scanner 202 by performing wired communication or wireless communication with the printing apparatus 200B connected to the image processing apparatus 100C. Further, the image processing apparatus 100C displays a scan instruction for prompting the user to scan the sheet on the display, and the printing apparatus 200B scans the sheet, and when the scanned image of the sheet is obtained by the scan, the printing apparatus The control for causing the image processing apparatus 100C to transmit the obtained scan image may be performed on 200B.
 以上のように、本変形例の画像処理装置おいて、取得部は、能力情報として、印刷装置で印刷に用いる用紙がスキャンされることにより得られたスキャン画像を取得してもよい。変換部は、取得部により取得された当該スキャン画像の輝度に基づいて、当該用紙に光を照射したときに当該用紙で反射される当該光の反射輝度を特定し、特定した反射輝度を最大輝度とする輝度範囲を第2のダイナミックレンジとして、第1静止画データから第2静止画データへ変換してもよい。 As described above, in the image processing apparatus according to this modification, the acquisition unit may acquire, as the capability information, a scanned image obtained by scanning a sheet used for printing by the printing apparatus. The conversion unit specifies the reflection luminance of the light reflected by the paper when the light is irradiated to the paper based on the luminance of the scan image acquired by the acquisition unit, and the specified reflection luminance is the maximum luminance. The first still image data may be converted into second still image data with the luminance range to be set as a second dynamic range.
 なお、能力情報I2は能力情報の一例である。 The capability information I2 is an example of the capability information.
 例えば、変形例3に示した例では、画像処理装置100Cにおいて、取得部110Cは、能力情報I2として、印刷装置200Bで印刷に用いる用紙がスキャンされることにより得られたスキャン画像を取得する。変換部120Cは、取得部110Cにより取得された当該スキャン画像の輝度に基づいて、当該用紙に光を照射したときに当該用紙で反射される当該光の反射輝度を特定し、特定した反射輝度を最大輝度とする輝度範囲を第2のダイナミックレンジとして、第1静止画データD1から第2静止画データD2へ変換する。 For example, in the example shown in the third modification, in the image processing apparatus 100C, the acquisition unit 110C acquires a scanned image obtained by scanning a sheet used for printing by the printing apparatus 200B as the capability information I2. The conversion unit 120C specifies the reflection luminance of the light reflected by the paper when the light is irradiated to the paper based on the luminance of the scan image acquired by the acquisition unit 110C, and specifies the specified reflection luminance. The first still image data D1 is converted into second still image data D2 as a second dynamic range, which is the luminance range to be the maximum luminance.
 このように構成された画像処理装置100Cでは、印刷装置200Bで印刷に用いる用紙を、スキャナ202でスキャンすることによりスキャン画像を取得し、取得されたスキャン画像に応じて、第2のダイナミックレンジを決定することができる。そして、画像処理装置100Cは、第1静止画データD1を、その第2のダイナミックレンジで定義された第2静止画データD2に変換し、当該第2静止画データD2を印刷装置200Bに出力することができる。つまり、画像処理装置100Cは、印刷装置200Bで印刷に用いる用紙の表現能力(すなわち、当該用紙の白で表現される輝度を最大輝度とした輝度範囲)を容易に推定でき、当該用紙の表現能力に適した輝度範囲(ダイナミックレンジ)を第2のダイナミックレンジとすることで、第2のダイナミックレンジを容易に決定することができる。このため、画像処理装置100Cは、第1静止画データD1に基づく画像を、印刷装置200Bに高品位に印刷させることができる。 In the image processing apparatus 100C configured as described above, the scanner 202 scans a sheet used for printing by the printing apparatus 200B to acquire a scan image, and the second dynamic range is set according to the acquired scan image. It can be decided. Then, the image processing apparatus 100C converts the first still image data D1 into second still image data D2 defined by the second dynamic range, and outputs the second still image data D2 to the printing apparatus 200B. be able to. That is, the image processing apparatus 100C can easily estimate the expressive ability of the sheet used for printing by the printing apparatus 200B (that is, the luminance range with the luminance represented by the white of the sheet as the maximum luminance), and the expressive ability of the sheet The second dynamic range can be easily determined by setting the luminance range (dynamic range) suitable for the second dynamic range as the second dynamic range. Therefore, the image processing apparatus 100C can cause the printing apparatus 200B to print an image based on the first still image data D1 with high quality.
 なお、画像処理装置100Cは、ユーザによって印刷装置200Bに入力された印刷結果を、印刷装置200Bから取得してもよい。 The image processing apparatus 100C may acquire the print result input to the printing apparatus 200B by the user from the printing apparatus 200B.
 (1-4-4.変形例4)
 次に、実施の形態の変形例4について説明する。
(1-4-4. Modification 4)
Next, a fourth modification of the embodiment will be described.
 実施の形態1(および実施の形態1の変形例1~3)では、画像処理装置100(100A、100B、または100C)は、印刷装置200(200A、または200B)とは別体の独立した装置である構成例を説明したが、本開示は何らこの構成に限定されない。例えば、画像処理装置100または画像処理装置100Aに印刷装置200が組み込まれていてもよく、画像処理装置100Bに印刷装置200Aが組み込まれていてもよく、画像処理装置100Cに印刷装置200Bが組み込まれていてもよい。この場合、画像処理装置100(100A、100B、または100C)において、印刷装置200(200A、または200B)と通信可能な通信IF105は省略されてもよい。 In the first embodiment (and the first to third modifications of the first embodiment), the image processing apparatus 100 (100A, 100B or 100C) is an independent apparatus separate from the printing apparatus 200 (200A or 200B). Although the configuration example has been described, the present disclosure is not limited to this configuration. For example, the printing apparatus 200 may be incorporated in the image processing apparatus 100 or the image processing apparatus 100A, the printing apparatus 200A may be incorporated in the image processing apparatus 100B, and the printing apparatus 200B is incorporated in the image processing apparatus 100C. It may be In this case, the communication IF 105 capable of communicating with the printing apparatus 200 (200A or 200B) may be omitted in the image processing apparatus 100 (100A, 100B or 100C).
 (1-4-5.変形例5)
 次に、実施の形態1の変形例5について説明する。
(1-4-5. Modification 5)
Next, a fifth modification of the first embodiment will be described.
 実施の形態1および変形例1~4では、変換部120(120C)が、第1静止画データD1を第2静止画データD2に変換する処理を行う構成例を説明したが、本開示は何らこの構成に限定されない。 In the first embodiment and the first to fourth modifications, the configuration example in which the converting unit 120 (120C) performs the process of converting the first still image data D1 into the second still image data D2 has been described. It is not limited to this configuration.
 変換部120(120C)は、第1静止画データD1の最大輝度が、用紙に対応する反射輝度(紙情報により特定される反射輝度、または、スキャン画像から取得される反射輝度)よりも小さい場合、第1静止画データD1から第2静止画データD2への変換を行わなくてもよい。 When the maximum luminance of the first still image data D1 is smaller than the reflection luminance corresponding to the sheet (reflection luminance specified by the paper information or reflection luminance acquired from the scan image), the conversion unit 120 (120C) The conversion from the first still image data D1 to the second still image data D2 may not be performed.
 この場合、取得部110(110A、110B、または110C)は、第1静止画データD1を取得すると共に、さらに、第1のダイナミックレンジよりも輝度範囲(ダイナミックレンジ)が狭い第3のダイナミックレンジで定義された第3静止画データD3を取得する(図22参照)。第3のダイナミックレンジは、例えば、SDR(Standard Dynamic Range)である。 In this case, the acquisition unit 110 (110A, 110B, or 110C) acquires the first still image data D1, and further has a third dynamic range whose luminance range (dynamic range) is narrower than the first dynamic range. The defined third still image data D3 is obtained (see FIG. 22). The third dynamic range is, for example, SDR (Standard Dynamic Range).
 なお、取得部110(110A、110B、または110C)は、第3静止画データD3を、第1静止画データD1を取得するときと同様に、画像処理装置100(100A、100B、または100C)に有線接続または無線接続された、撮像装置、情報端末、または記憶装置、等から取得してもよい。例えば、被写体を撮像して第1静止画データD1を生成する撮像装置(例えば、図9に示すHDR撮像装置10、等)が、その被写体の撮像時に、第1静止画データD1とともに第3静止画データD3も生成することができる場合は、取得部110(110A、110B、または110C)は、その撮像装置から第3静止画データD3を取得することができる。 Note that the acquisition unit 110 (110A, 110B, or 110C) adds the third still image data D3 to the image processing apparatus 100 (100A, 100B, or 100C) as when acquiring the first still image data D1. It may be acquired from an imaging device, an information terminal, a storage device, or the like connected by wire or wirelessly. For example, an imaging device (for example, the HDR imaging device 10 illustrated in FIG. 9 or the like illustrated in FIG. 9) that captures an object and generates the first still image data D1 generates a third still with the first still image data D1 when capturing the object. If the image data D3 can also be generated, the acquisition unit 110 (110A, 110B, or 110C) can acquire the third still image data D3 from the imaging device.
 そして、出力部130は、取得部110(110A、110B、または110C)により取得された第3静止画データD3を、印刷装置200(200A、または200B)に出力する。 Then, the output unit 130 outputs the third still image data D3 acquired by the acquisition unit 110 (110A, 110B or 110C) to the printing apparatus 200 (200A or 200B).
 すなわち、変形例5における画像処理装置において、取得部は、第1静止画データを取得すると共に、さらに、第1のダイナミックレンジよりも輝度範囲が狭い第3のダイナミックレンジで定義された第3静止画データを取得してもよい。第1静止画データの最大輝度が当該用紙に対応する反射輝度よりも小さい場合、変換部は、第1静止画データから第2静止画データへの変換を行わず、出力部は、取得部により取得された第3静止画データを印刷装置に出力してもよい。 That is, in the image processing apparatus according to the fifth modification, the acquiring unit acquires the first still image data, and further, a third still defined by a third dynamic range whose luminance range is narrower than the first dynamic range. Image data may be acquired. When the maximum luminance of the first still image data is smaller than the reflection luminance corresponding to the sheet, the conversion unit does not convert the first still image data into the second still image data, and the output unit is acquired by the acquisition unit. The acquired third still image data may be output to the printing apparatus.
 なお、第3静止画データD3は第3静止画データの一例である。 The third still image data D3 is an example of third still image data.
 例えば、変形例5に示した例では、画像処理装置100(100A、100B、または100C)において、取得部110(110A、110B、または110C)は、第1静止画データD1を取得すると共に、さらに、第1のダイナミックレンジよりも輝度範囲が狭い第3のダイナミックレンジで定義された第3静止画データD3を取得する。第1静止画データD1の最大輝度が当該用紙に対応する反射輝度よりも小さい場合、変換部120(120C)は、第1静止画データD1から第2静止画データD2への変換を行わず、出力部130は、取得部110(110A、110B、または110C)により取得された第3静止画データD3を印刷装置200(200A、または200B)に出力する。 For example, in the example shown in the fifth modification, in the image processing apparatus 100 (100A, 100B, or 100C), the acquisition unit 110 (110A, 110B, or 110C) acquires the first still image data D1 and further And acquiring third still image data D3 defined in a third dynamic range whose luminance range is narrower than the first dynamic range. When the maximum luminance of the first still image data D1 is smaller than the reflection luminance corresponding to the sheet, the conversion unit 120 (120C) does not convert the first still image data D1 into the second still image data D2, The output unit 130 outputs the third still image data D3 acquired by the acquisition unit 110 (110A, 110B or 110C) to the printing apparatus 200 (200A or 200B).
 このように構成された画像処理装置100(100A、100B、または100C)は、HDRの第1静止画データD1の最大輝度が、用紙に対応する反射輝度よりも小さい場合に、SDRの第3静止画データD3を印刷装置200(200A、または200B)に出力し、第3静止画データD3による画像を印刷装置200(200A、または200B)に印刷することができる。 The image processing apparatus 100 (100A, 100B, or 100C) configured in this way detects the third still image of SDR when the maximum brightness of the first still image data D1 of HDR is smaller than the reflection brightness corresponding to the sheet. The image data D3 can be output to the printing apparatus 200 (200A or 200B), and an image based on the third still image data D3 can be printed on the printing apparatus 200 (200A or 200B).
 例えば、第1静止画データD1の最大輝度が、印刷装置200(200A、または200B)で印刷に用いる用紙に対応する反射輝度よりも小さく、そのために、第1静止画データD1から第2静止画データD2への変換処理を行っても、高品位な画像を印刷装置200(200A、または200B)において印刷できない場合がある。そのような場合に、画像処理装置100(100A、100B、または100C)は、第1静止画データD1から第2静止画データD2への変換処理を行わず、SDRの第3静止画データD3を印刷装置200(200A、または200B)に出力する。そのため、変形例5に示す構成例によれば、画像処理装置100(100A、100B、または100C)において、変換処理に係る負荷を低減することができる。 For example, the maximum luminance of the first still image data D1 is smaller than the reflection luminance corresponding to the sheet used for printing in the printing apparatus 200 (200A or 200B). Therefore, the first still image data D1 to the second still image Even when conversion processing to data D2 is performed, a high quality image may not be printed by the printing apparatus 200 (200A or 200B). In such a case, the image processing apparatus 100 (100A, 100B, or 100C) does not perform conversion processing from the first still image data D1 to the second still image data D2, and the third still image data D3 of SDR is processed. It outputs to the printing apparatus 200 (200A or 200B). Therefore, according to the configuration example shown in the fifth modification, in the image processing apparatus 100 (100A, 100B, or 100C), the load associated with the conversion process can be reduced.
 (1-5.まとめ)
 図22は、実施の形態1(または変形例1~5)における画像処理装置100(100A、100B、または100C)の実施例を説明するための模式図である。
(1-5. Summary)
FIG. 22 is a schematic diagram for describing an example of the image processing apparatus 100 (100A, 100B, or 100C) in the first embodiment (or the first to fifth modifications).
 図22に示すように、画像処理装置100(100A、100B、または100C)は、HDR画像である第1静止画データD1を取得して、HDR表示装置30にそのまま出力する。HDR表示装置30は、HDRに対応しているため、第1静止画データD1を高品位に表示することができる。 As illustrated in FIG. 22, the image processing apparatus 100 (100A, 100B, or 100C) acquires the first still image data D1 that is an HDR image, and outputs the first still image data D1 to the HDR display apparatus 30 as it is. Since the HDR display device 30 supports HDR, the first still image data D1 can be displayed with high quality.
 また、画像処理装置100(100A、100B、または100C)は、第1静止画データD1と、SDR印刷装置50の能力情報I1(I2)とを取得する。そして、画像処理装置100(100A、100B、または100C)は、取得した能力情報I1(I2)に基づいて第1静止画データD1を第2静止画データD2に変換し、変換した第2静止画データD2をSDR印刷装置50に出力する。SDR印刷装置50は、第1静止画データD1をそのまま印刷することはできない。しかし、SDR印刷装置50は、画像処理装置100(100A、100B、または100C)において第1静止画データD1から変換された第2静止画データD2には対応しているため、第2静止画データD2を用いて第2静止画データD2に基づく画像を用紙に印刷することができる。また、第2静止画データD2は、SDR画像である第3静止画データD3よりも広い輝度範囲(ダイナミックレンジ)で定義されているため、第3静止画データD3によるSDR画像よりも高品位な画像を用紙に印刷することができる。 The image processing apparatus 100 (100A, 100B, or 100C) obtains the first still image data D1 and the capability information I1 (I2) of the SDR printing apparatus 50. Then, the image processing apparatus 100 (100A, 100B, or 100C) converts the first still image data D1 into the second still image data D2 based on the acquired capability information I1 (I2), and converts the second still image data D1. The data D2 is output to the SDR printing apparatus 50. The SDR printing apparatus 50 can not print the first still image data D1 as it is. However, since the SDR printing apparatus 50 corresponds to the second still image data D2 converted from the first still image data D1 in the image processing apparatus 100 (100A, 100B, or 100C), the second still image data An image based on the second still image data D2 can be printed on a sheet using D2. Further, since the second still image data D2 is defined in a luminance range (dynamic range) wider than the third still image data D3 which is an SDR image, the second still image data D2 is higher in quality than the SDR image by the third still image data D3. Images can be printed on paper.
 また、画像処理装置100(100A、100B、または100C)は、第1静止画データD1の最大輝度が、用紙に対応する反射輝度よりも小さい場合は、第1静止画データD1から第2静止画データD2への変換を行わず、第3静止画データD3をSDR印刷装置50に出力する。SDR印刷装置50は、SDR画像である第3静止画データD3に対応しているため、第3静止画データD3による画像を、そのまま印刷することができる。 In addition, the image processing apparatus 100 (100A, 100B, or 100C) detects the first still image data D1 to the second still image when the maximum brightness of the first still image data D1 is smaller than the reflection brightness corresponding to the sheet. The third still image data D3 is output to the SDR printing apparatus 50 without conversion to the data D2. Since the SDR printing apparatus 50 corresponds to the third still image data D3 which is an SDR image, the image based on the third still image data D3 can be printed as it is.
 (1-6.通信プロトコルの例)
 次に、図23~図26を参照しながら、本実施の形態における通信プロトコルの例(第1の例~第4の例)について説明する。なお、以下の説明において、互いに実質的に同じ構成要素には同じ符号を付与し、重複する説明を省略する。
(1-6. Example of communication protocol)
Next, with reference to FIGS. 23 to 26, examples (first to fourth examples) of the communication protocol in the present embodiment will be described. In the following description, the same reference numerals are given to constituent elements substantially the same as one another, and redundant description will be omitted.
 図23は、実施の形態における表示装置400と印刷装置500との間の通信プロトコルの第1の例を模式的に示す図である。 FIG. 23 is a view schematically showing a first example of a communication protocol between the display device 400 and the printing device 500 in the embodiment.
 表示装置400は、例えば、テレビジョンセット、ビデオレコーダ、ビデオプレーヤ、またはデジタルカメラ、等の画像表示機能または画像再生機能を有する装置である。印刷装置500は、プリンタ等の印刷機能を有する装置である。 The display device 400 is a device having an image display function or an image reproduction function such as, for example, a television set, a video recorder, a video player, or a digital camera. The printing apparatus 500 is an apparatus having a printing function such as a printer.
 表示装置400の通信プロトコルは、下位から順に、物理層としてのUSB物理層440、トランスポート層としてのPTP Transport層430、変換層としてのDPS Layer420、および、アプリ層としてのDPS application層410を備えて構成される。印刷装置500の通信プロトコルは、下位から順に、物理層としてのUSB物理層540、トランスポート層としてのPTP Transport層530、変換層としてのDPS Layer520、および、アプリ層としてのDPS application層510を備えて構成される。なお、図23では、物理層としてUSB物理層440、540が利用され、表示装置400と印刷装置500との間がUSB接続される構成例を示しているが、表示装置400と印刷装置500との間は、無線通信(例えば、Wi-Fi)で接続されてもよい。 The communication protocol of the display device 400 includes a USB physical layer 440 as a physical layer, a PTP Transport layer 430 as a transport layer, a DPS layer 420 as a conversion layer, and a DPS application layer 410 as an application layer in order from the lower level. Is configured. The communication protocol of the printing apparatus 500 includes a USB physical layer 540 as a physical layer, a PTP Transport layer 530 as a transport layer, a DPS layer 520 as a conversion layer, and a DPS application layer 510 as an application layer in order from the lower level. Is configured. Although FIG. 23 shows a configuration example in which the USB physical layers 440 and 540 are used as the physical layer, and the display device 400 and the printing device 500 are USB-connected, the display device 400 and the printing device 500 are used. May be connected by wireless communication (for example, Wi-Fi).
 図23では、トランスポート層で、ISO15740として規定されているPicture Transfer Protocol(PTP)が利用される構成例を示している。変換層は、アプリ層に対するI/F(InterFace)を規定し、アプリ層からの入出力をPTPプロトコルに変換する。DPS層には、表示装置400と印刷装置500との間で相互に対応する機能を有するかどうかのネゴシエーションを行う、DPS Discoveryが存在する。DPS Layer420にはDPS Discovery421が存在し、DPS Layer520にはDPS Discovery521が存在する。 FIG. 23 illustrates a configuration example in which the Picture Transfer Protocol (PTP) defined as ISO 15740 is used in the transport layer. The conversion layer defines an I / F (InterFace) for the application layer, and converts input / output from the application layer into a PTP protocol. In the DPS layer, DPS Discovery exists to negotiate whether the display device 400 and the printing device 500 have mutually corresponding functions. The DPS Discovery 421 exists in the DPS Layer 420, and the DPS Discovery 521 exists in the DPS Layer 520.
 表示装置400と印刷装置500との間がUSB接続される場合には、表示装置400および印刷装置500へのUSBケーブルの接続を契機としてPTPでの相互接続が確立してから、表示装置400および印刷装置500は、相互に、DPS Discovery421、521によって接続相手を確認する。このあと、印刷対象となる静止画を保持する表示装置400は、Storage Server412として、Storage Client511となる印刷装置500に対してファイルを提供する。印刷装置500は、Print Server512として、Print Client411となる表示装置400からの要求を受け付ける。表示装置400のPrint Client411は、Print Server512に対して、プリンタの能力を問い合わせ、適宜、その問合せの結果を、表示装置400のUI(User Interface)上で表示する。 In the case where the display device 400 and the printing device 500 are connected by USB, the connection of the USB cable to the display device 400 and the printing device 500 is taken as an opportunity to establish the interconnection in PTP. The printing apparatus 500 mutually confirms the connection partner by DPS Discovery 421 and 521. After that, the display device 400 that holds the still image to be printed provides the file as the storage server 412 to the printing device 500 that is the storage client 511. The printing apparatus 500 receives a request from the display apparatus 400 to be the Print Client 411 as the Print Server 512. The Print Client 411 of the display device 400 inquires the Print Server 512 about the capability of the printer, and appropriately displays the result of the inquiry on a UI (User Interface) of the display device 400.
 また、表示装置400で静止画の一覧表示等がなされ、その表示の中から印刷の対象となる静止画がユーザに選択されることによって、ユーザから印刷の指示が表示装置400に対してなされた場合は、表示装置400は、指示された静止画の印刷をPrint Server512に対して要求する。印刷装置500は、Storage Client511として、表示装置400のStorage Server412に、印刷の指示がなされた静止画に対応するファイルを要求する。表示装置400は、その要求に応じて、保持する静止画をファイルとして印刷装置500に通知する。 In addition, a list of still images is displayed on the display device 400, and the user selects a still image to be printed from among the displayed images, and the user issues a print instruction to the display device 400. In the case, the display device 400 requests the Print Server 512 to print the instructed still image. The printing apparatus 500 requests the storage server 412 of the display apparatus 400 as a storage client 511 for a file corresponding to a still image for which printing is instructed. In response to the request, the display device 400 notifies the printing device 500 of the held still image as a file.
 図24は、実施の形態における表示装置400Aと印刷装置500Aとの間の通信プロトコルの第2の例を模式的に示す図である。 FIG. 24 is a diagram schematically showing a second example of the communication protocol between the display device 400A and the printing device 500A in the embodiment.
 表示装置400Aは、第1の例と同様に、例えば、テレビジョンセット、ビデオレコーダ、ビデオプレーヤ、またはデジタルカメラ、等の画像表示機能または画像再生機能を有する装置である。印刷装置500Aは、第1の例と同様に、プリンタ等の印刷機能を有する装置である。 Similarly to the first example, the display device 400A is a device having an image display function or an image reproduction function, such as a television set, a video recorder, a video player, or a digital camera. The printing apparatus 500A is an apparatus having a printing function, such as a printer, as in the first example.
 表示装置400Aの通信プロトコルは、下位から順に、TCP/IP層440A、UPnP層430A、Control Middle層420A、および、アプリ層410Aを備えて構成される。印刷装置500Aの通信プロトコルは、下位から順に、TCP/IP層540A、UPnP層530A、Control Middle層520A、および、アプリ層510Aを備えて構成される。 The communication protocol of the display device 400A is configured to include a TCP / IP layer 440A, a UPnP layer 430A, a Control Middle layer 420A, and an application layer 410A in order from the lower side. The communication protocol of the printing apparatus 500A includes a TCP / IP layer 540A, a UPnP layer 530A, a Control Middle layer 520A, and an application layer 510A in order from the lower side.
 表示装置400Aおよび印刷装置500Aの通信プロトコルは、物理層(図示せず)としてWi-Fiを利用し、その上のトランスポート層としてTCP/IP層440A、540Aを採用している。そして、表示装置400Aおよび印刷装置500Aは、TCP/IP層440A、540の上で、相互に接続相手を発見するためのプロトコルとしてUPnPを利用している。表示装置400Aおよび印刷装置500AがUPnP層430A、530Aの機能によって相互に接続相手を認識した後で、印刷ジョブの制御等を行うControl Middle層420A、520Aの助けを得て、実際の印刷データが、表示装置400Aと印刷装置500Aとの間でやり取りされる。なお、プリンタは、機種毎に独自の印刷命令を有しており、PC(Personal Computer)では、プリント毎の印刷命令の差異をドライバで吸収することができる。テレビジョンセットまたはビデオレコーダ等の表示装置400Aでは、そのようなPCとは異なり、ドライバをインストールするという仕組みを取ることが難しい。したがって、表示装置400Aは、汎用的な印刷用記述言語を用いてもよい。 The communication protocol of the display device 400A and the printing device 500A uses Wi-Fi as a physical layer (not shown), and employs TCP / IP layers 440A and 540A as a transport layer thereon. The display device 400A and the printing device 500A use UPnP as a protocol for mutually discovering a connection partner on the TCP / IP layers 440A and 540. After the display device 400A and the printing device 500A mutually recognize the connection partner by the function of the UPnP layer 430A, 530A, the actual print data is obtained with the help of the Control Middle layer 420A, 520A which performs control of the print job etc. , Between the display device 400A and the printing device 500A. The printer has its own print command for each model, and in a PC (Personal Computer), the driver can absorb the difference in the print command for each print. In the display device 400A such as a television set or a video recorder, unlike such a PC, it is difficult to take a mechanism of installing a driver. Therefore, the display device 400A may use a general-purpose printing description language.
 このような言語の例として、W3Cで規定されたXHTML-printがある。XHTML-printを用いた印刷では、表示装置400Aは、XHTML-printコンテンツ411Aに基づいて印刷の指示を作成し、作成した印刷指示を印刷装置500Aに送付する。印刷装置500Aでは、XHTML-printで記述されたその印刷指示に基づき、XHTML描画処理部511Aにおいて画像ファイルまたはテキスト文字列等のレイアウト、ラスタライズ処理を行い、実際に印刷を行うデータを生成する。印刷処理部512Aは、こうして得られたデータを印刷する。 An example of such a language is XHTML-print specified by W3C. In printing using XHTML-print, the display device 400A creates a print instruction based on the XHTML-print content 411A, and sends the created print instruction to the printing device 500A. In the printing apparatus 500A, based on the print instruction described in XHTML-print, the XHTML drawing processing unit 511A performs layout processing and rasterization processing of an image file or a text character string, and generates data to be actually printed. The print processing unit 512A prints the data thus obtained.
 図25は、実施の形態における表示装置400Bと印刷装置500Bとの間の通信プロトコルの第3の例を示す図である。 FIG. 25 is a diagram illustrating a third example of the communication protocol between the display device 400B and the printing device 500B in the embodiment.
 表示装置400Bは、第1の例および第2の例と同様に、例えば、テレビジョンセット、ビデオレコーダ、ビデオプレーヤ、またはデジタルカメラ、等の画像表示機能または画像再生機能を有する装置である。また、印刷装置500Bは、第1の例および第2の例と同様に、プリンタ等の印刷機能を有する装置である。なお、印刷装置500Bは、第1静止画データD1を印刷する際に、HDRには対応しておらずSDRにしか対応していない装置である。 The display device 400B is, for example, a device having an image display function or an image reproduction function such as a television set, a video recorder, a video player, or a digital camera, as in the first and second examples. The printing apparatus 500B is an apparatus having a printing function, such as a printer, as in the first and second examples. Note that the printing apparatus 500B is an apparatus that does not support HDR and does not support SDR only when printing the first still image data D1.
 表示装置400Bの通信プロトコルは、下位から順に、Wi-Fi層440B、PTP Transport層430、DPS Layer420、および、DPS application層410Bを備えて構成される。印刷装置500Bの通信プロトコルは、下位から順に、Wi-Fi層540B、PTP Transport層530、DPS Layer520、および、DPS application層510を備えて構成される。 The communication protocol of the display device 400B is configured to include a Wi-Fi layer 440B, a PTP Transport layer 430, a DPS layer 420, and a DPS application layer 410B in order from the lower side. The communication protocol of the printing apparatus 500B is configured to include a Wi-Fi layer 540B, a PTP Transport layer 530, a DPS Layer 520, and a DPS application layer 510 in order from the lower side.
 第3の例に示す表示装置400Bと印刷装置500Bとの間の通信プロトコルは、第1の例で示したPTPを用いた通信プロトコルをベースとする。表示装置400BのPrint Client411は、印刷装置500BのPrint Server512との通信によって、接続相手である印刷装置500Bの機能を把握する。 The communication protocol between the display apparatus 400B and the printing apparatus 500B shown in the third example is based on the communication protocol using PTP shown in the first example. The Print Client 411 of the display device 400B recognizes the function of the printing device 500B, which is the connection partner, through communication with the Print Server 512 of the printing device 500B.
 例えば、印刷装置500Bが、8bits JPGのようなSDR静止画データしか取り扱えない装置であった場合、表示装置400Bは、表示装置400Bが保持するHDR静止画データを、そのまま印刷装置500Bに渡すことはできない。そのような場合、表示装置400Bは、表示装置400Bが保持するHDR静止画データを、印刷装置500Bの機能にあわせてGrading Module413Bで調整し、8bits JPGファイルとする。すなわち、表示装置400Bは、印刷装置500Bの機能にあわせてHDR静止画データから8bits JPGファイルを作成する。 For example, when the printing device 500B can handle only SDR still image data such as 8 bits JPG, the display device 400B can pass the HDR still image data held by the display device 400B as it is to the printing device 500B. Can not. In such a case, the display device 400B adjusts the HDR still image data held by the display device 400B with the grading module 413B in accordance with the function of the printing device 500B to generate an 8 bits JPG file. That is, the display device 400B creates an 8 bits JPG file from the HDR still image data in accordance with the function of the printing device 500B.
 表示装置400Bは、印刷装置500BのStorage Client511からの要求に対する応答として、HDR静止画データから得られた8bits JPGファイルを印刷装置500Bに提供する。つまり、第3の例に示す表示装置400Bは、実施の形態1および変形例1~5において説明した画像処理装置100、100A~100Cのいずれかを含む構成である。 The display device 400B provides the 8 bits JPG file obtained from the HDR still image data to the printing device 500B as a response to the request from the Storage Client 511 of the printing device 500B. That is, the display device 400B shown in the third example is configured to include any of the image processing devices 100 and 100A to 100C described in the first embodiment and the first to fifth modifications.
 なお、上述の例では、表示装置400Bが印刷装置500Bの機能に合わせて8bits JPGファイルの静止画データを作成する構成を説明したが、本開示はこの構成に限定されない。表示装置400Bは、あらかじめHDR静止画ファイルに対応する8bits JPGファイルを作成しておいてもよい。例えば、8bits JPGファイルしか受け付けられない印刷装置に対応するためには、デジタルカメラは、撮像により得られた画像を、HDRに対応している第1静止画データD1と8bits JPGファイル(例えば、第3静止画データD3)との両方のフォーマットで生成することが望ましい。 Although the above-mentioned example explained the composition which display device 400B creates the still picture data of 8 bits JPG file according to the function of printing device 500B, this indication is not limited to this composition. The display device 400B may create in advance an 8 bits JPG file corresponding to the HDR still image file. For example, in order to correspond to a printing apparatus that can receive only 8 bits JPG files, the digital camera may use an image obtained by imaging, first still image data D1 corresponding to HDR, and 8 bits JPG files (for example, It is desirable to generate in both formats of 3 still image data D3).
 ただし、sRGBおよびBT.709等の標準的な表示環境が規定されている表示装置400Bとは異なり、印刷装置500Bでは、印刷に使用する用紙およびインクの種類等によって印刷の品質に差が生じる可能性がある。そのため、表示装置400Bは、印刷装置500Bの設定や能力に応じて、HDR静止画ファイルから8bits JPGファイルを作成する際のグレーディングの方法を変更した方がよい場合がある。したがって、表示装置400Bは、印刷装置500Bの設定や能力に応じて、都度、8bits JPGファイルを作成してもよい。 However, sRGB and BT. Unlike the display device 400B in which a standard display environment such as 709 is specified, in the printing device 500B, the print quality may differ depending on the type of paper and ink used for printing. Therefore, in some cases, it is better for the display device 400B to change the grading method when creating the 8 bits JPG file from the HDR still image file, in accordance with the settings and capabilities of the printing device 500B. Therefore, the display device 400B may create an 8 bits JPG file each time according to the settings and capabilities of the printing device 500B.
 なお、印刷装置500Bは、印刷装置200(200A、または200B)であってもよい。 The printing apparatus 500B may be the printing apparatus 200 (200A or 200B).
 図26は、実施の形態における表示装置400Cと印刷装置500Cとの間の通信プロトコルの第4の例を模式的に示す図である。 FIG. 26 is a diagram schematically showing a fourth example of a communication protocol between the display device 400C and the printing device 500C in the embodiment.
 第4の例において、表示装置400Cは、第3の例と同様に、例えば、テレビジョンセット、ビデオレコーダ、ビデオプレーヤ、またはデジタルカメラ、等の画像表示機能または画像再生機能を有する装置である。また、印刷装置500Cは、第3の例と同様に、プリンタ等の印刷機能を有する装置であり、第1静止画データD1を印刷する際に、HDRには対応しておらずSDRにしか対応していない装置である。 In the fourth example, as in the third example, the display device 400C is a device having an image display function or an image reproduction function, such as a television set, a video recorder, a video player, or a digital camera. The printing apparatus 500C is an apparatus having a printing function such as a printer as in the third example, and when printing the first still image data D1, the printing apparatus 500C is not compatible with HDR and only compatible with SDR. Not a device.
 表示装置400Cの通信プロトコルは、下位から順に、Wi-Fi層440B、PTP Transport層430、DPS Layer420、および、DPS application層410を備えて構成される。印刷装置500Cの通信プロトコルは、下位から順に、Wi-Fi層540B、PTP Transport層530、DPS Layer520、および、DPS application層510Cを備えて構成される。 The communication protocol of the display device 400C is configured to include a Wi-Fi layer 440B, a PTP Transport layer 430, a DPS Layer 420, and a DPS application layer 410 in order from the lower side. The communication protocol of the printing apparatus 500C is configured to include a Wi-Fi layer 540B, a PTP Transport layer 530, a DPS Layer 520, and a DPS application layer 510C in order from the lower side.
 第4の例に示す表示装置400Cと印刷装置500Cとの間の通信プロトコルは、第3の例に示した構成と比較して、第3の例において表示装置400BのGrading Module413Bが行っていた処理を、印刷装置500CのGrading Module513Cが行う点が異なる。つまり、第4の例に示す印刷装置500Cは、実施の形態1および変形例1~5において説明した画像処理装置100、100A~100Cのいずれかを含む構成である。 The communication protocol between the display apparatus 400C and the printing apparatus 500C shown in the fourth example is the processing performed by the grading module 413B of the display apparatus 400B in the third example, compared to the configuration shown in the third example. Is different from that performed by the grading module 513C of the printing apparatus 500C. That is, the printing apparatus 500C shown in the fourth example is configured to include any of the image processing apparatuses 100 and 100A to 100C described in the first embodiment and the first to fifth modifications.
 表示装置400CのStorage Server412は、印刷装置500CのStorage Client511からの要求に対する応答として、HDR静止画データを、そのまま印刷装置500Cに提供する。印刷装置500Cでは、受信したHDR静止画データを、印刷装置500Cで使用する用紙およびインクの種類、および印刷品質に関する設定、等に応じて、Grading Module513Cが適切にグレーディングして印刷する。 The Storage Server 412 of the display device 400C provides the HDR still image data as it is to the printing device 500C as a response to the request from the Storage Client 511 of the printing device 500C. In the printing device 500C, the grading module 513C appropriately grades and prints the received HDR still image data according to the type of paper and ink used in the printing device 500C, the setting regarding the printing quality, and the like.
 なお、印刷装置500Cは、印刷装置200(200A、または200B)であってもよい。 The printing apparatus 500C may be the printing apparatus 200 (200A or 200B).
 (他の実施の形態)
 以上のように、本出願において開示する技術の例示として、実施の形態1および変形例1~5を説明した。しかしながら、本開示における技術は、これに限定されず、変更、置き換え、付加、省略等を行った実施の形態にも適用できる。また、上記実施の形態1および変形例1~5で説明した各構成要素を組み合わせて、新たな実施の形態とすることも可能である。
(Other embodiments)
As described above, Embodiment 1 and Modifications 1 to 5 have been described as examples of the technology disclosed in the present application. However, the technology in the present disclosure is not limited to this, and can also be applied to embodiments in which changes, replacements, additions, omissions, and the like have been made. In addition, it is also possible to combine the constituent elements described in the first embodiment and the first to fifth modifications into a new embodiment.
 そこで、以下、他の実施の形態を例示する。 Therefore, other embodiments will be exemplified below.
 第1静止画データD1で使用されている輝度情報に関しては、絶対的な輝度値が用いられているとしたが、第2静止画データD2が生成される際に、絶対的な輝度値に代えて、撮像された写真の明暗部各部のSTOP数から算出された輝度が用いられてもよい。つまり、変換部120(120C)は、第1静止画データD1の絶対輝度を用いずに、相対輝度を用いて、第1静止画データD1を第2静止画データD2に変換してもよい。 Although the absolute luminance value is used as the luminance information used in the first still image data D1, when the second still image data D2 is generated, the absolute luminance value is used instead. The luminance calculated from the STOP number of each part of the light and dark part of the photographed picture may be used. That is, the conversion unit 120 (120C) may convert the first still image data D1 into the second still image data D2 using relative luminance without using the absolute luminance of the first still image data D1.
 実施の形態では、印刷能力を、用紙単体の反射率等で示される能力として説明した。しかし、印刷能力を、印刷装置の印刷特性を加えた加算特性としてもよい。つまり、印刷能力には、印刷装置のインクの特性、およびインクの吐出の特性、等も考慮されてもよい。そして、第1静止画データD1の第1のダイナミックレンジと、用紙の種類、光源の種類および明るさ、およびインクの特性等により表現可能な輝度範囲(ダイナミックレンジ)と、が考慮されて、第1静止画データD1が第2静止画データD2に変換されてもよい。 In the embodiment, the printing capability has been described as the capability represented by the reflectance of a single sheet or the like. However, the printing capability may be an addition characteristic to which the printing characteristic of the printing apparatus is added. That is, the printability may also take into consideration the characteristics of the ink of the printing apparatus and the characteristics of the ejection of the ink. Then, the first dynamic range of the first still image data D1, the type of paper, the type and brightness of the light source, and the luminance range (dynamic range) that can be expressed by the characteristics of the ink are considered. One still picture data D1 may be converted into second still picture data D2.
 変換部120によって信号の変換処理が行われる代わりに、信号の変換処理と同等の効果を得られるように、印刷装置のインク吐出量が制御されてもよい。つまり、第2静止画データD2を生成する代わりに、第2静止画データD2を印刷した場合と同等の印刷物が印刷装置により出力されるように、印刷装置のインク吐出量が制御されてもよい。 Instead of the signal conversion process being performed by the conversion unit 120, the ink ejection amount of the printing apparatus may be controlled so as to obtain the same effect as the signal conversion process. That is, instead of generating the second still image data D2, the ink discharge amount of the printing apparatus may be controlled such that the printed product equivalent to the case where the second still image data D2 is printed is output by the printing apparatus. .
 実施の形態1および変形例1~5において、各構成要素は、専用のハードウェア(例えば、半導体集積回路を含む電子回路)で構成されてもよく、あるいは、各構成要素に適したソフトウェアプログラムをプロセッサが実行することによって実現されてもよい。各構成要素は、CPU(Central Processing Unit)またはプロセッサ等のプログラム実行部が、ハードディスクまたは半導体メモリ等の記録媒体に記録されたソフトウェアプログラムを読み出して実行することによって実現されてもよい。 In the first embodiment and the first to fifth modifications, each component may be configured by dedicated hardware (for example, an electronic circuit including a semiconductor integrated circuit), or a software program suitable for each component may be used. It may be realized by execution by a processor. Each component may be realized by a program execution unit such as a central processing unit (CPU) or processor reading and executing a software program recorded in a recording medium such as a hard disk or a semiconductor memory.
 また、各実施の形態で図面に示したブロック図における機能ブロックの分割は、単なる一例に過ぎない。例えば、複数の機能ブロックが1つの機能ブロックとして実現されたり、あるいは、1つの機能ブロックが複数に分割されたり、または、一部の機能が他の機能ブロックに移されたりしてもよい。また、複数の機能ブロックの機能を、単一のハードウェアまたはソフトウェアが、並列に処理してもよく、または時分割に処理してもよい。また、複数の機能ブロックの一部の機能がハードウェアで実現され、残りの機能がソフトウェアで実現されてもよい。 Further, division of functional blocks in the block diagrams shown in the drawings in the respective embodiments is merely an example. For example, a plurality of functional blocks may be realized as one functional block, or one functional block may be divided into a plurality, or some functions may be transferred to another functional block. Also, the functions of multiple functional blocks may be processed in parallel by a single piece of hardware or software, or may be processed in time division. Also, some functions of the plurality of functional blocks may be realized by hardware, and the remaining functions may be realized by software.
 また、実施の形態で図面に示したフローチャートにおける各ステップが実行される順序は、単なる一例に過ぎず、実施の形態で説明した順序とは異なる順序で各ステップが実行されてもよい。また、上記ステップの一部が、他のステップと同時に(すなわち、並列に)実行されてもよい。 Further, the order in which the steps in the flowchart illustrated in the embodiment are performed is merely an example, and the steps may be performed in an order different from the order described in the embodiment. Also, some of the above steps may be performed simultaneously with other steps (ie, in parallel).
 ここで、上記各実施の形態の画像処理方法を実現するソフトウェアは、次のようなプログラムである。 Here, software for realizing the image processing method according to each of the above-described embodiments is the following program.
 すなわち、このプログラムは、コンピュータに、撮像により得られ輝度範囲が第1のダイナミックレンジで定義された第1静止画データを取得し、印刷装置の印刷能力を示す能力情報を取得し、取得した第1静止画データを、取得した能力情報が示す印刷能力に応じて、第1のダイナミックレンジよりも輝度範囲が狭い第2のダイナミックレンジで定義された第2静止画データに変換し、変換した第2静止画データを印刷装置に出力する画像処理方法を実行させる。 That is, this program acquires the first still image data obtained by imaging and having the luminance range defined by the first dynamic range in the computer, and acquires the acquired capability information indicating the printing capability of the printing apparatus. 1) Still image data converted into second still image data defined by a second dynamic range whose luminance range is narrower than the first dynamic range according to the printing capability indicated by the acquired capability information, 2. Execute an image processing method for outputting still image data to the printing apparatus.
 また、上記画像処理方法と、当該画像処理方法をコンピュータに実行させるコンピュータプログラムおよびそのプログラムを記録したコンピュータ読み取り可能な記録媒体は、本開示の範囲に含まれる。ここで、コンピュータ読み取り可能な記録媒体としては、例えば、フレキシブルディスク、ハードディスク、CD-ROM、MO、DVD、DVD-ROM、DVD-RAM、BD(Blu-ray(登録商標) Disc)、半導体メモリ、等を挙げることができる。コンピュータプログラムは、上記記録媒体に記録されたものに限定されず、電気通信回線、無線または有線通信回線、インターネット等のネットワーク、等を経由して伝送されてもよい。 The image processing method, a computer program that causes a computer to execute the image processing method, and a computer readable recording medium recording the program are included in the scope of the present disclosure. Here, as a computer readable recording medium, for example, a flexible disk, a hard disk, a CD-ROM, an MO, a DVD, a DVD, a DVD-ROM, a DVD-RAM, a BD (Blu-ray (registered trademark) Disc), a semiconductor memory, Etc. can be mentioned. The computer program is not limited to one recorded in the above recording medium, and may be transmitted via a telecommunication line, a wireless or wired communication line, a network such as the Internet, or the like.
 また、上記の各装置を構成する構成要素の一部または全部は、各装置に脱着可能なICカードまたは単体のモジュールから構成されてもよい。 Further, some or all of the components constituting each of the above-described devices may be configured from an IC card or a single module that can be attached to or detached from each device.
 また、上記の各装置を構成する構成要素の一部または全部は、1つのLSI(Large Scale Integration:大規模集積回路)で構成されてもよい。 In addition, some or all of the components constituting each of the above-described devices may be configured by one LSI (Large Scale Integration).
 また、各処理部は、LSIまたはICに限定されるものではなく、専用回路または汎用プロセッサで実現されてもよい。あるいは、回路構成をプログラムすることが可能なFPGA(Field Programmable Gate Array)、または、LSI内部の回路セルの接続や設定を再構成可能なリコンフィギュラブル・プロセッサで実現されてもよい。 Each processing unit is not limited to an LSI or an IC, and may be realized by a dedicated circuit or a general-purpose processor. Alternatively, it may be realized by an FPGA (field programmable gate array) capable of programming the circuit configuration, or a reconfigurable processor capable of reconfigurable connection and setting of circuit cells in the LSI.
 また、上記のプログラムを、記録媒体に記録して頒布または流通させてもよい。例えば、頒布されたプログラムを装置類にインストールして、装置類のプロセッサに実行させることで、装置類に各種処理を行わせることが可能となる。 Also, the above program may be recorded on a recording medium and distributed or distributed. For example, by installing the distributed program in the apparatus and causing the processor of the apparatus to execute the program, the apparatus can perform various processes.
 また、本開示におけるコンピュータプログラムまたはデジタル信号を、電気通信回線、無線または有線通信回線、インターネット等のネットワーク、データ放送、等を経由して伝送してもよい。 In addition, the computer program or the digital signal in the present disclosure may be transmitted via a telecommunication line, a wireless or wired communication line, a network such as the Internet, data broadcasting, and the like.
 また、本開示は、プログラムまたはデジタル信号を記録媒体に記録して移送することにより、またはプログラムまたはデジタル信号を、ネットワーク等を経由して移送することにより、独立した他のコンピュータシステムにより実施してもよい。 In addition, the present disclosure may be implemented by another independent computer system by recording and transporting a program or digital signal on a recording medium, or by transporting a program or digital signal via a network or the like. It is also good.
 また、実施の形態において、各処理(各機能)は、単一の装置(システム)によって集中処理されることによって実現されてもよく、あるいは、複数の装置によって分散処理されることによって実現されてもよい。 Moreover, in the embodiment, each process (each function) may be realized by centralized processing by a single device (system), or realized by distributed processing by a plurality of devices. It is also good.
 以上のように、本開示における技術の例示として、実施の形態を説明した。そのために、添付図面および詳細な説明を提供した。 As described above, the embodiment has been described as an example of the technology in the present disclosure. For that purpose, the attached drawings and the detailed description are provided.
 したがって、添付図面および詳細な説明に記載された構成要素の中には、課題解決のために必須な構成要素だけでなく、上記技術を例示するために、課題解決のためには必須でない構成要素も含まれ得る。そのため、それらの必須ではない構成要素が添付図面や詳細な説明に記載されていることをもって、直ちに、それらの必須ではない構成要素が必須であるとの認定をするべきではない。 Therefore, among the components described in the attached drawings and the detailed description, not only components essential for solving the problem but also components not essential for solving the problem in order to exemplify the above-mentioned technology May also be included. Therefore, the fact that those non-essential components are described in the attached drawings and the detailed description should not immediately mean that those non-essential components are essential.
 また、上述の実施の形態は、本開示における技術を例示するためのものであるから、特許請求の範囲またはその均等の範囲において種々の変更、置き換え、付加、省略などを行うことができる。 In addition, since the above-described embodiment is for illustrating the technique in the present disclosure, various modifications, replacements, additions, omissions, and the like can be made within the scope of the claims or the equivalents thereof.
 本開示は、高品位な印刷物を印刷するための静止画データを得ることができる画像処理装置および画像処理方法に適用可能である。具体的には、テレビジョンセットまたはディスプレイ等の映像表示装置、ビデオレコーダまたはビデオプレーヤ等の映像再生装置、デジタルカメラまたはビデオカメラ等の撮像装置、スマートフォンまたはタブレットコンピュータ等の端末装置、プリンタ等の印刷装置、または、PCまたはサーバ等のコンピュータ装置、等に、本開示は適用可能である。 The present disclosure is applicable to an image processing apparatus and an image processing method that can obtain still image data for printing high-quality printed matter. Specifically, a video display device such as a television set or display, a video reproduction device such as a video recorder or video player, an imaging device such as a digital camera or video camera, a terminal device such as a smartphone or tablet computer, printing such as a printer The present disclosure is applicable to an apparatus or a computer apparatus such as a PC or a server.
1,2,3,4,5,6  パターン
10,10A  HDR撮像装置
11  HDR撮像部
12  変換部
13  JPEG圧縮部
14  SDR撮像部
15  HDR画像補正部
16  HEVC圧縮部
20  SDR撮像装置
21  SDR撮像部
22  JPEG圧縮部
30  HDR表示装置
40  SDR表示装置
50  SDR印刷装置
100,100A,100B,100C  画像処理装置
101  CPU
102  メインメモリ
103  ストレージ
104  入力IF
105  通信IF
110,110A,110B,110C  取得部
120,120C  変換部
130  出力部
150  テストパターン
200,200A,200B  印刷装置
201  入力IF
202  スキャナ
300  入力装置
400,400A,400B,400C  表示装置
410,410B,510,510C  DPS application層
410A,510A  アプリ層
411  Print Client
411A  XHTML-printコンテンツ
412  Storage Server
413B,513C  Grading Module
420,520  DPS Layer
420A,520A  Control Middle層
421,521  DPS Discovery
430,530  PTP Transport層
430A,530A  UPnP層
431A,531A  UPnP Discovery
440,540  USB物理層
440A,540A  TCP/IP層
440B,540B  Wi-Fi層
500,500A,500B,500C  印刷装置
511  Storage Client
512  Print Server
511A  XHTML描画処理部
512A  印刷処理部
D1  第1静止画データ
D2  第2静止画データ
D3  第3静止画データ
I1,I2  能力情報
P1  用紙
1, 2, 3, 4, 5, 6 pattern 10, 10A HDR imaging device 11 HDR imaging unit 12 conversion unit 13 JPEG compression unit 14 SDR imaging unit 15 HDR image correction unit 16 HEVC compression unit 20 SDR imaging device 21 SDR imaging unit 22 JPEG Compressing Unit 30 HDR Display Device 40 SDR Display Device 50 SDR Printing Device 100, 100A, 100B, 100C Image Processing Device 101 CPU
102 Main memory 103 Storage 104 Input IF
105 Communication IF
110, 110A, 110B, 110C acquisition unit 120, 120C conversion unit 130 output unit 150 test patterns 200, 200A, 200B printing apparatus 201 input IF
202 Scanner 300 Input device 400, 400A, 400B, 400C Display device 410, 410B, 510, 510C DPS application layer 410A, 510A Application layer 411 Print Client
411A XHTML-print content 412 Storage Server
413B, 513C Grading Module
420,520 DPS Layer
420A, 520A Control Middle layer 421, 521 DPS Discovery
430, 530 PTP Transport Layer 430A, 530A UPnP Layer 431A, 531A UPnP Discovery
440, 540 USB physical layer 440A, 540A TCP / IP layer 440B, 540B Wi- Fi layer 500, 500A, 500B, 500C Printer 511 Storage Client
512 Print Server
511A XHTML drawing processing unit 512A Print processing unit D1 First still image data D2 Second still image data D3 Third still image data I1 and I2 Capability information P1 Paper

Claims (9)

  1. 撮像により得られ、輝度範囲が第1のダイナミックレンジで定義された第1静止画データと、印刷装置の印刷能力を示す能力情報とを取得する取得部と、
    前記取得部により取得された前記第1静止画データを、前記取得部により取得された前記能力情報が示す前記印刷能力に応じて、前記第1のダイナミックレンジよりも輝度範囲が狭い第2のダイナミックレンジで定義された第2静止画データに変換する変換部と、
    前記変換部により変換された前記第2静止画データを前記印刷装置に出力する出力部と、を備える、
    画像処理装置。
    An acquisition unit configured to acquire first still image data obtained by imaging and having a luminance range defined by a first dynamic range, and capability information indicating the printing capability of the printing apparatus;
    A second dynamic range whose luminance range is narrower than the first dynamic range according to the printing capability indicated by the capability information acquired by the acquiring unit; A converter for converting the second still image data defined by the range;
    An output unit that outputs the second still image data converted by the conversion unit to the printing apparatus;
    Image processing device.
  2. 前記取得部は、前記能力情報として、前記印刷装置で印刷に用いる用紙の種類を示す紙情報を取得し、
    前記変換部は、
    複数の用紙の種類と、当該用紙に光を照射したときに当該用紙で反射される前記光の反射輝度との関係を参照することで、前記取得部により取得された前記紙情報が示す用紙の種類に対応する前記反射輝度を特定し、
    特定した前記反射輝度を最大輝度とする輝度範囲を前記第2のダイナミックレンジとして、前記第1静止画データから前記第2静止画データへ変換する、
    請求項1に記載の画像処理装置。
    The acquisition unit acquires paper information indicating a type of paper used for printing by the printing apparatus as the capability information,
    The conversion unit is
    The sheet of paper indicated by the paper information acquired by the acquisition unit by referring to the relationship between the types of paper and the reflection luminance of the light reflected by the paper when the paper is irradiated with light. Identify the reflected luminance corresponding to the type;
    The first still image data is converted into the second still image data, with the luminance range in which the identified reflection luminance is the maximum luminance as the second dynamic range.
    The image processing apparatus according to claim 1.
  3. 前記取得部は、前記能力情報として、前記印刷装置で印刷に用いる用紙がスキャンされることにより得られたスキャン画像を取得し、
    前記変換部は、
    前記取得部により取得された前記スキャン画像の輝度に基づいて、前記用紙に光を照射したときに当該用紙で反射される前記光の反射輝度を特定し、
    特定した前記反射輝度を最大輝度とする輝度範囲を前記第2のダイナミックレンジとして、前記第1静止画データから前記第2静止画データへ変換する、
    請求項1に記載の画像処理装置。
    The acquisition unit acquires, as the capability information, a scanned image obtained by scanning a sheet used for printing by the printing apparatus.
    The conversion unit is
    Identifying the reflection luminance of the light reflected by the sheet when the sheet is irradiated with light, based on the luminance of the scan image acquired by the acquisition unit;
    The first still image data is converted into the second still image data, with the luminance range in which the identified reflection luminance is the maximum luminance as the second dynamic range.
    The image processing apparatus according to claim 1.
  4. 前記取得部は、前記能力情報として、前記印刷装置で印刷に用いる用紙に当該印刷装置が特定のパターンを印刷して得られた印刷結果を取得し、
    前記変換部は、前記取得部により取得された前記印刷結果に応じて決定した輝度範囲を前記第2のダイナミックレンジとして、前記第1静止画データから前記第2静止画データへ変換する、
    請求項1に記載の画像処理装置。
    The acquisition unit acquires, as the capability information, a print result obtained by the printing apparatus printing a specific pattern on a sheet used for printing by the printing apparatus.
    The conversion unit converts, as the second dynamic range, the first still image data into the second still image data, using the luminance range determined according to the print result acquired by the acquisition unit.
    The image processing apparatus according to claim 1.
  5. 前記取得部は、前記第1静止画データを取得すると共に、さらに、前記第1のダイナミックレンジよりも輝度範囲が狭い第3のダイナミックレンジで定義された第3静止画データを取得し、
    前記第1静止画データの最大輝度が前記用紙に対応する前記反射輝度よりも小さい場合、前記変換部は、前記第1静止画データから前記第2静止画データへの変換を行わず、前記出力部は、前記取得部により取得された前記第3静止画データを前記印刷装置に出力する、
    請求項2から4のいずれか1項に記載の画像処理装置。
    The acquisition unit acquires the first still image data, and further acquires third still image data defined in a third dynamic range whose luminance range is narrower than the first dynamic range.
    When the maximum luminance of the first still image data is smaller than the reflection luminance corresponding to the sheet, the conversion unit does not convert the first still image data to the second still image data, and the output is performed. The unit outputs the third still image data acquired by the acquisition unit to the printing apparatus.
    The image processing apparatus according to any one of claims 2 to 4.
  6. 前記第1のダイナミックレンジは、HDR(High Dynamic Range)である、
    請求項1から5のいずれか1項に記載の画像処理装置。
    The first dynamic range is HDR (High Dynamic Range),
    The image processing apparatus according to any one of claims 1 to 5.
  7. 前記画像処理装置は、さらに、前記印刷装置を備える、
    請求項1から6のいずれか1項に記載の画像処理装置。
    The image processing apparatus further includes the printing apparatus.
    The image processing apparatus according to any one of claims 1 to 6.
  8. 撮像により得られ、輝度範囲が第1のダイナミックレンジで定義された第1静止画データを取得し、
    印刷装置の印刷能力を示す能力情報を取得し、
    取得した前記第1静止画データを、取得した前記能力情報が示す前記印刷能力に応じて、前記第1のダイナミックレンジよりも輝度範囲が狭い第2のダイナミックレンジで定義された第2静止画データに変換し、
    変換した前記第2静止画データを前記印刷装置に出力する、
    画像処理方法。
    Acquiring first still image data obtained by imaging and having a luminance range defined by a first dynamic range;
    Obtain capability information that indicates the printing capability of the printing device,
    Second still image data defined in a second dynamic range whose luminance range is narrower than the first dynamic range in accordance with the print capability indicated by the acquired capability information indicating the acquired first still image data Convert to
    Outputting the converted second still image data to the printing device;
    Image processing method.
  9. 請求項8に記載の画像処理方法をコンピュータに実行させるためのプログラム。 A program for causing a computer to execute the image processing method according to claim 8.
PCT/JP2017/040732 2016-11-17 2017-11-13 Image processing device, image processing method, and program WO2018092711A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP17871075.2A EP3544280B1 (en) 2016-11-17 2017-11-13 Image processing device, image processing method, and program
CN201780070657.7A CN109983754B (en) 2016-11-17 2017-11-13 Image processing apparatus, image processing method, and recording medium
US16/348,026 US10726315B2 (en) 2016-11-17 2017-11-13 Image processing device, image processing method, and program
JP2018551611A JP6719061B2 (en) 2016-11-17 2017-11-13 Image processing apparatus, image processing method, and program

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201662423335P 2016-11-17 2016-11-17
US62/423335 2016-11-17
JP2017164937 2017-08-30
JP2017-164937 2017-08-30

Publications (1)

Publication Number Publication Date
WO2018092711A1 true WO2018092711A1 (en) 2018-05-24

Family

ID=62146487

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/040732 WO2018092711A1 (en) 2016-11-17 2017-11-13 Image processing device, image processing method, and program

Country Status (1)

Country Link
WO (1) WO2018092711A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019204439A (en) * 2018-05-25 2019-11-28 キヤノン株式会社 Image processing device, image processing method, and program
JP2020004268A (en) * 2018-06-29 2020-01-09 キヤノン株式会社 Image processing device, image processing method, and program
JP2020141334A (en) * 2019-02-28 2020-09-03 キヤノン株式会社 Imaging device and control method thereof, and program
JP2020150412A (en) * 2019-03-13 2020-09-17 キヤノン株式会社 Image output device, image supply device and control method thereof, system, program as well as storage medium
JP2020178292A (en) * 2019-04-19 2020-10-29 キヤノン株式会社 Imaging apparatus, control method thereof, and program
JP2020182186A (en) * 2019-04-26 2020-11-05 キヤノン株式会社 Imaging apparatus, recording control method, and program
EP3860105A1 (en) 2020-01-31 2021-08-04 Canon Kabushiki Kaisha Image processing apparatus, image processing method, program, and non-transitory computer-readable storage medium storing program
EP3860104A1 (en) 2020-01-31 2021-08-04 Canon Kabushiki Kaisha Image processing apparatus, image processing method, program, and non-transitory computer-readable storage medium storing program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004120217A (en) * 2002-08-30 2004-04-15 Canon Inc Image processing apparatus, image processing method, program, and recording medium
JP2006080834A (en) * 2004-09-09 2006-03-23 Fuji Xerox Co Ltd Image forming apparatus
JP2008072551A (en) * 2006-09-15 2008-03-27 Ricoh Co Ltd Image processing method, image processing apparatus, program and recording medium
JP2014053886A (en) * 2012-08-07 2014-03-20 Canon Inc Image processor, image processing method and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004120217A (en) * 2002-08-30 2004-04-15 Canon Inc Image processing apparatus, image processing method, program, and recording medium
JP2006080834A (en) * 2004-09-09 2006-03-23 Fuji Xerox Co Ltd Image forming apparatus
JP2008072551A (en) * 2006-09-15 2008-03-27 Ricoh Co Ltd Image processing method, image processing apparatus, program and recording medium
JP2014053886A (en) * 2012-08-07 2014-03-20 Canon Inc Image processor, image processing method and program

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7073191B2 (en) 2018-05-25 2022-05-23 キヤノン株式会社 Image processing equipment, image processing methods and programs
JP2019204439A (en) * 2018-05-25 2019-11-28 キヤノン株式会社 Image processing device, image processing method, and program
JP2020004268A (en) * 2018-06-29 2020-01-09 キヤノン株式会社 Image processing device, image processing method, and program
JP7316768B2 (en) 2018-06-29 2023-07-28 キヤノン株式会社 Image processing device, image processing method, and program
JP2020141334A (en) * 2019-02-28 2020-09-03 キヤノン株式会社 Imaging device and control method thereof, and program
JP2020150412A (en) * 2019-03-13 2020-09-17 キヤノン株式会社 Image output device, image supply device and control method thereof, system, program as well as storage medium
JP7307560B2 (en) 2019-03-13 2023-07-12 キヤノン株式会社 Image display device, image supply device, control method and program
JP2020178292A (en) * 2019-04-19 2020-10-29 キヤノン株式会社 Imaging apparatus, control method thereof, and program
JP7365133B2 (en) 2019-04-19 2023-10-19 キヤノン株式会社 Communication device, its control method, and program
US11778336B2 (en) 2019-04-19 2023-10-03 Canon Kabushiki Kaisha Image capturing apparatus and control method thereof
JP2020182186A (en) * 2019-04-26 2020-11-05 キヤノン株式会社 Imaging apparatus, recording control method, and program
JP7313893B2 (en) 2019-04-26 2023-07-25 キヤノン株式会社 IMAGING DEVICE, RECORDING CONTROL METHOD AND PROGRAM
JP2021124767A (en) * 2020-01-31 2021-08-30 キヤノン株式会社 Image processing device, image processing method, and program
JP2021124766A (en) * 2020-01-31 2021-08-30 キヤノン株式会社 Image processing device, image processing method, and program
CN113276570A (en) * 2020-01-31 2021-08-20 佳能株式会社 Image processing apparatus, image processing method, and storage medium
US20210241056A1 (en) * 2020-01-31 2021-08-05 Canon Kabushiki Kaisha Image processing apparatus, image processing method, non-transitory computer-readable storage medium storing program
EP3860104A1 (en) 2020-01-31 2021-08-04 Canon Kabushiki Kaisha Image processing apparatus, image processing method, program, and non-transitory computer-readable storage medium storing program
EP3860105A1 (en) 2020-01-31 2021-08-04 Canon Kabushiki Kaisha Image processing apparatus, image processing method, program, and non-transitory computer-readable storage medium storing program
US11797806B2 (en) 2020-01-31 2023-10-24 Canon Kabushiki Kaisha Image processing apparatus, image processing method, non-transitory computer-readable storage medium storing program
JP7431596B2 (en) 2020-01-31 2024-02-15 キヤノン株式会社 Image processing device, image processing method and program
JP7431595B2 (en) 2020-01-31 2024-02-15 キヤノン株式会社 Image processing device, image processing method and program

Similar Documents

Publication Publication Date Title
WO2018092711A1 (en) Image processing device, image processing method, and program
US9294750B2 (en) Video conversion device, photography system of video system employing same, video conversion method, and recording medium of video conversion program
JP6719061B2 (en) Image processing apparatus, image processing method, and program
US11184596B2 (en) Image processing device, reproduction device, image processing method, and reproduction method
US20040196381A1 (en) Image processing method and apparatus
JP4131192B2 (en) Imaging apparatus, image processing apparatus, and image recording apparatus
JPWO2005079056A1 (en) Image processing apparatus, photographing apparatus, image processing system, image processing method and program
US20180003949A1 (en) Projector and projection system that correct optical characteristics, image processing apparatus, and storage medium
JP7316768B2 (en) Image processing device, image processing method, and program
US20050280842A1 (en) Wide gamut film system for motion image capture
US9443327B2 (en) Rendering and un-rendering using profile replacement
JP2004096500A (en) Image pickup apparatus, image processing apparatus, and image recording apparatus
JP2006203555A (en) Image processing method, image processor and image processing program
US20080158351A1 (en) Wide gamut film system for motion image capture
WO2018021261A1 (en) Image processing device, reproduction device, image processing method, and reproduction method
JP7431596B2 (en) Image processing device, image processing method and program
JP2004328530A (en) Imaging apparatus, image processing apparatus, and image recording apparatus
Brendel 55‐3: Invited Paper: Delivering Content for HDR
US20200412913A1 (en) Image processing apparatus, image processing method, and storage medium
JP2005202749A (en) Image processing method, image processing apparatus, and image recording apparatus
JP2004178428A (en) Image processing method
JP2015005835A (en) Image processing device, image forming device, and recording medium
JP5957813B2 (en) Imaging apparatus, program, and recording medium
Gaggioni et al. S-log: A new lut for digital production mastering and interchange application
Thorpe Canon-Log Transfer Characteristics

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17871075

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2018551611

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017871075

Country of ref document: EP

Effective date: 20190617