WO2019111529A1 - Image processing device and image processing method - Google Patents

Image processing device and image processing method Download PDF

Info

Publication number
WO2019111529A1
WO2019111529A1 PCT/JP2018/037948 JP2018037948W WO2019111529A1 WO 2019111529 A1 WO2019111529 A1 WO 2019111529A1 JP 2018037948 W JP2018037948 W JP 2018037948W WO 2019111529 A1 WO2019111529 A1 WO 2019111529A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
processing apparatus
distance
unit
black
Prior art date
Application number
PCT/JP2018/037948
Other languages
French (fr)
Japanese (ja)
Inventor
憲明 杉本
大輔 深川
真史 内田
朋宏 森
知市 藤澤
松原 義明
佐藤 光雄
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社, ソニー株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2019111529A1 publication Critical patent/WO2019111529A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B19/00Cameras
    • G03B19/02Still-picture cameras
    • G03B19/04Roll-film cameras
    • G03B19/07Roll-film cameras having more than one objective
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/13Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control

Definitions

  • the present disclosure relates to an image processing apparatus and an image processing method.
  • Patent Document 1 a technology is provided in which an image generated by a camera attachable to and detachable from the information processing terminal is supplied to the information processing terminal by wireless communication.
  • Patent Document 2 a plurality of imaging units are provided to simultaneously generate a plurality of images with different image quality, for example, an image of a first angle of view and an image of a second angle of view narrower than the first angle of view.
  • composition of an image picturized from a mutually different viewpoint position can not be performed appropriately.
  • a corresponding point can not be correctly obtained because a pixel corresponding to a pixel of interest in one captured image exceeds the search range for parallax detection, and thus the image quality of the composite image may be degraded. is there.
  • the present disclosure is made in view of the above problem, and the object of the present disclosure is to set the color image acquired by the first imaging unit and the viewpoint position different from the first imaging unit. It is an object of the present invention to provide a new and improved image processing apparatus and image processing method capable of combining the black-and-white images acquired by the second imaging unit for imaging more appropriately.
  • a first imaging unit that acquires a color image by imaging a subject, and a second image that acquires a black-and-white image by imaging the subject from viewpoint positions different from the first imaging unit.
  • An image processing apparatus comprising: an imaging unit; and a composition control unit configured to increase a composition ratio of the color image with respect to composition of the color image and the black and white image as compared to the composition ratio of the black and white image.
  • a computer-implemented image processing method comprising: increasing the composition ratio of the color image relative to the composition ratio of the black and white image.
  • a black and white image is acquired by imaging the subject from viewpoint positions different from the first imaging unit that acquires a color image by imaging the subject and the first imaging unit.
  • An image processing apparatus comprising: a second imaging unit; and a combination control unit configured to control combination of the color image and the black-and-white image by processing using predetermined sensor information.
  • a computer-implemented image processing method comprising controlling the combining of the color image and the black and white image.
  • a color image acquired by the first imaging unit and a black-and-white image acquired by the second imaging unit that performs imaging from a viewpoint position different from that of the first imaging unit It becomes possible to synthesize more appropriately.
  • FIG. 3 is a diagram exemplifying a pixel array of a first imaging unit 110 and a second imaging unit 120. It is a figure for demonstrating the determination processing of the image quality deterioration presence or absence based on an image surface phase difference sensor. It is a figure for demonstrating the determination processing of the image quality deterioration presence or absence based on an image surface phase difference sensor.
  • FIG. 5 is a block diagram showing an example of functional configuration of a combination processing unit 150 and an image quality deterioration determination unit 180.
  • FIG. It is the figure which illustrated the parallax histogram. It is a figure for demonstrating parallax difference absolute value. It is the figure which illustrated the parallax gap histogram. It is a figure for demonstrating the determination processing of the image quality deterioration presence or absence based on the parallax gap feature-value and the search range over feature-value. It is a figure for demonstrating a luminance difference small color difference large area.
  • FIG. 5 is a block diagram showing an example of functional configuration of a combination processing unit 150 and an image quality deterioration determination unit 180.
  • FIG. FIG. 18 is a block diagram illustrating an example of a functional configuration of a Y / C distribution ratio processing unit 156. It is a figure for demonstrating the determination processing of the image quality deterioration presence or absence based on Y / C dispersion
  • FIG. 16 is a block diagram showing an example of a functional configuration of a Y / C edge component ratio processing unit 157. It is a figure for demonstrating the determination processing of the image quality deterioration presence or absence based on Y / C edge component ratio.
  • 5 is a flowchart illustrating an example of image quality deterioration determination processing and combination processing.
  • Embodiment> (1.1. Overview) First, an outline of an embodiment according to the present disclosure will be described.
  • the image processing apparatus 100 captures a subject by capturing a subject, and captures a subject from a viewpoint position different from the first imaging unit 110 that acquires a color image and the first imaging unit 110. And a second imaging unit 120 for acquiring the
  • the image processing apparatus 100 is a smartphone, and as shown in 1B, the first imaging unit 110 and the second imaging unit 120 are provided at different positions on the back of the smartphone. It is done.
  • the image processing apparatus 100 generates a composite image by performing processing of combining a color image and a black and white image. More specifically, since parallax occurs in color images and black-and-white images taken from different positions, the image processing apparatus 100 searches for corresponding points (corresponding points) by matching each image, and takes action. A composite image is generated by aligning each image with respect to a point. Thus, the image processing apparatus 100 can increase the luminance according to the characteristics of the lens and the sensor used in the second imaging unit 120, and can therefore generate an image with high sensitivity even under low illumination. .
  • FIG. 2 is a diagram for explaining the image quality obtained by the combining process of the conventional image processing apparatus. For example, in a pixel of a near view as compared with a distant view, a corresponding point can not be correctly obtained because a pixel corresponding to a pixel of interest in one captured image exceeds the search range for parallax detection, and thus the image quality of the composite image may be degraded. is there.
  • FIG. 3 shows the occlusion based on the black-and-white image acquired by the second imaging unit 120.
  • a luminance signal is used to calculate a corresponding point, but an area in which the gradation of the luminance signal is low and the gradation of the color difference signal is high (hereinafter referred to as "a small area of small luminance difference color difference" Since the corresponding points can not be obtained correctly in the case of), the image quality of the synthesized image obtained by the synthesis processing may be degraded.
  • the image processing apparatus 100 combines a color image and a black and white image by processing using various sensor information such as a distance sensor, a focus sensor, and an image plane phase difference sensor, as well as the analysis result of the captured image. Control. More specifically, the image processing apparatus 100 determines whether the image quality of the composite image is degraded by processing using various types of sensor information, and when it is determined that the image quality is degraded, the composite ratio of black and white images To approximately zero (or zero). Thus, the image processing apparatus 100 can improve the image quality of the composite image.
  • various sensor information such as a distance sensor, a focus sensor, and an image plane phase difference sensor
  • “to make the composite ratio of black and white images substantially zero (or zero)” aims to reduce the deterioration of the image quality of the composite image to such an extent that it is not recognized by the user.
  • setting the composite ratio of the black and white image to substantially zero (or zero) may be expressed as “not composite (or composite off)”.
  • the combination of the color image and the black and white image may be expressed as "combining (or combining on)”.
  • the image processing apparatus 100 determines that the image quality of the composite image is degraded, it does not combine the color image and the black and white image (composition OFF) and determines that the image quality of the composite image does not degrade. , Composite the color image and the black and white image (composition on).
  • the image processing apparatus 100 may alleviate the image quality degradation by simply reducing the composition ratio of the black and white image instead of not combining the color image and the black and white image.
  • the image processing apparatus 100 includes a first imaging unit 110, a second imaging unit 120, a first preprocessing unit 130, a second preprocessing unit 140, and a combining processing unit. 150, a focus control unit 160, a distance sensor 170, and an image quality deterioration determination unit 180.
  • First imaging unit 110, second imaging unit 120 The first imaging unit 110 and the second imaging unit 120 are configured using an imaging element such as a complementary metal oxide semiconductor (CMOS) image sensor, and photoelectric conversion of light captured by a lens (not shown) To generate captured image data. Further, the first imaging unit 110 and the second imaging unit 120 have characteristic differences.
  • CMOS complementary metal oxide semiconductor
  • FIG. 5 illustrates the pixel arrangement of the first imaging unit 110 and the second imaging unit 120.
  • 5A indicates the pixel array of the first imaging unit 110.
  • the first imaging unit 110 is configured using, for example, a color filter in which a red (R) pixel, a blue (B) pixel, and a green (G) pixel are arranged in a Bayer pattern.
  • a color filter in which a red (R) pixel, a blue (B) pixel, and a green (G) pixel are arranged in a Bayer pattern.
  • two pixels at diagonal positions in the 2 ⁇ 2 pixel unit are green (G) pixels, and the remaining pixels are red (R) pixels and blue (B) pixels.
  • the first imaging unit 110 is configured of color pixels each of which outputs an electric signal based on the amount of incident light of any one color component of red, blue, and green. Therefore, the first imaging unit 110 generates color image data in which each pixel indicates any of the three primary color (RGB) components.
  • the second imaging unit 120 indicates the pixel array of the second imaging unit 120.
  • all the pixels are configured by W (white) pixels that output electrical signals based on incident light amounts in the entire wavelength region of visible light. Therefore, the second imaging unit 120 generates black and white image data.
  • the focus control unit 160 described below realizes autofocus by changing the positions of predetermined lenses provided in the first imaging unit 110 and the second imaging unit 120. Then, the first imaging unit 110 and the second imaging unit 120 provide the image quality deterioration determination unit 180 with information on the position of the lens in the in-focus state (hereinafter referred to as “focus position information”). .
  • the image quality deterioration determination unit 180 can calculate the distance between the image processing apparatus 100 and the subject (hereinafter referred to as “subject distance” for convenience) by analyzing the focus position information.
  • the first preprocessing unit 130 performs lens distortion correction, defective pixel correction, gain control, correction processing such as white balance correction or noise reduction, and demosaicing processing on color image data acquired by the first imaging unit 110. Or apply scaling processing etc.
  • the first pre-processing unit 130 provides the color image data after the pre-processing to the combining processing unit 150.
  • the second preprocessing unit 140 performs lens distortion correction, defective pixel correction, correction processing such as gain control or noise reduction, or scaling processing on the black and white image data acquired by the second imaging unit 120. .
  • the second pre-processing unit 140 provides the composite processing unit 150 with the corrected black and white image data.
  • the focus control unit 160 is a functional configuration that realizes autofocus when the first imaging unit 110 and the second imaging unit 120 perform imaging processing. More specifically, the focus control unit 160 realizes autofocus based on the contrast of the image data and the information from the image plane phase difference sensor.
  • the focus control unit 160 acquires color image data and black and white image data from the first imaging unit 110 and the second imaging unit 120, and analyzes these data. Calculate the contrast value with. Then, the focus control unit 160 determines whether the image is in focus using the contrast value. If the image is not in focus, the focus control unit 160 determines the focusing direction of the lens using the contrast value of the image data. Adjust the focus by driving. In other words, when the contrast value does not reach the maximum value, the focus control unit 160 brings the focus into focus by driving the lens such that the contrast value reaches the maximum value.
  • the autofocus based on the information from the image plane phase difference sensor will be described.
  • the image plane phase difference sensor two types of imaging elements subjected to pupil division are arranged in a mixed state on the chip.
  • the image plane phase difference sensor can calculate the subject distance based on the obtained parallax information, and the focus control unit 160 drives the lens to a position corresponding to the obtained subject distance. Focus on.
  • the focus control unit 160 implements autofocus using at least one of the contrast of the image data and the information from the image plane phase difference sensor. Note that the method of realizing autofocus is not limited to this.
  • the distance sensor 170 is a sensor capable of measuring the subject distance by a predetermined method.
  • the distance sensor 170 includes a light source (for example, an LED or a laser diode) capable of emitting visible light or invisible light (for example, infrared light) and a light receiving element. Then, after irradiating the light from the light source, the distance sensor 170 receives the light reflected by the subject by the light receiving element, evaluates / calculates the reflected light, converts it into a distance, and outputs it.
  • a light source for example, an LED or a laser diode
  • invisible light for example, infrared light
  • the principle of measuring the subject distance is a triangular distance measuring method that converts the imaging position of the light receiving element into a distance, or a slight time from light irradiation to light reception, and the time difference converted to a distance It may be a time of flight type, etc., but is not limited thereto. Further, although the subject to be measured is assumed to be, for example, a subject located near the center of the angle of view, it is not limited to this.
  • the distance sensor 170 provides subject distance data (also referred to as “distance sensor information”) to the image quality deterioration determination unit 180.
  • the image quality deterioration determination unit 180 is a functional configuration that determines the presence or absence of image quality deterioration by processing using various types of sensor information and functions as a combination control unit. The determination of the image quality deterioration can be realized by various methods, and the details will be described in “1.3. Example of determination of image quality deterioration”.
  • composition processing unit 150 is a functional configuration that controls composition processing of a color image and a black and white image and functions as a composition control unit. More specifically, when the image quality degradation determining unit 180 determines that the image quality of the combined image is degraded, the combining processing unit 150 sets the combining ratio of the black and white image to substantially zero (or zero). Thus, the composition processing unit 150 can generate a composite image having high image quality. As described above, when it is determined that the image quality of the combined image is degraded, the combining processing unit 150 does not set the combining ratio of the black and white image to substantially zero (or zero), but merely combines the combining ratio of the black and white image. The image quality deterioration may be alleviated by reducing the
  • the composition processing unit 150 analyzes the color image data and the black and white image data acquired from the first pre-processing unit 130 and the second pre-processing unit 140 to determine the deterioration of the image quality caused by the parallax. Calculate the image feature amount of.
  • the composition processing unit 150 may set the entire captured image as a calculation target region of the image feature amount, and may set the calculation target region excluding the region on the upper, lower, left, and right ends in the captured image. As described above, if the calculation target area is set excluding the area on the end side, for example, it is possible to prevent the parallax and the parallax gap distance described later from being unable to be calculated because the target pixel is the side end position. It becomes possible to calculate the image feature quantity accurately. In addition, calculation costs such as generation of a histogram can be reduced.
  • the composition processing unit 150 provides the extracted image feature amount to the image quality degradation determination unit 180, so that the image quality degradation determination unit 180 can determine the presence or absence of image quality degradation using the image feature amount.
  • the image feature amount is output using captured image data generated by the image sensor provided in the first imaging unit 110 and the second imaging unit 120, image quality deterioration is performed using the image feature amount. Determining the presence or absence can be said to determine the presence or absence of image quality deterioration by processing using sensor information from the image sensor. The details of the process will be described in “1.3. Determination example of image quality deterioration”.
  • the functional configuration example of the image processing apparatus 100 has been described above.
  • the above-described functional configuration described using FIG. 4 is merely an example, and the functional configuration of the image processing apparatus 100 is not limited to such an example.
  • the image processing apparatus 100 may not necessarily include all of the functional configurations shown in FIG.
  • the functional configuration of the image processing apparatus 100 can be flexibly deformed according to the specification and the operation.
  • the ISO sensitivity of the first imaging unit 110 is ISO It is assumed that the ISO sensitivity of the second imaging unit 120 when the gain of the second imaging unit 120 is set to a substantially minimum value is set to ISO min2 . At this time, when the target ISO sensitivity is included in the range of the following expression (1), the first imaging unit 110 and the second imaging unit 120 can not set the ISO sensitivity to be substantially the same as each other. .
  • the image quality deterioration determination unit 180 acquires each of the ISO sensitivity information from the first imaging unit 110 and the second imaging unit 120, and one of the ISO sensitivities is included in the range indicated by the equation (1).
  • the image quality deterioration determining unit 180 determines that the image quality of the combined image is degraded (or the image quality is highly likely to be degraded), and the combining processing unit 150 sets the combining ratio of the black and white image to substantially zero.
  • the image quality deterioration determining unit 180 may set the hysteresis d represented by the following equation (2). As a result, the image quality deterioration determination unit 180 can prevent frequent switching between the combined on state and the combined off state.
  • the image quality deterioration determining unit 180 may determine whether the image quality of the combined image is deteriorated by the process using the sensor information provided from the distance sensor 170. More specifically, the distance sensor 170 measures the subject distance by a predetermined method as described above. Then, the image quality deterioration determining unit 180 compares the subject distance data provided from the distance sensor 170 with a predetermined threshold to determine whether the subject distance is close enough to cause image quality deterioration. When it is determined that the subject distance is close enough to cause image quality deterioration, the combining processing unit 150 sets the combining ratio of the black and white image to substantially zero.
  • the above process may be changed as appropriate.
  • the image quality deterioration determination unit 180 when the reliability of the subject distance data is also provided (or when the reliability can be calculated by a predetermined process), the image quality deterioration determination unit 180 only when the reliability is higher than a predetermined value, The image quality deterioration may be determined based on the subject distance data.
  • the image quality deterioration determination unit 180 determines the image quality of the combined image based on the focus position information (information on the position of the lens in the in-focus state) provided from each of the first imaging unit 110 and the second imaging unit 120. May be determined. More specifically, the image quality deterioration determination unit 180 can convert the focus position information into a subject distance.
  • the method of converting the focus position information into the subject distance is not particularly limited, and a known method may be used.
  • the image quality deterioration determining unit 180 determines whether the subject distance is close enough to cause image quality deterioration by comparing the subject distance with a predetermined threshold value, and the subject distance is close enough to cause image quality deterioration. If it is determined that the distance is the distance, the composition processing unit 150 makes the composition ratio of the black and white image substantially zero.
  • the image quality deterioration determining unit 180 may set the hysteresis d represented by the following expression (3).
  • “lenspos” indicates the position of the lens in the in-focus state.
  • the image quality deterioration determining unit 180 determines whether the quality of the combined image is deteriorated based on the information provided from the image plane phase difference sensor provided in the first imaging unit 110 or the second imaging unit 120. You may The image plane phase difference sensor can output a distance map indicating the subject distance in each area in the screen. Therefore, the image quality deterioration determining unit 180 can determine the image quality deterioration based on the distance map provided by the image plane phase difference sensor. For example, as shown in FIG. 6, the screen is divided into seven vertical and nine horizontal areas, and the image plane phase difference sensor can output the subject distance in area units.
  • the image quality deterioration judging unit 180 recognizes that the subject with a short subject distance is shown in the area 10 (the area at the lower left part of the screen) based on the distance map provided from the image plane phase difference sensor. It is possible to determine the image quality deterioration.
  • the reliability map is also referred to as “image plane phase difference sensor information”.
  • image plane phase difference sensor information an example of the distance map provided by the image plane phase difference sensor is shown in 7A of FIG. 7 and an example of the reliability map provided together is shown in 7B.
  • Each map represents the subject distance in each area of one captured image and the reliability thereof, and the area of each map corresponds to the area shown in FIG.
  • the image quality deterioration determining unit 180 determines the presence or absence of image quality deterioration by performing the following processing using the distance map and the reliability map.
  • the image quality deterioration determining unit 180 extracts an area having a reliability of a predetermined value R min or more from the acquired reliability map, and extracts data in the distance map corresponding to the area. Then, the image quality deterioration determining unit 180 searches for data having the lowest value (in other words, data having the closest subject distance) from the data in the extracted distance map, and sets the data as D min . Next, the image quality deterioration determination unit 180 calculates the range of the distance D assumed to include the subject located closest to the object according to the following Expression (4).
  • the image quality deterioration determination unit 180 extracts data included in the range of the distance D shown in the above equation (4) from the data in the distance map extracted above. Then, the image quality deterioration determination unit 180 sorts the extracted data in order of distance as shown in FIG. Note that FIG. 8 also shows the reliability corresponding to the distance. Here, the i-th distance in the sorted data as D i, the reliability corresponding to the D i and R i.
  • the image quality deterioration determining unit 180 extracts the data having reliability higher than R max from the sorted data with the reliability reliably representing reliability as R max, and the distance is the longest among the extracted data. Let the data number be N. If there is no data which is equal to or greater than R max in the sorted data, the number of the data having the farthest distance in the sorted data is N. Then, the image quality deterioration determination unit 180 estimates the distance D obj assumed to include the subject located closest to the image by performing the calculation of the following Expression (5). In other words, the image quality deterioration determination unit 180 calculates a weighted average value as the distance D obj based on the reliability.
  • the image quality deterioration determining unit 180 compares the distance D obj with a predetermined threshold to determine whether the distance D obj is close enough to cause the image quality deterioration, and the distance D obj is the image quality deterioration. If it is determined that the distance is short enough to cause the image, the composition processing unit 150 makes the composition ratio of the black and white image substantially zero.
  • the image quality deterioration determining unit 180 may apply a time smoothing filter to the distance map provided by the image plane phase difference sensor . This improves the accuracy of the distance map.
  • the image quality deterioration judging unit 180 applies a time smoothing filter to the distance map. It is not possible to determine the presence or absence of image quality deterioration based on the image plane phase difference sensor.
  • the image quality deterioration determining unit 180 may exclude the frame for which the determination of the image quality deterioration is not possible from the application target of the time smoothing filter. Alternatively, only in this case, the image quality deterioration determining unit 180 may determine the image quality deterioration based on the focus position information. In other words, the image quality deterioration determining unit 180 may switch the information used to determine the presence or absence of the image quality deterioration between the information from the image plane phase difference sensor and the focus position information according to the reliability of the image plane phase difference sensor. .
  • the image quality deterioration determination unit 180 may determine whether the image quality of the composite image is deteriorated based on the image feature amount of a subject whose subject distance is equal to or less than a predetermined value (hereinafter referred to as “short distance subject”). . More specifically, the composition processing unit 150 analyzes the color image and the black-and-white image to calculate the parallax distribution feature amount, the search range exceeding feature amount, or the parallax gap feature amount, and the image quality deterioration determination unit 180 The presence or absence of the image quality deterioration may be determined by determining whether the image feature amount of the image corresponds to the image feature amount of the short distance subject.
  • the composition processing unit 150 includes a parallax histogram processing unit 151, a parallax distribution feature quantity calculation unit 152, a search range exceeding feature quantity calculation unit 153, and a parallax gap feature quantity calculation unit 154.
  • the image quality deterioration determination unit 180 includes the short distance feature amount determination unit 181.
  • the disparity histogram processing unit 151 performs disparity detection based on the black and white image data and the color image data supplied from the first pre-processing unit 130 and the second pre-processing unit 140, and generates disparity information indicating the detected disparity Do. Since the first imaging unit 110 and the second imaging unit 120 perform imaging from different viewpoint positions as shown in FIG. 1B, the imaging obtained by the first imaging unit 110 and the second imaging unit 120 The image is an image having parallax. Therefore, the parallax histogram processing unit 151 generates parallax information indicating the parallax for each pixel based on the captured image data supplied from the first preprocessing unit 130 and the second preprocessing unit 140.
  • the disparity histogram processing unit 151 generates disparity information by corresponding point detection processing such as block matching. For example, the parallax histogram processing unit 151 uses a captured image acquired by any one of the first imaging unit 110 and the second imaging unit 120 as a reference captured image, and a reference based on the target position on the reference captured image. A block area on the other captured image most similar to the block area is detected. The disparity histogram processing unit 151 calculates a disparity vector indicating the difference between the position of the detected block area and the position of the reference block area. The disparity histogram processing unit 151 calculates the disparity with each pixel on the reference captured image as the target position, and outputs the disparity vector calculated for each pixel.
  • FIG. 10 exemplifies a parallax histogram
  • FIG. 10 (a) is a parallax histogram of a captured image in which the subject is close to the same plane
  • FIG. 10 (b) is a distance to the subject Illustrate parallax histograms of captured images different from each other.
  • a peak occurs at a position away from the parallax "0" in the negative direction due to the difference in distance.
  • FIG. 10 illustrates a parallax histogram of a captured image in which the distances to the subject are different to generate a plurality of parallaxes, and a large parallax occurs when the objects are close.
  • the subject is closer to generate a magnitude parallax than in (b) of FIG. 10, and therefore, a peak is generated at a position further away in the negative direction than (b) of FIG. .
  • FIG. 11 is a diagram for explaining the parallax difference absolute value used to generate the parallax gap histogram.
  • the disparity histogram processing unit 151 calculates the disparity PV1 at a position horizontally separated from the position of the pixel of interest in the calculation target area by “ ⁇ (PARALLAX_DIFF_DISTANCE / 2)”.
  • the parallax histogram processing unit 151 calculates the parallax difference absolute value PVapd shown in the equation (6) by calculating the parallax PV2 at a position separated by “(PARALLAX_DIFF_DISTANCE / 2)” horizontally from the target pixel position. Do. Note that the parallax gap distance (PARALLAX_DIFF_DISTANCE) is set in advance.
  • the parallax difference absolute value PVapd when the subject is close to the same plane, the parallax difference absolute value PVapd has a small difference between the parallax PV1 and the parallax PV2, and the value of the parallax difference absolute value PVapd is small.
  • the parallax difference absolute value PVapd has a large difference between the parallax PV1 and the parallax PV2, for example, if the distance to the subject is different and the target pixel is a boundary between subjects having different distances, the value of the parallax difference absolute value PVapd is large. Become.
  • the disparity histogram processing unit 151 generates a disparity gap histogram that is a histogram of the disparity difference absolute value PVapd calculated with each pixel of the calculation target area as the pixel of interest.
  • FIG. 12 illustrates the parallax gap histogram.
  • the parallax distribution feature quantity calculation unit 152 calculates a statistic indicating the feature of the parallax distribution from the parallax histogram generated by the parallax histogram processing unit 151 as a parallax distribution feature quantity.
  • the parallax distribution feature quantity calculation unit 152 calculates, for example, a standard deviation as a statistic indicating the characteristic of the parallax distribution, and sets the calculated standard deviation as the parallax distribution feature quantity FVfsd.
  • the parallax distribution feature quantity calculated from the histogram of FIG. 10A is “FVfsd-a”
  • the parallax distribution feature quantity “FVfsd-b” calculated from the histogram of FIG. 10B and FIG.
  • the parallax distribution feature quantity calculation unit 152 calculates the standard deviation of the parallax histogram as the parallax distribution feature quantity FVfsd, and based on the parallax distribution feature quantity FVfsd, the subject is close to the same plane or there is a plurality of parallaxes. Can be determined.
  • the search range over feature amount calculation unit 153 indicates a ratio of the frequency (over_search_range_counter) to the total frequency (counter) generating the parallax more than the search range set in advance from the parallax histogram generated by the parallax histogram processing unit 151.
  • a search range exceeding feature amount FVosr is calculated.
  • the search range over feature amount calculation unit 153 performs the calculation of Expression (7) using the disparity histogram to calculate the search range over feature amount FVosr.
  • the search range over feature amount calculated from the histogram in (a) of FIG. 10 is set to “F Vosr-a”. Further, the search range over feature amount calculated from the histogram of (b) of FIG. 10 is “F Vosr-b”, and the search range over feature amount calculated from the histogram of (c) of FIG. 10 is “F Vosr-c”. In this case, the search range over feature amount is “F Vosr -a, F Vosr-b ⁇ F Vosr-c”.
  • the search range over feature amount calculation unit 153 calculates the search range over feature amount FVosr, it can be determined based on the search range over feature amount FVosr whether or not an object causing large parallax is imaged. . That is, it is possible to detect a short distance subject whose matching accuracy is reduced (or which can not be matched).
  • the parallax gap feature quantity calculation unit 154 calculates the parallax gap feature quantity FVpd from the parallax gap histogram generated by the parallax histogram processing unit 151.
  • the parallax gap feature quantity calculation unit 154 calculates the parallax gap feature quantity FVpd indicating the ratio of the frequency (large_parallax_diff_counter) of the frequency (large_parallax_diff_counter) generating the parallax gap greater than or equal to the maximum parallax gap distance preset from the parallax gap histogram. calculate.
  • the parallax gap feature quantity calculation unit 154 performs the calculation of Expression (8) using the parallax gap histogram to calculate the parallax gap feature quantity FVpd.
  • the parallax gap feature quantity FVpd calculated by the parallax gap feature quantity calculation unit 154 indicates the proportion of pixels that generate the maximum parallax gap distance.
  • the subject on the same plane has a small parallax gap, and the parallax gap is large at an image boundary portion of subjects having different distances. Therefore, it is possible to determine the generation state of the image boundary of subjects having large distances.
  • the short distance feature amount determination unit 181 determines whether the image quality is degraded based on the image feature amounts calculated by the parallax distribution feature amount calculation unit 152, the search range over feature amount calculation unit 153, and the parallax gap feature amount calculation unit 154. Make a decision.
  • FIG. 13 when the vertical axis is the parallax gap feature amount FVpd and the horizontal axis is the search range beyond feature amount FVosr, the result of the image quality deterioration when imaging is performed in various scenes and the determination curve 20 etc. It is shown. More specifically, first, the user uses the image processing apparatus 100 to capture various scenes while changing the position etc. of the subject, and the composite image, the parallax gap feature amount FVpd, and the search range over feature amount FVosr Output.
  • the image processing apparatus 100 uses a set of the determination results as machine learning data (so-called supervised learning) to use the set of determination results as a determination curve that is a curve that can most appropriately separate the presence or absence of image quality deterioration.
  • machine learning data so-called supervised learning
  • the method of machine learning is not limited to this.
  • the determination curve may be output using “deep learning (deep learning)”, various simulation techniques, or the like.
  • the image quality deterioration determining unit 180 compares the point indicated by the parallax gap feature amount FVpd and the search range exceeding feature amount FVosr extracted from the captured image with the determination curve. Then, when the point indicates a value higher than the determination curve (when the point is located in the area of the arrow 22 in FIG. 13), the image quality degradation determination unit 180 degrades the image quality of the composite image (or degrades the image quality) And the composition processing unit 150 makes the composition ratio of the black and white image substantially zero.
  • the image quality deterioration determination unit 180 may set the hysteresis represented by the curve 21 in FIG. Then, in the case where the point indicated by the image feature amount newly extracted from the captured image shows a value lower than that of the curve 21 (when the point is located in the area of the arrow 23 in FIG. 13) in the composite on state. The image quality deterioration determination unit 180 switches to a state of combining off. As a result, the image quality deterioration determination unit 180 can prevent frequent switching between the combined on state and the combined off state.
  • the parallax gap feature amount FVpd and the search range over feature amount FVosr are used to determine the presence or absence of image quality deterioration, but the parallax distribution feature amount FVfsd may be used together. More specifically, the user performs imaging of various scenes in the same manner as described above using the image processing apparatus 100, and the composite image, the parallax gap feature amount FVpd, the search range exceeding feature amount FVosr, and the parallax distribution The feature amount FVfsd is calculated.
  • the user visually confirms the image quality deterioration, determines whether or not to turn off combining for each scene, and uses the machine learning that uses the set of the determination results as the teacher data for learning, and the determination curved surface is obtained.
  • the determination curved surface is a curved surface represented on three-dimensional coordinates in which the depth direction in FIG. 13 is the parallax distribution feature amount FVfsd.
  • the image quality deterioration determining unit 180 compares the point indicated by the parallax gap feature amount FVpd extracted from the captured image, the search range exceeding feature amount FVosr, and the parallax distribution feature amount FVfsd with the determination curved surface to obtain the presence or absence of image deterioration. Determine
  • the image quality deterioration determining unit 180 can improve the determination accuracy of the image quality deterioration by performing processing by combining a plurality of image feature amounts.
  • the combination of image feature quantities used for processing is free, and only one image feature quantity may be used.
  • each image feature amount is calculated from a parallax histogram or a parallax gap histogram has been described, but each image feature amount may be calculated based on a parallax map obtained from a color image and a black and white image .
  • the image quality deterioration determining unit 180 may determine the presence or absence of image quality deterioration based on the feature amount of the low luminance difference color difference large area (the area where the gradation of the luminance signal is low and the gradation of the color difference signal is high). For example, as shown in FIG. 14, in the display screen, the red system area 30 and the blue system area 31 are adjacent (in other words, the gradation of the color difference signal is high), and the luminance difference between the area 30 and the area 31 Is smaller than a predetermined value (in other words, the gradation of the luminance signal is low). In this case, it can be said that the area 32 including the area 30 and the adjacent part of the area 31 is the luminance difference small color difference large area.
  • the parallax estimation accuracy is lowered. Even if an incorrect parallax is output, in the region where the gradation of the color difference signal is low, the change in Y signal, Cb signal and Cr signal is not large, and the degree of image quality deterioration at the time of combining is small. On the other hand, in the area where the erroneous parallax is output and the gradation of the color difference signal is high, the degree of image quality deterioration at the time of combining becomes large. Therefore, when the image quality deterioration determination unit 180 detects a luminance difference small color difference large area wider than a predetermined area, the combination processing unit 150 sets the combination ratio of the black and white image to substantially zero.
  • the combination processing unit 150 includes a signal extraction unit 155, a Y / C dispersion ratio processing unit 156, and a Y / C edge component ratio processing unit 157, and the image quality deterioration determination unit 180 includes A luminance difference small color difference large feature amount determination unit 182 is provided.
  • the ratio of the dispersion value of C signal (C signal means Cb signal or Cr signal) to the dispersion value of Y signal (hereinafter referred to as “Y / C dispersion ratio”)
  • the ratio of the edge component of the C signal to the edge component of the Y signal (hereinafter referred to as “Y / C edge component ratio”)
  • the combining processing unit 150 extracts the Y signal, the Cb signal and the Cr signal from the color image data using the signal extracting unit 155, and these signals are processed by the Y / C dispersion ratio processing unit 156 and the Y / C edge component ratio.
  • Each feature amount described above is calculated by being input to the processing unit 157.
  • the luminance difference small color difference large feature amount determination unit 182 determines the presence or absence of image quality deterioration based on each feature amount.
  • FIG. 16 is a diagram showing an example of a functional configuration of the Y / C distribution ratio processing unit 156.
  • the Y / C dispersion ratio processor 156 includes a Y dispersion value calculator 156a, a Cb dispersion value calculator 156b, a Cr dispersion value calculator 156c, a comparator 156d, and a Y / C dispersion And a ratio calculation unit 156e.
  • the Y dispersion value calculation unit 156a, the Cb dispersion value calculation unit 156b, and the Cr dispersion value calculation unit 156c respectively divide the entire screen into areas of a fixed size, and the dispersion values of the Y signal, Cb signal and Cr signal in each area calculate.
  • the method of calculating the variance value is general, so the description is omitted.
  • the comparing unit 156 d compares the dispersion value of the Cb signal with the dispersion value of the Cr signal, and provides a dispersion value having a larger value to the Y / C dispersion ratio calculating unit 156 e.
  • the Y / C dispersion ratio calculation unit 156e calculates the ratio of the dispersion value of the C signal (the dispersion value having the larger value of the dispersion value of the Cb signal and the dispersion value of the Cr signal) to the dispersion value of the Y signal. And the ratio is provided to the luminance difference small color difference large feature amount determination unit 182.
  • the luminance difference small color difference large feature amount determination unit 182 determines the image quality deterioration based on the Y / C dispersion ratio.
  • the user uses the image processing apparatus 100 to capture various scenes while changing the position of a subject and the like, and outputs a composite image and a Y / C dispersion ratio. Thereafter, the user visually confirms the image quality deterioration and determines whether or not to turn off the composition for each scene. Then, the image processing apparatus 100 outputs the feature of the Y / C dispersion ratio in which image quality deterioration easily occurs by using machine learning or the like in which the set of the determination results is used as learning teacher data. For example, in FIG.
  • the luminance difference small color difference large feature amount determination unit 182 determines whether or not the Y / C dispersion ratio of each area calculated from the captured image falls within the area 40.
  • the luminance difference small color difference large feature amount determination unit 182 may deteriorate the image quality of the composite image (or may deteriorate the image quality)
  • the combination processing unit 150 determines that the combination ratio of the black and white image is substantially zero.
  • the above process is merely an example, and may be changed as appropriate.
  • the feature of the Y / C dispersion ratio in which image quality deterioration easily occurs may be output.
  • FIG. 18 is a diagram showing an example of a functional configuration of the Y / C edge component ratio processing unit 157.
  • the Y / C edge component ratio processing unit 157 includes a Y edge component detection unit 157a, a Cb edge component detection unit 157b, a Cr edge component detection unit 157c, a comparison unit 157d, and Y / C. And an edge component ratio calculation unit 157e.
  • Y edge component detection unit 157a, Cb edge component detection unit 157b and Cr edge component detection unit 157c respectively detect edge components of Y signal, Cb signal and Cr signal in each pixel (in other words, each signal is sensitive Detect where the change is or where it changes discontinuously).
  • the detection method (detection algorithm etc.) of the edge component is not particularly limited, and known techniques may be used.
  • the comparing unit 157d compares the edge component of the Cb signal with the edge component of the Cr signal, and provides an edge component having a larger value to the Y / C edge component ratio calculating unit 157e.
  • the Y / C edge component ratio calculation unit 157e then calculates the ratio of the edge component of the C signal (the edge component having the larger value of the edge component of the Cb signal and the edge component of the Cr signal) to the edge component of the Y signal.
  • the ratio is calculated and provided to the luminance difference small color difference large feature amount determination unit 182.
  • the luminance difference small color difference large feature amount determination unit 182 determines the image quality deterioration based on the Y / C edge component ratio.
  • the user uses the image processing apparatus 100 to capture various scenes while changing the position of a subject and the like, and outputs a composite image and a Y / C edge component ratio. Thereafter, the user visually confirms the image quality deterioration and determines whether or not to turn off the composition for each scene. Then, the image processing apparatus 100 outputs the feature of the Y / C edge component ratio at which the image quality deterioration easily occurs, by using machine learning or the like in which the set of the determination results is used as learning teacher data. For example, in FIG.
  • the luminance difference small color difference large feature amount determination unit 182 may deteriorate the image quality of the composite image (or may deteriorate the image quality) Is high, and the composition processing unit 150 makes the composition ratio of the black and white image substantially zero.
  • the above process is merely an example, and may be changed as appropriate.
  • the feature of the Y / C edge component ratio that is likely to cause image quality degradation may be output.
  • step S1000 the image quality deterioration determining unit 180 of the image processing apparatus 100 determines whether the image quality deterioration occurs based on the ISO sensitivity. More specifically, the image quality deterioration determination unit 180 acquires each of the ISO sensitivity information from the first imaging unit 110 and the second imaging unit 120, and one of the ISO sensitivities is in the range indicated by the above equation (1). It is determined whether the deterioration of the image quality occurs based on whether or not it is included in. If it is determined that the image quality deterioration occurs (step S1000 / Yes), the composition processing unit 150 makes the composition ratio of the black and white image substantially zero in step S1004 (in other words, it sets the composition OFF state or Reduce the composite ratio of black and white images).
  • step S1008 the image quality deterioration determining unit 180 determines whether the image quality is deteriorated by the process using the sensor information provided from the distance sensor 170 It is determined whether or not. More specifically, the image quality deterioration determining unit 180 analyzes the subject distance data provided from the distance sensor 170 to determine whether the subject distance is close enough to cause image quality deterioration. If it is determined that the subject distance is close enough to cause deterioration in the image quality (step S1008 / Yes), the composition processing unit 150 makes the composition ratio of the black and white image substantially zero in step S1004 (in other words, composition) Turn it off, or reduce the composite ratio of the black and white image).
  • the image quality deterioration determination unit 180 determines whether the first imaging unit 110 and the second imaging unit 120 are selected in step S1012. It is determined whether the image quality degradation occurs based on the focus position information provided by each of the above. More specifically, the image quality deterioration determining unit 180 converts the focus position information into the subject distance, and determines whether the subject distance is close enough to cause the image quality deterioration.
  • the composition processing unit 150 makes the composition ratio of the black and white image substantially zero in step S1004 (in other words, composition) Turn it off, or reduce the composite ratio of the black and white image).
  • the image quality deterioration determining unit 180 determines in step S1016 based on the information provided from the image plane phase difference sensor. It is determined whether or not image quality degradation occurs. More specifically, the image quality deterioration determination unit 180 calculates the distance D obj by performing equation (5) using the distance map and the reliability map provided by the image plane phase difference sensor, and the distance D obj is It is determined whether the distance is short enough to cause image quality deterioration.
  • the composition processing unit 150 makes the composition ratio of the black and white image substantially zero in step S1004 (in other words, Turn off compositing, or reduce the compositing ratio of black and white images).
  • the image quality deterioration determination unit 180 determines the image quality based on the image feature amount of the short distance object in step S1020. It is determined whether deterioration occurs. More specifically, the composition processing unit 150 outputs the parallax gap feature amount FVpd, the search range exceeding feature amount FVosr or the parallax distribution feature amount FVfsd using the black and white image data and the color image data, and the image quality deterioration determination unit 180 Whether or not the image quality is degraded is determined based on whether or not these image feature amounts correspond to the image feature amounts of the short distance subject.
  • the composition processing unit 150 sets the composition ratio of the black and white image to substantially zero at Step S1004. Turn it off, or reduce the composite ratio of the black and white image).
  • the image quality deterioration determination unit 180 determines the image feature amount of the small luminance difference color difference large area in step S1024. It is determined whether the image quality is degraded. More specifically, the composition processing unit 150 outputs image feature quantities such as Y / C dispersion ratio or Y / C edge component ratio using the captured image, and the image quality deterioration determination unit 180 uses these image feature quantities. It is determined whether or not the image quality is degraded based on whether or not the image feature amount of the low luminance difference color difference large area is satisfied.
  • the composition processing unit 150 makes the composition ratio of the black and white image substantially zero in step S1004. (In other words, the composite off state is set or the composite ratio of the black and white image is reduced).
  • the composition processing unit 150 substantially eliminates the composition ratio of the black and white image in step S1028.
  • the image processing apparatus 100 includes a 3D depth sensor 190 instead of the distance sensor 170 as shown in FIG. 21.
  • the image processing apparatus 100 performs processing using sensor information from the 3D depth sensor 190.
  • the image quality deterioration may be determined by The other configuration is the same as that shown in FIG.
  • the 3D depth sensor 190 includes a light emitting unit 191 of infrared light and a light receiving unit 192.
  • the light emitting unit 191 emits infrared light to a subject, and the light receiving unit 192 , Receive infrared light reflected by the subject. Then, the 3D depth sensor 190 can measure a slight time from when the infrared light is irradiated to when it is received, and can create a distance map by a time of flight equation that converts the time difference into a distance. .
  • An example of the distance map created by the 3D depth sensor 190 when the subject is a soccer ball as shown in FIG. 22 is shown in FIG. As shown in FIG.
  • the distance map indicates, for example, the subject distance by color shading, and indicates that the subject distance is closer as the color is darker.
  • the image quality deterioration determination unit 180 obtains a distance map from the 3D depth sensor 190, and analyzes the distance map to determine the closest subject distance (hereinafter, referred to as "nearest neighbor distance 60". See FIG. 23). Identify.
  • the combining processing unit 150 sets the combining ratio of the black and white image to substantially zero. Since the distance sensor 170 basically outputs the subject distance at a certain point included in the screen, it is difficult to output the nearest distance 60. On the other hand, in the present modification, since the closest distance 60 is output based on the distance map of the entire screen, the determination of the image quality deterioration can be realized with higher accuracy.
  • the above process is merely an example, and may be changed as appropriate.
  • the type of light emitted by the 3D depth sensor 190, the method of creating the distance map, the content of the distance map, and the like are not particularly limited.
  • the image quality deterioration determining unit 180 determines the presence or absence of image quality deterioration using the distance map provided from the 3D depth sensor 190 as described above, the recognition position of a face or the like, the in-focus position, the gaze position (for example, The distance map may be corrected so that the distance in the distance map tends to be closer as the center of the screen or the position specified by analysis of the line of sight or the like is closer.
  • 24A shows a distance map before correction (note that the distance map is assumed to be the same as that shown in FIG. 23).
  • 24B values at a certain straight line 71 drawn on the distance map of 24A are shown.
  • the image quality deterioration determination unit 180 outputs the corrected distance map shown in 24D by multiplying each value of the distance map before correction shown in 24B with each value of the coefficient function shown in 24C. Thereafter, the image quality deterioration determining unit 180 identifies the nearest distance 60 by analyzing the corrected distance map, and when it is determined that the nearest distance 60 is short enough to cause the image quality deterioration, the combining processing unit A step 150 makes the composite ratio of black and white images substantially zero.
  • the image quality deterioration determining unit 180 can determine whether the user is likely to perceive deterioration of the image quality in consideration of the user's gaze condition of the subject.
  • the above process is merely an example, and may be changed as appropriate.
  • the coefficient function may be changed as appropriate from the contents shown in 24C.
  • the focus controlled by the focus control unit 160 matches the distant view, there is a high possibility that the subject distance of the subject being watched by the user is also far. At this time, even if the short distance subject enters the angle of view, the short distance subject is not gazed by the user, so the possibility that the user perceives the deterioration of the image quality is low (in other words, the deterioration of the image quality is acceptable) Can be
  • the image quality deterioration judging unit 180 may change the flowchart shown in FIG. 20 to that shown in FIG. FIG. 25 includes step S1108 which does not exist in the flowchart shown in FIG. More specifically, in step S1108, the image quality deterioration determination unit 180 converts the focus position information into the subject distance, and the subject distance of the subject being watched by the user is far because the focus is in the distant view. If it is determined (step S1108 / Yes), in step S1132, the combining processing unit 150 combines the color image and the black and white image without setting the black and white image combining ratio to substantially zero (in other words, the combination on state To do).
  • step S1108 / No when the image quality deterioration determination unit 180 determines that the subject distance of the subject being watched by the user is short (step S1108 / No), the process after step S1112 (the same as the process after step S1008 in FIG. 20) is To be implemented. Thereby, the image quality of the distant view part which a user is gazing at may be improved by synthetic
  • the image quality deterioration determining unit 180 may change the threshold used in the above-described determination of the image quality deterioration according to the magnification of the electronic zoom.
  • the angle of view after the composition processing is different from the angle of view after the electronic zoom.
  • the combination OFF state is set.
  • the image processing apparatus 100 performs the extraction region of the screen feature amount only in the region of the angle of view after the electronic zoom. More specifically, the composition processing unit 150 acquires information related to the electronic zoom (for example, information capable of specifying the area of the angle of view after the electronic zoom such as the starting point of the electronic zoom and the magnification), and based on the information Then, various screen feature quantities in the area of the angle of view after the electronic zoom are calculated. Then, the image quality deterioration determination unit 180 determines the presence or absence of image quality deterioration based on these screen feature amounts. As a result, even if the image quality is degraded, the image processing apparatus 100 can continue the combining process if the occurrence position is outside the area of the angle of view after the electronic zoom.
  • information related to the electronic zoom for example, information capable of specifying the area of the angle of view after the electronic zoom such as the starting point of the electronic zoom and the magnification
  • various screen feature quantities in the area of the angle of view after the electronic zoom are calculated.
  • the image processing apparatus 100 performs the determination of the image quality deterioration and the determination of the combination availability on a frame basis. For example, when the area where the image quality deterioration occurs is small (for example, the area area is less than a predetermined value) Even if) there is a possibility that synthesis will not be performed. On the other hand, in the image processing apparatus 100 according to the modification, only the area (or the area in the vicinity thereof) where the image quality deterioration occurs may be an area not synthesized, and the area other than the area may be synthesized.
  • the image processing apparatus 100 may perform the determination of the image quality deterioration and the determination of the combination availability in the area unit.
  • the image processing apparatus 100 may perform the determination of the image quality deterioration and the determination of the combination availability on a subject basis. As a result, the image processing apparatus 100 can prevent the synthesis of the entire screen from being performed (or vice versa) due to the image quality deterioration occurring in a part of the area.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure is any type of movement, such as automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility, airplanes, drones, ships, robots, construction machines, agricultural machines (tractors), etc. It may be realized as a device mounted on the body.
  • FIG. 27 is a block diagram showing a schematic configuration example of a vehicle control system 7000 that is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • Vehicle control system 7000 comprises a plurality of electronic control units connected via communication network 7010.
  • the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside information detection unit 7400, an in-vehicle information detection unit 7500, and an integrated control unit 7600. .
  • the communication network 7010 connecting the plurality of control units is, for example, an arbitrary standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), or FlexRay (registered trademark). It may be an in-vehicle communication network.
  • CAN Controller Area Network
  • LIN Local Interconnect Network
  • LAN Local Area Network
  • FlexRay registered trademark
  • Each control unit includes a microcomputer that performs arithmetic processing in accordance with various programs, a storage unit that stores programs executed by the microcomputer or parameters used in various arithmetic operations, and drive circuits that drive devices to be controlled. Equipped with Each control unit is provided with a network I / F for communicating with other control units via the communication network 7010, and by wired communication or wireless communication with an apparatus or sensor inside or outside the vehicle. A communication I / F for performing communication is provided. In FIG.
  • a microcomputer 7610 As a functional configuration of the integrated control unit 7600, a microcomputer 7610, a general-purpose communication I / F 7620, a dedicated communication I / F 7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device I / F 7660, an audio image output unit 7670, An in-vehicle network I / F 7680 and a storage unit 7690 are illustrated.
  • the other control units also include a microcomputer, a communication I / F, a storage unit, and the like.
  • Drive system control unit 7100 controls the operation of devices related to the drive system of the vehicle according to various programs.
  • drive system control unit 7100 includes a drive force generation device for generating a drive force of a vehicle such as an internal combustion engine or a drive motor, a drive force transmission mechanism for transmitting the drive force to the wheels, and a steering angle of the vehicle. It functions as a control mechanism such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
  • the drive system control unit 7100 may have a function as a control device such as an ABS (Antilock Brake System) or an ESC (Electronic Stability Control).
  • Vehicle state detection unit 7110 is connected to drive system control unit 7100.
  • the vehicle state detection unit 7110 may be, for example, a gyro sensor that detects an angular velocity of an axial rotational movement of a vehicle body, an acceleration sensor that detects an acceleration of the vehicle, or an operation amount of an accelerator pedal, an operation amount of a brake pedal, and steering of a steering wheel. At least one of the sensors for detecting the angle, the engine speed, the rotational speed of the wheel, etc. is included.
  • Drive system control unit 7100 performs arithmetic processing using a signal input from vehicle state detection unit 7110 to control an internal combustion engine, a drive motor, an electric power steering device, a brake device, and the like.
  • Body system control unit 7200 controls the operation of various devices equipped on the vehicle body according to various programs.
  • the body control unit 7200 functions as a keyless entry system, a smart key system, a power window device, or a control device of various lamps such as a head lamp, a back lamp, a brake lamp, a blinker or a fog lamp.
  • the body system control unit 7200 may receive radio waves or signals of various switches transmitted from a portable device substituting a key.
  • Body system control unit 7200 receives the input of these radio waves or signals, and controls a door lock device, a power window device, a lamp and the like of the vehicle.
  • the battery control unit 7300 controls the secondary battery 7310 which is a power supply source of the drive motor according to various programs. For example, information such as the battery temperature, the battery output voltage, or the remaining capacity of the battery is input to the battery control unit 7300 from the battery device provided with the secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and performs temperature adjustment control of the secondary battery 7310 or control of a cooling device or the like provided in the battery device.
  • Outside-vehicle information detection unit 7400 detects information outside the vehicle equipped with vehicle control system 7000.
  • the imaging unit 7410 and the external information detection unit 7420 is connected to the external information detection unit 7400.
  • the imaging unit 7410 includes at least one of a time-of-flight (ToF) camera, a stereo camera, a monocular camera, an infrared camera, and another camera.
  • ToF time-of-flight
  • an environment sensor for detecting the current weather or weather, or another vehicle, an obstacle or a pedestrian around the vehicle equipped with the vehicle control system 7000 is detected in the outside-vehicle information detection unit 7420, for example.
  • the ambient information detection sensors at least one of the ambient information detection sensors.
  • the environment sensor may be, for example, at least one of a raindrop sensor that detects wet weather, a fog sensor that detects fog, a sunshine sensor that detects sunshine intensity, and a snow sensor that detects snowfall.
  • the ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a light detection and ranging (LIDAR) device.
  • the imaging unit 7410 and the external information detection unit 7420 may be provided as independent sensors or devices, or may be provided as an integrated device of a plurality of sensors or devices.
  • FIG. 28 shows an example of installation positions of the imaging unit 7410 and the external information detection unit 7420.
  • the imaging units 7910, 7912, 7914, 7916, 7918 are provided at, for example, at least one of the front nose of the vehicle 7900, the side mirror, the rear bumper, the back door, and the upper portion of the windshield of the vehicle interior.
  • An imaging unit 7910 provided in the front nose and an imaging unit 7918 provided in the upper part of the windshield in the vehicle cabin mainly acquire an image in front of the vehicle 7900.
  • the imaging units 7912 and 7914 provided in the side mirror mainly acquire an image of the side of the vehicle 7900.
  • An imaging unit 7916 provided in the rear bumper or back door mainly acquires an image behind the vehicle 7900.
  • the imaging unit 7918 provided on the upper part of the windshield in the passenger compartment is mainly used to detect a leading vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 28 illustrates an example of the imaging range of each of the imaging units 7910, 7912, 7914, and 7916.
  • the imaging range a indicates the imaging range of the imaging unit 7910 provided on the front nose
  • the imaging ranges b and c indicate the imaging ranges of the imaging units 7912 and 7914 provided on the side mirrors
  • the imaging range d indicates The imaging range of the imaging part 7916 provided in the rear bumper or the back door is shown.
  • a bird's-eye view of the vehicle 7900 as viewed from above can be obtained.
  • the external information detection units 7920, 7922, 7924, 7926, 7928, and 7930 provided on the front, rear, sides, and corners of the vehicle 7900 and above the windshield of the vehicle interior may be, for example, ultrasonic sensors or radar devices.
  • the external information detection units 7920, 7926, 7930 provided on the front nose of the vehicle 7900, the rear bumper, the back door, and the upper part of the windshield of the vehicle interior may be, for example, a LIDAR device.
  • These outside-of-vehicle information detection units 7920 to 7930 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle or the like.
  • the out-of-vehicle information detection unit 7400 causes the imaging unit 7410 to capture an image outside the vehicle, and receives the captured image data. Further, the external information detection unit 7400 receives detection information from the external information detection unit 7420 connected. When the out-of-vehicle information detection unit 7420 is an ultrasonic sensor, a radar device, or a LIDAR device, the out-of-vehicle information detection unit 7400 transmits ultrasonic waves or electromagnetic waves and receives information on the received reflected waves.
  • the external information detection unit 7400 may perform object detection processing or distance detection processing of a person, a car, an obstacle, a sign, a character on a road surface, or the like based on the received information.
  • the external information detection unit 7400 may perform environment recognition processing for recognizing rainfall, fog, road surface conditions and the like based on the received information.
  • the external information detection unit 7400 may calculate the distance to an object outside the vehicle based on the received information.
  • the external information detection unit 7400 may perform image recognition processing or distance detection processing for recognizing a person, a car, an obstacle, a sign, a character on a road surface, or the like based on the received image data.
  • the external information detection unit 7400 performs processing such as distortion correction or alignment on the received image data, and combines the image data captured by different imaging units 7410 to generate an overhead image or a panoramic image. It is also good.
  • the external information detection unit 7400 may perform viewpoint conversion processing using image data captured by different imaging units 7410.
  • An in-vehicle information detection unit 7500 detects information in the vehicle.
  • a driver state detection unit 7510 that detects a state of a driver is connected to the in-vehicle information detection unit 7500.
  • the driver state detection unit 7510 may include a camera for imaging the driver, a biometric sensor for detecting the driver's biological information, a microphone for collecting sound in the vehicle interior, and the like.
  • the biological sensor is provided, for example, on a seat or a steering wheel, and detects biological information of an occupant sitting on a seat or a driver who grips the steering wheel.
  • the in-vehicle information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, or determine whether the driver does not go to sleep You may The in-vehicle information detection unit 7500 may perform processing such as noise canceling processing on the collected audio signal.
  • the integrated control unit 7600 controls the overall operation in the vehicle control system 7000 in accordance with various programs.
  • An input unit 7800 is connected to the integrated control unit 7600.
  • the input unit 7800 is realized by, for example, a device such as a touch panel, a button, a microphone, a switch or a lever, which can be input operated by the passenger.
  • the integrated control unit 7600 may receive data obtained by speech recognition of speech input by the microphone.
  • the input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or an external connection device such as a mobile phone or a PDA (Personal Digital Assistant) corresponding to the operation of the vehicle control system 7000.
  • PDA Personal Digital Assistant
  • the input unit 7800 may be, for example, a camera, in which case the passenger can input information by gesture. Alternatively, data obtained by detecting the movement of the wearable device worn by the passenger may be input. Furthermore, the input unit 7800 may include, for example, an input control circuit that generates an input signal based on the information input by the passenger or the like using the above-described input unit 7800 and outputs the generated signal to the integrated control unit 7600. The passenger or the like operates the input unit 7800 to input various data to the vehicle control system 7000 and instruct processing operations.
  • the storage unit 7690 may include a ROM (Read Only Memory) that stores various programs executed by the microcomputer, and a RAM (Random Access Memory) that stores various parameters, calculation results, sensor values, and the like.
  • the storage unit 7690 may be realized by a magnetic storage device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • HDD hard disk drive
  • semiconductor storage device an optical storage device
  • magneto-optical storage device or the like.
  • the general-purpose communication I / F 7620 is a general-purpose communication I / F that mediates communication with various devices existing in the external environment 7750.
  • General-purpose communication I / F 7620 is a cellular communication protocol such as GSM (registered trademark) (Global System of Mobile communications), WiMAX (registered trademark), LTE (registered trademark) (Long Term Evolution) or LTE-A (LTE-Advanced).
  • GSM Global System of Mobile communications
  • WiMAX registered trademark
  • LTE registered trademark
  • LTE-A Long Term Evolution-Advanced
  • other wireless communication protocols such as wireless LAN (also referred to as Wi-Fi (registered trademark)), Bluetooth (registered trademark), etc. may be implemented.
  • the general-purpose communication I / F 7620 is connected to, for example, an apparatus (for example, an application server or control server) existing on an external network (for example, the Internet, a cloud network, or an operator-specific network) via a base station or access point
  • an apparatus for example, an application server or control server
  • an external network for example, the Internet, a cloud network, or an operator-specific network
  • the general-purpose communication I / F 7620 is a terminal (for example, a driver, a pedestrian or a shop terminal, or an MTC (Machine Type Communication) terminal) existing near the vehicle using, for example, P2P (Peer To Peer) technology. It may be connected with
  • the dedicated communication I / F 7630 is a communication I / F that supports a communication protocol designed for use in a vehicle.
  • the dedicated communication I / F 7630 may be a standard protocol such as WAVE (Wireless Access in Vehicle Environment), DSRC (Dedicated Short Range Communications), or cellular communication protocol, which is a combination of lower layer IEEE 802.11p and upper layer IEEE 1609, for example. May be implemented.
  • the dedicated communication I / F 7630 is typically used for Vehicle to Vehicle communication, Vehicle to Infrastructure communication, Vehicle to Home communication, and Vehicle to Pedestrian. 2.) Perform V2X communication, a concept that includes one or more of the communication.
  • the positioning unit 7640 receives a GNSS signal (for example, a GPS signal from a Global Positioning System (GPS) satellite) from, for example, a Global Navigation Satellite System (GNSS) satellite and executes positioning, thereby performing latitude, longitude, and altitude of the vehicle.
  • Generate location information including Positioning section 7640 may specify the current position by exchanging signals with the wireless access point, or may acquire position information from a terminal such as a mobile phone having a positioning function, a PHS, or a smartphone.
  • the beacon receiving unit 7650 receives, for example, radio waves or electromagnetic waves transmitted from a radio station or the like installed on a road, and acquires information such as the current position, traffic jams, closing times or required time.
  • the function of the beacon reception unit 7650 may be included in the above-described dedicated communication I / F 7630.
  • An in-vehicle apparatus I / F 7660 is a communication interface that mediates the connection between the microcomputer 7610 and various in-vehicle apparatuses 7760 existing in the vehicle.
  • the in-car device I / F 7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), or WUSB (Wireless USB). Further, the in-car device I / F 7660 can be connected via a connection terminal (and a cable, if necessary) (not shown) via USB (Universal Serial Bus), HDMI (High-Definition Multimedia Interface), or MHL (Mobile High). A wired connection may be established, such as a definition link, etc.
  • the in-vehicle device 7760 includes, for example, at least one of a mobile device or wearable device that the passenger has, or an information device carried in or attached to the vehicle.
  • the in-vehicle device 7760 may include a navigation device for performing route search to any destination
  • the in-vehicle device I / F 7660 controls signals with these in-vehicle devices 7760 Or exchange data signals.
  • the in-vehicle network I / F 7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010.
  • the in-vehicle network I / F 7680 transmits and receives signals and the like in accordance with a predetermined protocol supported by the communication network 7010.
  • the microcomputer 7610 of the integrated control unit 7600 is connected via at least one of a general-purpose communication I / F 7620, a dedicated communication I / F 7630, a positioning unit 7640, a beacon reception unit 7650, an in-vehicle device I / F 7660, and an in-vehicle network I / F 7680.
  • the vehicle control system 7000 is controlled in accordance with various programs based on the information acquired. For example, the microcomputer 7610 calculates a control target value of the driving force generation device, the steering mechanism or the braking device based on the acquired information inside and outside the vehicle, and outputs a control command to the driving system control unit 7100. It is also good.
  • the microcomputer 7610 realizes the function of an advanced driver assistance system (ADAS) including collision avoidance or shock mitigation of a vehicle, follow-up traveling based on an inter-vehicle distance, vehicle speed maintenance traveling, vehicle collision warning, vehicle lane departure warning, etc. Cooperative control for the purpose of In addition, the microcomputer 7610 automatically runs without using the driver's operation by controlling the driving force generating device, the steering mechanism, the braking device, etc. based on the acquired information of the surroundings of the vehicle. Coordinated control may be performed for the purpose of driving and the like.
  • ADAS advanced driver assistance system
  • the microcomputer 7610 is information acquired via at least one of a general-purpose communication I / F 7620, a dedicated communication I / F 7630, a positioning unit 7640, a beacon reception unit 7650, an in-vehicle device I / F 7660, and an in-vehicle network I / F 7680. Based on the above, three-dimensional distance information between the vehicle and an object such as a surrounding structure or a person may be generated, and local map information including the peripheral information of the current position of the vehicle may be created. Further, the microcomputer 7610 may predict a danger such as a collision of a vehicle or a pedestrian or the like approaching a road or the like on the basis of the acquired information, and may generate a signal for warning.
  • the warning signal may be, for example, a signal for generating a warning sound or lighting a warning lamp.
  • the audio image output unit 7670 transmits an output signal of at least one of audio and image to an output device capable of visually or aurally notifying information to a passenger or the outside of a vehicle.
  • an audio speaker 7710, a display unit 7720, and an instrument panel 7730 are illustrated as output devices.
  • the display unit 7720 may include, for example, at least one of an on-board display and a head-up display.
  • the display portion 7720 may have an AR (Augmented Reality) display function.
  • the output device may be another device such as a headphone, a wearable device such as a glasses-type display worn by a passenger, a projector, or a lamp other than these devices.
  • the display device may obtain information obtained from various processes performed by the microcomputer 7610 or information received from another control unit in various formats such as text, images, tables, graphs, etc. Display visually.
  • the audio output device converts an audio signal composed of reproduced audio data or audio data into an analog signal and outputs it in an auditory manner.
  • At least two control units connected via the communication network 7010 may be integrated as one control unit.
  • each control unit may be configured by a plurality of control units.
  • the vehicle control system 7000 may comprise another control unit not shown.
  • part or all of the functions of any control unit may be provided to another control unit. That is, as long as transmission and reception of information are performed via the communication network 7010, predetermined arithmetic processing may be performed by any control unit.
  • a sensor or device connected to any control unit is connected to another control unit, a plurality of control units may mutually transmit and receive detection information via the communication network 7010. .
  • a computer program for realizing each function of the image processing apparatus 100 according to the present embodiment described with reference to FIG. 4 can be implemented in any control unit or the like.
  • a computer readable recording medium in which such a computer program is stored can be provided.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory or the like.
  • the above computer program may be distributed via, for example, a network without using a recording medium.
  • the image processing apparatus 100 according to the present embodiment described with reference to FIG. 4 can be applied to the integrated control unit 7600 of the application example shown in FIG.
  • the components of the image processing apparatus 100 described with reference to FIG. 4 is a module (for example, an integrated circuit module configured by one die) for the integrated control unit 7600 shown in FIG. It may be realized.
  • the image processing apparatus 100 described with reference to FIG. 4 may be realized by a plurality of control units of the vehicle control system 7000 shown in FIG.
  • the image processing apparatus 100 is processed not only by the analysis result of the captured image but also by using various sensor information such as a distance sensor, a focus sensor, and an image plane phase difference sensor. Control the combination of color and black and white images. More specifically, the image processing apparatus 100 determines whether the image quality of the composite image is degraded by processing using various types of sensor information, and when it is determined that the image quality is degraded, the composite ratio of black and white images To approximately zero (or zero). Thus, the image processing apparatus 100 can improve the image quality of the composite image.
  • various sensor information such as a distance sensor, a focus sensor, and an image plane phase difference sensor. Control the combination of color and black and white images. More specifically, the image processing apparatus 100 determines whether the image quality of the composite image is degraded by processing using various types of sensor information, and when it is determined that the image quality is degraded, the composite ratio of black and white images To approximately zero (or zero). Thus, the image processing apparatus 100 can improve the image quality of the composite image.
  • the techniques of the present disclosure may also be utilized when switching cameras. More specifically, when the user switches and uses two cameras having different angles of view in order to change the angle of view of the captured image, the user may feel discomfort due to the movement of the viewpoint when switching the cameras .
  • smooth switching may be realized by combining captured images of two cameras at the time of switching. Then, when it is determined that the image quality deterioration occurs by combining the captured images of the two cameras, the combining may not be performed (in other words, imaging of any angle of view (wide angle or narrow angle) Only the image may be output). Thereby, switching without image quality deterioration can be realized.
  • a first imaging unit that acquires a color image by imaging a subject
  • a second imaging unit that acquires a black and white image by imaging the subject from a viewpoint position different from that of the first imaging unit
  • a combining control unit configured to increase the combining ratio of the color image relative to the combining ratio of the black and white image for combining the color image and the black and white image.
  • Image processing device (2)
  • the combination control unit is configured to set the combination ratio of the color image to the combination ratio of the black and white image when it is determined that the image quality of the combined image generated by the combination is deteriorated based on processing using predetermined sensor information. Make it higher, The image processing apparatus according to (1).
  • the combination control unit makes the combination ratio of the black and white image substantially zero when it is determined that the image quality is deteriorated.
  • the combination control unit determines whether the image quality is degraded based on the distance to the subject calculated by processing using the sensor information.
  • the distance to the subject is calculated by processing using distance sensor information.
  • the distance to the subject is calculated based on focus position information when focusing is performed using the sensor information.
  • the image processing apparatus according to (4). (7) The distance to the subject is calculated by processing using image plane phase difference sensor information.
  • the image plane phase difference sensor information includes information on distance and information on reliability, The distance to the subject is calculated by weighted averaging based on the reliability.
  • the combination control unit determines whether the image quality is deteriorated based on the black and white image generated by an image sensor or a feature amount calculated by processing using the color image.
  • the feature amount is calculated based on the parallax between the color image and the black and white image.
  • the feature amount is a statistic indicating the variation in the parallax for each pixel, a ratio of pixels exceeding the parallax amount in a predetermined range in the parallax for each pixel, or a predetermined distance from the pixel in the parallax direction for each pixel
  • the parallax difference absolute value is at least one of the ratio of pixels exceeding a predetermined amount when the parallax difference absolute value of the pixel separated by the predetermined distance in the opposite direction to the separated pixel is calculated.
  • (12) The feature amount is calculated based on a luminance signal and a color difference signal extracted from the color image.
  • the feature amount is at least one of a ratio of the variation of the color difference signal to the variation of the brightness signal, and a ratio of an edge component of the color difference signal to an edge component of the brightness signal.
  • the image processing apparatus according to (12). (14) Acquiring a color image by imaging a subject; Acquiring a black and white image by imaging the subject from different viewpoint positions; Setting the composition ratio of the color image higher than the composition ratio of the black and white image for the combination of the color image and the black and white image An image processing method implemented by a computer.
  • a first imaging unit that acquires a color image by imaging a subject
  • a second imaging unit that acquires a black and white image by imaging the subject from a viewpoint position different from that of the first imaging unit
  • a composition control unit configured to control composition of the color image and the black and white image by processing using predetermined sensor information.
  • Image processing device (16)
  • the combination control unit changes a combination ratio of the color image and the black and white image by processing using the sensor information.
  • the combination control unit changes the combination ratio, when it is determined that the image quality of a combined image generated by the combination is deteriorated, based on processing using the sensor information.
  • the combining control unit makes the combining ratio of the color image higher than the combining ratio of the black and white image.
  • the distance to the subject is calculated by processing using distance sensor information.
  • (22) The distance to the subject is calculated based on focus position information when focusing is performed using the sensor information.
  • the distance to the subject is calculated by processing using image plane phase difference sensor information.
  • the image plane phase difference sensor information includes information on distance and information on reliability, The distance to the subject is calculated by weighted averaging based on the reliability.
  • the combination control unit controls the combination based on the black-and-white image generated by an image sensor or a feature amount calculated by processing using the color image.
  • the feature amount is calculated based on the parallax between the color image and the black and white image.
  • the feature amount is a statistic indicating the variation in the parallax for each pixel, a ratio of pixels exceeding the parallax amount in a predetermined range in the parallax for each pixel, or a predetermined distance from the pixel in the parallax direction for each pixel
  • the parallax difference absolute value is at least one of the ratio of pixels exceeding a predetermined amount when the parallax difference absolute value of the pixel separated by the predetermined distance in the opposite direction to the separated pixel is calculated.
  • the feature amount is calculated based on a luminance signal and a color difference signal extracted from the color image.
  • the feature amount is at least one of a ratio of the variation of the color difference signal to the variation of the brightness signal, and a ratio of an edge component of the color difference signal to an edge component of the brightness signal.
  • the image processing apparatus according to (28). (30) Acquiring a color image by imaging a subject; Acquiring a black and white image by imaging the subject from different viewpoint positions; Controlling composition of the color image and the black and white image by processing using predetermined sensor information; An image processing method implemented by a computer.
  • image processing apparatus 110 first imaging unit 120 second imaging unit 130 first pre-processing unit 140 second pre-processing unit 150 combination processing unit 151 parallax histogram processing unit 152 parallax distribution feature quantity calculation unit 153 search range exceeded Feature amount calculation unit 154 Parallax gap feature amount calculation unit 155 Signal extraction unit 156 Y / C dispersion ratio processing unit 156a Y dispersion value calculation unit 156b Cb dispersion value calculation unit 156c Cr dispersion value calculation unit 156d Comparison unit 156e Y / C dispersion ratio Calculation unit 157 Y / C edge component ratio processing unit 157a Y edge component detection unit 157b Cb edge component detection unit 157c Cr edge component detection unit 157d comparison unit 157e Y / C edge component ratio calculation unit 160 focus control unit 170 distance sensor 180 image quality Degradation judgment unit 181 Short distance Symptoms amount determining unit 182 brightness difference smaller chrominance large characteristic amount determination unit 190 3D depth sensor

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)
  • Color Image Communication Systems (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

[Problem] To be able to more appropriately composite a color image obtained by a first image-capturing unit and a monochrome image obtained by a second image-capturing unit that captures an image from a different viewpoint position than the first image-capturing unit. [Solution] Provided is an image processing device comprising: a first image-capturing unit that obtains a color image by capturing an image of a subject; a second image-capturing unit that obtains a monochrome image by capturing an image of the subject from a different viewpoint position than the first image-capturing unit; and a composition control unit that, as regards the composition of the color image and the monochrome image, makes the composition ratio of the color image higher than the composition ratio of the monochrome image.

Description

画像処理装置および画像処理方法Image processing apparatus and image processing method
 本開示は、画像処理装置および画像処理方法に関する。 The present disclosure relates to an image processing apparatus and an image processing method.
 従来、携帯型の電子機器、例えばスマートフォン等の情報処理端末では、小型化または薄型化のために撮像部の画質が一眼レフカメラ等に比べて低下している。このため、例えば特許文献1では、情報処理端末に対して着脱可能なカメラで生成された画像を、無線通信で情報処理端末に供給する技術が行われている。また、特許文献2では、複数の撮像部を設けて、画質の異なる複数の画像、例えば第1の画角の画像と第1の画角よりも狭い第2の画角の画像を同時に生成する技術が開示されている。 Conventionally, in a portable electronic device, for example, an information processing terminal such as a smartphone, the image quality of an imaging unit is lower than that of a single-lens reflex camera or the like for downsizing or thinning. Therefore, for example, in Patent Document 1, a technology is provided in which an image generated by a camera attachable to and detachable from the information processing terminal is supplied to the information processing terminal by wireless communication. In Patent Document 2, a plurality of imaging units are provided to simultaneously generate a plurality of images with different image quality, for example, an image of a first angle of view and an image of a second angle of view narrower than the first angle of view. Technology is disclosed.
特開2015-088824号公報JP, 2015-088824, A 特開2013-219525号公報JP, 2013-219525, A
 ところで、上記の特許文献1または特許文献2に開示されている技術等を含む先行技術によっては、互いに異なる視点位置から撮像された画像の合成を適切に行うことができない。例えば、遠景に比べて近景の画素では、一方の撮像画像における注目画素に対応する画素が視差検出の探索範囲を超えることで対応点が正しく求められないため、合成画像の画質が劣化する場合がある。 By the way, according to the prior art including the technique etc. which are disclosed in the above-mentioned patent documents 1 or patent documents 2, composition of an image picturized from a mutually different viewpoint position can not be performed appropriately. For example, in a pixel of a near view as compared with a distant view, a corresponding point can not be correctly obtained because a pixel corresponding to a pixel of interest in one captured image exceeds the search range for parallax detection, and thus the image quality of the composite image may be degraded. is there.
 そこで、本開示は、上記問題に鑑みてなされたものであり、本開示の目的とするところは、第1の撮像部によって取得されたカラー画像と、第1の撮像部とは異なる視点位置から撮像を行う第2の撮像部によって取得された白黒画像をより適切に合成することが可能な新規かつ改良された画像処理装置および画像処理方法を提供することにある。 Therefore, the present disclosure is made in view of the above problem, and the object of the present disclosure is to set the color image acquired by the first imaging unit and the viewpoint position different from the first imaging unit. It is an object of the present invention to provide a new and improved image processing apparatus and image processing method capable of combining the black-and-white images acquired by the second imaging unit for imaging more appropriately.
 本開示によれば、被写体を撮像することでカラー画像を取得する第1の撮像部と、前記第1の撮像部とは異なる視点位置から前記被写体を撮像することで白黒画像を取得する第2の撮像部と、前記カラー画像と前記白黒画像の合成について、前記カラー画像の合成比率を前記白黒画像の合成比率に比べて高くする合成制御部と、を備える、画像処理装置が提供される。 According to the present disclosure, a first imaging unit that acquires a color image by imaging a subject, and a second image that acquires a black-and-white image by imaging the subject from viewpoint positions different from the first imaging unit. An image processing apparatus is provided, comprising: an imaging unit; and a composition control unit configured to increase a composition ratio of the color image with respect to composition of the color image and the black and white image as compared to the composition ratio of the black and white image.
 また、本開示によれば、被写体を撮像することでカラー画像を取得することと、異なる視点位置から前記被写体を撮像することで白黒画像を取得することと、前記カラー画像と前記白黒画像の合成について、前記カラー画像の合成比率を前記白黒画像の合成比率に比べて高くすることと、を有する、コンピュータにより実行される画像処理方法が提供される。 Further, according to the present disclosure, acquiring a color image by imaging an object, acquiring a black and white image by imaging the object from different viewpoint positions, and combining the color image and the black and white image A computer-implemented image processing method comprising: increasing the composition ratio of the color image relative to the composition ratio of the black and white image.
 また、本開示によれば、被写体を撮像することでカラー画像を取得する第1の撮像部と、前記第1の撮像部とは異なる視点位置から前記被写体を撮像することで白黒画像を取得する第2の撮像部と、所定のセンサ情報を用いた処理により前記カラー画像と前記白黒画像の合成を制御する合成制御部と、を備える、画像処理装置が提供される。 Further, according to the present disclosure, a black and white image is acquired by imaging the subject from viewpoint positions different from the first imaging unit that acquires a color image by imaging the subject and the first imaging unit. An image processing apparatus is provided, comprising: a second imaging unit; and a combination control unit configured to control combination of the color image and the black-and-white image by processing using predetermined sensor information.
 また、本開示によれば、被写体を撮像することでカラー画像を取得することと、異なる視点位置から前記被写体を撮像することで白黒画像を取得することと、所定のセンサ情報を用いた処理により前記カラー画像と前記白黒画像の合成を制御することと、を有する、コンピュータにより実行される画像処理方法が提供される。 Further, according to the present disclosure, it is possible to acquire a color image by imaging an object, acquire a black and white image by imaging the object from different viewpoint positions, and a process using predetermined sensor information. A computer-implemented image processing method is provided, comprising controlling the combining of the color image and the black and white image.
 以上説明したように本開示によれば、第1の撮像部によって取得されたカラー画像と、第1の撮像部とは異なる視点位置から撮像を行う第2の撮像部によって取得された白黒画像をより適切に合成することが可能になる。 As described above, according to the present disclosure, a color image acquired by the first imaging unit and a black-and-white image acquired by the second imaging unit that performs imaging from a viewpoint position different from that of the first imaging unit It becomes possible to synthesize more appropriately.
 なお、上記の効果は必ずしも限定的なものではなく、上記の効果とともに、または上記の効果に代えて、本明細書に示されたいずれかの効果、または本明細書から把握され得る他の効果が奏されてもよい。 Note that the above-mentioned effects are not necessarily limited, and, along with or in place of the above-mentioned effects, any of the effects shown in the present specification, or other effects that can be grasped from the present specification May be played.
本実施形態に係る画像処理装置100の具体例を示す図である。It is a figure which shows the specific example of the image processing apparatus 100 which concerns on this embodiment. 合成処理によって得られる画質を説明するための図である。It is a figure for demonstrating the image quality obtained by a synthetic | combination process. 白黒画像を基準としたときのオクルージョンを示す図である。It is a figure which shows an occlusion on the basis of a black and white image. 本実施形態に係る画像処理装置100の機能構成例を示すブロック図である。It is a block diagram showing an example of functional composition of image processing device 100 concerning this embodiment. 第1の撮像部110および第2の撮像部120の画素配列を例示した図である。FIG. 3 is a diagram exemplifying a pixel array of a first imaging unit 110 and a second imaging unit 120. 像面位相差センサに基づく画質劣化有無の判定処理を説明するための図である。It is a figure for demonstrating the determination processing of the image quality deterioration presence or absence based on an image surface phase difference sensor. 像面位相差センサに基づく画質劣化有無の判定処理を説明するための図である。It is a figure for demonstrating the determination processing of the image quality deterioration presence or absence based on an image surface phase difference sensor. 像面位相差センサに基づく画質劣化有無の判定処理を説明するための図である。It is a figure for demonstrating the determination processing of the image quality deterioration presence or absence based on an image surface phase difference sensor. 合成処理部150および画質劣化判定部180の機能構成例を示すブロック図である。5 is a block diagram showing an example of functional configuration of a combination processing unit 150 and an image quality deterioration determination unit 180. FIG. 視差ヒストグラムを例示した図である。It is the figure which illustrated the parallax histogram. 視差差分絶対値を説明するための図である。It is a figure for demonstrating parallax difference absolute value. 視差ギャップヒストグラムを例示した図である。It is the figure which illustrated the parallax gap histogram. 視差ギャップ特徴量およびサーチ範囲超え特徴量に基づく画質劣化有無の判定処理を説明するための図である。It is a figure for demonstrating the determination processing of the image quality deterioration presence or absence based on the parallax gap feature-value and the search range over feature-value. 輝度差小色差大領域を説明するための図である。It is a figure for demonstrating a luminance difference small color difference large area. 合成処理部150および画質劣化判定部180の機能構成例を示すブロック図である。5 is a block diagram showing an example of functional configuration of a combination processing unit 150 and an image quality deterioration determination unit 180. FIG. Y/C分散比率処理部156の機能構成例を示すブロック図である。FIG. 18 is a block diagram illustrating an example of a functional configuration of a Y / C distribution ratio processing unit 156. Y/C分散比率に基づく画質劣化有無の判定処理を説明するための図である。It is a figure for demonstrating the determination processing of the image quality deterioration presence or absence based on Y / C dispersion | distribution ratio. Y/Cエッジ成分比率処理部157の機能構成例を示すブロック図である。FIG. 16 is a block diagram showing an example of a functional configuration of a Y / C edge component ratio processing unit 157. Y/Cエッジ成分比率に基づく画質劣化有無の判定処理を説明するための図である。It is a figure for demonstrating the determination processing of the image quality deterioration presence or absence based on Y / C edge component ratio. 画質劣化有無の判定処理および合成処理の一例を示すフローチャートである。5 is a flowchart illustrating an example of image quality deterioration determination processing and combination processing. 変形例に係る画像処理装置100の機能構成例を示すブロック図である。It is a block diagram showing an example of functional composition of image processing device 100 concerning a modification. 3D深度センサに基づく画質劣化有無の判定処理を説明するための図である。It is a figure for demonstrating the determination processing of the image quality deterioration presence or absence based on 3D depth sensor. 3D深度センサに基づく画質劣化有無の判定処理を説明するための図である。It is a figure for demonstrating the determination processing of the image quality deterioration presence or absence based on 3D depth sensor. ユーザによる被写体の注視状況に基づく合成可否の判定処理を説明するための図である。It is a figure for demonstrating the determination processing of the synthetic | combination availability based on the gaze condition of the to-be-photographed object by a user. 変形例に係る画質劣化有無の判定処理および合成処理の一例を示すフローチャートである。It is a flow chart which shows an example of judgment processing of image quality degradation existence concerning a modification, and composition processing. 画質が劣化した領域が電子ズームによって拡大されることを説明するための図である。It is a figure for demonstrating that the area | region to which the image quality deteriorated is expanded by the electronic zoom. 車両制御システムの概略的な構成の一例を示すブロック図である。It is a block diagram showing an example of rough composition of a vehicle control system. 車外情報検出部及び撮像部の設置位置の一例を示す説明図である。It is explanatory drawing which shows an example of the installation position of a vehicle exterior information detection part and an imaging part.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In the present specification and the drawings, components having substantially the same functional configuration will be assigned the same reference numerals and redundant description will be omitted.
 なお、説明は以下の順序で行うものとする。
 1.実施形態
  1.1.概要
  1.2.機能構成例
  1.3.画質劣化有無の判定例
  1.4.処理フローの例
 2.変形例
  2.1.3D深度センサに基づく判定
  2.2.ユーザによる被写体の注視状況に基づく判定
  2.3.電子ズームに基づく判定
  2.4.部分的な合成制御
 3.応用例
 4.まとめ
The description will be made in the following order.
1. Embodiment 1.1. Overview 1.2. Functional configuration example 1.3. Determination example of image quality deterioration presence or absence 1.4. Example of processing flow 2. Modified example 2.1.3 Determination based on D depth sensor 2.2. Determination based on the gaze condition of the subject by the user 2.3. Determination based on electronic zoom 2.4. Partial synthesis control 3. Application example 4. Summary
  <1.実施形態>
 (1.1.概要)
 まず、本開示に係る実施形態の概要について説明する。
<1. Embodiment>
(1.1. Overview)
First, an outline of an embodiment according to the present disclosure will be described.
 本実施形態に係る画像処理装置100は、被写体を撮像することでカラー画像を取得する第1の撮像部110と、第1の撮像部110とは異なる視点位置から被写体を撮像することで白黒画像を取得する第2の撮像部120と、を備えている。例えば、図1の1Aに示すように、画像処理装置100はスマートフォンであり、1Bに示すように、第1の撮像部110および第2の撮像部120は当該スマートフォンの背面の互いに異なる位置に備えられている。 The image processing apparatus 100 according to the present embodiment captures a subject by capturing a subject, and captures a subject from a viewpoint position different from the first imaging unit 110 that acquires a color image and the first imaging unit 110. And a second imaging unit 120 for acquiring the For example, as shown in 1A of FIG. 1, the image processing apparatus 100 is a smartphone, and as shown in 1B, the first imaging unit 110 and the second imaging unit 120 are provided at different positions on the back of the smartphone. It is done.
 そして、画像処理装置100は、カラー画像と白黒画像を合成する処理を行うことで合成画像を生成する。より具体的には、互いに異なる位置から撮像されたカラー画像と白黒画像には視差が発生するため、画像処理装置100は、各画像のマッチングにより互いに対応する点(対応点)を探索し、対応点を基準に各画像の位置を合せることで合成画像を生成する。これによって、画像処理装置100は、第2の撮像部120で用いられているレンズおよびセンサの特性に応じて輝度を高めることができるため、低照度下でも感度の高い画像を生成することができる。 Then, the image processing apparatus 100 generates a composite image by performing processing of combining a color image and a black and white image. More specifically, since parallax occurs in color images and black-and-white images taken from different positions, the image processing apparatus 100 searches for corresponding points (corresponding points) by matching each image, and takes action. A composite image is generated by aligning each image with respect to a point. Thus, the image processing apparatus 100 can increase the luminance according to the characteristics of the lens and the sensor used in the second imaging unit 120, and can therefore generate an image with high sensitivity even under low illumination. .
 ところで、上記の特許文献1または特許文献2に開示されている技術等を含む先行技術によっては、互いに異なる視点位置から撮像された画像の合成を適切に行うことができない。図2は、従来の画像処理装置の合成処理によって得られる画質を説明するための図である。例えば、遠景に比べて近景の画素では、一方の撮像画像における注目画素に対応する画素が視差検出の探索範囲を超えることで対応点が正しく求められないため、合成画像の画質が劣化する場合がある。 By the way, according to the prior art including the technique etc. which are disclosed in the above-mentioned patent documents 1 or patent documents 2, composition of an image picturized from a mutually different viewpoint position can not be performed appropriately. FIG. 2 is a diagram for explaining the image quality obtained by the combining process of the conventional image processing apparatus. For example, in a pixel of a near view as compared with a distant view, a corresponding point can not be correctly obtained because a pixel corresponding to a pixel of interest in one captured image exceeds the search range for parallax detection, and thus the image quality of the composite image may be degraded. is there.
 また、遠景に比べて近景ではオクルージョンが増加する。図3は、第2の撮像部120で取得される白黒画像を基準としたときのオクルージョンを示している。視差によってオクルージョンが生じると、第1の撮像部110で取得されたカラー画像にはオクルージョン領域に対応する画像データが無い。このため、対応点が正しく求められなかったり、合成処理によって生成された合成画像のオクルージョン領域で色情報が欠落したりする。 In addition, the occlusion increases in the near view as compared to the distant view. FIG. 3 shows the occlusion based on the black-and-white image acquired by the second imaging unit 120. When an occlusion occurs due to parallax, the color image acquired by the first imaging unit 110 has no image data corresponding to the occlusion area. For this reason, corresponding points may not be determined correctly, or color information may be missing in the occlusion area of the composite image generated by the composition processing.
 また、通常、対応点の算出には輝度信号が使用されるところ、輝度信号の階調が低く、かつ、色差信号の階調が高い領域(以降、「輝度差小色差大領域」と呼称する)では対応点が正しく求められないため、合成処理によって得られる合成画像の画質が劣化する場合がある。 Also, in general, a luminance signal is used to calculate a corresponding point, but an area in which the gradation of the luminance signal is low and the gradation of the color difference signal is high (hereinafter referred to as "a small area of small luminance difference color difference" Since the corresponding points can not be obtained correctly in the case of), the image quality of the synthesized image obtained by the synthesis processing may be degraded.
 ところで、対応点が正しく求められない領域については合成を行わないようにする技術や、近景と判断された場合には合成を行わないようにする技術等が開発されている。しかし、これらの技術は、撮像画像の解析結果だけに基づいて合成の可否判断を行っている場合が多いため精度が十分でない場合があった。 By the way, there have been developed techniques for preventing synthesis for areas where corresponding points can not be obtained correctly, and techniques for preventing synthesis when it is determined that a near view is made. However, these techniques often do not have sufficient accuracy because they are often determined based on the analysis result of the captured image.
 本件の開示者は、上記事情に鑑みて本件を創作するに至った。本開示に係る画像処理装置100は、撮像画像の解析結果だけでなく、例えば、距離センサ、フォーカスセンサ、像面位相差センサ等の各種センサ情報を用いた処理によりカラー画像と白黒画像の合成を制御する。より具体的には、画像処理装置100は、各種センサ情報を用いた処理により、合成画像の画質が劣化するか否かを判定し、画質が劣化すると判定した場合には、白黒画像の合成比率を略ゼロ(またはゼロ)にする。これによって、画像処理装置100は、合成画像の画質を向上させることができる。 The present disclosure person has come to create this matter in view of the above circumstances. The image processing apparatus 100 according to the present disclosure combines a color image and a black and white image by processing using various sensor information such as a distance sensor, a focus sensor, and an image plane phase difference sensor, as well as the analysis result of the captured image. Control. More specifically, the image processing apparatus 100 determines whether the image quality of the composite image is degraded by processing using various types of sensor information, and when it is determined that the image quality is degraded, the composite ratio of black and white images To approximately zero (or zero). Thus, the image processing apparatus 100 can improve the image quality of the composite image.
 なお、「白黒画像の合成比率を略ゼロ(またはゼロ)にする」とは、合成画像の画質の劣化を、ユーザに認識されない程度まで低減させることを目的としている。以降では、便宜的に、白黒画像の合成比率を略ゼロ(またはゼロ)にすることを「合成しない(または合成オフ)」と表現する場合がある。また、合成画像の画質が劣化しないと判定された場合にカラー画像と白黒画像の合成が行われることを「合成する(または合成オン)」と表現する場合がある。すなわち、画像処理装置100は、合成画像の画質が劣化すると判定した場合には、カラー画像と白黒画像を合成せず(合成オフにする)、合成画像の画質が劣化しないと判定した場合には、カラー画像と白黒画像を合成する(合成オンにする)。なお、画像処理装置100は、画質が劣化すると判定した場合には、カラー画像と白黒画像を合成しないのではなく、単に白黒画像の合成比率を低減させることで画質劣化を緩和してもよい。 Note that “to make the composite ratio of black and white images substantially zero (or zero)” aims to reduce the deterioration of the image quality of the composite image to such an extent that it is not recognized by the user. Hereinafter, for convenience, setting the composite ratio of the black and white image to substantially zero (or zero) may be expressed as “not composite (or composite off)”. In addition, when it is determined that the image quality of the combined image is not deteriorated, the combination of the color image and the black and white image may be expressed as "combining (or combining on)". That is, when the image processing apparatus 100 determines that the image quality of the composite image is degraded, it does not combine the color image and the black and white image (composition OFF) and determines that the image quality of the composite image does not degrade. , Composite the color image and the black and white image (composition on). When the image processing apparatus 100 determines that the image quality is degraded, the image processing apparatus 100 may alleviate the image quality degradation by simply reducing the composition ratio of the black and white image instead of not combining the color image and the black and white image.
 (1.2.機能構成例)
 上記では、本開示に係る実施形態の概要について説明した。続いて、図4を参照して、画像処理装置100の機能構成例について説明する。
(1.2. Functional configuration example)
The outline of the embodiment according to the present disclosure has been described above. Subsequently, a functional configuration example of the image processing apparatus 100 will be described with reference to FIG.
 画像処理装置100は、図4に示すように、第1の撮像部110と、第2の撮像部120と、第1の前処理部130と、第2の前処理部140と、合成処理部150と、フォーカス制御部160と、距離センサ170と、画質劣化判定部180と、を備える。 As shown in FIG. 4, the image processing apparatus 100 includes a first imaging unit 110, a second imaging unit 120, a first preprocessing unit 130, a second preprocessing unit 140, and a combining processing unit. 150, a focus control unit 160, a distance sensor 170, and an image quality deterioration determination unit 180.
 (第1の撮像部110、第2の撮像部120)
 第1の撮像部110および第2の撮像部120は、CMOS(Complementary Metal Oxide Semiconductor)イメージセンサなどの撮像素子を用いて構成されており、レンズ(図示せず)により取り込まれた光の光電変換を行い、撮像画像データを生成する。また、第1の撮像部110および第2の撮像部120は特性差を有している。
(First imaging unit 110, second imaging unit 120)
The first imaging unit 110 and the second imaging unit 120 are configured using an imaging element such as a complementary metal oxide semiconductor (CMOS) image sensor, and photoelectric conversion of light captured by a lens (not shown) To generate captured image data. Further, the first imaging unit 110 and the second imaging unit 120 have characteristic differences.
 図5は、第1の撮像部110および第2の撮像部120の画素配列を例示している。5Aは、第1の撮像部110の画素配列を示している。第1の撮像部110は、例えば赤色(R)画素と青色(B)画素と緑色(G)画素をベイヤ配列としたカラーフィルタを用いて構成されている。ベイヤ配列では2×2画素の画素単位において対角位置の2つの画素が緑色(G)画素で、残りの画素が赤色(R)画素と青色(B)画素とされている。すなわち、第1の撮像部110は、各画素が赤色と青色と緑色のいずれか1つの色成分の入射光量に基づく電気信号を出力する色画素によって構成されている。したがって、第1の撮像部110は、各画素が三原色(RGB)成分のいずれかを示すカラー画像データを生成する。 FIG. 5 illustrates the pixel arrangement of the first imaging unit 110 and the second imaging unit 120. 5A indicates the pixel array of the first imaging unit 110. The first imaging unit 110 is configured using, for example, a color filter in which a red (R) pixel, a blue (B) pixel, and a green (G) pixel are arranged in a Bayer pattern. In the Bayer arrangement, two pixels at diagonal positions in the 2 × 2 pixel unit are green (G) pixels, and the remaining pixels are red (R) pixels and blue (B) pixels. That is, the first imaging unit 110 is configured of color pixels each of which outputs an electric signal based on the amount of incident light of any one color component of red, blue, and green. Therefore, the first imaging unit 110 generates color image data in which each pixel indicates any of the three primary color (RGB) components.
 5Bは、第2の撮像部120の画素配列を示している。第2の撮像部120は、全画素が、可視光の全波長領域の入射光量に基づく電気信号を出力するW(ホワイト)画素によって構成される。したがって、第2の撮像部120は、白黒画像データを生成する。 5B indicates the pixel array of the second imaging unit 120. In the second imaging unit 120, all the pixels are configured by W (white) pixels that output electrical signals based on incident light amounts in the entire wavelength region of visible light. Therefore, the second imaging unit 120 generates black and white image data.
 なお、以降で説明するフォーカス制御部160は、第1の撮像部110および第2の撮像部120に備えられている所定のレンズの位置を変更することによってオートフォーカスを実現する。そして、第1の撮像部110および第2の撮像部120は、フォーカスが合っている状態のレンズの位置に関する情報(以降、「フォーカス位置情報」と呼称する)を画質劣化判定部180へ提供する。画質劣化判定部180は、フォーカス位置情報を解析することによって、画像処理装置100と被写体との間の距離(以降、便宜的に「被写体距離」と呼称する)を算出することができる。 The focus control unit 160 described below realizes autofocus by changing the positions of predetermined lenses provided in the first imaging unit 110 and the second imaging unit 120. Then, the first imaging unit 110 and the second imaging unit 120 provide the image quality deterioration determination unit 180 with information on the position of the lens in the in-focus state (hereinafter referred to as “focus position information”). . The image quality deterioration determination unit 180 can calculate the distance between the image processing apparatus 100 and the subject (hereinafter referred to as “subject distance” for convenience) by analyzing the focus position information.
 (第1の前処理部130、第2の前処理部140)
 第1の前処理部130は、第1の撮像部110で取得されたカラー画像データに対して、レンズ歪補正、欠陥画素補正、ゲインコントロール、ホワイトバランス補正もしくはノイズリダクション等の補正処理、デモザイク処理またはスケーリング処理等を施す。第1の前処理部130は、前処理後のカラー画像データを合成処理部150へ提供する。
(First pre-processing unit 130, second pre-processing unit 140)
The first preprocessing unit 130 performs lens distortion correction, defective pixel correction, gain control, correction processing such as white balance correction or noise reduction, and demosaicing processing on color image data acquired by the first imaging unit 110. Or apply scaling processing etc. The first pre-processing unit 130 provides the color image data after the pre-processing to the combining processing unit 150.
 第2の前処理部140は、第2の撮像部120で取得された白黒画像データに対して、レンズ歪補正、欠陥画素補正、ゲインコントロールもしくはノイズリダクション等の補正処理、またはスケーリング処理等を施す。第2の前処理部140は、補正後の白黒画像データを合成処理部150へ提供する。 The second preprocessing unit 140 performs lens distortion correction, defective pixel correction, correction processing such as gain control or noise reduction, or scaling processing on the black and white image data acquired by the second imaging unit 120. . The second pre-processing unit 140 provides the composite processing unit 150 with the corrected black and white image data.
 (フォーカス制御部160)
 フォーカス制御部160は、第1の撮像部110および第2の撮像部120によって撮像処理が行われる際に、オートフォーカスを実現する機能構成である。より具体的には、フォーカス制御部160は、画像データのコントラストや、像面位相差センサからの情報に基づいてオートフォーカスを実現する。
(Focus control unit 160)
The focus control unit 160 is a functional configuration that realizes autofocus when the first imaging unit 110 and the second imaging unit 120 perform imaging processing. More specifically, the focus control unit 160 realizes autofocus based on the contrast of the image data and the information from the image plane phase difference sensor.
 画像データのコントラストに基づくオートフォーカスについて説明すると、フォーカス制御部160は、第1の撮像部110および第2の撮像部120からカラー画像データおよび白黒画像データを取得し、これらのデータを解析することでコントラスト値を算出する。そして、フォーカス制御部160は、コントラスト値を用いてピントが合っているか否かを判断し、ピントが合っていない場合、画像データのコントラスト値を用いてレンズの合焦方向を決定し、レンズを駆動させることで、ピントを合わせる。換言すると、フォーカス制御部160は、コントラスト値が略最大値になっていない場合、コントラスト値が略最大値になるようにレンズを駆動させることで、ピントを合わせる。 The autofocus based on the contrast of image data will be described. The focus control unit 160 acquires color image data and black and white image data from the first imaging unit 110 and the second imaging unit 120, and analyzes these data. Calculate the contrast value with. Then, the focus control unit 160 determines whether the image is in focus using the contrast value. If the image is not in focus, the focus control unit 160 determines the focusing direction of the lens using the contrast value of the image data. Adjust the focus by driving. In other words, when the contrast value does not reach the maximum value, the focus control unit 160 brings the focus into focus by driving the lens such that the contrast value reaches the maximum value.
 像面位相差センサからの情報に基づくオートフォーカスについて説明すると、像面位相差センサには、瞳分割が施された2種類の撮像素子がチップ上に混在された状態で配置されている。これにより、像面位相差センサは、得られた視差情報に基づいて被写体距離を算出することができ、フォーカス制御部160は、得られた被写体距離に対応する位置にレンズを駆動させることで、ピントを合わせる。 The autofocus based on the information from the image plane phase difference sensor will be described. In the image plane phase difference sensor, two types of imaging elements subjected to pupil division are arranged in a mixed state on the chip. Thus, the image plane phase difference sensor can calculate the subject distance based on the obtained parallax information, and the focus control unit 160 drives the lens to a position corresponding to the obtained subject distance. Focus on.
 フォーカス制御部160は、画像データのコントラストまたは像面位相差センサからの情報の少なくとも一方を用いてオートフォーカスを実現する。なお、オートフォーカスの実現方法はこれに限定されない。 The focus control unit 160 implements autofocus using at least one of the contrast of the image data and the information from the image plane phase difference sensor. Note that the method of realizing autofocus is not limited to this.
 (距離センサ170)
 距離センサ170は、所定の方式によって被写体距離を測定可能なセンサである。例えば、距離センサ170は、可視光もしくは不可視光(例えば、赤外光等)を照射可能な光源(例えば、LEDまたはレーザダイオード等)と、受光素子と、を備えている。そして、距離センサ170は、光源から光を照射した後に、被写体で反射された光を受光素子で受光し、その反射された光を評価・演算し、距離に換算して出力する。
(Distance sensor 170)
The distance sensor 170 is a sensor capable of measuring the subject distance by a predetermined method. For example, the distance sensor 170 includes a light source (for example, an LED or a laser diode) capable of emitting visible light or invisible light (for example, infrared light) and a light receiving element. Then, after irradiating the light from the light source, the distance sensor 170 receives the light reflected by the subject by the light receiving element, evaluates / calculates the reflected light, converts it into a distance, and outputs it.
 なお、被写体距離の測定原理は、受光素子の結像位置を距離に換算する三角測距式や、光が照射されてから受光されるまでのわずかな時間を測定し、その時間差を距離に換算するタイム・オブ・フライト式等でもよいが、これらに限定されない。また、測定対象となる被写体は、例えば、画角の中央付近に位置している被写体を想定しているが、これに限定されない。距離センサ170は、被写体距離データ(「距離センサ情報」とも呼称する)を画質劣化判定部180へ提供する。 The principle of measuring the subject distance is a triangular distance measuring method that converts the imaging position of the light receiving element into a distance, or a slight time from light irradiation to light reception, and the time difference converted to a distance It may be a time of flight type, etc., but is not limited thereto. Further, although the subject to be measured is assumed to be, for example, a subject located near the center of the angle of view, it is not limited to this. The distance sensor 170 provides subject distance data (also referred to as “distance sensor information”) to the image quality deterioration determination unit 180.
 (画質劣化判定部180)
 画質劣化判定部180は、各種センサ情報を用いた処理により、画質劣化有無の判定を行い、合成制御部として機能する機能構成である。画質劣化有無の判定は様々な方法で実現可能であり、詳細については「1.3.画質劣化有無の判定例」にて説明を行う。
(Image quality deterioration determination unit 180)
The image quality deterioration determination unit 180 is a functional configuration that determines the presence or absence of image quality deterioration by processing using various types of sensor information and functions as a combination control unit. The determination of the image quality deterioration can be realized by various methods, and the details will be described in “1.3. Example of determination of image quality deterioration”.
 (合成処理部150)
 合成処理部150は、カラー画像と白黒画像の合成処理を制御し、合成制御部として機能する機能構成である。より具体的には、合成処理部150は、画質劣化判定部180によって合成画像の画質が劣化すると判定された場合、白黒画像の合成比率を略ゼロ(またはゼロ)にする。これによって、合成処理部150は、高い画質を有する合成画像を生成することができる。なお、上記のとおり、合成画像の画質が劣化すると判定された場合には、合成処理部150は、白黒画像の合成比率を略ゼロ(またはゼロ)にするのではなく、単に白黒画像の合成比率を低減させることで画質劣化を緩和してもよい。
(Composition processing unit 150)
The composition processing unit 150 is a functional configuration that controls composition processing of a color image and a black and white image and functions as a composition control unit. More specifically, when the image quality degradation determining unit 180 determines that the image quality of the combined image is degraded, the combining processing unit 150 sets the combining ratio of the black and white image to substantially zero (or zero). Thus, the composition processing unit 150 can generate a composite image having high image quality. As described above, when it is determined that the image quality of the combined image is degraded, the combining processing unit 150 does not set the combining ratio of the black and white image to substantially zero (or zero), but merely combines the combining ratio of the black and white image. The image quality deterioration may be alleviated by reducing the
 また、合成処理部150は、第1の前処理部130および第2の前処理部140から取得したカラー画像データと白黒画像データを解析することで、視差に起因する画質の劣化判定を行うための画像特徴量を算出する。なお、合成処理部150は、撮像画像全体を画像特徴量の算出対象領域としてもよく、撮像画像において上下左右の端部側の領域を除いて算出対象領域を設定してもよい。このように、端部側の領域を除いて算出対象領域を設定すれば、例えば注目画素が側端の位置であるために視差や後述する視差ギャップ距離等の算出ができなくなってしまうことを防止することが可能となり、精度よく画像特徴量を算出できるようになる。また、ヒストグラムの生成等の演算コストも低減できる。 Further, the composition processing unit 150 analyzes the color image data and the black and white image data acquired from the first pre-processing unit 130 and the second pre-processing unit 140 to determine the deterioration of the image quality caused by the parallax. Calculate the image feature amount of. The composition processing unit 150 may set the entire captured image as a calculation target region of the image feature amount, and may set the calculation target region excluding the region on the upper, lower, left, and right ends in the captured image. As described above, if the calculation target area is set excluding the area on the end side, for example, it is possible to prevent the parallax and the parallax gap distance described later from being unable to be calculated because the target pixel is the side end position. It becomes possible to calculate the image feature quantity accurately. In addition, calculation costs such as generation of a histogram can be reduced.
 そして、合成処理部150は、抽出した画像特徴量を画質劣化判定部180へ提供することで、画質劣化判定部180は、当該画像特徴量を用いて画質劣化有無の判定を行うことができる。なお、当該画像特徴量は、第1の撮像部110と第2の撮像部120に備えられるイメージセンサによって生成された撮像画像データを用いて出力されるため、当該画像特徴量を用いて画質劣化有無の判定を行うことは、イメージセンサからのセンサ情報を用いた処理により画質劣化有無の判定を行うことと言える。処理の詳細については「1.3.画質劣化有無の判定例」にて説明を行う。 Then, the composition processing unit 150 provides the extracted image feature amount to the image quality degradation determination unit 180, so that the image quality degradation determination unit 180 can determine the presence or absence of image quality degradation using the image feature amount. In addition, since the image feature amount is output using captured image data generated by the image sensor provided in the first imaging unit 110 and the second imaging unit 120, image quality deterioration is performed using the image feature amount. Determining the presence or absence can be said to determine the presence or absence of image quality deterioration by processing using sensor information from the image sensor. The details of the process will be described in “1.3. Determination example of image quality deterioration”.
 以上、画像処理装置100の機能構成例について説明した。なお、図4を用いて説明した上記の機能構成はあくまで一例であり、画像処理装置100の機能構成は係る例に限定されない。例えば、画像処理装置100は、図4に示す機能構成の全てを必ずしも備えなくてもよい。また、画像処理装置100の機能構成は、仕様や運用に応じて柔軟に変形可能である。 The functional configuration example of the image processing apparatus 100 has been described above. The above-described functional configuration described using FIG. 4 is merely an example, and the functional configuration of the image processing apparatus 100 is not limited to such an example. For example, the image processing apparatus 100 may not necessarily include all of the functional configurations shown in FIG. Also, the functional configuration of the image processing apparatus 100 can be flexibly deformed according to the specification and the operation.
 (1.3.画質劣化有無の判定例)
 上記では、画像処理装置100の機能構成例について説明した。続いて、画像処理装置100の画質劣化判定部180による画質劣化有無の判定方法の例について説明する。以降で説明する、様々な画質劣化有無の判定方法は、いずれか1つのみが用いられてもよいし、互いに組み合せて用いられてもよいとする。
(1.3. Judgment example of image quality deterioration existence)
In the above, the functional configuration example of the image processing apparatus 100 has been described. Subsequently, an example of a method of determining the presence or absence of image quality deterioration by the image quality deterioration determining unit 180 of the image processing apparatus 100 will be described. It is assumed that any one of the various image quality deterioration determination methods described below may be used or may be used in combination with each other.
 (ISO感度に基づく判定)
 本実施形態に係る画像処理装置100のように、カラー画像と白黒画像が合成される場合、カラー画像と白黒画像の明るさを同程度まで調節することが求められる場合がある。このとき、第1の撮像部110と第2の撮像部120のISO(International Organization for Standardization)感度の差が所定値よりも大きくなると、合成画像の画質が劣化する(または、合成が不可能になる)。
(Judging based on ISO sensitivity)
When a color image and a black and white image are combined as in the image processing apparatus 100 according to the present embodiment, it may be required to adjust the brightness of the color image and the black and white image to the same degree. At this time, if the difference in ISO (International Organization for Standardization) sensitivity between the first imaging unit 110 and the second imaging unit 120 becomes larger than a predetermined value, the image quality of the composite image is degraded (or the composition is impossible). Become).
 第1の撮像部110と第2の撮像部120のシャッタースピードを略同一にし、第1の撮像部110のゲインが略最小値に設定されたときの第1の撮像部110のISO感度をISOmin1とし、第2の撮像部120のゲインが略最小値に設定されたときの第2の撮像部120のISO感度をISOmin2とする。このとき、ターゲットとなるISO感度が以下の式(1)の範囲に含まれる場合、第1の撮像部110と第2の撮像部120は、互いのISO感度を略同一に設定することができない。 When the shutter speeds of the first imaging unit 110 and the second imaging unit 120 are substantially the same, and the gain of the first imaging unit 110 is set to a substantially minimum value, the ISO sensitivity of the first imaging unit 110 is ISO It is assumed that the ISO sensitivity of the second imaging unit 120 when the gain of the second imaging unit 120 is set to a substantially minimum value is set to ISO min2 . At this time, when the target ISO sensitivity is included in the range of the following expression (1), the first imaging unit 110 and the second imaging unit 120 can not set the ISO sensitivity to be substantially the same as each other. .
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 そこで、画質劣化判定部180は、第1の撮像部110と第2の撮像部120からそれぞれのISO感度情報を取得し、いずれかのISO感度が上記式(1)に示す範囲に含まれる場合、画質劣化判定部180は合成画像の画質が劣化する(または、画質が劣化する可能性が高い)と判定し、合成処理部150は白黒画像の合成比率を略ゼロにする。 Therefore, when the image quality deterioration determination unit 180 acquires each of the ISO sensitivity information from the first imaging unit 110 and the second imaging unit 120, and one of the ISO sensitivities is included in the range indicated by the equation (1). The image quality deterioration determining unit 180 determines that the image quality of the combined image is degraded (or the image quality is highly likely to be degraded), and the combining processing unit 150 sets the combining ratio of the black and white image to substantially zero.
 なお、画質劣化判定部180は、以下の式(2)で表されるヒステリシスdを設定してもよい。これによって、画質劣化判定部180は、合成オンの状態と合成オフの状態が頻繁に切り替わることを防ぐことができる。 The image quality deterioration determining unit 180 may set the hysteresis d represented by the following equation (2). As a result, the image quality deterioration determination unit 180 can prevent frequent switching between the combined on state and the combined off state.
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 (距離センサ170に基づく判定)
 画質劣化判定部180は、距離センサ170から提供されたセンサ情報を用いた処理により合成画像の画質が劣化するか否かを判定してもよい。より具体的には、距離センサ170は、上記のように所定の方法で被写体距離を測定する。そして、画質劣化判定部180は、距離センサ170から提供された被写体距離データと所定の閾値を比較することで、当該被写体距離が画質劣化を引き起こすほど近距離であるか否かを判定する。そして、当該被写体距離が画質劣化を引き起こすほど近距離であると判定された場合、合成処理部150は、白黒画像の合成比率を略ゼロにする。
(Determination based on distance sensor 170)
The image quality deterioration determining unit 180 may determine whether the image quality of the combined image is deteriorated by the process using the sensor information provided from the distance sensor 170. More specifically, the distance sensor 170 measures the subject distance by a predetermined method as described above. Then, the image quality deterioration determining unit 180 compares the subject distance data provided from the distance sensor 170 with a predetermined threshold to determine whether the subject distance is close enough to cause image quality deterioration. When it is determined that the subject distance is close enough to cause image quality deterioration, the combining processing unit 150 sets the combining ratio of the black and white image to substantially zero.
 なお、上記の処理は適宜変更され得る。例えば、画質劣化判定部180は、被写体距離データの信頼度が併せて提供される場合(または、所定の処理によって信頼度を算出可能な場合)、当該信頼度が所定値より高いときにのみ、被写体距離データに基づいて画質劣化有無の判定を行ってもよい。 The above process may be changed as appropriate. For example, when the reliability of the subject distance data is also provided (or when the reliability can be calculated by a predetermined process), the image quality deterioration determination unit 180 only when the reliability is higher than a predetermined value, The image quality deterioration may be determined based on the subject distance data.
 (フォーカス位置情報に基づく判定)
 画質劣化判定部180は、第1の撮像部110および第2の撮像部120のそれぞれから提供されたフォーカス位置情報(フォーカスが合っている状態のレンズの位置に関する情報)に基づいて合成画像の画質が劣化するか否かを判定してもよい。より具体的に説明すると、画質劣化判定部180は、当該フォーカス位置情報を被写体距離に換算することができる。なお、フォーカス位置情報を被写体距離に換算する方法は特に限定されず、公知の方法が用いられ得る。
(Determination based on focus position information)
The image quality deterioration determination unit 180 determines the image quality of the combined image based on the focus position information (information on the position of the lens in the in-focus state) provided from each of the first imaging unit 110 and the second imaging unit 120. May be determined. More specifically, the image quality deterioration determination unit 180 can convert the focus position information into a subject distance. The method of converting the focus position information into the subject distance is not particularly limited, and a known method may be used.
 そして、画質劣化判定部180は、被写体距離と所定の閾値を比較することで、被写体距離が画質劣化を引き起こすほど近距離であるか否かを判定し、当該被写体距離が画質劣化を引き起こすほど近距離であると判定した場合、合成処理部150は、白黒画像の合成比率を略ゼロにする。 Then, the image quality deterioration determining unit 180 determines whether the subject distance is close enough to cause image quality deterioration by comparing the subject distance with a predetermined threshold value, and the subject distance is close enough to cause image quality deterioration. If it is determined that the distance is the distance, the composition processing unit 150 makes the composition ratio of the black and white image substantially zero.
 なお、画質劣化判定部180は、以下の式(3)で表されるヒステリシスdを設定してもよい。ここで、「lenspos」は、フォーカスが合っている状態のレンズの位置を示す。これによって、画質劣化判定部180は、合成オンの状態と合成オフの状態が頻繁に切り替わることを防ぐことができる。 The image quality deterioration determining unit 180 may set the hysteresis d represented by the following expression (3). Here, “lenspos” indicates the position of the lens in the in-focus state. As a result, the image quality deterioration determination unit 180 can prevent frequent switching between the combined on state and the combined off state.
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 (像面位相差センサに基づく判定)
 画質劣化判定部180は、第1の撮像部110または第2の撮像部120に備えられている像面位相差センサから提供される情報に基づいて合成画像の画質が劣化するか否かを判定してもよい。像面位相差センサは、画面内の各領域における被写体距離を示す距離マップを出力することができる。したがって、画質劣化判定部180は、像面位相差センサから提供された距離マップに基づいて画質劣化有無の判定を行うことができる。例えば、図6に示すように画面が縦7個、横9個の領域に区切られており、像面位相差センサは、領域単位で被写体距離を出力することができるとする。このとき、画質劣化判定部180は、像面位相差センサから提供された距離マップに基づいて、領域10(画面の左下部分の領域)に被写体距離の近い被写体が写っていることを認識し、画質劣化有無の判定を行うことができる。
(Determination based on image plane phase difference sensor)
The image quality deterioration determining unit 180 determines whether the quality of the combined image is deteriorated based on the information provided from the image plane phase difference sensor provided in the first imaging unit 110 or the second imaging unit 120. You may The image plane phase difference sensor can output a distance map indicating the subject distance in each area in the screen. Therefore, the image quality deterioration determining unit 180 can determine the image quality deterioration based on the distance map provided by the image plane phase difference sensor. For example, as shown in FIG. 6, the screen is divided into seven vertical and nine horizontal areas, and the image plane phase difference sensor can output the subject distance in area units. At this time, the image quality deterioration judging unit 180 recognizes that the subject with a short subject distance is shown in the area 10 (the area at the lower left part of the screen) based on the distance map provided from the image plane phase difference sensor. It is possible to determine the image quality deterioration.
 しかし、距離マップの精度は、被写体のコントラストにより大きく変化する可能性があるため、通常は、距離マップだけでなく、その距離マップの信頼度を示す信頼度マップが併せて出力される(距離マップまたは信頼度マップは「像面位相差センサ情報」とも呼称される)。ここで、像面位相差センサから提供される距離マップの一例を図7の7Aに示し、併せて提供される信頼度マップの一例を7Bに示す。各マップは、撮像された一画像の各領域における被写体距離と、その信頼度を表しており、それぞれのマップの領域は図6に示した領域に対応している。そして、画質劣化判定部180は、距離マップおよび信頼度マップを用いて以下の処理を行うことによって、画質劣化有無の判定を行う。 However, since the accuracy of the distance map may greatly change depending on the contrast of the subject, normally, not only the distance map but also a reliability map indicating the reliability of the distance map is output together (distance map Alternatively, the reliability map is also referred to as “image plane phase difference sensor information”. Here, an example of the distance map provided by the image plane phase difference sensor is shown in 7A of FIG. 7 and an example of the reliability map provided together is shown in 7B. Each map represents the subject distance in each area of one captured image and the reliability thereof, and the area of each map corresponds to the area shown in FIG. Then, the image quality deterioration determining unit 180 determines the presence or absence of image quality deterioration by performing the following processing using the distance map and the reliability map.
 まず、画質劣化判定部180は、取得した信頼度マップの中から所定値Rmin以上の信頼度を有する領域を抽出し、当該領域に対応する距離マップ中のデータを抽出する。そして、画質劣化判定部180は、抽出した距離マップ中のデータの中から、最も値の低いデータ(換言すると、被写体距離が最も近いデータ)を探索し、当該データをDminとする。次に、画質劣化判定部180は、最も近傍に位置する被写体が含まれると想定される距離Dの範囲を以下の式(4)のとおり算出する。 First, the image quality deterioration determining unit 180 extracts an area having a reliability of a predetermined value R min or more from the acquired reliability map, and extracts data in the distance map corresponding to the area. Then, the image quality deterioration determining unit 180 searches for data having the lowest value (in other words, data having the closest subject distance) from the data in the extracted distance map, and sets the data as D min . Next, the image quality deterioration determination unit 180 calculates the range of the distance D assumed to include the subject located closest to the object according to the following Expression (4).
Figure JPOXMLDOC01-appb-M000004
Figure JPOXMLDOC01-appb-M000004
 そして、画質劣化判定部180は、上記で抽出した距離マップ中のデータの中から、上記の式(4)に示す距離Dの範囲に含まれるデータを抽出する。そして、画質劣化判定部180は、図8に示すように、抽出したデータを距離の順にソートする。なお、図8には、距離に対応する信頼度も示されている。ここで、ソート後のデータにおけるi番目の距離をDiとし、当該Diに対応する信頼度をRiとする。 Then, the image quality deterioration determination unit 180 extracts data included in the range of the distance D shown in the above equation (4) from the data in the distance map extracted above. Then, the image quality deterioration determination unit 180 sorts the extracted data in order of distance as shown in FIG. Note that FIG. 8 also shows the reliability corresponding to the distance. Here, the i-th distance in the sorted data as D i, the reliability corresponding to the D i and R i.
 続いて、画質劣化判定部180は、確実に信頼に値する信頼度をRmaxとし、ソート後のデータから信頼度がRmax以上となるデータを抽出し、抽出したデータの中で最も距離の遠いデータの番号をNとする。仮に、Rmax以上となるデータがソート後のデータに存在しない場合には、ソート後のデータの中で最も距離の遠いデータの番号をNとする。そして、画質劣化判定部180は、以下の式(5)の演算を行うことによって、最も近傍に位置する被写体が含まれると想定される距離Dobjを推定する。換言すると、画質劣化判定部180は、信頼度に基づいて重み付け平均した値を距離Dobjとして算出する。 Subsequently, the image quality deterioration determining unit 180 extracts the data having reliability higher than R max from the sorted data with the reliability reliably representing reliability as R max, and the distance is the longest among the extracted data. Let the data number be N. If there is no data which is equal to or greater than R max in the sorted data, the number of the data having the farthest distance in the sorted data is N. Then, the image quality deterioration determination unit 180 estimates the distance D obj assumed to include the subject located closest to the image by performing the calculation of the following Expression (5). In other words, the image quality deterioration determination unit 180 calculates a weighted average value as the distance D obj based on the reliability.
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005
 そして、画質劣化判定部180は、距離Dobjと所定の閾値を比較することで、距離Dobjが画質劣化を引き起こすほど近距離であるか否かを判定し、当該距離Dobjが画質劣化を引き起こすほど近距離であると判定した場合、合成処理部150は、白黒画像の合成比率を略ゼロにする。 Then, the image quality deterioration determining unit 180 compares the distance D obj with a predetermined threshold to determine whether the distance D obj is close enough to cause the image quality deterioration, and the distance D obj is the image quality deterioration. If it is determined that the distance is short enough to cause the image, the composition processing unit 150 makes the composition ratio of the black and white image substantially zero.
 なお、上記の処理は適宜変更され得る。例えば、動画や静止画のプレビュー表示のように定常的に動作するシステムにおいて、画質劣化判定部180は、像面位相差センサから提供された距離マップに対して時間平滑化フィルタをかけてもよい。これによって、距離マップの精度が向上する。なお、このとき、像面位相差センサから提供された信頼度マップにおいて、所定値Rmin以上の信頼度を有するデータが存在しない場合、画質劣化判定部180は距離マップに対して時間平滑化フィルタをかけることができないため、像面位相差センサに基づいて画質劣化有無の判定を行うことができない。この場合、画質劣化判定部180は、画質劣化有無の判定が不可であったフレームを時間平滑化フィルタの適用対象から除外してもよい。または、画質劣化判定部180は、この場合のみ、フォーカス位置情報に基づく画質劣化有無の判定を行ってもよい。換言すると、画質劣化判定部180は、像面位相差センサの信頼度に応じて、画質劣化有無の判定に用いる情報を、像面位相差センサからの情報とフォーカス位置情報とで切り替えてもよい。 The above process may be changed as appropriate. For example, in a system that operates steadily, such as a preview display of a moving image or a still image, the image quality deterioration determining unit 180 may apply a time smoothing filter to the distance map provided by the image plane phase difference sensor . This improves the accuracy of the distance map. At this time, when there is no data having a reliability higher than a predetermined value R min in the reliability map provided from the image plane phase difference sensor, the image quality deterioration judging unit 180 applies a time smoothing filter to the distance map. It is not possible to determine the presence or absence of image quality deterioration based on the image plane phase difference sensor. In this case, the image quality deterioration determining unit 180 may exclude the frame for which the determination of the image quality deterioration is not possible from the application target of the time smoothing filter. Alternatively, only in this case, the image quality deterioration determining unit 180 may determine the image quality deterioration based on the focus position information. In other words, the image quality deterioration determining unit 180 may switch the information used to determine the presence or absence of the image quality deterioration between the information from the image plane phase difference sensor and the focus position information according to the reliability of the image plane phase difference sensor. .
 (近距離被写体の画像特徴量に基づく判定)
 画質劣化判定部180、被写体距離が所定値以下である被写体(以降、「近距離被写体」と呼称する)の画像特徴量に基づいて合成画像の画質が劣化するか否かを判定してもよい。より具体的には、合成処理部150が、カラー画像と白黒画像を解析することで、視差分布特徴量、サーチ範囲超え特徴量または視差ギャップ特徴量を算出し、画質劣化判定部180は、これらの画像特徴量が近距離被写体の画像特徴量に該当するか否かを判定することで、画質劣化有無を判定してもよい。
(Determination based on the image feature amount of the short distance object)
The image quality deterioration determination unit 180 may determine whether the image quality of the composite image is deteriorated based on the image feature amount of a subject whose subject distance is equal to or less than a predetermined value (hereinafter referred to as “short distance subject”). . More specifically, the composition processing unit 150 analyzes the color image and the black-and-white image to calculate the parallax distribution feature amount, the search range exceeding feature amount, or the parallax gap feature amount, and the image quality deterioration determination unit 180 The presence or absence of the image quality deterioration may be determined by determining whether the image feature amount of the image corresponds to the image feature amount of the short distance subject.
 ここで、図9を参照して、本判定方法における合成処理部150と画質劣化判定部180の機能構成例を説明する。図9に示すように、合成処理部150は、視差ヒストグラム処理部151と、視差分布特徴量算出部152と、サーチ範囲超え特徴量算出部153と、視差ギャップ特徴量算出部154と、を備え、画質劣化判定部180は、近距離特徴量判定部181を備える。 Here, with reference to FIG. 9, an example of functional configuration of the combination processing unit 150 and the image quality deterioration determination unit 180 in the present determination method will be described. As shown in FIG. 9, the composition processing unit 150 includes a parallax histogram processing unit 151, a parallax distribution feature quantity calculation unit 152, a search range exceeding feature quantity calculation unit 153, and a parallax gap feature quantity calculation unit 154. The image quality deterioration determination unit 180 includes the short distance feature amount determination unit 181.
 視差ヒストグラム処理部151は、第1の前処理部130と第2の前処理部140から供給された白黒画像データとカラー画像データに基づいて視差検出を行い、検出した視差を示す視差情報を生成する。第1の撮像部110と第2の撮像部120は、図1の1Bに示すように異なる視点位置から撮像を行うため、第1の撮像部110と第2の撮像部120で取得された撮像画像は視差を持つ画像となる。したがって、視差ヒストグラム処理部151は、第1の前処理部130と第2の前処理部140から供給された撮像画像データに基づいて画素毎の視差を示す視差情報を生成する。 The disparity histogram processing unit 151 performs disparity detection based on the black and white image data and the color image data supplied from the first pre-processing unit 130 and the second pre-processing unit 140, and generates disparity information indicating the detected disparity Do. Since the first imaging unit 110 and the second imaging unit 120 perform imaging from different viewpoint positions as shown in FIG. 1B, the imaging obtained by the first imaging unit 110 and the second imaging unit 120 The image is an image having parallax. Therefore, the parallax histogram processing unit 151 generates parallax information indicating the parallax for each pixel based on the captured image data supplied from the first preprocessing unit 130 and the second preprocessing unit 140.
 視差ヒストグラム処理部151は、ブロックマッチングなどの対応点検出処理によって視差情報を生成する。例えば、視差ヒストグラム処理部151は、第1の撮像部110と第2の撮像部120のいずれか一方で取得された撮像画像を基準撮像画像として、基準撮像画像上の注目位置を基準とした基準ブロック領域に最も類似する他方の撮像画像上のブロック領域を検出する。視差ヒストグラム処理部151は、検出したブロック領域と基準ブロック領域の位置の差を示す視差ベクトルを算出する。視差ヒストグラム処理部151は、基準撮像画像上の各画素を注目位置として視差の算出を行い、画素毎に算出した視差ベクトルを出力する。 The disparity histogram processing unit 151 generates disparity information by corresponding point detection processing such as block matching. For example, the parallax histogram processing unit 151 uses a captured image acquired by any one of the first imaging unit 110 and the second imaging unit 120 as a reference captured image, and a reference based on the target position on the reference captured image. A block area on the other captured image most similar to the block area is detected. The disparity histogram processing unit 151 calculates a disparity vector indicating the difference between the position of the detected block area and the position of the reference block area. The disparity histogram processing unit 151 calculates the disparity with each pixel on the reference captured image as the target position, and outputs the disparity vector calculated for each pixel.
 そして、視差ヒストグラム処理部151は、算出対象領域の各画素について算出されている視差ベクトルを用いてヒストグラムを生成する。なお、図10は、視差ヒストグラムを例示しており、図10の(a)は、被写体が同一平面に近い状態であるの撮像画像の視差ヒスグラム、図10の(b)は、被写体までの距離が異なる撮像画像の視差ヒストグラムを例示している。この視差ヒストグラムでは、距離の違いにより視差「0」からマイナス方向に離れた位置にピークを生じている。図10の(c)は、被写体までの距離が異なり複数の視差を生じており、被写体が接近していることで大きな視差を生じる状態である撮像画像の視差ヒストグラムを例示している。この視差ヒストグラムでは、図10の(b)に比べて被写体が接近して大きさ視差を生じていることから、図10の(b)よりもマイナス方向にさらに離れた位置にピークを生じている。 Then, the disparity histogram processing unit 151 generates a histogram using the disparity vector calculated for each pixel of the calculation target area. Note that FIG. 10 exemplifies a parallax histogram, and FIG. 10 (a) is a parallax histogram of a captured image in which the subject is close to the same plane, and FIG. 10 (b) is a distance to the subject Illustrate parallax histograms of captured images different from each other. In this parallax histogram, a peak occurs at a position away from the parallax "0" in the negative direction due to the difference in distance. (C) of FIG. 10 illustrates a parallax histogram of a captured image in which the distances to the subject are different to generate a plurality of parallaxes, and a large parallax occurs when the objects are close. In this parallax histogram, the subject is closer to generate a magnitude parallax than in (b) of FIG. 10, and therefore, a peak is generated at a position further away in the negative direction than (b) of FIG. .
 さらに、視差ヒストグラム処理部151は、視差ギャップヒストグラムを生成する。図11は、視差ギャップヒストグラムを生成に用いる視差差分絶対値を説明するための図である。視差ヒストグラム処理部151は、図11に示すように、算出対象領域の注目画素の位置から水平に「-(PARALLAX_DIFF_DISTANCE/2)」の画素分だけ離れた位置の視差PV1を算出する。また、視差ヒストグラム処理部151は、注目画素位置から水平に「(PARALLAX_DIFF_DISTANCE/2)」の画素分だけ離れた位置の視差PV2を算出して、式(6)に示す視差差分絶対値PVapdを算出する。なお、視差ギャップ距離(PARALLAX_DIFF_DISTANCE)は予め設定されている。 Furthermore, the disparity histogram processing unit 151 generates a disparity gap histogram. FIG. 11 is a diagram for explaining the parallax difference absolute value used to generate the parallax gap histogram. As shown in FIG. 11, the disparity histogram processing unit 151 calculates the disparity PV1 at a position horizontally separated from the position of the pixel of interest in the calculation target area by “− (PARALLAX_DIFF_DISTANCE / 2)”. Further, the parallax histogram processing unit 151 calculates the parallax difference absolute value PVapd shown in the equation (6) by calculating the parallax PV2 at a position separated by “(PARALLAX_DIFF_DISTANCE / 2)” horizontally from the target pixel position. Do. Note that the parallax gap distance (PARALLAX_DIFF_DISTANCE) is set in advance.
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000006
 視差差分絶対値PVapdは、例えば被写体が同一平面に近い状態である場合、視差PV1と視差PV2の差は小さいことから視差差分絶対値PVapdの値は小さくなる。また、視差差分絶対値PVapdは、例えば被写体までの距離が異なり、注目画素が距離の異なる被写体の境界であると、視差PV1と視差PV2の差が大きいことから視差差分絶対値PVapdの値は大きくなる。視差ヒストグラム処理部151は、算出対象領域の各画素を注目画素として算出した視差差分絶対値PVapdのヒストグラムである視差ギャップヒストグラムを生成する。なお、図12は視差ギャップヒストグラムを例示している。 For example, when the subject is close to the same plane, the parallax difference absolute value PVapd has a small difference between the parallax PV1 and the parallax PV2, and the value of the parallax difference absolute value PVapd is small. In addition, the parallax difference absolute value PVapd has a large difference between the parallax PV1 and the parallax PV2, for example, if the distance to the subject is different and the target pixel is a boundary between subjects having different distances, the value of the parallax difference absolute value PVapd is large. Become. The disparity histogram processing unit 151 generates a disparity gap histogram that is a histogram of the disparity difference absolute value PVapd calculated with each pixel of the calculation target area as the pixel of interest. FIG. 12 illustrates the parallax gap histogram.
 視差分布特徴量算出部152は、視差ヒストグラム処理部151で生成された視差ヒストグラムから視差分布の特徴を示す統計量を視差分布特徴量として算出する。視差分布特徴量算出部152は、視差分布の特徴を示す統計量として例えば標準偏差を算出して、算出した標準偏差を視差分布特徴量FVfsdとする。例えば図10の(a)のヒストグラムから算出した視差分布特徴量を「FVfsd-a」、図10の(b)のヒストグラムから算出した視差分布特徴量「FVfsd-b」、図10の(c)のヒストグラムから算出した視差分布特徴量「FVfsd-c」とする。この場合、視差分布特徴量は「FVfsd-a<FVfsd-b,FVfsd-c」となる。このように、視差分布特徴量算出部152が視差ヒストグラムの標準偏差を視差分布特徴量FVfsdとして算出することで、視差分布特徴量FVfsdに基づき、被写体が同一平面に近いかあるいは複数の視差があるかが判定され得る。 The parallax distribution feature quantity calculation unit 152 calculates a statistic indicating the feature of the parallax distribution from the parallax histogram generated by the parallax histogram processing unit 151 as a parallax distribution feature quantity. The parallax distribution feature quantity calculation unit 152 calculates, for example, a standard deviation as a statistic indicating the characteristic of the parallax distribution, and sets the calculated standard deviation as the parallax distribution feature quantity FVfsd. For example, the parallax distribution feature quantity calculated from the histogram of FIG. 10A is “FVfsd-a”, the parallax distribution feature quantity “FVfsd-b” calculated from the histogram of FIG. 10B, and FIG. The parallax distribution feature quantity “FVfsd−c” calculated from the histogram of In this case, the parallax distribution feature quantity is “FVfsd−a <FVfsd−b, FVfsd−c”. As described above, the parallax distribution feature quantity calculation unit 152 calculates the standard deviation of the parallax histogram as the parallax distribution feature quantity FVfsd, and based on the parallax distribution feature quantity FVfsd, the subject is close to the same plane or there is a plurality of parallaxes. Can be determined.
 サーチ範囲超え特徴量算出部153は、視差ヒストグラム処理部151で生成された視差ヒストグラムから予め設定されているサーチ範囲以上の視差を生じている度数(over_search_range_counter)の全度数(counter)に対する割合を示すサーチ範囲超え特徴量FVosrを算出する。サーチ範囲超え特徴量算出部153は、視差ヒストグラムを用いて式(7)の演算を行い、サーチ範囲超え特徴量FVosrを算出する。 The search range over feature amount calculation unit 153 indicates a ratio of the frequency (over_search_range_counter) to the total frequency (counter) generating the parallax more than the search range set in advance from the parallax histogram generated by the parallax histogram processing unit 151. A search range exceeding feature amount FVosr is calculated. The search range over feature amount calculation unit 153 performs the calculation of Expression (7) using the disparity histogram to calculate the search range over feature amount FVosr.
Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000007
 例えば、図10の(a)のヒストグラムから算出したサーチ範囲超え特徴量を「FVosr-a」とする。また、図10の(b)のヒストグラムから算出したサーチ範囲超え特徴量を「FVosr-b」、図10の(c)のヒストグラムから算出したサーチ範囲超え特徴量を「FVosr-c」とする。この場合、サーチ範囲超え特徴量は「FVosr-a,FVosr-b<FVosr-c」となる。このように、サーチ範囲超え特徴量算出部153がサーチ範囲超え特徴量FVosrを算出することで、サーチ範囲超え特徴量FVosrに基づき、大きな視差を生じる被写体が撮像されているか否かが判定され得る。すなわち、マッチング精度の低下する(またはマッチングができない)近距離被写体の検出が可能となる。 For example, the search range over feature amount calculated from the histogram in (a) of FIG. 10 is set to “F Vosr-a”. Further, the search range over feature amount calculated from the histogram of (b) of FIG. 10 is “F Vosr-b”, and the search range over feature amount calculated from the histogram of (c) of FIG. 10 is “F Vosr-c”. In this case, the search range over feature amount is “F Vosr -a, F Vosr-b <F Vosr-c”. As described above, when the search range over feature amount calculation unit 153 calculates the search range over feature amount FVosr, it can be determined based on the search range over feature amount FVosr whether or not an object causing large parallax is imaged. . That is, it is possible to detect a short distance subject whose matching accuracy is reduced (or which can not be matched).
 視差ギャップ特徴量算出部154は、視差ヒストグラム処理部151で生成された視差ギャップヒストグラムから視差ギャップ特徴量FVpdを算出する。視差ギャップ特徴量算出部154は、視差ギャップヒストグラムから予め設定されている最大視差ギャップ距離以上の視差ギャップを生じている度数(large_parallax_diff_counter)の全度数(counter)に対する割合を示す視差ギャップ特徴量FVpdを算出する。視差ギャップ特徴量算出部154は、視差ギャップヒストグラムを用いて式(8)の演算を行い、視差ギャップ特徴量FVpdを算出する。 The parallax gap feature quantity calculation unit 154 calculates the parallax gap feature quantity FVpd from the parallax gap histogram generated by the parallax histogram processing unit 151. The parallax gap feature quantity calculation unit 154 calculates the parallax gap feature quantity FVpd indicating the ratio of the frequency (large_parallax_diff_counter) of the frequency (large_parallax_diff_counter) generating the parallax gap greater than or equal to the maximum parallax gap distance preset from the parallax gap histogram. calculate. The parallax gap feature quantity calculation unit 154 performs the calculation of Expression (8) using the parallax gap histogram to calculate the parallax gap feature quantity FVpd.
Figure JPOXMLDOC01-appb-M000008
Figure JPOXMLDOC01-appb-M000008
 このように、視差ギャップ特徴量算出部154で算出された視差ギャップ特徴量FVpdは、最大視差ギャップ距離を生じる画素の割合を示している。ここで、同一平面にある被写体は視差ギャップが小さく、距離が異なる被写体の画像境界部分では視差ギャップが大きいことから、距離が大きく異なる被写体の画像境界の発生状況を判定することが可能となる。 As described above, the parallax gap feature quantity FVpd calculated by the parallax gap feature quantity calculation unit 154 indicates the proportion of pixels that generate the maximum parallax gap distance. Here, the subject on the same plane has a small parallax gap, and the parallax gap is large at an image boundary portion of subjects having different distances. Therefore, it is possible to determine the generation state of the image boundary of subjects having large distances.
 そして、近距離特徴量判定部181は、視差分布特徴量算出部152、サーチ範囲超え特徴量算出部153および視差ギャップ特徴量算出部154によって算出された各画像特徴量に基づいて画質劣化有無の判定を行う。 Then, the short distance feature amount determination unit 181 determines whether the image quality is degraded based on the image feature amounts calculated by the parallax distribution feature amount calculation unit 152, the search range over feature amount calculation unit 153, and the parallax gap feature amount calculation unit 154. Make a decision.
 ここで、図13を参照して、近距離特徴量判定部181による画質劣化有無の判定方法の一例について説明する。図13には、縦軸を視差ギャップ特徴量FVpdとし、横軸をサーチ範囲超え特徴量FVosrとした場合において、各種シーンで撮像が行われた際の画質劣化有無の結果および判定曲線20等が示されている。より具体的には、まず、ユーザが画像処理装置100を用いて、被写体の位置等を変更しながら様々なシーンの撮像を行い、合成画像、視差ギャップ特徴量FVpdおよびサーチ範囲超え特徴量FVosrを出力する。その後、ユーザが画質劣化有無を目視で確認し、合成オフにすべきか否かの判定をシーン毎に行う。そして、画像処理装置100は、当該判定結果の集合を学習用教師データとして使用する機械学習(いわゆる、教師有り学習)を用いることで、画質劣化有無を最も適切に分離可能な曲線である判定曲線を出力する。なお、機械学習の手法はこれに限定されない。また、「深層学習(ディープラーニング:Deep Learning)」や各種シミュレーション技術等を用いて判定曲線が出力されてもよい。 Here, with reference to FIG. 13, an example of a method of determining the presence or absence of image quality deterioration by the short distance feature amount determination unit 181 will be described. In FIG. 13, when the vertical axis is the parallax gap feature amount FVpd and the horizontal axis is the search range beyond feature amount FVosr, the result of the image quality deterioration when imaging is performed in various scenes and the determination curve 20 etc. It is shown. More specifically, first, the user uses the image processing apparatus 100 to capture various scenes while changing the position etc. of the subject, and the composite image, the parallax gap feature amount FVpd, and the search range over feature amount FVosr Output. Thereafter, the user visually confirms the image quality deterioration and determines whether or not to turn off the composition for each scene. Then, the image processing apparatus 100 uses a set of the determination results as machine learning data (so-called supervised learning) to use the set of determination results as a determination curve that is a curve that can most appropriately separate the presence or absence of image quality deterioration. Output In addition, the method of machine learning is not limited to this. In addition, the determination curve may be output using “deep learning (deep learning)”, various simulation techniques, or the like.
 そして、画質劣化判定部180は、撮像画像から抽出された視差ギャップ特徴量FVpdおよびサーチ範囲超え特徴量FVosrによって示される点と判定曲線とを比較する。そして、当該点が判定曲線よりも高い値を示す場合(点が図13の矢印22の領域に位置する場合)、画質劣化判定部180は、合成画像の画質が劣化する(または、画質が劣化する可能性が高い)と判定し、合成処理部150は白黒画像の合成比率を略ゼロにする。 Then, the image quality deterioration determining unit 180 compares the point indicated by the parallax gap feature amount FVpd and the search range exceeding feature amount FVosr extracted from the captured image with the determination curve. Then, when the point indicates a value higher than the determination curve (when the point is located in the area of the arrow 22 in FIG. 13), the image quality degradation determination unit 180 degrades the image quality of the composite image (or degrades the image quality) And the composition processing unit 150 makes the composition ratio of the black and white image substantially zero.
 なお、画質劣化判定部180は、図13の曲線21で表されるヒステリシスを設定してもよい。そして、合成オンの状態で、新たに撮像画像から抽出された画像特徴量によって示される点が曲線21よりも低い値を示す場合(点が図13の矢印23の領域に位置する場合)に、画質劣化判定部180は、合成オフの状態に切り替える。これによって、画質劣化判定部180は、合成オンの状態と合成オフの状態が頻繁に切り替わることを防ぐことができる。 Note that the image quality deterioration determination unit 180 may set the hysteresis represented by the curve 21 in FIG. Then, in the case where the point indicated by the image feature amount newly extracted from the captured image shows a value lower than that of the curve 21 (when the point is located in the area of the arrow 23 in FIG. 13) in the composite on state. The image quality deterioration determination unit 180 switches to a state of combining off. As a result, the image quality deterioration determination unit 180 can prevent frequent switching between the combined on state and the combined off state.
 また、上記では、画質劣化有無の判定に、視差ギャップ特徴量FVpdおよびサーチ範囲超え特徴量FVosrが用いられていたが、併せて視差分布特徴量FVfsdが用いられてもよい。より具体的には、ユーザは、画像処理装置100を用いて、上記と同様の方法で、様々なシーンの撮像を行い、合成画像、視差ギャップ特徴量FVpd、サーチ範囲超え特徴量FVosrおよび視差分布特徴量FVfsdを算出する。その後、ユーザが画質劣化有無を目視で確認し、合成オフにすべきか否かの判定をシーン毎に行い、当該判定結果の集合を学習用教師データとして使用する機械学習を用いることで判定曲面を出力する。ここで、判定曲面とは、図13の奥行き方向を視差分布特徴量FVfsdとする三次元座標上に表される曲面である。そして、画質劣化判定部180は、撮像画像から抽出された視差ギャップ特徴量FVpd、サーチ範囲超え特徴量FVosrおよび視差分布特徴量FVfsdによって示される点と判定曲面とを比較することで、画像劣化有無を判定する。 In the above description, the parallax gap feature amount FVpd and the search range over feature amount FVosr are used to determine the presence or absence of image quality deterioration, but the parallax distribution feature amount FVfsd may be used together. More specifically, the user performs imaging of various scenes in the same manner as described above using the image processing apparatus 100, and the composite image, the parallax gap feature amount FVpd, the search range exceeding feature amount FVosr, and the parallax distribution The feature amount FVfsd is calculated. Thereafter, the user visually confirms the image quality deterioration, determines whether or not to turn off combining for each scene, and uses the machine learning that uses the set of the determination results as the teacher data for learning, and the determination curved surface is obtained. Output. Here, the determination curved surface is a curved surface represented on three-dimensional coordinates in which the depth direction in FIG. 13 is the parallax distribution feature amount FVfsd. Then, the image quality deterioration determining unit 180 compares the point indicated by the parallax gap feature amount FVpd extracted from the captured image, the search range exceeding feature amount FVosr, and the parallax distribution feature amount FVfsd with the determination curved surface to obtain the presence or absence of image deterioration. Determine
 このように、画質劣化判定部180は、複数の画像特徴量を組み合せて処理を行うことで、画質劣化有無の判定精度を向上させることができる。もちろん、処理に用いられる画像特徴量の組み合わせは自由であり、いずれか一つの画像特徴量のみが用いられてもよい。なお、上記では、各画像特徴量が視差ヒストグラムまたは視差ギャップヒストグラムから算出される例について説明したが、各画像特徴量は、カラー画像と白黒画像から得られる視差マップに基づいて算出されてもよい。 As described above, the image quality deterioration determining unit 180 can improve the determination accuracy of the image quality deterioration by performing processing by combining a plurality of image feature amounts. Of course, the combination of image feature quantities used for processing is free, and only one image feature quantity may be used. In the above, an example in which each image feature amount is calculated from a parallax histogram or a parallax gap histogram has been described, but each image feature amount may be calculated based on a parallax map obtained from a color image and a black and white image .
 (輝度差小色差大領域の特徴量に基づく判定)
 画質劣化判定部180は、輝度差小色差大領域(輝度信号の階調が低く、かつ、色差信号の階調が高い領域)の特徴量に基づいて画質劣化有無を判定してもよい。例えば、図14に示すように、表示画面において赤系統の領域30と青系統の領域31とが隣接しており(換言すると、色差信号の階調が高い)、領域30と領域31の輝度差は所定値より小さい(換言すると、輝度信号の階調が低い)とする。この場合、領域30と領域31の隣接部分を含む領域32は輝度差小色差大領域であると言える。
(Determination based on the feature amount of the luminance difference small color difference large area)
The image quality deterioration determining unit 180 may determine the presence or absence of image quality deterioration based on the feature amount of the low luminance difference color difference large area (the area where the gradation of the luminance signal is low and the gradation of the color difference signal is high). For example, as shown in FIG. 14, in the display screen, the red system area 30 and the blue system area 31 are adjacent (in other words, the gradation of the color difference signal is high), and the luminance difference between the area 30 and the area 31 Is smaller than a predetermined value (in other words, the gradation of the luminance signal is low). In this case, it can be said that the area 32 including the area 30 and the adjacent part of the area 31 is the luminance difference small color difference large area.
 ここで、視差推定は、通常、輝度信号を用いて行われるため、輝度信号の階調が低い領域においては、視差の推定精度が低下する。仮に、誤った視差が出力されても、色差信号の階調が低い領域においては、Y信号、Cb信号およびCr信号のそれぞれの変化が大きくないため合成時の画質劣化の程度は小さい。一方、誤った視差が出力され、かつ、色差信号の階調が高い領域においては、合成時の画質劣化の程度が大きくなってしまう。そのため、画質劣化判定部180が所定面積より広い輝度差小色差大領域を検出した場合、合成処理部150は白黒画像の合成比率を略ゼロにする。 Here, since parallax estimation is normally performed using a luminance signal, in a region where the gradation of the luminance signal is low, the parallax estimation accuracy is lowered. Even if an incorrect parallax is output, in the region where the gradation of the color difference signal is low, the change in Y signal, Cb signal and Cr signal is not large, and the degree of image quality deterioration at the time of combining is small. On the other hand, in the area where the erroneous parallax is output and the gradation of the color difference signal is high, the degree of image quality deterioration at the time of combining becomes large. Therefore, when the image quality deterioration determination unit 180 detects a luminance difference small color difference large area wider than a predetermined area, the combination processing unit 150 sets the combination ratio of the black and white image to substantially zero.
 ここで、図15を参照して、本判定方法における合成処理部150と画質劣化判定部180の機能構成例を説明する。図15に示すように、合成処理部150は、信号抽出部155と、Y/C分散比率処理部156と、Y/Cエッジ成分比率処理部157と、を備え、画質劣化判定部180は、輝度差小色差大特徴量判定部182を備える。輝度差小色差大領域では、Y信号の分散値に対するC信号(なお、C信号は、Cb信号またはCr信号を指す)の分散値の比率(以降、「Y/C分散比率」と呼称する)、および、Y信号のエッジ成分に対するC信号のエッジ成分の比率(以降、「Y/Cエッジ成分比率」と呼称する)という特徴量が大きくなる傾向にある。したがって、合成処理部150は、信号抽出部155を用いてカラー画像データからY信号、Cb信号およびCr信号を抽出し、これらの信号をY/C分散比率処理部156およびY/Cエッジ成分比率処理部157へ入力することで上記の各特徴量を算出する。そして、輝度差小色差大特徴量判定部182は、各特徴量に基づいて画質劣化有無の判定を行う。 Here, with reference to FIG. 15, a functional configuration example of the combination processing unit 150 and the image quality deterioration determination unit 180 in the present determination method will be described. As shown in FIG. 15, the combination processing unit 150 includes a signal extraction unit 155, a Y / C dispersion ratio processing unit 156, and a Y / C edge component ratio processing unit 157, and the image quality deterioration determination unit 180 includes A luminance difference small color difference large feature amount determination unit 182 is provided. In the luminance difference small color difference large area, the ratio of the dispersion value of C signal (C signal means Cb signal or Cr signal) to the dispersion value of Y signal (hereinafter referred to as “Y / C dispersion ratio”) , And the ratio of the edge component of the C signal to the edge component of the Y signal (hereinafter referred to as “Y / C edge component ratio”) tends to be large. Therefore, the combining processing unit 150 extracts the Y signal, the Cb signal and the Cr signal from the color image data using the signal extracting unit 155, and these signals are processed by the Y / C dispersion ratio processing unit 156 and the Y / C edge component ratio. Each feature amount described above is calculated by being input to the processing unit 157. Then, the luminance difference small color difference large feature amount determination unit 182 determines the presence or absence of image quality deterioration based on each feature amount.
 まず、Y/C分散比率に基づく処理について説明する。図16は、Y/C分散比率処理部156の機能構成例を示す図である。図16に示すように、Y/C分散比率処理部156は、Y分散値算出部156aと、Cb分散値算出部156bと、Cr分散値算出部156cと、比較部156dと、Y/C分散比率算出部156eと、を備える。 First, processing based on the Y / C distribution ratio will be described. FIG. 16 is a diagram showing an example of a functional configuration of the Y / C distribution ratio processing unit 156. As shown in FIG. As shown in FIG. 16, the Y / C dispersion ratio processor 156 includes a Y dispersion value calculator 156a, a Cb dispersion value calculator 156b, a Cr dispersion value calculator 156c, a comparator 156d, and a Y / C dispersion And a ratio calculation unit 156e.
 Y分散値算出部156a、Cb分散値算出部156bおよびCr分散値算出部156cは、それぞれ、画面全体を一定サイズの領域に分割し、各領域におけるY信号、Cb信号およびCr信号の分散値を算出する。分散値の算出方法は一般的なものであるため、説明を省略する。その後、比較部156dは、Cb信号の分散値とCr信号の分散値を比較し、より大きな値を有する分散値をY/C分散比率算出部156eに提供する。そして、Y/C分散比率算出部156eは、Y信号の分散値に対するC信号の分散値(Cb信号の分散値とCr信号の分散値のうちのより大きな値を有する分散値)の比率を算出し、当該比率を輝度差小色差大特徴量判定部182へ提供する。輝度差小色差大特徴量判定部182は、Y/C分散比率に基づいて画質劣化有無の判定を行う。 The Y dispersion value calculation unit 156a, the Cb dispersion value calculation unit 156b, and the Cr dispersion value calculation unit 156c respectively divide the entire screen into areas of a fixed size, and the dispersion values of the Y signal, Cb signal and Cr signal in each area calculate. The method of calculating the variance value is general, so the description is omitted. After that, the comparing unit 156 d compares the dispersion value of the Cb signal with the dispersion value of the Cr signal, and provides a dispersion value having a larger value to the Y / C dispersion ratio calculating unit 156 e. Then, the Y / C dispersion ratio calculation unit 156e calculates the ratio of the dispersion value of the C signal (the dispersion value having the larger value of the dispersion value of the Cb signal and the dispersion value of the Cr signal) to the dispersion value of the Y signal. And the ratio is provided to the luminance difference small color difference large feature amount determination unit 182. The luminance difference small color difference large feature amount determination unit 182 determines the image quality deterioration based on the Y / C dispersion ratio.
 ここで、図17を参照して、Y/C分散比率に基づく画質劣化有無の判定方法の一例について説明する。まず、ユーザが画像処理装置100を用いて、被写体の位置等を変更しながら様々なシーンの撮像を行い、合成画像およびY/C分散比率を出力する。その後、ユーザが画質劣化有無を目視で確認し、合成オフにすべきか否かの判定をシーン毎に行う。そして、画像処理装置100は、当該判定結果の集合を学習用教師データとして使用する機械学習等を用いることで、画質劣化が発生し易いY/C分散比率の特徴を出力する。例えば、図17には、縦軸をC信号の分散値とし、横軸をY信号の分散値とした場合において、各種シーンで撮像が行われた際の画質劣化有無の結果、および、画質劣化が発生し易いY/C分散比率の特徴に対応する領域40が示されている(換言すると、Y/C分散比率が領域40内に当たる場合には画質劣化が生じやすい)。そして、輝度差小色差大特徴量判定部182は、撮像画像から算出された各領域のY/C分散比率が領域40内に当たるか否かを判定する。Y/C分散比率が領域40内に当たる領域が所定数以上存在する場合には、輝度差小色差大特徴量判定部182は、合成画像の画質が劣化する(または、画質が劣化する可能性が高い)と判定し、合成処理部150は白黒画像の合成比率を略ゼロにする。 Here, with reference to FIG. 17, an example of a method of determining the presence or absence of image quality deterioration based on the Y / C dispersion ratio will be described. First, the user uses the image processing apparatus 100 to capture various scenes while changing the position of a subject and the like, and outputs a composite image and a Y / C dispersion ratio. Thereafter, the user visually confirms the image quality deterioration and determines whether or not to turn off the composition for each scene. Then, the image processing apparatus 100 outputs the feature of the Y / C dispersion ratio in which image quality deterioration easily occurs by using machine learning or the like in which the set of the determination results is used as learning teacher data. For example, in FIG. 17, when the vertical axis is the dispersion value of the C signal and the horizontal axis is the dispersion value of the Y signal, the result of the image quality deterioration when imaging is performed in various scenes and the image quality deterioration A region 40 corresponding to the characteristic of the Y / C dispersion ratio in which Y.sub.2 tends to occur is shown (in other words, when the Y / C dispersion ratio falls within the region 40, image quality deterioration is likely to occur). Then, the luminance difference small color difference large feature amount determination unit 182 determines whether or not the Y / C dispersion ratio of each area calculated from the captured image falls within the area 40. When there are a predetermined number or more of areas where the Y / C dispersion ratio falls within the area 40, the luminance difference small color difference large feature amount determination unit 182 may deteriorate the image quality of the composite image (or may deteriorate the image quality) The combination processing unit 150 determines that the combination ratio of the black and white image is substantially zero.
 なお、上記の処理はあくまで一例であり、適宜変更され得る。例えば、機械学習以外の技術が用いられることによって、画質劣化が発生し易いY/C分散比率の特徴が出力されてもよい。 The above process is merely an example, and may be changed as appropriate. For example, by using a technique other than machine learning, the feature of the Y / C dispersion ratio in which image quality deterioration easily occurs may be output.
 続いて、Y/Cエッジ成分比率に基づく処理について説明する。図18は、Y/Cエッジ成分比率処理部157の機能構成例を示す図である。図18に示すように、Y/Cエッジ成分比率処理部157は、Yエッジ成分検出部157aと、Cbエッジ成分検出部157bと、Crエッジ成分検出部157cと、比較部157dと、Y/Cエッジ成分比率算出部157eと、を備える。 Subsequently, processing based on the Y / C edge component ratio will be described. FIG. 18 is a diagram showing an example of a functional configuration of the Y / C edge component ratio processing unit 157. As shown in FIG. As shown in FIG. 18, the Y / C edge component ratio processing unit 157 includes a Y edge component detection unit 157a, a Cb edge component detection unit 157b, a Cr edge component detection unit 157c, a comparison unit 157d, and Y / C. And an edge component ratio calculation unit 157e.
 Yエッジ成分検出部157a、Cbエッジ成分検出部157bおよびCrエッジ成分検出部157cは、それぞれ、各画素におけるY信号、Cb信号およびCr信号それぞれのエッジ成分を検出する(換言すると、各信号において鋭敏に変化している箇所、または、不連続に変化している箇所を検出する)。エッジ成分の検出方法(検出アルゴリズム等)は特に限定されず、公知の技術が用いられ得る。その後、比較部157dは、Cb信号のエッジ成分とCr信号のエッジ成分を比較し、より大きな値を有するエッジ成分をY/Cエッジ成分比率算出部157eに提供する。そして、Y/Cエッジ成分比率算出部157eは、Y信号のエッジ成分に対するC信号のエッジ成分(Cb信号のエッジ成分とCr信号のエッジ成分のうちのより大きな値を有するエッジ成分)の比率を算出し、当該比率を輝度差小色差大特徴量判定部182へ提供する。輝度差小色差大特徴量判定部182は、Y/Cエッジ成分比率に基づいて画質劣化有無の判定を行う。 Y edge component detection unit 157a, Cb edge component detection unit 157b and Cr edge component detection unit 157c respectively detect edge components of Y signal, Cb signal and Cr signal in each pixel (in other words, each signal is sensitive Detect where the change is or where it changes discontinuously). The detection method (detection algorithm etc.) of the edge component is not particularly limited, and known techniques may be used. Thereafter, the comparing unit 157d compares the edge component of the Cb signal with the edge component of the Cr signal, and provides an edge component having a larger value to the Y / C edge component ratio calculating unit 157e. The Y / C edge component ratio calculation unit 157e then calculates the ratio of the edge component of the C signal (the edge component having the larger value of the edge component of the Cb signal and the edge component of the Cr signal) to the edge component of the Y signal. The ratio is calculated and provided to the luminance difference small color difference large feature amount determination unit 182. The luminance difference small color difference large feature amount determination unit 182 determines the image quality deterioration based on the Y / C edge component ratio.
 ここで、図19を参照して、Y/Cエッジ成分比率に基づく画質劣化有無の判定方法の一例について説明する。まず、ユーザが画像処理装置100を用いて、被写体の位置等を変更しながら様々なシーンの撮像を行い、合成画像およびY/Cエッジ成分比率を出力する。その後、ユーザが画質劣化有無を目視で確認し、合成オフにすべきか否かの判定をシーン毎に行う。そして、画像処理装置100は、当該判定結果の集合を学習用教師データとして使用する機械学習等を用いることで、画質劣化が発生し易いY/Cエッジ成分比率の特徴を出力する。例えば、図19には、縦軸をC信号のエッジ成分とし、横軸をY信号のエッジ成分とした場合において、各種シーンで撮像が行われた際の画質劣化有無の結果、および、画質劣化が発生し易いY/Cエッジ成分比率の特徴に対応する領域50が示されている(換言すると、Y/Cエッジ成分比率が領域50内に当たる場合には画質劣化が生じやすい)。そして、輝度差小色差大特徴量判定部182は、撮像画像から算出された各画素のY/Cエッジ成分比率が領域50内に当たるか否かを判定する。Y/Cエッジ成分比率が領域50内に当たる画素が所定ピクセル以上存在する場合には、輝度差小色差大特徴量判定部182は、合成画像の画質が劣化する(または、画質が劣化する可能性が高い)と判定し、合成処理部150は白黒画像の合成比率を略ゼロにする。 Here, with reference to FIG. 19, an example of a method of determining the presence or absence of image quality deterioration based on the Y / C edge component ratio will be described. First, the user uses the image processing apparatus 100 to capture various scenes while changing the position of a subject and the like, and outputs a composite image and a Y / C edge component ratio. Thereafter, the user visually confirms the image quality deterioration and determines whether or not to turn off the composition for each scene. Then, the image processing apparatus 100 outputs the feature of the Y / C edge component ratio at which the image quality deterioration easily occurs, by using machine learning or the like in which the set of the determination results is used as learning teacher data. For example, in FIG. 19, when the vertical axis is an edge component of the C signal and the horizontal axis is an edge component of the Y signal, image quality deterioration as a result of image quality deterioration when imaging is performed in various scenes and image quality deterioration A region 50 corresponding to the characteristic of the Y / C edge component ratio where the occurrence of H. tends to occur is shown (in other words, image quality degradation is likely to occur when the Y / C edge component ratio falls within the region 50). Then, the luminance difference small color difference large feature amount determination unit 182 determines whether the Y / C edge component ratio of each pixel calculated from the captured image falls within the region 50. When a pixel having a Y / C edge component ratio falling within the region 50 is a predetermined pixel or more, the luminance difference small color difference large feature amount determination unit 182 may deteriorate the image quality of the composite image (or may deteriorate the image quality) Is high, and the composition processing unit 150 makes the composition ratio of the black and white image substantially zero.
 なお、上記の処理はあくまで一例であり、適宜変更され得る。例えば、機械学習以外の技術が用いられることによって、画質劣化が発生し易いY/Cエッジ成分比率の特徴が出力されてもよい。 The above process is merely an example, and may be changed as appropriate. For example, by using a technique other than machine learning, the feature of the Y / C edge component ratio that is likely to cause image quality degradation may be output.
 (1.4.処理フローの例)
 上記では、画像処理装置100の画質劣化判定部180による画質劣化有無の判定方法の例について説明した。続いて、図20を参照して、画像処理装置100の各機能構成による処理フローの例について説明する。上記のとおり、画像処理装置100は、上記で説明した様々な画質劣化の判定方法を組み合せて合成処理を制御することができるところ、図20には、上記で説明した全ての判定方法が組み合わされる場合の処理フローが示されている。
(1.4. Example of processing flow)
In the above, the example of the determination method of the image quality deterioration presence or absence by the image quality deterioration determination unit 180 of the image processing apparatus 100 has been described. Subsequently, with reference to FIG. 20, an example of a processing flow by each functional configuration of the image processing apparatus 100 will be described. As described above, the image processing apparatus 100 can control combining processing by combining various determination methods of image quality deterioration described above, but FIG. 20 combines all the determination methods described above. The process flow of the case is shown.
 まず、ステップS1000にて、画像処理装置100の画質劣化判定部180は、ISO感度に基づいて画質の劣化が生じるか否かを判定する。より具体的には、画質劣化判定部180は、第1の撮像部110と第2の撮像部120からそれぞれのISO感度情報を取得し、いずれかのISO感度が上記式(1)に示す範囲に含まれるか否かに基づいて画質の劣化が生じるか否かを判定する。画質の劣化が生じると判定された場合(ステップS1000/Yes)、ステップS1004にて、合成処理部150は白黒画像の合成比率を略ゼロにする(換言すると、合成オフの状態にする。または、白黒画像の合成比率を低減させる)。 First, in step S1000, the image quality deterioration determining unit 180 of the image processing apparatus 100 determines whether the image quality deterioration occurs based on the ISO sensitivity. More specifically, the image quality deterioration determination unit 180 acquires each of the ISO sensitivity information from the first imaging unit 110 and the second imaging unit 120, and one of the ISO sensitivities is in the range indicated by the above equation (1). It is determined whether the deterioration of the image quality occurs based on whether or not it is included in. If it is determined that the image quality deterioration occurs (step S1000 / Yes), the composition processing unit 150 makes the composition ratio of the black and white image substantially zero in step S1004 (in other words, it sets the composition OFF state or Reduce the composite ratio of black and white images).
 画質の劣化が生じないと判定された場合(ステップS1000/No)、ステップS1008にて、画質劣化判定部180は、距離センサ170から提供されたセンサ情報を用いた処理により画質の劣化が生じるか否かを判定する。より具体的には、画質劣化判定部180は、距離センサ170から提供された被写体距離データを解析することで、被写体距離が画質劣化を引き起こすほど近距離であるか否かを判定する。画質の劣化が生じるほど被写体距離が近距離であると判定された場合(ステップS1008/Yes)、ステップS1004にて、合成処理部150は白黒画像の合成比率を略ゼロにする(換言すると、合成オフの状態にする。または、白黒画像の合成比率を低減させる)。 When it is determined that the image quality does not deteriorate (step S1000 / No), in step S1008, the image quality deterioration determining unit 180 determines whether the image quality is deteriorated by the process using the sensor information provided from the distance sensor 170 It is determined whether or not. More specifically, the image quality deterioration determining unit 180 analyzes the subject distance data provided from the distance sensor 170 to determine whether the subject distance is close enough to cause image quality deterioration. If it is determined that the subject distance is close enough to cause deterioration in the image quality (step S1008 / Yes), the composition processing unit 150 makes the composition ratio of the black and white image substantially zero in step S1004 (in other words, composition) Turn it off, or reduce the composite ratio of the black and white image).
 画質の劣化が生じるほど被写体距離が近距離ではないと判定された場合(ステップS1008/No)、ステップS1012にて、画質劣化判定部180は、第1の撮像部110および第2の撮像部120のそれぞれから提供されたフォーカス位置情報に基づいて画質の劣化が生じるか否かを判定する。より具体的には、画質劣化判定部180は、フォーカス位置情報を被写体距離に換算し、被写体距離が画質劣化を引き起こすほど近距離であるか否かを判定する。画質の劣化が生じるほど被写体距離が近距離であると判定された場合(ステップS1012/Yes)、ステップS1004にて、合成処理部150は白黒画像の合成比率を略ゼロにする(換言すると、合成オフの状態にする。または、白黒画像の合成比率を低減させる)。 If it is determined that the subject distance is not close enough to cause deterioration in image quality (step S1008 / No), the image quality deterioration determination unit 180 determines whether the first imaging unit 110 and the second imaging unit 120 are selected in step S1012. It is determined whether the image quality degradation occurs based on the focus position information provided by each of the above. More specifically, the image quality deterioration determining unit 180 converts the focus position information into the subject distance, and determines whether the subject distance is close enough to cause the image quality deterioration. If it is determined that the subject distance is close enough to cause deterioration in the image quality (step S1012 / Yes), the composition processing unit 150 makes the composition ratio of the black and white image substantially zero in step S1004 (in other words, composition) Turn it off, or reduce the composite ratio of the black and white image).
 画質の劣化が生じるほど被写体距離が近距離ではないと判定された場合(ステップS1012/No)、ステップS1016にて、画質劣化判定部180は、像面位相差センサから提供される情報に基づいて画質の劣化が生じるか否かを判定する。より具体的には、画質劣化判定部180は、像面位相差センサから提供される距離マップおよび信頼度マップを用いて式(5)を行うことで距離Dobjを算出し、距離Dobjが画質劣化を引き起こすほど近距離であるか否かを判定する。画質の劣化が生じるほど距離Dobjが近距離であると判定された場合(ステップS1016/Yes)、ステップS1004にて、合成処理部150は白黒画像の合成比率を略ゼロにする(換言すると、合成オフの状態にする。または、白黒画像の合成比率を低減させる)。 If it is determined that the subject distance is not close enough to cause deterioration in the image quality (step S1012 / No), the image quality deterioration determining unit 180 determines in step S1016 based on the information provided from the image plane phase difference sensor. It is determined whether or not image quality degradation occurs. More specifically, the image quality deterioration determination unit 180 calculates the distance D obj by performing equation (5) using the distance map and the reliability map provided by the image plane phase difference sensor, and the distance D obj is It is determined whether the distance is short enough to cause image quality deterioration. If it is determined that the distance D obj is a short distance so that deterioration of the image quality occurs (step S1016 / Yes), the composition processing unit 150 makes the composition ratio of the black and white image substantially zero in step S1004 (in other words, Turn off compositing, or reduce the compositing ratio of black and white images).
 画質の劣化が生じるほど距離Dobjが近距離ではないと判定された場合(ステップS1016/No)、ステップS1020にて、画質劣化判定部180は、近距離被写体の画像特徴量に基づいて画質の劣化が生じるか否かを判定する。より具体的には、合成処理部150は、白黒画像データとカラー画像データを用いて視差ギャップ特徴量FVpd、サーチ範囲超え特徴量FVosrまたは視差分布特徴量FVfsdを出力し、画質劣化判定部180は、これらの画像特徴量が近距離被写体の画像特徴量に該当するか否かに基づいて画質の劣化が生じるか否かを判定する。画像特徴量が近距離被写体の画像特徴量に該当すると判定された場合(ステップS1020/Yes)、ステップS1004にて、合成処理部150は白黒画像の合成比率を略ゼロにする(換言すると、合成オフの状態にする。または、白黒画像の合成比率を低減させる)。 If it is determined that the distance D obj is not a short distance so that deterioration of the image quality occurs (step S1016 / No), the image quality deterioration determination unit 180 determines the image quality based on the image feature amount of the short distance object in step S1020. It is determined whether deterioration occurs. More specifically, the composition processing unit 150 outputs the parallax gap feature amount FVpd, the search range exceeding feature amount FVosr or the parallax distribution feature amount FVfsd using the black and white image data and the color image data, and the image quality deterioration determination unit 180 Whether or not the image quality is degraded is determined based on whether or not these image feature amounts correspond to the image feature amounts of the short distance subject. If it is determined that the image feature amount corresponds to the image feature amount of the short distance object (Yes at Step S1020), the composition processing unit 150 sets the composition ratio of the black and white image to substantially zero at Step S1004. Turn it off, or reduce the composite ratio of the black and white image).
 画像特徴量が近距離被写体の画像特徴量に該当しないと判定された場合(ステップS1020/No)、ステップS1024にて、画質劣化判定部180は、輝度差小色差大領域の画像特徴量に基づいて画質の劣化が生じるか否かを判定する。より具体的には、合成処理部150は、撮像画像を用いてY/C分散比率またはY/Cエッジ成分比率という画像特徴量を出力し、画質劣化判定部180は、これらの画像特徴量が輝度差小色差大領域の画像特徴量に該当するか否かに基づいて画質の劣化が生じるか否かを判定する。これらの画像特徴量が輝度差小色差大領域の画像特徴量に該当すると判定された場合(ステップS1024/Yes)、ステップS1004にて、合成処理部150は白黒画像の合成比率を略ゼロにする(換言すると、合成オフの状態にする。または、白黒画像の合成比率を低減させる)。これらの画像特徴量が輝度差小色差大領域の画像特徴量に該当しないと判定された場合(ステップS1024/No)、ステップS1028にて、合成処理部150は、白黒画像の合成比率を略ゼロにすることなくカラー画像と白黒画像を合成する(換言すると、合成オンの状態にする)ことで、一連の処理が終了する。 When it is determined that the image feature amount does not correspond to the image feature amount of the short distance object (step S1020 / No), the image quality deterioration determination unit 180 determines the image feature amount of the small luminance difference color difference large area in step S1024. It is determined whether the image quality is degraded. More specifically, the composition processing unit 150 outputs image feature quantities such as Y / C dispersion ratio or Y / C edge component ratio using the captured image, and the image quality deterioration determination unit 180 uses these image feature quantities. It is determined whether or not the image quality is degraded based on whether or not the image feature amount of the low luminance difference color difference large area is satisfied. If it is determined that these image feature amounts correspond to the image feature amounts of the small luminance difference color difference large area (step S1024 / Yes), the composition processing unit 150 makes the composition ratio of the black and white image substantially zero in step S1004. (In other words, the composite off state is set or the composite ratio of the black and white image is reduced). When it is determined that these image feature amounts do not correspond to the image feature amounts of the small luminance difference color difference large area (step S1024 / No), the composition processing unit 150 substantially eliminates the composition ratio of the black and white image in step S1028. By combining the color image and the black-and-white image without changing them (in other words, setting the combination on state), a series of processing ends.
 図20の処理が撮像画像毎に繰り返し行われることを想定しているが、これに限定されない。なお、図20に示したフローチャートにおける各ステップは、必ずしも記載された順序に沿って時系列に処理する必要はない。すなわち、フローチャートにおける各ステップは、記載された順序と異なる順序で処理されても、並列的に処理されてもよい。また、上記のとおり、フローチャートにおける各ステップは適宜省略されてもよい。 Although it is assumed that the process of FIG. 20 is repeatedly performed for every captured image, it is not limited to this. The steps in the flowchart shown in FIG. 20 do not have to be processed chronologically in the order described. That is, the steps in the flowchart may be processed in an order different from the order described or in parallel. Also, as described above, each step in the flowchart may be omitted as appropriate.
  <2.変形例>
 上記では、本開示の一実施形態について説明してきた。続いて、本開示の変形例について説明する。
<2. Modified example>
Above, an embodiment of the present disclosure has been described. Then, the modification of this indication is explained.
 (2.1.3D深度センサに基づく判定)
 上記では、距離センサ170からのセンサ情報を用いた処理により画質劣化有無の判定を行う例について説明した。変形例に係る画像処理装置100は、図21に示すように、距離センサ170の代りに3D深度センサ190を備えており、画像処理装置100は、3D深度センサ190からのセンサ情報を用いた処理により画質劣化有無の判定を行ってもよい。なお、その他の構成は、図4と同一である。
(Determination based on 2.1.3 D depth sensor)
In the above, the example in which the determination of the image quality deterioration is performed by the process using the sensor information from the distance sensor 170 has been described. The image processing apparatus 100 according to the modification includes a 3D depth sensor 190 instead of the distance sensor 170 as shown in FIG. 21. The image processing apparatus 100 performs processing using sensor information from the 3D depth sensor 190. The image quality deterioration may be determined by The other configuration is the same as that shown in FIG.
 3D深度センサ190は、例えば、図22に示すように、赤外光の発光部191と、受光部192と、を備え、発光部191は、被写体に赤外光を照射し、受光部192は、被写体で反射された赤外光を受光する。そして、3D深度センサ190は、赤外光が照射されてから受光されるまでのわずかな時間を測定し、その時間差を距離に換算するタイム・オブ・フライト式によって距離マップを作成することができる。図22のように被写体がサッカーボールである場合に、3D深度センサ190によって作成される距離マップの一例を図23に示す。図23に示すように、距離マップは、例えば、色の濃淡で被写体距離を示し、色が濃いほど被写体距離が近いことを示す。画質劣化判定部180は、3D深度センサ190から距離マップを取得し、当該距離マップを解析することで、最も近い被写体距離(以降、「最近傍距離60」と呼称する。図23を参照)を特定する。 For example, as shown in FIG. 22, the 3D depth sensor 190 includes a light emitting unit 191 of infrared light and a light receiving unit 192. The light emitting unit 191 emits infrared light to a subject, and the light receiving unit 192 , Receive infrared light reflected by the subject. Then, the 3D depth sensor 190 can measure a slight time from when the infrared light is irradiated to when it is received, and can create a distance map by a time of flight equation that converts the time difference into a distance. . An example of the distance map created by the 3D depth sensor 190 when the subject is a soccer ball as shown in FIG. 22 is shown in FIG. As shown in FIG. 23, the distance map indicates, for example, the subject distance by color shading, and indicates that the subject distance is closer as the color is darker. The image quality deterioration determination unit 180 obtains a distance map from the 3D depth sensor 190, and analyzes the distance map to determine the closest subject distance (hereinafter, referred to as "nearest neighbor distance 60". See FIG. 23). Identify.
 画質劣化判定部180は、最近傍距離60が画質劣化を引き起こすほど近距離であると判定した場合、合成処理部150は、白黒画像の合成比率を略ゼロにする。距離センサ170は、基本的に、画面に含まれるある一点の被写体距離を出力するため、最近傍距離60を出力することが困難である。一方で、本変形例は、画面全体の距離マップに基づいて最近傍距離60を出力するため、画質劣化有無の判定をより精度高く実現することができる。 If the image quality deterioration determining unit 180 determines that the nearest distance 60 is close enough to cause the image quality deterioration, the combining processing unit 150 sets the combining ratio of the black and white image to substantially zero. Since the distance sensor 170 basically outputs the subject distance at a certain point included in the screen, it is difficult to output the nearest distance 60. On the other hand, in the present modification, since the closest distance 60 is output based on the distance map of the entire screen, the determination of the image quality deterioration can be realized with higher accuracy.
 なお、上記の処理はあくまで一例であり、適宜変更され得る。例えば、3D深度センサ190が照射する光の種類、距離マップの作成方法または距離マップの内容等は特に限定されない。 The above process is merely an example, and may be changed as appropriate. For example, the type of light emitted by the 3D depth sensor 190, the method of creating the distance map, the content of the distance map, and the like are not particularly limited.
 (2.2.ユーザによる被写体の注視状況に基づく判定)
 ユーザは、被写体に注視しているほど、その被写体についての画質の劣化を知覚し易い。そこで、画質劣化判定部180は、上記のように3D深度センサ190から提供された距離マップを用いて画質劣化有無の判定を行う際に、顔等の認識位置、合焦位置、注視位置(例えば、画面の中心または視線の解析により特定される位置等)に近いほど距離マップにおける距離が近距離になり易いように距離マップの補正を行ってもよい。
(2.2. Determination based on the gaze condition of the subject by the user)
As the user gazes at the subject, it is easier for the user to perceive deterioration in the image quality of the subject. Therefore, when the image quality deterioration determining unit 180 determines the presence or absence of image quality deterioration using the distance map provided from the 3D depth sensor 190 as described above, the recognition position of a face or the like, the in-focus position, the gaze position (for example, The distance map may be corrected so that the distance in the distance map tends to be closer as the center of the screen or the position specified by analysis of the line of sight or the like is closer.
 ここで、図24を参照して、画面の中心が注視位置とされたときの距離マップの補正処理の具体例について説明する。24Aには、補正前の距離マップが示されている(なお、当該距離マップは図23に示されたものと同一であるとする)。24Bには、24Aの距離マップ上に引かれたある直線71における値が示されている。 Here, with reference to FIG. 24, a specific example of the correction process of the distance map when the center of the screen is set as the gaze position will be described. 24A shows a distance map before correction (note that the distance map is assumed to be the same as that shown in FIG. 23). In 24B, values at a certain straight line 71 drawn on the distance map of 24A are shown.
 注視位置である画面の中心に近いほど距離マップの値が高く(換言すると、距離マップにおける距離が近く)出力されるように、24Cに示すような係数関数が定義されているとする。そして、画質劣化判定部180は、24Bに示す補正前の距離マップの各値に対して、24Cに示す係数関数の各値を乗算することによって、24Dに示す補正後の距離マップを出力する。その後、画質劣化判定部180は、補正後の距離マップを解析することで最近傍距離60を特定し、当該最近傍距離60が画質劣化を引き起こすほど近距離であると判定した場合、合成処理部150は、白黒画像の合成比率を略ゼロにする。 It is assumed that a coefficient function as shown in 24C is defined so that the value of the distance map is output higher (in other words, the distance in the distance map is closer) closer to the center of the screen which is the gaze position. Then, the image quality deterioration determination unit 180 outputs the corrected distance map shown in 24D by multiplying each value of the distance map before correction shown in 24B with each value of the coefficient function shown in 24C. Thereafter, the image quality deterioration determining unit 180 identifies the nearest distance 60 by analyzing the corrected distance map, and when it is determined that the nearest distance 60 is short enough to cause the image quality deterioration, the combining processing unit A step 150 makes the composite ratio of black and white images substantially zero.
 これによって、画質劣化判定部180は、ユーザによる被写体の注視状況も考慮して、画質の劣化がユーザに知覚され易いか否かを判定することができる。なお、上記の処理はあくまで一例であり、適宜変更され得る。例えば、係数関数は24Cに示した内容から適宜変更されてもよい。 Thus, the image quality deterioration determining unit 180 can determine whether the user is likely to perceive deterioration of the image quality in consideration of the user's gaze condition of the subject. The above process is merely an example, and may be changed as appropriate. For example, the coefficient function may be changed as appropriate from the contents shown in 24C.
 ところで、フォーカス制御部160によって制御されるフォーカスが遠景に合っているときには、ユーザによって注視されている被写体の被写体距離も遠い可能性が高い。このとき、近距離被写体が画角に入ったとしても、当該近距離被写体はユーザに注視されていないため、その画質の劣化をユーザが知覚する可能性は低い(換言すると、画質の劣化が許容され得る)。 By the way, when the focus controlled by the focus control unit 160 matches the distant view, there is a high possibility that the subject distance of the subject being watched by the user is also far. At this time, even if the short distance subject enters the angle of view, the short distance subject is not gazed by the user, so the possibility that the user perceives the deterioration of the image quality is low (in other words, the deterioration of the image quality is acceptable) Can be
 したがって、画質劣化判定部180は、図20に示したフローチャートを図25のように変更してもよい。図25には、図20に示したフローチャートには存在しないステップS1108が含まれる。より具体的には、ステップS1108にて、画質劣化判定部180は、フォーカス位置情報を被写体距離に換算し、フォーカスが遠景に合っていることでユーザによって注視されている被写体の被写体距離が遠いと判定した場合(ステップS1108/Yes)、ステップS1132にて、合成処理部150は、白黒画像の合成比率を略ゼロにすることなくカラー画像と白黒画像を合成する(換言すると、合成オンの状態にする)。一方、画質劣化判定部180が、ユーザによって注視されている被写体の被写体距離が近いと判定した場合(ステップS1108/No)、ステップS1112以降の処理(図20のステップS1008以降の処理と同一)が実施される。これにより、ユーザが注視している遠景部分の画質が合成処理によって向上され得る。 Therefore, the image quality deterioration judging unit 180 may change the flowchart shown in FIG. 20 to that shown in FIG. FIG. 25 includes step S1108 which does not exist in the flowchart shown in FIG. More specifically, in step S1108, the image quality deterioration determination unit 180 converts the focus position information into the subject distance, and the subject distance of the subject being watched by the user is far because the focus is in the distant view. If it is determined (step S1108 / Yes), in step S1132, the combining processing unit 150 combines the color image and the black and white image without setting the black and white image combining ratio to substantially zero (in other words, the combination on state To do). On the other hand, when the image quality deterioration determination unit 180 determines that the subject distance of the subject being watched by the user is short (step S1108 / No), the process after step S1112 (the same as the process after step S1008 in FIG. 20) is To be implemented. Thereby, the image quality of the distant view part which a user is gazing at may be improved by synthetic | combination processing.
 (2.3.電子ズームに基づく判定)
 通常、電子ズームは、合成処理を含む画像処理の後段で行われることが多い。仮に、合成処理によって画質が劣化した後に電子ズームが行われると、図26に示すように、画質が劣化した領域が拡大されることで、画質の劣化がより目立ってしまう。図26の26Aには画質が劣化した合成画像が表され、26Bには合成処理後の電子ズームによって画質が劣化した領域が拡大された画像が表されている。したがって、画質劣化判定部180は、上記の画質劣化有無の判定で用いた閾値を電子ズームの倍率に応じて変更してもよい。
(2.3. Determination based on the electronic zoom)
Usually, electronic zoom is often performed after image processing including composition processing. If the electronic zoom is performed after the image quality is degraded by the combining process, as shown in FIG. 26, the area where the image quality is degraded is enlarged, and the image quality is more significantly degraded. 26A in FIG. 26 represents a composite image with degraded image quality, and 26B represents an image in which a region with degraded image quality by the electronic zoom after composition processing is enlarged. Therefore, the image quality deterioration determining unit 180 may change the threshold used in the above-described determination of the image quality deterioration according to the magnification of the electronic zoom.
 また、電子ズームが合成処理を含む画像処理の後段で行われる場合、合成処理が行われる画角と、電子ズーム後の画角が異なるため、電子ズーム後の画角では画面に映らない領域でのみ画質の劣化が発生している可能性がある。しかし、上記で説明してきた画質劣化有無の判定は画面全体に対して行われるため、電子ズームによって画面に映らなくなった領域で画質の劣化が発生した場合でも合成オフの状態に設定されてしまう。 In addition, when the electronic zoom is performed at a later stage of the image processing including the composition processing, the angle of view after the composition processing is different from the angle of view after the electronic zoom. There is a possibility that only the deterioration of the image quality has occurred. However, since the determination of the image quality deterioration described above is performed on the entire screen, even if the image quality deterioration occurs in the area which is not displayed on the screen by the electronic zoom, the combination OFF state is set.
 そこで、変形例に係る画像処理装置100は、画面特徴量の抽出領域を電子ズーム後の画角の領域でのみ行う。より具体的には、合成処理部150は、電子ズームに関する情報(例えば、電子ズームの起点および倍率等の、電子ズーム後の画角の領域を特定可能な情報)を取得し、当該情報に基づいて、電子ズーム後の画角の領域における各種画面特徴量を算出する。そして、画質劣化判定部180はこれらの画面特徴量に基づいて画質劣化有無の判定を行う。これによって、画質の劣化が発生する場合であっても、その発生位置が電子ズーム後の画角の領域外である場合には、画像処理装置100は、合成処理を継続することができる。 Therefore, the image processing apparatus 100 according to the modification performs the extraction region of the screen feature amount only in the region of the angle of view after the electronic zoom. More specifically, the composition processing unit 150 acquires information related to the electronic zoom (for example, information capable of specifying the area of the angle of view after the electronic zoom such as the starting point of the electronic zoom and the magnification), and based on the information Then, various screen feature quantities in the area of the angle of view after the electronic zoom are calculated. Then, the image quality deterioration determination unit 180 determines the presence or absence of image quality deterioration based on these screen feature amounts. As a result, even if the image quality is degraded, the image processing apparatus 100 can continue the combining process if the occurrence position is outside the area of the angle of view after the electronic zoom.
 (2.4.部分的な合成制御)
 上記では、画像処理装置100は、画質劣化有無の判定および合成可否の判定をフレーム単位で行っていたため、例えば、画質の劣化が発生する領域が小さい場合(例えば、領域面積が所定値以下である場合)であっても合成が行われない可能性がある。一方、変形例に係る画像処理装置100は、画質の劣化が発生する領域(またはその付近の領域)だけを合成しない領域とし、その領域以外を合成する領域としてもよい。
(2.4. Partial synthesis control)
In the above, the image processing apparatus 100 performs the determination of the image quality deterioration and the determination of the combination availability on a frame basis. For example, when the area where the image quality deterioration occurs is small (for example, the area area is less than a predetermined value) Even if) there is a possibility that synthesis will not be performed. On the other hand, in the image processing apparatus 100 according to the modification, only the area (or the area in the vicinity thereof) where the image quality deterioration occurs may be an area not synthesized, and the area other than the area may be synthesized.
 より具体的には、図6のように、画面が複数の領域に区切られた場合において、画像処理装置100は、当該領域単位で画質劣化有無の判定および合成可否の判定を行ってもよい。また、画像処理装置100は、撮影画像において被写体の輪郭を認識できた場合には、被写体単位で画質劣化有無の判定および合成可否の判定を行ってもよい。これによって、画像処理装置100は、一部の領域に画質の劣化が発生することによって画面全体の合成が行われないこと(または、その逆)を防ぐことができる。 More specifically, as shown in FIG. 6, when the screen is divided into a plurality of areas, the image processing apparatus 100 may perform the determination of the image quality deterioration and the determination of the combination availability in the area unit. In addition, when the image processing apparatus 100 can recognize the contour of the subject in the captured image, the image processing apparatus 100 may perform the determination of the image quality deterioration and the determination of the combination availability on a subject basis. As a result, the image processing apparatus 100 can prevent the synthesis of the entire screen from being performed (or vice versa) due to the image quality deterioration occurring in a part of the area.
  <3.応用例>
 本開示に係る技術は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット、建設機械、農業機械(トラクター)などのいずれかの種類の移動体に搭載される装置として実現されてもよい。
<3. Application example>
The technology according to the present disclosure can be applied to various products. For example, the technology according to the present disclosure is any type of movement, such as automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility, airplanes, drones, ships, robots, construction machines, agricultural machines (tractors), etc. It may be realized as a device mounted on the body.
 図27は、本開示に係る技術が適用され得る移動体制御システムの一例である車両制御システム7000の概略的な構成例を示すブロック図である。車両制御システム7000は、通信ネットワーク7010を介して接続された複数の電子制御ユニットを備える。図27に示した例では、車両制御システム7000は、駆動系制御ユニット7100、ボディ系制御ユニット7200、バッテリ制御ユニット7300、車外情報検出ユニット7400、車内情報検出ユニット7500、及び統合制御ユニット7600を備える。これらの複数の制御ユニットを接続する通信ネットワーク7010は、例えば、CAN(Controller Area Network)、LIN(Local Interconnect Network)、LAN(Local Area Network)又はFlexRay(登録商標)等の任意の規格に準拠した車載通信ネットワークであってよい。 FIG. 27 is a block diagram showing a schematic configuration example of a vehicle control system 7000 that is an example of a mobile control system to which the technology according to the present disclosure can be applied. Vehicle control system 7000 comprises a plurality of electronic control units connected via communication network 7010. In the example shown in FIG. 27, the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside information detection unit 7400, an in-vehicle information detection unit 7500, and an integrated control unit 7600. . The communication network 7010 connecting the plurality of control units is, for example, an arbitrary standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), or FlexRay (registered trademark). It may be an in-vehicle communication network.
 各制御ユニットは、各種プログラムにしたがって演算処理を行うマイクロコンピュータと、マイクロコンピュータにより実行されるプログラム又は各種演算に用いられるパラメータ等を記憶する記憶部と、各種制御対象の装置を駆動する駆動回路とを備える。各制御ユニットは、通信ネットワーク7010を介して他の制御ユニットとの間で通信を行うためのネットワークI/Fを備えるとともに、車内外の装置又はセンサ等との間で、有線通信又は無線通信により通信を行うための通信I/Fを備える。図27では、統合制御ユニット7600の機能構成として、マイクロコンピュータ7610、汎用通信I/F7620、専用通信I/F7630、測位部7640、ビーコン受信部7650、車内機器I/F7660、音声画像出力部7670、車載ネットワークI/F7680及び記憶部7690が図示されている。他の制御ユニットも同様に、マイクロコンピュータ、通信I/F及び記憶部等を備える。 Each control unit includes a microcomputer that performs arithmetic processing in accordance with various programs, a storage unit that stores programs executed by the microcomputer or parameters used in various arithmetic operations, and drive circuits that drive devices to be controlled. Equipped with Each control unit is provided with a network I / F for communicating with other control units via the communication network 7010, and by wired communication or wireless communication with an apparatus or sensor inside or outside the vehicle. A communication I / F for performing communication is provided. In FIG. 27, as a functional configuration of the integrated control unit 7600, a microcomputer 7610, a general-purpose communication I / F 7620, a dedicated communication I / F 7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device I / F 7660, an audio image output unit 7670, An in-vehicle network I / F 7680 and a storage unit 7690 are illustrated. The other control units also include a microcomputer, a communication I / F, a storage unit, and the like.
 駆動系制御ユニット7100は、各種プログラムにしたがって車両の駆動系に関連する装置の動作を制御する。例えば、駆動系制御ユニット7100は、内燃機関又は駆動用モータ等の車両の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構、車両の舵角を調節するステアリング機構、及び、車両の制動力を発生させる制動装置等の制御装置として機能する。駆動系制御ユニット7100は、ABS(Antilock Brake System)又はESC(Electronic Stability Control)等の制御装置としての機能を有してもよい。 Drive system control unit 7100 controls the operation of devices related to the drive system of the vehicle according to various programs. For example, drive system control unit 7100 includes a drive force generation device for generating a drive force of a vehicle such as an internal combustion engine or a drive motor, a drive force transmission mechanism for transmitting the drive force to the wheels, and a steering angle of the vehicle. It functions as a control mechanism such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle. The drive system control unit 7100 may have a function as a control device such as an ABS (Antilock Brake System) or an ESC (Electronic Stability Control).
 駆動系制御ユニット7100には、車両状態検出部7110が接続される。車両状態検出部7110には、例えば、車体の軸回転運動の角速度を検出するジャイロセンサ、車両の加速度を検出する加速度センサ、あるいは、アクセルペダルの操作量、ブレーキペダルの操作量、ステアリングホイールの操舵角、エンジン回転数又は車輪の回転速度等を検出するためのセンサのうちの少なくとも一つが含まれる。駆動系制御ユニット7100は、車両状態検出部7110から入力される信号を用いて演算処理を行い、内燃機関、駆動用モータ、電動パワーステアリング装置又はブレーキ装置等を制御する。 Vehicle state detection unit 7110 is connected to drive system control unit 7100. The vehicle state detection unit 7110 may be, for example, a gyro sensor that detects an angular velocity of an axial rotational movement of a vehicle body, an acceleration sensor that detects an acceleration of the vehicle, or an operation amount of an accelerator pedal, an operation amount of a brake pedal, and steering of a steering wheel. At least one of the sensors for detecting the angle, the engine speed, the rotational speed of the wheel, etc. is included. Drive system control unit 7100 performs arithmetic processing using a signal input from vehicle state detection unit 7110 to control an internal combustion engine, a drive motor, an electric power steering device, a brake device, and the like.
 ボディ系制御ユニット7200は、各種プログラムにしたがって車体に装備された各種装置の動作を制御する。例えば、ボディ系制御ユニット7200は、キーレスエントリシステム、スマートキーシステム、パワーウィンドウ装置、あるいは、ヘッドランプ、バックランプ、ブレーキランプ、ウィンカー又はフォグランプ等の各種ランプの制御装置として機能する。この場合、ボディ系制御ユニット7200には、鍵を代替する携帯機から発信される電波又は各種スイッチの信号が入力され得る。ボディ系制御ユニット7200は、これらの電波又は信号の入力を受け付け、車両のドアロック装置、パワーウィンドウ装置、ランプ等を制御する。 Body system control unit 7200 controls the operation of various devices equipped on the vehicle body according to various programs. For example, the body control unit 7200 functions as a keyless entry system, a smart key system, a power window device, or a control device of various lamps such as a head lamp, a back lamp, a brake lamp, a blinker or a fog lamp. In this case, the body system control unit 7200 may receive radio waves or signals of various switches transmitted from a portable device substituting a key. Body system control unit 7200 receives the input of these radio waves or signals, and controls a door lock device, a power window device, a lamp and the like of the vehicle.
 バッテリ制御ユニット7300は、各種プログラムにしたがって駆動用モータの電力供給源である二次電池7310を制御する。例えば、バッテリ制御ユニット7300には、二次電池7310を備えたバッテリ装置から、バッテリ温度、バッテリ出力電圧又はバッテリの残存容量等の情報が入力される。バッテリ制御ユニット7300は、これらの信号を用いて演算処理を行い、二次電池7310の温度調節制御又はバッテリ装置に備えられた冷却装置等の制御を行う。 The battery control unit 7300 controls the secondary battery 7310 which is a power supply source of the drive motor according to various programs. For example, information such as the battery temperature, the battery output voltage, or the remaining capacity of the battery is input to the battery control unit 7300 from the battery device provided with the secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and performs temperature adjustment control of the secondary battery 7310 or control of a cooling device or the like provided in the battery device.
 車外情報検出ユニット7400は、車両制御システム7000を搭載した車両の外部の情報を検出する。例えば、車外情報検出ユニット7400には、撮像部7410及び車外情報検出部7420のうちの少なくとも一方が接続される。撮像部7410には、ToF(Time Of Flight)カメラ、ステレオカメラ、単眼カメラ、赤外線カメラ及びその他のカメラのうちの少なくとも一つが含まれる。車外情報検出部7420には、例えば、現在の天候又は気象を検出するための環境センサ、あるいは、車両制御システム7000を搭載した車両の周囲の他の車両、障害物又は歩行者等を検出するための周囲情報検出センサのうちの少なくとも一つが含まれる。 Outside-vehicle information detection unit 7400 detects information outside the vehicle equipped with vehicle control system 7000. For example, at least one of the imaging unit 7410 and the external information detection unit 7420 is connected to the external information detection unit 7400. The imaging unit 7410 includes at least one of a time-of-flight (ToF) camera, a stereo camera, a monocular camera, an infrared camera, and another camera. For example, an environment sensor for detecting the current weather or weather, or another vehicle, an obstacle or a pedestrian around the vehicle equipped with the vehicle control system 7000 is detected in the outside-vehicle information detection unit 7420, for example. And at least one of the ambient information detection sensors.
 環境センサは、例えば、雨天を検出する雨滴センサ、霧を検出する霧センサ、日照度合いを検出する日照センサ、及び降雪を検出する雪センサのうちの少なくとも一つであってよい。周囲情報検出センサは、超音波センサ、レーダ装置及びLIDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)装置のうちの少なくとも一つであってよい。これらの撮像部7410及び車外情報検出部7420は、それぞれ独立したセンサないし装置として備えられてもよいし、複数のセンサないし装置が統合された装置として備えられてもよい。 The environment sensor may be, for example, at least one of a raindrop sensor that detects wet weather, a fog sensor that detects fog, a sunshine sensor that detects sunshine intensity, and a snow sensor that detects snowfall. The ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a light detection and ranging (LIDAR) device. The imaging unit 7410 and the external information detection unit 7420 may be provided as independent sensors or devices, or may be provided as an integrated device of a plurality of sensors or devices.
 ここで、図28は、撮像部7410及び車外情報検出部7420の設置位置の例を示す。撮像部7910,7912,7914,7916,7918は、例えば、車両7900のフロントノーズ、サイドミラー、リアバンパ、バックドア及び車室内のフロントガラスの上部のうちの少なくとも一つの位置に設けられる。フロントノーズに備えられる撮像部7910及び車室内のフロントガラスの上部に備えられる撮像部7918は、主として車両7900の前方の画像を取得する。サイドミラーに備えられる撮像部7912,7914は、主として車両7900の側方の画像を取得する。リアバンパ又はバックドアに備えられる撮像部7916は、主として車両7900の後方の画像を取得する。車室内のフロントガラスの上部に備えられる撮像部7918は、主として先行車両又は、歩行者、障害物、信号機、交通標識又は車線等の検出に用いられる。 Here, FIG. 28 shows an example of installation positions of the imaging unit 7410 and the external information detection unit 7420. The imaging units 7910, 7912, 7914, 7916, 7918 are provided at, for example, at least one of the front nose of the vehicle 7900, the side mirror, the rear bumper, the back door, and the upper portion of the windshield of the vehicle interior. An imaging unit 7910 provided in the front nose and an imaging unit 7918 provided in the upper part of the windshield in the vehicle cabin mainly acquire an image in front of the vehicle 7900. The imaging units 7912 and 7914 provided in the side mirror mainly acquire an image of the side of the vehicle 7900. An imaging unit 7916 provided in the rear bumper or back door mainly acquires an image behind the vehicle 7900. The imaging unit 7918 provided on the upper part of the windshield in the passenger compartment is mainly used to detect a leading vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
 なお、図28には、それぞれの撮像部7910,7912,7914,7916の撮影範囲の一例が示されている。撮像範囲aは、フロントノーズに設けられた撮像部7910の撮像範囲を示し、撮像範囲b,cは、それぞれサイドミラーに設けられた撮像部7912,7914の撮像範囲を示し、撮像範囲dは、リアバンパ又はバックドアに設けられた撮像部7916の撮像範囲を示す。例えば、撮像部7910,7912,7914,7916で撮像された画像データが重ね合わせられることにより、車両7900を上方から見た俯瞰画像が得られる。 Note that FIG. 28 illustrates an example of the imaging range of each of the imaging units 7910, 7912, 7914, and 7916. The imaging range a indicates the imaging range of the imaging unit 7910 provided on the front nose, the imaging ranges b and c indicate the imaging ranges of the imaging units 7912 and 7914 provided on the side mirrors, and the imaging range d indicates The imaging range of the imaging part 7916 provided in the rear bumper or the back door is shown. For example, by overlaying the image data captured by the imaging units 7910, 7912, 7914, and 7916, a bird's-eye view of the vehicle 7900 as viewed from above can be obtained.
 車両7900のフロント、リア、サイド、コーナ及び車室内のフロントガラスの上部に設けられる車外情報検出部7920,7922,7924,7926,7928,7930は、例えば超音波センサ又はレーダ装置であってよい。車両7900のフロントノーズ、リアバンパ、バックドア及び車室内のフロントガラスの上部に設けられる車外情報検出部7920,7926,7930は、例えばLIDAR装置であってよい。これらの車外情報検出部7920~7930は、主として先行車両、歩行者又は障害物等の検出に用いられる。 The external information detection units 7920, 7922, 7924, 7926, 7928, and 7930 provided on the front, rear, sides, and corners of the vehicle 7900 and above the windshield of the vehicle interior may be, for example, ultrasonic sensors or radar devices. The external information detection units 7920, 7926, 7930 provided on the front nose of the vehicle 7900, the rear bumper, the back door, and the upper part of the windshield of the vehicle interior may be, for example, a LIDAR device. These outside-of-vehicle information detection units 7920 to 7930 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle or the like.
 図27に戻って説明を続ける。車外情報検出ユニット7400は、撮像部7410に車外の画像を撮像させるとともに、撮像された画像データを受信する。また、車外情報検出ユニット7400は、接続されている車外情報検出部7420から検出情報を受信する。車外情報検出部7420が超音波センサ、レーダ装置又はLIDAR装置である場合には、車外情報検出ユニット7400は、超音波又は電磁波等を発信させるとともに、受信された反射波の情報を受信する。車外情報検出ユニット7400は、受信した情報に基づいて、人、車、障害物、標識又は路面上の文字等の物体検出処理又は距離検出処理を行ってもよい。車外情報検出ユニット7400は、受信した情報に基づいて、降雨、霧又は路面状況等を認識する環境認識処理を行ってもよい。車外情報検出ユニット7400は、受信した情報に基づいて、車外の物体までの距離を算出してもよい。 Returning to FIG. 27, the description will be continued. The out-of-vehicle information detection unit 7400 causes the imaging unit 7410 to capture an image outside the vehicle, and receives the captured image data. Further, the external information detection unit 7400 receives detection information from the external information detection unit 7420 connected. When the out-of-vehicle information detection unit 7420 is an ultrasonic sensor, a radar device, or a LIDAR device, the out-of-vehicle information detection unit 7400 transmits ultrasonic waves or electromagnetic waves and receives information on the received reflected waves. The external information detection unit 7400 may perform object detection processing or distance detection processing of a person, a car, an obstacle, a sign, a character on a road surface, or the like based on the received information. The external information detection unit 7400 may perform environment recognition processing for recognizing rainfall, fog, road surface conditions and the like based on the received information. The external information detection unit 7400 may calculate the distance to an object outside the vehicle based on the received information.
 また、車外情報検出ユニット7400は、受信した画像データに基づいて、人、車、障害物、標識又は路面上の文字等を認識する画像認識処理又は距離検出処理を行ってもよい。車外情報検出ユニット7400は、受信した画像データに対して歪補正又は位置合わせ等の処理を行うとともに、異なる撮像部7410により撮像された画像データを合成して、俯瞰画像又はパノラマ画像を生成してもよい。車外情報検出ユニット7400は、異なる撮像部7410により撮像された画像データを用いて、視点変換処理を行ってもよい。 Further, the external information detection unit 7400 may perform image recognition processing or distance detection processing for recognizing a person, a car, an obstacle, a sign, a character on a road surface, or the like based on the received image data. The external information detection unit 7400 performs processing such as distortion correction or alignment on the received image data, and combines the image data captured by different imaging units 7410 to generate an overhead image or a panoramic image. It is also good. The external information detection unit 7400 may perform viewpoint conversion processing using image data captured by different imaging units 7410.
 車内情報検出ユニット7500は、車内の情報を検出する。車内情報検出ユニット7500には、例えば、運転者の状態を検出する運転者状態検出部7510が接続される。運転者状態検出部7510は、運転者を撮像するカメラ、運転者の生体情報を検出する生体センサ又は車室内の音声を集音するマイク等を含んでもよい。生体センサは、例えば、座面又はステアリングホイール等に設けられ、座席に座った搭乗者又はステアリングホイールを握る運転者の生体情報を検出する。車内情報検出ユニット7500は、運転者状態検出部7510から入力される検出情報に基づいて、運転者の疲労度合い又は集中度合いを算出してもよいし、運転者が居眠りをしていないかを判別してもよい。車内情報検出ユニット7500は、集音された音声信号に対してノイズキャンセリング処理等の処理を行ってもよい。 An in-vehicle information detection unit 7500 detects information in the vehicle. For example, a driver state detection unit 7510 that detects a state of a driver is connected to the in-vehicle information detection unit 7500. The driver state detection unit 7510 may include a camera for imaging the driver, a biometric sensor for detecting the driver's biological information, a microphone for collecting sound in the vehicle interior, and the like. The biological sensor is provided, for example, on a seat or a steering wheel, and detects biological information of an occupant sitting on a seat or a driver who grips the steering wheel. The in-vehicle information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, or determine whether the driver does not go to sleep You may The in-vehicle information detection unit 7500 may perform processing such as noise canceling processing on the collected audio signal.
 統合制御ユニット7600は、各種プログラムにしたがって車両制御システム7000内の動作全般を制御する。統合制御ユニット7600には、入力部7800が接続されている。入力部7800は、例えば、タッチパネル、ボタン、マイクロフォン、スイッチ又はレバー等、搭乗者によって入力操作され得る装置によって実現される。統合制御ユニット7600には、マイクロフォンにより入力される音声を音声認識することにより得たデータが入力されてもよい。入力部7800は、例えば、赤外線又はその他の電波を利用したリモートコントロール装置であってもよいし、車両制御システム7000の操作に対応した携帯電話又はPDA(Personal Digital Assistant)等の外部接続機器であってもよい。入力部7800は、例えばカメラであってもよく、その場合搭乗者はジェスチャにより情報を入力することができる。あるいは、搭乗者が装着したウェアラブル装置の動きを検出することで得られたデータが入力されてもよい。さらに、入力部7800は、例えば、上記の入力部7800を用いて搭乗者等により入力された情報に基づいて入力信号を生成し、統合制御ユニット7600に出力する入力制御回路などを含んでもよい。搭乗者等は、この入力部7800を操作することにより、車両制御システム7000に対して各種のデータを入力したり処理動作を指示したりする。 The integrated control unit 7600 controls the overall operation in the vehicle control system 7000 in accordance with various programs. An input unit 7800 is connected to the integrated control unit 7600. The input unit 7800 is realized by, for example, a device such as a touch panel, a button, a microphone, a switch or a lever, which can be input operated by the passenger. The integrated control unit 7600 may receive data obtained by speech recognition of speech input by the microphone. The input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or an external connection device such as a mobile phone or a PDA (Personal Digital Assistant) corresponding to the operation of the vehicle control system 7000. May be The input unit 7800 may be, for example, a camera, in which case the passenger can input information by gesture. Alternatively, data obtained by detecting the movement of the wearable device worn by the passenger may be input. Furthermore, the input unit 7800 may include, for example, an input control circuit that generates an input signal based on the information input by the passenger or the like using the above-described input unit 7800 and outputs the generated signal to the integrated control unit 7600. The passenger or the like operates the input unit 7800 to input various data to the vehicle control system 7000 and instruct processing operations.
 記憶部7690は、マイクロコンピュータにより実行される各種プログラムを記憶するROM(Read Only Memory)、及び各種パラメータ、演算結果又はセンサ値等を記憶するRAM(Random Access Memory)を含んでいてもよい。また、記憶部7690は、HDD(Hard Disc Drive)等の磁気記憶デバイス、半導体記憶デバイス、光記憶デバイス又は光磁気記憶デバイス等によって実現してもよい。 The storage unit 7690 may include a ROM (Read Only Memory) that stores various programs executed by the microcomputer, and a RAM (Random Access Memory) that stores various parameters, calculation results, sensor values, and the like. In addition, the storage unit 7690 may be realized by a magnetic storage device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
 汎用通信I/F7620は、外部環境7750に存在する様々な機器との間の通信を仲介する汎用的な通信I/Fである。汎用通信I/F7620は、GSM(登録商標)(Global System of Mobile communications)、WiMAX(登録商標)、LTE(登録商標)(Long Term Evolution)若しくはLTE-A(LTE-Advanced)などのセルラー通信プロトコル、又は無線LAN(Wi-Fi(登録商標)ともいう)、Bluetooth(登録商標)などのその他の無線通信プロトコルを実装してよい。汎用通信I/F7620は、例えば、基地局又はアクセスポイントを介して、外部ネットワーク(例えば、インターネット、クラウドネットワーク又は事業者固有のネットワーク)上に存在する機器(例えば、アプリケーションサーバ又は制御サーバ)へ接続してもよい。また、汎用通信I/F7620は、例えばP2P(Peer To Peer)技術を用いて、車両の近傍に存在する端末(例えば、運転者、歩行者若しくは店舗の端末、又はMTC(Machine Type Communication)端末)と接続してもよい。 The general-purpose communication I / F 7620 is a general-purpose communication I / F that mediates communication with various devices existing in the external environment 7750. General-purpose communication I / F 7620 is a cellular communication protocol such as GSM (registered trademark) (Global System of Mobile communications), WiMAX (registered trademark), LTE (registered trademark) (Long Term Evolution) or LTE-A (LTE-Advanced). Or, other wireless communication protocols such as wireless LAN (also referred to as Wi-Fi (registered trademark)), Bluetooth (registered trademark), etc. may be implemented. The general-purpose communication I / F 7620 is connected to, for example, an apparatus (for example, an application server or control server) existing on an external network (for example, the Internet, a cloud network, or an operator-specific network) via a base station or access point You may Also, the general-purpose communication I / F 7620 is a terminal (for example, a driver, a pedestrian or a shop terminal, or an MTC (Machine Type Communication) terminal) existing near the vehicle using, for example, P2P (Peer To Peer) technology. It may be connected with
 専用通信I/F7630は、車両における使用を目的として策定された通信プロトコルをサポートする通信I/Fである。専用通信I/F7630は、例えば、下位レイヤのIEEE802.11pと上位レイヤのIEEE1609との組合せであるWAVE(Wireless Access in Vehicle Environment)、DSRC(Dedicated Short Range Communications)、又はセルラー通信プロトコルといった標準プロトコルを実装してよい。専用通信I/F7630は、典型的には、車車間(Vehicle to Vehicle)通信、路車間(Vehicle to Infrastructure)通信、車両と家との間(Vehicle to Home)の通信及び歩車間(Vehicle to Pedestrian)通信のうちの1つ以上を含む概念であるV2X通信を遂行する。 The dedicated communication I / F 7630 is a communication I / F that supports a communication protocol designed for use in a vehicle. The dedicated communication I / F 7630 may be a standard protocol such as WAVE (Wireless Access in Vehicle Environment), DSRC (Dedicated Short Range Communications), or cellular communication protocol, which is a combination of lower layer IEEE 802.11p and upper layer IEEE 1609, for example. May be implemented. The dedicated communication I / F 7630 is typically used for Vehicle to Vehicle communication, Vehicle to Infrastructure communication, Vehicle to Home communication, and Vehicle to Pedestrian. 2.) Perform V2X communication, a concept that includes one or more of the communication.
 測位部7640は、例えば、GNSS(Global Navigation Satellite System)衛星からのGNSS信号(例えば、GPS(Global Positioning System)衛星からのGPS信号)を受信して測位を実行し、車両の緯度、経度及び高度を含む位置情報を生成する。なお、測位部7640は、無線アクセスポイントとの信号の交換により現在位置を特定してもよく、又は測位機能を有する携帯電話、PHS若しくはスマートフォンといった端末から位置情報を取得してもよい。 The positioning unit 7640 receives a GNSS signal (for example, a GPS signal from a Global Positioning System (GPS) satellite) from, for example, a Global Navigation Satellite System (GNSS) satellite and executes positioning, thereby performing latitude, longitude, and altitude of the vehicle. Generate location information including Positioning section 7640 may specify the current position by exchanging signals with the wireless access point, or may acquire position information from a terminal such as a mobile phone having a positioning function, a PHS, or a smartphone.
 ビーコン受信部7650は、例えば、道路上に設置された無線局等から発信される電波あるいは電磁波を受信し、現在位置、渋滞、通行止め又は所要時間等の情報を取得する。なお、ビーコン受信部7650の機能は、上述した専用通信I/F7630に含まれてもよい。 The beacon receiving unit 7650 receives, for example, radio waves or electromagnetic waves transmitted from a radio station or the like installed on a road, and acquires information such as the current position, traffic jams, closing times or required time. The function of the beacon reception unit 7650 may be included in the above-described dedicated communication I / F 7630.
 車内機器I/F7660は、マイクロコンピュータ7610と車内に存在する様々な車内機器7760との間の接続を仲介する通信インタフェースである。車内機器I/F7660は、無線LAN、Bluetooth(登録商標)、NFC(Near Field Communication)又はWUSB(Wireless USB)といった無線通信プロトコルを用いて無線接続を確立してもよい。また、車内機器I/F7660は、図示しない接続端子(及び、必要であればケーブル)を介して、USB(Universal Serial Bus)、HDMI(登録商標)(High-Definition Multimedia Interface、又はMHL(Mobile High-definition Link)等の有線接続を確立してもよい。車内機器7760は、例えば、搭乗者が有するモバイル機器若しくはウェアラブル機器、又は車両に搬入され若しくは取り付けられる情報機器のうちの少なくとも1つを含んでいてもよい。また、車内機器7760は、任意の目的地までの経路探索を行うナビゲーション装置を含んでいてもよい。車内機器I/F7660は、これらの車内機器7760との間で、制御信号又はデータ信号を交換する。 An in-vehicle apparatus I / F 7660 is a communication interface that mediates the connection between the microcomputer 7610 and various in-vehicle apparatuses 7760 existing in the vehicle. The in-car device I / F 7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), or WUSB (Wireless USB). Further, the in-car device I / F 7660 can be connected via a connection terminal (and a cable, if necessary) (not shown) via USB (Universal Serial Bus), HDMI (High-Definition Multimedia Interface), or MHL (Mobile High). A wired connection may be established, such as a definition link, etc. The in-vehicle device 7760 includes, for example, at least one of a mobile device or wearable device that the passenger has, or an information device carried in or attached to the vehicle. In addition, the in-vehicle device 7760 may include a navigation device for performing route search to any destination The in-vehicle device I / F 7660 controls signals with these in-vehicle devices 7760 Or exchange data signals.
 車載ネットワークI/F7680は、マイクロコンピュータ7610と通信ネットワーク7010との間の通信を仲介するインタフェースである。車載ネットワークI/F7680は、通信ネットワーク7010によりサポートされる所定のプロトコルに則して、信号等を送受信する。 The in-vehicle network I / F 7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010. The in-vehicle network I / F 7680 transmits and receives signals and the like in accordance with a predetermined protocol supported by the communication network 7010.
 統合制御ユニット7600のマイクロコンピュータ7610は、汎用通信I/F7620、専用通信I/F7630、測位部7640、ビーコン受信部7650、車内機器I/F7660及び車載ネットワークI/F7680のうちの少なくとも一つを介して取得される情報に基づき、各種プログラムにしたがって、車両制御システム7000を制御する。例えば、マイクロコンピュータ7610は、取得される車内外の情報に基づいて、駆動力発生装置、ステアリング機構又は制動装置の制御目標値を演算し、駆動系制御ユニット7100に対して制御指令を出力してもよい。例えば、マイクロコンピュータ7610は、車両の衝突回避あるいは衝撃緩和、車間距離に基づく追従走行、車速維持走行、車両の衝突警告、又は車両のレーン逸脱警告等を含むADAS(Advanced Driver Assistance System)の機能実現を目的とした協調制御を行ってもよい。また、マイクロコンピュータ7610は、取得される車両の周囲の情報に基づいて駆動力発生装置、ステアリング機構又は制動装置等を制御することにより、運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行ってもよい。 The microcomputer 7610 of the integrated control unit 7600 is connected via at least one of a general-purpose communication I / F 7620, a dedicated communication I / F 7630, a positioning unit 7640, a beacon reception unit 7650, an in-vehicle device I / F 7660, and an in-vehicle network I / F 7680. The vehicle control system 7000 is controlled in accordance with various programs based on the information acquired. For example, the microcomputer 7610 calculates a control target value of the driving force generation device, the steering mechanism or the braking device based on the acquired information inside and outside the vehicle, and outputs a control command to the driving system control unit 7100. It is also good. For example, the microcomputer 7610 realizes the function of an advanced driver assistance system (ADAS) including collision avoidance or shock mitigation of a vehicle, follow-up traveling based on an inter-vehicle distance, vehicle speed maintenance traveling, vehicle collision warning, vehicle lane departure warning, etc. Cooperative control for the purpose of In addition, the microcomputer 7610 automatically runs without using the driver's operation by controlling the driving force generating device, the steering mechanism, the braking device, etc. based on the acquired information of the surroundings of the vehicle. Coordinated control may be performed for the purpose of driving and the like.
 マイクロコンピュータ7610は、汎用通信I/F7620、専用通信I/F7630、測位部7640、ビーコン受信部7650、車内機器I/F7660及び車載ネットワークI/F7680のうちの少なくとも一つを介して取得される情報に基づき、車両と周辺の構造物や人物等の物体との間の3次元距離情報を生成し、車両の現在位置の周辺情報を含むローカル地図情報を作成してもよい。また、マイクロコンピュータ7610は、取得される情報に基づき、車両の衝突、歩行者等の近接又は通行止めの道路への進入等の危険を予測し、警告用信号を生成してもよい。警告用信号は、例えば、警告音を発生させたり、警告ランプを点灯させたりするための信号であってよい。 The microcomputer 7610 is information acquired via at least one of a general-purpose communication I / F 7620, a dedicated communication I / F 7630, a positioning unit 7640, a beacon reception unit 7650, an in-vehicle device I / F 7660, and an in-vehicle network I / F 7680. Based on the above, three-dimensional distance information between the vehicle and an object such as a surrounding structure or a person may be generated, and local map information including the peripheral information of the current position of the vehicle may be created. Further, the microcomputer 7610 may predict a danger such as a collision of a vehicle or a pedestrian or the like approaching a road or the like on the basis of the acquired information, and may generate a signal for warning. The warning signal may be, for example, a signal for generating a warning sound or lighting a warning lamp.
 音声画像出力部7670は、車両の搭乗者又は車外に対して、視覚的又は聴覚的に情報を通知することが可能な出力装置へ音声及び画像のうちの少なくとも一方の出力信号を送信する。図27の例では、出力装置として、オーディオスピーカ7710、表示部7720及びインストルメントパネル7730が例示されている。表示部7720は、例えば、オンボードディスプレイ及びヘッドアップディスプレイの少なくとも一つを含んでいてもよい。表示部7720は、AR(Augmented Reality)表示機能を有していてもよい。出力装置は、これらの装置以外の、ヘッドホン、搭乗者が装着する眼鏡型ディスプレイ等のウェアラブルデバイス、プロジェクタ又はランプ等の他の装置であってもよい。出力装置が表示装置の場合、表示装置は、マイクロコンピュータ7610が行った各種処理により得られた結果又は他の制御ユニットから受信された情報を、テキスト、イメージ、表、グラフ等、様々な形式で視覚的に表示する。また、出力装置が音声出力装置の場合、音声出力装置は、再生された音声データ又は音響データ等からなるオーディオ信号をアナログ信号に変換して聴覚的に出力する。 The audio image output unit 7670 transmits an output signal of at least one of audio and image to an output device capable of visually or aurally notifying information to a passenger or the outside of a vehicle. In the example of FIG. 27, an audio speaker 7710, a display unit 7720, and an instrument panel 7730 are illustrated as output devices. The display unit 7720 may include, for example, at least one of an on-board display and a head-up display. The display portion 7720 may have an AR (Augmented Reality) display function. The output device may be another device such as a headphone, a wearable device such as a glasses-type display worn by a passenger, a projector, or a lamp other than these devices. When the output device is a display device, the display device may obtain information obtained from various processes performed by the microcomputer 7610 or information received from another control unit in various formats such as text, images, tables, graphs, etc. Display visually. When the output device is an audio output device, the audio output device converts an audio signal composed of reproduced audio data or audio data into an analog signal and outputs it in an auditory manner.
 なお、図27に示した例において、通信ネットワーク7010を介して接続された少なくとも二つの制御ユニットが一つの制御ユニットとして一体化されてもよい。あるいは、個々の制御ユニットが、複数の制御ユニットにより構成されてもよい。さらに、車両制御システム7000が、図示されていない別の制御ユニットを備えてもよい。また、上記の説明において、いずれかの制御ユニットが担う機能の一部又は全部を、他の制御ユニットに持たせてもよい。つまり、通信ネットワーク7010を介して情報の送受信がされるようになっていれば、所定の演算処理が、いずれかの制御ユニットで行われるようになってもよい。同様に、いずれかの制御ユニットに接続されているセンサ又は装置が、他の制御ユニットに接続されるとともに、複数の制御ユニットが、通信ネットワーク7010を介して相互に検出情報を送受信してもよい。 In the example shown in FIG. 27, at least two control units connected via the communication network 7010 may be integrated as one control unit. Alternatively, each control unit may be configured by a plurality of control units. Furthermore, the vehicle control system 7000 may comprise another control unit not shown. In the above description, part or all of the functions of any control unit may be provided to another control unit. That is, as long as transmission and reception of information are performed via the communication network 7010, predetermined arithmetic processing may be performed by any control unit. Similarly, while a sensor or device connected to any control unit is connected to another control unit, a plurality of control units may mutually transmit and receive detection information via the communication network 7010. .
 なお、図4を用いて説明した本実施形態に係る画像処理装置100の各機能を実現するためのコンピュータプログラムを、いずれかの制御ユニット等に実装することができる。また、このようなコンピュータプログラムが格納された、コンピュータで読み取り可能な記録媒体を提供することもできる。記録媒体は、例えば、磁気ディスク、光ディスク、光磁気ディスク、フラッシュメモリ等である。また、上記のコンピュータプログラムは、記録媒体を用いずに、例えばネットワークを介して配信されてもよい。 A computer program for realizing each function of the image processing apparatus 100 according to the present embodiment described with reference to FIG. 4 can be implemented in any control unit or the like. In addition, a computer readable recording medium in which such a computer program is stored can be provided. The recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory or the like. Also, the above computer program may be distributed via, for example, a network without using a recording medium.
 以上説明した車両制御システム7000において、図4を用いて説明した本実施形態に係る画像処理装置100は、図27に示した応用例の統合制御ユニット7600に適用することができる。 In the vehicle control system 7000 described above, the image processing apparatus 100 according to the present embodiment described with reference to FIG. 4 can be applied to the integrated control unit 7600 of the application example shown in FIG.
 また、図4を用いて説明した画像処理装置100の少なくとも一部の構成要素は、図27に示した統合制御ユニット7600のためのモジュール(例えば、一つのダイで構成される集積回路モジュール)において実現されてもよい。あるいは、図4を用いて説明した画像処理装置100が、図27に示した車両制御システム7000の複数の制御ユニットによって実現されてもよい。 Further, at least a part of the components of the image processing apparatus 100 described with reference to FIG. 4 is a module (for example, an integrated circuit module configured by one die) for the integrated control unit 7600 shown in FIG. It may be realized. Alternatively, the image processing apparatus 100 described with reference to FIG. 4 may be realized by a plurality of control units of the vehicle control system 7000 shown in FIG.
  <4.まとめ>
 以上で説明してきたように、本開示に係る画像処理装置100は、撮像画像の解析結果だけでなく、例えば、距離センサ、フォーカスセンサ、像面位相差センサ等の各種センサ情報を用いた処理によりカラー画像と白黒画像の合成を制御する。より具体的には、画像処理装置100は、各種センサ情報を用いた処理により、合成画像の画質が劣化するか否かを判定し、画質が劣化すると判定した場合には、白黒画像の合成比率を略ゼロ(またはゼロ)にする。これによって、画像処理装置100は、合成画像の画質を向上させることができる。
<4. Summary>
As described above, the image processing apparatus 100 according to the present disclosure is processed not only by the analysis result of the captured image but also by using various sensor information such as a distance sensor, a focus sensor, and an image plane phase difference sensor. Control the combination of color and black and white images. More specifically, the image processing apparatus 100 determines whether the image quality of the composite image is degraded by processing using various types of sensor information, and when it is determined that the image quality is degraded, the composite ratio of black and white images To approximately zero (or zero). Thus, the image processing apparatus 100 can improve the image quality of the composite image.
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。 The preferred embodiments of the present disclosure have been described in detail with reference to the accompanying drawings, but the technical scope of the present disclosure is not limited to such examples. It will be apparent to those skilled in the art of the present disclosure that various modifications and alterations can be conceived within the scope of the technical idea described in the claims. It is naturally understood that the technical scope of the present disclosure is also included.
 例えば、本開示の技術は、カメラの切り替え時にも利用され得る。より具体的には、ユーザが撮像画像の画角を変更するために、画角の異なる2台のカメラを切り替えて使う場合、カメラの切り替え時に、視点移動によってユーザが違和感を覚える可能性がある。これを避けるために、本開示によって、切り替わり時に2台のカメラの撮像画像が合成されることで滑らかな切り替えが実現されてもよい。そして、2台のカメラの撮像画像が合成されることによって画質劣化が発生すると判定された場合、合成が行われなくてもよい(換言すると、いずれかの画角(広角または狭角)の撮像画像のみが出力されてもよい)。これによって、画質劣化の無い切り替えが実現され得る。 For example, the techniques of the present disclosure may also be utilized when switching cameras. More specifically, when the user switches and uses two cameras having different angles of view in order to change the angle of view of the captured image, the user may feel discomfort due to the movement of the viewpoint when switching the cameras . In order to avoid this, according to the present disclosure, smooth switching may be realized by combining captured images of two cameras at the time of switching. Then, when it is determined that the image quality deterioration occurs by combining the captured images of the two cameras, the combining may not be performed (in other words, imaging of any angle of view (wide angle or narrow angle) Only the image may be output). Thereby, switching without image quality deterioration can be realized.
 また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 In addition, the effects described in the present specification are merely illustrative or exemplary, and not limiting. That is, the technology according to the present disclosure can exhibit other effects apparent to those skilled in the art from the description of the present specification, in addition to or instead of the effects described above.
 なお、以下のような構成も本開示の技術的範囲に属する。
(1)
 被写体を撮像することでカラー画像を取得する第1の撮像部と、
 前記第1の撮像部とは異なる視点位置から前記被写体を撮像することで白黒画像を取得する第2の撮像部と、
 前記カラー画像と前記白黒画像の合成について、前記カラー画像の合成比率を前記白黒画像の合成比率に比べて高くする合成制御部と、を備える、
 画像処理装置。
(2)
 前記合成制御部は、所定のセンサ情報を用いた処理に基づいて、前記合成により生成される合成画像の画質が劣化すると判定した場合に、前記カラー画像の合成比率を前記白黒画像の合成比率に比べて高くする、
 前記(1)に記載の画像処理装置。
(3)
 前記合成制御部は、前記画質が劣化すると判定した場合に、前記白黒画像の合成比率を略ゼロにする、
 前記(2)に記載の画像処理装置。
(4)
 前記合成制御部は、前記センサ情報を用いた処理により算出された前記被写体までの距離に基づいて前記画質が劣化するか否かを判定する、
 前記(2)または(3)に記載の画像処理装置。
(5)
 前記被写体までの距離は、距離センサ情報を用いた処理により算出される、
 前記(4)に記載の画像処理装置。
(6)
 前記被写体までの距離は、前記センサ情報を用いてフォーカスが行われた際のフォーカス位置情報に基づいて算出される、
 前記(4)に記載の画像処理装置。
(7)
 前記被写体までの距離は、像面位相差センサ情報を用いた処理により算出される、
 前記(4)に記載の画像処理装置。
(8)
 前記像面位相差センサ情報には距離に関する情報と信頼度に関する情報が含まれ、
 前記被写体までの距離は、前記信頼度に基づく重み付け平均処理により算出される、
 前記(7)に記載の画像処理装置。
(9)
 前記合成制御部は、イメージセンサにより生成された前記白黒画像または前記カラー画像を用いた処理により算出された特徴量に基づいて前記画質が劣化するか否かを判定する、
 前記(2)または(3)に記載の画像処理装置。
(10)
 前記特徴量は、前記カラー画像と前記白黒画像の視差に基づいて算出される、
 前記(9)に記載の画像処理装置。
(11)
 前記特徴量は、画素毎の前記視差のばらつきを示す統計量、前記画素毎の前記視差において所定範囲の視差量を超える画素の割合、または、前記画素毎に当該画素から視差方向に所定距離だけ離れた画素と逆方向に前記所定距離だけ離れた画素の視差差分絶対値が算出された場合の前記視差差分絶対値が所定量を超える画素の割合の少なくともいずれか1つである、
 前記(10)に記載の画像処理装置。
(12)
 前記特徴量は、前記カラー画像から抽出された輝度信号と色差信号に基づいて算出される、
 前記(9)に記載の画像処理装置。
(13)
 前記特徴量は、前記輝度信号のばらつきに対する前記色差信号のばらつきの比率、前記輝度信号のエッジ成分に対する前記色差信号のエッジ成分の比率の少なくともいずれか1つである、
 前記(12)に記載の画像処理装置。
(14)
 被写体を撮像することでカラー画像を取得することと、
 異なる視点位置から前記被写体を撮像することで白黒画像を取得することと、
 前記カラー画像と前記白黒画像の合成について、前記カラー画像の合成比率を前記白黒画像の合成比率に比べて高くすることと、を有する、
 コンピュータにより実行される画像処理方法。
(15)
 被写体を撮像することでカラー画像を取得する第1の撮像部と、
 前記第1の撮像部とは異なる視点位置から前記被写体を撮像することで白黒画像を取得する第2の撮像部と、
 所定のセンサ情報を用いた処理により前記カラー画像と前記白黒画像の合成を制御する合成制御部と、を備える、
 画像処理装置。
(16)
 前記合成制御部は、前記センサ情報を用いた処理により前記カラー画像と前記白黒画像の合成比率を変更する、
 前記(15)に記載の画像処理装置。
(17)
 前記合成制御部は、前記センサ情報を用いた処理に基づいて、前記合成により生成される合成画像の画質が劣化すると判定した場合に、前記合成比率を変更する、
 前記(16)に記載の画像処理装置。
(18)
 前記合成制御部は、前記画質が劣化すると判定した場合に、前記カラー画像の合成比率を前記白黒画像の合成比率に比べて高くする、
 前記(17)に記載の画像処理装置。
(19)
 前記合成制御部は、前記画質が劣化すると判定した場合に、前記白黒画像の合成比率を略ゼロにする、
 前記(18)に記載の画像処理装置。
(20)
 前記合成制御部は、前記センサ情報を用いた処理により算出された前記被写体までの距離に基づいて前記合成を制御する、
 前記(15)から(19)のいずれか1項に記載の画像処理装置。
(21)
 前記被写体までの距離は、距離センサ情報を用いた処理により算出される、
 前記(20)に記載の画像処理装置。
(22)
 前記被写体までの距離は、前記センサ情報を用いてフォーカスが行われた際のフォーカス位置情報に基づいて算出される、
 前記(20)に記載の画像処理装置。
(23)
 前記被写体までの距離は、像面位相差センサ情報を用いた処理により算出される、
 前記(20)に記載の画像処理装置。
(24)
 前記像面位相差センサ情報には距離に関する情報と信頼度に関する情報が含まれ、
 前記被写体までの距離は、前記信頼度に基づく重み付け平均処理により算出される、
 前記(23)に記載の画像処理装置。
(25)
 前記合成制御部は、イメージセンサにより生成された前記白黒画像または前記カラー画像を用いた処理により算出された特徴量に基づいて前記合成を制御する、
 前記(15)から(19)のいずれか1項に記載の画像処理装置。
(26)
 前記特徴量は、前記カラー画像と前記白黒画像の視差に基づいて算出される、
 前記(25)に記載の画像処理装置。
(27)
 前記特徴量は、画素毎の前記視差のばらつきを示す統計量、前記画素毎の前記視差において所定範囲の視差量を超える画素の割合、または、前記画素毎に当該画素から視差方向に所定距離だけ離れた画素と逆方向に前記所定距離だけ離れた画素の視差差分絶対値が算出された場合の前記視差差分絶対値が所定量を超える画素の割合の少なくともいずれか1つである、
 前記(26)に記載の画像処理装置。
(28)
 前記特徴量は、前記カラー画像から抽出された輝度信号と色差信号に基づいて算出される、
 前記(25)に記載の画像処理装置。
(29)
 前記特徴量は、前記輝度信号のばらつきに対する前記色差信号のばらつきの比率、前記輝度信号のエッジ成分に対する前記色差信号のエッジ成分の比率の少なくともいずれか1つである、
 前記(28)に記載の画像処理装置。
(30)
 被写体を撮像することでカラー画像を取得することと、
 異なる視点位置から前記被写体を撮像することで白黒画像を取得することと、
 所定のセンサ情報を用いた処理により前記カラー画像と前記白黒画像の合成を制御することと、を有する、
 コンピュータにより実行される画像処理方法。
The following configurations are also within the technical scope of the present disclosure.
(1)
A first imaging unit that acquires a color image by imaging a subject;
A second imaging unit that acquires a black and white image by imaging the subject from a viewpoint position different from that of the first imaging unit;
And a combining control unit configured to increase the combining ratio of the color image relative to the combining ratio of the black and white image for combining the color image and the black and white image.
Image processing device.
(2)
The combination control unit is configured to set the combination ratio of the color image to the combination ratio of the black and white image when it is determined that the image quality of the combined image generated by the combination is deteriorated based on processing using predetermined sensor information. Make it higher,
The image processing apparatus according to (1).
(3)
The combination control unit makes the combination ratio of the black and white image substantially zero when it is determined that the image quality is deteriorated.
The image processing apparatus according to (2).
(4)
The combination control unit determines whether the image quality is degraded based on the distance to the subject calculated by processing using the sensor information.
The image processing apparatus according to (2) or (3).
(5)
The distance to the subject is calculated by processing using distance sensor information.
The image processing apparatus according to (4).
(6)
The distance to the subject is calculated based on focus position information when focusing is performed using the sensor information.
The image processing apparatus according to (4).
(7)
The distance to the subject is calculated by processing using image plane phase difference sensor information.
The image processing apparatus according to (4).
(8)
The image plane phase difference sensor information includes information on distance and information on reliability,
The distance to the subject is calculated by weighted averaging based on the reliability.
The image processing apparatus according to (7).
(9)
The combination control unit determines whether the image quality is deteriorated based on the black and white image generated by an image sensor or a feature amount calculated by processing using the color image.
The image processing apparatus according to (2) or (3).
(10)
The feature amount is calculated based on the parallax between the color image and the black and white image.
The image processing apparatus according to (9).
(11)
The feature amount is a statistic indicating the variation in the parallax for each pixel, a ratio of pixels exceeding the parallax amount in a predetermined range in the parallax for each pixel, or a predetermined distance from the pixel in the parallax direction for each pixel The parallax difference absolute value is at least one of the ratio of pixels exceeding a predetermined amount when the parallax difference absolute value of the pixel separated by the predetermined distance in the opposite direction to the separated pixel is calculated.
The image processing apparatus according to (10).
(12)
The feature amount is calculated based on a luminance signal and a color difference signal extracted from the color image.
The image processing apparatus according to (9).
(13)
The feature amount is at least one of a ratio of the variation of the color difference signal to the variation of the brightness signal, and a ratio of an edge component of the color difference signal to an edge component of the brightness signal.
The image processing apparatus according to (12).
(14)
Acquiring a color image by imaging a subject;
Acquiring a black and white image by imaging the subject from different viewpoint positions;
Setting the composition ratio of the color image higher than the composition ratio of the black and white image for the combination of the color image and the black and white image
An image processing method implemented by a computer.
(15)
A first imaging unit that acquires a color image by imaging a subject;
A second imaging unit that acquires a black and white image by imaging the subject from a viewpoint position different from that of the first imaging unit;
And a composition control unit configured to control composition of the color image and the black and white image by processing using predetermined sensor information.
Image processing device.
(16)
The combination control unit changes a combination ratio of the color image and the black and white image by processing using the sensor information.
The image processing apparatus according to (15).
(17)
The combination control unit changes the combination ratio, when it is determined that the image quality of a combined image generated by the combination is deteriorated, based on processing using the sensor information.
The image processing apparatus according to (16).
(18)
When it is determined that the image quality is deteriorated, the combining control unit makes the combining ratio of the color image higher than the combining ratio of the black and white image.
The image processing apparatus according to (17).
(19)
The combination control unit makes the combination ratio of the black and white image substantially zero when it is determined that the image quality is deteriorated.
The image processing apparatus according to (18).
(20)
The combination control unit controls the combination based on the distance to the subject calculated by processing using the sensor information.
The image processing apparatus according to any one of (15) to (19).
(21)
The distance to the subject is calculated by processing using distance sensor information.
The image processing apparatus according to (20).
(22)
The distance to the subject is calculated based on focus position information when focusing is performed using the sensor information.
The image processing apparatus according to (20).
(23)
The distance to the subject is calculated by processing using image plane phase difference sensor information.
The image processing apparatus according to (20).
(24)
The image plane phase difference sensor information includes information on distance and information on reliability,
The distance to the subject is calculated by weighted averaging based on the reliability.
The image processing apparatus according to (23).
(25)
The combination control unit controls the combination based on the black-and-white image generated by an image sensor or a feature amount calculated by processing using the color image.
The image processing apparatus according to any one of (15) to (19).
(26)
The feature amount is calculated based on the parallax between the color image and the black and white image.
The image processing apparatus according to (25).
(27)
The feature amount is a statistic indicating the variation in the parallax for each pixel, a ratio of pixels exceeding the parallax amount in a predetermined range in the parallax for each pixel, or a predetermined distance from the pixel in the parallax direction for each pixel The parallax difference absolute value is at least one of the ratio of pixels exceeding a predetermined amount when the parallax difference absolute value of the pixel separated by the predetermined distance in the opposite direction to the separated pixel is calculated.
The image processing apparatus according to (26).
(28)
The feature amount is calculated based on a luminance signal and a color difference signal extracted from the color image.
The image processing apparatus according to (25).
(29)
The feature amount is at least one of a ratio of the variation of the color difference signal to the variation of the brightness signal, and a ratio of an edge component of the color difference signal to an edge component of the brightness signal.
The image processing apparatus according to (28).
(30)
Acquiring a color image by imaging a subject;
Acquiring a black and white image by imaging the subject from different viewpoint positions;
Controlling composition of the color image and the black and white image by processing using predetermined sensor information;
An image processing method implemented by a computer.
 100  画像処理装置
 110  第1の撮像部
 120  第2の撮像部
 130  第1の前処理部
 140  第2の前処理部
 150  合成処理部
 151  視差ヒストグラム処理部
 152  視差分布特徴量算出部
 153  サーチ範囲超え特徴量算出部
 154  視差ギャップ特徴量算出部
 155  信号抽出部
 156  Y/C分散比率処理部
 156a  Y分散値算出部
 156b  Cb分散値算出部
 156c  Cr分散値算出部
 156d  比較部
 156e  Y/C分散比率算出部
 157  Y/Cエッジ成分比率処理部
 157a  Yエッジ成分検出部
 157b  Cbエッジ成分検出部
 157c  Crエッジ成分検出部
 157d  比較部
 157e  Y/Cエッジ成分比率算出部
 160  フォーカス制御部
 170  距離センサ
 180  画質劣化判定部
 181  近距離特徴量判定部
 182  輝度差小色差大特徴量判定部
 190  3D深度センサ
100 image processing apparatus 110 first imaging unit 120 second imaging unit 130 first pre-processing unit 140 second pre-processing unit 150 combination processing unit 151 parallax histogram processing unit 152 parallax distribution feature quantity calculation unit 153 search range exceeded Feature amount calculation unit 154 Parallax gap feature amount calculation unit 155 Signal extraction unit 156 Y / C dispersion ratio processing unit 156a Y dispersion value calculation unit 156b Cb dispersion value calculation unit 156c Cr dispersion value calculation unit 156d Comparison unit 156e Y / C dispersion ratio Calculation unit 157 Y / C edge component ratio processing unit 157a Y edge component detection unit 157b Cb edge component detection unit 157c Cr edge component detection unit 157d comparison unit 157e Y / C edge component ratio calculation unit 160 focus control unit 170 distance sensor 180 image quality Degradation judgment unit 181 Short distance Symptoms amount determining unit 182 brightness difference smaller chrominance large characteristic amount determination unit 190 3D depth sensor

Claims (30)

  1.  被写体を撮像することでカラー画像を取得する第1の撮像部と、
     前記第1の撮像部とは異なる視点位置から前記被写体を撮像することで白黒画像を取得する第2の撮像部と、
     前記カラー画像と前記白黒画像の合成について、前記カラー画像の合成比率を前記白黒画像の合成比率に比べて高くする合成制御部と、を備える、
     画像処理装置。
    A first imaging unit that acquires a color image by imaging a subject;
    A second imaging unit that acquires a black and white image by imaging the subject from a viewpoint position different from that of the first imaging unit;
    And a combining control unit configured to increase the combining ratio of the color image relative to the combining ratio of the black and white image for combining the color image and the black and white image.
    Image processing device.
  2.  前記合成制御部は、所定のセンサ情報を用いた処理に基づいて、前記合成により生成される合成画像の画質が劣化すると判定した場合に、前記カラー画像の合成比率を前記白黒画像の合成比率に比べて高くする、
     請求項1に記載の画像処理装置。
    The combination control unit is configured to set the combination ratio of the color image to the combination ratio of the black and white image when it is determined that the image quality of the combined image generated by the combination is deteriorated based on processing using predetermined sensor information. Make it higher,
    The image processing apparatus according to claim 1.
  3.  前記合成制御部は、前記画質が劣化すると判定した場合に、前記白黒画像の合成比率を略ゼロにする、
     請求項2に記載の画像処理装置。
    The combination control unit makes the combination ratio of the black and white image substantially zero when it is determined that the image quality is deteriorated.
    The image processing apparatus according to claim 2.
  4.  前記合成制御部は、前記センサ情報を用いた処理により算出された前記被写体までの距離に基づいて前記画質が劣化するか否かを判定する、
     請求項2に記載の画像処理装置。
    The combination control unit determines whether the image quality is degraded based on the distance to the subject calculated by processing using the sensor information.
    The image processing apparatus according to claim 2.
  5.  前記被写体までの距離は、距離センサ情報を用いた処理により算出される、
     請求項4に記載の画像処理装置。
    The distance to the subject is calculated by processing using distance sensor information.
    The image processing apparatus according to claim 4.
  6.  前記被写体までの距離は、前記センサ情報を用いてフォーカスが行われた際のフォーカス位置情報に基づいて算出される、
     請求項4に記載の画像処理装置。
    The distance to the subject is calculated based on focus position information when focusing is performed using the sensor information.
    The image processing apparatus according to claim 4.
  7.  前記被写体までの距離は、像面位相差センサ情報を用いた処理により算出される、
     請求項4に記載の画像処理装置。
    The distance to the subject is calculated by processing using image plane phase difference sensor information.
    The image processing apparatus according to claim 4.
  8.  前記像面位相差センサ情報には距離に関する情報と信頼度に関する情報が含まれ、
     前記被写体までの距離は、前記信頼度に基づく重み付け平均処理により算出される、
     請求項7に記載の画像処理装置。
    The image plane phase difference sensor information includes information on distance and information on reliability,
    The distance to the subject is calculated by weighted averaging based on the reliability.
    The image processing apparatus according to claim 7.
  9.  前記合成制御部は、イメージセンサにより生成された前記白黒画像または前記カラー画像を用いた処理により算出された特徴量に基づいて前記画質が劣化するか否かを判定する、
     請求項2に記載の画像処理装置。
    The combination control unit determines whether the image quality is deteriorated based on the black and white image generated by an image sensor or a feature amount calculated by processing using the color image.
    The image processing apparatus according to claim 2.
  10.  前記特徴量は、前記カラー画像と前記白黒画像の視差に基づいて算出される、
     請求項9に記載の画像処理装置。
    The feature amount is calculated based on the parallax between the color image and the black and white image.
    The image processing apparatus according to claim 9.
  11.  前記特徴量は、画素毎の前記視差のばらつきを示す統計量、前記画素毎の前記視差において所定範囲の視差量を超える画素の割合、または、前記画素毎に当該画素から視差方向に所定距離だけ離れた画素と逆方向に前記所定距離だけ離れた画素の視差差分絶対値が算出された場合の前記視差差分絶対値が所定量を超える画素の割合の少なくともいずれか1つである、
     請求項10に記載の画像処理装置。
    The feature amount is a statistic indicating the variation in the parallax for each pixel, a ratio of pixels exceeding the parallax amount in a predetermined range in the parallax for each pixel, or a predetermined distance from the pixel in the parallax direction for each pixel The parallax difference absolute value is at least one of the ratio of pixels exceeding a predetermined amount when the parallax difference absolute value of the pixel separated by the predetermined distance in the opposite direction to the separated pixel is calculated.
    The image processing apparatus according to claim 10.
  12.  前記特徴量は、前記カラー画像から抽出された輝度信号と色差信号に基づいて算出される、
     請求項9に記載の画像処理装置。
    The feature amount is calculated based on a luminance signal and a color difference signal extracted from the color image.
    The image processing apparatus according to claim 9.
  13.  前記特徴量は、前記輝度信号のばらつきに対する前記色差信号のばらつきの比率、前記輝度信号のエッジ成分に対する前記色差信号のエッジ成分の比率の少なくともいずれか1つである、
     請求項12に記載の画像処理装置。
    The feature amount is at least one of a ratio of the variation of the color difference signal to the variation of the brightness signal, and a ratio of an edge component of the color difference signal to an edge component of the brightness signal.
    An image processing apparatus according to claim 12.
  14.  被写体を撮像することでカラー画像を取得することと、
     異なる視点位置から前記被写体を撮像することで白黒画像を取得することと、
     前記カラー画像と前記白黒画像の合成について、前記カラー画像の合成比率を前記白黒画像の合成比率に比べて高くすることと、を有する、
     コンピュータにより実行される画像処理方法。
    Acquiring a color image by imaging a subject;
    Acquiring a black and white image by imaging the subject from different viewpoint positions;
    Setting the composition ratio of the color image higher than the composition ratio of the black and white image for the combination of the color image and the black and white image
    An image processing method implemented by a computer.
  15.  被写体を撮像することでカラー画像を取得する第1の撮像部と、
     前記第1の撮像部とは異なる視点位置から前記被写体を撮像することで白黒画像を取得する第2の撮像部と、
     所定のセンサ情報を用いた処理により前記カラー画像と前記白黒画像の合成を制御する合成制御部と、を備える、
     画像処理装置。
    A first imaging unit that acquires a color image by imaging a subject;
    A second imaging unit that acquires a black and white image by imaging the subject from a viewpoint position different from that of the first imaging unit;
    And a composition control unit configured to control composition of the color image and the black and white image by processing using predetermined sensor information.
    Image processing device.
  16.  前記合成制御部は、前記センサ情報を用いた処理により前記カラー画像と前記白黒画像の合成比率を変更する、
     請求項15に記載の画像処理装置。
    The combination control unit changes a combination ratio of the color image and the black and white image by processing using the sensor information.
    The image processing apparatus according to claim 15.
  17.  前記合成制御部は、前記センサ情報を用いた処理に基づいて、前記合成により生成される合成画像の画質が劣化すると判定した場合に、前記合成比率を変更する、
     請求項16に記載の画像処理装置。
    The combination control unit changes the combination ratio, when it is determined that the image quality of a combined image generated by the combination is deteriorated, based on processing using the sensor information.
    The image processing apparatus according to claim 16.
  18.  前記合成制御部は、前記画質が劣化すると判定した場合に、前記カラー画像の合成比率を前記白黒画像の合成比率に比べて高くする、
     請求項17に記載の画像処理装置。
    When it is determined that the image quality is deteriorated, the combining control unit makes the combining ratio of the color image higher than the combining ratio of the black and white image.
    The image processing apparatus according to claim 17.
  19.  前記合成制御部は、前記画質が劣化すると判定した場合に、前記白黒画像の合成比率を略ゼロにする、
     請求項18に記載の画像処理装置。
    The combination control unit makes the combination ratio of the black and white image substantially zero when it is determined that the image quality is deteriorated.
    The image processing apparatus according to claim 18.
  20.  前記合成制御部は、前記センサ情報を用いた処理により算出された前記被写体までの距離に基づいて前記合成を制御する、
     請求項15に記載の画像処理装置。
    The combination control unit controls the combination based on the distance to the subject calculated by processing using the sensor information.
    The image processing apparatus according to claim 15.
  21.  前記被写体までの距離は、距離センサ情報を用いた処理により算出される、
     請求項20に記載の画像処理装置。
    The distance to the subject is calculated by processing using distance sensor information.
    An image processing apparatus according to claim 20.
  22.  前記被写体までの距離は、前記センサ情報を用いてフォーカスが行われた際のフォーカス位置情報に基づいて算出される、
     請求項20に記載の画像処理装置。
    The distance to the subject is calculated based on focus position information when focusing is performed using the sensor information.
    An image processing apparatus according to claim 20.
  23.  前記被写体までの距離は、像面位相差センサ情報を用いた処理により算出される、
     請求項20に記載の画像処理装置。
    The distance to the subject is calculated by processing using image plane phase difference sensor information.
    An image processing apparatus according to claim 20.
  24.  前記像面位相差センサ情報には距離に関する情報と信頼度に関する情報が含まれ、
     前記被写体までの距離は、前記信頼度に基づく重み付け平均処理により算出される、
     請求項23に記載の画像処理装置。
    The image plane phase difference sensor information includes information on distance and information on reliability,
    The distance to the subject is calculated by weighted averaging based on the reliability.
    An image processing apparatus according to claim 23.
  25.  前記合成制御部は、イメージセンサにより生成された前記白黒画像または前記カラー画像を用いた処理により算出された特徴量に基づいて前記合成を制御する、
     請求項15に記載の画像処理装置。
    The combination control unit controls the combination based on the black-and-white image generated by an image sensor or a feature amount calculated by processing using the color image.
    The image processing apparatus according to claim 15.
  26.  前記特徴量は、前記カラー画像と前記白黒画像の視差に基づいて算出される、
     請求項25に記載の画像処理装置。
    The feature amount is calculated based on the parallax between the color image and the black and white image.
    An image processing apparatus according to claim 25.
  27.  前記特徴量は、画素毎の前記視差のばらつきを示す統計量、前記画素毎の前記視差において所定範囲の視差量を超える画素の割合、または、前記画素毎に当該画素から視差方向に所定距離だけ離れた画素と逆方向に前記所定距離だけ離れた画素の視差差分絶対値が算出された場合の前記視差差分絶対値が所定量を超える画素の割合の少なくともいずれか1つである、
     請求項26に記載の画像処理装置。
    The feature amount is a statistic indicating the variation in the parallax for each pixel, a ratio of pixels exceeding the parallax amount in a predetermined range in the parallax for each pixel, or a predetermined distance from the pixel in the parallax direction for each pixel The parallax difference absolute value is at least one of the ratio of pixels exceeding a predetermined amount when the parallax difference absolute value of the pixel separated by the predetermined distance in the opposite direction to the separated pixel is calculated.
    The image processing apparatus according to claim 26.
  28.  前記特徴量は、前記カラー画像から抽出された輝度信号と色差信号に基づいて算出される、
     請求項25に記載の画像処理装置。
    The feature amount is calculated based on a luminance signal and a color difference signal extracted from the color image.
    An image processing apparatus according to claim 25.
  29.  前記特徴量は、前記輝度信号のばらつきに対する前記色差信号のばらつきの比率、前記輝度信号のエッジ成分に対する前記色差信号のエッジ成分の比率の少なくともいずれか1つである、
     請求項28に記載の画像処理装置。
    The feature amount is at least one of a ratio of the variation of the color difference signal to the variation of the brightness signal, and a ratio of an edge component of the color difference signal to an edge component of the brightness signal.
    An image processing apparatus according to claim 28.
  30.  被写体を撮像することでカラー画像を取得することと、
     異なる視点位置から前記被写体を撮像することで白黒画像を取得することと、
     所定のセンサ情報を用いた処理により前記カラー画像と前記白黒画像の合成を制御することと、を有する、
     コンピュータにより実行される画像処理方法。
    Acquiring a color image by imaging a subject;
    Acquiring a black and white image by imaging the subject from different viewpoint positions;
    Controlling composition of the color image and the black and white image by processing using predetermined sensor information;
    An image processing method implemented by a computer.
PCT/JP2018/037948 2017-12-08 2018-10-11 Image processing device and image processing method WO2019111529A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017236175 2017-12-08
JP2017-236175 2017-12-08

Publications (1)

Publication Number Publication Date
WO2019111529A1 true WO2019111529A1 (en) 2019-06-13

Family

ID=66750149

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/037948 WO2019111529A1 (en) 2017-12-08 2018-10-11 Image processing device and image processing method

Country Status (2)

Country Link
CN (1) CN110012215B (en)
WO (1) WO2019111529A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11301974B2 (en) * 2019-05-27 2022-04-12 Canon Kabushiki Kaisha Image processing apparatus, image processing method, image capturing apparatus, and storage medium
US11765309B2 (en) * 2019-12-13 2023-09-19 Sony Group Corporation Video capturing subject using IR light
CN113992868A (en) * 2021-11-30 2022-01-28 维沃移动通信有限公司 Image sensor, camera module and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013026672A (en) * 2011-07-15 2013-02-04 Toshiba Corp Solid-state imaging device and camera module
JP2013183353A (en) * 2012-03-02 2013-09-12 Toshiba Corp Image processor
JP2016156934A (en) * 2015-02-24 2016-09-01 キヤノン株式会社 Distance information generation device, imaging device, distance information generation method, and distance information generation program
WO2017154293A1 (en) * 2016-03-09 2017-09-14 ソニー株式会社 Image processing apparatus, imaging apparatus, image processing method, and program

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5157499A (en) * 1990-06-29 1992-10-20 Kabushiki Kaisha N A C High-speed video camera using solid-state image sensor
WO2001091098A1 (en) * 2000-05-24 2001-11-29 Hitachi, Ltd. Color/black-and-white switchable portable terminal and display device
CN101662694B (en) * 2008-08-29 2013-01-30 华为终端有限公司 Method and device for presenting, sending and receiving video and communication system
JP2013026844A (en) * 2011-07-21 2013-02-04 Nikon Corp Image generation method and device, program, recording medium, and electronic camera
JP5978737B2 (en) * 2012-04-25 2016-08-24 株式会社ニコン Image processing apparatus, imaging apparatus, and image processing program
TWI568259B (en) * 2015-05-14 2017-01-21 聚晶半導體股份有限公司 Image capturing device and hybrid image processing method thereof
CN106447641A (en) * 2016-08-29 2017-02-22 努比亚技术有限公司 Image generation device and method
CN106506950A (en) * 2016-10-27 2017-03-15 成都西纬科技有限公司 A kind of image processing method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013026672A (en) * 2011-07-15 2013-02-04 Toshiba Corp Solid-state imaging device and camera module
JP2013183353A (en) * 2012-03-02 2013-09-12 Toshiba Corp Image processor
JP2016156934A (en) * 2015-02-24 2016-09-01 キヤノン株式会社 Distance information generation device, imaging device, distance information generation method, and distance information generation program
WO2017154293A1 (en) * 2016-03-09 2017-09-14 ソニー株式会社 Image processing apparatus, imaging apparatus, image processing method, and program

Also Published As

Publication number Publication date
CN110012215B (en) 2022-08-16
CN110012215A (en) 2019-07-12

Similar Documents

Publication Publication Date Title
US10880498B2 (en) Image processing apparatus and image processing method to improve quality of a low-quality image
US10957029B2 (en) Image processing device and image processing method
JP7024782B2 (en) Image processing device and image processing method and image pickup device
US11815799B2 (en) Information processing apparatus and information processing method, imaging apparatus, mobile device, and computer program
US10704957B2 (en) Imaging device and imaging method
JP6977722B2 (en) Imaging equipment and image processing system
US11325520B2 (en) Information processing apparatus and information processing method, and control apparatus and image processing apparatus
US11030723B2 (en) Image processing apparatus, image processing method, and program
JP6816769B2 (en) Image processing equipment and image processing method
JP6816768B2 (en) Image processing equipment and image processing method
WO2019111529A1 (en) Image processing device and image processing method
WO2019142660A1 (en) Picture processing device, picture processing method, and program
US11375137B2 (en) Image processor, image processing method, and imaging device
JP6981416B2 (en) Image processing device and image processing method
JPWO2018008408A1 (en) Solid-state imaging device, correction method, and electronic device
US20230013424A1 (en) Information processing apparatus, information processing method, program, imaging apparatus, and imaging system
WO2019111651A1 (en) Imaging system, image processing device, and image processing method
WO2019155718A1 (en) Recognition device, recognition method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18886018

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18886018

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP