WO2020195515A1 - Imaging device and image processing method - Google Patents

Imaging device and image processing method Download PDF

Info

Publication number
WO2020195515A1
WO2020195515A1 PCT/JP2020/008036 JP2020008036W WO2020195515A1 WO 2020195515 A1 WO2020195515 A1 WO 2020195515A1 JP 2020008036 W JP2020008036 W JP 2020008036W WO 2020195515 A1 WO2020195515 A1 WO 2020195515A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
pixel
wavelength range
light
processing unit
Prior art date
Application number
PCT/JP2020/008036
Other languages
French (fr)
Japanese (ja)
Inventor
純也 飯塚
Original Assignee
日立オートモティブシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立オートモティブシステムズ株式会社 filed Critical 日立オートモティブシステムズ株式会社
Priority to CN202080017627.1A priority Critical patent/CN113574851B/en
Publication of WO2020195515A1 publication Critical patent/WO2020195515A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/12Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only

Definitions

  • the present invention relates to an image pickup apparatus and an image processing method, and more particularly to an image pickup apparatus for discriminating the hue of an captured image and an image processing method.
  • the in-vehicle camera mounted on the vehicle for driving support and automatic driving is required to have a wide angle of view in order to improve the accuracy of detecting pedestrians and bicycles jumping out from the side of the vehicle.
  • in-vehicle cameras need to accurately detect the distance to a distant object due to ACC function (Adaptive Cruise Control, constant speed driving, inter-vehicle distance control), etc., and high spatial resolution is required. Be done.
  • ACC function Adaptive Cruise Control, constant speed driving, inter-vehicle distance control
  • high spatial resolution is required. Be done.
  • widening the angle of view is generally realized by reducing the focal length of the objective lens of the camera, and the spatial resolution is lowered accordingly. Therefore, it is difficult to achieve both wide angle of view and improvement of spatial resolution only by improving the objective lens.
  • a technology for improving the spatial resolution by adopting a different type of image sensor is also known.
  • a color image sensor having an RGGB (Red-Green-Green-Blue) pixel array (or Bayer array) to recognize traffic lights, signs, and white and orange lines on the road for driving assistance. Is often adopted.
  • a technique has been proposed in which the image sensor of this RGGB arrangement is replaced with a color image sensor such as an RCCC (Red-Clear-Clear-Clear) pixel arrangement. According to this, the spatial resolution of the image sensor is improved, and it is possible to achieve both wide angle of view and improvement of spatial resolution.
  • the RCCC pixel array includes, for example, one pixel (R pixel) that detects red light and three clear pixels (C pixel) that detect both blue light, green light, and red light, for a total of four pixels. It is composed of two rows and two columns (2 ⁇ 2) arranged in one unit (unit cell), and the unit cells are repeatedly arranged.
  • the interval between C pixels is one pixel, which is different from the RGGB arrangement in which the interval between pixels of the same color is two pixels.
  • the detection value of the C pixel which is 3/4 of the entire pixel, can be used as it is for the grayscale image referred to in the distance measurement, and the interpolation processing from the surrounding pixels that lowers the spatial resolution can be used. Can be applied only to the position of the R pixel at a ratio of 1/4 of the entire pixel. Therefore, it is expected that a higher spatial resolution can be obtained as compared with the RGGB pixel array.
  • Patent Document 1 describes the application of the RCCC pixel array image sensor to an in-vehicle camera. However, there is no specific description about the processing for the captured image.
  • Patent Document 2 also discloses the application of an image sensor having a pixel array consisting of two types, R and C, to an in-vehicle camera.
  • the magnitude of the correlation between the image of the image sensor and the checkerboard pattern according to the pixel type Red / Clear of the image sensor is used for discrimination between the red tail lamp of an automobile and lights such as headlights and street lights.
  • correlation which is a statistic
  • the hue is discriminated for an image region having a certain number of pixels. Therefore, when objects having different hues coexist in this region, an accurate discrimination result cannot be obtained.
  • the present invention is an image pickup apparatus capable of accurately discriminating a predetermined hue when an image sensor in which two types of pixels are arranged is used, and can achieve both a wide angle of view and an improvement in spatial resolution.
  • An object of the present invention is to provide an image processing method.
  • the image pickup apparatus includes a first pixel that detects light in a first wavelength range in the visible range, and the first wavelength range in addition to the light in the first wavelength range.
  • a first pixel that detects light in a first wavelength range in the visible range, and the first wavelength range in addition to the light in the first wavelength range.
  • an image sensor configured by repeatedly arranging filter units including a second pixel that detects light having a different visible light wavelength, and the position of the second pixel based on the detected light amount of the first pixel.
  • the first interpolated image obtained by interpolating the position of the first pixel and the second interpolated image obtained by interpolating the position of the first pixel based on the detected light amount of the second pixel can be generated.
  • an interpolation processing unit and a color information generation processing unit that determines the hue at the position based on the detected light amount of a set of pixels at the same position of the first interpolated image and the second interpolated image. It is characterized by.
  • the image pickup apparatus includes a first pixel that detects light in the first wavelength range in the visible range, and the first wavelength range in addition to the light in the first wavelength range.
  • the first interpolated image obtained by interpolating the above and the second interpolated image obtained by interpolating the position of the first pixel based on the detected light amount of the second pixel can be generated. It includes an interpolation processing unit and a color image generation processing unit that generates a color image based on the first interpolated image and the second interpolated image.
  • the color image generation processing unit Based on the first interpolated image, the color image generation processing unit generates a first component image having a first wavelength component in the first wavelength range, and the first interpolated image and the second interpolated image.
  • a difference image which is the difference between the interpolated images, is generated, the difference image is multiplied by the first distribution ratio, and a second component image having a component in a second wavelength range different from the first wavelength range is generated.
  • the difference image is multiplied by a second distribution ratio to generate a third component image having components in the first wavelength range and a third wavelength range different from the second wavelength range. ..
  • the second pixel is based on the step of acquiring an image from an image sensor configured by repeatedly arranging a filter unit including a second pixel for detecting light of the wavelength of the first pixel and the amount of detected light of the first pixel.
  • the step includes a step of determining the hue at the position based on the detected light amount of the set of pixels at the same position of the first interpolated image and the second interpolated image.
  • an image pickup apparatus and an image processing method capable of accurately discriminating a predetermined hue when an image sensor in which two types of pixels are arranged is used, and a wide angle of view can be provided. And the improvement of spatial resolution can be achieved at the same time.
  • a third embodiment it is a flowchart showing a procedure for generating a colorized image by the color image generation processing unit 110 from an image taken by the image sensor 101. It is a schematic diagram explaining the pixel array (CyCCC array) in the image sensor 101 of the 4th embodiment. It is a graph which shows the sensitivity characteristic of Cy pixel, C pixel of the image sensor of CyCCC pixel array.
  • the image pickup device 1 includes an image sensor 101, an interpolation processing unit 102, a color information generation processing unit 103, a recognition processing unit 104, a color image generation processing unit 110, and an image recording unit 112.
  • the image sensor 101 is an image sensor such as a CMOS sensor or a CCD sensor for acquiring an optical image. These image sensors have photodiodes arranged in an array on a plane, and are configured to detect a light amount distribution on the plane by a plurality of pixels provided by these photodiodes. An image to be imaged can be obtained by detecting the light focused by a condensing member such as a lens or a mirror (not shown) with the image sensor 101.
  • a condensing member such as a lens or a mirror (not shown) with the image sensor 101.
  • the image pickup apparatus 1 of the first embodiment uses the image sensor of the RCCC pixel array shown in FIG. 2 as the image sensor 101.
  • the RCCC pixel array includes R (Red) pixels 201 that receive the light of the color filter that transmits red light among the light in the visible range and can detect the red light, and the visible light that receives the light of the transparent color filter. It has two types of pixels, C (Clear) pixels 202, which are capable of detecting the total amount of light.
  • a 2 ⁇ 2 pixel filter unit 203 composed of one R pixel 201 and three C pixels 202 is repeatedly arranged over a plurality of rows and a plurality of columns.
  • the R pixel 201 is capable of receiving light in the wavelength range of red light
  • the C pixel 203 is capable of receiving light in the wavelength range of visible light including blue light, green light, and red light. It is said that it can receive light in the entire area.
  • the image output by the image sensor 101 is a grayscale image in which a pixel value based on the amount of red light detected by the R pixel 201 and a pixel value based on the total amount of visible light detected by the C pixel 202 are mixed.
  • the interpolation processing unit 102 obtains the amount of red light component light at the position of the C pixel 202 by interpolating it from the amount of detected light of the surrounding R pixel 201 based on the grayscale image output by the image sensor 101. In addition, the interpolation processing unit 102 interpolates and obtains the total amount of visible light at the position of the R pixel 201 from the detected light amount of the surrounding C pixel 202 based on the grayscale image output by the image sensor 101.
  • the processing of the interpolation processing unit 102 can be realized by, for example, the calculation shown in FIG. In the arithmetic processing shown in FIG. 4, the processing unit is 3 ⁇ 3 pixels centered on the pixel to be interpolated.
  • the interpolation process can be executed based on the four pixel patterns of patterns 1 to 4.
  • Pattern 1 is a case where the lower right nine pixels of FIG. 2 are used as processing units
  • pattern 2 is a case where the lower left nine pixels of FIG. 2 are used as processing units
  • pattern 3 is a case where the lower left nine pixels of FIG. 2 are used as processing units.
  • the case where the nine pixels on the upper right of 2 are used as the processing unit
  • the pattern 4 shows the case where the nine pixels on the upper left of FIG. 2 are used as the processing unit.
  • the pixel in the center of the 3 ⁇ 3 pixel is targeted for interpolation processing.
  • the interpolation calculation of the R component in the central C pixel 202 is performed.
  • the interpolation calculation of the C component in the central R pixel 201 is performed.
  • the interpolation calculation of the pixel to be interpolated (the center of 3 ⁇ 3) is executed based on the pixel values of the pixels around the pixel to be interpolated.
  • this calculation is suitable for the case of the pixel array shown in FIG. 2, and different operations having the same meaning as above can be adopted for different pixel arrays. Further, although the above interpolation calculation is executed as the average value of the surrounding pixel values, a different calculation method (for example, a weighted average) can be adopted depending on the required performance of the imaging device, the state of the environment, and the like. Needless to say.
  • an image (R image) showing the spatial distribution of the amount of red light and an image (C image) showing the spatial distribution of the total amount of visible light having the same number of pixels are generated. Can be done. Further, following the process shown in FIG. 4, the R image and the C image may be further subjected to an interpolation process for correcting the distortion of the image.
  • the color information generation processing unit 103 outputs the brightness information and the hue information of the captured image to the recognition processing unit 104 based on the C image and the R image generated by the interpolation processing unit 102.
  • the recognition processing unit 104 recognizes a traffic signal, a road sign, a light of a vehicle in front, a white line on the road, an orange line, and the like with reference to the lightness component and the hue component. Hues are specified for traffic signals and the like. For example, traffic signal lights are bluish green (cyan), orange, and red, and road signs are often red, blue, white, and orange. As for the vehicle lights, the tail lamps and brake lamps are red, and the winkers are orange. Therefore, the recognition processing unit 104 is configured to be able to discriminate between these colors.
  • the color information generation processing unit 103 can calculate the brightness information of the captured image based on, for example, the C image generated by the interpolation processing unit 102.
  • the color information generation processing unit 103 has an R / C value comparison processing unit 105 inside.
  • the R / C value comparison processing unit 105 receives the C image and the R image from the interpolation processing unit 102, and detects the R component detection light amount and the C component with respect to the pixels arranged at the same position of the C image and the R image. The ratio to the amount of light (R / C ratio) is calculated. Then, the R / C value comparison processing unit 105 discriminates the hue by comparing the R / C ratio with the reference value of the color to be discriminated, and outputs the discrimination result as hue information.
  • the red determination reference value storage unit 106 the orange determination reference value storage unit 107, and the achromatic color determination standard
  • the value storage unit 108 and the blue / green determination reference value storage unit 109 are provided.
  • the four reference values stored in these storage units are compared with the R / C ratio described above.
  • the hues of blue-green (cyan) of traffic signals and the blue hues of road signs are different, they can be distinguished by their brightness and existing position, and these hues are collectively treated as blue and green.
  • the color image generation processing unit 110 colorizes the image using the C image and the R image generated by the interpolation processing unit 102, and the R (Red) component image, the G (Green) component image, and the B (Blue) corresponding to the three primary colors. Generates and outputs a component image.
  • the image recording unit 112 is a portion for accumulating the R (Red) component image, the G (Green) component image, and the B (Blue) component image generated by the color image generation processing unit 110, and is a flash memory, a hard disk, a DRAM, or the like. Consists of.
  • the color image generation processing unit 110 of the first embodiment outputs the input R image as an R component image to the image recording unit 112, and has a CR image generation processing unit 111 inside. As the G component image and the B component image, an image based on the CR image generated by obtaining the difference value of each pixel of the C image and the R image is output by the CR image generation processing unit 111.
  • FIG. 5 is an example of actual measurement characteristics of the C pixel value and the R pixel value of the RCCC image obtained by photographing the color used in the road sign. More specifically, the actual measurement as a result of taking a color chart (X-Rite Colorchecker SG (registered trademark)) with a camera equipped with an image sensor of RCCC pixel arrangement under white light of five different brightnesses. This is a characteristic example.
  • the C image and the R image are obtained as a result of the above-mentioned interpolation processing.
  • a part close to the standard color of the safety color specified by JIS Z9103 used for the road sign is extracted from the above color chart (for example, red [row 3 column L], orange [row 6 column L].
  • reference numerals 501 and 502 represent a plot based on the pixels of the red portion and a C vs. R characteristic for red light obtained by the least squares method, respectively.
  • reference numerals 503 and 504 represent C vs. R characteristics for orange light obtained by the plot based on the orange portion and the least squares method, respectively.
  • Reference numerals 505 and 506 represent C vs. R characteristics for yellow light obtained by the plot based on the yellow part and the least squares method, respectively.
  • Reference numerals 507 and 508 represent C vs. R characteristics for achromatic light obtained by the plot based on the white (achromatic) portion and the least squares method, respectively.
  • 509 and 510 represent plots based on the blue part.
  • the ratio (R / C) of the detected light amount of R pixel to the detected light amount of C pixel for each color is constant regardless of the brightness, and the recognition of road signs etc. for driving support by the in-vehicle camera. Since the ratios of the red, orange, yellow, white (achromatic color), and blue colors referred to in the subject are different, it is understood that these hues can be classified based on the R / C.
  • FIG. 6 is an example of actual measurement characteristics of the C pixel value and the R pixel value of the RCCC image obtained by photographing the color used in the traffic signal.
  • FIG. 6 is also an example of actual measurement characteristics as a result of photographing the above-mentioned color chart with a camera equipped with an image sensor having an RCCC pixel arrangement under white light having five different brightnesses.
  • FIG. 6 shows the detected light amount of C pixel and the detected light amount of R pixel by extracting the parts (red [row 3 column L], orange [row 6 column L], cyan [row 8 column B]) that are close to the color used in the traffic signal. This is a plot of the relationship between the amount of detected light.
  • Reference numerals 501 and 502 and reference numerals 503 and 504 are the same as those in FIG. 5, and thus redundant description will be omitted.
  • reference numerals 601 and 602 represent C vs. R characteristics for yellow light obtained by the plot based on the cyan part and the least squares method, respectively. From these data, it is possible to discriminate the hue of the signal color as in FIG.
  • FIG. 7 is a flowchart illustrating a procedure for obtaining hue information from an image captured by the image sensor 101, and is a process executed by the image sensor 101, the interpolation processing unit 102, and the color information generation processing unit 103 of FIG. is there.
  • an image is captured and acquired by the image sensor 101.
  • the acquired image has a pixel value based on the amount of red light detected by the R pixel 201 of the image sensor 101 of the RCCC pixel array shown in FIG. 2 and a pixel based on the total amount of visible light detected by the C pixel 202. It is a grayscale image in which values are mixed.
  • step 702 the above-mentioned interpolation processing is executed for each pixel position of the grayscale image obtained in step 701, and an R image and a C image are generated. Then, in step 703, the ratio (R / C ratio) of the pixel value of the R image at the same pixel position and the pixel value of the C image is obtained, and in step 704, the hue is determined based on the value of the R / C ratio. ..
  • FIG. 8 is a flowchart showing the details of the procedure for determining the hue based on the pixel value ratio (R / C ratio) in step 704.
  • Tr is the lower limit of the R / C ratio for discriminating red
  • To is the lower limit of the R / C ratio for discriminating orange
  • Ty is the lower limit of the R / C ratio for discriminating yellow.
  • the lower limit value and Tg indicate the lower limit value of the R / C ratio for distinguishing from achromatic colors (white, gray, black), respectively.
  • step 801 the R / C ratio value and the lower limit value Tr are compared. If R / C> Tr (Yes), the hue of the pixel is determined to be red. If R / C ⁇ Tr (No), the process proceeds to step 802.
  • step 802 the value of the R / C ratio and the lower limit value To are compared. If R / C> To (Yes), the hue of the pixel is determined to be orange. If R / C ⁇ To (No), the process proceeds to step 803.
  • step 803 the value of the R / C ratio and the lower limit value Ty are compared. If R / C> Ty (Yes), the hue of the pixel is determined to be yellow. If R / C ⁇ Ty (No), the process proceeds to step 804.
  • step 804 the value of the R / C ratio and the lower limit value Tg are compared. If R / C> Tg (Yes), it is determined that the hue of the pixel is achromatic (white, gray, black).
  • step S704 the hue of the pixel is determined to be blue or green.
  • FIG. 9 is a flowchart showing a procedure for generating a colorized image by the color image generation processing unit 110 from a grayscale image taken by the image sensor 101.
  • steps 701 and 702 after the grayscale image is captured and acquired by the image sensor 101, the above-mentioned interpolation processing is executed for each pixel position of the grayscale image, and the R image and the R image and A C image is generated.
  • step 901 the difference image (CR image) between the pixel value of the generated C image and the pixel value of the R image is generated in the CR image generation processing unit 111. Further, in step 902, the CR image generation processing unit 111 generates an image ( ⁇ (CR)) obtained by multiplying the pixel value of the CR image by the distribution ratio ⁇ (0 ⁇ ⁇ ⁇ 1). It is made into a G component image.
  • step 903 an image ((1- ⁇ ) (CR)) obtained by multiplying the pixel value of the CR image by (1- ⁇ ) is generated by the CR image generation processing unit 111, and the B component image is generated. It is said that.
  • the distribution ratio ⁇ is a value indicating the ratio of the G component images included in the difference image (CR).
  • may be referred to as a “first allocation ratio” and (1- ⁇ ) may be referred to as a “second allocation ratio”.
  • the sum of the pixel values ⁇ (CR) given to the G component image in step 902 and the pixel values (1- ⁇ ) (CR) given to the B component image in step 903 is CR.
  • the sum of the first allocation ratio ⁇ and the second allocation ratio (1- ⁇ ) is 1.
  • the wavelength range in which the C pixel 202 has sensitivity is wider than that of the R pixel 201 by the amount of the G component and the B component, and the C Since it is estimated that the amount of detected light of the pixel 202 and the R pixel 201 is larger by the sum of the G component and the B component, the above relationship is established. It is also possible to add a process of increasing or decreasing the brightness of each component (not shown) to the obtained R component image, G component image, and B component image.
  • FIG. 10 is a diagram (graph) for explaining the first example of the distribution ratio ⁇ between the G component image and the B component image.
  • the vertical axis represents the value of the R / C ratio (0 or more and 1 or less)
  • the horizontal axis is the ratio of the G component image to the sum (G + B) of the G component image and the B component image, that is, the first.
  • the distribution ratio ⁇ (0 or more and 1 or less) is shown.
  • the approximate distribution of each hue in the space indicated by the vertical axis and the horizontal axis is shown by a dotted line.
  • the imaging device of the first embodiment is applied to an in-vehicle camera, particularly the color of the clear sky and the blue light (cyan), the color of the road surface and the white line (achromatic color), the red light and the tail lamp of the vehicle (red) It is reproduced well. Therefore, based on the image sensor 101 having an RCCC pixel array, it is possible to obtain a colorized image with less visual discomfort.
  • FIG. 11 is a diagram (graph) for explaining a second example of the distribution ratio ⁇ between the G component image and the B component image.
  • the difference from FIG. 10 is that the distribution ratio ⁇ is changed so that the distribution ratio ⁇ changes according to the change in the R / C ratio, at least when the R / C ratio is within a predetermined range. This is because when the image pickup device 1 is used as an in-vehicle camera, orange color appears more frequently than pink color in the captured image.
  • the locus 1101 in FIG. 11 shows the change in the R / C ratio.
  • the R / C ratio is equal to or greater than the first value and equal to or less than the second value, the larger the R / C ratio, the larger the distribution ratio ⁇ .
  • the function showing the locus 1101 can be stored in a storage unit (not shown) in the color image generation processing unit 110. It goes without saying that the locus 1101 in FIG. 11 is just an example, and the shape of the locus 1101 can be changed as appropriate.
  • the distribution ratio ⁇ increases as the R / C ratio increases.
  • the hue assigned to the pixel having the R / C ratio changes from cyan to achromatic to orange to red.
  • the image pickup device 1 When the image pickup device 1 is used as an in-vehicle camera, the color of the clear sky and the green signal (cyan), the color of the road surface and the white line (achromatic color), the red signal and the tail lamp (red) are reproduced particularly well. In addition, the reproducibility of colors such as orange lines, yellow signals, and winkers on the road surface can be improved as compared with the example of FIG.
  • the locus 1101 in FIG. 11 is set to be a continuous curve over the entire area where the R / C ratio changes from 0 to 1. This is because if the locus 1101 has a discontinuous curve, the change in hue becomes large in the vicinity of the discontinuous sky, causing a problem that noise becomes large.
  • the image pickup apparatus 1 and the image processing method according to the first embodiment even when an image sensor having two types of pixels such as an RCCC pixel array is applied, a predetermined image sensor is applied. It becomes possible to accurately discriminate hues.
  • the RCCC pixel array it is possible to achieve both a wide angle of view and an improvement in spatial resolution as compared with the RGGB pixel array, but the information obtained regarding the color is usually limited.
  • the effect of being able to generate an image reflecting the color of the object can be obtained, similar to the image pickup device to which the image sensor of the RGGB arrangement is applied.
  • the RCCC pixel array In the first embodiment, an example of the RCCC pixel array is shown, but if the pixel array consists of two types of pixels, a pixel having a red color filter and a pixel having a transparent color filter, the ratio of the number of pixels is It does not have to be 1: 3. Further, in order to place importance on the recognition accuracy for a red object (traffic signal, tail lamp, sign) for in-vehicle camera applications, there are two types of embodiments, one having a red color filter and the other having a transparent color filter. As shown, a pair of a pixel having a non-red color filter and a pixel having a transparent color filter may be used depending on the application.
  • FIG. 12 is a block diagram showing the overall configuration of the image pickup apparatus of the second embodiment.
  • the components common to the apparatus of the first embodiment are designated by the same reference numerals as those in FIG. 1, and therefore, duplicate description will be omitted below.
  • the second embodiment is different from the first embodiment in the configuration of the color information generation processing unit 103.
  • the color information generation processing unit 103 of the second embodiment includes an address generation unit 1201 and a color information generation table 1202.
  • the address generation unit 1201 is configured to input a C image and an R image and output the corresponding address signal. Specifically, the address generation unit 1201 generates an address signal corresponding to a set of pixel values at the same pixel position of the input C image and R image, and outputs the address signal to the color information generation table 1202.
  • the color information generation table 1202 stores an address signal and brightness information and hue information corresponding to the address signal as a table. Then, the color information generation table 1202 identifies and outputs the corresponding brightness information and hue information based on the address information input from the address generation unit 1201.
  • the data structure shown in FIG. 13 is applied to the color information generation table 1202.
  • the address signal supplied from the address generation unit 1201 connects the R pixel value and the C pixel value to the upper bit and the lower bit like ⁇ R, C ⁇ .
  • the generated data is assigned so as to be applied, and as the corresponding data, the R pixel value corresponding to the address and the hue information corresponding to the R / C ratio based on the C pixel value are stored. According to such a configuration, hue information can be easily generated without requiring complicated calculations.
  • the second embodiment it is possible to easily generate color information including not only the hue classification based on the R / C ratio but also the brightness classification based on the level of the C pixel value. Further, it is possible to obtain an effect that it becomes possible to determine complicated color information, such as changing the threshold value of the division of the R / C ratio value and the number of divisions according to the level of the C pixel value.
  • FIG. 14 is a block diagram showing the overall configuration of the image pickup apparatus according to the third embodiment.
  • FIG. 13 since the constituent members common to the apparatus of the first embodiment are designated by the same reference numerals as those in FIG. 1, duplicate description will be omitted below.
  • the configuration of the color image generation processing unit 110 is different from that of the first embodiment.
  • the color image generation processing unit 110 of the third embodiment is configured to be able to obtain a color image with less visual discomfort even when the detected light amount of the pixel exceeds the upper limit value.
  • an appropriate color image can be obtained in the case where the detected light amount of the pixel does not exceed the upper limit and is not saturated, but the saturated portion is reddish than in reality. It becomes a hue. This is because the difference between the C pixel value and the R pixel value becomes smaller than the actual value at the saturated portion, and the values given to the G component and the B component become relatively small. In the third embodiment, this phenomenon can be effectively suppressed.
  • the color image generation processing unit 110 of the third embodiment includes the brightness saturation pixel determination processing unit 1401 in addition to the CR image generation processing unit 111 similar to that of the first embodiment. It includes a saturated pixel replacement processing unit 1402, an R component brightness correction unit 1403, a G component brightness correction unit 1404, and a B component brightness correction unit 1405.
  • the R component brightness correction unit 1403, the G component brightness correction unit 1404, and the B component brightness correction unit 1405 together constitute a brightness correction unit.
  • the function of the CR image generation processing unit 111 is the same as that of the first embodiment (FIG. 1).
  • the brightness saturation pixel determination processing unit 1401 determines that the brightness of the pixel is a saturated pixel. It should be noted that it is also possible to configure a pixel having a brightness near the upper limit value to be determined to be a saturated pixel, although the amount of detected light does not exceed the upper limit value.
  • the saturated pixel replacement processing unit 1402 corresponds the brightness of the R component, the G component, or the B component to the pixel determined to be saturated based on the determination result of the brightness saturation pixel determination processing unit 1401. Switch to the upper limit and output. That is, the saturated pixel replacement processing unit 1402 has a function of replacing each component value with an upper limit value so that the saturated pixel is treated as white.
  • the brightness of the pixels of the R image is input to the saturated pixel replacement processing unit 1402 after being corrected by the R component brightness correction unit 1403.
  • the saturation pixel replacement processing unit 1402 is input after the brightness of the pixels of the G image and the B image is corrected by the G component brightness correction unit 1404 and the B component brightness correction unit 1405, respectively.
  • the color image generation processing unit 110 of the third embodiment includes an R component brightness correction unit 1403, a G component brightness correction unit 1404, and a B component brightness correction unit 1405, and the brightness of each component is high. After correcting in the direction of, it is supplied to the saturated pixel replacement processing unit 1402.
  • the correction amount ⁇ in the R component brightness correction unit 1403, the G component brightness correction unit 1404, and the B component brightness correction unit 1405 is a value of 1 or more, and ⁇ 3 can be preferably set.
  • the saturation pixel replacement processing unit 1402 may adjust the brightness so as to reach the upper limit value.
  • FIG. 15 is a flowchart showing a procedure for generating a colorized image by the color image generation processing unit 110 from the image taken by the image sensor 101. Steps 701, 702, 703, 901, 902, and 903 are the same as those in FIG. 9, so duplicate description will be omitted.
  • step 1501 after step 903, the brightness saturation pixel determination processing unit 1401 determines the existence and position of saturated pixels (pixels having brightness equal to or higher than the upper limit value) in the C image.
  • the pixel values (brightness) of the R component image, the G component image, and the B component image are multiplied by ⁇ ( ⁇ ) in the R component brightness correction unit 1403, the G component brightness correction unit 1404, and the B component brightness correction unit 1405. ⁇ 1).
  • the difference in brightness when the saturated pixels are replaced with white
  • step 1503 the pixel values of the R component image, the G component image, and the B component image at the positions of the saturated pixels are replaced with the respective upper limit values.
  • step 1503 the portion where the pixel value is saturated and the red component is displayed strongly in a pseudo manner is replaced with white.
  • the image pickup apparatus and the image processing method according to the fourth embodiment will be described with reference to FIGS. 16 and 17.
  • the image sensor 101 has a pixel array different from the RCCC pixel array.
  • one filter unit 203 has two types of pixels, a Cy pixel 1601 for detecting cyan color, which is a complementary color of red, and a C pixel 202, and one Cy pixel 1601 and three C pixels.
  • This is a pixel arrangement (CyCCC pixel arrangement) in which a 2 ⁇ 2 pixel filter unit 203 composed of 202 is repeated over a plurality of rows and a plurality of columns.
  • the Cy pixel 1601 has sensitivity to blue light and green light.
  • the difference image (C-Cy) between the C image and the Cy image is an R image.
  • the Cy pixel 1601 has sensitivity to blue light and green light, the sensitivity of the image sensor 101 to green light and blue light is improved as compared with the RCCC pixel array.
  • the Cy pixel 1601 has a higher S than the R pixel 201.
  • the / N ratio can be given, and as a result, the accuracy of hue discrimination can be expected to be improved as compared with the first embodiment.
  • the present invention is not limited to the above-mentioned examples, and includes various modifications.
  • the above-described embodiment has been described in detail in order to explain the present invention in an easy-to-understand manner, and is not necessarily limited to those having all the described configurations.
  • it is possible to replace a part of the configuration of one embodiment with the configuration of another embodiment and it is also possible to add the configuration of another embodiment to the configuration of one embodiment.

Abstract

The purpose of the present invention is, when an image sensor in which two kinds of pixels are arrayed is used, to make it possible to accurately discriminate a prescribed hue, and to achieve both widening an angle of view and improvement of spatial resolution. This imaging device is provided with: an image sensor that is configured by repeatedly arraying filter units each including a first pixel that detects light in a first wavelength range of the visible region, and a second pixel that detects, in addition to the light in the first wavelength range, light having a wavelength of visible light that is different from the first wavelength range; an interpolation processing unit that is configured so as to be able to generate a first interpolation image obtained by interpolating the position of the second pixel on the basis of the amount of the detected light of the first pixel, and a second interpolation image obtained by interpolating the position of the first pixel on the basis of the amount of the detected light of the second pixel; and a color information generation processing unit that, on the basis of the amount of the detected light of a pair of pixels on the same position as the first interpolation image and the second interpolation image, determines a hue at the position.

Description

撮像装置及び画像処理方法Imaging device and image processing method
 本発明は、撮像装置、及び画像処理方法に関し、特に撮像した画像の色相を判別する撮像装置、及び画像処理方法に関する。 The present invention relates to an image pickup apparatus and an image processing method, and more particularly to an image pickup apparatus for discriminating the hue of an captured image and an image processing method.
 運転支援や自動運転のために車両に搭載される車載カメラは、車両の横方向から飛び出す歩行者や自転車を検出する精度を向上するため、広画角化が求められる。その一方で、車載カメラは、ACC機能(Adaptive Cruise Control、定速走行、車間距離制御)などのために、より遠方の対象物までの距離を正確に検出する必要があり、高い空間解像度が求められる。しかし、広画角化は一般にカメラの対物レンズの焦点距離を小さくすることで実現されるが、これに伴い空間解像度が低下する。このため、対物レンズの改善のみでは広画角化と空間解像度向上とを両立させることは難しい。 The in-vehicle camera mounted on the vehicle for driving support and automatic driving is required to have a wide angle of view in order to improve the accuracy of detecting pedestrians and bicycles jumping out from the side of the vehicle. On the other hand, in-vehicle cameras need to accurately detect the distance to a distant object due to ACC function (Adaptive Cruise Control, constant speed driving, inter-vehicle distance control), etc., and high spatial resolution is required. Be done. However, widening the angle of view is generally realized by reducing the focal length of the objective lens of the camera, and the spatial resolution is lowered accordingly. Therefore, it is difficult to achieve both wide angle of view and improvement of spatial resolution only by improving the objective lens.
 これに対して、異なる方式のイメージセンサの採用により空間解像度を向上させる技術も知られている。車載カメラでは、運転支援のために、交通信号機、標識、並びに路上の白色線及び橙色線等を認識するためにRGGB(Red-Green-Green-Blue)画素配列(又はベイヤー配列)のカラーイメージセンサを採用することが多い。近年、このRGGB配列のイメージセンサをRCCC(Red-Clear-Clear-Clear)画素配列などのカラーイメージセンサに置き換える技術が提案されている。これによれば、イメージセンサの空間解像度を改善され、広画角化と空間解像度向上を両立させることができる。 On the other hand, a technology for improving the spatial resolution by adopting a different type of image sensor is also known. In the in-vehicle camera, a color image sensor having an RGGB (Red-Green-Green-Blue) pixel array (or Bayer array) to recognize traffic lights, signs, and white and orange lines on the road for driving assistance. Is often adopted. In recent years, a technique has been proposed in which the image sensor of this RGGB arrangement is replaced with a color image sensor such as an RCCC (Red-Clear-Clear-Clear) pixel arrangement. According to this, the spatial resolution of the image sensor is improved, and it is possible to achieve both wide angle of view and improvement of spatial resolution.
 RCCC画素配列は、例えば、赤色光を検出する画素(R画素)を1個と、青色光と緑色光と赤色光を共に検出するクリア画素(C画素)を3個、計4個の画素を2行2列(2×2)に並べて一単位(ユニットセル)とし、そのユニットセルを繰り返し配列することで構成される。C画素の間隔は1画素であり、この点、同色の画素の間隔が2画素であるRGGB配列等とは異なっている。このRCCC配列では、測距で参照するグレースケール画像に対して、画素全体の3/4をなすC画素の検出値をそのまま用いることができ、且つ空間解像度を低下させる周囲の画素からの補間処理の適用を画素全体の1/4の割合のR画素の位置のみに留めることができる。このため、RGGB画素配列と比較して高い空間解像度を得られると期待される。 The RCCC pixel array includes, for example, one pixel (R pixel) that detects red light and three clear pixels (C pixel) that detect both blue light, green light, and red light, for a total of four pixels. It is composed of two rows and two columns (2 × 2) arranged in one unit (unit cell), and the unit cells are repeatedly arranged. The interval between C pixels is one pixel, which is different from the RGGB arrangement in which the interval between pixels of the same color is two pixels. In this RCCC array, the detection value of the C pixel, which is 3/4 of the entire pixel, can be used as it is for the grayscale image referred to in the distance measurement, and the interpolation processing from the surrounding pixels that lowers the spatial resolution can be used. Can be applied only to the position of the R pixel at a ratio of 1/4 of the entire pixel. Therefore, it is expected that a higher spatial resolution can be obtained as compared with the RGGB pixel array.
 しかし、R(Red)とC(Clear)の2種類の画素からなるRCCC画素配列では、R赤(Red)、緑(Green)、青(Blue)の3原色を区分して光量を検出するRGGB画素配列と比較して、色彩に関する情報量は限定され、色彩に関する処理方法を改める必要がある。
 RCCC画素配列のイメージセンサの車載カメラへの適用に関しては、特許文献1に記載されている。しかし、撮像した画像に対する処理についての具体的な記載は無い。
However, in the RCCC pixel array consisting of two types of pixels, R (Red) and C (Clear), the RGGB that detects the amount of light by classifying the three primary colors of R red (Red), green (Green), and blue (Blue). Compared with the pixel array, the amount of information related to color is limited, and it is necessary to revise the processing method related to color.
Patent Document 1 describes the application of the RCCC pixel array image sensor to an in-vehicle camera. However, there is no specific description about the processing for the captured image.
 また、特許文献2も、R及びCの2種類からなる画素配列のイメージセンサの車載カメラへの適用に関して開示している。特許文献2では、自動車の赤色のテールランプと、ヘッドライト、街灯などの灯火の弁別に、イメージセンサの画像と、イメージセンサの画素種別Red/Clearに従うチェッカーボードパターンとの相関の大小を用いる。この手法においては、統計量である相関を用いるため、ある程度のまとまった画素数を有する画像領域に対して色相を判別することになる。このため、この領域内に異なる色相を有する物体が共存する場合には正確な判別結果が得られない。また、色相を判別する領域内に異なる色相のパターンが混在しないように領域を設定することは、形状の想定が可能である程度の大きさを有するパターンでないと困難である。このように、従来技術では、RCCC画素配列による色相の正確な判別は困難である。RCCC画素配列以外でも、2種類の画素からなる画素配列を有するイメージセンサにおいては、同様の問題が生じ得る。 Patent Document 2 also discloses the application of an image sensor having a pixel array consisting of two types, R and C, to an in-vehicle camera. In Patent Document 2, the magnitude of the correlation between the image of the image sensor and the checkerboard pattern according to the pixel type Red / Clear of the image sensor is used for discrimination between the red tail lamp of an automobile and lights such as headlights and street lights. In this method, since correlation, which is a statistic, is used, the hue is discriminated for an image region having a certain number of pixels. Therefore, when objects having different hues coexist in this region, an accurate discrimination result cannot be obtained. Further, it is difficult to set an area so that patterns of different hues do not coexist in the area for discriminating hues unless the pattern has a certain size and the shape can be assumed. As described above, in the prior art, it is difficult to accurately determine the hue by the RCCC pixel arrangement. In addition to the RCCC pixel array, the same problem may occur in an image sensor having a pixel array composed of two types of pixels.
特開2017-046051号公報Japanese Unexamined Patent Publication No. 2017-046051 米国特許出願公開第2007/0221822号明細書U.S. Patent Application Publication No. 2007/0221822
 本発明は、2種類の画素を配列したイメージセンサを用いた場合において、所定の色相の弁別を正確に行うことができ、広画角化と空間解像度向上を両立させることが可能な撮像装置及び画像処理方法を提供することを目的とする。 The present invention is an image pickup apparatus capable of accurately discriminating a predetermined hue when an image sensor in which two types of pixels are arranged is used, and can achieve both a wide angle of view and an improvement in spatial resolution. An object of the present invention is to provide an image processing method.
 本発明の第1の態様に係る撮像装置は、可視域の第1の波長範囲の光を検出する第1の画素と、前記第1の波長範囲の光に加えて前記第1の波長範囲とは異なる可視光の波長の光を検出する第2の画素とを含むフィルタユニットを繰り返し配列して構成されるイメージセンサと、前記第1の画素の検出光量に基づき、前記第2の画素の位置を補間して得た第1の補間画像と、前記第2の画素の検出光量に基づき、前記第1の画素の位置を補間して得た第2の補間画像とを生成可能に構成された補間処理部と、前記第1の補間画像と前記第2の補間画像との同一位置の画素の組の検出光量に基づいて、該位置における色相を判定する色情報生成処理部とを備えたことを特徴とする。 The image pickup apparatus according to the first aspect of the present invention includes a first pixel that detects light in a first wavelength range in the visible range, and the first wavelength range in addition to the light in the first wavelength range. Is an image sensor configured by repeatedly arranging filter units including a second pixel that detects light having a different visible light wavelength, and the position of the second pixel based on the detected light amount of the first pixel. The first interpolated image obtained by interpolating the position of the first pixel and the second interpolated image obtained by interpolating the position of the first pixel based on the detected light amount of the second pixel can be generated. It is provided with an interpolation processing unit and a color information generation processing unit that determines the hue at the position based on the detected light amount of a set of pixels at the same position of the first interpolated image and the second interpolated image. It is characterized by.
 本発明の第2の態様に係る撮像装置は、可視域の第1の波長範囲の光を検出する第1の画素と、前記第1の波長範囲の光に加えて前記第1の波長範囲とは異なる可視光の波長の光を検出する第2の画素とを含むフィルタユニットを繰り返し配列して構成されるイメージセンサと、前記第1の画素の検出光量に基づき、前記第2の画素の位置を補間して得た第1の補間画像と、前記第2の画素の検出光量に基づき、前記第1の画素の位置を補間して得た第2の補間画像とを生成可能に構成された補間処理部と、前記第1の補間画像及び前記第2の補間画像に基づきカラー画像を生成するカラー画像生成処理部とを備える。前記カラー画像生成処理部は、前記第1の補間画像に基づき、前記第1の波長範囲の第1の波長成分を有する第1成分画像を生成し、前記第1の補間画像と前記第2の補間画像の差である差画像を生成し、前記差画像に第1の配分比を乗算し、前記第1の波長範囲とは異なる第2の波長範囲の成分を有する第2成分画像を生成し、前記差画像に第2の配分比を乗算し、前記第1の波長範囲及び前記第2の波長範囲とは異なる第3の波長範囲の成分を有する第3成分画像を生成するよう構成される。 The image pickup apparatus according to the second aspect of the present invention includes a first pixel that detects light in the first wavelength range in the visible range, and the first wavelength range in addition to the light in the first wavelength range. Is an image sensor configured by repeatedly arranging filter units including a second pixel that detects light having a different visible light wavelength, and a position of the second pixel based on the amount of detected light of the first pixel. The first interpolated image obtained by interpolating the above and the second interpolated image obtained by interpolating the position of the first pixel based on the detected light amount of the second pixel can be generated. It includes an interpolation processing unit and a color image generation processing unit that generates a color image based on the first interpolated image and the second interpolated image. Based on the first interpolated image, the color image generation processing unit generates a first component image having a first wavelength component in the first wavelength range, and the first interpolated image and the second interpolated image. A difference image, which is the difference between the interpolated images, is generated, the difference image is multiplied by the first distribution ratio, and a second component image having a component in a second wavelength range different from the first wavelength range is generated. , The difference image is multiplied by a second distribution ratio to generate a third component image having components in the first wavelength range and a third wavelength range different from the second wavelength range. ..
 本発明に係る画像処理方法は、可視域の第1の波長範囲の光を検出する第1の画素と、前記第1の波長範囲の光に加えて前記第1の波長範囲とは異なる可視光の波長の光を検出する第2の画素とを含むフィルタユニットを繰り返し配列して構成されるイメージセンサから画像を取得するステップと、前記第1の画素の検出光量に基づき、前記第2の画素の位置を補間して第1の補間画像を取得するステップと、前記第2の画素の検出光量に基づき、前記第1の画素の位置を補間して第2の補間画像を取得するステップと、前記第1の補間画像と前記第2の補間画像との同一位置の画素の組の検出光量に基づいて、該位置における色相を判定するステップとを備える。 In the image processing method according to the present invention, the first pixel for detecting light in the first wavelength range in the visible range, and visible light different from the first wavelength range in addition to the light in the first wavelength range. The second pixel is based on the step of acquiring an image from an image sensor configured by repeatedly arranging a filter unit including a second pixel for detecting light of the wavelength of the first pixel and the amount of detected light of the first pixel. The step of interpolating the position of the first pixel to acquire the first interpolated image, and the step of interpolating the position of the first pixel and acquiring the second interpolated image based on the detected light amount of the second pixel. The step includes a step of determining the hue at the position based on the detected light amount of the set of pixels at the same position of the first interpolated image and the second interpolated image.
 本発明によれば、2種類の画素を配列したイメージセンサを用いた場合において、所定の色相の弁別を正確に行うことができる撮像装置及び画像処理方法を提供することができ、広画角化と空間解像度向上を両立させることができる。 According to the present invention, it is possible to provide an image pickup apparatus and an image processing method capable of accurately discriminating a predetermined hue when an image sensor in which two types of pixels are arranged is used, and a wide angle of view can be provided. And the improvement of spatial resolution can be achieved at the same time.
第1の実施の形態に係る撮像装置の基本構成の一例を説明する全体構成図である。It is an overall block diagram explaining an example of the basic structure of the image pickup apparatus which concerns on 1st Embodiment. RCCC画素配列のイメージセンサの画素配置を示した概略図である。It is the schematic which showed the pixel arrangement of the image sensor of RCCC pixel arrangement. RCCC画素配列のイメージセンサの、R画素、C画素の感度特性を示すグラフである。It is a graph which shows the sensitivity characteristic of R pixel, C pixel of the image sensor of RCCC pixel array. 補間処理部102での演算処理の一例を示した概略図である。It is the schematic which showed an example of the arithmetic processing in the interpolation processing unit 102. 道路標識で用いられる色を撮影したRCCC画像のC画素値とR画素値の実測特性例を示すグラフである。It is a graph which shows the actual measurement characteristic example of the C pixel value and R pixel value of the RCCC image which photographed the color used for the road sign. 交通信号機で用いられる色を撮影したRCCC画像のC画素値とR画素値の実測特性例を示すグラフである。It is a graph which shows the actual measurement characteristic example of the C pixel value and R pixel value of the RCCC image which photographed the color used in the traffic signal. イメージセンサ101で撮影した画像から色相情報を得るまでの手順を説明するフローチャートである。It is a flowchart explaining the procedure from getting the hue information from the image taken by the image sensor 101. 図7のステップ704において、画素値の比(R/C比)に基づき色相を判別する手順の詳細を示したフローチャートである。It is a flowchart which showed the detail of the procedure of discriminating the hue based on the ratio (R / C ratio) of a pixel value in step 704 of FIG. 第1の実施の形態において、イメージセンサ101で撮影したグレースケールの画像から、カラー画像生成処理部110においてカラー化した画像を生成する手順を示すフローチャートである。In the first embodiment, it is a flowchart showing a procedure of generating a colorized image by a color image generation processing unit 110 from a grayscale image taken by an image sensor 101. G成分画像とB成分画像との配分比αの第1の例を説明する図(グラフ)である。It is a figure (graph) explaining the 1st example of the distribution ratio α of G component image and B component image. G成分画像とB成分画像との配分比αの第2の例を説明する図(グラフ)である。It is a figure (graph) explaining the 2nd example of the distribution ratio α of G component image and B component image. 第2の実施の形態に係る撮像装置の基本構成の一例を説明する全体構成図である。It is an overall block diagram explaining an example of the basic structure of the image pickup apparatus which concerns on 2nd Embodiment. 図12の色情報生成テーブル1202の構成例を説明する概略図である。It is the schematic explaining the structural example of the color information generation table 1202 of FIG. 第3の実施の形態に係る撮像装置の基本構成の一例を説明する全体構成図である。It is an overall block diagram explaining an example of the basic structure of the image pickup apparatus which concerns on 3rd Embodiment. 第3の実施の形態において、イメージセンサ101で撮影した画像から、カラー画像生成処理部110においてカラー化した画像を生成する手順を示すフローチャートである。In a third embodiment, it is a flowchart showing a procedure for generating a colorized image by the color image generation processing unit 110 from an image taken by the image sensor 101. 第4の実施の形態のイメージセンサ101における画素配列(CyCCC配列)を説明する概略図である。It is a schematic diagram explaining the pixel array (CyCCC array) in the image sensor 101 of the 4th embodiment. CyCCC画素配列のイメージセンサの、Cy画素、C画素の感度特性を示すグラフである。It is a graph which shows the sensitivity characteristic of Cy pixel, C pixel of the image sensor of CyCCC pixel array.
 以下、添付図面を参照して本発明の実施の形態を説明する。添付図面では、機能的に同じ要素は同じ番号で表示される場合もある。なお、添付図面は本開示の原理に則った実施形態と実装例を示しているが、これらは本開示の理解のためのものであり、決して本開示を限定的に解釈するために用いられるものではない。本明細書の記述は典型的な例示に過ぎず、本開示の特許請求の範囲又は適用例を如何なる意味においても限定するものではない。 Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings. In the attached drawings, functionally the same elements may be displayed with the same number. The accompanying drawings show embodiments and implementation examples in accordance with the principles of the present disclosure, but these are for the purpose of understanding the present disclosure and are never used for a limited interpretation of the present disclosure. is not. The description of the present specification is merely a typical example, and does not limit the scope of claims or application examples of the present disclosure in any sense.
 本実施形態では、当業者が本開示を実施するのに十分詳細にその説明がなされているが、他の実装・形態も可能で、本開示の技術的思想の範囲と精神を逸脱することなく構成・構造の変更や多様な要素の置き換えが可能であることを理解する必要がある。従って、以降の記述をこれに限定して解釈してはならない。 In this embodiment, the description is given in sufficient detail for those skilled in the art to implement the present disclosure, but other implementations and embodiments are also possible and do not deviate from the scope and spirit of the technical idea of the present disclosure. It is necessary to understand that it is possible to change the structure and structure and replace various elements. Therefore, the following description should not be construed as limited to this.
[第1の実施の形態]
 図1の全体構成図を参照して、第1の実施の形態に係る撮像装置の基本構成の一例を説明する。この撮像装置1は、イメージセンサ101、補間処理部102、色情報生成処理部103、認識処理部104、カラー画像生成処理部110、及び画像記録部112を有する。
[First Embodiment]
An example of the basic configuration of the image pickup apparatus according to the first embodiment will be described with reference to the overall configuration diagram of FIG. The image pickup device 1 includes an image sensor 101, an interpolation processing unit 102, a color information generation processing unit 103, a recognition processing unit 104, a color image generation processing unit 110, and an image recording unit 112.
 イメージセンサ101は、光画像を取得するためのCMOSセンサやCCDセンサなどのイメージセンサである。これらのイメージセンサは、平面上にアレイ状に配置されたフォトダイオードを有し、これらのフォトダイオードにより提供される複数画素により、平面上の光量分布を検出するよう構成されている。図示しないレンズやミラーなどの集光部材により合焦した光をイメージセンサ101で検出することにより、撮像対象の画像を得ることができる。 The image sensor 101 is an image sensor such as a CMOS sensor or a CCD sensor for acquiring an optical image. These image sensors have photodiodes arranged in an array on a plane, and are configured to detect a light amount distribution on the plane by a plurality of pixels provided by these photodiodes. An image to be imaged can be obtained by detecting the light focused by a condensing member such as a lens or a mirror (not shown) with the image sensor 101.
 この第1の実施の形態の撮像装置1は、イメージセンサ101として、図2に示すRCCC画素配列のイメージセンサを利用する。RCCC画素配列は、可視域の光のうち赤色光を透過するカラーフィルタの光を受光し赤色光を検出可能とされたR(Red)画素201と、透明なカラーフィルタの光を受光し可視光全体の光量を検出可能とされたC(Clear)画素202の二種類の画素を有する。ひとつのR画素201と3つのC画素202からなる2×2画素のフィルタユニット203が複数行複数列に亘り繰り返し配置される。 The image pickup apparatus 1 of the first embodiment uses the image sensor of the RCCC pixel array shown in FIG. 2 as the image sensor 101. The RCCC pixel array includes R (Red) pixels 201 that receive the light of the color filter that transmits red light among the light in the visible range and can detect the red light, and the visible light that receives the light of the transparent color filter. It has two types of pixels, C (Clear) pixels 202, which are capable of detecting the total amount of light. A 2 × 2 pixel filter unit 203 composed of one R pixel 201 and three C pixels 202 is repeatedly arranged over a plurality of rows and a plurality of columns.
 図3のグラフに示すように、R画素201は赤色光の波長域の光を受光可能とされており、一方、C画素203は青色光、緑色光、赤色光を含む可視光の波長域の全域の光を受光可能とされている。イメージセンサ101が出力する画像は、R画素201が検出した赤色光の光量に基づく画素値と、C画素202が検出した可視光全体の光量に基づく画素値が混在するグレースケール画像である。 As shown in the graph of FIG. 3, the R pixel 201 is capable of receiving light in the wavelength range of red light, while the C pixel 203 is capable of receiving light in the wavelength range of visible light including blue light, green light, and red light. It is said that it can receive light in the entire area. The image output by the image sensor 101 is a grayscale image in which a pixel value based on the amount of red light detected by the R pixel 201 and a pixel value based on the total amount of visible light detected by the C pixel 202 are mixed.
 補間処理部102では、イメージセンサ101が出力したグレースケール画像に基づき、C画素202の位置の赤色光成分光量を、周囲のR画素201の検出光量から補間して求める。加えて、補間処理部102は、イメージセンサ101が出力したグレースケール画像に基づき、R画素201の位置の可視光全体の光量を周囲のC画素202の検出光量から補間して求める。 The interpolation processing unit 102 obtains the amount of red light component light at the position of the C pixel 202 by interpolating it from the amount of detected light of the surrounding R pixel 201 based on the grayscale image output by the image sensor 101. In addition, the interpolation processing unit 102 interpolates and obtains the total amount of visible light at the position of the R pixel 201 from the detected light amount of the surrounding C pixel 202 based on the grayscale image output by the image sensor 101.
 補間処理部102の処理は、例えば図4に示す演算により実現することができる。図4に示す演算処理では、補間対象とする画素を中心とした3×3画素を処理単位としている。図2に示すRCCC画素配列を有するイメージセンサ101において補間処理を実行する場合、パターン1~パターン4の4通りの画素パターンに基づき補間処理を実行することができる。パターン1は、図2の右下の9つの画素が処理単位とされる場合であり、パターン2は、図2の左下の9つの画素が処理単位とされる場合であり、パターン3は、図2の右上の9つの画素が処理単位とされる場合であり、パターン4は、図2の左上の9つの画素が処理単位とされる場合を示している。 The processing of the interpolation processing unit 102 can be realized by, for example, the calculation shown in FIG. In the arithmetic processing shown in FIG. 4, the processing unit is 3 × 3 pixels centered on the pixel to be interpolated. When the interpolation process is executed in the image sensor 101 having the RCCC pixel array shown in FIG. 2, the interpolation process can be executed based on the four pixel patterns of patterns 1 to 4. Pattern 1 is a case where the lower right nine pixels of FIG. 2 are used as processing units, pattern 2 is a case where the lower left nine pixels of FIG. 2 are used as processing units, and pattern 3 is a case where the lower left nine pixels of FIG. 2 are used as processing units. The case where the nine pixels on the upper right of 2 are used as the processing unit, and the pattern 4 shows the case where the nine pixels on the upper left of FIG. 2 are used as the processing unit.
 パターン1~パターン4はいずれも、3×3画素の中心にある画素を補間処理の対象としている。例えば、パターン2~4では、中心にある画素がC画素202であるので、中心のC画素202におけるR成分の補間演算が行われる。一方、パターン1では、中心にある画素がR画素201であるので、中心のR画素201におけるC成分の補間演算が行われる。補間処理の対象の画素(3×3の中心)の補間演算は、その補間対象の画素の周囲にある画素の画素値に基づいて実行される。 In each of pattern 1 to pattern 4, the pixel in the center of the 3 × 3 pixel is targeted for interpolation processing. For example, in patterns 2 to 4, since the central pixel is the C pixel 202, the interpolation calculation of the R component in the central C pixel 202 is performed. On the other hand, in the pattern 1, since the central pixel is the R pixel 201, the interpolation calculation of the C component in the central R pixel 201 is performed. The interpolation calculation of the pixel to be interpolated (the center of 3 × 3) is executed based on the pixel values of the pixels around the pixel to be interpolated.
 パターン1では、3×3画素の中心のR画素201(R22)を補間処理の対象とし、そのC成分(C22)が、上下左右4つのC画素202の画素値に基づいて演算される(C22=(C12+C32+C12+C32)/4)。
 パターン2では、C画素202(C22)が3×3画素の中心にあるので、そのR成分(R22)が、その左右2つのR画素201の画素値に基づいて演算される(R22=(R212+R23)/2)。
 パターン3では、C画素202(C22)が3×3画素の中心にあり、そのR成分(R22)が、その上下2つのR画素201の画素値に基づいて演算される(R22=(R12+R32)/2)。
 パターン4では、C画素202(C22)が3×3画素の中心にあり、そのR成分(R22)が、その上下左右4つのR画素201の画素値に基づいて演算される(R22=(R11+R13+R31+R33)/4)。
In pattern 1, the R pixel 201 (R22) at the center of the 3 × 3 pixel is targeted for interpolation processing, and its C component (C22) is calculated based on the pixel values of the four C pixels 202 on the top, bottom, left, and right (C22). = (C12 + C32 + C12 + C32) / 4).
In pattern 2, since the C pixel 202 (C22) is at the center of the 3 × 3 pixel, its R component (R22) is calculated based on the pixel values of the two left and right R pixels 201 (R22 = (R212 + R23). ) / 2).
In pattern 3, the C pixel 202 (C22) is at the center of the 3 × 3 pixel, and its R component (R22) is calculated based on the pixel values of the two upper and lower R pixels 201 (R22 = (R12 + R32)). / 2).
In pattern 4, the C pixel 202 (C22) is at the center of the 3 × 3 pixel, and its R component (R22) is calculated based on the pixel values of the four R pixels 201 on the top, bottom, left, and right (R22 = (R11 + R13 + R31 + R33). ) / 4).
 この演算は、図2の画素配列の場合に適合するものであり、異なる画素配列に対しては、上記と同趣旨の異なる演算が採用可能であることは言うまでもない。また、上記の補間演算は、周囲の画素値の平均値として実行されたが、撮像装置の要求性能や環境の状態等に応じて、異なる演算手法(例えば、加重平均)が採用可能であることは言うまでもない。 It goes without saying that this calculation is suitable for the case of the pixel array shown in FIG. 2, and different operations having the same meaning as above can be adopted for different pixel arrays. Further, although the above interpolation calculation is executed as the average value of the surrounding pixel values, a different calculation method (for example, a weighted average) can be adopted depending on the required performance of the imaging device, the state of the environment, and the like. Needless to say.
 この処理を繰り返し行うことにより、赤色光の光量の空間分布を表す画像(R画像)と、これと同じ画素数を有する可視光全体の光量の空間分布を表す画像(C画像)を生成することができる。また、図4に示した処理に続いて、画像の歪を補正するための補間処理をR画像とC画像にさらに施してもよい。 By repeating this process, an image (R image) showing the spatial distribution of the amount of red light and an image (C image) showing the spatial distribution of the total amount of visible light having the same number of pixels are generated. Can be done. Further, following the process shown in FIG. 4, the R image and the C image may be further subjected to an interpolation process for correcting the distortion of the image.
 色情報生成処理部103は、補間処理部102で生成されたC画像とR画像とに基づいて、撮像画像の明度情報及び色相情報を認識処理部104へ出力する。認識処理部104は、この明度成分と色相成分とを参照して、交通信号機、道路標識、前方車両の灯火、路上の白色線、橙色線などを認識する。交通信号機等は色相が規定されている。例えば交通信号機の灯火は青緑色(シアン)、橙色、赤色の3色であり、道路標識は赤色、青色、白色、橙色が使われる割合が特に多い。また、車両の灯火については、テールランプとブレーキランプは赤色であり、ウィンカは橙色である。このため、認識処理部104は、これらの色を互いに弁別可能に構成される。 The color information generation processing unit 103 outputs the brightness information and the hue information of the captured image to the recognition processing unit 104 based on the C image and the R image generated by the interpolation processing unit 102. The recognition processing unit 104 recognizes a traffic signal, a road sign, a light of a vehicle in front, a white line on the road, an orange line, and the like with reference to the lightness component and the hue component. Hues are specified for traffic signals and the like. For example, traffic signal lights are bluish green (cyan), orange, and red, and road signs are often red, blue, white, and orange. As for the vehicle lights, the tail lamps and brake lamps are red, and the winkers are orange. Therefore, the recognition processing unit 104 is configured to be able to discriminate between these colors.
 色情報生成処理部103は、撮像画像の明度情報を、例えば補間処理部102で生成されたC画像に基づいて演算することができる。 The color information generation processing unit 103 can calculate the brightness information of the captured image based on, for example, the C image generated by the interpolation processing unit 102.
 色情報生成処理部103は、内部にR/C値比較処理部105を有している。R/C値比較処理部105は、補間処理部102からC画像とR画像を受信し、当該C画像とR画像の同一位置に配置された画素に関し、R成分の検出光量とC成分の検出光量との比(R/C比)を演算する。そして、R/C値比較処理部105は、このR/C比を弁別対象の色の基準値をそれぞれ比較して色相を判別し、その判別結果を色相情報として出力する。 The color information generation processing unit 103 has an R / C value comparison processing unit 105 inside. The R / C value comparison processing unit 105 receives the C image and the R image from the interpolation processing unit 102, and detects the R component detection light amount and the C component with respect to the pixels arranged at the same position of the C image and the R image. The ratio to the amount of light (R / C ratio) is calculated. Then, the R / C value comparison processing unit 105 discriminates the hue by comparing the R / C ratio with the reference value of the color to be discriminated, and outputs the discrimination result as hue information.
 本実施の形態では、上記の交通信号機、道路標識、前方走行車、及び路面の線を認識することを想定し、赤色判定基準値記憶部106、橙色判定基準値記憶部107、無彩色判定基準値記憶部108、青・緑判定基準値記憶部109の4つが備えられている。これら記憶部に記憶されている4つの基準値が、前記のR/C比と比較される。なお、交通信号機の青緑色(シアン)と道路標識の青色の色相は異なるが、明るさや存在する位置で区別可能であり、これらの色相を一括して青・緑として扱う。 In the present embodiment, assuming that the above traffic signal, road sign, vehicle in front, and road surface line are recognized, the red determination reference value storage unit 106, the orange determination reference value storage unit 107, and the achromatic color determination standard The value storage unit 108 and the blue / green determination reference value storage unit 109 are provided. The four reference values stored in these storage units are compared with the R / C ratio described above. Although the hues of blue-green (cyan) of traffic signals and the blue hues of road signs are different, they can be distinguished by their brightness and existing position, and these hues are collectively treated as blue and green.
 カラー画像生成処理部110は、補間処理部102で生成したC画像とR画像を用いて画像をカラー化し、三原色に対応したR(Red)成分画像とG(Green)成分画像とB(Blue)成分画像とを生成して出力する。画像記録部112は、カラー画像生成処理部110で生成したR(Red)成分画像とG(Green)成分画像とB(Blue)成分画像とを蓄積する部位であり、フラッシュメモリやハードディスク、DRAMなどで構成される。この第1の実施の形態のカラー画像生成処理部110は、入力されたR画像をR成分画像として画像記録部112へ出力すると共に、内部にC-R画像生成処理部111を有する。G成分画像、B成分画像としては、C-R画像生成処理部111でC画像とR画像の各画素の差分値を求めて生成したC-R画像に基づく画像を出力する。 The color image generation processing unit 110 colorizes the image using the C image and the R image generated by the interpolation processing unit 102, and the R (Red) component image, the G (Green) component image, and the B (Blue) corresponding to the three primary colors. Generates and outputs a component image. The image recording unit 112 is a portion for accumulating the R (Red) component image, the G (Green) component image, and the B (Blue) component image generated by the color image generation processing unit 110, and is a flash memory, a hard disk, a DRAM, or the like. Consists of. The color image generation processing unit 110 of the first embodiment outputs the input R image as an R component image to the image recording unit 112, and has a CR image generation processing unit 111 inside. As the G component image and the B component image, an image based on the CR image generated by obtaining the difference value of each pixel of the C image and the R image is output by the CR image generation processing unit 111.
 図5は、道路標識で用いられる色を撮影したRCCC画像のC画素値とR画素値の実測特性例である。より具体的には、カラーチャート(X-Rite社 Colorchecker SG(登録商標))を、5通りの明るさの白色光下で、RCCC画素配列のイメージセンサを搭載したカメラで撮影した結果としての実測特性例である。C画像とR画像とは、前述の補間処理の結果として得られたものである。図5のグラフは、道路標識で用いるJIS Z9103で規定された安全色の基準色に近い部位を、上記カラーチャート中から抜き出して(例えば、赤[行3列L]・橙[行6列L]・黄[行4列H]・白(無彩色)[行6列J]・青[行3列F])、C画素202の検出光量とR画素201の検出光量の関係をプロットしたものである。 FIG. 5 is an example of actual measurement characteristics of the C pixel value and the R pixel value of the RCCC image obtained by photographing the color used in the road sign. More specifically, the actual measurement as a result of taking a color chart (X-Rite Colorchecker SG (registered trademark)) with a camera equipped with an image sensor of RCCC pixel arrangement under white light of five different brightnesses. This is a characteristic example. The C image and the R image are obtained as a result of the above-mentioned interpolation processing. In the graph of FIG. 5, a part close to the standard color of the safety color specified by JIS Z9103 used for the road sign is extracted from the above color chart (for example, red [row 3 column L], orange [row 6 column L]. ] ・ Yellow [row 4 column H] ・ white (achromatic color) [row 6 column J] ・ blue [row 3 column F]), plotting the relationship between the detected light amount of C pixel 202 and the detected light amount of R pixel 201 Is.
 図5において、符号501及び502は、赤色の部位の画素に基づくプロットと、最小二乗法で求めた赤色光に対するC対R特性をそれぞれ表す。また、符号503及び504は、橙色の部位に基づくプロットと最小二乗法で求めた橙色光に対するC対R特性をそれぞれ表す。符号505及び506は、黄色の部位に基づくプロットと最小二乗法で求めた黄色光に対するC対R特性をそれぞれ表す。符号507及び508は、白色(無彩色)の部位に基づくプロットと最小二乗法で求めた無彩色光に対するC対R特性をそれぞれ表す。509と510は青色の部位に基づくプロットを表す。これらのデータから、各色に対するR画素の検出光量とC画素の検出光量の比(R/C)は明るさによらず一定であると共に、車載カメラでの運転支援のために道路標識などの認識対象で参照する赤・橙・黄・白(無彩色)・青の各色については異なる比となることから、R/Cに基づきこれらの色相を区分可能であることが理解される。 In FIG. 5, reference numerals 501 and 502 represent a plot based on the pixels of the red portion and a C vs. R characteristic for red light obtained by the least squares method, respectively. Further, reference numerals 503 and 504 represent C vs. R characteristics for orange light obtained by the plot based on the orange portion and the least squares method, respectively. Reference numerals 505 and 506 represent C vs. R characteristics for yellow light obtained by the plot based on the yellow part and the least squares method, respectively. Reference numerals 507 and 508 represent C vs. R characteristics for achromatic light obtained by the plot based on the white (achromatic) portion and the least squares method, respectively. 509 and 510 represent plots based on the blue part. From these data, the ratio (R / C) of the detected light amount of R pixel to the detected light amount of C pixel for each color is constant regardless of the brightness, and the recognition of road signs etc. for driving support by the in-vehicle camera. Since the ratios of the red, orange, yellow, white (achromatic color), and blue colors referred to in the subject are different, it is understood that these hues can be classified based on the R / C.
 図6は、交通信号機で用いられる色を撮影したRCCC画像のC画素値とR画素値の実測特性例である。この図6も、上述のカラーチャートを5通りの明るさの白色光下でRCCC画素配列のイメージセンサを搭載したカメラで撮影した結果としての実測特性例である。図6は、交通信号機で用いる色に近い部位(赤[行3列L]・橙[行6列L]・シアン[行8列B])を抜き出して、C画素の検出光量とR画素の検出光量の関係をプロットしたものである。符号501及び502、符号503及び504は、図5と同様であるので重複する説明を省略する。図6において、符号601と602はシアン色の部位に基づくプロットと最小二乗法で求めた黄色光に対するC対R特性をそれぞれ表す。これらのデータから、信号の色も、図5と同様に色相の弁別が可能である。 FIG. 6 is an example of actual measurement characteristics of the C pixel value and the R pixel value of the RCCC image obtained by photographing the color used in the traffic signal. FIG. 6 is also an example of actual measurement characteristics as a result of photographing the above-mentioned color chart with a camera equipped with an image sensor having an RCCC pixel arrangement under white light having five different brightnesses. FIG. 6 shows the detected light amount of C pixel and the detected light amount of R pixel by extracting the parts (red [row 3 column L], orange [row 6 column L], cyan [row 8 column B]) that are close to the color used in the traffic signal. This is a plot of the relationship between the amount of detected light. Reference numerals 501 and 502 and reference numerals 503 and 504 are the same as those in FIG. 5, and thus redundant description will be omitted. In FIG. 6, reference numerals 601 and 602 represent C vs. R characteristics for yellow light obtained by the plot based on the cyan part and the least squares method, respectively. From these data, it is possible to discriminate the hue of the signal color as in FIG.
 図7はイメージセンサ101で撮影した画像から色相情報を得るまでの手順を説明するフローチャートであり、図1のイメージセンサ101、補間処理部102、及び色情報生成処理部103で実行される処理である。 FIG. 7 is a flowchart illustrating a procedure for obtaining hue information from an image captured by the image sensor 101, and is a process executed by the image sensor 101, the interpolation processing unit 102, and the color information generation processing unit 103 of FIG. is there.
 まずステップ701でイメージセンサ101により画像が撮像・取得される。この取得された画像は、図2に示したRCCC画素配列のイメージセンサ101のR画素201が検出した赤色光の光量に基づく画素値と、C画素202が検出した可視光全体の光量に基づく画素値とが混在したグレースケール画像である。 First, in step 701, an image is captured and acquired by the image sensor 101. The acquired image has a pixel value based on the amount of red light detected by the R pixel 201 of the image sensor 101 of the RCCC pixel array shown in FIG. 2 and a pixel based on the total amount of visible light detected by the C pixel 202. It is a grayscale image in which values are mixed.
 続くステップ702では、ステップ701で得たグレースケールの画像の各画素位置に対して前述の補間処理が実行され、R画像及びC画像が生成される。そして、ステップ703では、同一画素位置のR画像の画素値とC画像の画素値の比(R/C比)を求められ、ステップ704において、R/C比の値に基づき色相が判別される。 In the following step 702, the above-mentioned interpolation processing is executed for each pixel position of the grayscale image obtained in step 701, and an R image and a C image are generated. Then, in step 703, the ratio (R / C ratio) of the pixel value of the R image at the same pixel position and the pixel value of the C image is obtained, and in step 704, the hue is determined based on the value of the R / C ratio. ..
 図8は、ステップ704において、画素値の比(R/C比)に基づき色相を判別する手順の詳細を示したフローチャートである。図8において、Trは、赤色と判別するためのR/C比の下限値、Toは橙色と判別するためのR/C比の下限値、Tyは黄色と判別するためのR/C比の下限値、Tgは無彩色(白色、灰色、黒色)と判別するためのR/C比の下限値をそれぞれ示す。 FIG. 8 is a flowchart showing the details of the procedure for determining the hue based on the pixel value ratio (R / C ratio) in step 704. In FIG. 8, Tr is the lower limit of the R / C ratio for discriminating red, To is the lower limit of the R / C ratio for discriminating orange, and Ty is the lower limit of the R / C ratio for discriminating yellow. The lower limit value and Tg indicate the lower limit value of the R / C ratio for distinguishing from achromatic colors (white, gray, black), respectively.
 ステップ801では、R/C比の値と下限値Trとが比較される。R/C>Trであれば(Yes)、当該画素の色相が赤色と判断される。R/C≦Trであれば(No)、ステップ802に進む。 In step 801 the R / C ratio value and the lower limit value Tr are compared. If R / C> Tr (Yes), the hue of the pixel is determined to be red. If R / C ≦ Tr (No), the process proceeds to step 802.
 ステップ802では、R/C比の値と下限値Toとが比較される。R/C>Toであれば(Yes)、当該画素の色相が橙色と判断される。R/C≦Toであれば(No)、ステップ803に進む。 In step 802, the value of the R / C ratio and the lower limit value To are compared. If R / C> To (Yes), the hue of the pixel is determined to be orange. If R / C ≦ To (No), the process proceeds to step 803.
 ステップ803では、R/C比の値と下限値Tyとが比較される。R/C>Tyであれば(Yes)、当該画素の色相が黄色と判断される。R/C≦Tyであれば(No)、ステップ804に進む。 In step 803, the value of the R / C ratio and the lower limit value Ty are compared. If R / C> Ty (Yes), the hue of the pixel is determined to be yellow. If R / C ≦ Ty (No), the process proceeds to step 804.
 ステップ804では、R/C比の値と下限値Tgが比較される。R/C>Tgであれば(Yes)、当該画素の色相が無彩色(白色、灰色、黒色)と判断される。 In step 804, the value of the R / C ratio and the lower limit value Tg are compared. If R / C> Tg (Yes), it is determined that the hue of the pixel is achromatic (white, gray, black).
 ステップS801~804でいずれも「No」の判断がなされると、当該画素の色相は青又は緑色と判断される。以上により、ステップS704の判別手順が終了する。 When the determination of "No" is made in any of steps S801 to 804, the hue of the pixel is determined to be blue or green. With the above, the determination procedure of step S704 is completed.
 図9は、イメージセンサ101で撮影したグレースケールの画像から、カラー画像生成処理部110においてカラー化した画像を生成する手順を示すフローチャートである。 FIG. 9 is a flowchart showing a procedure for generating a colorized image by the color image generation processing unit 110 from a grayscale image taken by the image sensor 101.
 図7と同様にして、ステップ701と702において、イメージセンサ101によりグレースケールの画像を撮像・取得した後、グレースケールの画像の各画素位置に対して前述の補間処理が実行され、R画像及びC画像が生成される。 In the same manner as in FIG. 7, in steps 701 and 702, after the grayscale image is captured and acquired by the image sensor 101, the above-mentioned interpolation processing is executed for each pixel position of the grayscale image, and the R image and the R image and A C image is generated.
 続くステップ901では、生成されたC画像の画素値とR画像の画素値の差画像(C-R画像)がC-R画像生成処理部111において生成される。また、ステップ902では、C-R画像の画素値に配分比α(0≦α≦1)を乗算して得られる画像(α(C-R))がC-R画像生成処理部111において生成され、G成分画像とされる。 In the following step 901, the difference image (CR image) between the pixel value of the generated C image and the pixel value of the R image is generated in the CR image generation processing unit 111. Further, in step 902, the CR image generation processing unit 111 generates an image (α (CR)) obtained by multiplying the pixel value of the CR image by the distribution ratio α (0 ≦ α ≦ 1). It is made into a G component image.
 更に、ステップ903では、C-R画像の画素値を(1-α)倍した画像((1-α)(C-R))がC-R画像生成処理部111において生成され、B成分画像とされる。配分比αは、差画像(C-R)に含まれるG成分画像の割合を示す値である。以下において、αを「第1の配分比」、(1-α)を「第2の配分比」と称することがある。 Further, in step 903, an image ((1-α) (CR)) obtained by multiplying the pixel value of the CR image by (1-α) is generated by the CR image generation processing unit 111, and the B component image is generated. It is said that. The distribution ratio α is a value indicating the ratio of the G component images included in the difference image (CR). In the following, α may be referred to as a “first allocation ratio” and (1-α) may be referred to as a “second allocation ratio”.
 ステップ902でG成分画像に与える画素値α(C-R)と、ステップ903でB成分画像に与える画素値(1-α)(C-R)の和はC-Rである。換言すれば、第1の配分比αと、第2の配分比(1-α)の和は1である。図3に示したR画素201とC画素202の波長-感度特性の関係において、C画素202が感度を有する波長域が、R画素201と比較してG成分とB成分の分だけ広く、C画素202とR画素201の検出光量がG成分とB成分の和だけ大きいと見積もられるため、上記のような関係が成立する。なお、得られたR成分画像、G成分画像、B成分画像に、さらに図示しない各成分の明るさを増減する処理を加算することも可能である。 The sum of the pixel values α (CR) given to the G component image in step 902 and the pixel values (1-α) (CR) given to the B component image in step 903 is CR. In other words, the sum of the first allocation ratio α and the second allocation ratio (1-α) is 1. In the relationship between the wavelength-sensitivity characteristics of the R pixel 201 and the C pixel 202 shown in FIG. 3, the wavelength range in which the C pixel 202 has sensitivity is wider than that of the R pixel 201 by the amount of the G component and the B component, and the C Since it is estimated that the amount of detected light of the pixel 202 and the R pixel 201 is larger by the sum of the G component and the B component, the above relationship is established. It is also possible to add a process of increasing or decreasing the brightness of each component (not shown) to the obtained R component image, G component image, and B component image.
 図10は、G成分画像とB成分画像との配分比αの第1の例を説明する図(グラフ)である。図10のグラフにおいて、縦軸はR/C比の値(0以上1以下)を示し、横軸はG成分画像とB成分画像の和(G+B)に対するG成分画像の比、すなわち第1の配分比α(0以上1以下)を示している。このグラフは、縦軸と横軸で示される空間内における各色相の概略分布を点線で示している。 FIG. 10 is a diagram (graph) for explaining the first example of the distribution ratio α between the G component image and the B component image. In the graph of FIG. 10, the vertical axis represents the value of the R / C ratio (0 or more and 1 or less), and the horizontal axis is the ratio of the G component image to the sum (G + B) of the G component image and the B component image, that is, the first. The distribution ratio α (0 or more and 1 or less) is shown. In this graph, the approximate distribution of each hue in the space indicated by the vertical axis and the horizontal axis is shown by a dotted line.
 図10に示すように、R/C比の値の大小のみをファクターとした場合、当該画素が赤色であるか、シアン又は灰色であるかを判別することはできる。しかし、R/C比のみでは、図10のグラフ中で横方向に並ぶ色相は判別することはできない。例えば、青、シアン、緑を互いに判別することはできない。また、桃色と黄色(又は橙色)も、R/C比の大小のみでは判別することはできない。 As shown in FIG. 10, when only the magnitude of the R / C ratio value is used as a factor, it is possible to determine whether the pixel is red, cyan or gray. However, the hues arranged in the horizontal direction in the graph of FIG. 10 cannot be discriminated only by the R / C ratio. For example, blue, cyan, and green cannot be distinguished from each other. Further, pink and yellow (or orange) cannot be discriminated only by the magnitude of the R / C ratio.
 図10では、αの値を0.5に固定した場合の色相の判別を説明している。すなわち、図10において、軌跡1001はR/C比の変化を示しており、R/C比によらずα=0.5に固定している。R/C比が0から1に向けて上昇するにつれて、当該画素の色相は、順にシアン、無彩色、桃色と橙色の中間色、赤色のように変化する。第1の実施の形態の撮像装置を車載カメラに適用した場合、特に晴天の空の色や青信号(シアン)、路面や白線の色(無彩色)、赤信号や車両のテールランプ(赤色)が特に良く再現される。従って、RCCC画素配列のイメージセンサ101に基づき、視感上の違和感の小さいカラー化した画像を得ることが可能である。 FIG. 10 illustrates the discrimination of hue when the value of α is fixed at 0.5. That is, in FIG. 10, the locus 1001 shows a change in the R / C ratio, and is fixed at α = 0.5 regardless of the R / C ratio. As the R / C ratio increases from 0 to 1, the hue of the pixel changes in that order, such as cyan, achromatic, neutral between pink and orange, and red. When the imaging device of the first embodiment is applied to an in-vehicle camera, particularly the color of the clear sky and the blue light (cyan), the color of the road surface and the white line (achromatic color), the red light and the tail lamp of the vehicle (red) It is reproduced well. Therefore, based on the image sensor 101 having an RCCC pixel array, it is possible to obtain a colorized image with less visual discomfort.
 図11は、G成分画像とB成分画像との配分比αの第2の例を説明する図(グラフ)である。図10との相違は、少なくともR/C比が所定の範囲にある場合、R/C比の変化に従って、配分比αが変化するよう、配分比αを変動させる点である。撮像装置1が車載カメラとして使用された場合、その撮像画像において桃色よりも橙色が現れる頻度が高いことに鑑みたものである。 FIG. 11 is a diagram (graph) for explaining a second example of the distribution ratio α between the G component image and the B component image. The difference from FIG. 10 is that the distribution ratio α is changed so that the distribution ratio α changes according to the change in the R / C ratio, at least when the R / C ratio is within a predetermined range. This is because when the image pickup device 1 is used as an in-vehicle camera, orange color appears more frequently than pink color in the captured image.
 図11の軌跡1101は、R/C比の変化を示している。この図11では、R/C比が、例えば第1の値より小さい値のとき(例えばR/C<0.25)は、配分比αを一定値(例えばα=0.5)に固定する。一方R/C比が第1の値以上第2の値以下である場合には、R/C比が大きくなるほど配分比αも大きくされる。そして、R/C比が第2の値より大きい場合(例えばR/C>0.75)には、配分比αを一定値(例えばα=1)に固定する。この軌跡1101を示す関数は、カラー画像生成処理部110内の、図示しない記憶部に記憶させておくことができる。なお、この図11の軌跡1101はあくまでも一例であって、軌跡1101の形状は、適宜変更可能であることは言うまでもない。 The locus 1101 in FIG. 11 shows the change in the R / C ratio. In FIG. 11, when the R / C ratio is smaller than, for example, the first value (for example, R / C <0.25), the distribution ratio α is fixed to a constant value (for example, α = 0.5). .. On the other hand, when the R / C ratio is equal to or greater than the first value and equal to or less than the second value, the larger the R / C ratio, the larger the distribution ratio α. Then, when the R / C ratio is larger than the second value (for example, R / C> 0.75), the distribution ratio α is fixed at a constant value (for example, α = 1). The function showing the locus 1101 can be stored in a storage unit (not shown) in the color image generation processing unit 110. It goes without saying that the locus 1101 in FIG. 11 is just an example, and the shape of the locus 1101 can be changed as appropriate.
 この軌跡1101のように配分比αをR/C比の増加に応じて変化させた場合、R/C比が小さい段階(α=0.5)では、当該画素の色相は、シアン又は灰色と判定することができる。一方、R/C比が第1の値以上第2の値以下となった場合には、R/C比が増加するにつれて配分比αが増加する。これにより、上記R/C比の画素に対し割り当てられる色相はシアン~無彩~橙色~赤色と変化する。撮像装置1を車載カメラとして使用した場合、特に晴天の空の色や青信号(シアン)、路面や白線の色(無彩色)、赤信号やテールランプの(赤色)が特に良く再現される。加えて、図10の例と比較して路面の橙色線、黄信号、ウィンカなどの色の再現度を改善することができる。 When the distribution ratio α is changed according to the increase in the R / C ratio as in this locus 1101, the hue of the pixel is cyan or gray at the stage where the R / C ratio is small (α = 0.5). Can be determined. On the other hand, when the R / C ratio is equal to or greater than the first value and equal to or less than the second value, the distribution ratio α increases as the R / C ratio increases. As a result, the hue assigned to the pixel having the R / C ratio changes from cyan to achromatic to orange to red. When the image pickup device 1 is used as an in-vehicle camera, the color of the clear sky and the green signal (cyan), the color of the road surface and the white line (achromatic color), the red signal and the tail lamp (red) are reproduced particularly well. In addition, the reproducibility of colors such as orange lines, yellow signals, and winkers on the road surface can be improved as compared with the example of FIG.
 なお、図11の軌跡1101は、R/C比が0から1に変化する全体に亘って連続した曲線となるよう設定されるのが好適である。軌跡1101が不連続な曲線となると、不連続天付近において色相の変化が大きくなり、雑音が大きくなる問題が発生するためである。 It is preferable that the locus 1101 in FIG. 11 is set to be a continuous curve over the entire area where the R / C ratio changes from 0 to 1. This is because if the locus 1101 has a discontinuous curve, the change in hue becomes large in the vicinity of the discontinuous sky, causing a problem that noise becomes large.
 以上説明したように、第1の実施の形態に係る撮像装置1及び画像処理方法によれば、RCCC画素配列のような2種類の画素を有するイメージセンサを適用した場合であっても、所定の色相の弁別を正確に行うことが可能になる。RCCC画素配列の場合、RGGB画素配列に比べ、広画角化と空間解像度向上を両立することが可能になるが、色彩に関し得られる情報は通常、限定される。しかし、この第1の実施の形態では、RGGB配列のイメージセンサを適用した撮像装置と同様に、対象物の色彩を反映した画像を生成できる効果が得られる。 As described above, according to the image pickup apparatus 1 and the image processing method according to the first embodiment, even when an image sensor having two types of pixels such as an RCCC pixel array is applied, a predetermined image sensor is applied. It becomes possible to accurately discriminate hues. In the case of the RCCC pixel array, it is possible to achieve both a wide angle of view and an improvement in spatial resolution as compared with the RGGB pixel array, but the information obtained regarding the color is usually limited. However, in this first embodiment, the effect of being able to generate an image reflecting the color of the object can be obtained, similar to the image pickup device to which the image sensor of the RGGB arrangement is applied.
 なお、第1の実施の形態ではRCCC画素配列の例を示したが、赤色のカラーフィルタを有する画素と透明のカラーフィルタを有する画素の二種類からなる画素配列であれば、画素数の割合は1:3でなくともよい。また、車載カメラ用途として赤色の対象(交通信号機、テールランプ、標識)に対する認識精度に比重をおくため、赤色のカラーフィルタを有する画素と透明のカラーフィルタを有する画素の二種類とする実施の形態を示したが、用途に応じて赤色ではないカラーフィルタを有する画素と透明のカラーフィルタを有する画素との組にしてもよい。 In the first embodiment, an example of the RCCC pixel array is shown, but if the pixel array consists of two types of pixels, a pixel having a red color filter and a pixel having a transparent color filter, the ratio of the number of pixels is It does not have to be 1: 3. Further, in order to place importance on the recognition accuracy for a red object (traffic signal, tail lamp, sign) for in-vehicle camera applications, there are two types of embodiments, one having a red color filter and the other having a transparent color filter. As shown, a pair of a pixel having a non-red color filter and a pixel having a transparent color filter may be used depending on the application.
[第2の実施の形態]
 次に、図12及び図13を参照して、第2の実施の形態に係る撮像装置及び画像処理方法を説明する。図12は、第2の実施の形態の撮像装置の全体構成を示すブロック図である。図12において、第1の実施の形態の装置と共通する構成部材については図1と同一の参照符号を付しているので、以下では重複する説明は省略する。
[Second Embodiment]
Next, the image pickup apparatus and the image processing method according to the second embodiment will be described with reference to FIGS. 12 and 13. FIG. 12 is a block diagram showing the overall configuration of the image pickup apparatus of the second embodiment. In FIG. 12, the components common to the apparatus of the first embodiment are designated by the same reference numerals as those in FIG. 1, and therefore, duplicate description will be omitted below.
 第2の実施の形態は、色情報生成処理部103の構成において第1の実施の形態と異なっている。この第2の実施の形態の色情報生成処理部103は、アドレス生成部1201と、色情報生成テーブル1202とを備えている。 The second embodiment is different from the first embodiment in the configuration of the color information generation processing unit 103. The color information generation processing unit 103 of the second embodiment includes an address generation unit 1201 and a color information generation table 1202.
 アドレス生成部1201はC画像、R画像を入力されて、対応するアドレス信号を出力するよう構成されている。具体的にアドレス生成部1201は、入力されたC画像及びR画像の同一画素位置の画素の値の組に対応するアドレス信号を生成し、色情報生成テーブル1202に出力する。 The address generation unit 1201 is configured to input a C image and an R image and output the corresponding address signal. Specifically, the address generation unit 1201 generates an address signal corresponding to a set of pixel values at the same pixel position of the input C image and R image, and outputs the address signal to the color information generation table 1202.
 色情報生成テーブル1202は、アドレス信号と、そのアドレス信号に対応する明度情報と色相情報とをテーブルとして記憶する。そして、色情報生成テーブル1202は、アドレス生成部1201から入力されたアドレス情報に基づき、対応する明度情報と色相情報を特定して出力する。 The color information generation table 1202 stores an address signal and brightness information and hue information corresponding to the address signal as a table. Then, the color information generation table 1202 identifies and outputs the corresponding brightness information and hue information based on the address information input from the address generation unit 1201.
 色情報生成テーブル1202は、例えば図13に示すデータ構造を適用する。図13の例において、アドレス生成部1201から供給されるアドレス信号は、アドレスを{R、C}のようにR画素値とC画素値の値を上位側ビットと下位側ビットに連接することで生成したものを適用するように割当てると共に、対応するデータとして、そのアドレスに相当するR画素値、C画素値によるR/C比に対応した色相情報を格納する。このような構成によれば、複雑な演算を要することなく、簡便に色相情報を生成可能である。 For example, the data structure shown in FIG. 13 is applied to the color information generation table 1202. In the example of FIG. 13, the address signal supplied from the address generation unit 1201 connects the R pixel value and the C pixel value to the upper bit and the lower bit like {R, C}. The generated data is assigned so as to be applied, and as the corresponding data, the R pixel value corresponding to the address and the hue information corresponding to the R / C ratio based on the C pixel value are stored. According to such a configuration, hue information can be easily generated without requiring complicated calculations.
 また、この第2の実施の形態では、R/C比による色相区分だけでなく、C画素値のレベルによる明度区分も加えた色彩情報を生成することも容易に行うことができる。さらに、C画素値のレベルに合わせて、R/C比の値の区分の閾値や、区分の数を変えるなど、複雑な色彩情報の判定を行うことが可能になる効果も得られる。 Further, in the second embodiment, it is possible to easily generate color information including not only the hue classification based on the R / C ratio but also the brightness classification based on the level of the C pixel value. Further, it is possible to obtain an effect that it becomes possible to determine complicated color information, such as changing the threshold value of the division of the R / C ratio value and the number of divisions according to the level of the C pixel value.
[第3の実施の形態]
 次に、図14及び図15を参照して、第3の実施の形態に係る撮像装置及び画像処理方法を説明する。図14は、第3の実施の形態の撮像装置の全体構成を示すブロック図である。図13おいて、第1の実施の形態の装置と共通する構成部材については図1と同一の参照符号を付しているので、以下では重複する説明は省略する。
[Third Embodiment]
Next, the image pickup apparatus and the image processing method according to the third embodiment will be described with reference to FIGS. 14 and 15. FIG. 14 is a block diagram showing the overall configuration of the image pickup apparatus according to the third embodiment. In FIG. 13, since the constituent members common to the apparatus of the first embodiment are designated by the same reference numerals as those in FIG. 1, duplicate description will be omitted below.
 この第3の実施の形態の撮像装置は、カラー画像生成処理部110の構成が第1の実施の形態とは異なっている。この第3の実施の形態のカラー画像生成処理部110は、画素の検出光量が上限値を超過した場合においても視覚上違和感の少ないカラー画像を得ることができるよう構成されている。前述の実施の形態の撮像装置での画像生成処理では、画素の検出光量が上限を超えず飽和していないケースにおいては適切なカラー画像が得られるが、飽和した箇所においては現実よりも赤みがかった色相となる。これは、飽和した箇所においてはC画素値とR画素値の差が現実よりも小さくなり、G成分とB成分に与える値が相対的に小さくなるためである。第3の実施の形態では、この現象を効果的に抑制することができる。 In the image pickup apparatus of the third embodiment, the configuration of the color image generation processing unit 110 is different from that of the first embodiment. The color image generation processing unit 110 of the third embodiment is configured to be able to obtain a color image with less visual discomfort even when the detected light amount of the pixel exceeds the upper limit value. In the image generation process in the image pickup apparatus of the above-described embodiment, an appropriate color image can be obtained in the case where the detected light amount of the pixel does not exceed the upper limit and is not saturated, but the saturated portion is reddish than in reality. It becomes a hue. This is because the difference between the C pixel value and the R pixel value becomes smaller than the actual value at the saturated portion, and the values given to the G component and the B component become relatively small. In the third embodiment, this phenomenon can be effectively suppressed.
 図14に示すように、第3の実施の形態のカラー画像生成処理部110は、第1の実施の形態と同様のC-R画像生成処理部111に加え、明度飽和画素判定処理部1401、飽和画素置換処理部1402、R成分明度補正部1403、G成分明度補正部1404、及びB成分明度補正部1405を備える。R成分明度補正部1403、G成分明度補正部1404、及びB成分明度補正部1405は、全体として明度補正部を構成する。 As shown in FIG. 14, the color image generation processing unit 110 of the third embodiment includes the brightness saturation pixel determination processing unit 1401 in addition to the CR image generation processing unit 111 similar to that of the first embodiment. It includes a saturated pixel replacement processing unit 1402, an R component brightness correction unit 1403, a G component brightness correction unit 1404, and a B component brightness correction unit 1405. The R component brightness correction unit 1403, the G component brightness correction unit 1404, and the B component brightness correction unit 1405 together constitute a brightness correction unit.
 C-R画像生成処理部111の機能は第1の実施の形態(図1)と同様である。明度飽和画素判定処理部1401は、C画像中の画素の検出光量が上限値を超過した場合、当該画素は明度が飽和画素であると判定する。なお、検出光量が上限値は超過していないが、上限値の近傍の明度を有する画素を、飽和画素であると判定するような構成とすることも可能である。 The function of the CR image generation processing unit 111 is the same as that of the first embodiment (FIG. 1). When the detected light amount of the pixel in the C image exceeds the upper limit value, the brightness saturation pixel determination processing unit 1401 determines that the brightness of the pixel is a saturated pixel. It should be noted that it is also possible to configure a pixel having a brightness near the upper limit value to be determined to be a saturated pixel, although the amount of detected light does not exceed the upper limit value.
 飽和画素置換処理部1402は、明度飽和画素判定処理部1401での判定結果に基づき、明度が飽和したと判定された画素に対して、R成分、G成分、又はB成分の明度を、対応する上限値に切替えて出力する。つまり、飽和画素置換処理部1402は、飽和した画素が白色として扱われるように、各成分値を上限値に差し替える機能を有する。 The saturated pixel replacement processing unit 1402 corresponds the brightness of the R component, the G component, or the B component to the pixel determined to be saturated based on the determination result of the brightness saturation pixel determination processing unit 1401. Switch to the upper limit and output. That is, the saturated pixel replacement processing unit 1402 has a function of replacing each component value with an upper limit value so that the saturated pixel is treated as white.
 飽和画素置換処理部1402には、R画像の画素の明度が、R成分明度補正部1403により補正された後に入力される。同様に、飽和画素置換処理部1402には、G画像及びB画像の画素の明度が、G成分明度補正部1404及びB成分明度補正部1405によりそれぞれ補正された後に入力される。 The brightness of the pixels of the R image is input to the saturated pixel replacement processing unit 1402 after being corrected by the R component brightness correction unit 1403. Similarly, the saturation pixel replacement processing unit 1402 is input after the brightness of the pixels of the G image and the B image is corrected by the G component brightness correction unit 1404 and the B component brightness correction unit 1405, respectively.
 R成分とG成分とB成分の各々に、8Byte(255bit)が割り当てられている場合、明るさを表すRGBの合計値は、白色に対して255×3=765bitとなる。 一方、C-R画像生成処理部111に入力されるC画像にも同様に255bitが割り当てられていて、ビット数として3倍の違いがある。 When 8 bytes (255 bits) are assigned to each of the R component, the G component, and the B component, the total value of RGB representing the brightness is 255 × 3 = 765 bits with respect to white. On the other hand, 255 bits are similarly assigned to the C image input to the CR image generation processing unit 111, and there is a three-fold difference in the number of bits.
 このため、C-R画像生成処理部111の出力をそのまま飽和画素置換処理部1402に入力すると、飽和していない画素の上限の明るさと、飽和して上限値に調整された画素の明るさで3倍程度の明るさの段差が生じ、視感上違和感が大きい画像となる。そこで、この第3の実施の形態のカラー画像生成処理部110は、R成分明度補正部1403、G成分明度補正部1404、及びB成分明度補正部1405を備え、各成分の明度を明度が大となる方向へ補正した上で、飽和画素置換処理部1402へ供給する。一例として、R成分明度補正部1403、G成分明度補正部1404、及びB成分明度補正部1405での補正量βは、1以上の値であり、好適にはβ≒3とすることができる。β倍に調整されたR成分画像、G成分画像、及びB成分画像の明度が各上限値を超えた場合は、当該上限値となるように飽和画素置換処理部1402において調整すればよい。 Therefore, when the output of the CR image generation processing unit 111 is directly input to the saturated pixel replacement processing unit 1402, the brightness of the upper limit of the saturated pixel and the brightness of the saturated pixel adjusted to the upper limit value are obtained. A step of about 3 times the brightness is generated, and the image becomes visually uncomfortable. Therefore, the color image generation processing unit 110 of the third embodiment includes an R component brightness correction unit 1403, a G component brightness correction unit 1404, and a B component brightness correction unit 1405, and the brightness of each component is high. After correcting in the direction of, it is supplied to the saturated pixel replacement processing unit 1402. As an example, the correction amount β in the R component brightness correction unit 1403, the G component brightness correction unit 1404, and the B component brightness correction unit 1405 is a value of 1 or more, and β≈3 can be preferably set. When the brightness of the R component image, the G component image, and the B component image adjusted to β times exceeds each upper limit value, the saturation pixel replacement processing unit 1402 may adjust the brightness so as to reach the upper limit value.
 図15はイメージセンサ101で撮影した画像から、カラー画像生成処理部110においてカラー化した画像を生成する手順を示すフローチャートである。ステップ701、702、703、901、902、903は、図9と同様であるので、重複する説明は省略する。 FIG. 15 is a flowchart showing a procedure for generating a colorized image by the color image generation processing unit 110 from the image taken by the image sensor 101. Steps 701, 702, 703, 901, 902, and 903 are the same as those in FIG. 9, so duplicate description will be omitted.
 ステップ903の後のステップ1501では、明度飽和画素判定処理部1401において、C画像中の飽和画素(上限値以上、又はその近傍の明度を有する画素)の存在と、その位置が判別される。 In step 1501 after step 903, the brightness saturation pixel determination processing unit 1401 determines the existence and position of saturated pixels (pixels having brightness equal to or higher than the upper limit value) in the C image.
 続くステップ1502は、R成分明度補正部1403、G成分明度補正部1404、及びB成分明度補正部1405において、R成分画像、G成分画像、B成分画像の画素値(明度)をβ倍(β≧1)する。これにより、飽和画素が白色に置き換えられた場合の明るさの段差が軽減される。 In the following step 1502, the pixel values (brightness) of the R component image, the G component image, and the B component image are multiplied by β (β) in the R component brightness correction unit 1403, the G component brightness correction unit 1404, and the B component brightness correction unit 1405. ≧ 1). As a result, the difference in brightness when the saturated pixels are replaced with white is reduced.
 そして、ステップ1503では、飽和画素の位置のR成分画像、G成分画像、B成分画像の画素値が、各上限値に置換される。このステップ1503により、画素値が飽和して擬似的に赤色成分が強く表示されてしまう箇所が、白色で置き換えられる。以上に示した装置構成ならびに処理方法によれば、第1の実施の形態で示した効果に加えて、さらに画素の検出光量が上限値を超過した場合においても視覚上違和感の少ないカラー画像を得る効果が得られる。 Then, in step 1503, the pixel values of the R component image, the G component image, and the B component image at the positions of the saturated pixels are replaced with the respective upper limit values. By this step 1503, the portion where the pixel value is saturated and the red component is displayed strongly in a pseudo manner is replaced with white. According to the apparatus configuration and the processing method shown above, in addition to the effects shown in the first embodiment, a color image with less visual discomfort can be obtained even when the detected light amount of the pixel exceeds the upper limit value. The effect is obtained.
[第4の実施の形態]
 次に、図16及び図17を参照して、第4の実施の形態に係る撮像装置及び画像処理方法を説明する。図16に示すように、この第4の実施の形態では、イメージセンサ101が、RCCC画素配列とは異なる画素配列を有している。このイメージセンサ101では、1つのフィルタユニット203は、赤色の補色であるシアン色を検出するCy画素1601と、C画素202の二種類の画素を有し、ひとつのCy画素1601と3つのC画素202からなる2×2画素のフィルタユニット203を複数行複数列に亘り繰り返す画素配置(CyCCC画素配置)である。
[Fourth Embodiment]
Next, the image pickup apparatus and the image processing method according to the fourth embodiment will be described with reference to FIGS. 16 and 17. As shown in FIG. 16, in this fourth embodiment, the image sensor 101 has a pixel array different from the RCCC pixel array. In this image sensor 101, one filter unit 203 has two types of pixels, a Cy pixel 1601 for detecting cyan color, which is a complementary color of red, and a C pixel 202, and one Cy pixel 1601 and three C pixels. This is a pixel arrangement (CyCCC pixel arrangement) in which a 2 × 2 pixel filter unit 203 composed of 202 is repeated over a plurality of rows and a plurality of columns.
 図17に示すように、Cy画素1601は、青色光と緑色光に対して感度を有する。C画像とCy画像の差画像(C-Cy)はR画像となる。このようにして得られたC画像、及びR画像に対し、第1の実施の形態と同様の処理を実行することで、第1の実施の形態と同様の効果を得ることができる。Cy画素1601は青色光と緑色光に感度を有しているため、イメージセンサ101の緑色光及び青色光に対する感度がRCCC画素配列に比べ向上する。白色光がイメージセンサ101に入射した場合、R画素201と比較して大きい光量が検出されるので、さまざまな色相の被写体を含む画像全体の傾向として、Cy画素1601はR画素201よりも高いS/N比を与えることができ、結果として色相の判別の精度が第1の実施の形態に比べ改善することが期待できる。 As shown in FIG. 17, the Cy pixel 1601 has sensitivity to blue light and green light. The difference image (C-Cy) between the C image and the Cy image is an R image. By executing the same processing as in the first embodiment on the C image and R image thus obtained, the same effect as in the first embodiment can be obtained. Since the Cy pixel 1601 has sensitivity to blue light and green light, the sensitivity of the image sensor 101 to green light and blue light is improved as compared with the RCCC pixel array. When white light is incident on the image sensor 101, a large amount of light is detected as compared with the R pixel 201. Therefore, as a tendency of the entire image including subjects of various hues, the Cy pixel 1601 has a higher S than the R pixel 201. The / N ratio can be given, and as a result, the accuracy of hue discrimination can be expected to be improved as compared with the first embodiment.
 尚、本発明は上記した実施例に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施例は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、ある実施例の構成の一部を他の実施例の構成に置き換えることが可能であり、また、ある実施例の構成に他の実施例の構成を加えることも可能である。また、各実施例の構成の一部について、他の構成の追加・削除・置換をすることが可能である。 The present invention is not limited to the above-mentioned examples, and includes various modifications. For example, the above-described embodiment has been described in detail in order to explain the present invention in an easy-to-understand manner, and is not necessarily limited to those having all the described configurations. Further, it is possible to replace a part of the configuration of one embodiment with the configuration of another embodiment, and it is also possible to add the configuration of another embodiment to the configuration of one embodiment. Further, it is possible to add / delete / replace a part of the configuration of each embodiment with another configuration.
101…イメージセンサ、102…補間処理部、103…色情報生成処理部、104…認識処理部、110…カラー画像生成処理部、111…C-R画像生成処理部、112…画像記録部、201…R画素、202…C画素、203…フィルタユニット、1201…アドレス生成部、1202…色情報生成テーブル、1401…明度飽和画素判定処理部、1402…飽和画素置換処理部、1403…R成分明度補正部、1404…G成分明度補正部、1405…B成分明度補正部、1601…Cy画素。 101 ... image sensor, 102 ... interpolation processing unit, 103 ... color information generation processing unit, 104 ... recognition processing unit, 110 ... color image generation processing unit, 111 ... CR image generation processing unit, 112 ... image recording unit, 201 ... R pixel, 202 ... C pixel, 203 ... Filter unit, 1201 ... Address generation unit, 1202 ... Color information generation table, 1401 ... Brightness saturated pixel determination processing unit, 1402 ... Saturated pixel replacement processing unit, 1403 ... R component brightness correction 1404 ... G component brightness correction unit, 1405 ... B component brightness correction unit, 1601 ... Cy pixel.

Claims (15)

  1.  可視域の第1の波長範囲の光を検出する第1の画素と、前記第1の波長範囲の光に加えて前記第1の波長範囲とは異なる可視光の波長の光を検出する第2の画素とを含むフィルタユニットを繰り返し配列して構成されるイメージセンサと、
     前記第1の画素の検出光量に基づき、前記第2の画素の位置を補間して得た第1の補間画像と、前記第2の画素の検出光量に基づき、前記第1の画素の位置を補間して得た第2の補間画像とを生成可能に構成された補間処理部と、
     前記第1の補間画像と前記第2の補間画像との同一位置の画素の組の検出光量に基づいて、該位置における色相を判定する色情報生成処理部と
    を備えたことを特徴とする、撮像装置。
    A first pixel for detecting light in the first wavelength range in the visible region, and a second pixel for detecting light having a visible light wavelength different from the first wavelength range in addition to the light in the first wavelength range. An image sensor composed of repeatedly arranging filter units including the pixels of
    Based on the detected light amount of the first pixel, the position of the first pixel is determined based on the first interpolated image obtained by interpolating the position of the second pixel and the detected light amount of the second pixel. An interpolation processing unit configured to be able to generate a second interpolated image obtained by interpolation, and
    It is characterized by including a color information generation processing unit that determines the hue at the position based on the detected light amount of a set of pixels at the same position of the first interpolated image and the second interpolated image. Imaging device.
  2.  前記色情報生成処理部は、前記第1の補間画像及び前記第2の補間画像の同一位置の画素の組の検出光量の比に基づいて、該位置における色相を判定する、請求項1に記載の撮像装置。 The first aspect of the present invention, wherein the color information generation processing unit determines the hue at the position based on the ratio of the detected light amounts of the set of pixels at the same position of the first interpolated image and the second interpolated image. Imaging device.
  3.  前記第1の補間画像及び前記第2の補間画像に基づきカラー画像を生成するカラー画像生成処理部を更に備え、
     前記カラー画像生成処理部は、
     前記第1の補間画像に基づき、前記第1の波長範囲の第1の波長成分を有する第1成分画像を生成し、
     前記第1の補間画像と前記第2の補間画像の差である差画像を生成し、
     前記差画像に第1の配分比を乗算し、前記第1の波長範囲とは異なる第2の波長範囲の成分を有する第2成分画像を生成し、
     前記差画像に第2の配分比を乗算し、前記第1の波長範囲及び前記第2の波長範囲とは異なる第3の波長範囲の成分を有する第3成分画像を生成するよう構成された、請求項1又は2に記載の撮像装置。
    A color image generation processing unit that generates a color image based on the first interpolated image and the second interpolated image is further provided.
    The color image generation processing unit
    Based on the first interpolated image, a first component image having a first wavelength component in the first wavelength range is generated.
    A difference image, which is the difference between the first interpolated image and the second interpolated image, is generated.
    The difference image is multiplied by the first distribution ratio to generate a second component image having a component in a second wavelength range different from the first wavelength range.
    The difference image is multiplied by a second distribution ratio to generate a third component image having components in the first wavelength range and a third wavelength range different from the second wavelength range. The imaging device according to claim 1 or 2.
  4.  前記第1の配分比又は前記第2の配分比は、前記第1の補間画像及び前記第2の補間画像の同一位置の画素の組の検出光量の比の変化に拘わらず、一定値とされる、請求項3に記載の撮像装置。 The first distribution ratio or the second distribution ratio is set to a constant value regardless of the change in the ratio of the detected light amount of the set of pixels at the same position in the first interpolated image and the second interpolated image. The imaging device according to claim 3.
  5.  前記第1の配分比又は前記第2の配分比は、前記第1の補間画像及び前記第2の補間画像の同一位置の画素の組の検出光量の比の変化に従って変化する、請求項3に記載の撮像装置。 According to claim 3, the first distribution ratio or the second distribution ratio changes according to a change in the ratio of detected light amounts of a set of pixels at the same position in the first interpolated image and the second interpolated image. The imaging device described.
  6.  前記第1の画素は赤色光を検出可能な画素であり、前記第2の画素は赤色光、緑色光、及び青色光を検出可能な画素である、請求項1~3のいずれか1項に記載の撮像装置。 The first pixel is a pixel capable of detecting red light, and the second pixel is a pixel capable of detecting red light, green light, and blue light, according to any one of claims 1 to 3. The imaging device described.
  7.  前記第1の画素は青色光及び緑色光を検出可能な画素であり、
     前記第2の画素は赤色光、緑色光、及び青色光を検出可能な画素である、請求項1~3のいずれか1項に記載の撮像装置。
    The first pixel is a pixel capable of detecting blue light and green light.
    The imaging device according to any one of claims 1 to 3, wherein the second pixel is a pixel capable of detecting red light, green light, and blue light.
  8.  可視域の第1の波長範囲の光を検出する第1の画素と、前記第1の波長範囲の光に加えて前記第1の波長範囲とは異なる可視光の波長の光を検出する第2の画素とを含むフィルタユニットを繰り返し配列して構成されるイメージセンサと、
     前記第1の画素の検出光量に基づき、前記第2の画素の位置を補間して得た第1の補間画像と、前記第2の画素の検出光量に基づき、前記第1の画素の位置を補間して得た第2の補間画像とを生成可能に構成された補間処理部と、
     前記第1の補間画像及び前記第2の補間画像に基づきカラー画像を生成するカラー画像生成処理部と
    を備え、
     前記カラー画像生成処理部は、
     前記第1の補間画像に基づき、前記第1の波長範囲の第1の波長成分を有する第1成分画像を生成し、
     前記第1の補間画像と前記第2の補間画像の差である差画像を生成し、
     前記差画像に第1の配分比を乗算し、前記第1の波長範囲とは異なる第2の波長範囲の成分を有する第2成分画像を生成し、
     前記差画像に第2の配分比を乗算し、前記第1の波長範囲及び前記第2の波長範囲とは異なる第3の波長範囲の成分を有する第3成分画像を生成するよう構成された
    ことを特徴とする撮像装置。
    A first pixel for detecting light in the first wavelength range in the visible region, and a second pixel for detecting light having a visible light wavelength different from the first wavelength range in addition to the light in the first wavelength range. An image sensor composed of repeatedly arranging filter units including the pixels of
    Based on the detected light amount of the first pixel, the position of the first pixel is determined based on the first interpolated image obtained by interpolating the position of the second pixel and the detected light amount of the second pixel. An interpolation processing unit configured to be able to generate a second interpolated image obtained by interpolation, and
    It is provided with a color image generation processing unit that generates a color image based on the first interpolated image and the second interpolated image.
    The color image generation processing unit
    Based on the first interpolated image, a first component image having a first wavelength component in the first wavelength range is generated.
    A difference image, which is the difference between the first interpolated image and the second interpolated image, is generated.
    The difference image is multiplied by the first distribution ratio to generate a second component image having a component in a second wavelength range different from the first wavelength range.
    It is configured to multiply the difference image by a second distribution ratio to generate a third component image having components in the first wavelength range and a third wavelength range different from the second wavelength range. An imaging device characterized by.
  9.  前記カラー画像生成処理部は、
     前記第2の補間画像において、明度が飽和している飽和画素を判定する明度飽和画素判定処理部と、
     前記飽和画素の明度を上限値に置換する飽和画素置換処理部と
     を更に備える、請求項8に記載の撮像装置。
    The color image generation processing unit
    In the second interpolated image, the brightness saturation pixel determination processing unit for determining the saturated pixels whose brightness is saturated, and the brightness saturation pixel determination processing unit.
    The imaging device according to claim 8, further comprising a saturated pixel replacement processing unit that replaces the brightness of the saturated pixels with an upper limit value.
  10.  前記飽和画素置換処理部は、
     前記第1の補間画像、前記差画像に前記第1の配分比を乗算して得られた画像、及び前記差画像に前記第2の配分比を乗算して得られた画像の明度を補正する明度補正部を更に備え、
     前記飽和画素置換処理部は、前記明度補正部による明度の補正後の画像に含まれる前記飽和画素の明度を前記上限値に置換する、
    請求項9に記載の撮像装置。
    The saturated pixel replacement processing unit is
    The brightness of the first interpolated image, the image obtained by multiplying the difference image by the first distribution ratio, and the image obtained by multiplying the difference image by the second distribution ratio is corrected. Further equipped with a brightness correction unit,
    The saturated pixel replacement processing unit replaces the brightness of the saturated pixels included in the image after the brightness correction by the brightness correction unit with the upper limit value.
    The imaging device according to claim 9.
  11.  可視域の第1の波長範囲の光を検出する第1の画素と、前記第1の波長範囲の光に加えて前記第1の波長範囲とは異なる可視光の波長の光を検出する第2の画素とを含むフィルタユニットを繰り返し配列して構成されるイメージセンサから画像を取得するステップと、
     前記第1の画素の検出光量に基づき、前記第2の画素の位置を補間して第1の補間画像を取得するステップと、
     前記第2の画素の検出光量に基づき、前記第1の画素の位置を補間して第2の補間画像を取得するステップと、
     前記第1の補間画像と前記第2の補間画像との同一位置の画素の組の検出光量に基づいて、該位置における色相を判定するステップと
     を含む、画像処理方法。
    A first pixel for detecting light in the first wavelength range in the visible region, and a second pixel for detecting light having a visible light wavelength different from the first wavelength range in addition to the light in the first wavelength range. The step of acquiring an image from an image sensor configured by repeatedly arranging filter units including the pixels of
    A step of interpolating the position of the second pixel based on the detected light amount of the first pixel to acquire a first interpolated image, and
    A step of interpolating the position of the first pixel based on the detected light amount of the second pixel to acquire a second interpolated image, and
    An image processing method including a step of determining a hue at a position based on the amount of detected light of a set of pixels at the same position of the first interpolated image and the second interpolated image.
  12.  前記色相を判定するステップは、前記第1の補間画像及び前記第2の補間画像の同一位置の画素の組の検出光量の比に基づいて該位置における色相を判定する、請求項11に記載の画像処理方法。 11. The step of determining the hue is according to claim 11, wherein the hue at the position is determined based on the ratio of the detected light amounts of the set of pixels at the same position in the first interpolated image and the second interpolated image. Image processing method.
  13.  前記第1の補間画像及び前記第2の補間画像に基づきカラー画像を生成するステップを更に備え、
     前記カラー画像を生成するステップは、
     前記第1の補間画像に基づき、前記第1の波長範囲の第1の波長成分を有する第1成分画像を生成し、
     前記第1の補間画像と前記第2の補間画像の差である差画像を生成し、
     前記差画像に第1の配分比を乗算し、前記第1の波長範囲とは異なる第2の波長範囲の成分を有する第2成分画像を生成し、
     前記差画像に第2の配分比を乗算し、前記第1の波長範囲及び前記第2の波長範囲とは異なる第3の波長範囲の成分を有する第3成分画像を生成する
    請求項11又は12に記載の画像処理方法。
    Further comprising a step of generating a color image based on the first interpolated image and the second interpolated image.
    The step of generating the color image is
    Based on the first interpolated image, a first component image having a first wavelength component in the first wavelength range is generated.
    A difference image, which is the difference between the first interpolated image and the second interpolated image, is generated.
    The difference image is multiplied by the first distribution ratio to generate a second component image having a component in a second wavelength range different from the first wavelength range.
    Claim 11 or 12 to multiply the difference image by a second distribution ratio to generate a third component image having components in the first wavelength range and a third wavelength range different from the second wavelength range. The image processing method described in.
  14.  前記第1の配分比又は前記第2の配分比は、前記第1の補間画像及び前記第2の補間画像の同一位置の画素の組の検出光量の比の変化に拘わらず、一定値とされる、請求項13に記載の画像処理方法。 The first distribution ratio or the second distribution ratio is set to a constant value regardless of the change in the ratio of the detected light amount of the set of pixels at the same position in the first interpolated image and the second interpolated image. The image processing method according to claim 13.
  15.  前記第1の配分比又は前記第2の配分比は、前記第1の補間画像及び前記第2の補間画像の同一位置の画素の組の検出光量の比の変化に従って変化する、請求項13に記載の画像処理方法。 13. The first distribution ratio or the second distribution ratio changes according to a change in the ratio of the detected light amounts of the pair of pixels at the same position of the first interpolated image and the second interpolated image, according to claim 13. The image processing method described.
PCT/JP2020/008036 2019-03-27 2020-02-27 Imaging device and image processing method WO2020195515A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202080017627.1A CN113574851B (en) 2019-03-27 2020-02-27 Image pickup apparatus and image processing method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-060777 2019-03-27
JP2019060777A JP7116001B2 (en) 2019-03-27 2019-03-27 Imaging device and image processing method

Publications (1)

Publication Number Publication Date
WO2020195515A1 true WO2020195515A1 (en) 2020-10-01

Family

ID=72609045

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/008036 WO2020195515A1 (en) 2019-03-27 2020-02-27 Imaging device and image processing method

Country Status (3)

Country Link
JP (1) JP7116001B2 (en)
CN (1) CN113574851B (en)
WO (1) WO2020195515A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05219511A (en) * 1984-07-31 1993-08-27 Rca Corp Camera device using color coding filter
JP2009290795A (en) * 2008-05-30 2009-12-10 Sharp Corp Image processor, image processing method, image processing program, recording medium, and electronic information device
JP2013197670A (en) * 2012-03-16 2013-09-30 Ricoh Co Ltd Imaging device, object detection device, vehicle travel support image processing system, and vehicle
JP2015008391A (en) * 2013-06-25 2015-01-15 キヤノン株式会社 Image processing apparatus, image processing method, and image processing program
JP2016111647A (en) * 2014-12-10 2016-06-20 株式会社日本自動車部品総合研究所 Image processing apparatus and lane borderline recognition system
JP2017046051A (en) * 2015-08-24 2017-03-02 株式会社デンソー On-vehicle camera device

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1524625A2 (en) * 2003-10-17 2005-04-20 Matsushita Electric Industrial Co., Ltd. Enhancement of interpolated image
JP2006135564A (en) * 2004-11-05 2006-05-25 Casio Comput Co Ltd Device and method for pixel interpolation
JP4726065B2 (en) * 2006-01-11 2011-07-20 株式会社山武 Edge detection method and edge detection apparatus
JP5904213B2 (en) * 2012-01-24 2016-04-13 ソニー株式会社 Image processing apparatus, image processing method, and program
WO2016079831A1 (en) * 2014-11-19 2016-05-26 オリンパス株式会社 Image processing device, image processing method, image processing program and endoscopic device
JP2016132533A (en) * 2015-01-20 2016-07-25 株式会社栗本鐵工所 Conveying device
JP6732282B2 (en) * 2016-06-03 2020-07-29 株式会社永木精機 Coated wire stripping device and coated wire stripping method
US11172172B2 (en) * 2016-12-30 2021-11-09 Texas Instruments Incorporated Efficient and flexible color processor

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05219511A (en) * 1984-07-31 1993-08-27 Rca Corp Camera device using color coding filter
JP2009290795A (en) * 2008-05-30 2009-12-10 Sharp Corp Image processor, image processing method, image processing program, recording medium, and electronic information device
JP2013197670A (en) * 2012-03-16 2013-09-30 Ricoh Co Ltd Imaging device, object detection device, vehicle travel support image processing system, and vehicle
JP2015008391A (en) * 2013-06-25 2015-01-15 キヤノン株式会社 Image processing apparatus, image processing method, and image processing program
JP2016111647A (en) * 2014-12-10 2016-06-20 株式会社日本自動車部品総合研究所 Image processing apparatus and lane borderline recognition system
JP2017046051A (en) * 2015-08-24 2017-03-02 株式会社デンソー On-vehicle camera device

Also Published As

Publication number Publication date
CN113574851B (en) 2023-02-07
JP7116001B2 (en) 2022-08-09
JP2020162034A (en) 2020-10-01
CN113574851A (en) 2021-10-29

Similar Documents

Publication Publication Date Title
US9906766B2 (en) Imaging device
US8805070B2 (en) Image processing apparatus and image processing method
CN110519489B (en) Image acquisition method and device
JP5832855B2 (en) Image processing apparatus, imaging apparatus, and image processing program
JP5668951B2 (en) Image input device
US9160998B2 (en) Method and apparatus for canceling chromatic aberration
US8023010B2 (en) Defective pixel correction device
US20070183657A1 (en) Color-image reproduction apparatus
EP2793469B1 (en) Image processing device
CN110365961B (en) Image demosaicing device and method
US20100166305A1 (en) Method for detecting and correcting chromatic aberration, and apparatus and method for processing image using the same
US10298856B2 (en) Imaging device
US9131199B2 (en) Imaging apparatus capable of generating an appropriate color image
US8817140B2 (en) Camera set-up and method for ascertaining picture signals having color values
KR20070103229A (en) Method and apparatus for color interpolation
TW202133601A (en) A low-light imaging system
WO2020195515A1 (en) Imaging device and image processing method
CN110365872B (en) Dynamic demosaicing of camera pixels
JP6315239B2 (en) Imaging apparatus, imaging method, image processing apparatus, imaging program
JP2012008845A (en) Image processor
JP2012010141A (en) Image processing apparatus
JP6717660B2 (en) Imaging device
JP4053280B2 (en) Image processing apparatus and image processing method
WO2011162155A1 (en) Image capturing device
CN113852773A (en) Image processing apparatus and method, imaging system, moving object, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20778174

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20778174

Country of ref document: EP

Kind code of ref document: A1