WO2022059139A1 - Image display device and image display method - Google Patents

Image display device and image display method Download PDF

Info

Publication number
WO2022059139A1
WO2022059139A1 PCT/JP2020/035284 JP2020035284W WO2022059139A1 WO 2022059139 A1 WO2022059139 A1 WO 2022059139A1 JP 2020035284 W JP2020035284 W JP 2020035284W WO 2022059139 A1 WO2022059139 A1 WO 2022059139A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
flare
light source
region
source position
Prior art date
Application number
PCT/JP2020/035284
Other languages
French (fr)
Japanese (ja)
Inventor
勇人 菊田
俊明 久保
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2020/035284 priority Critical patent/WO2022059139A1/en
Priority to JP2022550267A priority patent/JP7355252B2/en
Publication of WO2022059139A1 publication Critical patent/WO2022059139A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • This disclosure relates to an image display device such as an electronic mirror device and an image display method.
  • image correction that reduces the brightness value is performed so that the user's visibility is not impaired by the influence of flare generated by a strong local light source such as the headlight of a vehicle behind. It has been devised to do. Further, for example, in Patent Document 1, the brightness value of the concentric region from the center of the high-luminance region by the headlight of the rear vehicle in the image of the rear of the own vehicle is reduced, and the brightness value of the central portion is reduced. Image correction that suppresses the rate is performed to brighten only the central part of the headlights of the rear vehicle so that the driver can easily grasp the position of the rear vehicle.
  • the present disclosure has been made to solve the above-mentioned problems, and an object of the present disclosure is to provide an image display device for displaying an image with improved visibility.
  • an image pickup unit that images the outside of a vehicle and pixels in which pixels having a preset first brightness value or higher in the image pickup input image captured by the image pickup unit are gathered adjacent to each other.
  • a light source position detection unit that detects an area as a light source position, and a flare area determination unit that determines a pixel area having a preset second brightness value or more as a flare area from a pixel area around the center of the light source position.
  • a correction image generation unit that generates a correction image by reducing the brightness value of pixels of at least the input image in the flare area of the captured input image that are equal to or higher than the preset third brightness value, and at least the flare area.
  • the corrected image and the input image are combined with a composite image generation unit that generates a composite image by increasing the ratio of the corrected image as it is closer to the center of the light source position and increasing the ratio of the input image as it is farther from the center.
  • a composite image is provided in the pixel area where the image is generated, a corrected image or an input image is provided in the flare area among the pixel areas where the composite image is not generated, and an image display unit is provided outside the flare area to display the input image. It is an image.
  • the image display method includes a step of detecting as a light source position a pixel region in which pixels having a preset first brightness value or higher in an image captured by capturing the outside of a vehicle are adjacent to each other.
  • a correction image or an input image is provided in the flare area, and a display image for displaying the input image is generated outside the flare area.
  • the ratio of the input image to the corrected image is close to the center of the light source position in at least a part of the flare region determined from the center of the light source position by detecting the light source position from the brightness value of the captured input image. Since the composite image generated by increasing the proportion of the corrected image and increasing the proportion of the input image is displayed, the discontinuity of the brightness value of the displayed image can be suppressed and the image with high visibility can be displayed. ..
  • FIG. 1 is a schematic configuration diagram of the image display device of the first embodiment.
  • FIG. 2 is a schematic block diagram of the image display device of the first embodiment.
  • 3A to 3C are explanatory views illustrating control of the image processing unit of the first embodiment.
  • 4A and 4B are graphs showing the function F used in the composite image generation unit of the first embodiment.
  • 5A to 5D are examples of processed images in the image display device of the first embodiment.
  • FIG. 6 is an example of a time-by-time image capture input image for explaining the second embodiment.
  • FIG. 7 is a schematic block diagram of the image display device of the second embodiment.
  • FIG. 8 is a schematic block diagram of the image display device of the third embodiment.
  • the image display device 1 includes an image pickup unit 2 that captures an image of the outside of the own vehicle, an image processing unit 3 that processes the captured image, and an image display unit 4 that displays the processed image.
  • the image pickup unit 2 has one or more cameras that output an image pickup input image 110 that images the outside of the own vehicle.
  • an example in which the image pickup unit 2 takes an image of the situation behind the own vehicle will be described.
  • the camera for example, a two-dimensional image pickup device using a CMOS (Complementary MOSFET) or a CCD (Charge Coupled Device) element as an image sensor may be used.
  • CMOS Complementary MOSFET
  • CCD Charge Coupled Device
  • the image display unit 4 is mounted in the own vehicle and is arranged at a position suitable for the user to see the image display unit 4. For example, it is arranged near the A-pillars on the left and right sides of the front surface of the own vehicle, near the door trim in front of the side surface of the own vehicle, and the like.
  • the image display unit 4 uses a display device having a backlight composed of, for example, a liquid crystal filter such as an LCD (Liquid Crystal Display) panel, an LED (Light Emitting Diode), and a diffuser. Further, it may be a display device in which LEDs that emit organic light are arranged in pixel units.
  • the image display unit 4 displays a display image obtained by processing the image pickup input image 110 described later.
  • the image processing unit 3 is partially or wholly composed of a processing circuit.
  • the plurality of functions of the image processing unit 3 may be realized by separate processing circuits, or the functions of the plurality of parts may be collectively realized by one processing circuit.
  • the processing circuit may be configured by hardware or software, that is, a programmed computer.
  • a part may be realized by hardware and the other part may be realized by software.
  • the hardware for example, FPGA (Field-Programmable Gate Array), microprocessor, microcontroller, DSP (Digital Signal Processor) or the like may be used.
  • the software is, for example, a program that executes a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), or the like on a processor.
  • FIG. 2 is a schematic block diagram of the image display device 1.
  • 3A to 3C are explanatory views for explaining the processing control of the image processing unit, FIG. 3A shows an image pickup input image, FIG. 3B shows a light source position information, and FIG. 3C shows an example of flare region information.
  • the image processing unit 3 includes a light source position detection unit 21, a flare area determination unit 22, a correction image generation unit 23, and a composite image generation unit 24.
  • the light source position information 31 indicating the light source position 41 detected by the light source position detection unit 21 from the image pickup input image 110 is output to the flare area determination unit 22 and the composite image generation unit 24, and the flare area determination unit 22 outputs the light source position information 31.
  • the flare area information 32 indicating the determined flare area 42 is output to the composite image generation unit 24.
  • the corrected image generation unit 23 generates a corrected image 12 in which the brightness value of the input image 11 in at least the flare region 42 of the captured input image 110 is reduced, and outputs the corrected image 12 to the composite image generation unit 24.
  • the composite image generation unit 24 generates a composite image 13 in which the composite ratio of the input image 11 and the corrected image 12 is changed based on the distance D from the center of the light source position 41 in at least a part of the flare region 42.
  • the composite image 13 is placed in the region where the composite image 13 is generated, and the corrected image 12 or the corrected image 12 is set in the flare region 42 including the end De of the flare region 42 in the region where the composite image 13 is not generated.
  • the image including the input image 11 is displayed on the image display unit 4 in the area outside the flare area 42.
  • the above-mentioned image processing is performed in each of the flare areas 42, and the input image 11 is displayed on the image display unit 4 in the areas other than the flare areas 42.
  • the light source position detection unit 21 inputs the image pickup input image 110 shown in FIG. 3A output from the image pickup unit 2, and the pixels having the first luminance value Th1 or more set in advance in the image pickup input image 110 are adjacent to each other.
  • the gathered pixel region is detected as the light source position 41, and the light source position 41 is used as the light source position information 31 shown in FIG. 3B.
  • the light source position information 31 is, for example, two-dimensional map information having the same size as the captured input image 110, and indicates the light source position 41 in which the white portion in FIG. 3B is detected.
  • the light source position detection unit 21 performs gradation processing on the pixels of the image pickup input image 110 to calculate the luminance value, and detects the high-luminance pixel whose luminance value is equal to or higher than the first luminance value Th1 as the light source position 41. do.
  • the luminance value for example, when the configuration of the image pickup element is a gradation value of three colors obtained from an RGB color filter, it is converted into a YCbCr color space composed of a luminance component and a color difference component, and the Y component is used as the luminance value. ..
  • the luminance value becomes high when strong light is projected onto the imaging device, when imaging noise or circuit noise appears locally, and the like.
  • the light source position detection unit 21 calculates the area of the area where the pixels having the first luminance value Th1 or more are adjacent to each other, and sets the area of the area equal to or more than the preset area as one light source position 41.
  • the object to be visually recognized by the user is the rear vehicle 51 behind the own vehicle, and the range in which the rear vehicle 51 exists within the angle of view of the camera is limited. Therefore, the pixel for calculating the luminance value by the light source position detection unit 21 may be limited to the range in which the rear vehicle 51 exists.
  • the flare area determination unit 22 determines the flare area 42 by using the light source position information 31 obtained from the light source position detection unit 21, and uses it as the flare area information 32.
  • the flare region information 32 is, for example, two-dimensional map information having the same size as the captured input image 110 as shown in FIG. 3C, and is information on a pixel region having high brightness around the center of the light source position 41. For example, the white portion in FIG. 3C is the determined flare region 42.
  • a method for determining the flare region 42 will be described. First, peripheral pixels are extracted in order from the center of the light source position 41 obtained in the light source position information 31, and the luminance values of the surrounding pixels are processed and calculated in the same manner as the light source position detection unit 21.
  • the center of the light source position 41 is, for example, the center of gravity of the pixel region detected as the light source position 41 by the light source position detection unit 21.
  • the pixel region in which the luminance value of the pixels around the center of the light source position 41 detected by the light source position detection unit 21 is equal to or greater than the preset second luminance value Th2 is determined as the flare region 42. ..
  • the corrected image generation unit 23 refers to at least the input image 11 in the flare region 42 of the captured input image 110, and reduces the brightness value of the pixel having the third brightness value Th3 or more set in advance. Twelve is generated.
  • the corrected image 12 may be one in which the luminance value is reduced only for the pixels having the third luminance value Th3 or more to remove the high luminance component, and both the third luminance value Th3 or more and the third luminance value Th3 or not are reduced. It may be something to make. For example, since the pixels obtained by imaging the headlight 52 of the rear vehicle 51 have a very high luminance value, the pixels around the headlight 52 have a large luminance value due to the influence of flare.
  • the third luminance value Th3 or more Flare can be suppressed by reducing the brightness value of the pixel.
  • the setting of the minimum value of the third brightness value Th3 is affected by the sensitivity of the image sensor in the image pickup unit 2 and the allocation to the digital value by HDR (High Dynamic Range) conversion, so it depends on the performance of the equipment to be used and the image pickup environment. Determined as a parameter based on.
  • the corrected image generation unit 23 may perform a process of reducing the luminance value of the pixel having the above-mentioned third luminance value Th3 or more, for example.
  • a method of preparing a table for determining an output value based on the luminance value of the captured input image 110 by a LUT (Look Up Table) and generating the corrected image 12 can be mentioned.
  • the luminance value of the input pixel is output for the luminance value smaller than the third luminance value Th3, and the luminance value of the input pixel is reduced when the luminance value is equal to or greater than the third luminance value Th3. Set the output so that it is the value that was set.
  • the luminance value increased due to the influence of flare becomes higher as the distance from the center of the light source position 41 becomes closer, so that a large reduction is required. Therefore, as the LUT parameter to be prepared becomes higher in brightness than the third luminance value Th3, for example, linear, Gaussian, and exponential function parameters may be given to reduce the luminance value.
  • the LUT parameter may be a one-dimensional input parameter using the luminance value of the pixel as an input value, or may be a multidimensional input parameter including the value of each RGB color and the chromaticity.
  • a LUT may be constructed only in the flare region 42 determined by the flare region determination unit 22 to generate the corrected image 12. Further, the corrected image 12 may be generated beyond the flare region boundary, which is the boundary between the flare region 42 and the region other than the flare region 42. That is, the correction image 12 may be generated by the correction image generation unit 23 in the entire image pickup input image 110, or not in the entire image pickup input image 110. It may be carried out at least in the pixel region for generating the composite image 13 described later
  • the composite image generation unit 24 uses the input image 11, the light source position information 31, the flare area information 32, and the corrected image 12, which are at least a part of the captured input image 110, and the closer to the center of the light source position 41, the closer the corrected image 12 is.
  • the ratio is increased, and the ratio of the captured input image 110 is increased as the ratio is increased.
  • a composite image 13 in which the corrected image 12 and the input image 11 are combined over the entire flare region 42 is generated and output to the image display unit 4.
  • the method of generating the composite image 13 in the composite image generation unit 24 will be described. For example, when the image obtained by synthesizing the input image 11 and the corrected image 12 in the flare area 42 at a constant ratio and the input image 11 outside the flare area 42 are displayed together, a discontinuity of the luminance value appears at the boundary of the flare area. It occurs and visibility is reduced. Therefore, the ratio of combining the input image 11 and the corrected image 12 is changed based on the distance D from the center of the light source position 41. That is, the closer to the center of the light source position 41, the smaller the composite ratio Ba of the input image 11, the larger the composite ratio Bb of the corrected image 12, and the farther from the center of the light source position 41, the larger the composite ratio Ba of the input image 11. The synthesis ratio Bb of 12 is reduced.
  • FIGS. 4A and 4B are graphs showing the function F (D, De, Tc) used in the composite image generation unit 24, and the composite ratio Ba of the input image 11 and the distance D from the center of the light source position 41. Shown in relation.
  • the function F (D, De, Tc) includes the distance D from the center of the light source position 41, the composition ratio Ba of the input image 11, the distance De from the center of the light source position 41 to the end of the flare region 42, and the adjustment parameter Tc. It becomes a function.
  • the composite ratio Bb of the corrected image 12 is a value obtained by subtracting the composite ratio Ba of the input image 11 from 1.
  • the distance D from the center of the light source position 41 is a variable, for example, the Euclidean distance of the line segment connecting the pixel for calculating the composition ratio and the pixel at the center of the light source position 41.
  • the distance De is the distance between the pixel at the end of the flare region 42 and the pixel at the center of the light source position 41 when the line segment is extended toward the end of the flare region 42.
  • the flare region 42 is not concentric from the center of the light source position 41, and the distance De from the center of the light source position 41 to the end of the flare region 42 varies depending on the selected line segment.
  • the composition ratio Ba may be 1, and as shown in FIG. 4B, the composition ratio Ba of the input image 11 from the center of the light source position 41 to an arbitrary distance Dn is 0, and the composition ratio Ba from Dn to the end De of the flare region 42 is arbitrary.
  • the composition ratio Ba of the input image 11 may be changed up to the distance De + ⁇ D obtained by adding ⁇ D.
  • the function F (D, De, Tc) may use an exponential function obtained by inversion of the Gaussian distribution, or may use a linear function.
  • the composite image generation unit 24 may generate the composite image 13 in the flare region 42, or may generate the composite image 13 in the region including the end of the flare region and its periphery.
  • the composition ratio may be generated by changing it for each line segment, and the composition ratio may be changed at the end of the flare region and its periphery such as the pixel region from Dn to De + ⁇ D, and the end of the flare region 42. It may include a pixel area exceeding.
  • the composition ratio Ba of the input image 11 may be 0 as shown in FIG. 4B, or the input image 11 and the correction image 12 may be added together at an arbitrary ratio.
  • FIG. 5A is an example of an image pickup input image 110 in which the rear right side of the own vehicle is imaged. Due to the influence of flare caused by the headlight 52 of the rear vehicle 51, the shape of the rear vehicle 51 in the vicinity of the headlight 52 and the road environment are difficult to visually recognize. Further, since the headlights 52 are arranged one on each side, the two flares are connected and visually recognized as one light source, making it difficult to grasp the position of the rear vehicle 51. In FIG.
  • the pixel region in which the pixels having the first luminance value Th1 or more in the image pickup input image 110 are adjacently gathered is detected as the light source position 41, and the luminance value of the pixels around the center of the light source position 41 is the second.
  • a pixel region having a luminance value of Th2 or more is determined as a flare region 42.
  • FIG. 5C is an example of the corrected image 12 in which the image pickup input image 110 is obtained by reducing the luminance value of the pixel having the third luminance value Th3 or more as a whole.
  • 5D shows the composite image 13 in the region where the composite image 13 is generated, the corrected image 12 or the composite image 13 in the flare region 42 among the regions where the composite image 13 is not generated, and the region other than the flare region 42.
  • a display image in which the input image 11 is displayed Specifically, in the flare region 42, the closer to the light source position 41, the larger the composite ratio Bb of the corrected image 12, and the farther it is, the larger the composite ratio Ba of the input image 11, and the composite of the input image 11 and the corrected image 12.
  • the image 13 is generated, and the input image 11 is displayed in a region other than the flare region 42.
  • the image display device 1 has a second brightness from the pixel region around the center of the light source position 41 detected from the brightness value of the captured input image 110.
  • a pixel region having a value Th2 or more is determined to be a flare region 42, and the brightness value of a pixel having a third brightness value Th3 or more of at least the input image 11 in the flare region 42 of the captured input image 110 is reduced and corrected.
  • Image 12 is generated.
  • the corrected image 12 and the input image 11 are combined by increasing the proportion of the corrected image 12 as it is closer to the center of the light source position 41 and increasing the proportion of the input image 11 as it is farther away.
  • Image 13 is generated.
  • the composite image 13 is placed in the region where the composite image 13 is generated, and the corrected image 12 or the corrected image 12 includes the input image 11 in the flare region 42 in the region where the composite image 13 is not generated. Since the input image 11 is displayed on the image display unit 4 outside the flare area 42, the discontinuity of the brightness value at the boundary between the flare area 42 and the non-flare area 42 is suppressed, and the image has high visibility. Can be displayed. Further, since it is not necessary to reduce the luminance value in the entire captured input image 110, it becomes easy to visually recognize notable information such as the shapes of surrounding buildings and the rear vehicle 51.
  • the influence of flare is suppressed and visual information such as the shape of the vehicle 51 behind the flare region boundary and the road environment becomes easier to visually recognize.
  • composition ratio Ba of the input image 11 and the composition ratio Bb of the corrected image 12 for each line segment, the discontinuity of the brightness value at the boundary of the flare region is suppressed even if the flare shape is complicated. , Highly visible images can be displayed.
  • the generation of the composite image 13 from the center of the light source position 41 to the end De of the flare region 42, or from Dn to De + ⁇ D described above may be processed for each pixel, and the flare region may be processed.
  • the function F (D, De, Tc) may be patterned and processed using a table based on the area of the pixel region of 42 or the light source position 41. It can be simplified by processing the pixel areas collectively.
  • the flare area determination unit 22 an example of determining a pixel area having a second luminance value Th2 or more as a flare region 42 from a pixel region around the center of the light source position 41 is shown, but the second luminance value is shown. Th2 may be changed based on the distance from the center of the light source position 41. The luminance value of the flare region 42 increases as the distance from the center of the light source position 41 increases. Therefore, for example, if the second luminance value Th2 is lowered sequentially from the center of the light source position 41, the flare region 42 can be more accurately formed. It can be determined.
  • the luminance value that determines the end portion of the flare region 42 is Th2i, which is smaller than the second luminance value Th20 set near the center of the light source position 41.
  • the second luminance value Th2 is set larger as the distance from the center of the light source position 41 is shorter, even if the pixels around the light source position 41 are affected by flare and the luminance value is increased, the luminance value is increased.
  • the flare region 42 can be appropriately determined.
  • the flare region 42 determined by the flare region determination unit 22 is a region continuous from the center of the light source position 41, if it is determined that the flare region 42 is not the flare region 42 in the above-mentioned determination of the flare region 42, the flare region 42 is determined. It can be determined that the region outside the determined pixel region is not the flare region 42. Therefore, it is better to start from the center of the light source position 41 and make a determination toward the outside.
  • the flare region 42 may be determined in a limited direction.
  • the end portion of the flare region 42 in eight directions is determined by advancing in order from the center of the light source position 41 with respect to a total of eight-direction axes of up / down / left / right and diagonal 45-degree axes.
  • the pixel region other than the on-axis pay attention to the two neighboring axes, and draw the pixels that are the ends of the flare region 42 of the two axes into a straight line or an arc centered on the center of the light source position 41.
  • the region located on the light source position 41 side of the connected line may be determined as the flare region 42.
  • the determination of the flare region 42 can be simplified, and the generation of the composite image 13 in the composite image generation unit 24 can also be simplified.
  • the processing load can be reduced by limiting the pixel region for determining the flare region 42 within the range of the distance from the center of the preset light source position 41. For example, a distance proportional to the area of the pixel region of the light source position 41 obtained from the light source position information 31 may be set, or a plurality of adjacent pixels may be averaged and set as a group.
  • the image display device 1 includes an image pickup unit 2, an image processing unit 3, and an image display unit 4 in the same manner as the schematic configuration of the first embodiment shown in FIG.
  • the image processing unit 3 holds the front frame information 33 such as the light source position information 31 and the flare area information 32 obtained in the previous frame captured in the previous time zone, which can be referred to in the image processing of the current frame captured at the present time.
  • the front frame information holding unit 25 is provided.
  • the flare area determination unit 22 determines a pixel region in which the luminance value around the center of the light source position 41 is equal to or greater than the second luminance value Th2 as the flare region 42. Since the light source position 41 is detected from the pixel region having the first luminance value Th1 or more in the image pickup input image 110, when noise appears in the input image 11, the flare region 42 in which the noise is erroneously detected as the light source position 41 is determined. There is a risk of Further, for example, when the road surface reflected light 53 of the headlight 52 of the rear vehicle 51 is projected and the pixel region having the second luminance value Th2 or more is large, it is determined to be the flare region 42, but when it is small, the flare region 42 is determined.
  • the camera used for the image pickup unit 2 that acquires the image pickup input image 110 usually takes images at intervals of 30 to 60 frames per second and recognizes noise and road surface reflected light 53. Focuses on the fact that is not all frames, and the display image is visually recognized even if noise is deleted or the flare area 42 is erroneously determined by referring to the information of the previous frame during image processing in the current frame. It does not reduce the sex.
  • FIG. 6 describes an example in which the road surface reflected light 53 is projected on the image pickup input image 110.
  • FIG. 6 is an example of the image pickup input image 110 captured in the order of time T1 to T7, and the road surface reflected light 53 suddenly appears at time T4 to T6. Further, the luminance value of the road surface reflected light 53 changes and becomes maximum at time T5. Due to the influence of such a momentary change in the luminance value, if the road surface reflected light 53 is not determined to be the flare region 42 at the time T4 and T6, and is determined to be the flare region 42 at the time T5, the input image 11 is used as it is. When the composite image 13 combined with the corrected image 12 is displayed, the display image displayed on the image display unit 4 will flicker. Therefore, the light source position information 31 and the flare area information 32 of the front frame are held in the front frame information holding unit 25, and the composite image generation unit 24 generates the composite image 13 of the current frame using these information.
  • the image pickup input image 110 or the input image 11 before the current time captured by the image pickup unit 2 is used as a front frame (for example, if the current time is time T5, the front frame of the current frame at time T5 is time T1 to One of the frames of 4). It may also refer to multiple previous frames including the frame before the previous frame.
  • FIG. 7 is a schematic block diagram of the image display device 1 of the present embodiment.
  • the image processing unit 3 includes a front frame information holding unit 25 that holds the light source position information 31 acquired by the light source position detecting unit 21 and the flare area information 32 acquired by the flare area determination unit 22 in the front frame.
  • the processing performed by the light source position detection unit 21, the flare area determination unit 22, and the correction image generation unit 23 is the same as that of the first embodiment.
  • a ring buffer having a memory area secured so that the information from the time T1 to the time T7 can be sequentially held is used.
  • the composite image generation unit 24 inputs the input image 11, the light source position information 31, the flare area information 32, the corrected image 12, and the front frame information 33 obtained from the front frame.
  • the composite image generation unit 24 synthesizes the input image 11.
  • the composite image 13 is generated by increasing the ratio Ba to, for example, 0.7 or more. In this way, when the noise of the captured input image 110, the road surface reflected light 53 appearing in a hurry, and the like are reflected in the image with reference to the front frame information 33, and the influence of flare is considered to be small, the brightness of the input image 11 is increased. By approaching the value, the flicker of the displayed image can be suppressed.
  • the front frame information holding unit 25 may hold a plurality of front frame information. For example, if the pixel at the position determined to be the flare region 42 in the time T6 is determined to be the flare region 42 only in T5 in the previous frames of the times T1 to T5 held in the front frame information 33, it is determined to be the flare region 42.
  • the percentage of unfilled frames is 4/5.
  • composition ratio Ba of the input image 11 may be determined by using a function having the ratio of the number of previous frames that is not determined to be the flare region 42 as a variable or a LUT having the number of frames as the input value.
  • the ratio of the input image 11 and the ratio of the corrected image 12 in the composite image generation unit 24 are such that the larger the number of the front frames that are not determined to be the flare region 42 in the front frame captured by the image pickup unit 2, the larger the number of the input image 11. Increase the proportion and decrease the proportion of the corrected image.
  • the front frame information holding unit 25 may hold the information from which the information of the flare area 42 determined by the area determination unit 22 is obtained.
  • the light source position 41 in the front frame is held by the front frame information holding unit 25, and the ratio of the input image 11 and the ratio of the corrected image 12 in the composite image generation unit 24 are set to the light source position in the front frame imaged by the image pickup unit 2.
  • the ratio of the input image 11 may be increased and the ratio of the corrected image 12 may be decreased so that 41 is not detected.
  • the light source position 41 is detected from the brightness value of the captured input image 110, and the input image 11 and the corrected image 12 are formed in at least a part of the flare region 42 determined from the center of the light source position 41.
  • the composite image 13 generated by increasing the proportion of the corrected image 12 as it is closer to the center of the light source position 41 and increasing the proportion of the input image 11 as it is farther away is displayed, and the input image is displayed in the area other than the flare area 42. Since 11 is displayed, it is possible to suppress the discontinuity of the brightness value of the displayed image and display an image with high visibility. Further, referring to the front frame information 33, when the influence of flare is considered to be small, the flicker of the displayed image is suppressed by bringing the brightness value closer to the input image 11, and an image with higher visibility is displayed. can.
  • the image display device 1 includes an image pickup unit 2, an image processing unit 3, and an image display unit 4 in the same manner as the schematic configuration of the first embodiment shown in FIG. As shown in 8, the front frame information holding unit 26 is provided with the composite image 13 of the front frame output from the composite image generation unit 24.
  • the front frame information holding unit 26 stores the composite image 13 output from the composite image generation unit 24 in a plurality of front frames.
  • a memory area is secured so that information from time T1 to time T7 can be sequentially held, and a ring buffer or the like is used.
  • the light source position detection unit 21 uses the captured input image 110 of the current frame and the composite image 13 of the plurality of front frames held by the front frame information holding unit 26, and presents from the transition of the light source position 41 in the composite image 13 of the front frame.
  • the light source position 41 of the frame is estimated. For example, in the case of a camera that captures 60 frames per second, the transition of the light source can be grasped because the same light source is repeatedly imaged in the composite image 13 of a plurality of previous frames. Then, by comparing the brightness values of the composite images 13 of the plurality of front frames, it is possible to find and remove high-luminance pixels that are not affected by flare such as noise and road surface reflected light 53.
  • a composite image 13 of a plurality of front frames for a preset time is taken out from the front frame information holding unit 26, and the transition of the position of the center of gravity of the high-luminance region calculated for each frame is observed.
  • the position of the center of gravity moves according to the traveling direction of the road. Can be estimated from. If the movement of the position of the center of gravity in the high-luminance region cannot be estimated in a plurality of front frames, it is determined that the region is not affected by flare caused by noise of the image pickup input image 110, road surface reflected light, or the like, and the light source position 41. Can be prevented from being detected.
  • the flare in the input image 11 of the current frame is changed from the fluctuation of the size of the flare region 42 due to the movement of the headlight 52 of the rear vehicle 51 depending on the directivity characteristic of the sensitivity of the camera.
  • the region 42 may be determined.
  • the region is not affected by the flare caused by the noise of the image pickup input image 110, the road surface reflected light 53, and the like, and the region is determined to be the flare region 42. You can avoid it.
  • the light source position 41 is detected from the brightness value of the captured input image 110, and the input image 11 and the corrected image 12 are formed in at least a part of the flare region 42 determined from the center of the light source position 41.
  • the composite image 13 generated by increasing the proportion of the corrected image 12 as it is closer to the center of the light source position 41 and increasing the proportion of the input image 11 as it is farther away is displayed, and the input image is displayed in the area other than the flare area 42. Since 11 is displayed, it is possible to suppress the discontinuity of the brightness value of the displayed image and display an image with high visibility. Then, by using the front frame information 34 with reference to the composite image 13 of the front frame, the positions of the light source position 41 and the flare region 42 can be estimated, and the processing can be simplified.
  • the captured input image 110 may be taken out as it is, or an image whose brightness, hue, etc. have been adjusted may be used.
  • the image display device 1 has been described, the image processing unit 3 constituting the image display device 1, the image display method implemented by the image display device 1, and the image processing method implemented by the image processing unit 3 are also the present invention. Form a part of. Further, a program that causes a computer to perform processing in the image processing unit 3 or the image processing method described above, and a computer-readable recording medium that records the program, such as a non-temporary recording medium, also form a part of the present invention. ..
  • 1 image display device 2 image pickup unit, 3 image processing unit, 4 image display unit, 11 input image, 12 corrected image, 13 composite image, 21 light source position detection unit, 22 flare area determination unit, 23 corrected image generation unit, 24 Composite image generation unit, 25, 26 front frame information holding unit, 31 light source position information, 32 flare area information, 33, 34 front frame information, 41 light source position, 42 flare area, 51 rear vehicle, 52 headlight, 53 road surface reflection Hikari, 110 image capture input image

Abstract

According to the present invention, a synthetic image (13) is generated in which, in an image affected by a flare, a light source location (41) is detected from the brightness value of an image pickup input image (110), and in at least a part of a flare region (42) determined from the center of the light source location (41), with regard to a corrected image (12) and an input image (11), as the flare region (42) becomes closer to the center of the light source location (41), a proportion of the corrected image (12) becomes larger, and as the flare region (42) becomes farther away therefrom, a proportion of the input image (11) becomes larger. Furthermore, the synthetic image (13) is generated in a pixel region in which the synthetic image (13) is generated, the corrected image (12) or the input image (11) is generated within the flare region (42) in a pixel region in which the synthetic image (13) is not generated, and a display image displaying the input image (11) is generated outside the flare region (42). Accordingly, a discontinuity in the brightness value of a boundary between the flare region (42) and a region other than the flare area (42) can be suppressed, and thus an image having high visibility can be displayed.

Description

画像表示装置および画像表示方法Image display device and image display method
 本開示は、電子ミラー装置などの画像表示装置および画像表示方法に関する。 This disclosure relates to an image display device such as an electronic mirror device and an image display method.
 従来の電子ミラー装置などの画像表示装置では、後方車両のヘッドライトなど局所的な強い光源によって発生するフレアの影響によりユーザーの視認性が損なわれないように、例えば輝度値を低減させる画像補正を行う工夫がなされている。また、例えば特許文献1では、自車両の後方を撮像した画像内にある後方車両のヘッドライトによる高輝度領域の中心から同心円状の領域の輝度値を低減させるとともに、中心部分の輝度値の低減率を抑制させる画像補正をして、後方車両のヘッドライトの中心部分のみ明るく映し、運転者が後方車両の位置を容易に把握できるようにしている。 In image display devices such as conventional electronic mirror devices, for example, image correction that reduces the brightness value is performed so that the user's visibility is not impaired by the influence of flare generated by a strong local light source such as the headlight of a vehicle behind. It has been devised to do. Further, for example, in Patent Document 1, the brightness value of the concentric region from the center of the high-luminance region by the headlight of the rear vehicle in the image of the rear of the own vehicle is reduced, and the brightness value of the central portion is reduced. Image correction that suppresses the rate is performed to brighten only the central part of the headlights of the rear vehicle so that the driver can easily grasp the position of the rear vehicle.
特開2015-198302号公報JP-A-2015-198302
 しかしながら、フレアの影響を受けた領域の輝度値を低減させると、輝度値を低減させない領域との境界に不連続部分が出現し視認性が低下するという課題があった。 However, when the brightness value of the area affected by the flare is reduced, there is a problem that a discontinuous portion appears at the boundary with the area where the brightness value is not reduced and the visibility is lowered.
 本開示は、上述した課題を解決するためになされたものであり、視認性を向上させた画像を表示する画像表示装置を提供することを目的とするものである。 The present disclosure has been made to solve the above-mentioned problems, and an object of the present disclosure is to provide an image display device for displaying an image with improved visibility.
 本開示に係る画像表示装置は、車両の外部を撮像する撮像部と、撮像部により撮像された撮像入力画像中の予め設定された第一の輝度値以上の画素が隣接して集まっている画素領域を光源位置として検出する光源位置検出部と、光源位置の中心の周囲の画素領域から、予め設定された第二の輝度値以上である画素領域をフレア領域と判定するフレア領域判定部と、撮像入力画像のうちの、少なくともフレア領域内の入力画像中の予め設定された第三の輝度値以上の画素の輝度値を低減させて、補正画像を生成する補正画像生成部と、少なくともフレア領域の一部において、補正画像と入力画像とを、光源位置の中心に近いほど補正画像の割合を大きく、遠いほど入力画像の割合を大きくして、合成画像を生成する合成画像生成部と、合成画像が生成された画素領域には合成画像を、合成画像が生成されない画素領域のうち、フレア領域内には補正画像または入力画像を、フレア領域外には入力画像を表示する画像表示部を備えたものである。 In the image display device according to the present disclosure, an image pickup unit that images the outside of a vehicle and pixels in which pixels having a preset first brightness value or higher in the image pickup input image captured by the image pickup unit are gathered adjacent to each other. A light source position detection unit that detects an area as a light source position, and a flare area determination unit that determines a pixel area having a preset second brightness value or more as a flare area from a pixel area around the center of the light source position. A correction image generation unit that generates a correction image by reducing the brightness value of pixels of at least the input image in the flare area of the captured input image that are equal to or higher than the preset third brightness value, and at least the flare area. In a part of, the corrected image and the input image are combined with a composite image generation unit that generates a composite image by increasing the ratio of the corrected image as it is closer to the center of the light source position and increasing the ratio of the input image as it is farther from the center. A composite image is provided in the pixel area where the image is generated, a corrected image or an input image is provided in the flare area among the pixel areas where the composite image is not generated, and an image display unit is provided outside the flare area to display the input image. It is an image.
 本開示に係る画像表示方法は、車両の外部を撮像した撮像入力画像中の予め設定された第一の輝度値以上の画素が隣接して集まっている画素領域を光源位置として検出するステップと、光源位置の中心の周囲の画素の輝度値が、予め設定された第二の輝度値以上である画素領域をフレア領域と判定するステップと、撮像入力画像のうちの、少なくともフレア領域内の入力画像の予め設定された第三の輝度値以上の画素の輝度値を低減させて、補正画像を生成するステップと、少なくともフレア領域の一部において、補正画像と入力画像とを、光源位置の中心に近いほど補正画像の割合を大きく、遠いほど入力画像の割合を大きくして、合成画像を生成するステップと、合成画像が生成された画素領域には合成画像を、合成画像が生成されない画素領域のうち、フレア領域内には補正画像または入力画像を、フレア領域外には入力画像を表示する表示画像を生成するするステップとを備えたものである。 The image display method according to the present disclosure includes a step of detecting as a light source position a pixel region in which pixels having a preset first brightness value or higher in an image captured by capturing the outside of a vehicle are adjacent to each other. A step of determining a pixel region in which the brightness value of the pixels around the center of the light source position is equal to or higher than a preset second brightness value as a flare region, and an input image of at least the flare region of the captured input images. The step of generating a corrected image by reducing the brightness value of the pixel equal to or higher than the preset third brightness value of, and the corrected image and the input image at the center of the light source position at least in a part of the flare region. The closer it is, the larger the proportion of the corrected image, and the farther it is, the greater the proportion of the input image to generate a composite image. Among them, a correction image or an input image is provided in the flare area, and a display image for displaying the input image is generated outside the flare area.
 本開示によれば、撮像入力画像の輝度値から光源位置を検出し、光源位置の中心から判定したフレア領域の少なくとも一部において、入力画像と補正画像との割合を、光源位置の中心に近いほど補正画像の割合を大きく、遠いほど入力画像の割合を大きくして生成させた合成画像を表示させるため、表示画像の輝度値の不連続性を抑制して、視認性の高い画像を表示できる。 According to the present disclosure, the ratio of the input image to the corrected image is close to the center of the light source position in at least a part of the flare region determined from the center of the light source position by detecting the light source position from the brightness value of the captured input image. Since the composite image generated by increasing the proportion of the corrected image and increasing the proportion of the input image is displayed, the discontinuity of the brightness value of the displayed image can be suppressed and the image with high visibility can be displayed. ..
図1は実施の形態1の画像表示装置の概略構成図である。FIG. 1 is a schematic configuration diagram of the image display device of the first embodiment. 図2は実施の形態1の画像表示装置の概略ブロック図である。FIG. 2 is a schematic block diagram of the image display device of the first embodiment. 図3A~図3Cは実施の形態1の画像処理部の制御を説明する説明図である。3A to 3C are explanatory views illustrating control of the image processing unit of the first embodiment. 図4A、図4Bは実施の形態1の合成画像生成部に用いる関数Fを示すグラフである。4A and 4B are graphs showing the function F used in the composite image generation unit of the first embodiment. 図5A~図5Dは実施の形態1の画像表示装置における処理画像例である。5A to 5D are examples of processed images in the image display device of the first embodiment. 図6は実施の形態2を説明するための時間毎の撮像撮像入力画像例である。FIG. 6 is an example of a time-by-time image capture input image for explaining the second embodiment. 図7は実施の形態2の画像表示装置の概略ブロック図である。FIG. 7 is a schematic block diagram of the image display device of the second embodiment. 図8は実施の形態3の画像表示装置の概略ブロック図である。FIG. 8 is a schematic block diagram of the image display device of the third embodiment.
実施の形態1.
 実施の形態1における画像表示装置1について図1を用いて説明する。画像表示装置1は、自車両の外部を撮像する撮像部2と、撮像した画像を処理する画像処理部3と、処理された画像を表示する画像表示部4とを備える。
Embodiment 1.
The image display device 1 according to the first embodiment will be described with reference to FIG. The image display device 1 includes an image pickup unit 2 that captures an image of the outside of the own vehicle, an image processing unit 3 that processes the captured image, and an image display unit 4 that displays the processed image.
 撮像部2は、自車両の外部を撮像した撮像入力画像110を出力する1以上のカメラを有する。本実施の形態では、撮像部2が自車両の後方の状況を撮像する例について説明する。カメラは、例えばCMOS(Complementary MOSFET)やCCD(Charge Coupled Device)素子をイメージセンサとした二次元撮像デバイスなどを用いるとよい。 The image pickup unit 2 has one or more cameras that output an image pickup input image 110 that images the outside of the own vehicle. In the present embodiment, an example in which the image pickup unit 2 takes an image of the situation behind the own vehicle will be described. As the camera, for example, a two-dimensional image pickup device using a CMOS (Complementary MOSFET) or a CCD (Charge Coupled Device) element as an image sensor may be used.
 画像表示部4は、自車両内に搭載され、ユーザーが画像表示部4を見るのに適した位置に配置されている。例えば、自車両内の前面左右にあるAピラー近傍、自車両内の側面前方のドアトリム近傍などに配置される。画像表示部4は、例えばLCD(Liquid Crystal Display)パネルなどの液晶フィルタ、LED(Light Emitting Diode)および拡散板で構成されたバックライトを有するディスプレイ装置を用いる。また、有機発光するLEDを画素単位で配列されたディスプレイ装置であってもよい。画像表示部4には、後述する撮像入力画像110を処理することで得られる表示画像が表示される。 The image display unit 4 is mounted in the own vehicle and is arranged at a position suitable for the user to see the image display unit 4. For example, it is arranged near the A-pillars on the left and right sides of the front surface of the own vehicle, near the door trim in front of the side surface of the own vehicle, and the like. The image display unit 4 uses a display device having a backlight composed of, for example, a liquid crystal filter such as an LCD (Liquid Crystal Display) panel, an LED (Light Emitting Diode), and a diffuser. Further, it may be a display device in which LEDs that emit organic light are arranged in pixel units. The image display unit 4 displays a display image obtained by processing the image pickup input image 110 described later.
 画像処理部3は、その一部または全部を処理回路で構成される。例えば、画像処理部3の複数の機能をそれぞれ別個の処理回路で実現してもよく、複数の部分の機能を纏めて1つの処理回路で実現してもよい。また、処理回路はハードウェアで構成してもよく、ソフトウェアで、即ちプログラムされたコンピュータで構成してもよい。さらに、画像処理部3の各部分の機能のうち、一部をハードウェアで実現し、他の一部をソフトウェアで実現するようにしてもよい。ハードウェアは、例えばFPGA(Field―Programmable Gate Array)、マイクロプロセッサ、マイクロコントローラ、DSP(Digital Signal Processor)などを用いればよい。ソフトウェアは、例えばCPU(Central Processing Unit)、GPU(Graphics Processing Unit)などをプロセッサ上で実行されるプログラムである。 The image processing unit 3 is partially or wholly composed of a processing circuit. For example, the plurality of functions of the image processing unit 3 may be realized by separate processing circuits, or the functions of the plurality of parts may be collectively realized by one processing circuit. Further, the processing circuit may be configured by hardware or software, that is, a programmed computer. Further, among the functions of each part of the image processing unit 3, a part may be realized by hardware and the other part may be realized by software. As the hardware, for example, FPGA (Field-Programmable Gate Array), microprocessor, microcontroller, DSP (Digital Signal Processor) or the like may be used. The software is, for example, a program that executes a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), or the like on a processor.
 画像処理部3について図2および図3A~図3Cを用いて説明する。図2は画像表示装置1の概略ブロック図である。図3A~図3Cは画像処理部の処理制御を説明する説明図であり、図3Aは撮像入力画像、図3Bは光源位置情報、図3Cはフレア領域情報の例を示す。
 図2に示すように、画像処理部3は、光源位置検出部21、フレア領域判定部22、補正画像生成部23および合成画像生成部24を備える。画像処理部3において、撮像入力画像110から光源位置検出部21により検出した光源位置41を示す光源位置情報31をフレア領域判定部22および合成画像生成部24に出力し、フレア領域判定部22により判定したフレア領域42を示すフレア領域情報32を合成画像生成部24に出力する。
 また、補正画像生成部23は、撮像入力画像110の少なくともフレア領域42内の入力画像11の輝度値を低減させた補正画像12を生成して合成画像生成部24に出力する。そして、合成画像生成部24により、フレア領域42の少なくとも一部において、光源位置41の中心からの距離Dに基づいて入力画像11と補正画像12との合成割合を変化させた合成画像13を生成し、合成画像13が生成された領域には合成画像13を、合成画像13が生成されない領域のうち、フレア領域42の端部Deを含むフレア領域42内には補正画像12または補正画像12に入力画像11を含ませた画像を、フレア領域42外の領域には入力画像11を画像表示部4に表示させる。
 複数のフレア領域42が判定された場合には、フレア領域42内にはそれぞれ上述の画像処理を行い、フレア領域42でない領域には入力画像11を画像表示部4に表示させる。
The image processing unit 3 will be described with reference to FIGS. 2 and 3A to 3C. FIG. 2 is a schematic block diagram of the image display device 1. 3A to 3C are explanatory views for explaining the processing control of the image processing unit, FIG. 3A shows an image pickup input image, FIG. 3B shows a light source position information, and FIG. 3C shows an example of flare region information.
As shown in FIG. 2, the image processing unit 3 includes a light source position detection unit 21, a flare area determination unit 22, a correction image generation unit 23, and a composite image generation unit 24. In the image processing unit 3, the light source position information 31 indicating the light source position 41 detected by the light source position detection unit 21 from the image pickup input image 110 is output to the flare area determination unit 22 and the composite image generation unit 24, and the flare area determination unit 22 outputs the light source position information 31. The flare area information 32 indicating the determined flare area 42 is output to the composite image generation unit 24.
Further, the corrected image generation unit 23 generates a corrected image 12 in which the brightness value of the input image 11 in at least the flare region 42 of the captured input image 110 is reduced, and outputs the corrected image 12 to the composite image generation unit 24. Then, the composite image generation unit 24 generates a composite image 13 in which the composite ratio of the input image 11 and the corrected image 12 is changed based on the distance D from the center of the light source position 41 in at least a part of the flare region 42. The composite image 13 is placed in the region where the composite image 13 is generated, and the corrected image 12 or the corrected image 12 is set in the flare region 42 including the end De of the flare region 42 in the region where the composite image 13 is not generated. The image including the input image 11 is displayed on the image display unit 4 in the area outside the flare area 42.
When a plurality of flare areas 42 are determined, the above-mentioned image processing is performed in each of the flare areas 42, and the input image 11 is displayed on the image display unit 4 in the areas other than the flare areas 42.
 光源位置検出部21は、撮像部2から出力される図3Aに示す撮像入力画像110を入力し、撮像入力画像110の中の予め設定された第一の輝度値Th1以上の画素が隣接して集まっている画素領域を光源位置41として検出し、光源位置41を図3Bに示す光源位置情報31とする。光源位置情報31は、例えば撮像入力画像110と同サイズの二次元マップ情報であり、図3B中の白い部分が検出された光源位置41を示す。
 光源位置検出部21は、例えば撮像入力画像110の画素を階調処理して輝度値を算出し、その輝度値が第一の輝度値Th1以上となった高輝度な画素を光源位置41として検出する。輝度値の算出は、例えば撮像素子の構成がRGBのカラーフィルタから得られる三色の階調値の場合、輝度成分と色差成分で構成されるYCbCr色空間に変換しY成分を輝度値とする。輝度値が高くなるのは、強い光が撮像デバイスに投射された場合、撮像ノイズや回路ノイズが局所的に出現する場合などである。一般的に、フレアを出現させる高輝度な光源は、画素領域が大きなものに限定される。そのため、光源位置検出部21では第一の輝度値Th1以上の画素が隣接し集まっている領域の面積を算出し、予め設定された面積以上の領域を一つの光源位置41とする。自車両の後方を撮像する場合、ユーザーが視認すべき物体は自車両の後方にある後方車両51であり、カメラの画角内に後方車両51が存在する範囲は限定される。そのため、光源位置検出部21にて輝度値を算出する画素を後方車両51の存在する範囲に限定して検出してもよい。
The light source position detection unit 21 inputs the image pickup input image 110 shown in FIG. 3A output from the image pickup unit 2, and the pixels having the first luminance value Th1 or more set in advance in the image pickup input image 110 are adjacent to each other. The gathered pixel region is detected as the light source position 41, and the light source position 41 is used as the light source position information 31 shown in FIG. 3B. The light source position information 31 is, for example, two-dimensional map information having the same size as the captured input image 110, and indicates the light source position 41 in which the white portion in FIG. 3B is detected.
For example, the light source position detection unit 21 performs gradation processing on the pixels of the image pickup input image 110 to calculate the luminance value, and detects the high-luminance pixel whose luminance value is equal to or higher than the first luminance value Th1 as the light source position 41. do. For the calculation of the luminance value, for example, when the configuration of the image pickup element is a gradation value of three colors obtained from an RGB color filter, it is converted into a YCbCr color space composed of a luminance component and a color difference component, and the Y component is used as the luminance value. .. The luminance value becomes high when strong light is projected onto the imaging device, when imaging noise or circuit noise appears locally, and the like. In general, a high-intensity light source that causes flare to appear is limited to a light source having a large pixel area. Therefore, the light source position detection unit 21 calculates the area of the area where the pixels having the first luminance value Th1 or more are adjacent to each other, and sets the area of the area equal to or more than the preset area as one light source position 41. When photographing the rear of the own vehicle, the object to be visually recognized by the user is the rear vehicle 51 behind the own vehicle, and the range in which the rear vehicle 51 exists within the angle of view of the camera is limited. Therefore, the pixel for calculating the luminance value by the light source position detection unit 21 may be limited to the range in which the rear vehicle 51 exists.
 フレア領域判定部22は、光源位置検出部21から得た光源位置情報31を用いて、フレア領域42を判定し、フレア領域情報32とする。フレア領域情報32は、例えば図3Cに示すような撮像入力画像110と同サイズの二次元マップ情報であり、光源位置41の中心の周囲の高輝度となった画素領域の情報である。例えば図3C中の白い部分が判定されたフレア領域42となる。 The flare area determination unit 22 determines the flare area 42 by using the light source position information 31 obtained from the light source position detection unit 21, and uses it as the flare area information 32. The flare region information 32 is, for example, two-dimensional map information having the same size as the captured input image 110 as shown in FIG. 3C, and is information on a pixel region having high brightness around the center of the light source position 41. For example, the white portion in FIG. 3C is the determined flare region 42.
 フレア領域42を判定する方法について説明する。まず、光源位置情報31において得られる光源位置41の中心から順に周囲の画素を抽出し、その周囲の画素の輝度値を光源位置検出部21と同様に処理して算出する。光源位置41の中心は、例えば光源位置検出部21により光源位置41として検出した画素領域の重心とする。
 具体的には、光源位置検出部21により検出された光源位置41の中心の周囲の画素の輝度値が、予め設定された第二の輝度値Th2以上である画素領域をフレア領域42として判定する。
A method for determining the flare region 42 will be described. First, peripheral pixels are extracted in order from the center of the light source position 41 obtained in the light source position information 31, and the luminance values of the surrounding pixels are processed and calculated in the same manner as the light source position detection unit 21. The center of the light source position 41 is, for example, the center of gravity of the pixel region detected as the light source position 41 by the light source position detection unit 21.
Specifically, the pixel region in which the luminance value of the pixels around the center of the light source position 41 detected by the light source position detection unit 21 is equal to or greater than the preset second luminance value Th2 is determined as the flare region 42. ..
 次に、補正画像生成部23は、撮像入力画像110のうち少なくともフレア領域42内の入力画像11を参照し予め設定された第三の輝度値Th3以上の画素の輝度値を低減させた補正画像12を生成する。
 補正画像12は、第三の輝度値Th3以上の画素のみ輝度値を低減させて高輝度成分を除去したものでもよく、第三の輝度値Th3以上と第三の輝度値Th3以上でない双方を低減させるものであってもよい。
 例えば、後方車両51のヘッドライト52を撮像した画素は非常に高い輝度値を有するため、ヘッドライト52の周囲の画素はフレアの影響により輝度値が大きくなる。
 そして、フレアの影響により増大した輝度値は光源位置41の中心からの距離が近いほど大きくなることから、光源位置41の中心の周囲の画素に着目することにより、第三の輝度値Th3以上の画素の輝度値を低減させることでフレアを抑制できる。第三の輝度値Th3の最小値の設定は、撮像部2における撮像素子の感度やHDR(High Dinamic Range)変換によるデジタル値への割り当てに影響を受けるため、利用する機器の性能や撮像環境に基づいてパラメータとして決定する。
Next, the corrected image generation unit 23 refers to at least the input image 11 in the flare region 42 of the captured input image 110, and reduces the brightness value of the pixel having the third brightness value Th3 or more set in advance. Twelve is generated.
The corrected image 12 may be one in which the luminance value is reduced only for the pixels having the third luminance value Th3 or more to remove the high luminance component, and both the third luminance value Th3 or more and the third luminance value Th3 or not are reduced. It may be something to make.
For example, since the pixels obtained by imaging the headlight 52 of the rear vehicle 51 have a very high luminance value, the pixels around the headlight 52 have a large luminance value due to the influence of flare.
Since the luminance value increased due to the influence of flare increases as the distance from the center of the light source position 41 increases, by focusing on the pixels around the center of the light source position 41, the third luminance value Th3 or more Flare can be suppressed by reducing the brightness value of the pixel. The setting of the minimum value of the third brightness value Th3 is affected by the sensitivity of the image sensor in the image pickup unit 2 and the allocation to the digital value by HDR (High Dynamic Range) conversion, so it depends on the performance of the equipment to be used and the image pickup environment. Determined as a parameter based on.
 補正画像生成部23は、例えば上述の第三の輝度値Th3以上の画素の輝度値を低減させる処理を施せばよい。補正画像12の生成は、例えばLUT(Look Up Table)により、撮像入力画像110の輝度値に基づき、出力値を決定するテーブルを用意し、補正画像12を生成する方法が挙げられる。この場合、第三の輝度値Th3より小さい輝度値については入力される画素の輝度値を出力し、第三の輝度値Th3以上の輝度値である場合には入力される画素の輝度値より低減された値となるように出力を設定する。
 フレアの影響により増大した輝度値は、光源位置41の中心からの距離が近いほど高輝度であるため、大きな低減を必要とする。そのため、用意するLUTのパラメータは第三の輝度値Th3より高輝度になるにつれて、例えば線形、ガウシアン、指数関数パラメータを与え輝度値を低減させてもよい。LUTパラメータは画素の輝度値を入力値とした1次元の入力パラメータとしてもよく、RGB各色の値、色度を含めた多次元の入力パラメータとしてもよい。フレア領域判定部22において判定されたフレア領域42のみにLUTを構築し補正画像12の生成を行ってもよい。さらに、フレア領域42とフレア領域42ではない領域との境界であるフレア領域境界を越えて補正画像12を生成してもよい。つまり、補正画像生成部23における補正画像12の生成は撮像入力画像110全体でもよく、撮像入力画像110全体でなくてもよい。少なくとも後述の合成画像13を生成する画素領域に実施すればよい。
The corrected image generation unit 23 may perform a process of reducing the luminance value of the pixel having the above-mentioned third luminance value Th3 or more, for example. For the generation of the corrected image 12, for example, a method of preparing a table for determining an output value based on the luminance value of the captured input image 110 by a LUT (Look Up Table) and generating the corrected image 12 can be mentioned. In this case, the luminance value of the input pixel is output for the luminance value smaller than the third luminance value Th3, and the luminance value of the input pixel is reduced when the luminance value is equal to or greater than the third luminance value Th3. Set the output so that it is the value that was set.
The luminance value increased due to the influence of flare becomes higher as the distance from the center of the light source position 41 becomes closer, so that a large reduction is required. Therefore, as the LUT parameter to be prepared becomes higher in brightness than the third luminance value Th3, for example, linear, Gaussian, and exponential function parameters may be given to reduce the luminance value. The LUT parameter may be a one-dimensional input parameter using the luminance value of the pixel as an input value, or may be a multidimensional input parameter including the value of each RGB color and the chromaticity. A LUT may be constructed only in the flare region 42 determined by the flare region determination unit 22 to generate the corrected image 12. Further, the corrected image 12 may be generated beyond the flare region boundary, which is the boundary between the flare region 42 and the region other than the flare region 42. That is, the correction image 12 may be generated by the correction image generation unit 23 in the entire image pickup input image 110, or not in the entire image pickup input image 110. It may be carried out at least in the pixel region for generating the composite image 13 described later.
 合成画像生成部24は、撮像入力画像110の少なくとも一部である入力画像11、光源位置情報31、フレア領域情報32および補正画像12を用いて、光源位置41の中心に近いほど補正画像12の割合を大きく、遠いほど撮像入力画像110の割合を大きくして、例えばフレア領域42全域に補正画像12と入力画像11とを合成した合成画像13を生成し、画像表示部4に出力する。 The composite image generation unit 24 uses the input image 11, the light source position information 31, the flare area information 32, and the corrected image 12, which are at least a part of the captured input image 110, and the closer to the center of the light source position 41, the closer the corrected image 12 is. The ratio is increased, and the ratio of the captured input image 110 is increased as the ratio is increased. For example, a composite image 13 in which the corrected image 12 and the input image 11 are combined over the entire flare region 42 is generated and output to the image display unit 4.
 合成画像生成部24における合成画像13生成方法について説明する。例えば、フレア領域42内において入力画像11と補正画像12とを一定の割合で合成した画像と、フレア領域42外の入力画像11とを合わせ表示すると、フレア領域境界に輝度値の不連続性が生じ視認性が低下する。そこで、入力画像11および補正画像12を合成する割合は光源位置41の中心からの距離Dに基づいて変化させる。すなわち、光源位置41の中心に近いほど入力画像11の合成割合Baを小さく、補正画像12の合成割合Bbを大きくし、光源位置41の中心から遠いほど入力画像11の合成割合Baを大きく補正画像12の合成割合Bb小さくする。 The method of generating the composite image 13 in the composite image generation unit 24 will be described. For example, when the image obtained by synthesizing the input image 11 and the corrected image 12 in the flare area 42 at a constant ratio and the input image 11 outside the flare area 42 are displayed together, a discontinuity of the luminance value appears at the boundary of the flare area. It occurs and visibility is reduced. Therefore, the ratio of combining the input image 11 and the corrected image 12 is changed based on the distance D from the center of the light source position 41. That is, the closer to the center of the light source position 41, the smaller the composite ratio Ba of the input image 11, the larger the composite ratio Bb of the corrected image 12, and the farther from the center of the light source position 41, the larger the composite ratio Ba of the input image 11. The synthesis ratio Bb of 12 is reduced.
 例えば、図4A、図4Bは、合成画像生成部24に用いる関数F(D,De,Tc)を示すグラフであり、入力画像11の合成割合Baと光源位置41の中心からの距離Dとの関係で示されている。
 関数F(D,De,Tc)は、光源位置41の中心からの距離D、入力画像11の合成割合Ba、光源位置41の中心からフレア領域42の端部までの距離De、調整パラメータTcの関数となる。補正画像12の合成割合Bbは、入力画像11の合成割合Baを1から減じた値となる。
 光源位置41の中心からの距離Dは変数であり、例えば合成割合を算出する画素と光源位置41の中心の画素とを結んだ線分のユークリッド距離である。距離Deは、その線分をフレア領域42の端部の方向に伸ばしたときのフレア領域42の端部の画素と光源位置41の中心の画素との距離である。通常、フレア領域42は光源位置41の中心から同心円状とはならず、光源位置41の中心からフレア領域42の端部までの距離Deは選択した線分により異なる。調整パラメータTcは、例えばフレア領域42の端部における合成割合BaおよびBbとの合成割合を調整する定数であり、D=Deとなる位置における入力画像11の合成割合Baを調整し、フレア領域42の端部の輝度値の不連続性を緩和する。すなわち、関数F(D,De,Tc)は、フレア領域42の端部の出力がTcで調整され光源位置41の中心に向かうにつれて、0に近づく入力画像11の合成割合Baの関数である。関数F(D,De,Tc)は、図4Aに示すように光源位置41の中心D=0の入力画像11の合成割合Baを0、フレア領域42の端部D=Deの入力画像11の合成割合Baを1としてもよく、図4Bに示すように光源位置41の中心から任意の距離Dnまでの入力画像11の合成割合Baを0、Dnからフレア領域42の端部Deと任意の距離ΔDを足しあわせた距離De+ΔDまで入力画像11の合成割合Baを変化させてもよい。また、関数F(D,De,Tc)はガウス分布を逆関数化した指数関数を用いてもよく、線形関数を用いてもよい。
For example, FIGS. 4A and 4B are graphs showing the function F (D, De, Tc) used in the composite image generation unit 24, and the composite ratio Ba of the input image 11 and the distance D from the center of the light source position 41. Shown in relation.
The function F (D, De, Tc) includes the distance D from the center of the light source position 41, the composition ratio Ba of the input image 11, the distance De from the center of the light source position 41 to the end of the flare region 42, and the adjustment parameter Tc. It becomes a function. The composite ratio Bb of the corrected image 12 is a value obtained by subtracting the composite ratio Ba of the input image 11 from 1.
The distance D from the center of the light source position 41 is a variable, for example, the Euclidean distance of the line segment connecting the pixel for calculating the composition ratio and the pixel at the center of the light source position 41. The distance De is the distance between the pixel at the end of the flare region 42 and the pixel at the center of the light source position 41 when the line segment is extended toward the end of the flare region 42. Normally, the flare region 42 is not concentric from the center of the light source position 41, and the distance De from the center of the light source position 41 to the end of the flare region 42 varies depending on the selected line segment. The adjustment parameter Tc is, for example, a constant for adjusting the composite ratio Ba and the composite ratio with Bb at the end of the flare region 42, and adjusts the composite ratio Ba of the input image 11 at the position where D = De to adjust the flare region 42. It alleviates the discontinuity of the brightness value at the end of. That is, the function F (D, De, Tc) is a function of the composition ratio Ba of the input image 11 that approaches 0 as the output of the end portion of the flare region 42 is adjusted by Tc and moves toward the center of the light source position 41. As shown in FIG. 4A, the function F (D, De, Tc) sets the composite ratio Ba of the input image 11 at the center D = 0 of the light source position 41 to 0, and sets the composite ratio Ba of the end D = De of the flare region 42 to 0. The composition ratio Ba may be 1, and as shown in FIG. 4B, the composition ratio Ba of the input image 11 from the center of the light source position 41 to an arbitrary distance Dn is 0, and the composition ratio Ba from Dn to the end De of the flare region 42 is arbitrary. The composition ratio Ba of the input image 11 may be changed up to the distance De + ΔD obtained by adding ΔD. Further, the function F (D, De, Tc) may use an exponential function obtained by inversion of the Gaussian distribution, or may use a linear function.
 このように、合成画像生成部24はフレア領域42内において合成画像13を生成してもよく、フレア領域端部とその周辺を含む領域において合成画像13を生成してもよい。合成割合は、線分毎に変化させて生成してもよく、合成割合を変化させるのはDnからDe+ΔDまでの画素領域のようにフレア領域端部とその周辺でもよく、フレア領域42の端部を超える画素領域を含んでもよい。この場合、合成割合を変化させない領域は、図4Bに示すように入力画像11の合成割合Baを0としてもよく、入力画像11と補正画像12を任意の割合で足し合わせた画像としてもよい。 As described above, the composite image generation unit 24 may generate the composite image 13 in the flare region 42, or may generate the composite image 13 in the region including the end of the flare region and its periphery. The composition ratio may be generated by changing it for each line segment, and the composition ratio may be changed at the end of the flare region and its periphery such as the pixel region from Dn to De + ΔD, and the end of the flare region 42. It may include a pixel area exceeding. In this case, in the region where the composition ratio is not changed, the composition ratio Ba of the input image 11 may be 0 as shown in FIG. 4B, or the input image 11 and the correction image 12 may be added together at an arbitrary ratio.
 本実施の形態における画像処理例について図5A~図5Dを用いて説明する。図5Aは自車両における右側後方を撮像した撮像入力画像110の例である。後方車両51のヘッドライト52によるフレアの影響によって、ヘッドライト52近傍の後方車両51の形状や道路環境が視認しにくい画像となる。また、ヘッドライト52は左右に一つずつ配置されているため、2つのフレアが繋がって一つの光源として視認され後方車両51の位置が把握しにくくなっている。
 図5Bは撮像入力画像110中の第一の輝度値Th1以上の画素が隣接して集まっている画素領域を光源位置41として検出し、光源位置41の中心の周囲の画素の輝度値が、第二の輝度値Th2以上である画素領域をフレア領域42として判定した例である。
 図5Cは撮像入力画像110を第三の輝度値Th3以上の画素の輝度値を全体的に低減させた補正画像12の例である。
 図5Dは合成画像13が生成された領域には合成画像13を、合成画像13が生成されない領域のうち、フレア領域42内には補正画像12または合成画像13を、フレア領域42ではない領域には入力画像11を表示させた表示画像の例である。具体的には、フレア領域42において光源位置41に近いほど補正画像12の合成割合Bbを大きく、遠いほど入力画像11の合成割合Baを大きくして入力画像11と補正画像12とを合成した合成画像13を生成し、フレア領域42ではない領域には入力画像11を表示させている。
An example of image processing in the present embodiment will be described with reference to FIGS. 5A to 5D. FIG. 5A is an example of an image pickup input image 110 in which the rear right side of the own vehicle is imaged. Due to the influence of flare caused by the headlight 52 of the rear vehicle 51, the shape of the rear vehicle 51 in the vicinity of the headlight 52 and the road environment are difficult to visually recognize. Further, since the headlights 52 are arranged one on each side, the two flares are connected and visually recognized as one light source, making it difficult to grasp the position of the rear vehicle 51.
In FIG. 5B, the pixel region in which the pixels having the first luminance value Th1 or more in the image pickup input image 110 are adjacently gathered is detected as the light source position 41, and the luminance value of the pixels around the center of the light source position 41 is the second. This is an example in which a pixel region having a luminance value of Th2 or more is determined as a flare region 42.
FIG. 5C is an example of the corrected image 12 in which the image pickup input image 110 is obtained by reducing the luminance value of the pixel having the third luminance value Th3 or more as a whole.
FIG. 5D shows the composite image 13 in the region where the composite image 13 is generated, the corrected image 12 or the composite image 13 in the flare region 42 among the regions where the composite image 13 is not generated, and the region other than the flare region 42. Is an example of a display image in which the input image 11 is displayed. Specifically, in the flare region 42, the closer to the light source position 41, the larger the composite ratio Bb of the corrected image 12, and the farther it is, the larger the composite ratio Ba of the input image 11, and the composite of the input image 11 and the corrected image 12. The image 13 is generated, and the input image 11 is displayed in a region other than the flare region 42.
 このように、本実施の形態における画像表示装置1は、フレアの影響を受けた画像において、撮像入力画像110の輝度値から検出した光源位置41の中心の周囲の画素領域から、第二の輝度値Th2以上である画素領域をフレア領域42と判定し、撮像入力画像110のうちの、少なくともフレア領域42内の入力画像11の第三の輝度値Th3以上の画素の輝度値を低減させて補正画像12を生成する。そして、少なくともフレア領域42の一部において、補正画像12と入力画像11とを、光源位置41の中心に近いほど補正画像12の割合を大きく、遠いほど入力画像11の割合を大きくして、合成画像13を生成する。さらに、合成画像13が生成された領域には合成画像13を、合成画像13が生成されない領域のうち、フレア領域42内には補正画像12または補正画像12に入力画像11を含ませた画像を、フレア領域42外には入力画像11を画像表示部4で表示させる構成にしたので、フレア領域42とフレア領域42でない領域の境界の輝度値の不連続性を抑制し、視認性の高い画像を表示できる。
 また、撮像入力画像110全体において輝度値を低減しなくてもよいため、周辺の建物や後方車両51の形状など注目すべき情報が視認しやすくなる
As described above, in the image affected by the flare, the image display device 1 according to the present embodiment has a second brightness from the pixel region around the center of the light source position 41 detected from the brightness value of the captured input image 110. A pixel region having a value Th2 or more is determined to be a flare region 42, and the brightness value of a pixel having a third brightness value Th3 or more of at least the input image 11 in the flare region 42 of the captured input image 110 is reduced and corrected. Image 12 is generated. Then, at least in a part of the flare region 42, the corrected image 12 and the input image 11 are combined by increasing the proportion of the corrected image 12 as it is closer to the center of the light source position 41 and increasing the proportion of the input image 11 as it is farther away. Image 13 is generated. Further, the composite image 13 is placed in the region where the composite image 13 is generated, and the corrected image 12 or the corrected image 12 includes the input image 11 in the flare region 42 in the region where the composite image 13 is not generated. Since the input image 11 is displayed on the image display unit 4 outside the flare area 42, the discontinuity of the brightness value at the boundary between the flare area 42 and the non-flare area 42 is suppressed, and the image has high visibility. Can be displayed.
Further, since it is not necessary to reduce the luminance value in the entire captured input image 110, it becomes easy to visually recognize notable information such as the shapes of surrounding buildings and the rear vehicle 51.
 さらに、補正画像12を入力画像11の輝度値が高いほど大きく低減させることで、フレアの影響は抑制されフレア領域境界の後方車両51の形状や道路環境などの視覚情報を視認しやすくなる。 Further, by significantly reducing the corrected image 12 as the brightness value of the input image 11 increases, the influence of flare is suppressed and visual information such as the shape of the vehicle 51 behind the flare region boundary and the road environment becomes easier to visually recognize.
 また、線分毎に入力画像11の合成割合Baと補正画像12の合成割合Bbを変化させることにより、複雑なフレア形状であっても、フレア領域境界の輝度値の不連続性を抑制して、視認性の高い画像を表示できる。 Further, by changing the composition ratio Ba of the input image 11 and the composition ratio Bb of the corrected image 12 for each line segment, the discontinuity of the brightness value at the boundary of the flare region is suppressed even if the flare shape is complicated. , Highly visible images can be displayed.
 なお、合成画像生成部24において、光源位置41の中心からフレア領域42の端部Deまで、または上述したDnからDe+ΔDまでの合成画像13の生成は、画素ごとに処理してもよく、フレア領域42または光源位置41の画素領域の面積に基づいたテーブルを用いて関数F(D,De,Tc)をパターン化して処理してもよい。画素領域をまとめて処理すると、簡便化することができる。 In the composite image generation unit 24, the generation of the composite image 13 from the center of the light source position 41 to the end De of the flare region 42, or from Dn to De + ΔD described above may be processed for each pixel, and the flare region may be processed. The function F (D, De, Tc) may be patterned and processed using a table based on the area of the pixel region of 42 or the light source position 41. It can be simplified by processing the pixel areas collectively.
 また、フレア領域判定部22において、光源位置41の中心の周囲の画素領域から、第二の輝度値Th2以上である画素領域をフレア領域42と判定する例を示したが、第二の輝度値Th2は光源位置41の中心からの距離に基づいて変化させてもよい。フレア領域42の輝度値は、光源位置41の中心からの距離が近いほど大きくなるため、例えば光源位置41の中心から順次、第二の輝度値Th2を低くすれば、より精度よくフレア領域42を判定することができる。この際、フレア領域42の端部を決定する輝度値は光源位置41の中心付近で設定した第二の輝度値Th20より小さいTh2iとなる。このように、光源位置41の中心からの距離が近いほど第二の輝度値Th2を大きく設定することで、光源位置41の周囲の画素がフレアの影響を受け輝度値が増大していても、適切にフレア領域42を判定することができる。 Further, in the flare area determination unit 22, an example of determining a pixel area having a second luminance value Th2 or more as a flare region 42 from a pixel region around the center of the light source position 41 is shown, but the second luminance value is shown. Th2 may be changed based on the distance from the center of the light source position 41. The luminance value of the flare region 42 increases as the distance from the center of the light source position 41 increases. Therefore, for example, if the second luminance value Th2 is lowered sequentially from the center of the light source position 41, the flare region 42 can be more accurately formed. It can be determined. At this time, the luminance value that determines the end portion of the flare region 42 is Th2i, which is smaller than the second luminance value Th20 set near the center of the light source position 41. In this way, by setting the second luminance value Th2 larger as the distance from the center of the light source position 41 is shorter, even if the pixels around the light source position 41 are affected by flare and the luminance value is increased, the luminance value is increased. The flare region 42 can be appropriately determined.
 また、フレア領域判定部22により判定するフレア領域42は、光源位置41の中心から連続した領域であるため、上述のフレア領域42の判定においてフレア領域42ではないと判定された場合には、その判定された画素領域よりも外の領域はフレア領域42ではないと判定することができる。そのため、光源位置41の中心から開始して外側へ向かうように判定していくのがよい。 Further, since the flare region 42 determined by the flare region determination unit 22 is a region continuous from the center of the light source position 41, if it is determined that the flare region 42 is not the flare region 42 in the above-mentioned determination of the flare region 42, the flare region 42 is determined. It can be determined that the region outside the determined pixel region is not the flare region 42. Therefore, it is better to start from the center of the light source position 41 and make a determination toward the outside.
 また、輝度値によるフレア領域42の判定を簡便化するため、限定した方向でフレア領域42を判定してもよい。例えば、光源位置41の中心から、上下左右、斜め45度軸の計8方向軸に対して、順に進めることで、8方向のフレア領域42の端部が判定される。その際、軸上以外の画素領域に対しては、近傍の2軸に着目し、その2軸のフレア領域42の端部となる画素同士を直線もしくは光源位置41の中心を中心とした円弧で結んだ線よりも光源位置41側に位置する領域をフレア領域42と判定するようにしてもよい。このような方法により、フレア領域42の判定を簡便化できるとともに、合成画像生成部24における合成画像13の生成も簡便化できる。
 さらに、フレア領域42を判定する画素領域を、予め設定された光源位置41の中心からの距離の範囲内に限定することで処理の負荷を軽減することができる。例えば、光源位置情報31から得られる光源位置41の画素領域の面積に比例した距離を設定してもよく、隣接する複数画素を平均化してひとかたまりとして設定してもよい。
Further, in order to simplify the determination of the flare region 42 based on the luminance value, the flare region 42 may be determined in a limited direction. For example, the end portion of the flare region 42 in eight directions is determined by advancing in order from the center of the light source position 41 with respect to a total of eight-direction axes of up / down / left / right and diagonal 45-degree axes. At that time, for the pixel region other than the on-axis, pay attention to the two neighboring axes, and draw the pixels that are the ends of the flare region 42 of the two axes into a straight line or an arc centered on the center of the light source position 41. The region located on the light source position 41 side of the connected line may be determined as the flare region 42. By such a method, the determination of the flare region 42 can be simplified, and the generation of the composite image 13 in the composite image generation unit 24 can also be simplified.
Further, the processing load can be reduced by limiting the pixel region for determining the flare region 42 within the range of the distance from the center of the preset light source position 41. For example, a distance proportional to the area of the pixel region of the light source position 41 obtained from the light source position information 31 may be set, or a plurality of adjacent pixels may be averaged and set as a group.
実施の形態2.
 本実施の形態に係る画像表示装置1は、図1に示す実施の形態1の概略構成と同様に撮像部2、画像処理部3、画像表示部4とを備える。画像処理部3は、現時点に撮像された現フレームの画像処理において参照できる、前の時間帯に撮像された前フレームで得た光源位置情報31およびフレア領域情報32などの前フレーム情報33を保持する前フレーム情報保持部25を備える。
Embodiment 2.
The image display device 1 according to the present embodiment includes an image pickup unit 2, an image processing unit 3, and an image display unit 4 in the same manner as the schematic configuration of the first embodiment shown in FIG. The image processing unit 3 holds the front frame information 33 such as the light source position information 31 and the flare area information 32 obtained in the previous frame captured in the previous time zone, which can be referred to in the image processing of the current frame captured at the present time. The front frame information holding unit 25 is provided.
 フレア領域判定部22は、光源位置41の中心の周囲の輝度値が、第二の輝度値Th2以上である画素領域をフレア領域42として判定する。光源位置41は撮像入力画像110中の第一の輝度値Th1以上の画素領域から検出するため、入力画像11にノイズが出現した場合、ノイズを光源位置41と誤って検出したフレア領域42を判定する虞がある。また、例えば後方車両51のヘッドライト52の路面反射光53が映しだされ、第二の輝度値Th2以上となる画素領域が大きい場合はフレア領域42と判定されるが、小さい場合はフレア領域42と判定されず、表示画像にちらつきが生じる虞がある。本実施の形態の画像表示装置1は、撮像入力画像110を取得する撮像部2に用いるカメラが、通常1秒間に30~60フレームの間隔で撮像し、ノイズや路面反射光53を認識するのは全フレームではないことに着目したもので、現フレームでの画像処理時に前フレームの情報を参照して、ノイズを削除したり、フレア領域42を誤って判定したりしても表示画像の視認性を低下させないものである。 The flare area determination unit 22 determines a pixel region in which the luminance value around the center of the light source position 41 is equal to or greater than the second luminance value Th2 as the flare region 42. Since the light source position 41 is detected from the pixel region having the first luminance value Th1 or more in the image pickup input image 110, when noise appears in the input image 11, the flare region 42 in which the noise is erroneously detected as the light source position 41 is determined. There is a risk of Further, for example, when the road surface reflected light 53 of the headlight 52 of the rear vehicle 51 is projected and the pixel region having the second luminance value Th2 or more is large, it is determined to be the flare region 42, but when it is small, the flare region 42 is determined. Is not determined, and there is a risk that the displayed image may flicker. In the image display device 1 of the present embodiment, the camera used for the image pickup unit 2 that acquires the image pickup input image 110 usually takes images at intervals of 30 to 60 frames per second and recognizes noise and road surface reflected light 53. Focuses on the fact that is not all frames, and the display image is visually recognized even if noise is deleted or the flare area 42 is erroneously determined by referring to the information of the previous frame during image processing in the current frame. It does not reduce the sex.
 図6により、撮像入力画像110に路面反射光53が映しだされた例を説明する。図6は、時間T1~T7の順に撮像された撮像入力画像110の例であり、時間T4~T6に路面反射光53が急遽に出現している。さらに、路面反射光53の輝度値は変化し、時間T5において最大となっている。このような輝度値の瞬間的な変化の影響で、仮に時間T4およびT6では路面反射光53をフレア領域42と判定せず、時間T5ではフレア領域42と判定した場合に、そのまま入力画像11と補正画像12とを合成した合成画像13を表示させると、画像表示部4に表示される表示画像にちらつきが生じることとなる。そこで、前フレーム情報保持部25に前フレームの光源位置情報31およびフレア領域情報32を保持しておき、これらの情報を用いて現フレームの合成画像13の生成を合成画像生成部24で行う。 FIG. 6 describes an example in which the road surface reflected light 53 is projected on the image pickup input image 110. FIG. 6 is an example of the image pickup input image 110 captured in the order of time T1 to T7, and the road surface reflected light 53 suddenly appears at time T4 to T6. Further, the luminance value of the road surface reflected light 53 changes and becomes maximum at time T5. Due to the influence of such a momentary change in the luminance value, if the road surface reflected light 53 is not determined to be the flare region 42 at the time T4 and T6, and is determined to be the flare region 42 at the time T5, the input image 11 is used as it is. When the composite image 13 combined with the corrected image 12 is displayed, the display image displayed on the image display unit 4 will flicker. Therefore, the light source position information 31 and the flare area information 32 of the front frame are held in the front frame information holding unit 25, and the composite image generation unit 24 generates the composite image 13 of the current frame using these information.
 本実施の形態において、撮像部2で撮像された現時点の前の撮像入力画像110または入力画像11を前フレーム(例えば現時点が時間T5であれは、時間T5の現フレームの前フレームは時間T1~4のいずれかのフレーム)という。前フレームの前のフレームを含めて複数の前フレームをいうこともある。
 図7は、本実施の形態の画像表示装置1の概略ブロック図である。画像処理部3には、前フレームにおいて、光源位置検出部21で取得した光源位置情報31およびフレア領域判定部22で取得したフレア領域情報32を保持する前フレーム情報保持部25を備える。光源位置検出部21、フレア領域判定部22、補正画像生成部23で行う処理については実施の形態1と同様である。
In the present embodiment, the image pickup input image 110 or the input image 11 before the current time captured by the image pickup unit 2 is used as a front frame (for example, if the current time is time T5, the front frame of the current frame at time T5 is time T1 to One of the frames of 4). It may also refer to multiple previous frames including the frame before the previous frame.
FIG. 7 is a schematic block diagram of the image display device 1 of the present embodiment. The image processing unit 3 includes a front frame information holding unit 25 that holds the light source position information 31 acquired by the light source position detecting unit 21 and the flare area information 32 acquired by the flare area determination unit 22 in the front frame. The processing performed by the light source position detection unit 21, the flare area determination unit 22, and the correction image generation unit 23 is the same as that of the first embodiment.
 前フレーム情報保持部25における前フレーム情報33の保存は、例えば時間T1から時間T7の情報が順次保持可能となるようにメモリ領域を確保したリングバッファなどを用いる。 For the storage of the previous frame information 33 in the previous frame information holding unit 25, for example, a ring buffer having a memory area secured so that the information from the time T1 to the time T7 can be sequentially held is used.
 合成画像生成部24では、入力画像11、光源位置情報31、フレア領域情報32、補正画像12、前フレームから得られる前フレーム情報33を入力とする。 The composite image generation unit 24 inputs the input image 11, the light source position information 31, the flare area information 32, the corrected image 12, and the front frame information 33 obtained from the front frame.
 例えば、現フレームで判定されたフレア領域42の位置が、前フレーム情報保持部25の光源位置情報31からフレア領域42ではないと判断された場合、合成画像生成部24では、入力画像11の合成割合Baを、例えば0.7以上と大きくして合成画像13の生成を行う。
 このように前フレーム情報33を参照して、撮像入力画像110のノイズや急遽出現する路面反射光53などが映った画像で、フレアによる影響が小さいと思われる際には、入力画像11の輝度値に近づけることにより、表示画像のちらつきを抑制できる。
For example, when the position of the flare region 42 determined in the current frame is determined not to be the flare region 42 from the light source position information 31 of the front frame information holding unit 25, the composite image generation unit 24 synthesizes the input image 11. The composite image 13 is generated by increasing the ratio Ba to, for example, 0.7 or more.
In this way, when the noise of the captured input image 110, the road surface reflected light 53 appearing in a hurry, and the like are reflected in the image with reference to the front frame information 33, and the influence of flare is considered to be small, the brightness of the input image 11 is increased. By approaching the value, the flicker of the displayed image can be suppressed.
 また、前フレーム情報保持部25に複数の前フレームの情報を保持させてもよい。例えば、時間T6においてフレア領域42と判定した位置の画素が、前フレーム情報33に保持された時間T1~T5の前フレームではT5のみフレア領域42と判定されていた場合は、フレア領域42と判定されていないフレームの割合は4/5である。この場合は、例えば図4Aにおける光源位置41の中心であるD=0の入力画像11の合成割合Baを4/5とする。また、フレア領域42と判断されない前フレームの数の割合を変数とした関数や、フレーム数を入力値としたLUTを用いて入力画像11の合成割合Baを決定してもよい。
 このように、合成画像生成部24における入力画像11の割合および補正画像12の割合は、撮像部2で撮像された前フレームでフレア領域42と判定されない前フレームの数が多いほど入力画像11の割合を大きく、補正画像の割合を小さくする。
Further, the front frame information holding unit 25 may hold a plurality of front frame information. For example, if the pixel at the position determined to be the flare region 42 in the time T6 is determined to be the flare region 42 only in T5 in the previous frames of the times T1 to T5 held in the front frame information 33, it is determined to be the flare region 42. The percentage of unfilled frames is 4/5. In this case, for example, the composition ratio Ba of the input image 11 having D = 0, which is the center of the light source position 41 in FIG. 4A, is set to 4/5. Further, the composition ratio Ba of the input image 11 may be determined by using a function having the ratio of the number of previous frames that is not determined to be the flare region 42 as a variable or a LUT having the number of frames as the input value.
As described above, the ratio of the input image 11 and the ratio of the corrected image 12 in the composite image generation unit 24 are such that the larger the number of the front frames that are not determined to be the flare region 42 in the front frame captured by the image pickup unit 2, the larger the number of the input image 11. Increase the proportion and decrease the proportion of the corrected image.
 なお、前フレーム情報保持部25に保持させる前フレーム情報33を、前フレームにおける光源位置情報31およびフレア領域情報32とする例を述べたが、光源位置検出部21で検出する光源位置41、フレア領域判定部22で判定するフレア領域42の情報が得られる情報を前フレーム情報保持部25に保持させてもよい。
 例えば、前フレームにおける光源位置41を前フレーム情報保持部25に保持させ、合成画像生成部24における入力画像11の割合および補正画像12の割合を、撮像部2で撮像された前フレームで光源位置41が検出されていないほど、入力画像11の割合を大きく、補正画像12の割合を小さくしてもよい。
Although an example has been described in which the front frame information 33 held by the front frame information holding unit 25 is the light source position information 31 and the flare area information 32 in the front frame, the light source position 41 and flare detected by the light source position detecting unit 21 have been described. The front frame information holding unit 25 may hold the information from which the information of the flare area 42 determined by the area determination unit 22 is obtained.
For example, the light source position 41 in the front frame is held by the front frame information holding unit 25, and the ratio of the input image 11 and the ratio of the corrected image 12 in the composite image generation unit 24 are set to the light source position in the front frame imaged by the image pickup unit 2. The ratio of the input image 11 may be increased and the ratio of the corrected image 12 may be decreased so that 41 is not detected.
 このような画像表示装置1においても、撮像入力画像110の輝度値から光源位置41を検出し、光源位置41の中心から判定したフレア領域42の少なくとも一部において、入力画像11と補正画像12との割合を、光源位置41の中心に近いほど補正画像12の割合を大きく、遠いほど入力画像11の割合を大きくして生成させた合成画像13を表示し、フレア領域42でない領域には入力画像11を表示させるため、表示画像の輝度値の不連続性を抑制して、視認性の高い画像を表示できる。
 また、前フレーム情報33を参照して、フレアによる影響が小さいと思われる際には、入力画像11の輝度値に近づけることにより、表示画像のちらつきを抑制し、さらに視認性の高い画像を表示できる。
Even in such an image display device 1, the light source position 41 is detected from the brightness value of the captured input image 110, and the input image 11 and the corrected image 12 are formed in at least a part of the flare region 42 determined from the center of the light source position 41. The composite image 13 generated by increasing the proportion of the corrected image 12 as it is closer to the center of the light source position 41 and increasing the proportion of the input image 11 as it is farther away is displayed, and the input image is displayed in the area other than the flare area 42. Since 11 is displayed, it is possible to suppress the discontinuity of the brightness value of the displayed image and display an image with high visibility.
Further, referring to the front frame information 33, when the influence of flare is considered to be small, the flicker of the displayed image is suppressed by bringing the brightness value closer to the input image 11, and an image with higher visibility is displayed. can.
実施の形態3.
 本実施の形態に係る画像表示装置1は、図1に示す実施の形態1の概略構成と同様に撮像部2、画像処理部3、画像表示部4とを備え、画像処理部3は、図8に示すように合成画像生成部24から出力された前フレームの合成画像13を前フレーム情報保持部26に備える。
Embodiment 3.
The image display device 1 according to the present embodiment includes an image pickup unit 2, an image processing unit 3, and an image display unit 4 in the same manner as the schematic configuration of the first embodiment shown in FIG. As shown in 8, the front frame information holding unit 26 is provided with the composite image 13 of the front frame output from the composite image generation unit 24.
 前フレーム情報保持部26は、複数の前フレームで合成画像生成部24から出力された合成画像13を保存する。情報の保存は、例えば時間T1から時間T7の情報が順次保持可能となるようにメモリ領域を確保し、リングバッファなどを用いる。 The front frame information holding unit 26 stores the composite image 13 output from the composite image generation unit 24 in a plurality of front frames. For information storage, for example, a memory area is secured so that information from time T1 to time T7 can be sequentially held, and a ring buffer or the like is used.
 光源位置検出部21は、現フレームの撮像入力画像110および前フレーム情報保持部26が保持する複数の前フレームの合成画像13を用いて、前フレームの合成画像13における光源位置41の推移から現フレームの光源位置41を推定する。例えば1秒間に60フレームを撮像するカメラの場合、複数の前フレームの合成画像13には、同じ光源が繰り返し撮像されているため光源の推移が把握できる。そして、複数の前フレームの合成画像13の輝度値を比較することによりノイズや路面反射光53などのフレアによる影響のない高輝度画素を見つけ取り除くことが可能となる。
 例えば、前フレーム情報保持部26から予め設定された時間の複数の前フレームの合成画像13を取り出し、フレーム毎に算出した高輝度領域の重心の位置の推移をみる。例えば、後方車両51のヘッドライト52の場合には道路の進行方向に合わせて重心の位置が動くため、現時点の撮像入力画像110における光源位置41は、複数の前フレームの高輝度領域の重心位置から推定できる。
 複数の前フレームにおいて高輝度領域の重心の位置の動向が推定できない場合には、撮像入力画像110のノイズや路面反射光などに起因するフレアの影響を受けない領域と判断して、光源位置41と検出しないようにすることができる。
The light source position detection unit 21 uses the captured input image 110 of the current frame and the composite image 13 of the plurality of front frames held by the front frame information holding unit 26, and presents from the transition of the light source position 41 in the composite image 13 of the front frame. The light source position 41 of the frame is estimated. For example, in the case of a camera that captures 60 frames per second, the transition of the light source can be grasped because the same light source is repeatedly imaged in the composite image 13 of a plurality of previous frames. Then, by comparing the brightness values of the composite images 13 of the plurality of front frames, it is possible to find and remove high-luminance pixels that are not affected by flare such as noise and road surface reflected light 53.
For example, a composite image 13 of a plurality of front frames for a preset time is taken out from the front frame information holding unit 26, and the transition of the position of the center of gravity of the high-luminance region calculated for each frame is observed. For example, in the case of the headlight 52 of the rear vehicle 51, the position of the center of gravity moves according to the traveling direction of the road. Can be estimated from.
If the movement of the position of the center of gravity in the high-luminance region cannot be estimated in a plurality of front frames, it is determined that the region is not affected by flare caused by noise of the image pickup input image 110, road surface reflected light, or the like, and the light source position 41. Can be prevented from being detected.
 複数の前フレームの合成画像13からフレーム毎に、カメラの感度の指向特性に依存する後方車両51のヘッドライト52の移動によるフレア領域42の大きさの変動から、現フレームの入力画像11におけるフレア領域42を判定してもよい。
 複数の前フレームの合成画像13から変動動向が推定できない場合には、撮像入力画像110のノイズや路面反射光53などに起因するフレアによる影響を受けない領域と判断して、フレア領域42と判定しないようにすることができる。
From the composite image 13 of the plurality of front frames, the flare in the input image 11 of the current frame is changed from the fluctuation of the size of the flare region 42 due to the movement of the headlight 52 of the rear vehicle 51 depending on the directivity characteristic of the sensitivity of the camera. The region 42 may be determined.
When the fluctuation trend cannot be estimated from the composite image 13 of the plurality of front frames, it is determined that the region is not affected by the flare caused by the noise of the image pickup input image 110, the road surface reflected light 53, and the like, and the region is determined to be the flare region 42. You can avoid it.
 このような画像表示装置1においても、撮像入力画像110の輝度値から光源位置41を検出し、光源位置41の中心から判定したフレア領域42の少なくとも一部において、入力画像11と補正画像12との割合を、光源位置41の中心に近いほど補正画像12の割合を大きく、遠いほど入力画像11の割合を大きくして生成させた合成画像13を表示し、フレア領域42でない領域には入力画像11を表示させるため、表示画像の輝度値の不連続性を抑制して、視認性の高い画像を表示できる。
 そして前フレームの合成画像13を参照した前フレーム情報34を用いることにより光源位置41およびフレア領域42の位置を推定することができ、処理を簡便化できる。
Even in such an image display device 1, the light source position 41 is detected from the brightness value of the captured input image 110, and the input image 11 and the corrected image 12 are formed in at least a part of the flare region 42 determined from the center of the light source position 41. The composite image 13 generated by increasing the proportion of the corrected image 12 as it is closer to the center of the light source position 41 and increasing the proportion of the input image 11 as it is farther away is displayed, and the input image is displayed in the area other than the flare area 42. Since 11 is displayed, it is possible to suppress the discontinuity of the brightness value of the displayed image and display an image with high visibility.
Then, by using the front frame information 34 with reference to the composite image 13 of the front frame, the positions of the light source position 41 and the flare region 42 can be estimated, and the processing can be simplified.
 なお、上記実施の形態1~3において、入力画像11は、撮像入力画像110をそのまま取り出してもよく、輝度、色合いなどを調整したものを用いてもよい。 Note that, in the above embodiments 1 to 3, as the input image 11, the captured input image 110 may be taken out as it is, or an image whose brightness, hue, etc. have been adjusted may be used.
 また、上述以外にも、各実施の形態の自由な組み合わせ、各実施の形態の任意の構成要素の変形、または各実施の形態の任意の構成要素の省略が可能である。 In addition to the above, it is possible to freely combine each embodiment, modify any component of each embodiment, or omit any component of each embodiment.
 また、画像表示装置1について説明したが、画像表示装置1を構成する画像処理部3、画像表示装置1で実施される画像表示方法、画像処理部3で実施される画像処理方法もまた本発明の一部を成す。さらに、上述の画像処理部3または画像処理方法における処理をコンピュータに実行させるプログラム、および該プログラムを記録したコンピュータで読み取り可能な記録媒体、例えば非一時的記録媒体もまた本発明の一部を成す。 Further, although the image display device 1 has been described, the image processing unit 3 constituting the image display device 1, the image display method implemented by the image display device 1, and the image processing method implemented by the image processing unit 3 are also the present invention. Form a part of. Further, a program that causes a computer to perform processing in the image processing unit 3 or the image processing method described above, and a computer-readable recording medium that records the program, such as a non-temporary recording medium, also form a part of the present invention. ..
1 画像表示装置、2 撮像部、3 画像処理部、4 画像表示部、11 入力画像、12 補正画像、13 合成画像、21 光源位置検出部、22 フレア領域判定部、23 補正画像生成部、24 合成画像生成部、25、26 前フレーム情報保持部、31 光源位置情報、32 フレア領域情報、33、34 前フレーム情報、41 光源位置、42 フレア領域、51 後方車両、52 ヘッドライト、53 路面反射光、110 撮像入力画像 1 image display device, 2 image pickup unit, 3 image processing unit, 4 image display unit, 11 input image, 12 corrected image, 13 composite image, 21 light source position detection unit, 22 flare area determination unit, 23 corrected image generation unit, 24 Composite image generation unit, 25, 26 front frame information holding unit, 31 light source position information, 32 flare area information, 33, 34 front frame information, 41 light source position, 42 flare area, 51 rear vehicle, 52 headlight, 53 road surface reflection Hikari, 110 image capture input image

Claims (10)

  1.  車両の外部を撮像する撮像部と、
     前記撮像部により撮像された撮像入力画像中の予め設定された第一の輝度値以上の画素が隣接して集まっている画素領域を光源位置として検出する光源位置検出部と、
     前記光源位置の中心の周囲の画素領域から、予め設定された第二の輝度値以上である画素領域をフレア領域と判定するフレア領域判定部と、
     前記撮像入力画像のうちの、少なくとも前記フレア領域内の入力画像中の予め設定された第三の輝度値以上の画素の輝度値を低減させて、補正画像を生成する補正画像生成部と、
     少なくとも前記フレア領域の一部において、前記補正画像と前記入力画像とを、前記光源位置の中心に近いほど前記補正画像の割合を大きく、遠いほど前記入力画像の割合を大きくして、合成画像を生成する合成画像生成部と、
     前記合成画像が生成された画素領域には合成画像を、前記合成画像が生成されない画素領域のうち、前記フレア領域内には前記補正画像または前記入力画像を、前記フレア領域外には前記入力画像を表示する画像表示部と、
    を備えた画像表示装置。
    An image pickup unit that captures the outside of the vehicle and
    A light source position detection unit that detects as a light source position a pixel region in which pixels of a preset first luminance value or higher in an image pickup input image captured by the image pickup unit are adjacently gathered.
    From the pixel area around the center of the light source position, a flare area determination unit that determines a pixel area having a second luminance value or more set in advance as a flare area.
    A corrected image generation unit that generates a corrected image by reducing the brightness value of pixels of at least the input image in the flare region of the captured input image that are equal to or higher than a preset third brightness value.
    At least in a part of the flare region, the corrected image and the input image are combined by increasing the proportion of the corrected image as it is closer to the center of the light source position and increasing the proportion of the input image as it is farther from the center. The composite image generator to be generated and
    The composite image is in the pixel region where the composite image is generated, the corrected image or the input image is in the flare region, and the input image is outside the flare region in the pixel region where the composite image is not generated. Image display unit that displays
    Image display device equipped with.
  2.  前記合成画像生成部は、前記フレア領域内またはフレア領域端部とその周辺を含む領域において、前記合成画像を生成する請求項1に記載の画像表示装置。 The image display device according to claim 1, wherein the composite image generation unit generates the composite image in the flare region or in a region including the flare region end and its periphery.
  3.  前記フレア領域判定部は、前記光源位置の中心からの距離が近いほど前記第二の輝度値を大きく設定し前記フレア領域の判定を行う請求項1または請求項2に記載の画像表示装置。 The image display device according to claim 1 or 2, wherein the flare area determination unit sets the second luminance value larger as the distance from the center of the light source position is shorter, and determines the flare area.
  4.  現時点より前の時間帯に撮像された前フレームにおける前フレーム情報を保持する前フレーム情報保持部をさらに備えた請求項1~3のいずれか一項に記載の画像表示装置。 The image display device according to any one of claims 1 to 3, further comprising a front frame information holding unit for holding front frame information in a front frame imaged in a time zone before the present time.
  5.  前記前フレーム情報は、前記光源位置検出部で得た光源位置情報、前記フレア領域判定部で得たフレア領域情報、および前記合成画像生成部で得た前記合成画像の少なくともいずれかである請求項4に記載の画像表示装置。 The claim that the front frame information is at least one of the light source position information obtained by the light source position detection unit, the flare region information obtained by the flare region determination unit, and the composite image obtained by the composite image generation unit. The image display device according to 4.
  6.  前記前フレーム情報は前記光源位置情報であって、前記合成画像生成部における前記入力画像の割合および前記補正画像の割合は、前記光源位置情報による前記光源位置が検出されていない前記前フレームの数が多いほど、前記入力画像の割合を大きく、前記補正画像の割合を小さくする請求項5に記載の画像表示装置。 The front frame information is the light source position information, and the ratio of the input image and the ratio of the corrected image in the composite image generation unit are the number of the front frames in which the light source position is not detected by the light source position information. The image display device according to claim 5, wherein the larger the number, the larger the ratio of the input image and the smaller the ratio of the corrected image.
  7.  前記前フレーム情報は前記フレア領域情報であって、前記合成画像生成部における前記入力画像の割合および前記補正画像の割合は、前記フレア領域情報により前記フレア領域と判定されない前記前フレームの数が多いほど、前記入力画像の割合を大きく、前記補正画像の割合を小さくする請求項5に記載の画像表示装置。 The front frame information is the flare region information, and the ratio of the input image and the ratio of the corrected image in the composite image generation unit are large in the number of the front frames that are not determined to be the flare region by the flare region information. The image display device according to claim 5, wherein the ratio of the input image is increased and the ratio of the corrected image is decreased.
  8.  前記前フレーム情報は前記合成画像であって、前記光源位置検出部は、前記前フレーム情報保持部が保持する前記前フレームの前記合成画像における前記光源位置の推移から現時点のフレームの前記光源位置を推定する請求項5に記載の画像表示装置。 The front frame information is the composite image, and the light source position detecting unit obtains the light source position of the current frame from the transition of the light source position in the composite image of the front frame held by the front frame information holding unit. The image display device according to claim 5.
  9.  前記前フレーム情報は前記合成画像であって、前記フレア領域判定部は、前記前フレーム情報保持部が保持する前記前フレームの前記合成画像における前記フレア領域の推移から現時点のフレームの前記フレア領域を判定する請求項5に記載の画像表示装置。 The front frame information is the composite image, and the flare region determination unit obtains the flare region of the current frame from the transition of the flare region in the composite image of the front frame held by the front frame information holding unit. The image display device according to claim 5.
  10.  車両の外部を撮像した撮像入力画像中の予め設定された第一の輝度値以上の画素が隣接して集まっている画素領域を光源位置として検出するステップと、
     前記光源位置の中心の周囲の画素の輝度値が、予め設定された第二の輝度値以上である画素領域をフレア領域と判定するステップと、
     前記撮像入力画像のうちの、少なくとも前記フレア領域内の入力画像の予め設定された第三の輝度値以上の画素の輝度値を低減させて、補正画像を生成するステップと、
     少なくとも前記フレア領域の一部において、前記補正画像と前記入力画像とを、前記光源位置の中心に近いほど前記補正画像の割合を大きく、遠いほど前記入力画像の割合を大きくして、合成画像を生成するステップと、
     前記合成画像が生成された画素領域には合成画像を、前記合成画像が生成されない画素領域のうち、前記フレア領域内には前記補正画像または前記入力画像を、前記フレア領域外には前記入力画像を表示する表示画像を生成するするステップと、
    を備えることを特徴とする画像表示方法。
    A step of detecting as a light source position a pixel region in which pixels having a preset first luminance value or more in an image captured by imaging the outside of a vehicle are adjacent to each other.
    A step of determining a pixel region in which the luminance value of the pixels around the center of the light source position is equal to or higher than a preset second luminance value as a flare region.
    A step of reducing the luminance value of pixels of at least the input image in the flare region of the captured input image to be equal to or higher than a preset third luminance value to generate a corrected image.
    At least in a part of the flare region, the corrected image and the input image are combined by increasing the proportion of the corrected image as it is closer to the center of the light source position and increasing the proportion of the input image as it is farther from the center. Steps to generate and
    The composite image is in the pixel region where the composite image is generated, the corrected image or the input image is in the flare region, and the input image is outside the flare region in the pixel region where the composite image is not generated. And the steps to generate a display image to display
    An image display method characterized by comprising.
PCT/JP2020/035284 2020-09-17 2020-09-17 Image display device and image display method WO2022059139A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2020/035284 WO2022059139A1 (en) 2020-09-17 2020-09-17 Image display device and image display method
JP2022550267A JP7355252B2 (en) 2020-09-17 2020-09-17 Image display device and image display method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/035284 WO2022059139A1 (en) 2020-09-17 2020-09-17 Image display device and image display method

Publications (1)

Publication Number Publication Date
WO2022059139A1 true WO2022059139A1 (en) 2022-03-24

Family

ID=80776560

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/035284 WO2022059139A1 (en) 2020-09-17 2020-09-17 Image display device and image display method

Country Status (2)

Country Link
JP (1) JP7355252B2 (en)
WO (1) WO2022059139A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004048456A (en) * 2002-07-12 2004-02-12 Niles Co Ltd Image pickup system
JP2009141813A (en) * 2007-12-07 2009-06-25 Panasonic Corp Imaging apparatus, camera, vehicle and imaging method
WO2009081533A1 (en) * 2007-12-21 2009-07-02 Panasonic Corporation Flare correcting device
JP2009296224A (en) * 2008-06-04 2009-12-17 Nippon Soken Inc Imaging means and program
JP2010004450A (en) * 2008-06-23 2010-01-07 Nippon Soken Inc Image pickup apparatus and program
JP2013232880A (en) * 2012-04-27 2013-11-14 Lg Innotek Co Ltd Image processing device and image processing method
JP2018121122A (en) * 2017-01-23 2018-08-02 株式会社デンソー Driving support system and driving support method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004048456A (en) * 2002-07-12 2004-02-12 Niles Co Ltd Image pickup system
JP2009141813A (en) * 2007-12-07 2009-06-25 Panasonic Corp Imaging apparatus, camera, vehicle and imaging method
WO2009081533A1 (en) * 2007-12-21 2009-07-02 Panasonic Corporation Flare correcting device
JP2009296224A (en) * 2008-06-04 2009-12-17 Nippon Soken Inc Imaging means and program
JP2010004450A (en) * 2008-06-23 2010-01-07 Nippon Soken Inc Image pickup apparatus and program
JP2013232880A (en) * 2012-04-27 2013-11-14 Lg Innotek Co Ltd Image processing device and image processing method
JP2018121122A (en) * 2017-01-23 2018-08-02 株式会社デンソー Driving support system and driving support method

Also Published As

Publication number Publication date
JPWO2022059139A1 (en) 2022-03-24
JP7355252B2 (en) 2023-10-03

Similar Documents

Publication Publication Date Title
JP6084434B2 (en) Image processing system and image processing method
US9123179B2 (en) Surrounding image display system and surrounding image display method for vehicle
JP5257695B2 (en) Monitoring device
WO2009088080A1 (en) Projector
US20150350572A1 (en) Night-vision device
JP2010257282A (en) Obstacle detection device and vehicle with the device mounted thereon
JP4952499B2 (en) Image processing device
US20150042806A1 (en) Vehicle vision system with reduction of temporal noise in images
US20080063295A1 (en) Imaging Device
JP2013162339A (en) Imaging apparatus
WO2022059139A1 (en) Image display device and image display method
JP2005236588A (en) Image data conversion device and camera device
JP2010171777A (en) Imaging apparatus
JP6168024B2 (en) Captured image display device, captured image display method, and captured image display program
JP2012008845A (en) Image processor
JP6266022B2 (en) Image processing device, alarm device, and image processing method
US11410274B2 (en) Information processing device and program
JP7148384B2 (en) Semiconductor device, image processing method and program
CN109788211B (en) Captured image display system, electronic mirror system, and captured image display method
JP6991664B2 (en) Photographed image display system and electronic mirror system
JP6327388B2 (en) Captured image display device, captured image display method, and captured image display program
JP7279613B2 (en) Image processing device
US20220294996A1 (en) Imaging system and method
US11381757B2 (en) Imaging system and method
JP7080724B2 (en) Light distribution control device, light projection system and light distribution control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20954124

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022550267

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20954124

Country of ref document: EP

Kind code of ref document: A1