WO2006098358A1 - 画像処理装置および方法、プログラム、並びに記録媒体 - Google Patents
画像処理装置および方法、プログラム、並びに記録媒体 Download PDFInfo
- Publication number
- WO2006098358A1 WO2006098358A1 PCT/JP2006/305115 JP2006305115W WO2006098358A1 WO 2006098358 A1 WO2006098358 A1 WO 2006098358A1 JP 2006305115 W JP2006305115 W JP 2006305115W WO 2006098358 A1 WO2006098358 A1 WO 2006098358A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- gradation
- luminance
- data
- display
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 90
- 238000000034 method Methods 0.000 title description 31
- 238000006243 chemical reaction Methods 0.000 claims description 95
- 238000003384 imaging method Methods 0.000 claims description 94
- 238000000605 extraction Methods 0.000 claims description 15
- 238000003672 processing method Methods 0.000 claims description 4
- 239000000284 extract Substances 0.000 abstract description 4
- 238000010586 diagram Methods 0.000 description 24
- 230000008569 process Effects 0.000 description 22
- 238000001514 detection method Methods 0.000 description 10
- 230000035945 sensitivity Effects 0.000 description 10
- 238000012806 monitoring device Methods 0.000 description 8
- 230000008859 change Effects 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 6
- 238000005259 measurement Methods 0.000 description 6
- 238000012544 monitoring process Methods 0.000 description 6
- 239000003973 paint Substances 0.000 description 6
- 239000004065 semiconductor Substances 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 238000005286 illumination Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 229910044991 metal oxide Inorganic materials 0.000 description 3
- 150000004706 metal oxides Chemical class 0.000 description 3
- GGCZERPQGJTIQP-UHFFFAOYSA-N sodium;9,10-dioxoanthracene-2-sulfonic acid Chemical compound [Na+].C1=CC=C2C(=O)C3=CC(S(=O)(=O)O)=CC=C3C(=O)C2=C1 GGCZERPQGJTIQP-UHFFFAOYSA-N 0.000 description 3
- 230000000295 complement effect Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 229910052709 silver Inorganic materials 0.000 description 2
- 239000004332 silver Substances 0.000 description 2
- -1 silver halide Chemical class 0.000 description 2
- 239000010426 asphalt Substances 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005669 field effect Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 229920006395 saturated elastomer Polymers 0.000 description 1
- 238000009738 saturating Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/40—Image enhancement or restoration using histogram techniques
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/24—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
- G06T5/94—Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/741—Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/76—Circuitry for compensating brightness variation in the scene by influencing the image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/14—Picture signal circuitry for video frequency region
- H04N5/20—Circuitry for controlling amplitude response
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20004—Adaptive image processing
- G06T2207/20012—Locally adaptive
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20208—High dynamic range [HDR] image processing
Definitions
- the present invention relates to an image processing apparatus and method, a program, and a recording medium, and in particular, an image processing apparatus and method for enabling an image captured by an imaging element with a wide dynamic range to be clearly displayed.
- a program, and a recording medium are examples of image processing apparatus and method.
- HDRC High Dynamic Range CMOS (HDRC) Complementary Metal Oxide semiconductor
- HDRC has a wide dynamic range and can shoot images, so the maximum number of gradations of the captured image (captured image) is 214 (214 gradation levels). Power that can be displayed
- the display gradation of a display device such as a normal display is generally 256 gradations, for example. For this reason, when displaying HDRC captured images, the number of gradations is usually smaller than the number of gradations of the captured image, so the captured image cannot be displayed as it is on a normal display or the like. Therefore, in order to display the HDRC captured image on a normal display device, gradation conversion of the captured image is performed.
- FIG. 1 is a diagram for explaining an example of conventional gradation conversion.
- the images taken with a power camera using HDRC were taken as images 1 when the weather was clear, image 2 taken when it was cloudy, and taken at night It is assumed that the gradation of image 3 as an image is converted and displayed as a display image on a display device.
- the figure shows that the maximum value of the luminance value of the pixel included in image 1 is the value bll, the minimum value of the luminance value of the pixel is the value bl2, and the luminance value of the pixel included in image 2 Is the value b21, the minimum luminance value of the pixel is the value b22, and the luminance value of the pixel included in the image 3
- the maximum value is the value b31, and the minimum value of the luminance value of the pixel is the value b32.
- the luminance values of the pixels included in the images 1 to 3 are based on the maximum value and the minimum value.
- Gradation conversion is performed. For example, in the image 1, the luminance value between the maximum luminance value bll and the minimum luminance value bl2 is converted so as to be assigned to 256 gradations and displayed on the display device.
- the luminance value between the maximum luminance value b21 and the minimum luminance value b22 is converted so as to be assigned to 256 gradations.
- Patent Literature 1 JP 7-506932 Gazette
- a display image displayed by conventional gradation conversion is not always displayed clearly. Since the dynamic range such as HDRC is wide and the image taken with the camera is very wide compared to the dynamic range of the display device, if the gradation is compressed and displayed as it is, it will be a light image with no overall contrast, It becomes difficult to recognize the content displayed by the human eye.
- the maximum and minimum luminance values are obtained, and the gradation is compressed so that the maximum value is the maximum luminance of the display device and the minimum value is the minimum luminance of the display device. and so on.
- the display device has no power as to which luminance region is viewed by a person.
- a bright luminance region different from the luminance region seen by the person is the main floor when a bright sky is reflected. It is displayed with a key.
- a dark luminance area that is different from the luminance area that the person is viewing is displayed with the main gradation.
- Image data taken with an HDRC camera is so high that people are not interested in it! , Value (or low, value) brightness information is also included, so people can pay attention to which brightness area! / Power of scolding (I don't know the power of interest!) It can be displayed with proper density and color Absent.
- the width of the maximum value and the minimum value of the brightness value may differ greatly, so that the gradation of the display image varies depending on the change in the brightness value of the captured image. It may fluctuate as it is stretched or shrunk, and the displayed image may look very unnatural.
- the brightness value between the maximum brightness value b31 and the minimum brightness value b32 is converted so as to be assigned to 256 gradations and displayed on the display device.
- the power of the image 3 taken at night is likely to be composed of dark pixels (low brightness values) in almost all parts of the image. It is highly possible that the light is also emitted by extremely bright object power compared to other parts of the image, such as flash light and headlights.
- the center of the luminance value of the pixel is supposed to be a pixel that should originally be displayed clearly.
- the brightness value is higher than the luminance value of (most pixels in the image), and the display image corresponding to the image 3 becomes dim overall.
- the display brightness of the entire screen of the display image varies each time according to changes in weather or lighting, and the image becomes difficult to see for the user. For example, when sunlight suddenly enters the screen from between buildings, or when an oncoming vehicle with a headlight turned on at night suddenly appears in a curve, it will be displayed in an easy-to-read state. Image power that suddenly became dark and hard to see. On the other hand, when a dark area such as a tunnel is approaching, an image that has been displayed in an easy-to-see state suddenly becomes brighter and difficult to see.
- the present invention has been made in view of such a situation, and makes it possible to clearly display an image photographed by an imaging element having a wide dynamic range.
- An image processing apparatus to which the present invention is applied includes an imaging unit that captures an image and outputs data of the captured image, and a preset that is included in an image corresponding to the data output from the imaging unit An image of a region, a conversion means for converting the gradation of the image based on data corresponding to the image of the object having a predetermined reflectance, and an image having the gradation converted by the conversion means And display means for displaying an image corresponding to the signal of the image.
- an image is captured, data of the captured image is output, and a predetermined area included in the image corresponding to the output data is supported.
- the image gradation is converted based on the data corresponding to the image of the object having a predetermined reflectance, and the image corresponding to the image signal having the converted gradation is displayed.
- the captured image can be clearly displayed regardless of the gradation that can be displayed by the display means.
- the imaging unit converts the charge generated corresponding to the captured light into an analog electrical signal having a voltage value proportional to the logarithm of the number of charges for each pixel, and converts the analog electrical signal to digital Image data can be output after being converted to data.
- the photographing means is composed of, for example, an HDRC camera.
- the imaging unit converts the current generated corresponding to the captured light into an analog electrical signal having a voltage value proportional to the logarithm of the magnitude of the current for each pixel, and the analog electrical signal Can be converted into digital data to output image data.
- the gradation width of the image that can be displayed by the display means can be made smaller than the gradation width of the image photographed by the photographing means.
- the information processing apparatus may further include a receiving unit that receives an input of information regarding the position of the region in the image.
- the converting means extracts data corresponding to a preset region included in the image, and based on the data extracted by the extracting means, the image has a gradation width that can be displayed in the image. Based on the identification means for identifying the corresponding luminance width and the luminance width identified by the identification means, the gradation of the image is converted into a gradation width that can be displayed by the display means, and the data corresponding to the converted image Output means for outputting the above signals.
- the extraction unit calculates an average value of luminance values of pixels in the region based on data corresponding to the region, and the specifying unit calculates an average value of luminance values of pixels in the region calculated by the extraction unit. Is assigned to a preset gradation within the gradation width that can be displayed by the display means, and based on the assigned gradation, the upper and lower luminance limits corresponding to the gradation width that can be displayed in the image The luminance width can be specified by specifying the value.
- the correspondence between the luminance value of the pixel and the gradation is set appropriately, and an image can always be displayed with a natural color.
- the conversion means stores information relating to the position of the region in the image and information relating to the gradation corresponding to the average value of the luminance values of the pixels in the region within the gradation width that can be displayed by the display means.
- a storage means may be further provided.
- the conversion means determines a gradation corresponding to a luminance value of a pixel in the region, within a gradation width that can be displayed by the display unit, based on a user's selection regarding display of the image in the region. can do.
- the tone of the image is converted based on an object that is always included in the captured image, and an image with a hue that suits the user's preference is displayed. Togashi.
- the conversion means determines a gradation corresponding to the luminance value of the pixel in the area, within the gradation width that can be displayed by the display means, based on the reflectance of the object of the image in the area. Is possible.
- the gradation of the image can be converted based on a region in which an object with a known reflectance is always included in the captured image.
- Distance measurement means for measuring the distance to the object of the image in the region is further provided, based on the distance to the object of the image in the region measured by the distance measurement means and the luminance value of the pixel in the region. Thus, the reflectance of the object can be calculated.
- the imaging unit converts the charge or current generated corresponding to the captured light into an analog electrical signal having a voltage value proportional to the logarithm of the number of charges and the magnitude of the current for each pixel.
- the analog electrical signal can be subjected to gradation conversion processing by the conversion means, and the analog electrical signal that has been processed by the conversion means can be converted into digital data and output.
- An image processing method to which the present invention is applied includes an acquisition step of acquiring data of an image captured by an imaging unit, and a preset area included in an image corresponding to the data acquired by the processing of the acquisition step A conversion step for converting the gradation of the image based on data corresponding to the image of the object having a predetermined reflectance, and an image signal having the gradation converted by the processing of the conversion step. And a display step for displaying a corresponding image.
- data of an image captured by the imaging means is acquired, and is an image of a preset area included in an image corresponding to the acquired data.
- the gradation of the image is converted, and an image corresponding to the signal of the image having the converted gradation is displayed.
- a program to which the present invention is applied is a program that causes an image processing apparatus to perform image processing, and is an acquisition control step that controls acquisition of data of an image captured by an imaging unit, and processing of the acquisition control step. Is a pre-set area image included in the image corresponding to the data obtained by the above-described conversion, and controls the conversion of the gradation of the image based on the data corresponding to the image of the object having a predetermined reflectance. It is characterized by causing a computer to execute a control step and a display control step for controlling display of an image corresponding to an image signal whose gradation has been converted by the processing of the conversion control step.
- a recording medium to which the present invention is applied has a program for causing an image processing apparatus to perform image processing.
- a recording medium that is recorded includes an acquisition control step that controls acquisition of image data captured by the imaging unit, and a preset image included in the image corresponding to the data acquired by the processing of the acquisition control step.
- a conversion control step for controlling the conversion of the gradation of the image based on the data corresponding to the image of the object having a predetermined reflectance, and the conversion control step.
- a program for causing a computer to execute a display control step for controlling display of an image corresponding to a signal of an image converted from is recorded!
- an image can be clearly displayed.
- FIG. 1 is a diagram for explaining an example of conventional gradation conversion.
- FIG. 2 is a block diagram showing a configuration example of a monitoring apparatus to which the present invention is applied.
- FIG. 3 is a diagram showing an example of an attachment position of the imaging unit in FIG.
- FIG. 4 is a diagram showing an example of an attachment position of the imaging unit in FIG.
- FIG. 5 is a diagram illustrating a configuration example of an imaging unit in FIG.
- FIG. 6 is a diagram illustrating sensitivity characteristics of the imaging unit.
- FIG. 7 is a block diagram illustrating a configuration example of a control unit in FIG. 1.
- FIG. 8 is a flowchart illustrating an example of display adjustment processing.
- FIG. 9 is a diagram illustrating an example of gradation conversion according to the present invention.
- FIG. 10 is a flowchart illustrating an example of reflectance measurement processing.
- FIG. 11 is a diagram illustrating an example of a correspondence relationship between a luminance value, a reflectance, and a display gradation.
- FIG. 12 is a flowchart illustrating an example of reference information storage processing.
- FIG. 13 is a diagram showing an example of designation of a reference area.
- FIG. 14 is a diagram showing an example of an image displayed by a conventional technique.
- FIG. 15 is a diagram showing an example of an image displayed by the technique of the present invention.
- FIG. 16 is a block diagram illustrating another configuration example of the imaging unit.
- FIG. 17 is a block diagram illustrating a configuration example of a gradation assignment determining unit.
- FIG. 18 is a block diagram showing a configuration example of a personal computer.
- FIG. 2 is a block diagram showing an external configuration example of the embodiment of the monitoring apparatus 100 to which the present invention is applied.
- the monitoring device 100 is a device that is mounted on, for example, an automobile and shoots the front outside the vehicle and presents a clear image to the user.
- the monitoring device 100 includes an imaging unit 101, a control unit 102, and a display unit 103. Yes.
- the imaging unit 101 is configured by, for example, a camera with a wide dynamic range, and captures and captures an image (which may be a moving image or a still image) based on light input from the lens 101a.
- the processed image data is output to the control unit 102.
- the imaging unit 101 captures a moving image
- the captured image data is encoded on a frame basis. Output as digital data.
- the control unit 102 can display a clear image on the display unit 103 with respect to data of the captured image that is data supplied from the imaging unit 101 and captured by the imaging unit 101. In this way, processing such as gradation conversion is performed, and a signal corresponding to the processed image data is output to the display unit 103.
- the operation input unit 104 and the like perform setting of data necessary for processing, input of commands, and the like.
- the operation input unit 104 may be configured by an external information device such as a personal computer.
- the display unit 103 is configured by, for example, an LCD (Liquid Crystal Display) or the like, and displays an image corresponding to a signal supplied from the control unit 102.
- the display unit 103 is configured by, for example, a liquid crystal display that is generally sold, and the imaging unit 101 captures an image with a wide dynamic range, whereas the display unit 103 is captured by the imaging unit 101.
- the display has a display gradation of a smaller number of gradations (for example, 256 gradations).
- FIG. 3 is a diagram illustrating an example of the attachment position of the imaging unit 101.
- the imaging unit 101 is attached in the vicinity of the rear mirror of the automobile 111, and the lens 101a is an object necessary for gradation conversion processing described later, and is located in front of the automobile. It is configured such that light centering on the light ray 131 is incident so as to include the road surface 112 immediately before being positioned at, and an image corresponding to the light is photographed.
- the imaging unit 101 may be incorporated in the room mirror itself or attached to the base note of the room mirror.
- the mounting position of the image pickup unit 101 is preferably a position where the front can be photographed through an area where dirt on the windshield is wiped off by the wiper, and the front can be photographed through the windshield, thus hindering the driver's view. It is sufficient if there is no position. By doing so, it is possible to take an image that is almost equivalent to that seen by a user.
- the imaging unit 101 may be attached as shown in FIG.
- FIG. 5 is a block diagram illustrating a configuration example of the imaging unit 101. As shown in the figure, the imaging unit 101 is configured such that light output from the lens 101 a is output to the imaging control unit 121.
- the imaging control unit 121 is, for example, a logarithmic conversion type imaging device such as HDRC (High Dynamic Range CMOS (Complementary Metal Oxide Semiconductor)), a light detection unit 141, a logarithmic conversion unit 142, an A / D conversion unit 143, and An imaging timing control unit 144 is included.
- HDRC High Dynamic Range CMOS (Complementary Metal Oxide Semiconductor)
- the light of the subject incident through the lens 101a forms an image on a light detection surface (not shown) of the light detection unit 141 of the imaging control unit 121.
- the light detection unit 141 includes, for example, a plurality of light receiving elements such as photodiodes, and converts the light of the subject imaged by the lens 1 Ola into a charge corresponding to the intensity (light quantity) of the light. Accumulate the converted charge.
- the light detection unit 141 supplies the accumulated charge to the logarithmic conversion unit 142 in synchronization with the control signal supplied from the imaging timing control unit 144. It should be noted that the photodetection unit 141 can be supplied to the logarithmic conversion unit 142 as it is without accumulating the converted charge.
- the logarithmic conversion unit 142 is configured by, for example, a plurality of MOSFETs (Metal Oxide Semiconductor Field Effect Transistors).
- the logarithmic conversion unit 142 uses the sub-threshold characteristic of the MOSFET to change the charge (or current) supplied from the photodetection unit 141 to a voltage proportional to the logarithm of the number of charges (or current intensity) for each pixel. Convert to analog electrical signal with value.
- the logarithmic conversion unit 142 supplies the converted analog electrical signal to the A / D conversion unit 143.
- the A / D conversion unit 143 converts an analog electric signal into digital image data in synchronization with the control signal supplied from the imaging timing control unit 144, and converts the converted image data into the image processing device 112. To supply. Accordingly, the pixel value of each pixel of the image data output from the imaging control unit 121 is a value proportional to the value obtained by logarithmically converting the amount of light of the subject incident on the light detection unit 141.
- FIG. 6 is a graph showing sensitivity characteristics of the HDRC imaging control unit 121, a CCD (Charge Coupled Device) imaging device, a silver halide film, and the human eye.
- the horizontal axis indicates the logarithm of the illuminance (unit: lux) of the incident light
- the vertical axis indicates the sensitivity.
- Line 151 Represents the sensitivity characteristic of the imaging control unit 121
- the line 152 represents the sensitivity characteristic of the human eye
- the line 153 represents the sensitivity characteristic of the silver salt film
- the line 154 represents the sensitivity characteristic of the CCD image sensor.
- the sensitivity characteristic of the conventional CMOS image sensor is almost similar to the sensitivity characteristic of the CCD image sensor indicated by line 154.
- the imaging control unit 121 outputs image data having a pixel value substantially proportional to the logarithm of the amount of light of the incident subject, thereby forming the photo constituting the imaging control unit 121.
- saturating capacitances such as diodes and MOSFETs, CCD image sensors, silver salt films, and about 170 dB of dynamics from about 1 millilux to about 500 k / letter, higher than the brightness of sunlight, wider than the human eye Have a range.
- the logarithmic conversion unit 142 outputs data composed of luminance values (or pixel values) that are substantially proportional to the logarithm of the incident light quantity, so that even when the incident light quantity increases, Capacitance of elements such as photodiodes and MOSFETs constituting the imaging control unit 121 may be saturated, or the current flowing through each element and the applied voltage may exceed the range that can output according to the input of each element. Absent. Accordingly, it is possible to obtain a luminance value (or pixel value) according to the fluctuation of the amount of incident light almost accurately within the range of luminance that can be imaged.
- the dynamic range of the logarithmic conversion type image sensor 142 is not limited to the above-mentioned 170 dB, but depending on the purpose of use, a dynamic range corresponding to the required dynamic range such as about lOOdB or 200 dB may be used. !
- the imaging device 101 using the imaging control unit 121 can brighten the subject to the maximum pixel value that can be output by the imaging device without adjusting the amount of incident light by adjusting the aperture or shutter speed.
- the pixel value corresponding to the portion is clipped, or the pixel value force S corresponding to the dark portion of the subject is clipped to the minimum pixel value that can be output by the image sensor.
- the imaging apparatus 101 can faithfully capture the detailed change in luminance of the subject without the bright portion of the subject flying white or the dark portion being painted black.
- imaging apparatus 101 using the imaging control unit 121 captures the front of the vehicle from the inside of the vehicle during the daytime, even if the sun is within the angle of view, without adjusting the amount of incident light, Images that faithfully reproduce the conditions of the sun and the road ahead can be taken.
- imaging device 1 01 is for shooting the front of the vehicle from the inside of the vehicle at night, even if the headlight of the oncoming vehicle is illuminated from the front, without adjusting the amount of incident light, It is possible to shoot images that are faithfully reproduced up to the dark part of the light illuminated by the light.
- the output value (luminance value) of each pixel is, for example, 14-bit data. Since it is possible to output as, lever converts a luminance value of the pixel as it is to the tone, broad 2 14, the data corresponding to the image having the gradation width.
- the imaging apparatus 101 using the imaging control unit 121 does not need to adjust the amount of incident light, for example, during imaging of two frames in the image data output from the imaging apparatus 101.
- the pixel values corresponding to the areas where the luminance varies fluctuate, and most of the pixel values corresponding to the areas where the luminance does not vary Does not fluctuate. Therefore, the pixel value (hereinafter also referred to as the difference value) of each pixel of the data (hereinafter referred to as difference data) obtained by taking the difference of the image data between the frames reflects the change in luminance of the subject almost faithfully. Value.
- an imaging apparatus using a CCD imaging device whose dynamic range is narrower than the human eye needs to adjust the amount of incident light in accordance with the brightness of the subject.
- the pixel value corresponding to the region where the luminance does not fluctuate may also fluctuate. Therefore, the difference value of each pixel of the difference data may not be a value that faithfully reflects the change in luminance of the subject.
- the pixel value of the image data output from the imaging device 101 is a value that is substantially proportional to the logarithm of the amount of light of the subject, regardless of the brightness (illuminance) of the illumination irradiated to the subject,
- the histogram showing the distribution of pixel values in the image data obtained by photographing the subject has almost the same shape as the histogram showing the reflectance distribution of the subject.
- the difference in the histogram width indicating the distribution of the pixel values of the first image data and the second image data is about 100 times.
- the pixel values of the image data obtained by imaging the subject fluctuate substantially uniformly.
- the illuminance of the light illuminating the subject changes almost uniformly, and the subject's brightness changes almost uniformly by + 5%.
- the fluctuation values of the pixel values corresponding to the two regions are almost the same value (loglOl.05).
- the difference between the fluctuation values of the pixel values corresponding to the two areas described above is about 100 times.
- the sensitivity of the CCD image sensor and the silver salt film is not proportional to the illuminance of the incident light due to factors such as the gamma characteristic. . Therefore, the histogram showing the distribution of pixel values of image data using a CCD image sensor or silver halide film is dependent on the amount of light (intensity of illuminance) even if the distribution of the amount of incident light (illuminance) is uniform. , Its shape changes.
- the configuration imaging unit 101 configured using the HDRC imaging control unit 121 as described above can capture an image with a wide dynamic range, unlike the case of a CCD imaging device. There is no loss of image data due to excessive or insufficient illuminance even in images taken in backlight or at night.For example, it can be installed in a car and used for a navigation system, or for security monitoring systems such as crime prevention. It can be used in a wide range of fields.
- FIG. 7 is a block diagram illustrating a configuration example of the control unit 102.
- a reference area extraction unit 171 extracts pixels of an image corresponding to the reference area from the captured image data supplied from the imaging unit 101.
- the reference area is an area corresponding to an image of an object having a known reflectance included in the photographed image, and when the monitoring device 100 is mounted on an automobile, for example, the road surface 112 immediately before the automobile is 112.
- the area corresponding to the image is the reference area.
- Information relating to the position of the reference area in the captured image is registered information set in advance, and is supplied from the registration information supply unit 174 to the reference area extraction unit.
- the reference area extraction unit 171 includes information on the luminance values of the pixels included in the reference area (for example, , The average value of luminance values) is calculated and output to the cutout processing unit 172.
- the cutout processing unit 172 displays the luminance value of the pixel included in the captured image on the display unit 103.
- the luminance value (luminance width) corresponding to the pixel of the displayed image is cut out.
- the clipping processing unit 172 assigns the average value of the luminance values of the pixels in the reference region to a predetermined gradation (for example, the nth gradation) among the gradations that can be displayed by the display unit 103.
- the upper limit value and the lower limit value of the luminance value of the pixel corresponding to the gradation that can be displayed by the display unit 103 are extracted from the captured image and output to the gradation conversion unit 173.
- information representing the power to assign the average value of the luminance values of the pixels in the reference region to which gradation is set as preset registration information or the like is supplied from the registration information supply unit 174 to the cut-out processing unit 172.
- the tone conversion unit 173 performs tone conversion of the captured image based on the upper limit value and the lower limit value of the luminance values of the pixels supplied from the clipping processing unit 172, and outputs the tone-converted image signal. Output to display section 103.
- gradation conversion may be performed by a process such as dithering or error dispersion, or may be converted by another method.
- the registration information supply unit 174 stores in advance registration information such as information on the position of the reference region and information indicating the gradation to which the average value of the luminance values of the pixels in the reference region is assigned. To the reference area extraction unit 171 or the cut-out processing unit 172.
- each unit configuring control unit 102 may be configured by hardware such as a semiconductor integrated circuit in which a logical operation unit and a storage unit for realizing the functions described above are incorporated.
- the control unit 102 may be configured by a computer, for example, and each unit described above may be configured as a function block realized by software processed by the computer.
- step S101 the control unit 102 acquires captured image data supplied from the imaging unit 101.
- step S102 the reference region extraction unit 171 obtains the processing in step S101. Image data corresponding to the reference area in the image corresponding to the extracted data is extracted.
- step S103 the reference area extraction unit 171 calculates the average value of the luminance values of the pixels in the reference area based on the data extracted in the process of step S102.
- step S104 the clipping processing unit 172 detects the luminance value of the pixels of the entire captured image.
- step S 105 the cut-out processing unit 172 converts the average value calculated by the process in step S 103 to a predetermined gradation (for example, the nth gradation) among the displayable gradations. assign. Then, the upper limit value and the lower limit value of the luminance value of the pixel corresponding to the displayable gradation are cut out as the luminance width.
- a predetermined gradation for example, the nth gradation
- step S106 the gradation conversion unit 173 converts the gradation of the captured image data acquired in step S101 based on the luminance width cut out in step S105.
- step S107 the gradation conversion unit 173 outputs a signal corresponding to the image data whose gradation has been converted in the process of step S106 to the display unit 103.
- the photographed image 201 in the figure is, for example, an image photographed in front of a car running on the road.
- the image 201 is acquired.
- an area 201a displayed in the lower center of the image is an area corresponding to the image of the road surface 112 immediately before the automobile, and is the reference area described above.
- the reference area 201a is shown as a rectangular area, but the shape of the reference area 201a may be any shape. Further, when the monitoring device 100 is mounted on an automobile, for example, the position of the reference area 201a may be moved left and right in the image 201 in accordance with the steering angle of the steering.
- the road surface 112 is usually made of asphalt or the like, for example, the road surface 112 is an object to be displayed in gray with a predetermined brightness. That is, the road surface 112 is an object having a known reflectance (or color to be displayed) included in the captured image and is always included in an image captured by the imaging device 101 of the traveling automobile.
- An area 201a corresponding to the road surface 112 in the image is set as a reference area. Note that images taken by the imaging unit 101 are not It is assumed that information for analyzing and specifying the position of the region 201a in the image 201 is stored in the registration information supply unit 174 in advance.
- the reference area 201a is extracted by the reference area extraction unit 171.
- the same processing is performed using an image of the object as a reference region image so that the object such as a part of the body whose color is known is always captured in the captured image. You may do it.
- step S103 the average value bn of the luminance values of the pixels included in the reference area 201a is calculated.
- the reference area 201a includes, for example, a bright road surface display or a road surface paint such as a white line
- the average luminance power of the pixels in the reference area 201a There is a different fear. Therefore, for example, among the blocks of pixels in the reference area 201a, the block having the highest luminance and the block having the lowest luminance are specified in advance, and the reference area 201 is excluded except for the pixels of those blocks. Alternatively, the average luminance may be obtained.
- the region extraction unit 171 is provided with a shape recognition function, and based on the shape of the road surface 112 recognized by the shape recognition function, a portion corresponding to road surface paint or the like is specified, and a pixel of a portion corresponding to road surface paint or the like is specified. Except for, calculate the average luminance of the pixels in the reference area 20 la.
- a part corresponding to road surface paint or the like may be specified by performing predetermined image processing. For example, the change in the luminance value of the pixel in the reference area 201a is prayed, and the edge of the object included in the image is compared by comparing the differential value at a predetermined position of the characteristic indicating the change in the luminance value with a threshold value.
- objects placed on the road surface for example, fallen leaves and dust
- the pixels in the portion corresponding to the detected road surface paint and object are detected. May be removed.
- the display unit 103 can display, for example, gray of a predetermined brightness of the road surface 112 on the registration information supply unit 174.
- the gradation width W for example, the gradation width from the 0th to the 255th
- the cutout processing unit 172 is included in the reference area 201a.
- the color of the wet road surface is different from that of the dry road surface, it may be made to correspond to a gradation different from the color of the dry road surface (general road surface color).
- the setting value of the gradation corresponding to the color (or reflectance) of the road surface in the registration information supply unit 174 is the setting value of rainy weather. It may be configured to be changed to.
- a block having a high luminance and a block having a low luminance in the photographed image are detected.
- the average value of the luminance of the pixels in the block with high luminance and the average value of the luminance of the pixels in the block with high luminance are set as the upper limit value or lower limit value of the luminance value, respectively, and the luminance width may be extracted. It is pretty.
- the gradation conversion unit 173 performs a process of converting the gradation of the image 201 based on the upper limit value bp and the lower limit value bq of the luminance value (step S106).
- the captured image data supplied from the imaging unit 101 is, for example, data corresponding to the image 201 having a wide number of gradations of 214, and in the image 201, the luminance value is greater than the value bp. Pixels or pixels with luminance values less than the value bq may be included. However, due to gradation conversion, pixels with a luminance value greater than the value bp are displayed in the same gradation as pixels with a luminance value of bp, and pixels with a luminance value less than the value bq Is displayed with the same gradation as the pixel whose value is bq.
- the captured image captured by the imaging unit 101 is clearly displayed on the display unit 103 having a smaller gradation width (for example, 256 gradations) than the captured image captured by the imaging unit 101. It is possible to display.
- an image captured by the imaging unit 101 is displayed on the display unit 103. Since the brightness range corresponding to the displayable gradation width is cut out, it is not necessary to display all the brightness values of the pixels of the wide dynamic range. Even if there is a bright ⁇ (or ⁇ ⁇ ) object in the The entire image is prevented from being darkened (or brightened), and a clear image can be displayed. Also, since the correspondence between the luminance value of the pixel and the gradation is determined based on the reflectance (color) of the road surface 112 of the reference area 201a, which is a known object, an image is always displayed in a natural color. be able to.
- step S151 illumination means (not shown) connected to the control unit 102 and controlled is irradiated with light.
- a headlight having a known brightness (illuminance) is irradiated on the immediately preceding road surface 112.
- the process of step S 151 may be performed based on an operation input by the user, or may be automatically performed in accordance with the execution of the reflectance measurement process.
- step S152 the imaging unit 101 captures an image of the object illuminated in the process of step S151 and outputs the image to the control unit 102.
- step S153 the reference area extraction unit 171 extracts reference area data in the captured image.
- step S154 the reference area extraction unit 171 calculates the average value of the luminance of the pixels in the reference area and outputs the average value to the registration information supply unit 174. For example, if the road surface 112 illuminated by a headlight with a known brightness is detected in a wide dynamic range shot image! And what output level is detected (captured), the reflectance of the road surface 112 Can be requested.
- step S155 the registration information supply unit 174 calculates the reflectance of the reference area, and in step S156, determines the display gradation of the reference area.
- the registration information supply unit 174 stores a table as shown in FIG.
- the luminance value (pixel) of the captured image, the reflectance, and the display gradation are stored in association with each other.
- the display gradation is a gradation width that can be displayed by the display unit 103. It represents a predetermined gradation.
- the registration information supply unit 174 specifies the reflectance and display gradation of the reference region based on this table, and stores it as information indicating to which gradation the average value of the luminance values of the pixels in the reference region is assigned.
- step S151 may be omitted and the subject may be photographed with natural light.
- an object with the same reflectance is not necessarily taken as a pixel with the same luminance value, and the distance between the headlight and the road surface immediately before the car does not vary greatly. The reflectance can be measured more accurately when illuminated with a headlight.
- the output level of the pixel in the photographed image differs depending on the brightness (illuminance) of a headlight or the like. It is possible to store multiple tables corresponding to multiple illuminances.
- the power at which the output level is detected (captured) in the captured image and the reflectance of the road surface 112 is obtained.
- the mounting position of the imaging unit 101 Depending on the shape of the car, etc., the distance between the lens 101a of the image pickup unit 101 and the road surface 112 may be different, and even with the same reflectance object, the output level in the captured image may differ (an error occurs). is there.
- a laser radar may be provided in the vicinity of the lens 101a to obtain the reflectance of the road surface 112 more accurately.
- the laser radar emits laser light toward a target, receives the intensity of the reflected light with a light-receiving unit such as a photodiode, and reflects the reflected force of the laser light based on a preset threshold value.
- the distance to the object is obtained using the time until the reflection is detected. Therefore, by using laser radar, it is possible to simultaneously measure the amount of reflection of light in a narrow area irradiated with the laser and the distance to that area, so that the reflectance can be accurately measured using the brightness and distance captured by reflection. Can be sought.
- the distance may be measured by another distance measuring means such as a stereo camera.
- the monitoring device 100 when used for building monitoring or the like, when the object in the reference area is an object (target) whose distance is previously divided, such as an adjacent building, for example, via the operation input unit 104 or the like.
- the distance to the object in the reference area can be set. [0113] In this way, the reflectance is measured and the display gradation is specified.
- the monitoring device 100 is mounted on an automobile or the like has been described so far.
- the monitoring device 100 may be installed near a predetermined building, for example, to monitor the entrance of the building. Good. In such a case, it is necessary to newly set a reference area and store it in the monitoring apparatus 100.
- This process is executed, for example, when the user operates the operation input unit 104 to instruct execution of the reference area storage process.
- step S181 the control unit 102 causes the display unit 103 to display an image captured by the imaging unit 101.
- the display unit 103 displays a pixel having a luminance value between the maximum value and the minimum value of the luminance value of the pixel of the captured image in which the processing described above with reference to FIG. It is converted and displayed so that it can be assigned to possible gradations (for example, 256 gradations).
- step S182 the control unit 102 accepts designation of a reference area in the captured image based on an operation input from the operation input unit 104.
- the user designates the reference area while viewing the image displayed on the display unit 103.
- a rectangular (which may be any shape) reference area 303 is specified in the image of the door 302 of the building 301.
- the coordinate value of the designated reference area is stored in the registration information supply unit 174 as registration information that is information related to the position of the reference area in the captured image.
- control unit 102 accepts display level adjustment based on the operation input from operation input unit 104.
- the display level is a level for adjusting the color of the image displayed on the display unit 103, etc., and an arbitrary display level is selected by the user's operation, and the reference level of the reference area specified in step S182 is selected.
- the color tone (or density) of the image is changed and displayed based on the display level selected by the user. Indicated.
- step S184 the control unit 102 assigns the display gradation of the reference region based on the display level adjusted (selected) in step S183. For example, the control unit 102 detects in which gradation (the number of gradation) the reference area is currently displayed by the process of step S183, and calculates the average value of the luminance values of the pixels in the reference area. Assign to that gradation. As a result, registration information that is information indicating the power to assign the average value of the luminance values of the pixels in the reference region to which gradation is generated and stored in the registration information supply unit 174.
- the reference area storage process is performed.
- the density or color to be displayed in the reference area is determined (step S183). If the reference area is displayed so as to have that density or color, the luminance of the pixels in the area other than the reference area varies greatly. However, it is always possible to provide an image that is easy to see for the user. That is, by executing the processing described above with reference to FIG. 8 based on the registration information stored in the registration information supply unit 174 by this processing, the captured image captured by the imaging unit 101 is The captured image captured by the imaging unit 101 can be clearly displayed on the display unit 103 having a smaller gradation width (for example, 256 gradations).
- FIG. 14 shows the luminance between the maximum value and the minimum value of the luminance value of the pixel of the captured image with a wide dynamic range shot by the image pickup unit using HDRC, as in the conventional gradation conversion.
- FIG. 5 is a diagram illustrating an example of an image displayed on a display unit having a display gradation having a smaller number of gradations than that of a photographed image after being converted so that value pixels are assigned to gradations that can be displayed by the display unit.
- oncoming cars and roadside zone trees are unclearly displayed in the image, and an unnatural impression is given to those who see the color of the entire image being faint.
- FIG. 15 shows a wide dynamic range photographed by the imaging unit using the same HDRC as in FIG. 14 by gradation conversion of the present invention (for example, the processing described above with reference to FIG. 8).
- FIG. 6 is a diagram showing an example of an image displayed on a display unit having a gradation width smaller than that of a photographed image.
- the oncoming car for example, the trees on the roadside zone is clearly displayed in the image, and there is a natural impression on the viewer regarding the color of the entire image. Given.
- the reference density (or Showed the method of improving the display of the intermediate area. Since HDRC images may include high-intensity areas or low-intensity areas that are significantly different from the intermediate area, perform image processing using methods other than those described above, and implement measures to improve image quality. In this case as well, by using the present invention, the display of the reference area can be stably displayed with the density, brightness, or color expected by the user.
- 14-bit AZD-converted wide dynamic range images contain a very wide luminance band, so image processing when detecting predetermined objects included in the image, etc., and compositing images as necessary It is difficult to perform image processing such as when performing image processing in the same manner as conventional image processing. For example, when a Laplacian transformation that performs differentiation processing is performed on a wide dynamic range image as in the case of a normal dynamic range image, only a noise-like result can be obtained, or it can be a threshold value for binary key processing. Since there are too many luminance candidates, the amount of processing increases explosively.
- wide dynamic range images have the same purpose as conventional image processing because they are different in nature from images used in conventional image processing.
- the processing process will require significant modifications. For this reason, when various types of image processing are performed using the image data output from the image pickup unit 101, the image data output from the image pickup unit 101 is output as normal dynamic range image data. This is advantageous in terms of controlling costs.
- a predetermined luminance range is uniformly AZD-converted by a 12-bit AZD converter
- 4096-gradation image data is obtained.
- the luminance range that can be photographed is very narrow compared to the logarithmic conversion type image sensor used in the present invention
- gradation other than the luminance range that can be imaged cannot be obtained.
- the luminance range is expressed in 4096 gradations, it is possible to obtain an image with no sense of incongruity when observed with the human eye.
- the logarithmic conversion type image sensor used in the present invention can capture a luminance range from darkness at night to direct sunlight, even if such a wide luminance range is expressed by 4096 tones, The image becomes unnatural and uncomfortable.
- the configuration of the imaging unit 101 shown in FIG. 5 may be configured as shown in FIG. 16, for example.
- FIG. 16 is a block diagram illustrating another configuration example of the imaging unit 101.
- the same parts as those in FIG. 5 are denoted by the same reference numerals, and detailed description thereof will be omitted. That is, in the case of FIG. 16, an imaging control unit 401 is provided instead of the imaging control unit 121, a gradation assignment determining unit 411 is newly provided in the imaging control unit 401, and the AZ D conversion unit 143 is provided. Instead, it has basically the same configuration as that of FIG. 5 except that the AZD conversion unit 412 is provided.
- the light detection unit 141 converts the light of the subject imaged by the lens 101 into a charge corresponding to the brightness (illuminance) of the incident light, and accumulates the converted charge.
- the light detection unit 141 supplies the accumulated charge to the logarithmic conversion unit 142 in synchronization with the control signal supplied from the imaging timing control unit 144. It is also possible to supply the logarithmic conversion unit 142 as it is without accumulating the converted charge in the light detection unit 141.
- the logarithmic conversion unit 142 generates an analog electric signal obtained by converting the charge supplied from the light detection unit 141 into a voltage value approximately proportional to the logarithm of the number of charges (or current intensity) for each pixel.
- the logarithmic conversion unit 142 supplies the generated analog electrical signal to the AZD conversion unit 412 and also supplies it to the gradation assignment determining unit 411.
- the gradation assignment determination unit 411 analyzes the analog electrical signal supplied from the logarithmic conversion unit 112 and determines gradation assignment for AZD conversion executed by the AZD conversion unit 412.
- the gradation assignment determining unit 411 detects the main luminance range (luminance region) in the luminance distribution of the input image, and makes it possible to sufficiently recognize the image in the luminance range. Assigns the number of gradation steps for / D conversion.
- the brightness range can be set to one. It may be more than one. Also, it is possible to give no gradation step to the area between multiple areas, or to obtain a coarser gradation than the predetermined luminance area.
- the brightness range to be set may be selected and set automatically from the captured image, or it may be set by user operation input.
- FIG. 17 is a block diagram illustrating a configuration example of the gradation assignment determining unit 411 in FIG.
- the average luminance calculation unit 451 obtains the analog image signal supplied from the logarithmic conversion unit 142, calculates the average luminance, and calculates the average luminance calculation result as the main region luminance range setting unit.
- the main area luminance range setting unit 452 sets the luminance range of the main area based on the average luminance of the image signal supplied from the average luminance calculation unit 451, and converts the set luminance range of the main area to the gradation. This is supplied to the allocation calculation unit 455.
- the main area luminance range setting unit 452 uses, for example, a predetermined luminance range centered on the average luminance of pixels of a preset portion in the image corresponding to the signal supplied from the logarithmic conversion unit 142 as the main area. As the luminance range.
- the main region luminance range setting unit 452 starts from the pixel having the average luminance in the entire image corresponding to the signal supplied from the logarithmic conversion unit 142, and having a luminance value close to this luminance. A certain number of pixels may be selected as the luminance range of the main area.
- the gradation allocation calculation unit 455 acquires the image signal supplied from the logarithmic conversion unit 112, and based on the information on the luminance range of the main region supplied from the main region luminance range setting unit 452, AZD In the AZD conversion by the conversion unit 412, it is determined how many gradation steps are assigned to which luminance range.
- the gradation allocation calculating unit 455 determines gradation allocation so that a larger number of gradation steps are allocated to the luminance range set as the main area, and the luminance allocation calculation unit 455 determines the gradation allocation based on the luminance range of the main area.
- the output level of pixels with low luminance input level is 0 (that is, black), and pixels with higher luminance than the luminance range of the main area are set to the maximum value of the whole gradation regardless of the input level. Assign the number of gradation steps.
- the limited number of gradation steps possessed by the AZD conversion unit 412 is effectively distributed, and even if the captured image has a wide dynamic range, the luminance range corresponding to the main region is set. It is possible to obtain an image having a (natural) number of gradations so that the surrounding portion can be more easily recognized by the user.
- the AZD conversion unit 412 in FIG. 16 performs AZD conversion of an analog electrical signal into digital image data in synchronization with the control signal supplied from the imaging timing control unit 144. At this time, the AZD conversion unit 412 performs AZD conversion according to the gradation assignment determined by the gradation assignment determination unit 411. The AZD conversion unit 412 outputs the converted digital image data.
- the imaging unit 101 By configuring the imaging unit 101 as shown in Fig. 16, the imaging unit 101 causes the level assigned by the gradation assignment determining unit 411 to be proportional to the logarithm of the brightness of the subject (incident light amount). It is possible to output digital image data that has been AZD converted based on the key.
- an analog signal corresponding to a wide dynamic range image captured is converted into digital data corresponding to a normal dynamic range image. Can be output.
- a CPU (Central Processing Unit) 501 has various types according to a program stored in a ROM (Read Only Memory) 502 or a program loaded from a storage unit 508 to a RAM (Random Access Memory) 503. Execute the process.
- the RAM 503 also appropriately stores data necessary for the CPU 501 to execute various processes.
- the CPU 501, ROM 502, and RAM 503 are connected to each other via a bus 504.
- An input / output interface 505 is also connected to the bus 504.
- the input / output interface 505 includes an input unit 506 including a keyboard and a mouse, a CRT ( Cathode Ray Tube), LCD (Liquid Crystal display) display, etc., output unit 507 composed of speaker power, storage unit 508 composed of hard disk, etc., network interface card such as modem, LAN card, etc.
- the communication unit 509 to be connected is connected.
- a communication unit 509 performs communication processing via a network including the Internet.
- the drive 510 is also connected to the input / output interface 505 as necessary, and a removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory is appropriately mounted, and these forces are also read out.
- the computer program is installed in the storage unit 508 as necessary.
- a program constituting the software is installed from a network such as the Internet or a powerful recording medium such as the removable medium 511.
- this recording medium is distributed to distribute the program to the user separately from the apparatus main body shown in FIG. 18, and a magnetic disk (floppy disk (registered trademark) on which the program is recorded is distributed. )), Optical disks (including CD-ROM (Compact Disk-Read Only Memory), DVDs (including Digital Versatile Disk)), magneto-optical disks (including MD (Mini-Disk) (registered trademark)), or semiconductor memory It is composed of a ROM502 that stores programs and a hard disk included in the storage unit 508 that is distributed to the user in a state of being pre-installed in the main body of the device. Including life.
- step of executing the series of processes described above is not necessarily performed in time series in addition to processes performed in time series in accordance with the described order. Also includes processing executed in parallel or individually.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Mechanical Engineering (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2007508175A JP4985394B2 (ja) | 2005-03-15 | 2006-03-15 | 画像処理装置および方法、プログラム、並びに記録媒体 |
EP06729136A EP1868370A4 (en) | 2005-03-15 | 2006-03-15 | IMAGE PROCESSING APPARATUS AND METHOD, PROGRAM AND RECORDING MEDIUM |
US11/908,939 US20090015683A1 (en) | 2005-03-15 | 2006-03-15 | Image processing apparatus, method and program, and recording medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005073756 | 2005-03-15 | ||
JP2005-073756 | 2005-03-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006098358A1 true WO2006098358A1 (ja) | 2006-09-21 |
Family
ID=36991700
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2006/305115 WO2006098358A1 (ja) | 2005-03-15 | 2006-03-15 | 画像処理装置および方法、プログラム、並びに記録媒体 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20090015683A1 (ja) |
EP (1) | EP1868370A4 (ja) |
JP (1) | JP4985394B2 (ja) |
CN (1) | CN101142812A (ja) |
WO (1) | WO2006098358A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009246444A (ja) * | 2008-03-28 | 2009-10-22 | Nissan Motor Co Ltd | 映像表示システム及び映像表示方法 |
KR101309733B1 (ko) * | 2010-03-02 | 2013-09-17 | 아우토리브 디벨롭먼트 아베 | 자동차의 운전자 보조 시스템 및 운전자 보조 방법 |
JP2018078520A (ja) * | 2016-11-11 | 2018-05-17 | 株式会社デンソー | 画像処理装置 |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101539379B1 (ko) * | 2008-12-31 | 2015-07-29 | 주식회사 동부하이텍 | 실시간 영상 발생기 |
JP5229575B2 (ja) * | 2009-05-08 | 2013-07-03 | ソニー株式会社 | 画像処理装置および方法、並びにプログラム |
US9153064B2 (en) * | 2009-12-23 | 2015-10-06 | Intel Corporation | Grouping pixels to be textured |
EP2656774B1 (en) * | 2011-01-31 | 2016-05-11 | Olympus Corporation | Fluorescence observation apparatus |
JP5260696B2 (ja) * | 2011-03-10 | 2013-08-14 | 株式会社デンソー | エッジ点抽出装置、車線検出装置、およびプログラム |
US8582873B2 (en) * | 2011-06-16 | 2013-11-12 | Tandent Vision Science, Inc. | Use of an object database in an image process |
KR101326991B1 (ko) * | 2011-12-01 | 2013-11-13 | 현대자동차주식회사 | 노면의 성질 감지 장치 및 그 방법 |
KR101361663B1 (ko) * | 2012-03-21 | 2014-02-12 | 주식회사 코아로직 | 차량용 영상 처리 장치 및 방법 |
JP6414386B2 (ja) * | 2014-03-20 | 2018-10-31 | 株式会社島津製作所 | 画像処理装置および画像処理プログラム |
JP6390512B2 (ja) * | 2015-05-21 | 2018-09-19 | 株式会社デンソー | 車載カメラ装置 |
US10257394B2 (en) | 2016-02-12 | 2019-04-09 | Contrast, Inc. | Combined HDR/LDR video streaming |
US10264196B2 (en) | 2016-02-12 | 2019-04-16 | Contrast, Inc. | Systems and methods for HDR video capture with a mobile device |
EP3497925B1 (en) * | 2016-08-09 | 2022-11-23 | Contrast, Inc. | Real-time hdr video for vehicle control |
CN109120859B (zh) * | 2017-06-26 | 2022-03-25 | 深圳光峰科技股份有限公司 | 一种影像数据处理装置及拍摄设备、显示系统 |
JP7003558B2 (ja) * | 2017-10-12 | 2022-01-20 | カシオ計算機株式会社 | 画像処理装置、画像処理方法、及びプログラム |
US10691957B2 (en) * | 2018-02-12 | 2020-06-23 | ITS Plus, Inc. | Method for increasing the accuracy of traffic cameras using optical masking technology |
US10951888B2 (en) | 2018-06-04 | 2021-03-16 | Contrast, Inc. | Compressed high dynamic range video |
CN112351260B (zh) * | 2020-08-04 | 2021-12-10 | 中煤科工集团沈阳研究院有限公司 | 掘进工作面无人值守自动可视化监控系统及监控方法 |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005033709A (ja) * | 2003-07-11 | 2005-02-03 | Nissan Motor Co Ltd | 車両用周辺監視装置 |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5418243A (en) * | 1977-07-12 | 1979-02-10 | Toshiba Corp | Picture display unit |
JPH05103336A (ja) * | 1991-10-11 | 1993-04-23 | Olympus Optical Co Ltd | カラー映像信号処理装置およびカラー映像信号の色補正方法 |
GB2262339B (en) * | 1991-12-13 | 1995-09-06 | Honda Motor Co Ltd | Method of inspecting the surface of a workpiece |
JP2856386B2 (ja) * | 1994-10-26 | 1999-02-10 | 松下電器産業株式会社 | 色彩調整装置及びその方法 |
JP3759280B2 (ja) * | 1997-04-15 | 2006-03-22 | 富士通株式会社 | 道路監視用事象検知装置 |
JP3724188B2 (ja) * | 1998-04-30 | 2005-12-07 | コニカミノルタホールディングス株式会社 | 固体撮像装置 |
JP4126784B2 (ja) * | 1998-10-30 | 2008-07-30 | 株式会社ニコン | 画像取り込み装置 |
JP2000307896A (ja) * | 1999-04-15 | 2000-11-02 | Toshiba Corp | 画像処理装置及び画像処理方法 |
JP4256028B2 (ja) * | 1999-07-02 | 2009-04-22 | 富士フイルム株式会社 | 圧縮符号化装置および方法 |
JP2001126075A (ja) * | 1999-08-17 | 2001-05-11 | Fuji Photo Film Co Ltd | 画像処理方法および装置並びに記録媒体 |
JP2001078090A (ja) * | 1999-09-02 | 2001-03-23 | Fuji Photo Film Co Ltd | 広ダイナミックレンジ記録画像再生装置 |
JP2001119683A (ja) * | 1999-10-20 | 2001-04-27 | Fujitsu General Ltd | 画像センサ装置 |
JP2001136492A (ja) * | 1999-11-09 | 2001-05-18 | Fuji Photo Film Co Ltd | 画像再生装置 |
JP4245782B2 (ja) * | 2000-07-03 | 2009-04-02 | オリンパス株式会社 | 撮像装置 |
JP2002044526A (ja) * | 2000-07-21 | 2002-02-08 | Minolta Co Ltd | 固体撮像装置 |
JP4281311B2 (ja) * | 2001-09-11 | 2009-06-17 | セイコーエプソン株式会社 | 被写体情報を用いた画像処理 |
JP2003101886A (ja) * | 2001-09-25 | 2003-04-04 | Olympus Optical Co Ltd | 撮像装置 |
JP4099011B2 (ja) * | 2002-07-10 | 2008-06-11 | 富士重工業株式会社 | 監視装置および監視方法 |
JP4407163B2 (ja) * | 2003-05-30 | 2010-02-03 | 株式会社ニコン | カメラシステム |
JP3994941B2 (ja) * | 2003-07-22 | 2007-10-24 | オムロン株式会社 | 車両用レーダ装置 |
JP2005057358A (ja) * | 2003-08-06 | 2005-03-03 | Nikon Corp | 電子カメラおよび画像処理プログラム |
-
2006
- 2006-03-15 JP JP2007508175A patent/JP4985394B2/ja not_active Expired - Fee Related
- 2006-03-15 WO PCT/JP2006/305115 patent/WO2006098358A1/ja active Application Filing
- 2006-03-15 CN CNA2006800086326A patent/CN101142812A/zh active Pending
- 2006-03-15 EP EP06729136A patent/EP1868370A4/en not_active Withdrawn
- 2006-03-15 US US11/908,939 patent/US20090015683A1/en not_active Abandoned
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005033709A (ja) * | 2003-07-11 | 2005-02-03 | Nissan Motor Co Ltd | 車両用周辺監視装置 |
Non-Patent Citations (2)
Title |
---|
NAKAJIMA A.: "Wide Dynamic Range Camera [HNDC Series]", EIZO JOHO INDUSTRIAL, KABUSHIKI KAISHA SANGYO KAIHATSU KIKO, NIPPON, vol. 35, no. 1, 1 January 2003 (2003-01-01), pages 41 - 46, XP003005818 * |
See also references of EP1868370A4 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009246444A (ja) * | 2008-03-28 | 2009-10-22 | Nissan Motor Co Ltd | 映像表示システム及び映像表示方法 |
KR101309733B1 (ko) * | 2010-03-02 | 2013-09-17 | 아우토리브 디벨롭먼트 아베 | 자동차의 운전자 보조 시스템 및 운전자 보조 방법 |
JP2018078520A (ja) * | 2016-11-11 | 2018-05-17 | 株式会社デンソー | 画像処理装置 |
Also Published As
Publication number | Publication date |
---|---|
EP1868370A4 (en) | 2009-04-22 |
EP1868370A1 (en) | 2007-12-19 |
CN101142812A (zh) | 2008-03-12 |
JPWO2006098358A1 (ja) | 2008-08-28 |
US20090015683A1 (en) | 2009-01-15 |
JP4985394B2 (ja) | 2012-07-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4985394B2 (ja) | 画像処理装置および方法、プログラム、並びに記録媒体 | |
JP5182555B2 (ja) | 画像処理装置および画像処理方法、画像処理システム、プログラム、並びに、記録媒体 | |
JP4835593B2 (ja) | 画像処理装置および画像処理方法、プログラム、並びに、記録媒体 | |
US20090016636A1 (en) | Image processing apparatus, and method, program and recording medium | |
JP4678487B2 (ja) | 画像処理システム、画像処理装置および方法、記録媒体、並びにプログラム | |
US20060215882A1 (en) | Image processing apparatus and method, recording medium, and program | |
US9639764B2 (en) | Image recognition system for vehicle for traffic sign board recognition | |
JP4399174B2 (ja) | 自動車のディスプレイユニット、自動車の夜間視認装置および自動車の赤外線夜間視認装置を備えた車両 | |
JP6740866B2 (ja) | 画像出力装置 | |
JP6531542B2 (ja) | 画像処理システム、画像処理装置、撮像装置、画像処理方法、プログラムおよび記録媒体 | |
JP2016196233A (ja) | 車両用道路標識認識装置 | |
US10129458B2 (en) | Method and system for dynamically adjusting parameters of camera settings for image enhancement | |
US10525902B2 (en) | Convertible roof opening detection for mirror camera | |
US20060228024A1 (en) | Method for adjusting an image sensor | |
JP6593581B2 (ja) | 画質調整装置並びにカメラユニット | |
JP3385475B2 (ja) | 画像監視装置 | |
CN101142811A (zh) | 图像处理装置和图像处理方法、程序以及记录介质 | |
JP2005333248A (ja) | カメラ式車両感知器における画面の輝度調整方法及び装置 | |
JP2002268116A (ja) | 自動露出制御装置およびそのプログラムを組み込んだ外部記憶媒体 | |
EP4209990A2 (en) | Blended gray image enhancement | |
JP2007249568A (ja) | 画像処理装置および方法、記録媒体、並びに、プログラム | |
JP2022112195A (ja) | 画像表示装置 | |
JPH03276379A (ja) | 車番自動読取装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200680008632.6 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 11908939 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006729136 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: RU |
|
WWP | Wipo information: published in national office |
Ref document number: 2006729136 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2007508175 Country of ref document: JP Kind code of ref document: A |