WO2014002534A1 - 対象物認識装置 - Google Patents
対象物認識装置 Download PDFInfo
- Publication number
- WO2014002534A1 WO2014002534A1 PCT/JP2013/057363 JP2013057363W WO2014002534A1 WO 2014002534 A1 WO2014002534 A1 WO 2014002534A1 JP 2013057363 W JP2013057363 W JP 2013057363W WO 2014002534 A1 WO2014002534 A1 WO 2014002534A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- luminance
- area
- pixels
- value
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
Definitions
- the present invention relates to an apparatus for recognizing an object to be monitored such as a living body using an image captured by an infrared camera.
- An infrared image is obtained by acquiring a captured image of the periphery of the vehicle with an infrared camera mounted on the vehicle and extracting an image portion of the monitoring target based on a binary image generated by binarizing the captured image.
- An apparatus for recognizing a monitored object present in an imaging area of a camera is known in the prior art.
- Patent Document 1 an image of an infrared camera is extracted by extracting an image of a living body such as a person as a monitoring target from a high luminance area (an area formed by pixels of high luminance values) in the binary image.
- a high luminance area an area formed by pixels of high luminance values
- the living body when a living body such as a person to be monitored is present in the imaging region of an infrared camera mounted on a vehicle, under normal circumstances, the living body is generally a subject (road surface) around (background) the living body. It has a relatively higher temperature than the wall surface of the structure).
- the image of the living body in the image captured by the infrared camera is relatively brighter than the image of the background. Therefore, under a normal environment, it is possible to extract an image of a living body from a high brightness region of a captured image of an infrared camera.
- the temperature of the living body may be relatively lower than the temperature of the subject in the background.
- the image of the living body in the image captured by the infrared camera has a relatively low luminance as compared to the image of the background.
- the inventor of the present application is studying to extract an image of a living body from a low brightness area constituted by relatively low brightness pixels in an image captured by the infrared camera.
- the captured image of the infrared camera mounted on the vehicle usually includes an image area where the sky is projected. Since the sky projected onto the image area is a low temperature area regardless of the outside air temperature or the like, the sky image area is a low brightness image area.
- the present invention has been made in view of the above background, and appropriately recognizes an object such as a living body relatively lower than the background temperature based on an image captured by an infrared camera. It aims at providing the object recognition device which can do.
- the object recognition device of the present invention exists in the imaging area of the infrared camera based on the image captured by the infrared camera and is relatively cooler than the background temperature.
- An object recognition apparatus having a function of recognizing an object, wherein each pixel formed by excluding pixels having a brightness equal to or less than a predetermined value indicating a pixel on which the sky is projected is excluded from the captured image as the predetermined value.
- the image processing apparatus is characterized by comprising: digitizing processing means; and object image extracting means configured to extract an image portion of the object from an image region constituted by the low luminance pixels in the captured image.
- the binarization processing means is configured to exclude each pixel formed by excluding a pixel having a luminance equal to or less than a predetermined value indicating a pixel on which sky is projected, from the captured image.
- the pixels are classified into low luminance pixels having a luminance value equal to or lower than the threshold for conversion and high luminance pixels having a luminance higher than the threshold for binarization.
- the binarization threshold is set to a luminance value higher than the predetermined value, an object relatively cooler than the background temperature is present in the imaging area of the infrared camera.
- the pixels of the image portion (the image of the whole or part of the object) of the object in the captured image become the low luminance pixel of the low luminance pixel and the high luminance pixel can do.
- the object image extracting unit is an image area configured by the low luminance pixels
- An image portion of the object can be extracted from (sometimes referred to hereinafter as a low brightness image area).
- the first aspect of the present invention it is possible to appropriately recognize an object such as a living body which is relatively lower in temperature than the background temperature from the image captured by the infrared camera.
- the luminance of each pixel of the sky image area projected onto the captured image of the infrared camera is usually lower than the luminance of another image area (an image area on which an object is projected), Some variation occurs depending on the condition of the sky such as the presence or absence of clouds. In addition, when the area of the sky image area projected onto the captured image is large, variation in luminance of each pixel of the sky image area is more likely to occur than when the area is small.
- the predetermined value is set so as to change the predetermined value in accordance with the area of the image area where sky is projected or the luminance representative value of the image area in the captured image. It is preferable that the apparatus further comprises exclusion luminance value setting means configured in the above (second invention).
- the above-mentioned luminance representative value means the representative value of the luminance of the image area where sky is projected.
- the representative value for example, an average value, a maximum value, or the like of the brightness of the upper part of the captured image of the infrared camera (the part where the sky is estimated to be projected) can be used.
- the predetermined value can be appropriately set by reflecting the variation of the luminance of each pixel of the image area where sky is projected on the predetermined value.
- the exclusion luminance value setting unit may set the predetermined value to be larger as the area of the image area on which the sky is projected is larger or as the luminance representative value of the image area is larger. It comprises (the 3rd invention). As a result, it is possible to eliminate as much as possible that the pixel on which the sky is projected is included in each pixel formed by excluding the pixel having the luminance equal to or less than the predetermined value.
- the image portion which is not the object is extracted as the image portion of the object by the process of the object image extracting means, that is, the process of extracting the image portion of the object from the low luminance image area. It can be prevented. Consequently, the processing reliability of the object image extraction means can be enhanced.
- the binarization processing means determines the relationship between the luminance value and the number of pixels in the image area constituted by the pixels excluding the pixel having the luminance equal to or less than the predetermined value in the captured image. It is preferable to include a binarization threshold setting unit configured to set the binarization threshold based on the histogram shown (fourth invention).
- the pixel of the image portion of the object in the imaged image when there is an object relatively lower than the background temperature in the imaging area of the infrared camera, the pixel of the image portion of the object in the imaged image.
- the binarization threshold value so as to be a low brightness pixel of the low brightness pixel and the high brightness pixel with high certainty.
- FIG. 2 is a block diagram showing the configuration of the object recognition apparatus shown in FIG. 1;
- FIG. 7 is a view showing an example of a histogram used in the processing of STEPs 2, 6, 7 of FIG. 3;
- FIG. 5A is a view showing an example of a captured image, and
- FIG. 5B is a view showing an example of a binary image obtained by binarizing the captured image of FIG. 5A.
- the object recognition device 10 of the present embodiment is mounted on a vehicle 1.
- the vehicle 1 in addition to the target object recognition device 10, two cameras 11L constituting a stereo camera for imaging a predetermined monitoring area AR0 (field angle area between straight lines L1 and L2 in FIG. 1) around the vehicle 1.
- the display 12 displays the display information such as a captured image of the vehicle 1 so that the driver of the vehicle 1 can visually recognize;
- a speaker 13 for outputting information is mounted.
- the display 12 is composed of a liquid crystal display installed in the vehicle compartment in front of the driver's seat of the vehicle 1, or a head-up display or the like for projecting an image on a windshield of the vehicle 1 for display.
- the display 12 may be capable of appropriately displaying navigation information (map or the like), audio information and the like in addition to the captured image of the camera 11R.
- Each of the cameras 11L and 11R is an infrared camera having sensitivity in the wavelength band of the infrared region. Then, the cameras 11L and 11R output a video signal indicating the luminance of each pixel constituting each captured image.
- a monitoring area AR0 (hereinafter sometimes referred to as an imaging area AR0) captured by the cameras 11L and 11R is an area on the front side of the vehicle 1 in the present embodiment. The cameras 11L and 11R are mounted on the front of the vehicle 1 in order to capture an image of the monitoring area AR0.
- the cameras 11L and 11R are disposed in parallel in the vehicle width direction in a substantially symmetrical positional relationship with respect to the central axis (Z axis in FIG. 1) of the vehicle 1 in the vehicle width direction (X axis direction in FIG. It is done.
- the cameras 11L and 11R are attached to the front of the vehicle 1 so that the optical axes thereof are parallel to each other and the height from the road surface is equal.
- Each of the cameras 11L and 11R has the following characteristics with respect to the distribution state of the temperature of the entire subject in the imaging area AR0 captured in the captured image, with respect to the luminance of the captured image defined by the video signal.
- the characteristic is that the brightness of the image of an arbitrary object in the imaging area AR0 (the brightness of the image in the vicinity of the projection area of the object) is the brightness according to the temperature itself of the object Not the size of the object according to the relative temperature difference between the object and its background (the object (the wall of the building, the road surface, etc. present behind the object when viewed from the cameras 11L and 11R)) Is a characteristic (hereinafter referred to as an AC characteristic).
- the image captured by each of the cameras 11L and 11R having such AC characteristics is an image in which a relatively significant temperature change (spatial temperature change) in the entire subject in the image pickup area AR0 is taken. It is a captured image in which a change in luminance of a portion is emphasized. Furthermore, in the captured image, the brightness of the image portion where the portion having a uniform temperature (the portion where the temperature of each portion is substantially the same) is substantially the same regardless of the temperature magnitude (absolute temperature) It is a captured image which becomes the brightness of the size of.
- Each camera 11L, 11R of the present embodiment is a camera having such AC characteristics.
- each of the cameras 11L and 11R may not have its own AC characteristics. That is, in each of the cameras 11L and 11R, the luminance of each pixel defined by the video signal output by that camera is the luminance according to the size of the temperature of the subject projected onto that pixel (the temperature is It may be a camera having such characteristics that the higher the brightness, the higher the brightness. In such a case, the captured image of the AC characteristics can be obtained by performing the video filter process on the captured image defined by the video signal of each of the cameras 11L and 11R.
- one of the cameras 11L and 11R constituting the stereo camera for example, the camera 11R on the right side may be referred to as a reference camera 11R.
- the object recognition device 10 is an electronic circuit unit configured by a CPU, a memory, an interface circuit, and the like (not shown).
- the target object recognition device 10 performs predetermined control processing by executing a program to be mounted by the CPU.
- the target object recognition device 10 recognizes a target object of a predetermined type of a monitoring target present in the imaging area AR0 based on the captured images of the cameras 11L and 11R.
- the object is a living body such as a pedestrian (person) or a wild animal.
- the object recognition device 10 detects the distance between the vehicle 1 and the object present in front of the vehicle 1 at each predetermined control processing cycle, and detects the position of the object (relative position with respect to the vehicle 1). Chase. Furthermore, in the case where the object recognition device 10 is a living body judged to have a possibility of coming into contact with the vehicle 1, the object recognition device 10 draws the attention of the driver of the vehicle 1 to the object (living body). And, an alert display is performed on the display 12, and an alerting process for outputting an alert sound (or an alert sound) from the speaker 13 is executed.
- the object recognition device 10 will be further described with reference to FIG.
- the image signals of the cameras 11L and 11R are input to the object recognition device 10, and detection signals of various sensors mounted on the vehicle 1 are input.
- the object recognition device 10 includes a yaw rate sensor 21 for detecting a yaw rate of the vehicle 1, a vehicle speed sensor 22 for detecting a vehicle speed of the vehicle 1, and a brake for detecting a brake operation (depression of a brake pedal) by a driver.
- a detection signal (or an operation command signal of a wiper) of a sensor 23, an outside air temperature sensor 24 for detecting an outside air temperature, and a wiper sensor 25 for detecting an operation state of a wiper (not shown) of a windshield of the vehicle 1 is input.
- the display 12 and the speaker 13 are connected to the object recognition device 10.
- the display of the display 12 and the sound output of the speaker 13 are controlled by the object recognition device 10.
- the object recognition apparatus 10 is realized by a function (a function realized by a software configuration) realized by executing a program to be mounted by a CPU or a hardware configuration (an input / output circuit, an arithmetic circuit, etc.)
- a captured image acquisition unit 31 that acquires captured images of the cameras 11L and 11R (captured images of the AC characteristics)
- a binarization processing unit that executes binarization processing that binarizes the captured images.
- an object image extraction unit 35 for extracting an image portion of an object (an object to be a candidate for a living body) which may be a living body by using a binary image obtained by the binarization process; It is determined whether or not the object from which the image portion is extracted by the object image extraction unit 35 is a living body having a possibility of contact with the vehicle 1, and the determination result is positive. And a contact avoidance processing unit 36 for execution.
- the binarization processing unit 32 and the object image extraction unit 35 correspond to the binarization processing unit and the object image extraction unit in the present invention, respectively.
- the binarization processing unit 32 sets the exclusion luminance value setting unit 33, which corresponds to the exclusion luminance value setting unit in the present invention, and the binarization threshold setting, which corresponds to the binarization threshold setting unit in the present invention. It includes the function as part 34.
- the object recognition device 10 recognizes an object present in the monitoring area (imaging area) AR0 in front of the vehicle 1 by executing the process shown in the flowchart of FIG. 3 every predetermined control cycle.
- the object recognition device 10 executes the process of STEP 1 by the captured image acquisition unit 31.
- the captured image acquisition unit 31 acquires captured images of the cameras 11L and 11R.
- the captured image acquisition unit 31 causes each of the cameras 11L and 11R to capture an image of the imaging area AR0. Then, the captured image acquisition unit 31 performs A / D conversion on the video signal output from each of the cameras 11L and 11R according to the imaging, thereby digitalizing the luminance value of each pixel for each of the cameras 11L and 11R.
- the captured image (captured image of the AC characteristics) represented by is acquired. Then, the captured image acquisition unit 31 stores and holds the acquired captured images of the cameras 11L and 11R in an image memory (not shown).
- the captured image acquisition unit 31 stores and holds, in the image memory, a plurality of captured images for a period up to a predetermined time, including the latest captured image.
- the captured image acquisition unit 31 performs image filtering processing on the captured image defined by the video signal of the infrared camera to capture the AC characteristics. It suffices to acquire an image.
- the object recognition device 10 executes the processing of STEP 2 to 4 by the binarization processing unit 32 and the object image extraction unit 35.
- STEPs 2 and 3 are processes of the binarization processing unit 32.
- the binarization processing unit 32 executes the processing of the binarization threshold setting unit 34.
- the binarization threshold value setting unit 34 sets a first binarization threshold value Yth1 for binarizing the captured image of the reference camera 11R.
- the binarization threshold value setting unit 34 sets 2 as a histogram (hereinafter referred to as a first luminance histogram) indicating the relationship between the luminance value of each pixel of the image captured by the reference camera 11R and the number of pixels (frequency).
- a first threshold Yth1 for digitization is set.
- the first binarization threshold value Yth1 is set when the temperature of a living body such as a person as a monitoring target object present in the imaging area AR0 of the cameras 11L and 11R is higher than the temperature of the background subject of the living body.
- the luminance of the image portion of the living body in the captured image of the cameras 11L and 11R is a threshold set so as to be higher than the first threshold for binarization Yth1.
- the first binarization threshold Yth1 is set by the so-called P tile method based on the first luminance histogram. That is, in the first luminance histogram, Yth1 is set such that the total number of pixels which is equal to or greater than the first binarization threshold value Yth1 is the number of pixels of a predetermined ratio of the total number of pixels of the captured image.
- Yth1 in the drawing is set as the first threshold for binarization.
- the binarization processing unit 32 generates a first binary image by binarizing the captured image of the reference camera 11R with the first threshold Yth1 for binarization set as described above. Do.
- the pixels of the image captured by the reference camera 11R are classified into two types, that is, a pixel of high luminance value having a luminance value of Yth1 or more and a pixel of low luminance value having a luminance value smaller than Yth1. Then, binarization of the captured image is performed. Then, a first binary image is generated by setting a pixel of high luminance value as a white pixel and a pixel of low luminance value as a black pixel.
- a living body such as a person as an object to be monitored is present in the imaging area AR0 of the cameras 11L and 11R, and the temperature of the living body is a subject of the background If the temperature is higher than (normal case), the image part of the living body becomes a local white area in the first binary image.
- the first binary image may be generated by setting a pixel having a high luminance value of Yth1 or more as a black pixel and setting a pixel having a luminance lower than Yth1 as a white pixel.
- the next STEP 4 is processing of the object image extraction unit 35.
- the object image extracting unit 35 selects an object as a candidate of a living body from a white area (an image area formed by white pixels (pixels with high luminance value of Yth1 or more)) in the first binary image. Extract the image part of the object.
- the image portion of the object to be extracted has, for example, a width in which the width in the vertical direction and the horizontal direction, the ratio of these widths, the height from the road surface, the average brightness value, the brightness dispersion, etc. It is an image portion that is inside (within the range set assuming that the object is a living body such as a person or a wild animal).
- the first binary image is generated by setting a pixel having a high luminance value of Yth1 or more to be a black pixel and a pixel having a luminance value lower than Yth1 to be a white pixel, the first binary image is generated.
- An image portion of an object as a living body candidate may be extracted from the black region in the binary image of.
- the object recognition device 10 executes the determination process of STEP5.
- the object recognition device 10 determines the current environmental condition. Specifically, this determination processing is whether or not one of the condition that the current outside air temperature is a high temperature above the predetermined temperature and the condition that the current weather is the time of rainfall hold. Is a process of determining
- the object recognition device 10 determines whether the outside air temperature is a high temperature equal to or higher than a predetermined temperature based on the detected value of the outside air temperature by the outside air temperature sensor 24.
- the object recognition device 10 determines whether it is raining or not based on the operating condition of the wiper indicated by the output of the wiper sensor 25 (or the operation command signal of the wiper). Specifically, the object recognition device 10 determines that it is raining when the wiper is in operation, and determines that it is not raining when the wiper is not in operation.
- whether it is raining may be detected using a raindrop sensor.
- weather information may be received by communication to recognize whether it is raining or not.
- the temperature of a living body such as a person (pedestrian) is higher than the temperature of an object around the living body such as a road surface under a normal environment (such as an environment where the outside air temperature is not so high).
- a living body appears in the captured images (captured images of AC characteristics) of the cameras 11L and 11R which are infrared cameras, the luminance of the image of the living body is usually the subject of the background (road surface, wall of a building, etc.
- the brightness of the image is higher than that of the image of.
- the temperature of a living body such as a person (pedestrian) may be lower than the temperature of an object around it.
- the brightness of the image portion of the living body in the captured image of the cameras 11L and 11R is lower than the brightness of the image of the subject (the road surface, the wall of a building, etc.) in the background.
- the image portion of the living body becomes a black image (image of low luminance value) in the first binary image, so in STEP 4 the image portion of the living body is extracted Can not do it.
- the object recognition device 10 further executes the processing of STEPs 6 to 9 by the binarization processing unit 32 and the object image extraction unit 35 to obtain an object as a candidate of a living body. Extract the image part of the object.
- STEPs 6 to 8 are processes of the binarization processing unit 32.
- the binarization processing unit 32 executes the processing of the exclusion luminance value setting unit 33.
- the exclusion luminance value setting unit 33 sets the exclusion luminance value Yex used to exclude the pixel on which the sky is projected from the image to be binarized from the captured image of the reference camera 11R.
- the exclusion luminance value Yex corresponds to the "predetermined value" in the present invention.
- a sky image is usually projected on the captured image of the cameras 11L and 11R.
- the temperature of the sky is generally lower than that of other objects (road surface, structure, living body, etc.).
- the luminance value of each pixel of the whole or most of the image area where the sky is projected in the captured image of the cameras 11L and 11R is lower than a certain luminance value.
- the exclusion luminance value Yex is basically a luminance value set such that the luminance value of a pixel on which sky is projected in the captured image of the cameras 11L and 11R is equal to or less than Yex.
- the luminance value of each pixel in the image area where the sky is projected varies to some extent due to the influence of the presence or absence of a cloud or the area of the sky image area. For example, in the case where a cloud image is projected on the sky image area, the luminance value is higher than in the case where no cloud image is projected.
- the exclusion luminance value setting unit 33 variably sets the exclusion luminance value Yex. Specifically, the average value or the maximum value of the luminances in the portion near the upper end (the image area estimated to be projected of the sky) in the captured image of the reference camera 11R is calculated as the luminance representative value of the portion Be done.
- the number of pixels having a luminance value equal to or less than a preset predetermined value is calculated as the number of empty area pixels schematically representing the sky area in the captured image Ru.
- the boundary between the empty image area and the image area of another object is detected by a method such as edge extraction, and pixels of the area surrounded by the boundary The number may be calculated as the empty area pixel number.
- the exclusion luminance value Yex is set based on the above-described representative luminance value and the number of empty area pixels based on a preset map or arithmetic expression set in advance.
- the exclusion luminance value Yex is set so that the exclusion luminance value Yex becomes a larger value as the luminance representative value is larger.
- the exclusion-use luminance value Yex is set such that the exclusion-use luminance value Yex becomes a larger value as the number of empty area pixels increases (as the area of the sky projection area in the captured image becomes larger).
- the binarization processing unit 32 executes the processing of the binarization threshold setting unit 34.
- the binarization threshold value setting unit 34 excludes pixels (pixels in which the sky can be considered to be projected) having a luminance value equal to or less than the exclusion luminance value Yex from the captured image of the reference camera 11R.
- a second threshold Yth2 for binarization is set to binarize (hereinafter referred to as an empty area excluded image).
- the sky region excluded image is, in other words, an image composed of pixels having a luminance value higher than the exclusion luminance value Yex among the captured image of the reference camera 11R.
- the second binarization threshold Yth2 corresponds to the binarization threshold in the present invention.
- the binarization threshold setting unit 34 generates a binary value based on a histogram (hereinafter, referred to as a second luminance histogram) representing the relationship between the luminance value of each pixel of the sky area excluded image and the number of pixels (frequency). Second threshold Yth2 is set. Note that, as illustrated in FIG. 4, the second luminance histogram is a portion excluding the portion where the luminance value is equal to or less than Yex from the first luminance histogram.
- the second threshold Yth2 for binarization is used when the temperature of a living body such as a person as a monitoring target object present in the imaging area AR0 of the cameras 11L and 11R is lower than the temperature of the background subject of the living body.
- the luminance of the image portion of the living body in the captured image of the cameras 11L and 11R is a threshold set so as to be lower than the second threshold Yth2 for binarization.
- the second threshold Yth2 for binarization is set by the P tile method, similarly to the first threshold Yth1 for binarization.
- the second threshold Yth2 for binarization is set based on the second luminance histogram instead of the first luminance histogram. That is, in the second luminance histogram, Yth2 is set such that the total number of pixels which is equal to or smaller than the second threshold for binarization Yth2 is the number of pixels of a predetermined ratio of the total number of pixels of the captured image. In this case, Yth2 is a luminance value higher than the exclusion luminance value Yex.
- the second luminance histogram is a histogram as exemplified in FIG. 4, Yth2 (> Yex) in the drawing is set as the second threshold for binarization.
- the binarization processing unit 32 generates a second binary image by binarizing the empty area excluded image using the second threshold Yth2 for binarization set as described above. .
- the pixels of the sky area excluded image are classified into two types of a pixel of low luminance value having a luminance value of Yth2 or less and a pixel of high luminance value having a luminance value larger than Yth2.
- the empty area excluded image is binarized.
- the second binary image is generated by setting the pixels of low luminance value as white pixels and the pixels of high luminance value as black pixels.
- a living body such as a person as an object to be monitored is present in the imaging area AR0 of the cameras 11L and 11R under a situation where the determination result in STEP 5 is affirmative. If the temperature of the living body is lower than the temperature of the subject in the background, the image portion of the living body becomes a local white area in the second binary image.
- a captured image of the reference camera 11R (or the camera 11L) as illustrated in FIG. 5A can be obtained.
- the image portion of the person (pedestrian) in the imaging area AR0 is the portion of the image of the person in FIG. 5A due to the temperature being lower than the temperature of the subject in the background.
- the image is dark as shown in.
- the first luminance histogram and the second luminance histogram in this captured image are the histograms shown in FIG.
- the empty region excluded image is binarized using the second threshold Yth2 for binarization shown in FIG. 4 to generate a second binary image as shown in FIG. 5 (b). Be done.
- the image portion of the person (pedestrian) shown in FIG. 5A is obtained as a local white region.
- pixels having a luminance value equal to or lower than the exclusion luminance value Yex pixels considered to have sky projected
- Pixels of parts other than the area excluded image are forcibly set to black pixels.
- a pixel having a low luminance value having a luminance value of Yth2 or less is a black pixel, and a pixel having a high luminance value having a luminance value larger than Yth2 is a white pixel
- the second binary image may be generated as
- a reverse image is generated by reversing the level of the luminance value of each pixel of the sky region excluded image before binarization, and each pixel of this reverse image is a threshold obtained by inverting Yth 2 (hereinafter referred to as a reverse threshold).
- the second binary image one of the two types of pixels is white, by classifying into two types of a pixel having the above luminance value and a pixel having a luminance value smaller than the inversion threshold value).
- a binary image in which the other is black may be generated.
- the above-mentioned inverted image is obtained by subtracting the luminance value Y in the sky region excluded image from the maximum luminance value (for example, 255 in the case of 8-bit gradation) of the luminance value of each pixel.
- the object recognition device 10 executes the process of the object image extraction unit 35.
- the object image extracting unit 35 selects an object as a candidate of a living body from a white region (an image region formed by white pixels (pixels with low luminance value less than Yth 2)) in the second binary image. Extract the image part of the object.
- the extraction process of STEP 9 is performed in the same manner as the extraction process of STEP 4.
- the image part of the living body is the image part of the white region in the second binary image
- the image part of the object can be extracted by the same program processing as STEP4.
- the second binary image is generated by setting a pixel with a low luminance value equal to or less than Yth2 as a black pixel and a pixel with a luminance higher than Yth2 as a white pixel, the second binary image is generated.
- An image portion of an object as a living body candidate may be extracted from the black region in the binary image of.
- the target object recognition device 10 executes the determination process of STEP10. In this determination process, it is determined whether or not an object as a living body candidate has been extracted by the processes of STEPs 2 to 9.
- the object recognition apparatus 10 next executes the processing in STEP11 by the contact avoidance processing unit 36.
- the contact avoidance processing unit 36 executes, for example, the same process as that described in Patent Document 1 for the object (candidate of living body) extracted by the object image extracting unit 35. Calculation of the real space position of the object, identification of whether the object is a living body to be monitored, and determination of whether the object may come in contact with the vehicle 1 are performed .
- the contact avoidance processing unit 36 estimates the distance between the object and the vehicle 1 (vehicle) by a method of stereo ranging based on the parallax of the image part of the object in each of the cameras 11L and 11R. Furthermore, the contact avoidance processing unit 36 determines the actual space position of the object (relative position with respect to the host vehicle 1) based on the estimated value of the distance and the position of the image portion of the object in the captured image of the reference camera 11R. Estimate
- the contact avoidance processing unit 36 It is determined that there is a possibility of future contact between an object (for example, an object indicated by P1 in FIG. 1) and the vehicle 1.
- the contact avoidance processing unit 36 determines entry determination areas AR2 and AR3 in which the real space position of the object is set as shown in FIG. 1 outside the left and right of the contact determination area AR1 in the imaging area AR0.
- the target object for example, an object indicated by P2 and P3 in FIG. 1 even when the position of the movement vector of the object is the direction to enter the contact determination area AR1.
- the entry determination areas AR2 and AR3 are set as areas obtained by removing the contact determination area AR1 from the area where the distance from the host vehicle 1 is equal to or less than the distance value Z1 in the imaging area AR0.
- the direction of the movement vector of the object is specified from, for example, a time series of estimated values of real space positions up to a predetermined time before the object.
- the contact determination area AR1 and the entry determination areas AR2 and AR3 are areas having a range in the height direction of the host vehicle 1 (an area equal to or less than a predetermined height higher than the height of the host vehicle 1). Then, it is determined that an object present at a position higher than the predetermined height has no possibility of future contact with the vehicle 1.
- the contact avoidance processing unit 36 specifies (determines) whether or not the target object is a living body such as a person, for the target object determined to have a possibility of future contact with the vehicle 1.
- an object portion in the captured image of the reference camera 11R (more specifically, the image portion is extracted by the object image extraction unit 35, and there is a possibility of future contact with the vehicle 1 by the contact avoidance processing unit 36 Based on features such as the shape, size, luminance distribution, etc. of the image portion of the object determined (for example, by the method described in Patent Document 1), it is specified whether the object is a person or not Be done.
- the object When it is determined that the object is not a person, it may be determined whether the object is a wild animal such as a quadruped animal.
- the contact avoidance processing unit 36 executes the above-mentioned alerting processing for an object that has a possibility of contact with the host vehicle 1 and is identified as a living body such as a person.
- the contact avoidance processing unit 36 causes the display 12 to display the captured image of the reference camera 11R, and at the same time, an image of an object in the captured image (a living body having a possibility of contact with the vehicle 1). Control the display 12 to highlight.
- the contact avoidance processing unit 36 displays the image of the target portion in the captured image displayed on the display 12 with a frame of a predetermined color, or blinks the frame to display the image of the target object. Highlight the image.
- the contact avoidance processing unit 36 causes the speaker 13 to output an alarm sound (or sound) indicating that a living body having a possibility of contact with the vehicle 1 is present in the imaging area (monitoring area) AR0. Control.
- a visual alarm and an audible alarm on the living body having a possibility of contact with the vehicle 1 are given to the driver. .
- the driver's attention to the living body is raised.
- the driver performs a driving operation (such as a brake operation) that can appropriately avoid the contact between the living body and the host vehicle 1, and the contact can be avoided.
- whether or not to execute the alerting process may be selected according to, for example, the depression amount of the brake pedal or the deceleration degree of the vehicle 1.
- the driver's attention is to be drawn by visual notification by the display 12 and auditory notification by the speaker 13.
- the driver's attention is to be drawn by visual notification by the display 12 and auditory notification by the speaker 13.
- only one notification may be given.
- the driver's attention may be drawn by performing a tactile notification such as vibrating the driver's seat, etc. Good.
- the brake system of the vehicle 1 When the brake system of the vehicle 1 is configured to be able to adjust its braking force from the braking force corresponding to the operation of the brake pedal by hydraulic control or the like, the driver's attention may be drawn. In addition, the braking force of the brake system may be automatically increased.
- the processing of STEPs 6 to 9 makes the object as a candidate for the living body (relative to the background object). An image portion of the low temperature object is extracted from the second binary image.
- the binarization for generating the second binary image is an image area (a pixel having a luminance value equal to or less than the exclusion luminance value Yex) in which the sky is projected from the captured image of the reference camera 11R. Is executed using the second threshold Yth2 for binarization of the luminance value higher than Yex.
- the object in the captured image of the reference camera 11R can be set to a luminance value equal to or less than the second binarization second threshold value Yth2.
- an object as a candidate of a living body having a temperature relatively lower than that of the background is appropriately extracted from the white region (the region formed by pixels with low luminance values lower than Yth2) in the second binary image. You can do so.
- the exclusion luminance value Yex reflects that the dispersion of the luminance value of the sky image area occurs due to the presence or absence of the sky cloud, the area of the sky image area in the captured image, etc. It is set variably. Therefore, the pixels of the actual sky image area in the captured image are not included in the sky area excluded image as much as possible (in other words, the luminance value of all or most of the pixels of the actual sky image area is Yex or less) Can be set as the exclusion luminance value Yex.
- a background is obtained from a white area (an area formed by pixels with low luminance values lower than Yth2) in the second binary image generated by binarizing the empty area excluded image with the second threshold Yth2 for binarization. It is possible to extract an object as a candidate for a living body that is relatively cooler than the subject of (1) with higher certainty.
- processing of STEPs 6 to 9 may be executed prior to the processing of STEPs 2 to 4, or may be performed in parallel with the processing of STEPs 2 to 4 by a plurality of CPUs or by time division processing. It is also good.
- the system provided with two camera 11L, 11R which comprises a stereo camera was illustrated.
- the system may include only one camera (infrared camera).
- the distance between an object such as a living body in the captured image of the camera and the vehicle 1 may be measured by another distance measuring device such as a radar device.
- the distance between the object and the vehicle 1 may be estimated from the temporal change rate of the size of the image part of the object in the time series of the captured image of the camera.
- the processing of STEP 2 to 4 is performed to extract the image portion of the object relatively hotter than the background subject.
- the processing of STEPs 2 to 4 is omitted and the processing of STEPs 6 to 9 is performed.
- the image portion of the object (a relatively low temperature object) may be extracted.
- the object may be an object other than a living body.
- the exclusion luminance value Yex may be a constant value in a situation where it is known that the luminance of an empty image area is maintained at a substantially constant luminance.
- the exclusion luminance value Yex may be changed as appropriate according to parameters other than the luminance representative value or the area of the empty image area.
- the image portion of the object is extracted.
- an area constituted by pixels having a luminance value equal to or higher than the first threshold Yth1 for binarization in a captured image of the camera 11L or 11R, or a second binarization An image portion of an object such as a living body may be extracted from a region constituted by pixels having a luminance value equal to or less than the threshold value Yth2.
- the system in which the object recognition device 10 and the cameras 11L and 11R are mounted on the vehicle 1 has been described as an example.
- the present invention can also be applied to the case where a camera (infrared camera) for acquiring a captured image is installed at a predetermined place such as a road or an entrance of a facility.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Closed-Circuit Television Systems (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims (4)
- 赤外線カメラの撮像画像を基に、該赤外線カメラの撮像領域に存在して、背景の温度よりも相対的に低温となっている対象物を認識する機能を有する対象物認識装置であって、
前記撮像画像のうち、空が投影されている画素を示す所定値以下の輝度の画素を除外してなる各画素を、前記所定値よりも高い輝度値に設定された2値化用閾値以下の輝度値を有する低輝度画素と該2値化用閾値よりも高い輝度値を有する高輝度画素とに分類するように構成された2値化処理手段と、
前記撮像画像のうち、前記低輝度画素により構成される画像領域から前記対象物の画像部分を抽出するように構成された対象物画像抽出手段とを備えることを特徴とする対象物認識装置。 - 請求項1記載の対象物認識装置において、
前記撮像画像のうち、空が投影されている画像領域の面積又は該画像領域の輝度代表値に応じて前記所定値を変化させるように該所定値を設定するように構成された除外用輝度値設定手段をさらに備えることを特徴とする対象物認識装置。 - 請求項2記載の対象物認識装置において、
前記除外用輝度値設定手段は、前記空が投影されている画像領域の面積が大きいほど、又は、該画像領域の輝度代表値が大きいほど、前記所定値を大きくするように設定するように構成されていることを特徴とする対象物認識装置。 - 請求項1記載の対象物認識装置において、
前記2値化処理手段は、前記撮像画像のうち、前記所定値以下の輝度の画素を除く画素により構成される画像領域における輝度値と画素数との関係を示すヒストグラムに基づいて、前記2値化用閾値を設定するように構成された2値化用閾値設定手段を含むことを特徴とする対象物認識装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014522450A JP5809751B2 (ja) | 2012-06-26 | 2013-03-15 | 対象物認識装置 |
US14/408,437 US20150169980A1 (en) | 2012-06-26 | 2013-03-15 | Object recognition device |
CN201380026765.6A CN104335244A (zh) | 2012-06-26 | 2013-03-15 | 对象物识别装置 |
DE112013003276.7T DE112013003276T5 (de) | 2012-06-26 | 2013-03-15 | Objekterkennungseinrichtung |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012143562 | 2012-06-26 | ||
JP2012-143562 | 2012-06-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014002534A1 true WO2014002534A1 (ja) | 2014-01-03 |
Family
ID=49782714
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/057363 WO2014002534A1 (ja) | 2012-06-26 | 2013-03-15 | 対象物認識装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20150169980A1 (ja) |
JP (1) | JP5809751B2 (ja) |
CN (1) | CN104335244A (ja) |
DE (1) | DE112013003276T5 (ja) |
WO (1) | WO2014002534A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017224935A (ja) * | 2016-06-14 | 2017-12-21 | 株式会社日立ソリューションズ | 物体検知評価システム及び物体検知評価方法 |
JP2021033835A (ja) * | 2019-08-28 | 2021-03-01 | 株式会社Jvcケンウッド | 対象物認識装置、対象物認識方法及びプログラム |
WO2021234782A1 (ja) * | 2020-05-18 | 2021-11-25 | 日本電信電話株式会社 | 画像処理装置、方法およびプログラム |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103797528A (zh) * | 2011-09-28 | 2014-05-14 | 本田技研工业株式会社 | 生物体识别装置 |
CN105096333A (zh) * | 2015-09-06 | 2015-11-25 | 河海大学常州校区 | 一种森林火灾红外热成像图像分割方法 |
WO2017063128A1 (zh) * | 2015-10-12 | 2017-04-20 | 深圳市大疆创新科技有限公司 | 喷洒质量检测装置、系统、方法以及采样辅助装置 |
CN105678310B (zh) | 2016-02-03 | 2019-08-06 | 北京京东方多媒体科技有限公司 | 红外热图像轮廓提取方法及装置 |
KR102597435B1 (ko) * | 2016-04-20 | 2023-11-03 | 엘지이노텍 주식회사 | 영상 취득 장치 및 그 방법 |
JP2018036937A (ja) * | 2016-09-01 | 2018-03-08 | 住友電気工業株式会社 | 画像処理装置、画像処理システム、画像処理プログラムおよびラベル |
JP6765353B2 (ja) * | 2017-07-28 | 2020-10-07 | 公益財団法人鉄道総合技術研究所 | 構造物検査システム、構造物検査装置及び構造物検査方法 |
EP3620765B1 (en) | 2018-09-10 | 2020-11-04 | Axis AB | Method and system for filtering thermal image data |
WO2020183698A1 (ja) * | 2019-03-14 | 2020-09-17 | 株式会社Fuji | 対象体判定方法、対象体判定装置 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006101384A (ja) * | 2004-09-30 | 2006-04-13 | Nissan Motor Co Ltd | 人物検出装置及び方法 |
JP2006099611A (ja) * | 2004-09-30 | 2006-04-13 | Nissan Motor Co Ltd | 人物検出装置及び方法 |
JP2011170499A (ja) * | 2010-02-17 | 2011-09-01 | Honda Motor Co Ltd | 車両の周辺監視装置 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3987013B2 (ja) * | 2003-09-01 | 2007-10-03 | 本田技研工業株式会社 | 車両周辺監視装置 |
JP2007279076A (ja) * | 2006-04-03 | 2007-10-25 | Fuji Xerox Co Ltd | 画像形成装置、画像形成部構成部材製造方法および弾性部材 |
JP4410292B1 (ja) * | 2008-10-20 | 2010-02-03 | 本田技研工業株式会社 | 車両の周辺監視装置 |
JP4482599B2 (ja) * | 2008-10-24 | 2010-06-16 | 本田技研工業株式会社 | 車両の周辺監視装置 |
-
2013
- 2013-03-15 CN CN201380026765.6A patent/CN104335244A/zh active Pending
- 2013-03-15 WO PCT/JP2013/057363 patent/WO2014002534A1/ja active Application Filing
- 2013-03-15 JP JP2014522450A patent/JP5809751B2/ja not_active Expired - Fee Related
- 2013-03-15 US US14/408,437 patent/US20150169980A1/en not_active Abandoned
- 2013-03-15 DE DE112013003276.7T patent/DE112013003276T5/de not_active Withdrawn
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006101384A (ja) * | 2004-09-30 | 2006-04-13 | Nissan Motor Co Ltd | 人物検出装置及び方法 |
JP2006099611A (ja) * | 2004-09-30 | 2006-04-13 | Nissan Motor Co Ltd | 人物検出装置及び方法 |
JP2011170499A (ja) * | 2010-02-17 | 2011-09-01 | Honda Motor Co Ltd | 車両の周辺監視装置 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017224935A (ja) * | 2016-06-14 | 2017-12-21 | 株式会社日立ソリューションズ | 物体検知評価システム及び物体検知評価方法 |
JP2021033835A (ja) * | 2019-08-28 | 2021-03-01 | 株式会社Jvcケンウッド | 対象物認識装置、対象物認識方法及びプログラム |
JP7259644B2 (ja) | 2019-08-28 | 2023-04-18 | 株式会社Jvcケンウッド | 対象物認識装置、対象物認識方法及びプログラム |
WO2021234782A1 (ja) * | 2020-05-18 | 2021-11-25 | 日本電信電話株式会社 | 画像処理装置、方法およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20150169980A1 (en) | 2015-06-18 |
CN104335244A (zh) | 2015-02-04 |
JP5809751B2 (ja) | 2015-11-11 |
JPWO2014002534A1 (ja) | 2016-05-30 |
DE112013003276T5 (de) | 2015-03-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014002534A1 (ja) | 対象物認識装置 | |
US10501088B2 (en) | Road surface state determination apparatus, imaging apparatus, imaging system, and road surface state determination method | |
JP4173901B2 (ja) | 車両周辺監視装置 | |
JP4456086B2 (ja) | 車両周辺監視装置 | |
JP4482599B2 (ja) | 車両の周辺監視装置 | |
JP5760090B2 (ja) | 生体認識装置 | |
JP5687702B2 (ja) | 車両周辺監視装置 | |
JP4173902B2 (ja) | 車両周辺監視装置 | |
JP4528283B2 (ja) | 車両周辺監視装置 | |
WO2010007718A1 (ja) | 車両周辺監視装置 | |
JP2014056295A (ja) | 車両周辺監視装置 | |
KR101236234B1 (ko) | 레이저 센서와 카메라를 이용한 차선 인식 시스템 | |
JP4813304B2 (ja) | 車両周辺監視装置 | |
JP4887540B2 (ja) | 車両周辺監視装置、車両、車両周辺監視用プログラム、車両周辺監視方法 | |
US9064158B2 (en) | Vehicle surroundings monitoring device | |
JP5345992B2 (ja) | 車両周辺監視装置 | |
JP4765113B2 (ja) | 車両周辺監視装置、車両、車両周辺監視用プログラム、車両周辺監視方法 | |
JP5642785B2 (ja) | 車両の周辺監視装置 | |
JP2012053663A (ja) | 物体種別判定装置 | |
JP2008028478A (ja) | 障害物検出システム、及び障害物検出方法 | |
JP4627305B2 (ja) | 車両周辺監視装置、車両周辺監視方法、及び車両周辺監視用プログラム | |
JP5149918B2 (ja) | 車両の周辺監視装置 | |
JP2001204011A (ja) | 動物検知装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13810558 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014522450 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14408437 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112013003276 Country of ref document: DE Ref document number: 1120130032767 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13810558 Country of ref document: EP Kind code of ref document: A1 |