WO2014042168A1 - Dispositif de traitement d'image, procédé de détection d'objet et programme de détection d'objet - Google Patents

Dispositif de traitement d'image, procédé de détection d'objet et programme de détection d'objet Download PDF

Info

Publication number
WO2014042168A1
WO2014042168A1 PCT/JP2013/074462 JP2013074462W WO2014042168A1 WO 2014042168 A1 WO2014042168 A1 WO 2014042168A1 JP 2013074462 W JP2013074462 W JP 2013074462W WO 2014042168 A1 WO2014042168 A1 WO 2014042168A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
pixel
raindrop
distance
affected
Prior art date
Application number
PCT/JP2013/074462
Other languages
English (en)
Japanese (ja)
Inventor
渡邉慎
梶谷浩一郎
田中信頼
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2014042168A1 publication Critical patent/WO2014042168A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • G01S7/4915Time delay measurement, e.g. operational details for pixel components; Phase measurement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/95Lidar systems specially adapted for specific applications for meteorological use
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/493Extracting wanted echo signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Definitions

  • the present invention relates to a technique for detecting an object located in a detection area by processing a distance image of the detection area, which is picked up by receiving reflected light of light irradiated to the detection area.
  • a slide door that slides horizontally and opens and closes at a position facing a train door (vehicle door) stopped at a station platform (see Patent Document 1), or both sides of an entrance / exit
  • a fixed column is erected on the fixed column, and a plurality of stop bars are provided between the movable columns that are slidable in the vertical direction with respect to the fixed column. See).
  • the fall prevention fence is temporarily opened after the train stops at the station platform in order to secure a passage for passengers.
  • the train departs after passengers get on and off and the fall prevention fence is closed.
  • the presence or absence of an object located between the train and the fall prevention fence is detected by a sensor or the like.
  • an object located between the train and the fall prevention fence after passengers getting on and off the train stopped at the station platform is closed and the door of the train and the fall prevention fence are closed.
  • the station staff or the like is confirmed without starting the train.
  • a transmission type or a reflection type photoelectric sensor is used from the viewpoint of preventing a blind spot where the presence or absence of the object cannot be detected.
  • a configuration using an image sensor is more common than a configuration.
  • pixels affected by raindrops may occur during rain.
  • the output of the raindrop-affected pixels is noise. Therefore, as the number of raindrop-affected pixels increases, the possibility that an object is erroneously detected increases (in fact, the possibility that an object is detected even though no object exists) increases.
  • An object of the present invention is to provide a technique capable of sufficiently suppressing erroneous detection of an object due to generation of raindrop-affected pixels affected by raindrops during rain.
  • the image processing apparatus of the present invention is configured as follows in order to solve the above-described problems and achieve the object.
  • the image acquisition unit acquires, as a pair of captured images, a distance image of the detection area and a received light intensity image captured by the imaging device irradiating the detection area with light and receiving the reflected light.
  • the image acquisition part should just be the structure which acquires the distance image of a detection area at least.
  • the imaging device may use, for example, a known TOF (Time Of Flight) camera, a light source that emits laser light, a light receiving element that receives reflected light, and laser light emitted from the light source in the detection area. It is good also as a structure which has the scanning part to scan.
  • the imaging device may be configured so that the distance image of the detection area and the received light intensity image can be captured at the same timing.
  • the TOF camera is able to obtain a range image with a single eye, and is attracting attention because it can be downsized at a lower cost than the stereo image processing method.
  • the image acquisition unit may be configured to acquire at least a distance image of the detection area (not acquire a received light intensity image).
  • the imaging device may be configured to capture a distance image of the detection area.
  • the raindrop-affected pixel determination unit uses a distance image acquired by the image acquisition unit as a pair of captured images and each pixel of the received light intensity image, and the raindrop affected by the raindrop using the distance and received light intensity acquired at that pixel. It is determined whether the pixel is an influence pixel.
  • the raindrop-affected pixels receive, for example, rain pixels that have detected reflected light from raindrops, or raindrop-attached pixels that have been affected by raindrops attached to the imaging lens (reflected light that has passed through the raindrops that have adhered to the imaging lens). Pixel).
  • the pixel referred to in the present invention may be one light receiving element (1 pixel) of the image pickup element, or a plurality of adjacent light receiving elements (for example, two light receiving elements (2 ⁇ 2 pixels) in the vertical and horizontal directions).
  • the block comprised by may be sufficient.
  • the image processing apparatus according to the present invention processes the distance image and the received light intensity image in units of pixels referred to herein (not necessarily in units of light receiving elements).
  • Reflected light from raindrops detected during rainfall is reflected at a relatively close position.
  • the reflected light having a light reception intensity greatly different from that of the raindrops is not likely to be reflected by raindrops but is likely to be reflected light reflected by an object. Therefore, the determination accuracy can be improved by determining whether or not the pixel is a rain pixel in which reflected light from the raindrop is detected using the distance and the received light intensity acquired in the pixel.
  • the inventors of the present application confirmed the tendency of distance and received light intensity by experiment for raindrop-attached pixels affected by raindrops attached to the imaging lens. It was confirmed that the received light intensity of the raindrop adhesion pixel was lowered. It was confirmed that most of the raindrop-attached pixels were within a certain range. This is considered to be a phenomenon caused by attenuation of the amount of reflected light accompanying transmission of raindrops attached to the imaging lens. In addition, it was confirmed that most of the raindrop adhesion pixels are within a range where the distance changes. This is considered to be a phenomenon that occurs when reflected light is refracted during the transmission of raindrops attached to the imaging lens, and pixels that receive the reflected light shift to adjacent pixels or neighboring pixels.
  • the determination accuracy is increased. It can be improved.
  • the tendency of the change in distance and the change in the received light intensity in the raindrop-attached pixels affected by the raindrops attached to the imaging lens is considered to vary depending on factors such as the characteristics of the image acquisition unit and the imaging environment. It is preferable to set a determination criterion for determining whether or not the pixel is a raindrop-attached pixel by checking these tendencies when installing.
  • the image acquisition unit is preferably configured to acquire the distance image of the detection area and the received light intensity image as a pair of captured images (the imaging device has the same timing for the distance image of the detection area and the received light intensity image). It is preferable that the image can be captured with a
  • the object detection unit processes the detection distance image using the determination result of the raindrop-affected pixel determination unit, and detects an object imaged in the detection distance image.
  • the object detection unit can detect an object located in the detection area without being affected by the raindrop-affected pixels that are noise by not using the raindrop-affected pixels as pixels that have detected reflected light from the object. . That is, it is possible to sufficiently suppress erroneous detection of an object due to the occurrence of raindrop-affected pixels affected by raindrops during rainfall.
  • the reference image storage unit stores the reference distance image and the reference light intensity image of the detection area. It is good also as a structure memorize
  • the raindrop-affected pixel determination unit determines whether each pixel is a raindrop-affected pixel by using the distance image and the received light intensity image acquired by the imaging device and acquired by the image acquisition unit as a pair of detection images.
  • the object detection unit may be configured to generate a difference image between the reference distance image and the distance image of the detection image and detect the imaged object.
  • the raindrop-affected pixel determination unit determines the reflected light from the raindrop for each pixel based on the difference in the received light intensity calculated using the received light intensity of the pixel in the reference received light intensity image and the received light intensity image of the detection image. It is good also as a structure which determines whether it is the detected rain pixel.
  • the raindrop-affected pixel determination unit uses, for each pixel, the distance acquired in the pixel in the distance image of the detection image and the light reception intensity of the pixel in the reference light reception intensity image and the light reception intensity image of the detection image. Based on the difference in the received light intensity calculated in the above, it may be configured to determine whether or not the pixel is a rain pixel in which reflected light from the raindrop is detected.
  • the reflectance of raindrops varies depending on the reflectance of objects (including the background) located behind the raindrops. Therefore, by using the difference in the received light intensity calculated by using the received light intensity of the pixel in the reference received light intensity image and the received light intensity image of the detection image, it is determined whether the pixel is a rain pixel in which reflected light from the raindrop is detected. The determination can be made in consideration of the reflectance of an object (including the background) located behind the raindrop. Accordingly, it is possible to further improve the accuracy of determining whether or not the pixel is a rain pixel that has detected reflected light from raindrops, and as a result, it is possible to further improve the accuracy of detecting an object located in the detection area.
  • a pixel in which the difference in received light intensity calculated using the received light intensity of the pixel between the reference received light intensity image and the received light intensity image of the detection image is within a set range (within a range in which the amount of decrease in received light amount falls).
  • Whether or not the pixel is a raindrop-attached pixel may be determined using either the difference in distance or the difference in received light intensity, but is determined using both the difference in distance or the difference in received light intensity. Is preferred.
  • the reference image generation unit captures the reference distance image based on the distance image for one or a plurality of frames and the received light intensity image captured by the imaging device at a predetermined reference image acquisition timing and acquired by the image acquisition unit, and A configuration may be used in which the reference received light intensity image is generated as a pair of reference images.
  • a rain condition determining unit that determines whether or not it is a rain condition may be provided based on the number of pixels that the rain drop effect pixel determining section determines to be a rain drop affecting pixel.
  • the object detection unit may be configured such that when the rain state determination unit determines that it is raining, the size of the object to be detected is larger than when the rain state determination unit determines that it is not raining. In this way, erroneous detection of an object can be sufficiently suppressed even if raindrop-affected pixels are concentrated to some extent during rainfall.
  • the object detection unit may determine the size of the object to be detected according to the number of pixels that the raindrop influence pixel determination unit has determined to be raindrop influence pixels, that is, the amount of rainfall.
  • the number of raindrop-affected pixels increases as the amount of rainfall increases. Therefore, not only is it possible to prevent erroneous detection of raindrop-affected pixels, which are noise, as an object located within the detection area when the rainfall is relatively high, but also the reflection from the object when the rainfall is relatively low.
  • a plurality of adjacent pixels that have detected light are erroneously determined to be raindrop-affected pixels, and as a result, occurrence of a situation in which an object located in the detection area is missed can be suppressed.
  • the object detection method according to the present invention is an invention that causes a computer to execute processes corresponding to the configurations of the above-described image acquisition unit, raindrop-affected pixel determination unit, and object detection unit.
  • the object detection program according to the present invention is an invention that, when installed in a computer, causes the computer to execute processes corresponding to the configurations of the above-described image acquisition unit, raindrop-affected pixel determination unit, and object detection unit. .
  • the present invention it is possible to sufficiently suppress the erroneous detection of an object due to the generation of raindrop-affected pixels affected by raindrops during rainfall.
  • the image processing apparatus has a fall prevention fence installed along the side edge of the station platform between the train and the train to prevent passengers from falling from the station platform into the track.
  • the space is the detection area.
  • the image processing apparatus detects an object located in this detection area.
  • FIG. 1 is a schematic diagram showing a station platform where a fall prevention fence is installed.
  • FIG. 1A is an overhead view of the station platform
  • FIG. 1B is a plan view of the station platform viewed from the track side.
  • the fall prevention fence includes a housing 1 that functions as a door pocket, and a sliding door 2 that is slidably attached to the housing 1.
  • FIG. 1 shows a state in which the slide door 2 is closed.
  • the slide door 2 is provided at a position where each door of a train that stops at the installed station platform is opposed. When the sliding door 2 is opened, it is housed in the housing 1 (door pocket).
  • the slide door 2 slides in the left-right direction in FIG.
  • the detection area where the image processing apparatus according to this example detects an object is between the fall prevention fence and the track at the position where the slide door 2 is provided.
  • the image processing apparatus is provided in each slide door 2.
  • FIG. 2 is a block diagram showing the configuration of the main part of the image processing apparatus according to this example.
  • the image processing apparatus 10 includes a control unit 11, an image sensor unit 12, an image processing unit 13, and an output unit 14.
  • the image processing apparatus 10 can use an information processing apparatus such as a personal computer having the above-described configuration as hardware.
  • the information processing apparatus used as hardware executes the processing described later (processing according to the flowcharts shown in FIGS. 5, 6, 7, and 11) by installing the object detection program referred to in the present invention.
  • the control unit 11 controls the operation of each part of the image processing apparatus 10 main body.
  • the image sensor unit 12 has a TOF (Time Of Flight) camera that captures a distance image of the detection area and a received light intensity image by irradiating the detection area with infrared light and receiving the reflected light.
  • This TOF camera corresponds to the imaging device referred to in the present invention.
  • the TOF camera has a light source that irradiates infrared light to a detection area (imaging area) and an imaging element (n ⁇ m pixel imaging element) in which n ⁇ m light receiving elements are arranged in a matrix.
  • the TOF camera measures the time (flight time) from irradiating infrared light to a detection area until receiving reflected light for each pixel.
  • the TOF camera obtains the distance to the object (the reflection surface of the light-reflected target object) by measuring the phase difference between the light applied to the detection area and the received reflected light.
  • the light emitted from the light source uses light whose intensity is modulated.
  • the modulation phase is shifted according to the propagation distance.
  • the phase of the irradiated light is monitored by directly receiving a part of the light from the light source by a part of the light receiving element, and the deviation from the phase of the light received as the reflected light is obtained.
  • the known principle for obtaining the phase shift is the received light signal (A0, A2) sampled every T / 2 period with respect to the modulation period T of the irradiation light, and further FIG.
  • the phase shift amount ⁇ is calculated based on the propagation distance based on the received light signals (A1, A3) sampled at the timing shifted by T / 4.
  • the distance D from the phase shift ⁇ obtained here to the object can be obtained.
  • each light receiving element pixel
  • the light receiving signals A0, A1, A2, and A3 described above can be obtained based on the accumulated charge for each T / 4 period.
  • the “pixel” is a block of light receiving elements (pixels), which is a unit when performing image processing to detect an object by obtaining a phase shift or distance, and It may be a single light receiving element (1 pixel) or a block composed of a plurality of adjacent light receiving elements (for example, two light receiving elements (2 ⁇ 2 pixels) vertically and horizontally).
  • the image processing apparatus processes the distance image and the received light intensity image in units of pixels referred to herein (not necessarily in units of light receiving elements).
  • the amount of light received for one period of the above-described modulation of irradiation light is too small in quantity, so the exposure time of the camera is set appropriately and the amount of charge accumulated during that period is used.
  • the phase shift is calculated to find the distance to the object.
  • the distance obtained by associating the distance to the reflecting surface for each pixel by collecting all the distance information for each pixel (meaning a block as a unit for image processing as described above). Get an image.
  • the TOF camera collects all accumulated charges for a predetermined period (for a plurality of cycles) for each pixel, thereby obtaining a received light intensity image in which the intensity of the reflected light (reflected light amount) received by the pixel is associated with each pixel. get.
  • the TOF camera can obtain a distance image of the detection area and a received light intensity image captured at the same exposure timing (exposure period).
  • an imaging device having a light source that emits laser light, a light receiving element that receives reflected light, and a scanning unit that scans laser light emitted from the light source in a detection area is used. May be.
  • the imaging device included in the image sensor unit 12 may be configured so that the distance image of the detection area and the received light intensity image can be captured at the same exposure timing (exposure period).
  • This TOF camera can capture, for example, a distance image of about 5 to 10 frames per second and a received light intensity image (a pair of captured images).
  • FIG. 4 is a diagram showing a mounting example of the TOF camera.
  • the TOF camera is attached relatively above the housing 1 so that the detection area shown in FIG. 1 is within the imaging area, and the imaging direction is directed obliquely downward.
  • the TOF camera is attached to the track side from the slide door 2.
  • the TOF camera can fit the detection area within the imaging area, as shown in FIG. 4, do not attach it to the case 1 of the fall prevention fence, and attach it to a column or the like standing on the station platform. Also good.
  • the image processing unit 13 processes the detection area distance image and the received light intensity image captured by the image sensor unit 12 at the same timing as a pair of detection images (detection distance image and detection received light intensity image). Otherwise, it may be used to generate a pair of reference images (reference distance image and reference received light intensity image).
  • the image processing unit 13 includes a memory (not shown) that stores the generated pair of reference images. This memory has a configuration corresponding to the reference image storage unit referred to in the present invention.
  • the image processing unit 13 processes the pair of detection images using the pair of reference images stored in the memory, and detects an object located in the detection area. Details of the object detection process will be described later.
  • the output unit 14 outputs the object detection result in the image processing unit 13 to a connected fall prevention fence or alarm device.
  • a device that has detected the object is notified by a warning sound or turned on a warning light to notify the station staff of that fact (the object has been detected). Inform.
  • FIG. 5 is a flowchart showing the operation of the image processing apparatus.
  • the fall prevention fence closes the slide door 2 when the train is not stopped at the station platform.
  • the image processing apparatus 10 executes the process shown in FIG. 5 every time the train stops at the station platform.
  • the image processing apparatus 10 is a pair of captured images (distance image, distance image, and image) captured by the image sensor unit 12 with the TOF camera at this timing (corresponding to the first timing in the present invention). Then, a reference distance image and a reference received light intensity image (a pair of reference images) based on the received light intensity image) are generated, and a reference image acquisition process is executed (s1).
  • This reference image acquisition process uses a pair of captured images captured by the TOF camera in a state where the door of the train stopped at the station platform and the slide door 2 of the fall prevention fence are closed. That is, the pair of reference images can be used as a background image in which no object exists in the detection area. Further, the pair of reference images are images with a background of a train stopped at the station platform.
  • the door of the train stopped at the station platform and the slide door 2 of the fall prevention fence are opened, and passengers can get on and off the train.
  • the door of the train stopped at the station platform and the slide door 2 of the fall prevention fence are closed.
  • the image processing apparatus 10 has an image sensor unit 12 at a timing (corresponding to a second timing referred to in the present invention) at which the door of the train stopped at the station platform and the slide door 2 of the fall prevention fence are closed. Performs a detection image acquisition process for acquiring a pair of captured images (distance image and received light intensity image) captured by the TOF camera as a pair of detection images (s2).
  • the image processing apparatus 10 performs an object detection process for detecting an object located in the detection area using the pair of reference images generated and acquired in s1 and the pair of detection images acquired in s2 (s3). .
  • s3 it is a process which detects whether the object which is not imaged by the pair of reference images acquired by s1 is imaged by the pair of detection images acquired by s2. Therefore, a train stopped at the station platform, a support post installed at the station platform, a structure such as a fall prevention fence, etc. is not detected as an object in s3.
  • the image processing apparatus 10 outputs the detection result of the object detection process relating to s3 in the output unit 14 (s4).
  • the device to which the detection result is input performs a warning notification and notifies the station staff, the train driver, and the like to that effect. If the object located in the detection area is not detected in the object detection process in the image processing apparatus 10, the train driver starts the train from the station platform. On the other hand, if an object located within the detection area is detected, the station staff confirms and then starts a train from the station platform.
  • An operation management system or the like that manages the operation of the train instructs the image processing apparatus 10 to start timing of the reference image acquisition process related to s1 and start timing of the detection image acquisition process related to s2. What is necessary is just composition.
  • these start timing instructions may be input by a train driver or a station staff.
  • FIG. 6 is a flowchart showing the reference image acquisition process.
  • the image sensor unit 12 captures a pair of captured images (distance image and received light intensity image) with a TOF camera for a preset number of frames (for example, 5 frames) (s11).
  • the image processing unit 13 generates a reference distance image using the distance image of the number of frames captured in s11 (s12). In s12, a distance image in which an average value of distances of frames corresponding to the pixel is associated is generated for each pixel, and this is acquired as a reference distance image.
  • the image processing unit 13 generates a reference light reception intensity image using the light reception intensity images of the number of frames captured in s11 (s13).
  • a light reception intensity image is generated in which the average value of the light reception intensity of each frame corresponding to the pixel is associated, and this is acquired as a reference light reception intensity image.
  • the image processing unit 13 stores the reference distance image generated and acquired in s12 and the reference received light intensity image acquired and acquired in s13 in the image memory (not shown) as a pair of reference images (s14).
  • the reference distance image is generated from the distance images of a plurality of frames
  • the reference light reception intensity image is generated from the light reception intensity images of the plurality of frames.
  • a suppressed reference distance image and a reference received light intensity image can be generated and acquired.
  • the image processing unit 13 sets the set number of frames as one frame, and stores the distance image and the received light intensity image captured by the image sensor unit 12 in the image memory as the reference distance image and the reference received light intensity image. It is good also as a structure.
  • the detection image acquisition processing according to s2 is a pair of detection images obtained by capturing a pair of captured images (distance image and light reception intensity image) with the TOF camera and using the images as a detection distance image and detection light reception intensity image. It is a process to acquire.
  • FIG. 7 is a flowchart showing the object detection process.
  • the image processing unit 13 performs, for each pixel of the detection image, a rain pixel determination process that determines whether the pixel is a pixel that receives reflected light from raindrops (hereinafter referred to as a rain pixel) (s21). ).
  • the meaning of the “pixel” of the rain pixel is a block of light receiving elements (pixels) as a unit when performing image processing.
  • this block may be one light receiving element (1 pixel) of the image pickup device, or is constituted by a plurality of adjacent light receiving elements (for example, two light receiving elements (2 ⁇ 2 pixels) each in vertical and horizontal directions). It may be a block.
  • the rain pixel determination process related to s21 is performed by any of the following methods (1) to (3).
  • a determination straight line for determining whether or not it is a rain pixel is determined in advance by a linear function of distance and received light intensity.
  • the distance associated with the detection distance image horizontal axis in FIG. 8B
  • the light reception intensity associated with the detection light reception intensity image vertical axis in FIG. 8B.
  • pixels located in the hatched area in FIG. 8B are determined as rain pixels, and pixels not located in the hatched area are determined not to be rain pixels.
  • the reflectance of the raindrop changes depending on the reflectance of the reflecting surface located after the raindrop.
  • a linear function of a distance and a difference in received light intensity is a rain pixel.
  • a determination straight line for determining whether or not is determined in advance. Then, the absolute value of the difference between the distance associated with the detection distance image (horizontal axis in FIG. 8C) and the light reception intensity in the detection light reception intensity image and the light reception intensity in the reference light reception intensity image (FIG. 8).
  • a pixel plotted based on the vertical axis in (C) is determined as a rain pixel if the pixel is located in the area indicated by hatching in FIG. 8C, and a pixel not located in the area indicated by the hatching is determined. It is determined that it is not a rain pixel.
  • areas to be determined as rain pixels include the imaging environment of the detection area, the imaging characteristics of the image sensor unit 12 (for example, the pixel density of the TOF camera, the imaging lens, and the like). Therefore, adjustment is made when the image processing apparatus 10 is installed.
  • the light reception intensity difference is the absolute value of the difference in the light reception intensity of the corresponding pixels in the reference light reception intensity image and the detection light reception intensity image as described above.
  • the reason why the absolute value is used is that the pixel of the reference light reception intensity image is a rain pixel that detects the reflected light from the raindrop, and the corresponding pixel of the detection light reception intensity image is the pixel that does not detect the reflected light from the raindrop. This is to consider the case. Further, in both the reference light reception intensity image and the detection light reception intensity image, it is highly likely that the pixel that has detected the reflected light from the raindrop is determined to be the background in the differential image generation process described later.
  • the accuracy of the rain pixel determination process is improved in the order of (1), (2), and (3).
  • the calculation amount increases in the order of (1), (2), and (3), so the processing time increases. Which of the above methods (1) to (3) is used for the rain pixel determination process may be determined in consideration of accuracy and processing time.
  • the image processing unit 13 determines, for each pixel of the detection image, whether the pixel is a pixel affected by raindrops attached to the imaging lens (hereinafter referred to as raindrop-attached pixels). An attached pixel determination process is performed (s22).
  • the raindrop adhesion pixel determination process for s22 is performed by any of the following methods (4) to (6).
  • the inventor of the present application confirmed, through experiments, the tendency of changes in distance and light reception intensity for raindrop-attached pixels affected by raindrops attached to the imaging lens. It was confirmed that the received light intensity of the raindrop adhesion pixel was lowered. In addition, it was confirmed that most of the raindrop adhesion pixels fall within a certain range. This is considered to be a phenomenon caused by attenuation of the amount of reflected light accompanying transmission of raindrops attached to the imaging lens.
  • FIG. 9A shows a measurement result obtained by measuring the amount of change in the amount of received light applied to a pixel having raindrops attached to the imaging lens.
  • the bar graph indicates the number of pixels with respect to the amount of change in received light intensity.
  • the line graph shows the cumulative ratio (percentage) of pixels with respect to the amount of change in received light intensity in raindrop-affected pixels.
  • the amount of change in the received light intensity referred to here depends on the environment in which the inventor conducted the experiment, and it goes without saying that the numerical value varies depending on the environment.
  • FIG. 9B shows a measurement result obtained by measuring a change amount of a distance applied to a pixel having raindrops attached to the imaging lens.
  • the bar graph indicates the number of pixels with respect to the amount of change in distance.
  • the line graph indicates a cumulative ratio (percentage) of pixels with respect to the distance change amount in the raindrop-affected pixels.
  • FIG. 9B it was confirmed that about 73% of the raindrop adhesion pixels had a distance change amount of 350 mm or less. It was also confirmed that the percentage of raindrop-attached pixels whose distance change amount was 150 mm or less was about 3%.
  • the amount of change in distance referred to here depends on the environment in which the inventor conducted the experiment, and it goes without saying that the numerical value varies depending on the environment.
  • the determination accuracy can be improved.
  • the pixel is determined to be a raindrop-attached pixel, and the other pixels are determined not to be raindrop-attached pixels.
  • the absolute value of the difference in received light intensity is equal to or less than P1 shown in FIG. 10C
  • a pixel whose absolute value of the difference in distance is between Dmin and Dmax shown in FIG. 10C is determined as a raindrop-attached pixel, and the other pixels are determined not to be raindrop-attached pixels.
  • P1, Dmin, and Dmax for determining an area to be determined as a raindrop adhesion pixel are the imaging environment of the detection area and the imaging characteristics of the image sensor unit 12 (for example, Since it varies depending on the pixel density of the TOF camera and the focal length of the imaging lens, it is adjusted when the image processing apparatus 10 is installed.
  • the accuracy of this raindrop adhesion pixel determination process improves in the order of (4), (5), and (6).
  • the reason for using the absolute value of the difference in the received light intensity of the corresponding pixel and the absolute value of the difference in the distance of the corresponding pixel in the raindrop adhesion pixel determination process is the same as the rain pixel determination process described above. This is because the pixel of the intensity image is a raindrop-attached pixel and the corresponding pixel of the detection light-receiving intensity image is not a raindrop-attached pixel.
  • the image processing unit 13 performs a rain state determination process for determining whether it is a rain state (s23). In s23, if the total number of pixels determined to be rain pixels in s21 and pixels determined to be raindrop adhesion pixels in s22 is equal to or greater than a predetermined threshold pixel number S, it is determined to be a rain state. If the total number of pixels determined to be rain pixels in s21 and pixels determined to be raindrop adhesion pixels in s22 is less than a predetermined threshold pixel number S, it is determined that the rain state is not present.
  • the image processing unit 13 determines that it is not raining in s23, it sets the lower limit of the size of the object to be detected to A1 (s24). If the image processing unit 13 determines that it is raining in s23, it sets the lower limit of the size of the object to be detected to A2 (A1 ⁇ A2) larger than A1 (s25).
  • the unit of A1 and A2 is the number of pixels.
  • the image processing unit 13 generates a difference image between the reference distance image generated and acquired in s12 and the detection distance image acquired in s2 (s26).
  • s26 a binary value in which the pixel value of the pixel having the same distance, the pixel determined to be a rain pixel in s22, and the pixel value determined to be a raindrop adhesion pixel in s23 is set to “0”, and the pixel values of other pixels are set to “1”.
  • the image processing unit 13 detects the object captured in the difference image generated in s26 (s27).
  • s27 in the difference image generated in s26, grouping is performed in which a set of pixels having a pixel value “1” located in the vicinity is set as one group.
  • the image processing unit 13 determines that the group is an object if the number of pixels in the group is larger than the lower limit (A1 or A2) of the object size set in s24 or s25, and if the number is smaller, the image processing unit 13 determines noise. (It is not an object.)
  • the image processing apparatus 10 includes an object that accompanies generation of raindrop-affected pixels such as rain pixels that detect reflected light from raindrops or raindrop-attached pixels that are affected by raindrops attached to the imaging lens. Can be sufficiently suppressed. That is, even when it is raining, erroneous detection of objects can be sufficiently suppressed.
  • the image processing apparatus 10 may output not only the object detection result in s4 but also the determination result of the rain state in s23.
  • the image processing apparatus 10 is configured to change the lower limit (A1 or A2) of the size of the object detected in s27 depending on whether or not it is in a rainy state, rain pixels for some pixels in s21 Prevents the object from being erroneously detected due to this misjudgment, even if a misjudgment that the pixel is not a rain pixel or a misjudgment that a raindrop adhering pixel is not a raindrop adhering pixel for some pixels occurs in s22 it can.
  • the neighboring pixels that detected the reflected light from the object are erroneously determined to be raindrop-affected pixels, and a situation may occur in which an object located in the detection area is missed. Is also suppressed.
  • the lower limit of the size of the object detected in s27 is set in two stages depending on whether it is raining.
  • the lower limit of the size of the object is set in the rain pixel in s21. You may make it set according to the total of the pixel determined to exist and the pixel determined to be a raindrop adhesion pixel by s22.
  • the size of the object detected in s27 is set to A1, and s21
  • the number of raindrop-affected pixels increases as the amount of rainfall increases. Therefore, it is possible not only to prevent erroneous detection of a plurality of adjacent raindrop-affected pixels as an object located in the detection area when the rainfall is relatively high, but also from the object when the rainfall is relatively low. A plurality of adjacent pixels that have detected reflected light are erroneously determined to be raindrop-affected pixels, and as a result, the occurrence of a situation in which an object located in the detection area is missed is further suppressed.
  • the rain condition is determined in accordance with the detection processing of the object located in the detection area.
  • the determination may be performed in a time zone when the train is not stopped at the platform.
  • the detection area may be continuously captured by the TOF camera, and the process shown in FIG. 11 may be performed.
  • the image processing apparatus 10 captures a pair of captured images (distance image and received light intensity image) with a TOF camera (s31).
  • the image processing unit 13 generates and acquires a pair of reference images based on the latest captured image of the set number of frames including the pair of captured images captured this time (s32).
  • s32 is the same processing as s1 (s11 to s14) described above.
  • the image processing unit 13 acquires a pair of captured images captured this time as detection images (s33).
  • the image processing unit 13 performs the raindrop influence pixel determination process described above (s34), and determines whether or not it is a raining state (s35).
  • the processing related to s34 and s35 is the same processing as s21 to s23 described above.
  • s35 it may be determined that it is in a rainy state when a frame in which the total number of pixels determined to be raindrop-affected pixels is equal to or greater than a predetermined threshold pixel number S continues for a set frame or more. Good.
  • a predetermined threshold pixel number S it is determined that the rain state is present. It may be.
  • the present invention is described by taking the image processing apparatus 10 as an example where the detection area for detecting an object is between the fall prevention fence and the track at the position where the slide door 2 is provided.
  • the detection area for detecting an object is between the fall prevention fence and the track at the position where the slide door 2 is provided.
  • it can be used for purposes such as detecting an intruder by setting the entrance to a factory or a condominium as a detection area.
  • it may be configured to notify the security guard staying in the security room or the management room to that effect.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Image Processing (AREA)
  • Train Traffic Observation, Control, And Security (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne une technologie permettant de supprimer de façon suffisante la détection erronée d'une goutte d'eau en tant qu'objet pendant des précipitations. Un dispositif de traitement d'image (10) génère une image de distance de référence et une image d'intensité de lumière reçue de référence basée sur une paire d'images capturées qui ont été capturées par une unité de capteur d'image (12), et acquiert les images générées en tant que paire d'images de référence (s1). Le dispositif de traitement d'image (10) acquiert la paire d'images capturées qui ont été capturées par l'unité de capteur d'image (12) en tant que paire d'images de détection (s2). Le dispositif de traitement d'image (10) détecte un objet positionné dans une zone de détection en utilisant la paire d'images de référence et la paire d'images de détection (s3).
PCT/JP2013/074462 2012-09-14 2013-09-11 Dispositif de traitement d'image, procédé de détection d'objet et programme de détection d'objet WO2014042168A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2012202441 2012-09-14
JP2012-202441 2012-09-14
JP2013-051981 2013-03-14
JP2013051981A JP6123377B2 (ja) 2012-09-14 2013-03-14 画像処理装置、オブジェクト検出方法、およびオブジェクト検出プログラム

Publications (1)

Publication Number Publication Date
WO2014042168A1 true WO2014042168A1 (fr) 2014-03-20

Family

ID=50278278

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/074462 WO2014042168A1 (fr) 2012-09-14 2013-09-11 Dispositif de traitement d'image, procédé de détection d'objet et programme de détection d'objet

Country Status (2)

Country Link
JP (1) JP6123377B2 (fr)
WO (1) WO2014042168A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016170076A (ja) * 2015-03-13 2016-09-23 オムロン株式会社 オブジェクト検知装置、オブジェクト検知方法、およびオブジェクト検知プログラム
CN106770055A (zh) * 2017-02-21 2017-05-31 中国水利水电科学研究院 一种基于激光反射原理的区域降雨均匀度测量系统及方法
CN106770038A (zh) * 2017-02-21 2017-05-31 中国水利水电科学研究院 一种基于激光折射原理的区域降雨均匀度测量系统及方法
CN115096511A (zh) * 2022-06-13 2022-09-23 东莞市众志时代试验设备有限公司 一种基于自然降雨机理的淋雨试验箱完善方法

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6405765B2 (ja) * 2014-07-22 2018-10-17 サクサ株式会社 撮像装置及び判定方法
JP6405141B2 (ja) * 2014-07-22 2018-10-17 サクサ株式会社 撮像装置及び判定方法
JP6388951B2 (ja) * 2014-08-27 2018-09-12 株式会社ニコンビジョン 距離検出装置、光学機器、及び距離検出方法
JP6813541B2 (ja) * 2018-07-26 2021-01-13 ファナック株式会社 光学系異常を検出する測距装置
JP7115390B2 (ja) * 2019-03-28 2022-08-09 株式会社デンソー 測距装置

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006046961A (ja) * 2004-07-30 2006-02-16 Matsushita Electric Works Ltd 人体検知センサ
JP2010121995A (ja) * 2008-11-18 2010-06-03 Calsonic Kansei Corp 車両用距離画像データ生成装置
JP2011016421A (ja) * 2009-07-08 2011-01-27 Higashi Nippon Transportec Kk 支障物検知装置及びこれを備えたプラットホームドアシステム並びに支障物検知方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0512591A (ja) * 1991-06-28 1993-01-22 Mitsubishi Electric Corp 監視装置
JP4206926B2 (ja) * 2004-01-15 2009-01-14 株式会社デンソー セキュリティシステム
JP4611911B2 (ja) * 2006-02-24 2011-01-12 セコム株式会社 画像センサ
JP5027645B2 (ja) * 2007-12-28 2012-09-19 セコム株式会社 複合型侵入検知装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006046961A (ja) * 2004-07-30 2006-02-16 Matsushita Electric Works Ltd 人体検知センサ
JP2010121995A (ja) * 2008-11-18 2010-06-03 Calsonic Kansei Corp 車両用距離画像データ生成装置
JP2011016421A (ja) * 2009-07-08 2011-01-27 Higashi Nippon Transportec Kk 支障物検知装置及びこれを備えたプラットホームドアシステム並びに支障物検知方法

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016170076A (ja) * 2015-03-13 2016-09-23 オムロン株式会社 オブジェクト検知装置、オブジェクト検知方法、およびオブジェクト検知プログラム
CN106770055A (zh) * 2017-02-21 2017-05-31 中国水利水电科学研究院 一种基于激光反射原理的区域降雨均匀度测量系统及方法
CN106770038A (zh) * 2017-02-21 2017-05-31 中国水利水电科学研究院 一种基于激光折射原理的区域降雨均匀度测量系统及方法
CN106770055B (zh) * 2017-02-21 2023-09-08 中国水利水电科学研究院 一种基于激光反射原理的区域降雨均匀度测量系统及方法
CN106770038B (zh) * 2017-02-21 2024-02-02 中国水利水电科学研究院 一种基于激光折射原理的区域降雨均匀度测量系统及方法
CN115096511A (zh) * 2022-06-13 2022-09-23 东莞市众志时代试验设备有限公司 一种基于自然降雨机理的淋雨试验箱完善方法
CN115096511B (zh) * 2022-06-13 2023-04-18 东莞市众志时代试验设备有限公司 一种基于自然降雨机理的淋雨试验箱完善方法

Also Published As

Publication number Publication date
JP6123377B2 (ja) 2017-05-10
JP2014074708A (ja) 2014-04-24

Similar Documents

Publication Publication Date Title
JP6123377B2 (ja) 画像処理装置、オブジェクト検出方法、およびオブジェクト検出プログラム
JP5065744B2 (ja) 個体検出器
US10407275B2 (en) Detection and control system for elevator operations
KR20110101177A (ko) 차창의 상태를 검출하기 위한 카메라 장치
JP2011016421A (ja) 支障物検知装置及びこれを備えたプラットホームドアシステム並びに支障物検知方法
JP6031908B2 (ja) 画像処理装置、オブジェクト検出方法、およびオブジェクト検出プログラム
JP5904069B2 (ja) 画像処理装置、オブジェクト検出方法、およびオブジェクト検出プログラム
US20050207616A1 (en) Movable barrier operator with an obstacle detector
JP6070306B2 (ja) 画像処理装置、オブジェクト検出方法、およびオブジェクト検出プログラム
JP2013066166A (ja) 撮像装置、並びに、これを用いる画像解析装置及び移動装置
JPWO2017073348A1 (ja) 赤外線撮像装置及びその制御方法、並びに車両
JP6273682B2 (ja) 画像処理装置、オブジェクト検出方法、およびオブジェクト検出プログラム
JP6465772B2 (ja) レーザレーダ装置
EP2587462A1 (fr) Système et procédé de surveillance d'image avec reduction du taux de fausses alarmes
DE602006007332D1 (de) Sicherheitssystem und alarmverifikationssystem
JP2011093514A (ja) プラットホームドアの安全装置
JP7108115B2 (ja) 車両扉開閉検知装置
CN102568150A (zh) 监控紧急出口的畅通性的方法和系统
JP6015296B2 (ja) 画像処理装置、周囲環境推定方法、および周囲環境推定プログラム
JP4692437B2 (ja) 監視カメラ装置
US10962645B2 (en) Reception apparatus, reception method, transmission apparatus, transmission method, and communication system
KR101339456B1 (ko) 스캐너를 이용한 터널 내 유고 검지 시스템
JP2018159685A (ja) 距離測定装置
JP2015145186A (ja) 移動体管理装置
JP2006323652A (ja) 防犯センサ

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13836545

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13836545

Country of ref document: EP

Kind code of ref document: A1