WO2017195753A1 - Système de surveillance - Google Patents

Système de surveillance Download PDF

Info

Publication number
WO2017195753A1
WO2017195753A1 PCT/JP2017/017464 JP2017017464W WO2017195753A1 WO 2017195753 A1 WO2017195753 A1 WO 2017195753A1 JP 2017017464 W JP2017017464 W JP 2017017464W WO 2017195753 A1 WO2017195753 A1 WO 2017195753A1
Authority
WO
WIPO (PCT)
Prior art keywords
value
light
distance
pixel
monitoring
Prior art date
Application number
PCT/JP2017/017464
Other languages
English (en)
Japanese (ja)
Inventor
潤一 藤田
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2018517019A priority Critical patent/JP6835079B2/ja
Publication of WO2017195753A1 publication Critical patent/WO2017195753A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/487Extracting wanted echo signals, e.g. pulse detection
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/181Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using active radiation detection systems

Definitions

  • the present invention relates to a monitoring system that monitors an object by scanning and projecting laser light or the like, for example.
  • a monitoring device that uses a distance image has been proposed as a monitoring device for detecting an intruder or the like in a monitoring space.
  • the distance image has distance information as a pixel value.
  • a laser beam or the like is sent out to a monitoring space, and monitoring is performed to measure a distance to an object in the monitoring space from a time from the sending to reception of reflected light.
  • the device is known.
  • distance information about a plurality of directions facing the monitoring space can be obtained by sequentially changing the transmission direction of the measurement medium such as laser light and scanning the monitoring space two-dimensionally.
  • a distance image can be formed.
  • a distance image (background image) as a background in which no moving object is present is obtained in advance, the obtained background image is compared with the input distance image (current image), and a predetermined value is obtained.
  • a so-called background subtraction method is used, in which pixels having changed distances are extracted to obtain a change region. Accordingly, it is possible to determine whether or not the moving object is the target detection target based on the size / shape of the change area and the distance information in the current image.
  • the distance image includes information such as the direction of the object viewed from the transmitting / receiving unit such as a laser beam and the distance to the object. Therefore, the size and shape of the object can be known from the distance image. For example, in an intruder detection application, it is possible to distinguish a relatively large person in the distance from small animals in the vicinity (such as a frog or a cat). It becomes possible, and the detection accuracy of an intruder can be improved.
  • the background distance data is stored as a reference image, and if the distance difference value from the current image is equal to or greater than a predetermined value, the foreground object located in front of the reference image, that is, the detected object.
  • the background if the distance measurement reproducibility is low and the measurement distance variation is large, the background itself may be erroneously detected as an object because the distance variation occurs in the background due to the measurement distance variation.
  • Patent Document 2 discloses a distance image sensor that scans a predetermined object to be measured and detects a distance value of each pixel in the scanning region, and a distance value of each pixel that is input from the distance image sensor. And an image data processing device including an image processing circuit that generates a distance image from a difference between distance values of the background image, and the image processing circuit is controlled by the distance image sensor.
  • a distance image processing system configured to store a distance value at which the acquired distance value of each pixel is maximized and generate a background image based on the maximum distance value of each pixel is disclosed. .
  • the present invention has been made in view of the above circumstances, and monitoring that reduces erroneous detection of object detection by the background subtraction method by setting appropriate background pixel values according to the distribution characteristics of the measurement distance variation of each pixel.
  • the purpose is to provide a system.
  • a monitoring system reflecting one aspect of the present invention.
  • a light projecting / receiving unit comprising: an emitting unit that emits a light beam; a scanning unit that scans the light beam in a monitoring space; and a light receiving unit that receives a light beam reflected from an object in the monitoring space;
  • a processing unit for obtaining a distance value to the object based on a time difference between an emission time at which the light beam is emitted from the emission unit and a light reception time at which the light beam reflected from the object is received by the light receiving unit.
  • a monitoring system When the standard deviation ⁇ is less than the threshold ⁇ th, the processing unit performs statistical processing by performing n scans to obtain n distances from the light emitting / receiving unit to the object as pre-monitoring processing.
  • the median value of the n distance values is a background pixel value and the standard deviation ⁇ is equal to or greater than the threshold ⁇ th, a value lower than the median value of the n distance values and greater than or equal to the minimum value is used as the background.
  • Pixel value The processing unit obtains a distance value by measuring reflected light from the monitoring space when monitoring an object, and the obtained distance value is obtained by measuring the reflected light in the same light projecting / receiving direction. When it is smaller than the pixel value, it is recognized as a monitoring object.
  • the present invention it is possible to provide a monitoring system that reduces erroneous detection of object detection by the background subtraction method by setting an appropriate background pixel value according to the distribution characteristics of the measurement distance variation of each pixel.
  • monitoring apparatus MD It is sectional drawing of monitoring apparatus MD concerning this embodiment. It is a figure which shows the state which scans the inside of the monitoring space of the monitoring apparatus MD with the laser spot light SB (it shows by hatching) radiate
  • FIG. 1 is a cross-sectional view of a monitoring device MD as a monitoring system according to the present embodiment.
  • the shape and length of components may differ from actual ones.
  • the semiconductor laser LD and the collimating lens CL constitute an emission part LPS
  • the lens LS and the photodiode PD constitute a light receiving part RPS
  • the mirror unit MU constitutes a scanning part, and further these are used for projection.
  • the optical axes of the emission part LPS and the light receiving part RPS are preferably orthogonal to the rotation axis RO of the mirror unit MU.
  • a box-shaped housing CS fixed to a rigid wall WL or the like has an upper wall CSa, a lower wall CSb facing the upper wall CSa, and a side wall CSc connecting the upper wall CSa and the lower wall CSb.
  • An opening CSd is formed in a part of the side wall CSc, and a transparent plate TR is attached to the opening CSd.
  • the mirror unit MU is connected to a shaft MTa of a motor MT fixed to the casing CS and is driven to rotate.
  • the axis (rotation axis) of the axis MTa extends in the Z direction which is the vertical direction, and the XY plane formed by the X direction and the Y direction orthogonal to the Z direction is a horizontal plane.
  • the axis of MTa may be inclined with respect to the vertical direction.
  • part of the laser light reflected by the object of the scanning light flux is transmitted through the transparent plate TR again and incident on the second mirror surface M2 of the mirror unit MU in the housing CS.
  • the light is reflected here, is further reflected by the first mirror surface M1, is collected by the lens LS, and is detected for each pixel by the light receiving surface of the photodiode PD.
  • the processing circuit PROC which is a processing unit, obtains distance information according to the time difference between the emission timing of the semiconductor laser LD and the light reception timing of the photodiode PD.
  • the object is detected in the entire area in the monitoring space, and a frame FL (see FIG. 2) as a distance image having distance information for each pixel can be obtained.
  • Such a distance image can be transmitted to a remote monitor via a network (not shown) or the like and displayed, or can be stored in a storage device.
  • the processing circuit PROC compares the reference image of FIG. 3A and the distance image of FIG. 3B, and can alert the user that some object has appeared if a difference occurs.
  • the processing circuit PROC tracks the intruder OBJ to obtain the moving direction and speed (moving object detection is performed). )be able to.
  • FIG. 4 is an example of a histogram in which the horizontal axis indicates the distance value to the background target on one pixel obtained by scanning n times with respect to the background target, and the frequency is indicated on the vertical axis.
  • the background object is the white wall surface (incident position P1 of the laser beam) of the building BL in FIG. 2, the reflection condition is uniform because it is a flat surface having substantially the same reflectance, and the distance value for each scan.
  • the data obtained as shown in FIG. 4A almost follows a normal distribution. Such pixels having substantially the same distance value are referred to as flat pixels here.
  • the distance value for example, the peak PK2, that is, the background object
  • the distance value exceeds the predetermined dead zone from the maximum value MX in a certain frame. If it is obtained, this may be erroneously detected as a monitoring object.
  • the distance value for example, the peak PK4, that is, the background object
  • the maximum data value MX is set to the maximum distance value at the last pixel
  • a distance value is obtained from the maximum value MX before a predetermined dead zone in a certain frame. If this happens, there is a risk that this will be erroneously detected as an object to be monitored.
  • the maximum distance value is set after dividing the flat pixel and the other pixels by using the standard deviation ⁇ .
  • the distance value data obtained in n scans is statistically processed, and if the standard deviation ⁇ is smaller than the threshold ⁇ th, it is determined that the variation in the data is relatively small and the pixel is a flat pixel. If it is a flat pixel, the median value C of the data can be used as the maximum distance value as described above.
  • the standard deviation ⁇ is equal to or greater than the threshold ⁇ th, it is determined that the data variation is relatively large, that is, a pixel other than a flat pixel, that is, a miscellaneous pixel, a scraped pixel, or a last pixel.
  • the minimum value MN among the distance values obtained in n scans is set as the maximum distance value of the pixel here.
  • the median value may be within ⁇ 10% of the variation width with respect to the average value of distance values obtained by calculation.
  • FIG. 6 is a histogram showing the results of an experiment conducted by the present inventor. Focusing on the data of a specific pixel when performing a plurality of scans, the vertical axis represents frequency and the horizontal axis represents distance value. ing. However, the numerical value of the distance value is relative.
  • FIG. 6D shows a result of processing the distance value of a specific pixel that receives the reflected light by irradiating the floor surface where the intensity of the reflected light is near a measurable level and receiving the reflected light. This is an example of the last pixel.
  • the reflected light from the floor surface is greater than or equal to a measurable level in one scan, and a distance value is obtained, but the reflected light from the floor surface is less than a measurable level in another scan.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Measurement Of Optical Distance (AREA)
  • Burglar Alarm Systems (AREA)

Abstract

L'invention concerne un système de surveillance dans lequel des erreurs de détection d'un objet selon un procédé de soustraction d'arrière-plan sont réduites en réglant des valeurs de pixels d'arrière-plan appropriées selon les caractéristiques de distribution de variations des distances de mesure pour chaque pixel. Dans ce système de surveillance, en tant que prétraitement de surveillance, une section de traitement : obtient n distances entre une unité de projection/réception de lumière et une cible en exécutant un balayage n fois, et exécute un traitement statistique ; et si l'écart type σ est inférieur à un seuil σth, utilise la valeur médiane des n éléments de valeurs de distance en tant que valeur de pixel d'arrière-plan, et si l'écart type σ est supérieur ou égal au seuil σth, utilise la valeur minimale parmi les n éléments de valeurs de distance en tant que valeur de pixel d'arrière-plan. Pendant la surveillance d'une cible, la section de traitement : trouve une valeur de distance en mesurant la lumière réfléchie par l'espace de surveillance ; et reconnaît une cible de surveillance si la valeur de distance trouvée est inférieure à la valeur de pixel d'arrière-plan obtenue en mesurant la lumière réfléchie dans la même direction de projection/réception de lumière.
PCT/JP2017/017464 2016-05-13 2017-05-09 Système de surveillance WO2017195753A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2018517019A JP6835079B2 (ja) 2016-05-13 2017-05-09 監視システム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016096994 2016-05-13
JP2016-096994 2016-05-13

Publications (1)

Publication Number Publication Date
WO2017195753A1 true WO2017195753A1 (fr) 2017-11-16

Family

ID=60267300

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/017464 WO2017195753A1 (fr) 2016-05-13 2017-05-09 Système de surveillance

Country Status (2)

Country Link
JP (1) JP6835079B2 (fr)
WO (1) WO2017195753A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2021229722A1 (fr) * 2020-05-13 2021-11-18

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007122508A (ja) * 2005-10-28 2007-05-17 Secom Co Ltd 侵入検知装置
JP2012022573A (ja) * 2010-07-15 2012-02-02 Toshiba Corp 移動体検出装置
JP2012255657A (ja) * 2011-06-07 2012-12-27 Mitsubishi Electric Corp レーダ画像処理装置及びレーダ画像処理方法
WO2016002776A1 (fr) * 2014-07-03 2016-01-07 三菱電機株式会社 Appareil de surveillance

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9875557B2 (en) * 2012-11-05 2018-01-23 The Chancellor Masters And Scholars Of The University Of Oxford Extrinsic calibration of imaging sensing devices and 2D LIDARs mounted on transportable apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007122508A (ja) * 2005-10-28 2007-05-17 Secom Co Ltd 侵入検知装置
JP2012022573A (ja) * 2010-07-15 2012-02-02 Toshiba Corp 移動体検出装置
JP2012255657A (ja) * 2011-06-07 2012-12-27 Mitsubishi Electric Corp レーダ画像処理装置及びレーダ画像処理方法
WO2016002776A1 (fr) * 2014-07-03 2016-01-07 三菱電機株式会社 Appareil de surveillance

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2021229722A1 (fr) * 2020-05-13 2021-11-18
JP7186923B2 (ja) 2020-05-13 2022-12-09 三菱電機株式会社 障害物検知装置、駐車支援装置、衝突回避装置及び障害物検知方法

Also Published As

Publication number Publication date
JPWO2017195753A1 (ja) 2019-03-14
JP6835079B2 (ja) 2021-02-24

Similar Documents

Publication Publication Date Title
JP7465312B2 (ja) 高周波パルス発射を用いる高分解能ライダ
TWI583937B (zh) 在主動視頻煙霧檢測(avsd)系統中測定光束對準之技術
US6862097B2 (en) Three-dimensional shape measuring method, and three-dimensional shape measuring apparatus
JP5417723B2 (ja) 方位測定方法及び方位測定装置
JP2007122508A (ja) 侵入検知装置
JP2009110069A (ja) レーザエリアセンサ
JP2006209318A (ja) 人数検出装置及び方法
JP2019101000A (ja) 測距点群データ測定システム、および制御プログラム
JPWO2017135224A1 (ja) 光走査型の対象物検出装置
US11906629B2 (en) Method and device for distance measurement
JP2022017619A (ja) 画像解析装置、画像解析システム、および制御プログラム
WO2017195753A1 (fr) Système de surveillance
JP2017215642A (ja) 監視システム
JP6825623B2 (ja) 監視システムの設定方法及び監視システム
JP6825624B2 (ja) 監視システム
EP4174810A1 (fr) Fonctionnement d'un détecteur de fumée à balayage
JP6988797B2 (ja) 監視システム
JP2022168956A (ja) レーザ計測装置およびその計測方法
JP7020096B2 (ja) 物体検出装置、物体検出装置の制御方法、および物体検出装置の制御プログラム
WO2017195755A1 (fr) Système de surveillance
JP6749191B2 (ja) スキャナ装置および測量装置
JPH0979967A (ja) 流体中の浮遊粒子計測方法及びその装置
US10605917B2 (en) Optical-scanning-type object detection device having a mirror surface to be inclined in a direction crossing a rotation axis
JP2018163129A (ja) 物体検知方法及び物体検知装置
JP2024042919A (ja) 光センシングシステム、光センシング装置、及び、光センシング方法

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2018517019

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17796114

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17796114

Country of ref document: EP

Kind code of ref document: A1