WO2017195754A1 - Surveillance system - Google Patents

Surveillance system Download PDF

Info

Publication number
WO2017195754A1
WO2017195754A1 PCT/JP2017/017465 JP2017017465W WO2017195754A1 WO 2017195754 A1 WO2017195754 A1 WO 2017195754A1 JP 2017017465 W JP2017017465 W JP 2017017465W WO 2017195754 A1 WO2017195754 A1 WO 2017195754A1
Authority
WO
WIPO (PCT)
Prior art keywords
monitoring
distance
light
fence
reflected light
Prior art date
Application number
PCT/JP2017/017465
Other languages
French (fr)
Japanese (ja)
Inventor
潤一 藤田
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2018517020A priority Critical patent/JP6988797B2/en
Publication of WO2017195754A1 publication Critical patent/WO2017195754A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/487Extracting wanted echo signals, e.g. pulse detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/181Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using active radiation detection systems
    • G08B13/187Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using active radiation detection systems by interference of a radiation field

Definitions

  • the present invention relates to a monitoring system that monitors an object by scanning and projecting laser light or the like, for example.
  • a monitoring device that uses a distance image has been proposed as a monitoring device for detecting an intruder or the like in a monitoring space.
  • the distance image has distance information as a pixel value.
  • a laser beam or the like is sent out to a monitoring space, and monitoring is performed to measure a distance to an object in the monitoring space from a time from the sending to reception of reflected light.
  • the device is known.
  • distance information about a plurality of directions facing the monitoring space can be obtained by sequentially changing the transmission direction of the measurement medium such as laser light and scanning the monitoring space two-dimensionally.
  • a distance image can be formed.
  • a distance image background image
  • current image input distance image
  • the change area is obtained. Then, based on the size / shape of the change area and the distance information in the current image, it is determined whether or not the moving object is the target detection target.
  • the distance image includes information such as the direction of the object viewed from the transmitting / receiving unit such as a laser beam and the distance to the object. Therefore, the size and shape of the object can be known from the distance image. For example, in an intruder detection application, it is possible to distinguish a relatively large person in the distance from small animals in the vicinity (such as a frog or a cat). It becomes possible, and the detection accuracy of an intruder can be improved.
  • the surroundings of the facilities are often surrounded by fences so that ordinary people cannot easily enter.
  • a human action can be detected quickly outside the fence because an alarm can be issued early.
  • the monitoring device since the monitoring device is generally installed in the fence to avoid inadvertent access by humans or the like, the laser light emitted from the monitoring device may pass through the fence depending on the installation location of the monitoring device. The light that travels from the inside toward the outside and is reflected by the object may return through the fence again.
  • the object distance image that is in front of the background distance image is extracted as a change image, that is, only the object at a position closer to the background object is the monitoring object. Therefore, if a fence image is set as a background image, there is a problem that even if an object outside the fence appears over the fence, it cannot be detected.
  • the present invention has been made in view of the above circumstances, and provides a monitoring system capable of detecting an object such as a fence in a monitoring system that detects an object based on a background difference using the highest-order reflected light, for example. For the purpose.
  • a monitoring system reflecting one aspect of the present invention.
  • a light projecting / receiving unit comprising: an emitting unit that emits a light beam; a scanning unit that scans the light beam in a monitoring space; and a light receiving unit that receives a light beam reflected from an object in the monitoring space;
  • the processing unit can set a reference background in the monitoring space, and can measure the distance as an object to be monitored that is closer to the reference background,
  • the processing unit can set a mask area in an arbitrary range of the monitoring space, and when detecting reflected light from an object existing in the mask area, the object detecting the reflected light Is set to a predetermined distance.
  • a monitoring system that detects an object based on a background difference using the highest-order reflected light
  • monitoring apparatus MD It is sectional drawing of monitoring apparatus MD concerning this embodiment. It is a figure which shows the state which scans the inside of the monitoring space of the monitoring apparatus MD with the laser spot light SB (it shows by hatching) radiate
  • (A) (c) is a figure which shows the state which looked at the monitoring space where a background target object exists from the side, (b), (d) is a figure which shows the processing result of the processing circuit PROC, respectively. Yes, the vertical axis represents the reflected light intensity, and the horizontal axis represents the distance. It is a figure which shows the monitoring space where background objects other than fence FS do not exist.
  • (A), (c) is a figure which shows the state which looked at the monitoring space from which the background target object except fence FS does not exist from the side
  • (b), (d) is the processing result of the processing circuit PROC, respectively.
  • the vertical axis represents the reflected light intensity
  • the horizontal axis represents the distance.
  • (A), (c) is a figure which shows the state which looked at the monitoring space which set the mask area
  • (b), (d) is a figure which shows the processing result of the processing circuit PROC, respectively.
  • the vertical axis represents the reflected light intensity, and the horizontal axis represents the distance.
  • It is a flowchart which shows the monitoring control performed in the processing circuit PROC of the monitoring apparatus MD. It is a figure which shows the relationship between the cross-sectional shape of a laser spot light beam, and the magnitude
  • FIG. 1 is a cross-sectional view of a monitoring device MD as a monitoring system according to the present embodiment.
  • the monitoring device MD is preferably installed in an important facility such as an unmanned observation station in the back of the mountain.
  • the monitoring device MD includes, for example, a pulse-type semiconductor laser LD that emits a laser beam, a collimating lens CL that converts divergent light from the semiconductor laser LD into parallel light, and laser light that is collimated by the collimating lens CL.
  • a mirror unit MU that scans and projects light toward the monitoring space by a rotating mirror surface, reflects reflected light from the object, and a lens LS that collects reflected light from the object reflected by the mirror unit MU;
  • a photodiode PD that receives the light collected by the lens LS, a processing circuit (processing unit) PROC that obtains distance information according to a time difference between the emission timing of the semiconductor laser LD and the light reception timing of the photodiode PD, and a mirror It has a motor MT that rotationally drives the unit MU, and a housing CS that houses them.
  • the semiconductor laser LD and the collimating lens CL constitute an emission part LPS
  • the lens LS and the photodiode PD constitute a light receiving part RPS
  • the mirror unit MU constitutes a scanning part, and further these are used for projection.
  • the optical axes of the emission part LPS and the light receiving part RPS are preferably orthogonal to the rotation axis RO of the mirror unit MU.
  • a box-shaped housing CS fixed to a wall WL or the like of a rigid facility has an upper wall CSa, a lower wall CSb facing the upper wall CSa, and a side wall CSc connecting the upper wall CSa and the lower wall CSb. .
  • An opening CSd is formed in a part of the side wall CSc, and a transparent plate TR is attached to the opening CSd.
  • the mirror unit MU has a shape in which two quadrangular pyramids are joined together in opposite directions, that is, four pairs of mirror surfaces M1 and M2 tilted in a direction facing each other (but not limited to four pairs).
  • the mirror surfaces M1 and M2 are preferably formed by depositing a reflective film on the surface of a resin material (for example, PC) in the shape of a mirror unit.
  • the mirror unit MU is connected to a shaft MTa of a motor MT fixed to the casing CS and is driven to rotate.
  • the axis (rotation axis) of the axis MTa extends in the Z direction which is the vertical direction, and the XY plane formed by the X direction and the Y direction orthogonal to the Z direction is a horizontal plane.
  • the axis of MTa may be inclined with respect to the vertical direction.
  • divergent light intermittently emitted in a pulse form from a semiconductor laser LD is converted into a parallel light beam by a collimating lens CL, enters a first mirror surface M1 of a rotating mirror unit MU, and is reflected there. Further, after being reflected by the second mirror surface M2, the light is scanned and projected as a laser spot light having, for example, a vertically long rectangular cross section through the transparent plate TR and toward the external monitoring space.
  • the direction in which the emitted laser spot light is reflected by the object and returned as reflected light is referred to as the light projecting / receiving direction.
  • Laser spot beams traveling in the same light projecting / receiving direction are detected by the same pixel.
  • FIG. 2 is a diagram illustrating a state in which the laser spot light SB (indicated by hatching) emitted in accordance with the rotation of the mirror unit MU scans the monitoring space of the monitoring device MD, and trees are arranged behind the fence FS. An example in which TT is thick is shown.
  • the crossing angles are different from each other.
  • the laser light is sequentially reflected by the rotating first mirror surface M1 and second mirror surface M2.
  • the laser light reflected by the first pair of the first mirror surface M1 and the second mirror surface M2 moves from the left to the right in the horizontal direction in the uppermost area Ln1 in accordance with the rotation of the mirror unit MU. Is scanned.
  • the laser light reflected by the second pair of the first mirror surface M1 and the second mirror surface M2 moves from the left in the horizontal direction to the second region Ln2 from the top of the monitoring space according to the rotation of the mirror unit MU. Scan to the right.
  • the laser light reflected by the third pair of the first mirror surface M1 and the second mirror surface M2 passes through the third region Ln3 from the top in the monitoring space in accordance with the rotation of the mirror unit MU from the left in the horizontal direction. Scan to the right.
  • the laser beams reflected by the fourth pair of the first mirror surface M1 and the second mirror surface are moved horizontally from left to right in the lowermost region Ln4 of the monitoring space according to the rotation of the mirror unit MU. Scanned.
  • a single frame FL is obtained by combining images obtained by scanning the regions Ln1 to Ln4. If the first pair of the first mirror surface M1 and the second mirror surface M2 return after one rotation of the mirror unit MU, the region from the uppermost region Ln1 to the lowermost region Ln4 is again displayed. Scanning is repeated to obtain the next frame FL.
  • part of the laser light reflected by the object (trees TT or fence FS) out of the projected light beam is transmitted again through the transparent plate TR and the second mirror unit MU in the housing CS.
  • the light enters the mirror surface M2, is reflected here, is further reflected by the first mirror surface M1, is collected by the lens LS, and is detected by the light receiving surface of the photodiode PD.
  • the processing circuit PROC which is a processing unit, obtains distance information according to the time difference between the emission timing of the semiconductor laser LD and the light reception timing of the photodiode PD.
  • the object is detected in the entire area in the monitoring space, and a frame FL (see FIG. 2) as a distance image having distance information for each pixel can be obtained.
  • Such a distance image can be transmitted to a remote monitor via a network (not shown) or the like and displayed, or can be stored in a storage device.
  • FIG. 3 is a diagram illustrating a state in which the monitoring space is viewed from the side in association with the processing result of the processing circuit PROC.
  • a distance image (reference image) serving as a reference is obtained by the monitoring device MD in the absence of a moving object such as a human being or an animal.
  • a part of the laser spot light beam SB emitted from the monitoring device MD and traveling in the same light projecting / receiving direction is reflected by the fence FS installed on the ground G to become reflected light RB1.
  • the rest passes through the wire mesh of the fence FS, is irradiated on the background object of the trees TT, etc., becomes reflected light RB2, and returns to the monitoring device MD.
  • the processing result from the processing circuit PROC is obtained from the peak P1 (first echo) corresponding to the distance to the fence FS and the background object (referred to as the reference background).
  • a peak P2 last echo: highest order reflected light
  • a background image is set by recognizing the background object using the highest order reflected light, and by detecting the detected object using the highest order reflected light during actual monitoring, rain, snow, fog, etc. The impact can be avoided.
  • the highest order reflected light is within the maximum measurable distance of the monitoring device MD (beyond this, it becomes an unmeasurable point), and the reflected light from the farthest object is It shall be said.
  • the processing circuit PROC compares the processing result of FIG. 3B with the processing result of FIG. When a predetermined distance difference from P3 occurs, attention can be alerted.
  • the processing circuit PROC tracks the intruder OBJ to obtain the moving direction and speed (moving object detection). It can be performed. For example, when the position of the peak P3 comes before the peak P1, the processing circuit PROC can determine that the intruder OBJ has exceeded the fence FS, and can issue an alarm or the like via an alarm device (not shown). .
  • FIG. 5 is a diagram showing a state in which such a monitoring space is viewed from the side in association with the processing result of the processing circuit PROC.
  • a distance image (reference image) serving as a reference is obtained by the monitoring device MD in a state where there is no moving object such as a human being or an animal.
  • a part of the laser spot light beam SB emitted from the monitoring device MD and traveling in the same light projecting / receiving direction is reflected by the fence FS installed on the ground G and reflected light RB1.
  • the remainder passes through the fence FS and travels toward infinity, as shown in FIG. 5B, only the reflected light beam RB1 returns to the monitoring device MD. Therefore, when the above-described monitoring object detection algorithm is used in such a monitoring space, the reflected light beam RB1 becomes the last echo, so the background image uses the fence FS.
  • the processing circuit PROC cannot detect the intruder OBJ at the back side of the fence FS. It will be. If the intruder OBJ gets over the fence FS, the processing circuit PROC can detect it, but the detection timing may be too late.
  • FIG. 6 is a diagram illustrating a state in which a monitoring space similar to that in FIG. 4 is viewed from the side in association with the processing result of the processing circuit PROC.
  • a distance image (reference image) serving as a reference is obtained by the monitoring device MD in a state where there is no moving object such as a human being or an animal.
  • the laser light that has passed through the fence FS travels toward infinity, and thus the reflected light RB1 is generated only from the fence FS.
  • the mask process is performed on the fence FS so as to have a predetermined distance different from the actual distance.
  • the user can input the fence FS in the monitoring space by inputting to the processing device PROC via an interface (not shown). It is possible to set a mask area MA (which can take an arbitrary three-dimensional shape) that completely surrounds the mask area MA.
  • the processing apparatus PROC forcibly sets the three-dimensional coordinates of the object (including the fence FS) reflected from the mask area MA to a predetermined distance, that is, an infinite distance (or an unmeasurable point).
  • the distance obtained based on the reflected light from the fence FS existing in the mask area MA is replaced with the distance from the monitoring device MD to infinity (or a measurement impossible point). Accordingly, the fence FS can be ignored in the object detection algorithm, and the object detection can be prevented from being an obstacle.
  • the “predetermined distance” is preferably infinitely far (or an unmeasurable point), but when a background object exists on the back side of the mask area, the distance of the background object may be used.
  • the “unmeasurable point” refers to a point where the reflected light of the laser beam on the object is below the detection limit value of the photodiode PD because it is too far from the monitoring device MD.
  • the distance of the object in the mask area MA can be replaced with, for example, 50 m or more. Then, the peak P1 of the reflected light RB1 of the fence FS indicated by the dotted line in FIG. 6B is shifted beyond the unmeasurable point, and is excluded from the background image in the object detection algorithm.
  • the processing circuit PROC compares the processing result of FIG. 6B and the processing result of FIG. 6D, determines that a new peak P3 has occurred, and can call attention. For example, if the intruder OBJ is included in the mask area MA, there is a risk that it will be excluded from the monitoring object on the detection algorithm, but the mask area MA can be arbitrarily set in three dimensions. When the person OBJ tries to get over the fence FS, it will protrude above the mask area MA, so that the monitoring device MD can detect this and issue an alarm.
  • FIG. 7 is a flowchart showing the monitoring control performed by the processing circuit PROC of the monitoring device MD.
  • the processing circuit PROC determines whether or not to set the mask area MA. Specifically, for example, when the user inputs data (three-dimensional coordinates) of the mask area MA including a fence or the like in the monitoring space, the processing circuit PROC determines that the mask area MA is set in response to this, and step S102 Thus, the mask area MA is designated based on the input data. On the other hand, if the data of the mask area MA is not input, the flow bypasses step S102, so that the processing circuit PROC does not set the mask area MA.
  • step S103 the processing circuit PROC scans the laser beam from the light projecting / receiving unit of the monitoring device MD, receives the reflected light, and obtains N distance data (distance image) for one frame in the monitoring space. obtain.
  • the processing circuit PROC sets the distance of the measurement point to infinity in step S106.
  • the flow bypasses step S106, and the processing circuit PROC calculates the actual distance of the measurement point. Value.
  • the processing circuit PROC compares the obtained distance image for one frame with the background image in step S111, and determines whether or not a suspicious object (moving object) has appeared in step S112. To do. If it is determined that there is a suspicious object, the processing circuit PROC outputs an alarm signal in step S113, displays an alarm on a monitor (not shown), returns the flow to step S101, performs the next laser light scan, and then A distance image is obtained for each frame. On the other hand, when it is determined that there is no suspicious object. The processing circuit PROC returns the flow to step S101 without outputting an alarm signal.
  • the present invention is not limited to the embodiments described in the specification, and other embodiments and modifications are included for those skilled in the art from the embodiments and technical ideas described in the present specification. it is obvious.
  • the description and the embodiments are for illustrative purposes only, and the scope of the present invention is indicated by the following claims.
  • the mask area can also be set when a fence or the like is present in front of the background object within the measurable range.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Measurement Of Optical Distance (AREA)
  • Burglar Alarm Systems (AREA)

Abstract

Provided is a surveillance system that detects an object by background subtraction using reflected light of the highest order, for example, and that is capable of detecting an object over a fence, for example. In this surveillance system, a processing section employs, as a background target, a target that is the farthest from a light projection/reception unit in the same light projection/reception direction, and is capable of measuring the distance of a target closer than the background target as a surveillance target. The processing section is capable of setting a mask region within a discretionary range of a surveillance space, and when reflected light from a target that is present within the mask region has been detected, the processing section sets the distance of the target for which the reflected light has been detected to a predetermined distance.

Description

監視システムMonitoring system
 本発明は、例えばレーザ光等を走査投光して物体を監視する監視システムに関する。 The present invention relates to a monitoring system that monitors an object by scanning and projecting laser light or the like, for example.
 監視空間への侵入者等を検出する監視装置として、距離画像を用いるものが提案されている。ここで距離画像とは、画素値として距離情報を有するものである。具体的には、特許文献1に示すように、レーザ光等を監視空間へ向けて送出し、その送出から反射光の受光までの時間などから監視空間内の対象物までの距離を計測する監視装置が知られている。かかる監視装置では、レーザ光等の測定媒体の送出方向を順次変えて監視空間内を二次元的に走査することにより、監視空間を向いた複数の方向に関する距離情報を得ることができ、これにより距離画像を形成できる。 A monitoring device that uses a distance image has been proposed as a monitoring device for detecting an intruder or the like in a monitoring space. Here, the distance image has distance information as a pixel value. Specifically, as shown in Patent Document 1, a laser beam or the like is sent out to a monitoring space, and monitoring is performed to measure a distance to an object in the monitoring space from a time from the sending to reception of reflected light. The device is known. In such a monitoring apparatus, distance information about a plurality of directions facing the monitoring space can be obtained by sequentially changing the transmission direction of the measurement medium such as laser light and scanning the monitoring space two-dimensionally. A distance image can be formed.
 距離画像を用いた監視装置では、移動物体が存在しない背景となる距離画像(背景画像)と、入力された距離画像(現画像)とを比較し、背景画像より近い距離に相当する画素を抽出して変化領域を求めることが行われる。そして、変化領域の大きさ・形状及び現画像における距離情報に基づいて、移動物体が目的とする検知対象物であるか否かを判定する。 In a monitoring device using a distance image, a distance image (background image) as a background in which no moving object exists is compared with an input distance image (current image), and pixels corresponding to a distance closer to the background image are extracted. Thus, the change area is obtained. Then, based on the size / shape of the change area and the distance information in the current image, it is determined whether or not the moving object is the target detection target.
 距離画像は、レーザ光束等の送受部から見た物体の方向と、当該物体までの距離という情報を有する。よって、距離画像により、物体の大きさ・形状を知ることができ、例えば、侵入者検知の用途においては、遠方の比較的大きな人物と近傍の小動物(鼠や猫等)とを区別することが可能となり、侵入者の検出精度を向上させることができる。 The distance image includes information such as the direction of the object viewed from the transmitting / receiving unit such as a laser beam and the distance to the object. Therefore, the size and shape of the object can be known from the distance image. For example, in an intruder detection application, it is possible to distinguish a relatively large person in the distance from small animals in the vicinity (such as a frog or a cat). It becomes possible, and the detection accuracy of an intruder can be improved.
特開2007-122507号公報JP 2007-122507 A
 ところで、山奥に設けた重要な施設等では、施設の周囲をフェンスなどで囲うことで、一般人が容易に立ち入れないようにすることが多い。このような施設に監視装置を設けて無人監視を行う場合、人間の行動をフェンス外でいち早く検知できると、早期に警報を発することが出来るので好ましいとされる。一方、監視装置は、人間等の不用意なアクセスを避けるべくフェンス内に設置されることが一般的であるから、監視装置の設置場所によっては、監視装置から出射されたレーザ光がフェンス越しにその内側から外側に向けて進行し、対象物に反射した光が再びフェンスを通って戻ってくる場合がある。ところが、特許文献1の監視装置においては、背景距離画像より手前にある対象距離画像を変化画像として抽出することとなっており、すなわち背景対象物より近い位置の対象物のみを監視対象としている。従って、フェンス画像を背景画像として設定してしまうと、フェンス外側の物体がフェンス越しに見える状況でも、それを検知できなくなるという問題がある。 By the way, in important facilities set up in the back of the mountain, the surroundings of the facilities are often surrounded by fences so that ordinary people cannot easily enter. When unattended monitoring is performed by providing a monitoring device in such a facility, it is preferable that a human action can be detected quickly outside the fence because an alarm can be issued early. On the other hand, since the monitoring device is generally installed in the fence to avoid inadvertent access by humans or the like, the laser light emitted from the monitoring device may pass through the fence depending on the installation location of the monitoring device. The light that travels from the inside toward the outside and is reflected by the object may return through the fence again. However, in the monitoring device of Patent Document 1, the object distance image that is in front of the background distance image is extracted as a change image, that is, only the object at a position closer to the background object is the monitoring object. Therefore, if a fence image is set as a background image, there is a problem that even if an object outside the fence appears over the fence, it cannot be detected.
 本発明は、上記事情に鑑みなされたものであり、例えば最高次の反射光を使って背景差分により物体検出を行う監視システムにおいて、フェンス越しなどの物体を検出できるようにする監視システムを提供することを目的とする。 The present invention has been made in view of the above circumstances, and provides a monitoring system capable of detecting an object such as a fence in a monitoring system that detects an object based on a background difference using the highest-order reflected light, for example. For the purpose.
 上述した目的のうち少なくとも一つを実現するために、本発明の一側面を反映した監視システムは、
 光束を出射する出射部と、前記光束を監視空間内で走査する走査部と、前記監視空間内の対象物から反射した光束を受光する受光部とを備えた投受光ユニットと、
 前記投受光ユニットからの信号を処理することで、前記対象物までの距離を測定する処理部と、を有する監視システムであって、
 前記処理部は、前記監視空間において基準背景を設定し、前記基準背景よりも近い対象物を監視対象物としてその距離を測定可能となっており、
 前記処理部は、前記監視空間の任意の範囲にマスク領域を設定可能となっており、前記マスク領域内に存在する対象物からの反射光を検出したときは、前記反射光を検出した対象物の距離を所定距離に設定するものである。
In order to achieve at least one of the above-described objects, a monitoring system reflecting one aspect of the present invention is provided.
A light projecting / receiving unit comprising: an emitting unit that emits a light beam; a scanning unit that scans the light beam in a monitoring space; and a light receiving unit that receives a light beam reflected from an object in the monitoring space;
A processing system for measuring a distance to the object by processing a signal from the light projecting / receiving unit,
The processing unit can set a reference background in the monitoring space, and can measure the distance as an object to be monitored that is closer to the reference background,
The processing unit can set a mask area in an arbitrary range of the monitoring space, and when detecting reflected light from an object existing in the mask area, the object detecting the reflected light Is set to a predetermined distance.
 本発明によれば、例えば最高次の反射光を使って背景差分により物体検出を行う監視システムにおいて、フェンス越しなどの物体を検出できるようにする監視システムを提供することができる。 According to the present invention, for example, in a monitoring system that detects an object based on a background difference using the highest-order reflected light, it is possible to provide a monitoring system that can detect an object such as a fence.
本実施形態にかかる監視装置MDの断面図である。It is sectional drawing of monitoring apparatus MD concerning this embodiment. ミラーユニットMUの回転に応じて、出射するレーザスポット光SB(ハッチングで示す)で、監視装置MDの監視空間内を走査する状態を示す図である。It is a figure which shows the state which scans the inside of the monitoring space of the monitoring apparatus MD with the laser spot light SB (it shows by hatching) radiate | emitted according to rotation of the mirror unit MU. (a)、(c)は、背景対象物が存在する監視空間を側方から見た状態を示す図であり、(b)、(d)は、それぞれ処理回路PROCの処理結果を示す図であり、縦軸を反射光強度、横軸を距離としている。(A), (c) is a figure which shows the state which looked at the monitoring space where a background target object exists from the side, (b), (d) is a figure which shows the processing result of the processing circuit PROC, respectively. Yes, the vertical axis represents the reflected light intensity, and the horizontal axis represents the distance. フェンスFS以外の背景対象物が存在しない監視空間を示す図である。It is a figure which shows the monitoring space where background objects other than fence FS do not exist. (a)、(c)は、フェンスFS以外の背景対象物が存在しない監視空間を側方から見た状態を示す図であり、(b)、(d)は、それぞれ処理回路PROCの処理結果を示す図であり、縦軸を反射光強度、横軸を距離としている。(A), (c) is a figure which shows the state which looked at the monitoring space from which the background target object except fence FS does not exist from the side, (b), (d) is the processing result of the processing circuit PROC, respectively. The vertical axis represents the reflected light intensity, and the horizontal axis represents the distance. (a)、(c)は、マスク領域を設定した監視空間を側方から見た状態を示す図であり、(b)、(d)は、それぞれ処理回路PROCの処理結果を示す図であり、縦軸を反射光強度、横軸を距離としている。(A), (c) is a figure which shows the state which looked at the monitoring space which set the mask area | region from the side, (b), (d) is a figure which shows the processing result of the processing circuit PROC, respectively. The vertical axis represents the reflected light intensity, and the horizontal axis represents the distance. 監視装置MDの処理回路PROCで行われる監視制御を示すフローチャートである。It is a flowchart which shows the monitoring control performed in the processing circuit PROC of the monitoring apparatus MD. レーザスポット光束の断面形状と、フェンスの網目の大きさとの関係を示す図である。It is a figure which shows the relationship between the cross-sectional shape of a laser spot light beam, and the magnitude | size of the mesh | network of a fence.
 以下、添付した図面を参照しながら、本発明の実施形態を説明する。図1は、本実施形態にかかる監視システムとしての監視装置MDの断面図であるが、構成要素の形状や長さ等、実際と異なる場合がある。監視装置MDは、山奥の無人観測所などの重要施設に設置されると好ましい。 Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings. FIG. 1 is a cross-sectional view of a monitoring device MD as a monitoring system according to the present embodiment. However, the shape and length of components may differ from actual ones. The monitoring device MD is preferably installed in an important facility such as an unmanned observation station in the back of the mountain.
 監視装置MDは、例えば、レーザ光束を出射するパルス式の半導体レーザLDと、半導体レーザLDからの発散光を平行光に変換するコリメートレンズCLと、コリメートレンズCLで平行とされたレーザ光を、回転するミラー面により監視空間に向かって走査投光すると共に、対象物からの反射光を反射させるミラーユニットMUと、ミラーユニットMUで反射された対象物からの反射光を集光するレンズLSと、レンズLSにより集光された光を受光するフォトダイオードPDと、半導体レーザLDの出射タイミングとフォトダイオードPDの受光タイミングとの時間差に応じて距離情報を求める処理回路(処理部)PROCと、ミラーユニットMUを回転駆動するモータMTと、これらを収容する筐体CSとを有する。 The monitoring device MD includes, for example, a pulse-type semiconductor laser LD that emits a laser beam, a collimating lens CL that converts divergent light from the semiconductor laser LD into parallel light, and laser light that is collimated by the collimating lens CL. A mirror unit MU that scans and projects light toward the monitoring space by a rotating mirror surface, reflects reflected light from the object, and a lens LS that collects reflected light from the object reflected by the mirror unit MU; A photodiode PD that receives the light collected by the lens LS, a processing circuit (processing unit) PROC that obtains distance information according to a time difference between the emission timing of the semiconductor laser LD and the light reception timing of the photodiode PD, and a mirror It has a motor MT that rotationally drives the unit MU, and a housing CS that houses them.
 本実施形態において、半導体レーザLDとコリメートレンズCLとで出射部LPSを構成し、レンズLSとフォトダイオードPDとで受光部RPSを構成し、ミラーユニットMUが走査部を構成し、更にこれらで投受光ユニットを構成する。出射部LPS、受光部RPSの光軸は、ミラーユニットMUの回転軸ROに対して直交していると好ましい。 In this embodiment, the semiconductor laser LD and the collimating lens CL constitute an emission part LPS, the lens LS and the photodiode PD constitute a light receiving part RPS, and the mirror unit MU constitutes a scanning part, and further these are used for projection. Configure the light receiving unit. The optical axes of the emission part LPS and the light receiving part RPS are preferably orthogonal to the rotation axis RO of the mirror unit MU.
 剛体である施設の壁WL等に固定されたボックス状の筐体CSは、上壁CSaと、これに対向する下壁CSbと、上壁CSaと下壁CSbとを連結する側壁CScとを有する。側壁CScの一部に開口CSdが形成され、開口CSdには透明板TRが取り付けられている。 A box-shaped housing CS fixed to a wall WL or the like of a rigid facility has an upper wall CSa, a lower wall CSb facing the upper wall CSa, and a side wall CSc connecting the upper wall CSa and the lower wall CSb. . An opening CSd is formed in a part of the side wall CSc, and a transparent plate TR is attached to the opening CSd.
 ミラーユニットMUは、2つの四角錐を逆向きに接合して一体化した形状を有し、すなわち対になって向き合う方向に傾いたミラー面M1、M2を4対(但し4対に限られない)有している。ミラー面M1、M2は、ミラーユニットの形状をした樹脂素材(例えばPC)の表面に、反射膜を蒸着することにより形成されていると好ましい。 The mirror unit MU has a shape in which two quadrangular pyramids are joined together in opposite directions, that is, four pairs of mirror surfaces M1 and M2 tilted in a direction facing each other (but not limited to four pairs). ) The mirror surfaces M1 and M2 are preferably formed by depositing a reflective film on the surface of a resin material (for example, PC) in the shape of a mirror unit.
 ミラーユニットMUは、筐体CSに固定されたモータMTの軸MTaに連結され、回転駆動されるようになっている。本実施形態では、軸MTaの軸線(回転軸線)が鉛直方向であるZ方向に延在しており、Z方向に直交するX方向及びY方向によりなすXY平面が水平面となっているが、軸MTaの軸線を鉛直方向に対して傾けても良い。 The mirror unit MU is connected to a shaft MTa of a motor MT fixed to the casing CS and is driven to rotate. In the present embodiment, the axis (rotation axis) of the axis MTa extends in the Z direction which is the vertical direction, and the XY plane formed by the X direction and the Y direction orthogonal to the Z direction is a horizontal plane. The axis of MTa may be inclined with respect to the vertical direction.
 次に、監視装置MDの対象物検出原理について説明する。図1において、半導体レーザLDからパルス状に間欠的に出射された発散光は、コリメートレンズCLで平行光束に変換され、回転するミラーユニットMUの第1ミラー面M1に入射し、ここで反射され、更に第2ミラー面M2で反射した後、透明板TRを透過して外部の監視空間に向けて、例えば縦長の矩形断面を持つレーザスポット光として走査投光される。尚、出射されたレーザスポット光が対象物で反射し、反射光として戻ってくる方向を投受光方向という。同一投受光方向に進行するレーザスポット光束は、同一の画素で検出される。 Next, the object detection principle of the monitoring device MD will be described. In FIG. 1, divergent light intermittently emitted in a pulse form from a semiconductor laser LD is converted into a parallel light beam by a collimating lens CL, enters a first mirror surface M1 of a rotating mirror unit MU, and is reflected there. Further, after being reflected by the second mirror surface M2, the light is scanned and projected as a laser spot light having, for example, a vertically long rectangular cross section through the transparent plate TR and toward the external monitoring space. The direction in which the emitted laser spot light is reflected by the object and returned as reflected light is referred to as the light projecting / receiving direction. Laser spot beams traveling in the same light projecting / receiving direction are detected by the same pixel.
 図2は、ミラーユニットMUの回転に応じて、出射するレーザスポット光SB(ハッチングで示す)で、監視装置MDの監視空間内を走査する状態を示す図であり、フェンスFSの奥側に木々TTが茂っている例を示している。 FIG. 2 is a diagram illustrating a state in which the laser spot light SB (indicated by hatching) emitted in accordance with the rotation of the mirror unit MU scans the monitoring space of the monitoring device MD, and trees are arranged behind the fence FS. An example in which TT is thick is shown.
 ここで、ミラーユニットMUの第1ミラー面M1と第2ミラー面M2の組み合わせにおいて、それぞれ交差角が異なっている。レーザ光は、回転する第1ミラー面M1と第2ミラー面M2にて、順次反射される。まず1番対の第1ミラー面M1と第2ミラー面M2にて反射したレーザ光は、ミラーユニットMUの回転に応じて、監視空間の一番上の領域Ln1を水平方向に左から右へと走査される。次に、2番対の第1ミラー面M1と第2ミラー面M2で反射したレーザ光は、ミラーユニットMUの回転に応じて、監視空間の上から二番目の領域Ln2を水平方向に左から右へと走査される。次に、3番対の第1ミラー面M1と第2ミラー面M2で反射したレーザ光は、ミラーユニットMUの回転に応じて、監視空間の上から三番目の領域Ln3を水平方向に左から右へと走査される。次に、4番対の第1ミラー面M1と第2ミラー面で反射したレーザ光は、ミラーユニットMUの回転に応じて、監視空間の最も下の領域Ln4を水平方向に左から右へと走査される。これにより監視装置MDが監視可能な監視空間全体の1回の走査が完了する。この領域Ln1~Ln4の走査により得られた画像を組み合わせて、1つのフレームFLが得られることとなる。そして、ミラーユニットMUが1回転した後、1番対の第1ミラー面M1と第2ミラー面M2が戻ってくれば、再び監視空間の一番上の領域Ln1から最も下の領域Ln4までの走査を繰り返し、次のフレームFLが得られる。 Here, in the combination of the first mirror surface M1 and the second mirror surface M2 of the mirror unit MU, the crossing angles are different from each other. The laser light is sequentially reflected by the rotating first mirror surface M1 and second mirror surface M2. First, the laser light reflected by the first pair of the first mirror surface M1 and the second mirror surface M2 moves from the left to the right in the horizontal direction in the uppermost area Ln1 in accordance with the rotation of the mirror unit MU. Is scanned. Next, the laser light reflected by the second pair of the first mirror surface M1 and the second mirror surface M2 moves from the left in the horizontal direction to the second region Ln2 from the top of the monitoring space according to the rotation of the mirror unit MU. Scan to the right. Next, the laser light reflected by the third pair of the first mirror surface M1 and the second mirror surface M2 passes through the third region Ln3 from the top in the monitoring space in accordance with the rotation of the mirror unit MU from the left in the horizontal direction. Scan to the right. Next, the laser beams reflected by the fourth pair of the first mirror surface M1 and the second mirror surface are moved horizontally from left to right in the lowermost region Ln4 of the monitoring space according to the rotation of the mirror unit MU. Scanned. Thus, one scan of the entire monitoring space that can be monitored by the monitoring device MD is completed. A single frame FL is obtained by combining images obtained by scanning the regions Ln1 to Ln4. If the first pair of the first mirror surface M1 and the second mirror surface M2 return after one rotation of the mirror unit MU, the region from the uppermost region Ln1 to the lowermost region Ln4 is again displayed. Scanning is repeated to obtain the next frame FL.
 図1において、走査投光された光束のうち対象物(木々TTやフェンスFS)に当たって反射したレーザ光の一部は、再び透明板TRを透過して筐体CS内のミラーユニットMUの第2ミラー面M2に入射し、ここで反射され、更に第1ミラー面M1で反射されて、レンズLSにより集光され、それぞれフォトダイオードPDの受光面で検知されることとなる。更に、処理部である処理回路PROCが、半導体レーザLDの出射タイミングとフォトダイオードPDの受光タイミングとの時間差に応じて距離情報を求める。これにより監視空間内の全領域で対象物の検出を行って、画素毎に距離情報を持つ距離画像としてのフレームFL(図2参照)を得ることができる。かかる距離画像は、不図示のネットワークなどを介して遠方のモニタに送信されて表示されたり、また記憶装置に記憶できる。 In FIG. 1, part of the laser light reflected by the object (trees TT or fence FS) out of the projected light beam is transmitted again through the transparent plate TR and the second mirror unit MU in the housing CS. The light enters the mirror surface M2, is reflected here, is further reflected by the first mirror surface M1, is collected by the lens LS, and is detected by the light receiving surface of the photodiode PD. Further, the processing circuit PROC, which is a processing unit, obtains distance information according to the time difference between the emission timing of the semiconductor laser LD and the light reception timing of the photodiode PD. Thereby, the object is detected in the entire area in the monitoring space, and a frame FL (see FIG. 2) as a distance image having distance information for each pixel can be obtained. Such a distance image can be transmitted to a remote monitor via a network (not shown) or the like and displayed, or can be stored in a storage device.
 次に、監視装置MDの監視対象物の検知アルゴリズムについて説明する。図3は、監視空間を側方から見た状態を、処理回路PROCの処理結果に対応づけて示す図である。ここで、監視の前準備として、図3(a)に示すように、人間や動物等の移動物体が存在しない状態で、監視装置MDにより基準となる距離画像(基準画像)を得る。図3(a)に示す状態では、監視装置MDから出射され同一の投受光方向に向かうレーザスポット光束SBの一部は、地面G上に設置されたフェンスFSで反射されて反射光RB1となるが、残りはフェンスFSの金網を通過して、木々TTの背景対象物等に照射され反射光RB2となって、それぞれ監視装置MDに戻る。 Next, the detection algorithm for the monitoring object of the monitoring device MD will be described. FIG. 3 is a diagram illustrating a state in which the monitoring space is viewed from the side in association with the processing result of the processing circuit PROC. Here, as a preparation for monitoring, as shown in FIG. 3A, a distance image (reference image) serving as a reference is obtained by the monitoring device MD in the absence of a moving object such as a human being or an animal. In the state shown in FIG. 3A, a part of the laser spot light beam SB emitted from the monitoring device MD and traveling in the same light projecting / receiving direction is reflected by the fence FS installed on the ground G to become reflected light RB1. However, the rest passes through the wire mesh of the fence FS, is irradiated on the background object of the trees TT, etc., becomes reflected light RB2, and returns to the monitoring device MD.
 尚、レーザスポット光束は、監視装置MDから離れるにつれて広がるので、レーザスポット光束の断面形状と、フェンスFSの網目の大きさとの関係は、フェンスFSまでの距離により異なる。例えば図8において、フェンスFSの網目1辺の長さをL=40mm、厚さをt=4mmとした場合、監視装置MDとフェンスFSとの距離が3mの場合、レーザスポット光束SB1の大きさでフェンスFSに照射され、監視装置MDとフェンスFSとの距離が10mの場合、レーザスポット光束SB2の大きさでフェンスFSに照射されることとなる。 Since the laser spot light beam spreads away from the monitoring device MD, the relationship between the cross-sectional shape of the laser spot light beam and the mesh size of the fence FS varies depending on the distance to the fence FS. For example, in FIG. 8, when the length of one side of the mesh of the fence FS is L = 40 mm and the thickness is t = 4 mm, and the distance between the monitoring device MD and the fence FS is 3 m, the size of the laser spot light beam SB1 When the distance between the monitoring device MD and the fence FS is 10 m, the fence FS is irradiated with the size of the laser spot light beam SB2.
 この場合、処理回路PROCからの処理結果は、図3(b)に示すように、フェンスFSまでの距離に対応したピークP1(第1エコー)と、背景対象物(基準背景とする)までの距離に対応したピークP2(ラストエコー:最高次の反射光)が得られる。背景差分法を用いた対象物監視アルゴリズム上、最高次の反射光であるピークP2の奥には対象物は存在しないものとする。最高次の反射光を用いて背景対象物を認識して背景画像を設定し、また実際の監視時に最高次の反射光を用いて検知対象物を認識することで、雨や雪、霧などの影響を回避できる。ここで、「最高次の反射光」とは、前記監視装置MDの最大限測定可能な距離(これを超えると測定不能点となる)以内であって、最も遠方の対象物からの反射光をいうものとする。 In this case, as shown in FIG. 3B, the processing result from the processing circuit PROC is obtained from the peak P1 (first echo) corresponding to the distance to the fence FS and the background object (referred to as the reference background). A peak P2 (last echo: highest order reflected light) corresponding to the distance is obtained. In the object monitoring algorithm using the background subtraction method, it is assumed that no object exists behind the peak P2, which is the highest order reflected light. A background image is set by recognizing the background object using the highest order reflected light, and by detecting the detected object using the highest order reflected light during actual monitoring, rain, snow, fog, etc. The impact can be avoided. Here, “the highest order reflected light” is within the maximum measurable distance of the monitoring device MD (beyond this, it becomes an unmeasurable point), and the reflected light from the farthest object is It shall be said.
 ここで、図3(c)に示すように、背景対象物の手前に侵入者OBJが現れた場合、図3(d)に示すように、処理回路PROCからの処理結果において、ピークP1とピークP2との間に、侵入者OBJからの反射光RB3に応じたピークP3が現れることとなる。この測定点で、ピークP3はピークP2に代わってラストエコーとなる、処理回路PROCは、図3(b)の処理結果と図3(d)の処理結果とを比較して、ピークP2とピークP3との間に所定の距離差が生じた場合、注意を喚起することができる。又、侵入者OBJが移動している場合、走査を繰り返して得られるフレームでピークP3の位置が変わることから、処理回路PROCは侵入者OBJを追跡して、移動方向や速度を求める(動体検出を行う)ことができる。例えば、ピークP3の位置がピークP1より手前になった場合、処理回路PROCは侵入者OBJがフェンスFSを超えたと判断して、不図示の警報器を介してアラーム発報などを行うこともできる。 Here, as shown in FIG. 3C, when an intruder OBJ appears in front of the background object, as shown in FIG. 3D, in the processing result from the processing circuit PROC, the peak P1 and the peak A peak P3 corresponding to the reflected light RB3 from the intruder OBJ appears between P2 and P2. At this measurement point, the peak P3 becomes the last echo instead of the peak P2. The processing circuit PROC compares the processing result of FIG. 3B with the processing result of FIG. When a predetermined distance difference from P3 occurs, attention can be alerted. Further, when the intruder OBJ is moving, the position of the peak P3 changes in the frame obtained by repeating the scanning, so that the processing circuit PROC tracks the intruder OBJ to obtain the moving direction and speed (moving object detection). It can be performed. For example, when the position of the peak P3 comes before the peak P1, the processing circuit PROC can determine that the intruder OBJ has exceeded the fence FS, and can issue an alarm or the like via an alarm device (not shown). .
 一方、監視空間において、図4に示すように、フェンスFSの向こう側が開けた空間である場合など、監視装置の監視可能範囲にフェンスFS以外の対象物が存在しないこともある。図5は、このような監視空間を側方から見た状態を、処理回路PROCの処理結果に対応づけて示す図である。 On the other hand, in the monitoring space, as shown in FIG. 4, there may be no object other than the fence FS in the monitorable range of the monitoring device, such as a space where the other side of the fence FS is opened. FIG. 5 is a diagram showing a state in which such a monitoring space is viewed from the side in association with the processing result of the processing circuit PROC.
 この例でも、監視の前準備として、図5(a)に示すように、人間や動物等の移動物体が存在しない状態で、監視装置MDにより基準となる距離画像(基準画像)を得る。しかるに、図5(a)に示す状態では、監視装置MDから出射され同一の投受光方向に向かうレーザスポット光束SBの一部は、地面G上に設置されたフェンスFSで反射されて反射光RB1となるが、残りはフェンスFSを通過して無限遠方に向かうため、図5(b)に示すように、監視装置MDに戻るのは反射光束RB1のみとなる。従って、このような監視空間に、上述の監視対象物の検知アルゴリズムを用いた場合、反射光束RB1がラストエコーとなるから、背景画像はフェンスFSを用いたものとなる。 Also in this example, as a preparation for monitoring, as shown in FIG. 5A, a distance image (reference image) serving as a reference is obtained by the monitoring device MD in a state where there is no moving object such as a human being or an animal. However, in the state shown in FIG. 5A, a part of the laser spot light beam SB emitted from the monitoring device MD and traveling in the same light projecting / receiving direction is reflected by the fence FS installed on the ground G and reflected light RB1. However, since the remainder passes through the fence FS and travels toward infinity, as shown in FIG. 5B, only the reflected light beam RB1 returns to the monitoring device MD. Therefore, when the above-described monitoring object detection algorithm is used in such a monitoring space, the reflected light beam RB1 becomes the last echo, so the background image uses the fence FS.
 かかる場合、図5(c)に示すように、フェンスFSの奥側に侵入者OBJが現れると、図5(d)に示すように、処理回路PROCからの処理結果において、ピークP1の奥側に、侵入者OBJからの反射光RB3に応じたピークP3が現れることとなる。ここで、処理回路PROCは、図5(b)の処理結果と図5(d)の処理結果とを比較するが、フェンスFSからの反射光RB1をラストエコーとして背景画像を設定すると、背景差分法を用いた検知アルゴリズムにおいてはラストエコーより奥側の監視対象物から反射光が戻ることを想定していないため、その結果として処理回路PROCは、フェンスFSの奥側の侵入者OBJを検知できないこととなる。フェンスFSを侵入者OBJが乗り越えれば、処理回路PROCが検知できるが、それでは検知タイミングが遅くなりすぎることがある。 In such a case, as shown in FIG. 5C, when an intruder OBJ appears in the back side of the fence FS, in the processing result from the processing circuit PROC, as shown in FIG. Then, a peak P3 corresponding to the reflected light RB3 from the intruder OBJ appears. Here, the processing circuit PROC compares the processing result of FIG. 5B and the processing result of FIG. 5D, but if the background image is set with the reflected light RB1 from the fence FS as the last echo, the background difference In the detection algorithm using the method, it is not assumed that the reflected light returns from the monitoring object at the back side from the last echo, and as a result, the processing circuit PROC cannot detect the intruder OBJ at the back side of the fence FS. It will be. If the intruder OBJ gets over the fence FS, the processing circuit PROC can detect it, but the detection timing may be too late.
 かかる課題に対応できる監視装置MDの監視対象物の検知アルゴリズムについて説明する。図6は、図4と同様な監視空間を側方から見た状態を、処理回路PROCの処理結果に対応づけて示す図である。 The detection algorithm of the monitoring target object of the monitoring device MD that can cope with such a problem will be described. FIG. 6 is a diagram illustrating a state in which a monitoring space similar to that in FIG. 4 is viewed from the side in association with the processing result of the processing circuit PROC.
 ここでも、監視の前準備として、図6(a)に示すように、人間や動物等の移動物体が存在しない状態で、監視装置MDにより基準となる距離画像(基準画像)を得る。このとき、図5(a)に示す例と同様に、フェンスFSを通過したレーザ光は無限遠方に向かうため、反射光RB1はフェンスFSからしか発生しない。 Also here, as a preparation for monitoring, as shown in FIG. 6A, a distance image (reference image) serving as a reference is obtained by the monitoring device MD in a state where there is no moving object such as a human being or an animal. At this time, similarly to the example shown in FIG. 5A, the laser light that has passed through the fence FS travels toward infinity, and thus the reflected light RB1 is generated only from the fence FS.
 そこで、本実施形態では、フェンスFSについては、実際の距離とは異なる所定距離とするマスク処理を行う。具体的には、監視装置MDに対するフェンスFSの位置(距離及び方向)が予め分かっているので、不図示のインタフェースを介してユーザが処理装置PROCに入力を行うことで、監視空間においてフェンスFSを全て囲うマスク領域MA(任意の3次元形状をとりうる)を設定することができる。かかる場合、処理装置PROCは、マスク領域MA内から反射した物体(フェンスFSを含む)の3次元座標を、所定距離すなわち無限遠方(又は測定不能点)に強制的に設定する。つまり、マスク領域MA内に存在するフェンスFSからの反射光に基づいて得られた距離については、監視装置MDから無限遠方(又は測定不能点)までの距離に置換するのである。これにより対象物検知アルゴリズム上、フェンスFSを無視することができ、対象物検知の障害とならないようにできる。「所定距離」については、無限遠方(又は測定不能点)であると好ましいが、マスク領域の奥側に背景対象物が存在する場合、その背景対象物の距離としても良い。 Therefore, in this embodiment, the mask process is performed on the fence FS so as to have a predetermined distance different from the actual distance. Specifically, since the position (distance and direction) of the fence FS with respect to the monitoring device MD is known in advance, the user can input the fence FS in the monitoring space by inputting to the processing device PROC via an interface (not shown). It is possible to set a mask area MA (which can take an arbitrary three-dimensional shape) that completely surrounds the mask area MA. In such a case, the processing apparatus PROC forcibly sets the three-dimensional coordinates of the object (including the fence FS) reflected from the mask area MA to a predetermined distance, that is, an infinite distance (or an unmeasurable point). In other words, the distance obtained based on the reflected light from the fence FS existing in the mask area MA is replaced with the distance from the monitoring device MD to infinity (or a measurement impossible point). Accordingly, the fence FS can be ignored in the object detection algorithm, and the object detection can be prevented from being an obstacle. The “predetermined distance” is preferably infinitely far (or an unmeasurable point), but when a background object exists on the back side of the mask area, the distance of the background object may be used.
 尚、「測定不能点」とは、監視装置MDから離れ過ぎているために、対象物でのレーザ光の反射光がフォトダイオードPDの検出限界値以下となる点をいう。例えば測定不能点が50m先である場合、マスク領域MA内の物体の距離を例えば50m以上に置換することができる。すると、図6(b)に点線で示すフェンスFSの反射光RB1のピークP1は、測定不能点以上にシフトされるから対象物検知アルゴリズム上、背景画像から除外されることとなる。 Note that the “unmeasurable point” refers to a point where the reflected light of the laser beam on the object is below the detection limit value of the photodiode PD because it is too far from the monitoring device MD. For example, when the unmeasurable point is 50 m away, the distance of the object in the mask area MA can be replaced with, for example, 50 m or more. Then, the peak P1 of the reflected light RB1 of the fence FS indicated by the dotted line in FIG. 6B is shifted beyond the unmeasurable point, and is excluded from the background image in the object detection algorithm.
 このような設定を行うと、図6(c)に示すように、監視空間内に侵入者OBJが現れた場合、図6(d)に示すように、処理回路PROCからの処理結果において、侵入者OBJからの反射光RB3に応じたピークP3が現れるが、これは検知アルゴリズム上、ラストエコーとして認識されることとなる。従って、処理回路PROCは、図6(b)の処理結果と図6(d)の処理結果とを比較して、ピークP3が新たに生じたと判断して、注意を喚起することができる。尚、例えばマスク領域MA内に侵入者OBJが含まれると、検知アルゴリズム上では監視対象物から除外される恐れはあるが、マスク領域MAは3次元的に任意に設定可能であるため、例えば侵入者OBJがフェンスFSを乗り越えようとした場合、マスク領域MAの上方にはみ出すことになるので、監視装置MDはこれを検知してアラーム発報などを行うことができる。 When such setting is performed, when an intruder OBJ appears in the monitoring space as shown in FIG. 6C, intrusion occurs in the processing result from the processing circuit PROC as shown in FIG. 6D. The peak P3 corresponding to the reflected light RB3 from the person OBJ appears, but this is recognized as the last echo on the detection algorithm. Therefore, the processing circuit PROC compares the processing result of FIG. 6B and the processing result of FIG. 6D, determines that a new peak P3 has occurred, and can call attention. For example, if the intruder OBJ is included in the mask area MA, there is a risk that it will be excluded from the monitoring object on the detection algorithm, but the mask area MA can be arbitrarily set in three dimensions. When the person OBJ tries to get over the fence FS, it will protrude above the mask area MA, so that the monitoring device MD can detect this and issue an alarm.
 以上の検知アルゴリズムを用いた監視装置MDの監視制御を、図面を参照して説明する。ここで、処理回路PROCは、不図示のインタフェースを介するユーザの指定に応じて、マスク領域を任意に設定するものとする。 The monitoring control of the monitoring device MD using the above detection algorithm will be described with reference to the drawings. Here, it is assumed that the processing circuit PROC arbitrarily sets a mask area in accordance with a user's designation through an interface (not shown).
 図7は、監視装置MDの処理回路PROCで行われる監視制御を示すフローチャートである。まず図7のステップS101において、処理回路PROCはマスク領域MAを設定するか否かを判断する。具体的には、例えば監視空間におけるフェンス等を含むマスク領域MAのデータ(3次元座標)をユーザが入力した場合、これに応じ処理回路PROCはマスク領域MAを設定するものと判断し、ステップS102で、入力されたデータに基づいてマスク領域MAを指定する。一方、マスク領域MAのデータが入力されなければ、フローはステップS102を迂回するので、処理回路PROCがマスク領域MAを設定するはない。 FIG. 7 is a flowchart showing the monitoring control performed by the processing circuit PROC of the monitoring device MD. First, in step S101 in FIG. 7, the processing circuit PROC determines whether or not to set the mask area MA. Specifically, for example, when the user inputs data (three-dimensional coordinates) of the mask area MA including a fence or the like in the monitoring space, the processing circuit PROC determines that the mask area MA is set in response to this, and step S102 Thus, the mask area MA is designated based on the input data. On the other hand, if the data of the mask area MA is not input, the flow bypasses step S102, so that the processing circuit PROC does not set the mask area MA.
 次いで、ステップS103で、処理回路PROCは、監視装置MDの投受光ユニットからレーザ光の走査を行い、その反射光を受光して監視空間における1フレーム分のN個の距離データ(距離画像)を得る。ここで処理回路PROCは、まずステップS104でN=1とし、更にステップS105で、N番目の測定点の3次元座標に基づいて、これがマスク領域MAに属するか否かを判定する。 Next, in step S103, the processing circuit PROC scans the laser beam from the light projecting / receiving unit of the monitoring device MD, receives the reflected light, and obtains N distance data (distance image) for one frame in the monitoring space. obtain. Here, the processing circuit PROC first sets N = 1 in step S104, and further determines in step S105 whether or not it belongs to the mask area MA based on the three-dimensional coordinates of the Nth measurement point.
 N番目の測定点の座標がマスク領域MA内に存在すると判断した場合、処理回路PROCは、ステップS106で、当該測定点の距離を無限遠方に設定する。一方、マスク領域MAが設定されていない、又は対象物の測定点の座標がマスク領域MA内に存在しない場合、フローはステップS106を迂回し、処理回路PROCは当該測定点の距離を実際の演算値とする。 If it is determined that the coordinates of the Nth measurement point exist in the mask area MA, the processing circuit PROC sets the distance of the measurement point to infinity in step S106. On the other hand, when the mask area MA is not set or the coordinates of the measurement point of the object do not exist in the mask area MA, the flow bypasses step S106, and the processing circuit PROC calculates the actual distance of the measurement point. Value.
 処理回路PROCは、ステップS107で、フレーム内の全ての測定点についてマスク領域MA内に存在するか否か判定し終わったか否かを判断し、判定し終わっていないと判断した場合、ステップS108でN=N+1として、フローをステップS105へと戻す。一方、フレーム内の全ての測定点についてマスク領域MA内に存在するか否か判定し終わったと判断した場合、更に処理回路PROCは、ステップS109で、不図示のインタフェースを用いてユーザーが背景画像を更新したか否かを判断する。背景画像を更新したと判断した場合、処理回路PROCは、ステップS110で新たな背景画像(カレントフレーム)を登録して、これを用いて監視対象物を検知する。背景画像を更新しなかったと判断した場合、フローはステップS110を迂回する。 In step S107, the processing circuit PROC determines whether or not all the measurement points in the frame have been determined to exist in the mask area MA. If it is determined that the determination has not been completed, the processing circuit PROC determines in step S108. As N = N + 1, the flow returns to step S105. On the other hand, if it is determined that all the measurement points in the frame are present in the mask area MA, the processing circuit PROC further uses the interface (not shown) to display the background image in step S109. It is determined whether or not it has been updated. If it is determined that the background image has been updated, the processing circuit PROC registers a new background image (current frame) in step S110 and uses this to detect the monitoring object. If it is determined that the background image has not been updated, the flow bypasses step S110.
 更に、処理回路PROCは距離データを取得した後、ステップS111で、得られた1フレーム分の距離画像を背景画像と比較し、ステップS112で、不審物(動体)が出現したか否かを判断する。不審物が存在すると判断した場合、ステップS113で処理回路PROCはアラーム信号を出力し、不図示のモニタにおいて警報表示を行い、フローをステップS101へと戻して、次のレーザ光走査を行って次のフレームについて距離画像を得る。一方、不審物が存在しないと判断した場合。処理回路PROCはアラーム信号を出力することなく、フローをステップS101へと戻す。 Further, after acquiring the distance data, the processing circuit PROC compares the obtained distance image for one frame with the background image in step S111, and determines whether or not a suspicious object (moving object) has appeared in step S112. To do. If it is determined that there is a suspicious object, the processing circuit PROC outputs an alarm signal in step S113, displays an alarm on a monitor (not shown), returns the flow to step S101, performs the next laser light scan, and then A distance image is obtained for each frame. On the other hand, when it is determined that there is no suspicious object. The processing circuit PROC returns the flow to step S101 without outputting an alarm signal.
 本発明は、明細書に記載の実施形態に限定されるものではなく、他の実施形態・変形例を含むことは、本明細書に記載された実施形態や技術思想から本分野の当業者にとって明らかである。明細書の記載及び実施形態は、あくまでも例証を目的としており、本発明の範囲は後述するクレームによって示されている。例えば、フェンスのみならず、レーザ光の一部を透過するガラス板などをマスク領域で囲うこともできる。更に、監視装置MDを取り付けた壁WLを鉛直方向回りに回転させることで、投受光ユニットから出射されるレーザ光を360°全方向に向けることができる。マスク領域は、測定可能範囲内にある背景対象物の手前にフェンス等が存在する場合にも設定可能である。 The present invention is not limited to the embodiments described in the specification, and other embodiments and modifications are included for those skilled in the art from the embodiments and technical ideas described in the present specification. it is obvious. The description and the embodiments are for illustrative purposes only, and the scope of the present invention is indicated by the following claims. For example, not only the fence but also a glass plate that transmits a part of the laser beam can be surrounded by the mask region. Further, by rotating the wall WL to which the monitoring device MD is attached about the vertical direction, the laser light emitted from the light projecting / receiving unit can be directed in all directions of 360 °. The mask area can also be set when a fence or the like is present in front of the background object within the measurable range.
CL       コリメートレンズ
CS       筐体
CSa      上壁
CSb      下壁
CSc      側壁
CSd      開口
FL       フレーム
FS       フェンス
G        地面
LD       半導体レーザ
LPS      出射部
LS       レンズ
M1、M2    ミラー面
MA       マスク領域
MD       監視装置
MT       モータ
MTa      軸
MU       ミラーユニット
OBJ      侵入者
PD       フォトダイオード
PROC     処理回路
RB1,RB2,RB3  反射光
RO       回転軸
RPS      受光部
SB       レーザスポット光
TR       透明板
WL       壁
CL Collimating lens CS Housing CSa Upper wall CSb Lower wall CSc Side wall CSd Opening FL Frame FS Fence G Ground LD Semiconductor laser LPS Emitting part LS Lens M1, M2 Mirror surface MA Mask area MD Monitoring device MT Motor MTa Axis MU Mirror unit OBJ Intrusion Person PD Photodiode PROC Processing circuit RB1, RB2, RB3 Reflected light RO Rotating axis RPS Light receiving part SB Laser spot light TR Transparent plate WL Wall

Claims (6)

  1.  光束を出射する出射部と、前記光束を監視空間内で走査する走査部と、前記監視空間内の対象物から反射した光束を受光する受光部とを備えた投受光ユニットと、
     前記投受光ユニットからの信号を処理することで、前記対象物までの距離を測定する処理部と、を有する監視システムであって、
     前記処理部は、前記監視空間において基準背景を設定し、前記投受光ユニットから同一投受光方向において前記基準背景よりも近い対象物を監視対象物としてその距離を測定可能となっており、
     前記処理部は、前記監視空間の任意の範囲にマスク領域を設定可能となっており、前記マスク領域内に存在する対象物からの反射光を検出したときは、前記反射光を検出した対象物の距離を所定距離に設定する監視システム。
    A light projecting / receiving unit comprising: an emitting unit that emits a light beam; a scanning unit that scans the light beam in a monitoring space; and a light receiving unit that receives a light beam reflected from an object in the monitoring space;
    A processing system for measuring a distance to the object by processing a signal from the light projecting / receiving unit,
    The processing unit is capable of setting a reference background in the monitoring space, and measuring the distance from the light projecting / receiving unit as an object to be monitored that is closer to the reference background in the same light projecting / receiving direction,
    The processing unit can set a mask area in an arbitrary range of the monitoring space, and when detecting reflected light from an object existing in the mask area, the object detecting the reflected light Monitoring system that sets the distance of a certain distance.
  2.  前記所定距離は、前記基準背景までの距離、或いは無限遠方又は測定不能点である請求項1に記載の監視システム。 The monitoring system according to claim 1, wherein the predetermined distance is a distance to the reference background, an infinite distance, or a measurement impossible point.
  3.  前記処理部は、最高次反射光を反射した対象物を前記基準背景又は検知すべき対象物と認定する請求項1に記載の監視システム。 The monitoring system according to claim 1, wherein the processing unit recognizes an object that reflects the highest-order reflected light as the reference background or an object to be detected.
  4.  前記マスク領域内の対象物はフェンスである請求項1~3のいずれかに記載の監視システム。 The monitoring system according to any one of claims 1 to 3, wherein the object in the mask area is a fence.
  5.  前記処理部は、監視範囲を3次元空間とし、前記マスク領域を前記監視範囲内に3次元的に設定可能である請求項1~4のいずれかに記載の監視システム。 The monitoring system according to any one of claims 1 to 4, wherein the processing unit can set a monitoring range as a three-dimensional space and set the mask region in the monitoring range three-dimensionally.
  6.  前記処理部は、前記走査部による前記光束の時間的に先行する走査と、それより時間的に後行する走査とで位置が変化した前記監視対象物を、動体として検出する請求項1~5のいずれかに記載の監視システム。 The processing unit detects, as a moving object, the monitoring object whose position has changed between a scanning preceding the light beam in time by the scanning unit and a scanning succeeding in time. A monitoring system according to any one of the above.
PCT/JP2017/017465 2016-05-13 2017-05-09 Surveillance system WO2017195754A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2018517020A JP6988797B2 (en) 2016-05-13 2017-05-09 Monitoring system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016096995 2016-05-13
JP2016-096995 2016-05-13

Publications (1)

Publication Number Publication Date
WO2017195754A1 true WO2017195754A1 (en) 2017-11-16

Family

ID=60267765

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/017465 WO2017195754A1 (en) 2016-05-13 2017-05-09 Surveillance system

Country Status (2)

Country Link
JP (1) JP6988797B2 (en)
WO (1) WO2017195754A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017207365A (en) * 2016-05-18 2017-11-24 株式会社デンソーアイティーラボラトリ Computation processing device, computation processing method, and program
JP2019105550A (en) * 2017-12-13 2019-06-27 コニカミノルタ株式会社 Object detection device, control method and control program for object detection device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1063818A (en) * 1996-08-23 1998-03-06 Toyo Denki Kk Object detecting device
US20110043806A1 (en) * 2008-04-17 2011-02-24 Avishay Guetta Intrusion warning system
JP2011185762A (en) * 2010-03-09 2011-09-22 Denso Wave Inc Security system
WO2016002776A1 (en) * 2014-07-03 2016-01-07 三菱電機株式会社 Monitoring apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05215963A (en) * 1992-01-31 1993-08-27 Canon Inc Range finder

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1063818A (en) * 1996-08-23 1998-03-06 Toyo Denki Kk Object detecting device
US20110043806A1 (en) * 2008-04-17 2011-02-24 Avishay Guetta Intrusion warning system
JP2011185762A (en) * 2010-03-09 2011-09-22 Denso Wave Inc Security system
WO2016002776A1 (en) * 2014-07-03 2016-01-07 三菱電機株式会社 Monitoring apparatus

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017207365A (en) * 2016-05-18 2017-11-24 株式会社デンソーアイティーラボラトリ Computation processing device, computation processing method, and program
JP2019105550A (en) * 2017-12-13 2019-06-27 コニカミノルタ株式会社 Object detection device, control method and control program for object detection device
JP7020096B2 (en) 2017-12-13 2022-02-16 コニカミノルタ株式会社 Object detection device, control method of object detection device, and control program of object detection device

Also Published As

Publication number Publication date
JP6988797B2 (en) 2022-01-05
JPWO2017195754A1 (en) 2019-03-22

Similar Documents

Publication Publication Date Title
US7164116B2 (en) Monitor for intrusion detection
CN102253392B (en) Time of flight camera unit and Optical Surveillance System
US8420998B2 (en) Target detecting and determining method for detecting and determining target based on height information and storage medium for storing program executing target detecting and determining method
JPH02504440A (en) Intrusion detection method
WO1999003080A1 (en) Intruder detector system
JP2019101000A (en) Distance measurement point group data measurement system and control program
NL1028743C1 (en) Motion detection method for building security systems, by comparing measured distance between moving object and optical device with reference value
WO2020105527A1 (en) Image analysis device, image analysis system, and control program
JP3011121B2 (en) Security system
WO2017195754A1 (en) Surveillance system
JP2017215642A (en) Monitoring system
WO2017199785A1 (en) Monitoring system setting method, and monitoring system
JP6863035B2 (en) Intrusion monitoring methods, intrusion monitoring programs, and intrusion monitoring devices
JP6825624B2 (en) Monitoring system
JP7020096B2 (en) Object detection device, control method of object detection device, and control program of object detection device
WO2020008685A1 (en) Information notification device, program for information notification device, and information notification system
EP4174810A1 (en) Operating a scanning smoke detector
WO2017195755A1 (en) Surveillance system
JP6835079B2 (en) Monitoring system
US10580144B2 (en) Method and system for tracking holographic object
JP2017125765A (en) Object detection device
US10605917B2 (en) Optical-scanning-type object detection device having a mirror surface to be inclined in a direction crossing a rotation axis
CN110839131A (en) Synchronization control method, synchronization control device, electronic equipment and computer readable medium
JP2000233029A (en) Fire evacuation guide system and fire smoke detector used in the same
US20230342952A1 (en) Method for coordinative measuring by terrestrial scanning with image-based interference detection of moving objects

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2018517020

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17796115

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17796115

Country of ref document: EP

Kind code of ref document: A1