WO2019093372A1 - Object detecting system and object detecting program - Google Patents

Object detecting system and object detecting program Download PDF

Info

Publication number
WO2019093372A1
WO2019093372A1 PCT/JP2018/041335 JP2018041335W WO2019093372A1 WO 2019093372 A1 WO2019093372 A1 WO 2019093372A1 JP 2018041335 W JP2018041335 W JP 2018041335W WO 2019093372 A1 WO2019093372 A1 WO 2019093372A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
unit
object detection
monitoring area
processing unit
Prior art date
Application number
PCT/JP2018/041335
Other languages
French (fr)
Japanese (ja)
Inventor
生田目晃志
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2019552348A priority Critical patent/JP7244802B2/en
Publication of WO2019093372A1 publication Critical patent/WO2019093372A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/51Display arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/10Scanning systems
    • G02B26/12Scanning systems using multifaceted mirrors
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/181Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using active radiation detection systems

Definitions

  • the present invention relates to an object detection system and an object detection program for detecting return light from an object while scanning a light beam.
  • a so-called three-dimensional lidar that detects distance and direction with high accuracy is used.
  • 3D-LiDAR three-dimensional lidar
  • three-dimensional information can be obtained, so it is desirable that the detection results and the monitoring results be displayed three-dimensionally.
  • it is also performed to display the detection result only in a necessary monitoring area, and in that case, it is desirable that the monitoring area can be easily confirmed, set, corrected, and the like.
  • the present invention has been made in view of the above-mentioned problems of the background art, and is applicable to an object detection system that facilitates setting and grasping of a monitoring area even when distance detection is performed in three dimensions.
  • Object of the invention is to provide an object detection program.
  • an object detection system reflecting one aspect of the present invention includes a distance measurement unit that detects a reflected light while scanning a light beam and measures a distance from a propagation time
  • An object detection unit for detecting an object from distance information obtained by the distance measurement unit, an input / output unit including a display, and three-dimensional display and two-dimensional display of the object detected by the object detection unit on the display Accepting an operation of an input / output unit using a display in a state in which the display processing unit to be performed and either the two-dimensional display or the three-dimensional display corresponding to a predetermined gaze direction are performed on the display
  • an area setting unit configured to set a monitoring area to be monitored by the object detection unit among the areas measured by the distance measurement unit.
  • an object detection program reflecting one aspect of the present invention includes a distance measurement unit that detects a reflected light while scanning a light beam and measures a distance from a propagation time
  • An object detection unit for detecting an object from distance information obtained by the distance measurement unit, an input / output unit including a display, and three-dimensional display and two-dimensional display of the object detected by the object detection unit on the display
  • an area setting unit for setting a monitoring area to be monitored by the object detection unit among areas measured by the distance measurement unit. Operating in Gosuru controller.
  • FIG. 3A is a perspective view for explaining the monitoring area set in the area to be measured
  • FIG. 3B is a view showing an example of a three-dimensional display of the monitoring area.
  • movement of the object detection system of FIG. It is a figure explaining operation
  • region. 7A to 7D are diagrams for explaining display examples by the display.
  • An object detection system 100 shown in FIG. 1 includes a laser radar unit 21, a support unit 23, and a control device 80.
  • the laser radar unit 21 is a distance measurement unit that detects the presence of a detection target and the distance to the detection target, and measures the distance from the propagation time while scanning the light beam with the rotating scanning mirror 53a.
  • the laser radar unit (distance measurement unit) 21 includes a light projection system 51, a light reception system 52, a rotation reflection unit 53, a drive circuit 55, and an exterior component 56.
  • the light projecting system 51, the light receiving system 52, and the rotary reflecting portion 53 constitute a scanning optical system 59.
  • the light projection system 51 emits a laser beam L1 that is the source of the light beam or the light projection beam to a scanning mirror 53a of the rotation reflection unit 53 described later.
  • the light projection system 51 has a light source 51a that generates a laser beam L1 set in the infrared or other wavelength range.
  • the light receiving system 52 receives the return light L 2 reflected by the scanning mirror 53 a of the rotary reflection unit 53, which is the reflected light or light beam from the detection object OB incident through the optical window 56 a of the exterior component 56. .
  • the light receiving system 52 has a light receiving element 52a having, for example, six pixels in the vertical sub-scanning direction in order to detect the return light L2.
  • the laser beam (projected beam) L1 emitted from the laser radar unit 21 is reflected by the detection target OB, and one of the light reflected by the detection target OB.
  • the light beam enters the light receiving system 52 via the scanning mirror 53a in the laser radar unit 21 as return light (reflected light) L2.
  • the rotation reflection unit 53 includes a scanning mirror 53a and a rotation drive unit 53b.
  • the scanning mirror 53a is a double reflection type polygon mirror, and has a first reflecting portion 53i and a second reflecting portion 53j for bending an optical path.
  • the first and second reflecting portions 53i and 53j are respectively disposed above and below along the rotation axis RX extending in parallel to the z direction.
  • the first and second reflecting portions 53i and 53j have a pyramidal shape.
  • the inclination angles of the reflecting surfaces of the first and second reflecting portions 53i and 53j gradually change with the rotational position of the scanning mirror 53a (in the example shown, the position facing the four azimuths in units of 90 °). (For the specific shape of the first and second reflecting portions 53i, 53j, see WO 2014/168137).
  • the reflecting surface of the first reflecting portion 53i reflects the laser beam (projected beam) L1 incident from the + y direction, which is the right direction on the paper surface, in a direction substantially orthogonal to the first direction. 2 Lead to the mirror surface of the reflection part 53j.
  • the mirror surface of the second reflecting portion 53j reflects the laser light L1 incident from the lower side on the paper surface in a direction substantially orthogonal to the laser light L1 and guides the laser light L1 to the right of the detection target OB side on the paper surface.
  • a part of return light (reflected light) L2 reflected by the detection target OB follows a path reverse to the path of the laser light L1, and is detected by the light receiving system 52.
  • the scanning mirror 53a reflects again the return light L2 reflected by the detection target OB by the mirror surface of the second reflecting portion 53j, and guides the return light L2 to the mirror surface of the first reflecting portion 53i. Subsequently, the return light L2 is reflected again by the mirror surface of the first reflecting portion 53i and is guided to the light receiving system 52 side.
  • the traveling direction of the laser beam L1 changes in a plane (that is, the xy plane) orthogonal to the vertical z-axis direction. That is, the laser beam L1 is scanned around the z axis as the scanning mirror 53a rotates.
  • the angular area scanned by the laser beam L1 is a detection area.
  • the opening angle with respect to the + z-axis direction is the light projection angle, and xy between the traveling direction of the laser light L1 at the scanning start point and the traveling direction of the laser light L1 at the scanning end point
  • the angle formed in the plane is the irradiation angle.
  • a projection field corresponding to the detection area is formed by the projection angle and the irradiation angle. Since the light projection field changes in four steps in the vertical direction according to the rotational position of the scanning mirror 53a by 90 °, the light projection field as a whole is a light projection field achieved by a single scan. In the vertical direction, it has a fourfold spread.
  • the drive circuit 55 controls the operation of the light source 51a of the light projection system 51, the light receiving element 52a of the light reception system 52, the rotation drive unit 53b of the rotation reflection unit 53, and the like. Further, the drive circuit 55 obtains object information of the detection object OB from the electric signal obtained by converting the return light L2 incident on the light receiving element 52a of the light receiving system 52. Specifically, when the output signal from the light receiving element 52a is equal to or higher than a predetermined threshold value, the drive circuit 55 determines that the light receiving element 52a receives the return light L2 from the detection target OB. In this case, the distance to the detection object OB is obtained from the difference between the light emission timing of the light source 51a and the light reception timing of the light receiving element 52a.
  • the azimuth information on the main scanning direction and sub scanning direction of the detection object OB It can be asked.
  • the exterior component 56 is for covering and protecting the internal components of the laser radar unit 21.
  • the support unit 23 not only supports the laser radar unit 21 but also has a function of adjusting the orientation or attitude of the laser radar unit 21 under the control of the control device 80.
  • the supporting unit 23 adjusts the posture of the laser radar unit 21 under the control of the control device 80 when the whole including the supporting unit 23 is inclined, and maintains the laser radar unit 21 in the state before the inclination. It may be
  • the control device 80 has an input / output unit 81 which is an interface with an operator, an arithmetic processing unit 82 which performs arithmetic processing on data etc. based on a program, controls an external device, etc., external data, arithmetic processing results etc.
  • a storage unit 83 for storing data and a communication unit 84 for communicating with an external device are provided.
  • the input / output unit 81 includes an operation unit 81a that receives an instruction from the operator, and a display 81b that presents the processing result of the arithmetic processing unit 82 to the operator.
  • the operation unit 81a has a keyboard, a mouse and the like, and can make the progress state of the program executed by the control device 80 reflect the intention of the operator.
  • the operation unit 81a receives, for example, an operation of designating a monitoring area from an area or area being measured by the operator.
  • the operation unit 81a may be like a touch panel attached to the display 81b.
  • the display 81 b is a display device such as an LCD that enables two-dimensional display or three-dimensional display, but may be a head mounted display that enables three-dimensional viewing or stereoscopic viewing.
  • the arithmetic processing unit 82 has an arithmetic unit such as a central processing unit (CPU) and an attached circuit such as an interface circuit, and performs distance measurement, display of detection points, setting / display of a monitoring area, clustering, object extraction, noise
  • An object detection program including various processes such as removal processing and alarm processing is executed.
  • the arithmetic processing unit 82 detects an object from the distance information obtained by the laser radar unit 21 which is a distance measurement unit as an object detection unit. Further, the arithmetic processing unit 82 causes the display 81 b to perform three-dimensional display or two-dimensional display of the detected object or the detection target OB as a display processing unit. That is, the processing unit 82 can display the detection points obtained by the distance measurement three-dimensionally or two-dimensionally on the display 81 b.
  • the arithmetic processing unit 82 serves as an area setting unit, and by receiving an instruction from the operator via the operation unit 81a and the display 81b, a monitoring area to be monitored by the laser radar unit 21 in the measurement area or the measurement area.
  • the arithmetic processing unit 82 receives an instruction from the operator via the operation unit 81a and the display 81b, thereby changing the outline of the monitoring area by expansion, reduction, or the like.
  • the arithmetic processing unit 82 can receive the contour shape of the monitoring area as an arbitrary shape.
  • the user can freely set a three-dimensional monitoring area via the arithmetic processing unit 82 according to the purpose or the environment.
  • the arithmetic processing unit 82 can adjust the arrangement and the number of monitoring areas.
  • the arithmetic processing unit 82 can combine the plurality of monitoring areas into one monitoring area.
  • the arithmetic processing unit 82 as a monitoring unit, records or issues the presence of a moving body when the arithmetic processing unit (object detecting unit) 82 detects a moving body in the monitoring area. Thereby, it is possible to record or report the moving body which has entered the monitoring area.
  • the arithmetic processing unit 82 detects a moving body, the three-dimensional behavior of the moving body can be grasped in the monitoring area.
  • FIG. 3A is a diagram for explaining setting and display of a monitoring area.
  • the measurement area includes a detection point (not shown) obtained by one measurement operation (operation of the entire area) of the laser radar unit 21.
  • Measurement data giving detection points are measurement data of polar coordinates originally, but they are converted into a rectangular coordinate system of XYZ.
  • two monitoring areas SA1 and SA2 are set in the measurement area represented by the orthogonal coordinate system. Setting the monitoring areas SA1 and SA2 enables efficient monitoring focusing on the required area.
  • a frame-like projected image PI1 obtained by projecting the monitoring area SA1 onto the XY plane, which is a predetermined reference plane, corresponds to the outline of the monitoring area SA1 in plan view, and displays the first pattern displayed two-dimensionally on the display 81b. It corresponds to an image.
  • a frame-like projected image PI2 obtained by projecting the monitoring area SA1 onto the XZ plane, which is a predetermined reference plane corresponds to the outline of the monitoring area SA1 viewed from the front, and is a second pattern displayed two-dimensionally on the display 81b.
  • These projection images PI1 and PI2 are parallel projection images when the monitoring area SA1 is viewed from a direction parallel to the Y axis and the Z axis.
  • FIG. 3B shows a perspective projection image of the monitoring area SA1 viewed from the viewpoint EO of FIG. 3A, which corresponds to a three-dimensional display displayed on the display 81b.
  • the monitoring area SA1 is displayed as a projected image PI3 which is reduced toward the distance.
  • the storage unit 83 stores an object detection program and various data necessary for its execution. Further, when the arithmetic processing unit 82 determines that the alarm target is present in the monitoring area, the storage unit 83 records various information such as the state of the alarm target and the time. Furthermore, the storage unit 83 sequentially records data on the object extracted by the object detection program, and enables the arithmetic processing unit 82 to monitor the movement state of the object.
  • the communication unit 84 enables communication between the arithmetic processing unit 82 and the laser radar unit 21 or the support unit 23, and enables the arithmetic processing unit 82 to take in data from the laser radar unit 21 etc., and the arithmetic processing unit The command from 82 can be transmitted to the laser radar unit 21.
  • the arithmetic processing unit 82 of the control device 80 operates the laser radar unit 21 to start capturing measurement data including distance information and the like (step S11).
  • the laser radar unit 21 outputs measurement data (r, ⁇ , ⁇ ) of polar coordinates or data as a source of measurement data (r, ⁇ , ⁇ ), and the arithmetic processing unit 82 outputs measurement data (r, r of polar coordinates).
  • measurement data r, ⁇ of polar coordinates
  • X, Y, Z measurement data
  • the distance to the detected object is r
  • the polar angle is ⁇
  • the azimuth angle is ⁇ .
  • coordinate conversion that compensates for the inclination of the attitude of the laser radar unit 21 is also possible.
  • the arithmetic processing unit 82 causes the display 81b to display a detection point at which an object is detected based on measurement data including distance information and the like received from the laser radar unit 21 (step S12). At this time, the arithmetic processing unit 82 can three-dimensionally display a group of detection points obtained by the distance measurement on the display 81 b. At this time, the detection point can be displayed in color according to the distance from the viewpoint or the like. The detection points displayed three-dimensionally may be subjected to processing such as clustering and noise removal as described later.
  • the detection points displayed on the display 81 b are not limited to three-dimensional display, and may be two-dimensional display.
  • the three-dimensional display will be described.
  • a central projection method is used for three-dimensional display, and arithmetic processing is performed to project a large number of three-dimensionally arranged detection points on a two-dimensional plane.
  • the vector of the viewpoint is V E
  • the vector of the visual center is V O
  • the arrangement of the viewpoint based on the visual center is considered, the azimuth angle is ⁇ and the elevation angle is ⁇ .
  • the coordinates PP (a, b, c) can be regarded as the translation of the origin and the rotation of the coordinate axes around the Z and Y coordinate axes, and the following relational expressions Given by Coordinates DP1 (b / (-a), c / (-) calculated by reduction with distance from coordinate values b and c parallel-projected to a plane orthogonal to the line of sight PP (a, b, c) based on viewpoint a)) is the coordinates of the central projection.
  • central projection display of detection points ie, Three-dimensional display becomes possible.
  • coordinates DP2 (b, c) for which reduction due to distance is not calculated are coordinates of parallel projection, and two-dimensional display corresponding to an arbitrary viewpoint can be performed using this.
  • the arithmetic processing unit 82 checks whether or not the monitoring area is set with reference to the storage unit 83 (step S13), and when the monitoring area is set (Y in step S13) If there is a processing request to change the monitoring area (Y in step S14), setting processing of the monitoring area is performed (step S15).
  • the arithmetic processing unit 82 requests the operator to use the input / output unit 81 whether or not to set the monitoring area and to select the setting method, and receives the operator's selection (step S51). At this time, the arithmetic processing unit 82 uses the operation unit 81 a and the display 81 b of the input / output unit 81 to perform input acceptance by GUI (Graphical User Interface).
  • GUI Graphic User Interface
  • the basic shape prepared in advance includes, for example, a rectangular parallelepiped, a cylinder, a sphere, etc., and the posture and size of the basic figure can be changed, and it is also possible to combine a plurality of figures.
  • the outer edge of the monitoring area is defined by assuming the surface of the obstacle from a static detection point and calculating an approximate surface extending to a predetermined distance from the surface of the obstacle. can do.
  • the arithmetic processing unit 82 requests the operator to select the display method of the monitoring area using the input / output unit 81, and receives the operator's selection (step S52).
  • a display method of the monitoring area as described above, three-dimensional display or two-dimensional display can be performed, and in the case of three-dimensional display, camera images corresponding to viewpoints can be superimposed or displayed in parallel. Furthermore, the operator can shift or switch the viewpoint of the three-dimensional display or the two-dimensional display using the input / output unit 81.
  • the monitoring area is observed three-dimensionally, and it is easy to grasp the spatial arrangement of the monitoring area.
  • the intended monitoring area can be set quickly.
  • the monitoring area is set in a state in which the display 81 b is displayed in a two-dimensional manner, the monitoring area is projected onto each reference plane with little distortion, and the arrangement of the monitoring area becomes relatively accurate.
  • the arithmetic processing unit 82 receives the setting of the monitoring area intended by the operator using the input / output unit 81 (step S53), and stores the set monitoring area in the storage unit 83.
  • the operator selects, for example, (1) one or more basic shapes from a tool area provided outside the measurement area for displaying the detection points of the display 81 b, and (2) additionally displayed in the measurement area of the display 81 b.
  • the monitoring area can be set in the measurement area by selecting the outer edge candidate of the monitoring area or (3) selecting a tool for drawing a point or plane.
  • the monitoring area set by various methods as described above is displayed together with the detection point in the measurement area.
  • the arithmetic processing unit 82 allows the operator to move the position of the monitoring area or increase or decrease the size.
  • FIGS. 7A and 7B show a specific example in which a three-dimensional display is made on the display 81b, and it can be seen that a horizontally long rectangular frame-like monitoring area is displayed together with detection points in the measurement area.
  • FIG. 7A is a viewpoint for observing the monitoring area from the front
  • FIG. 7B is a viewpoint for observing the monitoring area from the top.
  • the displays in FIGS. 7A and 7B are not only performed alone on the display 81 b but also include parallel display on the display 81 b.
  • FIGS. 7C and 7D show a specific example in which a two-dimensional display is made on the display 81b, and it can be seen that a horizontally long rectangular monitoring area is displayed together with the detection point in the measurement area.
  • FIG. 7C is a viewpoint for observing the monitoring area from the front
  • FIG. 7D is a viewpoint for observing the monitoring area from the top.
  • the displays in FIGS. 7C and 7D are not only performed alone on the display 81b, but also include parallel display on the display 81b.
  • the predetermined monitoring area is inherited even if the display on the display 81b is switched from the three-dimensional display to the two-dimensional display at the request of the operator. That is, the arithmetic processing unit 82 maintains the setting of the monitoring area before and after the display switching, and performs processing of switching the three-dimensional display of the predetermined monitoring area to the corresponding two-dimensional display. Contrary to the above, when the monitoring area is once set in the two-dimensional display, the predetermined monitoring area is inherited even if the two-dimensional display is switched to the three-dimensional display at the request of the operator.
  • three-dimensional display and two-dimensional display are switched to be performed, but three-dimensional display and two-dimensional display can also be displayed side by side on the display 81 b.
  • the arithmetic processing unit 82 confirms whether the operator desires correction of the monitoring area using the input / output unit 81 (step S54).
  • the arithmetic processing unit 82 requests the operator to select the display method of the monitoring area using the input / output unit 81, and receives the selection of the operator (step S55).
  • the arithmetic processing unit 82 receives correction of the monitoring area desired by the operator using the input / output unit 81 (step S56), and stores the corrected monitoring area in the storage unit 83.
  • the correction of the monitoring area includes partially deleting the monitoring area set in step S53, extending the addition of an additional area to the monitoring area set in step S53, and the like, and deleting the entire monitoring area And adding new monitoring areas to different places.
  • the arithmetic processing unit 82 sets the display method of the set or corrected monitoring area (step S57). That is, the arithmetic processing unit 82 displays the monitoring area set in step S53 or the monitoring area corrected in step S56 on the display 81b, line types such as line thickness, line color, broken line and solid line
  • the operator can input / set information on the other information using the input / output unit 81, and the result is stored in the storage unit 83.
  • the arithmetic processing unit 82 operates the display 81 b in a display mode according to the setting (step S16). That is, when the group of detection points is three-dimensionally displayed on the display 81b at, for example, a predetermined viewpoint in step S12, the group of detection points is similarly three-dimensionally displayed on the display 81b at the predetermined viewpoint The frame of the monitoring area set in step S15 is superimposed and displayed.
  • the arithmetic processing unit 82 performs clustering from the latest measurement data (step S17), and stores the result in the storage unit 83.
  • Clustering is processing to subset detection points by connecting adjacent measurement points and the like to obtain target size and outline information.
  • Clustering can be performed on measurement data (r, ⁇ , ⁇ ) in polar coordinates or measurement data (X, Y, Z) in an orthogonal coordinate system.
  • processing such as connection of a plurality of obtained clusters can be added.
  • adjacent clusters can be connected by expanding a region of one to several pixels or a distance corresponding thereto around the cluster and narrowing the periphery of the obtained cluster by the corresponding number of pixels or distance.
  • the arithmetic processing unit 82 performs various arithmetic processing on each cluster obtained by the clustering in step S17 to determine the position and size of each cluster (step S18).
  • the position of a cluster for example, the average position or the center of gravity of detection points or pixel points constituting the cluster can be used.
  • the size of a cluster for example, a volume in a region connecting the outer edges of detection points or pixel points constituting the cluster, an area projected on an XY plane, an area projected on an XZ plane, or the like can be used.
  • the arithmetic processing unit 82 performs noise determination to remove small-sized ones from the clusters having the additional information obtained in step S18 in consideration of the size, and selects an object worthy of attention (step S19). . That is, the arithmetic processing unit 82 determines a cluster larger than the noise level as a front object, labels the object extracted in this manner, and stores the object in the storage unit 83.
  • the arithmetic processing unit 82 extracts the clusters existing in the monitoring area set in step S15 from the clusters obtained in step S18 (step S21), and stores the result in the storage unit 83.
  • the arithmetic processing unit 82 performs tracking on the clusters extracted as within the monitoring area (step S22). Specifically, the arithmetic processing unit 82 causes the display 81 b to display the extracted cluster with a mark around it or a frame of a quadrangular prism. At this time, it is also possible to capture the movement of the extracted cluster as a trajectory. In this case, the identity of the extracted cluster is determined from the shape and size. In the above tracking, not only detection points constituting clusters extracted as in the monitoring area but also detection points constituting clusters larger than the noise level obtained outside the monitoring area can be displayed on the display 81b. Vehicles, buildings, etc. are displayed as clusters outside the monitoring area, but if detection points making up these clusters are three-dimensionally displayed on the display 81b, identification of clusters or objects inside or outside the monitoring area It will be easier.
  • the arithmetic processing unit 82 determines whether the focused target tracked in step S22 is a warning target (step S23). For example, in the case of monitoring a person or a car, if the size of the cluster is different from these targets, it is determined that the target of interest is not a warning target. In addition, the trajectory of the cluster and the moving speed can also be used to determine whether or not an alarm is to be made.
  • the arithmetic processing unit 82 When it is determined that the alarm target is present (Y in step S23), the arithmetic processing unit 82 performs alarm processing and recording processing (step S24). As the alarm processing, the arithmetic processing unit 82 notifies the operator that the alarm target has appeared in the monitoring area, using the display 81 b and a speaker (not shown). Further, the arithmetic processing unit 82 notifies the external system via the communication unit 84 that the alarm target has appeared in the monitoring area. As recording processing, the arithmetic processing unit 82 stores, in the storage unit 83, features such as the time and the size and shape of the cluster when the cluster corresponding to the alarm target appears in the monitoring area.
  • step S25 If there is no instruction to end the process (N in step S25), the arithmetic processing unit 82 returns to step S13 and repeats the process regardless of the presence of the alarm target.
  • operation processing unit 82 receives a processing request for setting start from the operator (step S34), and if there is no processing request for setting the monitoring area (N in step S34).
  • the arithmetic processing unit 82 operates the display 81b in the display mode according to the setting, as in the process at step S16 (step S36).
  • the arithmetic processing unit 82 performs clustering from the latest measurement data (step S37) without determining the monitored area (step S37), and determines the position and size of each obtained cluster (Step S38), noise determination is performed to remove small clusters (step S39). In this case, even if a cluster is detected, it is not set as an alarm target, but the detection result is displayed two-dimensionally or three-dimensionally.
  • the arithmetic processing unit 82 as the area setting unit performs either of two-dimensional display or three-dimensional display corresponding to a predetermined gaze direction on the display 81 b. Since the monitoring area to be monitored by the arithmetic processing unit (object detection unit) 82 is set by receiving the operation of the input / output unit 81 using the display 81 b in a state in which one is performed, the user can Since the operation of setting the monitoring area is performed based on the two-dimensional display or the three-dimensional display from the predetermined direction, the arrangement of the monitoring area becomes relatively accurate. In particular, the operation of setting the monitoring area based on the two-dimensional display can be simplified even for the user who is not used to the operation. Moreover, when setting a monitoring area
  • the present invention was described according to an embodiment, the present invention is not limited to the above-mentioned embodiment etc.
  • the structure and the number of the laser radar units 21 are merely examples, and distance measurement units of various structures can be used.
  • the method of clustering is not limited to the above, and various methods can be adopted with regard to setting of the near range and the close range, the connection method of the pixel points, and the like.
  • the detection point within the measurement area is displayed on the display 81b when setting the monitoring area, but instead of the detection point in real time, detection points measured in the past, in the local space It is also possible to display one based on CAD data or the like.

Abstract

Provided is an object detecting system, which facilitates setting and recognition of a monitoring area even in three-dimensional distance detection. An object detecting system 100 comprises: a laser radar unit 21 as a distance measuring unit for detecting reflected light while scanning a light beam and measuring a distance on the basis of propagation time of the light beam; an arithmetic processing unit (object detecting unit) 82 for detecting an object on the basis of distance information obtained by the distance measuring unit; an input/output unit 81 including a display 81b; an arithmetic processing unit (display processing unit) 82 for controlling the display 81b to display a three-dimensional image and a two-dimensional image of the object detected by the arithmetic processing unit (object detecting unit) 82; and an arithmetic processing unit (area setting unit) 82 for setting, in response to operation on the input/output unit 81 using the display 81b in a state in which the display 81b is controlled to display a two-dimensional image corresponding to a predetermined sight-line direction, monitoring areas SA1, SA2, which are to be monitored by the arithmetic processing unit (object detecting unit) 82, out of areas, the distances to which are measured by the distance measuring unit.

Description

物体検出システム及び物体検出プログラムObject detection system and object detection program
 本発明は、光ビームを走査しつつ対象物からの戻り光を検出する物体検出システム及び物体検出プログラムに関する。 The present invention relates to an object detection system and an object detection program for detecting return light from an object while scanning a light beam.
 物体検出システムでは、距離と方向とを高精度で検出する3次元ライダー(3D-LiDAR)と呼ばれるものが用いられる。3次元ライダーを使用する場合には、3次元情報が得られるため、検知結果や監視結果も3次元的に表示されることが望ましい。また、必要な監視領域に限って検出結果を表示することも行われており、その場合、監視領域を簡易に確認、設定、修正等できることが望ましい。 In the object detection system, a so-called three-dimensional lidar (3D-LiDAR) that detects distance and direction with high accuracy is used. When using a three-dimensional lidar, three-dimensional information can be obtained, so it is desirable that the detection results and the monitoring results be displayed three-dimensionally. In addition, it is also performed to display the detection result only in a necessary monitoring area, and in that case, it is desirable that the monitoring area can be easily confirmed, set, corrected, and the like.
 物体検出システムとして、2次元ライダーを用いたものであるが、最大検知距離内に検出エリア又は監視領域を設定するものが公知となっている(特許文献1)。この物体検出システムでは、2次元ライダーを用いる結果としてスキャン面が平面であるため、検出エリア又は監視領域の設定に際しては、スキャン平面上で領域設定を行うことになり、それで十分であった。 Although a two-dimensional rider is used as an object detection system, what sets a detection area or a monitoring area within the maximum detection distance is known (patent document 1). In this object detection system, since the scan plane is a plane as a result of using the two-dimensional lidar, when setting the detection area or the monitoring area, the area setting is performed on the scan plane, which is sufficient.
 しかし、3次元ライダーを使用する場合、複数のスキャン平面を持つことになるので、上記特許文献1のように各スキャン平面に対して監視領域を設定すると、設定に手間がかかるだけでなく、監視領域の空間的な把握が困難になる。また、3次元ライダーによって得られる距離情報は平面的なものではないため、監視領域を多層の平面で考えることは不適切である。 However, in the case of using a three-dimensional lidar, since it has a plurality of scan planes, setting a monitoring area for each scan plane as in the above-mentioned patent document 1 not only takes time for setting, but also monitoring It becomes difficult to grasp the space in space. In addition, it is inappropriate to consider the monitoring area in a multi-layered plane because the distance information obtained by the three-dimensional lidar is not planar.
特開2009-93428号公報JP, 2009-93428, A
 本発明は、上記背景技術の問題点に鑑みてなされたものであり、3次元で距離検知を行う場合であっても、監視領域の設定や把握が容易な物体検出システム、及び、これに適用される物体検出プログラムを提供することを目的とする。 The present invention has been made in view of the above-mentioned problems of the background art, and is applicable to an object detection system that facilitates setting and grasping of a monitoring area even when distance detection is performed in three dimensions. Object of the invention is to provide an object detection program.
 上述した目的のうち少なくとも一つを実現するために、本発明の一側面を反映した物体検出システムは、光ビームを走査しつつ反射光を検出して伝搬時間から距離を計測する距離計測部と、距離計測部によって得た距離情報から物体を検知する物体検知部と、ディスプレイを含む入出力部と、ディスプレイに物体検知部によって検知した物体の3次元的な表示と2次元的な表示とを行わせる表示処理部と、ディスプレイに所定の視線方向に対応する2次元的な表示及び3次元的な表示のいずれか一方を行わせた状態で、ディスプレイを利用した入出力部の操作を受け付けることによって、距離計測部によって計測されるエリアのうち、物体検知部による監視処理の対象となる監視領域を設定する領域設定部とを備える。 In order to realize at least one of the above-described objects, an object detection system reflecting one aspect of the present invention includes a distance measurement unit that detects a reflected light while scanning a light beam and measures a distance from a propagation time An object detection unit for detecting an object from distance information obtained by the distance measurement unit, an input / output unit including a display, and three-dimensional display and two-dimensional display of the object detected by the object detection unit on the display Accepting an operation of an input / output unit using a display in a state in which the display processing unit to be performed and either the two-dimensional display or the three-dimensional display corresponding to a predetermined gaze direction are performed on the display And an area setting unit configured to set a monitoring area to be monitored by the object detection unit among the areas measured by the distance measurement unit.
 上述した目的のうち少なくとも一つを実現するために、本発明の一側面を反映した物体検出プログラムは、光ビームを走査しつつ反射光を検出して伝搬時間から距離を計測する距離計測部と、距離計測部によって得た距離情報から物体を検知する物体検知部と、ディスプレイを含む入出力部と、ディスプレイに物体検知部によって検知した物体の3次元的な表示と2次元的な表示とを行わせる表示処理部と、ディスプレイに所定の視線方向に対応する2次元的な表示及び3次元的な表示のいずれか一方を行わせた状態で、ディスプレイを利用した入出力部の操作を受け付けることによって、距離計測部によって計測されるエリアのうち、物体検知部による監視処理の対象となる監視領域を設定する領域設定部とを備える物体検出システムを制御する制御装置で動作する。 In order to realize at least one of the above-described objects, an object detection program reflecting one aspect of the present invention includes a distance measurement unit that detects a reflected light while scanning a light beam and measures a distance from a propagation time An object detection unit for detecting an object from distance information obtained by the distance measurement unit, an input / output unit including a display, and three-dimensional display and two-dimensional display of the object detected by the object detection unit on the display Accepting an operation of an input / output unit using a display in a state in which the display processing unit to be performed and either the two-dimensional display or the three-dimensional display corresponding to a predetermined gaze direction are performed on the display And an area setting unit for setting a monitoring area to be monitored by the object detection unit among areas measured by the distance measurement unit. Operating in Gosuru controller.
本発明の一実施形態に係る物体検出システムを説明する図である。It is a figure explaining the object detection system concerning one embodiment of the present invention. 図1の物体検出システムを構成するレーザーレーダーユニットの構造を説明する概略図である。It is the schematic explaining the structure of the laser radar unit which comprises the object detection system of FIG. 図3Aは、計測されるエリアにおいて設定される監視領域を説明する斜視図であり、図3Bは、監視領域の3次元的表示の一例を示す図である。FIG. 3A is a perspective view for explaining the monitoring area set in the area to be measured, and FIG. 3B is a view showing an example of a three-dimensional display of the monitoring area. 図1の物体検出システムの動作を説明する図である。It is a figure explaining operation | movement of the object detection system of FIG. 図1の物体検出システムの動作を説明する図である。It is a figure explaining operation | movement of the object detection system of FIG. 監視領域の設定方法を説明する図である。It is a figure explaining the setting method of the monitoring area | region. 図7A~7Dは、ディスプレイによる表示例を説明する図である。7A to 7D are diagrams for explaining display examples by the display.
 以下、図1等を参照しつつ、本発明に係る一実施形態の物体検出システムについて説明する。 Hereinafter, an object detection system according to an embodiment of the present invention will be described with reference to FIG. 1 and the like.
 図1に示す物体検出システム100は、レーザーレーダーユニット21と、支持部23と、制御装置80とを備える。 An object detection system 100 shown in FIG. 1 includes a laser radar unit 21, a support unit 23, and a control device 80.
 図2を参照して、レーザーレーダーユニット21の構造の一例について説明する。レーザーレーダーユニット21は、検出対象の存在や当該検出対象までの距離を検出する距離計測部であり、回転する走査用ミラー53aによって光ビームを走査しつつ伝搬時間から距離を計測する。レーザーレーダーユニット(距離計測部)21は、投光系51と、受光系52と、回転反射部53と、駆動回路55と、外装部品56とを備える。これらのうち、投光系51と、受光系52と、回転反射部53とは、走査型の光学系59を構成している。 An example of the structure of the laser radar unit 21 will be described with reference to FIG. The laser radar unit 21 is a distance measurement unit that detects the presence of a detection target and the distance to the detection target, and measures the distance from the propagation time while scanning the light beam with the rotating scanning mirror 53a. The laser radar unit (distance measurement unit) 21 includes a light projection system 51, a light reception system 52, a rotation reflection unit 53, a drive circuit 55, and an exterior component 56. Among these, the light projecting system 51, the light receiving system 52, and the rotary reflecting portion 53 constitute a scanning optical system 59.
 投光系51は、後述する回転反射部53の走査用ミラー53aに対して光ビーム又は投光ビームの元になるレーザー光L1を射出する。投光系51は、赤外その他の波長域に設定されたレーザー光L1を発生する光源51aを有する。 The light projection system 51 emits a laser beam L1 that is the source of the light beam or the light projection beam to a scanning mirror 53a of the rotation reflection unit 53 described later. The light projection system 51 has a light source 51a that generates a laser beam L1 set in the infrared or other wavelength range.
 受光系52は、外装部品56の光学窓56aを介して入射する検出対象OBからの反射光又は光ビームであって、回転反射部53の走査用ミラー53aで反射された戻り光L2を受光する。受光系52は、戻り光L2を検出するため、縦の副走査方向に関して例えば6つの画素を有する受光素子52aを有する。検出領域内に物体等の検出対象OBがあると、レーザーレーダーユニット21から射出されたレーザー光(投光ビーム)L1が検出対象OBで反射等され、検出対象OBで反射等された光の一部が戻り光(反射光)L2としてレーザーレーダーユニット21における走査用ミラー53aを介して受光系52に入射する。 The light receiving system 52 receives the return light L 2 reflected by the scanning mirror 53 a of the rotary reflection unit 53, which is the reflected light or light beam from the detection object OB incident through the optical window 56 a of the exterior component 56. . The light receiving system 52 has a light receiving element 52a having, for example, six pixels in the vertical sub-scanning direction in order to detect the return light L2. When there is a detection target OB such as an object in the detection area, the laser beam (projected beam) L1 emitted from the laser radar unit 21 is reflected by the detection target OB, and one of the light reflected by the detection target OB. The light beam enters the light receiving system 52 via the scanning mirror 53a in the laser radar unit 21 as return light (reflected light) L2.
 回転反射部53は、走査用ミラー53aと回転駆動部53bとを有する。走査用ミラー53aは、2回反射型のポリゴンミラーであり、光路折り曲げ用の第1反射部53iと第2反射部53jとを有する。第1及び第2反射部53i,53jは、z方向に平行に延びる回転軸RXに沿って上下にそれぞれ配置されている。第1及び第2反射部53i,53jは角錐状の形状を有している。第1及び第2反射部53i,53jの反射面の傾斜角は、走査用ミラー53aの回転位置(図示の例では90°単位で4方位を向く位置)に伴って徐々に変化するものになっている(第1及び第2反射部53i,53jの具体的な形状については、国際公開第2014/168137号参照)。 The rotation reflection unit 53 includes a scanning mirror 53a and a rotation drive unit 53b. The scanning mirror 53a is a double reflection type polygon mirror, and has a first reflecting portion 53i and a second reflecting portion 53j for bending an optical path. The first and second reflecting portions 53i and 53j are respectively disposed above and below along the rotation axis RX extending in parallel to the z direction. The first and second reflecting portions 53i and 53j have a pyramidal shape. The inclination angles of the reflecting surfaces of the first and second reflecting portions 53i and 53j gradually change with the rotational position of the scanning mirror 53a (in the example shown, the position facing the four azimuths in units of 90 °). (For the specific shape of the first and second reflecting portions 53i, 53j, see WO 2014/168137).
 第1反射部53iの反射面は、紙面上で右方向である+y方向から入射したレーザー光(投光ビーム)L1を略直交する方向に反射し、紙面上で上方向である+z方向の第2反射部53jの鏡面に導く。第2反射部53jの鏡面は、紙面上で下方向から入射したレーザー光L1を略直交する方向に反射し、紙面上で右方向の検出対象OB側へ導く。検出対象OBで反射された一部の戻り光(反射光)L2は、レーザー光L1の経路と逆の経路をたどり、受光系52で検出される。つまり、走査用ミラー53aは、検出対象OBで反射された戻り光L2を、第2反射部53jの鏡面で再度反射させ、第1反射部53iの鏡面に導く。続いて、戻り光L2を第1反射部53iの鏡面で再度反射させ、受光系52側へ導く。 The reflecting surface of the first reflecting portion 53i reflects the laser beam (projected beam) L1 incident from the + y direction, which is the right direction on the paper surface, in a direction substantially orthogonal to the first direction. 2 Lead to the mirror surface of the reflection part 53j. The mirror surface of the second reflecting portion 53j reflects the laser light L1 incident from the lower side on the paper surface in a direction substantially orthogonal to the laser light L1 and guides the laser light L1 to the right of the detection target OB side on the paper surface. A part of return light (reflected light) L2 reflected by the detection target OB follows a path reverse to the path of the laser light L1, and is detected by the light receiving system 52. That is, the scanning mirror 53a reflects again the return light L2 reflected by the detection target OB by the mirror surface of the second reflecting portion 53j, and guides the return light L2 to the mirror surface of the first reflecting portion 53i. Subsequently, the return light L2 is reflected again by the mirror surface of the first reflecting portion 53i and is guided to the light receiving system 52 side.
 走査用ミラー53aが回転すると、縦のz軸方向に直交する平面(つまり、xy面)内において、レーザー光L1の進行方向が変化する。つまり、レーザー光L1は、走査用ミラー53aの回転に伴って、z軸のまわりに走査される。レーザー光L1によって走査される角度領域が検出領域となる。投光用のレーザー光L1の進行方向において+z軸方向に関する開き角が投光角度であり、走査開始点でのレーザー光L1の進行方向と走査終了点でのレーザー光L1の進行方向とのxy面内でのなす角度が照射角度である。このような投光角度と照射角度とによって検出領域に対応する投光視野が形成される。なお、投光視野は、走査用ミラー53aの90°単位の回転位置に応じて上下方向に関して4段階で変化するので、全体としての投光視野は、単一の走査で達成される投光視野に対して上下方向に4倍の広がりを有するものとなっている。 When the scanning mirror 53a rotates, the traveling direction of the laser beam L1 changes in a plane (that is, the xy plane) orthogonal to the vertical z-axis direction. That is, the laser beam L1 is scanned around the z axis as the scanning mirror 53a rotates. The angular area scanned by the laser beam L1 is a detection area. In the traveling direction of the laser light L1 for light projection, the opening angle with respect to the + z-axis direction is the light projection angle, and xy between the traveling direction of the laser light L1 at the scanning start point and the traveling direction of the laser light L1 at the scanning end point The angle formed in the plane is the irradiation angle. A projection field corresponding to the detection area is formed by the projection angle and the irradiation angle. Since the light projection field changes in four steps in the vertical direction according to the rotational position of the scanning mirror 53a by 90 °, the light projection field as a whole is a light projection field achieved by a single scan. In the vertical direction, it has a fourfold spread.
 駆動回路55は、投光系51の光源51a、受光系52の受光素子52a、回転反射部53の回転駆動部53b等の動作を制御する。また、駆動回路55は、受光系52の受光素子52aに入射した戻り光L2の変換によって得た電気信号から検出対象OBの物体情報を得る。具体的には、受光素子52aにおける出力信号が所定の閾値以上である場合、駆動回路55において、受光素子52aが検出対象OBからの戻り光L2を受光したと判断される。この場合、光源51aでの発光タイミングと受光素子52aでの受光タイミングとの差から、検出対象OBまでの距離が求められる。また、受光素子52aへの戻り光L2の副走査方向に関する受光位置及び走査用ミラー53aの主走査方向に相当する回転角に基づいて、検出対象OBの主走査方向及び副走査方向に関する方位情報を求めることができる。 The drive circuit 55 controls the operation of the light source 51a of the light projection system 51, the light receiving element 52a of the light reception system 52, the rotation drive unit 53b of the rotation reflection unit 53, and the like. Further, the drive circuit 55 obtains object information of the detection object OB from the electric signal obtained by converting the return light L2 incident on the light receiving element 52a of the light receiving system 52. Specifically, when the output signal from the light receiving element 52a is equal to or higher than a predetermined threshold value, the drive circuit 55 determines that the light receiving element 52a receives the return light L2 from the detection target OB. In this case, the distance to the detection object OB is obtained from the difference between the light emission timing of the light source 51a and the light reception timing of the light receiving element 52a. Further, based on the light receiving position of the return light L2 to the light receiving element 52a in the sub scanning direction and the rotation angle corresponding to the main scanning direction of the scanning mirror 53a, the azimuth information on the main scanning direction and sub scanning direction of the detection object OB It can be asked.
 外装部品56は、レーザーレーダーユニット21の内蔵部品を覆い、保護するためのものである。 The exterior component 56 is for covering and protecting the internal components of the laser radar unit 21.
 図1に戻って、支持部23は、レーザーレーダーユニット21を支持するだけでなく、制御装置80の制御下でレーザーレーダーユニット21の向き又は姿勢を調整する機能を有する。なお、支持部23は、支持部23等を含む全体が傾斜した場合に、制御装置80の制御下でレーザーレーダーユニット21の姿勢を調整し、レーザーレーダーユニット21を傾斜前の状態に維持するようなものであってもよい。 Returning to FIG. 1, the support unit 23 not only supports the laser radar unit 21 but also has a function of adjusting the orientation or attitude of the laser radar unit 21 under the control of the control device 80. The supporting unit 23 adjusts the posture of the laser radar unit 21 under the control of the control device 80 when the whole including the supporting unit 23 is inclined, and maintains the laser radar unit 21 in the state before the inclination. It may be
 制御装置80は、オペレーターとのインターフェースである入出力部81と、プログラムに基づいてデータ等に対する演算処理、外部装置の制御等を行う演算処理部82と、外部からのデータ、演算処理結果等を保管する記憶部83と、外部装置と通信するための通信部84とを備える。 The control device 80 has an input / output unit 81 which is an interface with an operator, an arithmetic processing unit 82 which performs arithmetic processing on data etc. based on a program, controls an external device, etc., external data, arithmetic processing results etc. A storage unit 83 for storing data and a communication unit 84 for communicating with an external device are provided.
 入出力部81は、オペレーターからの指示を取り込む操作部81aと、演算処理部82による処理結果をオペレーターに提示するディスプレイ81bとを有する。操作部81aは、キーボード、マウス等を有し、制御装置80で実行されるプログラムの進行状態をオペレーターの意思を反映したものにすることができる。操作部81aは、例えばオペレーターが計測中のエリア又は領域から監視領域を指定する操作を受け付ける。操作部81aは、ディスプレイ81bに付随するタッチパネルのようなものであってもよい。ディスプレイ81bは、2次元的表示又は3次元的表示を可能にするLCD等の表示デバイスであるが、3次元視又は立体視を可能にするヘッドマウントディスプレイであってもよい。 The input / output unit 81 includes an operation unit 81a that receives an instruction from the operator, and a display 81b that presents the processing result of the arithmetic processing unit 82 to the operator. The operation unit 81a has a keyboard, a mouse and the like, and can make the progress state of the program executed by the control device 80 reflect the intention of the operator. The operation unit 81a receives, for example, an operation of designating a monitoring area from an area or area being measured by the operator. The operation unit 81a may be like a touch panel attached to the display 81b. The display 81 b is a display device such as an LCD that enables two-dimensional display or three-dimensional display, but may be a head mounted display that enables three-dimensional viewing or stereoscopic viewing.
 演算処理部82は、CPU(Central Processing Unit)等の演算部、インターフェース回路等の付属回路を有しており、距離計測、検出点の表示、監視領域の設定・表示、クラスタリング、対象抽出、ノイズ除去処理、警報処理等の各種工程を含む物体検出プログラムを実行する。 The arithmetic processing unit 82 has an arithmetic unit such as a central processing unit (CPU) and an attached circuit such as an interface circuit, and performs distance measurement, display of detection points, setting / display of a monitoring area, clustering, object extraction, noise An object detection program including various processes such as removal processing and alarm processing is executed.
 具体的には、演算処理部82は、物体検知部として、距離計測部であるレーザーレーダーユニット21によって得た距離情報から物体を検知する。また、演算処理部82は、表示処理部として、ディスプレイ81bに検知した物体又は検出対象OBの3次元的な表示又は2次元的な表示を行わせる。つまり、演算処理部82は、距離計測によって得られた検出点をディスプレイ81bにおいて3次元的に表示させたり2次元的に表示させたりすることができる。演算処理部82は、領域設定部として、操作部81a及びディスプレイ81bを介してオペレーターからの指示を受け付けることによって、計測エリア又は計測領域のうち、レーザーレーダーユニット21による監視処理の対象となる監視領域を設定する。また、演算処理部82は、領域設定部として、操作部81a及びディスプレイ81bを介してオペレーターからの指示を受け付けることによって、監視領域の輪郭を拡張、縮小等によって変更する。演算処理部82は、監視領域の輪郭形状を任意形状として受け付けることができる。これにより、ユーザーは、演算処理部82を介して3次元的な監視領域を目的や環境に応じて自在に設定することができる。また、演算処理部82は、監視領域の配置及び個数を調整することができる。また、演算処理部82は、監視領域が複数設定されている場合に、当該複数の監視領域を組み合わせて1つの監視領域とできる。また、演算処理部82は、監視部として、演算処理部(物体検知部)82が監視領域内で動体を検知した場合に動体の存在を記録又は発報する。これにより、監視領域に進入した動体を記録し或いは通報することができる。演算処理部82が動体を検知することにより、監視領域において動体の3次元的な挙動を把握することができる。 Specifically, the arithmetic processing unit 82 detects an object from the distance information obtained by the laser radar unit 21 which is a distance measurement unit as an object detection unit. Further, the arithmetic processing unit 82 causes the display 81 b to perform three-dimensional display or two-dimensional display of the detected object or the detection target OB as a display processing unit. That is, the processing unit 82 can display the detection points obtained by the distance measurement three-dimensionally or two-dimensionally on the display 81 b. The arithmetic processing unit 82 serves as an area setting unit, and by receiving an instruction from the operator via the operation unit 81a and the display 81b, a monitoring area to be monitored by the laser radar unit 21 in the measurement area or the measurement area. Set Further, the arithmetic processing unit 82, as an area setting unit, receives an instruction from the operator via the operation unit 81a and the display 81b, thereby changing the outline of the monitoring area by expansion, reduction, or the like. The arithmetic processing unit 82 can receive the contour shape of the monitoring area as an arbitrary shape. Thus, the user can freely set a three-dimensional monitoring area via the arithmetic processing unit 82 according to the purpose or the environment. In addition, the arithmetic processing unit 82 can adjust the arrangement and the number of monitoring areas. In addition, when a plurality of monitoring areas are set, the arithmetic processing unit 82 can combine the plurality of monitoring areas into one monitoring area. In addition, the arithmetic processing unit 82, as a monitoring unit, records or issues the presence of a moving body when the arithmetic processing unit (object detecting unit) 82 detects a moving body in the monitoring area. Thereby, it is possible to record or report the moving body which has entered the monitoring area. When the arithmetic processing unit 82 detects a moving body, the three-dimensional behavior of the moving body can be grasped in the monitoring area.
 図3Aは、監視領域の設定及び表示を説明する図である。計測エリアには、レーザーレーダーユニット21の一回の計測動作(全領域の操作)によって得た検出点(不図示)が含まれる。検出点を与える計測データは、元々極座標の計測データであるが、XYZの直交座標系に変換される。図示の例では、直交座標系で表現された計測エリア中に2つの監視領域SA1,SA2が設定されている。監視領域SA1,SA2を設定することで、必要な領域に注力した効率的な監視が可能になる。監視領域SA1を所定の基準面であるXY面に投影した枠状の投影像PI1は、監視領域SA1を平面視した輪郭に対応し、ディスプレイ81bに2次元的に表示される第1パターンの表示像に相当する。また、監視領域SA1を所定の基準面であるXZ面に投影した枠状の投影像PI2は、監視領域SA1を正面視した輪郭に対応し、ディスプレイ81bに2次元的に表示される第2パターンの表示像に相当する。これらの投影像PI1,PI2は、監視領域SA1をY軸及びZ軸に平行な方向から見た平行投影像となっている。 FIG. 3A is a diagram for explaining setting and display of a monitoring area. The measurement area includes a detection point (not shown) obtained by one measurement operation (operation of the entire area) of the laser radar unit 21. Measurement data giving detection points are measurement data of polar coordinates originally, but they are converted into a rectangular coordinate system of XYZ. In the illustrated example, two monitoring areas SA1 and SA2 are set in the measurement area represented by the orthogonal coordinate system. Setting the monitoring areas SA1 and SA2 enables efficient monitoring focusing on the required area. A frame-like projected image PI1 obtained by projecting the monitoring area SA1 onto the XY plane, which is a predetermined reference plane, corresponds to the outline of the monitoring area SA1 in plan view, and displays the first pattern displayed two-dimensionally on the display 81b. It corresponds to an image. In addition, a frame-like projected image PI2 obtained by projecting the monitoring area SA1 onto the XZ plane, which is a predetermined reference plane, corresponds to the outline of the monitoring area SA1 viewed from the front, and is a second pattern displayed two-dimensionally on the display 81b. Corresponds to the displayed image of These projection images PI1 and PI2 are parallel projection images when the monitoring area SA1 is viewed from a direction parallel to the Y axis and the Z axis.
 図3Bは、図3Aの視点EOから見た監視領域SA1の透視投影像を示し、ディスプレイ81bに表示される3次元的表示に相当する。この場合、ディスプレイが平面的な表示を行うものであっても、検知した物体や監視領域の形状や配置を3次元的なものとして把握することが容易になる。図からも明らかなように、監視領域SA1は、遠方に向かって縮小する投影像PI3として表示されている。 FIG. 3B shows a perspective projection image of the monitoring area SA1 viewed from the viewpoint EO of FIG. 3A, which corresponds to a three-dimensional display displayed on the display 81b. In this case, even if the display performs flat display, it becomes easy to grasp the shape and arrangement of the detected object and the monitoring area as a three-dimensional one. As apparent from the figure, the monitoring area SA1 is displayed as a projected image PI3 which is reduced toward the distance.
 図1に戻って、記憶部83は、物体検出プログラムやその実行に必要な諸データを記憶する。また、記憶部83には、演算処理部82が監視領域内に警報対象が存在すると判定した場合に、警報対象の状態、時刻等の諸情報が記録される。さらに、記憶部83は、物体検出プログラムによって抽出した対象に関するデータを逐次記録して、演算処理部82による対象の移動状態の監視を可能にする。 Returning to FIG. 1, the storage unit 83 stores an object detection program and various data necessary for its execution. Further, when the arithmetic processing unit 82 determines that the alarm target is present in the monitoring area, the storage unit 83 records various information such as the state of the alarm target and the time. Furthermore, the storage unit 83 sequentially records data on the object extracted by the object detection program, and enables the arithmetic processing unit 82 to monitor the movement state of the object.
 通信部84は、演算処理部82とレーザーレーダーユニット21又は支持部23との通信を可能にし、演算処理部82がレーザーレーダーユニット21等からのデータを取り込むことを可能にするとともに、演算処理部82からの指令をレーザーレーダーユニット21に送信することを可能にする。 The communication unit 84 enables communication between the arithmetic processing unit 82 and the laser radar unit 21 or the support unit 23, and enables the arithmetic processing unit 82 to take in data from the laser radar unit 21 etc., and the arithmetic processing unit The command from 82 can be transmitted to the laser radar unit 21.
 以下、図4及び図5を参照して、図1に示す物体検出システム100を用いた物体検出方法又は物体検出プログラムの実行について説明する。 Hereinafter, with reference to FIG. 4 and FIG. 5, execution of the object detection method or object detection program using the object detection system 100 shown in FIG. 1 will be described.
 まず、制御装置80の演算処理部82は、レーザーレーダーユニット21を動作させて距離情報等を含む計測データの取り込みを開始する(ステップS11)。レーザーレーダーユニット21からは、極座標の計測データ(r,θ,φ)又は計測データ(r,θ,φ)の元となるデータが出力され、演算処理部82は、極座標の計測データ(r,θ,φ)等を直交座標系の計測データ(X,Y,Z)に変換し、結果を記憶部83に保管する。具体的には、極座標から直交座標系への変換は、公知の関係
X=r・sinθ×cosφ
Y=r・sinθ×sinφ
Z=r・cosθ
を用いて実行される。ここで、検出された対象までの距離をr、極角をθ、方位角をφとしている。この際、レーザーレーダーユニット21の姿勢の傾きを補償するような座標変換も可能である。
First, the arithmetic processing unit 82 of the control device 80 operates the laser radar unit 21 to start capturing measurement data including distance information and the like (step S11). The laser radar unit 21 outputs measurement data (r, θ, φ) of polar coordinates or data as a source of measurement data (r, θ, φ), and the arithmetic processing unit 82 outputs measurement data (r, r of polar coordinates). is converted into measurement data (X, Y, Z) of the orthogonal coordinate system, and the result is stored in the storage unit 83. Specifically, the conversion from polar coordinates to an orthogonal coordinate system is based on a known relationship X = r · sin θ × cos φ
Y = r · sin θ × sin φ
Z = r · cos θ
Is performed using Here, the distance to the detected object is r, the polar angle is θ, and the azimuth angle is φ. At this time, coordinate conversion that compensates for the inclination of the attitude of the laser radar unit 21 is also possible.
 次に、演算処理部82は、レーザーレーダーユニット21から受け取った距離情報等を含む計測データに基づいて物体を検知した検出点をディスプレイ81bに表示させる(ステップS12)。この際、演算処理部82は、距離計測によって得られた検出点の群をディスプレイ81bにおいて3次元的に表示させることができる。この際、視点等からの距離に応じて検出点を着色表示することができる。3次元的に表示される検出点は、後述するようなクラスタリング、ノイズ除去等の処理がなされたものであってもよい。なお、ディスプレイ81bに表示させる検出点は、3次元的な表示に限らず、2次元的な表示とすることができる。 Next, the arithmetic processing unit 82 causes the display 81b to display a detection point at which an object is detected based on measurement data including distance information and the like received from the laser radar unit 21 (step S12). At this time, the arithmetic processing unit 82 can three-dimensionally display a group of detection points obtained by the distance measurement on the display 81 b. At this time, the detection point can be displayed in color according to the distance from the viewpoint or the like. The detection points displayed three-dimensionally may be subjected to processing such as clustering and noise removal as described later. The detection points displayed on the display 81 b are not limited to three-dimensional display, and may be two-dimensional display.
 ここで、3次元的表示について説明する。本実施形態では、3次元的表示のため中心投影の手法を利用し、3次元的に配置された多数の検出点を2次元平面に投影する演算処理を行う。直交座標系において、視点のベクトルをVとし、視心のベクトルをVとし、視心を基準とする視点の配置を考えた場合に、方位角をα、仰角をβとして、視点基準の座標PP(a,b,c)は、原点の平行移動及びZ軸及びY座標軸の周りの座標軸の回転ととらえることができ、以下の関係式
Figure JPOXMLDOC01-appb-I000001
で与えられる。視点基準の座標PP(a,b,c)のうち、視線に直交する面に平行投影した座標値b,cから距離による縮小を計算した座標DP1(b/(-a),c/(-a))が中心投影の座標となる。この中心投影座標DP1(b/(-a),c/(-a))をディスプレイ81bの画面に適合させる縮小又は拡大、量子化等の画像処理を行うことで、検出点の中心投影表示すなわち3次元的表示が可能になる。なお、距離による縮小を計算しない座標DP2(b,c)は、平行投影の座標であり、これを用いることで任意の視点に対応する2次元的表示が可能になる。
Here, the three-dimensional display will be described. In this embodiment, a central projection method is used for three-dimensional display, and arithmetic processing is performed to project a large number of three-dimensionally arranged detection points on a two-dimensional plane. In the orthogonal coordinate system, assuming that the vector of the viewpoint is V E , the vector of the visual center is V O, and the arrangement of the viewpoint based on the visual center is considered, the azimuth angle is α and the elevation angle is β. The coordinates PP (a, b, c) can be regarded as the translation of the origin and the rotation of the coordinate axes around the Z and Y coordinate axes, and the following relational expressions
Figure JPOXMLDOC01-appb-I000001
Given by Coordinates DP1 (b / (-a), c / (-) calculated by reduction with distance from coordinate values b and c parallel-projected to a plane orthogonal to the line of sight PP (a, b, c) based on viewpoint a)) is the coordinates of the central projection. By performing image processing such as reduction, enlargement, or quantization to fit the central projection coordinates DP1 (b / (-a), c / (-a)) to the screen of the display 81b, central projection display of detection points, ie, Three-dimensional display becomes possible. Note that coordinates DP2 (b, c) for which reduction due to distance is not calculated are coordinates of parallel projection, and two-dimensional display corresponding to an arbitrary viewpoint can be performed using this.
 次に、演算処理部82は、監視領域が設定されている状態である否かを記憶部83を参照して確認し(ステップS13)、監視領域が設定されている場合(ステップS13でY)、監視領域を変更する処理要求があれば(ステップS14でY)、監視領域の設定処理を行う(ステップS15)。 Next, the arithmetic processing unit 82 checks whether or not the monitoring area is set with reference to the storage unit 83 (step S13), and when the monitoring area is set (Y in step S13) If there is a processing request to change the monitoring area (Y in step S14), setting processing of the monitoring area is performed (step S15).
 図6を参照して、監視領域の設定処理を説明する。まず、演算処理部82は、入出力部81を利用してオペレーターに監視領域について設定の要否及び設定の方法の選択を要求し、オペレーターの選択を受け付ける(ステップS51)。この際、演算処理部82は、入出力部81を構成する操作部81a及びディスプレイ81bを利用して、GUI(Graphical User Interface)による入力受付を行う。監視領域の設定方法として、例えば、(1)予め準備された基本形状を当てはめる方法と、(2)実際の計測結果を利用して建造物等の障害物の表面から境界面を得る方法と、(3)オペレーターが基準点や基準面を入力することで境界面を直接入力する方法とが考えられる。予め準備された基本形状としては、例えば直方体、円柱、球等があり、基本図形の姿勢やサイズを変化させることができ、複数の図形を組み合わせることも可能となっている。実際の計測結果を利用する場合、静的な検出点から障害物の表面を想定し、障害物の表面から所定距離の位置に延びる近似的な面を算出することで、監視領域の外縁を画定することができる。オペレーターが基準点や基準面を入力する場合、直接座標点を入力することもできるが、静的な検出点や予め準備した基準点をディスプレイ81bに表示させることで入力を支援することができる。以上の方法(1)~(3)は単独で用いるのではなく、組み合わせて用いることができる。 The setting process of the monitoring area will be described with reference to FIG. First, the arithmetic processing unit 82 requests the operator to use the input / output unit 81 whether or not to set the monitoring area and to select the setting method, and receives the operator's selection (step S51). At this time, the arithmetic processing unit 82 uses the operation unit 81 a and the display 81 b of the input / output unit 81 to perform input acceptance by GUI (Graphical User Interface). As a method of setting the monitoring area, for example, (1) a method of applying a previously prepared basic shape, and (2) a method of obtaining an interface from the surface of an obstacle such as a building using actual measurement results. (3) It is conceivable that the operator directly inputs the boundary surface by inputting the reference point or the reference surface. The basic shape prepared in advance includes, for example, a rectangular parallelepiped, a cylinder, a sphere, etc., and the posture and size of the basic figure can be changed, and it is also possible to combine a plurality of figures. When actual measurement results are used, the outer edge of the monitoring area is defined by assuming the surface of the obstacle from a static detection point and calculating an approximate surface extending to a predetermined distance from the surface of the obstacle. can do. When the operator inputs a reference point or a reference surface, although it is possible to directly input a coordinate point, it is possible to support the input by displaying a static detection point or a reference point prepared in advance on the display 81 b. The above methods (1) to (3) can not be used alone but can be used in combination.
 次に、演算処理部82は、入出力部81を利用してオペレーターに監視領域の表示方法の選択を要求し、オペレーターの選択を受け付ける(ステップS52)。監視領域の表示方法としては、既述のように3次元的表示や2次元的表示であり、3次元的表示の場合、視点に対応するカメラ画像を重畳表示又は並列表示させることもできる。さらに、オペレーターは、入出力部81を利用して3次元的表示や2次元的表示の視点をシフト又は切り換えることができる。ディスプレイ81bに3次元的表示を行わせた状態で監視領域を設定する場合、監視領域を3次元的に観察することになり、監視領域の空間的な配置を把握し易いので、監視領域の設定操作が容易になり、意図する監視領域を迅速に設定できる。ディスプレイ81bに2次元的表示を行わせた状態で監視領域を設定する場合、監視領域を少ない歪みで各基準面に投影することになり、監視領域の配置の把握が比較的正確になる。 Next, the arithmetic processing unit 82 requests the operator to select the display method of the monitoring area using the input / output unit 81, and receives the operator's selection (step S52). As a display method of the monitoring area, as described above, three-dimensional display or two-dimensional display can be performed, and in the case of three-dimensional display, camera images corresponding to viewpoints can be superimposed or displayed in parallel. Furthermore, the operator can shift or switch the viewpoint of the three-dimensional display or the two-dimensional display using the input / output unit 81. When setting the monitoring area in a state in which the display 81b is three-dimensionally displayed, the monitoring area is observed three-dimensionally, and it is easy to grasp the spatial arrangement of the monitoring area. Operation is facilitated, and the intended monitoring area can be set quickly. When the monitoring area is set in a state in which the display 81 b is displayed in a two-dimensional manner, the monitoring area is projected onto each reference plane with little distortion, and the arrangement of the monitoring area becomes relatively accurate.
 次に、演算処理部82は、入出力部81を利用してオペレーターの意図する監視領域の設定を受け付け(ステップS53)、設定された監視領域を記憶部83に保管する。オペレーターは、例えば(1)ディスプレイ81bの検出点を表示する計測エリア外に設けたツール領域からいずれかの1つ以上の基本形状を選択し、(2)ディスプレイ81bの計測エリア内に追加表示された監視領域の外縁候補を選択し、或いは(3)点や面を描画するツールを選択することで計測エリア内に監視領域を設定することができる。このように様々な手法で設定された監視領域は、計測エリア内に検出点とともに表示される。この際、演算処理部82は、オペレーターが監視領域の位置を移動させたり、サイズを増減させたりすることを許容する。 Next, the arithmetic processing unit 82 receives the setting of the monitoring area intended by the operator using the input / output unit 81 (step S53), and stores the set monitoring area in the storage unit 83. The operator selects, for example, (1) one or more basic shapes from a tool area provided outside the measurement area for displaying the detection points of the display 81 b, and (2) additionally displayed in the measurement area of the display 81 b. The monitoring area can be set in the measurement area by selecting the outer edge candidate of the monitoring area or (3) selecting a tool for drawing a point or plane. The monitoring area set by various methods as described above is displayed together with the detection point in the measurement area. At this time, the arithmetic processing unit 82 allows the operator to move the position of the monitoring area or increase or decrease the size.
 図7A及び7Bは、ディスプレイ81bに3次元的表示がなされている具体例を示し、計測エリア内に検出点とともに横長の直方体枠状の監視領域が表示されていることが分かる。なお、図7Aは、監視領域を正面から観察する視点となっており、図7Bは、監視領域を上面から観察する視点となっている。なお、図7A及び7Bの表示は、ディスプレイ81b上に単独で行われるだけでなく、ディスプレイ81b上に並列させる場合を含む。 FIGS. 7A and 7B show a specific example in which a three-dimensional display is made on the display 81b, and it can be seen that a horizontally long rectangular frame-like monitoring area is displayed together with detection points in the measurement area. FIG. 7A is a viewpoint for observing the monitoring area from the front, and FIG. 7B is a viewpoint for observing the monitoring area from the top. The displays in FIGS. 7A and 7B are not only performed alone on the display 81 b but also include parallel display on the display 81 b.
 図7C及び7Dは、ディスプレイ81bに2次元的表示がなされている具体例を示し、計測エリア内に検出点とともに横長の矩形の監視領域が表示されていることが分かる。なお、図7Cは、監視領域を正面から観察する視点となっており、図7Dは、監視領域を上面から観察する視点となっている。なお、図7C及び7Dの表示は、ディスプレイ81b上に単独で行われるだけでなく、ディスプレイ81b上に並列させる場合を含む。 7C and 7D show a specific example in which a two-dimensional display is made on the display 81b, and it can be seen that a horizontally long rectangular monitoring area is displayed together with the detection point in the measurement area. FIG. 7C is a viewpoint for observing the monitoring area from the front, and FIG. 7D is a viewpoint for observing the monitoring area from the top. The displays in FIGS. 7C and 7D are not only performed alone on the display 81b, but also include parallel display on the display 81b.
 以上において、3次元的表示で監視領域を一旦設定した場合、オペレーターの要求でディスプレイ81bによる表示を3次元的表示から2次元的表示に切り換えても、既定の監視領域が承継される。つまり、演算処理部82は、表示の切り換え前後で監視領域の設定を維持し、既定の監視領域の3次元的表示を対応する2次元的表示に切り換える処理を行う。以上とは逆に、2次元的表示で監視領域を一旦設定した場合、オペレーターの要求で2次元的表示から3次元的表示に切り換えても、既定の監視領域が承継される。 In the above, when the monitoring area is once set in the three-dimensional display, the predetermined monitoring area is inherited even if the display on the display 81b is switched from the three-dimensional display to the two-dimensional display at the request of the operator. That is, the arithmetic processing unit 82 maintains the setting of the monitoring area before and after the display switching, and performs processing of switching the three-dimensional display of the predetermined monitoring area to the corresponding two-dimensional display. Contrary to the above, when the monitoring area is once set in the two-dimensional display, the predetermined monitoring area is inherited even if the two-dimensional display is switched to the three-dimensional display at the request of the operator.
 なお、以上の説明では3次元的表示と2次元的表示とを切り換えて行うとしているが、ディスプレイ81bに3次元的表示と2次元的表示とを並べて表示することもできる。 In the above description, three-dimensional display and two-dimensional display are switched to be performed, but three-dimensional display and two-dimensional display can also be displayed side by side on the display 81 b.
 次に、演算処理部82は、入出力部81を利用してオペレーターが監視領域の補正を希望するか否かを確認する(ステップS54)。 Next, the arithmetic processing unit 82 confirms whether the operator desires correction of the monitoring area using the input / output unit 81 (step S54).
 オペレーターが監視領域の補正を希望する場合、演算処理部82は、入出力部81を利用してオペレーターに監視領域の表示方法の選択を要求し、オペレーターの選択を受け付ける(ステップS55)。 When the operator desires to correct the monitoring area, the arithmetic processing unit 82 requests the operator to select the display method of the monitoring area using the input / output unit 81, and receives the selection of the operator (step S55).
 次に、演算処理部82は、入出力部81を利用してオペレーターが希望する監視領域の補正を受け付け(ステップS56)、補正された監視領域を記憶部83に保管する。監視領域の補正は、ステップS53で設定された監視領域を部分的に削除すること、ステップS53で設定された監視領域に追加領域を付加する拡張を行うこと等を含み、監視領域全体を削除することや別の箇所に新たな監視領域を追加することを含む。 Next, the arithmetic processing unit 82 receives correction of the monitoring area desired by the operator using the input / output unit 81 (step S56), and stores the corrected monitoring area in the storage unit 83. The correction of the monitoring area includes partially deleting the monitoring area set in step S53, extending the addition of an additional area to the monitoring area set in step S53, and the like, and deleting the entire monitoring area And adding new monitoring areas to different places.
 その後、演算処理部82は、設定又は補正した監視領域の表示方法を設定する(ステップS57)。すなわち、演算処理部82は、ステップS53で設定された監視領域、又はステップS56で補正された監視領域をディスプレイ81bに表示する際の、線の太さ、線の色、破線及び実線といった線種の別等に関する情報について、オペレーターが入出力部81を利用して入力・設定できるようにし、その結果を記憶部83に保管する。 Thereafter, the arithmetic processing unit 82 sets the display method of the set or corrected monitoring area (step S57). That is, the arithmetic processing unit 82 displays the monitoring area set in step S53 or the monitoring area corrected in step S56 on the display 81b, line types such as line thickness, line color, broken line and solid line The operator can input / set information on the other information using the input / output unit 81, and the result is stored in the storage unit 83.
 図4に戻って、演算処理部82は、監視領域の設定後、ディスプレイ81bを設定に応じた表示モードで動作させる(ステップS16)。つまり、ステップS12で検出点の群をディスプレイ81bにおいて例えば所定の視点で3次元的に表示させていた場合、同様に検出点の群をディスプレイ81bにおいて所定の視点で3次元的に表示させるとともに、ステップS15で設定された監視領域の枠を重畳表示させる。 Returning to FIG. 4, after setting the monitoring area, the arithmetic processing unit 82 operates the display 81 b in a display mode according to the setting (step S16). That is, when the group of detection points is three-dimensionally displayed on the display 81b at, for example, a predetermined viewpoint in step S12, the group of detection points is similarly three-dimensionally displayed on the display 81b at the predetermined viewpoint The frame of the monitoring area set in step S15 is superimposed and displayed.
 次に、演算処理部82は、最新の計測データからクラスタリングを行って(ステップS17)、結果を記憶部83に保管する。クラスタリングは、隣接する計測点を繋ぐこと等によって検出点を部分集合化し、対象のサイズや輪郭的な情報を得るための処理である。クラスタリングは、極座標の計測データ(r,θ,φ)又は直交座標系の計測データ(X,Y,Z)に対して行うことができる。以上のクラスタリングには、得られた複数のクラスタの連結等の処理を追加することができる。例えばクラスタの周囲に1~数画素又はこれに相当する距離の領域拡張を行い、得られたクラスタの周囲を対応する画素数又は距離だけ狭めることで、近接するクラスタを連結することができる。 Next, the arithmetic processing unit 82 performs clustering from the latest measurement data (step S17), and stores the result in the storage unit 83. Clustering is processing to subset detection points by connecting adjacent measurement points and the like to obtain target size and outline information. Clustering can be performed on measurement data (r, θ, φ) in polar coordinates or measurement data (X, Y, Z) in an orthogonal coordinate system. In the above clustering, processing such as connection of a plurality of obtained clusters can be added. For example, adjacent clusters can be connected by expanding a region of one to several pixels or a distance corresponding thereto around the cluster and narrowing the periphery of the obtained cluster by the corresponding number of pixels or distance.
 次に、演算処理部82は、ステップS17のクラスタリングによって得た各クラスタについて各種演算処理を行って、各クラスタの位置及びサイズを決定する(ステップS18)。クラスタの位置の決定には、例えばクラスタを構成する検出点又は画素点の平均位置又は重心を利用することができる。また、クラスタのサイズの決定には、例えばクラスタを構成する検出点又は画素点の外縁をつなぐ領域内の体積、XY面に投影した面積、XZ面に投影した面積等を用いることができる。 Next, the arithmetic processing unit 82 performs various arithmetic processing on each cluster obtained by the clustering in step S17 to determine the position and size of each cluster (step S18). For the determination of the position of a cluster, for example, the average position or the center of gravity of detection points or pixel points constituting the cluster can be used. Further, for determination of the size of a cluster, for example, a volume in a region connecting the outer edges of detection points or pixel points constituting the cluster, an area projected on an XY plane, an area projected on an XZ plane, or the like can be used.
 その後、演算処理部82は、ステップS18で得た付加情報を有する各クラスタからサイズを考慮して、サイズの小さなものを除去するノイズ判定を行って、着目に値する対象を選別する(ステップS19)。つまり、演算処理部82は、ノイズレベルよりも大きなクラスタを前方物体と判断し、このように抽出した対象をラベリングして記憶部83に保管する。 Thereafter, the arithmetic processing unit 82 performs noise determination to remove small-sized ones from the clusters having the additional information obtained in step S18 in consideration of the size, and selects an object worthy of attention (step S19). . That is, the arithmetic processing unit 82 determines a cluster larger than the noise level as a front object, labels the object extracted in this manner, and stores the object in the storage unit 83.
 次に、演算処理部82は、ステップS18で得たクラスタから、ステップS15で設定した監視領域内に存在するものを抽出し(ステップS21)、結果を記憶部83に保管する。 Next, the arithmetic processing unit 82 extracts the clusters existing in the monitoring area set in step S15 from the clusters obtained in step S18 (step S21), and stores the result in the storage unit 83.
 次に、演算処理部82は、監視領域内として抽出されたクラスタについてトラッキングを行う(ステップS22)。具体的には、演算処理部82は、抽出されたクラスタに対してその周囲にマークを付したり四角柱状の枠を付したりしてディスプレイ81bに表示させる。この際、抽出されたクラスタの移動を軌跡として捉えることも可能である。この場合、抽出したクラスタの同一性を形状やサイズから判定する。以上のトラッキングでは、監視領域内として抽出されたクラスタを構成する検出点だけでなく、監視領域外で得られたノイズレベルよりも大きなクラスタを構成する検出点をディスプレイ81bに表示させることができる。監視領域外のクラスタとして、車両、建造物等が表示されるが、これらのクラスタを構成する検出点をディスプレイ81bに3次元的に表示させておけば、監視領域内外のクラスタ又は対象の識別が容易になる。 Next, the arithmetic processing unit 82 performs tracking on the clusters extracted as within the monitoring area (step S22). Specifically, the arithmetic processing unit 82 causes the display 81 b to display the extracted cluster with a mark around it or a frame of a quadrangular prism. At this time, it is also possible to capture the movement of the extracted cluster as a trajectory. In this case, the identity of the extracted cluster is determined from the shape and size. In the above tracking, not only detection points constituting clusters extracted as in the monitoring area but also detection points constituting clusters larger than the noise level obtained outside the monitoring area can be displayed on the display 81b. Vehicles, buildings, etc. are displayed as clusters outside the monitoring area, but if detection points making up these clusters are three-dimensionally displayed on the display 81b, identification of clusters or objects inside or outside the monitoring area It will be easier.
 次に、演算処理部82は、ステップS22でトラッキングした着目対象が警報対象であるか否かを判断する(ステップS23)。例えば人物や自動車を監視している場合、これらの対象とはクラスタのサイズが異なる場合、着目対象が警報対象でないと判断される。また、クラスタの軌跡や移動速度も警報対象であるか否かの判断に利用することができる。 Next, the arithmetic processing unit 82 determines whether the focused target tracked in step S22 is a warning target (step S23). For example, in the case of monitoring a person or a car, if the size of the cluster is different from these targets, it is determined that the target of interest is not a warning target. In addition, the trajectory of the cluster and the moving speed can also be used to determine whether or not an alarm is to be made.
 演算処理部82は、警報対象が存在すると判定した場合(ステップS23でY)、警報処理及び記録処理を行う(ステップS24)。警報処理として、演算処理部82は、ディスプレイ81bやスピーカー(不図示)を利用して、オペレーターに対して警報対象が監視領域内に出現したことを知らせる。また、演算処理部82は、通信部84を介して外部システムに対して警報対象が監視領域内に出現したことを知らせる。記録処理として、演算処理部82は、警報対象に対応するクラスタが監視領域内に出現した時間、クラスタのサイズや形状といった特徴を記憶部83に保管する。 When it is determined that the alarm target is present (Y in step S23), the arithmetic processing unit 82 performs alarm processing and recording processing (step S24). As the alarm processing, the arithmetic processing unit 82 notifies the operator that the alarm target has appeared in the monitoring area, using the display 81 b and a speaker (not shown). Further, the arithmetic processing unit 82 notifies the external system via the communication unit 84 that the alarm target has appeared in the monitoring area. As recording processing, the arithmetic processing unit 82 stores, in the storage unit 83, features such as the time and the size and shape of the cluster when the cluster corresponding to the alarm target appears in the monitoring area.
 演算処理部82は、警報対象が存在すると否とに関わらず、処理終了の指示が無ければ(ステップS25でN)、ステップS13に戻って処理を繰り返す。 If there is no instruction to end the process (N in step S25), the arithmetic processing unit 82 returns to step S13 and repeats the process regardless of the presence of the alarm target.
 監視領域が設定されていない場合(ステップS13でN)、演算処理部82は、オペレーターから設定開始の処理要求を受け付け(ステップS34)、監視領域設定開始の処理要求がない場合(ステップS34でN)、演算処理部82は、ステップS16での処理と同様に、ディスプレイ81bを設定に応じた表示モードで動作させる(ステップS36)。 If the monitoring area is not set (N in step S13), operation processing unit 82 receives a processing request for setting start from the operator (step S34), and if there is no processing request for setting the monitoring area (N in step S34). The arithmetic processing unit 82 operates the display 81b in the display mode according to the setting, as in the process at step S16 (step S36).
 その後、監視領域がないままで、ステップS17~S19での処理と同様に、演算処理部82は、最新の計測データからクラスタリングを行い(ステップS37)、得られた各クラスタの位置及びサイズを決定し(ステップS38)、サイズの小さなクラスタを除去するノイズ判定を行う(ステップS39)。この場合、クラスタを検出しても警報対象とせず、ただ検出結果を2次元的に又は3次元的に表示することになる。 Thereafter, in the same manner as the processing in steps S17 to S19, the arithmetic processing unit 82 performs clustering from the latest measurement data (step S37) without determining the monitored area (step S37), and determines the position and size of each obtained cluster (Step S38), noise determination is performed to remove small clusters (step S39). In this case, even if a cluster is detected, it is not set as an alarm target, but the detection result is displayed two-dimensionally or three-dimensionally.
 以上で説明した実施形態の物体検出システム100によれば、領域設定部としての演算処理部82が、ディスプレイ81bに所定の視線方向に対応する2次元的な表示及び3次元的な表示のいずれか一方を行わせた状態で、ディスプレイ81bを利用した入出力部81の操作を受け付けることによって、演算処理部(物体検知部)82による監視処理の対象となる監視領域を設定するので、ユーザーは、既定の方向から2次元的な表示又は3次元的な表示に基づいて監視領域を設定する操作を行うことになり、監視領域の配置の把握が比較的正確になる。特に、2次元的な表示に基づいて監視領域を設定する操作は、操作に慣れないユーザーにとっても簡易なものとなる。また、3次元的な表示に基づいて監視領域を設定する場合、高さ、幅、及び奥行きを1画面で直感的に把握することができる。 According to the object detection system 100 of the embodiment described above, the arithmetic processing unit 82 as the area setting unit performs either of two-dimensional display or three-dimensional display corresponding to a predetermined gaze direction on the display 81 b. Since the monitoring area to be monitored by the arithmetic processing unit (object detection unit) 82 is set by receiving the operation of the input / output unit 81 using the display 81 b in a state in which one is performed, the user can Since the operation of setting the monitoring area is performed based on the two-dimensional display or the three-dimensional display from the predetermined direction, the arrangement of the monitoring area becomes relatively accurate. In particular, the operation of setting the monitoring area based on the two-dimensional display can be simplified even for the user who is not used to the operation. Moreover, when setting a monitoring area | region based on a three-dimensional display, height, width, and depth can be intuitively grasped by 1 screen.
 以上、実施形態に即して本発明を説明したが、本発明は、上記実施形態等に限定されるものではない。例えば、レーザーレーダーユニット21の構造や個数は単なる例示であり、様々な構造の距離計測部を用いることができる。 As mentioned above, although the present invention was described according to an embodiment, the present invention is not limited to the above-mentioned embodiment etc. For example, the structure and the number of the laser radar units 21 are merely examples, and distance measurement units of various structures can be used.
 クラスタリングの手法は、上記のものに限らず、近傍範囲や近接範囲の設定、画素点のつなぎ方等に関して様々な手法を採用することができる。 The method of clustering is not limited to the above, and various methods can be adopted with regard to setting of the near range and the close range, the connection method of the pixel points, and the like.
 以上で説明した例では、監視領域の設定に際して、ディスプレイ81bに計測エリア内の検出点を表示させているが、リアルタイムの際の検出点に代えて、過去に計測された検出点、現地空間のCADデータ等に基づくものを表示させることもできる。 In the example described above, the detection point within the measurement area is displayed on the display 81b when setting the monitoring area, but instead of the detection point in real time, detection points measured in the past, in the local space It is also possible to display one based on CAD data or the like.

Claims (8)

  1.  光ビームを走査しつつ反射光を検出して伝搬時間から距離を計測する距離計測部と、
     前記距離計測部によって得た距離情報から物体を検知する物体検知部と、
     ディスプレイを含む入出力部と、
     前記ディスプレイに前記物体検知部によって検知した物体の3次元的な表示と2次元的な表示とを行わせる表示処理部と、
     前記ディスプレイに所定の視線方向に対応する2次元的な表示及び3次元的な表示のいずれか一方を行わせた状態で、前記ディスプレイを利用した前記入出力部の操作を受け付けることによって、前記距離計測部によって計測されるエリアのうち、前記物体検知部による監視処理の対象となる監視領域を設定する領域設定部と
    を備える物体検出システム。
    A distance measuring unit which detects reflected light while scanning a light beam and measures the distance from the propagation time;
    An object detection unit that detects an object from the distance information obtained by the distance measurement unit;
    An input / output unit including a display,
    A display processing unit that causes the display to perform three-dimensional display and two-dimensional display of an object detected by the object detection unit;
    The distance is received by receiving an operation of the input / output unit using the display in a state in which the display performs one of two-dimensional display and three-dimensional display corresponding to a predetermined gaze direction. An object detection system comprising: an area setting unit configured to set a monitoring area to be monitored by the object detection unit among areas measured by a measurement unit.
  2.  前記領域設定部は、前記監視領域の輪郭形状を任意形状として受け付ける、請求項1に記載の物体検出システム。 The object detection system according to claim 1, wherein the area setting unit receives an outline shape of the monitoring area as an arbitrary shape.
  3.  前記領域設定部は、前記監視領域の配置及び個数を調整可能にする、請求項1及び2のいずれか一項に記載の物体検出システム。 The object detection system according to any one of claims 1 and 2, wherein the area setting unit can adjust the arrangement and the number of the monitoring areas.
  4.  前記ディスプレイに表示される3次元的な表示は、透視投影像であり、前記ディスプレイに表示される2次元的な表示は、所定の基準面に投影される平行投影像である、請求項1~3のいずれか一項に記載の物体検出システム。 The three-dimensional display displayed on the display is a perspective projection image, and the two-dimensional display displayed on the display is a parallel projection image projected on a predetermined reference plane. The object detection system according to any one of 3.
  5.  前記物体検知部は、動体を検知する、請求項1~4のいずれか一項に記載の物体検出システム。 The object detection system according to any one of claims 1 to 4, wherein the object detection unit detects a moving body.
  6.  前記物体検知部が前記監視領域内で動体を検知した場合に動体の存在を記録又は発報する監視部をさらに備える、請求項5に記載の物体検出システム。 The object detection system according to claim 5, further comprising: a monitoring unit that records or issues the presence of a moving object when the object detection unit detects a moving object in the monitoring area.
  7.  前記表示処理部は、3次元的な表示と2次元的な表示とを切り換えた場合、前記領域設定部によって設定された前記監視領域を承継する表示を行う、請求項1~6のいずれか一項に記載の物体検出システム。 7. The display processing unit according to any one of claims 1 to 6, wherein, when switching between three-dimensional display and two-dimensional display, the display processing unit performs display for inheriting the monitoring area set by the area setting unit. The object detection system as described in a term.
  8.  光ビームを走査しつつ反射光を検出して伝搬時間から距離を計測する距離計測部と、前記距離計測部によって得た距離情報から物体を検知する物体検知部と、ディスプレイを含む入出力部と、前記ディスプレイに前記物体検知部によって検知した物体の3次元的な表示と2次元的な表示とを行わせる表示処理部と、前記ディスプレイに所定の視線方向に対応する2次元的な表示及び3次元的な表示のいずれか一方を行わせた状態で、前記ディスプレイを利用した前記入出力部の操作を受け付けることによって、前記距離計測部によって計測されるエリアのうち、前記物体検知部による監視処理の対象となる監視領域を設定する領域設定部とを備える物体検出システムを制御する制御装置で動作する物体検出プログラム。 A distance measuring unit that detects reflected light while scanning a light beam and measures a distance from a propagation time; an object detecting unit that detects an object from distance information obtained by the distance measuring unit; an input / output unit including a display A display processing unit for causing the display to perform three-dimensional display and two-dimensional display of an object detected by the object detection unit; and two-dimensional display and three corresponding to a predetermined gaze direction on the display The monitoring process by the object detection unit in the area measured by the distance measurement unit by accepting the operation of the input / output unit using the display in a state in which any one of the two-dimensional display is performed An object detection program operated by a control device for controlling an object detection system comprising: a region setting unit configured to set a monitoring region to be a target of
PCT/JP2018/041335 2017-11-09 2018-11-07 Object detecting system and object detecting program WO2019093372A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019552348A JP7244802B2 (en) 2017-11-09 2018-11-07 Object detection system and object detection program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-216130 2017-11-09
JP2017216130 2017-11-09

Publications (1)

Publication Number Publication Date
WO2019093372A1 true WO2019093372A1 (en) 2019-05-16

Family

ID=66438805

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/041335 WO2019093372A1 (en) 2017-11-09 2018-11-07 Object detecting system and object detecting program

Country Status (2)

Country Link
JP (1) JP7244802B2 (en)
WO (1) WO2019093372A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110927731A (en) * 2019-11-15 2020-03-27 深圳市镭神智能系统有限公司 Three-dimensional protection method, three-dimensional detection device and computer readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003269915A (en) * 2002-03-13 2003-09-25 Omron Corp Monitor for three-dimensional space
JP2007013814A (en) * 2005-07-01 2007-01-18 Secom Co Ltd Setting apparatus for detection region
JP2007249722A (en) * 2006-03-17 2007-09-27 Hitachi Ltd Object detector

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007053812A1 (en) * 2007-11-12 2009-05-14 Robert Bosch Gmbh Video surveillance system configuration module, configuration module monitoring system, video surveillance system configuration process, and computer program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003269915A (en) * 2002-03-13 2003-09-25 Omron Corp Monitor for three-dimensional space
JP2007013814A (en) * 2005-07-01 2007-01-18 Secom Co Ltd Setting apparatus for detection region
JP2007249722A (en) * 2006-03-17 2007-09-27 Hitachi Ltd Object detector

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110927731A (en) * 2019-11-15 2020-03-27 深圳市镭神智能系统有限公司 Three-dimensional protection method, three-dimensional detection device and computer readable storage medium
CN110927731B (en) * 2019-11-15 2021-12-17 深圳市镭神智能系统有限公司 Three-dimensional protection method, three-dimensional detection device and computer readable storage medium

Also Published As

Publication number Publication date
JPWO2019093372A1 (en) 2020-11-19
JP7244802B2 (en) 2023-03-23

Similar Documents

Publication Publication Date Title
US10132611B2 (en) Laser scanner
JP5465128B2 (en) Point cloud position data processing device, point cloud position data processing system, point cloud position data processing method, and point cloud position data processing program
EP2788717B1 (en) Position and orientation determination in 6-dof
US9342890B2 (en) Registering of a scene disintegrating into clusters with visualized clusters
EP1903304B1 (en) Position measuring system, position measuring method and position measuring program
CN110178156A (en) Range sensor including adjustable focal length imaging sensor
JP2012057960A (en) Point group position data processor, point group position data processing method, point group position data processing system, and point group position data processing program
JP7064163B2 (en) 3D information acquisition system
JPH09187038A (en) Three-dimensional shape extract device
JP6955203B2 (en) Object detection system and object detection program
JP7194015B2 (en) Sensor system and distance measurement method
EP3351899A1 (en) Method and device for inpainting of colourised three-dimensional point clouds
JP2010117211A (en) Laser radar installation position verification apparatus, laser radar installation position verification method, and program for laser radar installation position verification apparatus
CN110132129A (en) The system based on augmented reality with circumference attributive function
JP2019144210A (en) Object detection system
WO2019093372A1 (en) Object detecting system and object detecting program
JP2009175012A (en) Measurement device and measurement method
JPWO2017199785A1 (en) Monitoring system setting method and monitoring system
WO2019093371A1 (en) Object detecting system and object detecting program
US9245346B2 (en) Registering of a scene disintegrating into clusters with pairs of scans
JP6895074B2 (en) Object detection system and object detection program
JP7392826B2 (en) Data processing device, data processing system, and data processing method
US20210389430A1 (en) Scanning surveying system
US20230260223A1 (en) Augmented reality alignment and visualization of a point cloud
EP4246184A1 (en) Software camera view lock allowing editing of drawing without any shift in the view

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18875883

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019552348

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18875883

Country of ref document: EP

Kind code of ref document: A1