WO2019093372A1 - Procédé de détection d'objet et programme de détection d'objet - Google Patents

Procédé de détection d'objet et programme de détection d'objet Download PDF

Info

Publication number
WO2019093372A1
WO2019093372A1 PCT/JP2018/041335 JP2018041335W WO2019093372A1 WO 2019093372 A1 WO2019093372 A1 WO 2019093372A1 JP 2018041335 W JP2018041335 W JP 2018041335W WO 2019093372 A1 WO2019093372 A1 WO 2019093372A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
unit
object detection
monitoring area
processing unit
Prior art date
Application number
PCT/JP2018/041335
Other languages
English (en)
Japanese (ja)
Inventor
生田目晃志
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2019552348A priority Critical patent/JP7244802B2/ja
Publication of WO2019093372A1 publication Critical patent/WO2019093372A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/51Display arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/10Scanning systems
    • G02B26/12Scanning systems using multifaceted mirrors
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/181Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using active radiation detection systems

Definitions

  • the present invention relates to an object detection system and an object detection program for detecting return light from an object while scanning a light beam.
  • a so-called three-dimensional lidar that detects distance and direction with high accuracy is used.
  • 3D-LiDAR three-dimensional lidar
  • three-dimensional information can be obtained, so it is desirable that the detection results and the monitoring results be displayed three-dimensionally.
  • it is also performed to display the detection result only in a necessary monitoring area, and in that case, it is desirable that the monitoring area can be easily confirmed, set, corrected, and the like.
  • the present invention has been made in view of the above-mentioned problems of the background art, and is applicable to an object detection system that facilitates setting and grasping of a monitoring area even when distance detection is performed in three dimensions.
  • Object of the invention is to provide an object detection program.
  • an object detection system reflecting one aspect of the present invention includes a distance measurement unit that detects a reflected light while scanning a light beam and measures a distance from a propagation time
  • An object detection unit for detecting an object from distance information obtained by the distance measurement unit, an input / output unit including a display, and three-dimensional display and two-dimensional display of the object detected by the object detection unit on the display Accepting an operation of an input / output unit using a display in a state in which the display processing unit to be performed and either the two-dimensional display or the three-dimensional display corresponding to a predetermined gaze direction are performed on the display
  • an area setting unit configured to set a monitoring area to be monitored by the object detection unit among the areas measured by the distance measurement unit.
  • an object detection program reflecting one aspect of the present invention includes a distance measurement unit that detects a reflected light while scanning a light beam and measures a distance from a propagation time
  • An object detection unit for detecting an object from distance information obtained by the distance measurement unit, an input / output unit including a display, and three-dimensional display and two-dimensional display of the object detected by the object detection unit on the display
  • an area setting unit for setting a monitoring area to be monitored by the object detection unit among areas measured by the distance measurement unit. Operating in Gosuru controller.
  • FIG. 3A is a perspective view for explaining the monitoring area set in the area to be measured
  • FIG. 3B is a view showing an example of a three-dimensional display of the monitoring area.
  • movement of the object detection system of FIG. It is a figure explaining operation
  • region. 7A to 7D are diagrams for explaining display examples by the display.
  • An object detection system 100 shown in FIG. 1 includes a laser radar unit 21, a support unit 23, and a control device 80.
  • the laser radar unit 21 is a distance measurement unit that detects the presence of a detection target and the distance to the detection target, and measures the distance from the propagation time while scanning the light beam with the rotating scanning mirror 53a.
  • the laser radar unit (distance measurement unit) 21 includes a light projection system 51, a light reception system 52, a rotation reflection unit 53, a drive circuit 55, and an exterior component 56.
  • the light projecting system 51, the light receiving system 52, and the rotary reflecting portion 53 constitute a scanning optical system 59.
  • the light projection system 51 emits a laser beam L1 that is the source of the light beam or the light projection beam to a scanning mirror 53a of the rotation reflection unit 53 described later.
  • the light projection system 51 has a light source 51a that generates a laser beam L1 set in the infrared or other wavelength range.
  • the light receiving system 52 receives the return light L 2 reflected by the scanning mirror 53 a of the rotary reflection unit 53, which is the reflected light or light beam from the detection object OB incident through the optical window 56 a of the exterior component 56. .
  • the light receiving system 52 has a light receiving element 52a having, for example, six pixels in the vertical sub-scanning direction in order to detect the return light L2.
  • the laser beam (projected beam) L1 emitted from the laser radar unit 21 is reflected by the detection target OB, and one of the light reflected by the detection target OB.
  • the light beam enters the light receiving system 52 via the scanning mirror 53a in the laser radar unit 21 as return light (reflected light) L2.
  • the rotation reflection unit 53 includes a scanning mirror 53a and a rotation drive unit 53b.
  • the scanning mirror 53a is a double reflection type polygon mirror, and has a first reflecting portion 53i and a second reflecting portion 53j for bending an optical path.
  • the first and second reflecting portions 53i and 53j are respectively disposed above and below along the rotation axis RX extending in parallel to the z direction.
  • the first and second reflecting portions 53i and 53j have a pyramidal shape.
  • the inclination angles of the reflecting surfaces of the first and second reflecting portions 53i and 53j gradually change with the rotational position of the scanning mirror 53a (in the example shown, the position facing the four azimuths in units of 90 °). (For the specific shape of the first and second reflecting portions 53i, 53j, see WO 2014/168137).
  • the reflecting surface of the first reflecting portion 53i reflects the laser beam (projected beam) L1 incident from the + y direction, which is the right direction on the paper surface, in a direction substantially orthogonal to the first direction. 2 Lead to the mirror surface of the reflection part 53j.
  • the mirror surface of the second reflecting portion 53j reflects the laser light L1 incident from the lower side on the paper surface in a direction substantially orthogonal to the laser light L1 and guides the laser light L1 to the right of the detection target OB side on the paper surface.
  • a part of return light (reflected light) L2 reflected by the detection target OB follows a path reverse to the path of the laser light L1, and is detected by the light receiving system 52.
  • the scanning mirror 53a reflects again the return light L2 reflected by the detection target OB by the mirror surface of the second reflecting portion 53j, and guides the return light L2 to the mirror surface of the first reflecting portion 53i. Subsequently, the return light L2 is reflected again by the mirror surface of the first reflecting portion 53i and is guided to the light receiving system 52 side.
  • the traveling direction of the laser beam L1 changes in a plane (that is, the xy plane) orthogonal to the vertical z-axis direction. That is, the laser beam L1 is scanned around the z axis as the scanning mirror 53a rotates.
  • the angular area scanned by the laser beam L1 is a detection area.
  • the opening angle with respect to the + z-axis direction is the light projection angle, and xy between the traveling direction of the laser light L1 at the scanning start point and the traveling direction of the laser light L1 at the scanning end point
  • the angle formed in the plane is the irradiation angle.
  • a projection field corresponding to the detection area is formed by the projection angle and the irradiation angle. Since the light projection field changes in four steps in the vertical direction according to the rotational position of the scanning mirror 53a by 90 °, the light projection field as a whole is a light projection field achieved by a single scan. In the vertical direction, it has a fourfold spread.
  • the drive circuit 55 controls the operation of the light source 51a of the light projection system 51, the light receiving element 52a of the light reception system 52, the rotation drive unit 53b of the rotation reflection unit 53, and the like. Further, the drive circuit 55 obtains object information of the detection object OB from the electric signal obtained by converting the return light L2 incident on the light receiving element 52a of the light receiving system 52. Specifically, when the output signal from the light receiving element 52a is equal to or higher than a predetermined threshold value, the drive circuit 55 determines that the light receiving element 52a receives the return light L2 from the detection target OB. In this case, the distance to the detection object OB is obtained from the difference between the light emission timing of the light source 51a and the light reception timing of the light receiving element 52a.
  • the azimuth information on the main scanning direction and sub scanning direction of the detection object OB It can be asked.
  • the exterior component 56 is for covering and protecting the internal components of the laser radar unit 21.
  • the support unit 23 not only supports the laser radar unit 21 but also has a function of adjusting the orientation or attitude of the laser radar unit 21 under the control of the control device 80.
  • the supporting unit 23 adjusts the posture of the laser radar unit 21 under the control of the control device 80 when the whole including the supporting unit 23 is inclined, and maintains the laser radar unit 21 in the state before the inclination. It may be
  • the control device 80 has an input / output unit 81 which is an interface with an operator, an arithmetic processing unit 82 which performs arithmetic processing on data etc. based on a program, controls an external device, etc., external data, arithmetic processing results etc.
  • a storage unit 83 for storing data and a communication unit 84 for communicating with an external device are provided.
  • the input / output unit 81 includes an operation unit 81a that receives an instruction from the operator, and a display 81b that presents the processing result of the arithmetic processing unit 82 to the operator.
  • the operation unit 81a has a keyboard, a mouse and the like, and can make the progress state of the program executed by the control device 80 reflect the intention of the operator.
  • the operation unit 81a receives, for example, an operation of designating a monitoring area from an area or area being measured by the operator.
  • the operation unit 81a may be like a touch panel attached to the display 81b.
  • the display 81 b is a display device such as an LCD that enables two-dimensional display or three-dimensional display, but may be a head mounted display that enables three-dimensional viewing or stereoscopic viewing.
  • the arithmetic processing unit 82 has an arithmetic unit such as a central processing unit (CPU) and an attached circuit such as an interface circuit, and performs distance measurement, display of detection points, setting / display of a monitoring area, clustering, object extraction, noise
  • An object detection program including various processes such as removal processing and alarm processing is executed.
  • the arithmetic processing unit 82 detects an object from the distance information obtained by the laser radar unit 21 which is a distance measurement unit as an object detection unit. Further, the arithmetic processing unit 82 causes the display 81 b to perform three-dimensional display or two-dimensional display of the detected object or the detection target OB as a display processing unit. That is, the processing unit 82 can display the detection points obtained by the distance measurement three-dimensionally or two-dimensionally on the display 81 b.
  • the arithmetic processing unit 82 serves as an area setting unit, and by receiving an instruction from the operator via the operation unit 81a and the display 81b, a monitoring area to be monitored by the laser radar unit 21 in the measurement area or the measurement area.
  • the arithmetic processing unit 82 receives an instruction from the operator via the operation unit 81a and the display 81b, thereby changing the outline of the monitoring area by expansion, reduction, or the like.
  • the arithmetic processing unit 82 can receive the contour shape of the monitoring area as an arbitrary shape.
  • the user can freely set a three-dimensional monitoring area via the arithmetic processing unit 82 according to the purpose or the environment.
  • the arithmetic processing unit 82 can adjust the arrangement and the number of monitoring areas.
  • the arithmetic processing unit 82 can combine the plurality of monitoring areas into one monitoring area.
  • the arithmetic processing unit 82 as a monitoring unit, records or issues the presence of a moving body when the arithmetic processing unit (object detecting unit) 82 detects a moving body in the monitoring area. Thereby, it is possible to record or report the moving body which has entered the monitoring area.
  • the arithmetic processing unit 82 detects a moving body, the three-dimensional behavior of the moving body can be grasped in the monitoring area.
  • FIG. 3A is a diagram for explaining setting and display of a monitoring area.
  • the measurement area includes a detection point (not shown) obtained by one measurement operation (operation of the entire area) of the laser radar unit 21.
  • Measurement data giving detection points are measurement data of polar coordinates originally, but they are converted into a rectangular coordinate system of XYZ.
  • two monitoring areas SA1 and SA2 are set in the measurement area represented by the orthogonal coordinate system. Setting the monitoring areas SA1 and SA2 enables efficient monitoring focusing on the required area.
  • a frame-like projected image PI1 obtained by projecting the monitoring area SA1 onto the XY plane, which is a predetermined reference plane, corresponds to the outline of the monitoring area SA1 in plan view, and displays the first pattern displayed two-dimensionally on the display 81b. It corresponds to an image.
  • a frame-like projected image PI2 obtained by projecting the monitoring area SA1 onto the XZ plane, which is a predetermined reference plane corresponds to the outline of the monitoring area SA1 viewed from the front, and is a second pattern displayed two-dimensionally on the display 81b.
  • These projection images PI1 and PI2 are parallel projection images when the monitoring area SA1 is viewed from a direction parallel to the Y axis and the Z axis.
  • FIG. 3B shows a perspective projection image of the monitoring area SA1 viewed from the viewpoint EO of FIG. 3A, which corresponds to a three-dimensional display displayed on the display 81b.
  • the monitoring area SA1 is displayed as a projected image PI3 which is reduced toward the distance.
  • the storage unit 83 stores an object detection program and various data necessary for its execution. Further, when the arithmetic processing unit 82 determines that the alarm target is present in the monitoring area, the storage unit 83 records various information such as the state of the alarm target and the time. Furthermore, the storage unit 83 sequentially records data on the object extracted by the object detection program, and enables the arithmetic processing unit 82 to monitor the movement state of the object.
  • the communication unit 84 enables communication between the arithmetic processing unit 82 and the laser radar unit 21 or the support unit 23, and enables the arithmetic processing unit 82 to take in data from the laser radar unit 21 etc., and the arithmetic processing unit The command from 82 can be transmitted to the laser radar unit 21.
  • the arithmetic processing unit 82 of the control device 80 operates the laser radar unit 21 to start capturing measurement data including distance information and the like (step S11).
  • the laser radar unit 21 outputs measurement data (r, ⁇ , ⁇ ) of polar coordinates or data as a source of measurement data (r, ⁇ , ⁇ ), and the arithmetic processing unit 82 outputs measurement data (r, r of polar coordinates).
  • measurement data r, ⁇ of polar coordinates
  • X, Y, Z measurement data
  • the distance to the detected object is r
  • the polar angle is ⁇
  • the azimuth angle is ⁇ .
  • coordinate conversion that compensates for the inclination of the attitude of the laser radar unit 21 is also possible.
  • the arithmetic processing unit 82 causes the display 81b to display a detection point at which an object is detected based on measurement data including distance information and the like received from the laser radar unit 21 (step S12). At this time, the arithmetic processing unit 82 can three-dimensionally display a group of detection points obtained by the distance measurement on the display 81 b. At this time, the detection point can be displayed in color according to the distance from the viewpoint or the like. The detection points displayed three-dimensionally may be subjected to processing such as clustering and noise removal as described later.
  • the detection points displayed on the display 81 b are not limited to three-dimensional display, and may be two-dimensional display.
  • the three-dimensional display will be described.
  • a central projection method is used for three-dimensional display, and arithmetic processing is performed to project a large number of three-dimensionally arranged detection points on a two-dimensional plane.
  • the vector of the viewpoint is V E
  • the vector of the visual center is V O
  • the arrangement of the viewpoint based on the visual center is considered, the azimuth angle is ⁇ and the elevation angle is ⁇ .
  • the coordinates PP (a, b, c) can be regarded as the translation of the origin and the rotation of the coordinate axes around the Z and Y coordinate axes, and the following relational expressions Given by Coordinates DP1 (b / (-a), c / (-) calculated by reduction with distance from coordinate values b and c parallel-projected to a plane orthogonal to the line of sight PP (a, b, c) based on viewpoint a)) is the coordinates of the central projection.
  • central projection display of detection points ie, Three-dimensional display becomes possible.
  • coordinates DP2 (b, c) for which reduction due to distance is not calculated are coordinates of parallel projection, and two-dimensional display corresponding to an arbitrary viewpoint can be performed using this.
  • the arithmetic processing unit 82 checks whether or not the monitoring area is set with reference to the storage unit 83 (step S13), and when the monitoring area is set (Y in step S13) If there is a processing request to change the monitoring area (Y in step S14), setting processing of the monitoring area is performed (step S15).
  • the arithmetic processing unit 82 requests the operator to use the input / output unit 81 whether or not to set the monitoring area and to select the setting method, and receives the operator's selection (step S51). At this time, the arithmetic processing unit 82 uses the operation unit 81 a and the display 81 b of the input / output unit 81 to perform input acceptance by GUI (Graphical User Interface).
  • GUI Graphic User Interface
  • the basic shape prepared in advance includes, for example, a rectangular parallelepiped, a cylinder, a sphere, etc., and the posture and size of the basic figure can be changed, and it is also possible to combine a plurality of figures.
  • the outer edge of the monitoring area is defined by assuming the surface of the obstacle from a static detection point and calculating an approximate surface extending to a predetermined distance from the surface of the obstacle. can do.
  • the arithmetic processing unit 82 requests the operator to select the display method of the monitoring area using the input / output unit 81, and receives the operator's selection (step S52).
  • a display method of the monitoring area as described above, three-dimensional display or two-dimensional display can be performed, and in the case of three-dimensional display, camera images corresponding to viewpoints can be superimposed or displayed in parallel. Furthermore, the operator can shift or switch the viewpoint of the three-dimensional display or the two-dimensional display using the input / output unit 81.
  • the monitoring area is observed three-dimensionally, and it is easy to grasp the spatial arrangement of the monitoring area.
  • the intended monitoring area can be set quickly.
  • the monitoring area is set in a state in which the display 81 b is displayed in a two-dimensional manner, the monitoring area is projected onto each reference plane with little distortion, and the arrangement of the monitoring area becomes relatively accurate.
  • the arithmetic processing unit 82 receives the setting of the monitoring area intended by the operator using the input / output unit 81 (step S53), and stores the set monitoring area in the storage unit 83.
  • the operator selects, for example, (1) one or more basic shapes from a tool area provided outside the measurement area for displaying the detection points of the display 81 b, and (2) additionally displayed in the measurement area of the display 81 b.
  • the monitoring area can be set in the measurement area by selecting the outer edge candidate of the monitoring area or (3) selecting a tool for drawing a point or plane.
  • the monitoring area set by various methods as described above is displayed together with the detection point in the measurement area.
  • the arithmetic processing unit 82 allows the operator to move the position of the monitoring area or increase or decrease the size.
  • FIGS. 7A and 7B show a specific example in which a three-dimensional display is made on the display 81b, and it can be seen that a horizontally long rectangular frame-like monitoring area is displayed together with detection points in the measurement area.
  • FIG. 7A is a viewpoint for observing the monitoring area from the front
  • FIG. 7B is a viewpoint for observing the monitoring area from the top.
  • the displays in FIGS. 7A and 7B are not only performed alone on the display 81 b but also include parallel display on the display 81 b.
  • FIGS. 7C and 7D show a specific example in which a two-dimensional display is made on the display 81b, and it can be seen that a horizontally long rectangular monitoring area is displayed together with the detection point in the measurement area.
  • FIG. 7C is a viewpoint for observing the monitoring area from the front
  • FIG. 7D is a viewpoint for observing the monitoring area from the top.
  • the displays in FIGS. 7C and 7D are not only performed alone on the display 81b, but also include parallel display on the display 81b.
  • the predetermined monitoring area is inherited even if the display on the display 81b is switched from the three-dimensional display to the two-dimensional display at the request of the operator. That is, the arithmetic processing unit 82 maintains the setting of the monitoring area before and after the display switching, and performs processing of switching the three-dimensional display of the predetermined monitoring area to the corresponding two-dimensional display. Contrary to the above, when the monitoring area is once set in the two-dimensional display, the predetermined monitoring area is inherited even if the two-dimensional display is switched to the three-dimensional display at the request of the operator.
  • three-dimensional display and two-dimensional display are switched to be performed, but three-dimensional display and two-dimensional display can also be displayed side by side on the display 81 b.
  • the arithmetic processing unit 82 confirms whether the operator desires correction of the monitoring area using the input / output unit 81 (step S54).
  • the arithmetic processing unit 82 requests the operator to select the display method of the monitoring area using the input / output unit 81, and receives the selection of the operator (step S55).
  • the arithmetic processing unit 82 receives correction of the monitoring area desired by the operator using the input / output unit 81 (step S56), and stores the corrected monitoring area in the storage unit 83.
  • the correction of the monitoring area includes partially deleting the monitoring area set in step S53, extending the addition of an additional area to the monitoring area set in step S53, and the like, and deleting the entire monitoring area And adding new monitoring areas to different places.
  • the arithmetic processing unit 82 sets the display method of the set or corrected monitoring area (step S57). That is, the arithmetic processing unit 82 displays the monitoring area set in step S53 or the monitoring area corrected in step S56 on the display 81b, line types such as line thickness, line color, broken line and solid line
  • the operator can input / set information on the other information using the input / output unit 81, and the result is stored in the storage unit 83.
  • the arithmetic processing unit 82 operates the display 81 b in a display mode according to the setting (step S16). That is, when the group of detection points is three-dimensionally displayed on the display 81b at, for example, a predetermined viewpoint in step S12, the group of detection points is similarly three-dimensionally displayed on the display 81b at the predetermined viewpoint The frame of the monitoring area set in step S15 is superimposed and displayed.
  • the arithmetic processing unit 82 performs clustering from the latest measurement data (step S17), and stores the result in the storage unit 83.
  • Clustering is processing to subset detection points by connecting adjacent measurement points and the like to obtain target size and outline information.
  • Clustering can be performed on measurement data (r, ⁇ , ⁇ ) in polar coordinates or measurement data (X, Y, Z) in an orthogonal coordinate system.
  • processing such as connection of a plurality of obtained clusters can be added.
  • adjacent clusters can be connected by expanding a region of one to several pixels or a distance corresponding thereto around the cluster and narrowing the periphery of the obtained cluster by the corresponding number of pixels or distance.
  • the arithmetic processing unit 82 performs various arithmetic processing on each cluster obtained by the clustering in step S17 to determine the position and size of each cluster (step S18).
  • the position of a cluster for example, the average position or the center of gravity of detection points or pixel points constituting the cluster can be used.
  • the size of a cluster for example, a volume in a region connecting the outer edges of detection points or pixel points constituting the cluster, an area projected on an XY plane, an area projected on an XZ plane, or the like can be used.
  • the arithmetic processing unit 82 performs noise determination to remove small-sized ones from the clusters having the additional information obtained in step S18 in consideration of the size, and selects an object worthy of attention (step S19). . That is, the arithmetic processing unit 82 determines a cluster larger than the noise level as a front object, labels the object extracted in this manner, and stores the object in the storage unit 83.
  • the arithmetic processing unit 82 extracts the clusters existing in the monitoring area set in step S15 from the clusters obtained in step S18 (step S21), and stores the result in the storage unit 83.
  • the arithmetic processing unit 82 performs tracking on the clusters extracted as within the monitoring area (step S22). Specifically, the arithmetic processing unit 82 causes the display 81 b to display the extracted cluster with a mark around it or a frame of a quadrangular prism. At this time, it is also possible to capture the movement of the extracted cluster as a trajectory. In this case, the identity of the extracted cluster is determined from the shape and size. In the above tracking, not only detection points constituting clusters extracted as in the monitoring area but also detection points constituting clusters larger than the noise level obtained outside the monitoring area can be displayed on the display 81b. Vehicles, buildings, etc. are displayed as clusters outside the monitoring area, but if detection points making up these clusters are three-dimensionally displayed on the display 81b, identification of clusters or objects inside or outside the monitoring area It will be easier.
  • the arithmetic processing unit 82 determines whether the focused target tracked in step S22 is a warning target (step S23). For example, in the case of monitoring a person or a car, if the size of the cluster is different from these targets, it is determined that the target of interest is not a warning target. In addition, the trajectory of the cluster and the moving speed can also be used to determine whether or not an alarm is to be made.
  • the arithmetic processing unit 82 When it is determined that the alarm target is present (Y in step S23), the arithmetic processing unit 82 performs alarm processing and recording processing (step S24). As the alarm processing, the arithmetic processing unit 82 notifies the operator that the alarm target has appeared in the monitoring area, using the display 81 b and a speaker (not shown). Further, the arithmetic processing unit 82 notifies the external system via the communication unit 84 that the alarm target has appeared in the monitoring area. As recording processing, the arithmetic processing unit 82 stores, in the storage unit 83, features such as the time and the size and shape of the cluster when the cluster corresponding to the alarm target appears in the monitoring area.
  • step S25 If there is no instruction to end the process (N in step S25), the arithmetic processing unit 82 returns to step S13 and repeats the process regardless of the presence of the alarm target.
  • operation processing unit 82 receives a processing request for setting start from the operator (step S34), and if there is no processing request for setting the monitoring area (N in step S34).
  • the arithmetic processing unit 82 operates the display 81b in the display mode according to the setting, as in the process at step S16 (step S36).
  • the arithmetic processing unit 82 performs clustering from the latest measurement data (step S37) without determining the monitored area (step S37), and determines the position and size of each obtained cluster (Step S38), noise determination is performed to remove small clusters (step S39). In this case, even if a cluster is detected, it is not set as an alarm target, but the detection result is displayed two-dimensionally or three-dimensionally.
  • the arithmetic processing unit 82 as the area setting unit performs either of two-dimensional display or three-dimensional display corresponding to a predetermined gaze direction on the display 81 b. Since the monitoring area to be monitored by the arithmetic processing unit (object detection unit) 82 is set by receiving the operation of the input / output unit 81 using the display 81 b in a state in which one is performed, the user can Since the operation of setting the monitoring area is performed based on the two-dimensional display or the three-dimensional display from the predetermined direction, the arrangement of the monitoring area becomes relatively accurate. In particular, the operation of setting the monitoring area based on the two-dimensional display can be simplified even for the user who is not used to the operation. Moreover, when setting a monitoring area
  • the present invention was described according to an embodiment, the present invention is not limited to the above-mentioned embodiment etc.
  • the structure and the number of the laser radar units 21 are merely examples, and distance measurement units of various structures can be used.
  • the method of clustering is not limited to the above, and various methods can be adopted with regard to setting of the near range and the close range, the connection method of the pixel points, and the like.
  • the detection point within the measurement area is displayed on the display 81b when setting the monitoring area, but instead of the detection point in real time, detection points measured in the past, in the local space It is also possible to display one based on CAD data or the like.

Abstract

L'invention concerne un système de détection d'objet, qui facilite le paramétrage et la reconnaissance d'une zone de surveillance même dans une détection de distance tridimensionnelle. Un système de détection d'objet (100) comprend : une unité radar-laser (21) utilisée comme unité de mesure de distance pour détecter une lumière réfléchie pendant le balayage d'un faisceau lumineux, et mesurer une distance sur la base du temps de propagation du faisceau lumineux; une unité de traitement arithmétique (unité de détection d'objet) (82) pour détecter un objet sur la base d'informations de distance obtenues par l'unité de mesure de distance; une unité d'entrée/sortie (81) comprenant un dispositif d'affichage (81b); une unité de traitement arithmétique (unité de traitement d'affichage) (82) pour commander le dispositif d'affichage (81b) et le faire afficher une image tridimensionnelle et une image bidimensionnelle de l'objet détectée par l'unité de traitement arithmétique (unité de détection d'objet) (82); et une unité de traitement arithmétique (unité de paramétrage de zone) (82) pour paramétrer des zones de surveillance (SA1, SA2), en réponse à une opération effectuée sur l'unité d'entrée/sortie (81) en utilisant le dispositif d'affichage (81b) dans un état où celui-ci (81b) est commandé pour afficher une image bidimensionnelle correspondant à une direction de ligne visuelle prédéterminée, lesdites zones de surveillance (SA1, SA2) devant être surveillées par l'unité de traitement arithmétique (unité de détection d'objet) (82), hors zones, dont les distances par rapport à elle sont mesurées par l'unité de mesure de distance.
PCT/JP2018/041335 2017-11-09 2018-11-07 Procédé de détection d'objet et programme de détection d'objet WO2019093372A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019552348A JP7244802B2 (ja) 2017-11-09 2018-11-07 物体検出システム及び物体検出プログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-216130 2017-11-09
JP2017216130 2017-11-09

Publications (1)

Publication Number Publication Date
WO2019093372A1 true WO2019093372A1 (fr) 2019-05-16

Family

ID=66438805

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/041335 WO2019093372A1 (fr) 2017-11-09 2018-11-07 Procédé de détection d'objet et programme de détection d'objet

Country Status (2)

Country Link
JP (1) JP7244802B2 (fr)
WO (1) WO2019093372A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110927731A (zh) * 2019-11-15 2020-03-27 深圳市镭神智能系统有限公司 一种立体防护方法、三维检测装置和计算机可读存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003269915A (ja) * 2002-03-13 2003-09-25 Omron Corp 三次元監視装置
JP2007013814A (ja) * 2005-07-01 2007-01-18 Secom Co Ltd 検出領域の設定装置
JP2007249722A (ja) * 2006-03-17 2007-09-27 Hitachi Ltd 物体検知装置

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007053812A1 (de) * 2007-11-12 2009-05-14 Robert Bosch Gmbh Konfigurationsmodul für ein Videoüberwachungssystem, Überwachungssystem mit dem Konfigurationsmodul, Verfahren zur Konfiguration eines Videoüberwachungssystems sowie Computerprogramm

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003269915A (ja) * 2002-03-13 2003-09-25 Omron Corp 三次元監視装置
JP2007013814A (ja) * 2005-07-01 2007-01-18 Secom Co Ltd 検出領域の設定装置
JP2007249722A (ja) * 2006-03-17 2007-09-27 Hitachi Ltd 物体検知装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110927731A (zh) * 2019-11-15 2020-03-27 深圳市镭神智能系统有限公司 一种立体防护方法、三维检测装置和计算机可读存储介质
CN110927731B (zh) * 2019-11-15 2021-12-17 深圳市镭神智能系统有限公司 一种立体防护方法、三维检测装置和计算机可读存储介质

Also Published As

Publication number Publication date
JPWO2019093372A1 (ja) 2020-11-19
JP7244802B2 (ja) 2023-03-23

Similar Documents

Publication Publication Date Title
US10132611B2 (en) Laser scanner
JP5465128B2 (ja) 点群位置データ処理装置、点群位置データ処理システム、点群位置データ処理方法、および点群位置データ処理プログラム
JP5620200B2 (ja) 点群位置データ処理装置、点群位置データ処理方法、点群位置データ処理システム、および点群位置データ処理プログラム
EP2788717B1 (fr) Détermination de position et d'orientation dans 6-dof
EP1903304B1 (fr) Procédé, système et programme de mesure de position
CN110178156A (zh) 包括可调节焦距成像传感器的距离传感器
JP7064163B2 (ja) 3次元情報取得システム
JP6955203B2 (ja) 物体検出システム及び物体検出プログラム
JP7194015B2 (ja) センサシステム及び距離測定方法
JP2010117211A (ja) レーザレーダ用設置位置検証装置、レーザレーダ用設置位置の検証方法及びレーザレーダ用設置位置検証装置用プログラム
CN110132129A (zh) 具有周界限定功能的基于增强现实的系统
JP2019144210A (ja) 物体検出システム
WO2019093372A1 (fr) Procédé de détection d'objet et programme de détection d'objet
JP2009175012A (ja) 計測装置および計測方法
JPWO2017199785A1 (ja) 監視システムの設定方法及び監視システム
WO2019093371A1 (fr) Système de détection d'objet et programme de détection d'objet
US9245346B2 (en) Registering of a scene disintegrating into clusters with pairs of scans
JP6895074B2 (ja) 物体検出システム及び物体検出プログラム
Sheh et al. On building 3d maps using a range camera: Applications to rescue robotics
JP7392826B2 (ja) データ処理装置、データ処理システム及びデータ処理方法
US20210389430A1 (en) Scanning surveying system
US20230260223A1 (en) Augmented reality alignment and visualization of a point cloud
US20240161435A1 (en) Alignment of location-dependent visualization data in augmented reality
EP4246184A1 (fr) Serrure de vue de caméra logicielle permettant l'édition de dessin sans décalage dans la vue
WO2022244296A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme et système de traitement d'informations

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18875883

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019552348

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18875883

Country of ref document: EP

Kind code of ref document: A1