WO2024018337A1 - Object detection device, object detection method, and computer program - Google Patents

Object detection device, object detection method, and computer program Download PDF

Info

Publication number
WO2024018337A1
WO2024018337A1 PCT/IB2023/057175 IB2023057175W WO2024018337A1 WO 2024018337 A1 WO2024018337 A1 WO 2024018337A1 IB 2023057175 W IB2023057175 W IB 2023057175W WO 2024018337 A1 WO2024018337 A1 WO 2024018337A1
Authority
WO
WIPO (PCT)
Prior art keywords
measurement point
data
point group
measurement
moving object
Prior art date
Application number
PCT/IB2023/057175
Other languages
French (fr)
Japanese (ja)
Inventor
明宇 李
Original Assignee
ロベルト·ボッシュ·ゲゼルシャフト·ミト•ベシュレンクテル·ハフツング
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ロベルト·ボッシュ·ゲゼルシャフト·ミト•ベシュレンクテル·ハフツング filed Critical ロベルト·ボッシュ·ゲゼルシャフト·ミト•ベシュレンクテル·ハフツング
Publication of WO2024018337A1 publication Critical patent/WO2024018337A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects

Definitions

  • the present invention relates to an object detection device, an object detection method, and a computer program that are applied to a moving object such as a vehicle.
  • moving objects such as vehicles are equipped with object detection devices that use distance measuring sensors to detect the surrounding environment of the moving object.
  • object detection devices that use distance measuring sensors to detect the surrounding environment of the moving object.
  • AC C cruise control
  • Collision safety functions are implemented to avoid collisions with obstacles such as vehicles in front or to reduce the impact of a collision.
  • Li DAR Li DAR
  • Li DAR which uses optical waves such as lasers and infrared rays, is known as one type of distance measurement sensor used in such object detection devices.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2021-163213
  • L i DAR can measure the distance and direction to the object to be detected, it cannot measure the speed of the object to be detected like a radar sensor, which is also known as a ranging sensor. cannot be measured. For this reason, there is a risk that irregularities on the road surface or installed objects such as guardrails may be mistakenly detected as moving objects, or that tracking of moving objects may be interrupted.
  • An object of the present invention is to provide an object detection device, an object detection method, and a computer program that can improve the accuracy of detecting a moving object using LiDAR.
  • the data of the measurement point acquired by L i DAR ( 1 1 ) is From certain primary measurement point data, extract a measurement point group in which the distance between adjacent measurement points is within a predetermined distance, and among the extracted measurement point group, the length in any direction is a predetermined length.
  • a filtering process that removes the non-moving body measurement point group from the primary measurement point data by treating the measurement point group having a value equal to or higher as the non-moving body measurement point group; and filtering the non-moving body measurement point group from the primary measurement point data.
  • the measurement points excluding the static measurement points which are the measurement points that overlap with the primary measurement point data used in the previous processing cycle
  • a dynamic measurement point extraction process is performed to extract a dynamic measurement point from the secondary measurement point data, and the distance between adjacent measurement points is within the predetermined distance.
  • An object detection device is provided that performs the etching process.
  • primary measurement point data that is data of measurement points acquired by L i D A R ( 1 1 ) is , extracting a group of measurement points in which the distance between adjacent measurement points is within a predetermined distance, and among the extracted measurement point group, a measurement point whose length in any direction is equal to or greater than a predetermined length value.
  • Step S11 removing the non-moving object measurement point group from the primary measurement point data by setting the point group as a non-moving object measurement point group
  • the secondary measurement point data which is the data of the measurement points excluding the static measurement points
  • the measurement points excluding the static measurement points which are the measurement points that overlap with the primary measurement point data used in the previous processing cycle, are called dynamic measurement points.
  • Step S15 from the secondary measurement point data, extract a group of measurement points in which the distance between adjacent measurement points is within the predetermined distance, and a clustering process (step S17) for identifying: a first moving body measurement point group consisting of the dynamic measurement points and the static measurement points; a matching process (step S 1 9 to step S 2 9 ); and an object detection method is provided.
  • the computer detects an object based on data of measurement points acquired by L i D A R ( 1 1 ).
  • a computer program that executes a process of determining, from primary measurement point data that is measurement point data acquired by the L i D A R ( 1 1 ), a measurement in which the distance between adjacent measurement points is within a predetermined distance. While extracting a point group, among the extracted measurement point groups, a measurement point group whose length in any direction is equal to or greater than a predetermined length value is defined as a non-moving object measurement point group from the primary measurement point data.
  • step S11 Excluding the non-moving object measurement point group (step S11); and out of the secondary measurement point data that is measurement point data obtained by excluding the non-moving object measurement point group from the primary measurement point data, extracting measurement points excluding static measurement points that overlap with the primary measurement point data used in the previous processing cycle as dynamic measurement points (step S15); From the data, a group of measurement points in which the distance between adjacent measurement points is within the predetermined distance is extracted, and a first mobile measurement point group consisting of the dynamic measurement points, the dynamic measurement points, and a clustering process (step S17) for identifying a second moving body measurement point group constituted by the static measurement points; 2.
  • a computer that executes a matching process (steps S 1 9 to S 2 9) that associates the measurement point group with moving object data detected up to the previous processing cycle, and a process that includes . program will be provided.
  • FIG. 1 is a schematic diagram showing a configuration example of a vehicle as an example of a moving body according to an embodiment of the present invention.
  • FIG. 2 is a block diagram showing a configuration example of an object detection device according to the same embodiment.
  • FIG. 3 A flowchart showing processing operations by the object detection device according to the embodiment.
  • Fig. 4 is a flowchart showing a moving object detection processing operation by the object detection device according to the same embodiment.
  • FIG. 5 is an explanatory diagram showing filtering processing by the object detection device according to the embodiment.
  • FIG. 6 is an explanatory diagram showing filtering processing by the object detection device according to the same embodiment.
  • FIG. 7 is an explanatory diagram showing dynamic measurement point extraction processing and clustering processing by the object detection device according to the embodiment.
  • FIG. 8 is an explanatory diagram showing dynamic measurement point extraction processing and clustering processing by the object detection device according to the same embodiment.
  • FIG. 9 is an explanatory diagram showing matching processing by the object detection device according to the embodiment.
  • the object detection device can be applied to various moving bodies such as ships, aircraft, and robots, as well as vehicles such as four-wheeled vehicles and motorcycles.
  • various moving bodies such as ships, aircraft, and robots, as well as vehicles such as four-wheeled vehicles and motorcycles.
  • an example in which an object detection device is applied to a four-wheeled vehicle as a moving object will be described.
  • FIG. 1 is a schematic diagram showing a configuration example of a vehicle 1 as an example of a moving object.
  • the vehicle 1 shown in Fig. 1 is configured as a two-wheel drive vehicle 1 that transmits drive torque output from a drive power source 2 ⁇ , such as an internal combustion engine or a drive motor, to the left and right front wheels.
  • the vehicle 1 may be a four-wheel drive vehicle that transmits driving force to front, rear, left and right wheels.
  • vehicle 1 when vehicle 1 is an electric vehicle or a hybrid electric vehicle, vehicle 1 includes a secondary battery that stores power supplied to the drive motor, and a battery that stores power supplied to the drive motor and charges the battery. It is equipped with a generator that generates the electric power generated by the vehicle, an inverter device that controls the driving of the drive motor, etc.
  • the vehicle 1 includes a driving force source 20, an electric steering device 25, and a brake fluid pressure control unit 30 as devices for controlling the running of the vehicle 1.
  • the driving power source 20 outputs driving torque that is transmitted to the wheels via a transmission or differential mechanism (not shown).
  • the drive power source 2 ⁇ and the transmission are controlled by the vehicle control device 4 ⁇ .
  • the electric steering device 2 5 includes an electric motor and a gear mechanism (not shown), and is controlled by the vehicle control device 4 ⁇ to adjust the steering angle of the left and right front wheels.
  • the vehicle control device 4 ⁇ controls the electric steering device 2 5 based on the steering angle of the steering wheel by the driver.
  • the vehicle control device 40 controls the electric steering device 25 based on the target steering angle.
  • the brake fluid pressure control unit 30 adjusts the hydraulic pressure supplied to the brake caliper provided on each wheel to generate braking force.
  • the brake fluid pressure control unit 30 is driven by the car Controlled by both control devices 4 ⁇ .
  • the brake fluid pressure control unit 30 is used in combination with regenerative braking by the drive motor.
  • the vehicle control device 40 includes one or more electronic control devices that control the driving of the driving power source 20, the electric steering device 25, and the brake fluid pressure control unit 30.
  • the vehicle control device 40 is configured to be able to acquire a signal transmitted from the object detection device 50, and is configured to be able to execute automatic driving control of the vehicle 1.
  • automatic driving control shall include emergency brake control and A C C (Ada p t i v e C r u i se Cont r ⁇ !).
  • the vehicle control device 40 acquires information on the driver's driving operation amount, and controls the driving force source 20, the electric steering device 25, and the brake fluid pressure control unit 30. control the drive
  • the object detection processing unit 63 acquires the data (primary measurement point data) mpts of the measurement points detected by L i DAR 1 1, and performs filtering processing to exclude non-moving body measurement points from the primary measurement point data. Execute (Step S 1 1). Specifically, the object detection processing unit 63 extracts one or more measurement point groups in which the distance between adjacent measurement points is within a predetermined distance from the primary measurement point data mpts. For example, the object detection processing unit 63 calculates the distance between each measurement point converted to the coordinates of the fixed coordinate system, and selects a plurality of distances between adjacent measurement points within a predetermined distance. Extract a measurement point group that combines measurement points. Typically, Euclidean distance may be used as the distance, but other distances may also be used.
  • the object detection processing unit 63 selects a measurement point group whose length in any direction is equal to or greater than a predetermined length value from among the extracted one or more measurement point groups. Exclude from the primary measurement point data as a non-moving object measurement point group. For example, the object detection processing unit 63 calculates the maximum length along an appropriate direction for each of the extracted one or more measurement point groups. The appropriate direction may be along any direction on the fixed coordinate system. Then, the object detection processing unit 63 selects a group of measurement points whose calculated maximum length is equal to or greater than a predetermined length value, which is set to a value exceeding the maximum length assumed for a moving object, as non-moving object measurement points.
  • the predetermined length value may be a value that exceeds the maximum length expected for a detected vehicle, such as a bus or a truck, but may be arbitrarily set depending on the size of the moving object to be detected.
  • the predetermined length value may be set to a value within a range of 10 to 12 m, for example.
  • FIGS. 5 to 6 are explanatory diagrams showing the filtering process in step S11.
  • FIG. 5 shows a two-dimensional image of primary measurement point data detected when a plurality of LiDARs are provided on the front, rear, left, and right sides of the vehicle 1.
  • the object detection processing unit 63 extracts a measurement point group that combines a plurality of measurement points in which the distance between adjacent measurement points is within a predetermined distance from the primary measurement point data, and extracts a measurement point group in which the distance between adjacent measurement points is within a predetermined distance. Measurement point groups with a value greater than or equal to the value are excluded from the primary measurement point data as non-moving object measurement point groups.
  • Figure 6 shows measurement point data (secondary measurement point data) after excluding the non-moving object measurement point group from the primary measurement point data.
  • the non-moving object measurement points excluded in step s i 1 are considered to be road surface irregularities such as curbs or roadside objects such as guardrails, and are not used as measurement points for detecting moving objects. . Therefore, the object detection processing unit 63 labels and records the non-moving object measurement points excluded in step S11 as road surface irregularities or road-installed objects (step S13).
  • the object detection processing unit 63 executes a dynamic measurement point extraction process on the secondary measurement point data excluding the non-first moving object measurement point group in step S11. (Step S 1 5). For example, the object detection processing unit 63 detects data acquired from L i D A R 1 1 in the previous processing cycle among the secondary measurement point data, which is measurement point data obtained by excluding non-moving object measurement points from the primary measurement point data. Static measurement points that overlap with the primary measurement point data are excluded, and the remaining measurement points are extracted as dynamic measurement points. More specifically, the object detection processing unit 63 uses the secondary measurement point data and the primary measurement point data acquired in the previous processing cycle to calculate the position from the previous processing cycle to the current processing cycle. Identify and exclude static measurement points that do not change. The object detection processing unit 63 extracts, from the secondary measurement point data, measurement points whose positions have changed from the previous processing cycle to the current processing cycle as dynamic measurement points.
  • the object detection processing unit 63 extracts a group of measurement points in which the distance between adjacent measurement points is within a predetermined distance from the secondary measurement point data, and extracts a group of measurement points in which the distance between adjacent measurement points is within a predetermined distance.
  • Clustering processing is performed to identify a first moving object measurement point group consisting of dynamic measurement points and a static measurement point, and a second moving object measurement point group consisting of dynamic measurement points and static measurement points.
  • the object detection processing unit 63 performs secondary measurement. From the fixed point data, one or more measurement point groups in which the distance between adjacent measurement points is within a predetermined distance are extracted.
  • the predetermined distance may be the same as the distance used in the filtering in step s11.
  • the object detection processing unit 63 sets the measurement point group consisting of the dynamic measurement points extracted in step S15 as the first moving object measurement point group among the extracted measurement point groups, and divides the dynamic measurement points and static measurement points into a first moving object measurement point group.
  • a measurement point group composed of the target measurement point group is specified as a second moving object measurement point group.
  • the object detection processing unit 6 3 uses the measurement point group extracted from the secondary measurement point data in the dynamic measurement point extraction process ( S 1 5 ) and the clustering process ( S 1 7 ). is classified into a first moving object measurement point group whose position has changed significantly since the previous processing cycle, and a second moving object measurement point group whose position has changed less.
  • FIGS. 7 to 8 are explanatory diagrams showing the dynamic measurement point extraction process (S15) and the clustering process (S17).
  • Each black circle indicates the measurement point group extracted in the current processing cycle
  • the white circle indicates the primary measurement point data acquired in the previous processing cycle.
  • the measurement point group extracted from the secondary measurement point data does not include static measurement points that overlap with the primary measurement point data of the previous processing cycle. Therefore, the measurement point group is classified into the first moving object measurement point group composed of dynamic measurement points.
  • the measurement point group extracted from the secondary measurement point data is a static measurement point (measurement point within the dashed line H 1) that overlaps with the primary measurement point data of the previous processing cycle.
  • the measurement points excluding the static measurement points are composed of the dynamic measurement points (the measurement points indicated by the broken line H 2). Therefore, the measurement point group is classified as a second mobile measurement point group, which is composed of dynamic measurement points and static measurement points.
  • the object detection processing unit 63 matches the first moving object measurement point group and the second moving object measurement point group identified by the clustering process with the moving objects detected by the previous processing cycle. Matching processing for association is executed (steps S19 to S29).
  • the matching process includes a first matching process (steps S19 to S23) that is a simple process, and a first mobile measurement point that could not be associated by the first matching process.
  • a second matching process (steps S25 to S29) for associating groups is included.
  • “Associating the first moving object measurement point group and the second moving object measurement point group with the moving objects detected by the previous processing cycle” means that the first moving object measurement point group and the second moving object measurement point group This indicates that the measurement point group where the same moving object as the data of the detected moving object is detected is to be identified and recorded from among the group. For example, if data of detected moving objects labeled with identification information to identify individual moving objects was recorded along with information estimating the predetermined moving speed and direction of movement up to the previous processing cycle, the current Among the first moving object measurement point group and the second moving object measurement point group identified in the processing cycle, the measurement point group that detected the same moving object as the detected moving object data is identified and labeled with the same identification information. The process of recording the information along with the moving speed and moving direction information is executed.
  • the first matching process is a matching process that can be executed more easily (in a shorter time) than the second matching process described later, and can be easily and reliably performed without performing the second matching process. High Executed for the purpose of completing the association for the first moving object measurement point group and the second moving object measurement point group that can be associated.
  • the first matching process may be performed using existing matching technology.
  • the object detection processing unit 63 collects the first moving object measurement point group, the second moving object measurement point group, and the moving object data that has been labeled as a moving object by the previous processing cycle (detected moving object data). ) to search for the first moving object measurement point group and the second moving object measurement point group that can be associated with the detected moving object data.
  • the distance D obtained for the moving data, the velocity component of each measurement point group V x _ c 1 ( n ), V y — c 1 ( n ), and the velocity component of each detected moving object data V x — m ⁇ ( m ), V ym ⁇ ( m ) are input to the machine learning model 70 to obtain a group of measurement points and detected moving object data that can be correlated with each other.
  • the machine learning model 70 learns the movement patterns of moving objects using data from measurement point clouds for learning that have been collected or generated in advance for various types of moving objects moving at various speeds, turning speeds, and directions. This is a model that has Typically, a support vector machine may be used as the machine learning model, but other machine learning models such as random forest or dual network may also be used.
  • the object detection processing unit 63 converts the first moving object measurement point group and the second moving object measurement point group, which have associated detected moving object data, into detected moving object data along with information on speed, moving direction, and orientation. Correlate and record (step S 2 7).
  • the object detection processing unit 63 treats the first moving object measurement point group and the second moving object measurement point group for which there is no associated detected moving object data as new moving objects. Detected moving object data is generated and recorded (step S29). When newly generating detected moving object data, the object detection processing unit 63 generates the first moving object measurement point group and the second moving object measurement point group that could not be associated with the detected moving object data in each processing cycle. New detected moving object data may be generated using . Specifically, the object detection processing unit 63 calculates the distance between the first moving object measurement point group and the second moving object measurement point group between each processing cycle (between frames), the moving speed in each processing cycle, and the like.
  • the object detection processing unit 6 3 repeats the processing from step S 1 1 to step S 2 9 in each processing cycle, and extracts the data extracted from the measurement point data detected by L i D A R 1 1.
  • the measurement point group is associated with the detected moving object data.
  • the object detection processing unit 63 detects the size of the moving object that clearly exceeds the size of the moving object from the data of the measurement points of LiDAR11 (primary measurement point data).
  • dynamic measurement points are extracted by excluding static measurement points whose position has not changed since the previous processing cycle.
  • the object detection processing unit 63 clusters the secondary measurement point data obtained by excluding the non-moving object measurement point group from the primary measurement point data, and clusters the secondary measurement point data, which is obtained by excluding the non-moving object measurement point group, from the primary measurement point data, and clusters the first moving object measurement point group consisting of the dynamic measurement point group and the moving object measurement point group.
  • a second mobile object measurement point group consisting of a static measurement point group and a static measurement point group are classified, and a matching process is performed with the detected moving object data that has already been recorded.
  • the first moving object measurement point group is estimated to have a relatively fast moving speed
  • the second moving object measurement point group is estimated to have a relatively slow moving speed.
  • the object detection device can detect data from the primary measurement point data detected in a series of processing cycles, even though the measurement point data of L i D A R 1 1 does not have velocity information. The same moving objects can be associated and detected.
  • the object detection processing unit 63 detects objects with a predetermined speed among the detected moving object data. Determine whether or not a moving object with a speed value equal to or higher than a predetermined speed value can be associated with any of the first moving object measurement points, and select moving object data whose speed is less than a predetermined speed value among the detected moving object data. , executes a first matching process to determine whether or not it can be associated with either the first moving object measurement point group or the second moving object measurement point group. This allows matching targets to be narrowed down according to the speed of detected moving object data, thereby reducing the load on matching processing.
  • the object detection processing unit 63 collects the data of the first moving object measurement point group and the second moving object measurement point group, as well as the data detected by the previous processing cycle. Is it possible to input the moving object data into a machine learning model and associate the data of the first moving object measurement point group and the second moving object measurement point group with the moving object data detected by the previous processing cycle? Execute the second matching process to determine . As a result, based on the learning results of the movement patterns of various moving objects, the data of the first moving object measurement point group and the second moving object measurement point group are converted into moving object data detected by the previous processing cycle. It is possible to match with high accuracy.
  • the object detection processing unit 63 detects the first moving object measurement point group and the second moving object measurement point group that can be associated by simple matching processing (first matching processing). After removing the group and the detected moving object data, a second matching process using a machine learning model is performed. Therefore, the processing speed and processing load required for the moving object detection processing can be further reduced, and the matching processing with the detected moving object data can be executed with high accuracy.
  • the predetermined length value in the filtering process is set to the value of the assumed maximum length of the moving object, and the object detection processing unit 63 A group of measurement points whose length in either direction is equal to or greater than a predetermined length value are labeled and recorded as road surface irregularities or road-installed objects. As a result, a group of measurement points larger than the expected size of a moving object can be easily identified as road surface irregularities or roadside objects.
  • the above effects achieved by the object detection device can also be achieved by a computer program that executes the object detection method and object detection processing.
  • the first matching process may be omitted and only the second matching process may be executed, and an object detection device that executes this processing method is also within the technical scope of the present disclosure. It is understood that it belongs.
  • the L i D A R 1 1 calculates the position and velocity of the measurement point, and transmits information on the calculated position and velocity to the object detection device 5 ⁇ .
  • the D A R 1 1 may calculate the position of the measurement point and transmit information on the calculated position to the object detection device 5 ⁇ , and the object detection device 5 ⁇ may calculate information on the velocity of the measurement point.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)

Abstract

Provided are an object detection device, an object detection method, and a computer program that can improve the accuracy of moving body detection using a LiDAR. An object detection device (100) detects an object on the basis of measurement point data acquired by a LiDAR (11). The object detection device: extracts, from primary measurement point data that is measurement point data acquired by the LiDAR (11), a measurement point group for which the distance between neighboring measurement points is within a prescribed distance; excludes, from the primary measurement point data, a non-moving-body measurement point group that is, among the extracted measurement point groups, a measurement point group with the length in any direction being at least a prescribed length threshold value; extracts dynamic measurement points, which are measurement points obtained by excluding static measurement points that are measurement points overlapping with the primary measurement point data used in a previous processing cycle, from secondary measurement point data that is measurement point data obtained by excluding the non-moving-body measurement point group from the primary measurement point data; extracts, from the secondary measurement point data, a measurement point group for which the distance between neighboring measurement points is within a prescribed distance; identifies a first moving body measurement point group that comprises the dynamic measurement points and a second moving body measurement point group that comprises the dynamic measurement points and the static measurement points; and, through clustering processing, associates the identified first moving body measurement point group and second moving body measurement point group with moving body data for which detection was completed in or prior to the previous processing cycle.

Description

【書類名】 明細書 [Document name] Statement
【発明の名称】 物体検出装置及び物体検出方法並びにコンピュータプログラム [Name of the invention] Object detection device, object detection method, and computer program
【技術分野】 【Technical field】
【。 0 0 1 】 本発明は、 車両等の移動体に適用される物体検出装置及び物体検出方法並びにコンピュ ータプログラムに関する。 [. [001] The present invention relates to an object detection device, an object detection method, and a computer program that are applied to a moving object such as a vehicle.
【背景技術】 [Background technology]
【。 0 0 2】 近年、 車両等の移動体には、 移動体の周囲環境を検出する測距センサを用いた物体検出 装置が備えられている。 例えば車両では、 検出される周囲環境の情報に基づいて、 先行車 両との車間距離を目標車間距離に維持しながら自車両を自動で走行させるためのクルーズ コントロール制御 ( AC C : A d a p t i v e C r u i s e C o n t r 〇 1 ) や、 先 行車両を含む障害物等との衝突を回避あるいは衝突時の衝撃を軽減するための衝突安全機 能が実行される。 このような物体検出装置に用いられる測距センサの一っとして、 レーザ や赤外線等の光学波を利用した L i DAR ( L i g h t D e t e c t i o n A n d R a n g i n g ) が知られている。 [. [002] In recent years, moving objects such as vehicles are equipped with object detection devices that use distance measuring sensors to detect the surrounding environment of the moving object. For example, in a vehicle, cruise control (AC C) is used to automatically drive the vehicle while maintaining the target distance to the vehicle in front based on detected information about the surrounding environment. Collision safety functions are implemented to avoid collisions with obstacles such as vehicles in front or to reduce the impact of a collision. Li DAR (L i DAR), which uses optical waves such as lasers and infrared rays, is known as one type of distance measurement sensor used in such object detection devices.
【先行技術文献】 [Prior art documents]
【特許文献】 [Patent document]
【〇 0 0 3】 【〇 0 0 3】
【特許文献 1 】 特開 2 0 2 1 — 1 6 3 2 1 3号公報 [Patent Document 1] Japanese Patent Application Laid-Open No. 2021-163213
【発明の概要】 [Summary of the invention]
【発明が解決しよう とする課題】 [Problem to be solved by the invention]
【〇 0 0 4】 しかしながら、 L i DARは、 検出対象物までの距離や方向を測定することができるー 方、 同じく測距センサとして知られているレーダセンサのように検出対象物の速度を測定 することができない。 このため、 路面上の凹凸やガードレール等の設置物を移動体として 誤検出したり、 移動体の追跡が途絶えたりするおそれがある。 [〇 0 0 4] However, although L i DAR can measure the distance and direction to the object to be detected, it cannot measure the speed of the object to be detected like a radar sensor, which is also known as a ranging sensor. cannot be measured. For this reason, there is a risk that irregularities on the road surface or installed objects such as guardrails may be mistakenly detected as moving objects, or that tracking of moving objects may be interrupted.
【〇 0 0 5】 本発明は、 L i D ARを用いた移動体の検出精度を向上可能な物体検出装置及び物体検 出方法並びにコンピュータプログラムを提供することを目的とする。 [〇 0 0 5] An object of the present invention is to provide an object detection device, an object detection method, and a computer program that can improve the accuracy of detecting a moving object using LiDAR.
【課題を解決するための手段】 [Means to solve the problem]
【〇 0 0 6】 上記課題を解決するために、 本発明のある観点によれば、 [〇 0 0 6] In order to solve the above problems, according to one aspect of the present invention,
L i DAR ( 1 1 ) により取得された測定点のデータに基づいて物体を検出する物体検 出装置 ( 1 〇 〇) において、 前記 L i DAR ( 1 1 ) により取得された測定点のデータである一次測定点データから 、 隣り合う測定点間の距離が所定の距離内にある測定点群を抽出するとともに、 抽出した 前記測定点群のうち、 いずれかの方向の長さが所定の長さ閩値以上の測定点群を非移動体 測定点群として、 前記一次測定点データから前記非移動体測定点群を除外するフィルタリ ング処理と、 前記一次測定点データから前記非移動体測定点群を除外した測定点のデータである二次 測定点データのうち、 前回の処理サイクルで用いられた前記一次測定点データと重なる測 定点である静的測定点を除外した測定点を動的測定点として抽出する動的測定点抽出処理 と、 前記二次測定点データから、 隣り合う測定点間の距離が前記所定の距離内にある測定点 群を抽出し、 前記動的測定点により構成される第 1移動体測定点群と、 前記動的測定点及 び前記静的測定点により構成される第 2移動体測定点群と、 を特定するクラスタリング処 理と、 前記クラスタリング処理により特定された前記第 1移動体測定点群及び前記第 2移動体 測定点群と、 前回の処理サイクルまでに検出済みの移動体データと、 の関連付けを行うマ ッチング処理と、 を実行する、 物体検出装置が提供される。 In an object detection device (1 〇 〇) that detects an object based on the data of the measurement point acquired by L i DAR ( 1 1 ), the data of the measurement point acquired by L i DAR ( 1 1 ) is From certain primary measurement point data, extract a measurement point group in which the distance between adjacent measurement points is within a predetermined distance, and among the extracted measurement point group, the length in any direction is a predetermined length. a filtering process that removes the non-moving body measurement point group from the primary measurement point data by treating the measurement point group having a value equal to or higher as the non-moving body measurement point group; and filtering the non-moving body measurement point group from the primary measurement point data. Among the secondary measurement point data, which is the data of the measurement points excluding the static measurement points, the measurement points excluding the static measurement points, which are the measurement points that overlap with the primary measurement point data used in the previous processing cycle, are called dynamic measurement points. A dynamic measurement point extraction process is performed to extract a dynamic measurement point from the secondary measurement point data, and the distance between adjacent measurement points is within the predetermined distance. a first moving body measurement point group; a second moving body measurement point group constituted by the dynamic measurement points and the static measurement points; a clustering process for identifying the first moving body measurement point group; a map for associating the first moving object measurement point group and the second moving object measurement point group with moving object data detected by the previous processing cycle; An object detection device is provided that performs the etching process.
[ 0 0 0 7 ] また、 上記課題を解決するために、 本発明の別の観点によれば、 [ 0 0 0 7 ] Furthermore, in order to solve the above problems, according to another aspect of the present invention,
L i D A R ( 1 1 ) により取得された測定点のデータに基づいて物体を検出する物体検 出方法において、 前記 L i D A R ( 1 1 ) により取得された測定点のデータである一次測定点データから 、 隣り合う測定点間の距離が所定の距離内にある測定点群を抽出するとともに、 抽出した 前記測定点群のうち、 いずれかの方向の長さが所定の長さ閩値以上の測定点群を非移動体 測定点群として、 前記一次測定点データから前記非移動体測定点群を除外することと (ス テップ S 1 1 ) 、 前記一次測定点データから前記非移動体測定点群を除外した測定点のデータである二次 測定点データのうち、 前回の処理サイクルで用いられた前記一次測定点データと重なる測 定点である静的測定点を除外した測定点を動的測定点として抽出することと (ステップ S 1 5 ) 、 前記二次測定点データから、 隣り合う測定点間の距離が前記所定の距離内にある測定点 群を抽出し、 前記動的測定点により構成される第 1移動体測定点群と、 前記動的測定点及 び前記静的測定点により構成される第 2移動体測定点群と、 を特定するクラスタリング処 理 (ステップ S 1 7 ) と、 前記クラスタリング処理により特定された前記第 1移動体測定点群及び前記第 2移動体 測定点群と、 前回の処理サイクルまでに検出済みの移動体データと、 の関連付けを行うマ ッチング処理 (ステップ S 1 9~ステップ S 2 9 ) と、 を含む、 物体検出方法が提供される。 In an object detection method that detects an object based on measurement point data acquired by L i D A R ( 1 1 ), primary measurement point data that is data of measurement points acquired by L i D A R ( 1 1 ) is , extracting a group of measurement points in which the distance between adjacent measurement points is within a predetermined distance, and among the extracted measurement point group, a measurement point whose length in any direction is equal to or greater than a predetermined length value. removing the non-moving object measurement point group from the primary measurement point data by setting the point group as a non-moving object measurement point group (step S11); Among the secondary measurement point data, which is the data of the measurement points excluding the static measurement points, the measurement points excluding the static measurement points, which are the measurement points that overlap with the primary measurement point data used in the previous processing cycle, are called dynamic measurement points. (Step S15), from the secondary measurement point data, extract a group of measurement points in which the distance between adjacent measurement points is within the predetermined distance, and a clustering process (step S17) for identifying: a first moving body measurement point group consisting of the dynamic measurement points and the static measurement points; a matching process (step S 1 9 to step S 2 9 ); and an object detection method is provided.
[ 0 0 0 8 ] また、 上記課題を解決するために、 本発明の別の観点によれば、 コンピュータに、 L i D A R ( 1 1 ) により取得された測定点のデータに基づいて物体 を検出する処理を実行させるコンピュータプログラムであって、 前記 L i D A R ( 1 1 ) により取得された測定点のデータである一次測定点データから 、 隣り合う測定点間の距離が所定の距離内にある測定点群を抽出するとともに、 抽出した 前記測定点群のうち、 いずれかの方向の長さが所定の長さ閩値以上の測定点群を非移動体 測定点群として、 前記一次測定点データから前記非移動体測定点群を除外することと (ス テップ S 1 1 ) 、 前記一次測定点データから前記非移動体測定点群を除外した測定点のデータである二次 測定点データのうち、 前回の処理サイクルで用いられた前記一次測定点データと重なる測 定点である静的測定点を除外した測定点を動的測定点として抽出することと (ステップ S 1 5 ) 、 前記二次測定点データから、 隣り合う測定点間の距離が前記所定の距離内にある測定点 群を抽出し、 前記動的測定点により構成される第 1移動体測定点群と、 前記動的測定点及 び前記静的測定点により構成される第 2移動体測定点群と、 を特定するクラスタリング処 理 (ステップ S 1 7 ) と、 前記クラスタリング処理により特定された前記第 1移動体測定点群及び前記第 2移動体 測定点群と、 前回の処理サイクルまでに検出済みの移動体データと、 の関連付けを行うマ ッチング処理 (ステップ S 1 9~ステップ S 2 9 ) と、 を含む処理を実行させる、 コンピュータプログラムが提供される。 [ 0 0 0 8 ] Furthermore, in order to solve the above problems, according to another aspect of the present invention, the computer detects an object based on data of measurement points acquired by L i D A R ( 1 1 ). A computer program that executes a process of determining, from primary measurement point data that is measurement point data acquired by the L i D A R ( 1 1 ), a measurement in which the distance between adjacent measurement points is within a predetermined distance. While extracting a point group, among the extracted measurement point groups, a measurement point group whose length in any direction is equal to or greater than a predetermined length value is defined as a non-moving object measurement point group from the primary measurement point data. Excluding the non-moving object measurement point group (step S11); and out of the secondary measurement point data that is measurement point data obtained by excluding the non-moving object measurement point group from the primary measurement point data, extracting measurement points excluding static measurement points that overlap with the primary measurement point data used in the previous processing cycle as dynamic measurement points (step S15); From the data, a group of measurement points in which the distance between adjacent measurement points is within the predetermined distance is extracted, and a first mobile measurement point group consisting of the dynamic measurement points, the dynamic measurement points, and a clustering process (step S17) for identifying a second moving body measurement point group constituted by the static measurement points; 2. A computer that executes a matching process (steps S 1 9 to S 2 9) that associates the measurement point group with moving object data detected up to the previous processing cycle, and a process that includes . program will be provided.
【発明の効果】 【Effect of the invention】
[ 0 0 0 9 ] 以上説明したように本発明によれば、 L i D A Rを用いた移動体の検出精度を向上させ ることができる。 [0009] As explained above, according to the present invention, it is possible to improve the accuracy of detecting a moving object using LiDAR.
【図面の簡単な説明】 [Brief explanation of drawings]
[ 0 0 1 0 ] 【図 1】 本発明の実施の形態に係る移動体の一例としての車両の構成例を示す模式図 である。 [ 0 0 1 0 ] FIG. 1 is a schematic diagram showing a configuration example of a vehicle as an example of a moving body according to an embodiment of the present invention.
【図 2】 同実施形態に係る物体検出装置の構成例を示すブロック図である。 FIG. 2 is a block diagram showing a configuration example of an object detection device according to the same embodiment.
【図 3 ] 同実施形態に係る物体検出装置による処理動作を示すフローチャートである[Figure 3] A flowchart showing processing operations by the object detection device according to the embodiment.
【図 4】 同実施形態に係る物体検出装置による移動体検出処理動作を示すフローチャ ー トである。 [Fig. 4] Fig. 4 is a flowchart showing a moving object detection processing operation by the object detection device according to the same embodiment.
【図 5】 同実施形態に係る物体検出装置によるフィルタリング処理を示す説明図であ る。 FIG. 5 is an explanatory diagram showing filtering processing by the object detection device according to the embodiment.
【図 6】 同実施形態に係る物体検出装置によるフィルタリング処理を示す説明図であ る。 FIG. 6 is an explanatory diagram showing filtering processing by the object detection device according to the same embodiment.
【図 7】 同実施形態に係る物体検出装置による動的測定点抽出処理及びクラスタリン グ処理を示す説明図である。 FIG. 7 is an explanatory diagram showing dynamic measurement point extraction processing and clustering processing by the object detection device according to the embodiment.
【図 8 ] 同実施形態に係る物体検出装置による動的測定点抽出処理及びクラスタリン グ処理を示す説明図である。 FIG. 8 is an explanatory diagram showing dynamic measurement point extraction processing and clustering processing by the object detection device according to the same embodiment.
【図 9】 同実施形態に係る物体検出装置によるマッチング処理を示す説明図である。FIG. 9 is an explanatory diagram showing matching processing by the object detection device according to the embodiment.
【発明を実施するための形態】 [Form for carrying out the invention]
[ 0 0 1 1 ] 以下、 添付図面を参照しながら本発明の好適な実施の形態について詳細に説明する。 な お、 本明細書及び図面において、 実質的に同一の機能構成を有する構成要素については同 ーの符号を付することにより重複説明を省略する。 [ 0 0 1 1 ] Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. In addition, in this specification and the drawings, the same reference numerals are given to the constituent elements having substantially the same functional configuration to omit redundant explanation.
[ 0 0 1 2 ] [ 0 0 1 2 ]
< 1 . 移動体の構成例 > まず、 本実施形態に係る物体検出装置を適用可能な移動体の構成例を説明する。 <1. Configuration example of moving body> First, a configuration example of a moving body to which the object detection device according to the present embodiment can be applied will be described.
[ 0 0 1 3 ] 本実施形態に係る物体検出装置は、 四輪自動車や自動二輪車等の車両の他、 船舶、 航空 機又はロボッ ト等の種々の移動体に適用することができる。 本実施形態では、 移動体とし て四輪自動車に物体検出装置を適用した例を説明する。 [ 0 0 1 3 ] The object detection device according to this embodiment can be applied to various moving bodies such as ships, aircraft, and robots, as well as vehicles such as four-wheeled vehicles and motorcycles. In this embodiment, an example in which an object detection device is applied to a four-wheeled vehicle as a moving object will be described.
[ 0 0 1 4 ] 図 1は、 移動体の一例としての車両 1の構成例を示す模式図である。 図 1に示した車両 1 は、 内燃機関や駆動用モータ等の駆動力源 2 〇から出力される駆動トルクを左右の前輪 へ伝達する二輪駆動の車両 1 として構成されている。 車両 1は、 前後左右の車輪へ駆動力 を伝達する四輪駆動の車両であってもよい。 また、 車両 1が電気自動車やハイブリ ッ ド電 気自動車の場合、 車両 1には、 駆動用モータへ供給される電力を蓄積する二次電池や、 駆 動用モータへ供給される電力及びバッテリに充電される電力を発電する発電機、 駆動用モ ータの駆動を制御するインバータ装置等が搭載される。 [ 0 0 1 4 ] FIG. 1 is a schematic diagram showing a configuration example of a vehicle 1 as an example of a moving object. The vehicle 1 shown in Fig. 1 is configured as a two-wheel drive vehicle 1 that transmits drive torque output from a drive power source 2〇, such as an internal combustion engine or a drive motor, to the left and right front wheels. The vehicle 1 may be a four-wheel drive vehicle that transmits driving force to front, rear, left and right wheels. In addition, when vehicle 1 is an electric vehicle or a hybrid electric vehicle, vehicle 1 includes a secondary battery that stores power supplied to the drive motor, and a battery that stores power supplied to the drive motor and charges the battery. It is equipped with a generator that generates the electric power generated by the vehicle, an inverter device that controls the driving of the drive motor, etc.
[ 0 0 1 5 ] 車両 1は、 車両 1の走行を制御する機器として、 駆動力源 2 0、 電動ステアリング装置 2 5及びブレーキ液圧制御ユニッ ト 3 0を備えている。 駆動力源 2 0は、 図示しない変速 機や差動機構を介して車輪に伝達される駆動トルクを出力する。 駆動力源 2 〇や変速機の 駆動は車両制御装置 4 〇により制御される。 [ 0 0 1 5 ] The vehicle 1 includes a driving force source 20, an electric steering device 25, and a brake fluid pressure control unit 30 as devices for controlling the running of the vehicle 1. The driving power source 20 outputs driving torque that is transmitted to the wheels via a transmission or differential mechanism (not shown). The drive power source 2〇 and the transmission are controlled by the vehicle control device 4〇.
[ 0 0 1 6 ] 電動ステァリング装置 2 5は図示しない電動モータやギヤ機構を含み、 車両制御装置 4 〇により制御されることによって左右の前輪の操舵角を調節する。 車両制御装置 4 〇は、 手動運転中には、 ドライバによるステァリングホイールの操舵角に基づいて電動ステアリ ング装置 2 5を制御する。 また、 車両制御装置 4 0は、 自動運転中には、 目標操舵角に基 づいて電動ステアリング装置 2 5を制御する。 [ 0 0 1 6 ] The electric steering device 2 5 includes an electric motor and a gear mechanism (not shown), and is controlled by the vehicle control device 4 〇 to adjust the steering angle of the left and right front wheels. During manual driving, the vehicle control device 4 〇 controls the electric steering device 2 5 based on the steering angle of the steering wheel by the driver. Furthermore, during automatic driving, the vehicle control device 40 controls the electric steering device 25 based on the target steering angle.
[ 0 0 1 7 ] ブレーキ液圧制御ユニッ ト 3 0は、 それぞれの車輪に設けられたブレーキキャリパに供 給する油圧を調節し、 制動力を発生させる。 ブレーキ液圧制御ユニッ ト 3 0の駆動は、 車 両制御装置 4 〇により制御される。 車両 1が電気自動車あるいはハイブリ ッ ド電気自動車 の場合、 ブレーキ液圧制御ユニッ ト 3 0は、 駆動用モータによる回生ブレーキと併用され る。 [ 0 0 1 7 ] The brake fluid pressure control unit 30 adjusts the hydraulic pressure supplied to the brake caliper provided on each wheel to generate braking force. The brake fluid pressure control unit 30 is driven by the car Controlled by both control devices 4 〇. When the vehicle 1 is an electric vehicle or a hybrid electric vehicle, the brake fluid pressure control unit 30 is used in combination with regenerative braking by the drive motor.
[ 0 0 1 8 ] 車両制御装置 4 0は、 駆動力源 2 0、 電動ステアリング装置 2 5及びブレーキ液圧制御 ユニッ ト 3 〇の駆動を制御する一つ又は複数の電子制御装置を含む。 車両制御装置 4 〇は 、 物体検出装置 5 0から送信される信号を取得可能に構成され、 車両 1の自動運転制御を 実行可能に構成されている。 なお、 自動運転制御には、 緊急ブレーキ制御や A C C ( A d a p t i v e C r u i s e C o n t r 〇 ! ) を含むものとする。 また、 車両制御装置 4 0は、 車両 1の手動運転時においては、 ドライバの運転操作量の情報を取得し、 駆動力 源 2 0、 電動ステアリング装置 2 5及びブレーキ液圧制御ユニッ ト 3 0の駆動を制御する[ 0 0 1 8 ] The vehicle control device 40 includes one or more electronic control devices that control the driving of the driving power source 20, the electric steering device 25, and the brake fluid pressure control unit 30. The vehicle control device 40 is configured to be able to acquire a signal transmitted from the object detection device 50, and is configured to be able to execute automatic driving control of the vehicle 1. Furthermore, automatic driving control shall include emergency brake control and A C C (Ada p t i v e C r u i se Cont r 〇!). Furthermore, during manual operation of the vehicle 1, the vehicle control device 40 acquires information on the driver's driving operation amount, and controls the driving force source 20, the electric steering device 25, and the brake fluid pressure control unit 30. control the drive
Figure imgf000006_0001
Figure imgf000007_0001
Figure imgf000008_0001
物体検出処理部 6 3は、 L i D A R 1 1により検出された測定点のデータ (一次測定点 データ) m p t sを取得し、 一次測定点データから非移動体測定点群を除外するフィルタ リング処理を実行する (ステップ S 1 1 ) 。 具体的に、 物体検出処理部 6 3は、 一次測定 点データ m p t sから、 隣り合う測定点間の距離が所定の距離内にある一つ又は複数の測 定点群を抽出する。 例えば物体検出処理部 6 3は、 固定座標系の座標に変換されたそれぞ れの測定点間の距離を求め、 隣り合う測定点間の距離があらかじめ設定された所定の距離 内にある複数の測定点同士を結合した測定点群を抽出する。 距離としては、 代表的にはユ ークリ ッ ド距離が用いられてよいが、 他の距離であってもよい。
Figure imgf000006_0001
Figure imgf000007_0001
Figure imgf000008_0001
The object detection processing unit 63 acquires the data (primary measurement point data) mpts of the measurement points detected by L i DAR 1 1, and performs filtering processing to exclude non-moving body measurement points from the primary measurement point data. Execute (Step S 1 1). Specifically, the object detection processing unit 63 extracts one or more measurement point groups in which the distance between adjacent measurement points is within a predetermined distance from the primary measurement point data mpts. For example, the object detection processing unit 63 calculates the distance between each measurement point converted to the coordinates of the fixed coordinate system, and selects a plurality of distances between adjacent measurement points within a predetermined distance. Extract a measurement point group that combines measurement points. Typically, Euclidean distance may be used as the distance, but other distances may also be used.
[ 0 0 4 0 ] また、 物体検出処理部 6 3は、 抽出した一つ又は複数の測定点群のうち、 いずれかの方 向の長さが所定の長さ閩値以上の測定点群を非移動体測定点群として、 一次測定点データ から除外する。 例えば物体検出処理部 6 3は、 抽出した一つ又は複数の測定点群のそれぞ れについて、 適宜の方向に沿う最大長さを算出する。 適宜の方向は、 固定座標系上のどの 方向に沿う方向であってもよい。 そして、 物体検出処理部 6 3は、 求めた最大長さが、 移 動体として想定される最大長さを超える値に設定された所定の長さ閩値以上の測定点群を 非移動体測定点群として一次測定点群データから除外する。 所定の長さ閩値は、 例えばバ スやトラック等、 検出車両として想定される最大長さを超える値であってよいが、 検出さ れる移動体のサイズに応じて任意に設定されてよい。 所定の長さ閩値は、 例えば 1 〇〜 1 2 mの範囲内の値に設定されてよい。 [ 0 0 4 0 ] Furthermore, the object detection processing unit 63 selects a measurement point group whose length in any direction is equal to or greater than a predetermined length value from among the extracted one or more measurement point groups. Exclude from the primary measurement point data as a non-moving object measurement point group. For example, the object detection processing unit 63 calculates the maximum length along an appropriate direction for each of the extracted one or more measurement point groups. The appropriate direction may be along any direction on the fixed coordinate system. Then, the object detection processing unit 63 selects a group of measurement points whose calculated maximum length is equal to or greater than a predetermined length value, which is set to a value exceeding the maximum length assumed for a moving object, as non-moving object measurement points. Exclude as a group from the primary measurement point cloud data. The predetermined length value may be a value that exceeds the maximum length expected for a detected vehicle, such as a bus or a truck, but may be arbitrarily set depending on the size of the moving object to be detected. The predetermined length value may be set to a value within a range of 10 to 12 m, for example.
[ 0 0 4 1 ] 図 5〜図 6は、 ステップ S 1 1のフィルタリング処理を示す説明図である。 図 5は、 車 両 1の前後左右に複数の L i D A Rが設けられている場合に検出された一次測定点データ の二次元画像を示している。 物体検出処理部 6 3は、 当該一次測定点データにから隣り合 う測定点間の距離が所定距離内にある複数の測定点を結合した測定点群を抽出し、 最大長 さが所定の長さ閩値以上の測定点群を非移動体測定点群として一次測定点データから除外 する。 図 6は、 一次測定点データから非移動体測定点群を除外した測定点のデータ (二次 測定点データ) を示している。 [ 0 0 4 1 ] FIGS. 5 to 6 are explanatory diagrams showing the filtering process in step S11. FIG. 5 shows a two-dimensional image of primary measurement point data detected when a plurality of LiDARs are provided on the front, rear, left, and right sides of the vehicle 1. The object detection processing unit 63 extracts a measurement point group that combines a plurality of measurement points in which the distance between adjacent measurement points is within a predetermined distance from the primary measurement point data, and extracts a measurement point group in which the distance between adjacent measurement points is within a predetermined distance. Measurement point groups with a value greater than or equal to the value are excluded from the primary measurement point data as non-moving object measurement point groups. Figure 6 shows measurement point data (secondary measurement point data) after excluding the non-moving object measurement point group from the primary measurement point data.
[ 0 0 4 2 ] ステップ s i 1で除外された非移動体測定点群は、 縁石等の路面凹凸や、 ガードレール 等の路上設置物と考えられ、 移動体を検出する対象の測定点とはしない。 したがって、 物 体検出処理部 6 3は、 ステップ S 1 1において除外した非移動体測定点群を、 路面凹凸又 は路上設置物としてラベリングして記録する (ステップ S 1 3 ) 。 [ 0 0 4 2 ] The non-moving object measurement points excluded in step s i 1 are considered to be road surface irregularities such as curbs or roadside objects such as guardrails, and are not used as measurement points for detecting moving objects. . Therefore, the object detection processing unit 63 labels and records the non-moving object measurement points excluded in step S11 as road surface irregularities or road-installed objects (step S13).
[ 0 0 4 3 ] [ 0 0 4 3 ]
(動的測定点抽出処理) 物体検出処理部 6 3は、 ステップ S 1 1において非第 1移動体測定点群を除外した二次 測定点データに対して、 動的測定点抽出処理を実行する (ステップ S 1 5 ) 。 例えば物体 検出処理部 6 3は、 一次測定点データから非移動体測定点群を除外した測定点のデータで ある二次測定点データのうち、 前回の処理サイクルで L i D A R 1 1から取得された一次 測定点データと重なる測定点である静的測定点を除外し、 残る測定点を動的測定点として 抽出する。 より具体的に、 物体検出処理部 6 3は、 二次測定点データと、 前回の処理サイ クルで取得された一次測定点データとを用いて、 前回の処理サイクルから今回の処理サイ クルにかけて位置の変化のない静的測定点を特定し、 除外する。 物体検出処理部 6 3は、 二次測定点データのうち、 前回の処理サイクルから今回の処理サイクルにかけて位置が変 化した測定点を動的測定点として抽出する。 (Dynamic measurement point extraction process) The object detection processing unit 63 executes a dynamic measurement point extraction process on the secondary measurement point data excluding the non-first moving object measurement point group in step S11. (Step S 1 5). For example, the object detection processing unit 63 detects data acquired from L i D A R 1 1 in the previous processing cycle among the secondary measurement point data, which is measurement point data obtained by excluding non-moving object measurement points from the primary measurement point data. Static measurement points that overlap with the primary measurement point data are excluded, and the remaining measurement points are extracted as dynamic measurement points. More specifically, the object detection processing unit 63 uses the secondary measurement point data and the primary measurement point data acquired in the previous processing cycle to calculate the position from the previous processing cycle to the current processing cycle. Identify and exclude static measurement points that do not change. The object detection processing unit 63 extracts, from the secondary measurement point data, measurement points whose positions have changed from the previous processing cycle to the current processing cycle as dynamic measurement points.
[ 0 0 4 4 ] [ 0 0 4 4 ]
(クラスタリング処理) 次いで、 物体検出処理部 6 3は、 二次測定点データから、 隣り合う測定点間の距離が所 定の距離内にある測定点群を抽出し、 動的測定点により構成される第 1移動体測定点群と 、 動的測定点及び静的測定点により構成される第 2移動体測定点群と、 を特定するクラス タリング処理を実行する (ステップ S 1 7 ) 。 具体的に、 物体検出処理部 6 3は、 二次測 定点データのうち、 隣り合う測定点間の距離が所定の距離内にある一つ又は複数の測定点 群を抽出する。 所定の距離は、 ステップ s 1 1のフィルタリングで用いた距離と同じであ ってもよい。 ただし、 所定の距離は、 ステップ S 1 1のフィルタリングで用いた距離と異 なる距離であってもよい。 物体検出処理部 6 3は、 抽出した測定点群のうち、 ステップ S 1 5で抽出した動的測定点により構成される測定点群を第 1移動体測定点群とし、 動的測 定点及び静的測定点群により構成される測定点群を第 2移動体測定点群として特定する。 (Clustering Process) Next, the object detection processing unit 63 extracts a group of measurement points in which the distance between adjacent measurement points is within a predetermined distance from the secondary measurement point data, and extracts a group of measurement points in which the distance between adjacent measurement points is within a predetermined distance. Clustering processing is performed to identify a first moving object measurement point group consisting of dynamic measurement points and a static measurement point, and a second moving object measurement point group consisting of dynamic measurement points and static measurement points. Specifically, the object detection processing unit 63 performs secondary measurement. From the fixed point data, one or more measurement point groups in which the distance between adjacent measurement points is within a predetermined distance are extracted. The predetermined distance may be the same as the distance used in the filtering in step s11. However, the predetermined distance may be different from the distance used in the filtering in step S11. The object detection processing unit 63 sets the measurement point group consisting of the dynamic measurement points extracted in step S15 as the first moving object measurement point group among the extracted measurement point groups, and divides the dynamic measurement points and static measurement points into a first moving object measurement point group. A measurement point group composed of the target measurement point group is specified as a second moving object measurement point group.
[ 0 0 4 5 ] つまり、 物体検出処理部 6 3は、 動的測定点抽出処理 ( S 1 5 ) 及びクラスタリング処 理 ( S 1 7 ) において、 二次測定点データから抽出された測定点群を、 前回の処理サイク ルからの位置の変化の大きい第 1移動体測定点群と、 位置の変化の小さい第 2移動体測定 点群とに分類する。 [ 0 0 4 5 ] In other words, the object detection processing unit 6 3 uses the measurement point group extracted from the secondary measurement point data in the dynamic measurement point extraction process ( S 1 5 ) and the clustering process ( S 1 7 ). is classified into a first moving object measurement point group whose position has changed significantly since the previous processing cycle, and a second moving object measurement point group whose position has changed less.
[ 0 0 4 6 ] 図 7〜図 8は、 動的測定点抽出処理 ( S 1 5 ) 及びクラスタリング処理 ( S 1 7 ) を示 す説明図である。 それぞれ黒丸・が今回の処理サイクルで抽出された測定点群を示し、 白 丸〇が前回の処理サイクルで取得された一次測定点データを示すものとする。 図 7に示し た例では、 二次測定点データから抽出された測定点群が、 前回の処理サイクルの一次測定 点データと重複する静的測定点を含んでいない。 したがって、 当該測定点群は、 動的測定 点により構成された第 1移動体測定点群に分類される。 一方、 図 8に示した例では、 二次 測定点データから抽出された測定点群が、 前回の処理サイクルの一次測定点データと重複 する静的測定点 (破線 H 1内の測定点) と、 静的測定点を除外した測定点が動的測定点 ( 破線 H 2の測定点) とにより構成される。 したがって、 当該測定点群は、 動的測定点及び 静的測定点により構成された第 2移動体測定点群に分類される。 [0046] FIGS. 7 to 8 are explanatory diagrams showing the dynamic measurement point extraction process (S15) and the clustering process (S17). Each black circle indicates the measurement point group extracted in the current processing cycle, and the white circle indicates the primary measurement point data acquired in the previous processing cycle. In the example shown in Figure 7, the measurement point group extracted from the secondary measurement point data does not include static measurement points that overlap with the primary measurement point data of the previous processing cycle. Therefore, the measurement point group is classified into the first moving object measurement point group composed of dynamic measurement points. On the other hand, in the example shown in Figure 8, the measurement point group extracted from the secondary measurement point data is a static measurement point (measurement point within the dashed line H 1) that overlaps with the primary measurement point data of the previous processing cycle. , The measurement points excluding the static measurement points are composed of the dynamic measurement points (the measurement points indicated by the broken line H 2). Therefore, the measurement point group is classified as a second mobile measurement point group, which is composed of dynamic measurement points and static measurement points.
[ 0 0 4 7 ] [ 0 0 4 7 ]
(マッチング処理) 次いで、 物体検出処理部 6 3は、 クラスタリング処理により特定された第 1移動体測定 点群及び第 2移動体測定点群と、 前回の処理サイクルまでに検出済みの移動体との関連付 けを行うマッチング処理を実行する (ステップ S 1 9〜ステップ S 2 9 ) 。 本実施形態で は、 マッチング処理は、 簡易的な処理による第 1マッチング処理 (ステップ S 1 9〜ステ ップ S 2 3 ) と、 第 1マッチング処理によって関連付けができなかった第 1移動体測定点 群の関連付けを行う第 2マッチング処理 (ステップ S 2 5〜ステップ S 2 9 ) とを含む。 (Matching Process) Next, the object detection processing unit 63 matches the first moving object measurement point group and the second moving object measurement point group identified by the clustering process with the moving objects detected by the previous processing cycle. Matching processing for association is executed (steps S19 to S29). In this embodiment, the matching process includes a first matching process (steps S19 to S23) that is a simple process, and a first mobile measurement point that could not be associated by the first matching process. A second matching process (steps S25 to S29) for associating groups is included.
[ 0 0 4 8 ] [ 0 0 4 8 ]
「第 1移動体測定点群及び第 2移動体測定点群と、 前回の処理サイクルまでに検出済み の移動体とを関連付ける」 ことは、 第 1移動体測定点群及び第 2移動体測定点群の中から 、 検出済みの移動体のデータと同一の移動体を検出した測定点群を特定して記録すること を示す。 例えば前回の処理サイクルまでに所定の移動速度及び移動方向を推定した情報と ともに個々の移動体を識別する識別情報をラベル付けされた検出済みの移動体のデータが 記録されていた場合、 今回の処理サイクルで特定された第 1移動体測定点群及び第 2移動 体測定点群のうち検出済みの移動体のデータと同一の移動体を検出した測定点群を特定し 、 同じ識別情報をラベル付けして移動速度及び移動方向の情報とともに記録する処理が実 行される。 "Associating the first moving object measurement point group and the second moving object measurement point group with the moving objects detected by the previous processing cycle" means that the first moving object measurement point group and the second moving object measurement point group This indicates that the measurement point group where the same moving object as the data of the detected moving object is detected is to be identified and recorded from among the group. For example, if data of detected moving objects labeled with identification information to identify individual moving objects was recorded along with information estimating the predetermined moving speed and direction of movement up to the previous processing cycle, the current Among the first moving object measurement point group and the second moving object measurement point group identified in the processing cycle, the measurement point group that detected the same moving object as the detected moving object data is identified and labeled with the same identification information. The process of recording the information along with the moving speed and moving direction information is executed.
[ 0 0 4 9 ] 第 1マッチング処理は、 後述する第 2マッチング処理に比べて簡易に (短時間で) 実行 可能なマッチング処理であり、 第 2マッチング処理を行うまでもなく簡単に信頼性の高い 関連付けできる第 1移動体測定点群及び第 2移動体測定点群については関連付けを完了さ せることを目的として実行される。 [ 0 0 4 9 ] The first matching process is a matching process that can be executed more easily (in a shorter time) than the second matching process described later, and can be easily and reliably performed without performing the second matching process. High Executed for the purpose of completing the association for the first moving object measurement point group and the second moving object measurement point group that can be associated.
[ 0 0 5 0 ] 第 1マッチング処理は、 既存のマッチング技術により行われてよい。 例えば物体検出処 理部 6 3は、 第 1移動体測定点群及び第 2移動体測定点群と、 前回の処理サイクルまでに 移動体としてラベル付けされている移動体データ (検出済移動体データ) とを用いて、 検 出済移動体データに関連付け可能な第 1移動体測定点群及び第 2移動体測定点群を探索す
Figure imgf000011_0001
移動データについて求めた距離 D、 それぞれの測定点群の速度成分 V x _ c 1 ( n ) , V y — c 1 ( n ) 、 及びそれぞれの検出済移動体データの速度成分 V x — m〇 ( m ) , V y m 〇 ( m ) を機械学習モデル 7 0へ入力し、 互いに関連付け可能な測定点群及び検出済 移動体データを求める。 機械学習モデル 7 0は、 あらかじめ様々な速度や旋回速度、 向き で走行する様々な種類の移動体について収集又は生成した学習用の測定点群のデータを入 カデータとして、 移動体の運動パターンを学習させたモデルである。 機械学習モデルは、 代表的にはサポートベクターマシンが用いられ得るが、 その他のランダムフォレス トや二 ューラルネッ トワーク等の機械学習モデルであってもよい。
[ 0 0 5 0 ] The first matching process may be performed using existing matching technology. For example, the object detection processing unit 63 collects the first moving object measurement point group, the second moving object measurement point group, and the moving object data that has been labeled as a moving object by the previous processing cycle (detected moving object data). ) to search for the first moving object measurement point group and the second moving object measurement point group that can be associated with the detected moving object data.
Figure imgf000011_0001
The distance D obtained for the moving data, the velocity component of each measurement point group V x _ c 1 ( n ), V y — c 1 ( n ), and the velocity component of each detected moving object data V x — m〇 ( m ), V ym 〇 ( m ) are input to the machine learning model 70 to obtain a group of measurement points and detected moving object data that can be correlated with each other. The machine learning model 70 learns the movement patterns of moving objects using data from measurement point clouds for learning that have been collected or generated in advance for various types of moving objects moving at various speeds, turning speeds, and directions. This is a model that has Typically, a support vector machine may be used as the machine learning model, but other machine learning models such as random forest or dual network may also be used.
[ 0 0 5 7 ] これにより、 第 1マッチング処理により関連付けされなかった第 1移動体測定点群及び 第 2移動体測定点群に関連付けられる検出済移動体データが求められる。 物体検出処理部 6 3は、 関連付けられる検出済移動体データが存在した第 1移動体測定点群及び第 2移動 体測定点群は、 速度、 移動方向及び向きの情報とともに検出済移動体データに関連付けて 記録する (ステップ S 2 7 ) 。 [ 0 0 5 7 ] As a result, detected moving object data that is associated with the first moving object measurement point group and the second moving object measurement point group that were not associated by the first matching process is obtained. The object detection processing unit 63 converts the first moving object measurement point group and the second moving object measurement point group, which have associated detected moving object data, into detected moving object data along with information on speed, moving direction, and orientation. Correlate and record (step S 2 7).
[ 0 0 5 8 ] 一方、 物体検出処理部 6 3は、 関連付けられる検出済移動体データが存在しなかった第 1 移動体測定点群及び第 2移動体測定点群について、 新たな移動体として検出済移動体デ ータを生成し、 記録する (ステップ S 2 9 ) 。 新たに検出済移動体データを生成する場合 、 物体検出処理部 6 3は、 それぞれの処理サイクルで検出済移動体データと関連付けでき なかった第 1移動体測定点群及び第 2移動体測定点群を用いて、 新たな検出済移動体デー タを生成してもよい。 具体的に、 物体検出処理部 6 3は、 各処理サイクル間 (フレーム間 ) での第 1移動体測定点群及び第 2移動体測定点群の距離や、 各処理サイクルでの移動速 度や移動速度に基づいて同一の対象物の測定点群と推定される複数のフレームの第 1移動 体測定点群及び第 2移動体測定点群を特定し、 当該第 1移動体測定点群及び第 2移動体測 定点群の各処理サイクルでの位置や速度、 対象物のサイズ (長さ及び幅) を算出する。 あ らたに生成された検出済移動体データは、 次の処理サイクル以降、 マッチング処理で用い られる。 [ 0 0 5 8 ] On the other hand, the object detection processing unit 63 treats the first moving object measurement point group and the second moving object measurement point group for which there is no associated detected moving object data as new moving objects. Detected moving object data is generated and recorded (step S29). When newly generating detected moving object data, the object detection processing unit 63 generates the first moving object measurement point group and the second moving object measurement point group that could not be associated with the detected moving object data in each processing cycle. New detected moving object data may be generated using . Specifically, the object detection processing unit 63 calculates the distance between the first moving object measurement point group and the second moving object measurement point group between each processing cycle (between frames), the moving speed in each processing cycle, and the like. Identify the first moving object measurement point group and the second moving object measurement point group of multiple frames that are estimated to be the measurement point group of the same object based on the moving speed, and identify the first moving object measurement point group and the second moving object measurement point group. 2 Calculates the position, speed, and size (length and width) of the object in each processing cycle of the moving object measurement point group. The newly generated detected moving object data will be used in the matching process from the next processing cycle onward.
[ 0 0 5 9 ] 物体検出処理部 6 3は、 ステップ S 1 1〜ステップ S 2 9までの処理を各処理サイクル ごとに繰り返し、 L i D A R 1 1により検出された測定点のデータから抽出される測定点 群を、 検出済移動体データに関連付けする。 [ 0 0 5 9 ] The object detection processing unit 6 3 repeats the processing from step S 1 1 to step S 2 9 in each processing cycle, and extracts the data extracted from the measurement point data detected by L i D A R 1 1. The measurement point group is associated with the detected moving object data.
[ 0 0 6 0 ] このように、 本実施形態では、 物体検出処理部 6 3は、 L i D A R 1 1の測定点のデー 夕 (一次測定点データ) から、 移動体のサイズを明らかに超えるサイズの非移動体測定点 群を除外するとともに、 前回の処理サイクルから位置の変化のない静的測定点を除外した 動的測定点を抽出する。 また、 物体検出処理部 6 3は、 一次測定点データから非移動体測 定点群を除外した二次測定点データをクラスタリングし、 動的測定点群からなる第 1移動 体測定点群と、 動的測定点群及び静的測定点群から成る第 2移動体測定点群とに分類し、 すでに記録されている検出済移動体データとのマッチング処理を実行する。 [ 0 0 6 0 ] In this way, in this embodiment, the object detection processing unit 63 detects the size of the moving object that clearly exceeds the size of the moving object from the data of the measurement points of LiDAR11 (primary measurement point data). In addition to excluding non-moving measurement points of size, dynamic measurement points are extracted by excluding static measurement points whose position has not changed since the previous processing cycle. In addition, the object detection processing unit 63 clusters the secondary measurement point data obtained by excluding the non-moving object measurement point group from the primary measurement point data, and clusters the secondary measurement point data, which is obtained by excluding the non-moving object measurement point group, from the primary measurement point data, and clusters the first moving object measurement point group consisting of the dynamic measurement point group and the moving object measurement point group. A second mobile object measurement point group consisting of a static measurement point group and a static measurement point group are classified, and a matching process is performed with the detected moving object data that has already been recorded.
[ 0 0 6 1 ] したがって、 移動速度が相対的に速いと推定される第 1移動体測定点群と、 移動速度が 相対的に遅いと推定される第 2移動体測定点群とに分けて、 すでに記録されている検出済 移動体データとのマッチング処理を実行することが可能となる。 このため、 検出済移動体 データの速度に応じて、 マッチング対象を第 1移動体測定点群又は第 2移動体測定点群に 絞り込むことが可能となり、 移動体の検出精度を向上させることができる。 このように、 本実施形態に係る物体検出装置は、 L i D A R 1 1の測定点のデータが速度の情報を持た ないにもかかわらず、 一連の処理サイクルで検出される一次測定点データから、 同一の移 動体を関連付けて検出することができる。 [ 0 0 6 1 ] Therefore, the first moving object measurement point group is estimated to have a relatively fast moving speed, and the second moving object measurement point group is estimated to have a relatively slow moving speed. , it becomes possible to perform matching processing with already recorded detected mobile object data. Therefore, depending on the speed of the detected moving object data, it is possible to narrow down the matching targets to the first moving object measurement point group or the second moving object measurement point group, and it is possible to improve the accuracy of moving object detection. . In this way, the object detection device according to the present embodiment can detect data from the primary measurement point data detected in a series of processing cycles, even though the measurement point data of L i D A R 1 1 does not have velocity information. The same moving objects can be associated and detected.
[ 0 0 6 2 ] また、 本実施形態では、 物体検出処理部 6 3は、 検出済移動体データのうち速度が所定 の速度閩値以上の移動体を、 第 1移動体測定点群のいずれかに関連付け可能か否かを判定 し、 検出済移動体データのうち速度が所定の速度閩値未満の移動体データを、 第 1移動体 測定点群及び第 2移動体測定点群のいずれかに関連付け可能か否かを判定する、 第 1マッ チング処理を実行する。 これにより、 検出済移動体データの速度に応じてマッチング対象 を絞り込むことができることから、 マッチング処理の負荷を軽減することができる。 [ 0 0 6 2 ] Furthermore, in the present embodiment, the object detection processing unit 63 detects objects with a predetermined speed among the detected moving object data. Determine whether or not a moving object with a speed value equal to or higher than a predetermined speed value can be associated with any of the first moving object measurement points, and select moving object data whose speed is less than a predetermined speed value among the detected moving object data. , executes a first matching process to determine whether or not it can be associated with either the first moving object measurement point group or the second moving object measurement point group. This allows matching targets to be narrowed down according to the speed of detected moving object data, thereby reducing the load on matching processing.
[ 0 0 6 3 ] また、 本実施形態では、 物体検出処理部 6 3は、 第 1移動体測定点群及び第 2移動体測 定点群のデータ、 並びに、 前回の処理サイクルまでに検出済みの移動体データを機械学習 モデルに入力し、 第 1移動体測定点群及び前記第 2移動体測定点群のデータを、 前回の処 理サイクルまでに検出済みの移動体データに関連付け可能か否かを判定する、 第 2マッチ ング処理を実行する。 これにより、 種々の移動体の運動パターンの学習結果に基づいて、 第 1移動体測定点群及び前記第 2移動体測定点群のデータを、 前回の処理サイクルまでに 検出済みの移動体データに対して精度よくマッチングさせることができる。 [ 0 0 6 3 ] Furthermore, in this embodiment, the object detection processing unit 63 collects the data of the first moving object measurement point group and the second moving object measurement point group, as well as the data detected by the previous processing cycle. Is it possible to input the moving object data into a machine learning model and associate the data of the first moving object measurement point group and the second moving object measurement point group with the moving object data detected by the previous processing cycle? Execute the second matching process to determine . As a result, based on the learning results of the movement patterns of various moving objects, the data of the first moving object measurement point group and the second moving object measurement point group are converted into moving object data detected by the previous processing cycle. It is possible to match with high accuracy.
[ 0 0 6 4 ] また、 本実施形態では、 物体検出処理部 6 3は、 簡易なマッチング処理 (第 1マッチン グ処理) により関連付け可能な第 1移動体測定点群及び第 2移動体測定点群と、 検出済み 移動体データとを除いた後で、 機械学習モデルを用いた第 2マッチング処理を実行する。 このため、 移動体検出処理に要する処理速度及び処理負荷をより軽減しつつ、 検出済移動 体データとのマッチング処理を精度よく実行することができる。 [ 0 0 6 4 ] Furthermore, in the present embodiment, the object detection processing unit 63 detects the first moving object measurement point group and the second moving object measurement point group that can be associated by simple matching processing (first matching processing). After removing the group and the detected moving object data, a second matching process using a machine learning model is performed. Therefore, the processing speed and processing load required for the moving object detection processing can be further reduced, and the matching processing with the detected moving object data can be executed with high accuracy.
[ 0 0 6 5 ] また、 本実施形態では、 フィルタリング処理における所定の長さ閩値を、 想定される移 動体の最大長さの値とし、 物体検出処理部 6 3は、 フィルタリング処理において、 いずれ かの方向の長さが所定の長さ閩値以上の測定点群を、 路面凹凸又は路上設置物としてラベ ル付けして記録する。 これにより、 想定される移動体のサイズを超える測定点群を容易に 路面凹凸又は路上設置物として特定することができる。 [ 0 0 6 5 ] Furthermore, in the present embodiment, the predetermined length value in the filtering process is set to the value of the assumed maximum length of the moving object, and the object detection processing unit 63 A group of measurement points whose length in either direction is equal to or greater than a predetermined length value are labeled and recorded as road surface irregularities or road-installed objects. As a result, a group of measurement points larger than the expected size of a moving object can be easily identified as road surface irregularities or roadside objects.
[ 0 0 6 6 ] 物体検出装置による上記効果は、 物体検出方法及び物体検出処理を実行させるコンピュ ータプログラムによっても同様の効果を奏することができる。 [ 0 0 6 6 ] The above effects achieved by the object detection device can also be achieved by a computer program that executes the object detection method and object detection processing.
[ 0 0 6 7 ] なお、 マッチング処理において、 第 1マッチング処理を省略して、 第 2マッチング処理 のみが実行されてもよく、 この処理方法を実行する物体検出装置も本開示の技術的範囲に 属するものと了解される。 [ 0 0 6 7 ] Note that in the matching process, the first matching process may be omitted and only the second matching process may be executed, and an object detection device that executes this processing method is also within the technical scope of the present disclosure. It is understood that it belongs.
[ 0 0 6 8 ] 以上、 添付図面を参照しながら本発明の好適な実施形態について詳細に説明したが本発 明はこのような例に限定されない。 本発明の属する技術の分野における通常の知識を有す る者であれば、 特許請求の範囲に記載された技術的思想の範疇内において各種の変更例ま たは修正例に想到し得ることは明らかであり、 これらについても当然に本発明の技術的範 囲に属するものと了解される。 [0068] Although preferred embodiments of the present invention have been described above in detail with reference to the accompanying drawings, the present invention is not limited to such examples. A person with ordinary knowledge in the technical field to which the present invention pertains will be able to come up with various changes and modifications within the scope of the technical idea stated in the claims. This is obvious, and it is understood that these naturally fall within the technical scope of the present invention.
[ 0 0 6 9 ] 例えば上記実施形態では、 L i D A R 1 1が測定点の位置及び速度を算出し、 算出した 位置及び速度の情報を物体検出装置 5 〇へ送信していたが、 L i D A R 1 1が測定点の位 置を算出するとともに算出した位置の情報を物体検出装置 5 〇へ送信し、 物体検出装置 5 〇が測定点の速度の情報を算出してもよい。 [ 0 0 6 9 ] For example, in the above embodiment, the L i D A R 1 1 calculates the position and velocity of the measurement point, and transmits information on the calculated position and velocity to the object detection device 5 〇. The D A R 1 1 may calculate the position of the measurement point and transmit information on the calculated position to the object detection device 5 〇, and the object detection device 5 〇 may calculate information on the velocity of the measurement point.
[符号の説明] [Explanation of symbols]
[ 0 0 7 0 ] [ 0 0 7 0 ]
1 : 車両、 1 1 • 1 1 L F : レーダセンサ、 1 7 :位置検出センサ、 4 〇 : 車両制御装置 、 5 0 : 物体検出装置、 5 1 : 通信部、 5 3 : 処理部、 5 5 : 記憶部、 6 1 : 取得部、 6 3 : 物体検出処理部、 C R : センサ座標系、 C v : 車両座標系、 C w : 固定座標系、 P f : 誤測定点、 P a _ i • P b _ i : 半 U定用測定点、 a r • a v : 方位角 1: Vehicle, 1 1 • 1 1 LF: Radar sensor, 1 7: Position detection sensor, 4 〇: Vehicle control device, 5 0: Object detection device, 5 1: Communication section, 5 3: Processing section, 5 5: Storage unit, 6 1: Acquisition unit, 6 3: Object detection processing unit, C R : Sensor coordinate system, C v: Vehicle coordinate system, C w : Fixed coordinate system, P f: Erroneous measurement point, P a _ i • P b _ i: half-U measurement point, ar • av: azimuth angle

Claims

【書類名】 請求の範囲 [Document name] Scope of claims
【請求項 1】 [Claim 1]
L i D A R ( 1 1 ) により取得された測定点のデータに基づいて物体を検出する物体検 出装置 ( 1 〇 〇) において、 前記 L i D A R ( 1 1 ) により取得された測定点のデータである一次測定点データから 、 隣り合う測定点間の距離が所定の距離内にある測定点群を抽出するとともに、 抽出した 前記測定点群のうち、 いずれかの方向の長さが所定の長さ閩値以上の測定点群を非移動体 測定点群として、 前記一次測定点データから前記非移動体測定点群を除外するフィルタリ ング処理と、 前記一次測定点データから前記非移動体測定点群を除外した測定点のデータである二次 測定点データのうち、 前回の処理サイクルで用いられた前記一次測定点データと重なる測 定点である静的測定点を除外した測定点を動的測定点として抽出する動的測定点抽出処理 と、 前記二次測定点データから、 隣り合う測定点間の距離が前記所定の距離内にある測定点 群を抽出し、 前記動的測定点により構成される第 1移動体測定点群と、 前記動的測定点及 び前記静的測定点により構成される第 2移動体測定点群と、 を特定するクラスタリング処 理と、 前記クラスタリング処理により特定された前記第 1移動体測定点群及び前記第 2移動体 測定点群と、 前回の処理サイクルまでに検出済みの移動体データと、 の関連付けを行うマ ッチング処理と、 を実行する、 物体検出装置。 In an object detection device (1 〇 〇) that detects an object based on the data of the measurement point acquired by L i D A R ( 1 1 ), the data of the measurement point acquired by L i D A R ( 1 1 ) is used. From certain primary measurement point data, extract a measurement point group in which the distance between adjacent measurement points is within a predetermined distance, and among the extracted measurement point group, the length in any direction is a predetermined length. a filtering process that removes the non-moving body measurement point group from the primary measurement point data by treating the measurement point group having a value equal to or higher as the non-moving body measurement point group; and filtering the non-moving body measurement point group from the primary measurement point data. Among the secondary measurement point data, which is the data of the measurement points excluding the static measurement points, the measurement points excluding the static measurement points, which are the measurement points that overlap with the primary measurement point data used in the previous processing cycle, are called dynamic measurement points. A dynamic measurement point extraction process is performed to extract a dynamic measurement point from the secondary measurement point data, and the distance between adjacent measurement points is within the predetermined distance. a first moving body measurement point group; a second moving body measurement point group constituted by the dynamic measurement points and the static measurement points; a clustering process for identifying the first moving body measurement point group; A matching process for associating a first moving object measurement point group and a second moving object measurement point group with moving object data detected up to the previous processing cycle; and an object detection device that performs the following steps.
【請求項 2】 前記マッチング処理において、 前記検出済みの移動体データのうち速度が所定の速度閩値以上の移動体を、 前記第 1移 動体測定点群のいずれかに関連付け可能か否かを判定し、 前記検出済みの移動体データの うち速度が所定の速度閩値未満の移動体データを、 前記第 1移動体測定点群及び前記第 2 移動体測定点群のいずれかに関連付け可能か否かを判定する、 第 1マッチング処理を実行 する、 請求項 1に記載の物体検出装置。 [Claim 2] In the matching process, it is determined whether or not a moving object whose speed is equal to or higher than a predetermined speed value among the detected moving object data can be associated with any of the first moving object measurement point group. and determine whether it is possible to associate moving object data whose speed is less than a predetermined speed threshold value among the detected moving object data with either the first moving object measurement point group or the second moving object measurement point group. The object detection device according to claim 1, wherein the object detection device executes a first matching process to determine whether or not the object is detected.
【請求項 3】 前記物体検出装置は、 様々な移動体の運動パターンを学習した機械学習モデルを備え、 前記マッチング処理において、 前記第 1マッチング処理によって関連付けができなかっ た前記第 1移動体測定点群及び前記第 2移動体測定点群のデータ、 並びに、 前回の処理サ イクルまでに検出済みの移動体データを前記機械学習モデルに入力し、 前記第 1移動体測 定点群及び前記第 2移動体測定点群のデータを、 前回の処理サイクルまでに検出済みの移 動体データに関連付け可能か否かを判定する、 第 2マッチング処理を実行する、 請求項 2に記載の物体検出装置。 3. The object detection device includes a machine learning model that has learned motion patterns of various moving objects, and in the matching process, the first moving object measurement point that could not be associated by the first matching process input the data of the group and the second moving object measurement point group, as well as the moving object data detected by the previous processing cycle into the machine learning model, and input the data of the first moving object measurement point group and the second moving object measurement point group 3. The object detection device according to claim 2, wherein the second matching process is executed to determine whether data of the body measurement point group can be associated with moving body data detected by the previous processing cycle.
【請求項 4】 前記物体検出装置は、 様々な移動体の運動パターンを学習した機械学習モデルを備え、 前記マッチング処理において、 前記第 1移動体測定点群及び前記第 2移動体測定点群の データ、 並びに、 前回の処理サイクルまでに検出済みの移動体データを前記機械学習モデ ルに入力し、 前記第 1移動体測定点群及び前記第 2移動体測定点群のデータを、 前回の処 理サイクルまでに検出済みの移動体データに関連付け可能か否かを判定する、 請求項 1に記載の物体検出装置。 4. The object detection device includes a machine learning model that has learned motion patterns of various moving objects, and in the matching process, the first moving object measurement point group and the second moving object measurement point group are data, and the moving object data detected up to the previous processing cycle are input into the machine learning model, and the data of the first moving object measurement point group and the second moving object measurement point group are input to the previous processing cycle. 2. The object detection device according to claim 1, wherein the object detection device determines whether or not it can be associated with moving object data that has been detected up to the physical cycle.
【請求項 5】 前記フィルタリング処理における前記所定の長さ閩値を、 想定される移動体の最大長さ の値とし、 前記フィルタリング処理において、 前記いずれかの方向の長さが前記所定の長さ閩値以 上の前記測定点群を、 路面凹凸又は路上設置物としてラベル付けして記録する、 請求項 1に記載の物体検出装置。 [Claim 5] The predetermined length value in the filtering process is set to the maximum length of an assumed moving object, and in the filtering process, the length in either direction is the predetermined length. Min value and above The object detection device according to claim 1, wherein the above measurement point group is labeled and recorded as road surface irregularities or road-installed objects.
【請求項 6】 [Claim 6]
L i D A R ( 1 1 ) により取得された測定点のデータに基づいて物体を検出する物体検 出方法において、 前記 L i D A R ( 1 1 ) により取得された測定点のデータである一次測定点データから 、 隣り合う測定点間の距離が所定の距離内にある測定点群を抽出するとともに、 抽出した 前記測定点群のうち、 いずれかの方向の長さが所定の長さ閩値以上の測定点群を非移動体 測定点群として、 前記一次測定点データから前記非移動体測定点群を除外することと (ス テップ S 1 1 ) 、 前記一次測定点データから前記非移動体測定点群を除外した測定点のデータである二次 測定点データのうち、 前回の処理サイクルで用いられた前記一次測定点データと重なる測 定点である静的測定点を除外した測定点を動的測定点として抽出することと (ステップ S 1 5 ) 、 前記二次測定点データから、 隣り合う測定点間の距離が前記所定の距離内にある測定点 群を抽出し、 前記動的測定点により構成される第 1移動体測定点群と、 前記動的測定点及 び前記静的測定点により構成される第 2移動体測定点群と、 を特定するクラスタリング処 理 (ステップ S 1 7 ) と、 前記クラスタリング処理により特定された前記第 1移動体測定点群及び前記第 2移動体 測定点群と、 前回の処理サイクルまでに検出済みの移動体データと、 の関連付けを行うマ ッチング処理 (ステップ S 1 9~ステップ S 2 9 ) と、 を含む、 物体検出方法。 In an object detection method that detects an object based on measurement point data acquired by L i D A R ( 1 1 ), primary measurement point data that is data of measurement points acquired by L i D A R ( 1 1 ) is , extracting a group of measurement points in which the distance between adjacent measurement points is within a predetermined distance, and among the extracted measurement point group, a measurement point whose length in any direction is equal to or greater than a predetermined length value. removing the non-moving object measurement point group from the primary measurement point data by setting the point group as a non-moving object measurement point group (step S11); Among the secondary measurement point data, which is the data of the measurement points excluding the static measurement points, the measurement points excluding the static measurement points, which are the measurement points that overlap with the primary measurement point data used in the previous processing cycle, are called dynamic measurement points. (Step S15), from the secondary measurement point data, extract a group of measurement points in which the distance between adjacent measurement points is within the predetermined distance, and a clustering process (step S17) for identifying: a first moving body measurement point group consisting of the dynamic measurement points and the static measurement points; a matching process (step S 1 9~Step S 2 9 ) An object detection method, including .
【請求項 ? ] コンピュータに、 L i D A R ( 1 1 ) により取得された測定点のデータに基づいて物体 を検出する処理を実行させるコンピュータプログラムであって、 前記 L i D A R ( 1 1 ) により取得された測定点のデータである一次測定点データから 、 隣り合う測定点間の距離が所定の距離内にある測定点群を抽出するとともに、 抽出した 前記測定点群のうち、 いずれかの方向の長さが所定の長さ閩値以上の測定点群を非移動体 測定点群として、 前記一次測定点データから前記非移動体測定点群を除外することと (ス テップ S 1 1 ) 、 前記一次測定点データから前記非移動体測定点群を除外した測定点のデータである二次 測定点データのうち、 前回の処理サイクルで用いられた前記一次測定点データと重なる測 定点である静的測定点を除外した測定点を動的測定点として抽出することと (ステップ S 1 5 ) 、 前記二次測定点データから、 隣り合う測定点間の距離が前記所定の距離内にある測定点 群を抽出し、 前記動的測定点により構成される第 1移動体測定点群と、 前記動的測定点及 び前記静的測定点により構成される第 2移動体測定点群と、 を特定するクラスタリング処 理 (ステップ S 1 7 ) と、 前記クラスタリング処理により特定された前記第 1移動体測定点群及び前記第 2移動体 測定点群と、 前回の処理サイクルまでに検出済みの移動体データと、 の関連付けを行うマ ッチング処理 (ステップ S 1 9~ステップ S 2 9 ) と、 を含む処理を実行させる、 コンピュータプログラム。 [Claim?] A computer program that causes a computer to execute a process of detecting an object based on data of a measurement point acquired by L i D A R ( 1 1 ), the computer program comprising: From the primary measurement point data, which is the data of the measured measurement points, a group of measurement points in which the distance between adjacent measurement points is within a predetermined distance is extracted, and among the extracted measurement point group, a group of measurement points whose length is equal to or greater than a predetermined length value as a group of measurement points of a non-moving object, and excluding the group of measurement points of a non-moving object from the primary measurement point data (step S11); Of the secondary measurement point data, which is data of measurement points obtained by excluding the non-moving object measurement point group from the primary measurement point data, static data is the measurement point that overlaps with the primary measurement point data used in the previous processing cycle. The measurement points excluding the measurement points are extracted as dynamic measurement points (step S15), and from the secondary measurement point data, a group of measurement points where the distance between adjacent measurement points is within the predetermined distance is extracted. and specifying a first moving body measurement point group made up of the dynamic measurement points, and a second moving body measurement point group made up of the dynamic measurement points and the static measurement points. Clustering processing (step S17), the first moving object measurement point group and the second moving object measurement point group identified by the clustering processing, and the moving object data detected up to the previous processing cycle. A computer program that executes a matching process (step S19 to step S29) that associates , and a process that includes .
PCT/IB2023/057175 2022-07-22 2023-07-13 Object detection device, object detection method, and computer program WO2024018337A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022116899 2022-07-22
JP2022-116899 2022-07-22

Publications (1)

Publication Number Publication Date
WO2024018337A1 true WO2024018337A1 (en) 2024-01-25

Family

ID=87553703

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2023/057175 WO2024018337A1 (en) 2022-07-22 2023-07-13 Object detection device, object detection method, and computer program

Country Status (1)

Country Link
WO (1) WO2024018337A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200085436A (en) * 2019-01-07 2020-07-15 영남대학교 산학협력단 Lidar system for object recognition using background estimation technique
CN113343840A (en) * 2021-06-02 2021-09-03 合肥泰瑞数创科技有限公司 Object identification method and device based on three-dimensional point cloud
US20220035003A1 (en) * 2020-07-29 2022-02-03 The Johns Hopkins University Method and apparatus for high-confidence people classification, change detection, and nuisance alarm rejection based on shape classifier using 3d point cloud data

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200085436A (en) * 2019-01-07 2020-07-15 영남대학교 산학협력단 Lidar system for object recognition using background estimation technique
US20220035003A1 (en) * 2020-07-29 2022-02-03 The Johns Hopkins University Method and apparatus for high-confidence people classification, change detection, and nuisance alarm rejection based on shape classifier using 3d point cloud data
CN113343840A (en) * 2021-06-02 2021-09-03 合肥泰瑞数创科技有限公司 Object identification method and device based on three-dimensional point cloud

Similar Documents

Publication Publication Date Title
CN110155046B (en) Automatic emergency braking hierarchical control method and system
CN108032859B (en) Automatic lane change control method and device and automobile
CN109318893B (en) Safe driving assistance method and system based on license plate pixel height change
CN110884490B (en) Method and system for judging vehicle intrusion and assisting driving, vehicle and storage medium
CN107415945A (en) For assessing the automotive drive system and its application method of track lane-change
US20200234191A1 (en) Travel model generation system, vehicle in travel model generation system, and processing method
CN110588623B (en) Large automobile safe driving method and system based on neural network
CN107472246A (en) Adaptive cruise control system and its operating method
CN112406820B (en) Multi-lane enhanced automatic emergency braking system control method
CN109649389B (en) Acceleration-adjustable self-adaptive cruise control system and control method
CN108569287A (en) The method and apparatus of generation vehicle control order, vehicle control device, storage medium
US11338812B2 (en) Vehicle control device
JP7081423B2 (en) Information processing system
JP2022040023A (en) Method and apparatus for controlling terrain mode using road condition judgement model based on deep learning
CN110723141A (en) Vehicle active collision avoidance system and collision avoidance mode switching method thereof
CN111127920A (en) Vehicle-mounted communication-based rear-end collision prevention early warning and control method and system
CN105160356A (en) Method and system for fusing sensor data of vehicle active safety system
CN109324537A (en) A kind of control system and control method of man-machine common operating and controlling vehicle
CN112810619A (en) Radar-based method for identifying front target vehicle of assistant driving system
CN113859240A (en) Lane change assist system and lane change method using the same
CN113657265A (en) Vehicle distance detection method, system, device and medium
CN107161145A (en) Method and control device for avoiding rear-end collision
CN108569268A (en) Vehicle collision avoidance parameter calibration method and device, vehicle control device, storage medium
KR102011665B1 (en) Apparatus and method for evalutating adaptive cruise control system for vehicle
CN106985818A (en) A kind of motor vehicle intelligent drive assist system based on cloud computing

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23749155

Country of ref document: EP

Kind code of ref document: A1