WO2022215348A1 - 物標検出装置 - Google Patents
物標検出装置 Download PDFInfo
- Publication number
- WO2022215348A1 WO2022215348A1 PCT/JP2022/005723 JP2022005723W WO2022215348A1 WO 2022215348 A1 WO2022215348 A1 WO 2022215348A1 JP 2022005723 W JP2022005723 W JP 2022005723W WO 2022215348 A1 WO2022215348 A1 WO 2022215348A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target
- sensor
- information
- detection information
- fusion
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 181
- 230000004927 fusion Effects 0.000 claims abstract description 116
- 230000010354 integration Effects 0.000 claims description 66
- 238000012545 processing Methods 0.000 claims description 35
- 238000007499 fusion processing Methods 0.000 abstract 2
- 238000000034 method Methods 0.000 description 20
- 238000010586 diagram Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 8
- 230000006399 behavior Effects 0.000 description 7
- 238000007792 addition Methods 0.000 description 6
- 230000010365 information processing Effects 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 206010039203 Road traffic accident Diseases 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000028838 turning behavior Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/50—Systems of measurement based on relative movement of target
- G01S13/58—Velocity or trajectory determination systems; Sense-of-movement determination systems
- G01S13/588—Velocity or trajectory determination systems; Sense-of-movement determination systems deriving the velocity value from the range measurement
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/932—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using own vehicle data, e.g. ground speed, steering wheel direction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- the present invention relates to a target detection device.
- Driving assistance to achieve various objectives, including reducing traffic accidents, reducing the burden on drivers, improving fuel efficiency to reduce the burden on the global environment, and providing means of transportation for vulnerable people to realize a sustainable society.
- Systems and automated driving systems are being developed.
- a plurality of sensors (cameras, radars, etc.) are installed in the vehicle in order to monitor the surroundings of the vehicle instead of the driver.
- a system has been developed that uses the recognition results of a plurality of sensors mounted on a vehicle to automatically apply brakes to specific objects such as pedestrians and vehicles.
- Patent Document 1 discloses a moving target discrimination unit that discriminates whether or not an object detected by a radar device is a moving target, and an object extraction unit that extracts a specific object from an image captured by a camera unit.
- An object recognition device is disclosed that includes a unit.
- the specific object discriminating section discriminates whether or not the object detected by the radar device is a specific object based on the discrimination result of the moving target discriminating section and the extraction result of the object extracting section.
- the present invention has been made in view of such problems, and provides a target detection device that can appropriately determine the reliability of a target.
- a target object detection device is a target object detection device for detecting a target around a vehicle based on sensor information measured by a plurality of sensors, comprising a first sensor, a second sensor, and the An integration processing unit that integrates the target detection information of the first sensor and the target detection information of the second sensor for each cycle, and the integration that manages the integration result of the integration processing unit in chronological order as an integration history A history management unit, and a reliability determination unit that determines reliability of a target type based on the integrated history and target detection information of the first sensor and the second sensor.
- the target detection information of the first sensor includes target type information.
- the present invention it is possible to provide a target detection device that can appropriately determine the reliability of a target.
- FIG. 1 is a functional block diagram of a target detection device (sensor fusion device 1) according to a first embodiment
- FIG. 5 is a flowchart for explaining a procedure for determining reliability and adding detailed type information in a reliability determination unit 50 and a detailed type assigning unit 60
- FIG. 4 is a schematic diagram illustrating reliability determination and addition of detailed type information according to one scenario
- FIG. 4 is a schematic diagram illustrating reliability determination and addition of detailed type information according to one scenario
- FIG. 4 is a schematic diagram illustrating reliability determination and addition of detailed type information according to one scenario
- FIG. 4 is a schematic diagram illustrating reliability determination and addition of detailed type information according to one scenario
- FIG. 7 is a functional block diagram of a target detection device (sensor fusion device 1) according to a second embodiment
- FIG. 10 is a schematic diagram illustrating reliability determination and addition of detailed type information according to a second embodiment according to one scenario;
- FIG. 1 is a functional block diagram of a target detection device (sensor fusion device 1) according to the first embodiment.
- the sensor fusion device 1 of the first embodiment includes an observation information processing unit 20, an integration processing unit 30, an integration history management unit 40, a reliability determination unit 50, and a detailed classification assignment a portion 60;
- the sensor fusion device 1 also receives output signals from the first sensor 10a, the second sensor 10b, and the own vehicle behavior detection sensor 10c.
- the first sensor 10a and the second sensor 10b are sensors that detect targets around the vehicle.
- the first sensor 10a is a sensor capable of determining the type of the target based on image data of the captured target. far-infrared camera).
- the second sensor 10b is a sensor that can determine the presence and position of a target but does not have the function of determining the type of the target. , TOF (Time of Flight) sensor, or a combination thereof.
- the own vehicle behavior detection sensor 10c is a group of sensors that detect the speed, steering angle, yaw rate, etc. of the own vehicle.
- the vehicle behavior detection sensor 10c includes a wheel speed sensor, a steering angle sensor, an acceleration sensor, a gyro sensor, and the like.
- the detection information of the first sensor 10a includes at least the position, speed, type, and target ID of the detected target
- the detection information of the second sensor 10b includes at least the position, speed, and object ID of the detected target. Includes a tag ID.
- the sensor fusion device 1 (electronic control device) and various sensors (first sensor 10a, second sensor 10b, etc.) according to the present embodiment include a computer (microcomputer) including an arithmetic device, a memory, and an input/output device.
- the computing device includes a processor and executes programs stored in memory. Part of the processing performed by the arithmetic device by executing the program may be executed by another arithmetic device (for example, hardware such as FPGA (Field Programmable Gate Array) and ASIC (Application Specific Integrated Circuit)).
- FPGA Field Programmable Gate Array
- ASIC Application Specific Integrated Circuit
- the memory includes ROM and RAM, which are non-volatile storage elements.
- the ROM stores immutable programs (eg, BIOS) and the like.
- RAM is a high-speed and volatile memory device such as DRAM (Dynamic Random Access Memory) and a non-volatile memory device such as SRAM (Static Random Access Memory). Stores data that is used at times.
- the input/output device is an interface that performs external transmission of processing content by electronic control devices and sensors and reception of data from the outside according to a predetermined protocol.
- a program executed by the arithmetic unit is stored in a non-volatile memory, which is a non-temporary storage medium of the electronic control unit or sensor.
- the observation information processing unit 20 receives the detection information of the first sensor 10a and the second sensor 10b, and also receives the speed, steering angle, yaw rate, etc. of the vehicle output from the vehicle behavior detection sensor 10c. It has the function of uniform conversion to the format.
- the code given as the target ID is the same code when the same target is tracked in the time direction.
- the above format includes at least time and coordinate information.
- the time indicates, for example, fusion execution timing, and the coordinates are, for example, the center of the vehicle as the origin, the front of the vehicle as the x-axis, and the left side of the vehicle. refers to the coordinate system with the y-axis.
- the integration processing unit 30 receives target detection information output from the observation information processing unit 20 and performs integration processing, which will be described later.
- the integration processing unit 30 includes a grouping unit 30a that groups a plurality of pieces of detection information, an integration unit 30b that integrates the detection information of the grouped targets to generate or update a fusion target, and an integration and time processing unit of the detection information.
- a prediction updating unit 30c for performing prediction update for tracking the same target in the series.
- the grouping unit 30a uses at least position information among the target detection information to determine whether or not a plurality of pieces of detection information are detection information for the same target. Group the detected information as a set of information. A plurality of pieces of detection information grouped by the grouping unit 30a are integrated by the integration unit 30b, converted into fusion targets, and given the same target ID. The grouping by the grouping section 30a and the integration processing by the integration section 30b are performed at predetermined intervals.
- the grouping unit 30a prioritizes the sameness determination between the plurality of pieces of detection information for grouping in the previous cycle. Identity determination is performed between the obtained estimation information of the fusion target and a plurality of pieces of detection information obtained in the current cycle.
- the identity determination for the fusion target estimation information obtained in the previous cycle is included in at least the position information of the fusion target estimation information obtained in the previous cycle and a plurality of detection information obtained in the current cycle. location information.
- a plurality of pieces of detection information grouped by the grouping unit 30a are output to the integrating unit 30b.
- the integration unit 30b integrates the grouped pieces of detection information to generate or update a fusion target.
- the detection error characteristics of the first sensor 10a and the second sensor 10b are given as parameters to the integration unit 30b in advance, and integration is performed based on the positions and velocities corrected according to the parameters of the error characteristics. can be done.
- the error characteristics may be defined by a covariance matrix, and the positions and velocities calculated by probability averaging may be used for integration.
- the error characteristics of each sensor may be estimated during integration processing in the integration unit 30b instead of being given to the integration unit 30b in advance.
- the detection information of the first sensor 10a and the second sensor 10b may include data of such error characteristics.
- the integration unit 30b outputs the detection result as the fusion target.
- the detection information of one sensor is grouped, it is expressed as "integrated" as a fusion target.
- the position and speed information in the fusion target may be the same as the information detected by the sensor, or may indicate the internally estimated position and speed in chronological order according to the information detected by the sensor.
- the fusion target information which is the result of the integrated processing of the detection information in the integration unit 30b, is output to the prediction update unit 30c. Further, the fusion target and the detection information of the first sensor 10 a and the second sensor 10 b included in the fusion target are output to the integrated history management unit 40 .
- the integrated history management unit 40 has a function of managing the generated or updated fusion target information in chronological order.
- the prediction update unit 30c uses the execution cycle of the integrated processing, position and speed information included in the fusion target detection information, and the speed, steering angle, yaw rate, etc. of the vehicle output from the vehicle behavior detection sensor 10c. Then, the estimated value of the fusion target in the next cycle of integration processing is calculated. For example, the prediction updating unit 30c uses at least the velocity of the fusion target and the execution cycle of the integration process to calculate the change in the position of the target over time, and estimates the position of the fusion target based on the calculated change in the position over time. By doing so, the estimated information of the fusion target can be obtained. It should be noted that the position and speed of the target may be estimated by using the speed, steering angle, yaw rate, etc. of the own vehicle and considering the turning behavior of the own vehicle. The estimation information of the fusion target obtained by the prediction updating unit 30c is output to the grouping unit 30a as described above.
- the integrated history management unit 40 receives the fusion target output from the integration processing unit 30 and the detection information of the first sensor 10a and the second sensor 10b included in the fusion target, and stores the fusion target in chronological order. Stored and managed for each target ID. Information managed by the integrated history management unit 40 is output to the reliability determination unit 50 and the detailed type assignment unit 60 .
- the reliability determination unit 50 receives the integration history of the fusion target from the integration history management unit 40, determines the reliability of the fusion target based on the integration history, and uses the reliability information, which is the determination result, as the fusion object. given to the mark. Further, the detailed type providing unit 60 provides the fusion target with the type of the target indicated by the fusion target and information on the reliability thereof as detailed type information. The information generated by the detailed type assigning unit 60 is output to the driving control device 2, and vehicle control including automatic driving and driving assistance is executed.
- each fusion target is determined according to the contents of the detection information of the fusion target obtained in the previous period and the contents of the detection information of the fusion target in the current period.
- the confidence level of the particular class (pedestrian, car, cyclist, etc.) indicated is determined.
- the reliability is classified into at least three categories of "definite", "past", and "divided” in descending order of reliability.
- One of the basic criteria for reliability determination is that the reliability is determined to be high when grouping is performed based on the detection information of the first sensor 10a. Another criterion is that the current cycle grouping is more reliable than the previous cycle grouping.
- “split” has the lowest reliability. Specifically, regarding one fusion target, the detection information of the first sensor 10a is not grouped in the current cycle, while the detection information of the first sensor 10a is grouped in the previous cycle. If there is another detection information of 10a in this cycle, the reliability is determined as “split”. The reason why "split" is less reliable than "past” is that, unlike "past", the consistency of grouping between multiple cycles is not maintained.
- the grouping of the current period is said to be more correct. Based on policy, it would be correct for a given fusion target not to integrate its detection information. In other words, the fact that the detection information of the first sensor was grouped in the previous cycle was actually an error, and the consistency of the grouping is not maintained in a plurality of cycles. Therefore, it is judged that the reliability of "split" is lower than that of "past".
- the integrated history management unit 40 receives the results of grouping and integration of detected information from the integration processing unit 30 for each cycle, and manages the integration history for each cycle.
- the reliability determination unit 50 determines the reliability of the specific type in the fusion target obtained for each cycle.
- the detailed classification giving unit 60 gives the fusion target a detailed classification, which is the reliability determination result.
- the reliability determination unit 50 receives the detection information obtained in the current cycle, and the integration processing unit 30 groups the detection information into the fusion target related to the target ID in the current cycle, and obtains the specific type information. is given (step S100). If the determination result is affirmative (YES), the process proceeds to step S130, and "confirmed" is assigned as the reliability information for the fusion target that has been grouped and assigned specific type information. On the other hand, if the determination in step S100 is negative (NO), the process proceeds to step S110.
- step S110 the reliability determination unit 50 receives the detection information obtained in the previous cycle from the integrated history management unit 40, groups the detection information into the corresponding fusion targets in the previous cycle, and obtains specific type information. It is determined whether or not it has been given (step S110). If YES, the process proceeds to step S120. If NO, the reliability is not determined and the process ends (END).
- step S120 the detection information related to the specific type grouped and assigned in the previous period exists in another (as a fusion target with another target ID) in the current period. It is determined whether or not to If the determination result in step S120 is YES, in step S140 it is determined that the reliability of the specific type of the fusion target is "split", and detailed type information indicating that is added. On the other hand, if the determination result in step S130 is NO, in step S150 it is determined that the reliability of the specific type of the fusion target is "past", and detailed type information indicating that is added. As described above, the procedure for judging reliability and adding detailed type information is completed.
- FIG. 3 is a schematic diagram illustrating the procedure for determining the reliability and detailed type when the pedestrian 200 crosses in front of the vehicle 210.
- FIG. In the following description, it is assumed that time progresses in the order of (B) ⁇ (C) ⁇ (D) in FIG.
- the detection range CFOV of the first sensor (camera) 10a is indicated by a dashed sector
- the detection range RFOV of the second sensor (radar) 10b is indicated by a solid sector.
- Detected information C1 and C2 of the first sensor 10a are indicated by circles
- detected information R1 of the second sensor 10b is indicated by a square.
- a fusion target F1 that is generated by integrating a plurality of pieces of detection information and to which detailed type information of "confirmed" is assigned with respect to a predetermined specific type is indicated by a solid-line ellipse.
- the fusion target F2 determined to be "split" is indicated by a dotted ellipse.
- the pedestrian 200 when a pedestrian 200 crosses in front of a vehicle 210, in the situation (B), the pedestrian 200 is in the detection range CFOV of the first sensor 10a and the detection range CFOV of the second sensor 10b. It exists in the overlapping part of the detection range RFOV and is detected simultaneously by both sensors 10a and 10b.
- the integration unit 30b integrates the grouped detection information to generate the fusion target F1.
- the pedestrian 200 is still within the detection range CFOV of the first sensor 10a, but has moved outside the detection range RFOV of the second sensor 10b.
- the pedestrian 200 has already moved outside the detection range RFOV of the second sensor 10b. Detected as target R1. Internal interpolation means that the sensor behaves as if it continues to detect the target by interpolating the target moving outside the detection range at the detection end in order to prevent the control target from not being detected.
- FIG. 4A shows a case where a vehicle 210 turning left (turning left) approaches a pedestrian 200 walking parallel to the left side of the vehicle.
- FIG. 4(B) shows the situation before the vehicle 210 starts to turn left, and the pedestrian 200 exists in the overlapping portion of the detection range CFOV of the first sensor 10a and the detection range RFOV of the second sensor 10b. do.
- the detection information of both sensors 10a and 10b indicates the same position information.
- the vehicle 210 starts turning left, and the pedestrian 200 moves to the left of the vehicle 210. Proximity to the sides.
- step 110 it is determined whether or not the detection information is grouped in the fusion target F1 in the previous period and specific type information is added. In this example, a determination of YES is made, and the process proceeds to step S120.
- step S120 it is determined whether or not the detection information related to the specific type grouped and determined in the fusion target object F1 in the previous cycle exists in the current cycle (step S120).
- the determination is NO because the fusion target F1' is determined to have an unknown type.
- FIG. 5 shows the determination procedure when another preceding vehicle 230 is traveling in front of the own vehicle 210 and the preceding vehicle 230 passes a manhole 220 on the road on the way.
- another preceding vehicle 230 is traveling in front of the own vehicle 210 at approximately the same speed.
- the manhole 220 is not detected because it is blocked by 230 .
- the second sensor 10b does not detect the manhole 220, but detects only another preceding vehicle 230 as a target.
- the integration unit 30b integrates the grouped detection information to generate the fusion target F1.
- FIG. 5C assumes that the preceding vehicle 230 has just passed through the manhole 220 .
- FIG. 5(D) shows a case where the preceding vehicle 230 continues to travel further and travels to the front away from the manhole 220 .
- step S100 the determination in step S100 is NO
- step S110 the subsequent step S110 is YES
- step S120 is YES
- the detection information of the preceding vehicle 230 by the first sensor 10a and the detection information of the manhole 220 by the second sensor 10b are erroneously integrated and identified.
- the determination procedure of FIG. 2 even if an erroneous determination of reliability is once made in one cycle, the error is corrected in the next cycle, ensuring the accuracy of the determination of reliability. can be done.
- the pedestrian 200 is walking in parallel in front of the own vehicle 210, but there is an obstacle 240 (such as a fence) on the way. It shows a case where it becomes invisible from the own vehicle 210 .
- an obstacle 240 such as a fence
- the vehicle 210 is traveling forward, but the pedestrian 200 is still positioned to the left front of the vehicle 210 and is not obstructed by the obstacle 240.
- Various types of detection information are in a state as shown in FIG. 6(B).
- the situation shown in FIG. can get.
- the first sensor 10a cannot capture the image of the pedestrian 200, and the detection information of the first sensor 10a has not been obtained.
- the second sensor 10b continuously obtains detection information based on the above-described internal interpolation function and the like, detects it as the target R1, and integrates it into the fusion target F1.
- NO is determined in step S100
- YES is determined in step S110
- the determination of the specific type of the fusion target is highly reliable. This makes it possible to improve the accuracy of automatic driving and driving assistance.
- the integrated history management unit 40 manages the integrated result of integrating the detection information of each sensor obtained in each cycle, and the integrated result obtained in the previous cycle and the integrated result obtained in the current cycle. By comparing , it is possible to make a determination with high reliability regarding the determination of the specific type.
- FIG. 7 is a functional block diagram of a target detection device (sensor fusion device 1) according to the second embodiment.
- the same reference numerals are given to the same configurations as in the first embodiment (FIG. 1), and duplicate descriptions will be omitted below.
- the integrated processing unit 30 has a movement prediction unit 30d. It differs from the first embodiment.
- the movement prediction unit 30d predicts movement prediction information such as the movement speed and movement direction of the target according to the detection information of the first sensor 10a and the second sensor 10b and the detection information of the own vehicle behavior detection sensor 10c.
- the grouping unit 30a and the integration unit 30b of the integration processing unit 30 perform grouping and integration according to these prediction results of the movement prediction unit 30d.
- the movement prediction information predicted by the movement prediction unit 30d includes, in addition to the moving speed and direction of the target, the acceleration, the continuous deviation of the position, the direction of the target obtained from image recognition (for example, the direction of the pedestrian's face, line of sight), etc., and is not limited to a specific combination.
- the situation in FIG. 8(B) is the same as the situation in FIG. 3(B).
- the integration unit 30b integrates the grouped detection information to generate the fusion target F1.
- the situation in FIG. 8(C) is similar to the situation in FIG. 3(C).
- the pedestrian 200 is still within the detection range CFOV of the first sensor 10a, but has moved outside the detection range RFOV of the second sensor 10b.
- the detection information (target C2) of the first sensor 10a is different from the detection information of the second sensor 10b.
- target R2 is not grouped with the same target ID.
- Target C2 is merged into fusion target F1' and target R2 is merged into another fusion target F2.
- the grouping unit 30a and the integration unit 30b combine the detection information (target C2) of the first sensor 10a into a fusion target F1′ that is integrated based on the degree of matching with the previous period regarding the movement prediction information.
- the second embodiment in addition to obtaining the same effect as the first embodiment, by referring to the movement prediction information, it is possible to determine the specific type of fusion target. Regarding the determination, it becomes possible to make the determination with higher reliability.
- each configuration, function, processing unit, processing means, etc. described above may be realized by hardware, for example, by designing a part or all of them with an integrated circuit, and the processor realizes each function. It may be realized by software by interpreting and executing a program to execute. Information such as programs, tables, and files that implement each function can be stored in storage devices such as memory, hard disks, SSD (Solid State Drives), or recording media such as IC cards, SD cards, DVDs, and BDs. can.
- control lines and information lines shown in the attached drawings show what is considered necessary for explanation, and do not necessarily show all the control lines and information lines necessary for implementation. In practice, it can be considered that almost all configurations are interconnected.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
図1は、第1の実施の形態の物標検出装置(センサフュージョン装置1)の機能ブロック図である。図1に示すように、第1の実施の形態のセンサフュージョン装置1は、観測情報処理部20と、統合処理部30と、統合履歴管理部40と、信頼度判定部50と、詳細種別付与部60とを有する。また、センサフュージョン装置1には、第1センサ10a、第2センサ10b、自車挙動検知センサ10cの出力信号が入力される。
次に、図7及び図8を参照して、第2の実施の形態の物標検出装置を説明する。図7は、第2の実施の形態の物標検出装置(センサフュージョン装置1)の機能ブロック図である。図7において、第1の実施の形態(図1)と同一の構成については同一の参照符号を付し、以下では重複する説明は省略する。
本発明は前述した実施例に限定されるものではなく、添付した特許請求の範囲の趣旨内における様々な変形例及び同等の構成が含まれる。例えば、前述した実施例は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに本発明は限定されない。また、ある実施例の構成の一部を他の実施例の構成に置き換えてもよい。また、ある実施例の構成に他の実施例の構成を加えてもよい。また、各実施例の構成の一部について、他の構成の追加・削除・置換をしてもよい。
2… 運転制御装置
10a…第1センサ
10b…第2センサ
10c…自車挙動検知センサ
20… 観測情報処理部
30… 統合処理部
30a…グルーピング部
30b…統合部
30c…予測更新部
40…統合履歴管理部
50…信頼度判定部
200…歩行者
210…自車
220…マンホール
230…先行車
240…障害物
Claims (7)
- 複数のセンサにより計測されたセンサ情報に基づいて、車両周辺の物標を検出する物標検出装置であって、
第1センサと、
第2センサと、
前記第1センサの物標の検出情報と、前記第2センサの同左物標の検出情報とを周期毎に統合する統合処理部と、
前記統合処理部の統合結果を統合履歴として時系列で管理する統合履歴管理部と、
前記統合履歴と、前記第1センサ及び前記第2センサの同上物標の検出情報とに基づいて物標の種別の信頼度を判定する信頼度判定部と
を備え、
前記第1センサの物標の検出情報には、前記物標の種別情報が含まれる
ことを特徴とする物標検出装置。 - 前記第1センサ、及び前記第2センサは、前記物標の位置情報を検出情報として出力するよう構成され、
前記統合処理部は、前記位置情報に従い、前記統合を実行する、請求項1に記載の物標検出装置。 - 前記統合履歴管理部は、前記統合結果として、フュージョン物標の物標ID、及び物標種別を時系列で管理する、請求項1に記載の物標検出装置。
- 前記信頼度判定部は、
今周期において一のフュージョン物標に検出情報をグルーピングして特定種別情報を付与したか否かを判定する第1のステップと、
前周期において、対応するフュージョン物標に検出情報をグルーピングして特定種別情報を付与していたか否かを判定する第2のステップと、
前周期でグルーピングされて付与された特定種別に係る検出情報が、今周期において他に存在するか否かを判定する第3のステップと
を実行可能に構成される、請求項1に記載の物標検出装置。 - 前記信頼度判定部は、前記第1~第3のステップの判定の結果に従い、前記特定種別情報の判定の信頼度を判定する、請求項4に記載の物標検出装置。
- 前記第1センサ、及び前記第2センサは、前記物標の位置情報を検出情報として出力するよう構成され、
前記統合処理部は、前記物標の移動を予測する移動予測部を更に備え、
前記統合処理部は、前記位置情報と、前記移動予測部による前記物標の移動の予測に従い、前記統合を実行する、請求項1に記載の物標検出装置。 - 自車の挙動を検知する自車挙動検知センサを更に備える、請求項1に記載の物標検出装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/552,253 US20240159895A1 (en) | 2021-04-08 | 2022-02-14 | Target detection device |
DE112022000744.3T DE112022000744T5 (de) | 2021-04-08 | 2022-02-14 | Zieldetektionsvorrichtung |
JP2023512841A JPWO2022215348A1 (ja) | 2021-04-08 | 2022-02-14 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-065863 | 2021-04-08 | ||
JP2021065863 | 2021-04-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022215348A1 true WO2022215348A1 (ja) | 2022-10-13 |
Family
ID=83546318
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/005723 WO2022215348A1 (ja) | 2021-04-08 | 2022-02-14 | 物標検出装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240159895A1 (ja) |
JP (1) | JPWO2022215348A1 (ja) |
DE (1) | DE112022000744T5 (ja) |
WO (1) | WO2022215348A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007310741A (ja) * | 2006-05-19 | 2007-11-29 | Fuji Heavy Ind Ltd | 立体物認識装置 |
JP2014222462A (ja) * | 2013-05-14 | 2014-11-27 | 株式会社デンソー | 衝突緩和装置 |
JP2018205879A (ja) * | 2017-05-31 | 2018-12-27 | 本田技研工業株式会社 | 物標認識システム、物標認識方法、およびプログラム |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5806647B2 (ja) | 2012-07-12 | 2015-11-10 | 本田技研工業株式会社 | 物体認識装置 |
-
2022
- 2022-02-14 JP JP2023512841A patent/JPWO2022215348A1/ja active Pending
- 2022-02-14 WO PCT/JP2022/005723 patent/WO2022215348A1/ja active Application Filing
- 2022-02-14 US US18/552,253 patent/US20240159895A1/en active Pending
- 2022-02-14 DE DE112022000744.3T patent/DE112022000744T5/de active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007310741A (ja) * | 2006-05-19 | 2007-11-29 | Fuji Heavy Ind Ltd | 立体物認識装置 |
JP2014222462A (ja) * | 2013-05-14 | 2014-11-27 | 株式会社デンソー | 衝突緩和装置 |
JP2018205879A (ja) * | 2017-05-31 | 2018-12-27 | 本田技研工業株式会社 | 物標認識システム、物標認識方法、およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20240159895A1 (en) | 2024-05-16 |
JPWO2022215348A1 (ja) | 2022-10-13 |
DE112022000744T5 (de) | 2023-11-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6404634B2 (ja) | 予測的な先進運転支援システムの一貫性のある挙動生成 | |
KR102659056B1 (ko) | 주변 v2v 신호와 자체 차량의 센싱신호의 융합 시스템 및 방법 | |
US11789141B2 (en) | Omnidirectional sensor fusion system and method and vehicle including the same | |
US20200167576A1 (en) | Method, Computer Program Product, and Driver Assistance System for Determining One or More Lanes of a Road in an Environment of a Vehicle | |
US20210001883A1 (en) | Action selection device, computer readable medium, and action selection method | |
US10482332B2 (en) | Pedestrian determining apparatus for determining whether an object is a pedestrian crossing ahead of an own vehicle | |
US11748593B2 (en) | Sensor fusion target prediction device and method for vehicles and vehicle including the device | |
KR20150028258A (ko) | 정보 이용을 위한 방법 및 시스템 | |
CN111341148A (zh) | 用于处理多重反射信号的机动车的控制系统以及控制方法 | |
EP3467545A1 (en) | Object classification | |
US11919544B2 (en) | Method and device for operating an automated vehicle | |
US20220281476A1 (en) | Aiming device, driving control system, and method for calculating correction amount for sensor data | |
US11541885B2 (en) | Location prediction for dynamic objects | |
WO2022215348A1 (ja) | 物標検出装置 | |
KR20200133122A (ko) | 차량 충돌 방지 장치 및 방법 | |
JP6756507B2 (ja) | 環境認識装置 | |
JP6834020B2 (ja) | 物体認識装置および物体認識方法 | |
JP7374057B2 (ja) | 信号処理装置 | |
CN110709727A (zh) | 用于检测迎面车辆的车辆系统 | |
US11577753B2 (en) | Safety architecture for control of autonomous vehicle | |
US20230258792A1 (en) | Object Tracking Method by Using Sensor Fusion Technology and Vehicle Driving System by the Same | |
KR102682215B1 (ko) | 레이더 센서 포인트 클라우드 데이터를 이용한 물체 검출 기술 | |
WO2024084690A1 (ja) | 情報処理装置及び情報処理方法 | |
US20240166204A1 (en) | Vehicle Collision Threat Assessment | |
JP7454353B2 (ja) | 処理装置、及びそれを用いた外界認識装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22784333 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023512841 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112022000744 Country of ref document: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18552253 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22784333 Country of ref document: EP Kind code of ref document: A1 |