WO2023033040A1 - Système de détection de lumière parasite, dispositif de détection de lumière parasite, procédé de détection de lumière parasite et programme de détection de lumière parasite - Google Patents

Système de détection de lumière parasite, dispositif de détection de lumière parasite, procédé de détection de lumière parasite et programme de détection de lumière parasite Download PDF

Info

Publication number
WO2023033040A1
WO2023033040A1 PCT/JP2022/032758 JP2022032758W WO2023033040A1 WO 2023033040 A1 WO2023033040 A1 WO 2023033040A1 JP 2022032758 W JP2022032758 W JP 2022032758W WO 2023033040 A1 WO2023033040 A1 WO 2023033040A1
Authority
WO
WIPO (PCT)
Prior art keywords
flare
point group
detection
extracting
point
Prior art date
Application number
PCT/JP2022/032758
Other languages
English (en)
Japanese (ja)
Inventor
大輔 ▲高▼棹
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2023033040A1 publication Critical patent/WO2023033040A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/487Extracting wanted echo signals, e.g. pulse detection

Definitions

  • the present disclosure relates to a flare detection technique for detecting a flare point group included in a point group of detection data acquired by a rangefinder that senses reflected light with respect to light irradiation.
  • Patent Document 1 discloses a technique for detecting abnormal data generated by a glossy surface or multipath in a distance measuring device. This technique distinguishes between normal data and abnormal data based on the pulse width of the received light pulse, and invalidates the distance value of the abnormal data.
  • the data detected by the rangefinder may contain data caused by flares.
  • the light-receiving pulse width of the flare and the light-receiving pulse width of the reflected light from another object may become approximately the same. In this case, there is a possibility that the flare cannot be accurately detected.
  • An object of the present disclosure is to provide a flare detection system that can accurately detect flares. Another object of the present disclosure is to provide a flare detection method capable of accurately detecting flare. Yet another object of the present disclosure is to provide a flare detection program capable of accurately detecting flare.
  • a first aspect of the present disclosure is a flare detection system that has a processor and detects a flare point group included in a point group of detection data acquired by a rangefinder that senses reflected light with respect to light irradiation,
  • the processor Extracting a detection target cluster by clustering the point cloud according to the distance value; extracting corresponding points corresponding to the flare source from the point group that constitutes the detection target cluster; Detecting a flare point cloud by estimating a boundary between the entity reflection point cloud including corresponding points and the flare point cloud; configured to run
  • a second aspect of the present disclosure is a flare detection device that has a processor and detects a flare point group included in a point group of detection data acquired by a distance measuring device that senses reflected light with respect to light irradiation,
  • the processor Extracting a detection target cluster by clustering the point cloud according to the distance value; extracting corresponding points corresponding to the flare source from the point group that constitutes the detection target cluster; Detecting a flare point cloud by estimating a boundary between the entity reflection point cloud including corresponding points and the flare point cloud; configured to run
  • a third aspect of the present disclosure is a flare detection method executed by a processor to detect a flare point group included in a point group of detection data acquired by a rangefinder that senses reflected light with respect to light irradiation.
  • a fourth aspect of the present disclosure is stored in a storage medium and executed by a processor to detect a flare point group included in a point group of detection data acquired by a rangefinder that senses reflected light with respect to light irradiation.
  • a flare detection program comprising instructions, the instruction is Extracting a detection target cluster by clustering the point cloud according to the distance value; extracting corresponding points corresponding to the flare source from the point group that constitutes the detection target cluster; Detecting the flare point group by estimating the boundary between the entity reflection point group including corresponding points and the flare point group; including.
  • the detection target cluster is clustered according to the distance value from the point group of the detection data, and then the body reflection point group and the flare point group are detected based on the corresponding points in the cluster. be done. Therefore, the flare point cloud can be detected for each cluster discriminated for each object. Therefore, flare point clouds and reflected point clouds from other objects can be distinguished. As described above, flare can be accurately detected.
  • FIG. 2 is a block diagram showing the functional configuration of the flare detection system according to the first embodiment;
  • FIG. FIG. 2 is a schematic diagram conceptually showing a flare detection method according to the first embodiment;
  • 7 is a graph showing an example of edge feature amount calculation;
  • 4 is a flow chart showing a flare detection method according to the first embodiment;
  • the flare detection system 100 of the first embodiment shown in FIG. 1 detects a flare point group Pf included in a point group of detection data acquired by the LiDAR device 1 mounted on the host vehicle. From a viewpoint centered on the host vehicle, the host vehicle can also be said to be an ego-vehicle. From the point of view centering on the host vehicle, it can be said that other vehicles are users of other roads.
  • an automated driving mode is given that is divided into levels according to the degree of manual intervention of the driver in the driving task.
  • Autonomous driving modes may be achieved by autonomous cruise control, such as conditional driving automation, advanced driving automation, or full driving automation, in which the system performs all driving tasks when activated.
  • Autonomous driving modes may be provided by advanced driving assistance controls, such as driving assistance or partial driving automation, in which the occupant performs some or all driving tasks.
  • the automatic driving mode may be realized by either one, combination, or switching of the autonomous driving control and advanced driving support control.
  • the host vehicle is equipped with the LiDAR device 1 shown in FIG.
  • the LiDAR device 1 is a distance measuring device that measures the distance to the reflection point by detecting reflected light from the reflection point with respect to irradiation of light.
  • the LiDAR device 1 includes a light emitter 10 , a light receiver 20 and a control circuit 30 .
  • the light emitting unit 10 is a semiconductor element that emits directional laser light, such as a laser diode.
  • the light emitting unit 10 irradiates laser light toward the outside of the vehicle A in the form of an intermittent pulse beam.
  • the laser light is scanned by controlling the reflection angle of the reflecting mirror that reflects the laser light emitted from the light emitting unit 10 to the emission surface by an actuator.
  • the scanning direction may be horizontal or vertical.
  • the light receiving unit 20 is composed of a light receiving element that is highly sensitive to light, such as SPAD (Single Photon Avalanche Diode).
  • a plurality of light receiving elements are arranged in an array in a two-dimensional direction.
  • a set of adjacent light-receiving elements constitutes one light-receiving pixel (hereinafter also referred to as pixel).
  • the light-receiving element is exposed to the light incident from the sensing area determined by the angle of view of the light-receiving unit 20 in the external world of the light-receiving unit 20 .
  • the control circuit 30 controls the light emitting section 10, the light receiving section 20 and the actuator.
  • the control circuit 30 is a computer including at least one memory and at least one processor.
  • Memory is at least one type of non-transitory tangible storage medium, such as semiconductor memory, magnetic medium, optical medium, etc., for non-transitory storage or storage of computer-readable programs and data. medium).
  • the memory stores various programs that are executed by the processor.
  • the control circuit 30 controls exposure and scanning of a plurality of pixels in the light receiving section 20, and processes signals from the light receiving section 20 into data.
  • the control circuit 30 executes reflected light detection in which the light receiving section 20 detects the reflected light of the light emitted from the light emitting section 10 .
  • the laser light emitted from the light emitting unit 10 hits an object within the sensing area and is reflected. This reflected portion becomes a reflection point of the laser beam.
  • the laser light reflected at the reflection point enters the light receiving section 20 through the incident surface and is exposed.
  • the control circuit 30 scans a plurality of pixels of the light receiving section 20 to obtain reflected light at various angles within the field of view. Thereby, the control circuit 30 acquires a distance image of the reflecting object (object).
  • the control circuit 30 obtains the intensity of the reflected light obtained by scanning each pixel within a certain period of time or a value obtained based on the intensity (hereinafter referred to as reflected light intensity). Accumulated for each distance. Thereby, the control circuit 30 obtains a histogram of distance and reflected light intensity. The control circuit 30 calculates the distance to the reflection point based on the reflected light intensity of each bin of the histogram. Specifically, the control circuit 30 generates an approximation curve for bins equal to or greater than a predetermined threshold value, and sets the extreme value of the approximation curve as the distance to the reflection point in that pixel. The control circuit 30 can calculate the distance value for each pixel by performing the above-described processing for all pixels. As described above, the control circuit 30 can generate a point cloud image including the distance value, the reflected light intensity, and the received light pulse width for each pixel as detection data. A point cloud image can also be called a reflected light image or a range image.
  • the flare detection system 100 is connected to the LiDAR device 1 via at least one of, for example, a LAN (Local Area Network) line, wire harness, internal bus, wireless communication line, and the like. Flare detection system 100 includes at least one dedicated computer.
  • LAN Local Area Network
  • the dedicated computer that configures the flare detection system 100 may be an integrated ECU (Electronic Control Unit) that integrates the operation control of the host vehicle.
  • a dedicated computer that configures the flare detection system 100 may be a judgment ECU that judges driving tasks in driving control of the host vehicle.
  • the dedicated computer that constitutes the flare detection system 100 may be a supervisory ECU that monitors the operational control of the host vehicle.
  • the dedicated computer that constitutes the flare detection system 100 may be an evaluation ECU that evaluates the driving control of the host vehicle.
  • a dedicated computer that configures the flare detection system 100 may be a navigation ECU that navigates the travel route of the host vehicle.
  • a dedicated computer that configures the flare detection system 100 may be a locator ECU that estimates the self-state quantity of the host vehicle.
  • the dedicated computer that makes up the flare detection system 100 may be an actuator ECU that controls the host vehicle's travel actuators.
  • the dedicated computer that makes up the flare detection system 100 may be an HCU (Human Machine Interface) Control Unit (HCU) that controls the presentation of information in the host vehicle.
  • HCU Human Machine Interface
  • a dedicated computer that configures the flare detection system 100 may be a computer other than the host vehicle that configures an external center or a mobile terminal that can communicate with the host vehicle, for example.
  • a dedicated computer that constitutes the flare detection system 100 has at least one memory 101 and at least one processor 102 .
  • the memory 101 stores computer-readable programs, data, etc., non-temporarily, and includes at least one type of non-transitory storage medium such as a semiconductor memory, a magnetic medium, and an optical medium. tangible storage medium).
  • Processor 102 is, for example, CPU (Central Processing Unit), GPU (Graphics Processing Unit), RISC (Reduced Instruction Set Computer)-CPU, DFP (Data Flow Processor), GSP (Graph Streaming Processor), etc. At least one type as a core.
  • the processor 102 executes a plurality of instructions contained in the flare detection program stored in the memory 101 to detect the flare point group Pf contained in the point group of detection data.
  • the flare detection system 100 constructs a plurality of functional blocks for detecting the flare point group Pf included in the point group of the detection data.
  • the plurality of functional blocks constructed in flare detection system 100 include clustering block 110, source extraction block 120, and flare detection block 130, as shown in FIG.
  • the clustering block 110 extracts effective clusters Cv, which will be described later, by clustering the point groups of the detection data detected by the LiDAR device 1 according to the distance values.
  • the source extraction block 120 extracts corresponding points Ps corresponding to the flare source from the point group forming the effective cluster Cv.
  • the flare detection block 130 detects the flare point group Pf by estimating the boundary B between the entity reflection point group Pe including the corresponding point Ps and the flare point group Pf.
  • a flare detection flow for detecting a flare point group Pf included in a point group of detection data by the flare detection system 100 according to FIG. It is explained below.
  • This processing flow is repeatedly executed while the host vehicle or the LiDAR device 1 is activated.
  • Each "S" in this processing flow means a plurality of steps executed by a plurality of instructions included in the flare detection program.
  • the clustering block 110 acquires a distance image.
  • the clustering block 110 clusters the point group of reflection points in the distance image according to the distance value. More specifically, the clustering block 110 sets a condition (cluster condition ) is satisfied. When determining that the target reflection point satisfies the cluster condition, the clustering block 110 puts the target reflection point and the specific reflection point into the same cluster. The clustering block 110 performs the above processing for each reflection point on the detection line and classifies the point group into multiple clusters.
  • the clustering block 110 classifies each cluster into valid clusters Cv and invalid clusters Ci (see the left frame in FIG. 3).
  • a valid cluster Cv is a detection target cluster for flare detection.
  • An invalid cluster Ci is a cluster excluded from flare detection targets.
  • FIG. 3 is a graph showing an example of the relationship between the vertical azimuth and distance of each point on a specific detection line in the vertical direction.
  • the clustering block 110 classifies clusters that satisfy invalidity conditions as invalid clusters Ci.
  • the invalid condition is at least one of, for example, that the number of points forming a cluster is within a predetermined number (eg, 5), and that the distance width of the cluster is outside a predetermined range.
  • Clusters that do not satisfy the invalidity condition are treated as valid clusters Cv, and serve as target clusters for flare detection, which will be described later.
  • the source extraction block 120 calculates a score for each point of the valid cluster Cv as an evaluation value for specifying the flare source.
  • the score is, for example, the weighted sum of the received light pulse width and the reflected light intensity.
  • the weight of each parameter is made equal, for example.
  • the score may simply be the magnitude of the reflected light intensity.
  • the score may be the magnitude of the received pulse width at the point where the reflected light intensity is saturated.
  • the score may be the magnitude of the received pulse width.
  • the source extraction block 120 extracts the point where the score based on the received light pulse width and the reflected light intensity reaches the extraction score range as the corresponding point Ps corresponding to the flare source (see the center of FIG. 3). See inside the frame).
  • the extracted score range is a numerical range in which the score is greater than or equal to the threshold.
  • the origin extraction block 120 may simply use the highest score as a threshold and the point with the highest score as the point reaching the extraction score range.
  • the extracted score range is an example of the "extracted evaluation value range".
  • the flare detection block 130 filters the received light pulse width of the effective cluster Cv to calculate the edge feature amount of the effective cluster Cv (see FIG. 4). Furthermore, in S70, the flare detection block 130 tentatively determines the boundary B based on the edge feature amount. For example, the flare detection block 130 defines a boundary B between a point having the maximum or minimum edge feature amount and an adjacent point, and defines a point group including the corresponding point Ps separated by the boundary B as the entity reflection point group Pe , is provisionally determined as the flare point group Pf (see the right frame in FIG. 3). After that, in S80, the flare detection block 130 determines whether or not the determination condition is satisfied based on the result of the provisional determination.
  • the definite condition is a condition based on at least one of the distance value, the reflected light intensity, and the received light pulse width included in the reflected light image.
  • the definite condition includes that the variation of the distance value of each point in the tentatively determined flare point group Pf is within the definite variation range.
  • the fixed variation range is a range in which the magnitude of variation is equal to or less than a threshold or less than a threshold.
  • Variation in distance values is, for example, the average value of distance differences between adjacent pixels over the entire flare point group Pf.
  • the definite condition may include that the average value of the reflected light intensity of each point in the provisionally determined flare point group Pf is within the definite intensity range.
  • the determined intensity range is a range in which the average value of the reflected light intensity is equal to or less than the threshold or less than the threshold.
  • the definite condition may include that the average value of the light-receiving pulse width of each point in the provisionally determined flare point group Pf is within the definite pulse width range.
  • the defined pulse width range is a range in which the magnitude of the average value of received light pulse widths is equal to or less than the threshold or less than the threshold.
  • the definite condition may include that the spread of distance values in the tentatively determined flare point group Pf is within the definite spread range.
  • the spread of distance values is, for example, the size of the difference between the largest distance value and the smallest distance value.
  • the definite spread range is a range in which the magnitude of spread is equal to or less than a threshold or less than a threshold.
  • the definite condition may include that the differences in the average received light pulse widths of the tentatively determined entity reflection point group Pe and the flare point group Pf reach the definite width difference range.
  • the definite width difference range is a range in which the magnitude of the difference between the average received pulse widths is greater than or equal to the threshold.
  • the definite condition includes that the difference between the distance value of points near the boundary in the provisionally determined entity reflection point group Pe and the average value of the distance values in the flare point group Pf is within the definite distance difference range. good too.
  • the fixed distance difference range is a range in which the difference between the average values of the distance values is equal to or less than the threshold or less than the threshold.
  • the definite condition includes that the difference in the received light pulse width between a specific point of the entity reflection point group Pe near the boundary and a specific point of the flare point group Pf near the boundary is outside the range of the non-definite boundary difference.
  • the nondeterministic boundary difference range is a range in which the magnitude of the difference in received pulse width is equal to or less than the threshold or less than the threshold.
  • Establishment conditions may be determined to be met when a plurality of specific conditions among the above sub-conditions are met at the same time.
  • the confirmation condition may be determined to be satisfied when the condition regarding the dispersion of the distance values is satisfied and the condition regarding the difference in the average received light pulse widths of the point groups Pe and Pf is satisfied.
  • the established condition may be determined to be established when at least one of the above sub-conditions is established.
  • the confirmation condition may be determined to be met when a specific one of the above sub-conditions is met.
  • the flare detection block 130 sets the side including the corresponding point Ps across the boundary B as the entity reflection point group Pe, and the side not including the corresponding point Ps as the flare point. Determine group Pf.
  • the flare detection block 130 sets a flag for distinguishing the determined flare point group Pf from the entity reflection point group Pe, and sends the information of the flag to a device that uses detection information such as a reflected light image (e.g., automatic driving ECU, etc.). After outputting, a series of processing ends.
  • the flare detection block 130 determines that there is no boundary B that satisfies the determination condition, the flare detection block 130 determines the entire valid cluster Cv as the entity reflection point group Pe in S100. Since the flare point group Pf does not exist, the flare detection block 130 terminates a series of processing without setting a flag to distinguish it from the entity reflection point group Pe.
  • the detection target cluster is clustered according to the distance value from the point group of the detection data, and then the entity reflection point group Pe and the flare point group Pf are clustered based on the corresponding points Ps in the cluster. is detected. Therefore, the flare point group Pf can be detected for each cluster distinguished for each object. Therefore, the flare point cloud Pf and the reflection point cloud from other objects can be distinguished. As described above, flare can be accurately detected.
  • the flare detection system 100 may use an image acquired in a weak light emission mode in which the light emission intensity is lower than usual as the reflected light image used for extracting the flare source.
  • the dedicated computer that configures the flare detection system 100 may be the control circuit 30 of the LiDAR device 1 .
  • the LiDAR device 1 may be of a flash type that irradiates a relatively wide range with diffused laser light at once.
  • the dedicated computer that constitutes the flare detection system 100 may have at least one of digital circuits and analog circuits as a processor.
  • Digital circuits here include, for example, ASIC (Application Specific Integrated Circuit), FPGA (Field Programmable Gate Array), SOC (System on a Chip), PGA (Programmable Gate Array), and CPLD (Complex Programmable Logic Device).
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • SOC System on a Chip
  • PGA Programmable Gate Array
  • CPLD Complex Programmable Logic Device
  • the flare detection system 100 may be implemented as a flare detection device that is a processing device (eg, processing ECU, etc.) mounted on the host vehicle.
  • a processing device eg, processing ECU, etc.
  • the above-described embodiments and variations may be implemented as a semiconductor device (such as a semiconductor chip) having at least one processor 102 and at least one memory 101 of the flare detection system 100 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

Un système de détection de lumière parasite comporte un processeur et détecte un nuage de points de lumière parasite inclus dans un nuage de points de données de détection acquis par un dispositif LiDAR afin de détecter une lumière de réflexion dans une lumière d'exposition. Le processeur est configuré pour regrouper le nuage de points en fonction d'une valeur de distance, ce qui permet d'extraire un groupe à détecter. Le processeur est également configuré pour extraire un point correspondant qui correspond à une source de génération de lumière parasite du nuage de points constituant le groupe à détecter. Le processeur est en outre configuré pour mesurer la limite entre le nuage de points de lumière parasite et un nuage de points de réflexion réel qui comprend le point correspondant, ce qui permet de détecter le nuage de points de lumière parasite.
PCT/JP2022/032758 2021-09-06 2022-08-31 Système de détection de lumière parasite, dispositif de détection de lumière parasite, procédé de détection de lumière parasite et programme de détection de lumière parasite WO2023033040A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-144816 2021-09-06
JP2021144816A JP2023037966A (ja) 2021-09-06 2021-09-06 フレア検出システム、フレア検出装置、フレア検出方法、フレア検出プログラム

Publications (1)

Publication Number Publication Date
WO2023033040A1 true WO2023033040A1 (fr) 2023-03-09

Family

ID=85411341

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/032758 WO2023033040A1 (fr) 2021-09-06 2022-08-31 Système de détection de lumière parasite, dispositif de détection de lumière parasite, procédé de détection de lumière parasite et programme de détection de lumière parasite

Country Status (2)

Country Link
JP (1) JP2023037966A (fr)
WO (1) WO2023033040A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004198323A (ja) * 2002-12-19 2004-07-15 Denso Corp 車両用物体認識装置
JP2012093312A (ja) * 2010-10-28 2012-05-17 Denso Corp レーダ装置
JP2014098635A (ja) * 2012-11-14 2014-05-29 Denso Corp 物標検出システム、及び、物標検出装置
WO2020059565A1 (fr) * 2018-09-18 2020-03-26 パナソニックIpマネジメント株式会社 Dispositif d'acquisition de profondeur, procédé d'acquisition de profondeur et programme
US20200249326A1 (en) * 2019-02-01 2020-08-06 Panosense Inc. Identifying and/or removing ghost detections from lidar sensor output
WO2021166902A1 (fr) * 2020-02-21 2021-08-26 株式会社デンソー Dispositif de reconnaissance de cible

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004198323A (ja) * 2002-12-19 2004-07-15 Denso Corp 車両用物体認識装置
JP2012093312A (ja) * 2010-10-28 2012-05-17 Denso Corp レーダ装置
JP2014098635A (ja) * 2012-11-14 2014-05-29 Denso Corp 物標検出システム、及び、物標検出装置
WO2020059565A1 (fr) * 2018-09-18 2020-03-26 パナソニックIpマネジメント株式会社 Dispositif d'acquisition de profondeur, procédé d'acquisition de profondeur et programme
US20200249326A1 (en) * 2019-02-01 2020-08-06 Panosense Inc. Identifying and/or removing ghost detections from lidar sensor output
WO2021166902A1 (fr) * 2020-02-21 2021-08-26 株式会社デンソー Dispositif de reconnaissance de cible

Also Published As

Publication number Publication date
JP2023037966A (ja) 2023-03-16

Similar Documents

Publication Publication Date Title
US10832064B2 (en) Vacant parking space detection apparatus and vacant parking space detection method
CN110458854B (zh) 一种道路边缘检测方法和装置
CN107272019B (zh) 基于激光雷达扫描的路沿检测方法
US10614322B2 (en) Object recognition device
JP6413898B2 (ja) 歩行者判定装置
CN112799098B (zh) 雷达盲区监测方法、装置、电子设备和存储介质
JP6717240B2 (ja) 物標検出装置
WO2020196513A1 (fr) Dispositif de détection d'objet
CN109955829B (zh) 清洁激光雷达传感器的方法及装置
WO2013116598A1 (fr) Détection peu coûteuse de plots de marquage de couloirs
WO2023033040A1 (fr) Système de détection de lumière parasite, dispositif de détection de lumière parasite, procédé de détection de lumière parasite et programme de détection de lumière parasite
CN113646820B (zh) 检测装置及检测方法
WO2021038267A1 (fr) Procédé de reconnaissance d'objet et dispositif de reconnaissance d'objet
JP7375838B2 (ja) 測距補正装置、測距補正方法、測距補正プログラム、および測距装置
JP2017116445A (ja) 物体検出装置
JP2022009231A (ja) 判定装置、判定方法及び判定用プログラム
CN114902282A (zh) 用于有效感测碰撞威胁的系统和方法
US20230146935A1 (en) Content capture of an environment of a vehicle using a priori confidence levels
WO2022176679A1 (fr) Dispositif, procédé et programme de correction de mesure de distance, et dispositif de mesure de distance
WO2023181948A1 (fr) Dispositif d'élimination de bruit, dispositif de détection d'objet et procédé d'élimination de bruit
CN109795464B (zh) 制动方法、装置和存储介质
JP7514664B2 (ja) 物体認識方法及び物体認識システム
KR20230113343A (ko) 능동형 센서 시스템 및 물체 검출
WO2021166912A1 (fr) Dispositif de détection d'objet
JP2023059629A (ja) 路面状態推定システム、路面状態推定装置、路面状態推定方法、路面状態推定プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22864622

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE