WO2020021306A1 - Procédé de discrimination de matériaux et système de mise en œuvre respectif - Google Patents

Procédé de discrimination de matériaux et système de mise en œuvre respectif Download PDF

Info

Publication number
WO2020021306A1
WO2020021306A1 PCT/IB2018/055504 IB2018055504W WO2020021306A1 WO 2020021306 A1 WO2020021306 A1 WO 2020021306A1 IB 2018055504 W IB2018055504 W IB 2018055504W WO 2020021306 A1 WO2020021306 A1 WO 2020021306A1
Authority
WO
WIPO (PCT)
Prior art keywords
polarization
light beam
unit
point cloud
data
Prior art date
Application number
PCT/IB2018/055504
Other languages
English (en)
Inventor
Annemarie Ingrid HOLLECZEK
André ANTUNES DE CARVALHO ALBUQUERQUE
Alexandre Manuel RIBEIRO CORREIA
Pedro Manuel DE LIMA GOMES CALDELAS
Ângela R. RODRIGUES
Eduardo Jorge NUNES PEREIRA
Original Assignee
Bosch Car Multimedia Portugal, S.A.
Universidade Do Minho
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bosch Car Multimedia Portugal, S.A., Universidade Do Minho filed Critical Bosch Car Multimedia Portugal, S.A.
Publication of WO2020021306A1 publication Critical patent/WO2020021306A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/66Tracking systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/04Display arrangements
    • G01S7/06Cathode-ray tube displays or other two dimensional or three-dimensional displays
    • G01S7/10Providing two-dimensional and co-ordinated display of distance and direction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/499Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using polarisation effects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/51Display arrangements

Definitions

  • This application relates to a method for performing material discrimination and respective implementation system.
  • LIDAR Light Detecting and Ranging
  • 3D LIDAR comprises a set of techniques that use laser light to measure the distance to a specific target.
  • optical scanning elements By using appropriate optical scanning elements or by illuminating/ flashing a specific area of a target, it is possible to obtain 3D images with range information and backscattering properties of the target.
  • Such systems provide a 3D point cloud frame that can be processed by software in order to obtain additional information of the surrounding scene. Consequently, 3D LIDAR imaging has been widely recognized as an attractive possibility for vehicular applications such as hazard and collision avoidance and autonomous/assisted navigation.
  • object identification and recognition based on data provided by 3D LIDAR systems represent a complex multiparameter problem.
  • the obtained data still needs to be processed in order to recognize, discriminate and classify key elements such as vehicles, pedestrians, buildings or any other obstacle.
  • This kind of classification is of uttermost relevance for autonomous and driven-assisted navigation when hazard avoidance and self-steering decisions need to be made.
  • object recognition and mapping techniques and methodologies have been proposed, not only for terrestrial, but also for airborne applications. These techniques use the 3D information to determine the geometrical shape and edges of different objects in the illuminated scene to discriminate different types of targets.
  • object recognition based solely on pure geometrical and dimensional properties is difficult, in particular when the resolution of the LIDAR cameras is not very high and when different types of targets have similar geometry.
  • LIDAR sensors also provide information on the reflectivity properties of the illuminated targets by measuring the intensity of the reflected/backscattered light, as mentioned in document US9360554 B2.
  • the referred document discloses a LIDAR system comprised by at least one emitter and a detector array, covering a given field of view where each of the emitters emits a single pulse or a multi-pulse packet of light that is sampled by the detector array. On each emitter cycle the detector array will sample the incoming signal intensity at the pre-determined sampling frequency that generates two or more samples per emitted light packet. This allows for time- of-flight measurements from the signal portion of each emitted light packet after reflection by one or more objects in the field of view and its detection at the detector.
  • the present application describes a method for material discrimination comprising the following steps:
  • the variations on the polarization properties of the reflected light beam in relation to the polarization properties of the emitted light beam are determined based on coefficients:
  • — c representing the ratio between the current's perpendicular polarization component and total current, given by p/(l+p);
  • — k representing the degree of polarization of the detected light, given by (p-1 ) / ( 1+p) ;
  • the values of coefficients p,x,K,r are calculated for each field of view angle .
  • the 3D point cloud with polarization information is complemented with reflectivity data of the light beam.
  • the reflectivity data is determined by measuring the current generated by the reflected light beam.
  • the 3D point cloud with polarization information is complemented with velocity data.
  • the velocity data is determined by calculating the target's range difference between consecutive frames of the 3D point cloud.
  • the present application also describes a system for implementing the method for material discrimination of described above.
  • Said system comprises:
  • an emission unit comprising a light source
  • a central processing unit provided with processing means adapted to control the operation of the emission unit and the detection unit, by actuating the light source; said central processing unit being configured to measure the optical properties of the reflected light related to polarization and reflectivity data and information regarding target's range and velocity; said processing means programed to execute image processing techniques configured to generate a 3D point cloud with polarization information and with reflectivity and velocity data.
  • the light source is a linearly-polarized laser diode or a circularly polarized laser diode with a quarter-wave plate.
  • the light source is a unit comprising unpolarized light sources and polarizers.
  • the emission unit comprises a scanner.
  • the emission unit comprises beam shaping optics.
  • the emission unit comprises a polarizer.
  • the emission unit is a rotating house.
  • the detection unit is comprised by a photodetector array with integrated micropolarizers and an optical unit.
  • the photodetector comprises a solid state detector array with a polarizing assembly, wherein each detector is divided into smaller subdetectors each with a micropolarizer with the polarizing axis at a specific orientation.
  • each detector is divided into four subdetectors with two micropolarizers oriented along a direction parallel to the polarization state of the emitter, and two micropolarizers oriented along a direction perpendicular to the polarization state of the emitter.
  • each detector is divided into four subdetectors with micropolarizers, wherein a first micropolarizer is oriented along a direction parallel to the polarization state of the emitter; a second micropolarizer is oriented 45° with respect to the polarization state of the emitter; a third micropolarizer is oriented -45° with respect to the polarization state of the emitter; and a fourth micropolarizer is oriented along a direction perpendicular to the polarization state of the emitter .
  • the detection unit is comprised by a polarization-rotating element, a polarizer, an optical unit and a photodetector array.
  • the polarization-rotating element is a rotating half-wave plate.
  • the polarization- rotating element is a liquid crystal polarization rotator.
  • the polarization- rotating element is a Faraday rotator. Yet in another embodiment of the system, the polarization- rotating element is a Pockels cell.
  • the detection unit is comprised by a polarizing beam splitter, an optical unit, and two photodetectors (11) .
  • the optical unit is a lens or an arrangement of lenses, prisms and/or mirrors .
  • the present application relates to a method for material discrimination, based on LIDAR techniques and on the properties of the scattered light, in particularly the variations of the state of polarization of light backscattered from a target.
  • a target is assumed as any fixed or moving body such as for example, buildings, traffic signs, vehicles, animals or humans that are characterized by polarization properties of its respective constituent materials.
  • the method and respective system now developed can be integrated with autonomous or assisted terrestrial vehicle' s driving systems based on LIDAR sensors, but should not be limited to that implementation.
  • the present application intends to solve the problem of object identification and recognition in 3D images, providing a way to discriminate different types of materials such as fabrics, fur, wood, paints, metal or concrete.
  • the present technology combines information on the backscattered light parameters and image processing techniques for identification of people, vehicles, trees, traffic signs, buildings or animals. This approach allows performing up to a 6D analysis, where a 2D location in an image is combined with information regarding range, reflectivity, velocity and polarization of light, in order to provide material discrimination and consequently target classification.
  • the method is implemented by means of a system comprised by an emission unit, a detection unit and a central processing unit.
  • the emission unit works as light source, and comprises a linearly-polarized laser diode.
  • the emission unit also includes a scanner or beam shaping optics in order to scan or illuminate targets respectively.
  • the detection unit comprises photodetectors, and is responsible for receiving the backscattered light from the targets.
  • the central processing unit controls the operation of both emission and detection unit, by actuating the laser source on the configuration parameters of the emitted light. It is also configured to determine the range information and the optical properties of the reflected light, including reflectivity and polarization data. Additionally, it can also be configured to determine velocity information. Range, reflectivity and polarization can be directly obtained from the electrical signals at the photodetectors. On the other hand, velocity can be computed via signal processing techniques. Range information, is determined by measuring the time-of-flight between emission and detection of the light beam. By means of image processing techniques, and using the range information applied to a 2D image, it is possible to generate a 3D point cloud.
  • the velocity information can be determined by measuring the target's range difference between consecutive frames of said 3D point cloud. Note that this provides velocity values with respect to a camera sensor position.
  • the reflectivity is measured according to intensity of the light beam reflected from the target and can be quantified by the photocurrent generated in the photodetectors of the detection unit.
  • the sum of electrical currents generated by said photodetectors, which are oriented along the parallel and perpendicular polarization components of emitter provides the total intensity of the light backscattered form the target.
  • a single photodetector or a photodetector array can be used to detect scattered light.
  • the field of view angle of the scattered light reaching the detector is determined by the deflection angle of the scanner.
  • each detector of the arrays can be sensitive only to light coming from a specific angle.
  • the detection unit may also comprise a set of other optical elements, such as beam splitters, or wave plates, or polarizers, that are arranged in a specific manner depending on the embodiment, as will be described later.
  • the access to polarization properties of the reflected light provides additional information on the constituent material of a specific target. In fact, when polarized light beam hits a specific target, the polarization properties of the reflected light are modified depending on the target's material.
  • Said light beam is defined by a component that is oriented along a direction parallel to the polarization state of the emission unit, and another component that is oriented along a perpendicular direction. Based on that, a set of parameters can be defined in order to quantify the modification on the polarization properties of said reflected light beam. Assuming for simplicity that the orthogonal axis of the reflected light are aligned with directions parallel (II) and perpendicular (1) to the state of polarization of the emission unit, the respective coefficients are defined in Eq. 1, Eq. 2, Eq. 3 and Eq.
  • and 7 ⁇ are the electrical currents, generated by the photodetectors of the detection unit, which are proportional to the intensity of the parallel and perpendicular polarization components of the reflected light, respectively.
  • Coefficient p represents the current ratio between the perpendicular and parallel polarization components of the reflected light, whereas coefficient c gives the ratio between the perpendicular and total current.
  • All the previous parameters can be used to evaluate the variation of the polarization properties of the reflected light, without having to modify the detection architecture. Similarly to the intensity measurements, different values of the previous coefficients can be obtained for each field of view angle, thus enabling to obtain a 3D point cloud with polarization information.
  • the central processing unit uses the 3D point cloud with polarization information which complemented with reflectivity and velocity data, by means of image processing techniques, allows performing up to a 6D analysis able to provide type material discrimination and therefore target classification.
  • Figure 1 illustrates one embodiment for the system architecture, where the reference signs represent:
  • 1 - light source 1 - scanner or beam shaping optics
  • Figure 2 illustrates two different schemes for the photodetector array with integrated micropolarizers.
  • the polarizing grid is oriented along the parallel and perpendicular directions.
  • the integrated micropolarizers are oriented at the parallel and perpendicular directions and at 45° and -45°.
  • the reference signs represent:
  • FIG. 3 illustrates one embodiment for the system architecture, where the reference signs represent:
  • Figure 4 illustrates one embodiment for the system architecture, where the reference signs represent:
  • a method for material discrimination based on LIDAR techniques and on the properties of the reflected light, in particularly the variations of the state of polarization of light backscattered from a target.
  • a polarized light beam is emitted, scanning or illuminating a target.
  • the reflected light beam from the target is detected, and the respective polarization properties are determined.
  • the polarization data is quantified by measuring the variation on the polarization properties of the reflected light beam in relation to the polarization properties of the emitted light beam.
  • the variations on the polarization properties of the reflected light beam in relation to the polarization properties of the emitted light beam are determined based on coefficients p, representing the current ratio between the orthogonal polarization components of the reflected light; c, representing the ratio between the current's perpendicular polarization component and total current, given by the degree of
  • the central processing unit is configured to generate a 3D point cloud from a 2D image, captured from a sensor camera, complemented with range data extracted from the time-of- flight measurement between the emission and respective detection of a light beam. Then, by means of image processing techniques, a 3D point cloud with polarization information is generated.
  • reflectivity data of the light beam is also considered, being determined by measuring the current generated by the reflected light beam on the photodetectors.
  • velocity parameter is also considered, and for that, the central processing unit is configured to measure the target's range difference between consecutive frames of the 3D point cloud.
  • the central processing unit is able to perform a 6D analysis used to discriminate materials and classify targets. It is also described a system for implementing the method for material discrimination.
  • the system is comprised by an emission unit, a detection unit and a central processing unit.
  • the emission unit is comprised by a light source (1) .
  • said light source (1) is a linearly-polarized laser diode, which either scans or illuminates targets by using a scanner or beam shaping optics (2) .
  • the emission unit could be a rotating housing that contains the light source (1) and the detectors within.
  • an additional polarizer may be required after the light source (1) in order to obtain a highly linearly- polarized beam.
  • the detection unit comprises a photodetector array with integrated micropolarizers (5) and an optical unit (4), as illustrated in figure 1.
  • the optical unit (4) is a lens. The light backscattered from the targets is directly led to the lens, which focuses the reflected light to the photodetector array with integrated micropolarizers (5) .
  • Said photodetector (5) comprises a solid state detector array with a polarizing assembly, e.g. metallic grid, formed directly on top of the detectors, though other alternatives are possible.
  • the photodetector array (5) is composed of several detectors (6), which in turn are divided into smaller subdetectors, each with a micropolarizer with the polarizing axis at a specific orientation as illustrated in figure 2.
  • the micropolarizer is used to select the polarization component arriving at each subdetector, so different orientations of the micropolarizer enable detecting different polarization components in the same detector (6), enabling to improve the sensibility of the detection unit to the polarization properties of the reflected light.
  • each detector (6) is divided into four subdetectors, with two of them oriented along the direction parallel (6.2) to the state of polarization of the emission unit - and the other two along the perpendicular direction (6.1) .
  • each detector (6) is divided into four subdetectors with micropolarizers oriented (i) along a parallel direction (6.2), (ii) 45° with respect to the parallel direction (6.3), (iii) -45° with respect to parallel direction (6.4) and (iv) along a perpendicular direction (6.1) in relation to the polarization state of the emission unit.
  • the detection unit comprises a photodetector array (9), an optical unit (4), a polarization-rotating element (7) and a polarizer (8) .
  • the polarization-rotating element (7) is a rotating half-wave plate and the optical unit (4) is a lens.
  • the reflected light passes through the half-wave plate and the polarizer (8) before the focusing lens and photodetector array (9) .
  • the principle of operation of this embodiment is as follows. Polarizer (8) acts as a polarization filter as it lets only one of the polarization components reaching the photodetector array (9) .
  • the fast axis of the half-wave plate is aligned with the axis of the polarizer (8), the perpendicular polarization component of the reflected light is blocked by the polarizer (8), and only the parallel polarization component reaches the photodetector array (9) .
  • the fast axis of the half-wave plate is aligned at 45° with respect to the parallel direction, the perpendicular and parallel polarization components are rotated by 90°. By doing so, only the perpendicular component of the reflected light passes through the polarizer (8) and reaches the photodetector array (9), as the half-wave plate rotates the perpendicular polarization component to the parallel direction.
  • the polarization-rotating element (7) is a liquid crystal polarization rotator, a Faraday rotator or a Pockels cell.
  • the detection unit comprises a polarizing beam splitter (10), an optical unit (4) and two photodetectors (11) .
  • the optical unit (4) is an arrangement formed by two lenses. The light reflected is led to the polarizing beam splitter (10), which then separates the incoming beams into two orthogonal polarizations components, one parallel to the polarization state of the emission unit and other perpendicular. The two orthogonal polarization components of the incoming light are respectively led to the lenses aligned to each photodetector (11), which focus the received light onto the photodetectors (11) .
  • the two photodetectors (11) are aligned each one with the parallel and perpendicular polarization state of the emitter, generating electrical currents which are proportional to the intensity of the parallel and perpendicular polarization components of the reflected light.
  • the lens that formed the optical unit (4) can be a single lens or a complex optical arrangement of lenses, prisms and/or mirrors in order to correct for spherical aberrations, image distortions or to improve the field of view or the angular resolution.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

L'invention concerne un procédé de discrimination de matériaux, basé sur des techniques LIDAR, conçu pour résoudre le problème d'identification et de reconnaissance de cible dans des images 3D, en fournissant un moyen de distinguer différents types de matériaux tels que des tissus, de la fourrure, du bois, des peintures, du métal ou du béton. À cet égard, la technologie de l'invention combine des informations sur les paramètres de lumière rétrodiffusée ainsi que des techniques de traitement d'image afin d'identifier des obstacles tels que des personnes, des véhicules, des arbres, des panneaux de signalisation, des bâtiments ou des animaux. Il en résulte une analyse 6D dans laquelle un emplacement 2D dans une image est combiné avec des informations concernant la plage, la réflectivité, la vitesse et la polarisation de la lumière afin de fournir une discrimination de matériaux et, par conséquent, une classification cible.
PCT/IB2018/055504 2018-07-23 2018-07-24 Procédé de discrimination de matériaux et système de mise en œuvre respectif WO2020021306A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
PT11087318 2018-07-23
PT110873 2018-07-23

Publications (1)

Publication Number Publication Date
WO2020021306A1 true WO2020021306A1 (fr) 2020-01-30

Family

ID=63667943

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2018/055504 WO2020021306A1 (fr) 2018-07-23 2018-07-24 Procédé de discrimination de matériaux et système de mise en œuvre respectif

Country Status (1)

Country Link
WO (1) WO2020021306A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023002237A1 (fr) 2021-07-21 2023-01-26 Bosch Car Multimedia Portugal, S.A. Capteur lidar multifonctionnel polarimétrique pour reconnaissance de cibles
US11656388B1 (en) 2022-06-08 2023-05-23 Toyota Motor Engineering & Manufacturing North America, Inc. LiDAR reflective fabric

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5327285A (en) * 1990-06-11 1994-07-05 Faris Sadeg M Methods for manufacturing micropolarizers
US7339670B2 (en) 2005-07-29 2008-03-04 Lockheed Martin Coherent Technologies, Inc. Wavelength normalized depolarization ratio lidar
JP2008171961A (ja) * 2007-01-10 2008-07-24 Nikon Corp レーザ装置、露光方法及び装置、並びにデバイス製造方法
US9360554B2 (en) 2014-04-11 2016-06-07 Facet Technology Corp. Methods and apparatus for object detection and identification in a multiple detector lidar array
DE102015200027A1 (de) * 2015-01-05 2016-07-07 Robert Bosch Gmbh Vorrichtung und Verfahren zum Bestimmen einer Eigenschaft eines Messpunkts
EP3182158A1 (fr) * 2015-12-18 2017-06-21 STMicroelectronics (Research & Development) Limited Appareil de télémétrie
US20180100731A1 (en) * 2016-10-07 2018-04-12 Arizona Board Of Regents On Behalf Of The University Of Arizona Depth and/or orientation tracker using division of focal plane polarization and/or color camera
WO2018127789A1 (fr) * 2017-01-03 2018-07-12 Innoviz Technologies Ltd. Systèmes lidar et procédés de détection et de classification d'objets

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5327285A (en) * 1990-06-11 1994-07-05 Faris Sadeg M Methods for manufacturing micropolarizers
US7339670B2 (en) 2005-07-29 2008-03-04 Lockheed Martin Coherent Technologies, Inc. Wavelength normalized depolarization ratio lidar
JP2008171961A (ja) * 2007-01-10 2008-07-24 Nikon Corp レーザ装置、露光方法及び装置、並びにデバイス製造方法
US9360554B2 (en) 2014-04-11 2016-06-07 Facet Technology Corp. Methods and apparatus for object detection and identification in a multiple detector lidar array
DE102015200027A1 (de) * 2015-01-05 2016-07-07 Robert Bosch Gmbh Vorrichtung und Verfahren zum Bestimmen einer Eigenschaft eines Messpunkts
EP3182158A1 (fr) * 2015-12-18 2017-06-21 STMicroelectronics (Research & Development) Limited Appareil de télémétrie
US20180100731A1 (en) * 2016-10-07 2018-04-12 Arizona Board Of Regents On Behalf Of The University Of Arizona Depth and/or orientation tracker using division of focal plane polarization and/or color camera
WO2018127789A1 (fr) * 2017-01-03 2018-07-12 Innoviz Technologies Ltd. Systèmes lidar et procédés de détection et de classification d'objets

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023002237A1 (fr) 2021-07-21 2023-01-26 Bosch Car Multimedia Portugal, S.A. Capteur lidar multifonctionnel polarimétrique pour reconnaissance de cibles
US11656388B1 (en) 2022-06-08 2023-05-23 Toyota Motor Engineering & Manufacturing North America, Inc. LiDAR reflective fabric

Similar Documents

Publication Publication Date Title
US10739460B2 (en) Time-of-flight detector with single-axis scan
US11073617B2 (en) Integrated illumination and detection for LIDAR based 3-D imaging
JP4350385B2 (ja) ターゲットマークを自動検索する方法、ターゲットマークを自動検索する装置、受信ユニット、測地計および測地システム
US4497065A (en) Target recognition system enhanced by active signature measurements
KR102135177B1 (ko) 능동형 이미징 시스템 구현 방법 및 장치
EP3710853B1 (fr) Système lidar à balayage et procédé avec filtrage spatial pour réduction de lumière ambiante
US8724104B2 (en) Coarse and fine projective optical metrology system
WO2020021311A1 (fr) Dispositif de télémétrie de véhicule terrestre et procédé de fonctionnement correspondant
DK2690459T3 (en) Device and method for identifying and documenting at least one object passing through a radiation field
US11681033B2 (en) Enhanced polarized light collection in coaxial LiDAR architecture
JP2010249818A (ja) レーザビーム画像コントラスト増強
JP2000337887A (ja) 移動体の自己位置標定装置
US20080094607A1 (en) Optical method and device for measuring a distance from an obstacle
WO2020021306A1 (fr) Procédé de discrimination de matériaux et système de mise en œuvre respectif
US20230176219A1 (en) Lidar and ambience signal fusion in lidar receiver
US11280907B2 (en) Depth imaging system
WO2020105527A1 (fr) Dispositif et système d'analyse d'image, et programme de commande
US20230122788A1 (en) Method and device for the recognition of blooming in a lidar measurement
JP7465958B2 (ja) 赤外線検知のためのシステムおよび方法
CN111595444B (zh) 一种运动目标光谱跟踪测量遥感系统以及方法
JP3690260B2 (ja) 車間距離計測方法
CN110446944A (zh) 基于spad的激光雷达系统
Chun et al. Target recognition study using polarimetric laser radar
CN117337404A (zh) 像素映射固态lidar发射机系统和方法
WO2017176410A1 (fr) Détecteur de temps de vol avec balayage à un seul axe

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18773605

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18773605

Country of ref document: EP

Kind code of ref document: A1