EP2972081B1 - Balayage en profondeur à l'aide de multiples émetteurs - Google Patents

Balayage en profondeur à l'aide de multiples émetteurs Download PDF

Info

Publication number
EP2972081B1
EP2972081B1 EP14765420.6A EP14765420A EP2972081B1 EP 2972081 B1 EP2972081 B1 EP 2972081B1 EP 14765420 A EP14765420 A EP 14765420A EP 2972081 B1 EP2972081 B1 EP 2972081B1
Authority
EP
European Patent Office
Prior art keywords
beams
scene
mirror
light
emitters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP14765420.6A
Other languages
German (de)
English (en)
Other versions
EP2972081A4 (fr
EP2972081A2 (fr
Inventor
Alexander Shpunt
Ronen EINAT
Zafrir Mor
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Publication of EP2972081A2 publication Critical patent/EP2972081A2/fr
Publication of EP2972081A4 publication Critical patent/EP2972081A4/fr
Application granted granted Critical
Publication of EP2972081B1 publication Critical patent/EP2972081B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • G01J1/04Optical or mechanical part supplementary adjustable parts
    • G01J1/0407Optical elements not provided otherwise, e.g. manifolds, windows, holograms, gratings
    • G01J1/0477Prisms, wedges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • G01S7/4815Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning

Definitions

  • the present invention relates generally to methods and devices for projection and capture of optical radiation, and particularly to optical 3D mapping.
  • optical 3D mapping i.e., generating a 3D profile of the surface of an object by processing an optical image of the object.
  • This sort of 3D profile is also referred to as a 3D map, depth map or depth image, and 3D mapping is also referred to as depth mapping.
  • U.S. Patent Application Publication 2011/0279648 describes a method for constructing a 3D representation of a subject, which comprises capturing, with a camera, a 2D image of the subject. The method further comprises scanning a modulated illumination beam over the subject to illuminate, one at a time, a plurality of target regions of the subject, and measuring a modulation aspect of light from the illumination beam reflected from each of the target regions. A moving-mirror beam scanner is used to scan the illumination beam, and a photodetector is used to measure the modulation aspect. The method further comprises computing a depth aspect based on the modulation aspect measured for each of the target regions, and associating the depth aspect with a corresponding pixel of the 2D image.
  • U.S. Patent 8,018,579 describes a three-dimensional imaging and display system in which user input is optically detected in an imaging volume by measuring the path length of an amplitude modulated scanning beam as a function of the phase shift thereof. Visual image user feedback concerning the detected user input is presented.
  • U.S. Patent 7,952,781 describes a method of scanning a light beam and a method of manufacturing a microelectromechanical system (MEMS), which can be incorporated in a scanning device.
  • MEMS microelectromechanical system
  • a scanning mirror includes a substrate that is patterned to include a mirror area, a frame around the mirror area, and a base around the frame.
  • a set of actuators operate to rotate the mirror area about a first axis relative to the frame, and a second set of actuators rotate the frame about a second axis relative to the base.
  • the scanning mirror can be fabricated using semiconductor processing techniques.
  • Drivers for the scanning mirror may employ feedback loops that operate the mirror for triangular motions.
  • WO9816801 A1 relates to a sensor, wherein the sensor is a diode-laser-based vehicle detector and classifier, the sensor configured to measure the presence, speed and three-dimensional profiles of vehicles passing beneath it.
  • the sensor uses a rotating mirror for pulsed laser range imaging.
  • US 2006145062 A1 relates to an optoelectronic detection device such as a laser scanner using several radiation transmitting modules and a deflection device to guide transmitted radiation.
  • WO0020825 A1 relates to a system for determining a distance or displacement to a target object by monitoring the electrical characteristics of a semiconductor laser aimed at the object, wherein light from the laser is reflected back into the laser.
  • Embodiments of the present invention that are described hereinbelow provide improved apparatus and methods for depth mapping using a scanning beam.
  • mapping apparatus which includes a transmitter, which is configured to emit, in alternation, at least two beams including pulses of light along respective beam axes that are mutually offset transversely relative to a scan line direction of a raster pattern.
  • a scanner is configured to scan the two or more beams in the raster pattern over a scene.
  • a receiver is configured to receive the light reflected from the scene and to generate an output indicative of a time of flight of the pulses to and from points in the scene.
  • a processor is coupled to process the output of the receiver so as to generate a 3D map of the scene.
  • the scanner includes a rotating mirror, which is configured to rotate so as to generate the raster pattern, and the transmitter is configured to direct the at least two beams to reflect from the mirror in alternation as the mirror scans over the raster pattern.
  • the receiver includes a detector, which is configured to receive the reflected light from the scene via the mirror as the mirror rotates.
  • the detector has a detection area, which is scanned over the scene by the rotation of the mirror, and the at least two beams may have respective emission areas that are scanned over the scene by the rotation of the mirror and, at any given location along the raster pattern, fall within the detection area.
  • the apparatus includes a beamsplitter, which is positioned so as to direct the beams emitted by the transmitter toward the mirror, while permitting the reflected light to reach the detector, wherein the beam axes and an optical axis of the reflected light are parallel between the beamsplitter and the micromirror.
  • the transmitter includes at least two emitters, which are respectively configured to emit the at least two beams and are mounted on a substrate in respective positions that are offset transversely relative to a projection of the scan line direction onto the substrate.
  • the transmitter includes an edge-emitting laser die containing an array of two or more parallel laser stripes, which serve as the at least two emitters.
  • the transmitter includes a vertically-emitting laser die containing an array of two or more vertical-cavity surface-emitting lasers (VCSELs), which serve as the at least two emitters.
  • VCSELs vertical-cavity surface-emitting lasers
  • the transmitter includes optics, including an array of at least two microlenses, which are respectively aligned with the at least two emitters so that the optics reduce an angular separation between the beam axes.
  • the respective positions of the at least two emitters may be diagonally offset relative to the projection of the scan line direction onto the substrate.
  • a method for mapping which includes emitting, in alternation, at least two beams including pulses of light along respective beam axes that are mutually offset transversely relative to a scan line direction of a raster pattern.
  • the two or more beams are scanned in the raster pattern over a scene.
  • the light reflected from the scene is received, and responsively to the received light, an output is generated, which is indicative of a time of flight of the pulses to and from points in the scene.
  • the output is processed so as to generate a 3D map of the scene.
  • U.S. Patent Application 2013/0207970 which is assigned to the assignee of the present patent application, describes depth engines that generate 3D mapping data by measuring the time of flight of a scanning beam.
  • a light transmitter such as a laser, directs short pulses of light toward a scanning mirror, which scans the light beam over a scene of interest within a certain scan range.
  • a receiver such as a sensitive, high-speed photodiode (for example, an avalanche photodiode) receives light returned from the scene via the same scanning mirror.
  • Processing circuitry measures the time delay between the transmitted and received light pulses at each point in the scan. This delay is indicative of the distance traveled by the light beam, and hence of the depth of the object at the point.
  • the processing circuitry uses the depth data thus extracted in producing a 3D map of the scene.
  • the light beam, along with the detection area of the receiver, is scanned over the scene in a raster pattern.
  • the scan resolution can be increased by decreasing the angular separation between successive raster lines, but this sort of resolution increase will come at the expense of reduced frame rate, since a larger number of raster lines is required to cover the scene.
  • the resolution may be increased at the expense of reduced field of view if the number of raster lines per frame is unchanged. Mechanical constraints put a limit on the degree to which the scanning speed of the mirror can be increased in order to offset these effects.
  • Embodiments of the present invention address these limitations by multiplexing two (or more) scanning spots, which are mutually offset in angle, along each raster line of the scan.
  • a transmitter emits at least two pulsed beams in alternation. The respective axes of these beams are mutually offset (in angle) transversely relative to the scan line direction of the raster.
  • a scanner such as a moving mirror, scans the two or more beams in the raster pattern over a scene, thus generating, in effect, two or more parallel scan lines that are parallel to each raster line.
  • a receiver receives the light reflected from the scene and generates an output indicative of the time of flight of the pulses to and from points in the scene, which can then be used to create a 3D map of the scene.
  • embodiments of the present invention effectively multiply the scan resolution of a depth mapping system by two or more, depending on the number of emitters that are used.
  • the emitters may comprise, for example, diode lasers or other solid-state sources, which can be pulsed on and off rapidly in turn, and thus multiply the density of spots that are sensed by the receiver within a given time without requiring any increase in the speed of the scanner itself.
  • the optimal rate and pattern of pulsing the emitters may be selected on the basis of the scan rate and the desired pixel resolution of the depth map.
  • the scanner comprises a mirror, which oscillates (i.e., rotates about two perpendicular axes) as to generate the raster pattern.
  • the beams from the transmitter reflect from the mirror in alternation as the mirror scans over the raster pattern.
  • the receiver comprises a detector, which receives the reflected light from the scene via the mirror, as well.
  • the components can be chosen and designed so that the detection area of the detector, which is scanned over the scene by the rotation of the mirror, is large enough, in angular terms, so that the emission areas of all the transmitted beams at any given location along the raster pattern fall within the detection area.
  • the two (or more) pulsed beams may be generated by respective emitters, which are mounted on a substrate in respective positions that are offset transversely relative to a projection of the scan line direction onto the substrate.
  • the transmitter may comprise, for example, an edge-emitting laser die containing an array of two or more parallel laser stripes, which serve as the emitters, or an array of two or more vertical-cavity surface-emitting lasers (VCSELs), or possible, for high optical power, two-or more individually-addressable groups of VCSELs.
  • VCSELs vertical-cavity surface-emitting lasers
  • the angular separation between the emitted beams depends on the relative offset between the emitters on the substrate and on the optics that are used in projecting the beams. In general, technological constraints dictate a certain minimal offset between the emitters (or groups of emitters in case of VCSEL arrays), which in turn places a lower limit on the angular separation of the beams in the far field. This angular separation determines the density of scan spots and may limit the resolution that can be achieved between the pixels in the 3D map.
  • the transmitter optics comprise an array of microlenses that are respectively aligned with the emitters and are configured to reduce the angular separation between the beam axes.
  • the respective positions of the emitters are diagonally offset relative to the projection of the scan line direction onto the substrate.
  • Fig. 1 is a schematic, pictorial illustration of a depth mapping system 20, in accordance with an embodiment of the present invention.
  • the system is based on a scanning depth engine 22, which captures 3D information in a volume of interest (VOI) 30 in a scene that includes one or more objects.
  • VOI volume of interest
  • the objects comprise at least parts of the bodies of users 28 of the system.
  • Engine 22 outputs a sequence of frames containing depth data to a computer 24, which processes and extracts high-level information from the data. This high-level information may be provided, for example, to an application running on computer 24, which drives a display screen 26 accordingly.
  • Computer 24 processes data generated by engine 22 in order to reconstruct a depth map of VOI 30 containing users 28.
  • engine 22 emits pulses of light while scanning over the scene and measures the relative delay of the pulses reflected back from the scene.
  • a processor in engine 22 or in computer 24 then computes the 3D coordinates of points in the scene (including points on the surface of the users' bodies) based on the time of flight of the light pulses at each measured point (X,Y) in the scene.
  • This approach is advantageous in that it does not require the users to hold or wear any sort of beacon, sensor, or other marker. It gives the depth (Z) coordinates of points in the scene relative to the location of engine 22 and permits dynamic zooming and shift of the region that is scanned within the scene. Implementation and operation of the depth engine are described in greater detail hereinbelow.
  • computer 24 is shown in Fig. 1 , by way of example, as a separate unit from depth engine 22, some or all of the processing functions of the computer may be performed by a suitable microprocessor and software or by dedicated circuitry within the housing of the depth engine or otherwise associated with the depth engine. As another alternative, at least some of these processing functions may be carried out by a suitable processor that is integrated with display screen 26 (in a television set, for example) or with any other suitable sort of computerized device, such as a game console or media player. The sensing functions of engine 22 may likewise be integrated into computer 24 or other computerized apparatus that is to be controlled by the depth output.
  • a set of Cartesian axes is marked in Fig. 1 .
  • the Z-axis is taken to be parallel to the optical axis of depth engine 22.
  • the frontal plane of the depth engine is taken to be the X-Y plane, with the X-axis as the horizontal.
  • These axes are defined solely for the sake of convenience.
  • the terms "vertical” and “horizontal” are used herein in describing the operation of depth engine 22 solely for the sake of clarity of explanation, to correspond to the example implementation that is shown in Fig. 1 , and not by way of limitation, since the depth engine could equally operate at a rotation of 90° relative to the pictured view.
  • Other geometrical configurations of the depth engine and its volume of interest may alternatively be used and are considered to be within the scope of the present invention.
  • engine 22 generates two or more beams 38, which scan VOI 30 in a raster pattern.
  • the depth engine may scan rapidly in the Y-direction, in a resonant scan of a scanning mirror with a fixed frequency, such as 1-20 kHz, while scanning more slowly in the X-direction at the desired frame rate (such as 1-300 Hz, which is typically not a resonant frequency of rotation).
  • the scanning rate is generally a limiting factor in the resolution of the scan.
  • the number of scan lines that can be traversed in each frame is limited by the resonant mirror frequency, regardless of the rate and range of the X-direction scan.
  • Embodiments of the present invention that are described herein increase the achievable resolution by using multiple emitters in engine 22 to generate and scan multiple parallel raster lines concurrently in each scan of the mirror.
  • multiple emitters are used the present embodiment specifically for overcoming a limitation in the Y-direction scan rate, which is characteristic of the configuration of engine 22, the principles of the present invention may alternatively be used to enhance resolution in other sorts of scan configurations.
  • the range of the scan pattern of engine 22 may be adjusted during operation of system 20, as described in the above-mentioned U.S. Patent Application Publication 2013/0207970 .
  • the scan may be limited to a window 32, or the scan range may be controlled to focus on respective windows 34, 36 over users 28 while skipping over the space between them. These zoom capabilities enable enhanced resolution within the selected windows.
  • Fig. 2 is a schematic, pictorial illustration showing elements of an optical scanning head 40 that may be used in depth engine 22, in accordance with an embodiment of the present invention.
  • a transmitter 44 emits pulses of light toward a polarizing beamsplitter 60.
  • the transmitter comprises multiple emitters, which emit respective beams of light along axes that are mutually offset. These beams may comprise visible, infrared, and/or ultraviolet radiation (all of which are referred to as "light” in the context of the present description and in the claims).
  • the light from transmitter 44 reflects off beamsplitter 60 and is then directed by a folding mirror 62 toward a scanning micromirror 46.
  • a MEMS scanner 64 scans micromirror 46 in X- and Y-directions with the desired scan frequency and amplitude.
  • the micromirror scans beams 38 over the scene, typically via projection/collection optics, such as a suitable lens (not shown in the figures). Details of the micromirror and scanner are described in the above-mentioned U.S. Patent Application Publication 2013/0207970 , and techniques that may be used in producing these elements are described in the above-mentioned U.S. Patent 7,952,781 . In alternative embodiments (not shown), separate mirrors may be used for the X- and Y-direction scans, and other types of scanners - not necessarily MEMS-based - as are known in the art, may be used.
  • Light pulses returned from the scene strike micromirror 46, which reflects the light via turning mirror 62 through beamsplitter 60.
  • the optical axes of the transmitted beams, as defined by the axis of the projection optics, and of the reflected light, as defined by the axis of the collection optics, are parallel between beamsplitter 60 and mirror 62.
  • Receiver 48 senses the returned light pulses and generates corresponding electrical pulses.
  • Receiver 48 typically comprises a sensitive, high-speed photodetector, such as an avalanche photodiode (APD), along with a sensitive amplifier, such as a transimpedance amplifier (TIA), which amplifies the electrical pulses output by the photodetector. These pulses are indicative of the times of flight of the corresponding pulses of light.
  • a sensitive, high-speed photodetector such as an avalanche photodiode (APD)
  • TIA transimpedance amplifier
  • receiver 48 typically has a collection angle of about 0.5-10°.
  • the collection angle of the receiver can be made as small as 0.05° or as large as 90°.
  • the pulses that are output by receiver 48 are processed by a controller 42 (or by computer 24) in order to extract depth (Z) values as a function of scan location (X,Y).
  • the data from engine 40 may be output to computer 24 via a suitable interface.
  • the overall area of beamsplitter 60 and the aperture of receiver 48 are considerably larger than the area of the transmitted beams.
  • Beamsplitter 60 may be accordingly patterned, i.e., the reflective coating extends over only the part of its surface on which the transmitted beam is incident.
  • the reverse side of the beamsplitter may have a bandpass coating, to prevent light outside the emission band of transmitter 44 from reaching the receiver.
  • micromirror 46 be as large as possible, within the inertial constraints imposed by the scanner.
  • the area of the micromirror may be about 5-50 mm 2 .
  • Controller 42 coordinates the timing of the pulses emitted by transmitter 44 and of the scan pattern of micromirror 46. Specifically, the controller causes the multiple emitters in the transmitter to emit their respective pulses in alternation, so that each scan line generated by micromirror 46 actually traces two or more parallel scan lines, spaced a small distance apart, across VOI 30 (as shown below in Fig. 5 ).
  • the alternating pulse operation may be such as to cause each emitter to emit a single pulse in its turn, followed by the pulse from the next emitter; or it may alternatively generate more complex patterns, such as emission of two or more successive pulses by a given emitter, followed by two or more pulses from the next emitter, and so forth.
  • optical head shown in Fig. 2
  • alternative designs implementing similar principles are considered to be within the scope of the present invention.
  • optoelectronic module designs that are described in the above-mentioned U.S. Patent Application Publication 2013/0207970 may be adapted for multi-emitter operation.
  • a single emitter with an acousto-optic or electro-optic modulator in the transmit path between the transmitter and the beamsplitter could be used to generate multiple, alternating spots at a mutual (angular) offset.
  • the modulator changes the pointing angle of the transmitted beam by a small predefined amount (for example, by the inter-pixel separation of 1-10 mrad), which does not shift the beam significantly on the mirror.
  • Figs. 3A and 3B are schematic side and top views, respectively of an optoelectronic emitter module used in transmitter 44, in accordance with an embodiment of the present invention.
  • a laser die 70 is mounted on a suitable substrate 72, such as a silicon optical bench (SiOB).
  • Laser die 70 in this embodiment is an edge-emitting device, containing an array of two parallel laser stripes 80, 82. (In other embodiments, not shown in the figures, the array may contain a larger number of stripes; or surface-emitting devices may be used, as shown in Figs. 7B-C .)
  • Stripes 80 and 82 are offset transversely relative to the projection of a scan line 96 of optical scanning head 40 onto substrate 72.
  • the projection of the scan line is defined by imaging successive points along the scan line from VOI 30 back onto the substrate.
  • the scan lines in the raster, as shown in Fig. 5 may typically define a zigzag or sinusoidal pattern, and the "projection" referred to in the present description and in the claims is taken along the central, essentially straight portion of the scan lines.
  • stripes 80 and 82 emit beams along respective axes 87 and 88, which are generally parallel but in this embodiment are offset transversely relative to the projection of the scan line.
  • the separation between the stripes, and hence the offset between the respective beams is typically on the order of 30-50 ⁇ m, due to physical constraints of the semiconductor laser device, but larger or smaller separations are also possible.
  • the laser output beams from stripes 80 and 82 are collected by a microlens array 74, comprising microlenses 84, 86 that are respectively aligned with stripes 80, 82. (Alternatively, the microlenses may be formed directly on the output facet of laser die 70.)
  • the beams then reflect from a turning mirror, such as a prism 76 with a suitably-coated diagonal face, and are collimated by a projection lens 78.
  • lens 78 has a focal length of about 1-10 mm
  • the angular separation between the beams from stripes 80 and 82 in the far field i.e., as projected onto VOI 30
  • receiver 48 has a collection angle of 0.5-10°, as described above, there is enough flexibility in setting the system parameters so that both illumination beams fall within the detection area of the receiver.
  • a common cylindrical collimating lens may be used for Y-axis collimation of both of stripes 80 and 82 (in place of microlens array 74), while another cylindrical collimating lens, with longer focal distance, is used for X-axis collimation (in place of lens 78).
  • a single projection lens 78 may be sufficient, without pre-collimation by microlens array 74.
  • Fig. 4 is a schematic representation of illumination areas 92, 94 and a sensing area 90 of optical scanning head 40, in accordance with an embodiment of the present invention.
  • Illumination areas 92 and 94 correspond to the far-field beam profiles of the beams emitted by stripes 80 and 82 of laser die 70. As explained above, both of these areas 92 and 94 fall within a sensing area 90 of receiver 48, at respective transverse offsets on either side of scan line 96.
  • Controller 42 triggers transmitter 44 so that stripes 80 and 82 are pulsed in alternation, and thus areas 92 and 94 are illuminated in alternation.
  • the pulse timing is tuned so that there is no interference between the pulses returned from the scene due to emitters 80 and 82.
  • receiver 48 will sense the light pulse reflected from either area 92 or area 94, but not both.
  • the transmitted pulse sequences of emitters 80 and 82 may be defined to be orthogonal or otherwise separable, and controller 42 may separate the received pulses by applying a suitable signal processing algorithm.
  • the spatial resolution of depth engine is thus a function of the sizes and separations between successive illumination areas 92 and 94.
  • Fig. 5 is a schematic representation of a raster scan pattern 100 formed by optical scanning head 40 using the arrangement described above, in accordance with an embodiment of the present invention.
  • scan line 96 follows the zigzag raster pattern shown in the figure. (Alternatively, for simultaneous operation of both emitters, as described above, the scan pattern is just two parallel raster patterns.)
  • Beam areas 92 and 94 scan along corresponding offset scan lines 104, 106 that are transversely displaced (in the X-direction) relative to scan line 96.
  • successively illuminated areas 92 and 94 are also axially displaced (in the Y-direction) relative to one another.
  • the horizontal spread between successive Y-direction passes of scan line 96 is exaggerated in Fig. 5 , as is the vertical distance between successive illuminated areas 92 and 94; and in practice, the scan points are substantially more densely packed in both X- and Y-directions.
  • scan pattern 100 covers the scan area with twice the density that would be achieved by using only a single emitter, i.e., the scan comprises twice as many vertical lines as it would with a single emitter, and the horizontal resolution of the scan at any given mirror scan rate and range may thus be roughly doubled.
  • the vertical resolution is limited by the pulse rate at which laser die 70 is operated and other considerations, such as the temporal resolution of receiver 48 and the specific method that is used to extract depth measurements from the system.
  • Fig. 5 shows only a small number of raster lines and a small number of illuminated areas on each line, in practice the laser die and receiver can typically operate at frequencies of 20 MHz or higher. Consequently, the vertical resolution is limited in practice by the optical and processing capabilities of depth engine 22, rather than the spot density in the vertical direction.
  • the simple scan pattern 100 that is shown in Fig. 5 can be enhanced in a number of ways.
  • scanner 64 may step mirror 46 in the X-direction in smaller increments, or in increments of varying size, so that scan lines 104 and 106 cover the scan area with greater density in the horizontal direction and thus enhance the horizontal resolution.
  • the beam projection optics in transmitter 44 may be designed to optically reduce the angular separation between illuminated areas to 0.5° or less, thus reducing the offset between scan lines 104 and 106 and enhancing the horizontal resolution of the scan. (The angular separation may be reduced, for example, by offsetting microlenses 84 and 86 slightly relative to beam axes 87 and 88, in the manner described below with reference to Fig. 7C .)
  • stripes 80 and 82 may be pulsed in different sorts of alternating patterns from the simple alternating toggle that is illustrated in Fig. 5 .
  • stripe 80 may be pulsed two or more times in immediate succession to illuminate a group of successive areas 92 along line 104, followed by two or more successive pulses of stripe 82, and so on in alternation.
  • the signals output by receiver 48 due to each of these groups may be averaged to form a single pixel in the resulting depth map.
  • This technique can be particularly useful when the operating pulse frequency of laser die 70 is greater than the pixel clock rate of engine 22, in order to enhance the signal/noise ratio of the time-of-flight measurements without compromising the resolution.
  • the alternating patterns of the emitters may include simultaneous operation in orthogonal time sequences (as in certain methods that are used in communications).
  • Fig. 6 is a schematic representation of sensing area 90 and illumination areas 92, 94 of an optical scanning head, in accordance with another embodiment of the present invention.
  • This figure illustrates another way to enhance the horizontal resolution of the scan pattern:
  • the respective positions of the emitters in transmitter 44 are diagonally offset relative to the projection of scan line 96 onto substrate 72 of laser die 70.
  • substrate 72 may be rotated in the X-Y plane about the Z-axis defined by the optical axis of lens 78 ( Fig. 3A ).
  • the effective distance D between the illumination areas in the raster scan over VOI 30 will be reduced by the cosine of the rotation angle, as shown in Fig. 6 .
  • the figure shows a certain angular separation between illumination areas 92 and 94, at micromirror 46 the spatial separation between the corresponding spots is very small.
  • Fig. 7A is a schematic representation of sensing area 90 and illumination areas 110, 112, 114, 116 and 118 of an optical scanning head, in accordance with yet another embodiment of the present invention.
  • the larger number of illumination areas is achieved in this example by increasing the number of emitters in the transmitter.
  • an edge-emitting laser die with a larger number of stripes may be used, or a surface-emitting device may be used as shown in Fig. 7B and 7C .
  • all of the emitters are pulsed in alternation, so that the depth engine scans five lines in parallel, rather than only two as in the preceding examples, and thus achieves still higher resolution.
  • three, four, or six or more emitters may be operated together in this manner.
  • Figs. 7B and 7C schematically illustrate a beam transmitter 170 that may be used in producing the illumination areas of Fig. 7A , in accordance with an embodiment of the present invention.
  • Fig. 7B is a side view of the entire beam transmitter
  • Fig. 7C is a side view of a beam generator 172 that may be used in transmitter 170.
  • Transmitters of this sort and integrated optoelectronic modules based on such transmitter are described in greater detail in the above-mentioned U.S. Patent Application Publication 2013/0207970 .
  • Such transmitters and modules may be used in scanning head 40, mutatis mutandis, in place of the devices that are described above.
  • Beam generator 172 comprises an array of surface-emitting devices 178, such as vertical-cavity surface-emitting lasers (VCSELs).
  • the beams emitted by devices 178 are collected by a corresponding array of microlenses 176, which direct the beams toward a collimation lens 175.
  • Devices 178 and microlenses 176 may conveniently be formed on opposing faces of a transparent optical substrate 180, which may be a suitable semiconductor wafer, such as a GaAs wafer. As shown in Fig.
  • the alignment between device 178 and microlenses 176 is such that the locations of devices 178 are offset inwardly relative to the centers of the corresponding microlenses 176, thus giving rise to an angular spread between the individual beams transmitted by the microlenses.
  • the angular spread generated by microlenses 176 defines a single virtual focus from which all of the beams reaching collimation lens 175 appear to originate. Consequently, the angular offset between adjacent beams exiting the collimation lens, which respectively form illumination areas 110, 112, 114, 116 and 118, is reduced to approximately 0.5° or less, and all of the illumination areas thus fall within detection area 90.
  • devices 178 may be aligned with an outward offset relative to the corresponding microlenses 176, so that the beams transmitted by the microlenses converge to a real focus, with similar effect upon collimation.
  • the emitting devices are divided into groups, wherein each group acts as a single emitter, and the groups are aligned angularly as required for the overall performance described above.
  • multi-emitter transmitters may be used provide redundancy in depth mapping and other such systems: In the event that one of the emitters fails, the system may still continue to operate using the other emitter(s). This approach may reduce the need for extended transmitter burn-in before deployment in the field.
  • a multi-emitter configuration of the type described above may be applied in a pico-projector to deliver HD performance using hybrid electronic-MEMS scanning capabilities.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Computer Graphics (AREA)
  • Theoretical Computer Science (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Mechanical Optical Scanning Systems (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Computer Vision & Pattern Recognition (AREA)

Claims (12)

  1. Appareil de cartographie comprenant :
    un émetteur (44), qui est configuré pour émettre le long d'axes de faisceau respectifs (87, 88) au moins deux faisceaux comprenant des impulsions lumineuses ;
    un dispositif de balayage (64), qui est configuré pour balayer une scène avec les au moins deux faisceaux et qui comprend au moins un miroir rotatif (46) tournant de manière à générer un motif tramé par balayage dans une première direction à une première vitesse et à une seconde vitesse dans une seconde direction, perpendiculaire à la première direction, la seconde vitesse étant inférieure à la première vitesse, avec l'émetteur qui dirige les au moins deux faisceaux pour qu'ils se réfléchissent depuis l'au moins un miroir rotatif ;
    un récepteur (48), qui comprend un détecteur qui reçoit, par l'intermédiaire de la réflexion venant de l'au moins un miroir, la lumière réfléchie de la scène lorsque l'au moins un miroir tourne, et qui est configuré pour générer une sortie représentative d'un temps de vol des impulsions envoyées vers, et revenant de, points de la scène ; et
    un processeur (24), qui est couplé pour traiter la sortie du récepteur de manière à générer une carte tridimensionnelle de la scène,
    dans lequel le dispositif de balayage balaye les au moins deux faisceaux en un motif tramé (100), et l'émetteur émet en alternance les au moins deux faisceaux, avec les axes du faisceau qui sont mutuellement décalés transversalement par rapport à une direction des lignes du motif tramé, créant ainsi un balayage bidimensionnel de chacun des faisceaux.
  2. L'appareil selon la revendication 1, dans lequel le détecteur possède une zone de détection (90) qui est balayée sur la scène par la rotation de l'au moins un miroir, et dans lequel les au moins deux faisceaux ont des zones d'illumination respectives (92, 94) qui sont balayées sur la scène par la rotation de l'au moins un miroir et, en tout emplacement donné le long du motif tramé, tombent dans la zone de détection.
  3. L'appareil selon la revendication 1, comprenant un diviseur de faisceau (60), qui est positionné de manière à diriger les faisceaux émis par l'émetteur en direction de l'au moins un miroir, tout en permettant à la lumière réfléchie d'atteindre le détecteur, les axes des faisceaux et un axe optique de la lumière réfléchie étant parallèles entre le diviseur de faisceau et l'au moins un miroir.
  4. L'appareil selon la revendication 1, dans lequel le récepteur possède une zone de détection (90) qui est balayée sur la scène en synchronisme avec les au moins deux faisceaux, et dans lequel les au moins deux faisceaux ont des zones d'illumination respectives (92, 94) qui, en tout emplacement donné le long du motif tramé, tombent à l'intérieur de la zone de détection.
  5. L'appareil selon l'une des revendications 1 à 4, dans lequel l'émetteur comprend au moins deux émetteurs (80, 82) qui sont respectivement configurés pour émettre les au moins deux faisceaux et sont montés sur un substrat dans des positions respectives qui sont décalées transversalement par rapport à une projection sur le substrat de la direction de la ligne de balayage.
  6. L'appareil selon la revendication 5, dans lequel l'émetteur comprend une puce laser à émission par les bords, contenant un réseau de deux ou plus bandes laser parallèles, qui jouent le rôle des au moins deux émetteurs.
  7. L'appareil selon la revendication 5, dans lequel l'émetteur comprend une puce laser à émission verticale contenant un réseau de deux ou plus lasers à émission de surface à cavité verticale, qui jouent le rôle des au moins deux émetteurs.
  8. L'appareil selon la revendication 5, dans lequel l'émetteur comprend une optique, comprenant un réseau d'au moins deux microlentilles (176) qui sont respectivement alignées avec les au moins deux émetteurs de manière que l'optique réduise une séparation angulaire entre les axes des faisceaux.
  9. L'appareil selon la revendication 5, dans lequel les positions respectives des au moins deux émetteurs sont décalées diagonalement par rapport à la projection sur le substrat de la direction de la ligne de balayage.
  10. Un procédé de cartographie, comprenant :
    l'émission le long d'axes de faisceaux respectifs (87, 88) d'au moins deux faisceaux comprenant des impulsions lumineuses ;
    le balayage d'une scène avec les au moins deux faisceaux en dirigeant les faisceaux pour qu'ils se reflètent depuis au moins un miroir tournant (44) de manière à générer un motif tramé par balayage dans une première direction à une première vitesse et à une seconde vitesse dans une seconde direction, perpendiculaire à la première direction, la seconde vitesse étant inférieure à la première vitesse ;
    la réception de la lumière réfléchie depuis la scène par détection, par l'intermédiaire de la réflexion provenant de l'au moins un miroir, de la lumière qui est réfléchie depuis la scène lorsque l'au moins un miroir tourne, et la génération, en réponse à la lumière reçue, d'une sortie représentative d'un temps de vol des impulsions à destination et en provenance de points de la scène ; et
    le traitement de la sortie de manière à générer une carte tridimensionnelle de la scène,
    dans lequel les au moins deux faisceaux sont balayés de manière à générer un motif tramé (100), et les au moins deux faisceaux sont émis en alternance, avec les axes des faisceaux qui sont mutuellement décalés transversalement par rapport à une direction des lignes du motif tramé, créant ainsi un balayage bidimensionnel de chacun des faisceaux.
  11. Le procédé selon la revendication 10, dans lequel la réception de la lumière comprend le recueil de la lumière à l'intérieur d'une zone de détection (90) qui est balayée sur la scène en synchronisme avec les au moins deux faisceaux, et dans lequel les au moins deux faisceaux ont des zones d'illumination respectives (92, 94) qui, en tout emplacement donné le long du motif tramé, tombent à l'intérieur de la zone de détection.
  12. Le procédé selon la revendication 10 ou 11, dans lequel l'émission des au moins deux faisceaux comprend la commande impulsionnelle d'au moins deux émetteurs pour émettre les au moins deux faisceaux, les au moins deux émetteurs étant montés sur un substrat en des positions respectives qui sont décalées transversalement par rapport à une projection sur le substrat de la direction de la ligne de balayage.
EP14765420.6A 2013-03-15 2014-03-13 Balayage en profondeur à l'aide de multiples émetteurs Active EP2972081B1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361786711P 2013-03-15 2013-03-15
PCT/IB2014/059711 WO2014141115A2 (fr) 2013-03-15 2014-03-13 Balayage en profondeur à l'aide de de multiples émetteurs

Publications (3)

Publication Number Publication Date
EP2972081A2 EP2972081A2 (fr) 2016-01-20
EP2972081A4 EP2972081A4 (fr) 2016-11-09
EP2972081B1 true EP2972081B1 (fr) 2020-04-22

Family

ID=51538229

Family Applications (1)

Application Number Title Priority Date Filing Date
EP14765420.6A Active EP2972081B1 (fr) 2013-03-15 2014-03-13 Balayage en profondeur à l'aide de multiples émetteurs

Country Status (6)

Country Link
US (1) US9267787B2 (fr)
EP (1) EP2972081B1 (fr)
KR (1) KR101762525B1 (fr)
CN (1) CN105143820B (fr)
IL (1) IL241335B (fr)
WO (1) WO2014141115A2 (fr)

Families Citing this family (100)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3836539B1 (fr) 2007-10-10 2024-03-13 Gerard Dirk Smits Projecteur d'image avec suivi de lumière réfléchie
US10739460B2 (en) 2010-08-11 2020-08-11 Apple Inc. Time-of-flight detector with single-axis scan
WO2012054231A2 (fr) 2010-10-04 2012-04-26 Gerard Dirk Smits Système et procédé de projection en 3d et améliorations d'interactivité
US9715107B2 (en) 2012-03-22 2017-07-25 Apple Inc. Coupling schemes for gimbaled scanning mirror arrays
CN104221058B (zh) 2012-03-22 2017-03-08 苹果公司 装有万向接头的扫描镜阵列
WO2014016794A1 (fr) 2012-07-26 2014-01-30 Primesense Ltd. Miroir de balayage double axe
US8971568B1 (en) 2012-10-08 2015-03-03 Gerard Dirk Smits Method, apparatus, and manufacture for document writing and annotation with virtual ink
KR102124930B1 (ko) * 2013-08-16 2020-06-19 엘지전자 주식회사 공간 해상도가 가변되는 거리 정보를 획득할 수 있는 거리검출장치
CN104898124A (zh) * 2014-03-07 2015-09-09 光宝科技股份有限公司 深度检测的取样方法及其光学装置
US9810913B2 (en) 2014-03-28 2017-11-07 Gerard Dirk Smits Smart head-mounted projection system
CN106415309B (zh) * 2014-06-27 2019-07-30 Hrl实验室有限责任公司 扫描激光雷达及其制造方法
US9377533B2 (en) 2014-08-11 2016-06-28 Gerard Dirk Smits Three-dimensional triangulation and time-of-flight based tracking systems and methods
US9835853B1 (en) 2014-11-26 2017-12-05 Apple Inc. MEMS scanner with mirrors of different sizes
US9784838B1 (en) 2014-11-26 2017-10-10 Apple Inc. Compact scanner with gimbaled optics
DE102014118055A1 (de) * 2014-12-08 2016-06-09 Valeo Schalter Und Sensoren Gmbh Sendeeinrichtung, Empfangseinrichtung und Objekterfassungsvorrichtung für ein Kraftfahrzeug sowie Verfahren dafür
WO2016123618A1 (fr) * 2015-01-30 2016-08-04 Adcole Corporation Scanners optiques tridimensionnels et leurs procédés d'utilisation
US9798135B2 (en) 2015-02-16 2017-10-24 Apple Inc. Hybrid MEMS scanning module
WO2016168378A1 (fr) 2015-04-13 2016-10-20 Gerard Dirk Smits Vision artificielle destinée au mouvement propre, à la segmentation et à la classification d'objets
US9919425B2 (en) 2015-07-01 2018-03-20 Irobot Corporation Robot navigational sensor system
US9880267B2 (en) 2015-09-04 2018-01-30 Microvision, Inc. Hybrid data acquisition in scanned beam display
DE102015115011A1 (de) * 2015-09-08 2017-03-09 Valeo Schalter Und Sensoren Gmbh Laserscanner für Kraftfahrzeuge
US10503265B2 (en) 2015-09-08 2019-12-10 Microvision, Inc. Mixed-mode depth detection
US10063849B2 (en) 2015-09-24 2018-08-28 Ouster, Inc. Optical system for collecting distance information within a field
US9992477B2 (en) 2015-09-24 2018-06-05 Ouster, Inc. Optical system for collecting distance information within a field
US9897801B2 (en) 2015-09-30 2018-02-20 Apple Inc. Multi-hinge mirror assembly
US9703096B2 (en) 2015-09-30 2017-07-11 Apple Inc. Asymmetric MEMS mirror assembly
JP6854828B2 (ja) 2015-12-18 2021-04-07 ジェラルド ディルク スミッツ 物体のリアルタイム位置検知
WO2017112416A1 (fr) * 2015-12-20 2017-06-29 Apple Inc. Capteur de détection de lumière et de télémétrie
US10324171B2 (en) 2015-12-20 2019-06-18 Apple Inc. Light detection and ranging sensor
US9813673B2 (en) 2016-01-20 2017-11-07 Gerard Dirk Smits Holographic video capture and telepresence system
KR20200110823A (ko) 2016-01-29 2020-09-25 각코호진 메이지다이가쿠 레이저 스캔 시스템, 레이저 스캔 방법, 이동 레이저 스캔 시스템 및 프로그램
US10416292B2 (en) 2016-05-24 2019-09-17 Veoneer Us, Inc. Direct detection LiDAR system and method with frequency modulation (FM) transmitter and quadrature receiver
US10838062B2 (en) 2016-05-24 2020-11-17 Veoneer Us, Inc. Direct detection LiDAR system and method with pulse amplitude modulation (AM) transmitter and quadrature receiver
US10473784B2 (en) 2016-05-24 2019-11-12 Veoneer Us, Inc. Direct detection LiDAR system and method with step frequency modulation (FM) pulse-burst envelope modulation transmission and quadrature demodulation
US9766060B1 (en) * 2016-08-12 2017-09-19 Microvision, Inc. Devices and methods for adjustable resolution depth mapping
US10145680B2 (en) 2016-08-12 2018-12-04 Microvision, Inc. Devices and methods for providing depth mapping with scanning laser image projection
US10298913B2 (en) 2016-08-18 2019-05-21 Apple Inc. Standalone depth camera
CN109843500B (zh) 2016-08-24 2021-06-29 奥斯特公司 用于收集场内的距离信息的光学系统
US10488652B2 (en) 2016-09-21 2019-11-26 Apple Inc. Prism-based scanner
US20180081041A1 (en) * 2016-09-22 2018-03-22 Apple Inc. LiDAR with irregular pulse sequence
EP3532863A4 (fr) 2016-10-31 2020-06-03 Gerard Dirk Smits Lidar à balayage rapide avec sondage par voxel dynamique
EP3540378A4 (fr) * 2016-11-14 2020-08-19 Ningbo Onsight Co., Ltd. Procédé et appareil de balayage laser
GB201619921D0 (en) * 2016-11-24 2017-01-11 Cambridge Mechatronics Ltd SMA Depth mapping
US10771768B2 (en) 2016-12-15 2020-09-08 Qualcomm Incorporated Systems and methods for improved depth sensing
US10200683B2 (en) 2016-12-21 2019-02-05 Microvision, Inc. Devices and methods for providing foveated scanning laser image projection with depth mapping
US10261183B2 (en) 2016-12-27 2019-04-16 Gerard Dirk Smits Systems and methods for machine perception
US10942257B2 (en) 2016-12-31 2021-03-09 Innovusion Ireland Limited 2D scanning high precision LiDAR using combination of rotating concave mirror and beam steering devices
US10158845B2 (en) 2017-01-18 2018-12-18 Facebook Technologies, Llc Tileable structured light projection for wide field-of-view depth sensing
US10598771B2 (en) * 2017-01-18 2020-03-24 Analog Devices Global Unlimited Company Depth sensing with multiple light sources
US10419741B2 (en) 2017-02-24 2019-09-17 Analog Devices Global Unlimited Company Systems and methods for compression of three dimensional depth sensing
US10473766B2 (en) 2017-03-13 2019-11-12 The Charles Stark Draper Laboratory, Inc. Light detection and ranging (LiDAR) system and method
JP7246322B2 (ja) 2017-05-10 2023-03-27 ジェラルド ディルク スミッツ 走査ミラーシステム及び方法
WO2018213200A1 (fr) 2017-05-15 2018-11-22 Ouster, Inc. Émetteur d'imagerie optique doté d'une luminosité améliorée
DE102017005395B4 (de) * 2017-06-06 2019-10-10 Blickfeld GmbH LIDAR-Entfernungsmessung mit Scanner und FLASH-Lichtquelle
US11163042B2 (en) 2017-06-06 2021-11-02 Microvision, Inc. Scanned beam display with multiple detector rangefinding
US10830879B2 (en) 2017-06-29 2020-11-10 Apple Inc. Time-of-flight depth mapping with parallax compensation
US10838043B2 (en) 2017-11-15 2020-11-17 Veoneer Us, Inc. Scanning LiDAR system and method with spatial filtering for reduction of ambient light
US10613200B2 (en) 2017-09-19 2020-04-07 Veoneer, Inc. Scanning lidar system and method
US11460550B2 (en) 2017-09-19 2022-10-04 Veoneer Us, Llc Direct detection LiDAR system and method with synthetic doppler processing
EP3460519A1 (fr) * 2017-09-25 2019-03-27 Hexagon Technology Center GmbH Scanner laser
US11194022B2 (en) 2017-09-29 2021-12-07 Veoneer Us, Inc. Detection system with reflection member and offset detection array
US10684370B2 (en) 2017-09-29 2020-06-16 Veoneer Us, Inc. Multifunction vehicle detection system
EP3692391A4 (fr) * 2017-10-03 2020-11-11 Leddartech Inc. Instrument de télémètrie optique à impulsions multiples à forme d'onde complète
CN107656258A (zh) * 2017-10-19 2018-02-02 深圳市速腾聚创科技有限公司 激光雷达及激光雷达控制方法
US10591605B2 (en) 2017-10-19 2020-03-17 Gerard Dirk Smits Methods and systems for navigating a vehicle including a novel fiducial marker system
US11585901B2 (en) 2017-11-15 2023-02-21 Veoneer Us, Llc Scanning lidar system and method with spatial filtering for reduction of ambient light
JP7039948B2 (ja) * 2017-11-17 2022-03-23 株式会社デンソー 測距センサ
EP3493339B1 (fr) * 2017-12-04 2022-11-09 ams AG Dispositif semi-conducteur et procédé de mesures de la durée de vol et de la proximité
US11353556B2 (en) 2017-12-07 2022-06-07 Ouster, Inc. Light ranging device with a multi-element bulk lens system
EP3704510B1 (fr) * 2017-12-18 2022-10-05 Apple Inc. Détection de temps de vol à l'aide d'un réseau d'émetteurs adressable
WO2019148214A1 (fr) 2018-01-29 2019-08-01 Gerard Dirk Smits Systèmes lidar à balayage à largeur de bande élevée et hyper-résolution
CN112292608A (zh) 2018-02-23 2021-01-29 图达通爱尔兰有限公司 用于lidar系统的二维操纵系统
US11808888B2 (en) 2018-02-23 2023-11-07 Innovusion, Inc. Multi-wavelength pulse steering in LiDAR systems
US10760957B2 (en) 2018-08-09 2020-09-01 Ouster, Inc. Bulk optics for a scanning array
US10739189B2 (en) 2018-08-09 2020-08-11 Ouster, Inc. Multispectral ranging/imaging sensor arrays and systems
US11237256B2 (en) * 2018-09-19 2022-02-01 Waymo Llc Methods and systems for dithering active sensor pulse emissions
CN109254297B (zh) * 2018-10-30 2023-09-08 杭州欧镭激光技术有限公司 一种激光雷达的光路系统及一种激光雷达
US11543495B2 (en) * 2018-11-01 2023-01-03 Waymo Llc Shot reordering in LIDAR systems
CN109373897B (zh) * 2018-11-16 2020-07-31 广州市九州旗建筑科技有限公司 一种基于激光虚拟标尺的测量方法
CN114578381A (zh) * 2018-11-16 2022-06-03 上海禾赛科技有限公司 一种激光雷达系统
US11754682B2 (en) 2019-05-30 2023-09-12 Microvision, Inc. LIDAR system with spatial beam combining
US11796643B2 (en) 2019-05-30 2023-10-24 Microvision, Inc. Adaptive LIDAR scanning methods
US11828881B2 (en) 2019-05-30 2023-11-28 Microvision, Inc. Steered LIDAR system with arrayed receiver
US11480660B2 (en) 2019-07-09 2022-10-25 Microvision, Inc. Arrayed MEMS mirrors for large aperture applications
US11579256B2 (en) 2019-07-11 2023-02-14 Microvision, Inc. Variable phase scanning lidar system
US11579257B2 (en) 2019-07-15 2023-02-14 Veoneer Us, Llc Scanning LiDAR system and method with unitary optical element
US11474218B2 (en) 2019-07-15 2022-10-18 Veoneer Us, Llc Scanning LiDAR system and method with unitary optical element
WO2021029969A1 (fr) * 2019-08-13 2021-02-18 Apple Inc. Conditionnement optique de plan focal pour photonique intégrée
CN114208006A (zh) 2019-08-18 2022-03-18 苹果公司 具有电磁致动的力平衡微镜
EP4024855A4 (fr) * 2019-08-30 2022-09-28 LG Innotek Co., Ltd. Caméra temps de vol
DE102019125906A1 (de) * 2019-09-26 2021-04-01 Blickfeld GmbH Emitterarray für Lichtdetektion und -entfernungsmessung, LIDAR
US11313969B2 (en) 2019-10-28 2022-04-26 Veoneer Us, Inc. LiDAR homodyne transceiver using pulse-position modulation
US11733359B2 (en) 2019-12-03 2023-08-22 Apple Inc. Configurable array of single-photon detectors
CN113064136A (zh) * 2020-01-02 2021-07-02 隆达电子股份有限公司 发光元件与发光模块
CN113075641A (zh) * 2020-01-03 2021-07-06 华为技术有限公司 一种tof深度传感模组和图像生成方法
US11656340B2 (en) 2020-01-31 2023-05-23 Denso Corporation LIDAR device
US11372320B2 (en) 2020-02-27 2022-06-28 Gerard Dirk Smits High resolution scanning of remote objects with fast sweeping laser beams and signal recovery by twitchy pixel array
US11326758B1 (en) 2021-03-12 2022-05-10 Veoneer Us, Inc. Spotlight illumination system using optical element
US11732858B2 (en) 2021-06-18 2023-08-22 Veoneer Us, Llc Headlight illumination system using optical element
US11681028B2 (en) 2021-07-18 2023-06-20 Apple Inc. Close-range measurement of time of flight using parallax shift

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07198845A (ja) * 1993-12-28 1995-08-01 Nec Corp 距離・画像測定装置

Family Cites Families (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998016801A1 (fr) * 1996-10-11 1998-04-23 Schwartz Electro-Optics, Inc. Capteur multi-voies pour systeme autoroutier intelligent pour vehicules
US6115111A (en) 1998-10-05 2000-09-05 Korah; John K. Semiconductor laser based sensing device
US20020071169A1 (en) 2000-02-01 2002-06-13 Bowers John Edward Micro-electro-mechanical-system (MEMS) mirror device
EP1373830B1 (fr) * 2001-04-04 2006-05-17 Instro Precision Limited Mesure d'un profil de surface
US20030090818A1 (en) 2001-11-02 2003-05-15 Wittenberger John Carl Co-aligned receiver and transmitter for wireless link
US20030227614A1 (en) 2002-06-05 2003-12-11 Taminiau August A. Laser machining apparatus with automatic focusing
DE10244641A1 (de) 2002-09-25 2004-04-08 Ibeo Automobile Sensor Gmbh Optoelektronische Erfassungseinrichtung
US7064876B2 (en) 2003-07-29 2006-06-20 Lexmark International, Inc. Resonant oscillating scanning device with multiple light sources
EP2937897A3 (fr) 2003-09-15 2016-03-23 Nuvotronics LLC Emballage de dispositif et procédés pour la fabrication et le test de celui-ci
IL165212A (en) 2004-11-15 2012-05-31 Elbit Systems Electro Optics Elop Ltd Device for scanning light
US8018579B1 (en) 2005-10-21 2011-09-13 Apple Inc. Three-dimensional imaging and display system
EP1949665A1 (fr) * 2005-10-27 2008-07-30 Hewlett-Packard Development Company, L.P. Appareil et procédé de balayage lumineux utilisant un réseau de sources lumineuses
US7423821B2 (en) 2006-03-24 2008-09-09 Gentex Corporation Vision system
US7352499B2 (en) 2006-06-06 2008-04-01 Symbol Technologies, Inc. Arrangement for and method of projecting an image with pixel mapping
JP5012463B2 (ja) 2007-12-03 2012-08-29 セイコーエプソン株式会社 走査型画像表示システム及び走査型画像表示装置
US8519983B2 (en) * 2007-12-29 2013-08-27 Microvision, Inc. Input device for a scanned beam display
US8353457B2 (en) * 2008-02-12 2013-01-15 Datalogic ADC, Inc. Systems and methods for forming a composite image of multiple portions of an object from multiple perspectives
KR20100063996A (ko) 2008-12-04 2010-06-14 삼성전자주식회사 스캐너 및 이를 채용한 화상 형성 장치
US8305502B2 (en) 2009-11-11 2012-11-06 Eastman Kodak Company Phase-compensated thin-film beam combiner
US20110188054A1 (en) 2010-02-02 2011-08-04 Primesense Ltd Integrated photonics module for optical projection
US8279418B2 (en) * 2010-03-17 2012-10-02 Microsoft Corporation Raster scanning for depth detection
EP2378310B1 (fr) * 2010-04-15 2016-08-10 Rockwell Automation Safety AG Unité de caméra pour durée de vol et système de surveillance optique
US8330804B2 (en) 2010-05-12 2012-12-11 Microsoft Corporation Scanned-beam depth mapping to 2D image
US9400503B2 (en) * 2010-05-20 2016-07-26 Irobot Corporation Mobile human interface robot
US8654152B2 (en) 2010-06-21 2014-02-18 Microsoft Corporation Compartmentalizing focus area within field of view
CN103053167B (zh) * 2010-08-11 2016-01-20 苹果公司 扫描投影机及用于3d映射的图像捕获模块
WO2012027410A1 (fr) 2010-08-23 2012-03-01 Lighttime, Llc Lidar utilisant un balayage à système microélectromécanique
DE102010037744B3 (de) * 2010-09-23 2011-12-08 Sick Ag Optoelektronischer Sensor
EP2710568B1 (fr) * 2011-05-03 2019-11-06 Shilat Optronics Ltd Système de surveillance de terrain
US9329080B2 (en) * 2012-02-15 2016-05-03 Aplle Inc. Modular optics for scanning engine having beam combining optics with a prism intercepted by both beam axis and collection axis
US9651417B2 (en) 2012-02-15 2017-05-16 Apple Inc. Scanning depth engine
US8958911B2 (en) * 2012-02-29 2015-02-17 Irobot Corporation Mobile robot
WO2013140308A1 (fr) * 2012-03-22 2013-09-26 Primesense Ltd. Télédétection de la position d'un miroir basee sur la diffraction
EP2926225A4 (fr) * 2013-02-14 2016-07-27 Apple Inc Commandes flexibles de salle

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07198845A (ja) * 1993-12-28 1995-08-01 Nec Corp 距離・画像測定装置

Also Published As

Publication number Publication date
WO2014141115A2 (fr) 2014-09-18
KR101762525B1 (ko) 2017-07-27
IL241335A0 (en) 2015-11-30
US9267787B2 (en) 2016-02-23
CN105143820B (zh) 2017-06-09
IL241335B (en) 2019-02-28
WO2014141115A3 (fr) 2014-12-04
US20140313519A1 (en) 2014-10-23
KR20150131313A (ko) 2015-11-24
EP2972081A4 (fr) 2016-11-09
EP2972081A2 (fr) 2016-01-20
CN105143820A (zh) 2015-12-09

Similar Documents

Publication Publication Date Title
EP2972081B1 (fr) Balayage en profondeur à l'aide de multiples émetteurs
US20200371585A1 (en) Integrated optoelectronic module
US10739460B2 (en) Time-of-flight detector with single-axis scan
US10649072B2 (en) LiDAR device based on scanning mirrors array and multi-frequency laser modulation
US10305247B2 (en) Radiation source with a small-angle scanning array
KR102038533B1 (ko) 레이저 레이더 시스템 및 목표물 영상 획득 방법
JP6111617B2 (ja) レーザレーダ装置
US8427657B2 (en) Device for optical imaging, tracking, and position measurement with a scanning MEMS mirror
KR102020037B1 (ko) 하이브리드 라이다 스캐너
KR20180113924A (ko) Lidar 시스템 및 방법
US20200284882A1 (en) Lidar sensors and methods for the same
JP2017520755A (ja) 3d粗レーザスキャナ
US20210132196A1 (en) Flat optics with passive elements functioning as a transformation optics and a compact scanner to cover the vertical elevation field-of-view
US11796643B2 (en) Adaptive LIDAR scanning methods
CN108885260B (zh) 具有单轴扫描的渡越时间探测器
JP2021184067A (ja) 光走査装置、対象物認識装置および光走査方法
AU2015203089B2 (en) Scanning depth engine

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20150929

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

RIN1 Information on inventor provided before grant (corrected)

Inventor name: SHPUNT, ALEXANDER

Inventor name: MOR, ZAFRIR

Inventor name: EINAT, RONEN

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20161011

RIC1 Information provided on ipc code assigned before grant

Ipc: G01S 17/89 20060101ALI20161005BHEP

Ipc: G01S 7/481 20060101ALI20161005BHEP

Ipc: G02B 17/00 20060101ALI20161005BHEP

Ipc: G01C 3/08 20060101AFI20161005BHEP

Ipc: G01B 11/24 20060101ALI20161005BHEP

Ipc: G01C 3/00 20060101ALI20161005BHEP

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20180226

GRAJ Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted

Free format text: ORIGINAL CODE: EPIDOSDIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

TPAC Observations filed by third parties

Free format text: ORIGINAL CODE: EPIDOSNTIPA

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

INTC Intention to grant announced (deleted)
17Q First examination report despatched

Effective date: 20180509

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: APPLE INC.

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20191216

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602014064152

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1260705

Country of ref document: AT

Kind code of ref document: T

Effective date: 20200515

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20200422

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200824

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200822

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200723

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200722

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1260705

Country of ref document: AT

Kind code of ref document: T

Effective date: 20200422

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200722

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602014064152

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20210125

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20210331

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210331

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210331

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210313

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210331

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210313

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210331

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20140313

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230526

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20231229

Year of fee payment: 11

Ref country code: GB

Payment date: 20240108

Year of fee payment: 11