EP2955544B1 - Système de caméra TOF et procédé permettant de mesurer une distance avec ledit système - Google Patents

Système de caméra TOF et procédé permettant de mesurer une distance avec ledit système Download PDF

Info

Publication number
EP2955544B1
EP2955544B1 EP14171985.6A EP14171985A EP2955544B1 EP 2955544 B1 EP2955544 B1 EP 2955544B1 EP 14171985 A EP14171985 A EP 14171985A EP 2955544 B1 EP2955544 B1 EP 2955544B1
Authority
EP
European Patent Office
Prior art keywords
scene
pixels
depth map
indirect
camera system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP14171985.6A
Other languages
German (de)
English (en)
Other versions
EP2955544A1 (fr
Inventor
Ward Van Der Tempel
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Depthsensing Solutions NV SA
Original Assignee
Sony Depthsensing Solutions NV SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Depthsensing Solutions NV SA filed Critical Sony Depthsensing Solutions NV SA
Priority to EP14171985.6A priority Critical patent/EP2955544B1/fr
Priority to BE2014/0578A priority patent/BE1022486B1/fr
Priority to JP2016572252A priority patent/JP2017517737A/ja
Priority to US15/317,367 priority patent/US10901090B2/en
Priority to CN201580036607.8A priority patent/CN106662651B/zh
Priority to PCT/EP2015/063015 priority patent/WO2015189311A1/fr
Priority to KR1020177000302A priority patent/KR102432765B1/ko
Publication of EP2955544A1 publication Critical patent/EP2955544A1/fr
Application granted granted Critical
Publication of EP2955544B1 publication Critical patent/EP2955544B1/fr
Priority to JP2020195625A priority patent/JP7191921B2/ja
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/487Extracting wanted echo signals, e.g. pulse detection
    • G01S7/4876Extracting wanted echo signals, e.g. pulse detection by removing unwanted signals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4911Transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/493Extracting wanted echo signals

Definitions

  • the invention relates (i) to a method for measuring a distance between an object of a scene and a Time-Of-Flight camera system and (ii) to the Time-Of-Flight camera system associated therewith.
  • the invention relates to the problem of direct and indirect reflections of light within a scene and to the errors of depth measurements induced by these multiple reflections. It should be understood that by scene is meant all the surfaces surrounding the object onto which a light beam could be directly or indirectly reflected.
  • Time-Of-Flight technology is a promising technology for depth perception.
  • the well-known basic operational principle of a standard TOF camera system 3 is illustrated in Figure 1 .
  • the TOF camera system 3 captures 3D images of a scene 15 by analysing the time of flight of light from a dedicated illumination unit 18 to an object.
  • TOF camera system 3 includes a camera, for instance a matrix of pixels 1, and data processing means 4.
  • the scene 15 is actively illuminated with a modulated light 16 at a predetermined wavelength using the dedicated illumination unit 18, for instance with some light pulses of at least one predetermined frequency.
  • the modulated light is reflected back from objects within the scene.
  • a lens 2 collects the reflected light 17 and forms an image of the objects onto the imaging sensor 1 of the camera.
  • a delay is experienced between the emission of the modulated light, e.g. the so called light pulses, and the reception at the camera of those reflected light pulses.
  • Distance between reflecting objects and the camera may be determined as function of the time delay observed and the speed of light constant value.
  • a standard TOF camera system 9 is represented, comprising an illumination unit 8 for illuminating a scene 24 in multiple directions, a TOF sensor 6 for detecting the reflections of the emitted light and processing means 7 for processing the data acquired by the TOF sensor 6.
  • the light captured by the sensor 6 can originate from both the direct path 25 and the secondary reflection 26, the depth map 27, representing the depth associated to each point of the scene, measured is thereby erroneous.
  • US 2013/0148102A1 proposes to compensate for a multi-path fusing the results obtained by applying two spatially different illumination schemes, typically one to achieve highest possible lateral resolution and for the second one structuring the emitted light and by doing so lowering the lateral resolution but limiting the impact of multiple reflections.
  • US 2014/055771A1 discloses a TOF based camera system with an illumination module that illuminates only a given region of the field of view of the imaging sensor. This translates to a region of the pixels of the imaging sensor.
  • the acquired data of the pixel region is processed and/or readout, typically.
  • a second pixel region is illuminated and the second pixel region is processed. This procedure can be repeated a couple of times up to a few hundred even thousand times until the entire pixel array is readout and possibly readout a number of times.
  • the full depth image is then reconstructed based on the results from the different pixel region acquisitions.
  • Another approach uses a set of different spatial patterns which are generated by, for example, a Digital Light Processing (DLP) projector.
  • DLP Digital Light Processing
  • the direct and indirect components can be separated, as the depth acquired from the black part of the pattern is created by only indirect signal originating from multipath.
  • the different patterns are chosen in such a way that each part of the scene is captured in a black situation. Edge-effects are countered by defining the patterns with sufficient overlaps.
  • the creation of these different patterns is expensive.
  • a solution remains to be proposed in order to retrieve only the direct component of the reflected light in the most cost-effective way, in order to perform more accurate measurements of the distances between objects of a scene and the TOF camera system.
  • the present invention relates to a method for measuring a distance between an object of a scene and a Time-Of-Flight camera system, and providing a depth map of the object, the Time-Of-Flight camera system comprising an illumination unit, patterning means, an imaging sensor having a matrix of pixels and image processing means, the method comprising the following steps
  • the method when processing the data, could comprise a step of identifying, for instance on an intermediary depth maps but not only, peaks associated to elementary areas on which only indirect incident light beams can impinge by noticing the apparition of peaks in the second depth map (29) not present in the first depth map.
  • the data could be processed for eliminating the influence of indirect light beams and obtaining a final accurate depth map of the scene.
  • the present invention also relates to a Time-Of-Flight (TOF) camera system for measuring a distance between an object of a scene and the TOF camera system, and providing a depth map of the object, the TOF camera system comprising:
  • the modification of the illumination is performed by masking the illumination unit in a discrete way.
  • the patterning means could be for instance a mask avoiding direct incident light beams to impinge on some elementary areas of the scene.
  • the patterning means could comprise a series of identical pattern groups for enabling an easier processing of the data.
  • FIG. 3 illustrates a TOF camera system 10 according to an embodiment of the invention.
  • the Time-Of-Flight camera system 10 comprises an illumination unit 20 for illuminating a scene 24 with a modulated light.
  • the light emitted by this illumination unit 20 is arranged for being suitable for measuring distances using the Time-Of-Flight technology.
  • the illumination unit 20 may be arranged for emitting light pulses with an appropriate pulse width. Indeed, when using pulses, the pulse width of each light pulse determines the camera range. For instance, for a pulse width of 50 ns, the range is limited to 7.5 m.
  • the illumination unit is arranged for emitting multi-directional light, as represented by the plurality of emitted light rays 25, 26 and 28 represented in Figure 3 .
  • the TOF camera system further comprises an imaging sensor 21 typically comprising a matrix array of pixels, for receiving and detecting the reflected beams and forming an image of the scene 24.
  • an imaging sensor 21 typically comprising a matrix array of pixels, for receiving and detecting the reflected beams and forming an image of the scene 24.
  • an imaging sensor 21 typically comprising a matrix array of pixels, for receiving and detecting the reflected beams and forming an image of the scene 24.
  • an imaging sensor 21 typically comprising a matrix array of pixels, for receiving and detecting the reflected beams and forming an image of the scene 24.
  • an imaging sensor 21 typically comprising a matrix array of pixels, for receiving and detecting the reflected beams and forming an image of the scene 24.
  • the TOF camera system 10 further comprises patterning means 30 for creating a light pattern on the scene 24.
  • the light pattern may be a native laser speckle pattern obtained directly from laser light interferences, or be obtained from patterning means which may be placed in front of the illumination unit 20, or a combination of both the laser speckle and a patterning means 30.
  • the patterning means 30 the light emitted by the illumination unit 20 passes through these patterning means, causing the light to be modified and to form a light pattern on the scene, with delimited elementary areas 31, 32 of different intensities.
  • the light emitted by the illumination unit 20 may be blocked or its intensity may be reduced on given areas of the patterning means 30 and not blocked on other areas, resulting in the creation of areas with low light intensity 31 and high light intensity 32, respectively, on the scene.
  • these areas have been represented by thick lines 31 and 32, but it should be understood that the light pattern created on the scene is not a solid or a physical pattern attached on the scene 24, but the result of light effects originated from the patterning means 30 placed in front of the illumination unit 20.
  • the light pattern is projected on the scene by the illumination unit 20.
  • the patterning means 30 can be filtering means, a mask, a grid or any means enabling to modify in a discrete way the illumination.
  • the patterning means should provide a spatially periodic light pattern 31, 32, 45, 46 on the scene, for retrieving easily the areas 31 where only secondary reflections are measured.
  • the patterning means 30 could also comprise a series of identical pattern groups 50 as represented in Figure 4 or a series of different pattern groups that can be sequentially used in time in synchrony with a multiple of the TOF camera frame rate.
  • the invention does not require a pattern with a contrast of 100%, and that there is no need to align patterning means to the image sensor.
  • the TOF camera system 10 further comprises processing means 5 for determining the time of flight of light emitted by the illumination 20, and thereby, the distance between an object of the scene 24 and the imaging sensor 21.
  • the processing means 30 are arranged for receiving data from the pixels of the imaging sensor 21 and for processing them for eliminating the influence of the indirect light beams in the depth map of the object. The method for determining this distance, and a final and accurate depth map of the object, will be described in the following paragraphs.
  • the time of flight can be calculated in a separate processing unit which may be coupled to the TOF sensor 21 or may directly be integrated into the TOF sensor itself.
  • the processing means 5 have been represented coupled to the illumination unit 20, but the invention is not limited thereto.
  • a method for measuring a distance between an object of a scene and the Time-Of-Flight camera system and providing a depth map of the object, the Time-Of-Flight camera system 10 comprising an illumination unit 20, an imaging sensor 21 having a matrix of pixels 22, 23 and image processing means 30, will be now described, by referring to Figure 3 , Figure 4 , Figure 5 and Figure 6 .
  • the method comprises the step of modifying in a discrete way the illumination of the illumination unit 20 in order to illuminate elementary areas 31, 32 of the scene with different incident intensities, respectively, for distinguishing the direct incident light beam 25 from the indirect incident light beam 26, 28.
  • This modification can be performed for instance by creating a light pattern on a scene, the light pattern comprising delimited elementary areas with high and low light intensity.
  • This light pattern can be created by placing the patterning means 30 in front of the illumination unit 20, and thus, projecting a light pattern on the scene.
  • the pixels of the sensor 21 receive the beams reflected by these elementary areas 31, 32 and provide the image processing means 30 with corresponding data.
  • the light pattern projected on the scene can be retrieved on the intermediary depth map 29. This is illustrated by Figure 3 and Figure 5 .
  • peaks 33 correspond to areas 31 of the scene 24 where only secondary reflections are measured.
  • pixels of the imaging sensor 21 should not measured light, or with a small intensity because, by definition, these areas 31 are associated to areas of the patterning means 30 where the light is blocked or where the light intensity is reduced.
  • the light measured on pixel 23 is more dominated by the secondary reflection 28, whereas the light measured on pixel 22 corresponds to both direct and indirect components 25, 26.
  • the fact of identifying, for instance on an intermediary depth map 29, elementary areas 31 on which no direct incident light beam can impinge can be used for eliminating the influence of indirect light beams and obtaining a final and accurate depth map of the object.
  • the complex data obtained by the time of flight measurement of light dominated by the indirect components in pixel 23 can for example be subtracted from the complex data obtained by both direct and indirect components in pixel 22 to form a new complex data NC. If the indirect component contributions to the complex data are equal in pixel 22 and 23, the resulting complex data NC only contains information from the direct component. Even if pixel 23 still received a small direct component due to the limited contrast of the patterning means 30, the resulting complex data NC will have a smaller amplitude but will have the correct phase representing the time of flight of the direct component.
  • Figure 5 illustrates an intermediary depth map of an object of the scene 29 and the associated pixels of the TOF camera system.
  • Pixels 40 measure only indirect components and are associated to higher depth and peaks 33, whereas pixels 41 measure both direct and indirect components, and are associated to areas 34 of the depth map 29.
  • the identification of the pixels corresponding to the areas 31 can also be done using a signal intensity map, where these pixels will have lower intensity due to the missing direct components.
  • Confidence maps or noise maps can also be used to identify the pixels associated with the areas 31.
  • the complex value measured by the pixel 22 can be subtracted from the complex value measured by the pixel 23 to form a new complex value NC.
  • an indirect component function can be constructed by the samples taken on the areas 31. This indirect component function can then be interpolated for all pixels having both direct and indirect components and subtracted from these pixels, leaving only direct components.
  • the value associated to indirect components is a continuous function which can be easily sampled by all areas 31, because the indirect components originate from a Lambertian reflectance of the scene 24.
  • the scene 40 of Figure 6 comprises for instance a first wall 43 with a door and a second wall 41, on which a cupboard 42, with a given depth, is mounted.
  • a cupboard 42 with a given depth
  • indirect reflections originating from the reflection of the cupboard 42 or from the wall 43 do not lead to similar measurements.
  • different spatial zones 45, 46 of light pattern on the scene 40 can be determined.
  • different shapes have been used in Figure 6 , but it should be understood that both light sub-patterns 45 and 46 originate from the same patterning means 30 placed in front of the illumination unit 20 of the TOF camera system 10.
  • the scene is now first segmented using the depth data available or any additional data useful for segmenting the scene.
  • an continuous function can be associated with the indirect components, which can be sampled by the areas 31 belonging to each segment respectively.
  • This indirection component function linked to each segment can then be used to compensate for the unwanted indirect components present in the pixels with both direct and indirect components.

Claims (8)

  1. Procédé pour mesurer une distance entre un objet d'une scène (24, 40) et un système de caméra à temps de vol (10), et fournir une carte de profondeur de l'objet, le système de caméra à temps de vol (10) comprenant une unité d'éclairage (20), des moyens de formation de motif (30), un capteur d'imagerie (21) comprenant une matrice de pixels (22, 23), et des moyens de traitement d'image (5),
    lequel procédé comprend les étapes suivantes :
    éclairer la scène avec ladite unité d'éclairage (20) sans utiliser les moyens de formation de motif (30) ;
    recevoir sur les pixels de la matrice du capteur (21) les faisceaux réfléchis par la scène comprenant des composantes à la fois directes et indirectes fournissant aux moyens de traitement d'image (5) des données correspondantes et générer une première carte de profondeur intermédiaire (27) ;
    le procédé étant caractérisé par les étapes suivantes :
    modifier de manière discrète l'éclairage de ladite unité d'éclairage (20) avec des moyens de formation de motif (30) afin d'éclairer des zones élémentaires (31, 32) de la scène avec différentes intensités incidentes, respectivement, afin de distinguer les faisceaux de lumière incidente directs (25) des faisceaux de lumière incidente indirects (26, 28) ;
    recevoir sur les pixels de la matrice du capteur (21) les faisceaux réfléchis par lesdites zones élémentaires (31, 32) et fournir aux moyens de traitement d'image (5) des données correspondantes et générer une seconde carte de profondeur intermédiaire (29) ;
    traiter lesdites données correspondantes afin d'éliminer l'influence des faisceaux de lumière indirects dans la carte de profondeur de l'objet en identifiant, sur une carte de profondeur intermédiaire (29), les zones élémentaires (31) qu'aucun faisceau de lumière incidente direct ne peut heurter en indiquant l'apparition de pics dans la seconde carte de profondeur intermédiaire (29) absents dans la première carte de profondeur intermédiaire (27), et en soustrayant les données complexes obtenues dans certains (23) des pixels de la matrice par la mesure de temps de vol de lumière dominée par les composantes indirectes depuis les données complexes obtenues dans les autres pixels (22) de la matrice par des composantes à la fois directes et indirectes afin de former de nouvelles données complexes.
  2. Procédé selon la revendication 1, dans lequel l'étape consistant à modifier l'éclairage est une étape consistant à masquer l'unité d'éclairage (20) de manière discrète afin de créer des zones élémentaires (31) sur la scène qu'aucun faisceau de lumière incidente direct ne peut heurter.
  3. Procédé selon les revendications 1 ou 2, dans lequel l'étape consistant à traiter lesdites données correspondantes comprend l'étape consistant à identifier des zones élémentaires (31) que seuls des faisceaux de lumière incidente directs peuvent heurter.
  4. Procédé selon l'une quelconque des revendications précédentes, comprenant en outre l'étape consistant à déterminer deux zones spatiales différentes (45, 46) des zones élémentaires sur la scène (24, 40) associées à différents ensembles de pixels de ladite matrice de pixels (22, 23), respectivement.
  5. Système de caméra à temps de vol (TOF) (10) pour mesurer une distance entre un objet d'une scène (24, 40) et le système de caméra TOF (10) et fournir une carte de profondeur de l'objet, le système de caméra TOF comprenant :
    des moyens de traitement d'image (5) ;
    des moyens de formation de motif (30) ;
    une unité d'éclairage (20) pour éclairer, sans utiliser les moyens de formation de motif (30), la scène (24) avec une lumière modulée ;
    un capteur d'imagerie (21) comprenant une matrice de pixels (22, 23) pour recevoir sur les pixels de la matrice du capteur (21) les faisceaux réfléchis par la scène comprenant des composantes à la fois directes et indirectes, fournir aux moyens de traitement d'image (5) des données correspondantes et générer une première carte de profondeur intermédiaire (27) ;
    caractérisé en ce que :
    le système de caméra TOF (10) comprend en outre des moyens de formation de motif (30) pour modifier de manière discrète l'éclairage de ladite unité d'éclairage (20) avec des moyens de formation de motif (30) afin d'éclairer des zones élémentaires (31, 32) de la scène avec différentes intensités incidentes, respectivement, afin de distinguer les faisceaux de lumière incidente directs (25) des faisceaux de lumière incidente indirects (26, 28) ;
    dans lequel le capteur d'imagerie (21) est conçu pour recevoir sur les pixels de la matrice du capteur (21) les faisceaux réfléchis par lesdites zones élémentaires (31, 32) et fournir aux moyens de traitement d'image (5) des données correspondantes et générer une seconde carte de profondeur intermédiaire (29) ; et
    caractérisé en outre en ce que les moyens de traitement d'image (5) sont conçus pour éliminer l'influence des faisceaux de lumière indirects dans la carte de profondeur de l'objet lors du traitement desdites données correspondantes, en identifiant, sur une carte de profondeur intermédiaire (29), les zones élémentaires (31) qu'aucun faisceau de lumière incidente direct ne peut heurter en indiquant l'apparition de pics dans la seconde carte de profondeur (29) absents dans la première carte de profondeur (27), et en soustrayant les données complexes obtenues dans certains (23) des pixels de la matrice par la mesure de temps de vol de lumière dominée par les composantes indirectes depuis les données complexes obtenues dans les autres pixels (22) de la matrice par des composantes à la fois directes et indirectes afin de former de nouvelles données complexes.
  6. Système de caméra TOF (10) selon la revendication 5, dans lequel les moyens de formation de motif (30) sont disposés à l'avant de l'unité d'éclairage (20) afin de projeter un motif lumineux (31, 32, 45, 46) sur la scène.
  7. Système de caméra TOF (10) selon la revendication 5 ou 6, dans lequel les moyens de formation de motif (30) comprennent une série de groupes de motifs identiques (50) .
  8. Système de caméra TOF (10) selon l'une quelconque des revendications 5 et 6, dans lequel les moyens de formation de motif (30) comprennent un masque empêchant que des faisceaux incidents directs ne heurtent certaines des zones élémentaires (31) de la scène (24, 40).
EP14171985.6A 2014-06-11 2014-06-11 Système de caméra TOF et procédé permettant de mesurer une distance avec ledit système Active EP2955544B1 (fr)

Priority Applications (8)

Application Number Priority Date Filing Date Title
EP14171985.6A EP2955544B1 (fr) 2014-06-11 2014-06-11 Système de caméra TOF et procédé permettant de mesurer une distance avec ledit système
BE2014/0578A BE1022486B1 (fr) 2014-06-11 2014-07-25 Un systeme d'appareil de prise de vues tof et une methode pour mesurer une distance avec le systeme
US15/317,367 US10901090B2 (en) 2014-06-11 2015-06-11 TOF camera system and a method for measuring a distance with the system
CN201580036607.8A CN106662651B (zh) 2014-06-11 2015-06-11 Tof相机系统以及用于测量与该系统的距离的方法
JP2016572252A JP2017517737A (ja) 2014-06-11 2015-06-11 Tofカメラシステムおよび該システムにより距離を測定するための方法
PCT/EP2015/063015 WO2015189311A1 (fr) 2014-06-11 2015-06-11 Système de caméra temps de vol et procédé pour mesurer une distance avec le système
KR1020177000302A KR102432765B1 (ko) 2014-06-11 2015-06-11 Tof 카메라 시스템 및 그 시스템과의 거리를 측정하는 방법
JP2020195625A JP7191921B2 (ja) 2014-06-11 2020-11-26 Tofカメラシステムおよび該システムにより距離を測定するための方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP14171985.6A EP2955544B1 (fr) 2014-06-11 2014-06-11 Système de caméra TOF et procédé permettant de mesurer une distance avec ledit système

Publications (2)

Publication Number Publication Date
EP2955544A1 EP2955544A1 (fr) 2015-12-16
EP2955544B1 true EP2955544B1 (fr) 2020-06-17

Family

ID=50927987

Family Applications (1)

Application Number Title Priority Date Filing Date
EP14171985.6A Active EP2955544B1 (fr) 2014-06-11 2014-06-11 Système de caméra TOF et procédé permettant de mesurer une distance avec ledit système

Country Status (7)

Country Link
US (1) US10901090B2 (fr)
EP (1) EP2955544B1 (fr)
JP (2) JP2017517737A (fr)
KR (1) KR102432765B1 (fr)
CN (1) CN106662651B (fr)
BE (1) BE1022486B1 (fr)
WO (1) WO2015189311A1 (fr)

Families Citing this family (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170323429A1 (en) 2016-05-09 2017-11-09 John Peter Godbaz Multiple patterns in time-of-flight camera apparatus
US10928489B2 (en) * 2017-04-06 2021-02-23 Microsoft Technology Licensing, Llc Time of flight camera
US10598768B2 (en) * 2017-05-24 2020-03-24 Microsoft Technology Licensing, Llc Multipath mitigation for time of flight system
EP3460508A1 (fr) * 2017-09-22 2019-03-27 ams AG Corps semi-conducteur et procédé pour les mesures de temps de vol
US10215856B1 (en) 2017-11-27 2019-02-26 Microsoft Technology Licensing, Llc Time of flight camera
US11662433B2 (en) * 2017-12-22 2023-05-30 Denso Corporation Distance measuring apparatus, recognizing apparatus, and distance measuring method
US10901087B2 (en) * 2018-01-15 2021-01-26 Microsoft Technology Licensing, Llc Time of flight camera
CN108259744B (zh) * 2018-01-24 2020-06-23 北京图森智途科技有限公司 图像采集控制方法及其装置、图像采集系统和tof相机
JP7253323B2 (ja) * 2018-02-14 2023-04-06 オムロン株式会社 3次元計測システム及び3次元計測方法
CN111971578A (zh) 2018-03-29 2020-11-20 松下半导体解决方案株式会社 距离信息取得装置、多路径检测装置及多路径检测方法
KR20210008023A (ko) * 2018-05-09 2021-01-20 소니 세미컨덕터 솔루션즈 가부시키가이샤 디바이스 및 방법
CN109459738A (zh) * 2018-06-06 2019-03-12 杭州艾芯智能科技有限公司 一种多台tof相机相互避免干扰的方法及系统
US11609313B2 (en) * 2018-07-31 2023-03-21 Waymo Llc Hybrid time-of-flight and imager module
JP2020020681A (ja) 2018-08-01 2020-02-06 ソニーセミコンダクタソリューションズ株式会社 光源装置、イメージセンサ、センシングモジュール
KR102570059B1 (ko) * 2018-08-16 2023-08-23 엘지이노텍 주식회사 센싱 방법 및 장치
US11353588B2 (en) 2018-11-01 2022-06-07 Waymo Llc Time-of-flight sensor with structured light illuminator
US11029149B2 (en) 2019-01-30 2021-06-08 Microsoft Technology Licensing, Llc Multipath mitigation for time of flight system
CN113728246A (zh) * 2019-04-22 2021-11-30 株式会社小糸制作所 ToF照相机、车辆用灯具、汽车
US11070757B2 (en) 2019-05-02 2021-07-20 Guangzhou Tyrafos Semiconductor Technologies Co., Ltd Image sensor with distance sensing function and operating method thereof
CN111885322B (zh) * 2019-05-02 2022-11-22 广州印芯半导体技术有限公司 具有距离感测功能的图像传感器及其操作方法
US11479849B2 (en) * 2019-06-03 2022-10-25 Taiwan Semiconductor Manufacturing Company, Ltd. Physical vapor deposition chamber with target surface morphology monitor
US11644571B2 (en) 2019-07-01 2023-05-09 Samsung Electronics Co., Ltd. Electronic apparatus and control method thereof
CN112213730A (zh) * 2019-07-10 2021-01-12 睿镞科技(北京)有限责任公司 三维测距方法和装置
CN110378946B (zh) 2019-07-11 2021-10-01 Oppo广东移动通信有限公司 深度图处理方法、装置以及电子设备
CN112824934A (zh) * 2019-11-20 2021-05-21 深圳市光鉴科技有限公司 基于调制光场的tof多径干扰去除方法、系统、设备及介质
CN112824935B (zh) * 2019-11-20 2023-02-28 深圳市光鉴科技有限公司 基于调制光场的深度成像系统、方法、设备及介质
CN111045030B (zh) * 2019-12-18 2022-09-13 奥比中光科技集团股份有限公司 一种深度测量装置和方法
KR20210084752A (ko) 2019-12-27 2021-07-08 삼성전자주식회사 광원 및 ToF 센서를 포함하는 전자 장치 및 라이다 시스템
US20210255327A1 (en) * 2020-02-17 2021-08-19 Mediatek Inc. Emission And Reception Of Patterned Light Waves For Range Sensing
WO2022201848A1 (fr) * 2021-03-22 2022-09-29 ソニーグループ株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
CN117043547A (zh) * 2021-03-26 2023-11-10 高通股份有限公司 混合模式深度成像
RU2770153C1 (ru) * 2021-06-15 2022-04-14 Самсунг Электроникс Ко., Лтд. Способ корректировки ошибки измерения глубины tof-камеры
CN113665904B (zh) * 2021-09-07 2023-04-07 钟放鸿 基于tof技术的条盒香烟缺包检测方法
US11922606B2 (en) 2021-10-04 2024-03-05 Samsung Electronics Co., Ltd. Multipass interference correction and material recognition based on patterned illumination without frame rate loss
CN113945951B (zh) * 2021-10-21 2022-07-08 浙江大学 Tof深度解算中的多径干扰抑制方法、tof深度解算方法及装置
WO2023113700A1 (fr) * 2021-12-17 2023-06-22 Ams Sensors Singapore Pte. Ltd. Procédé de génération d'une carte de profondeur
JP2023172742A (ja) * 2022-05-24 2023-12-06 Toppanホールディングス株式会社 距離画像撮像装置、及び距離画像撮像方法

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040213463A1 (en) * 2003-04-22 2004-10-28 Morrison Rick Lee Multiplexed, spatially encoded illumination system for determining imaging and range estimation
US9002511B1 (en) * 2005-10-21 2015-04-07 Irobot Corporation Methods and systems for obstacle detection using structured light
JP2008309551A (ja) * 2007-06-13 2008-12-25 Nikon Corp 形状測定方法、記憶媒体、および形状測定装置
US20100157280A1 (en) * 2008-12-19 2010-06-24 Ambercore Software Inc. Method and system for aligning a line scan camera with a lidar scanner for real time data fusion in three dimensions
US8610808B2 (en) * 2008-12-22 2013-12-17 Koninklijke Philips N.V. Color CMOS imager with single photon counting capability
US8491135B2 (en) * 2010-01-04 2013-07-23 Microvision, Inc. Interactive projection with gesture recognition
EP2395369A1 (fr) * 2010-06-09 2011-12-14 Thomson Licensing Imageur de temps de vol
US9753128B2 (en) * 2010-07-23 2017-09-05 Heptagon Micro Optics Pte. Ltd. Multi-path compensation using multiple modulation frequencies in time of flight sensor
DE102011081561A1 (de) * 2011-08-25 2013-02-28 Ifm Electronic Gmbh Lichtlaufzeitkamerasystem mit Signalpfadüberwachung
JP2013078433A (ja) * 2011-10-03 2013-05-02 Panasonic Corp 監視装置、プログラム
WO2013052781A1 (fr) * 2011-10-07 2013-04-11 Massachusetts Institute Of Technology Procédé et dispositif pour déterminer des informations de profondeur relatives à une scène voulue
US9329035B2 (en) * 2011-12-12 2016-05-03 Heptagon Micro Optics Pte. Ltd. Method to compensate for errors in time-of-flight range cameras caused by multiple reflections
JP6309459B2 (ja) * 2012-02-15 2018-04-11 ヘプタゴン・マイクロ・オプティクス・ピーティーイー・エルティーディーHeptagon Micro Optics Pte.Ltd. ストライプ照明の飛行時間型カメラ
US9462255B1 (en) * 2012-04-18 2016-10-04 Amazon Technologies, Inc. Projection and camera system for augmented reality environment
US9696427B2 (en) * 2012-08-14 2017-07-04 Microsoft Technology Licensing, Llc Wide angle depth detection
US9857166B2 (en) * 2012-09-19 2018-01-02 Canon Kabushiki Kaisha Information processing apparatus and method for measuring a target object
JP6071363B2 (ja) * 2012-09-19 2017-02-01 キヤノン株式会社 距離計測装置及び方法
US9069080B2 (en) * 2013-05-24 2015-06-30 Advanced Scientific Concepts, Inc. Automotive auxiliary ladar sensor
DE102013109020B4 (de) * 2013-08-21 2016-06-09 Pmdtechnologies Gmbh Streulichtreferenzpixel
US9874638B2 (en) * 2014-03-06 2018-01-23 University Of Waikato Time of flight camera system which resolves direct and multi-path radiation components
US20170323429A1 (en) * 2016-05-09 2017-11-09 John Peter Godbaz Multiple patterns in time-of-flight camera apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
KR102432765B1 (ko) 2022-08-17
JP2021039131A (ja) 2021-03-11
JP7191921B2 (ja) 2022-12-19
US10901090B2 (en) 2021-01-26
US20170123067A1 (en) 2017-05-04
CN106662651A (zh) 2017-05-10
KR20170041681A (ko) 2017-04-17
BE1022486B1 (fr) 2016-05-03
WO2015189311A1 (fr) 2015-12-17
JP2017517737A (ja) 2017-06-29
EP2955544A1 (fr) 2015-12-16
CN106662651B (zh) 2019-08-06

Similar Documents

Publication Publication Date Title
EP2955544B1 (fr) Système de caméra TOF et procédé permettant de mesurer une distance avec ledit système
US10255682B2 (en) Image detection system using differences in illumination conditions
US9194953B2 (en) 3D time-of-light camera and method
CN110709722B (zh) 飞行时间摄像机
US9952047B2 (en) Method and measuring instrument for target detection and/or identification
KR20160045670A (ko) 비행-시간 카메라 시스템
EP2717069A1 (fr) Procédé pour déterminer et/ou compenser le décalage de plage d'un capteur de distance
CN110121659A (zh) 用于对车辆的周围环境进行特征描述的系统
JP2010175435A (ja) 三次元情報検出装置及び三次元情報検出方法
CN112689776A (zh) 使用彩色图像数据校准深度感测阵列
EP3620821A1 (fr) Caméra de mesure du temps de vol et procédé d'étalonnage d'une caméra de mesure du temps de vol
JP2015175644A (ja) 測距システム、情報処理装置、情報処理方法及びプログラム
KR101802894B1 (ko) Tof 및 구조광 방식이 융합된 3차원 영상 획득 시스템
JP2014070936A (ja) 誤差画素検出装置、誤差画素検出方法、および誤差画素検出プログラム
CN112014858A (zh) 修正测距异常的距离图像生成装置
JP2011080843A (ja) 立体形状測定システム及び立体形状測定方法
WO2021049490A1 (fr) Dispositif d'enregistrement d'images, système de génération d'images, procédé d'enregistrement d'images et programme d'enregistrement d'images
US11610339B2 (en) Imaging processing apparatus and method extracting a second RGB ToF feature points having a correlation between the first RGB and TOF feature points
US11037316B2 (en) Parallax calculation apparatus, parallax calculation method, and control program of parallax calculation apparatus
JP2017191082A (ja) 輝点画像取得装置および輝点画像取得方法
CN112946602A (zh) 多路径误差补偿的方法和多路径误差补偿的间接飞行时间距离计算装置
Kinder Design and development of a ranging-imaging spectrometer

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20150610

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20170522

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: SONY DEPTHSENSING SOLUTIONS N.V.

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: G01S 7/491 20060101ALI20191002BHEP

Ipc: G01S 17/89 20060101AFI20191002BHEP

Ipc: G01S 17/46 20060101ALN20191002BHEP

Ipc: G01S 7/493 20060101ALI20191002BHEP

RIC1 Information provided on ipc code assigned before grant

Ipc: G01S 17/46 20060101ALN20191021BHEP

Ipc: G01S 7/491 20060101ALI20191021BHEP

Ipc: G01S 7/493 20060101ALI20191021BHEP

Ipc: G01S 17/89 20060101AFI20191021BHEP

INTG Intention to grant announced

Effective date: 20191106

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: SONY DEPTHSENSING SOLUTIONS N.V.

GRAJ Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted

Free format text: ORIGINAL CODE: EPIDOSDIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

INTC Intention to grant announced (deleted)
RIC1 Information provided on ipc code assigned before grant

Ipc: G01S 7/491 20200101ALI20200317BHEP

Ipc: G01S 7/493 20060101ALI20200317BHEP

Ipc: G01S 17/46 20060101ALN20200317BHEP

Ipc: G01S 17/89 20200101AFI20200317BHEP

GRAJ Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted

Free format text: ORIGINAL CODE: EPIDOSDIGR1

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAR Information related to intention to grant a patent recorded

Free format text: ORIGINAL CODE: EPIDOSNIGR71

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

INTG Intention to grant announced

Effective date: 20200507

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602014066618

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1281996

Country of ref document: AT

Kind code of ref document: T

Effective date: 20200715

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200918

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200917

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20200617

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200917

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1281996

Country of ref document: AT

Kind code of ref document: T

Effective date: 20200617

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201019

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201017

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602014066618

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

26N No opposition filed

Effective date: 20210318

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20210630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210611

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210630

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210611

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20140611

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200617

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230527

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20230523

Year of fee payment: 10

Ref country code: DE

Payment date: 20230523

Year of fee payment: 10

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20230523

Year of fee payment: 10