EP2932706A1 - Bilderfassungsverfahren und -vorrichtung - Google Patents

Bilderfassungsverfahren und -vorrichtung

Info

Publication number
EP2932706A1
EP2932706A1 EP13818323.1A EP13818323A EP2932706A1 EP 2932706 A1 EP2932706 A1 EP 2932706A1 EP 13818323 A EP13818323 A EP 13818323A EP 2932706 A1 EP2932706 A1 EP 2932706A1
Authority
EP
European Patent Office
Prior art keywords
image
account
integration
sensor
motion sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP13818323.1A
Other languages
English (en)
French (fr)
Inventor
Fabien GAVANT
Laurent Alacoque
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Commissariat a lEnergie Atomique et aux Energies Alternatives CEA
Original Assignee
Commissariat a lEnergie Atomique CEA
Commissariat a lEnergie Atomique et aux Energies Alternatives CEA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Commissariat a lEnergie Atomique CEA, Commissariat a lEnergie Atomique et aux Energies Alternatives CEA filed Critical Commissariat a lEnergie Atomique CEA
Publication of EP2932706A1 publication Critical patent/EP2932706A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/684Vibration or motion blur correction performed by controlling the image sensor readout, e.g. by controlling the integration time
    • H04N23/6845Vibration or motion blur correction performed by controlling the image sensor readout, e.g. by controlling the integration time by combination of a plurality of images sequentially taken
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6812Motion detection based on additional sensors, e.g. acceleration sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time

Definitions

  • This application relates to methods and available ⁇ sitifs of general image capture. It relates more particularly to so-called image stabilization methods and devices, that is to say to avoid or limit the presence of visible artifacts on the image, which can result from unwanted movements of the acquisition device during a shot.
  • an embodiment provides a method in which an integration period of an image sensor is divided into several sub-periods whose durations are chosen taking into account at least one output signal of a sensor. of movements.
  • an intermediate image acquired by the image sensor is read, and the image sensor is reset.
  • said intermediate images are combined to form a final image.
  • the at least one output signal of the motion sensor is taken into account to effect the combination of the intermediate images.
  • the combination does not take into account intermediate images having a signal-to-noise ratio below a threshold.
  • the combination does not take into account intermediate images acquired during an integration subperiod less than a threshold.
  • an image quality index is calculated taking into account said at least one output signal of the motion sensor.
  • the quality index is taken into account to divide or not the integration period into several sub-periods.
  • the image sensor and the motion sensor are part of the same image acquisition device.
  • the motion sensor is configured to provide signals representative of movements of the image acquisition device.
  • the combination takes into account the brightness level in the intermediate images to reconstruct a final image with a wide dynamic range.
  • Another embodiment provides an image acquisition device comprising an image sensor, a motion sensor, and a circuit adapted to divide an integration period of the image sensor into several sub-periods of time. Integration whose durations are chosen taking into account at least one output signal of the motion sensor.
  • the motion sensor is configured to provide signals representative of movements of the image acquisition device.
  • the device further comprises an optical compensation device, and a circuit adapted to control the optical compensation device taking into account the at least one output signal of the motion sensor.
  • the device does not include an optical compensation device.
  • Figure 1 schematically illustrates in block form an embodiment of an image acquisition device
  • FIGS. 2A, 2B and 3 illustrate the operation of an embodiment of an image acquisition method
  • FIG. 4 schematically illustrates in the form of blocks an alternative embodiment of an image acquisition device
  • Figure 5 is a schematic perspective view illustrating an embodiment in an integrated form of an image acquisition device.
  • FIGS. 2A, 2B, 3 and 5 are not drawn to scale.
  • only those elements useful for understanding skin ⁇ standing of the invention have been shown and will be described. detailed description
  • An image acquisition device for example a digital camera, conventionally comprises an image sensor placed behind an optical system, all mounted in a protective case.
  • the acquisition device may comprise an optical stabilizer comprising a device for measuring the movements of the housing, or sensor of the movements of the housing, and an optical compensation device of these movements.
  • the motion measuring device may comprise one or more motion sensors, for example of the gyroscope, gyroscope, accelerometer, etc. type, and be configured to provide signals representative of movements of the housing.
  • the optical compensation device may include actuation elements configured to move the image sensor or all or part of the optical system in response to a control signal.
  • the optical compensation device can be controlled by taking into account the output signals of the motion measuring device, so that during the image acquisition or image acquisition phases, the image projected on the sensor is the more independent possible movements of the acquisition device.
  • a problem is that, when the movements of the acquisition device have large amplitudes, and / or when the focal length of the optical system is large, the optical compensation device can come into abutment without being able to completely compensate for the measured movements.
  • the response time of the optical compensation system may be too slow to compensate for some rapid movements of the acquisition device, or the compensation system may not be accurate enough to accurately compensate for the measured movements.
  • artifacts, and especially blur may be present in the output image, especially when the integration period (or integration time) of the sensor is important.
  • an image acquisition device comprising an image sensor and a device for measuring movements of the acquisition device, to divide an integration period of the sensor into one or more sub-elements.
  • integration periods whose durations are chosen taking into account the output signals of the motion measuring device. More particularly, when, during an image acquisition phase, movements of the acquisition device capable of significantly impacting the rendering of the image are detected, it is planned to interrupt the integration of the sensor, read an intermediate image or frame already integrated on the sensor, then immediately reset the sensor to start a new integration subperiod, and so on until the sum of the integration sub-periods is equal to the integration period in question.
  • the intermediate images can be combined or accumulated by taking into account the data provided by the motion measuring device, so as to reconstruct a final image of higher sharpness (and equivalent brightness level) than the image that would have been obtained if the integration of the sensor had been performed at one time.
  • FIG. 1 schematically illustrates, in the form of blocks, an example of an embodiment of an image acquisition device 100.
  • the device 100 comprises an image sensor 101 (IMG), which can be mounted in a protective case (not shown), for example behind an optical system (not shown).
  • An image supplying output of the sensor 101 is connected to a memory 103 (MEM) of the device 100, in which image data acquired by the sensor can be stored, for example for digital processing and / or waiting to be recorded on another storage medium (not shown).
  • Device 100 includes in addition, an image stabilization system.
  • the stabilization system comprises an optical stabilizer of the type mentioned above, that is to say having a device 105 (MS) adapted to measure movements of the device 100 (independently of possible movements of all or part of the scene seen by the sensor), and a device 107 (MC) optical compensation of these movements.
  • the stabilization system comprises a calculation and control circuit 109 (CPU), for example a microcontroller, configured to receive output signals from the device 105, and to control the device 107 accordingly, so that the projected image on the sensor 101 is as independent as possible from the movements of the device 100.
  • CPU calculation and control circuit 109
  • the stabilization system further comprises a memory zone 113 (PSF), which may be distinct from the memory 103 or included in the memory 103 wherein the circuit 109 is capable of storing data relating to the movements of the device 100.
  • the circuit 109 is further adapted to supply control signals to the image sensor 101, and to read and write to the memory 103.
  • FIG. 2A is a chronogram schematically showing the evolution as a function of time, during an image acquisition phase, of the equivalent position P x of the device 100, after compensation of the movements of the device 100 by the device 107
  • the curve P x of FIG. 2A does not represent all the displacements of the device 100 during the integration phase, but represents the part of these displacements which is not compensated by the device 107. for example because of their excessive amplitude, or because they are movements that are too fast to compensate.
  • the curve P x can be obtained by comparing the output signals of the motion measuring device 105 with the control signals supplied to the compensation device 107, possibly taking into account the time response of the compensation circuit 107, or with displacement sensors of the compensation device itself.
  • the image acquisition device 100 moves only in translation, and in only one direction of the image plane of the sensor.
  • the operating modes described are however compatible with more complex movements of the device 100, provided that these movements can be measured by the device 105.
  • a target integration period T of the sensor is chosen, for example automatically, taking into account the ambient light conditions, or by manual parameterization by the user.
  • the integration of the sensor 101 begins. From time t0 until the end of the image acquisition phase, the device 105 continuously measures the movements of the device 100, and transmits motion data to the circuit 109 which, in response, commands the optical compensation device 107 so that the image projected on the sensor 101 is as independent as possible from the movements of the device 100. In parallel, the circuit 109 determines the equivalent residual movements or displacements of the device 100, that is to say say the movements of the device 100 not compensated by the device 107 (P x signal).
  • the circuit 109 When the circuit 109 detects that the residual displacements of the device 100 are likely to cause a significant degradation of the rendering of the final image, it controls the interruption of the integration of the sensor 101, and an intermediate image is read and recorded in the memory 103. This marks the end of a first integration subperiod ⁇ of the sensor. The sensor is then immediately reset and a second integration subperiod ⁇ 2 begins, and so on until the sum of the integration sub-periods equal to the integration period T referred to. In the example shown, the period T is divided into four successive sub-periods ⁇ , X2, ⁇ 3 and ⁇ 4, that is to say that four intermediate images are read during the image acquisition phase.
  • data relating to the residual displacements of the device 100 can be recorded in the memory zone 113.
  • the intermediate images are combined to reconstruct a final image sharper than the image that would have been obtained if the sensor integration had been performed at one time.
  • residual displacement data of the device 100 determined by the circuit 109 may, for example, be shifted by one another. relative to the others before being added, so as to compensate at least in part for these residual displacements.
  • other methods of estimating residual displacements and recombination of the intermediate images can be used, for example a method using convolution techniques to match blocks of pixels representative of the same part of the scene. gain.
  • the reconstruction of the final image can be fully performed after reading the last intermediate image. However, to minimize the amount of memory required for storing intermediate images, a partial reconstruction can be performed after each intermediate reading.
  • a first intermediate image is read at the end of the integration subperiod ⁇ , and is stored in the memory 103.
  • a second intermediate image is read and is directly combined with the first intermediate image, taking into account the residual displacements of the device 100 during the sub-period x2.
  • a third intermediate image is read and is directly combined with the partially reconstructed image contained in the memory 103, taking into account the residual displacements of the device 100 during the sub-period ⁇ 3.
  • a fourth intermediate image is read and is directly combined with the already partially reconstructed image contained in the memory 103, taking into account the residual displacements of the device 100 during the subperiod ⁇ 4.
  • the circuit 109 can calculate, from the displacement data residual, the matrix or function spread point ("point spread function" in English) of the device 100, that is to say the deformation caused by the residual displacements of the device 100, a scene chosen for, in the absence of residual displacements, illuminate only one pixel of the sensor 101.
  • the staggering function can also be used to reconstruct the final image.
  • the circuit 109 calculates, taking into account the residual displacement data of the device 100, for example from the point spreading function, a quality index JND of FIG. image being acquired. This index can be used as a criterion by the circuit 109, to decide whether the integration of the sensor should be interrupted or whether it can be continued.
  • FIG. 2B represents the evolution as a function of time, during the image acquisition phase of FIG. 2A, of the quality index JND calculated by the circuit 109.
  • the index JND is fixed at a reference value, for example zero.
  • the circuit 109 recalculates the quality index JND taking into account the residual movements of the device 100.
  • the index JND reaches a low threshold JND m -j_ n (lower than reference level set at time t0)
  • the integration of the sensor is interrupted, an intermediate image is read, and a new integration sub-period starts.
  • the JND index is reset to its initial value (zero in this example).
  • the threshold JND m -j_ n defines a set of quality level required in each intermediate image. For a given sequence of movements during the integration phase, the higher the threshold JND m j_ n , the greater the number of integration subperiods will be important to meet this setpoint, and vice versa. The quality of the final image obtained by combining the intermediate images depends on the setpoint JND min .
  • the JND quality index is a perceptual quality index calculated from the point spread function by the method described in the article "Perceptual Image Quality Assessment Metric That Handles Arbitraty Motion Blur" from Fabien Gavant et al. (SPIE 8293, Image Quality and System Performance IX, 829314 (January 24, 2012)).
  • SPIE 8293 Image Quality and System Performance IX, 829314 (January 24, 2012)
  • the coordinates of the center of gravity of the matrix are calculated, then each coefficient of the matrix is weighted by its distance to the center of gravity, and the weighted coefficients are summed to obtain a standard deviation E.
  • FIG. 3 represents the acquisition of an image by a method of the type described with reference to FIGS. 1, 2A and 2B.
  • an integration period T ' is divided into nine successive integration subperiods referenced respectively ⁇ ' to ⁇ 9 '.
  • the sub-periods T3 ', T4', ⁇ 5 'and T8' are significantly shorter than the others, which means that during these sub-periods, movements of the device 100 resulted in rapid degradation the quality of the image being acquired.
  • the corresponding intermediate images (hatched in FIG. 3) are consequently relatively noisy.
  • the integration of the sensor can be extended until the sum of the integration sub-periods actually taken into account in the construction of 1 final image is equal to or close to the integration period T '.
  • FIG. 4 schematically illustrates, in the form of blocks, an example of an alternative embodiment of an image acquisition device 400.
  • the device 400 comprises the same elements as the device 100, at the the exception of the optical compensation device 107.
  • the acquisition device 400 does not include an optical stabilizer, but only a device 105 for measuring the movements of the acquisition device.
  • the operating modes described with reference to FIGS. 1, 2A, 2B and 3 are compatible with the device 400, with the difference that where, in the device 100, the equivalent residual movements of the device 100 were taken into account, after optical compensation by the device 107, in the device 400, the movements actually measured by the device 105 are directly taken into account.
  • An advantage of the device 400 is that it does not include an optical compensation device, which reduces its cost, weight and bulk.
  • FIG. 5 very schematically illustrates an embodiment in an integrated form of an image acquisition device 500 of the type described with reference to FIGS. 1 to 4.
  • the device 500 is made according to a stacking technology semiconductor chips, or 3D technology.
  • An image sensor 501 comprising a matrix of photodiodes 502 is made in a first level of stacking. The photodiodes 502 occupy for example the entire surface of the stack so as to capture as much light as possible.
  • a memory 503 adapted to contain at least one image acquired by the sensor 501, is made in a second stack level, under the sensor 501. Under the memory 503, a control circuit 509 is made in a third stacking level. .
  • the device 500 further comprises a motion measuring device 505 comprising for example a gyroscope.
  • the device 505 can be integrated into one of the aforementioned stacking levels.
  • the device 505 can be made in MEMS technology (English Micro-Electro-Mechanical Systems).
  • the device 500 may further comprise an optical compensation device (not shown), for example comprising a liquid lens having an electrically controllable shape. An optical stabilization can thus be achieved by controlling the lens according to the motion information measured by the device 505, while maintaining a high level of integration.
  • An advantage of the device 500 of Figure 5 lies in its small size and low weight.
  • An advantage of the embodiments described with reference to FIGS. 1 to 4 is that they make it possible to obtain a sharp image regardless of the amplitude and the speed of the movements of the image acquisition device, the focal length of the image. optical system, and the integration time of the sensor.
  • the segmentation of the integration period of the sensor occurs only when movements likely to affect the quality of the image being acquired are detected.
  • the integration period of the sensor will not be divided, and the final image will be obtained directly, without a step of combining intermediate images (that is to say that in this case, the integration period will be divided into a single integration subperiod). This makes it possible not to introduce noise into the final image by segmenting the integration period unnecessarily, when the acquisition device does not move.
  • the embodiments described are not limited to the particular examples of image acquisition devices described with reference to FIGS. 1, 4 and 5. More generally, the image stabilization method described in connection with FIGS. FIGS. 1 to 5 may be implemented in any image acquisition device comprising at least one image sensor and a device for measuring the movements of the acquisition device.
  • so-called high dynamic range image acquisition methods may comprise successive acquisitions of several images of the same scene with different integration times, and the reconstruction, from these images, of a final image of homogeneous brightness level, having a large dynamic range.
  • Such methods make it possible in particular to limit the phenomena of overexposure or under-exposure when the scene to be acquired is strongly contrasted.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
EP13818323.1A 2012-12-17 2013-12-17 Bilderfassungsverfahren und -vorrichtung Withdrawn EP2932706A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1262133A FR2999735B1 (fr) 2012-12-17 2012-12-17 Procede et dispositif d'acquisition d'image
PCT/FR2013/053114 WO2014096670A1 (fr) 2012-12-17 2013-12-17 Procede et dispositif d'acquisition d'image

Publications (1)

Publication Number Publication Date
EP2932706A1 true EP2932706A1 (de) 2015-10-21

Family

ID=48468405

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13818323.1A Withdrawn EP2932706A1 (de) 2012-12-17 2013-12-17 Bilderfassungsverfahren und -vorrichtung

Country Status (4)

Country Link
US (1) US9860449B2 (de)
EP (1) EP2932706A1 (de)
FR (1) FR2999735B1 (de)
WO (1) WO2014096670A1 (de)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017211677A1 (de) * 2017-07-07 2019-01-10 Siemens Healthcare Gmbh Bewegungsabhängige Rekonstruktion von Magnetresonanzabbildungen
JP7258465B2 (ja) * 2018-01-12 2023-04-17 キヤノン株式会社 撮像装置およびその制御方法
US11704777B2 (en) * 2021-08-27 2023-07-18 Raytheon Company Arbitrary motion smear modeling and removal

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100295953A1 (en) * 2009-05-21 2010-11-25 Canon Kabushiki Kaisha Image processing apparatus and method thereof
US20120086822A1 (en) * 2010-04-13 2012-04-12 Yasunori Ishii Blur correction device and blur correction method

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6778210B1 (en) * 1999-07-15 2004-08-17 Olympus Optical Co., Ltd. Image pickup apparatus with blur compensation
JP4511766B2 (ja) 2000-07-10 2010-07-28 株式会社リコー 撮影装置および撮影装置における振れ補正方法
US7773115B2 (en) * 2004-12-15 2010-08-10 Texas Instruments Incorporated Method and system for deblurring digital camera images using reference image and motion estimation
JP2006270657A (ja) * 2005-03-25 2006-10-05 Casio Comput Co Ltd 撮像装置、及び固体撮像素子、固体撮像素子の駆動方法
JP4789767B2 (ja) * 2006-09-28 2011-10-12 キヤノン株式会社 撮像装置及びその制御方法
JP4488041B2 (ja) * 2007-08-02 2010-06-23 ソニー株式会社 像ぶれ補正装置及び撮像装置
WO2010123923A1 (en) * 2009-04-23 2010-10-28 Zoran Corporation Multiple exposure high dynamic range image capture
JP5300590B2 (ja) * 2009-05-21 2013-09-25 キヤノン株式会社 画像処理装置およびその方法
JP5797003B2 (ja) * 2011-05-10 2015-10-21 キヤノン株式会社 撮像装置及びその制御方法、プログラム、並びに記憶媒体

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100295953A1 (en) * 2009-05-21 2010-11-25 Canon Kabushiki Kaisha Image processing apparatus and method thereof
US20120086822A1 (en) * 2010-04-13 2012-04-12 Yasunori Ishii Blur correction device and blur correction method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2014096670A1 *

Also Published As

Publication number Publication date
US9860449B2 (en) 2018-01-02
FR2999735B1 (fr) 2018-08-31
WO2014096670A1 (fr) 2014-06-26
US20150334304A1 (en) 2015-11-19
FR2999735A1 (fr) 2014-06-20

Similar Documents

Publication Publication Date Title
JP5198192B2 (ja) 映像復元装置および方法
JP5709911B2 (ja) 画像処理方法、画像処理装置、画像処理プログラムおよび撮像装置
WO2015135394A1 (en) Image acquisition method and image acquisition apparatus
KR101847392B1 (ko) 화상처리장치 및 그 제어 방법
EP2780763B1 (de) Verfahren und systeme zur aufnahme von bildfolgen mit kompensation der vergrösserungsariationen
FR3027143A1 (fr) Appareil mobile, notamment drone a voilure tournante, muni d'une camera video delivrant des sequences d'images corrigees dynamiquement de l'effet "wobble"
EP3114831B1 (de) Optimierte videorauschunterdrückung für heterogenes multisensorsystem
EP2932706A1 (de) Bilderfassungsverfahren und -vorrichtung
FR3062009B1 (fr) Generation adaptative d’une image a grande gamme dynamique d’une scene, a partir d’une pluralite d’images obtenues par lecture non destructive d’un capteur d’image.
FR2996034A1 (fr) Procede pour creer des images a gamme dynamique etendue en imagerie fixe et video, et dispositif d'imagerie implementant le procede.
FR3030791A1 (de)
KR101862643B1 (ko) 제어장치, 화상처리장치, 렌즈 장치, 화상처리 시스템, 제어 방법 및 화상처리방법
FR3057095A1 (fr) Procede de construction d'une carte de profondeur d'une scene et/ou d'une image entierement focalisee
JP2015109681A (ja) 画像処理方法、画像処理装置、画像処理プログラムおよび撮像装置
FR3023957A1 (fr) Dispositif de detection de mouvement
FR3066271B1 (fr) Capteur de mouvement et capteur d'images
JP2014064214A5 (de)
EP4066483B1 (de) Bilderfassung anhand von strahlungsempfindlichen elementen mit trägheitseffekt
CA2955368C (fr) Procede et dispositif de traitement de mouvements de hautes frequences dans un systeme optronique
BE1029415A1 (fr) Un système et un procédé de mesure de réactions d’un sujet, un programme d’ordinateur et un support lisible par ordinateur
WO2023036825A1 (fr) Dispositif de mesure des erreurs angulaires d'inclinaison d'un axe réel de rotation d'un élément rotatif et procédé
FR3123734A1 (fr) Procédé de traitement de données de pixels, dispositif et programme correspondant
WO2014072348A1 (fr) Alignement automatique des cameras dans un systeme 3d
FR3030086A1 (fr) Controle de l'affichage d'une image representative d'un objet capture par un dispositif d'acquisition d'images
FR3028611A1 (fr) Dispositif et procede de determination de positions de points dans un environnement tridimensionnel, dispositif de detection d'obstacles et vehicule equipe d'un tel dispositif

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20150602

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20190723

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20191203