WO2011064379A1 - Système d'aide à la production d'images stéréoscopiques - Google Patents

Système d'aide à la production d'images stéréoscopiques Download PDF

Info

Publication number
WO2011064379A1
WO2011064379A1 PCT/EP2010/068467 EP2010068467W WO2011064379A1 WO 2011064379 A1 WO2011064379 A1 WO 2011064379A1 EP 2010068467 W EP2010068467 W EP 2010068467W WO 2011064379 A1 WO2011064379 A1 WO 2011064379A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
image
viewing
optics
images
Prior art date
Application number
PCT/EP2010/068467
Other languages
English (en)
Inventor
Jacques Delacoux
Original Assignee
Transvideo
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Transvideo filed Critical Transvideo
Priority to EP10785406A priority Critical patent/EP2508004A1/fr
Publication of WO2011064379A1 publication Critical patent/WO2011064379A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof

Definitions

  • the invention relates to the field of technical means for producing multidimensional images currently called 3D or further stereoscopic images, aiming at recreating the sensation of relief specific to human vision.
  • the adjustment differences between the optics of capture systems cause anomalies in the perception of relief by the spectator. It is sought to produce images as perfect or as good as possible during the taking of snapshots.
  • the invention first relates to a device for processing at least two individual digital images obtained by respective means for acquiring an individual image, each having optics, this device including means for calculating from data of at least two individual digital images:
  • each acquisition means for example the distance or the average distance at which each acquisition means is focused
  • Each individual image may be obtained by a means for acquiring digital images, including optics and positioned in space, differently from the image acquisition means of another individual image.
  • Viewing means may allow display of an image resulting from the combination of individual images, in order to form a so-called « three-dimensional » image.
  • a device according to the invention may be associated with means for providing a user with 3D vision .
  • a device according to the invention is therefore notably designed so as to be applied with a device for producing multidimensional images or further stereoscopic images, aiming at recreating the sensation of relief specific to human vision.
  • the invention therefore notably relates to a system consisting of one or several modules, with which :
  • - and/or iris adjustment deviations may be diagnosed immediately.
  • Each diagnostic module may be used individually or in combination with one or several other ones.
  • the invention therefore notably relates to a device for processing digital images including at least one module with which it is possible to generate in real time graphical indications for appreciating the quality of the images and of certain disparities or anomalies which may occur in their capture by a 3D image viewing system. Means may further be provided for correcting these anomalies, on the basis of these graphical indications.
  • a device according to the invention may include means for calculating from digital image data, data of focused areas of each acquisition means and/or a difference between data of focused areas of at least two acquisition means, these means may themselves include at least one filter giving the possibility of discriminating the contours of the objects which have the focusing characteristics.
  • a device it is possible in an embodiment to produce at least one portion of a contour of one or more focused areas of a 3D image, this contour portion being visually different from at least one portion of a contour of one or more areas of the same image which are not focused.
  • a first type of contour for one or more focused areas of an image and a second type of contour, different from the first, for one or more other areas of the image may therefore be made.
  • the first type of contour may be in a first colour
  • the second type of contour in a second colour.
  • a device further includes means for forming, on the viewing means, or on or in an image formed by these viewing means, vertical and/or horizontal lines, or lines along at least one first and/or one second direction on the screen, superimposed on said image, these lines forming what is called a grid, or viewing grid .
  • Such a device may then include means for displaying portions of lines of a first type and portions of lines of a second type depending on the characteristics of the portions of the image covered by these portions of lines.
  • indications of a first type for example on the focused areas of an image.
  • indications of a second type may be formed and displayed, for example relating to data of focused areas and/or to focal length data, and/or aperture data, as indications for example located in the margin of the image or beside the latter, but on the viewing means .
  • a device may include means for forming, on the viewing means, or on or in an image formed by these means, or close to such an image, histogram data for at least one of the images obtained by the digital image acquisition means .
  • a device includes means for forming, on the viewing means, or on or in an image formed by these means, or close to such an image, at least one mark or reference mark or scale and cursor-forming means on this mark, in order to represent, for example as a function of the distance to the corresponding image acquisition means, differences between data of focused areas of images obtained by the digital image acquisition means and/or differences between focal length data and/or aperture data of the digital image acquisition means.
  • the invention also relates to a device for acquiring, processing and viewing images, including: - at least two means for acquiring digital images ,
  • a device according to the invention may be combined with means allowing an observer to view three- dimensional images.
  • the invention also relates to a method for processing synchronous and simultaneous digital images from a same scene, these images being obtained by at least two digital image acquisition means, each including optics, positioned differently in space, this method including the determination from digital image data :
  • a distance between at least two acquisition means and/or a relative position of the optical axes of at least two acquisition means may be adjusted. It is also possible to proceed with adjusting one or several parameters of the system, in order to correct defects or deficiencies in the viewed data .
  • a method according to the invention may therefore also include a correction of at least one of the means for acquiring digital images, or of its optics, depending on the displayed focusing area data of each acquisition means and/or on displayed focusing data of each optics and/or on displayed focal length data (zoom) of each optics and/or on displayed aperture (iris) data of each optics and/or on displayed histogram data of each image.
  • data of focused areas or adjusted areas of each acquisition means and/or a difference between data of focused areas of at least two acquisition means are calculated from digital image data and viewed.
  • Parallel lines may advantageously be formed, superposed on an image formed on these viewing means. Portions of lines of a first type and portions of lines of a second type may then be displayed, depending on the characteristics of the portions of the image covered by these portions of lines.
  • histogram data may further be formed for at least one of the images obtained by the digital image acquisition means.
  • At least one mark or reference mark or scale and cursor-forming means on this mark or scale are formed in order to represent, for example as a function of the distance to the corresponding image acquisition means, differences between data of focused areas of images obtained by the digital image acquisition means and/or differences between focal length data and/or aperture data of the digital image acquisition means.
  • Figs. 1A and IB illustrate two embodiments of the invention.
  • Fig. 2 illustrates diffusion of an image obtained by a system according to the invention.
  • Fig. 3 illustrates in more detail two cameras of the system according to the invention.
  • Fig. 4 illustrates in a more detailed way a screen for diffusing an image according to the invention .
  • Fig. 5 illustrates another embodiment of the invention.
  • Fig. 6 illustrates an exemplary image obtained with a device according to the invention.
  • Fig. 7 illustrates a screen with a viewing grid superposed to the field of the screen.
  • Fig. 8 illustrates an exemplary embodiment of a dual cursor for displaying data differences between the images obtained by two viewing means.
  • Fig. 9 illustrates an example of images displayed on a viewing screen, on which so-called « focused » areas appear in a way which is different from the other areas .
  • the present document describes an application to a two-camera system (or, more generally, two image sensors, or two digital image acquisition means) .
  • teaching of the invention may be generalized without any difficulty to a system equipped with more than two cameras or, more generally, more than two image sensors, or two digital image acquisition means.
  • these different means for acquiring digital images in any number n (n > 2) are physically separated from each other, but positioned in an adjustable way relatively to each other. Each of them may produce an individual image of a same object 2 (see Figs. 2, 4, 5) or of a same scene, the different images being simultaneously acquired in a synchronized way, each image notably depending on the position and on the characteristics of the image acquisition means which generates it. Each of the thereby obtained individual images will be combined with the other individual images in order to make an image having the characteristics of a so-called three-dimensional or « 3D » image, or which may be perceived as such by a user.
  • a first embodiment of the invention includes two image sensors 1, 3 (for example cameras) equipped with their respective optics 10, 30.
  • Each camera allows acquisition of digital images. These images may then be processed as explained below .
  • Both cameras are associated with means 5 coupling them in space, for example substantially in parallel so that their viewing axes are parallel with each other. Alternatively, their optical axes intersect at a convergent point C as in Fig. 3.
  • at least one camera, or each camera is rotatably mounted around at least one axis of rotation, for example locally perpendicular to a lateral displacement direction of this camera or substantially coincident or in the vicinity of this displacement direction .
  • the means 5 may include a rectilinear or curved bar on which the cameras are mounted, for example in an adjustable way along the bar, in order to be able to view an object 2.
  • Fig. 3 illustrates both cameras 1, 3 with a variable spacing d between them, but also with a variable angle a between them.
  • Point C represents the convergence point of both optical axes of both cameras 1, 3. This adjustment of the angle may be obtained by means for orienting the cameras.
  • means such as the means 5 of Fig. 1A may give the possibility of varying the distance between both cameras but also the viewing angle of each of the cameras relatively to each other. Means are then provided for maintaining the cameras fixed in a determined position once the latter is reached. For example, with each cameras, is associated a system including one or several blocking screws for blocking its translational displacement and one or several blocking screws for blocking its rotational displacement .
  • both cameras may be positioned along different directions by being mounted on a frame 5' including the apertures corresponding to the objectives 10, 30 of the cameras.
  • a semi-transparent mirror 15 allows each camera 1, 3 to receive part of the radiation of an incident image which propagates along an optical axis 17.
  • Both image capture devices 1, 3 are synchronous with each other and simultaneously produce images which correspond to those which would be seen by the left eye and by the right eye of an observer, with a gap and convergence, variable depending on the desired effect.
  • Each image will have characteristics which depend on the position of the camera which generates it.
  • different images may be displayed on a same viewing device, so as to create a sensation of relief for a spectator.
  • the latter may be provided with means delivering a different content for his/her right eye and his/her left eye, for example through specific spectacles 6 or by direct viewing, the human brain recombining the images in order to form what is currently called « the relief » and to allow appreciation of the distances.
  • Precautions should therefore be taken during capture of the images that they are of optimum quality or of good quality, allowing this « 3D » reconstruction.
  • a lack of focusing adjustment may first occur.
  • a lack of adjustment of the focusing of one of the two objectives 10, 30 produces a blurred or shifted image for one of the eyes of the observer creating visual discomfort similar to that due to a lack of accommodation of an eye.
  • Another type of defect is a lack of adjustment of the aperture (iris) of one of the two objectives 10, 30, which produces an overexposed or underexposed resulting image for one of the eyes of the observer, causing a more or less bothersome stroboscopic phenomenon depending on its intensity. Moreover, a difference in exposure will be associated with a difference of field depth in both images.
  • Still another type of defect is a lack of adjustment of focal length (zoom) of one of the two objectives 10, 30, which produces two images of different sizes, making three-dimensional perception impossible .
  • a device may include different electronic modules for processing an image, either associated with each other or not. With each module (or set of means) it is possible to detect one of the aforementioned defects.
  • Certain modules may use resources common to several modules.
  • a casing 20 may accommodate one or more of these modules.
  • Each module may be made as an electronic circuit and/or a set of software means programmed for producing the corresponding function.
  • the modules are formed by FPGA type programmable circuits associated with microprocessors or microcontrollers. The processing may also be performed in « digital signal processors » DSP.
  • a first module includes a set of filters enabling discrimination of the contours of the objects which have the characteristics of the focusing.
  • these filters are used in each of the images which will make up the three-dimensional image.
  • the resulting image of the transfer function TF of these filters is added to the resulting image, producing a more or less enhanced coloured contour on the so-called « focused » areas.
  • An image 19 including thick contours 19i, which are focused areas of the image, are thereby illustrated in Fig. 9, while the areas with finer contours 19 2 are considered as being areas which are out of focus.
  • a second module is a so-called « iris flavour
  • a third module is a so-called « zoom Bengal From digital data stemming from the objectives, the zoom adjustment data may be interpreted graphically and show the distance between both objectives. It may thus immediately identified which of the objectives would have to be corrected.
  • a programmable alarm it is possible to define acceptance tolerances between both adjustments.
  • a fourth module is a so-called « Dual Histogram » module: with this module it is possible to statistically analyze the digital data from both cameras. These data are simultaneously displayed on each other in coded colours allowing each source to be recognized. With this representation method it is possible i.a. to compensate for the possible lack of digital data from the objectives.
  • Figs. 4 and 5 is illustrated the image of an object, but also the display of a dual histogram 11, for example as a function of the distance to the respective camera or image acquiring means: one of the histograms relates to the image obtained by one (1) of the viewing means, the other histogram relates to the images obtained by the other viewing means (3) .
  • An illustration of such a dual histogram is given in Fig. 6. Because of simultaneous viewing of both histograms, the differences in black levels of the image, in dynamics of the image and in white levels of the image are easily interpreted. It is also possible to diagnose synchronization errors of the camera sensors. It is also possible to display data (for example aperture and/or focusing and/or zoom adjustment data) of the objectives, in another viewing field 13.
  • FIG. 8 An example of such a display is given in Fig. 8: these are two horizontal marks or lines 130, 132 (or horizontal scales) , one of these marks or lines 130 indicating the difference between the focusing distances of both objectives and the other mark or line 132, the difference between the aperture data of the same two objectives.
  • a cursor 131, 133 indicates on each mark or line the difference in focussing distance and in aperture, respectively, with respect to a central value (marked as « 0 ») for which there is an agreement between both objectives (if each of the cursors is in this position, then the focusing distances of both objectives are equal, as well as the aperture data) .
  • the cursor 131 may be of a first colour (red for example)
  • the cursor 133 may be of a second colour (blue for example) .
  • This type of display may be achieved for other pairs of data, or for a single datum (for example, it is possible to have the sole mark or scale 130, with the sole cursor 131, which would indicate the differences between the focusing distances of both objectives) .
  • one horizontal or vertical mark indicates the position of a specific zone for example as a function of the distance to the corresponding camera or image acquiring means.
  • a device according to the invention may be integrated entirely or partly in a viewing system 4 such as a video monitor, a projector, or any other system for reproducing digital images.
  • a viewing system 4 such as a video monitor, a projector, or any other system for reproducing digital images.
  • means or modules for image processing according to the invention may also be integrated in a self-contained module 20 intended to produce an image which may be used with a viewing system 4 as described earlier.
  • a viewing grid a so-called « 3D grid » may be displayed on the screen, superposed on an image.
  • An exemplary display of a grid 50, superposed on an image 59 on a screen 4 is given in
  • Such a grid includes lines 50i, 502,... 50 n , parallel with each other, here vertical lines (it is possible to have horizontal lines, instead of vertical lines or as a combination with them) , the spacing (or spacing value) between two consecutive lines 50 ⁇ , 50i + i of the grid may be adjustable, for example in the number of pixels of the incident image and/or in the percentage of the number of pixels making up the line of the incident signal.
  • the lines of the grid may be parameterizable according to one or more parameters, for example different colours and/or intensities, so to facilitate its reading by an operator, regardless of the contents of the image. For example focused areas are identified on the lines by a a particular colour. As a certain area of the image gradually becomes out of focus, the colour of the portions of lines which cover this area is modified. Other types of indication may be applied, in connection with lines parallel to each other, in order to differentiate areas of an image having a first characteristic from areas of the image which do not have this first characteristic. For example, it is possible to have more or less thick lines .
  • Fig. 7 schematically illustrates a grid 50 on a screen 4.
  • the image which results from the combination of the data of images provided by both cameras 1, 3 is not represented.
  • the portions of lines of the grid are in a first colour, for example in green, which is illustrated in Fig. 7 by portions 502 of bold lines.
  • the portions of lines of the grid are in a second colour, for example in red, which is illustrated in Fig. 7 by portions 501 of dashed lines.
  • a central portion 51 of the grid may be outlined so as to let the centre of the image remain visible.
  • the grid 50 may be used in a stereoscopic monitor 4 in order to determine the spacing between two objects from two different viewing devices.
  • Fig. 6 is also shown a histogram 11, which indicates the intensity for each camera, for example as a function of the distance to the respective camera. This gives the operator the possibility to select the appropriate zone.
  • the display on a same screen, of the focused areas (with the grid 50) and of a histogram gives an operator full information, both in time (with the characteristics, for example the colour, of the lines of the grid 50 which may change over time) and as a histogram, for example as a function of the distance to the respective camera.
  • determination of focusing area of acquisition means and/or of focusing data of each optics and/or focal length data (zoom) of each optics and/or aperture data (iris) of each optics and/or histogram data of each image is carried out:
  • any viewing system such as a screen, or a monitor, or a projector, moreover connected to the cameras through cables 27; this is the case of Fig. 4 ;
  • a self-contained system 20 not having any specific viewing means but means for calculating the characteristics or data indicated above and possibly for modifying the contents of the resulting image. This is the case of Fig. 5.
  • the self- contained system 20 is connected to a viewing means 4 via cables 23 forming a video connection. It is moreover connected to the cameras through other cables 27.
  • the system 20 is self-contained : it may be displaced in space independently of the cameras 1, 3 and independently of the viewing means 4.
  • the invention By applying the invention, it is possible to avoid problems of adjustment disparities between the cameras and the objectives by providing a synthetic, simultaneous and intuitive view of the working conditions of both cameras and of their objectives.
  • the invention provides improved reliability of the image, allows correction of errors during a shooting of stereoscopic images. It also avoids many and tedious tests to be avoided, which otherwise would have been necessary .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Studio Devices (AREA)

Abstract

L'invention porte sur un dispositif de traitement d'images numériques lui-même constitué de plusieurs modules et créant en temps réel des indications graphiques permettant d'apprécier la qualité des images et certaines disparités ou anomalies qui peuvent se produire dans leur capture par un système destiné à créer des images provoquant une sensation de relief pour le spectateur.
PCT/EP2010/068467 2009-11-30 2010-11-30 Système d'aide à la production d'images stéréoscopiques WO2011064379A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP10785406A EP2508004A1 (fr) 2009-11-30 2010-11-30 Système d'aide à la production d'images stéréoscopiques

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
FR0958479A FR2953359B1 (fr) 2009-11-30 2009-11-30 Systeme d'aide a la realisation d'images stereoscopiques
FR0958479 2009-11-30
US32328710P 2010-04-12 2010-04-12
US61/323,287 2010-04-12

Publications (1)

Publication Number Publication Date
WO2011064379A1 true WO2011064379A1 (fr) 2011-06-03

Family

ID=41693172

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2010/068467 WO2011064379A1 (fr) 2009-11-30 2010-11-30 Système d'aide à la production d'images stéréoscopiques

Country Status (3)

Country Link
EP (1) EP2508004A1 (fr)
FR (1) FR2953359B1 (fr)
WO (1) WO2011064379A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160248986A1 (en) * 2014-02-27 2016-08-25 Sony Corporation Digital cameras having reduced startup time, and related devices, methods, and computer program products

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5496106A (en) * 1994-12-13 1996-03-05 Apple Computer, Inc. System and method for generating a contrast overlay as a focus assist for an imaging device
EP0642275B1 (fr) * 1993-09-01 1999-03-10 Canon Kabushiki Kaisha Appareil de prise de vue multioculaire
US6526232B1 (en) * 1999-04-16 2003-02-25 Fuji Photo Optical Co., Ltd. Lens control unit
US20030174233A1 (en) * 2002-03-12 2003-09-18 Casio Computer Co., Ltd. Photographing apparatus, and method and program for displaying focusing condition
US20090278958A1 (en) * 2008-05-08 2009-11-12 Samsung Electronics Co., Ltd. Method and an apparatus for detecting a composition adjusted

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6937284B1 (en) * 2001-03-20 2005-08-30 Microsoft Corporation Focusing aid for camera
KR101167243B1 (ko) * 2005-04-29 2012-07-23 삼성전자주식회사 효과적으로 히스토그램을 디스플레이하기 위한 디지털 영상처리 장치의 제어 방법, 및 이 방법을 사용한 디지털 영상처리 장치
US8350945B2 (en) * 2007-10-15 2013-01-08 Panasonic Corporation Camera body used in an imaging device with which the state of an optical system is verified by an operation member

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0642275B1 (fr) * 1993-09-01 1999-03-10 Canon Kabushiki Kaisha Appareil de prise de vue multioculaire
US5496106A (en) * 1994-12-13 1996-03-05 Apple Computer, Inc. System and method for generating a contrast overlay as a focus assist for an imaging device
US6526232B1 (en) * 1999-04-16 2003-02-25 Fuji Photo Optical Co., Ltd. Lens control unit
US20030174233A1 (en) * 2002-03-12 2003-09-18 Casio Computer Co., Ltd. Photographing apparatus, and method and program for displaying focusing condition
US20090278958A1 (en) * 2008-05-08 2009-11-12 Samsung Electronics Co., Ltd. Method and an apparatus for detecting a composition adjusted

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2508004A1 *

Also Published As

Publication number Publication date
FR2953359A1 (fr) 2011-06-03
EP2508004A1 (fr) 2012-10-10
FR2953359B1 (fr) 2012-09-21

Similar Documents

Publication Publication Date Title
TWI444661B (zh) 顯示裝置及其控制方法
EP2494402B1 (fr) Systèmes d'affichage stéréo
US8000521B2 (en) Stereoscopic image generating method and apparatus
US7440004B2 (en) 3-D imaging arrangements
US10134180B2 (en) Method for producing an autostereoscopic display and autostereoscopic display
JP2010531102A (ja) 色フィルタで立体画像を生成し表示する方法および装置
US11962746B2 (en) Wide-angle stereoscopic vision with cameras having different parameters
AU2011200146A1 (en) Method and apparatus for processing video games
JP2002223458A (ja) 立体映像作成装置
CN106713894B (zh) 一种跟踪式立体显示方法及设备
JP2004333661A (ja) 立体画像表示装置、立体画像表示方法および立体画像表示プログラム
JP4436080B2 (ja) 立体画像再現歪み出力装置、立体画像再現歪み出力方法および立体画像再現歪み出力プログラム
JP2006013851A (ja) 撮像表示装置および撮像表示方法
TWI462569B (zh) 三維影像攝相機及其相關控制方法
CN108259888A (zh) 立体显示效果的测试方法及系统
EP2408217A2 (fr) Procédé pour la présentation d'images 3D et appareil pour la présentation d'images 3D virtuelles
EP2508004A1 (fr) Système d'aide à la production d'images stéréoscopiques
CN110602478A (zh) 一种三维显示装置及系统
JP3425402B2 (ja) 立体画像を表示する装置および方法
JP2012227653A (ja) 撮像装置及び撮像方法
EP3419287A1 (fr) Appareil et procédé d'affichage d'une image 3d
KR101414094B1 (ko) 3차원 무안경 디스플레이장치의 정렬검사장치
CN104717488B (zh) 显示设备和显示方法
CN113516683B (zh) 评估移动速度对立体视影响的建模方法
KR20110092629A (ko) 적절한 시청 위치 알림 방법 및 이를 위한 무안경 방식 3차원 디스플레이 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10785406

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2010785406

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2010785406

Country of ref document: EP