EP2422224A1 - Verfahren zum erfassen und anzeigen von bild-daten eines objekts - Google Patents

Verfahren zum erfassen und anzeigen von bild-daten eines objekts

Info

Publication number
EP2422224A1
EP2422224A1 EP10721656A EP10721656A EP2422224A1 EP 2422224 A1 EP2422224 A1 EP 2422224A1 EP 10721656 A EP10721656 A EP 10721656A EP 10721656 A EP10721656 A EP 10721656A EP 2422224 A1 EP2422224 A1 EP 2422224A1
Authority
EP
European Patent Office
Prior art keywords
image data
human
animal body
projected
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP10721656A
Other languages
German (de)
English (en)
French (fr)
Inventor
Christian Evers
Gerd Hechtfischer
Andreas Schiessl
Ralf JÜNEMANN
Andreas Paech
Olaf Ostwald
Marcus Schreckenberg
Georg Schummers
Alexander Rossmanith
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rohde and Schwarz GmbH and Co KG
Tomtec Imaging Systems GmbH
Original Assignee
Rohde and Schwarz GmbH and Co KG
Tomtec Imaging Systems GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rohde and Schwarz GmbH and Co KG, Tomtec Imaging Systems GmbH filed Critical Rohde and Schwarz GmbH and Co KG
Publication of EP2422224A1 publication Critical patent/EP2422224A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/887Radar or analogous systems specially adapted for specific applications for detection of concealed objects, e.g. contraband or weapons
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V5/00Prospecting or detecting by the use of ionising radiation, e.g. of natural or induced radioactivity
    • G01V5/20Detecting prohibited goods, e.g. weapons, explosives, hazardous substances, contraband or smuggled objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the invention relates to a method for capturing and displaying image data of one or more objects with respect to a human or animal body.
  • microwave radiation based systems have been developed which allow fast and reliable security screening of persons, for example at airports.
  • a system based on a microwave radiation is known, for example, from US Pat. No. 6,965,340 B1.
  • This system is based on the fact that the objects to be detected have a distinctly different dielectric constant with respect to the surrounding air or to the surrounding textiles, which leads to significant contrast in the image reproduction. Detection takes place down to the skin surface of the persons to be examined, since the perfused skin tissue has such a high water content that total reflection occurs there. Garments made of textiles or leather, however, are easily penetrated by the microwave radiation. As a result, objects that are hidden in the textiles or on the skin surface can be detected with the system. The widespread introduction of the systems failed, however, so far, the competent authorities considered that the privacy of the persons to be examined was injured by image reproduction, especially in the facial and genital area.
  • the invention is therefore based on the object to provide a method and an apparatus for capturing and displaying image data of an object with respect to a human or animal body, in which the image reproduction is abstracted in such a way that preserves the privacy of the persons to be examined remains.
  • the acquired image data are not displayed directly, but indirectly, by being projected onto an artificial body representing the human or animal body.
  • the artificial body may be a so-called
  • Avatar of a typical human body abstract modeled form which, although similar to a computer animation has human features and shows a person in typical body state, but does not specifically reflect the person to be examined.
  • the artificial body may also be an even more abstracted body, for example a cylinder or a plurality of cylindrical, conical, frustoconical or spherical bodies on which the image data are projected.
  • the facial features or other body geometries are distorted to the extent that the privacy of the person to be examined is respected.
  • the objects to be detected are distorted in their contours in the same way, but they are still detected by the system and are still recognizable in their coarse structure. In a specific case of suspicion, individual body regions can then be selected and equalized again using the inverse twisting method so that the detected objects can be displayed in their original structure, but only in connection with the body regions of the person to be examined.
  • the avatar is not shown directly, but only a settlement of the surface of the avatar with the objects projected thereon. This will be a further abstraction of the representation of
  • the body of the body can be trapezoidal.
  • the arms and legs can be represented as rectangles.
  • the head area can be displayed in a circle. It is also possible to represent individual body regions randomly scrambled for the viewer in the manner of a puzzle, without the viewer being able to assign the individual puzzle pieces to the individual body regions. If an object to be detected is accommodated in a body region, for example in the genital area, that is to be protected in terms of privacy, this is not easily recognizable to the viewer, because the body section shown is on the one hand too small and, on the other hand, strongly distorted. The privacy of the person to be examined therefore remains intact. During settlement, it may also be e.g. to trade a pattern of a virtual clothing.
  • the object is preferably not displayed in connection with the image data of the examined person, but on the avatar, so that the person controlling the body region, in which the detected object is located, can recognize and there targeted further investigations can be made. It is also possible to display only the position of the object, for example by a laser pointer. The position of the object can then either be displayed on a screen on the avatar or the body region can be displayed by a laser pointer directly to the person to be examined, so that targeted further investigations, for example, be made by a palpation.
  • the transformation used in the projection must be bijective, ie one-to-one, to the reverse transformation used in the back projection.
  • the transformation used in the projection must be unambiguous insofar as the pixel from which a projected starting point originates can be unambiguously reconstructed.
  • the inventive method is not only suitable for microwave scanners but for any type of imaging detectors, for example, for X-ray scanners.
  • Fig. 1 is a block diagram of an embodiment of the device according to the invention.
  • FIG. 2 objects projected onto an avatar
  • Dog tags that identify the location of the detected objects.
  • Fig. 1 shows a simplified block diagram of the device according to the invention 1. To be observed
  • Person 2 around is via an electric motor 3, preferably a stepping motor, a signal recording system consisting of a transmitting antenna 4, a receiving antenna 5 and optional an optical camera 6 movable.
  • the signal recording system is preferably movable through 360 ° around the person 2 to be observed. Preferably, this scanning takes place in several levels. However, a plurality of antennas may also be arranged in rows or in a matrix-like manner in order to scan the person 2 to be observed in parallel.
  • a Hochfreguenzaise 7 is connected via a transmitting device 8 with the transmitting antenna 4 in connection.
  • the radio-frequency unit 7 is connected to the receiving antenna 5 via a receiving unit 9.
  • the signal received by the radio-frequency unit 7 is forwarded to a control unit 10 which composes image data from the received signal.
  • Control unit 10 also takes over the control of the motor 3 and the optical camera 6. If several antennas are distributed in a matrix, an adjustment of the transmitting antenna 4 and the receiving antenna 5 is not necessary. In each case, one antenna always works as a transmitting antenna and the signal is received by all other antennas. The motor 3 for spatial adjustment of the arrangement of the antennas 4 and 5 can then be omitted.
  • the invention is not limited to such microwave scanners, in particular terahertz scanners.
  • Other methods that provide a corresponding volume data set, ie, data by amount and phase for each voxel (discrete spatial element) are also suitable, as long as they allow a three-dimensional surface representation of the human or animal body.
  • X-ray scanners using X-rays are also suitable. Included are also scanners, which generate the three-dimensional information only secondary by appropriate stereo evaluation.
  • a corresponding pre-processing of the image raw data generated by the image acquisition takes place.
  • the image raw data is preferably processed first to improve the image quality.
  • the raw BiId data from the control unit 10 is first given to a noise reduction processor 11, which performs a corresponding noise suppression (noise suppression). Reflections on the contour of the human or animal body generate signal components with low spatial frequency, which can be filtered out by the filter device 12 for suppressing these low-frequency signal components. Subsequently, preferably, one or more feature images are generated for each recorded individual image.
  • the data (eg RGB data) of the camera 6 can also be used. This revision is done in the image abstraction processor 13.
  • the result may be, for example, a cartoon-like representation of boundary lines. Also conceivable is a transition with the optical RGB data of the camera 6. Particularly suitable is a camera with depth imaging, for example a so-called TOF camera, for the optical measurement of the depth information.
  • an adjustment of the avatar, ie of the spatially low-detail standard model of a human, which only allows limited deformations, to the depth map, which is supplied by the camera 6, is preferably carried out in the unit 14.
  • the avatar is brought into a body position which corresponds to the body position of the person 2 to be observed, which occupies it at the time of the examination. This allows the observer of the avatar on the screen a better assignment of possibly detected objects to the corresponding body parts, because he sees the avatar in the same body position as the person to be observed.
  • the projection of the objects or the feature images with the objects on the surface of the avatar is preferably carried out in the unit 14.
  • the projection value used can be determined in various ways. In the simplest case, an averaging, preferably a weighted averaging, of the measured values is carried out from different measurements. It is also conceivable, however, to select the measured value or characteristic image with optimum contrast representation.
  • the optimal feature image depends mainly on the shooting angle. When the signal recording system is moved around the person 2 to be observed, there are usually one or more antenna positions in which the relevant pixel is reproduced with optimum contrast. The image data of this measurement will then be used for this pixel, while for other pixels, other image data from other measurements may be used.
  • the image with the objects projected on the avatar can be output to an image display device 16, such as a computer screen.
  • an image display device 16 such as a computer screen.
  • FIG. Visible is the cartoon-like illustrated in the form of boundary lines avatar 30 with the image data projected thereon, wherein an object 31 in the arm area, an object 32 in the trunk area and an object 33 in the thigh area is recognizable. It turns out that the very abstract representation of the avatar does not violate the personality sphere of the observed person 2.
  • a still further abstraction is achieved in that not the avatar 30 in a three-dimensional representation, but a settlement of the Surface of the avatar 30 to a certain geometry, preferably a planar geometry with minimization of the length and angle error is generated.
  • a flat map, pattern of a virtual clothing or partial projections are suitable.
  • FIG. Such a representation is shown by way of example in FIG.
  • the areas 40 and 41 correspond to the arm portions, the portion 42 to the trunk and neck area, the portion 43 of the head area and the portion 44 of the leg and lumbar region.
  • the objects 31, 32 and 33 projected thereon can be seen, wherein the object 31 comes to rest in the partial area 40 of the right arm section, the object 32 in the partial area 42 of the trunk area and the object 33 in the partial area 44 of the leg region.
  • the privacy of the person to be examined 2 is completely preserved here, because no conclusions can be drawn from the presentation on individual body parts of the person, it is nevertheless clearly recognizable for the security personnel, where on the body of the person to be examined 2, the detected objects 31-33 are located.
  • a development processor 17 (wind off surface) is present in the device 1 shown schematically in FIG.
  • the developed BiId data generated by the processing processor 17 are also available on the display device 16 as an image. If the direct display of the objects 31-33 shown in FIG. 2 is not desired in connection with image data of the surrounding parts of the body, because this does not sufficiently distort the parts of the body, and instead only an abstracted development is represented, as shown for example in FIG 3, it is meaningful that at least the body regions in which the detected objects 31-33 are located are marked on the avatar 30. This facilitates subsequent examinations, for example by scanning the person to be examined.
  • FIG. 4 This marking of the body regions in which the objects 31 to 33 are located is shown by way of example in FIG. 4.
  • no image data is projected onto the avatar at all, but only corresponding body regions are marked by arrows 51 to 53, for example.
  • the arrow 51 corresponds to the object 31, the arrow 52 to the object 32 and the arrow 53 to the object 33.
  • a corresponding marker device 18 (pointer avatar) is provided. These markers 51-53 are displayed on the display device 16 on the avatar 30 as an alternative image.
  • the position of the objects 31 to 33 directly to the person to be examined 2 for example by a directed light emission, in particular by a laser beam 25, are illustrated.
  • the security personnel then knows exactly where the object is located and can there z. B. make a targeted scan.
  • a body marking device 19 pointer person
  • This body position data can then be forwarded to a laser controller 20, which in the embodiment a corresponding laser 21 and drives a corresponding motor 22 for positioning the laser beam 25.
  • the laser beam 25 is then targeted to the corresponding body region on which the corresponding object 31 has been detected, and there generates a light spot.
  • the device 1 shown in FIG. 1 has a voice control device 23 (language controller), which is connected to a loudspeaker 24 or a headset.
  • the control person can be given in the example case by voice output of "an object on the right upper arm", “an object on the belly left” and “an object on the left thigh" a corresponding hint.
  • the output can also be in the form of an image in such a way that the microwave image of the detected objects 31-33 generated by the microwave scanner is underlaid with an optical image of the person 2 to be examined, which is acquired via the camera 6.
  • the microwave image of the detected objects 31-33 generated by the microwave scanner is underlaid with an optical image of the person 2 to be examined, which is acquired via the camera 6.
  • the entire body of the person 2 to be examined is shown, but only small sections of those body regions in which the objects 31 to 33 were detected.
  • a body-like avatar 30 instead of a body-like avatar 30 also simpler projection geometries can be used for the artificial body, such as a cylinder for parts of the body, such as the arms, a truncated cone for the fuselage, etc. It is also conceivable individual projection geometries for each individual feature image eg from the respective smoothed elevation profile of the recorded with the camera 6 optical data. An ambiguity in the mapping to the projection geometry is then eliminated. Indeed Each individual result image must then be viewed interactively in a film sequence.
  • An advantage in the representation of the settlement is also that the entire body surface can be displayed simultaneously, so both the front and the back of the person to be examined. 2
  • a rear-projection processor 26 is present, whose input is connected to the projection processor 15.
  • the rear projection processor 26 serves, if necessary, for the art body, e.g. the avatar 30 réellejuproj projected image data, so that the original image data with the body contours of the person to be examined 2 is present.
  • This back projection is only performed if safety-relevant objects 31-33 were detected. It is possible to underlay the microwave image data recorded by the microwave image pickup unit 3-4, 7-9 with optical image data taken by the camera 6. It is sufficient in this case, the back projection of the place. D. H. the image information itself would not have to be transformed with.
  • the projection processor 15 performs an encrypted transformation when projecting and the rear projection processor 26 uses a back transformation which is bijective to that of the backprojection processor 26
  • Projection processor 15 is made transformation.
  • the encryption ensures that without knowledge of the key, the reverse transformation is not possible, so that the permission of the back propagation can be limited to specially authorized staff of security personnel.
  • the invention is not limited to the illustrated embodiment. All elements described or drawn above can be combined with one another in the context of the invention as desired. Also conceivable is a combination of the above-mentioned physical space detection (using radio frequency (RF) or X-ray (X-ray)) with optical TOF measurement (measurement of the depth profile). The TOF measurement from eg several perspectives could be used directly for the creation of the avatar. Another advantage arises from the limitation of the target volume. This could save recording or arithmetic time in the reconstruction of the image data.
  • RF radio frequency
  • X-ray X-ray

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Graphics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Hardware Design (AREA)
  • Electromagnetism (AREA)
  • Software Systems (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Geophysics (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Image Processing (AREA)
  • Processing Or Creating Images (AREA)
  • Alarm Systems (AREA)
EP10721656A 2009-04-23 2010-04-14 Verfahren zum erfassen und anzeigen von bild-daten eines objekts Withdrawn EP2422224A1 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE102009018702 2009-04-23
DE102009034819 2009-07-27
PCT/EP2010/002298 WO2010121744A1 (de) 2009-04-23 2010-04-14 Verfahren zum erfassen und anzeigen von bild-daten eines objekts

Publications (1)

Publication Number Publication Date
EP2422224A1 true EP2422224A1 (de) 2012-02-29

Family

ID=42288646

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10721656A Withdrawn EP2422224A1 (de) 2009-04-23 2010-04-14 Verfahren zum erfassen und anzeigen von bild-daten eines objekts

Country Status (5)

Country Link
US (1) US20120038666A1 (ja)
EP (1) EP2422224A1 (ja)
JP (1) JP5477925B2 (ja)
DE (1) DE102010014880A1 (ja)
WO (1) WO2010121744A1 (ja)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102012006670A1 (de) * 2012-02-18 2013-08-22 Hübner GmbH Verfahren zur Sichtbarmachung von dreidimensionalen Gegenständen auf einer Person
DE102012111201B4 (de) * 2012-11-21 2014-07-17 Eads Deutschland Gmbh Sensorsystem sowie Sensoreinrichtung dafür
DE102013225283B4 (de) 2013-12-09 2023-04-27 Rohde & Schwarz GmbH & Co. Kommanditgesellschaft Verfahren und Vorrichtung zum Erfassen einer Rundumansicht
JP2015132597A (ja) * 2013-12-10 2015-07-23 マスプロ電工株式会社 ミリ波撮像装置
KR20160130482A (ko) * 2014-03-07 2016-11-11 라피스캔 시스템스, 인코포레이티드 초광대역 검출기
US11280898B2 (en) 2014-03-07 2022-03-22 Rapiscan Systems, Inc. Radar-based baggage and parcel inspection systems
DE102014225592A1 (de) 2014-12-11 2016-06-16 Smiths Heimann Gmbh Personenidentifikation für mehrstufige Personenkontrollen
CN107167771B (zh) * 2017-04-28 2018-10-26 深圳市无牙太赫兹科技有限公司 一种微波成像系统的直达波抑制方法及系统

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1183996A (ja) * 1997-09-03 1999-03-26 Omron Corp ミリ波検出装置
US7365672B2 (en) * 2001-03-16 2008-04-29 Battelle Memorial Institute Detection of a concealed object
US7405692B2 (en) * 2001-03-16 2008-07-29 Battelle Memorial Institute Detecting concealed objects at a checkpoint
US6720905B2 (en) * 2002-08-28 2004-04-13 Personnel Protection Technologies Llc Methods and apparatus for detecting concealed weapons
US7202808B2 (en) * 2004-04-14 2007-04-10 Safeview, Inc. Surveilled subject privacy imaging
US7386150B2 (en) * 2004-11-12 2008-06-10 Safeview, Inc. Active subject imaging with body identification
US6965340B1 (en) 2004-11-24 2005-11-15 Agilent Technologies, Inc. System and method for security inspection using microwave imaging
US20070235652A1 (en) * 2006-04-10 2007-10-11 Smith Steven W Weapon detection processing
IL176411A0 (en) * 2006-06-19 2007-07-04 Ariel University Res And Dev C Hand-held device and method for detecting concealed weapons and hidden objects
CN103064125B (zh) * 2007-06-21 2016-01-20 瑞皮斯坎系统股份有限公司 用于提高受指引的人员筛查的系统和方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2010121744A1 *

Also Published As

Publication number Publication date
JP5477925B2 (ja) 2014-04-23
JP2012524921A (ja) 2012-10-18
US20120038666A1 (en) 2012-02-16
DE102010014880A1 (de) 2010-11-18
WO2010121744A1 (de) 2010-10-28

Similar Documents

Publication Publication Date Title
EP2422224A1 (de) Verfahren zum erfassen und anzeigen von bild-daten eines objekts
DE102016200225B4 (de) Perspektivisches Darstellen eines virtuellen Szenebestandteils
DE10315242B4 (de) Verfahren und Vorrichtung zur realitätsnahen dreidimensionalen Bildgebung
EP2119397B1 (de) Bestimmung einer Kalibrier-Information für ein Röntgengerät
DE102013215516B4 (de) Röntgengerät und Verfahren zum Steuern eines Röntgengeräts
DE112010001224T5 (de) Auf Bewegungskompensation basierende CT-Vorrichtung und CT-Verfahren
DE102005022899A1 (de) Verfahren und Einrichtung zum Erzeugen eines digitalen tomosynthetischen 3D-Röntgenbildes von einem Untersuchungsobjekt
DE102014226756A1 (de) Bildgebungsanordnung sowie Verfahren zum Positionieren eines Patienten in einer Bildgebungsmodalität
WO2008006451A1 (de) Verfahren zum bestimmen einer eigenschaftskarte für einen gegenstand, insbesondere für ein lebewesen, basierend auf zumindest einem ersten bild, insbesondere einem kernspinresonanzbild
EP2082687A1 (de) Überlagerte Darstellung von Aufnahmen
DE102013225283B4 (de) Verfahren und Vorrichtung zum Erfassen einer Rundumansicht
DE102008003878B3 (de) Verfahren und Vorrichtung zur Visualisierung von 3D-Bilddaten der tomographischen Bildgebung
DE112017001315T5 (de) Rechenvorrichtung zum Überblenden eines laparoskopischen Bildes und eines Ultraschallbildes
DE102008028945A1 (de) Verfahren und Visualisierungsmodul zur Visualisierung von Unebenheiten der Innen-Oberfläche eines Hohlorgans, Bildbearbeitungseinrichtung und Tomographiesystem
EP1779327B1 (de) Verfahren und Vorrichtung zur Erzeugung dreidimensionaler tomographischer Bilder eines Objektes
DE102004027709B4 (de) Verfahren der virtuellen Endoskopie zur medizinischen 3D-Bilddarstellung und -verarbeitung, Computertomografiegerät, Arbeitsstation und Computerprogrammprodukt
DE102011075917A1 (de) Verfahren zum Bereitstellen eines 3D-Bilddatensatzes mit unterdrückten Messfeldüberschreitungsartefakten und Computertomograph
WO2005027050A2 (de) Verfahren zur darstellung von 3d bilddaten
DE102004027708B4 (de) Verfahren zur medizinischen 3D-Bilddarstellung und -verarbeitung, Computertomografiegerät, Arbeitsstation und Computerprogrammprodukt
DE102008023330A1 (de) Verfahren zur Erzeugung eines 3D-Rekonstruktionsbilddatensatzes eines Untersuchungsbereichs eines Untersuchungsobjekts
DE112020004854T5 (de) Medizinische Bildverarbeitungsvorrichtung, medizinisches Bildverarbeitungsprogramm, medizinische Vorrichtung und Behandlungssystem
WO2008074681A1 (de) Verfahren und einrichtung zum erzeugen eines tomosynthetischen 3d-röntgenbildes
EP3571091B1 (de) Verfahren und vorrichtung zum darstellen einer umgebung eines fahrzeuges
DE102011085860B4 (de) Verfahren zur medizinischen Bildgebung eines Körperteils, insbesondere der Hand
EP3664036B1 (de) Verfahren und system zur berechnung einer ausgabe von einem durch einen tomographen bereitgestellten scrollbaren bildstapel

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20101125

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20121213

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20151103