EP0577491B1 - Verfahren und Vorrichtung zur Überwachung einer dreidimensionalen Szene unter Verwendung von Bildsensoren - Google Patents

Verfahren und Vorrichtung zur Überwachung einer dreidimensionalen Szene unter Verwendung von Bildsensoren Download PDF

Info

Publication number
EP0577491B1
EP0577491B1 EP19930401661 EP93401661A EP0577491B1 EP 0577491 B1 EP0577491 B1 EP 0577491B1 EP 19930401661 EP19930401661 EP 19930401661 EP 93401661 A EP93401661 A EP 93401661A EP 0577491 B1 EP0577491 B1 EP 0577491B1
Authority
EP
European Patent Office
Prior art keywords
scene
sensors
scenes
points
synthesised
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
EP19930401661
Other languages
English (en)
French (fr)
Other versions
EP0577491A1 (de
Inventor
Francis Bretaudeau
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Matra Cap Systemes
Original Assignee
Matra Cap Systemes
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matra Cap Systemes filed Critical Matra Cap Systemes
Publication of EP0577491A1 publication Critical patent/EP0577491A1/de
Application granted granted Critical
Publication of EP0577491B1 publication Critical patent/EP0577491B1/de
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction

Definitions

  • the present invention relates to surveillance a lit, three-dimensional scene with structure known or determinable geometric, allowing in particular to detect anomalies larger than a value determined (representing for example intrusions) or, in an advanced embodiment, to locate a mobile moving inside the scene.
  • the present invention aims to provide a method and a scene monitoring or control device, meeting the requirements better than those previously known of practice, particularly in that they provide high selectivity by using only means simple hardware.
  • the reference model is purely geometric and only intervenes to establish the correspondence between homologous points in the fields of different sensors.
  • the comparison can be made between images taken at identical times, so in lighting conditions which are the same. Given that the comparison is made between synthesized scenes, a modification of the external conditions modifying the radiometric values (brightness or color) of a point will be has no effect on the comparison, provided that the sensors have similar response characteristics.
  • scene does not should not be interpreted as limited to the case of a zone continuous, corresponding to the field overlap sensors.
  • the stage can be made up of portions discreet in the fields, only considered important and monitor.
  • the comparison of the synthesized scenes can be performed by determining the coefficients of correlation between the two synthesized scenes, on determined neighborhoods of the points of the scene, and by comparison of these coefficients to a threshold which will be obviously always less than unity.
  • the 3D geometric model of the scene can be constituted from the images of the scene, in a state of reference, provided by the sensors, stored either pixel per pixel or, more generally, using elements geometric such as facets.
  • the geometric model can also be established directly in mathematical form from a field knowledge provided by observations earlier or using maps and models of two-dimensional geometric representation by sensors can be derived from knowledge the position of the sensors relative to the scene.
  • the detection operations defined above can be repeated, for example at the rate of taking view of the sensors, which will usually be cameras. So we can follow the evolution of an object in the scene.
  • the invention also provides a device for monitoring to implement the above process defined.
  • This device includes several sensors sensitive to a radiometric parameter, in lengths comparable waveforms, including storage means a purely geometric three-dimensional reference model of a scene to watch, associating, at each point of the scene, homologous points in the sensor field; of means allowing, from the geometric model and signals representative of the radiometric value of each image pixel provided by the sensors, to synthesize the real scene by projection on the geometric model three-dimensional; and neighborhood comparison means of determined dimensions of the synthesized scenes, allowing to determine the gaps between them.
  • the invention is susceptible of numerous modes of different implementation, allowing to adapt it as well in case we can have or get a geometric model three-dimensional as a digital terrain model that in case it is necessary to determine a transformation establishing, at least approximately, a correspondence between each point of the field of one of the sensors with the homologous point of the field of the other sensor.
  • FIG. 1 shows a device having only two sensors 10 and 12, constituted for example by sensitive cameras in the visible range, which have angular fields which overlap in the scene to be observed.
  • a point of the digital terrain model having coordinates (x, y, z) in the referential axis system chosen will correspond to a point of coordinates ( ⁇ i, ⁇ i) and of radiometry Si in a two-dimensional frame of reference which is associated this time with the respective sensor. It is possible to determine, by calculation or prior measurements, the transformations making it possible to pass coordinates and radiometry in a frame of reference linked to the scene to coordinates and radiometry in a frame of reference linked to each sensor.
  • a scene is thus constituted for each sensor synthesis by associating the real geometry, provided by the DEM, and the radiometric value, is directly provided by the sensor, is subjected to a preliminary treatment, by filtering example.
  • the scenes synthesized will remain almost identical to each other for this area, even if there has been an evolution in radiometry real. Even if, for example due to a different answer of the two sensors and lighting variations, the scenes synthesized do not remain strictly identical, the synthesized scenes will in any case remain closely correlated, in the absence of modification of the geometry of the ground.
  • the observed scene has undergone geometric modifications compared to the known model, by following an intrusion, the corresponding radiometric value at a point of the disturbance, seen from one of the sensors, will projected during the synthesis on a geometric position of the where it did not come from, since the device only knows the geometric model not having this disturbance. It will be the same for the value radiometric supplied, for the same point of the disturbance, by the other sensor. Consequently the synthesis scenes developed by two sensors, each based on information radiometric he receives and the geometric model of reference, will differ. The gaps between the two scenes of synthesis will depend on the height and volume of the disturbance.
  • the device can therefore be robust to variations in lighting conditions and be insensitive to passage of small objects on the ground.
  • the size, and in particular the height, minimum detected depends in particular on the resolution sensors and the distance between them.
  • the anomaly 14 shown in dashes results in a modification of the image provided by the sensor 10 over the width l , which can be translated as the addition of a shadow on the picture.
  • the device can be supplemented by sensors other natures, or implement multiple systems associates working in different wavelengths.
  • it can implement, in addition to detection using a comparison between radiometric values supplied by two or more sensors (e.g. infrared), a time detection using a comparison between successive images.
  • the first method assumes that a digital model of the terrain has been produced and stored.
  • This digital model can be created from aerial photographs, maps, or even measurements in the field. From the knowledge of this mathematical model, we can determine by calculation, for a given position of each of the sensors, at least two for each scene, the transformations which correspond, to each point of coordinates (x, y, z) of the digital model, the coordinates ⁇ i, ⁇ i of the image of this point in the coordinate system associated with the sensor of order i .
  • the radiometric signal S ( ⁇ i, ⁇ i) supplied at each point of the image of the sensor i is proportional to the flux re-emitted by the surface element associated with the points (x, y, z) of the model, the coefficient of proportionality being substantially constant over a neighborhood of a few pixels.
  • the detection scheme in the case of two sensors 10 and 12 constituted by two cameras, is then that shown in FIG. 3.
  • a point (x, y, z) diffusing a flow S it is possible to associate the image points ( ⁇ 1 , ⁇ 1) of camera 1 and ( ⁇ 2, ⁇ 2) of camera 2.
  • the detection operation is carried out by comparison signals S1 and S2 associated with the same point (x, y, z) and this for each of the points (x, y, z).
  • This method makes it possible to get rid of consequences of differences in gains between the sensors and effects due to angular conditions distinct from shooting, having an influence on the backscatter, towards the sensors.
  • the synthesis model is reconstructed with a radiometry which is no longer that associated with the geometry of the observed area.
  • a radiometry which did not come from the point of the scene of reference. And this is true for all the points masked by the perturbation.
  • Detection is determined by issuing an alarm signal when the correlation coefficient is less than a threshold s , which is obviously always chosen to be less than 1.
  • Such a detection technique has the advantage to be insensitive to variations in dynamics and in particular of illumination, which constitutes a factor of robustness important and allows omitting any radiometric calibration.
  • a second approach should be used when we do not initially know a numerical model of field: it allows to return to the initial conditions of the where a DTM is available.
  • This grid can in particular be constituted by a wrought, obtained by example by scanning with a laser and location, each laser pulse, from the point of the image representing impact on the ground.
  • Figure 4 shows two sensors, constituted by cameras 10 and 12 whose fields have an overlap zone 16 of which a part or the all constitutes the monitored scene. Lighting can be natural; at night or to improve the sensitivity of short distance detection, natural lighting can be replaced or supplemented by an artificial, visible or infrared. In the case of person detection, it suffices with a ground resolution of 10x10 cm and a cadence of two detections per seconds, which corresponds to a displacement of 1.5 m for one person at 10 km / h.
  • the sensors can be CCD infrared cameras with a matrix of 600x700 pixels.
  • the two cameras 10 and 12 are synchronized, by example by controlling camera 12 from the signal from camera synchronization 10.
  • the corresponding pixels of two images supplied from the same scene by two cameras are on two straight lines, called epipolar, each passing through the optical center O 1 and 0 2 of the respective image: in each image , the pixel corresponding to the same point P of the scene is in the image - constituted by an epipolar line - of the plane 0 1 PO 2 .
  • the epipolars have the property that any pixel of the epipolar in one image has a homologous pixel in the corresponding epipolar of the other image if the reference marks of the cameras remain unchanged.
  • An interesting configuration is to place the cameras so that their marks (line and frame directions) are parallel and that the points 0 1 and 0 2 are located on the same straight line parallel to the line direction.
  • Two epipoles such as 30 1 and 30 2 are then straight lines parallel to the line direction (FIG. 5A).
  • the treatments can then be carried out line by line according to the epipolar: it suffices to memorize in the card 26 the lines corresponding to the epipolar of the zones observed and a number of lines above and below corresponding to the extent of the chosen neighborhood for correlation calculation.
  • FIG. 5B When it is not possible or inconvenient to adopt the arrangement of FIG. 5A, that of FIG. 5B is still advantageous.
  • the epipolar 32 1 and 32 2 are then inclined lines, but making the same angle with respect to the lines of the two images.
  • FIG. 6 shows a possible constitution of the processing part of a device in the case of a terrain model using a memorized grid. It will be assumed that the optical centers of the cameras 10 and 12 are arranged in the vicinity of the same straight line parallel to the direction of scanning in the line direction.
  • the memory 26 (FIG. 4) can comprise two memory spaces 26 1 and 26 2 corresponding to a packet of a few lines of pixels only, centered on the line of order i whose pixels are neighboring.
  • the processor can be viewed as having two processing channels 24 1 and 24 2 performing the projections and receiving, on respective inputs 34 1 and 34 2 , the geometric models in digital form.
  • the processor also includes a unit 36 of comparison of projections, usually by determination an intercorrelation factor on each neighborhood and comparison to a threshold.
  • the output of the unit 36 is applied to a block 38 for transferring the images to a recording device and possibly a display device 40.
  • the processor 36 also has an output 40 controlling the passage of a packet of lines centered on line i has a packet centered on line i + 1 when the comparison is complete. In other words, the processor drives the storage of line packets asynchronously.
  • the architecture shown in Figure 6 allows limit the amount of RAM to write and read fast.
  • the invention is susceptible of numerous variants of realization and many applications other than those that have been specifically mentioned.
  • the device can be used to locate a robot inside the scene, the robot constituting an anomaly.
  • the sensors may be placed on a displacement device (bridge carriage rolling for example) fitted with measuring means allowing permanently have the coordinates of the sensors by report to the field. It is thus possible, from a main memory storing a digital terrain model corresponding to the entire robot evolution domain, to constantly update the processor memory of processing to match the location of the sensors.
  • the device according to the invention has the advantage to be free from most system faults visualization mounted on the robots themselves, including the limitation of the distance field of vision and the impossibility to observe the area of movement of the robot behind obstacles.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Claims (11)

  1. Verfahren zur Überwachung einer dreidimensionalen Szene mit einer bekannten oder bestimmbaren geometrischen Struktur, welches die Erfassung des Auftretens einer Reliefanomalie ermöglicht, mit den folgenden Schritten:
    Erstellen und Speichern eines dreidimensionalen geometrischen Bezugsmodells der Szene in einem Bezugszustand, indem jedem Punkt der Szene homologe Punkte in dem Feld von mindestens zwei Sensoren (10, 12) zugeordnet werden, welche in einem Abstand voneinander angeordnet sind und in Bezug auf einen radiometrischen Parameter (wie etwa die Helligkeit) der Szene empfindlich sind;
    periodisches Synthetisieren der realen Szene auf dem dreidimensionalen geometrischen Modell ausgehend von jeder der von den Sensoren gelieferten Darstellungen davon, zum Herstellen einer Anzahl von synthetisierten Szenen, welche gleich derjenigen der Sensoren ist; und
    Vergleichen von den synthetisierten Szenen zugehörigen Umgebungen miteinander, um Abweichungen infolge einer Modifikation des dreidimensionalen geometrischen Modells erkennbar zu machen, wobei nur diejenigen Abweichungen aufbewahrt werden, welche einer einen vorgegebenen Minimalwert übersteigenden Höhenanomalie entsprechen.
  2. Verfahren nach Anspruch 1, dadurch gekennzeichnet, daß die synthetisierten Szenen durch die Bestimmung von Interkorrelationskoeffizienten zwischen den beiden synthetisierten Szenen an bestimmten Umgebungen der Punkte der Szene verglichen werden, und daß die erhaltenen Koeffizienten mit einem Schwellenwert verglichen werden.
  3. Verfahren nach Anspruch 1 oder 2, dadurch gekennzeichnet, daß das geometrische Modell der Szene ausgehend von in einem Bezugs zustand von den Sensoren gelieferten, pixelweise gespeicherten Abbildungen der Szene erstellt wird.
  4. Verfahren nach Anspruch 1 oder 2, dadurch gekennzeichnet, daß das geometrische Modell in mathematischer Form ausgehend von einer durch vorhergegangene Beobachtungen oder mit Hilfe von Karten zur Verfügung gestellten Kenntnis des Geländes gebildet wird.
  5. Verfahren nach Anspruch 1 oder 2, dadurch gekennzeichnet, daß das geometrische Modell der Szene ausgehend von Landmarkenpunkten mit bekannten Koordinaten (x,y,z) und einer Gleichung mit einer vorgegebenen Form erstellt wird.
  6. Verfahren nach Anspruch 1 oder 2, dadurch gekennzeichnet, daß das geometrische Modell ausgehend von einem Gitternetz erstellt wird, welches durch Abtasten mit Hilfe eines Lasers über die Szene gelegt und in Form der Ortung des bei jedem Laserimpuls den Auftreffpunkt auf dem Gelände darstellenden Bildpunktes durch jeden der Sensoren registriert wird.
  7. Verfahren nach einem der vorhergehenden Ansprüche, dadurch gekennzeichnet, daß des weiteren aufeinanderfolgende, von den Sensoren gelieferte Abbildungen zum Erstellen einer zeitlichen Erfassung verglichen werden.
  8. Verfahren nach einem der vorhergehenden Ansprüche, dadurch gekennzeichnet, daß die Sensoren einen gegenseitigen Abstand haben, der geringer als die Entfernung zwischen den Sensoren und den nächstliegenden Punkten der Szene ist.
  9. Vorrichtung zur Überwachung einer angestrahlten dreidimensionalen Szene mit einer bekannten oder bestimmbaren geometrischen Struktur, welche die Erfassung von Reliefanomalien auf der Szene ermöglicht, mit mindestens zwei Sensoren (10, 12), welche in Bezug auf einen radiometrischen Parameter in vergleichbaren Wellenlängen empfindlich sind, und des weiteren mit: Einrichtungen zum Speichern eines rein geometrischen dreidimensionalen Bezugsmodells der zu überwachenden Szene, welche jedem Punkt der Szene homologe Punkte im Feld der Sensoren zuordnen; Einrichtungen, welche das Synthetisieren der Szenen ermöglichen, zur Herstellung einer Anzahl von synthetisierten Szenen, die gleich derjenigen der Sensoren ist, wobei jede der Szenen ausgehend von dem geometrischen Modell sowie Signalen, welche repräsentativ für den radiometrischen Wert jedes der von einem der Sensoren gelieferten Bildpunkte sind, durch Projizieren auf das dreidimensionale geometrische Modell synthetisiert wird; und Einrichtungen zum Vergleichen von Umgebungen mit vorgegebener Abmessung auf der Gesamtheit der synthetisierten Szenen ausgehend von den von den verschiedenen Sensoren gelieferten Signalen, wodurch die Bestimmung der Abweichungen zwischen den gleichen Umgebungen in den beiden synthetisierten Szenen ermöglicht wird.
  10. Vorrichtung nach Anspruch 9, dadurch gekennzeichnet, daß die Sensoren des weiteren Sensoren aufweisen, welche in Wellenlängenbereichen arbeiten, welche von denjenigen der beiden Sensoren verschieden sind.
  11. Überwachungsvorrichtung nach Anspruch 10 zum Lokalisieren eines mobilen Robots in der Szene, dadurch gekennzeichnet, daß die Sensoren von Einrichtungen zum Verfahren in Bezug auf die Szene getragen sind, und mit einem Speicher zum Aktualisieren des gespeicherten geometrischen Modells in Abhängigkeit von der Position der Sensoren in Bezug auf die Szene verbunden sind.
EP19930401661 1992-06-29 1993-06-28 Verfahren und Vorrichtung zur Überwachung einer dreidimensionalen Szene unter Verwendung von Bildsensoren Expired - Lifetime EP0577491B1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR9207977A FR2693011B1 (fr) 1992-06-29 1992-06-29 Procédé et dispositif de surveillance d'une scène tridimensionnelle, mettant en Óoeuvre des capteurs d'imagerie.
FR9207977 1992-06-29

Publications (2)

Publication Number Publication Date
EP0577491A1 EP0577491A1 (de) 1994-01-05
EP0577491B1 true EP0577491B1 (de) 1999-03-17

Family

ID=9431309

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19930401661 Expired - Lifetime EP0577491B1 (de) 1992-06-29 1993-06-28 Verfahren und Vorrichtung zur Überwachung einer dreidimensionalen Szene unter Verwendung von Bildsensoren

Country Status (3)

Country Link
EP (1) EP0577491B1 (de)
DE (1) DE69323934T2 (de)
FR (1) FR2693011B1 (de)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10050083A1 (de) * 2000-10-10 2002-04-18 Sick Ag Vorrichtung und Verfahren zur Erfassung von Objekten

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CH691151A5 (fr) * 1994-06-09 2001-04-30 Edouard Menoud Dispositif de surveillance et d'alerte de la présence de corps en danger dans une piscine.
WO1997004428A1 (de) * 1995-07-20 1997-02-06 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Interaktives überwachungssystem
DE19621612C2 (de) * 1996-05-31 2001-03-01 C Vis Comp Vision Und Automati Vorrichtung zur Überwachung eines Gleisabschnittes in einem Bahnhof
DE19700811A1 (de) * 1997-01-13 1998-07-16 Heinrich Landert Verfahren und Vorrichtung zur Ansteuerung von Türanlage in Abhängigkeit von der Anwesenheit von Personen
DE19709799A1 (de) * 1997-03-10 1998-09-17 Bosch Gmbh Robert Einrichtung zur Videoüberwachung einer Fläche
DE19749136C2 (de) * 1997-11-06 2000-01-27 Geutebrueck Gmbh Verfahren und Vorrichtung zum Erfassen von Bewegungen
GB2352859A (en) * 1999-07-31 2001-02-07 Ibm Automatic zone monitoring using two or more cameras
DE10044689A1 (de) * 2000-09-08 2002-03-21 Idm Gmbh Infrarot Sensoren Vorrichtung zur Überwachung eines Bereichs eines Raumes
DE10330011B4 (de) * 2003-07-03 2005-05-12 Eads Deutschland Gmbh Verfahren zur Hinderniserkennung und Geländeklassifikation
TW200604047A (en) * 2004-07-22 2006-02-01 Siemens Ag Method to detect an obstruction on a railroad
ITUB20151930A1 (it) * 2015-07-03 2017-01-03 Soilmec Spa Sistema e metodo di sicurezza per la rilevazione di una condizione di rischio in una regione situata in prossimita' di una macchina operatrice, quale una macchina di perforazione o simile.

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2150724A (en) * 1983-11-02 1985-07-03 Christopher Hall Surveillance system
CA1284383C (en) * 1986-08-04 1991-05-21 Fmc Corporation Computer integrated gaging system
EP0402829A3 (de) * 1989-06-14 1991-06-12 Siemens Aktiengesellschaft Verfahren und Vorrichtung zum Detektieren eines Eindringlings mittels eines passiven Infrarot-Bewegungsmelders

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10050083A1 (de) * 2000-10-10 2002-04-18 Sick Ag Vorrichtung und Verfahren zur Erfassung von Objekten
US7116799B2 (en) 2000-10-10 2006-10-03 Sick Ag Apparatus and a method for the detection of objects

Also Published As

Publication number Publication date
DE69323934T2 (de) 1999-12-09
EP0577491A1 (de) 1994-01-05
DE69323934D1 (de) 1999-04-22
FR2693011B1 (fr) 1994-09-23
FR2693011A1 (fr) 1993-12-31

Similar Documents

Publication Publication Date Title
US8982363B2 (en) Method and apparatus to determine depth information for a scene of interest
Bodenmann et al. Generation of high‐resolution three‐dimensional reconstructions of the seafloor in color using a single camera and structured light
EP1364351B1 (de) Verfahren und einrichtung zum erkennung von fasern auf der grundlage von bildanalyse
CN103852754B (zh) 飞行时间(tof)测量系统中的干扰抑制的方法
CA2859900C (fr) Procede d'estimation de flot optique a partir d'un capteur asynchrone de lumiere
EP0577491B1 (de) Verfahren und Vorrichtung zur Überwachung einer dreidimensionalen Szene unter Verwendung von Bildsensoren
US10163256B2 (en) Method and system for generating a three-dimensional model
FR2908546A1 (fr) Stereo camera intrusion detection system
CN108027874A (zh) 使用深度摄像头的基于计算机视觉的安全系统
EP0146428A1 (de) Automatisches Steuerverfahren und Vorrichtung für bewegliche Körper, insbesondere für fahrerlose selbstangetriebene Wagen
WO2000052633A1 (en) System for recovery of degraded images
US20150373320A1 (en) Visual cognition system
WO2019158839A1 (fr) Systeme de signalement de depassement d'un seuil d'intensite sonore
FR2985070A1 (fr) Procede et systeme de detection de chutes de personnes
WO2012127312A1 (fr) Système de surveillance
EP3384462A1 (de) Verfahren zur charakterisierung einer szene durch berechnung der 3d-orientierung
Ablavsky Background models for tracking objects in water
Morel et al. Visual behavior based bio-inspired polarization techniques in computer vision and robotics
EP0604245B1 (de) Verfahren zur Erscheinungsdetektierung von Punktgegenständen in einem Bild
JP2012107943A (ja) 対象識別装置及び対象識別方法
FR3093613A1 (fr) Dispositif de contrôle de traitement agricole
FR2747199A1 (fr) Dispositif pour la localisation d'un objet mobile
EP3862980B1 (de) Verfahren zur herstellung einer virtuellen dreidimensionalen darstellung einer dammaussenschicht
WO2021255214A1 (fr) Procede et systeme de caracterisation geometrique de feux
WO2023052448A1 (fr) Procédé de détection d'obstacles

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): DE GB IT

17P Request for examination filed

Effective date: 19931221

17Q First examination report despatched

Effective date: 19970121

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE GB IT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT;WARNING: LAPSES OF ITALIAN PATENTS WITH EFFECTIVE DATE BEFORE 2007 MAY HAVE OCCURRED AT ANY TIME BEFORE 2007. THE CORRECT EFFECTIVE DATE MAY BE DIFFERENT FROM THE ONE RECORDED.

Effective date: 19990317

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 19990317

REF Corresponds to:

Ref document number: 69323934

Country of ref document: DE

Date of ref document: 19990422

GBV Gb: ep patent (uk) treated as always having been void in accordance with gb section 77(7)/1977 [no translation filed]

Effective date: 19990317

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed
PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20090611

Year of fee payment: 17

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20110101