EP1189187B1 - Procédé et système de surveillance d'une zône prédéterminée - Google Patents
Procédé et système de surveillance d'une zône prédéterminée Download PDFInfo
- Publication number
- EP1189187B1 EP1189187B1 EP01119125.1A EP01119125A EP1189187B1 EP 1189187 B1 EP1189187 B1 EP 1189187B1 EP 01119125 A EP01119125 A EP 01119125A EP 1189187 B1 EP1189187 B1 EP 1189187B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- detection device
- image detection
- area
- monitoring
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19665—Details related to the storage of video surveillance data
- G08B13/19669—Event triggers storage or change of storage policy
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19608—Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19639—Details of the system layout
- G08B13/19641—Multiple cameras having overlapping views on a single scene
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19639—Details of the system layout
- G08B13/19652—Systems using zones in a single scene defined for different treatment, e.g. outer zone gives pre-alarm, inner zone gives alarm
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19697—Arrangements wherein non-video detectors generate an alarm themselves
Definitions
- the present invention relates to a method for monitoring a predetermined area having at least two zones, which represent a critical and a non-critical area and are separated by an alarm boundary, as well as a corresponding method.
- video surveillance cameras are usually used to visualize the security area.
- video surveillance cameras with different viewing areas are used here because the field of view of a single CCTV camera is not sufficient to completely grasp it because of the size of the security area.
- An exemplary security area is e.g. divided into a critical and a non-critical area, which are separated by an alarm limit.
- a variety of CCTV cameras are used to monitor this alarm limit.
- Each video surveillance camera is assigned a predetermined detection area, which respectively visualizes a section of the security area and includes a part or the complete viewing area of the video surveillance camera, wherein an object entering a certain detection area is detected by the corresponding video surveillance camera. Furthermore, this video surveillance camera detects the complete movement of the object in its detection area, i. all position changes of the object are perceived. In the event that the object exceeds the alarm limit in the direction of the critical range starting from the uncritical area, the video surveillance camera triggers an alarm.
- the security guard Assuming that, for example, a guard in the critical area can carry out checks without this alarm being triggered, the security guard Also, without triggering the alarm, the alarm limit may shift from the critical towards the non-critical area, yet alarm would be triggered if the security guard went back into the critical area. To avoid this, sensing the complete movement of the security guard in the coverage area of a video surveillance camera, ie, so-called object tracking, allows the security guard to be in the critical area because he had begun his surveillance mission in the critical area, and thus When the alarm limit is exceeded in this example, no alarm is triggered.
- a major disadvantage of this method is that the alarm is triggered in the event that the security guard in the detection area of a first video surveillance camera exceeds the alarm limit from the critical towards the non-critical area, and in a detection area of a second video surveillance camera in the critical area above the alarm limit returns.
- WO 0008856 discloses a surveillance system with multiple cameras. Based on the known prior art, the present invention has the object to provide a way to trigger alarm when monitoring a predetermined area with increased decision security when an unauthorized object enters a critical area.
- the object of the present invention is achieved by a method for monitoring a predetermined region on the basis of images, wherein the images are generated by at least one of a first and a second image capture device and the predetermined region has at least two zones which are critical and uncritical Represent a range separated by an alarm limit.
- each image detection device is assigned a predetermined detection range, and an object entering the detection range of an image detection device is detected by the corresponding image detection device.
- This image capture device records data describing changes in the position of the object in the corresponding coverage area.
- An object entering the detection area of the first image capturing means is detected by the first image capturing means, recording data from the first image capturing means describing changes in the position of the object in the corresponding capturing area, and these data are transmitted from the first bit capturing means to the first capturing means handed over second image capture device.
- An advantage of the transfer according to the invention of the acquisition data of a first image capture device to a second image capture device is that the second image capture device has more data regarding the tracked object, so that the decisive security is significantly increased in an alarm triggering. Furthermore, this object transfer between the monitoring devices enables seamless monitoring of an object that moves in the detection areas of several image acquisition devices.
- the detection regions of the first and second image capture devices are preferably designed such that they overlap in an overlap region or adjoin one another.
- the first image capture device transfers the data it has recorded without being asked to the second image capture device in the event that the first image capture device determines that the object will enter the capture region of the second image capture device.
- the second image capture device may have the first image capture device for transferring the data recorded relative to the object.
- the first image capture device transfers the data recorded by it to an evaluation device which is connected via a network to the first and the second image capture device and which in turn transfers the data to the second image capture device.
- the evaluation device can transfer its data unrequested to the second image capture device in the event that it is determined that the object will enter the capture area of the second image capture device.
- the second image detection device in the event that it detects an object which enters the assigned detection area and which was previously in the detection area of the first image detection device, the second image detection device can request the evaluation device to transfer the recorded data.
- an alarm is triggered when the object, starting from the non-critical region in the direction of the critical region, crosses the alarm boundary.
- no alarm is triggered if the object in the detection area of the second image capture device has crossed the alarm boundary from the non-critical area in the direction of the critical area and has previously crossed the alarm boundary from the critical area in the direction of the non-critical area in the detection area of the first image capture device.
- the data describing the change in position of the object in the detection area of an image capture device comprises at least information representing a motion vector, a mean velocity, and motion direction components.
- area data can be transmitted which represent a section of the detection area of the second image capture device into which the object will enter from the detection area of the first image capture device.
- the data preferably has information that represents the section of the detection area of the second image capture device into which the object will enter.
- a so-called detection threshold for detecting an object in the detection area, a so-called detection threshold must be exceeded, wherein in the section of the detection area of the second image detection device into which the object will enter, the detection threshold is lowered after the detection of the object Data submitted by the first image capture device.
- An advantage of lowering the detection threshold is that the detection reliability of objects in the detection area of image acquisition devices to which data is transferred from other image acquisition devices is increased. This is particularly advantageous for the case where an object is e.g. due to different visibility conditions in the detection range of an image capture device is very well detected, while it is difficult to detect in the detection range of another, adjacent image capture device.
- the data transferred from the first image capture device to the second image capture device can be used to perform an identification of the object, which is considerably simplified on the basis of the transferred data.
- Fig. 1 shows a side view of an arrangement of monitoring devices 1, 2 according to a preferred embodiment of the present invention.
- the monitoring devices 1, 2 can be designed as conventional video surveillance cameras or as infrared or thermal imaging cameras.
- the monitoring devices 1, 2 are preferably respectively mounted on masts 3, 4, which are firmly anchored in a terrain to be monitored 8.
- the monitoring devices 1, 2 can be attached to any, rigid or movable objects, for example to building parts.
- both surveillance cameras each have a so-called viewing area 6, 7 of a length 11.
- the length 11 of the viewing area 5, 6, 7 depends on an orientation angle ⁇ of the monitoring devices 1, 2, as well as on the detection properties of the corresponding monitoring devices 1, 2.
- the detection properties correspond, for example, to the optical properties of the camera optics used.
- the viewing areas 5, 6, 7 of various monitoring devices 1, 2 are such that they overlap each other by predetermined lengths 9, 10, so that a complete monitoring of the area to be monitored is made possible.
- Fig. 2 shows a three-dimensional illustration of the viewing areas 6, 7 of the monitoring devices 1, 2 from Fig. 1 ,
- the viewing areas 6, 7 are shown for clarity with a rectangular base with a base page of length 11, although in reality these viewing areas 6, 7 will be formed as a rule, so-called "sight cone".
- the viewing areas 6, 7 preferably overlap one another in such a way that larger objects, which enter the viewing area 7, for example, starting from the viewing area 6, can also be completely detected in the overlapping area of the viewing areas 6, 7 of the length 10.
- Fig. 3 shows a plan view of the arrangement of the monitoring devices 1, 2 according to Fig. 1 in a first embodiment.
- the monitoring devices 1, 2 are used in the illustrated embodiment, for example, to prevent unauthorized approach of objects to a runway 12 on an airport site.
- a control line KL1 is monitored by means of the monitoring devices 1, 2, and when an object, starting from an allowed range EB, exceeds this control line KL1 in the direction of the runway 12, an alarm is triggered.
- a person or a vehicle along a direction indicated by an arrow 13 from the permitted area EB exceeds the control line KL1 in the direction of the runway 12, an alarm is triggered when the control line KL1 is exceeded at point 16.
- the corresponding object first enters the viewing area 6 of the monitoring device 1 and then traverses this viewing area 6 and the overlapping area that this viewing area 6 forms with the viewing area 7 of the monitoring device 2.
- the object is detected by the monitoring device 1 at the time it enters the viewing area 6 and its movement is tracked by means of object tracking to the point where the object enters the overlapping area.
- Object tracking means that the complete course of movement of the object is recorded in the field of view 6, 7 of the monitoring device 1, 2.
- the object is transferred from the monitoring device 1 to the monitoring device 2 by means of so-called “visual handover” or object transfer and subsequently followed by the monitoring device 2 further up to the point 16 the alarm is triggered.
- Visual handover or object transfer means that the data recorded by the monitoring device 1 is provided with respect to the object of the monitoring device 2.
- a second variant is indicated by an arrow 14, which illustrates that in turn moves any object such as a human or a vehicle in the arrow direction along the arrow 14 and the control line KL1, starting from the permitted range EB in the direction of the runway 12.
- alarm is triggered in item 17.
- the corresponding object enters the viewing area 6 of the monitoring device 1 and leaves this viewing area 6 again before entering the viewing area 7 of the monitoring device 2 and then crossing the control line 1 in point 17.
- the object enters the viewing area 7 of the monitoring device 2 determined that this object previously crossed the viewing area 6 of the monitoring device 1, whereupon the object is transferred from the monitoring device 1 to the monitoring device 2.
- an advantage of the object transfer according to the invention is that the alarm-triggering monitoring device 1, 2 has more data regarding the tracked object than in the case of object tracking by a single monitoring device 1, 2, so that the decision security in alarm triggering is significantly increased. For example, approximately 75% of the movement of the object moving along the arrow 13 is in the field of view 6 of the monitoring device 1 and only 25% in the field of vision 7 of the monitoring device 2, which triggers the alarm.
- a third variant is shown by an arrow 15.
- An object traversing the path described by the arrow 15 could, for example, be an employee of the control personnel who walks the area between the control line KL1 and the runway 12 to avoid any disturbing objects to eliminate.
- the space between the control line KL1 and taxiway 12 can be defined as a restricted area EEB by allowing certain objects to be present.
- the monitoring device 2 In order to prevent such a triggering of a false alarm, the monitoring device 2 must know the "history" of the control employee, i. In the present example, the way the employee of the control personnel has traveled along the arrow 15 in the field of vision 6 of the monitoring device 1. Accordingly, if the object tracking of the control officer in both viewing areas 6, 7, both the monitoring device 1 and the monitoring device 2, taken into account in a decision, it can be determined that the control staff has first exceeded the control line KL1 from the gap and then again by a Crossing the control line KL1 has returned to this space. That is, it can be stated that in the present example, no alarm is triggered because the crossing of the control line KL1 is a legitimate violation.
- an essential advantage of the present invention is that in an area to be monitored, which is divided into several viewing areas 6, 7 of several monitoring devices 1, 2, any object by means of object transfer between the monitoring devices 1, 2 can be monitored completely, and if necessary with a very high decision reliability, the triggering of an alarm can be determined.
- Another significant advantage of the object transfer is that the detection reliability of objects in the viewing areas 6, 7 of the monitoring devices 1, 2 is increased, which will be explained in more detail below.
- An object enters, for example, the viewing area 6, where it is detected by the monitoring device 1 and wherein changes in the position of the object are tracked and corresponding data is recorded.
- a notification is sent from the monitoring device 1 to the monitoring device 2.
- the notification preferably comprises information regarding the direction in which the object is moving and the recorded data, so that the monitoring device 2 can determine at which location the object will enter the viewing area 7.
- the monitoring device 2 can more easily detect and detect the object when entering the viewing area 7.
- this threshold value becomes in the monitoring device 2 preferably lowered starting from the data sent by monitoring device 1, so that the detection threshold is lowered accordingly.
- the lowering of the threshold value can take place for a sub-zone of the field of vision 7 into which the object detected by monitoring device 1 is likely to enter, or for the entire field of vision 7.
- a monitoring device 1, 2 which detects an entry of an object into its field of vision 6, 7 and detects changes in position of this object, record and analyze data to determine whether the object is possibly from the field of view 6, 7 of another Monitoring device 1, 2 has occurred.
- the corresponding monitoring device 1, 2 can request data from the detected, other monitoring device 1, 2 about the motion sequence of the object in the field of view 6, 7 of this other monitoring device 1, 2.
- Fig. 4 shows an illustration of an arrangement of monitors 19, 20 according to another preferred embodiment of the present invention.
- the monitoring devices 19, 20 can in turn be designed as conventional video surveillance cameras or as infrared or thermal imaging cameras.
- the monitoring devices 19, 20 are preferably each mounted on masts, which are firmly anchored in the ground.
- the monitoring devices 19, 20 can be attached to any, rigid or movable objects, for example to building parts.
- the monitoring devices 19, 20 serve, for example, to monitor a prison building.
- both surveillance cameras each have a viewing area 21, 22.
- the size of the monitorable viewing area 21, 22 depends on the orientation of the monitoring devices 19, 20, as well as on the detection characteristics of the corresponding monitoring devices 19, 20.
- the detection characteristics correspond to the optical properties of the camera optics used.
- objects detected in both viewing areas 21, 22 of the monitoring devices 19, 20 are tracked.
- an object such as a security guard
- a fire escape 25 into an allowed area EB
- an alarm is to be triggered when an object, such as a breaker, passes from a window 24 into the allowed area EB.
- Fig. 5 shows an illustrative block diagram of an embodiment of an inventive system for image analysis.
- a plurality of surveillance cameras 1, 2 preferably provide video signals each for monitoring a field of vision detected by the corresponding camera 1, 2.
- the video signals of the cameras 1, 2 are each switched to an analog / digital converter 29 and digitized.
- the camera 1, 2 may be a conventional video surveillance camera or an infrared or thermal imaging camera.
- the digitized pixel data of the video signals of the respective cameras 1, 2 are respectively stored in a memory device 30, wherein a memory device 30
- a memory device 30 Preferably, two different image memory or an image memory having two different memory areas, and wherein in the first image memory or first memory area, the digitized pixel data representing a respective current image are stored, and in the second image memory or second memory area, the digitized pixel data, the to represent a reference image.
- image difference determining means 31 the digitized pixel data of the current image are compared with those of the reference image of each video signal to determine differences between both images respectively generated by one and the same camera 1, 2.
- a binary image can be generated in which pixels with the binary value "0" represent pixels whose data are unchanged with respect to the pixel data of the respective reference image, while pixels with the binary value "1 "mark marked pixels, ie Pixels of the current image whose data has been detected in relation to the pixel data of the reference image, an image change.
- the generated binary images are examined in an object determining device 32 respectively on coherent marked pixels, wherein all contiguous pixels are assigned to an object, ie objects are extracted from the binary images. Accordingly, an object corresponds to a contiguous image area that has changed within a certain time period depending on the memory cycle of the second memory area and the second memory area, respectively.
- Object data of the extracted objects are stored in an object list, the objects being defined, for example, as a rectangle or the like circumscribing the maximum horizontal and vertical extent of the marked pixel area.
- the current object list is compared with a stored object list of the previous image and updated. In this case, the objects extracted from the current binary image are assigned to the objects found in the previous image by a plausibility check, such as checking for minimum distance, similar shape or the like, and objects to which no object has been assigned over a certain period of time are deleted again.
- the object data thus generated are evaluated in an evaluation device 33 for detecting alarm-relevant objects and for alarm triggering.
- data is preferably calculated which results from the difference between a detection point, for example the center point of a new object, and a stored center point of an assigned object of the preceding image. Based on this data, the current object list can be updated.
- an arbitrary object may be used to calculate a particular object in the field of view of a camera 1, 2 by calculating data preferably comprising a distance traveled, a horizontal and vertical directional component, and an average velocity of the object taking into account the present lifetime to track the entire period of capture.
- the digitized pixel data corresponding to the object are respectively read from the first memory or first memory area, wherein image content features for the object are extracted by known image processing methods in the read-out image detail.
- the size of the extracted rectangle and the number of marked pixels found within the rectangle are used. All features of the extracted and tracked objects are compared with required feature criteria stored in a storage device 34, and preferably an alarm is triggered while all criteria are met.
- the camera 1, 2 is selected in the field of view, the corresponding object is detected, and the corresponding video signal of the current image of the selected camera 1, 2 is connected by a switch 48 to a monitor 35, wherein the alarm objects with the corresponding vectors are displayed.
- Fig. 6 shows a detailed view of the memory device 30 from Fig. 5
- the memory device 30 preferably comprises two image memories 36, 37, the digitized pixel data of the current image being stored in the image memory 36. Furthermore At intervals, the digitized pixel data is stored in the second image memory 37 so as to be used as a reference image until the image memory 37 is again stored in the image memory.
- the digitized pixel data from the frame memories 36, 37 are compared with each other in the frame difference determining means 31. Furthermore, the digitized pixel data can be read out of memory 36 from the evaluation device 33 for feature extraction with respect to alarm-relevant objects.
- FIG. 12 shows a detailed view of the image difference determining means 31 Fig. 5 .
- the image difference determination device 31 preferably comprises a subtraction device 38, an absolute value formation device 39 and a threshold value comparison device 40.
- the digitized pixel data of the current image is compared with the digitized pixel data of the reference image, and differences are determined for respective pixels corresponding to each other. From these differences, amounts are formed in the magnitude bit means 39 for the individual pixels, which are compared in the threshold value comparator 40 with a predetermined threshold, which represents a decision threshold for a pixel change. By this decision threshold, for example, changes caused by signal noise are eliminated. When the threshold value is exceeded, a binary value "1" is generated for the corresponding pixel, i. an image change has been detected for the pixel and therefore this pixel is noted or marked. If the value falls below the threshold, a binary value "0" is assigned to the pixel.
- the object determination device 32 preferably comprises a binary image memory 41, an object extractor 42 and an object correlator 43.
- the binary image generated in the image difference determining means 31 is stored. This binary image is examined by the object extractor 42 for contiguous and marked pixels, thus extracting objects and corresponding object data are stored in an object list.
- a significant advantage of the determination of objects is that in the further course of processing no longer individual pixels, but only the extracted objects are used, which increases the processing speed considerably.
- the current object list is compared with a stored object list of the previous image and updated, wherein the extracted from the current binary image objects are assigned to the objects found in the previous image by plausibility check.
- Fig. 9 shows a detailed view of the evaluation device 33 from Fig. 5 .
- the evaluation device 33 preferably comprises an object tracking device 44, a feature extraction device 45 and an alarm object checking device 46.
- the object tracking device 44 calculates data that is used to track an object in the field of view of a corresponding camera 1, 2. Furthermore, in the event that an object leaves the viewing area of a camera 1, 2, the object tracking device 44 determines whether the object enters the field of view of another camera 1, 2. Thus, the object tracker 44 may send a signal to the selector 28 to cause the video signal of the corresponding camera 1, 2 to be switched to the monitor 35 by means of the switch 48,
- the feature extraction device 45 reads out the image data in the area of alarm-relevant object rectangles from the first image memory 36 Fig. 6 from and extracted in this image section according to known image processing method image content features for a corresponding object. This feature extraction, however, only happens for alarm-relevant objects, ie for objects that have a predetermined direction, size, speed, etc.
- the features of the extracted and tracked objects are compared with the required feature criteria stored in memory device 34.
- Fig. 10 shows a block diagram of another variant of the system according to Fig. 5 in that a reduction stage 47 is interposed between the analog / digital converter 29 and the memory device 30.
- the reduction stage 47 serves to reduce the amount of data of a video signal, for example by adding individual pixel data in groups to new pixel data which are stored in the memory device 30.
- Fig. 11 shows a memory section of the binary image memory 41 Fig. 8 ,
- the memory section is shown in the form of a 2-dimensional coordinate system 49 with x-axis in the horizontal direction and y-axis in the vertical direction. Marked pixels are marked with crosses, ie "X".
- two marked pixel areas 50, 51 that have changed with respect to the reference image have been extracted as objects 1 and 2.
- the extracted objects are rectangular, with object 1 corresponding to the pixel area 50 having a height H1 and a width B1, and object 2 corresponding to the pixel area 51 having a height H2 and a width B2.
- data can be determined which the coordinates x, y of the corresponding object center point, the respective object height H, the respective object width B and the number Px of the binary marked pixels. These data are entered in an object list 52.
- the object has a height of 2 pixels and a width of 4 pixels and comprises a total of 5 marked pixels.
- Fig. 12 shows an updated object list in which the data computed by the object tracker 44 has been supplemented.
- the current detection center point of an object is represented by the coordinates x n and y n and the last stored center point of the object by the coordinates x n-1 and y n-1 .
- the values H n-1 , B n-1 and Px n-1 give the last stored cave. Width or number of marked pixels of the object.
- the updated object list is supplemented by the determined values for the amount of a motion vector s, a mean velocity v, a previous duration T and movement direction components R H and R V.
- the object 1 has, for example, a current detection center point (2; 0).
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Alarm Systems (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Analysis (AREA)
- Burglar Alarm Systems (AREA)
Claims (17)
- Procédé pour surveiller une zone prédéterminée au regard d'images, qui sont produites au moins par un premier et un deuxième dispositif de relevé d'image (1, 2), procédé d'après lequel la zone prédéterminée comprend au moins deux secteurs qui représentent une zone critique (12) et une zone non critique (EB) séparées l'une de l'autre par une limite d'alarme (KL1), et d'après lequel
on associe à chaque dispositif de relevé d'image une zone de détection (6, 7) prédéterminée,
un objet pénétrant dans la zone de détection (6) du premier dispositif de relevé d'image (1) est détecté par le premier dispositif de relevé d'image lorsqu'un seuil de détection prédéterminé pour la zone de détection du premier dispositif de relevé d'image est dépassé, d'après lequel des données, qui décrivent des variations de la position de l'objet dans la zone de détection correspondante, sont enregistrées par le premier dispositif de relevé, et
d'après lequel ces données sont transmises du premier dispositif de relevé d'image au deuxième dispositif de relevé d'image (2), et l'on abaisse un seuil de détection prédéterminé pour la zone de détection du deuxième dispositif de relevé d'image si l'objet quitte la zone de détection du premier dispositif de relevé d'image en direction de la zone de détection du deuxième dispositif de relevé d'image. - Procédé pour surveiller une zone prédéterminée selon la revendication 1, caractérisé en ce que les zones de détection (6, 7) du premier et deuxième dispositif de relevé d'image (1, 2) se chevauchent dans une zone de chevauchement ou sont adjacentes.
- Procédé pour surveiller une zone prédéterminée selon la revendication 1 ou la revendication 2, caractérisé en ce que le premier dispositif de relevé d'image (1) transmet les données de manière non sollicitée au deuxième dispositif de relevé d'image (2) pour le cas où le premier dispositif de relevé d'image constate que l'objet va entrer dans la zone de détection (7) du deuxième dispositif de relevé d'image.
- Procédé pour surveiller une zone prédéterminée selon la revendication 1 ou la revendication 2, caractérisé en ce que le deuxième dispositif de relevé d'image détecte un objet pénétrant dans la zone de détection (7) associée, et, pour le cas où le deuxième dispositif de relevé d'image constate que l'objet était auparavant dans la zone de détection (6) du premier dispositif de relevé d'image (1), sollicite le premier dispositif de relevé d'image pour qu'il lui transmette les données enregistrées.
- Procédé pour surveiller une zone prédéterminée selon la revendication 1 ou la revendication 2, caractérisé en ce que le premier dispositif de relevé d'image (1) transmet les données à un dispositif de traitement de données, qui est en liaison, par l'intermédiaire d'un réseau, avec le premier et le deuxième dispositif de relevé d'image (2), et ce dispositif de traitement de données transmet les données au deuxième dispositif de relevé d'image.
- Procédé pour surveiller une zone prédéterminée selon la revendication 5, caractérisé en ce que le dispositif de traitement de données transmet les données de manière non sollicitée au deuxième dispositif de relevé d'image (2), pour le cas où l'on constate que l'objet va pénétrer dans la zone de détection (7) du deuxième dispositif de relevé d'image.
- Procédé pour surveiller une zone prédéterminée selon la revendication 5, caractérisé en ce que le deuxième dispositif de relevé d'image (2) détecte un objet, qui pénètre dans la zone de détection (7) associée et, pour le cas où le deuxième dispositif de relevé d'image constate que l'objet était auparavant dans la zone de détection (6) du premier dispositif de relevé d'image (1), sollicite le dispositif de traitement de données pour qu'il lui transmette les données enregistrées.
- Procédé pour surveiller une zone prédéterminée selon l'une des revendications 1 à 7, caractérisé en ce qu'une alarme est déclenchée lorsque l'objet franchit la limite d'alarme (KL1) à partir de la zone non critique (EB) en direction de la zone critique (12).
- Procédé pour surveiller une zone prédéterminée selon l'une des revendications 1 à 8, caractérisé en ce qu'aucune alarme n'est déclenchée lorsque l'objet, dans la zone de détection (7) du deuxième dispositif de relevé d'image (2), franchit la limite d'alarme (KL1) à partir de la zone non critique (EB) en direction de la zone critique (12), et a auparavant, dans la zone de détection (6) du premier dispositif de relevé d'image (1), franchi la limite d'alarme à partir de la zone critique en direction de la zone non critique.
- Procédé pour surveiller une zone prédéterminée selon l'une des revendications 1 à 9, caractérisé en ce que les dispositifs de relevé d'image (1, 2) sont réalisés sous la forme de caméras de vidéosurveillance, infrarouges ou à image thermique.
- Procédé pour surveiller une zone prédéterminée selon l'une des revendications 1 à 10, caractérisé en ce que les données, qui décrivent les variations de la position de l'objet dans la zone de détection (6, 7) d'un dispositif de relevé d'image (1, 2), comportent au moins de l'information représentant un vecteur de mouvement, une vitesse moyenne ainsi que des composantes de direction de mouvement.
- Procédé pour surveiller une zone prédéterminée selon l'une des revendications 1 à 11, caractérisé en ce qu'au regard des données, qui décrivent les variations de position de l'objet dans la zone de détection (6) du premier dispositif de relevé d'image (1), on détermine des données de zones, qui représentent un secteur de la zone de détection du deuxième dispositif de relevé d'image (2) dans lequel va pénétrer l'objet.
- Procédé pour surveiller une zone prédéterminée selon l'une des revendications 1 à 12, caractérisé en ce que les données, qui sont transmises par le premier au deuxième dispositif de relevé d'image (1, 2), comportent de l'information représentant le secteur de la zone de détection (7) du deuxième dispositif de relevé d'image, dans lequel va pénétrer l'objet.
- Procédé pour surveiller une zone prédéterminée selon la revendication 5 ou la revendication 6, caractérisé en ce que la valeur de seuil de détection prédéterminée pour la zone de détection (7) du deuxième dispositif de relevé d'image (2), est abaissée pour une zone partielle de la zone de détection du deuxième dispositif de relevé d'image, dans laquelle va pénétrer l'objet.
- Procédé pour surveiller une zone prédéterminée selon la revendication 2, caractérisé en ce que le deuxième dispositif de relevé d'image (2) utilise les données, qui décrivent les variations de la position de l'objet dans la zone de chevauchement et enregistrées par le premier dispositif de relevé d'image (1), en vue d'une identification de l'objet.
- Système pour surveiller une zone prédéterminée au moyen d'au moins un premier et un deuxième dispositif de relevé d'image (1, 2), la zone prédéterminée présentant au moins deux secteurs, qui représentent une zone critique (12) et une zone non critique (EB) séparées l'une de l'autre par une limite d'alarme (KL1), le premier et le deuxième dispositif de relevé d'image présentant respectivement :un dispositif pour l'association d'une zone de détection (6, 7) prédéterminée,un dispositif de relevé pour relever ou détecter un objet, qui pénètre dans la zone de détection associée,un dispositif d'enregistrement pour enregistrer des données, qui décrivent des variations de la position de l'objet dans la zone de détection associée,le système comprenant, en outre :un dispositif pour transmettre les données enregistrées par le premier dispositif de relevé d'image (1), au deuxième dispositif de relevé d'image (2) ; etle système étant conçu pour exécuter un procédé selon l'une des revendications 1 à 15.
- Système pour surveiller une zone prédéterminée selon la revendication 16, caractérisé en ce que les dispositifs de relevé d'image (1, 2) sont réalisés sous la forme de caméras de vidéosurveillance, infrarouges ou à image thermique.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE10042935 | 2000-08-31 | ||
DE2000142935 DE10042935B4 (de) | 2000-08-31 | 2000-08-31 | Verfahren zum Überwachen eines vorbestimmten Bereichs und entsprechendes System |
Publications (3)
Publication Number | Publication Date |
---|---|
EP1189187A2 EP1189187A2 (fr) | 2002-03-20 |
EP1189187A3 EP1189187A3 (fr) | 2009-05-27 |
EP1189187B1 true EP1189187B1 (fr) | 2016-02-03 |
Family
ID=7654518
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP01119125.1A Expired - Lifetime EP1189187B1 (fr) | 2000-08-31 | 2001-08-08 | Procédé et système de surveillance d'une zône prédéterminée |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP1189187B1 (fr) |
DE (1) | DE10042935B4 (fr) |
ES (1) | ES2563452T3 (fr) |
NO (1) | NO329869B1 (fr) |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7221775B2 (en) | 2002-11-12 | 2007-05-22 | Intellivid Corporation | Method and apparatus for computerized image background analysis |
US8547437B2 (en) | 2002-11-12 | 2013-10-01 | Sensormatic Electronics, LLC | Method and system for tracking and behavioral monitoring of multiple objects moving through multiple fields-of-view |
DE10310636A1 (de) | 2003-03-10 | 2004-09-30 | Mobotix Ag | Überwachungsvorrichtung |
US7286157B2 (en) | 2003-09-11 | 2007-10-23 | Intellivid Corporation | Computerized method and apparatus for determining field-of-view relationships among multiple image sensors |
US7280673B2 (en) | 2003-10-10 | 2007-10-09 | Intellivid Corporation | System and method for searching for changes in surveillance video |
US7346187B2 (en) | 2003-10-10 | 2008-03-18 | Intellivid Corporation | Method of counting objects in a monitored environment and apparatus for the same |
FI117662B (fi) * | 2004-06-29 | 2006-12-29 | Videra Oy | AV-järjestelmä sekä ohjain |
KR20070045223A (ko) * | 2004-08-05 | 2007-05-02 | 마쯔시다덴기산교 가부시키가이샤 | 감시 장치 및 그를 컴퓨터에서 실행시키기 위한 프로그램을 저장한 저장매체. |
JP4829290B2 (ja) | 2005-03-25 | 2011-12-07 | センサーマティック・エレクトロニクス・エルエルシー | インテリジェントなカメラ選択および対象追跡 |
US9036028B2 (en) | 2005-09-02 | 2015-05-19 | Sensormatic Electronics, LLC | Object tracking and alerts |
US7671728B2 (en) | 2006-06-02 | 2010-03-02 | Sensormatic Electronics, LLC | Systems and methods for distributed monitoring of remote sites |
US7825792B2 (en) | 2006-06-02 | 2010-11-02 | Sensormatic Electronics Llc | Systems and methods for distributed monitoring of remote sites |
JP4318724B2 (ja) | 2007-02-14 | 2009-08-26 | パナソニック株式会社 | 監視カメラ及び監視カメラ制御方法 |
WO2017060083A1 (fr) * | 2015-10-06 | 2017-04-13 | Philips Lighting Holding B.V. | Système de comptage de personnes et d'éclairage intégré |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2183878B (en) * | 1985-10-11 | 1989-09-20 | Matsushita Electric Works Ltd | Abnormality supervising system |
CA2155719C (fr) * | 1994-11-22 | 2005-11-01 | Terry Laurence Glatt | Systeme de surveillance video avec cameras pilotes et asservies |
GB2337146B (en) * | 1998-05-08 | 2000-07-19 | Primary Image Limited | Method and apparatus for detecting motion across a surveillance area |
US6359647B1 (en) * | 1998-08-07 | 2002-03-19 | Philips Electronics North America Corporation | Automated camera handoff system for figure tracking in a multiple camera system |
-
2000
- 2000-08-31 DE DE2000142935 patent/DE10042935B4/de not_active Expired - Lifetime
-
2001
- 2001-08-08 EP EP01119125.1A patent/EP1189187B1/fr not_active Expired - Lifetime
- 2001-08-08 ES ES01119125.1T patent/ES2563452T3/es not_active Expired - Lifetime
- 2001-08-27 NO NO20014158A patent/NO329869B1/no not_active IP Right Cessation
Also Published As
Publication number | Publication date |
---|---|
ES2563452T3 (es) | 2016-03-15 |
NO20014158D0 (no) | 2001-08-27 |
DE10042935A1 (de) | 2002-03-14 |
NO329869B1 (no) | 2011-01-17 |
EP1189187A3 (fr) | 2009-05-27 |
DE10042935B4 (de) | 2005-07-21 |
EP1189187A2 (fr) | 2002-03-20 |
NO20014158L (no) | 2002-03-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE60034555T2 (de) | System zur objekterkennung und -verfolgung | |
EP1189187B1 (fr) | Procédé et système de surveillance d'une zône prédéterminée | |
DE4410406C2 (de) | Optoelektronische Vorrichtung zum Überwachen der Umgebung eines Fahrzeugs | |
DE112010003000B4 (de) | Sichtsystem zum Überwachen von Menschen in dynamischen Umgebungen | |
DE102005026876B4 (de) | Fahrzeugumgebungs-Überwachungsvorrichtung | |
DE69322306T2 (de) | Gegenstandserkennungssystem mittels Bildverarbeitung | |
DE102012221563B4 (de) | Funktionsdiagnose und validierung eines fahrzeugbasierten bildgebungssystems | |
EP0815539B1 (fr) | Procede pour detecter des objets en mouvement dans des images se succedant dans le temps | |
DE102012215544A1 (de) | Überwachung einer Bahnstrecke | |
DE4430016C2 (de) | Bewegungsmelder und ein Verfahren zur Bewegungsmeldung | |
DE102004018813A1 (de) | Verfahren zur Erkennung und/oder Verfolgung von Objekten | |
DE10030421A1 (de) | Fahrzeugumgebungsüberwachungssystem | |
DE19709799A1 (de) | Einrichtung zur Videoüberwachung einer Fläche | |
EP2701133B1 (fr) | Procédés et dispositifs destinés à l'enregistrement d'image d'un véhicule en excès de vitesse | |
EP1531342B1 (fr) | Procédé de détection des piétons | |
WO2009003793A2 (fr) | Dispositif pour identifier et/ou classifier des modèles de mouvements dans une séquence d'images d'une scène de surveillance, procédé et programme informatique | |
DE19621612C2 (de) | Vorrichtung zur Überwachung eines Gleisabschnittes in einem Bahnhof | |
DE10049366A1 (de) | Verfahren zum Überwachen eines Sicherheitsbereichs und entsprechendes System | |
DE102018008282A1 (de) | Vorrichtung und Verfahren zum Erfassen von Flugobjekten | |
EP0777864B1 (fr) | Systeme et procede d'evaluation d'images | |
DE19937928A1 (de) | Einrichtung zum Erkennen eines beweglichen Körpers und Einrichtung zum Überwachen eines Kraftfahrzeugs | |
EP3614299A1 (fr) | Procédé et dispositif de détection des objets dans des installations | |
EP1103821A2 (fr) | Installation de surveillance | |
DE19744694B4 (de) | Videobewegungsmeldeeinrichtung | |
EP3833576B1 (fr) | Systeme de camera de surveillance |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR |
|
AX | Request for extension of the european patent |
Free format text: AL;LT;LV;MK;RO;SI |
|
PUAL | Search report despatched |
Free format text: ORIGINAL CODE: 0009013 |
|
AK | Designated contracting states |
Kind code of ref document: A3 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR |
|
AX | Request for extension of the european patent |
Extension state: AL LT LV MK RO SI |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: H04N 7/18 20060101ALI20090417BHEP Ipc: G08B 13/196 20060101ALI20090417BHEP Ipc: G08B 13/194 20060101ALI20090417BHEP Ipc: G08B 15/00 20060101AFI20020128BHEP |
|
17P | Request for examination filed |
Effective date: 20091127 |
|
AKX | Designation fees paid |
Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR |
|
17Q | First examination report despatched |
Effective date: 20101229 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: SECURITON GMBH |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
INTG | Intention to grant announced |
Effective date: 20150318 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
INTG | Intention to grant announced |
Effective date: 20150928 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D Free format text: NOT ENGLISH |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 774012 Country of ref document: AT Kind code of ref document: T Effective date: 20160215 Ref country code: CH Ref legal event code: NV Representative=s name: BOVARD AG, CH Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D Free format text: LANGUAGE OF EP DOCUMENT: GERMAN |
|
REG | Reference to a national code |
Ref country code: ES Ref legal event code: FG2A Ref document number: 2563452 Country of ref document: ES Kind code of ref document: T3 Effective date: 20160315 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 50116527 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: FP |
|
REG | Reference to a national code |
Ref country code: SE Ref legal event code: TRGR |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20160203 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20160203 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 16 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20160603 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20160203 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 50116527 Country of ref document: DE |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed |
Effective date: 20161104 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20160203 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: MM4A |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20160808 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 17 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20160203 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20160203 Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20160203 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 18 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: NL Payment date: 20200826 Year of fee payment: 20 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20200827 Year of fee payment: 20 Ref country code: ES Payment date: 20200902 Year of fee payment: 20 Ref country code: GB Payment date: 20200826 Year of fee payment: 20 Ref country code: FR Payment date: 20200826 Year of fee payment: 20 Ref country code: LU Payment date: 20200820 Year of fee payment: 20 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: CH Payment date: 20200826 Year of fee payment: 20 Ref country code: AT Payment date: 20200824 Year of fee payment: 20 Ref country code: BE Payment date: 20200825 Year of fee payment: 20 Ref country code: SE Payment date: 20200826 Year of fee payment: 20 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R071 Ref document number: 50116527 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MK Effective date: 20210807 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: PE20 Expiry date: 20210807 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK07 Ref document number: 774012 Country of ref document: AT Kind code of ref document: T Effective date: 20210808 |
|
REG | Reference to a national code |
Ref country code: SE Ref legal event code: EUG |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MK Effective date: 20210808 |
|
REG | Reference to a national code |
Ref country code: ES Ref legal event code: FD2A Effective date: 20211126 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GB Free format text: LAPSE BECAUSE OF EXPIRATION OF PROTECTION Effective date: 20210807 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: ES Free format text: LAPSE BECAUSE OF EXPIRATION OF PROTECTION Effective date: 20210809 |