EP1656650A1 - Verfahren und system zur erkennung eines körpers in einer zone in der nähe einer grenzfläche - Google Patents

Verfahren und system zur erkennung eines körpers in einer zone in der nähe einer grenzfläche

Info

Publication number
EP1656650A1
EP1656650A1 EP04767924A EP04767924A EP1656650A1 EP 1656650 A1 EP1656650 A1 EP 1656650A1 EP 04767924 A EP04767924 A EP 04767924A EP 04767924 A EP04767924 A EP 04767924A EP 1656650 A1 EP1656650 A1 EP 1656650A1
Authority
EP
European Patent Office
Prior art keywords
data
interface
representative
bodies
green
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP04767924A
Other languages
English (en)
French (fr)
Other versions
EP1656650B1 (de
Inventor
Thierry Cohignac
Frédéric Guichard
Christophe Migliorini
Fanny Rousson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
MG International SA
Original Assignee
Vision IQ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vision IQ filed Critical Vision IQ
Publication of EP1656650A1 publication Critical patent/EP1656650A1/de
Application granted granted Critical
Publication of EP1656650B1 publication Critical patent/EP1656650B1/de
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/08Alarms for ensuring the safety of persons responsive to the presence of persons in a body of water, e.g. a swimming pool; responsive to an abnormal condition of a body of water
    • G08B21/082Alarms for ensuring the safety of persons responsive to the presence of persons in a body of water, e.g. a swimming pool; responsive to an abnormal condition of a body of water by monitoring electrical characteristics of the water

Definitions

  • the present invention relates to a method, a system and devices for detecting a body in an area located near an interface between two liquid and / or gaseous media, in particular of the water / air type.
  • “near” also means “at the interface”.
  • Problem posed The problem concerns the detection of the presence of bodies in the vicinity of a water / air interface.
  • the invention sets out more particularly to solve these various problems in the case, among others, of the following four applications: - alarm if a stationary body is located under the interface.
  • the device written in this patent uses principles of detection and localization of bodies relative to the interface different from those which are the subject of the present application.
  • Solution The present invention solves the problem of detecting bodies located in the vicinity of a water / air type interface by proposing a method and a system making it possible to evaluate the position of a body relative to an interface, particularly of water type. / air, to discriminate moving bodies from stationary bodies, to generate alerts, to compile statistics, to give elements of trajectography and to allow the detection of entries or exits of bodies in the monitored area.
  • the invention relates to a method for detecting a body in an area located near an interface between two liquid and / or gaseous media, especially of the water / air type.
  • the body is lit by electromagnetic radiation comprising at least two different wavelengths, in particular situated in ranges corresponding to the near infrared on the one hand and to green-blue on the other hand.
  • the media have different absorption coefficients depending on the wavelengths of the electromagnetic radiation.
  • the method comprises the following steps: - (a) the step of choosing from the wavelengths of the electromagnetic radiation, at least two wavelengths or two wavelength ranges, - (b) the step of perform, for each wavelength or wavelength range, an image of the interface and of the area, - (c) the step of producing electrical signals representative of each image, - (d) the step of digitizing the electrical signals so as to produce data corresponding to each image, - (e) the step of extracting data corresponding to each image two groups of data respectively representative of at least one part of the body in the range near infrared and in the green-blue range, - (f) the step of comparing the groups of data. Steps (c) to (f) are hereinafter referred to as the process of deducing the presence of a body.
  • the method further comprises the step of integrating over time the results of the step of comparing the groups of data.
  • the method further comprises the step of triggering an alarm if a human-sized body is detected under the interface for a time greater than a determined threshold.
  • the method is such that to extract data corresponding to each image two groups of data respectively representative of at least one part of the body in the near infrared range and in the green-blue range, we generates caps (within the meaning of the present invention).
  • the method further comprises the following steps: the step of associating characteristics with each cap, - the step of deducing the presence of a group of data representative of at least part of the body if the characteristics exceed a predetermined SC threshold.
  • the method is such that to compare the groups of data, one searches for the data representative of at least one part of the body in the green-blue range for which there is not, in a neighborhood determined geometric, corresponding data representative of at least one part of the body in the near infrared range. So in case of a positive search, we can conclude that the body is located under the interface.
  • the method is such that to compare the groups of data, one searches for the data representative of at least one part of the body in the green-blue range for which there is, in a determined geometric neighborhood, corresponding data representative of at least one part of the body in the infrared range.
  • the method is more particularly intended to discriminate between a stationary body and a moving body.
  • the method in order to integrate the results of the comparison of the groups of data over time, the method further comprises the following steps: the step of iterating at specific time intervals the process of deduction of the presence of the body,
  • the invention also relates to a system for detecting a body in an area located near an interface between two liquid and / or gaseous media, in particular of the water / air type.
  • the body is lit by electromagnetic radiation comprising at least two different wavelengths, in particular situated in ranges corresponding to the near infrared on the one hand and to green-blue on the other hand.
  • the media have different absorption coefficients depending on the wavelengths of the electromagnetic radiation.
  • the system includes: - (a) selection means for choosing from the wavelengths of the electromagnetic radiation, at least two wavelengths or two wavelength ranges, - (b) means for taking pictures to produce, for each of the wavelengths or wavelength ranges, an image of the interface and of the area, (c) conversion means for producing electrical signals representative of each image, - (d) digitization means for digitizing the electrical signals so as to produce data corresponding to each image, (e) computer processing means for extracting data corresponding to each image two groups of data respectively representative of at least one part of the body in the near range infrared and in the green-blue range, - (f) means of calculation for comparing the groups of data.
  • the means of conversion, the means of digitization, the means of computer processing, the means of calculation are hereinafter called the means of deducing the presence of a body.
  • the system further comprises integration means for integrating over time the results of the means of calculating the data groups.
  • the system further comprises activation means for activating an alarm if a body of human size is detected under the interface for a time greater than a determined threshold.
  • the system is such that the computer processing means make it possible to generate caps (within the meaning of the present invention).
  • the system is such that the computer processing means make it possible: - to associate characteristics with each cap, - to deduce the presence of a group of data representative of at least one part of the body if the characteristics exceed a predetermined threshold SC.
  • the system is such that the calculation means make it possible to search for data representative of at least one part of the body in the green-blue range for which there is not, in a determined geometric neighborhood , corresponding data representative of at least one part of the body in the near infrared range. It results from the combination of technical features that in the event of a positive search, we can conclude that the body is located under the interface.
  • the system is such that the calculation means make it possible to search for the data representative of at least one part of said body in the green-blue range for which there is, in a determined geometric neighborhood, data corresponding representative of at least part of said body in the near infrared range. It results from the combination of technical features that in the event of a positive search, it can be concluded that said body is located at least partially above the interface.
  • the system is more particularly intended to discriminate between a stationary body and a moving body.
  • the system is such that the integration means for integrating over time the results of the calculation means make it possible: - to iterate at determined time intervals the implementation means for deducing the presence of said body;
  • FIG. 7 which represents a flowchart of the computer processing means
  • - Figure 8 shows a general schematic view of the system according to the invention.
  • Pixel, Pixel value Pixel is called: an elementary area of an image obtained by creating a tiling, generally regular, of said image.
  • a sensor such as a video camera, or a thermal or acoustic camera
  • FIG. 1a represents an image 101 (symbolized by a man, swimming on the surface of a swimming pool, the contours of which are not perfectly visible).
  • FIG. 1b a tiling 102 of pixels 103 has been superimposed on this image.
  • a tiling has been shown in the figure on which the values of the pixels have been indicated.
  • Adjacent pixels Two pixels of the tiling are said to be adjacent if their edges or corners touch.
  • Path on tiling A path on tiling is an ordered and finite set of pixels where each pixel is adjacent to its next (in the sense of the ordering). The size of a path is given by the number of pixels making it up.
  • FIG. 2a represents a tiling 202 of 16 pixels 203, among which we have highlighted 3 pixels, called A, B and C. It can be noted that the pixels A and B are adjacent and that the pixels B and C are adjacent. There is therefore a path (A->B-> C) which connects these pixels. The set of pixels ⁇ A, B, C ⁇ is therefore connected. In FIG. 2b, a tiling has also been shown.
  • Each pair of pixels in the set is linked by a path of pixels belonging to the set, the set of pixels ⁇ A, B, C , E, F, 1 ⁇ is therefore connected.
  • FIG. 2c the same tiling 202 has been shown as in FIG. 2b, by selecting the set of pixels ⁇ A, C, F, N, P ⁇ .
  • A->C-> F which connects the pixels A, C and F, but there is no path of pixels belonging to the set connecting N and P, or else N to A.
  • L ' set of pixels ⁇ A, C, F, N, P ⁇ is not connected.
  • the set ⁇ A, C, F ⁇ is connected.
  • Pixel adjacent to a set A pixel which does not belong to a set is said to be adjacent to said set when it is joined to at least one pixel belonging to said set.
  • Level of a cap We call level of a top cap. or inf. said predetermined value.
  • FIG. 3a, 3b, 4a, and 4b represent images composed of tilings 302 (resp. 402) of pixels 303 (resp. 403) on which their values have been indicated.
  • FIG. 3a represents (inside 304 of the strong line 305) a set of 4 pixels. This set has the following properties: - it is connected in the sense of the definition given, - the values of all the pixels of the set are greater than 1, - the (twelve) pixels adjacent to the set have for some a value greater than 1. The set of pixels considered is therefore not an upper cap. level 1.
  • this set of pixels has the following properties: - it is connected in the sense of the definition given, - the values of all the pixels in the set are greater than 2, - the (twelve) contiguous pixels to the set all have a value less than or equal to 2.
  • This set of pixels is therefore an upper cap. level 2.
  • FIG. 3b represents a set 306 of eight pixels having the following properties: - it is connected in the sense of the definition given, - the values of all the pixels of the set are greater than 1, - the (eighteen) pixels contiguous to the set all have a value less than or equal to 1.
  • the set of pixels considered is therefore an upper cap. of level 1.
  • FIG. 4a represents a tiling 402 of pixels 403.
  • a strong line 405 has isolated a set 404 of ten pixels distributed in two zones 404 a and 404b.
  • This set of pixels 404 has the following properties: - it is not connected within the meaning of the definition given, - the values of all the pixels are greater than 1 - the (twenty-five) pixels joined to the set all have a value less than or equal to 1. The ten pixels of this unrelated set therefore do not constitute an upper cap. of level 1.
  • FIG. 4b represents a set 406 of twelve pixels having the following properties: - it is connected in the sense of the definition given, - the values of the pixels are not all greater than 1, - the (twenty- four) pixels joined to the set all have a value less than or equal to 1.
  • Characteristic (s) associated with a skullcap We call characteristic (s) associated (s) with a skullcap: one or more values obtained by arithmetic and / or logical operations predefined from the pixel values of the cap, and / or the positions of the pixels in the tiling, and / or the level of the cap. For example, an arithmetic operation could consist in using the sum of the differences between the value of each pixel of the cap and the level of the cap, or even the size (number of pixels) of said cap.
  • Figure 5 represents a schematic view of the system allowing the detection of bodies located in the vicinity of a water type interface /air. Since the green-blue 501 and near infrared 502 images are not necessarily taken from the same observation point, the data or the images can advantageously be replaced in a virtual common reference frame 503.
  • the virtual reference frame may correspond to the surface of water 504, so that a point on the surface of water 505, seen by the green-blue camera 506 and seen by the near infrared camera 507, will be in the same place 508 in the common coordinate system virtual. In this way, two points close to real space will correspond to close points in this virtual common landmark.
  • FIG. 6 represents, in the case of a swimming pool, a general view of the system allowing the detection of bodies located in the vicinity of a water / air type interface, in particular the detection and monitoring of swimmers.
  • the system according to the invention comprises means, hereinafter described, for detecting a body 601 in a zone 603 located near an interface 602 between two liquid media 604 and / or gaseous 605 in particular of the water / air type; said body being illuminated by electromagnetic radiation comprising at least two different wavelengths, in particular situated in ranges corresponding to the near infrared on the one hand and to green-blue on the other hand; said media having different absorption coefficients as a function of the wavelengths of the electromagnetic radiation.
  • “near” also means "at the interface”.
  • the system further comprises the means: A video camera 606a, equipped with a filter making it possible to produce at least one video image in the wavelength range of 300 to 700 nm (range called green-blue below).
  • a video camera 606b equipped with a filter making it possible to produce at least one video image in the wavelength range from 780 to 1100 nm (range termed near infrared thereafter). These cameras make it possible to produce video images of said interface 602 and of said zone 603, from at least two observation points 607a and 607b. These images are represented by electrical signals 608a and 608b.
  • Each of the observation points 607a and 607b is located on one side of said interface 602. In this case, the observation points 607a and 607b are located above the swimming pool.
  • the video cameras 606a and 606b and their housings are aerial, they are in the open air.
  • the system further includes digital conversion means 609 for producing digital data from electrical signals 608a and 608b representative of the green-blue and near infrared video images.
  • digital conversion means 609 for producing digital data from electrical signals 608a and 608b representative of the green-blue and near infrared video images.
  • the cameras 606a and 606b are equipped with polarizing filters 611a and 611b at least partially eliminating the reflections of light on said interface in said images.
  • This variant embodiment is particularly suitable in the case of a swimming pool reflecting the rays of the sun or those of an artificial lighting.
  • Said system further comprises computer processing means 700 described below.
  • FIG. 7 represents a flow diagram of the computer processing means 700.
  • the computer processing means 700 make it possible to discriminate the data corresponding to the green-blue video images of a part of a real body (FIG. 1a) from those corresponding to the video images apparent blue-green (FIG. 1b) generated by said interface 602.
  • the computer processing means 700 also make it possible to discriminate the data corresponding to near infrared video images of a part of a real body (FIG. 1a) from those corresponding to near infrared video images apparent (FIG. 1b) generated by said interface 602.
  • Said computer processing means 700 comprise calculation means, in particular a processor 701, and a memory 702.
  • Computer processing means 700 comprise means of 712 to extract a group of data representative of at least one part of the body in the near infrared range uge.
  • the computer processing means 700 further comprise extraction means 713 making it possible to extract a group of data representative of at least one part of the body in the green-blue range.
  • the extraction means 712 and 713 to extract groups of data representative of at least one part of the body in the near infrared range and in the green-blue range, the extraction means 712 and 713 - generate caps, - associate characteristics with each cap, - deduce the presence of a group of data representative of at least one part of the body if the characteristics exceed a predetermined threshold SC.
  • An example of a characteristic associated with a cap may be its area defined by the number of pixels constituting it.
  • Another characteristic associated with a cap can be its contrast defined as the sum of the differences between the value of each pixel of the cap and the level of the cap.
  • a data group representative of a part of a body could then be a cap having a contrast greater than a threshold SC and an area between a ThinMin threshold and a MaxMax threshold representative of the minimum and maximum dimensions of the body parts wanted.
  • the computer means 700 make it possible to select from the groups of extracted data, those which do not correspond to a part of the swimmer.
  • the system includes means making it possible to eliminate the caps corresponding to reflections, water lines, carpets as well as any object potentially present in a swimming pool and not corresponding to a part of the swimmer.
  • Examples of selection may be made by calculating the level of the caps, which must be below a threshold SR corresponding to the average gray level of the reflections, by calculating the alignment of the caps, corresponding to the usual position of the water lines. , by estimating the shape of the caps which should not be rectangular in order to eliminate the carpets.
  • the extraction means 712 and 713 may proceed other than by means of the extraction of caps.
  • the extraction means 712 and 713 can extract groups of pixels sharing one or more predetermined properties, and then associate characteristics with each group of pixels, and deduce the presence of a group of data representative of at least a body part if the characteristics exceed a predetermined SC threshold.
  • the predetermined property or properties may for example be chosen so as to exclude the appearance of the water / air interface in the image.
  • Said computer processing means 700 also comprise comparison means 714, for comparing said groups of data.
  • said comparison means 714 search for data representative of at least part of said body in the green-blue range for which there is not, in a geometrical comparison neighborhood, corresponding representative data at least a part of said body in the near infrared range. So that in the event of a positive search, it can be concluded that said body is located under the interface.
  • a geometric comparison neighborhood for example a circular neighborhood with a radius of 50 cm, centered on the center of gravity of the extracted caps in the green-blue image, caps extracted in the near infrared image. If the search is negative, the swimmer is considered to be below the surface of the water.
  • one searches for data representative of at least one part of said body in the green-blue range for which there is, in a geometric comparison neighborhood, corresponding data representative of at least one part of said body in the near infrared range. So that in the event of a positive search, it can be concluded that said body is located at least in part above the interface.
  • a geometric comparison neighborhood for example a circular neighborhood with a radius of 50 cm, centered on the center of gravity of the extracted caps in the green-blue image, caps extracted in the near infrared image. If the search is positive, the swimmer is considered to be at least partly above the surface of the water.
  • the caps extracted in the green-blue image and those extracted in the near infrared image are matched if the distance the shortest (between the two closest pixels) is less than 30 cm.
  • the unpaired green-blue image caps will then be considered to be a swimmer below the surface of the water.
  • the paired caps of the green-blue image will be considered as swimmers partly above the surface of the water.
  • the geometric comparison neighborhood is not necessarily determined.
  • the geometric comparison neighborhood relating to the infrared and green-blue caps respectively, as a function of geometric considerations relating to the positions of said caps and possibly also as a function of geometric considerations specific to the environment in particular the orientation of the cameras relative to the interface or the orientation in the images of the normal to the interface.
  • the caps from the infrared cameras being relative to the body parts located above the interface, we will look for the corresponding green-blue caps in a geometric comparison neighborhood calculated according to the orientation of the normal to the interface.
  • the system described in the present invention can be used in complement of a system based on stereovision such as that described in patent n ° FR 00/15803.
  • the system described in patent n ° FR 00/15803 detects a body under the surface of the water and, - if there is, in a determined geometrical neighborhood, corresponding data representative of at least one part of said body in the near infrared range, it can be concluded that said body is located at least partly above the interface, - if there is not, in a determined geometrical neighborhood, corresponding data representative of at least a part of said body in the near infrared range, it can be concluded that said body is located below one interface.
  • the system described in the present invention can advantageously use stereovision principles such as those described in patent No. FR 00/15803.
  • said system includes integration over time 703, associated with a clock 704, to iterate at determined time intervals said process for deducing the presence of a body described above.
  • the video images are taken at time intervals determined from said observation point.
  • said computer processing means 700 include totalizers 705 for calculating the number of times the body is detected during a determined period of time T1.
  • Said computer processing means 700 further comprise discriminators 706 for discriminating, at a point in said zone, between the bodies which are present a number of times greater than a determined threshold SI and the bodies which are present a number of times lower than said determined threshold SI.
  • said bodies are hereinafter designated the stationary bodies
  • said bodies are hereinafter designated the moving bodies.
  • said computer processing means 700 further comprises means for calculating the number of times a body is detected as being stationary and new during a determined period of time T2. Said time period T2 is chosen to be greater than the duration of the phenomena that are observed, and in particular greater than T1.
  • Said computer processing means 700 furthermore comprise transmission means 716 for transmitting an alert signal 711 according to the detection criteria described above.
  • an additional step of integration over time may advantageously be carried out by accumulation of images from the same green-blue and / or near infrared camera.
  • the accumulated image is calculated for example by averaging the gray levels of the pixels of the successive images taken over a determined time interval.
  • An accumulated image obtained by accumulation of images from a green-blue camera will be called green-blue accumulated image.
  • an accumulated image obtained by accumulation of images from a near infrared camera will be called an accumulated near infrared image.
  • the extraction means 712 and 713 can then also use the accumulated green-blue and / or near infrared images.
  • the extraction means 712 may extract only the caps of the green-blue image for which there is not, in the accumulated green-blue image, a similar cap located in a neighborhood.
  • Extraction means 712 and 713 can then also use composite images consisting of accumulated green-blue images and green-blue images as well as composite images consisting of accumulated near infrared and near infrared images.
  • the extraction means 712 could use the difference between the green-blue image and the accumulated green-blue image.
  • FIG. 8 represents a general schematic view of the system according to the invention.
  • the system makes it possible to detect a body 801 in a zone 802 located near an interface 803 between two liquid 812 and / or gaseous 813 media, in particular of the water / air type.
  • the body 801 is illuminated by electromagnetic radiation 804 comprising at least two different wavelengths, in particular situated in ranges corresponding to the near infrared on the one hand and to green-blue on the other hand.
  • the media 812 and 813 have different absorption coefficients as a function of the wavelengths of the electromagnetic radiation.
  • the system comprises: - (a) selection means 814 for choosing from the wavelengths of electromagnetic radiation 804, at least two wavelengths or two wavelength ranges, - (b) pick-up means views 815 to produce, for each wavelength or wavelength range, an image 805 of the interface and of the area, - (c) conversion means 816 for producing electrical signals 6 representative of each image 805, - (d) digitization means 817 for digitizing the electrical signals 806 so as to produce data 807 corresponding to each image, - (e) computer processing means 818 for extracting data 807 corresponding to each image 805 two groups of data 807 respectively representative of at least one part of the body 801 in the near infrared range and in the green-blue range, (f) calculation means 819 for comparing the groups of data 807.
  • the conversion means 816, the digitization means 817, the computer processing means 818, the calculation means 819 are hereinafter called the deduction means from the presence of a body 801. It is thus possible to detect the presence of a body 801 and / or to determine the position of the detected body with respect to the interface 803, by discriminating between a body 801 located under the interface 803 and a body 801 located at least in part above the interface 803.
  • the system further comprises integration means 820 for integrating over time the results of the calculation means 819 of the data groups 807.
  • the system further comprises activation means 821 for activating an alarm 808 if a body of human size is st detected under the interface for a time greater than a determined threshold.

Landscapes

  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Processing (AREA)
  • Emergency Alarm Devices (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Geophysics And Detection Of Objects (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
EP04767924A 2003-07-28 2004-07-28 Verfahren und system zur erkennung eines körpers in einer zone in der nähe einer grenzfläche Expired - Lifetime EP1656650B1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR0350378A FR2858450B1 (fr) 2003-07-28 2003-07-28 Procede et systeme pour detecter un corps dans une zone situee a proximite d'une interface
PCT/FR2004/050363 WO2005013226A1 (fr) 2003-07-28 2004-07-28 Procede et systeme pour detecter un corps dans une zone situee a proximite d'une interface

Publications (2)

Publication Number Publication Date
EP1656650A1 true EP1656650A1 (de) 2006-05-17
EP1656650B1 EP1656650B1 (de) 2008-03-05

Family

ID=34043805

Family Applications (1)

Application Number Title Priority Date Filing Date
EP04767924A Expired - Lifetime EP1656650B1 (de) 2003-07-28 2004-07-28 Verfahren und system zur erkennung eines körpers in einer zone in der nähe einer grenzfläche

Country Status (8)

Country Link
US (1) US7583196B2 (de)
EP (1) EP1656650B1 (de)
JP (1) JP4766492B2 (de)
AT (1) ATE388460T1 (de)
DE (1) DE602004012283D1 (de)
ES (1) ES2303092T3 (de)
FR (1) FR2858450B1 (de)
WO (1) WO2005013226A1 (de)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008066619A1 (en) * 2006-10-19 2008-06-05 Travis Sparks Pool light with safety alarm and sensor array
US7839291B1 (en) * 2007-10-02 2010-11-23 Flir Systems, Inc. Water safety monitor systems and methods
US8390685B2 (en) * 2008-02-06 2013-03-05 International Business Machines Corporation Virtual fence
US8345097B2 (en) * 2008-02-15 2013-01-01 Harris Corporation Hybrid remote digital recording and acquisition system
WO2012145800A1 (en) * 2011-04-29 2012-11-01 Preservation Solutions Pty Ltd Monitoring the water safety of at least one person in a body of water
US8544120B1 (en) * 2012-03-02 2013-10-01 Lockheed Martin Corporation Device for thermal signature reduction
CN103646511A (zh) * 2013-11-25 2014-03-19 银川博聚工业产品设计有限公司 游泳池溺水动态监控装置
US20170167151A1 (en) * 2015-12-10 2017-06-15 Elazar Segal Lifesaving system and method for swimming pool
US10329785B2 (en) 2016-04-08 2019-06-25 Robson Forensic, Inc. Lifeguard positioning system
WO2019156977A1 (en) * 2018-02-06 2019-08-15 Overhead Door Corporation Secure exit lane door
JP7313811B2 (ja) * 2018-10-26 2023-07-25 キヤノン株式会社 画像処理装置、画像処理方法、及びプログラム
CN109584509B (zh) * 2018-12-27 2020-08-11 太仓市小车东汽车服务有限公司 一种基于红外线与可见光组合的游泳池溺水监测方法
CN115278119B (zh) * 2022-09-30 2022-12-06 中国科学院长春光学精密机械与物理研究所 用于红外辐射特性测量的红外相机积分时间自动调整方法

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0683451B2 (ja) * 1986-08-01 1994-10-19 東芝エンジニアリング株式会社 水没検知システム
US4779095A (en) * 1986-10-28 1988-10-18 H & G Systems, Inc. Image change detection system
GB8811355D0 (en) * 1988-05-13 1997-09-17 Secr Defence An electro-optical detection system
US4862257A (en) * 1988-07-07 1989-08-29 Kaman Aerospace Corporation Imaging lidar system
JPH0378577A (ja) * 1989-08-19 1991-04-03 Mitsubishi Electric Corp 真空装置
US5043705A (en) * 1989-11-13 1991-08-27 Elkana Rooz Method and system for detecting a motionless body in a pool
GB9115537D0 (en) * 1991-07-18 1991-09-04 Secr Defence An electro-optical detection system
US5959534A (en) * 1993-10-29 1999-09-28 Splash Industries, Inc. Swimming pool alarm
US5638048A (en) * 1995-02-09 1997-06-10 Curry; Robert C. Alarm system for swimming pools
FR2741370B1 (fr) * 1995-11-16 1998-05-29 Poseidon Systeme de surveillance d'une piscine pour la prevention des noyades
US6963354B1 (en) * 1997-08-07 2005-11-08 The United States Of America As Represented By The Secretary Of The Navy High resolution imaging lidar for detecting submerged objects
US6628835B1 (en) * 1998-08-31 2003-09-30 Texas Instruments Incorporated Method and system for defining and recognizing complex events in a video sequence
US6327220B1 (en) * 1999-09-15 2001-12-04 Johnson Engineering Corporation Sonar location monitor
FR2802653B1 (fr) * 1999-12-21 2003-01-24 Poseidon Procede et systeme pour detecter un objet devant un fond
JP2002077897A (ja) * 2000-08-25 2002-03-15 Nippon Hoso Kyokai <Nhk> オブジェクト抽出型tvカメラ
DE60111074T2 (de) * 2000-12-06 2006-05-04 Poseidon Verfahren und vorrichtung zur detektion eines körpers in der nähe einer wasser/luft schichtgrenze
SG95652A1 (en) * 2001-05-25 2003-04-23 Univ Nanyang Drowning early warning system
US6642847B1 (en) * 2001-08-31 2003-11-04 Donald R. Sison Pool alarm device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2005013226A1 *

Also Published As

Publication number Publication date
FR2858450B1 (fr) 2005-11-11
US20070052697A1 (en) 2007-03-08
JP4766492B2 (ja) 2011-09-07
WO2005013226A1 (fr) 2005-02-10
ES2303092T3 (es) 2008-08-01
FR2858450A1 (fr) 2005-02-04
DE602004012283D1 (de) 2008-04-17
EP1656650B1 (de) 2008-03-05
US7583196B2 (en) 2009-09-01
JP2007500892A (ja) 2007-01-18
ATE388460T1 (de) 2008-03-15

Similar Documents

Publication Publication Date Title
EP1656650B1 (de) Verfahren und system zur erkennung eines körpers in einer zone in der nähe einer grenzfläche
FR3065307A1 (fr) Dispositif de capture d&#39;une empreinte d&#39;une partie corporelle.
FR3081248A1 (fr) Systeme et procede de determination d’un emplacement pour le placement d&#39;un paquet
FR2882160A1 (fr) Procede de capture d&#39;images comprenant une mesure de mouvements locaux
EP1240622B1 (de) Verfahren und vorrichtung zur detektion eines gegenstandes in bezug auf eine oberfläche
EP2994901B1 (de) Kompakter detektor menschlicher präsenz
FR2832528A1 (fr) Determination d&#39;un illuminant d&#39;une image numerique en couleur par segmentation et filtrage
EP1340104B1 (de) Verfahren und vorrichtung zur detektion eines körpers in der nähe einer wasser/luft schichtgrenze
EP3388976B1 (de) Betrugserkennungsverfahren
FR2934741A1 (fr) Dispositif interactif et procede d&#39;utilisation.
EP0577491B1 (de) Verfahren und Vorrichtung zur Überwachung einer dreidimensionalen Szene unter Verwendung von Bildsensoren
FR2717269A1 (fr) Système et procédé pour discriminer des cibles pouvant représenter des mines.
WO2001081858A1 (fr) Procede de mesurage d&#39;un objet tridimensionnel, ou d&#39;un ensemble d&#39;objets
EP2756483B1 (de) Verfahren und system zur erfassung und verarbeitung von bildern zur bewegungsdetektion
EP3770806A1 (de) Verfahren zur videoüberwachung der überschreitung einer linie durch personen, entsprechendes computerprogramm und entsprechende vorrichtung
FR2817624A1 (fr) Procede, systeme et dispositif pour detecter un corps a proximite d&#39;une interface de type eau/air
FR2860302A1 (fr) Capteur de detection infrarouge
FR2911984A1 (fr) Procede pour identifier des points symboliques sur une image d&#39;un visage d&#39;une personne
EP2329474B1 (de) Verfahren und system für szenenbeobachtung
FR2817625A1 (fr) Procede pour detecter des corps nouveaux dans une scene eclairee par des lumieres non forcement contraintes
EP4254250A1 (de) Verfahren und vorrichtung zur erkennung eines mobilen geräte-zuschauers auf der basis von tiefendaten
WO2014053437A1 (fr) Procédé de comptage de personnes pour appareil stéréoscopique et appareil stéréoscopique de comptage de personnes correspondant
FR3135812A1 (fr) Procédé de surveillance automatique des personnes dans un bassin d’eau, programme d’ordinateur et dispositif associés
FR3141788A1 (fr) Système de surveillance volumétrique d’un espace et programme d’ordinateur correspondant.
Gleason et al. Improved supervised classification of underwater military munitions using height features derived from optical imagery

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20060228

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PL PT RO SE SI SK TR

17Q First examination report despatched

Effective date: 20060811

DAX Request for extension of the european patent (deleted)
GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: MG INTERNATIONAL

111Z Information provided on other rights and legal means of execution

Free format text: FR

Effective date: 20070919

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PL PT RO SE SI SK TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: FRENCH

REF Corresponds to:

Ref document number: 602004012283

Country of ref document: DE

Date of ref document: 20080417

Kind code of ref document: P

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

REG Reference to a national code

Ref country code: ES

Ref legal event code: FG2A

Ref document number: 2303092

Country of ref document: ES

Kind code of ref document: T3

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

REG Reference to a national code

Ref country code: IE

Ref legal event code: FD4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080605

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080805

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080606

Ref country code: IE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

26N No opposition filed

Effective date: 20081208

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20080731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080605

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20080731

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20080731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

REG Reference to a national code

Ref country code: FR

Ref legal event code: GC

Ref country code: FR

Ref legal event code: AU

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080906

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080606

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 13

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 14

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 15

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20230711

Year of fee payment: 20

Ref country code: LU

Payment date: 20230711

Year of fee payment: 20

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20230831

Year of fee payment: 20

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20230710

Year of fee payment: 20

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: ES

Payment date: 20231103

Year of fee payment: 20

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: BE

Payment date: 20230929

Year of fee payment: 20

REG Reference to a national code

Ref country code: NL

Ref legal event code: MK

Effective date: 20240727

REG Reference to a national code

Ref country code: ES

Ref legal event code: FD2A

Effective date: 20240802

REG Reference to a national code

Ref country code: BE

Ref legal event code: MK

Effective date: 20240728

REG Reference to a national code

Ref country code: GB

Ref legal event code: PE20

Expiry date: 20240727

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF EXPIRATION OF PROTECTION

Effective date: 20240727