EP1224632B1 - Detection device - Google Patents

Detection device Download PDF

Info

Publication number
EP1224632B1
EP1224632B1 EP20010954036 EP01954036A EP1224632B1 EP 1224632 B1 EP1224632 B1 EP 1224632B1 EP 20010954036 EP20010954036 EP 20010954036 EP 01954036 A EP01954036 A EP 01954036A EP 1224632 B1 EP1224632 B1 EP 1224632B1
Authority
EP
European Patent Office
Prior art keywords
signal
sensor
detection device
designed
characterised
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP20010954036
Other languages
German (de)
French (fr)
Other versions
EP1224632A1 (en
Inventor
André HAUFE
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Iris GmbH IG Infrared and Intelligent Sensors
Original Assignee
Iris GmbH IG Infrared and Intelligent Sensors
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
Priority to DE10034976 priority Critical
Priority to DE2000134976 priority patent/DE10034976B4/en
Application filed by Iris GmbH IG Infrared and Intelligent Sensors filed Critical Iris GmbH IG Infrared and Intelligent Sensors
Priority to PCT/EP2001/008067 priority patent/WO2002007106A1/en
Publication of EP1224632A1 publication Critical patent/EP1224632A1/en
Application granted granted Critical
Publication of EP1224632B1 publication Critical patent/EP1224632B1/en
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=7649374&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=EP1224632(B1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual entry or exit registers

Description

  • The invention relates to a detection device for detecting persons or objects and their direction of movement, with a sensor arrangement for detecting electromagnetic radiation with the wavelength of visible and / or invisible light, which is reflected or emitted by a person or an object and with an evaluation unit which is connected to the sensor arrangement and is designed to derive a signal from the radiation detected by the radiation sensor arrangement and to emit a detection signal for as far as possible every object or person detected by the radiation sensor arrangement. In particular, the invention relates to a counting device for persons, which is connected to a corresponding detection device.
  • One application of such detection devices is to detect people who are crossing the entry or exit area of a means of transport to count the passengers entering or leaving the means of transport. From the DE 42 20 508 and the EP 0 515 635 In each case, detection devices are known which, with respect to the intended direction of movement of the passengers, have sensor elements arranged one behind the other and determine the direction of movement of detected persons by correlation of the radiation detected by the sensor elements. Such detection devices are thus able to determine not only the presence of an object or a person as in a simple light barrier, but also their direction of movement. One problem, however, is to reliably recognize non-targeted people who are standing in the entrance area of a bus, for example, or to distinguish those originating from different persons in close proximity.
  • One approach to solving the latter problem is in the DE 197 21 741 called. There, it is proposed to form a continuous distance signal for detected objects and to compare the distance function obtained in this way with predetermined or stored distance characteristics of known objects in order to obtain information about the number, the movement or the type of the objects. This happens according to the DE 197 21 741 by means of an active signal generator / detector arrangement. Active means that the detector picks up the radiation emitted by the signal generator and reflected by the object or the person.
  • From the DE 197 32 153 It is known to associate two images of a person taken from different points of view on the basis of characteristic image features, so as to obtain spatial information.
  • From the DE 42 20 508 a detection device is known, which is able to detect the intensity distribution of a person emanating heat radiation in a matrix-like manner.
  • The US 5,187,688 and US 5,255,301 each show an arrangement of a plurality of sensor elements with which can be determined during runtime measurement of the distance between a respective sensor element and the surface of a person.
  • It is an object of the present invention to provide a detection device which allows a more accurate object or person detection or counting in a simple manner.
  • According to the invention, this object is achieved with a detection device of the type mentioned in the introduction, which comprises individualizing means which are connected to the evaluation unit and designed to obtain information or information individualizing an object or person and which is connected to a memory which is formed, at least one section the history signal and the object or the person individualizing information as a characteristic parameter associated with the history signal store.
  • For this purpose, the detection device comprises parameter determination means, which are connected to the evaluation unit (18, 36) and designed to emit an additional signal. The evaluation unit (18; 36) is designed to form the characteristic parameter as a function of the additional signal. The parameter determination means (16, 18.1) comprise a radiation source (16) for radiation detectable by the sensor arrangement (12, 14; 32, 34) and an evaluation module (36.1 '). The radiation sensor arrangement comprises a sensor matrix (32.1) and both the radiation source and the radiation sensor arrangement are connected to the evaluation module. The evaluation module is designed to form from the radiation source (44) a radiation reflected by a person or an object and detected by the sensor matrix (32.1) as an additional signal determining the characteristic parameter and the information individualizing a respective person represents, wherein the matrix corresponds to the three-dimensional surface contour of a detected object or a detected person.
  • The parameter is derived from the history signal and an additional signal. This additional signal is obtained by an additional passive sensor and derived from an active radiation source. The parameter is multi-dimensional, ie a matrix with several values, which in particular individualize a person.
  • The invention is based on the idea of combining in a manner known per se a passively obtainable progress signal with at least one characteristic parameter, so that an at least two-dimensional signal or parameter matrix results, which includes information about the time course of the radiation detected by the sensor arrangement additional information combined. Such an arrangement allows, in itself from the DE 42 20 508 or the EP 0 515 635 known manner from the waveform signals to derive a motion signal by signal correlation and assign this motion signal by means of the or the characteristic parameters as reliably as possible to an individual object or an individual person. Preferably, the characteristic parameter describes a person-individual parameter, such as hair color, size, stature, etc.
  • The additional, characteristic parameter could be determined in a passive arrangement solely from the signal morphology. It is, however, according to the invention an idea, which supports the invention, to provide the detection device with additional means for determining the characteristic parameter. Among the multitude of conceivable additional means, two alternatives have proven to be particularly suitable in an unpredictable manner, namely a radiation source for realizing an active arrangement of the detection device, or additionally an additional sensor for detecting a further signal besides the radiation, e.g. an acoustic signal or an odor signal.
  • In the claimed active arrangement with a radiation source, the additional parameter is determined by evaluating the radiation reflected by an object or a person in relation to the radiation emitted by the radiation source. In this way, information about the duration of a signal from the radiation source about a person reflecting the sensor arrangement or the degree of reflection can be obtained.
  • As the frequency or wavelength range of the electromagnetic radiation, for the detection of which the sensor arrangement is formed, the range greater than 1400 nm is preferred. In an active arrangement with a radiation source, this wavelength range also applies to the radiation source. It has been found that in this wavelength range, both a favorable signal-to-noise ratio as well as achieve a high eye safety. In particular, a radiation power lying in this wavelength range can be more than 1000 times greater than, for example, in the range of 1050 nm, without this being associated with a health risk.
  • In principle, embodiments of the detection device are preferred, which are designed for arrangement in input and output openings such as doors of vehicles or rooms.
  • A preferred application of the detection device is the passenger count, for example in buses. In particular, for this application, the detection device is preferably connected to a local encoder such as a GPS receiver. Thus, the numbers of passengers determined by the detection device by means of a counting unit for the passengers entering and exiting passengers can be assigned to specific routes or stops of a bus. Together with an optional evaluation unit, integrated vehicle management is thus possible. This can be used for a whole fleet, if the detection devices and local givers of different vehicles are configured via radio to a central connectable.
  • In a preferred arrangement, the radiation source is arranged, for example, in the entrance area of a vehicle such that the radiation emanating from the radiation source strikes the person crossing the entrance area from above and is reflected by the head of the person from the duration of the signal Person is determinable. The characteristic parameter to be stored then corresponds to the size of the person. The synchronously recorded progress signal can be uniquely assigned to a person of the appropriate size by means of the characteristic parameter. Since most persons differ in size at least within certain limits, a largely individual assignment of progress signals is possible in this way, so that also such progress signals are to be assigned as starting from two different persons, which are close to each other from two persons present outgoing radiation.
  • An essential difference to that from the DE 197 21 741 known device is that, for example, in the case of person size determination for Forming the characteristic parameter not the distance function - ie the distance change - stored and compared with other distance functions, but only the minimum of the distance between the radiation source and the sensor arrangement on the one hand and the top of a person's head on the other.
  • Basically both are based on the DE 42 20 508 and the EP 0 515 635 as well as from the DE 197 21 741 known solutions alone on the correlation of two waveforms or functions. In the solution proposed here, the characteristic parameter is not derived from a comparison or correlation of functions among waveforms, but is formed from a signal alone. This signal can originate, for example, from an infrasound sensor for detecting heart sounds and thus the heart rate, or from the already described arrangement for detecting the person size or also according to the invention by a sensor matrix arrangement onto which an image of the persons crossing an entrance area is projected, so that from the Image can be obtained a characterizing the contour of the person parameters.
  • The sensor matrix arrangement can be connected to a radiation source of the type described above to form an active sensor, so that a three-dimensional contour of a detected person can be recorded as a characteristic parameter.
  • For receiving such or other signals individualizing a person, at least one corresponding sensor is preferably provided in each case. This sensor is preferably switched on when the history signal indicates that the detected person is currently in the greatest proximity to the sensor. Alternatively, the sensor remains continuously on and only that portion of the signal originating from the sensor is utilized, which was recorded at the time of closest approach to the sensor. For this purpose, the detection device preferably comprises corresponding location or distance determination means and a selection unit connected to these, which selects the corresponding sensor-derived signal section for further processing. the detection device preferably corresponding location or distance determining means and a selection unit connected thereto, which selects the corresponding sensor-derived signal portion for further processing.
  • In more differentiated embodiments of the invention, a plurality of characteristic parameters or parameter profiles can be obtained simultaneously and combined with one another in order to allow an even more precise differentiation of the information obtained and thus an even clearer individualization of the detected persons. Further preferred embodiments are mentioned in the subclaims.
  • These include in particular detection devices with an additional sensor for individual characteristics such as height, shape, hair color, heart tones or smell of a person or an object.
  • The invention will now be explained in more detail with reference to embodiments.
  • The figures for the embodiments show:
  • FIG. 1
    an unclaimed first variant of a detection device with an active sensor unit;
    FIG. 2
    an unclaimed detection device similar FIG. 1 with passive sensor unit and an additional sensor for a person-specific feature;
    FIG. 3
    an unclaimed detection device with a passive sensor array for receiving a multi-dimensional person-individual feature; and
    FIG. 4
    a detection device according to the invention similar FIG. 3 with an active sensor matrix for receiving a multi-dimensional person-individual feature.
  • Evaluation unit 18 is also connected to a memory 20 and a counting unit 22.
  • The sensor 12 and the radiation source 16 are connected together with the distance module 18. 1 of the evaluation unit 18. In the distance module 18. 1, the phase relationship between the radiation emitted by the radiation source 16 and the radiation received by the sensor 12 is determined, thus determining the transit time required for the signal emitted by the radiation source 16 and reflected by an object to be picked up by the sensor 12 to become. Thus, the distance between radiation source 16 and sensor 12 on the one hand and a reflective surface on the other hand can be determined. Instead of evaluating the transit time, the distance to a reflecting object can also be determined directly via the wavelength of the signal emitted by the radiation source 16 and the phase relationship between emitted and received radiation. The technologies required for this purpose are known in principle. Since the radiation source 16 and the sensor 12 are arranged vertically above the entrance of a bus, for example, and the distance to the ground is known, it is possible to deduce the minimum of a sequence of successive distance measurements on the size of a person passing through the access area. This minimum is stored as a person size in the memory 20 and represents a parameter characteristic of the person.
  • Simultaneously with the size determination, the two sensors 12 and 14 record and correlate the radiation signals reflected or emitted by a person. Due to the movement of a person 24 who, for example, is entering the bus, the two radiation sensors 12 and 14 record similar course signals that are time-displaced relative to one another. From the distance of the two sensors 12 and 14 and the time offset between the waveform signals recorded by them, the direction of movement and the speed of a person ascending or descending 24 can be determined.
  • In this way, the following information is obtained:
  • If the signal received by the sensor 12 changes compared to the signal received by the sensor 14 or vice versa, this is an indication of a reflecting or radiating object in the detection range of the sensors 12 and 14.
  • Changes in the radiation background occur synchronously by both sensors 12 and 14 and can therefore be hidden. If the evaluation of the progress signals of the sensors 12 and 14 obtained in this way yields that the two progress signals correlate with each other in time or not such that the correlation exceeds a certain level, the speed of an object can be determined from the time offset of the signals.
  • Since, as already explained at the outset, not all correlated signals can be assigned to a person or a person can also remain in the entrance area of a bus, so that the course of the two signals recorded by the sensors 12 and 14 changes little The correlation module 18.2 determined information to be linked by the distance module 18.1. A person standing in the entrance area of a bus can easily be identified for the distance module 18.1. In the memory 20, the size information about a person is stored in such a manner as to be associated with the history signal output from that person. The combination of the two pieces of information is highly characteristic of a person and makes it possible to recognize a person not only when boarding, but also when getting out.
  • Since by connecting the size information with the from the comparison of the progress signal information a greater individualization on or off persons is possible, they can also be counted more accurately. The assignment of the information obtained with the aid of the distance module 18.1 to the information obtained with the aid of the correlation module 18.2, the targeted storage of this information and the retrieval of the stored information is done by the assignment module 18.3.
  • Taking into account the direction information from the correlation module 18.2, it is possible for the assignment module 18.3 to identify a person as getting in or out. The counting unit 22 is connected to the assignment module 18.3 and designed such that a counter is incremented by one for each person identified as being to be entranced by the assignment module 18.3 and is decremented by one for each departing person. The count in the counting unit 22 thus indicates the number of persons who are, for example, in a bus. The counting unit can do this be connected to a plurality of evaluation units 18, which are assigned to several input areas of a means of transport.
  • The detection device 10 'in FIG. 2 has a passive sensor unit formed by the sensors 12 and 14 for receiving the progress signal. In addition, an additional sensor 26 is provided, which receives a person-individual characteristic, such as the hair color or heart sounds or the like. The evaluation of the additional signal is carried out by an evaluation module 18.1 'of the evaluation unit 18'. The assignment to the recorded by the sensors 12 and 14 history signal takes place, as already for FIG. 1 described by the assignment module 18.3. The evaluated additional signal is associated with the history signal stored in the memory 20.
  • The detection device 30 in FIG FIG. 3 is similar in construction, as the detection device 10 from FIG. 1 , Also provided are two infrared sensors 32 and 34, an evaluation unit 36, a memory 38 and a counting unit 40. An active radiation source such as the radiation source 16 is not provided FIG. 1 ,
  • For this purpose, at least the sensor 32 contains a plurality of sensor elements 32.1 in a matrix-like arrangement. The sensor elements 32.1 are in the focus of an imaging device such as in a converging lens 32.2. The radiation emitted by a person 42 is thus projected onto the sensor matrix 32. 1 as an image of the person 42.
  • Each person results in a largely individual projection pattern, which is characteristic of the respective person 42. This projection pattern is fed to an image module 36. 1 of the evaluation unit 36. In the image module 36.1, a characteristic pattern is extracted from the projection pattern as a characteristic parameter and stored in the memory 38.
  • Parallel to the formation of the characteristic pattern, signals are recorded by means of the sensors 32 and 34. It is sufficient if the sensor 34 contains only one sensor element and for the course signal of the sensor 32, only one sensor element of the sensor matrix 32.1 is used.
  • The two progress signals are, as already for FIG. 1 , in a correlation module 36.2 of the evaluation unit 36 correlated with each other to a movement information receive. This movement information is stored in the memory 38 associated with the corresponding characteristic pattern.
  • An assignment module 36.3 of the evaluation unit 36 works analogously to the assignment module 18.3 FIG. 1 and outputs, depending on the optionally stored output values of the image module 36.1 and the correlation module 36.2 for each rising or falling person from a signal which serves to control the counting unit 40 and increases or decreases in this counter accordingly.
  • According to the invention, the detection device 30 'differs FIG. 4 from the detection device 30 FIG. 3 essentially in that it comprises a radiation source 44, which makes it possible to expand the sensor matrix 32.1 into an active sensor unit. By means of the radiation source 44 and the sensor matrix 32.1, it is possible to form a three-dimensional contour of an object or a person in the detection area of the sensor matrix 32.1. This is done by evaluating the radiation detected by the sensor matrix 32.1 with respect to the radiation emitted by the radiation source 44 in an evaluation module 36.1 '. The evaluation module 36.1 'is for this purpose connected to the radiation source 44 and the sensor matrix 32.1 and designed so that from the radiation emitted by the radiation source 44, reflected by a person or an object and detected by the sensor matrix 32.1 radiation forms a matrix that the three-dimensional surface contour corresponds to the detected object or person. This matrix is stored as a characteristic parameter and the person individualizing information in the memory 38 associated with the history signal.
  • By matrices comparison, a beginner can be recognized on later boarding. For this purpose, the assignment module 36.3 is designed to compare matrices acquired when boarding persons with those matrices that were detected when persons were getting off. The entry and exit direction results from the progress signal. The assignment module 36.3 is also designed for the matrix comparison for transforming matrices, in particular for rotating matrices, in order to be able to take into account the different orientation of persons ascending and descending and the resulting change in the contour images to be compared.
  • By various variations of the described and claimed concepts, it is possible to achieve the desired accuracy and the individualization of a detection device.

Claims (15)

  1. Detection device (30') for detecting persons (42) or objects and their direction of movement comprising:
    - a radiation sensor arrangement (32.1) for detecting electromagnetic radiation with the wavelength of visible and/or invisible light, which emanates from a person or an object, and
    - an evaluation unit (36') that is connected to the sensor arrangement (32, 34) and is designed to form a variation signal, which corresponds to the time variation of the radiation detected by the radiation sensor arrangement, characterised in that the detection device further comprises
    - individualising means (32.1, 36.1'; 44) that are connected to the evaluation unit (36') and are designed to obtain information that individualises an object or person, and that are connected to a memory (38) which is designed to store at least a portion of the variation signal and the information individualizing the object or the person as characteristic parameters in association with the variation signal, and
    in that the detection device further comprises
    - parameter determining means (36.1', 44) that are connected to the evaluation unit (36') and that are designed to emit an additional signal and in that the evaluation unit (36') is designed to form the characteristic parameters as a function of the additional signal,
    wherein
    - the radiation sensor arrangement comprises a sensor matrix (32.1),
    - the parameter-determining means comprise a radiation source (44) for radiation which can be detected by the sensor matrix (32.1) as well as an evaluation module (36.1'),
    - and both the radiation source and the sensor matrix (32.1) are connected to the evaluation module (36.1'),
    - and the evaluation module is designed to form a matrix as an additional signal from the radiation sent from the radiation source (44), reflected by a person or an object and detected by the sensor matrix (32.1), which matrix determines the characteristic parameters and which represents information individualising the respective person, whereby the matrix corresponds to the three-dimensional surface contour of a detected object or a detected person.
  2. Detection device (30') according to claim 1, characterised in that the radiation source (44) is an infrared light source which preferably emits radiation in a wavelength range greater than 1400 nm.
  3. Detection device (30') according to claim 1 or 2, characterised in that the evaluation module (36.1) is connected with the radiation source (44) and the sensor arrangement (32.1) and is designed to determine the running times of a signal sent from the radiation source (44), reflected by an object or a person and received from the sensor arrangement (32, 34) as an additional signal.
  4. Detection device (30') according to one of claims 1 to 3, characterised in that the radiation source (44) is designed to emit a coded signal and in that the evaluation unit (36') is designed to determine the portion of the coded signal on the radiation received from the sensor matrix (32.1).
  5. Detection device (30') according to claim 4, characterised in that the coded signal is a periodic signal and in that the evaluation unit (36') is designed to determine the running time of a reflected signal as a function of the phase relationship between a coded signal received by the sensor arrangement (32, 34) and a coded signal sent by the radiation source (44).
  6. Detection device (30') according to one of claims 1 to 5, characterised in that the sensor arrangement (32, 34) comprises at least two sensor elements and in that the evaluation unit (36') is designed to form at least two variation signals for different sensor elements.
  7. Detection device (30') according to one of claims 1 to 6, characterised in that the evaluation unit (36') is designed to compare sections of one or more variation signals which have been recorded simultaneously or with a time-shift.
  8. Detection device (30') according to claim 7, characterised in that the evaluation unit (36') is designed to form a correlation coefficient as the result of the comparison of the variation signal sections.
  9. Detection device (30') according to claim 7 or 8, characterised in that the evaluation unit (36') is designed to perform a comparison of signal sections originating from various sensor elements a plurality of times, in that the signal sections for each comparison are time shifted relative to one another by different time differences, and in that a running time signal is formed which corresponds to the time shift which has the greatest similarity or best correlation with the compared signal sections.
  10. Detection device (30') according to claim 9, characterised in that the evaluation unit (36') is designed to form a velocity signal from the running time signal and from a predetermined spacing of said sensor elements, to which the signal sections used to form the running time signal relate.
  11. Detection device (30') according to one of claims 1 to 10, characterised in that the sensor matrix comprises several sensor elements which are arranged in the form of a matrix and in that the evaluation unit (36') is designed to compare time-shifted signal sections originating from different sensor elements and to derive from the signal section comparison a directional signal, in that from the spatial arrangement of such sensor elements which are assigned to the signal sections of greatest similarity a directional vector is produced.
  12. Detection device (30') according to one of claims 1 to 11, characterised by an additional sensor which is designed to detect hair colour and emit an additional signal dependent on the hair colour.
  13. Detection device (30') according one of the claims 1 to 11, characterised by an additional sensor, which is designed in the form of a microphone for detecting an acoustic signal such as for example cardiac sounds and emitting an additional signal dependent on the acoustic signal.
  14. Detection device (30') according to one of claims 1 to 11, characterised by an additional sensor which is designed for detecting a scent signal and emitting an additional signal dependent on the scent signal.
  15. Detection device (30') according to one of claims 1 to 14 and a counting device for moving persons or objects, characterised in that the counting device (40) is connected to the detecting device.
EP20010954036 2000-07-13 2001-07-12 Detection device Active EP1224632B1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
DE10034976 2000-07-13
DE2000134976 DE10034976B4 (en) 2000-07-13 2000-07-13 Detecting device for detecting persons
PCT/EP2001/008067 WO2002007106A1 (en) 2000-07-13 2001-07-12 Detection device

Publications (2)

Publication Number Publication Date
EP1224632A1 EP1224632A1 (en) 2002-07-24
EP1224632B1 true EP1224632B1 (en) 2009-12-16

Family

ID=7649374

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20010954036 Active EP1224632B1 (en) 2000-07-13 2001-07-12 Detection device

Country Status (12)

Country Link
US (1) US6774369B2 (en)
EP (1) EP1224632B1 (en)
JP (1) JP5064637B2 (en)
CN (1) CN1243327C (en)
AT (1) AT452387T (en)
AU (1) AU7640101A (en)
BR (1) BR0106974B1 (en)
DE (2) DE10034976B4 (en)
ES (1) ES2337232T3 (en)
MX (1) MXPA02002509A (en)
RU (1) RU2302659C2 (en)
WO (1) WO2002007106A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8242476B2 (en) 2005-12-19 2012-08-14 Leddartech Inc. LED object detection system and method combining complete reflection traces from individual narrow field-of-view channels
US8310655B2 (en) 2007-12-21 2012-11-13 Leddartech Inc. Detection and ranging methods and systems
US8436748B2 (en) 2007-06-18 2013-05-07 Leddartech Inc. Lighting system with traffic management capabilities
US8600656B2 (en) 2007-06-18 2013-12-03 Leddartech Inc. Lighting system with driver assistance capabilities
US8723689B2 (en) 2007-12-21 2014-05-13 Leddartech Inc. Parking management system and method using lighting system
US8842182B2 (en) 2009-12-22 2014-09-23 Leddartech Inc. Active 3D monitoring system for traffic detection
US8908159B2 (en) 2011-05-11 2014-12-09 Leddartech Inc. Multiple-field-of-view scannerless optical rangefinder in high ambient background light
US9235988B2 (en) 2012-03-02 2016-01-12 Leddartech Inc. System and method for multipurpose traffic detection and characterization
US9378640B2 (en) 2011-06-17 2016-06-28 Leddartech Inc. System and method for traffic side detection and characterization
DE102015202223A1 (en) 2015-02-09 2016-08-11 Iris-Gmbh Infrared & Intelligent Sensors Control system
WO2019092277A1 (en) 2017-11-13 2019-05-16 Iris-Gmbh Infrared & Intelligent Sensors Detection system
US10488492B2 (en) 2015-09-08 2019-11-26 Leddarttech Inc. Discretization of detection zone

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8152198B2 (en) * 1992-05-05 2012-04-10 Automotive Technologies International, Inc. Vehicular occupant sensing techniques
GB2366369B (en) * 2000-04-04 2002-07-24 Infrared Integrated Syst Ltd Detection of thermally induced turbulence in fluids
DE102004009541A1 (en) * 2004-02-23 2005-09-15 Iris-Gmbh Infrared & Intelligent Sensors User controllable acquisition system
WO2006135354A2 (en) * 2005-06-08 2006-12-21 Nielsen Media Research, Inc. Methods and apparatus for indirect illumination in electronic media rating systems
WO2007106806A2 (en) * 2006-03-13 2007-09-20 Nielsen Media Research, Inc. Methods and apparatus for using radar to monitor audiences in media environments
EP2308035A4 (en) * 2008-06-13 2011-10-19 Tmt Services And Supplies Pty Ltd Traffic control system and method
EP2267674A1 (en) * 2009-06-11 2010-12-29 Koninklijke Philips Electronics N.V. Subject detection
DE102009027027A1 (en) 2009-06-18 2010-12-30 Iris-Gmbh Infrared & Intelligent Sensors Survey system and operating procedures for a survey system
DE202009011048U1 (en) 2009-09-24 2009-12-31 Vitracom Ag Device for determining the number of persons crossing a passage
US8587657B2 (en) * 2011-04-13 2013-11-19 Xerox Corporation Determining a number of objects in an IR image
SG194714A1 (en) * 2011-05-03 2013-12-30 Shilat Optronics Ltd Terrain surveillance system
DE102011053639A1 (en) * 2011-09-15 2013-03-21 Viscan Solutions GmbH Golf Course Management System
CN103576428B (en) * 2012-08-02 2015-11-25 光宝科技股份有限公司 Laser projection system has security mechanisms
GB201219097D0 (en) 2012-10-24 2012-12-05 Metrasens Ltd Apparatus for detecting ferromagnetic objects at a protected doorway assembly
DE102013204145A1 (en) * 2013-02-27 2014-09-11 Init Innovative Informatikanwendungen In Transport-, Verkehrs- Und Leitsystemen Gmbh Arrangement and method for monitoring movement of persons in buildings
JP6280722B2 (en) * 2013-10-25 2018-02-14 矢崎エナジーシステム株式会社 Image analysis system, analysis apparatus, and analysis method
KR101582726B1 (en) 2013-12-27 2016-01-06 재단법인대구경북과학기술원 Apparatus and method for recognizing distance of stereo type
CN103955980B (en) * 2014-05-13 2017-02-15 温州亿通自动化设备有限公司 Human body model feature based bus passenger flow statistic device and processing method
CN104183040B (en) * 2014-08-21 2016-04-20 成都易默生汽车技术有限公司 Passenger carriage overloading detection system and detection method thereof
CN110045422A (en) 2014-12-18 2019-07-23 梅特拉森斯有限公司 Security system and the method for detecting contraband
DE102015202232A1 (en) * 2015-02-09 2016-08-11 Iris-Gmbh Infrared & Intelligent Sensors Data Acquisition System

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4769697A (en) * 1986-12-17 1988-09-06 R. D. Percy & Company Passive television audience measuring systems
US5101194A (en) * 1990-08-08 1992-03-31 Sheffer Eliezer A Pattern-recognizing passive infrared radiation detection system
JP2749191B2 (en) 1990-11-06 1998-05-13 新川電機株式会社 Counting method of height by passing people
DE4040811A1 (en) * 1990-12-14 1992-07-09 Iris Gmbh Infrared & Intellige counting direction Selective and switching device
JP2963236B2 (en) * 1991-05-02 1999-10-18 エヌシーアール インターナショナル インコーポレイテッド Counting method of passing people
NL9200283A (en) * 1992-02-17 1993-09-16 Aritech Bv Monitoring system.
DE4220508C2 (en) * 1992-06-22 1998-08-20 Iris Gmbh Infrared & Intellige A device for detecting persons
US5555512A (en) * 1993-08-19 1996-09-10 Matsushita Electric Industrial Co., Ltd. Picture processing apparatus for processing infrared pictures obtained with an infrared ray sensor and applied apparatus utilizing the picture processing apparatus
JP2978374B2 (en) * 1992-08-21 1999-11-15 松下電器産業株式会社 An image processing apparatus and method and control device for an air conditioner
JP2874563B2 (en) * 1994-07-07 1999-03-24 日本電気株式会社 Laser surveying instrument
JPH09161115A (en) * 1995-12-05 1997-06-20 Nippon Telegr & Teleph Corp <Ntt> Entrance/exit management sensor and its signal processing method
JP3521637B2 (en) * 1996-08-02 2004-04-19 オムロン株式会社 Passing people measuring apparatus and entrance and exit toll management system using the same
JP3233584B2 (en) * 1996-09-04 2001-11-26 松下電器産業株式会社 Passing people detection device
DE19721741A1 (en) * 1997-05-24 1998-11-26 Apricot Technology Gmbh Moving object detection and counting method
IL122846A (en) * 1998-01-04 2003-06-24 Visonic Ltd Passive infra-red intrusion sensing system covering downward zone
US6037594A (en) * 1998-03-05 2000-03-14 Fresnel Technologies, Inc. Motion detector with non-diverging insensitive zones
JP4016526B2 (en) * 1998-09-08 2007-12-05 富士ゼロックス株式会社 3D object identification device
IL130398A (en) * 1999-06-09 2003-11-23 Electronics Line E L Ltd Method and apparatus for detecting moving objects, particularly intrusions
DE19962201A1 (en) 1999-09-06 2001-03-15 Holger Lausch Determination of people activity within a reception area using cameras and sensors

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8242476B2 (en) 2005-12-19 2012-08-14 Leddartech Inc. LED object detection system and method combining complete reflection traces from individual narrow field-of-view channels
US8436748B2 (en) 2007-06-18 2013-05-07 Leddartech Inc. Lighting system with traffic management capabilities
US8600656B2 (en) 2007-06-18 2013-12-03 Leddartech Inc. Lighting system with driver assistance capabilities
US8723689B2 (en) 2007-12-21 2014-05-13 Leddartech Inc. Parking management system and method using lighting system
US8310655B2 (en) 2007-12-21 2012-11-13 Leddartech Inc. Detection and ranging methods and systems
US8842182B2 (en) 2009-12-22 2014-09-23 Leddartech Inc. Active 3D monitoring system for traffic detection
USRE47134E1 (en) 2011-05-11 2018-11-20 Leddartech Inc. Multiple-field-of-view scannerless optical rangefinder in high ambient background light
US8908159B2 (en) 2011-05-11 2014-12-09 Leddartech Inc. Multiple-field-of-view scannerless optical rangefinder in high ambient background light
US9378640B2 (en) 2011-06-17 2016-06-28 Leddartech Inc. System and method for traffic side detection and characterization
US9235988B2 (en) 2012-03-02 2016-01-12 Leddartech Inc. System and method for multipurpose traffic detection and characterization
WO2016128081A1 (en) 2015-02-09 2016-08-18 Iris-Gmbh Infrared & Intelligent Sensors Control system
DE102015202223A1 (en) 2015-02-09 2016-08-11 Iris-Gmbh Infrared & Intelligent Sensors Control system
US10488492B2 (en) 2015-09-08 2019-11-26 Leddarttech Inc. Discretization of detection zone
WO2019092277A1 (en) 2017-11-13 2019-05-16 Iris-Gmbh Infrared & Intelligent Sensors Detection system

Also Published As

Publication number Publication date
CN1393005A (en) 2003-01-22
MXPA02002509A (en) 2004-09-10
AU7640101A (en) 2002-01-30
DE10034976A1 (en) 2002-01-31
EP1224632A1 (en) 2002-07-24
JP5064637B2 (en) 2012-10-31
ES2337232T3 (en) 2010-04-22
AT452387T (en) 2010-01-15
RU2002109242A (en) 2004-02-10
RU2302659C2 (en) 2007-07-10
CN1243327C (en) 2006-02-22
BR0106974A (en) 2002-05-21
DE10034976B4 (en) 2011-07-07
WO2002007106A1 (en) 2002-01-24
JP2004504613A (en) 2004-02-12
US20020148965A1 (en) 2002-10-17
BR0106974B1 (en) 2012-12-11
DE50115259D1 (en) 2010-01-28
US6774369B2 (en) 2004-08-10

Similar Documents

Publication Publication Date Title
JP4025643B2 (en) Application of human facial feature recognition to car safety
US5436611A (en) Race recording and display system
US7164118B2 (en) Method and system for obstacle detection
US8902070B2 (en) Eye closure detection using structured illumination
CN101836077B (en) Distance-measuring method for device projecting reference line, and such device
AU751202B2 (en) Device for automatic detection of the number of spots on the top side of a dice for use on a professional basis in particular for the craps game
US20110285982A1 (en) Method and arrangement for obtaining information about objects around a vehicle
US7511833B2 (en) System for obtaining information about vehicular components
US20100130182A1 (en) Method and system for automated detection of mobile phone usage
US7477758B2 (en) System and method for detecting objects in vehicular compartments
US20090046538A1 (en) Apparatus and method for Determining Presence of Objects in a Vehicle
US20190018141A1 (en) Determining positional information of an object in space
US5835613A (en) Optical identification and monitoring system using pattern recognition for use with vehicles
US5845000A (en) Optical identification and monitoring system using pattern recognition for use with vehicles
US7768380B2 (en) Security system control for monitoring vehicular compartments
JP4350385B2 (en) Method for automatically searching for target marks, device for automatically searching for target marks, receiving unit, geodometer and geodetic system
US20080157510A1 (en) System for Obtaining Information about Vehicular Components
JP2598347B2 (en) Collision avoidance system
US6507779B2 (en) Vehicle rear seat monitor
JP4035505B2 (en) Safety method for getting in and out, check device, elevator
US6393133B1 (en) Method and system for controlling a vehicular system based on occupancy of the vehicle
CA2202575C (en) Vehicle identification system for electric toll collection system
US7079669B2 (en) Image processing device and elevator mounting it thereon
EP1221151B1 (en) Improvements relating to security
CN101027700B (en) Method and apparatus for detection and tracking of objects within a defined area

Legal Events

Date Code Title Description
AX Request for extension of the european patent to:

Free format text: AL;LT;LV;MK;RO;SI

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR

17P Request for examination filed

Effective date: 20020724

17Q First examination report despatched

Effective date: 20030521

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REF Corresponds to:

Ref document number: 50115259

Country of ref document: DE

Date of ref document: 20100128

Kind code of ref document: P

REG Reference to a national code

Ref country code: NL

Ref legal event code: VDEP

Effective date: 20091216

REG Reference to a national code

Ref country code: ES

Ref legal event code: FG2A

Ref document number: 2337232

Country of ref document: ES

Kind code of ref document: T3

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20091216

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20091216

REG Reference to a national code

Ref country code: IE

Ref legal event code: FD4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20091216

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20091216

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100416

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100317

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20091216

26N No opposition filed

Effective date: 20100917

BERE Be: lapsed

Owner name: IRIS-GMBH INFRARED & INTELLIGENT SENSORS

Effective date: 20100731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20091216

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20100731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20100731

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20100731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20100731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20100712

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20100712

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 16

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 17

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 18

PGFP Annual fee paid to national office [announced from national office to epo]

Ref country code: GB

Payment date: 20180725

Year of fee payment: 18

PGFP Annual fee paid to national office [announced from national office to epo]

Ref country code: DE

Payment date: 20190808

Year of fee payment: 19

Ref country code: FR

Payment date: 20190724

Year of fee payment: 19

Ref country code: IT

Payment date: 20190723

Year of fee payment: 19

Ref country code: TR

Payment date: 20190702

Year of fee payment: 19

Ref country code: ES

Payment date: 20190822

Year of fee payment: 19