AU2013209360A1 - Apparatus and method for the identification and documentation of at least one object driving through a radiation field - Google Patents

Apparatus and method for the identification and documentation of at least one object driving through a radiation field Download PDF

Info

Publication number
AU2013209360A1
AU2013209360A1 AU2013209360A AU2013209360A AU2013209360A1 AU 2013209360 A1 AU2013209360 A1 AU 2013209360A1 AU 2013209360 A AU2013209360 A AU 2013209360A AU 2013209360 A AU2013209360 A AU 2013209360A AU 2013209360 A1 AU2013209360 A1 AU 2013209360A1
Authority
AU
Australia
Prior art keywords
camera
sensor
position data
data
measurement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU2013209360A
Inventor
Michael Lehning
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jenoptik Robot GmbH
Original Assignee
Jenoptik Robot GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jenoptik Robot GmbH filed Critical Jenoptik Robot GmbH
Publication of AU2013209360A1 publication Critical patent/AU2013209360A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/91Radar or analogous systems specially adapted for specific applications for traffic control
    • G01S13/92Radar or analogous systems specially adapted for specific applications for traffic control for velocity measurement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/25Image signal generators using stereoscopic image cameras using two or more image sensors with different characteristics other than in their location or field of view, e.g. having different resolutions or colour pickup characteristics; using image signals from one sensor to control the characteristics of another sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects

Abstract

- 16 Abstract The invention is directed to an apparatus and a method for identifying and documenting at least one object (8) driving through a measurement field (5). The apparatus includes a radiation-measuring sensor (1) and two cameras (2) (3) which are arranged relative to one another such that the sensor radiation field (1.4) and the two object fields (2.3), (3.3) form a common overlapping area which determines the measurement field (5). In carrying out the method according to the invention, sensor position data are obtained by means of the sensor (1) at measuring times (1) and image recordings are made by the cameras (2), (3) at triggering times, the camera position data being calculated from the picture data of these image recordings. The sensor position data and the camera position data are compared with one another and checked for plausibility. In case of plausibility, the object (8) is securely identified. Fig. 3 Ci) I, ,) ----- 4, co4 Co rI o) CNcl N

Description

APPARATUS AND METHOD FOR THE IDENTIFICATION AND DOCUMENTATION OF AT LEAST ONE OBJECT DRIVING THROUGH A RADIATION FIELD 1. FIELD OF THE INVENTION The invention is directed to an apparatus and a method for identifying and documenting at least one object driving through a sensor radiation field. 2. BACKGROUND TO THE INVENTION An apparatus and method for identifying and documenting at least one object driving through a sensor radiation field is known, for example, from EP 2 284 568 A2. Various approaches for identifying and documenting objects such as are used particularly in the field of traffic monitoring are known from the prior art. In this regard, difficulties can arise particularly because more than one object can be located in the radiation field at the same time and cannot be identified with certainty due to an inexact determination of position or due to bent beam reflections. In order to take into account the various circumstances under which stationary objects and moving objects are monitored, EP 2 284 568 A2 suggests a system for acquiring objects in which a radar sensor is provided as a radiation-measuring sensor having a transmitter for transmitting a measurement beam into a sensor radiation field and a receiver for receiving reflected portions of the measurement beam. The radar sensor serves to measure the speed of an object located at a greater distance from the radar sensor. On a horizontal line, a first camera is on one side of the radar sensor and a second camera is on the other side, the object fields of both cameras lying within the sensor radiation field. The optical axes of the two cameras are oriented parallel to one another and to the axis of the sensor. A picture is taken of a scene by the camera at at least one triggering time. Owing to the horizontal distance of the cameras from one another, the scene is taken by each camera at a somewhat different angle. The horizontal deviation (horizontal disparity) which is therefore noted when comparing these pictures which were taken at the same time allows spatial information about the object to be determined. The object can be tracked over a timed sequence of photographs and the speed of the object at a shorter distance can be derived therefrom. Two cameras with a high resolving power are used so that, in addition to identification, it is also possible to document the object with a sufficiently high quality. The -2 measurement data of the sensor and the image data of the cameras are processed by means of a computing unit. A device according to EP 2 284 568 A2 is disadvantageous due to the high computing power and computing time required for fast processing of the information of the two high resolution cameras. Fast processing of the information is required for monitoring moving traffic. Further, an alternative capture of measurement data or image data at different distance ranges is provided therein so that an identification of an object is not verified, but is carried out alternatively using either the radar device or the camera. Against this backdrop, it would be advantageous to improve identification and documenting an object driving through a radiation field. 3. SUMMARY OF THE INVENTION In an apparatus containing a sensor with a transmitter for transmitting a measurement beam into a sensor radiation field covering a measurement field and a receiver unit for receiving portions of the measurement beam which are reflected by at least one object for forming measurement data, a first camera and a second camera which are arranged at a base distance relative to one another and which each supply picture data, and a computing unit for processing the measurement data and the picture data, the present invention, in one aspect thereof, provides that the two cameras are arranged in known positional relationships with respect to one another and to the sensor such that their object fields overlap with the sensor radiation field and the overlapping area formed in this way determines the measurement field. In so doing, the first camera has a higher resolving power than the second camera. The computing unit is formed in such a way that it determines sensor position data from measurement data acquired at a plurality of measuring times within a measurement duration and can store these sensor position data so as to be associated with the measuring times. The computing unit is further formed in such a way that with knowledge of the camera parameters and spatial arrangement of the two cameras and by means of a method of image recognition it can calculate camera position data of the at least one object from the picture data of the two cameras acquired at at least one same triggering time and can store these camera position data so as to be associated with the at least one triggering time, and in such a way that, taking into account the positional relationships and the time correlation between the at least one triggering time and the measuring times, it can compare the sensor position data and the camera position data with one another and check them for plausibility.
-3 Some of the terms used in describing the invention will be explained in the following for the sake of clarity. An angular field is determined by the quotient of the length of a line of a receiver matrix of a digital camera and the focal length of the associated camera objective. Consequently, a small angular field only allows a digital imaging of a small object field, whereas a larger angular field allows digital imaging of a larger object field. "Driving through" a radiation field as it pertains to an object is equivalent to passing through or moving through in some other way. An object may be, for example, a vehicle or an individual such as a pedestrian or an animal. By methods of image recognition is meant methods known from the prior art by which known features of objects are found in pictures or picture data and which are based, e.g., on edge detection. "Identification" means to associate an object with obtained measurement data which in the case of the present method includes a verification of the association. For example, when a measured speed is associated with a measured distance, the associated object is not identified until a distance has been determined for the object by means of a second detection method, which distance is correlated thereto. Sensor position data and camera position data which are compared with one another are considered plausible when they relate to one another in a particular way. Thus they can be converted into one another exactly or can lie within a permissible tolerance. Comparison between the sensor position data and the camera position data presents a possibility for verifying the position of an object by means of two different methods (use of a radiation measuring sensor vs. use of cameras). Camera position data of an object, particular the distance of the object, are obtained by means of the apparatus according to the invention by using two cameras with different camera parameters applying known stereoscopic methods. In this regard, it is advantageous that one of the cameras has only a low resolving power so that the required computing power of the apparatus is comparatively reduced without leading to disadvantages when calculating and processing the camera position data. However, combining two cameras with appreciably different resolutions is not suitable for producing a two-dimensional spatial image or for three-dimensional imaging. For legally credible proof of a traffic violation, it is required that an image (picture) be produced. In order to grasp a two-dimensional spatial image or a three- -4 dimensional image, the human brain at a point in time at which the image is viewed requires image information of roughly the same quality. Thus if a two-dimensional spatial image or a three-dimensional image is to be produced by a combination of at least two pictures that were taken from angles that are offset slightly with respect to one another, these pictures must have roughly the same quality, for example, the same resolution. In a preferred embodiment of the apparatus according to the invention, the resolving power of the first camera is higher than the resolving power of the second camera by a factor of at least two. A factor of at least four is advantageous in order to achieve a substantial reduction of the required computing power. A so-called violation camera which works as a single-shot camera and by means of which at least one high-resolution picture of a violating vehicle can be taken can be used as first camera. Owing to the high resolving power, relevant details of the violating vehicle such as license plate and vehicle operator can be captured and displayed. The angular field of the first camera is preferably small, for example, about 100. The second camera can be a sequence camera which is synchronized with the violation camera and by which a sequence of pictures is taken, which sequence is temporally associated with a violation, e.g., exceeding a speed. The resolving power of the second camera is lower than that of the first camera, while the angular field of the second camera is preferably larger than the angular field of the first camera. In an advantageous manner, an object such as a violating vehicle can be detected in appreciably different positions in at least two temporally successive pictures by means of a larger angular field. The picture data for obtaining the camera position data are then obtained at the same triggering time of the cameras. In a further embodiment of the apparatus according to the invention, the second camera can be a freely running sequence camera so that the triggering times of the two cameras do not coincide. In this case, there is no sequence picture synchronous in time with respect to a violation picture, but an artificial sequence picture can be interpolated or extrapolated from at least two sequence pictures taking into account the known camera parameters of the sequence camera and can be associated with the triggering time of the violation camera. Sequence pictures directly before and after the triggering time of the violation picture are preferably used.
-5 In further embodiments of the apparatus according to the invention, the second camera can also be suitable for and used as a single-shot camera for taking a violation picture. Accordingly, the second camera can be provided for taking one or more violation pictures within close range of the apparatus according to the invention. Also with respect to the angular field of the cameras, there is no need for similar cameras. The angular fields can be identical but can also advantageously differ. The radiation-measuring sensor can be a radar device or a laser scanner. The radar device can have a measurement beam with a small aperture angle of preferably less than or equal to 5*. Only distances may be derived from the measurement data thereof as sensor position data. A radar device of this type is not suitable for tracking an object. Repeated measurements serve only for verification of the measurement data. The radar device can also be a radar device with an aperture angle of preferably 200 or more by which the distance from the object and the solid angle between the sensor axis of the radar device and the object can be detected. It allows an unambiguous determination of sensor position data of the object in the sensor radiation field and is suitable for tracking the object. In accordance with a further aspect, the present invention provides a detection and documenting method in which a sensor directs a measurement beam into a sensor radiation field, receives portions of the measurement beam which are reflected at at least one object and measurement data are formed at measuring times during a measurement duration. Sensor position data are derived from the measurement data and are stored so as to be associated with the measuring times. A first camera with a relatively higher resolving power and a second camera with a relatively lower resolving power are arranged at a base distance relative to one another and relative to the sensor in such a way that their object fields overlap with the sensor radiation field, and a common overlapping area is formed which determines a measurement field. Picture data are obtained by the two cameras at at least one same triggering time, and the position of the at least one object relative to at least one of the two cameras is calculated from these picture data with knowledge of the camera parameters and the positional relationships of the cameras relative to one another by means of a method of image recognition and is stored as camera position data so as to be associated with at least one triggering time. The sensor position data at at least one measuring time and the camera position data at at least one triggering time are compared with one another taking into -6 account the positional relationships of the two cameras relative to the sensor and are checked for plausibility in order to identify the object. In case of plausibility, the picture data are stored and, as the case may be, provided with an electronic time stamp and date stamp. Pertinent details of the pictures, e.g., license plate, can be extracted from the picture or pictures and, e.g., retrievably stored in another file. The pictures can be displayed, for example, on a monitor or the picture data can be printed out as a picture. Absence of plausibility can be caused by bent-beam measurements, for example. The at least one triggering time preferably occurs between two measuring times within the measurement duration and the plausibility of the camera position data is checked with the sensor position data associated with one of the two measuring times. It is also advantageous when a triggering time occurs between two measuring times within the measurement duration and the plausibility of the camera position data is checked with the sensor position data associated with the two measuring times. It is most advantageous in technical respects relating to computation when a triggering time coincides with a measuring time and the plausibility of the camera position data is checked with the sensor position data associated with this measuring time. The at least one triggering time can also occur after the measurement duration and, with knowledge of sensor position data for the triggering time which are associated with the individual measuring times, sensor position data are extrapolated so that they can be compared with the camera position data. In carrying out the method according to the invention, it is preferred that the measurement beam is emitted by means of the radiation-measuring sensor over a measurement duration. The measurement beam partially impinges on an object moving in the radiation field. Portions of the measurement beam are reflected by the object, acquired by the receiver unit and converted into measurement signals. Measurement data of the object are formed from the measurement signals, and position data of the object with respect to the sensor (sensor position data) are obtained in turn from these measurement data. In case an angle-resolving radar device and a laser scanner are used, the sensor position data are formed by distance data and angle data. If a radar device which is not angle-resolving and which emits a measurement beam in the form of only a very narrow radiation cone is used, the sensor position data are given from knowledge of the orientation of the sensor axis and the -7 obtained distance data. In this case, there is a certain fuzziness of the sensor position data due to the expansion of the measurement beam around the sensor axis. The method according to the invention can also be combined with methods for optical vehicle classification and optical flow methods. Further aspects and preferred features of the invention will become apparent to the skilled addressee from the following description of a number of preferred, but not exclusive embodiments of the present invention, provided with reference to the accompanying drawings. 4. BRIEF DESCRIPTION OF THE DRAWINGS Fig. I shows a schematic illustration of an apparatus for detecting, identifying and documenting an object passing through a sensor field in accordance with a first embodiment of the present invention in which the sensor and the two cameras are accommodated in a common housing; Fig. 2 shows a schematic illustration of a second embodiment of an apparatus according to the invention in which two cameras are arranged at a distance from the sensor and the optical axes thereof are oriented parallel to one another; and Fig. 3 shows a schematic illustration of a third embodiment of an apparatus according to the invention in which the optical axes of the two cameras intersect. 5. DESCRIPTION OF PREFERRED EMBODIMENTS OF THE INVENTION A first embodiment of an apparatus according to the invention is shown in Fig. 1. As in all of the embodiments, its essential elements include a sensor 1 with a transmitter 1.2 for emitting a measurement beam and a receiver unit 1.3 for receiving reflected measurement radiation and a first camera 2, a second camera 3 and a computing unit 6. For understanding the apparatus and the method, there is no need to differentiate between the axes of the transmitter 1.2 and receiver unit 1.3, which latter can also have a plurality of axes, e.g., when the sensor 1 is an angle-resolving radar sensor. These axes, which are oriented parallel to one another and at a negligible distance from one another, are referred to collectively as a sensor axis 1.1. The transmitter 1.2 emits a measurement beam which forms a sensor radiation field 1.4 around the sensor axis 1.1. Parameters of the two - 8 cameras 2, 3 which are important for the apparatus and method include the optical axes 2.1, 2.2 of cameras 2, 3, their angular fields 2.2, 3.2 and object fields 2.3, 3.3 and their base distance 10 with respect to one another. The three embodiments illustrated in the figures differ from one another primarily by different spatial arrangements of the sensor 1 and two cameras 2, 3 relative to one another and with respect to the angular positions of the sensor axis 1.1 and of the two camera axes 2.1, 3.1, referred to hereinafter as positional relationship. The range of variation shown herein is intended to demonstrate the adaptability of the apparatus to different measurement tasks, e.g., speed monitoring, monitoring of red light violations, or presence monitoring. In an advantageous manner, each of the two cameras 2, 3 can be optimally arranged for their actual purpose, e.g., for a close-up shot of a vehicle in which the face of the driver and/or license plate are/is discernable, for capturing the vehicle in its environment, e.g., also so as to be visible together with a switched traffic light, or for taking picture sequences in which the forward movement of the vehicle is documented. In every case, the sensor 1 and the two cameras 2, 3 must be arranged relative to one another such that the sensor radiation field 1.4 and the two object fields 2.3, 3.3 overlap as is shown, for example, in Fig. I in a smaller far zone and in Fig. 2. The overlapping area formed in this way forms a measurement field 5. Only one object 8 which moves through this measurement field 5 can be identified and documented by the apparatus or method. The measuring times and the triggering times preferably lie within a measurement duration while the object 8 drives through the measurement field 5. However, they can also occur before or after the measurement duration. Association can then be carried out, as the case may be, by means of extrapolation with knowledge of the speed derived from the sensor data. The spatial positional relationships relative to one another must also be known in order that the sensor data and camera data obtained by the apparatus can be converted into one another and accordingly compared, which will be described later. For the following description, the measurement field 5 lies on a roadway 9. The installation site of the sensor 1 with respect to the roadway edge 9.1, roadway 9 and orientation of the sensor axis 1.1 can be selected as a reference basis for this purpose. In the first embodiment, shown in Fig. 1, the first camera 2 and the second camera 3 are arranged with the sensor 1 on a horizontal line and the optical axis of the first camera 2.1, -9 the sensor axis 1.1 and the optical axis of the second camera 3.1 are oriented parallel to one another. A measurement beam which is emitted through the transmitter 1.2 forms a sensor radiation field 1.4. If an object 8 moves through this sensor radiation field 1.4, a portion of the measurement beam is reflected on the receiver unit 1.3. The reception signals which are generated in this way are fed to the computing unit 6, where they are processed to form measurement data (hereinafter sensor data) which are stored by the computing unit 6 in each instance so as to be associated with a measuring time. The two cameras 2, 3 are optimized for different uses. Thus the first camera 2 has a comparatively higher resolving power through the use of a receiver matrix with, e.g., 16 megapixels and a comparatively smaller angular field 2.2 of, e.g., 100 through which the object field of the first camera 2.3 is determined along the camera axis 2.1 thereof The second camera 3 has a comparatively smaller resolving power through the use of a receiver matrix with, e.g., 4 megapixels and an angular field 2.2 of a comparatively larger angular field 3.2 of, e.g., 300 through which the object field of the second camera 3.3 is determined along the optical axis 3.1 thereof The two optical axes 2.1, 3.1 have a base distance 10 of, e.g., 25 cm relative to one another. Different sections of a scene with different amounts of picture data (hereinafter camera data) can be acquired by the two cameras 2, 3. The sensor 1 and the two cameras 2, 3 are signal-connected to the computing unit 6. The sensor data and camera data generated respectively by the sensor 1 and by the two cameras 2, 3 are stored and processed in the computing unit 6. Position data of the object 8 which are associated with measuring times and by means of which a temporary position of the object 8 relative to the sensor 1 is determined are derived from the sensor data. These position data (hereinafter sensor position data) can only relate to the distance of the object 8 from the sensor 1 (sensor distances) when, e.g., the sensor radiation field 1.4 is very narrow so that the temporary position is given with sufficient exactness by the distance and the orientation of the sensor axis 1.1. The sensor position data can also relate to sensor distances and angles when the sensor 1 is, e.g., a radar device with a broad sensor radiation field 1.4 and an angle-resolving receiver unit 1.3 or a laser scanner. Camera data are acquired through the two cameras 2, 3 at the same triggering time. Using known methods of image recognition and with knowledge of the base distance 10, - 10 position data of the object 8 relative to the cameras 2, 3 (hereinafter camera position data) are calculated from those camera data comprising at least a partial image of the object 8. While the angular position of the object 8 relative to the optical axes of the cameras 2, 3 can be calculated for each camera 2, 3 independently from one another from the imaging of the object 8 on the receiver matrices of the cameras 2, 3, it is necessary, as is well known, to reckon the camera data of both cameras 2, 3 in order to calculate the distance from the cameras 2, 3 (camera distance). The computing unit 6 is designed to compare the sensor position data and the camera position data with one another and to check the plausibility thereof To this end, the positional relationship of the sensor 1 relative to the two cameras 2, 3 must be known and stored in the computing unit 6. A monitor is connected as display unit 7 to the computing unit 6, and the sensor position data, camera position data and picture data can be visualized in their entirety by means of this monitor. Figs. 2 and 3 show two further embodiments of an apparatus according to the present invention in which the two optical axes of the cameras 2, 3 form an identical angle with the sensor axis 1.1 (Fig. 2) or all of the aforementioned axes form a different angle with one another. Accordingly, there is a different position of the measurement field 5 which is known with knowledge of the positional relationships and the sensor parameters and camera parameters. The camera distances can be calculated from the camera data of the two cameras 2, 3 by means of known trigonometric relationships such as the sine rule. The detection methodology according to the invention will be described in the following with reference to Fig. 3, but it is to be understood that it applies also to the other embodiments. To implement the method, the sensor I and the two cameras 2, 3 are arranged in a known positional relationship to one another and to the roadway 9 so that, as was explained above, a measurement field 5 is formed on the roadway 9. A measurement beam is emitted into a sensor radiation field 1.4 by the sensor 1. When the object 8 travels through the sensor radiation field 1.4, the measurement beam is partially reflected by the object 8 onto the receiver unit 1.3 of the sensor 1. The reception signals formed in this way are acquired at measuring times, converted into sensor data by the computing unit 6 and stored so as to be associated with the measuring times. During the measurement duration, i.e., the duration of travel through the measurement field 5, the two cameras 2, 3 are triggered at least once at the same time. This triggering time - I1 is determined from the sensor data by the computing unit 6, e.g., when given sensor position data, which can also be only sensor distances, are derived from the sensor data. These sensor position data allow the conclusion to be made that the object 8 is located in the measurement field 5. The triggering time can then coincide with a measuring time occurring next or can also lie between successive measuring times. A possibility for checking the sensor position data and camera position data for plausibility consists in that, by means of the sensor position data at a measuring time with knowledge of the positional relationships, camera position data are calculated at this measuring time and are then compared with the camera position data which are determined from the picture data and which were obtained at a triggering time that is at least close in time. When these camera position data match, it can then be confirmed that an object 8 measured by the sensor 1 is also the object 8 which is imaged by the two cameras 2, 3 and which has the derived camera position data, thus identifying the object 8. A first triggering time, for example, can be derived from the sensor data when an object 8 moves into the measurement field 5 or when an object 8 is located in the measurement field 5 and it has been verified that a speed has been exceeded. Later triggering times can likewise be derived from the sensor data, or they follow the first triggering time at time intervals which are predetermined by the computing unit 6. If the object 8 has been identified, the camera data are stored long-term for documentation. For example, a high-resolution picture of the object 8 is taken with the first camera 2, in which high-resolution picture pertinent details such as the license plate of the vehicle and/or the face of the driver of the vehicle are clearly discernable.
-12 Table of Reference Numerals and Features present in the drawings 1 sensor 1.1 sensor axis 1.2 transmitter 1.3 receiver unit 1.4 sensor radiation field 2 first camera 2.1 optical axis of the first camera 2.1 angular field of the first camera 2.3 object field of the first camera 3 second camera 3.1 optical axis of the second camera 3.2 angular field of the second camera 3.3 object field of the second camera 5 measurement field 6 computing unit 7 display unit 8 object 9 roadway 9.1 roadway edge 10 base distance

Claims (12)

1. Apparatus for identifying and documenting at least one object driving through a measurement field, including (i) a sensor with a transmitter for transmitting a measurement beam into a sensor radiation field covering a measurement field and a receiver unit for receiving portions of the measurement beam which are reflected by at least one object for forming measurement data, (ii) a first and a second camera which are arranged at a base distance relative to one another and which each supply picture data, and (iii) a computing unit for processing the measurement data and the picture data, characterized in that the two cameras are arranged in known positional relationships with respect to one another and to the sensor such that their object fields overlap with the sensor radiation field and the overlapping area formed in this way determines the measurement field, in that the first camera has a higher resolving power than the second camera, and in that the computing unit is arranged to determine sensor position data from measurement data acquired at a plurality of measuring times within a measurement duration and store these sensor position data so as to be associated with the measuring times, the computing unit further arranged to use the camera parameters, data about the spatial arrangement of the two cameras and a method of image recognition in calculating camera position data of the at least one object from the picture data of the two cameras acquired at at least one same triggering time, the computing unit further arranged to store these camera position data so as to be associated with the at least one triggering time, and the computing unit further arranged for taking into account the positional relationships and the time correlation between the at least one triggering time and the measuring times in comparing the sensor position data and the camera position data with one another and checking them for plausibility.
2. Apparatus according to claim 1, characterized in that the angular field of the first camera is smaller than the angular field of the second camera so that a smaller object field can be imaged with the first camera than with the second camera.
3. Apparatus according to claim 1, characterized in that the first camera is a single-shot camera and the second camera is a sequence camera.
4. Apparatus according to claim 1 or 2, characterized in that the resolving power of the first camera is higher than the resolving power of the second camera by a factor of at least two. - 14
5. Apparatus according to one of claims 1 to 3, characterized in that the sensor is a radar device.
6. Apparatus according to one of claims 1 to 4, characterized in that the sensor is a laser scanner.
7. Method for identifying and documenting at least one object driving through a measurement field, in which a sensor directs a measurement beam into a sensor radiation field , receives portions of the measurement beam which are reflected at at least one object, and measurement data are formed at measuring times during a measurement duration, sensor position data being derived therefrom and stored so as to be associated with the measuring times, characterized in that a first camera with a first resolving power and a second camera with a second resolving power lower than that of the first camera are arranged at a base distance relative to one another and relative to the sensor in such a way that their object fields overlap with the sensor radiation field, wherein an overlapping area formed in this way determines a measurement field, in that picture data are obtained by the two cameras at at least one same triggering time, and the position of the at least one object relative to at least one of the two cameras is calculated from these picture data with knowledge of the camera parameters and the positional relationships of the cameras relative to one another by means of a method of image recognition and is stored as camera position data so as to be associated with the at least one triggering time, and in that the sensor position data at at least one measuring time and the camera position data at at least one triggering time are compared with one another taking into account the positional relationships of the two cameras relative to the sensor and the time correlation between the at least one triggering time and the measuring times and are checked for plausibility in order to identify the object, and in the event of congruency the picture data of the two cameras obtained at at least one same triggering time are stored so that they can be visualized for documentation.
8. Method according to claim 7, characterized in that the at least one same triggering time occurs between two measuring times, and the plausibility of the camera position data is checked with the sensor position data associated with one of the two measuring times. - 15
9. Method according to claim 7, characterized in that the at least one triggering time lies between two measuring times, and the plausibility of the camera position data is checked with the sensor position data associated with the two measuring times.
10. Method according to claim 7, characterized in that the at least one triggering time coincides with a measuring time, and the plausibility of the camera position data is checked with the sensor position data associated with the measuring time.
11. Method according to one of claims 7 to 10, characterized in that the sensor position data and/or camera position data are distances and angles.
12. Method according to one of claims 7 to 10, characterized in that the sensor position data and/or camera position data are distances.
AU2013209360A 2012-07-27 2013-07-26 Apparatus and method for the identification and documentation of at least one object driving through a radiation field Abandoned AU2013209360A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102012106860.7A DE102012106860A1 (en) 2012-07-27 2012-07-27 Device and method for identifying and documenting at least one object passing through a radiation field
DE102012106860.7 2012-07-27

Publications (1)

Publication Number Publication Date
AU2013209360A1 true AU2013209360A1 (en) 2014-02-13

Family

ID=48793947

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2013209360A Abandoned AU2013209360A1 (en) 2012-07-27 2013-07-26 Apparatus and method for the identification and documentation of at least one object driving through a radiation field

Country Status (8)

Country Link
EP (1) EP2690459B1 (en)
CN (1) CN103578278A (en)
AU (1) AU2013209360A1 (en)
DE (1) DE102012106860A1 (en)
DK (1) DK2690459T3 (en)
ES (1) ES2628895T3 (en)
LT (1) LT2690459T (en)
PL (1) PL2690459T3 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103903440B (en) * 2014-04-03 2016-06-08 浙江宇视科技有限公司 A kind of electric police grasp shoot method and device
CN104050811B (en) * 2014-06-13 2017-05-24 深圳市砝石激光雷达有限公司 Laser motor vehicle model classification system and method
FR3027407B1 (en) * 2014-10-21 2016-11-04 Morpho ROAD CONTROL UNIT
CN106710221A (en) * 2017-03-15 2017-05-24 深圳市鹏之艺建筑设计有限公司 Smart city intelligent transportation system and evaluation method thereof
DE102017210408A1 (en) * 2017-06-21 2018-12-27 Conti Temic Microelectronic Gmbh Camera system with different shutter modes
CN108806265B (en) * 2018-01-30 2020-10-30 胡海明 License plate search-based violation detection system
CN110910633A (en) * 2018-09-14 2020-03-24 阿里巴巴集团控股有限公司 Road condition information processing method, device and system
JP7375304B2 (en) * 2019-01-30 2023-11-08 住友電気工業株式会社 Radio wave sensor condition evaluation device, radio wave sensor system, radio wave sensor evaluation method, computer program, and radio wave sensor adjustment method
CN112804481B (en) * 2020-12-29 2022-08-16 杭州海康威视系统技术有限公司 Method and device for determining position of monitoring point and computer storage medium

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2998791B2 (en) * 1996-10-31 2000-01-11 日本電気株式会社 3D structure estimation device
DE10011263A1 (en) * 2000-03-08 2001-09-13 Bosch Gmbh Robert Object detection system for adaptive cruise control system of vehicle, includes radar sensor with large and small detection ranges
DE10149115A1 (en) * 2001-10-05 2003-04-17 Bosch Gmbh Robert Object detection device for motor vehicle driver assistance systems checks data measured by sensor systems for freedom from conflict and outputs fault signal on detecting a conflict
WO2006123615A1 (en) * 2005-05-19 2006-11-23 Olympus Corporation Distance measuring apparatus, distance measuring method and distance measuring program
FR2921027B1 (en) * 2007-09-17 2010-02-26 Valeo Vision DRIVING AIDS DEVICE FOR A MOTOR VEHICLE HAVING A STEREOSCOPIC IMAGE CAPTURE SYSTEM
EP2178059A1 (en) * 2008-10-16 2010-04-21 ROBOT Visual Systems GmbH Device to record traffic infringements and to check number plates
US8284996B2 (en) * 2008-12-17 2012-10-09 Automated Speed Technologies, LLC Multiple object speed tracking system
US20100177162A1 (en) * 2009-01-15 2010-07-15 Charles Macfarlane Method and system for enabling 3d video and image processing using one full resolution video stream and one lower resolution video stream
US7978122B2 (en) * 2009-08-13 2011-07-12 Tk Holdings Inc. Object sensing system
DE102010012811B4 (en) * 2010-03-23 2013-08-08 Jenoptik Robot Gmbh Method for measuring speeds and associating the measured speeds with appropriate vehicles by collecting and merging object tracking data and image tracking data

Also Published As

Publication number Publication date
EP2690459B1 (en) 2017-03-29
LT2690459T (en) 2017-07-25
CN103578278A (en) 2014-02-12
EP2690459A2 (en) 2014-01-29
DK2690459T3 (en) 2017-07-17
DE102012106860A1 (en) 2014-02-13
ES2628895T3 (en) 2017-08-04
EP2690459A3 (en) 2014-10-22
PL2690459T3 (en) 2017-10-31

Similar Documents

Publication Publication Date Title
AU2013209360A1 (en) Apparatus and method for the identification and documentation of at least one object driving through a radiation field
WO2018072669A1 (en) Radiation inspection system and method
US20140168424A1 (en) Imaging device for motion detection of objects in a scene, and method for motion detection of objects in a scene
US10810847B2 (en) Method and camera system combining views from plurality of cameras
JP6296477B2 (en) Method and apparatus for determining the three-dimensional coordinates of an object
CN104567726B (en) Vehicle operation troubles detecting system and method
JP2008140370A (en) Stereo camera intrusion detection system
JP6251142B2 (en) Non-contact detection method and apparatus for measurement object
CN111465819B (en) Three-dimensional environmental sensing by means of projection of a pseudorandom pattern sequence and stereo camera module
KR20200064873A (en) Method for detecting a speed employing difference of distance between an object and a monitoring camera
US6795200B1 (en) Method and system for dimensioning boxes or other cuboid objects
EP2966400B1 (en) Overhead line position measuring device and method
KR102019036B1 (en) Apparatus for container image recognition using position sensors and method thereof
WO2008086293A2 (en) A system and method for measuring the speed of vehicles or other objects
KR20120130199A (en) Method and Device for Determining The Speed of Travel and Coordinates of Vehicles and Subsequently Identifying Same and Automatically Recording Road Traffic Offences
KR20160100788A (en) Apparatus and method for measuring speed of moving object
RU2010140804A (en) CLEARANCE AND ALIGNMENT MEASUREMENT SYSTEM AND APPROPRIATE METHOD
Sochor et al. Brnocompspeed: Review of traffic camera calibration and comprehensive dataset for monocular speed measurement
US20140320645A1 (en) Method for Detecting and Documenting the Speeds of a Plurality of Vehicles in an Image Document
CN101271590A (en) Method for acquiring cam contour object shape
JP2008292278A (en) Optical deviation detection method of distance detection device, and distance detection device
KR101057837B1 (en) Vehicle auto inspection system using laser beam
CN102981010B (en) For the method for the speed for carrying out confirmatory measurement target vehicle by camera
JP2015220623A (en) Mobile body imaging system
AU2011351897B2 (en) Method for safely identifying a vehicle captured by a radiation sensor in a photograph

Legal Events

Date Code Title Description
MK1 Application lapsed section 142(2)(a) - no request for examination in relevant period