DK2690459T3 - Device and method for identifying and documenting at least one object passing through a radiation field - Google Patents
Device and method for identifying and documenting at least one object passing through a radiation field Download PDFInfo
- Publication number
- DK2690459T3 DK2690459T3 DK13176747.7T DK13176747T DK2690459T3 DK 2690459 T3 DK2690459 T3 DK 2690459T3 DK 13176747 T DK13176747 T DK 13176747T DK 2690459 T3 DK2690459 T3 DK 2690459T3
- Authority
- DK
- Denmark
- Prior art keywords
- camera
- sensor
- measurement
- position data
- cameras
- Prior art date
Links
- 230000005855 radiation Effects 0.000 title claims description 52
- 238000000034 method Methods 0.000 title claims description 30
- 238000005259 measurement Methods 0.000 claims description 53
- 238000012545 processing Methods 0.000 claims description 5
- 230000002123 temporal effect Effects 0.000 claims description 4
- 230000003287 optical effect Effects 0.000 description 14
- 238000012544 monitoring process Methods 0.000 description 6
- 239000011159 matrix material Substances 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000001454 recorded image Methods 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000013213 extrapolation Methods 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/91—Radar or analogous systems specially adapted for specific applications for traffic control
- G01S13/92—Radar or analogous systems specially adapted for specific applications for traffic control for velocity measurement
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/017—Detecting movement of traffic to be counted or controlled identifying vehicles
- G08G1/0175—Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/25—Image signal generators using stereoscopic image cameras using two or more image sensors with different characteristics other than in their location or field of view, e.g. having different resolutions or colour pickup characteristics; using image signals from one sensor to control the characteristics of another sensor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/254—Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Radar Systems Or Details Thereof (AREA)
- Traffic Control Systems (AREA)
- Studio Devices (AREA)
- Image Analysis (AREA)
- Analysing Materials By The Use Of Radiation (AREA)
Description
Description
The invention relates to a device and a method for identifying and documenting at least one object passing through a sensor radiation field, as is known in generic form from EP 2 284 568 A2 .
From the prior art a number of different approaches to the identification and documentation of objects are known, such as are used in particular in the area of traffic monitoring. One of the difficulties found in these approaches is in particular the fact that multiple objects can be located in the radiation field at the same time, which cannot be reliably identified due to an inaccurate position determination or reflections of bent beams .
In order to take into account the varying circumstances in the monitoring of stationary and moving objects, in document EP 2 284 568 A2 a system for detecting an object is proposed, in which a radar sensor is present as a radiation measuring sensor, with a transmitter for sending a measuring beam into a sensor radiation field and a receiver for receiving reflected components of the measuring radiation. The radar sensor is used to provide a measurement of the speed of an object located a substantial distance from the radar sensor. Along a horizontal, a first camera is situated on one side of the radar sensor and a second camera is situated on the other side, wherein the object fields of the cameras are located within the sensor radiation field. The optical axes of the two cameras are aligned in parallel to each other and to the axis of the sensor. The cameras each record an image of a scene at at least one trigger time. As a result of the horizontal distance between the cameras, the scene is recorded by each of the cameras at a slightly different angle. The horizontal deviation (horizontal disparity) therefore obtained from a comparison of the images of the synchronously recorded image allows a determination of spatial information of the object. Using a temporal seguence of recorded images the object can be tracked, from which the velocity of the object at a fairly short distance can be derived.
In addition to its identification, in order to also allow the documentation of the object with sufficient quality, two cameras with a high resolution are used. The measurement data of the sensor and of the image data from the cameras are processed using a processing unit. A disadvantage of a device in accordance with document EP 2 284 568 A2 is its high computing power and computation time required to rapidly process the information from the two high-resolution cameras. High-speed processing of the information is necessary for monitoring flowing road traffic. In addition, an alternative acquisition of measurement data or image data in different distance ranges is provided here, which means an identification of an object is not verified, but is alternatively effected either by using the radar device or the cameras .
The object of the invention is to propose an improved facility for identifying and documenting an object passing through a radiation field.
This object is achieved for a device containing a sensor, having a transmitter for transmitting measuring radiation into a sensor radiation field overlapping a measuring field and a receiver unit for receiving parts of the measuring radiation reflected by at least one object, in order to form measurement data, a first camera and a second camera, which are arranged at a base distance from each other and each provide image data, as well as a computer unit for processing the measurement data and the image data, by virtue of the two cameras being arranged such, with respect to each other and to the sensor, that they have known relative positions to each other, so that their object fields overlap with the sensor radiation field and the overlapping area thus formed determines the measuring field . The first camera has a higher resolution than the second camera.
The computer unit is designed such that from measurement data obtained at a plurality of measurement times within a measurement period, it determines sensor position data and can store said sensor position data, assigned to the measurement times. The computer unit is also designed such that, from the image data of the two cameras obtained at at least one identical trigger time, given knowledge of the camera parameters and the spatial arrangement of the two cameras, using a method for image recognition it can compute camera position data of the at least one object and can store said position data, assigned to the at least one trigger time, and that said computer unit can compare the sensor position data and the camera position data and check their plausibility, taking into consideration the relative positions and the temporal correlation between the at least one trigger time and the measurement times.
For the sake of clarity, some of the terms used in the description of the invention will be explained in the following. A field of view [image field angle] is defined by the ratio of the length of a line of a receiver matrix of a digital camera and the focal length of the associated camera lens. A small field of view [image field angle] consequently only allows a digital image of a small object field, while a larger field of view [image field angle] allows a digital image of a larger object field.
The passage of an object through a radiation field is equivalent to a passage or any other form of through passage. An object can be, for example, a vehicle or an individual, such as a pedestrian or an animal.
Methods of image recognition are defined as known techniques from the prior art, with which known features of objects can be found in images or image data, which are based, for example, on edge detection.
The term identification means the assignment of an object to measured data obtained, which in the case of the present method includes a verification of the assignment. For example, if a measured speed is assigned to a measured distance, the associated object only counts as being identified as such if a distance which is correlated thereto has been determined for the object via a second detection procedure.
Sensor position data and camera position data which are compared to each other are then considered plausible if they relate to each other in a specific way. For example, they can be convertible into each other, or lie within a permitted tolerance. Comparing the sensor position data and the camera position data produces a means of verifying the position of an object by means of two different methods (use of a radiation measuring sensor vs. the use of cameras).
By means of the device according to the invention, camera position data of an object, in particular its distance away, are obtained using two cameras with different camera parameters by the application of known stereoscopic methods. An advantage of this is that one of the cameras has only a low resolution, which allows the required computing power of the device to be comparatively reduced without introducing disadvantages in the computing and processing of the camera position data. For creating a two-dimensional spatial image or a three-dimensional image however, the combination of two cameras with significantly different resolutions is not suitable. Providing legally tenable evidence of a traffic offence requires the creation of an image (picture) . To be able to interpret a two-dimensional spatial image or a three-dimensional image, the human brain requires image information of roughly the same quality at a point in time at which the image is viewed. Therefore, if a two-dimensional spatial image or a three-dimensional image is to be created by a combination of at least two images taken from slightly different angles from each other, then these images need to have roughly the same quality, for example roughly the same resolutions. A preferred embodiment of the device according to the invention is one in which the resolution capability of the first camera is at least a factor of 2 higher than the resolution capability of the second camera. A factor of at least four is advantageous in order to obtain a significant reduction of the required computing power.
As the first camera, a single image camera, also known as a traffic camera, can be used, by means of which at least one high-resolution image of an infringing vehicle can be recorded. Due to the high resolution capability, details of interest of the infringement vehicle, such as the registration number and driver of the vehicle, can be captured and displayed. The field of view angle of the first camera is preferably small, for example, approximately 10°.
The second camera can be a sequence camera synchronized with the traffic camera, with which a sequence of images is recorded and the sequence being temporally assigned to an infringement, such as a speeding violation. The resolution of the second camera is lower than that of the first camera, while the field of view of the second camera is preferably larger than the field of view of the first camera. By having a larger field of view angle, it is possible for an object such as an infringing vehicle to be detected in at least two successive images in significantly different positions in a convenient way. The image data for obtaining the camera position data are then obtained at an identical trigger time of the cameras.
In a further design of the device according to the invention the second camera can be a free-running sequence camera, which means the trigger times of the two cameras do not coincide. In this case, there is indeed no time-synchronized sequence image for one infringement image, but by taking account of the known camera parameters of the sequence camera an artificial sequence image can be interpolated or extrapolated from at least two sequence images, where said image is assigned to the trigger time of the infringement camera. Preferably, sequence images are used which occur immediately before and after the trigger time of the infringement image.
In other designs of the device according to the invention, the second camera, as a single image camera, can be suitable for recording an infringement image and used as such. In this way the second camera can be provided for a recording of one or more infringement images in close proximity to the device according to the invention.
Also, with regard to the field of view of the cameras, the cameras do not need to be of the same type. The fields of view can be the same size, but advantageously they can also be different from each other.
The radiation measuring sensor can be a radar device or a laser scanner.
The radar device can have a measuring radiation with a small opening angle of preferably less than / equal to 5°. From these measurement data, only distances can be derived as sensor position data. Such a radar device is not suitable for tracking an object. Multiple measurements can only be used for verification of the measurement data.
The radar device can also be such a device having an opening angle of preferably 20° or greater, with which the distance to the object and the spatial angle between the sensor axis of the radar device and the object can be detected. It allows an unambiguous determination of sensor position data of the object in the sensor radiation field and is suitable for tracking the object.
The problem is also solved by a method in which a sensor directs a measuring radiation into a sensor radiation field, receives components of the measuring radiation reflected at at least one object and time measurement data are formed at measurement times during a measurement period. Sensor position data are derived from the measurement data and stored, assigned to the measuring points. A first camera with a relatively higher and a second camera with a relatively lower resolution are arranged at a base distance from each other and from the sensor, such that their object fields overlap with the sensor radiation field, wherein a common overlapping region is formed that defines a measuring field. At at least one identical trigger time, the two cameras are used to obtain image data from which the position of the at least one object with respect to at least one of the two cameras can be computed, given knowledge of the camera parameters and the relative positions of the cameras to each other, via an image recognition method using the base distance, and said position is stored as camera position data, assigned to the at least one trigger time. The sensor position data at at least one measuring time and the camera position data at at least one trigger time are compared with each other, taking into account the positions of the two cameras relative to the sensor, and checked for plausibility in order to identify the object.
If plausibility is found, the image data is stored and possibly provided with an electronic time and date stamp. Interesting details of the images, e.g. the registration number, can be extracted from the image or images and stored e.g. in another file, so that they are retrievable. The images can be displayed, for example on a monitor, or the image data can be printed out as an image. A lack of plausibility can be caused, for example, by bent beam measurements that may occur.
The at least one trigger time preferably occurs within the measuring period between two measuring times, and the plausibility of the camera position data is verified with the sensor position data assigned to one of the two measurement times .
It is also advantageous if a trigger time point occurs between two measuring times within the measurement period, and the plausibility of the camera position data is verified with the sensor position data assigned to the two measuring times. It is most favourable from a computational point of view if a trigger time coincides with a measuring time and the plausibility of the camera position data is verified with the sensor position data assigned to this measurement time. The at least one trigger time may also occur after the measurement period, and given knowledge of the sensor position data assigned to individual measuring times, sensor position data are extrapolated for the trigger time in order to compare said position data with the camera position data.
In the implementation of the method according to the invention, a measuring radiation is emitted by means of the radiation measuring sensor over a measuring period. Components of the measuring radiation are incident on an object which moves in the radiation field. Components of the measuring radiation are reflected from the object, detected by the receiver unit and converted into the measuring signals. From the measuring signals, measuring data of the object are formed, from which in turn position data for the object in relation to the sensor (sensor position data) are obtained. If an angle-resolving radar device and a laser scanner are used, the sensor position data are formed by distance and angle data. In the case of a non-angle-resolving radar device, which emits a measuring radiation only in the form of a very narrow radiation cone, the sensor position data are obtained from knowledge of the alignment of the sensor axis and the resulting distance data. In this case, due to the broadening of the measuring beam about the sensor axis, a certain amount of blurring of the sensor position data occurs.
The method according to the invention can also be combined with methods for optical vehicle classification and optical flow methods.
The invention will hereafter be explained in further detail on the basis of exemplary embodiments and by reference to a drawing. They show:
Fig. 1 a first exemplary embodiment of a device according to the invention in which the sensor and the two cameras are accommodated in a common housing,
Fig. 2 a second exemplary embodiment of a device according to the invention, in which the two cameras are arranged remotely from the sensor and their optical axes are aligned parallel to each other, and
Fig.3 a third exemplary embodiment of a device according to the invention, in which the optical axes of the two cameras intersect. A first exemplary embodiment of a device according to the invention is shown in Fig. 1. As its essential elements it comprises, as do all embodiments, a sensor 1, with a transmitter 1.2 for emitting a measuring radiation and a receiver unit 1.3 for receiving reflected measuring radiation, and a first camera 2, a second camera 3 and a computer unit 6.
Understanding the device and the method does not require any differentiation between the axes of the transmitter 1.2 and the receiver unit 1.3, which can also comprise a plurality of axes, e.g. if the sensor 1 is an angle-resolving radar sensor. These axes, which are aligned parallel to each other and are spaced a negligible distance apart, are understood together as a single sensor axis 1.1. The transmitter 1.2 emits a measuring radiation, which forms a sensor radiation field 1.4 around the sensor axis 1.1. Important parameters of the two cameras 2, 3 for the device and the method are the optical axes 2.1, 2.2 thereof, the fields of view 2.2, 3.2 thereof, the object fields 2.3, 3.3 thereof and the base distance 10 separating them.
The following three exemplary embodiments identified for a device differ primarily in different spatial arrangements of the sensor 1 and the two cameras 2, 3 in relation to each other, and with regard to the angular positions of the sensor axis 1.1 and the two camera axes 2.1, 3.1 in relation to each other, hereafter designated as the relative position. The range of variation shown here is intended to demonstrate the ability of the device to be adapted to different measuring tasks, such as speed monitoring, monitoring of red light violations, or presence monitoring functions. The two cameras 2, 3 can each be optimally arranged for their individual purpose, e.g. for large-scale recording of a vehicle with a driver's face and/or registration number visible, for recording the vehicle in its surroundings, e.g. also visible together with a switched traffic light, or for recording image sequences in which the progress of the vehicle is documented.
In each case the sensor 1 and the two cameras 2, 3 must be arranged relative to each other such that the sensor radiation field 1.4 and the two object fields overlap 2.3, 3.3, as for example in Fig. 1, in a remote area reduced in size, and in Fig 2. The overlap area thus formed creates a measuring field 5. Only one object 8 which moves through this measuring field 5 can be identified and documented with the device or the method. The measuring times and the trigger times preferably occur within a measuring period during which the object 8 passes through the measuring field 5. They can also occur before or after it, however. An assignment via an extrapolation can then be effected, if necessary having knowledge of the speed derived from the sensor data. The relative spatial positions to one another must also be known, to be able to convert the sensor and camera data obtained with the device into each other and thus to be able to compare them, which is explained below.
For the following description, the measurement field 5 is designed to be on a road surface 9. As a suitable frame of reference, the position of the sensor 1 in relation to the edge of the road surface 9.1, the road surface 9 and the orientation of the sensor axis 1.1 can be selected.
In the first exemplary embodiment, shown in Figure 1, the first camera 2 and the second camera 3 are arranged with the sensor 1 on a horizontal line and the optical axis of the first camera 2.1, the sensor axis 1.1 and the optical axis of the second camera 3.1 are oriented parallel to each other. A measuring radiation emitted by the transmitter 1.2 forms a sensor radiation field 1.4. If an object 8 moves through this sensor radiation field 1.4, part of the measuring radiation is reflected onto the receiver unit 1.3. The received signals thereby generated are fed to the computing unit 6 and processed there into measurement data (hereinafter sensor data) , which are each stored by the computing unit 6, assigned to a measuring time.
The two cameras 2, 3 are optimized for different applications. Thus the first camera 2 has a comparatively greater resolution due to the use of a receiver matrix with e.g. 16 megapixels and a comparatively smaller field of view 2.2 of e.g. 10°, by means of which the object field of the first camera 2.3 is defined along the camera axis 2.1.
The second camera 3 has a comparatively smaller resolution due to the use of a receiver matrix with e.g. 4 megapixels and a comparatively larger field of view 3.2 of e.g. 30°, by means of which the object field of the second camera 3.3 is defined along the optical axis 3.1 thereof.
The two optical axes 2.1, 3.1 have a base distance 10 between them of e.g. 25 cm. By means of the two cameras 2, 3, different extracts of a scene can be captured, each with a different data volume of image data (hereinafter camera data).
The sensor 1 and the two cameras 2, 3 are connected to the computer unit 6 in a signal sense. There, the sensor data or camera data generated by the sensor 1 and by the two cameras 2, 3 respectively are stored and processed.
From the sensor data, position data of the object 8 which are assigned to measuring times are derived, by means of which a temporary position of the object 8 relative to the sensor 1 is determined. These position data (hereafter sensor position data) can only relate to the distance of the object 8 from the sensor 1 (sensor distances) if e.g. the sensor radiation field 1.4 is very narrow, so that the temporary position in each case is given sufficiently precisely by the distance and the orientation of the sensor axis 1.1. The sensor position data can also relate to sensor distances and angles if the sensor 1 is e.g. a radar device with a wide sensor radiation field 1.4 and an angle-resolving receiver unit 1.3, or a laser scanner.
Using each of the two cameras 2, 3, camera data are captured at an identical trigger time. From those camera data, which at least comprise a partial image of the object 8, using known methods of image recognition and given knowledge of the base distance 10, position data of the object 8 relative to the cameras 2, 3 (hereafter camera position data) are computed. While the angular position of the object 8 relative to the optical axes of the cameras 2, 3 can be calculated for each camera 2, 3 independently of each other from the stored image of the object 8 on the receiver matrices of the cameras 2, 3, as is well known the calculation of the camera data from both cameras 2, 3 is required to compute the distance to the cameras 2, 3 (camera distance). The computer unit 6 is designed to compare the sensor position data and the camera position data against each other and to check their plausibility. To this end the relative position of sensor 1 to the two cameras 2.3 must be known and stored in the computer unit 6. The computer unit 6 is connected to a monitor as an output unit 7, by means of which the sensor position data and camera position data and image data can be visualized as a whole .
In Fig. 2 and 3 two exemplary embodiments are shown, in which the two optical axes of the cameras 2, 3 enclose an identical angle with the sensor axis 1.1 (Fig.2), or all of the axes mentioned enclose a different angle with each other. This results in a different position of the measuring field 5, which given knowledge of the relative positions and the sensor and camera parameters, is known. The camera distances can be calculated from the camera data of the two cameras 2, 3 using known trigonometric relationships, such as the sine law.
In the following, the sequence of a method according to the invention will be explained by reference to Figure 3.
To implement the method, the sensor 1 and the two cameras 2, 3 are arranged in a known relative position to each other and to the road surface 9, so that, as already explained, a measuring field 5 is formed on the road surface 9.
By means of the sensor 1 a measuring radiation is emitted into a sensor radiation field 1.4. When an object 8 passes through the sensor radiation field 1.4, components of the measuring radiation are reflected from the object 8 onto the receiver unit 1.3 of the sensor 1. At measuring times the received signals thus formed are detected, converted by the computing unit 6 into sensor data and, assigned to the measuring times, stored.
During the measurement period, the duration of the passage through the measuring field 5, the two cameras 2, 3 are triggered at least once at the same time. This trigger time is determined by the computer unit 6 from the sensor data, e.g. when pre-defined sensor position data, which can also only be sensor distances, are derived from the sensor data which imply that the object 8 is located in the measuring field 5.
The trigger time can then subseguently coincide with a following measurement time, or else fall between consecutive measurement times.
One possibility for checking the sensor position data and camera position data for plausibility consists in using the sensor position data at a measuring time, and given the knowledge of the relative positions, to calculate camera position data at this measuring time, that are then compared with the camera position data obtained from the image data, which were obtained at a trigger time at least close to it. In the event of a match it can then be confirmed that an object 8 measured with the sensor 1 is also the object 8 imaged by the two cameras 2, 3 with the derived camera position data, with which the object 8 is identified.
From the sensor data e.g. a first trigger time can be derived, if an object 8 drives into the measuring field 5 or if an object 8 is located in the measuring field 5 and its excessive speed has been verified. Later trigger times can also be derived from the sensor data, or they follow the first trigger time at time intervals specified by the computing unit 6.
If the object 8 has been identified, the camera data are archived for documentation purposes. With the first camera 2, for example, a high-resolution image of the object 8 is recorded, in which interesting details, such as the registration number of the vehicle and / or the face of the vehicle driver, are clearly visible.
List of reference numerals 1 sensor 1.1 sensor axis 1.2 transmitter 1.3 receiver unit 1.4 sensor radiation field 2 first camera 2.1 optical axis of the first camera 2.2 field of view of the first camera 2.3 object field of the first camera 3 second camera 3.1 optical axis of the second camera 3.2 field of view of the second camera 3.3 object field of the second camera 5 measuring field 6 computer unit 7 output unit 8 object 9 road surface 9.1 road surface edge 10 base distance
Claims (12)
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102012106860.7A DE102012106860A1 (en) | 2012-07-27 | 2012-07-27 | Device and method for identifying and documenting at least one object passing through a radiation field |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| DK2690459T3 true DK2690459T3 (en) | 2017-07-17 |
Family
ID=48793947
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| DK13176747.7T DK2690459T3 (en) | 2012-07-27 | 2013-07-16 | Device and method for identifying and documenting at least one object passing through a radiation field |
Country Status (8)
| Country | Link |
|---|---|
| EP (1) | EP2690459B1 (en) |
| CN (1) | CN103578278A (en) |
| AU (1) | AU2013209360A1 (en) |
| DE (1) | DE102012106860A1 (en) |
| DK (1) | DK2690459T3 (en) |
| ES (1) | ES2628895T3 (en) |
| LT (1) | LT2690459T (en) |
| PL (1) | PL2690459T3 (en) |
Families Citing this family (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103903440B (en) * | 2014-04-03 | 2016-06-08 | 浙江宇视科技有限公司 | A kind of electric police grasp shoot method and device |
| CN104050811B (en) * | 2014-06-13 | 2017-05-24 | 深圳市砝石激光雷达有限公司 | Laser motor vehicle model classification system and method |
| FR3027407B1 (en) * | 2014-10-21 | 2016-11-04 | Morpho | ROAD CONTROL UNIT |
| CN106710221A (en) * | 2017-03-15 | 2017-05-24 | 深圳市鹏之艺建筑设计有限公司 | Smart city intelligent transportation system and evaluation method thereof |
| DE102017210408A1 (en) * | 2017-06-21 | 2018-12-27 | Conti Temic Microelectronic Gmbh | Camera system with different shutter modes |
| CN108806265B (en) * | 2018-01-30 | 2020-10-30 | 胡海明 | Violation detection system based on license plate search |
| CN110910633A (en) * | 2018-09-14 | 2020-03-24 | 阿里巴巴集团控股有限公司 | Road condition information processing method, device and system |
| JP7375304B2 (en) * | 2019-01-30 | 2023-11-08 | 住友電気工業株式会社 | Radio wave sensor condition evaluation device, radio wave sensor system, radio wave sensor evaluation method, computer program, and radio wave sensor adjustment method |
| CN112804481B (en) * | 2020-12-29 | 2022-08-16 | 杭州海康威视系统技术有限公司 | Method and device for determining position of monitoring point and computer storage medium |
| FR3153454A1 (en) | 2023-09-22 | 2025-03-28 | Idemia Identity & Security France | Image acquisition device for road control machines |
| FR3155907A1 (en) | 2023-11-24 | 2025-05-30 | Idemia Identity & Security France | Installation kit for an image acquisition device for road control machines |
| FR3161974A1 (en) | 2024-08-23 | 2025-11-07 | Idemia Identity & Security France | Method for configuring a road traffic control unit |
Family Cites Families (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2998791B2 (en) * | 1996-10-31 | 2000-01-11 | 日本電気株式会社 | 3D structure estimation device |
| DE10011263A1 (en) * | 2000-03-08 | 2001-09-13 | Bosch Gmbh Robert | Object detection system for adaptive cruise control system of vehicle, includes radar sensor with large and small detection ranges |
| DE10149115A1 (en) * | 2001-10-05 | 2003-04-17 | Bosch Gmbh Robert | Object detection device for motor vehicle driver assistance systems checks data measured by sensor systems for freedom from conflict and outputs fault signal on detecting a conflict |
| WO2006123615A1 (en) * | 2005-05-19 | 2006-11-23 | Olympus Corporation | Distance measuring apparatus, distance measuring method and distance measuring program |
| FR2921027B1 (en) * | 2007-09-17 | 2010-02-26 | Valeo Vision | DRIVING AIDS DEVICE FOR A MOTOR VEHICLE HAVING A STEREOSCOPIC IMAGE CAPTURE SYSTEM |
| EP2178059A1 (en) * | 2008-10-16 | 2010-04-21 | ROBOT Visual Systems GmbH | Device to record traffic infringements and to check number plates |
| WO2010077316A1 (en) * | 2008-12-17 | 2010-07-08 | Winkler Thomas D | Multiple object speed tracking system |
| US20100177162A1 (en) * | 2009-01-15 | 2010-07-15 | Charles Macfarlane | Method and system for enabling 3d video and image processing using one full resolution video stream and one lower resolution video stream |
| US7978122B2 (en) * | 2009-08-13 | 2011-07-12 | Tk Holdings Inc. | Object sensing system |
| DE102010012811B4 (en) * | 2010-03-23 | 2013-08-08 | Jenoptik Robot Gmbh | Method for measuring speeds and associating the measured speeds with appropriate vehicles by collecting and merging object tracking data and image tracking data |
-
2012
- 2012-07-27 DE DE102012106860.7A patent/DE102012106860A1/en not_active Withdrawn
-
2013
- 2013-07-16 EP EP13176747.7A patent/EP2690459B1/en active Active
- 2013-07-16 ES ES13176747.7T patent/ES2628895T3/en active Active
- 2013-07-16 PL PL13176747T patent/PL2690459T3/en unknown
- 2013-07-16 LT LTEP13176747.7T patent/LT2690459T/en unknown
- 2013-07-16 DK DK13176747.7T patent/DK2690459T3/en active
- 2013-07-26 AU AU2013209360A patent/AU2013209360A1/en not_active Abandoned
- 2013-07-29 CN CN201310323121.5A patent/CN103578278A/en active Pending
Also Published As
| Publication number | Publication date |
|---|---|
| AU2013209360A1 (en) | 2014-02-13 |
| EP2690459A3 (en) | 2014-10-22 |
| EP2690459B1 (en) | 2017-03-29 |
| EP2690459A2 (en) | 2014-01-29 |
| PL2690459T3 (en) | 2017-10-31 |
| LT2690459T (en) | 2017-07-25 |
| CN103578278A (en) | 2014-02-12 |
| ES2628895T3 (en) | 2017-08-04 |
| DE102012106860A1 (en) | 2014-02-13 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| DK2690459T3 (en) | Device and method for identifying and documenting at least one object passing through a radiation field | |
| US11238730B2 (en) | System and method for detecting and recording traffic law violation events | |
| Sochor et al. | Comprehensive data set for automatic single camera visual speed measurement | |
| US9898670B2 (en) | Method and device for observing the environment of a vehicle | |
| KR101647370B1 (en) | road traffic information management system for g using camera and radar | |
| EP0912970B1 (en) | Machine and method for detecting traffic offenses with dynamic aiming systems | |
| US8238610B2 (en) | Homography-based passive vehicle speed measuring | |
| KR101343975B1 (en) | Sudden detection system | |
| EP3606032B1 (en) | Method and camera system combining views from plurality of cameras | |
| KR102008263B1 (en) | Convergence detector and traffic enforcement system therewith | |
| US20150254976A1 (en) | Traffic monitoring system and traffic monitoring method | |
| WO2018072669A1 (en) | Radiation inspection system and method | |
| WO2010077316A1 (en) | Multiple object speed tracking system | |
| JP2008140370A (en) | Stereo camera intrusion detection system | |
| KR20200064873A (en) | Method for detecting a speed employing difference of distance between an object and a monitoring camera | |
| JP2010236891A (en) | Position coordinate conversion method between camera coordinate system and world coordinate system, vehicle-mounted apparatus, road side photographing apparatus, and position coordinate conversion system | |
| AU2013398544B2 (en) | A method of determining the location of a point of interest and the system thereof | |
| Altekar et al. | Infrastructure-based sensor data capture systems for measurement of operational safety assessment (osa) metrics | |
| US10852436B2 (en) | Imaging system and method for monitoring a field of view | |
| WO2020021306A1 (en) | Method for material discrimination and respective implementation system | |
| WO2018062368A1 (en) | Image pickup device and image pickup system | |
| KR101057837B1 (en) | Road security and traffic control system using laser | |
| KR20130062489A (en) | Device for tracking object and method for operating the same | |
| CN102981010A (en) | Method for verifying speed of appropriate vehicle by using camera | |
| KR20050036179A (en) | A forward area monitoring device of vehicle and method thereof |