AU2014202300B2 - Traffic monitoring system for speed measurement and assignment of moving vehicles in a multi-target recording module - Google Patents

Traffic monitoring system for speed measurement and assignment of moving vehicles in a multi-target recording module Download PDF

Info

Publication number
AU2014202300B2
AU2014202300B2 AU2014202300A AU2014202300A AU2014202300B2 AU 2014202300 B2 AU2014202300 B2 AU 2014202300B2 AU 2014202300 A AU2014202300 A AU 2014202300A AU 2014202300 A AU2014202300 A AU 2014202300A AU 2014202300 B2 AU2014202300 B2 AU 2014202300B2
Authority
AU
Australia
Prior art keywords
radar sensor
tracking radar
tracking
monitoring system
traffic monitoring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
AU2014202300A
Other versions
AU2014202300A1 (en
Inventor
Dima Proefrock
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jenoptik Robot GmbH
Original Assignee
Jenoptik Robot GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jenoptik Robot GmbH filed Critical Jenoptik Robot GmbH
Publication of AU2014202300A1 publication Critical patent/AU2014202300A1/en
Application granted granted Critical
Publication of AU2014202300B2 publication Critical patent/AU2014202300B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/08Systems for measuring distance only
    • G01S13/10Systems for measuring distance only using transmission of interrupted, pulse modulated waves
    • G01S13/26Systems for measuring distance only using transmission of interrupted, pulse modulated waves wherein the transmitted pulses use a frequency- or phase-modulated carrier wave
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S13/583Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of continuous unmodulated waves, amplitude-, frequency-, or phase-modulated waves and based upon the Doppler effect resulting from movement of targets
    • G01S13/584Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of continuous unmodulated waves, amplitude-, frequency-, or phase-modulated waves and based upon the Doppler effect resulting from movement of targets adapted for simultaneous range and velocity measurements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • G01S13/726Multiple target tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • G01S13/878Combination of several spaced transmitters or receivers of known location for determining the position of a transponder or a reflector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/91Radar or analogous systems specially adapted for specific applications for traffic control
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/91Radar or analogous systems specially adapted for specific applications for traffic control
    • G01S13/92Radar or analogous systems specially adapted for specific applications for traffic control for velocity measurement
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/052Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras

Abstract

TRAFFIC MONITORING SYSTEM FOR SPEED MEASUREMENT AND ASSIGNMENT OF MOVING VEHICLES IN A MULTI-TARGET RECORDING MODULE The invention relates to a traffic monitoring system for speed measurement and assignment of moving vehicles in a multi-target recording module. The object of finding a new possibility for more accurate speed measurement and reliable assignment of vehicles (3), wherein the problems with regard to assigning speeds to vehicles (3) at relatively great distances and measurement uncertainties associated therewith are reduced, is achieved according to the invention by virtue of the fact that, besides a first, azimuthally measuring tracking radar sensor (11), at least one further tracking radar sensor (12; 13; 14) having an antenna structure that is of the same type but is operated at a different frequency is arranged in a manner rotated by a defined angle (<p) relative to the antenna structure of the first tracking radar sensor (11), such that the further tracking radar sensor (12; 13; 14) forms an antenna structure for acquiring at least one signal component for the reconstruction of vertical distances, extents and elevation angle positions of the moving vehicles (3) and provides measurement data at its sensor output, and means for fusing the measurement data generated by the first and the at least one further tracking radar sensor (11; 12; 13; 14) to form 3D radar image data are present, wherein a tracking method with the aid of a sequence of the fused 3D radar image data is provided. b N 5 -------------- i i - ' / 11 5 3---- [N IL 7 )

Description

1 TRAFFIC MONITORING SYSTEM FOR SPEED MEASUREMENT AND ASSIGNMENT OF MOVING VEHICLES IN A MULTI-TARGET RECORDING MODULE Technical Field [0001] The invention relates to a traffic monitoring system for speed measurement and assignment of moving vehicles in a multi-target recording module, in particular for punishing speeding and other traffic violations, such as e.g. red light violations. Background [0002] It is known from the prior art to acquire the speed of moving objects and the two dimensional position thereof on different roadway lines of the roadway by tracking traffic scenes recorded sequentially along the roadway direction, the image of which traffic scenes can be generated by a wide variety of sensors, such as e.g. camera systems, laser scanner systems or tracking radar systems. [0003] Owing to their particularly high angular resolution and the comparatively large object field, tracking radar sensors having one transmitting and two receiving antennas are used more often than the other methods. In this case, preferably an FSK (frequency shift keying) modulated signal is reflected from the surroundings and the reflections are picked up by the two receiving antennas. The movement of the object results in a measurable frequency shift between transmission and reception signals (Doppler effect), from which the object speed can be derived directly. The distance between the object and the antennas can be calculated from the phase angle of different FSK frequencies. Owing to the spatial offset of the two receiving antennas, a phase offset arises between the identical FSK frequencies, from which phase offset it is possible to determine the angle of the object in the plane resulting from antenna normal and translation vector of the receiving antennas (stereo effect). The position of an object can thus be converted into polar coordinates. [0004] However, it is repeatedly problematic, from a bunch of vehicles, to assign the measured speeds to the respective vehicle unequivocally, in order that the substantive facts cannot be challenged when traffic violations are punished.
2 [0005] In this regard, US 8 294 595 B1 discloses a method for detecting moving vehicles, wherein a measurement is performed according to whether a number of vehicles are present in a video data sequence from a camera system, a number of speed measurements are performed by a radar system depending on the number of vehicles, and a decision is made as to whether a speed of a bunch of vehicles in the number of vehicles exceeds a threshold, whereupon - in the event of speeding being ascertained - a report is issued regarding that. In that case, however, there is detection uncertainty as to the vehicle - in the bunch of vehicles - to which the speeding should demonstrably be assigned. [0006] However, the prior art already discloses numerous methods for obtaining unambiguous vehicle assignments precisely in the area of multi-lane roadways. As an example, EP 1 990 655 Al shall be cited here, wherein, by means of the measured distance and the combination thereof with the associated measurement angle, the roadway line used by the vehicle is determined and the vehicle is thus identified unequivocally. The same basic principle of determining the position of the vehicle is also employed in EP 2 048 515 Al, wherein in that case, for determining red light violations, from the measured speed and the distance of the vehicle, the probability of crossing the stop line on red is predicted in order, in the event of a violation, to initiate evidence photographs at defined distances from the stop line. What is disadvantageous about these assignment methods is the high computational complexity required for finding the vehicle at a defined position with a specific speed and the model computation carried out for tracking for each vehicle. [0007] Furthermore, a motor vehicle radar system is known from EP 1 922 562 B1, wherein, in order to identify or exclude roadway obstructions or roadway boundaries, two antenna structures having directional characteristics that are different in mutually orthogonal spatial directions are arranged at a distance from one another, wherein the antenna structures differ in a third spatial direction, as a result of which the azimuthal angle position and/or horizontal extent of an object are/is determined from the phase difference between the signals and the elevation and/or vertical extent of an object are/is determined from the amplitude of the signals. In this case, the different directional characteristics of the two radar receiving antennas spaced apart horizontally are recorded by opposite vertical deviation of the radar axes from the main direction and the amplitudes of objects acquired in the differently oriented radar lobes are evaluated. A resultant disadvantageously reduced resolution of the azimuth angle, which is possible only in the overlap region of the radar lobes, and an inevitably increased measurement 3 noise in the azimuthal angle measurement can be tolerated only since an exact speed measurement relative to stationary obstructions and moving objects is not required in that case. [0008] For improving the reliability and/or the accuracy of the evaluation of tracking radar data or video data during the measurement of speeds, distances and dimensions of vehicles and the unambiguous assignment thereof, a combination of both recording systems by means of data correlation in the sense of a transformation of the radar data into the video data is also undertaken, as is known for example from DE 10 2010 012 811 A1 or else from EP 2 284 568 A2. Although the combination leads to the simplification of the assignment of radar data to specific vehicles, the accuracy of the radar data remains inadequate owing to the lack of resolution of the azimuth angle at relatively great distances and remains erroneous in the near range on account of different tracking points in the case of different vehicle classes and owing to known effects (such as deflected beam, cosine effect, etc.). [0009] It is desirable to find a new possibility for more accurate speed measurement and reliable assignment of vehicles in a multi-target recording module, wherein the problems with regard to assigning speed measurement data to vehicles at relatively great distances and measurement uncertainties associated therewith are reduced. It is also desirable to increase the reliability of the speed data and position data with respect to verified vehicle classes and to reduce the data processing complexity during vehicle tracking. Object of the Invention [0010] It is an object of the present invention to substantially overcome or at least ameliorate one or more of the disadvantages of the prior art, or to at least provide a useful alternative. Summary [0011] There is provided herein a traffic monitoring system for speed measurement and reliable assignment of moving vehicles with a multi-target recording module, comprising a first tracking radar sensor having a transmitting antenna and two spatially separated receiving antennas, which form a first antenna structure for acquiring horizontal distances, extents and azimuthal angle positions of the moving vehicles and have a sensor output with a regular sequence of measurement data containing at least one radial speed component directed to the 4 sensor from azimuth angle positions and distances, wherein at least one further tracking radar sensor, having an antenna structure that is of the same type as the antenna structure of the transmitting and receiving antennas of the first tracking radar sensor but is operated at a different frequency in comparison therewith, is arranged in direct proximity to the first tracking radar sensor, that the antenna structure of the at least one further tracking radar sensor is rotated by a defined angle relative to the antenna structure of the first tracking radar sensor, such that the further tracking radar sensor forms an antenna structure for acquiring at least one signal component for the reconstruction of vertical distances, extents and elevation angle positions of the moving vehicles and has a sensor output with a regular sequence of measurement data containing at least one radial speed component directed to the radar sensor from elevation angle positions, and that means for fusing the measurement data generated by the first and the at least one further tracking radar sensor to form three-dimensional (3D) radar image data are present, wherein a tracking method with the aid of a sequence of the fused 3D radar image data is provided with regard to each of the vehicles identified in a corresponding fashion from the measurement data of the first and of the at least one further tracking radar sensor. [0012] In one embodiment, said antenna structures of the first tracking radar sensor and of the at least one further tracking radar sensor are arranged in a manner rotated by integral multiples (n = 1...3) of 450 with respect to one another, as a result of which the at least one acquired signal component for the reconstruction of vertical distances, extents and elevation angle positions of the moving vehicles has a reconstruction error of a vertical with respect to a horizontal vehicle position of 1:1. [0013] In another embodiment, the antenna structures of the first tracking radar sensor and of the at least one further tracking radar sensor are arranged in a manner rotated by integral multiples (n = 1.. .5) of 300 with respect to one another, as a result of which the at least one acquired signal component for the reconstruction of vertical distances, extents and elevation angle positions of the moving vehicles has a reconstruction error of a vertical with respect to a horizontal vehicle position of 1.55:1. [0014] One particularly preferred embodiment of the invention arises if the antenna structures of the first tracking radar sensor and of the one further tracking radar sensor are arranged in a manner rotated by 90' with respect to one another, as a result of which the acquired signal 5 component can be used directly for the measurement of vertical distances, extents and elevation angle positions of the moving vehicles. [0015] Particular advantages arise if the first tracking radar sensor and the at least one further tracking radar sensor are combined with a video camera, wherein means for fusing the horizontal and vertical measurement data of the tracking radar sensors with video data of the video camera unit are provided. The latter can be an individual video camera in the simplest case. Expediently, however, the first tracking radar sensor and the at least one further tracking radar sensor are combined with a stereo video camera unit comprising two video cameras having a basic distance, wherein means for fusing the horizontal and vertical measurement data of the tracking radar sensors with video data of the two video cameras are provided. [0016] When using the combination of tracking radar sensor unit and video camera, it furthermore proves to be advantageous that an ROI window for determining vehicle features and for tracking within a video image is assigned to each of the vehicles identified in a corresponding fashion from the measurement data of the first and of the at least one further tracking radar sensor. In this case, the basic distance between the two video cameras is expediently arranged horizontally. However, there are cases in which - without any disadvantages for the data processing - the basic distance between the two video cameras is preferably arranged vertically. [0017] In one preferred embodiment of a combination of tracking radar sensor unit and stereo video camera unit, the first tracking radar sensor and the at least one further tracking radar sensor are arranged within the basic distance between the two video cameras, wherein the tracking radar sensors are arranged along a common axis of the basic distance between the two video cameras. [0018] In a further advantageous embodiment, the first tracking radar sensor and the at least one further tracking radar sensor are arranged within the basic distance between the two video cameras, wherein the tracking radar sensors are arranged orthogonally with respect to a common axis of the basic distance between the two video cameras. [0019] However - in particular for a permanently installed combination of tracking radar sensor unit and stereo video camera unit - it can also be expedient that the first and the at least one further tracking radar sensor are arranged outside the basic distance between the two video 6 cameras. In this case, the first tracking radar sensor and the at least one further tracking radar sensor are arranged spatially separately from the stereo camera unit, wherein, after the spatially separate installation thereof, a calibration for determining a transformation specification is provided. [0020] The invention is based on the fundamental consideration that conventional tracking radar sensors achieve a high-resolution angle measurement only in one coordinate (azimuth). That is particularly disadvantageous if approaching vehicles are measured by the radar measurement for great distances from an elevated point (e.g. a bridge), wherein the varying height of the vehicles then also has an influence as an additional error source. This problem is solved according to the invention by virtue of the fact that a customary tracking radar sensor which has in the azimuth direction one transmitting antenna and two spatially separated receiving antennas and thus measures accurately only in this coordinate is accompanied by a further radar sensor of the same type which, however, is arranged in a manner rotated by an angle, for angle resolution in the height coordinate (elevation). [0021] Particularly advantageously, this radar sensor system can also be combined with one or more video cameras in order to better document the image-based evidence of a traffic violation. For this case, primarily finding and tracking the vehicles in the video stream are simplified. [0022] The invention demonstrates a new possibility which realizes a more accurate speed measurement and reliable assignment of vehicles in a multi-target recording module, wherein the problems with regard to assigning speed measurement data to vehicles at relatively great distances and measurement uncertainties associated therewith are reduced. Furthermore, the reliability of the speed data and position data with respect to verified vehicle classes is increased and the data processing complexity during vehicle tracking in a video stream can be significantly reduced. Brief Description of the Drawings [0023] Preferred embodiments of the invention will be described hereinafter, by way of examples only, with reference to the accompanying drawings, wherein: 7 [0024] figure 1: shows the principle of the invention in one preferred embodiment, showing a) the sensor configuration, b) the horizontal radar cones and c) the vertical radar cones; [0025] figure 2: shows an illustration of the problem cases of a tracking radar during distance and speed measurement; the vertical resolution problem being illustrated in the comparison of a) a truck and b) a passenger car and c) in the case of a roadway with a gradient; [0026] figure 3: shows a schematic illustration of the vertical radar resolution for different angles p of the rotation of the radar sensors for a) p = 900, b) p = 450, wherein c) and d) contain the associated triangulation representations of the measurements; [0027] figure 4: shows four different sensor configurations with triangulation representations thereabove for a) two orthogonal radar sensors, b) three radar sensors with 450 rotation in each case, c) three radar sensors comprising one sensor with 450 rotation and two sensors directed identically, and d) with four radar sensors with 300 rotation in each case with respect to one another; [0028] figure 5: shows one preferred embodiment of the invention in conjunction with a stereo camera, a) to d) illustrating different orientations of the radar sensors with respect to the video camera base and with respect to the traffic scene to be recorded; [0029] figure 6: shows two expedient embodiments of the invention with different spatial arrangements of the combination system comprising radar sensors and video cameras; [0030] figure 7: shows an illustration of the advantageous reduction of the computational complexity in a video-radar combination system. Detailed Description [0031] The basic prerequisite for understanding the invention explained below is - on account of the multi-target capability of the traffic monitoring system - so-called vehicle tracking.
8 [0032] Tracking denotes the process of following one or more vehicles within the observed scene, such that direction of travel, speed, position and, if appropriate, further features such as shape, dimensioning, vehicle class, color and motor vehicle license plate can be read. [0033] The tracking can be effected by one sensor alone or else by the combination of the values from different sensors. If the values from different sensors are processed, which sensors can also be of the same type, then the term sensor fusion is employed in this connection. [0034] Tracking for a road traffic measuring system consists, in principle, of a staggered procedure. In a first stage, the sensor supplies the raw data. In general, said raw data are interrogated at regular intervals (e.g. 100 times/sec.) and in terms of their image yield an incomplete picture of the road scene. A radar system is assumed hereinafter as an exemplary sensor. In the radar system, distance, angle and speed component values are supplied as incomplete measurement parameters in a first stage. These speed values comprise the radial component of the speed. In this case, the radial component of the speed relates to the speed component in the direction of the radar system. [0035] Said raw data are analyzed in a first processing stage (feature generation) and produce so-called feature vectors. These feature vectors summarize the raw data at a lower level and describe a measurement event that can be assigned to a vehicle with high probability. [0036] The objective of this stage is to filter out background information and to reduce the raw data to a suitable set of feature vectors that can be utilized in further processing stages. [0037] In the case of the tracking radar, from the multiplicity of measurements only the measurements of raw data which comply with specific quality criteria (signal levels) are extracted and forwarded toward the outside. In the case of the quality criteria, different confidence is exhibited toward the different features. The actual tracking method follows, which incorporates the prepared feature vectors. The tracking method has to implement a series of tasks, which can be subdivided as follows. 1. initializing a model, 2. tracking a model, 3. discarding a model.
9 [0038] The initializing and discarding stages are also often referred to as the birth and death of a model. [0039] A model within the meaning of the invention is the calculated trajectory of a vehicle with the above-designated properties of speed, position and direction of travel from the feature vectors determined. [0040] In principle, tracking can be regarded from outside as a Blackbox into which feature vectors enter and information about the models emerges; beginning with the initialization phase, feature vectors are observed over a relatively long period of time. If said feature vectors comprise consistent information, e.g. that the position is propagating in regular steps, then a new model is initialized from the tracking. If new feature vectors then enter in the further step, then these feature vectors are compared with existing models. For this purpose, the distance of the measurement vector is compared with respect to the model parameters. If a measurement vector has settled in the vicinity of a model, then the model is updated taking account of this measurement vector. This process is designated as the so-called update phase in the context of tracking. If no feature vector could be assigned to the model, then the model is still retained. At all events, however, statistics are maintained as to when a feature vector could be assigned to a model, and when not. [0041] After the conclusion of this update phase, the model is advanced virtually on the roadway in accordance with its assigned traveling properties. This means that the new position is predicted. After the prediction of the model, in the next measurement cycle, an assignment of the new measurement vectors to the existing model parameters is once again tested. This process proceeds cyclically. After each prediction step, a check is made to determine whether the model has exceeded e.g. predefined photographic lines (trigger lines). In this case, a tracking module sends information toward the outside. This information can be used e.g. to initiate photographs. In principle, as many trigger lines as desired can be parameterized in the radar sensor. However, as the number increases there is an increase in the data loading of the interface between the radar sensor and a data processing system connected thereto. [0042] The birth of a model is one of the most difficult phases within tracking, since it is necessary to examine the space of the collected feature vectors in the past.
10 [0043] A likewise important method is instigating the death of the models. In the simplest case, the model leaves the measurement space and can then be deleted from the list of existing models. The list of existing models is necessary in order to track a plurality of models simultaneously (multi-target capability). A more difficult case arises if the model is still in the measurement space in principle, but no assignment of feature vectors could be performed over a relatively long period of time. In this case, the model also dies since the statistical uncertainty can be regarded as too high. Such a process can happen, for example, if the tracking in principle assumes that vehicles maintain a specific direction of travel, but the measured vehicle does not exhibit this traveling behavior, e.g. a vehicle turning right. Another possibility in respect of the model dying in the measurement region may be caused by the fact that the model initialization was inadequate or erroneous. [0044] The implementation of tracking can be realized by means of traditional recursive estimation filters (Kalman filters). Alternatively, however, it is also possible to use other (non recursive) methods. However, non-recursive methods require more computation resources and are not particularly suitable for highly dynamic changes of features, such as e.g. a change in the direction of travel. [0045] The advantage of modern tracking methods is that a highly accurate and good position estimation can be carried out even with incomplete measurement data, such as e.g. the radial speed. [0046] The basic construction of a radar system according to the invention - which is illustrated in one preferred and particularly clear variant in three views in figure 1 - comprises, in accordance with figure la, a tracking radar unit 1 having, besides a conventional first tracking radar sensor 11 having a vertically oriented transmitting antenna 111 and two spatially separated receiving antennas 112 parallel thereto, a further tracking radar sensor 12 of identical type, wherein the latter is arranged in a manner rotated relative to the first tracking radar sensor 11, such that its transmitting and receiving antennas 121 and 122, respectively, are not oriented vertically, but rather in a rotated fashion - here horizontally. As will be explained further below, angles of significantly less than 900 can also be chosen. [0047] While the first radar sensor 11, on account of its antenna structure and the orientation thereof, can perform exclusively high-resolution azimuthal angle measurements, the further 11 radar sensor 12 performs exclusively high-resolution angle measurements of elevation. As a result, the tracking radar unit 1 is developed from the original 2D radar (only tracking radar sensor 11) into a 2 x 2D radar (two tracking radar sensors 11 and 12 rotated in relation to one another) which actually offers a highly accurate 3D resolution. [0048] This 3D resolution becomes usable within the superimposition of the radar cones 2, as shown for the azimuth coordinate in figure lb and for the elevation in figure 1c. If the distance d (a few centimeters) is small in comparison with the observed measurement space (up to a few hundred meters), the spatial offset of the tracking radar sensors 11 and 12 has virtually no influence on the superimposition. Rather, the overlap region 23 and 24 is critically defined by the horizontal and vertical aperture angles of the radar cones 21 and 22. If both angles are identical, the radar cones 21 and 22 overlap almost perfectly. The overlap region decreases as the difference between the aperture angles increases. Outside the overlap regions 23 and 24 having different sizes, accurate azimuth angles are measurable in the non-overlapping part of the first radar cone 21 and accurate elevation angle measurements are possible in the corresponding part of the second radar cone 22. [0049] As will be explained further below, the radar sensors 11 and 12 can also have the distance d in an arbitrary direction (i.e. can be arranged vertically one above the other or else in an obliquely offset manner). [0050] Figure 2 shows problem cases for which the tracking radar measures the distance erroneously given a lack of angle resolution for the elevation and the speed can thus be determined less accurately. [0051] The illustration in figure 2a shows at the top the radar measurement situation and underneath the frontal image of a truck 31 that is generated therefrom, wherein the frontmost radar measurement point 25, owing to the height of the truck 31 above the roadway 4, leads to a shorter distance measurement than in the case of the same radar measurement situation in the case of a passenger car 32 as illustrated in figure 2b. The differing height of the radar measurement points 25 becomes particularly clear in the frontal images underneath with the aid of the framed front views 311 and 321, which are often extracted from the overall image for further image processing for the purpose of vehicle identification.
12 [0052] Figure 2c shows a similarly erroneous distance measurement for the case where the vehicle 3 moves on an inclined roadway 4, as a result of which, owing to the changing height of the acquired vehicle positions 33, 34 and 35, the distance measurement is adversely affected and the accuracy of the speed measurement decreases. [0053] Figure 3a shows, for the configuration of the radar sensors 11 and 12 as chosen in figure 1, the basic measurement situation as projection of an object in space onto the respective measurement plane of the radar sensors 11 and 12. The associated figure 3c illustrates the same situation as a two-dimensional triangulation scheme. This makes it clear that each of the radar sensors 11 and 12 completely acquires exactly one angle measurement, i.e. the measurement of the radar sensor 12 rotated by 900 directly yields a height measurement. [0054] Figure 3b then shows a modified orientation of the radar sensor 12 by virtue of the latter being rotated by 450 relative to the first radar sensor 11. In this case, both the horizontal and the height measurement are incorporated in the measurement value of the radar sensor 12. From the measurement value, it is possible to reconstruct the horizontal position and the height with the aid of the measurement data from the first radar sensor 11. In this case, the ratio of the reconstruction error of the vertical with respect to the horizontal object position is 1:1 [sin(45 0 ):sin(45 0 )]. Possible measurement errors of the radar sensor 12 on the horizontal measurement therefore have a greater effect during the reconstruction than in the case of figure 3a. If the angle y (depicted only in figure 4) is reduced further, e.g. to 30' or even to 5', then the ratio of the reconstruction error of the vertical with respect to the horizontal object position of the radar sensor 12 increases to a ratio according to [sin(90'-y) : sin y)]. In this case, therefore, a height measurement error has an indirectly proportional effect, namely with 1/(sin y). The triangulation shown in this respect in figure 3d illustrates the loss of measurement certainty. [0055] Figures 4a-4d show modifications for the tracking radar unit 1 according to the invention and in each case indicate thereabove a triangulation scheme for the respective sensor configuration. Merely for comparison, figure 4a once again shows the embodiment from figure 1.
13 [0056] Figure 4b describes a radar unit 1 having three radar sensors 11, 12 and 13, wherein the further radar sensor 12 is rotated as in figure 4a, but yet another radar sensor 13 is rotated with an angle e = 450 with respect to the first radar sensor 11. The ratio of the reconstruction error of the vertical with respect to the horizontal object position of the further radar sensor 13 is 1:1. Thus, the measurement of the radar sensor 13 contributes to the reduction of the measurement noise with respect to the measurement of the radar sensors 11 and 12. [0057] In a configuration such as is shown in figure 4c, although both radar sensors 11 and 13 oriented with y = 0' yield only the 2D object coordinates in the azimuth, overall the measurement noise of the total measurement decreases as a result. [0058] Figure 4d indicates an advantageous example for the tracking radar unit 1 in which, apart from the first radar sensor 11, three further radar sensors 12, 13 and 14 are also arranged, and are arranged in a manner rotated in each case by integral multiples of 300 relative to the first radar sensor 11. As a result of the largest possible spacing of the rotation angles y between 00 and 900, each individual radar sensor 11 to 14 makes an optimum contribution to the reduction of the measurement uncertainties in all three spatial dimensions. If, by contrast, a specific dimension is intended to be preferred in terms of its measurement certainty, then other spacings of the rotation angles p can also be chosen. In this regard, in the example from figure 4c, the horizontal resolution is preferred at the expense of the vertical resolution. [0059] The arrangement along a straight line is intended in this example merely to better elucidate the clarity of the systematic position change; a compact arrangement in a rectangle is preferred for reasons of the more uniform overlap regions 23, 24,... of the radar cones 2. It should be pointed out that the juxtaposition of more than two radar sensors - as is shown in figures 4b-4d for reasons of better illustration of the rotation mode - is generally effected by a two-dimensional arrangement in terms of the area - as described below with regard to figure 5. [0060] Figure 5 shows the advantageous combination of the tracking radar unit 1 according to the invention with a video camera unit 5. With no restriction of generality, a preferred use of two video cameras 51 and 52 arranged with a basic distance b in a stereo configuration is assumed below.
14 [0061] Various possibilities for realizing a video-radar combination system 6 are illustrated in figures 5a-5d. [0062] In a first embodiment, figure 5a exhibits, between the video cameras 51 and 52 arranged at the basic distance b, two radar sensors 11 and 12 rotated by 900, said radar sensors being arranged vertically centrally along the basic axis 53 of the video cameras 51 and 52 in a manner horizontally closely adjacent at the distance d (only designated in figure 1). In this way, what is achieved is not only a very compact arrangement but also an almost complete overlap of the radar cones 21 and 22 (only depicted in figure 1) with the stereo camera images with the fields of view (camera FOV - field of view). [0063] A totally equal variant with respect to figure 5a is achieved with the video-radar combination system 6 illustrated in figure 5b. In this configuration, the radar sensors 11 and 12 rotated by 900 in relation to one another are arranged in a manner vertically closely adjacent orthogonally with respect to the basic axis 53 and centrally between the two video cameras 51 and 52. [0064] Figure 5c illustrates a further variant of the video-radar combination system 6, which differs from the two variants mentioned above in that the basis axis 53 of the video cameras 51 and 52 is arranged vertically. That has advantages for attaching the combination system 6 to a slender mast and does not entail any disadvantages at all for the data fusion with the tracking radar system. [0065] In this modification, the tracking radar system consists of three radar sensors 11, 12 and 13 rotated by 450 in each case in relation to one another, analogously to figure 4b. In this case, the first radar sensor 11 and relative thereto a radar sensor 12 rotated by 900 and a radar sensor 13 rotated by 450 are arranged in an equilateral triangle whose centroid lies centrally on the basic axis 53 between the two video cameras 51 and 52. The same configuration of the radar sensors 11, 12 and 13 can, of course, also be applied to the horizontal normal position of the video cameras 51 and 52. [0066] A further modification of the vertically arranged stereo video camera unit 5 is illustrated in figure 5d with three further radar sensors 12, 13 and 14, which are arranged relative to the first radar sensor 11 in each case by a different integral multiple of the angle 15 <p = 300 (analogously to figure 4d). These four radar sensors 11, 12, 13 and 14 are arranged in a rectangle between the cameras 51 and 52, the rectangle (in each case with regard to the sensor centroids) being arranged centrally and symmetrically with respect to the basic axis 53 of the video cameras 51 and 52. This configuration, which can also be arranged as a rhombus (diamond) between the video cameras 51 and 52, can in the same way likewise be applied to the horizontal normal position of the video cameras 51 and 52. [0067] The fact that the video-radar combination system 6 can also be modified even further can be gathered from figure 6. The compact construction described above is illustrated in the background again in this illustration. [0068] It is also possible, however, to install the tracking radar unit 1 spatially separately from the video camera unit 5, for example on different sides of the roadway 4. That is unobjectionable as long as the radar cones 21 and 22 and camera FOV also have sufficient overlap regions for a near region of interest of the measurements and of the tracking (analogously to the overlap regions 23, 24 of the radar sensors 11, 12 as shown in figure 1). The required fusion of the radar data with the camera data in this case necessitates an additional calibration for creating a transformation specification, but this arises only once in the case of the fixed installation. [0069] The particular advantages of the video-radar combination system in the embodiment in accordance with figure 5a or 5b will be described in greater detail below with reference to figure 7. [0070] Vehicles that approach the measuring system (tracking radar sensor unit 1 and stereo video camera unit 5) receive signals at a distance of approximately 100 m from the tracking radar sensor unit 1 and, given corresponding quality and continuity of the signal profile, a model is born in the tracking algorithm. Since the sensor combination consisting of at least two radar sensors 11 and 12 uses tracking radar sensors having multi-target capability, a plurality of objects can be tracked simultaneously in this case.
16 [0071] For each object (vehicle 3), the tracking algorithm yields the current values and a prediction, which is adjusted with the subsequent, then current values. The values are: speed, direction, acceleration, distance, angle with respect to the measurement axis of the radar sensors 11 and 12 in three-dimensional space (3D space), and vehicle height, width and length. [0072] In this case, the vehicle width and length are determined in accordance with a method in DE 10 2012 107 445, not previously published. The vehicle height is determined analogously to the method described therein. [0073] Since the transformation parameters between the tracking radar sensor unit 1 and the video camera unit 5 are known or can be converted into one another, it is possible to use the a priori knowledge from the tracking data of the tracking radar sensor unit 1 in the far field (100 to 50 m) in order to determine an ROI space 55 (region of interest) in the form of a 3D space in the near field (50 to 10 m), in particular a parallelepiped, above the roadway 4 (or a roadway line thereof). The video stream of the video camera unit 5 is then evaluated for each tracked vehicle 3 exclusively and in a delimited manner in the ROI space 55 defined therefor. [0074] The height of the ROI space 55 as a parallelepiped (which is not depicted as such since it is composed of the sequentially recorded image data of the camera FOV 54) results substantially (i.e. taking account of required tolerance allowances known a priori) from the evaluation of the radar signals in the vertical direction, which provides information about the real height of the vehicle 3. The width of the parallelepiped analogously results from the evaluation of the radar signals in the horizontal direction, which provides information about the width of the vehicle 3. [0075] The length of the parallelepiped (ROI space 55) is derived depending on the length of the vehicle 3, which results from the evaluation of the radar signals in the horizontal and vertical directions with the possible assistance of the vehicle width and vehicle height from known a priori knowledge of certain vehicle classes. [0076] Approaching vehicles 3 are acquired by the video camera unit 5, but identified only if the individual images have been analyzed by image processing algorithms. In this case, the algorithms have to search through the entire acquired image region. Since, as is known, these algorithms require a high computational complexity, a very powerful and expensive data 17 processing system is a prerequisite. Moreover, such systems have an increased space requirement and a high evolution of heat. [0077] Through the use of the novel tracking radar sensor unit 1 in combination with a video camera unit 5 (in accordance with figure 5 or figure 6) there is now - as illustrated schematically in figure 7a - the possibility of evaluating only specific regions of the video images in order to reduce the computational complexity. In this case, however, the vehicles 3 cannot be reliably identified solely by the video camera unit 5 and the error rate of the derived speed increases significantly. [0078] With a 2D radar (in accordance with the prior art) although theoretically it would already be possible to improve this by means of a dynamic adaptation of the image excerpt, that would bring about only a slight improvement since, for a reliable and robust image analysis in the case of an approaching vehicle, the image excerpt would have to be altered from image to image both in terms of size, shape and in terms of position. However, the resolution of an angle in only one horizontal plane is not sufficient for this purpose. [0079] The acquisition range of the video-radar combination system 6 is a 3D space that is really present. With the aid of two orthogonal 2D radars (together 3D radar of the tracking radar sensor unit 1), vehicles 3 can be resolved completely in their different positions 33, 34, 35 and a reliable prediction of the movement curve of the vehicle 3 can be effected with the aid of the tracking algorithm. [0080] On the basis of these data determined during radar tracking in the far field (100 to 50 m), a prediction for an ROI space 55 of the video camera unit 5 in the form of a 3D space can be reliably established already before entry into the near field (50 to 10 m). If said 3D space is projected onto individual 2D images of the video stream in a manner dependent on the temporal progression, for each individual video image this results in very specific excerpts (depicted in figure 7a as a snapshot "a slice" of the ROI space 55) which yield robust results with minimal complexity if they are analyzed by tracking image processing algorithms. That part of the ROI space 55 which delimits the image data for the current position 35 of the vehicle 3 and is depicted in figure 7a is illustrated schematically in terms of its alteration by means of the dashed positions also for the positions 33 and 34 of the vehicle 3 recorded earlier. This change is also clearly discernible in the illustration of the tracked image recordings, which are shown 18 underneath in figure 7b, by means of the image excerpts lying one behind another, which are designated in a simplified manner by the associated selected positions 33, 34 and 35 of the vehicle 3. [0081] Consequently, virtually the entire performance of the data processing systems used can be brought to bear on the image data lying as far as possible in a region of high disparity of a stereo video camera unit 5. The data that are then to be processed lead to a very accurate image tracking result and thus also to very accurate values, in particular speed values. [0082] In this case, it is of particular importance in the installation of the video-radar combination system 6 that the calculated ROI space 55 lies in the complete acquisition range both of the tracking radar sensor unit 1 and of the stereo video camera unit 5. [0083] By means of the present combination of the tracking radar sensor unit 1 according to the invention on the basis of real 3D recording of vehicle movement data by means of additional vertical angle measurement with the video tracking known per se by means of a stereo video camera unit 5, a significant improvement in the accuracy and reliability of the speed measurement and the assignment of the vehicle data acquired in the video to the vehicle data measured by the radar is achieved and a reduction of the computational complexity required for the video tracking can be registered.
19 Reference signs 1 tracking radar unit 11 (first) tracking radar sensor 111 transmitting antenna 112 receiving antenna 12, 13, 14 (further) tracking radar sensor 2 radar cone 21 first radar cone 22 second radar cone 23 (horizontal) overlap region 24 (vertical) overlap region 25 radar measurement point 3 vehicle 31 truck 311,321 front view 32 passenger car 33, 34; 35 vehicle position 4 roadway 5 video camera unit 51,52 video camera 53 basic axis (of the video cameras) 54 camera field of view (FOV) 55 ROI space 6 camera-radar combination unit b basic width (of the stereo camera unit) d distance (between the tracking radar sensors) n integer <p angle

Claims (13)

1. A traffic monitoring system for speed measurement and reliable assignment of moving vehicles with a multi-target recording module, comprising a first tracking radar sensor having a transmitting antenna and two spatially separated receiving antennas, which form a first antenna structure for acquiring horizontal distances, extents and azimuthal angle positions of the moving vehicles and have a sensor output with a regular sequence of measurement data containing at least one radial speed component directed to the radar sensor from azimuth angle positions and distances, wherein at least one further tracking radar sensor, having an antenna structure that is of the same type as the antenna structure of the transmitting and receiving antennas of the first tracking radar sensor but is operated at a different frequency in comparison therewith, is arranged in direct proximity to the first tracking radar sensor, the antenna structure of the at least one further tracking radar sensor is rotated by a defined angle relative to the antenna structure of the first tracking radar sensor, such that the further tracking radar sensor forms an antenna structure for acquiring at least one signal component for the reconstruction of vertical distances, extents and elevation angle positions of the moving vehicles and has a sensor output with a regular sequence of measurement data containing at least one radial speed component directed to the radar sensor from elevation angle positions, and means for fusing the measurement data generated by the first and the at least one further tracking radar sensor to form three-dimensional (3D) radar image data are present, wherein a tracking method with the aid of a sequence of the fused 3D radar image data is provided with regard to each of the vehicles identified in a corresponding fashion from the measurement data of the first and of the at least one further tracking radar sensor.
2. The traffic monitoring system as claimed in claim 1, wherein the antenna structures of the first tracking radar sensor and of the at least one further tracking radar sensor are arranged in a manner rotated by integral multiples (n = 1...3) of 450 with respect to one another, as a result of which the at least one acquired signal component for the reconstruction of vertical distances, extents and elevation angle positions of the moving vehicles has a reconstruction error of a vertical with respect to a horizontal vehicle position of 1:1. 21
3. The traffic monitoring system as claimed in claim 1, wherein the antenna structures of the first tracking radar sensor and of the at least one further tracking radar sensor are arranged in a manner rotated by integral multiples (n = 1.. .5) of 300 with respect to one another, as a result of which the at least one acquired signal component for the reconstruction of vertical distances, extents and elevation angle positions of the moving vehicles has a reconstruction error of a vertical with respect to a horizontal vehicle position of 1.55:1.
4. The traffic monitoring system as claimed in claim 1, 2 or 3, wherein the antenna structures of the first tracking radar sensor and of the one further tracking radar sensor are arranged in a manner rotated by 90' with respect to one another, as a result of which the acquired signal component can be used directly for the measurement of vertical distances, extents and elevation angle positions of the moving vehicles.
5. The traffic monitoring system as claimed in any one of claims 1 to 4, wherein the first tracking radar sensor and the at least one further tracking radar sensor are combined with a video camera, wherein means for fusing the horizontal and vertical measurement data of the tracking radar sensors of the tracking radar unit with video data of the video camera unit are provided.
6. The traffic monitoring system as claimed in any one of claims 1 to 4, wherein the first tracking radar sensor and the at least one further tracking radar sensor are combined with a stereo video camera unit comprising two video cameras having a basic distance, wherein means for fusing the horizontal and vertical measurement data of the tracking radar sensors of the tracking radar unit with video data of the video cameras are provided.
7. The traffic monitoring system as claimed in either of claims 5 and 6, wherein an ROI window for determining vehicle features and for tracking within a video image is assigned to each of the vehicles identified in a corresponding fashion from the measurement data of the first and of the at least one further tracking radar sensor.
8. The traffic monitoring system as claimed in claim 6, wherein the basic distance between the two video cameras is arranged horizontally. 22
9. The traffic monitoring system as claimed in claim 6, wherein the basic distance between the two video cameras is arranged vertically.
10. The traffic monitoring system as claimed in claim 6, wherein the first tracking radar sensor and the at least one further tracking radar sensor are arranged within the basic distance between the two video cameras, wherein the tracking radar sensors are arranged along a basic axis of the basic distance between the two video cameras.
11. The traffic monitoring system as claimed in claim 6, wherein the first tracking radar sensor and the at least one further tracking radar sensor are arranged within the basic distance between the two video cameras, wherein the tracking radar sensors are arranged orthogonally with respect to a basic axis of the basic distance between the two video cameras.
12. The traffic monitoring system as claimed in claim 6, wherein the first and the at least one further tracking radar sensor are arranged outside the basic distance between the two video cameras.
13. The traffic monitoring system as claimed in claim 12, wherein the first tracking radar sensor and the at least one further tracking radar sensor are arranged spatially separately from the stereo video camera unit, wherein, after the spatially separate installation thereof, a calibration for determining a transformation specification is provided. JENOPTIK Robot GmbH Patent Attorneys for the Applicant/Nominated Person SPRUSON & FERGUSON
AU2014202300A 2013-04-30 2014-04-29 Traffic monitoring system for speed measurement and assignment of moving vehicles in a multi-target recording module Active AU2014202300B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102013104443.3A DE102013104443B4 (en) 2013-04-30 2013-04-30 Traffic monitoring system for measuring the speed and allocation of moving vehicles in a multi-target recording module
DE102013104443.3 2013-04-30

Publications (2)

Publication Number Publication Date
AU2014202300A1 AU2014202300A1 (en) 2014-11-13
AU2014202300B2 true AU2014202300B2 (en) 2016-01-21

Family

ID=50543417

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2014202300A Active AU2014202300B2 (en) 2013-04-30 2014-04-29 Traffic monitoring system for speed measurement and assignment of moving vehicles in a multi-target recording module

Country Status (4)

Country Link
EP (1) EP2799901A3 (en)
CN (1) CN104134354A (en)
AU (1) AU2014202300B2 (en)
DE (1) DE102013104443B4 (en)

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102015101292A1 (en) * 2015-01-29 2016-08-04 Valeo Schalter Und Sensoren Gmbh Method for detecting an object in an environmental region of a motor vehicle by checking a spatial deviation of measuring points, control device, driver assistance system and motor vehicle
CN104966400A (en) * 2015-06-11 2015-10-07 山东鼎讯智能交通股份有限公司 Integrated multi-object radar speed measurement snapshot system and method
US10145951B2 (en) * 2016-03-30 2018-12-04 Aptiv Technologies Limited Object detection using radar and vision defined image detection zone
ITUA20164219A1 (en) * 2016-06-09 2017-12-09 Kria S R L PROCEDURE AND SYSTEM FOR DETECTION OF VEHICLE TRAFFIC, CORRESPONDENT IT PRODUCT
JP6597517B2 (en) * 2016-08-10 2019-10-30 株式会社デンソー Target detection device
EP3315998B1 (en) * 2016-10-25 2021-12-08 KNORR-BREMSE Systeme für Nutzfahrzeuge GmbH Apparatus and method for determining a speed of a vehicle
CN108242172A (en) * 2016-12-26 2018-07-03 天津曼洛尔科技有限公司 A kind of traffic trip monitoring system
CN106710240B (en) * 2017-03-02 2019-09-27 公安部交通管理科学研究所 The passing vehicle for merging multiple target radar and video information tracks speed-measuring method
CN106878687A (en) * 2017-04-12 2017-06-20 吉林大学 A kind of vehicle environment identifying system and omni-directional visual module based on multisensor
CN108109394B (en) * 2017-12-07 2020-09-29 重庆交通大学 System and method for detecting traffic parameters of single geomagnetic vehicle based on vector model
EP3499265B1 (en) * 2017-12-12 2020-08-19 Veoneer Sweden AB Determining object motion and acceleration vector in a vehicle radar system
BE1026002B1 (en) * 2018-02-12 2019-09-09 Niko Nv ELECTRICAL OR ELECTRONIC DEVICE MODULE COMPRISING AT LEAST ONE RADAR SENSOR
US11035943B2 (en) * 2018-07-19 2021-06-15 Aptiv Technologies Limited Radar based tracking of slow moving objects
DE102018118150A1 (en) * 2018-07-26 2020-01-30 S.M.S Smart Microwave Sensors Gmbh System for controlling traffic routing at an intersection
CN109031234B (en) * 2018-08-09 2022-06-28 南京信息工程大学 Method for rapidly acquiring radar reflectivity data three-dimensional isosurface
CN109918900B (en) * 2019-01-28 2022-08-16 锦图计算技术(深圳)有限公司 Sensor attack detection method, device, equipment and computer readable storage medium
CN111845730B (en) * 2019-04-28 2022-02-18 郑州宇通客车股份有限公司 Vehicle control system and vehicle based on barrier height
CN110361735B (en) * 2019-07-22 2023-04-07 成都纳雷科技有限公司 Vehicle speed measuring method and device based on speed measuring radar
CN110333486A (en) * 2019-08-13 2019-10-15 四川朝阳公路试验检测有限公司 A kind of tunnel Ground Penetrating Radar detection system and its application method
CN110672091B (en) * 2019-09-29 2023-05-23 哈尔滨飞机工业集团有限责任公司 Flexible drag nacelle positioning system of time domain aircraft
CN110850396B (en) * 2019-11-29 2022-08-09 哈尔滨工程大学 Electric simulator applied to deep sea black box search and exploration positioning system and track generation method thereof
CN111856440B (en) * 2020-07-21 2024-04-05 阿波罗智联(北京)科技有限公司 Position detection method, device, equipment and readable storage medium
CN114648871B (en) * 2020-12-18 2024-01-02 富士通株式会社 Speed fusion method and device
CN115472022B (en) * 2022-09-06 2024-03-22 同盾科技有限公司 Fusion speed measuring method, fusion speed measuring device, storage medium and electronic equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2284568A2 (en) * 2009-08-13 2011-02-16 TK Holdings Inc. Object sensing system
US20120112953A1 (en) * 2009-09-16 2012-05-10 Broadcom Corporation Integrated and configurable radar system

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005189107A (en) * 2003-12-25 2005-07-14 Toshiba Corp Radar system
DE102005042729A1 (en) 2005-09-05 2007-03-08 Valeo Schalter Und Sensoren Gmbh Motor vehicle radar system with horizontal and vertical resolution
CN100380134C (en) * 2005-10-20 2008-04-09 武汉大学 Small size antenna array aperture expanding and space signal processing method
DE102007022373A1 (en) 2007-05-07 2008-11-13 Robot Visual Systems Gmbh Method for conclusively detecting the speed of a vehicle
US7592945B2 (en) * 2007-06-27 2009-09-22 Gm Global Technology Operations, Inc. Method of estimating target elevation utilizing radar data fusion
ES2393459T3 (en) 2007-10-11 2012-12-21 Jenoptik Robot Gmbh Procedure for the detection and documentation of traffic violations at a traffic light
US8294595B1 (en) 2009-09-21 2012-10-23 The Boeing Company Speed detector for moving vehicles
DE102010012811B4 (en) 2010-03-23 2013-08-08 Jenoptik Robot Gmbh Method for measuring speeds and associating the measured speeds with appropriate vehicles by collecting and merging object tracking data and image tracking data
DE102012107445B8 (en) 2012-08-14 2016-04-28 Jenoptik Robot Gmbh Method for classifying moving vehicles

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2284568A2 (en) * 2009-08-13 2011-02-16 TK Holdings Inc. Object sensing system
US20120112953A1 (en) * 2009-09-16 2012-05-10 Broadcom Corporation Integrated and configurable radar system

Also Published As

Publication number Publication date
DE102013104443A1 (en) 2014-10-30
EP2799901A3 (en) 2015-03-25
AU2014202300A1 (en) 2014-11-13
CN104134354A (en) 2014-11-05
EP2799901A2 (en) 2014-11-05
DE102013104443B4 (en) 2022-03-17

Similar Documents

Publication Publication Date Title
AU2014202300B2 (en) Traffic monitoring system for speed measurement and assignment of moving vehicles in a multi-target recording module
US10445928B2 (en) Method and system for generating multidimensional maps of a scene using a plurality of sensors of various types
US10937231B2 (en) Systems and methods for updating a high-resolution map based on binocular images
CN109102702A (en) Vehicle speed measuring method based on video encoder server and Radar Signal Fusion
CN109085570A (en) Automobile detecting following algorithm based on data fusion
CN108509972A (en) A kind of barrier feature extracting method based on millimeter wave and laser radar
KR20200067629A (en) Method and device to process radar data
CN109471096B (en) Multi-sensor target matching method and device and automobile
JP2017067756A (en) Object detection device and object detection method
KR20120130199A (en) Method and Device for Determining The Speed of Travel and Coordinates of Vehicles and Subsequently Identifying Same and Automatically Recording Road Traffic Offences
JP2004530144A (en) How to provide image information
CN111856507B (en) Environment sensing implementation method, intelligent mobile device and storage medium
Kim et al. Low-level sensor fusion network for 3d vehicle detection using radar range-azimuth heatmap and monocular image
CN113743171A (en) Target detection method and device
CN103578278A (en) Device and method for identifying and documenting at least one object passing through an irradiation field
Li et al. Automatic parking slot detection based on around view monitor (AVM) systems
CN111819466A (en) Apparatus, system, and method for locating a target in a scene
CN114063061A (en) Method for monitoring a vehicle by means of a plurality of sensors
CN112784679A (en) Vehicle obstacle avoidance method and device
CN113034586A (en) Road inclination angle detection method and detection system
CN115690713A (en) Binocular camera-based radar-vision fusion event detection method
CN115151836A (en) Method for detecting a moving object in the surroundings of a vehicle and motor vehicle
WO2020133041A1 (en) Vehicle speed calculation method, system and device, and storage medium
Hoffmann et al. Filter-based segmentation of automotive SAR images
da Silva et al. A priori knowledge-based STAP for traffic monitoring applications: First results

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)