CN114648871B - Speed fusion method and device - Google Patents

Speed fusion method and device Download PDF

Info

Publication number
CN114648871B
CN114648871B CN202011503615.8A CN202011503615A CN114648871B CN 114648871 B CN114648871 B CN 114648871B CN 202011503615 A CN202011503615 A CN 202011503615A CN 114648871 B CN114648871 B CN 114648871B
Authority
CN
China
Prior art keywords
speed
target object
sensors
radar
object corresponding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011503615.8A
Other languages
Chinese (zh)
Other versions
CN114648871A (en
Inventor
张兆宇
底欣
田军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujitsu Ltd
Original Assignee
Fujitsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ltd filed Critical Fujitsu Ltd
Priority to CN202011503615.8A priority Critical patent/CN114648871B/en
Priority to JP2021199666A priority patent/JP2022097411A/en
Publication of CN114648871A publication Critical patent/CN114648871A/en
Application granted granted Critical
Publication of CN114648871B publication Critical patent/CN114648871B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/052Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the application provides a speed fusion method and device, wherein the method comprises the following steps: and determining the speed of the target object corresponding to each group of sensors, wherein when the radar and the camera detect the target object and the first speed is larger than a threshold value, the radial speed of the target object detected by the radar is used as the speed of the target object corresponding to the sensor of the group of the radar and the camera, when the first speed is smaller than or equal to the threshold value, the average speed is used as the speed of the target object corresponding to the sensor of the group of the radar and the camera, and when the speed of the target object corresponding to the second number of the sensors of the first number of the groups is the average speed or the radial speed, the speed of the target object under the electronic map system is determined according to the speed of the target object corresponding to the second number of the sensors and the first included angle of the speed direction of the target object corresponding to the second number of the sensors.

Description

Speed fusion method and device
Technical Field
The application relates to the technical field of information processing.
Background
With the development of urban traffic, traffic conditions are increasingly complex. Urban road traffic has many participants, such as cars, bicycles, trucks, buses, pedestrians, etc., compared to highway relatively simple road conditions. However, these participants have low controllability and many uncertainty factors of their movements, and are more apparent at intersections of urban roads. In addition, once an accident occurs at an intersection, serious traffic jam is caused and great traffic pressure is brought about, which is particularly serious in cities with high population density. Therefore, how to improve the travel efficiency and alleviate the traffic pressure is a problem to be solved urgently for urban traffic management.
In recent years, intelligent traffic systems are increasingly being applied to urban traffic management. The existing intelligent traffic system generally performs target detection based on road videos shot by a monitoring camera, and performs analysis and traffic management according to detection results. In addition, a technique of traffic management based on the signal of the surveillance radar has also appeared.
It should be noted that the foregoing description of the background art is only for the purpose of facilitating a clear and complete description of the technical solutions of the present application and for the convenience of understanding by those skilled in the art. The above-described solutions are not considered to be known to the person skilled in the art simply because they are set forth in the background section of the present application.
Disclosure of Invention
Since the fields of view of the radar and the camera are limited, it is necessary to deploy multiple sets of cameras and radars on the road to detect the target object from different directions, so that data fusion of the results of the multiple sets of camera and radar detection is required to determine the position and/or speed of the target object.
The inventor finds that in the existing scheme, if the moving speed of the target object is too high, the speed accuracy of the detection of the camera is low, and in addition, when the target object is only detected by the radar, only the radial speed of the target object can be obtained, and the real speed of the target object cannot be obtained.
In view of at least one of the above problems, embodiments of the present application provide a speed fusion method and apparatus.
According to an aspect of an embodiment of the present application, there is provided a speed fusion method applied to a traffic perception system, the system including a first number of groups of sensors, the first number being greater than or equal to 2, each group of sensors including radars and/or cameras deployed at a same location, the locations at which different groups of sensors are deployed being different, the method including:
determining the speed of a target object corresponding to each group of sensors, wherein for each group of sensors, when the radar and the camera detect the target object and a first speed determined based on video data obtained by the camera is greater than a threshold value, the radial speed of the target object detected by the radar is taken as the speed of the target object corresponding to the sensor of the group where the radar and the camera are located, and when the first speed is less than or equal to the threshold value, a speed average value is taken as the speed of the target object corresponding to the sensor of the group where the radar and the camera are located, and the speed average value is an average value of the radial speed and the first speed; when only the radar or only the camera detects the target object, the radial speed or the first speed is used as the speed of the target object corresponding to the sensor of the group where the radar or the camera is located;
When the speed of the second number group of sensors corresponding to the target object is the speed average value or the radial speed, according to the speed of the target object corresponding to the second number group of sensors, and determining the speed of the target object under the electronic map system by a first included angle of the speed direction of the target object corresponding to a second number of sensors, wherein the second number is more than or equal to 2 and less than or equal to the first number.
According to another aspect of embodiments of the present application, there is provided a speed fusion device for use in a traffic sensing system, the system including a first number of sets of sensors, the first number being greater than or equal to 2, each set of sensors including radar and/or cameras deployed at the same location, the different sets of sensors being deployed at different locations, the device comprising:
a first determining unit configured to determine a speed of a target object corresponding to each set of sensors, wherein, for each set of sensors, when a radar and a camera both detect the target object and a first speed determined based on video data obtained by the camera is greater than a threshold, a radial speed of the target object detected by the radar is taken as a speed of the sensor of the set of the radar and the camera corresponding to the target object, and when the first speed is less than or equal to the threshold, a speed average is taken as a speed of the target object corresponding to the sensor of the set of the radar and the camera, the speed average being an average of the radial speed and the first speed; when only the radar or only the camera detects the target object, the radial speed or the first speed is used as the speed of the target object corresponding to the sensor of the group where the radar or the camera is located;
A second determining unit for determining, when the speed of the second number group sensor of the first number group sensor corresponding to the target object is a speed average value or a radial speed, according to the magnitude of the speed of the target object corresponding to the second number group sensor, and determining the speed of the target object under the electronic map system by a first included angle of the speed direction of the target object corresponding to the second number of sensors, wherein the second number is more than or equal to 2 and less than or equal to the first number.
One of the beneficial effects of the embodiment of the application is that: for each group of sensors, the speed of the target object corresponding to each group of sensors is determined according to the type of the sensor of the detected speed and a preset threshold value, when the speed of the target object corresponding to each group of sensors is determined based on radar, the speed of the target object under the electronic map system is determined according to the speed of the target object corresponding to each group of sensors and the included angle of the speed direction, so that the problems that the accuracy of the speed of the camera detection is low when the moving speed of the target object is too high and the real speed of the target object cannot be obtained when the target object is only detected by radar can be avoided, and the accuracy of the electronic map is further improved.
Specific embodiments of the present application are disclosed in detail below with reference to the following description and drawings, indicating the manner in which the principles of the present application may be employed. It should be understood that the embodiments of the present application are not limited in scope thereby. The embodiments of the present application include many variations, modifications and equivalents within the spirit and scope of the appended claims.
Features that are described and/or illustrated with respect to one embodiment may be used in the same way or in a similar way in one or more other embodiments in combination with or instead of the features of the other embodiments.
It should be emphasized that the term "comprises/comprising" when used herein is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps or components.
Drawings
Elements and features described in one drawing or one implementation of an embodiment of the present application may be combined with elements and features shown in one or more other drawings or implementations. Furthermore, in the drawings, like reference numerals designate corresponding parts throughout the several views, and may be used to designate corresponding parts as used in more than one embodiment.
The accompanying drawings, which are included to provide a further understanding of the embodiments of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the principles of the application. It is apparent that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained from these drawings without inventive effort for a person of ordinary skill in the art. In the drawings:
FIG. 1 is a schematic illustration of a velocity fusion method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a method for determining a velocity of a target object corresponding to each set of sensors according to an embodiment of the present application;
fig. 3A and fig. 3B are schematic diagrams illustrating the positional relationship between the first included angle and the second included angle in the embodiment of the present application;
FIG. 4 is a schematic diagram of a real speed direction of a target object in an electronic map coordinate system according to an embodiment of the present application;
FIG. 5 is a schematic illustration of a velocity fusion device according to an embodiment of the present application;
fig. 6 is a schematic diagram of a data processing apparatus according to an embodiment of the present application.
Detailed Description
The foregoing and other features of the present application will become apparent from the following description, with reference to the accompanying drawings. In the specification and drawings, there have been specifically disclosed specific embodiments of the present application which are indicative of some of the embodiments in which the principles of the present application may be employed, it being understood that the present application is not limited to the described embodiments, but, on the contrary, the present application includes all modifications, variations and equivalents falling within the scope of the appended claims.
In the embodiments of the present application, the terms "first," "second," and the like are used to distinguish between different elements from each other by reference, but do not denote a spatial arrangement or a temporal order of the elements, and the elements should not be limited by the terms. The term "and/or" includes any and all combinations of one or more of the associated listed terms. The terms "comprises," "comprising," "including," "having," and the like, are intended to reference the presence of stated features, elements, components, or groups of components, but do not preclude the presence or addition of one or more other features, elements, components, or groups of components.
In the embodiments of the present application, the singular forms "a," an, "and" the "include plural referents and should be construed broadly to mean" one "or" one type "and not limited to" one "or" another; furthermore, the term "comprising" is to be interpreted as including both the singular and the plural, unless the context clearly dictates otherwise. Furthermore, the term "according to" should be understood as "at least partially according to … …", and the term "based on" should be understood as "based at least partially on … …", unless the context clearly indicates otherwise.
In the embodiment of the application, each object coordinate in the electronic map coordinate system reflects the position of the object (object) in the real world. The video data obtained by the camera corresponds to a video coordinate system, the coordinates of each pixel are the positions of targets, the radar data obtained by the radar corresponds to a radar coordinate system, and the positions of the radar are the origins (origin) of the radar coordinate system. The information perceived by the cameras and the radar is finally identified as geographic position information on the high-precision electronic map and can be provided for nearby traffic participants. Therefore, it is necessary to convert the radar measurement speed data and the video measurement speed data into an electronic map coordinate system.
Various implementations of the examples of the present application are described below with reference to the accompanying drawings. These embodiments are merely illustrative and are not limiting of the examples of the present application.
Example of the first aspect
An embodiment of the present application provides a speed fusion method, and fig. 1 is a schematic diagram of the speed fusion method of the embodiment of the present application, where the method is applied to a traffic sensing system, and the system includes a first number of groups of sensors, where each group of sensors includes cameras and/or radars deployed at a same location, and the deployed locations of different groups of sensors are different, and referring to fig. 1, the method includes:
101, determining the speed of a target object corresponding to each group of sensors, wherein when the radar and the camera detect the target object and the first speed determined based on video data obtained by the camera is larger than a threshold value, the radial speed of the target object detected by the radar is taken as the speed of the target object corresponding to the sensor of the group of the radar and the camera, and when the first speed is smaller than or equal to the threshold value, the average speed is taken as the speed of the target object corresponding to the sensor of the group of the radar and the camera, and the average speed is the average value of the radial speed and the first speed; when only the radar or only the camera detects the target object, the radial speed or the first speed is used as the speed of the target object corresponding to the sensor of the group where the radar or the camera is located;
102, when the speed of the second number group sensor corresponding to the target object in the first number group sensor is the speed average value or the radial speed, according to the speed of the target object corresponding to the second number group sensor, and determining the speed of the target object under the electronic map system by a first included angle of the speed direction of the target object corresponding to the second number of sensors, wherein the first number and the second number are greater than or equal to 2, and the second number is less than or equal to the first number.
Therefore, for each group of sensors, the speed of the target object corresponding to each group of sensors is determined according to the type of the sensor of the detected speed and a preset threshold value, when the speed of the target object corresponding to each group of sensors is determined based on radar, the speed of the target object under the electronic map system is determined according to the speed of the target object corresponding to each group of sensors and the included angle of the speed direction, so that the problems that the accuracy of the speed of the camera detection is low when the moving speed of the target object is too high and the real speed of the target object cannot be obtained when the target object is only detected by radar can be avoided, and the accuracy of the electronic map is improved.
In some embodiments, the traffic sensing system includes two types of sensors, i.e., a camera and/or a radar (e.g., a microwave radar), where the camera and the radar may be disposed above a road or beside the road, and the sensors disposed at the same location are a group of sensors, each group of sensors includes two types of sensors, i.e., a camera and a radar, or includes only one type of sensor, and the present embodiment is not limited thereto, and the sensing system includes multiple groups of sensors disposed at different locations, and the detection range of the traffic sensing system may be set according to actual needs, e.g., a certain range centered at an intersection, a group of sensors disposed in the north-south direction at the intersection, a group of sensors disposed in the east-west direction, and the like, which is not exemplified herein. The traffic sensing system detection targets may include various types of motor vehicles, non-motor vehicles, and pedestrians.
In some embodiments, the camera may obtain a surveillance video of the road to obtain video data, the radar transmits a radar signal to a preset area of the road and obtains a radar reflection signal, and the radar data is obtained according to the reflection signal; when radar and cameras are used to detect objects on the road, the data is recorded frame by frame. In each frame of radar data (hereinafter simply referred to as radar frame), second target measurement data of a target detected by the radar, such as data of speed and position, is included; in each frame of the video detection result (hereinafter simply referred to as a video frame), first measurement data of the object detected by the camera, such as data of speed and position, is included. The second target measurement data comprises the radial velocity v of the target object in the radar coordinate system r Radial in radial velocity refers to the direction of the line connecting the radar and the target object, including toward the radar and away from the radar. The first speed of the first target measurement data in the video coordinate system is v i The direction of the first speed is the real speed direction of the target object, wherein the implementation of obtaining the radial speed according to the radar data and obtaining the first speed according to the video data may refer to the prior art, and will not be described herein.
In some embodiments, the speed of the target object corresponding to each group of sensors in the first number of groups of sensors is determined, that is, the speed data obtained by each sensor is fused in the group, and the speed of the target object corresponding to each group of sensors is the speed fusion result of each group of sensors, which is described below in the following case.
For example, when a group of sensors includes only one type of sensor and the sensor detects a target object, the speed of the target object detected by the sensor is taken as the speed of the target object corresponding to the group of sensors. For example, when a group of sensors includes only radar and the radar detects a target object, a radial velocity detected by the radar is taken as a velocity of the target object corresponding to the group of sensors, and when a group of sensors includes only a camera and the camera detects the target object, a first velocity detected by the camera is taken as a velocity of the target object corresponding to the group of sensors.
For example, when a group of sensors includes two types of sensors, but only one type of sensor detects a target object, the speed of the target object detected by the one type of sensor is taken as the speed of the target object corresponding to the group of sensors. For example, when a group of sensors includes a radar and a camera, and only the radar detects a target object, the radial velocity detected by the radar is used as the velocity of the target object corresponding to the group of sensors, and when only the camera detects the target object, the first velocity detected by the camera is used as the velocity of the target object corresponding to the group of sensors.
For example, when a group of sensors includes two types of sensors, but both of the two types of sensors detect a target object, since the first speed accuracy of the target object detected by the camera is low when the speed of the target object is too fast, the present embodiment determines from which type of sensor the speed corresponding to the group of sensors is determined by comparing the first speed with the magnitude of a threshold value, and when the first speed is greater than the threshold value, the first speed accuracy is low, and therefore, the radial speed of the target object detected by the radar is taken as the speed of the group of sensors corresponding to the target object, and when the first speed is less than or equal to the threshold value, the speed average is taken as the magnitude of the speed of the group of sensors corresponding to the target object, and the speed average is the average of the radial speed and the first speed (the direction of the speed of the group of sensors corresponding to the target object may be the same as the radial speed), so as to avoid the detection error that may be brought by a single sensor, and the above threshold value may be determined as needed, and the present embodiment is not limited.
Fig. 2 is a schematic diagram of a method for determining a speed of a target object corresponding to a set of sensors according to this embodiment, as shown in fig. 2, the method includes:
201, judging whether only a radar or a camera detects a target object in the group, executing 202 when the judging result is yes, otherwise executing 205;
202, judging whether a radar detects a target object, executing 203 when the judging result is yes, otherwise executing 204;
203, taking the radial speed detected by the radar as the speed of a target object corresponding to the group of sensors;
204, taking the first speed detected by the camera as the speed of the target object corresponding to the group of sensors;
205, when the radar and the camera detect the target object, judging whether the first speed is less than or equal to a threshold value, executing 206 when the judgment result is yes, otherwise executing 207;
206, taking the average value of the speeds as the speeds of the target objects corresponding to the group of sensors;
207, taking the radial velocity detected by the radar as the velocity of the target object corresponding to the set of sensors.
In some embodiments, when the speed of the target object corresponding to the set of sensors is determined based on the radar (the determination based on the radar includes that the speed is a radial speed of the radar or that the speed is a speed average of the radial speed of the radar and the first speed), since there may be a certain angle error between the direction of the radial speed and the real speed direction of the target object, the radial speed may not reflect the real speed size and direction of the target object, and the embodiments may perform the second fusion of the speeds of the target object corresponding to the respective sets of sensors determined based on the radar, that is, determine the real speed of the target object according to the speeds of the target object corresponding to at least two sets of sensors (hereinafter referred to as a second number of sets of sensors) determined based on the radar.
In some embodiments, at 102, a speed of the target object under the electronic map system is determined based on a speed of the target object corresponding to the second number of sets of sensors and the first included angle of the speed direction of the target object corresponding to the second number of sets.
In some embodiments, when the second number is equal to 2, determining a speed component of the speed corresponding to one set of sensors in the speed direction of the target object according to the speed of the target object corresponding to the second number of sets (two sets) of sensors and a first included angle of the speed direction of the target object corresponding to the second number of sets (two sets) of sensors, and taking the speed component as the speed of the target object under the electronic map system, namely, the real speed.
For example, the following equation 1) may be used to determine the speed of the object under the electronic map system:
V=|v 1 /cos(arctan(cotβ-v 2 /(v 1 sin beta))
Wherein v is 1 Is the speed of the target object corresponding to one group of sensors, beta is the first included angle, v 2 Is the speed of the target object corresponding to the other group of sensors, and V is the speed of the target object under the electronic map system.
In some embodiments, when the second number is greater than 2, for each two sets of sensors in the second number set of sensors, determining a speed component of a speed corresponding to one set of sensors in the speed direction of the target object according to the speed of the target object corresponding to each two sets of sensors and a first included angle of the speed direction of the target object corresponding to the two sets of sensors, and calculating an average value of the speed components corresponding to each two sets of sensors, where the average value is used as the speed of the target object under the electronic map system.
For example, when the second number is equal to 3, the above formula 1) is used for calculating the velocity components V1, V2, V3 for the first group and the second group, the first group and the third group, and the second group and the third group, respectively, and then the average value (v1+v2+v3)/3 of V1, V2, V3 is calculated as the velocity magnitude of the target object under the electronic map system.
In some embodiments, the method may further comprise: (optional)
103, when the speed of the second number of the first number of the sensors is the average speed or the radial speed, according to the speed of the target object corresponding to the second number of the sensors, and determining the speed direction of the target object under the electronic map system according to the first included angle of the speed direction of the target object corresponding to the second number of groups of sensors and the relative position relation between the target object and the radar.
In some embodiments, when the second number is equal to 2, determining a second included angle between the speed direction of the target object under the electronic map system and the direction of the speed corresponding to one of the two sets of sensors according to the speed of the target object corresponding to the second number of sets (two sets) of sensors and the first included angle; and determining the speed direction of the target object under the electronic map system according to the position relation between the first included angle and the second included angle and the relative position relation between the target object and the radar.
For example, the second angle α is determined using the following equation 2) 1
α 1 =|cotβ-v 2 /(v 1 sin beta) I formula 2
Wherein v is 1 Is the speed of the target object corresponding to one group of sensors, beta is the first included angle, v 2 The second included angle calculated by the formula 2) is the second included angle, and the included angle between the true speed and the radial speed is set to be (0 DEG, 90 DEG).
Fig. 3A and 3B are schematic diagrams of the positional relationship between the first included angle and the second included angle in the present embodiment, where the speeds of the target objects corresponding to the two sets of sensors are radial speeds of radar detection, and v is respectively 1 And v 2 ,v 1 And v 2 Is included by an angle beta, the true speed of the object is v, v and v 1 Included angle of (2)Is the second included angle alpha 1 V and v 2 The included angle of (a) is the second included angle alpha 2 ,v,v 1 And v 2 There are two relationships between, v as shown in FIG. 3A 1 And v 2 Are all on the same side of v, beta=alpha 21 As shown in FIG. 3B, v 1 And v 2 Respectively on both sides of v, beta=alpha 21
In some embodiments, if cotβ -v 2 /(v 1 sin beta), the first angle and the second angle are in the relationship shown in FIG. 3A, if cor beta-v 2 /(v 1 sin beta) is negative, the positional relationship between the first angle and the second angle is the relationship shown in FIG. 3B, and therefore, according to cotbeta-v 2 /(v 1 sin beta) can determine the position relationship between the first included angle and the second included angle, and further determine the direction of the real speed of the target object.
In some embodiments, only one relative velocity direction of the target object can be determined according to the positional relationship between the first angle and the second angle and the magnitude of the second angle, i.e. the velocity direction of the target object under the electronic map system is offset by a second angle relative to the velocity direction corresponding to one of the sensors (the offset direction is according to the cotβ -v 2 /(v 1 sin beta), the speed direction of the target object under the electronic map system (i.e. the real speed direction, or the absolute speed direction under the electronic map coordinate system) is determined by combining the relative position relationship between the target object and the radar, and the following description will be made with reference to fig. 4.
FIG. 4 is a schematic view showing the real velocity direction of the target object in the electronic coordinate system shown in FIG. 4, which is an electronic map coordinate system in which the positive direction of the Y-axis represents the north-positive direction of the real world, the negative direction of the Y-axis represents the south-positive direction of the real world, the positive direction of the X-axis represents the east-positive direction of the real world, and the negative direction of the X-axis represents the west-positive direction of the real world, which is only an example, but the present embodiment is not limited thereto, wherein the real velocity direction of the target object 1 is v1, wherein a group of the transmissions The sensor corresponding velocity is the radar detected radial velocity, denoted v 11 V1 and v 11 Is a second included angle alpha 11 The relative position relationship between the target object 1 and the radar is that the target object is in the southwest direction r of the radar 1 The true velocity direction of the target object 2 is v2, where the velocity corresponding to a set of sensors is the radial velocity detected by the radar, denoted v 12 V2 and v 12 Is a second included angle alpha 12 The relative position relationship between the target object 2 and the radar is that the target object is in the southeast direction r of the radar 2 The true velocity direction of the target object 1 is the west north (180-alpha 11 -r 1 ) The true velocity direction of the target object 2 is east-north deviation (180-alpha 12 -r 2
In some embodiments, when the second number is greater than 2, determining, for each two sets of sensors in the second number set of sensors, a second included angle between a speed direction of the target object under the electronic map system and a direction of a speed corresponding to one set of sensors according to the above formula 2), and calculating an average value of the second included angles corresponding to each two sets of sensors; and determining the speed direction of the target object under the electronic map system according to the position relationship between the first included angle and the second included angle and the relative position relationship between the target object and the radar, wherein the specific processing mode of calculating the second included angle and determining the speed direction of the target object under the electronic map system can refer to the implementation mode with the second number equal to 2, and the repetition is omitted.
In some embodiments, the speed and direction of the target object under the electronic map system can be output to the electronic map for dynamic display as a fusion perception result.
It should be noted that fig. 1 above is only a schematic illustration of an embodiment of the present application, but the present application is not limited thereto. For example, the order of execution among the operations may be appropriately adjusted, and other operations may be added or some of the operations may be reduced. Those skilled in the art can make appropriate modifications in light of the above, and are not limited to the description of fig. 1 above.
As can be seen from the above embodiments, for each group of sensors, the speed of the target object corresponding to each group of sensors is determined according to the detected sensor type of the speed and the predetermined threshold value, and when the speed of the target object corresponding to each group of sensors is determined based on radar, the speed of the target object under the electronic map system is determined according to the speed of the target object corresponding to each group of sensors and the included angle of the speed directions, so that the problem that the accuracy of the speed detected by the camera is low when the moving speed of the target object is too high and the real speed of the target object cannot be obtained when the target object is detected only by the radar can be avoided, thereby improving the accuracy of the electronic map.
Embodiments of the third aspect
The embodiment of the application provides a speed fusion device. Since the principle of solving the problem by this device is similar to that of the embodiment of the first aspect, the specific implementation thereof may refer to the embodiment of the first aspect, and the description will not be repeated here.
Fig. 5 is a schematic diagram of a speed fusion apparatus according to an embodiment of the present application, applied to a traffic perception system, where the system includes a first number of groups of sensors, each group of sensors includes cameras and/or radars disposed at the same location, and different groups of sensors are disposed at different locations, as shown in fig. 5, and the speed fusion apparatus 500 includes:
a first determining unit 501 configured to determine a speed of a target object corresponding to each set of sensors, where, for each set of sensors, when a radar and a camera both detect the target object and a first speed determined based on video data obtained by the camera is greater than a threshold, a radial speed of the target object detected by the radar is taken as a speed of the target object corresponding to a sensor of the set of the radar and the camera, and when the first speed is less than or equal to the threshold, a speed average is taken as a speed of the target object corresponding to a sensor of the set of the radar and the camera, the speed average being an average of the radial speed and the first speed; when only the radar or only the camera detects the target object, the radial speed or the first speed is used as the speed of the target object corresponding to the sensor of the group where the radar or the camera is located;
A second determining unit 502 for determining, when the speed of the second number of sets of sensors in the first number of sets of sensors corresponding to the target object is a speed average or a radial speed, based on the magnitude of the velocity of the target object corresponding to the second plurality of sets of sensors, and determining the speed of the target object under the electronic map system by a first included angle of the speed direction of the target object corresponding to the second number of sensors, wherein the second number is more than or equal to 2 and less than or equal to the first number.
In some embodiments, the apparatus may further comprise: (optional)
And a third determining unit 503, configured to determine, when the speed of the target object corresponding to the second number of the first number of the plurality of sensors is a speed average value or a radial speed, a speed direction of the target object under the electronic map system according to the speed of the target object corresponding to the second number of the plurality of sensors, a first included angle of the speed direction of the target object corresponding to the second number of the plurality of sensors, and a relative positional relationship between the target object and the radar.
In some embodiments, when the second number is equal to 2, the third determining unit 503 determines a second included angle between the speed direction of the target object under the electronic map system and the direction of the speed corresponding to one of the sensors according to the speed of the target object corresponding to the second number of sensors and the first included angle; and determining the speed direction of the target object under the electronic map system according to the position relation between the first included angle and the second included angle and the relative position relation between the target object and the radar.
In some embodiments, when the second number is greater than 2, for each two sets of sensors in the second number set of sensors, the third determining unit 503 determines, according to the magnitude of the speed corresponding to each two sets of sensors and the first included angle of the speed direction of the target object corresponding to the two sets of sensors, a second included angle of the speed direction of the target object under the electronic map system and the direction of the speed corresponding to one set of sensors, and calculates an average value of the second included angles corresponding to each two sets of sensors; and determining the speed direction of the target object under the electronic map system according to the position relation between the first included angle and the second included angle and the relative position relation between the target object and the radar.
In some embodiments, the third determining unit 503 determines the second included angle α using the following formula 1 :α 1 =|cotβ-v 2 /(v 1 sinβ)|;
Wherein v is 1 Is the speed of the target object corresponding to one group of sensors, beta is the first included angle, v 2 Is the speed of the target object corresponding to the other set of sensors.
In some embodiments, when the second number is greater than 2, for each two sets of sensors in the second number set of sensors, the second determining unit 502 determines a speed component of a speed corresponding to one set of sensors in the speed direction of the target object according to the speed of the target object corresponding to each two sets of sensors and a first included angle of the speed direction of the target object corresponding to the two sets of sensors, and calculates an average value of the speed components corresponding to each two sets of sensors, and uses the average value as the speed of the target object under the electronic map system.
In some embodiments, when the second number is equal to 2, the second determining unit 502 determines the speed of the target object under the electronic map system using the following formula:
V=|v 1 /cos(arctan(cotβ-v 2 /(v 1 sinβ))|;
wherein v is 1 Is the speed of the target object corresponding to one group of sensors, beta is the first included angle, v 2 Is the speed of the target object corresponding to the other group of sensors, and V is the speed of the target object under the electronic map system.
The implementation manners of the first determining unit 501, the second determining unit 502, and the third determining unit 503 may refer to 101-103 of the embodiment of the first aspect, which is not described herein.
Therefore, for each group of sensors, the speed of the target object corresponding to each group of sensors is determined according to the type of the sensor of the detected speed and a preset threshold value, when the speed of the target object corresponding to each group of sensors is determined based on radar, the speed of the target object under the electronic map system is determined according to the speed of the target object corresponding to each group of sensors and the included angle of the speed direction, so that the problems that the accuracy of the speed of the camera detection is low when the moving speed of the target object is too high and the real speed of the target object cannot be obtained when the target object is only detected by radar can be avoided, and the accuracy of the electronic map is improved.
Embodiments of the third aspect
Embodiments of the present application provide a data processing device, which may be, for example, a computer, a server, a workstation, a laptop, a smart phone, etc.; embodiments of the present application are not so limited.
Fig. 6 is a schematic diagram of a data processing apparatus according to an embodiment of the present application, and as shown in fig. 6, a data processing apparatus 600 according to an embodiment of the present application may include: at least one interface (not shown in fig. 6), a processor (e.g., a Central Processing Unit (CPU)) 601, a memory 602; a memory 602 is coupled to the processor 601. Wherein the memory 602 may store various data; further, a speed fusion program 603 is stored, and the program 603 is executed under the control of the processor 601, and various preset values, predetermined conditions, and the like are stored.
In one embodiment, the functions of the speed fusion apparatus 500 according to the embodiment of the third aspect may be integrated into the processor 601, implementing the speed fusion method according to the embodiment of the first aspect, applied to a traffic sensing system, where the system includes a first number of groups of sensors, each group of sensors including a camera and/or radar deployed at the same location, where different groups of sensors are deployed, for example, the processor 601 may be configured to: determining the speed of a target object corresponding to each group of sensors, wherein for each group of sensors, when the radar and the camera detect the target object and a first speed determined based on video data obtained by the camera is greater than a threshold value, the radial speed of the target object detected by the radar is taken as the speed of the sensor of the group of the radar and the camera corresponding to the target object, and when the first speed is less than or equal to the threshold value, a speed average value is taken as the speed of the target object corresponding to the sensor of the group of the radar and the camera, and the speed average value is the average value of the radial speed and the first speed; when only the radar or only the camera detects the target object, the radial speed or the first speed is used as the speed of the target object corresponding to the sensor of the group where the radar or the camera is located; and when the speed of the second number group of sensors corresponding to the target object in the first number group of sensors is a speed average value or a radial speed, determining the speed of the target object under the electronic map system according to the speed of the target object corresponding to the second number group of sensors and a first included angle of the speed direction of the target object corresponding to the second number group of sensors, wherein the first number and the second number are greater than or equal to 2, and the second number is smaller than or equal to the first number.
For example, the processor 601 may be further configured to: when the speed of the second number of the first number of the sensors is the average speed or the radial speed, determining the speed direction of the target object under the electronic map system according to the speed of the target object corresponding to the second number of the sensors, the first included angle of the speed direction of the target object corresponding to the second number of the sensors, and the relative position relationship between the target object and the radar.
In some embodiments, the implementation manner of the processor 601 may refer to the embodiment of the first aspect, which is not described herein.
In another embodiment, the speed fusion apparatus 500 according to the embodiment of the second aspect may be configured separately from the processor 601, for example, the speed fusion apparatus 500 may be configured as a chip connected to the processor 601, and the functions of the speed fusion apparatus 500 are implemented by the control of the processor 601.
Notably, the data processing device 600 may also include a display 605 and an I/O device 604, or may not necessarily include all of the components shown in FIG. 6; in addition, the data processing apparatus 600 may further include components not shown in fig. 6, to which reference is made.
In embodiments of the present application, processor 601, also sometimes referred to as a controller or operational control, may include a microprocessor or other processor device and/or logic device, which processor 601 receives inputs and controls the operation of the various components of data processing apparatus 600.
In an embodiment of the present application, the memory 602 may be, for example, one or more of a buffer, a flash memory, a hard drive, a removable media, a volatile memory, a non-volatile memory, or other suitable device. Various information may be stored, and a program for executing the related information may be stored. And the processor 601 may execute the program stored in the memory 602 to realize information storage or processing, etc. The function of the other components is similar to that of the prior art and will not be described in detail here. The various components of data processing apparatus 600 may be implemented by dedicated hardware, firmware, software, or combinations thereof without departing from the scope of the present application.
Therefore, for each group of sensors, the speed of the target object corresponding to each group of sensors is determined according to the type of the sensor of the detected speed and a preset threshold value, when the speed of the target object corresponding to each group of sensors is determined based on radar, the speed of the target object under the electronic map system is determined according to the speed of the target object corresponding to each group of sensors and the included angle of the speed direction, so that the problems that the accuracy of the speed of the camera detection is low when the moving speed of the target object is too high and the real speed of the target object cannot be obtained when the target object is only detected by radar can be avoided, and the accuracy of the electronic map is improved.
The present embodiments also provide a computer readable program, wherein the program, when executed in a data processing apparatus, causes the data processing apparatus to perform the speed fusion method according to the first aspect of the embodiments.
The present embodiment also provides a storage medium storing a computer readable program, wherein the computer readable program causes a data processing apparatus to execute the speed fusion method according to the first aspect of the embodiment.
The apparatus and method of the present application may be implemented by hardware, or may be implemented by hardware in combination with software. The present application relates to a computer readable program which, when executed by a logic means, enables the logic means to carry out the apparatus or constituent means described above, or enables the logic means to carry out the various methods or steps described above. The present application also relates to a storage medium such as a hard disk, a magnetic disk, an optical disk, a DVD, a flash memory, or the like for storing the above program.
The methods/apparatus described in connection with the embodiments of the present application may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. One or more of the functional blocks shown in the figures and/or one or more combinations of the functional blocks may correspond to software modules or hardware modules of the computer program flow. These software modules may correspond to the individual steps shown in the figures, respectively. These hardware modules may be implemented, for example, by solidifying the software modules using a Field Programmable Gate Array (FPGA).
A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. A storage medium may be coupled to the processor such that the processor can read information from, and write information to, the storage medium; or the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The software modules may be stored in the memory of the mobile terminal or in a memory card that is insertable into the mobile terminal. For example, if the apparatus (e.g., mobile terminal) employs a MEGA-SIM card of a relatively large capacity or a flash memory device of a large capacity, the software module may be stored in the MEGA-SIM card or the flash memory device of a large capacity.
One or more of the functional blocks described in the figures and/or one or more combinations of functional blocks may be implemented as a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any suitable combination thereof for use in performing the functions described herein. One or more of the functional blocks described with respect to the figures and/or one or more combinations of functional blocks may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP communication, or any other such configuration.
The present application has been described in connection with specific embodiments, but it should be apparent to those skilled in the art that these descriptions are intended to be illustrative and not limiting. Various modifications and alterations of this application may occur to those skilled in the art in light of the spirit and principles of this application, and are to be seen as within the scope of this application.
Regarding the above-described implementations disclosed in the examples of the present application, the following supplementary notes are also disclosed:
1. a speed fusion device for use in a traffic perception system, the system comprising a first number of sets of sensors, the first number being greater than or equal to 2, each set of sensors comprising radar and/or cameras deployed in a common location, the different sets of sensors being deployed in different locations, the device comprising:
a first determining unit configured to determine a speed of a target object corresponding to each set of sensors, wherein, for each set of sensors, when a radar and a camera both detect the target object and a first speed determined based on video data obtained by the camera is greater than a threshold, a radial speed of the target object detected by the radar is taken as a speed of the target object corresponding to a sensor of the set where the radar and the camera are located, and when the first speed is less than or equal to the threshold, a speed average value is taken as a speed of the target object corresponding to a sensor of the set where the radar and the camera are located, the speed average value being an average value of the radial speed and the first speed; when only a radar or only a camera detects the target object, taking the radial speed or the first speed as the speed of the target object corresponding to the sensor of the group where the radar or the camera is located;
And the second determining unit is used for determining the speed of the target object under the electronic map system according to the speed of the target object corresponding to the second number group of sensors and the first included angle of the speed direction of the target object corresponding to the second number group of sensors when the speed of the target object corresponding to the second number group of sensors is a speed average value or a radial speed, wherein the second number is more than or equal to 2 and less than or equal to the first number.
2. The apparatus of appendix 1, wherein the apparatus further comprises:
and a third determining unit, configured to determine, when the speed of the target object corresponding to the second number of sensors in the first number of sensors is a speed average value or a radial speed, a speed direction of the target object under the electronic map system according to the speed of the target object corresponding to the second number of sensors, a first included angle of the speed direction of the target object corresponding to the second number of sensors, and a relative positional relationship between the target object and the radar.
3. The apparatus according to supplementary note 2, wherein when the second number is equal to 2, the third determining unit determines a second included angle between a speed direction of the target object under the electronic map system and a direction of a speed corresponding to one of the sets of sensors according to a magnitude of the speed of the target object corresponding to the second number of sets of sensors and the first included angle; and determining the speed direction of the target object under an electronic map system according to the position relation between the first included angle and the second included angle and the relative position relation between the target object and the radar.
4. The apparatus according to supplementary note 1, wherein when the second number is greater than 2, the second determining unit determines, for each two sets of sensors in the second number of sets of sensors, a velocity component of a velocity corresponding to one set of sensors in a velocity direction of the target object according to a magnitude of a velocity of the target object corresponding to each two sets of sensors and a first angle of a velocity direction of the target object corresponding to the two sets of sensors, respectively, and calculates an average value of the velocity components corresponding to each two sets of sensors, taking the average value as a velocity magnitude of the target object under an electronic map system.
5. The apparatus according to supplementary note 2, wherein when the second number is greater than 2, the third determining unit determines, for each two sets of sensors in the second number of sets of sensors, a second angle between a speed direction of the target object under the electronic map system and a direction of a speed corresponding to one of the sets of sensors, based on a magnitude of the speed corresponding to each two sets of sensors and a first angle between the speed direction of the target object corresponding to the two sets of sensors, respectively, and calculates an average value of the second angles corresponding to each two sets of sensors; and determining the speed direction of the target object under an electronic map system according to the position relation between the first included angle and the second included angle and the relative position relation between the target object and the radar.
6. The apparatus according to supplementary note 1, wherein when the second number is equal to 2, the second determination unit determines a speed size of the target object under an electronic map system using the following formula:
V=|v 1 /cos(arctan(cotβ-v 2 /(v 1 sinβ))|;
wherein v is 1 Is the speed of the target object corresponding to one group of sensors, beta is the first included angle, v 2 Is the speed of the target object corresponding to the other group of sensors, and V is the speed of the target object under the electronic map system.
7. The apparatus according to supplementary note 3 or 5, wherein the third determination unit determines the second included angle α using the following formula 1 :α 1 =|cotβ-v 2 /(v 1 sinβ)|;
Wherein v is 1 Is the speed of the target object corresponding to one group of sensors, beta is the first included angle, v 2 Is the speed of the target object corresponding to another group of sensorsDegree.
8. A speed fusion method applied to a traffic perception system, the system comprising a first number of groups of sensors, each group of sensors comprising cameras and/or radars deployed in the same location, the locations where different groups of sensors are deployed being different, the method comprising:
determining the speed of a target object corresponding to each group of sensors, wherein when the radar and the camera detect the target object and a first speed determined based on video data obtained by the camera is larger than a threshold value, the radial speed of the target object detected by the radar is used as the speed of the sensor of the group of the radar and the camera corresponding to the target object, and when the first speed is smaller than or equal to the threshold value, a speed average value is used as the speed of the target object corresponding to the sensor of the group of the radar and the camera, and the speed average value is the average value of the radial speed and the first speed; when only a radar or only a camera detects the target object, taking the radial speed or the first speed as the speed of the target object corresponding to the sensor of the group where the radar or the camera is located;
And when the speed of the target object corresponding to the second number of the first number of the sets of sensors is a speed average value or a radial speed, determining the speed of the target object under an electronic map system according to the speed of the target object corresponding to the second number of the sets of sensors and a first included angle of the speed direction of the target object corresponding to the second number of the sets of sensors, wherein the second number is more than or equal to 2 and less than or equal to the first number.
9. The method of supplementary note 8, wherein the method further comprises:
and when the speed of the target object corresponding to the second number of the first number of the sensors is a speed average value or a radial speed, determining the speed direction of the target object under an electronic map system according to the speed of the target object corresponding to the second number of the sensors, a first included angle of the speed direction of the target object corresponding to the second number of the sensors and the relative position relation between the target object and the radar.
10. The method according to supplementary note 8, wherein when the second number is equal to 2, the third determining unit determines a second included angle between a speed direction of the target object under the electronic map system and a direction of a speed corresponding to one of the sets of sensors according to a magnitude of the speed of the target object corresponding to the second number of sets of sensors and the first included angle; and determining the speed direction of the target object under an electronic map system according to the position relation between the first included angle and the second included angle and the relative position relation between the target object and the radar.
11. The method according to supplementary note 8, wherein when the second number is greater than 2, the second determining unit determines, for each two sets of sensors in the second number of sets of sensors, a velocity component of a velocity corresponding to one set of sensors in a velocity direction of the target object according to a magnitude of a velocity of the target object corresponding to each two sets of sensors and a first included angle of a velocity direction of the target object corresponding to the two sets of sensors, respectively, and calculates an average value of the velocity components corresponding to each two sets of sensors, taking the average value as a velocity magnitude of the target object under an electronic map system.
12. The method according to supplementary note 9, wherein when the second number is greater than 2, for each two sets of sensors in the second number of sets of sensors, the third determining unit determines, according to the magnitude of the speed corresponding to each two sets of sensors and the first included angle of the speed direction of the target object corresponding to the two sets of sensors, a second included angle of the speed direction of the target object under the electronic map system and the direction of the speed corresponding to one set of sensors, and calculates an average value of the second included angles corresponding to each two sets of sensors; and determining the speed direction of the target object under an electronic map system according to the position relation between the first included angle and the second included angle and the relative position relation between the target object and the radar.
13. The method according to supplementary note 8, wherein when the second number is equal to 2, the second determining unit determines a speed size of the target object under an electronic map system using the following formula:
V=|v 1 /cos(arctan(cotβ-v 2 /(v 1 sinβ))|;
wherein v is 1 Is the speed of the target object corresponding to one group of sensors, beta is the first included angle, v 2 Is the speed of the target object corresponding to the other group of sensors, and V is the speed of the target object under the electronic map system.
14. The method according to supplementary note 10, wherein the third determining unit determines the second angle α using the following formula 1 :α 1 =|cotβ-v 2 /(v 1 sinβ)|;
Wherein v is 1 Is the speed of the target object corresponding to one group of sensors, beta is the first included angle, v 2 Is the speed of the target object corresponding to the other set of sensors.
15. The method according to supplementary note 13, wherein the third determining unit determines the second angle α using the following formula 1 :α 1 =|cotβ-v 2 /(v 1 sinβ)|;
Wherein v is 1 Is the speed of the target object corresponding to one group of sensors, beta is the first included angle, v 2 Is the speed of the target object corresponding to the other set of sensors.
16. A storage medium storing a computer-readable program, wherein the computer-readable program causes the computer to execute the speed fusion method according to any one of supplementary notes 8 to 15.

Claims (10)

1. A speed fusion device for use in a traffic perception system, the system comprising a first number of sets of sensors, the first number being greater than or equal to 2, each set of sensors comprising radar and/or cameras deployed in a common location, the different sets of sensors being deployed in different locations, the device comprising:
a first determining unit configured to determine a speed of a target object corresponding to each set of sensors, wherein, for each set of sensors, when a set of sensors includes only one type of sensor and the sensor detects the target object, the speed of the target object detected by the sensor is taken as the speed of the target object corresponding to the set of sensors;
when a group of sensors comprises two types of sensors, when the radar and the camera detect the target object and a first speed determined based on video data obtained by the camera is larger than a threshold value, taking the radial speed of the target object detected by the radar as the speed of the target object corresponding to the sensor of the group of the radar and the camera, and when the first speed is smaller than or equal to the threshold value, taking a speed average value as the speed of the target object corresponding to the sensor of the group of the radar and the camera, wherein the speed average value is the average value of the radial speed and the first speed; when only a radar or only a camera detects the target object, taking the radial speed or the first speed as the speed of the target object corresponding to the sensor of the group where the radar or the camera is located;
And the second determining unit is used for determining the speed of the target object under the electronic map system according to the speed of the target object corresponding to the second number group of sensors and the first included angle of the speed direction of the target object corresponding to the second number group of sensors when the speed of the target object corresponding to the second number group of sensors is a speed average value or a radial speed, wherein the second number is more than or equal to 2 and less than or equal to the first number.
2. The apparatus of claim 1, wherein the apparatus further comprises:
and a third determining unit, configured to determine, when the speed of the target object corresponding to the second number of sensors in the first number of sensors is a speed average value or a radial speed, a speed direction of the target object under the electronic map system according to the speed of the target object corresponding to the second number of sensors, a first included angle of the speed direction of the target object corresponding to the second number of sensors, and a relative positional relationship between the target object and the radar.
3. The apparatus according to claim 2, wherein when the second number is equal to 2, the third determining unit determines a second included angle between a speed direction of the target object under the electronic map system and a direction of a speed corresponding to one of the sets of sensors, based on a magnitude of the speed of the target object corresponding to the second number of sets of sensors and the first included angle; and determining the speed direction of the target object under an electronic map system according to the position relation between the first included angle and the second included angle and the relative position relation between the target object and the radar.
4. The apparatus according to claim 1, wherein when the second number is greater than 2, for each two sets of sensors in the second number of sets of sensors, the second determining unit determines a velocity component of a velocity corresponding to one set of sensors in a velocity direction of the target object from a magnitude of a velocity of the target object corresponding to each two sets of sensors and a first included angle of a velocity direction of the target object corresponding to the two sets of sensors, respectively, and calculates an average value of the velocity components corresponding to each two sets of sensors as a velocity magnitude of the target object under an electronic map system.
5. The apparatus according to claim 2, wherein when the second number is greater than 2, for each two sets of sensors in the second number of sets of sensors, the third determining unit determines a second included angle between a speed direction of the target object under the electronic map system and a direction of a speed corresponding to one of the sets of sensors, based on a magnitude of the speed corresponding to each two sets of sensors and a first included angle between the speed direction of the target object corresponding to the two sets of sensors, respectively, and calculates an average value of the second included angles corresponding to each two sets of sensors; and determining the speed direction of the target object under an electronic map system according to the position relation between the first included angle and the second included angle and the relative position relation between the target object and the radar.
6. The apparatus according to claim 1, wherein when the second number is equal to 2, the second determination unit determines a speed size of the target object under an electronic map system using the following formula:
V=|v 1 /cos(arctan(cotβ-v 2 /(v 1 sinβ)))|;
wherein v is 1 Is the speed of the target object corresponding to one group of sensors, beta is the first included angle, v 2 Is the speed of the target object corresponding to the other group of sensors, and V is the speed of the target object under the electronic map system.
7. The apparatus according to claim 3 or 5, wherein the third determination unit determines the second included angle α using the following formula 1 :α 1 =|cotβ-v 2 /(v 1 sinβ)|;
Wherein v is 1 Is the speed of the target object corresponding to one group of sensors, beta is the first included angle, v 2 Is the speed of the target object corresponding to the other set of sensors.
8. A speed fusion method applied to a traffic perception system, the system comprising a first number of groups of sensors, each group of sensors comprising cameras and/or radars deployed in the same location, the locations where different groups of sensors are deployed being different, the method comprising:
determining the speed of a target object corresponding to each group of sensors, wherein when one group of sensors only comprises one type of sensor and the sensors detect the target object, the speed of the target object detected by the sensors is taken as the speed of the target object corresponding to the group of sensors;
When a group of sensors comprises two types of sensors, when the radar and the camera detect the target object and a first speed determined based on video data obtained by the camera is larger than a threshold value, taking the radial speed of the target object detected by the radar as the speed of the target object corresponding to the sensor of the group of the radar and the camera, and when the first speed is smaller than or equal to the threshold value, taking a speed average value as the speed of the target object corresponding to the sensor of the group of the radar and the camera, wherein the speed average value is the average value of the radial speed and the first speed; when only a radar or only a camera detects the target object, taking the radial speed or the first speed as the speed of the target object corresponding to the sensor of the group where the radar or the camera is located;
and when the speed of the target object corresponding to the second number of the first number of the sets of sensors is a speed average value or a radial speed, determining the speed of the target object under an electronic map system according to the speed of the target object corresponding to the second number of the sets of sensors and a first included angle of the speed direction of the target object corresponding to the second number of the sets of sensors, wherein the second number is more than or equal to 2 and less than or equal to the first number.
9. The method of claim 8, wherein the method further comprises:
and when the speed of the target object corresponding to the second number of the first number of the sensors is a speed average value or a radial speed, determining the speed direction of the target object under an electronic map system according to the speed of the target object corresponding to the second number of the sensors, a first included angle of the speed direction of the target object corresponding to the second number of the sensors and the relative position relation between the target object and the radar.
10. The method of claim 8, wherein when the second number is equal to 2, determining a second included angle between a speed direction of the target object under the electronic map system and a direction of a speed corresponding to one of the sets of sensors according to a speed of the target object corresponding to the second number of sets of sensors and the first included angle; and determining the speed direction of the target object under an electronic map system according to the position relation between the first included angle and the second included angle and the relative position relation between the target object and the radar.
CN202011503615.8A 2020-12-18 2020-12-18 Speed fusion method and device Active CN114648871B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011503615.8A CN114648871B (en) 2020-12-18 2020-12-18 Speed fusion method and device
JP2021199666A JP2022097411A (en) 2020-12-18 2021-12-08 Speed fusion method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011503615.8A CN114648871B (en) 2020-12-18 2020-12-18 Speed fusion method and device

Publications (2)

Publication Number Publication Date
CN114648871A CN114648871A (en) 2022-06-21
CN114648871B true CN114648871B (en) 2024-01-02

Family

ID=81991129

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011503615.8A Active CN114648871B (en) 2020-12-18 2020-12-18 Speed fusion method and device

Country Status (2)

Country Link
JP (1) JP2022097411A (en)
CN (1) CN114648871B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104134354A (en) * 2013-04-30 2014-11-05 业纳遥控设备有限公司 Traffic monitoring system for speed measurement and assignment of moving vehicles in a multi-target recording module
CN109255349A (en) * 2017-07-14 2019-01-22 富士通株式会社 Object detection method, device and image processing equipment
CN110033476A (en) * 2018-01-11 2019-07-19 富士通株式会社 Target velocity estimation method, device and image processing equipment
CN111856448A (en) * 2020-07-02 2020-10-30 山东省科学院海洋仪器仪表研究所 Marine obstacle identification method and system based on binocular vision and radar
CN112034450A (en) * 2019-06-03 2020-12-04 富士通株式会社 Article detection method and apparatus

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10094919B2 (en) * 2015-10-06 2018-10-09 GM Global Technology Operations LLC Radar-vision fusion for target velocity estimation
US10782395B2 (en) * 2017-12-20 2020-09-22 Nxp B.V. True velocity vector estimation using V2X

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104134354A (en) * 2013-04-30 2014-11-05 业纳遥控设备有限公司 Traffic monitoring system for speed measurement and assignment of moving vehicles in a multi-target recording module
CN109255349A (en) * 2017-07-14 2019-01-22 富士通株式会社 Object detection method, device and image processing equipment
CN110033476A (en) * 2018-01-11 2019-07-19 富士通株式会社 Target velocity estimation method, device and image processing equipment
CN112034450A (en) * 2019-06-03 2020-12-04 富士通株式会社 Article detection method and apparatus
CN111856448A (en) * 2020-07-02 2020-10-30 山东省科学院海洋仪器仪表研究所 Marine obstacle identification method and system based on binocular vision and radar

Also Published As

Publication number Publication date
JP2022097411A (en) 2022-06-30
CN114648871A (en) 2022-06-21

Similar Documents

Publication Publication Date Title
CN110174093B (en) Positioning method, device, equipment and computer readable storage medium
JP6522076B2 (en) Method, apparatus, storage medium and program product for lateral vehicle positioning
CN111507130B (en) Lane-level positioning method and system, computer equipment, vehicle and storage medium
CN114578343A (en) Data fusion method and device
CN109974734A (en) A kind of event report method, device, terminal and storage medium for AR navigation
US20140365109A1 (en) Apparatus and method for recognizing driving lane
US20200143175A1 (en) Scenario detection apparatus and method
LU502288B1 (en) Method and system for detecting position relation between vehicle and lane line, and storage medium
EP1590771A1 (en) Real-time obstacle detection with a calibrated camera and known ego-motion
US10832428B2 (en) Method and apparatus for estimating a range of a moving object
CN111025308A (en) Vehicle positioning method, device, system and storage medium
CN110940974B (en) Object detection device
CN114384505A (en) Method and device for determining radar deflection angle
Kim et al. External vehicle positioning system using multiple fish-eye surveillance cameras for indoor parking lots
JP2021135286A (en) Method for converting coordinates, device, and data processor
JP2018084960A (en) Self-position estimation method and self-position estimation device
CN114648871B (en) Speed fusion method and device
CN112902911B (en) Ranging method, device, equipment and storage medium based on monocular camera
CN111612812B (en) Target object detection method, detection device and electronic equipment
CN110784680B (en) Vehicle positioning method and device, vehicle and storage medium
CN111605481A (en) Congestion car following system and terminal based on look around
CN104931024B (en) Obstacle detector
CN110539748A (en) congestion car following system and terminal based on look around
JP6961125B2 (en) Vehicle posture measuring device and vehicle posture measuring method
CN113147746A (en) Method and device for detecting ramp parking space

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant