CN115963485A - Speed detection method, device, equipment and readable storage medium - Google Patents
Speed detection method, device, equipment and readable storage medium Download PDFInfo
- Publication number
- CN115963485A CN115963485A CN202111180088.6A CN202111180088A CN115963485A CN 115963485 A CN115963485 A CN 115963485A CN 202111180088 A CN202111180088 A CN 202111180088A CN 115963485 A CN115963485 A CN 115963485A
- Authority
- CN
- China
- Prior art keywords
- speed
- detection target
- detection
- transverse
- moment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 190
- 230000033001 locomotion Effects 0.000 claims abstract description 14
- 238000000034 method Methods 0.000 claims description 38
- 239000002245 particle Substances 0.000 claims description 24
- 230000015654 memory Effects 0.000 claims description 20
- 238000006073 displacement reaction Methods 0.000 claims description 15
- 230000011218 segmentation Effects 0.000 claims description 15
- 238000005259 measurement Methods 0.000 claims description 13
- 238000006243 chemical reaction Methods 0.000 claims description 7
- 238000001914 filtration Methods 0.000 claims description 5
- 238000013135 deep learning Methods 0.000 claims description 4
- 238000012549 training Methods 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/50—Systems of measurement based on relative movement of target
- G01S13/58—Velocity or trajectory determination systems; Sense-of-movement determination systems
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
The application provides a speed detection method, a device, equipment and a readable storage medium, wherein the speed detection method comprises the following steps: establishing a first corresponding relation, wherein the first corresponding relation is used for representing the relation among the transverse speed, the longitudinal speed and a speed included angle of the detection target at the kth moment, and the speed included angle is the included angle between the transverse speed and the movement speed of the detection target at the kth moment; establishing a second corresponding relation, wherein the second corresponding relation is used for representing the relation among the azimuth angle, the transverse speed, the longitudinal speed and the radial speed of the detection target at the kth moment; based on the first and second correspondences, a lateral velocity and a longitudinal velocity are determined. The technical scheme of this application can effectively improve the degree of accuracy that speed detected.
Description
Technical Field
The present application relates to the field of measurement technologies, and in particular, to a speed detection method, apparatus, device, and readable storage medium.
Background
Currently, the lateral velocity and the longitudinal velocity of a detection target are generally determined by measuring the distance and the radial velocity of a plurality of reflection points of the detection target, wherein the plurality of reflection points are generated by reflecting electromagnetic waves of a radar by the detection target. However, due to the distance factor, the detection target may not generate a plurality of reflection points; alternatively, the included angle between the azimuth angles of adjacent reflection points generated by the detection target is small, which causes a problem of low accuracy in determining the lateral velocity and the longitudinal velocity.
Disclosure of Invention
The embodiment of the application provides a speed detection method, a speed detection device, speed detection equipment and a readable storage medium, which are used for solving the problems in the related technology, and the technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a speed detection method, including:
establishing a first corresponding relation, wherein the first corresponding relation is used for representing the relation among the transverse speed, the longitudinal speed and a speed included angle of the detection target at the kth moment, and the speed included angle is the included angle between the transverse speed and the movement speed of the detection target at the kth moment;
establishing a second corresponding relation, wherein the second corresponding relation is used for representing the relation among the azimuth angle, the transverse speed, the longitudinal speed and the radial speed of the detection target at the kth moment;
based on the first and second correspondences, a lateral velocity and a longitudinal velocity are determined.
In a second aspect, an embodiment of the present application provides a speed detection apparatus, including:
the first establishing module is used for establishing a first corresponding relation, and the first corresponding relation is used for representing the relation among the transverse speed, the longitudinal speed and the speed included angle of the detection target at the kth moment, wherein the speed included angle is the included angle between the transverse speed and the movement speed of the detection target at the kth moment;
the second establishing module is used for establishing a second corresponding relation, and the second corresponding relation is used for representing the relation among the azimuth angle, the transverse speed, the longitudinal speed and the radial speed of the detection target at the kth moment;
and the determining module is used for determining the transverse speed and the longitudinal speed based on the first corresponding relation and the second corresponding relation.
In a third aspect, an embodiment of the present application provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the speed detection method described above.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing computer instructions that, when executed on a computer, perform a method in any one of the above-described aspects.
The advantages or beneficial effects in the above technical solution at least include: the first corresponding relation and the second corresponding relation of the detection target at the kth moment are utilized to determine the transverse speed and the longitudinal speed of the detection target at the kth moment, so that the detection target can realize speed detection only by generating one reflection point, a plurality of reflection points are not required to be generated by the detection target, the influence of the distance factor and the azimuth angle resolution of the radar on the detection accuracy can be reduced, and the accuracy of the speed detection is improved.
The foregoing summary is provided for the purpose of description only and is not intended to be limiting in any way. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features of the present application will be readily apparent by reference to the drawings and the following detailed description.
Drawings
In the drawings, like reference characters designate like or similar parts or elements throughout the several views unless otherwise specified. The figures are not necessarily to scale. It is appreciated that these drawings depict only some embodiments in accordance with the disclosure and are not to be considered limiting of its scope.
FIG. 1 is a diagram illustrating the detection of a target by a radar in the related art;
FIG. 2A is a flow chart illustrating a speed detection method according to an embodiment of the present application;
FIG. 2B is a schematic diagram of detection of a detection target according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a first correspondence determination process according to an embodiment of the present application;
FIG. 4A is a schematic flowchart of obtaining a first position according to an embodiment of the present application;
FIG. 4B is a schematic diagram of obtaining a first pre-selection detection box according to an embodiment of the present application;
FIG. 4C is a schematic diagram of obtaining a semantically segmented image according to an embodiment of the present application;
FIG. 5 is a schematic flow chart of detecting an azimuth of a target according to an embodiment of the present application;
FIG. 6 is a flow chart illustrating a speed detection method according to another embodiment of the present application;
FIG. 7 is a diagram illustrating various application scenarios according to an embodiment of the present application;
FIG. 8 is a block diagram of a speed detection device according to an embodiment of the present application;
fig. 9 is a block diagram of an electronic device for implementing a speed detection method according to an embodiment of the present application.
Detailed Description
In the following, only certain exemplary embodiments are briefly described. As those skilled in the art will recognize, the described embodiments may be modified in various different ways, all without departing from the spirit or scope of the present application. Accordingly, the drawings and description are to be regarded as illustrative in nature, and not as restrictive.
As shown in fig. 1, in the related art, for the detection of the lateral velocity and the longitudinal velocity of the detection target, an electromagnetic wave is generally transmitted to the detection target by using a radar so that a plurality of reflection points of the detection target reflect the electromagnetic wave, so that the radar can detect the radial velocity and the azimuth angle of the plurality of reflection points. And moreover, a plurality of corresponding relations among the radial speed, the transverse speed and the longitudinal speed are established through the azimuth angle, and the transverse speed and the longitudinal speed can be determined by utilizing the plurality of corresponding relations, so that the detection is realized. Wherein, a plurality of corresponding relations among the radial velocity, the transverse velocity and the longitudinal velocity are established through the azimuth angle as shown in formula (1):
wherein, V ri For detecting the speed V of movement of the target at the P-th position i Radial velocity of the reflecting point, θ i For detecting the P-th of the target i Azimuth angle of reflection point, v x To detect the longitudinal speed of the target, v y To detect the lateral velocity of the target. FIG. 1 shows that the detection target generates a first reflection point P 1 And a second reflection point P 2 The case (1).
However, this detection method is not suitable when the detection target is far away from the radar and the azimuth angle interval between adjacent reflection points is small, for example, when the distance between the detection target and the radar exceeds 80m, the detection target cannot generate a plurality of reflection points; alternatively, when the azimuthal spacing of adjacent reflection points is small, the radar cannot resolve. Therefore, there is a problem that the accuracy of the speed detection is low.
In order to solve the above technical problem, the present application provides a speed detection method, as shown in fig. 2, the speed detection method may include:
step S201, establishing a first corresponding relationship, where the first corresponding relationship is used to represent a relationship between a lateral speed, a longitudinal speed, and a speed included angle of the detection target at the kth time, where the speed included angle is an included angle between the lateral speed and a movement speed of the detection target.
Wherein, the transverse speed is a transverse component of the movement speed, and the longitudinal speed is a longitudinal component of the movement speed. Illustratively, as shown in fig. 2B, the first correspondence may be represented by the following formula (2):
wherein alpha is k 、v x,k 、v y,k And sequentially setting a speed included angle, a longitudinal speed and a transverse speed of the detection target at the kth moment.
Step S202, establishing a second corresponding relation, wherein the second corresponding relation is used for representing the relation among the azimuth angle, the transverse speed, the longitudinal speed and the radial speed of the detection target at the kth moment.
The radial velocity may be obtained by Radar or Radar recorders, such as millimeter wave radars, lidar, laser Radar Recorders (LRRs), and the like. The second correspondence relationship can be expressed by the following formula (3):
v r,k =v x,k *cos(θ k )+v y,k *sin(θ k ) (3)
wherein v is r,k 、θ k Sequentially the radial velocity, the azimuth angle, v, of the detected target at the k-th moment r,k Representing the moving speed V of the detected object at the k-th moment k The radial component of (a).
And step S203, determining the transverse speed and the longitudinal speed based on the first corresponding relation and the second corresponding relation.
According to the scheme, the first corresponding relation and the second corresponding relation of the detection target at the kth moment are utilized to determine the transverse speed and the longitudinal speed of the detection target at the kth moment, so that the detection target can realize speed detection only by generating one reflection point, the detection target is not required to generate a plurality of reflection points, the influence of the distance factor and the azimuth angle resolution of the radar on the detection accuracy can be reduced, and the accuracy of speed detection is improved.
In one embodiment, as shown in fig. 3, step S201 may include:
step S301, a first position of the detection target at the k-1 moment and a second position at the k moment are obtained.
Illustratively, the first and second positions may be acquired using a position sensor, which may be a camera, a laser sensor, or the like.
Step S302, determining a displacement included angle between a first position and a second position;
and step S303, taking the displacement included angle as a speed included angle.
In one example, referring to fig. 2B, the position of the camera and the radar are the same (e.g., the position of the lidar recorder is the same as the position of the camera), and the displacement included angle α of the detected target can be determined by the following formula (4) k :
In another example, if the time interval Δ T between the k-th time and the k-1 th time is greater than the acquisition time interval of the camera, n position coordinates (x) may be acquired during the movement of the detection target from the first position to the second position m ,y m ) Wherein m =1,2, \8230, n, n is an integer, n is more than or equal to 1, and the displacement included angle alpha can be determined by the following formula (5) k :
Referring to fig. 2B, the included angle α is obtained by the following formula (6) k As the velocity angle:
in the related art, the lateral distance variation y between the second position and the first position is generally utilized k -y k-1 Longitudinal distance change x k -x k-1 And the time interval delta T between the k time and the k-1 time to determine the transverse speed of the detection target at the k time. For example, the lateral velocity v y,k Can be determined by the following equation (7):
however, since Δ T may be inaccurate, the determination capacity of the lateral velocity is susceptible to a time error, resulting in a large error in the lateral velocity. E.g. y k -y k-1 =0.6m, Δ T =30ms, then v y,k =20m/s; if Δ T has a timing error of 2ms, then acquired Δ T =28ms, and finally v is calculated y,k =21.4m/s, yielding a velocity error of 1.4 m/s.
In this embodiment, the displacement included angle of the detection target moving from the first position to the second position is used as the speed included angle of the detection target at the kth moment, so that the time error can be eliminated, the speed error can be reduced, and the accuracy of speed detection can be improved. In addition, in the process of moving the detection target from the first position to the second position, more than two position coordinates are obtained to determine a speed included angle, on one hand, because the time interval between the kth time and the kth-1 time is greater than the acquisition time interval of the camera, a measurement time window can be prolonged, so that the time error is effectively eliminated, and the accuracy of speed detection is improved; on the other hand, the average value of the plurality of speed included angles is smooth, so that the speed detection is more accurate.
In one embodiment, as shown in fig. 4A, acquiring the first position of the detection target at the k-1 st time may include:
step S401, a first image of the detection target at the k-1 moment is acquired.
The first image may be acquired by a camera, and a resolution of the camera to the position change of the detection target is greater than a resolution of the radar to the position change of the detection target.
S402, inputting a first image into a target detection model to obtain a first preselected detection frame; the target detection model is obtained by training a deep learning network model based on a plurality of sample images; the sample image comprises an image of a detection target;
step S403, determining first particle information from a first preselection detection frame;
in step S404, coordinate conversion is performed on the first particle information to obtain a first position.
In one example, as shown in fig. 4B, the first image is input into the target detection model, and a detection frame with an Intersection over Union (IoU) of 0.8 or more with the annotation frame 410 is determined as the first preselected detection frame. The first pre-selection detection box 420 may be a rectangular box, and the pixel coordinate of the ith vertex Ai of the first pre-selection detection box 420 is (u) 1i ,v 1i ) Wherein i is more than or equal to 1 and less than or equal to 4. The pixel coordinates (u) of the first particle B1 are determined from the pixel coordinates of the vertices of the first pre-selected detection box 320 1 ,v 1 ) Wherein
based on the pixel coordinates (u) of the first particle B1 1 ,v 1 ) Mapping relation with world coordinate system determines first position coordinate (x) of first particle at time k-1 k-1 ,y k-1 ) Wherein the pixel coordinate (u) of the first dot B1 1 ,v 1 ) The mapping relation with the world coordinate system is shown in formula (8):
wherein M is a conversion matrix determined by the internal and external parameters of the camera.
Correspondingly, a second image of the detection target can be acquired by the camera at the kth moment, and a second position coordinate (x) of the detection target at the kth moment can be determined by a determination method similar to the first position coordinate k ,y k )。
In this embodiment, the first image is input to the target detection model for target detection, so that the calculation speed is high, the position information can be determined at a high speed, and redundant pixels in the first image can be removed, so that the first preselection detection frame only retains the pixels of the detection target to the maximum extent, the accuracy of the first particle information is improved, and the accuracy of obtaining the first position is improved.
In one embodiment, determining first particle information from a first pre-selected detection box may comprise:
taking the center of the first preselection detection frame as first particle information; or,
and performing semantic segmentation on the first pre-selected detection frame to obtain a semantic segmentation image with a detection target, and determining first particle information from the semantic segmentation image.
Illustratively, as shown in FIG. 4B, where first pre-selection detection box 420 is a rectangular box, the geometric center of first pre-selection detection box 420 may be taken as the pixel coordinates of first particle B1.
Or, as shown in fig. 4C, performing semantic segmentation on the first pre-selected detection frame, further removing redundant pixels in the first pre-selected detection frame, and retaining pixels of the detection target to obtain a semantic segmentation image 430 with the detection target; based on the circumscribed circle contour or the inscribed circle contour (not shown in the figure) of the semantic segmentation image 430, the pixel coordinate corresponding to the center of the circumscribed circle contour or the inscribed circle contour is determined as the pixel coordinate of the first mass point.
Therefore, the accuracy of determining the first particle information is improved, and the accuracy of speed retrieval is improved.
In one embodiment, as shown in fig. 5, the azimuth angle of the detection target may be determined by the following steps:
step S501, acquiring a second position and a radial distance of a detection target at the kth moment;
and step S502, determining the azimuth angle of the detection target according to the second position and the radial distance.
In one example, referring to fig. 2B together, the second position is determined by the second image acquired by the camera at the k-th time. Radial distance from radar to a reflection point (e.g. first reflection point P) of detection target 1 ) Transmitting the electromagnetic wave and receiving the reflected electromagnetic wave.
Determining the azimuth angle of the detection target at the k-th time may include:
determining the transverse distance between the detection target and the camera from the second position coordinate;
and determining the azimuth angle of the detected target at the k moment based on the transverse distance and the radial distance measured by the radar. For example, the azimuth angle θ is determined by the following formula (9) k :
Wherein, y k To detect the lateral distance between the target and the camera, r k To detect the radial distance of the target at the k-th instant.
In another example, the radial velocity v of the detection target at the k-th time may be determined by the following formula (10) r,k With transverse velocity v y,k Longitudinal velocity v x,k And azimuth angle theta k The relationship between:
v r,k =v x,k *cos(θ k )+v y,k *sin(θ k ) (10)
in the embodiment, the azimuth angle of the detection target is determined by adopting the transverse distance between the detection target and the camera acquired by the camera and the radial distance acquired by the radar, the accuracy of the azimuth angle is higher than that of the azimuth angle measured by the radar, and the azimuth angle obtained by measuring the radar is replaced by the azimuth angle, so that the accuracy of speed detection is improved.
In one embodiment, as shown in fig. 6, the speed detection method may further include:
step S601, establishing a measurement model of the detection target based on the first corresponding relation and the second corresponding relation;
step S602, establishing a process model of a detected target based on a preset Constant Velocity (CV);
and S603, based on the measurement model and the process model, estimating the optimal transverse speed and the optimal longitudinal speed of the detection target at the kth moment by adopting Unscented Kalman Filtering (UKF).
Based on the method, the optimal transverse speed and the optimal longitudinal speed of the detection target at the kth moment are estimated by establishing a measurement model and a process model of the detection target and fusing the position information acquired by the camera with the radial distance and the radial speed acquired by the radar by adopting unscented Kalman filtering, so that the accuracy of speed detection can be effectively improved.
In one embodiment, the metrology model may include:
wherein x is k-1 、y k-1 Longitudinal distance and transverse distance, x, of the detected object at the k-1 th moment k 、y k 、r k 、v r 、v x 、v y 、α k 、θ k The longitudinal distance, the transverse distance, the radial speed, the longitudinal speed, the transverse speed, the displacement included angle/speed included angle and the azimuth angle of the detected target at the kth moment are respectively.
In one embodiment, a process model includes:
wherein x is k 、y k 、v x,k 、v y,k Respectively the longitudinal distance, the transverse distance, the longitudinal speed, the transverse speed, x of the detected target at the kth moment k-1 、y k-1 、v x,k-1 、v y,k-1 The longitudinal distance, the transverse distance, the longitudinal speed and the transverse speed of the detection target at the k-1 moment are respectively, and delta T is the time interval between the k moment and the k-1 moment.
Fig. 7 is a diagram illustrating various application scenarios according to an embodiment of the present application. As shown in fig. 7, the speed detection method according to the embodiment of the present application may be applied to an autonomous vehicle. For example, the present invention is applicable to scenes such as lane change cut-in, lane change cut-out, and intersection traffic of the target vehicle 710. In these application scenarios, the target vehicle door 710 can accurately detect the lateral speed and the longitudinal speed of the detected target (including the detected vehicle, the pedestrian, etc.), which is helpful for the target vehicle 710 to better perform path planning, obstacle avoidance, etc. in the automatic driving scenario.
Fig. 8 is a block diagram of a speed detection apparatus according to an embodiment of the present application. As shown in fig. 8, the speed detecting apparatus 800 may include:
a first establishing module 810, configured to establish a first corresponding relationship, where the first corresponding relationship is used to represent a relationship between a lateral speed, a longitudinal speed, and a speed included angle of a detected target at a kth time, where the speed included angle is an included angle between the lateral speed and a movement speed of the detected target at the kth time;
a second establishing module 820, configured to establish a second corresponding relationship, where the second corresponding relationship is used to represent a relationship among an azimuth angle, a lateral speed, a longitudinal speed, and a radial speed of the detected target at the kth time;
a determining module 830 is configured to determine the lateral speed and the longitudinal speed based on the first corresponding relationship and the second corresponding relationship.
In one embodiment, the first establishing module 810 may include:
the first acquisition submodule is used for acquiring a first position of a detection target at the k-1 th moment and a second position of the detection target at the k-1 th moment;
the first determining submodule is used for determining a displacement included angle between the first position and the second position;
and setting a submodule for taking the displacement included angle as a speed included angle.
In one embodiment, the first obtaining sub-module may include:
the acquisition unit is used for acquiring a first image of the detection target at the k-1 th moment;
the identification unit is used for inputting the first image into the target detection model to obtain a first preselected detection frame; the target detection model is obtained by training a deep learning network model based on a plurality of sample images; the sample image comprises an image of a detection target;
a determining unit, configured to determine first particle information from the first pre-selection detection frame;
and the conversion unit is used for carrying out coordinate conversion on the first particle information to obtain a first position.
In one embodiment, the determining unit may be configured to:
taking the center of a first pre-selection detection frame as first particle information; or,
and performing semantic segmentation on the first pre-selected detection frame to obtain a semantic segmentation image with a detection target, and determining first particle information from the semantic segmentation image.
In one embodiment, the second establishing module 820 may include:
the second acquisition sub-module is used for acquiring a second position and a radial distance of the detection target at the kth moment;
and the second determining submodule is used for determining the azimuth angle of the detection target according to the second position and the radial distance.
In one embodiment, the speed detection apparatus may further include:
the measurement model establishing module is used for establishing a measurement model of the detection target based on the first corresponding relation and the second corresponding relation;
the process model establishing module is used for establishing a process model of the detected target based on a preset uniform motion model;
and the estimation module is used for estimating the optimal transverse speed and the optimal longitudinal speed of the detection target at the kth moment by adopting unscented Kalman filtering based on the measurement model and the process model.
In one embodiment, the metrology model may include:
wherein x is k-1 、y k-1 Longitudinal distance and transverse distance, x, of the detected object at the k-1 th moment k 、y k 、r k,R 、v r,k 、v x,k 、v y,k 、α k 、θ k The longitudinal distance, the transverse distance, the radial speed, the longitudinal speed, the transverse speed, the displacement included angle/speed included angle and the azimuth angle of the detected target at the kth moment are respectively.
In one embodiment, a process model includes:
wherein x is k 、y k 、v x,k 、v y,k Respectively the longitudinal distance, the transverse distance, the longitudinal speed, the transverse speed, x of the detected target at the kth moment k-1 、y k-1 、v x,k-1 、v y,k-1 The longitudinal distance, the transverse distance, the longitudinal speed and the transverse speed of the detection target at the k-1 moment are respectively, and delta T is the time interval between the k moment and the k-1 moment.
The functions of each module in each apparatus in the embodiment of the present application may refer to corresponding descriptions in the above method, and are not described herein again.
Fig. 9 is a block diagram of an electronic device for implementing a speed detection method according to an embodiment of the present application. As shown in fig. 9, the electronic apparatus includes: a memory 910 and a processor 920, the memory 910 having stored therein instructions executable on the processor 920. The processor 920, when executing the instructions, implements the speed detection method in the above-described embodiments. The number of the memory 910 and the processor 920 may be one or more. The electronic device is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
The electronic device may further include a communication interface 930 for communicating with an external device for data interactive transmission. The various devices are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor 920 may process instructions for execution within the electronic device, including instructions stored in or on a memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to an interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 9, but this does not indicate only one bus or one type of bus.
Optionally, in an implementation, if the memory 910, the processor 920 and the communication interface 930 are integrated on a chip, the memory 910, the processor 920 and the communication interface 930 may complete communication with each other through an internal interface.
It should be understood that the processor may be a Central Processing Unit (CPU), other general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or any conventional processor or the like. It is noted that the processor may be a processor supporting Advanced reduced instruction set machine (ARM) architecture.
Embodiments of the present application provide a computer-readable storage medium (such as the above-mentioned memory 910) storing computer instructions, which when executed by a processor, implement the method provided in embodiments of the present application.
Alternatively, the memory 910 may include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of an electronic device for implementing the speed detection method, and the like. Further, the memory 910 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 910 may optionally include memory located remotely from the processor 920, which may be connected via a network to an electronic device for implementing the speed detection method. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
In the description of the present specification, reference to the description of "one embodiment," "some embodiments," "an example," "a specific example," or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more (two or more) executable instructions for implementing specific logical functions or steps in the process. And the scope of the preferred embodiments of the present application includes other implementations in which functions may be performed out of the order shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. All or a portion of the steps of the method of the above embodiments may be performed by associated hardware that is instructed by a program, which may be stored in a computer-readable storage medium, that when executed, includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module may also be stored in a computer-readable storage medium if it is implemented in the form of a software functional module and sold or used as a separate product. The storage medium may be a read-only memory, a magnetic or optical disk, or the like.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive various changes or substitutions within the technical scope of the present application, and these should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (18)
1. A speed detection method, comprising:
establishing a first corresponding relation, wherein the first corresponding relation is used for representing the relation among the transverse speed, the longitudinal speed and the speed included angle of the detection target at the kth moment, and the speed included angle is the included angle between the transverse speed and the movement speed of the detection target at the kth moment;
establishing a second corresponding relation, wherein the second corresponding relation is used for representing the relation among the azimuth angle, the transverse speed, the longitudinal speed and the radial speed of the detection target at the kth moment;
and determining the transverse speed and the longitudinal speed based on the first corresponding relation and the second corresponding relation.
2. The method of claim 1, wherein establishing the first correspondence comprises:
acquiring a first position of the detection target at a k-1 moment and a second position of the detection target at the k moment;
determining a displacement included angle between the first position and the second position;
and taking the displacement included angle as the speed included angle.
3. The method of claim 2, wherein obtaining the first position of the detection target at the k-1 st time comprises:
acquiring a first image of the detection target at the k-1 moment;
inputting the first image into a target detection model to obtain a first preselected detection frame; the target detection model is obtained by training a deep learning network model based on a plurality of sample images; the sample image comprises an image of the detection target;
determining first particle information from the first pre-selected detection frame;
and performing coordinate conversion on the first particle information to obtain a first position.
4. The method of claim 3, wherein determining first particle information from said first preselected detection frame comprises:
taking the center of the first pre-selection detection frame as the first particle information; or,
and performing semantic segmentation on the first pre-selected detection frame to obtain a semantic segmentation image with the detection target, and determining the first particle information from the semantic segmentation image.
5. The method of claim 1, wherein the establishing the second correspondence comprises:
acquiring a second position and a radial distance of the detection target at the kth moment;
and determining the azimuth angle of the detection target according to the second position and the radial distance.
6. The method of any of claims 1 to 5, further comprising:
establishing a measurement model of the detection target based on the first corresponding relation and the second corresponding relation;
establishing a process model of the detection target based on a preset uniform motion model;
and estimating the optimal transverse speed and the optimal longitudinal speed of the detection target at the kth moment by adopting unscented Kalman filtering based on the measurement model and the process model.
7. The method of claim 6, wherein the metrology model comprises:
wherein x is k-1 、y k-1 Respectively the longitudinal distance and the transverse distance, x, of the detected target at the k-1 time k 、y k 、r k,R 、v r,k 、v x,k 、v y,k 、α k 、θ k The longitudinal distance, the transverse distance, the radial speed, the longitudinal speed, the transverse speed, the displacement included angle/speed included angle and the azimuth angle of the detection target at the kth moment are respectively.
8. The method of claim 6, wherein the process model comprises:
wherein x is k 、y k 、v x,k 、v y,k Respectively the longitudinal distance, the transverse distance, the longitudinal speed, the transverse speed, x of the detected target at the kth moment k-1 、y k-1 、v x,k-1 、v y,k-1 Respectively the longitudinal distance, the transverse distance, the longitudinal speed and the transverse speed of the detection target at the k-1 th moment, wherein delta T is the time interval between the k-1 th moment and the k-1 th moment.
9. A speed detection device, comprising:
the device comprises a first establishing module, a second establishing module and a third establishing module, wherein the first establishing module is used for establishing a first corresponding relation, and the first corresponding relation is used for representing the relation among the transverse speed, the longitudinal speed and a speed included angle of a detection target at the kth moment, wherein the speed included angle is the included angle between the transverse speed and the movement speed of the detection target at the kth moment;
a second establishing module, configured to establish a second corresponding relationship, where the second corresponding relationship is used to characterize a relationship among the azimuth angle, the lateral speed, the longitudinal speed, and the radial speed of the detection target at the kth time;
a determining module for determining the lateral velocity and the longitudinal velocity based on the first correspondence and the second correspondence.
10. The apparatus of claim 9, wherein the first establishing means comprises:
the first acquisition submodule is used for acquiring a first position of the detection target at the k-1 th moment and a second position of the detection target at the k-1 th moment;
the first determining submodule is used for determining a displacement included angle between the first position and the second position;
and setting a submodule for taking the displacement included angle as the speed included angle.
11. The apparatus of claim 10, wherein the first acquisition submodule comprises:
the acquisition unit is used for acquiring a first image of the detection target at the k-1 th moment;
the identification unit is used for inputting the first image into a target detection model to obtain a first preselected detection frame; the target detection model is obtained by training a deep learning network model based on a plurality of sample images; the sample image comprises an image of the detection target;
a determination unit configured to determine first particle information from the first pre-selected detection frame;
and the conversion unit is used for carrying out coordinate conversion on the first particle information to obtain a first position.
12. The apparatus of claim 11, wherein the determining unit is configured to:
taking the center of the first pre-selection detection frame as the first particle information; or,
and performing semantic segmentation on the first pre-selected detection frame to obtain a semantic segmentation image with the detection target, and determining the first particle information from the semantic segmentation image.
13. The apparatus of claim 9, wherein the second establishing means comprises:
the second acquisition sub-module is used for acquiring a second position and a radial distance of the detection target at the kth moment;
and the second determining submodule is used for determining the azimuth angle of the detection target according to the second position and the radial distance.
14. The apparatus of any one of claims 9 to 13, further comprising:
the measurement model establishing module is used for establishing a measurement model of the detection target based on the first corresponding relation and the second corresponding relation;
the process model establishing module is used for establishing a process model of the detection target based on a preset uniform motion model;
and the estimation module is used for estimating the optimal transverse speed and the optimal longitudinal speed of the detection target at the kth moment by adopting unscented Kalman filtering based on the measurement model and the process model.
15. The apparatus of claim 14, wherein the metrology model comprises:
wherein x is k-1 、y k-1 Respectively the longitudinal distance and the transverse distance, x, of the detected target at the k-1 time k 、y k 、r k,R 、v r,k 、v x,k 、v y,k 、α k 、θ k The longitudinal distance, the transverse distance, the radial speed, the longitudinal speed, the transverse speed, the displacement included angle/speed included angle and the azimuth angle of the detection target at the kth moment are respectively.
16. The apparatus of claim 14, wherein the process model comprises:
wherein x is k 、y k 、v x,k 、v y,k Respectively the longitudinal distance, the transverse distance, the longitudinal speed, the transverse speed, x of the detected target at the kth moment k-1 、y k-1 、v x,k-1 、v y,k-1 The longitudinal distance, the transverse distance, the longitudinal speed and the transverse speed of the detection target at the k-1 moment are respectively, and delta T is the time interval between the k moment and the k-1 moment.
17. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-8.
18. A computer readable storage medium having stored therein computer instructions which, when executed by a processor, implement the method of any one of claims 1-8.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111180088.6A CN115963485A (en) | 2021-10-11 | 2021-10-11 | Speed detection method, device, equipment and readable storage medium |
PCT/CN2022/124576 WO2023061355A1 (en) | 2021-10-11 | 2022-10-11 | Velocity detection method and apparatus, device and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111180088.6A CN115963485A (en) | 2021-10-11 | 2021-10-11 | Speed detection method, device, equipment and readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115963485A true CN115963485A (en) | 2023-04-14 |
Family
ID=85888244
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111180088.6A Pending CN115963485A (en) | 2021-10-11 | 2021-10-11 | Speed detection method, device, equipment and readable storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN115963485A (en) |
WO (1) | WO2023061355A1 (en) |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103901425B (en) * | 2014-04-23 | 2017-08-11 | 武汉大学 | A kind of radar and method for measuring moving object lateral velocity |
US10094919B2 (en) * | 2015-10-06 | 2018-10-09 | GM Global Technology Operations LLC | Radar-vision fusion for target velocity estimation |
EP3572839A1 (en) * | 2018-05-23 | 2019-11-27 | Aptiv Technologies Limited | Method of estimating a velocity magnitude of a moving target in a horizontal plane and radar detection system |
US11255957B2 (en) * | 2020-01-27 | 2022-02-22 | Ford Global Technologies, Llc | Target velocity detection |
CN111366928B (en) * | 2020-02-21 | 2022-07-01 | 北京小马慧行科技有限公司 | Vehicle speed determination method and device, storage medium and processor |
-
2021
- 2021-10-11 CN CN202111180088.6A patent/CN115963485A/en active Pending
-
2022
- 2022-10-11 WO PCT/CN2022/124576 patent/WO2023061355A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
WO2023061355A1 (en) | 2023-04-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109345596B (en) | Multi-sensor calibration method, device, computer equipment, medium and vehicle | |
CN109975773B (en) | Millimeter wave radar calibration method, device, equipment and storage medium | |
CN110927708B (en) | Calibration method, device and equipment of intelligent road side unit | |
CN110471409B (en) | Robot inspection method and device, computer readable storage medium and robot | |
CN110561423A (en) | pose transformation method, robot and storage medium | |
CN107784671B (en) | Method and system for visual instant positioning and drawing | |
US20220277581A1 (en) | Hand pose estimation method, device and storage medium | |
CN110263713A (en) | Method for detecting lane lines, device, electronic equipment and storage medium | |
CN111177295A (en) | Image-building ghost eliminating method and device, computer-readable storage medium and robot | |
CN111536990A (en) | On-line external reference mis-calibration detection between sensors | |
CN112327329A (en) | Obstacle avoidance method, target device, and storage medium | |
CN111722297B (en) | Target existence probability calculation method and device, electronic equipment and storage medium | |
US11620846B2 (en) | Data processing method for multi-sensor fusion, positioning apparatus and virtual reality device | |
CN111523619B (en) | Target existence probability calculation method and device, electronic equipment and storage medium | |
CN112689842A (en) | Target detection method and device | |
CN115952248B (en) | Pose processing method, device, equipment, medium and product of terminal equipment | |
CN115963485A (en) | Speed detection method, device, equipment and readable storage medium | |
CN114216412B (en) | Angle information acquisition method and device for traffic participants and storage medium | |
CN113203424B (en) | Multi-sensor data fusion method and device and related equipment | |
CN117893992A (en) | Target detection method, model training method, electronic device and vehicle | |
KR20230006628A (en) | method and device for processing image, electronic equipment, storage medium and computer program | |
CN116295353A (en) | Positioning method, device and equipment of unmanned vehicle and storage medium | |
CN111596288B (en) | Method and device for measuring speed, vehicle-mounted terminal and vehicle-mounted speed measuring system | |
CN114964204A (en) | Map construction method, map using method, map constructing device, map using equipment and storage medium | |
CN114460551A (en) | On-site automatic calibration method and device based on millimeter wave radar and vision |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |