CN112529935A - Target vehicle motion state identification method and device - Google Patents

Target vehicle motion state identification method and device Download PDF

Info

Publication number
CN112529935A
CN112529935A CN201910879023.7A CN201910879023A CN112529935A CN 112529935 A CN112529935 A CN 112529935A CN 201910879023 A CN201910879023 A CN 201910879023A CN 112529935 A CN112529935 A CN 112529935A
Authority
CN
China
Prior art keywords
vehicle
target
speed
vehicle body
target vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910879023.7A
Other languages
Chinese (zh)
Other versions
CN112529935B (en
Inventor
商燕
黄洋文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Goldway Intelligent Transportation System Co Ltd
Original Assignee
Shanghai Goldway Intelligent Transportation System Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Goldway Intelligent Transportation System Co Ltd filed Critical Shanghai Goldway Intelligent Transportation System Co Ltd
Priority to CN201910879023.7A priority Critical patent/CN112529935B/en
Publication of CN112529935A publication Critical patent/CN112529935A/en
Application granted granted Critical
Publication of CN112529935B publication Critical patent/CN112529935B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Abstract

The embodiment of the invention provides a method and a device for identifying the motion state of a target vehicle, wherein the method comprises the following steps: and determining the relative speed of the target vehicle according to the two frames of target images. And obtaining the vehicle body yaw angle of the vehicle according to the vehicle data of the vehicle, wherein the vehicle body yaw angle is the angle rotated by the vehicle relative to the first shooting time at the second shooting time. And obtaining the absolute speed of the target vehicle according to the relative speed of the target vehicle, the vehicle body yaw angle and the vehicle speed information, and obtaining the position difference value of the target vehicle in the two frames of target images according to the vehicle body yaw angle and the vehicle speed information. And identifying the motion state of the target vehicle according to the absolute speed and the position difference value of the target vehicle. The absolute speed of the target vehicle and the position difference value of the target vehicle are obtained according to the relative speed of the target vehicle, the vehicle body yaw angle and the vehicle speed information, so that the motion state of the target vehicle is directly judged, and the calculation amount is reduced.

Description

Target vehicle motion state identification method and device
Technical Field
The embodiment of the invention relates to a computer technology, in particular to a method and a device for identifying a motion state of a target vehicle.
Background
With the continuous development of the vehicle related research field, the identification of the running state of the target vehicle existing on the road plays an increasingly important role, and the identification can provide preliminary information for subsequent technologies such as track association, target identification and tracking.
At present, in the prior art, the motion state identification of a target vehicle usually requires matching of corner points between two frames of target images, which includes detecting the corner points on a first frame of image, finding matching points on a second frame of image by adopting an optical flow algorithm, calculating a matrix representing the motion of a camera according to the relationship between the matching points of the two frames of target images, and compensating or estimating the motion of the camera by using an epipolar geometric relationship between the images under a motion platform, thereby realizing the motion state identification of the target vehicle.
However, the process of finding the matching points by using the optical flow algorithm and the process of calculating the camera motion matrix by using the relationship between the angular points are both complex, which results in large calculation amount and low processing efficiency.
Disclosure of Invention
The embodiment of the invention provides a method and a device for identifying a motion state of a target vehicle, which are used for solving the problem of large calculation amount of identification of the motion state of the target vehicle.
In a first aspect, an embodiment of the present invention provides a method for identifying a motion state of a target vehicle, including:
determining the relative speed of a target vehicle according to two frames of target images, wherein the two frames of target images are images which are acquired by a vehicle-mounted camera of the vehicle and comprise the target vehicle;
obtaining a vehicle body yaw angle of the vehicle according to vehicle data of the vehicle, wherein the vehicle data comprises vehicle speed information, the vehicle speed information comprises speeds of the vehicle at a first shooting time and a second shooting time corresponding to the two frames of target images respectively, and the vehicle body yaw angle is an angle rotated by the vehicle relative to the first shooting time at the second shooting time;
obtaining the absolute speed of the target vehicle according to the relative speed of the target vehicle, the vehicle body yaw angle and the vehicle speed information, and obtaining the position difference value of the target vehicle in the two frames of target images according to the vehicle body yaw angle and the vehicle speed information;
and identifying the motion state of the target vehicle according to the absolute speed of the target vehicle and the position difference value.
In one possible design, the obtaining an absolute speed of the target vehicle according to the relative speed of the target vehicle, the vehicle body yaw angle, and the vehicle speed information includes:
switching the speed of the own vehicle at the first photographing time and the speed of the own vehicle at the second photographing time to the same direction according to the yaw angle of the vehicle body;
and processing according to the relative speed of the target vehicle and the speed of the second shooting moment after the direction conversion to obtain the absolute speed of the target vehicle.
In one possible design, the obtaining a position difference value of the target vehicle in the two frames of target images according to the vehicle body yaw angle and the vehicle speed information includes:
according to the vehicle body yaw angle, performing rotation processing on a second vehicle body coordinate system corresponding to the vehicle at the second shooting time, and according to the vehicle speed information of the vehicle, performing translation processing on the second vehicle body coordinate system to obtain a processed second vehicle body coordinate system, wherein the processed second vehicle body coordinate system is the same as the first vehicle body coordinate system corresponding to the first shooting time;
acquiring a first body coordinate of the target vehicle in the first body coordinate system and a second body coordinate of the processed second body coordinate system;
and obtaining the position difference value of the target vehicle in the two frames of target images according to the first vehicle body coordinate and the second vehicle body coordinate.
In one possible design, the determining the relative speed of the target vehicle from the two frames of target images includes:
obtaining image coordinates of a target vehicle in the two frames of target images on an image coordinate system according to the two frames of target images;
converting the image coordinate systems of the target vehicles in the two frames of target images into a third vehicle body coordinate system according to the corresponding relation between the image coordinate system and the vehicle body coordinate system;
obtaining a relative driving distance of the target vehicle according to a third vehicle body coordinate of the target vehicle corresponding to the first shooting time in the third vehicle body coordinate system and a fourth vehicle body coordinate of the target vehicle corresponding to the second shooting time in the third vehicle body coordinate system;
and obtaining the relative speed of the target vehicle according to the time difference between the first shooting moment and the second shooting moment and the relative driving distance.
In one possible design, the vehicle data further includes wheel deflection information, and the wheel deflection information includes wheel deflection angles of the vehicle at a first shooting time and a second shooting time corresponding to the two frames of target images respectively;
the method for obtaining the vehicle body yaw angle of the vehicle according to the vehicle data corresponding to the shooting time of the two frames of target images sent by the sensor of the vehicle comprises the following steps:
and obtaining the vehicle body yaw angle of the vehicle according to the vehicle body wheelbase, the vehicle speed information and the wheel deflection information of the vehicle, wherein the vehicle speed, the wheel deflection angle and the vehicle body yaw angle of the vehicle are positively correlated, and the vehicle body wheelbase and the vehicle body yaw angle of the vehicle are negatively correlated.
In one possible design, the identifying the motion state of the target vehicle according to the absolute speed of the target vehicle and the position difference of the target vehicle in the two frames of target images includes:
if the absolute speed of the target vehicle is not 0, judging whether the position difference value of the target vehicle is 0 or not;
if so, determining that the motion state of the target vehicle is static;
and if not, determining that the motion state of the target vehicle is in driving.
In one possible design, the method further includes:
and if the absolute speed of the target vehicle is 0, determining that the motion state of the target vehicle is static.
In a second aspect, an embodiment of the present invention provides a target vehicle motion state identification device, including:
the vehicle-mounted camera comprises a determining module, a judging module and a judging module, wherein the determining module is used for determining the relative speed of a target vehicle according to two frames of target images, and the two frames of target images are images which are acquired by a vehicle-mounted camera of the vehicle and comprise the target vehicle;
the processing module is further used for obtaining a vehicle body yaw angle of the vehicle according to vehicle data of the vehicle, wherein the vehicle data comprise vehicle speed information, the vehicle speed information comprises speeds of the vehicle at a first shooting time and a second shooting time corresponding to the two frames of target images respectively, and the vehicle body yaw angle is an angle rotated by the vehicle at the second shooting time relative to the first shooting time;
the processing module is further used for obtaining the absolute speed of the target vehicle according to the relative speed of the target vehicle, the vehicle body yaw angle and the vehicle speed information, and obtaining the position difference value of the target vehicle in the two frames of target images according to the vehicle body yaw angle and the vehicle speed information;
and the identification module is used for identifying the motion state of the target vehicle according to the absolute speed of the target vehicle and the position difference value.
In one possible design, the processing module is specifically configured to:
switching the speed of the own vehicle at the first photographing time and the speed of the own vehicle at the second photographing time to the same direction according to the yaw angle of the vehicle body;
and processing according to the relative speed of the target vehicle and the speed of the second shooting moment after the direction conversion to obtain the absolute speed of the target vehicle.
In one possible design, the processing module is specifically configured to:
according to the vehicle body yaw angle, performing rotation processing on a second vehicle body coordinate system corresponding to the vehicle at the second shooting time, and according to the vehicle speed information of the vehicle, performing translation processing on the second vehicle body coordinate system to obtain a processed second vehicle body coordinate system, wherein the processed second vehicle body coordinate system is the same as the first vehicle body coordinate system corresponding to the first shooting time;
acquiring a first body coordinate of the target vehicle in the first body coordinate system and a second body coordinate of the processed second body coordinate system;
and obtaining the position difference value of the target vehicle in the two frames of target images according to the first vehicle body coordinate and the second vehicle body coordinate.
In one possible design, the determining module is specifically configured to:
obtaining image coordinates of a target vehicle in the two frames of target images on an image coordinate system according to the two frames of target images;
converting the image coordinate systems of the target vehicles in the two frames of target images into a third vehicle body coordinate system according to the corresponding relation between the image coordinate system and the vehicle body coordinate system;
obtaining a relative driving distance of the target vehicle according to a third vehicle body coordinate of the target vehicle corresponding to the first shooting time in the third vehicle body coordinate system and a fourth vehicle body coordinate of the target vehicle corresponding to the second shooting time in the third vehicle body coordinate system;
and obtaining the relative speed of the target vehicle according to the time difference between the first shooting moment and the second shooting moment and the relative driving distance.
In one possible design, the vehicle data further includes wheel deflection information, and the wheel deflection information includes wheel deflection angles of the vehicle at a first shooting time and a second shooting time corresponding to the two frames of target images respectively;
the processing module is specifically configured to:
and obtaining the vehicle body yaw angle of the vehicle according to the vehicle body wheelbase, the vehicle speed information and the wheel deflection information of the vehicle, wherein the vehicle speed, the wheel deflection angle and the vehicle body yaw angle of the vehicle are positively correlated, and the vehicle body wheelbase and the vehicle body yaw angle of the vehicle are negatively correlated.
In one possible design, the identification module is specifically configured to:
if the absolute speed of the target vehicle is not 0, judging whether the position difference value of the target vehicle is 0 or not;
if so, determining that the motion state of the target vehicle is static;
and if not, determining that the motion state of the target vehicle is in driving.
In one possible design, the identification module is further configured to:
and if the absolute speed of the target vehicle is 0, determining that the motion state of the target vehicle is static.
In a third aspect, an embodiment of the present invention provides a target vehicle motion state identification device, including:
a memory for storing a program;
a processor for executing the program stored by the memory, the processor being adapted to perform the method as described above in the first aspect and any one of the various possible designs of the first aspect when the program is executed.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, including instructions, which, when executed on a computer, cause the computer to perform the method as described above in the first aspect and any one of various possible designs of the first aspect.
The embodiment of the invention provides a method and a device for identifying the motion state of a target vehicle, wherein the method comprises the following steps: and determining the relative speed of the target vehicle according to the two frames of target images, wherein the two frames of target images are images including the target vehicle, acquired by the vehicle-mounted camera of the vehicle. And obtaining the vehicle body yaw angle of the vehicle according to the vehicle data of the vehicle, wherein the vehicle data comprises vehicle speed information, the vehicle speed information comprises the speeds of the vehicle at the first shooting time and the second shooting time corresponding to the two frames of target images respectively, and the vehicle body yaw angle is the angle rotated by the vehicle relative to the first shooting time at the second shooting time. And obtaining the absolute speed of the target vehicle according to the relative speed of the target vehicle, the vehicle body yaw angle and the vehicle speed information, and obtaining the position difference value of the target vehicle in the two frames of target images according to the vehicle body yaw angle and the vehicle speed information. And identifying the motion state of the target vehicle according to the absolute speed and the position difference value of the target vehicle. The relative speed of the target vehicle is determined according to the two frames of target images, the absolute speed of the target vehicle is obtained according to the relative speed of the target vehicle, the yaw angle of the vehicle body and the vehicle speed information, the position difference value of the target vehicle is obtained, the motion state of the target vehicle is directly judged, a matching point does not need to be searched, a camera motion matrix does not need to be calculated, the calculated amount is reduced, and the recognition efficiency of the motion state of the target vehicle is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a first scene schematic diagram of a method for identifying a motion state of a target vehicle according to an embodiment of the present invention;
fig. 2 is a first flowchart of a method for identifying a motion state of a target vehicle according to an embodiment of the present invention;
fig. 3 is a second flowchart of a method for identifying a motion state of a target vehicle according to an embodiment of the present invention;
fig. 4 is a scene schematic diagram ii of a method for identifying a motion state of a target vehicle according to an embodiment of the present invention;
fig. 5 is a third scene schematic diagram of the method for identifying a motion state of a target vehicle according to the embodiment of the present invention;
fig. 6 is a scene schematic diagram four of the method for identifying a motion state of a target vehicle according to the embodiment of the present invention;
fig. 7 is a schematic structural diagram of a target vehicle motion state identification device according to an embodiment of the present invention;
fig. 8 is a schematic hardware structure diagram of a target vehicle motion state identification device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a scene schematic diagram of a method for identifying a motion state of a target vehicle according to an embodiment of the present invention, as shown in fig. 1, a vehicle-mounted camera is mounted on a vehicle and used for capturing a road condition in front of the vehicle, where the vehicle-mounted camera may be, for example, a car recorder, or may also be a camera built in the vehicle, and it can be understood by those skilled in the art that the specific implementation manner of the vehicle-mounted camera is not limited in this embodiment as long as the vehicle-mounted camera can capture a road condition in front of the vehicle.
Specifically, the vehicle-mounted camera sends the captured image to the server in real time, so that the server processes the captured image to identify the motion state of the target vehicle, in the present invention, the target vehicle refers to a vehicle captured by the vehicle-mounted camera, and it can be understood by those skilled in the art that all vehicles (cars, trucks, electric vehicles, bicycles, etc.) on the road that can be captured by the vehicle-mounted camera can be regarded as target vehicles, as shown in fig. 1, wherein vehicles 102, 103, and 104 are all target vehicles.
In an optional embodiment, after identifying the motion state of the target vehicle, the server may send the motion state of the target vehicle to the vehicle, so that a driver of the vehicle can timely acquire the motion state of the preceding target vehicle, thereby improving driving safety.
At present, when the motion state of a target vehicle is identified, a traditional method generally utilizes an antipodal geometric relationship between images under a motion platform, such as a plane homography matrix, a basis matrix, a trifocal tensor and the like, so as to compensate or estimate the motion of a vehicle-mounted camera. The method comprises the steps of firstly utilizing the corner points detected based on a corner point detection algorithm on a first frame image, finding out the matching points on a second frame image by adopting an optical flow algorithm, and then calculating a matrix representing the motion of a camera according to the relation between the matching points of the two frames of images.
However, the conventional method has the following disadvantages: 1. it requires a three-dimensional sensor to acquire depth information of the corner points, resulting in a limited application scenario. 2. The method comprises the steps of finding matched corner points in a second frame on the basis of corner points detected by a corner point detection algorithm on a first frame image between two frames of images by using an optical flow algorithm, and calculating a matrix representing camera motion according to the relationship between the corner points of the two frames of images, wherein the process of finding the matched points by using the optical flow algorithm and the process of calculating the camera motion matrix by using the relationship between the corner points are complex, so that the calculation amount is very large.
To solve the problems in the prior art, the present invention provides a method for identifying a moving state of a target vehicle, so as to overcome the problem in the prior art that the calculation amount is large when identifying the moving state of the target vehicle, which is described below with reference to a specific embodiment, and first with reference to fig. 2, fig. 2 is a first flowchart of the method for identifying the moving state of the target vehicle according to the embodiment of the present invention, and as shown in fig. 2, the method includes:
s201, determining the relative speed of the target vehicle according to two frames of target images, wherein the two frames of target images are images including the target vehicle, acquired by a vehicle-mounted camera of the vehicle.
In this embodiment, the vehicle-mounted camera sends the captured image data to the server in real time, and those skilled in the art can understand that the image data captured by the vehicle-mounted camera may be a video, and the server processes the video captured by the vehicle-mounted camera, so as to acquire a multi-frame image.
Specifically, the server determines the relative speed of the target vehicle according to two frames of target images, wherein the target images are images including the target vehicle, it should be noted that the target vehicle in this embodiment refers to all vehicles included in the images captured by the onboard camera, and is not a specific vehicle specified, that is, as long as the vehicle is included in the images, the images can be regarded as the target images.
In an alternative implementation manner, the two-frame target image may be any two-frame target image, for example, the two-frame target image may be two adjacent frame target images, or may also be two spaced frame target images, which is not limited in this embodiment, as long as the two-frame target image includes the target vehicle.
Assuming that the target images include a plurality of target vehicles, each target vehicle is processed respectively to determine the relative speed of each target vehicle, and taking one target vehicle a as an example, the distance that the target vehicle advances can be determined according to two frames of target images, and then the relative speed of the target vehicle can be determined according to the time interval of the two frames of target images; alternatively, the relative speed of the target vehicle may also be determined according to the distance between the target vehicle and the host vehicle in the two target images, the speed of the host vehicle, and the like, and the implementation manner of the present embodiment is not particularly limited.
And S202, obtaining a vehicle body yaw angle of the vehicle according to vehicle data of the vehicle, wherein the vehicle data comprise vehicle speed information, the vehicle speed information comprises speeds of the vehicle at a first shooting time and a second shooting time corresponding to two frames of target images respectively, and the vehicle body yaw angle is an angle rotated by the vehicle relative to the first shooting time at the second shooting time.
In this embodiment, the vehicle sensor is provided in the vehicle and configured to detect vehicle data of the vehicle, specifically, if a time corresponding to a previous frame of target image in two frames of target images is a first shooting time, and a time corresponding to a next frame of target image is a second shooting time, the sensor may send a speed of the vehicle at the first shooting time and a speed of the vehicle at the second shooting time, so as to obtain vehicle speed information of the vehicle.
Specifically, the vehicle data is data for reflecting the running state of the own vehicle, and may further include, for example, wheel deflection information, where the wheel deflection information may include wheel deflection angles of the own vehicle at the first shooting time and the second shooting time, or may further include, for example, parameters of the own vehicle (such as a vehicle body wheel base, etc.).
In one possible implementation, for example, the vehicle body yaw angle of the vehicle may be obtained by processing according to the vehicle speed information, the wheel deflection information and a preset model, or the vehicle body yaw angle may also be directly obtained according to a difference value between the wheel deflection angles of the first shooting time and the second shooting time, which is not limited herein, as long as the angle of the vehicle rotated by the second shooting time relative to the first shooting time can be obtained according to the vehicle data.
S203, obtaining the absolute speed of the target vehicle according to the relative speed of the target vehicle, the vehicle body yaw angle and the vehicle speed information, and obtaining the position difference value of the target vehicle in the two frames of target images according to the vehicle body yaw angle and the vehicle speed information.
Specifically, the relative speed of the target vehicle may be understood as a traveling speed obtained when the target vehicle is stationary relative to the host vehicle, and the absolute speed of the target vehicle is an actual traveling speed of the target vehicle, which is obtained on the assumption that the host vehicle is stationary.
In one possible implementation, for example, the directions of the speeds at the first shooting time and the second shooting time may be switched to the same direction according to the yaw angle of the vehicle body, and then the absolute speed of the target vehicle may be obtained by performing addition or subtraction processing according to the relative speed of the target vehicle (the direction of the relative speed obtained from the two frames of target images is actually identical to the direction of the speed of the previous frame) and the vehicle speed of the host vehicle after the direction switching.
Alternatively, the direction of the relative speed of the target vehicle and the direction of the speed at the second shooting time may be switched to the same direction according to the yaw angle of the vehicle body, so as to obtain the sum absolute speed of the target vehicle at the second shooting time.
Meanwhile, in the embodiment, the position difference value of the target vehicle in the two frames of target images is obtained according to the vehicle body yaw angle and the vehicle speed information, wherein the position difference value is used for indicating whether the position of the target vehicle in the two frames of target images changes or not.
And S204, identifying the motion state of the target vehicle according to the absolute speed and the position difference of the target vehicle.
In the present embodiment, the motion state of the target vehicle may include stationary and in-travel, and when the absolute speed of the target vehicle is not 0 and the position difference is not 0, the target vehicle may be considered not to have moved, thereby determining that the motion state of the target vehicle is stationary, and otherwise, as long as one of the two is not 0, the target vehicle may be considered to have moved, thereby determining that the motion state of the target vehicle is in-travel.
The motion state of the target vehicle is identified together according to the absolute speed and the position difference value of the target vehicle, so that the accuracy of identifying the motion state of the target vehicle is ensured, and the wrong judgment caused by identification only according to one parameter is avoided.
In an optional embodiment, the motion state of the target vehicle may further include an initial starting state, a high-speed driving state, an upcoming stopping state, and the like, which is not limited in this embodiment, for example, the motion state of the target vehicle may be determined by comparing the absolute speed and the position difference with a preset threshold interval, and this is not limited in this embodiment.
The method for identifying the motion state of the target vehicle provided by the embodiment of the invention comprises the following steps: and determining the relative speed of the target vehicle according to the two frames of target images, wherein the two frames of target images are images including the target vehicle, acquired by the vehicle-mounted camera of the vehicle. And obtaining the vehicle body yaw angle of the vehicle according to the vehicle data of the vehicle, wherein the vehicle data comprises vehicle speed information, the vehicle speed information comprises the speeds of the vehicle at the first shooting time and the second shooting time corresponding to the two frames of target images respectively, and the vehicle body yaw angle is the angle rotated by the vehicle relative to the first shooting time at the second shooting time. And obtaining the absolute speed of the target vehicle according to the relative speed of the target vehicle, the vehicle body yaw angle and the vehicle speed information, and obtaining the position difference value of the target vehicle in the two frames of target images according to the vehicle body yaw angle and the vehicle speed information. And identifying the motion state of the target vehicle according to the absolute speed and the position difference value of the target vehicle. The relative speed of the target vehicle is determined according to the two frames of target images, the absolute speed of the target vehicle is obtained according to the relative speed of the target vehicle, the yaw angle of the vehicle body and the vehicle speed information, the position difference value of the target vehicle is obtained, the motion state of the target vehicle is directly judged, a matching point does not need to be searched, a camera motion matrix does not need to be calculated, the calculated amount is reduced, and the recognition efficiency of the motion state of the target vehicle is improved.
On the basis of the foregoing embodiment, the following describes in detail the method for identifying a moving state of a target vehicle according to an embodiment of the present invention with reference to fig. 3 to 6, where fig. 3 is a second flowchart of the method for identifying a moving state of a target vehicle according to an embodiment of the present invention, fig. 4 is a second schematic view of a scene of the method for identifying a moving state of a target vehicle according to an embodiment of the present invention, fig. 5 is a third schematic view of a scene of the method for identifying a moving state of a target vehicle according to an embodiment of the present invention, and fig. 6 is a fourth schematic view of a scene of the method for identifying a moving state of a target vehicle according to an embodiment of the present invention.
Fig. 3 is a second flowchart of the method for identifying a moving state of a target vehicle according to the embodiment of the present invention, fig. 4 is a second schematic view of a second scenario of the method for identifying a moving state of a target vehicle according to the embodiment of the present invention, fig. 5 is a third schematic view of a third scenario of the method for identifying a moving state of a target vehicle according to the embodiment of the present invention, and fig. 6 is a third schematic view of a fourth u7 of a scenario of the method for identifying a moving state of a target vehicle according to the embodiment of the present invention, as shown in fig. 3, the method:
s301, obtaining image coordinates of the target vehicle in the two frame target images on the image coordinate system according to the two frame target images.
Specifically, the two frames of target images are images including the target vehicle acquired by a vehicle-mounted camera of the vehicle, where the image coordinate system may be understood as a planar rectangular coordinate system, or may also be a three-dimensional coordinate system, which may use, for example, a central point of the image as an origin, or may use any vertex of the image as an origin, which may be selected according to actual requirements, and after the image coordinate system is established, image coordinates of the target vehicle on the image coordinate system, such as image coordinates of each pixel point of the vehicle, or image coordinates of 4 most marginal vertices of the vehicle, are acquired.
In a possible implementation manner, two frames of target images may be input to the target detection module and the target tracking module, where the target detection module is configured to detect a target vehicle included in the target images, the target tracking module is configured to detect whether the target vehicles included in the two frames of target images are the same vehicle, and the target detection module and the target tracking module may determine the location of the target vehicle in the image coordinate system, so as to obtain the image coordinates of the target vehicle.
S302, converting the image coordinate systems of the target vehicles in the two frames of target images into a third vehicle body coordinate system according to the corresponding relation between the image coordinate system and the vehicle body coordinate system.
In this embodiment, there is a correspondence between the image coordinate system and the vehicle body coordinate system, for example, a correspondence matrix between the image coordinate system and the vehicle body coordinate system may be obtained according to an inverse perspective projection method and camera parameters (a focal length, an erection height, and the like of the vehicle-mounted camera), and then the image coordinate system is converted into a third vehicle body coordinate system according to the correspondence matrix.
Specifically, the body coordinate system is a special moving coordinate system for describing the motion of the automobile, the origin of the system coincides with the mass center of the vehicle, when the vehicle is in a static state on a horizontal road surface, the X axis of the body coordinate system is parallel to the ground and points to the front of the vehicle, the Z axis of the body coordinate system points to the upper part through the mass center of the vehicle, and the Y axis points to the left side of the driver.
Those skilled in the art will understand that each vehicle corresponds to its own body coordinate system, and in the present embodiment, the third body coordinate system is the body coordinate system corresponding to the vehicle converted from the image coordinate system.
And S303, obtaining the relative travel distance of the target vehicle according to the third body coordinate of the target vehicle corresponding to the first shooting time in the third body coordinate system and the fourth body coordinate of the target vehicle corresponding to the second shooting time in the third body coordinate system.
Specifically, in two frames of target images, a target vehicle in a previous frame of target image corresponds to a first image coordinate, a target vehicle in a next frame of target image corresponds to a second image coordinate, for example, after the image coordinates are converted into a third body coordinate system, the corresponding first image coordinate is converted into a third body coordinate in the third body coordinate system, and the second image coordinate is converted into a fourth body coordinate in the third body coordinate system, wherein the third body coordinate and the fourth body coordinate are coordinates of the target vehicle in the third body coordinate system corresponding to the first shooting time and the second shooting time.
When the target vehicle is placed in the third body coordinate system of the host vehicle, it may be considered that (not really still) the host vehicle is still to acquire the relative travel distance of the target vehicle, that is, the movement of the target vehicle takes the host vehicle as a reference object, because the movement of the target vehicle can be measured in the same third body coordinate system (which is the host vehicle), so even if the host vehicle is moving, the present embodiment merely acquires the relative travel distance of the target vehicle in the third body coordinate system.
In an alternative embodiment, assuming that the body coordinate systems of the first and second photographing times move or rotate, the body coordinate system may be processed (e.g., moved or rotated correspondingly) so that the first and second photographing times correspond to the same body coordinate system.
And S304, obtaining the relative speed of the target vehicle according to the time difference and the relative driving distance between the first shooting time and the second shooting time.
Specifically, a time difference exists between the first shooting time and the second shooting time, and the relative speed of the target vehicle can be obtained according to the time difference and the relative driving distance.
And S305, obtaining the vehicle body yaw angle of the vehicle according to the vehicle body wheelbase, the vehicle speed information and the wheel deflection information of the vehicle, wherein the vehicle speed, the wheel deflection angle and the vehicle body yaw angle of the vehicle are positively correlated, and the vehicle body wheelbase and the vehicle body yaw angle of the vehicle are negatively correlated.
Specifically, the vehicle body wheelbase is a distance from a front axle center to a rear axle center of the vehicle, which is a fixed parameter of the vehicle, in this embodiment, the vehicle data further includes wheel deflection information, and the wheel deflection information includes wheel deflection angles of the vehicle at a first shooting time and a second shooting time corresponding to two frames of target images, respectively, where the wheel deflection angle may be, for example, a deflection angle of a front wheel, because steering of the vehicle is mainly performed by the front wheel, and in some special vehicles, the wheel deflection angle may also be, for example, a deflection angle of a rear wheel.
The vehicle body yaw angle of the vehicle can be obtained by the vehicle speed and the wheel yaw angle at the first shooting time, the vehicle speed and the wheel yaw angle at the second shooting time, and the vehicle body wheelbase.
In a possible implementation manner, the corresponding relationship between the body wheelbase, the vehicle speed information, the wheel deflection information and the body yaw angle of the vehicle can be obtained according to a bicycle model of the vehicle, the bicycle model of the vehicle is firstly introduced, wherein the bicycle model is a model for simplifying the motion of the vehicle, and particularly, the motion of the vehicle in the vertical direction is omitted, so that the description of the vehicle (which can be understood as a top view angle standing in the sky) is carried out based on a moving object on a two-dimensional plane.
Secondly, assuming that the vehicle is constructed like a bicycle, i.e. the front two wheels of the vehicle have the same data of wheel deflection angle, rotation speed and the like, and the rear two wheels also do, the front and rear tires can be described by one wheel respectively, thereby realizing the simplified description of the motion of the vehicle on a two-dimensional plane.
As further described below in conjunction with FIG. 4, and as shown in FIG. 4, where the motion of the vehicle is described using a bicycle model, assume that at any one of the camera times, the front wheel of the vehicle is located at the position indicated by point A, the rear wheel of the vehicle is located at the position indicated by point B, and the center of mass of the vehicle is located at the position indicated by point C, where v is shownxThe speed of the vehicle at the current moment, beta is the slip angle of the vehicle,
Figure BDA0002205287570000131
is the yaw angle of the vehicle, /)fIs the distance from the center of mass of the vehicle to point A,/rIs the distance (l) from the center of mass of the vehicle to point BfAnd lrThe sum is the vehicle body wheelbase), deltafFor the vehicle yaw angle of the front wheel, the relationship between the vehicle body wheelbase, the vehicle speed information, and the wheel yaw information of the own vehicle and the vehicle body yaw angle of the own vehicle can be obtained by the bicycle model described in fig. 4, as shown in formula one:
Figure BDA0002205287570000132
wherein, ω isdIs the transverse swing angle of the vehicle body, L is the wheel base of the vehicle body, specifically LfAnd lrFor the remaining parameters, reference is made to the description of the parameters in fig. 4, which is not described herein again, and reference may be made to the description in the prior art for a specific implementation manner of obtaining the formula one according to the bicycle model, which is not described herein again.
Those skilled in the art can understand that as long as the vehicle speed of the vehicle is positively correlated with both the wheel deflection angle and the vehicle body yaw angle, and the vehicle body wheelbase of the vehicle is negatively correlated with the vehicle body yaw angle, the specific implementation manner of obtaining the vehicle body yaw angle can be selected according to actual requirements,
and S306, converting the speed of the vehicle at the first shooting moment and the speed of the vehicle at the second shooting moment to the same direction according to the vehicle body yaw angle.
If the vehicle yaw angle is not 0, it indicates that the vehicle directions at the first and second imaging times have changed, and the direction of the vehicle speed of the corresponding host vehicle has also changed, it is necessary to switch the vehicle speed of the host vehicle at the first imaging time and the vehicle speed of the host vehicle at the second imaging time to the same direction in accordance with the vehicle yaw angle.
Specifically, the description will be made with reference to fig. 5, assuming that the speed (V) of the vehicle at the first shooting time1) Is the direction indicated by the arrow 1, the speed (V) of the vehicle at the second shooting time2) Is the direction indicated by the arrow 2, corresponding to the angle ω between the two directionsdNamely the yaw angle of the vehicle body, the speed can be converted to the same direction through the following formula two:
V3=V2×cosωdformula two
Wherein, V3I.e. the velocity V2Is converted into a velocity V1The direction of the first and second beams is the same.
Alternatively, the direction of the vehicle speed at the first shooting time may be converted into the vehicle speed at the second shooting time, so as to realize conversion into the same direction.
In another alternative implementation, if the vehicle body yaw angle is 0, it indicates that the direction of the own vehicle has not changed, and the direction of the vehicle speed of the corresponding own vehicle is the same at the first shooting time and the second shooting time, and the speed directions at the two times are not changed even if the above steps are performed.
And S307, processing is carried out according to the relative speed of the target vehicle and the vehicle speed at the second shooting time after the direction is converted, and the absolute speed of the target vehicle is obtained.
In the present embodiment, the relative speed of the target vehicle is obtained from two frames of target images, and it can be understood by those skilled in the art that the direction of the relative speed of the target vehicle is actually consistent with the speed direction of the host vehicle corresponding to the first shooting time, so in the preferred embodiment, the direction of the speed of the second shooting time is converted into the direction of the speed of the first shooting time to obtain the vehicle speed of the second shooting time after the direction conversion, i.e. V obtained in fig. 53
Specifically, if the relative speed of the target vehicle is a traveling speed of the host vehicle serving as a reference (the host vehicle is considered to be stationary), the absolute speed of the target vehicle, which is a traveling speed of the target vehicle serving as a reference on the ground, that is, an actual traveling speed of the target vehicle, is obtained by adding or subtracting the relative speed of the target vehicle and the vehicle speed at the second capturing time after the direction conversion.
Alternatively, if the direction of the speed of the host vehicle at the first shooting time is changed to be consistent with the direction of the speed of the host vehicle at the second shooting time, the direction of the relative speed of the target vehicle may be changed again, and the technical solution of the present invention may be implemented as well.
If the vehicle speed of the vehicle is 0, the relative speed of the target vehicle is consistent with the absolute speed of the target vehicle, and it can be understood by those skilled in the art that each shooting time corresponds to the relative speed and the absolute speed of the target vehicle, and the present embodiment performs real-time processing according to the image.
And S308, according to the yaw angle of the vehicle body, performing rotation processing on a second vehicle body coordinate system corresponding to the second shooting time of the vehicle, and according to the vehicle speed information of the vehicle, performing translation processing on the second vehicle body coordinate system to obtain a processed second vehicle body coordinate system, wherein the processed second vehicle body coordinate system is the same as the first vehicle body coordinate system corresponding to the first shooting time.
Specifically, assuming that the coordinate system corresponding to the vehicle at the first photographing time is a first body coordinate system and the vehicle coordinate system corresponding to the second photographing time is a second body coordinate system, and if the vehicle yaw angle is not 0, it indicates that the traveling direction of the vehicle has changed from the first photographing time to the second photographing time, and the direction of the corresponding vehicle body coordinate system has also changed, as shown in fig. 6, assuming that the vehicle coordinate system corresponding to the first photographing time is an x-y coordinate system and the coordinate system corresponding to the second photographing time is an s-t coordinate system, it can be seen that the vehicle coordinate system has rotated slowly from the first photographing time to the second photographing time is θ (equivalent to the vehicle yaw angle), the second body coordinate system (s-t coordinate system) corresponding to the second photographing time of the vehicle is rotated according to the vehicle yaw angle, thereby rotating the second body coordinate system to the same direction as the first body coordinate system.
Meanwhile, if the vehicle speed information of the host vehicle is not 0, it indicates that the position of the host vehicle has changed (advanced or retracted) from the first shooting time to the second shooting time and the position of the corresponding body coordinate system of the host vehicle has also changed, and the travel distance of the host vehicle is obtained from the vehicle speed at the first shooting time, the vehicle speed at the second shooting time, and the time interval between the first shooting time and the second shooting time, so that the second body coordinate system is translated to the same position as the first body coordinate system according to the travel distance.
And after rotation processing and translation processing, obtaining a processed second vehicle body coordinate system, wherein the processed second vehicle body coordinate system is the same as the first vehicle body coordinate system corresponding to the first shooting time.
The second body coordinate system corresponding to the second shooting moment is subjected to rotation processing and translation processing according to the yaw angle of the body and the speed information of the vehicle, so that target vehicles of front and rear frames are converted into the same body coordinate system, more accurate motion state judgment can be achieved, and compared with the traditional method, a matrix representing vehicle motion does not need to be calculated, and the calculation amount is reduced.
S309, acquiring a first body coordinate of the target vehicle in the first body coordinate system and a second body coordinate in the processed second body coordinate system.
And S310, obtaining a position difference value of the target vehicle in the two frames of target images according to the first vehicle body coordinate and the second vehicle body coordinate.
And specifically, acquiring a first body coordinate of the target vehicle in the first body coordinate system and a second body coordinate in the processed second body coordinate system, and then obtaining a position difference value of the target vehicle in the two frames of target images according to the first body coordinate and the second body coordinate, wherein the position difference value is used for indicating whether the position of the target vehicle changes.
S311, if the absolute speed of the target vehicle is not 0, determines whether the position difference of the target vehicle is 0, if so, executes S312, and if not, executes S313.
Specifically, if it is determined that the absolute speed of the target vehicle is not 0, it may be preliminarily determined that the target vehicle has moved, and it may be determined whether the position difference of the target vehicle is 0, so as to more accurately determine whether the target vehicle has moved.
In an alternative embodiment, if the absolute speed of the target vehicle is 0, it may be determined that the target vehicle does not move according to the absolute speed, and thus the moving state of the target vehicle is determined to be stationary.
And S312, determining that the motion state of the target vehicle is static.
And S313, determining the motion state of the target vehicle as running.
If the position difference of the target vehicle is 0, it may be determined that the moving state of the target vehicle is stationary, or if the position difference of the target vehicle is not 0, it may be determined that the moving state of the target vehicle is in travel,
in an alternative embodiment, the position difference of the target vehicle may be determined first, and then the absolute speed of the target vehicle may be determined, which is not limited herein, as long as it is ensured that the motion state of the target vehicle is determined to be in driving under the condition that the absolute speed of the target vehicle is not 0 and the absolute speed of the target vehicle is not 0.
The method for identifying the motion state of the target vehicle provided by the embodiment of the invention comprises the following steps: and obtaining the image coordinates of the target vehicle in the two frames of target images on the image coordinate system according to the two frames of target images. And converting the image coordinate systems of the target vehicles in the two frames of target images into a third vehicle body coordinate system according to the corresponding relation between the image coordinate system and the vehicle body coordinate system. And obtaining the relative travel distance of the target vehicle according to the third body coordinate of the target vehicle corresponding to the first shooting time in the third body coordinate system and the fourth body coordinate of the target vehicle corresponding to the second shooting time in the third body coordinate system. And obtaining the relative speed of the target vehicle according to the time difference value and the relative driving distance between the first shooting moment and the second shooting moment. And obtaining the vehicle body yaw angle of the vehicle according to the vehicle body wheelbase, the vehicle speed information and the wheel deflection information of the vehicle, wherein the vehicle speed, the wheel deflection angle and the vehicle body yaw angle of the vehicle are positively correlated, and the vehicle body wheelbase and the vehicle body yaw angle of the vehicle are negatively correlated. The speed of the vehicle at the first shooting time and the speed of the vehicle at the second shooting time are switched to the same direction according to the yaw angle of the vehicle body. And processing according to the relative speed of the target vehicle and the vehicle speed at the second shooting time after the direction conversion to obtain the absolute speed of the target vehicle. And according to the vehicle body yaw angle, performing rotation processing on a second vehicle body coordinate system corresponding to the second shooting time of the vehicle, and according to the vehicle speed information of the vehicle, performing translation processing on the second vehicle body coordinate system to obtain a processed second vehicle body coordinate system, wherein the processed second vehicle body coordinate system is the same as the first vehicle body coordinate system corresponding to the first shooting time. And acquiring first body coordinates of the target vehicle in a first body coordinate system and second body coordinates in a second processed body coordinate system. And obtaining the position difference value of the target vehicle in the two frames of target images according to the first vehicle body coordinate and the second vehicle body coordinate. And if the absolute speed of the target vehicle is not 0, judging whether the position difference value of the target vehicle is 0, and if so, determining that the motion state of the target vehicle is static. If not, determining that the motion state of the target vehicle is in driving. The second body coordinate system corresponding to the second shooting moment is subjected to rotation processing and translation processing, so that target vehicles of front and rear frames are converted into the same body coordinate system, whether the positions of the target vehicles change or not is determined, and more accurate motion state judgment can be achieved.
Fig. 7 is a schematic structural diagram of a target vehicle motion state identification device according to an embodiment of the present invention. As shown in fig. 7, the apparatus 70 includes: a determination module 701, a processing module 702, and an identification module 703.
A determining module 701, configured to determine a relative speed of a target vehicle according to two frames of target images, where the two frames of target images are images of the target vehicle acquired by a vehicle-mounted camera of the vehicle;
the processing module 702 is further configured to obtain a vehicle body yaw angle of the vehicle according to vehicle data of the vehicle, where the vehicle data includes vehicle speed information, the vehicle speed information includes speeds of the vehicle at a first shooting time and a second shooting time corresponding to the two frames of target images, respectively, and the vehicle body yaw angle is an angle rotated by the vehicle at the second shooting time relative to the first shooting time;
the processing module 702 is further configured to obtain an absolute speed of the target vehicle according to the relative speed of the target vehicle, the vehicle body yaw angle, and the vehicle speed information, and obtain a position difference value of the target vehicle in the two frames of target images according to the vehicle body yaw angle and the vehicle speed information;
the identifying module 703 is configured to identify a motion state of the target vehicle according to the absolute speed of the target vehicle and the position difference.
In one possible design, the processing module 702 is specifically configured to:
switching the speed of the own vehicle at the first photographing time and the speed of the own vehicle at the second photographing time to the same direction according to the yaw angle of the vehicle body;
and processing according to the relative speed of the target vehicle and the speed of the second shooting moment after the direction conversion to obtain the absolute speed of the target vehicle.
In one possible design, the processing module 702 is specifically configured to:
according to the vehicle body yaw angle, performing rotation processing on a second vehicle body coordinate system corresponding to the vehicle at the second shooting time, and according to the vehicle speed information of the vehicle, performing translation processing on the second vehicle body coordinate system to obtain a processed second vehicle body coordinate system, wherein the processed second vehicle body coordinate system is the same as the first vehicle body coordinate system corresponding to the first shooting time;
acquiring a first body coordinate of the target vehicle in the first body coordinate system and a second body coordinate of the processed second body coordinate system;
and obtaining the position difference value of the target vehicle in the two frames of target images according to the first vehicle body coordinate and the second vehicle body coordinate.
In one possible design, the determining module 701 is specifically configured to:
obtaining image coordinates of a target vehicle in the two frames of target images on an image coordinate system according to the two frames of target images;
converting the image coordinate systems of the target vehicles in the two frames of target images into a third vehicle body coordinate system according to the corresponding relation between the image coordinate system and the vehicle body coordinate system;
obtaining a relative driving distance of the target vehicle according to a third vehicle body coordinate of the target vehicle corresponding to the first shooting time in the third vehicle body coordinate system and a fourth vehicle body coordinate of the target vehicle corresponding to the second shooting time in the third vehicle body coordinate system;
and obtaining the relative speed of the target vehicle according to the time difference between the first shooting moment and the second shooting moment and the relative driving distance.
In one possible design, the vehicle data further includes wheel deflection information, and the wheel deflection information includes wheel deflection angles of the vehicle at a first shooting time and a second shooting time corresponding to the two frames of target images respectively;
the processing module 702 is specifically configured to:
and obtaining the vehicle body yaw angle of the vehicle according to the vehicle body wheelbase, the vehicle speed information and the wheel deflection information of the vehicle, wherein the vehicle speed, the wheel deflection angle and the vehicle body yaw angle of the vehicle are positively correlated, and the vehicle body wheelbase and the vehicle body yaw angle of the vehicle are negatively correlated.
In one possible design, the identification module 703 is specifically configured to:
if the absolute speed of the target vehicle is not 0, judging whether the position difference value of the target vehicle is 0 or not;
if so, determining that the motion state of the target vehicle is static;
and if not, determining that the motion state of the target vehicle is in driving.
In one possible design, the identification module 703 is further configured to:
and if the absolute speed of the target vehicle is 0, determining that the motion state of the target vehicle is static.
The apparatus provided in this embodiment may be used to implement the technical solutions of the above method embodiments, and the implementation principles and technical effects are similar, which are not described herein again.
Fig. 8 is a schematic diagram of a hardware structure of a target vehicle motion state recognition apparatus according to an embodiment of the present invention, and as shown in fig. 8, a target vehicle motion state recognition apparatus 80 according to this embodiment includes: a processor 801 and a memory 802; wherein
A memory 802 for storing computer-executable instructions;
the processor 801 is configured to execute the computer-executable instructions stored in the memory to implement the steps performed by the method for identifying the motion state of the target vehicle in the above embodiment. Reference may be made in particular to the description relating to the method embodiments described above.
Alternatively, the memory 802 may be separate or integrated with the processor 801.
When the memory 802 is provided separately, the target vehicle motion state identification device further includes a bus 803 for connecting the memory 802 and the processor 801.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer executing instruction is stored in the computer-readable storage medium, and when a processor executes the computer executing instruction, the method for identifying a moving state of a target vehicle, performed by the above apparatus for identifying a moving state of a target vehicle, is implemented.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules is only one logical division, and other divisions may be realized in practice, for example, a plurality of modules may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
The integrated module implemented in the form of a software functional module may be stored in a computer-readable storage medium. The software functional module is stored in a storage medium and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods according to the embodiments of the present application.
It should be understood that the Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present invention may be embodied directly in a hardware processor, or in a combination of the hardware and software modules within the processor.
The memory may comprise a high-speed RAM memory, and may further comprise a non-volatile storage NVM, such as at least one disk memory, and may also be a usb disk, a removable hard disk, a read-only memory, a magnetic or optical disk, etc.
The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, the buses in the figures of the present application are not limited to only one bus or one type of bus.
The storage medium may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (16)

1. A method for identifying a motion state of a target vehicle is characterized by comprising the following steps:
determining the relative speed of a target vehicle according to two frames of target images, wherein the two frames of target images are images which are acquired by a vehicle-mounted camera of the vehicle and comprise the target vehicle;
obtaining a vehicle body yaw angle of the vehicle according to vehicle data of the vehicle, wherein the vehicle data comprises vehicle speed information, the vehicle speed information comprises speeds of the vehicle at a first shooting time and a second shooting time corresponding to the two frames of target images respectively, and the vehicle body yaw angle is an angle rotated by the vehicle relative to the first shooting time at the second shooting time;
obtaining the absolute speed of the target vehicle according to the relative speed of the target vehicle, the vehicle body yaw angle and the vehicle speed information, and obtaining the position difference value of the target vehicle in the two frames of target images according to the vehicle body yaw angle and the vehicle speed information;
and identifying the motion state of the target vehicle according to the absolute speed of the target vehicle and the position difference value.
2. The method of claim 1, wherein the deriving an absolute speed of the target vehicle from the relative speed of the target vehicle, the body yaw angle, and the vehicle speed information comprises:
switching the speed of the own vehicle at the first photographing time and the speed of the own vehicle at the second photographing time to the same direction according to the yaw angle of the vehicle body;
and processing according to the relative speed of the target vehicle and the speed of the second shooting moment after the direction conversion to obtain the absolute speed of the target vehicle.
3. The method of claim 1, wherein said deriving a difference in position of said target vehicle in said two frames of target images based on said vehicle body yaw angle and said vehicle speed information comprises:
according to the vehicle body yaw angle, performing rotation processing on a second vehicle body coordinate system corresponding to the vehicle at the second shooting time, and according to the vehicle speed information of the vehicle, performing translation processing on the second vehicle body coordinate system to obtain a processed second vehicle body coordinate system, wherein the processed second vehicle body coordinate system is the same as the first vehicle body coordinate system corresponding to the first shooting time;
acquiring a first body coordinate of the target vehicle in the first body coordinate system and a second body coordinate of the processed second body coordinate system;
and obtaining the position difference value of the target vehicle in the two frames of target images according to the first vehicle body coordinate and the second vehicle body coordinate.
4. The method of claim 1, wherein determining a relative speed of a target vehicle from the two frames of target images comprises:
obtaining image coordinates of a target vehicle in the two frames of target images on an image coordinate system according to the two frames of target images;
converting the image coordinate systems of the target vehicles in the two frames of target images into a third vehicle body coordinate system according to the corresponding relation between the image coordinate system and the vehicle body coordinate system;
obtaining a relative driving distance of the target vehicle according to a third vehicle body coordinate of the target vehicle corresponding to the first shooting time in the third vehicle body coordinate system and a fourth vehicle body coordinate of the target vehicle corresponding to the second shooting time in the third vehicle body coordinate system;
and obtaining the relative speed of the target vehicle according to the time difference between the first shooting moment and the second shooting moment and the relative driving distance.
5. The method according to claim 1, wherein the vehicle data further includes wheel deflection information including wheel deflection angles of the own vehicle at first and second photographing times corresponding to the two frames of target images, respectively;
the method for obtaining the vehicle body yaw angle of the vehicle according to the vehicle data corresponding to the shooting time of the two frames of target images sent by the sensor of the vehicle comprises the following steps:
and obtaining the vehicle body yaw angle of the vehicle according to the vehicle body wheelbase, the vehicle speed information and the wheel deflection information of the vehicle, wherein the vehicle speed, the wheel deflection angle and the vehicle body yaw angle of the vehicle are positively correlated, and the vehicle body wheelbase and the vehicle body yaw angle of the vehicle are negatively correlated.
6. The method of claim 1, wherein identifying the motion state of the target vehicle based on the absolute velocity of the target vehicle and the difference in the position of the target vehicle in the two target images comprises:
if the absolute speed of the target vehicle is not 0, judging whether the position difference value of the target vehicle is 0 or not;
if so, determining that the motion state of the target vehicle is static;
and if not, determining that the motion state of the target vehicle is in driving.
7. The method of claim 6, further comprising:
and if the absolute speed of the target vehicle is 0, determining that the motion state of the target vehicle is static.
8. A target vehicle motion state recognition device characterized by comprising:
the vehicle-mounted camera comprises a determining module, a judging module and a judging module, wherein the determining module is used for determining the relative speed of a target vehicle according to two frames of target images, and the two frames of target images are images which are acquired by a vehicle-mounted camera of the vehicle and comprise the target vehicle;
the processing module is further used for obtaining a vehicle body yaw angle of the vehicle according to vehicle data of the vehicle, wherein the vehicle data comprise vehicle speed information, the vehicle speed information comprises speeds of the vehicle at a first shooting time and a second shooting time corresponding to the two frames of target images respectively, and the vehicle body yaw angle is an angle rotated by the vehicle at the second shooting time relative to the first shooting time;
the processing module is further used for obtaining the absolute speed of the target vehicle according to the relative speed of the target vehicle, the vehicle body yaw angle and the vehicle speed information, and obtaining the position difference value of the target vehicle in the two frames of target images according to the vehicle body yaw angle and the vehicle speed information;
and the identification module is used for identifying the motion state of the target vehicle according to the absolute speed of the target vehicle and the position difference value.
9. The apparatus of claim 8, wherein the processing module is specifically configured to:
switching the speed of the own vehicle at the first photographing time and the speed of the own vehicle at the second photographing time to the same direction according to the yaw angle of the vehicle body;
and processing according to the relative speed of the target vehicle and the speed of the second shooting moment after the direction conversion to obtain the absolute speed of the target vehicle.
10. The apparatus of claim 8, wherein the processing module is specifically configured to:
according to the vehicle body yaw angle, performing rotation processing on a second vehicle body coordinate system corresponding to the vehicle at the second shooting time, and according to the vehicle speed information of the vehicle, performing translation processing on the second vehicle body coordinate system to obtain a processed second vehicle body coordinate system, wherein the processed second vehicle body coordinate system is the same as the first vehicle body coordinate system corresponding to the first shooting time;
acquiring a first body coordinate of the target vehicle in the first body coordinate system and a second body coordinate of the processed second body coordinate system;
and obtaining the position difference value of the target vehicle in the two frames of target images according to the first vehicle body coordinate and the second vehicle body coordinate.
11. The apparatus of claim 8, wherein the determining module is specifically configured to:
obtaining image coordinates of a target vehicle in the two frames of target images on an image coordinate system according to the two frames of target images;
converting the image coordinate systems of the target vehicles in the two frames of target images into a third vehicle body coordinate system according to the corresponding relation between the image coordinate system and the vehicle body coordinate system;
obtaining a relative driving distance of the target vehicle according to a third vehicle body coordinate of the target vehicle corresponding to the first shooting time in the third vehicle body coordinate system and a fourth vehicle body coordinate of the target vehicle corresponding to the second shooting time in the third vehicle body coordinate system;
and obtaining the relative speed of the target vehicle according to the time difference between the first shooting moment and the second shooting moment and the relative driving distance.
12. The apparatus according to claim 8, wherein the vehicle data further includes wheel deflection information including wheel deflection angles of the own vehicle at first and second photographing times corresponding to the two frames of target images, respectively;
the processing module is specifically configured to:
and obtaining the vehicle body yaw angle of the vehicle according to the vehicle body wheelbase, the vehicle speed information and the wheel deflection information of the vehicle, wherein the vehicle speed, the wheel deflection angle and the vehicle body yaw angle of the vehicle are positively correlated, and the vehicle body wheelbase and the vehicle body yaw angle of the vehicle are negatively correlated.
13. The apparatus of claim 8, wherein the identification module is specifically configured to:
if the absolute speed of the target vehicle is not 0, judging whether the position difference value of the target vehicle is 0 or not;
if so, determining that the motion state of the target vehicle is static;
and if not, determining that the motion state of the target vehicle is in driving.
14. The apparatus of claim 13, wherein the identification module is further configured to:
and if the absolute speed of the target vehicle is 0, determining that the motion state of the target vehicle is static.
15. A target vehicle motion state recognition device characterized by comprising:
a memory for storing a program;
a processor for executing the program stored by the memory, the processor being configured to perform the method of any of claims 1 to 7 when the program is executed.
16. A computer-readable storage medium comprising instructions which, when executed on a computer, cause the computer to perform the method of any one of claims 1 to 7.
CN201910879023.7A 2019-09-18 2019-09-18 Target vehicle motion state identification method and device Active CN112529935B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910879023.7A CN112529935B (en) 2019-09-18 2019-09-18 Target vehicle motion state identification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910879023.7A CN112529935B (en) 2019-09-18 2019-09-18 Target vehicle motion state identification method and device

Publications (2)

Publication Number Publication Date
CN112529935A true CN112529935A (en) 2021-03-19
CN112529935B CN112529935B (en) 2023-05-16

Family

ID=74974931

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910879023.7A Active CN112529935B (en) 2019-09-18 2019-09-18 Target vehicle motion state identification method and device

Country Status (1)

Country Link
CN (1) CN112529935B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113239464A (en) * 2021-06-02 2021-08-10 北京汽车集团越野车有限公司 Method and device for determining section of vehicle body
WO2022246767A1 (en) * 2021-05-27 2022-12-01 华为技术有限公司 Method and device for determining steering intention of target vehicle
CN116088014A (en) * 2023-03-03 2023-05-09 北京理工大学 Main car information coordinate system conversion method and system based on longitude and latitude information

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003149256A (en) * 2001-11-14 2003-05-21 Mitsubishi Electric Corp Vehicle speed measuring instrument
JP2008026030A (en) * 2006-07-18 2008-02-07 Denso Corp Vehicle obstacle detector and vehicle control system
JP2010198552A (en) * 2009-02-27 2010-09-09 Konica Minolta Holdings Inc Driving state monitoring device
CN104865579A (en) * 2014-02-21 2015-08-26 株式会社电装 Vehicle-installed Obstacle Detection Apparatus Having Function For Judging Motion Condition Of Detected Object
CN106054191A (en) * 2015-04-06 2016-10-26 通用汽车环球科技运作有限责任公司 Wheel detection and its application in object tracking and sensor registration
US20180082581A1 (en) * 2016-09-16 2018-03-22 Kabushiki Kaisha Toshiba Travel speed calculation device and travel speed calculation method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003149256A (en) * 2001-11-14 2003-05-21 Mitsubishi Electric Corp Vehicle speed measuring instrument
JP2008026030A (en) * 2006-07-18 2008-02-07 Denso Corp Vehicle obstacle detector and vehicle control system
JP2010198552A (en) * 2009-02-27 2010-09-09 Konica Minolta Holdings Inc Driving state monitoring device
CN104865579A (en) * 2014-02-21 2015-08-26 株式会社电装 Vehicle-installed Obstacle Detection Apparatus Having Function For Judging Motion Condition Of Detected Object
US20150239472A1 (en) * 2014-02-21 2015-08-27 Denso Corporation Vehicle-installed obstacle detection apparatus having function for judging motion condition of detected object
CN106054191A (en) * 2015-04-06 2016-10-26 通用汽车环球科技运作有限责任公司 Wheel detection and its application in object tracking and sensor registration
US20180082581A1 (en) * 2016-09-16 2018-03-22 Kabushiki Kaisha Toshiba Travel speed calculation device and travel speed calculation method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SONG W ET AL.: "《Real-time obstacles detection and status classification for collision warning in a vehicle active safety system》", 《IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022246767A1 (en) * 2021-05-27 2022-12-01 华为技术有限公司 Method and device for determining steering intention of target vehicle
CN113239464A (en) * 2021-06-02 2021-08-10 北京汽车集团越野车有限公司 Method and device for determining section of vehicle body
CN113239464B (en) * 2021-06-02 2024-01-30 北京汽车集团越野车有限公司 Method and device for determining vehicle body section
CN116088014A (en) * 2023-03-03 2023-05-09 北京理工大学 Main car information coordinate system conversion method and system based on longitude and latitude information
CN116088014B (en) * 2023-03-03 2023-07-04 北京理工大学 Main car information coordinate system conversion method and system based on longitude and latitude information

Also Published As

Publication number Publication date
CN112529935B (en) 2023-05-16

Similar Documents

Publication Publication Date Title
CN109435942B (en) Information fusion-based parking space line and parking space recognition method and device
JP6785620B2 (en) Predictive suspension control for vehicles using stereo camera sensors
CN112529935B (en) Target vehicle motion state identification method and device
US10776946B2 (en) Image processing device, object recognizing device, device control system, moving object, image processing method, and computer-readable medium
JP6316161B2 (en) In-vehicle image processing device
CN110555407B (en) Pavement vehicle space identification method and electronic equipment
US9396553B2 (en) Vehicle dimension estimation from vehicle images
JP6769477B2 (en) Image processing device, imaging device, mobile device control system, image processing method, and program
US10832428B2 (en) Method and apparatus for estimating a range of a moving object
CN111572633B (en) Steering angle detection method, device and system
CN111386530A (en) Vehicle detection method and apparatus
EP3259732A1 (en) Method and device for stabilization of a surround view image
CN110197104B (en) Distance measurement method and device based on vehicle
CN108376384B (en) Method and device for correcting disparity map and storage medium
CN111160070A (en) Vehicle panoramic image blind area eliminating method and device, storage medium and terminal equipment
JP5832850B2 (en) Lane monitoring system and lane monitoring method
CN114037977B (en) Road vanishing point detection method, device, equipment and storage medium
JP2017211791A (en) Image processing device, imaging device, mobile body equipment control system, image processing method, and program
CN106874837B (en) Vehicle detection method based on video image processing
CN114494200A (en) Method and device for measuring trailer rotation angle
JP4144464B2 (en) In-vehicle distance calculation device
CN114863096A (en) Semantic map construction and positioning method and device for indoor parking lot
JP7021131B2 (en) Posture estimation device, posture estimation method, posture estimation program and recording medium
Deigmoeller et al. Road Surface Scanning using Stereo Cameras for Motorcycles.
JP4847303B2 (en) Obstacle detection method, obstacle detection program, and obstacle detection apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant