CN112529935B - Target vehicle motion state identification method and device - Google Patents

Target vehicle motion state identification method and device Download PDF

Info

Publication number
CN112529935B
CN112529935B CN201910879023.7A CN201910879023A CN112529935B CN 112529935 B CN112529935 B CN 112529935B CN 201910879023 A CN201910879023 A CN 201910879023A CN 112529935 B CN112529935 B CN 112529935B
Authority
CN
China
Prior art keywords
vehicle
target
target vehicle
speed
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910879023.7A
Other languages
Chinese (zh)
Other versions
CN112529935A (en
Inventor
商燕
黄洋文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Goldway Intelligent Transportation System Co Ltd
Original Assignee
Shanghai Goldway Intelligent Transportation System Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Goldway Intelligent Transportation System Co Ltd filed Critical Shanghai Goldway Intelligent Transportation System Co Ltd
Priority to CN201910879023.7A priority Critical patent/CN112529935B/en
Publication of CN112529935A publication Critical patent/CN112529935A/en
Application granted granted Critical
Publication of CN112529935B publication Critical patent/CN112529935B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention provides a target vehicle motion state identification method and device, wherein the method comprises the following steps: and determining the relative speed of the target vehicle according to the two frames of target images. And obtaining the body yaw angle of the vehicle according to the vehicle data of the vehicle, wherein the body yaw angle is the angle of rotation of the vehicle at the second shooting moment relative to the first shooting moment. And obtaining the absolute speed of the target vehicle according to the relative speed of the target vehicle, the yaw angle of the vehicle body and the vehicle speed information, and obtaining the position difference value of the target vehicle in the two frames of target images according to the yaw angle of the vehicle body and the vehicle speed information. The motion state of the target vehicle is identified based on the absolute speed and the position difference of the target vehicle. The absolute speed of the target vehicle and the position difference value of the target vehicle are obtained according to the relative speed of the target vehicle, the yaw angle of the vehicle body and the vehicle speed information, so that the motion state of the target vehicle is directly judged, and the calculated amount is reduced.

Description

Target vehicle motion state identification method and device
Technical Field
The embodiment of the invention relates to a computer technology, in particular to a target vehicle motion state identification method and device.
Background
With the continuous development of the related research field of vehicles, the identification of the running state of a target vehicle existing on a road plays an increasingly important role, and can provide preliminary information for subsequent technologies of track association, target identification, tracking and the like.
At present, in the prior art, the motion state identification of a target vehicle usually needs to be carried out by matching angular points between two frames of target images, firstly, the angular points are detected on a first frame of image, an optical flow algorithm is adopted to find matching points on a second frame of image, then, a matrix for representing the motion of a camera is calculated according to the relation between the matching points of the two frames of target images, and the epipolar geometry relation between images under a motion platform is utilized to compensate or estimate the motion of a camera, so that the motion state identification of the target vehicle is realized.
However, the process of searching the matching points by using the optical flow algorithm and the process of calculating the camera motion matrix by using the relationship between the angular points are complex, so that the calculated amount is large and the processing efficiency is low.
Disclosure of Invention
The embodiment of the invention provides a method and a device for identifying the motion state of a target vehicle, which are used for solving the problem of large calculation amount of motion state identification of the target vehicle.
In a first aspect, an embodiment of the present invention provides a method for identifying a motion state of a target vehicle, including:
determining the relative speed of a target vehicle according to two frames of target images, wherein the two frames of target images are images comprising the target vehicle, which are acquired by a vehicle-mounted camera of the vehicle;
obtaining a body yaw angle of the vehicle according to vehicle data of the vehicle, wherein the vehicle data comprises vehicle speed information, the vehicle speed information comprises speeds of the vehicle at a first shooting moment and a second shooting moment which correspond to the two frames of target images respectively, and the body yaw angle is an angle of the vehicle rotating relative to the first shooting moment at the second shooting moment;
obtaining the absolute speed of the target vehicle according to the relative speed of the target vehicle, the yaw angle of the vehicle body and the vehicle speed information, and obtaining the position difference value of the target vehicle in the two frames of target images according to the yaw angle of the vehicle body and the vehicle speed information;
and identifying the motion state of the target vehicle according to the absolute speed of the target vehicle and the position difference value.
In one possible design, the obtaining the absolute speed of the target vehicle according to the relative speed of the target vehicle, the yaw angle of the vehicle body and the vehicle speed information includes:
Converting the speed of the host vehicle at the first shooting time and the speed of the host vehicle at the second shooting time into the same direction according to the yaw angle of the vehicle body;
and processing according to the relative speed of the target vehicle and the speed of the second shooting moment after the direction conversion to obtain the absolute speed of the target vehicle.
In one possible design, the obtaining the position difference value of the target vehicle in the two frames of target images according to the yaw angle of the vehicle body and the vehicle speed information includes:
according to the yaw angle of the vehicle body, performing rotation processing on a second vehicle body coordinate system corresponding to the second shooting moment of the vehicle body, and performing translation processing on the second vehicle body coordinate system according to the speed information of the vehicle body to obtain a processed second vehicle body coordinate system, wherein the processed second vehicle body coordinate system is identical to a first vehicle body coordinate system corresponding to the first shooting moment;
acquiring a first body coordinate of the target vehicle in the first body coordinate system and a second body coordinate in the processed second body coordinate system;
and obtaining the position difference value of the target vehicle in the two frames of target images according to the first vehicle body coordinates and the second vehicle body coordinates.
In one possible design, the determining the relative speed of the target vehicle according to the two frames of target images includes:
obtaining image coordinates of a target vehicle in the two frames of target images on an image coordinate system according to the two frames of target images;
converting the image coordinate system of the target vehicle in the two frames of target images into a third vehicle body coordinate system according to the corresponding relation between the image coordinate system and the vehicle body coordinate system;
obtaining the relative running distance of the target vehicle according to the third vehicle body coordinate of the target vehicle corresponding to the first shooting moment in the third vehicle body coordinate system and the fourth vehicle body coordinate of the target vehicle corresponding to the second shooting moment in the third vehicle body coordinate system;
and obtaining the relative speed of the target vehicle according to the time difference between the first shooting time and the second shooting time and the relative running distance.
In one possible design, the vehicle data further includes wheel deflection information, where the wheel deflection information includes wheel deflection angles of the vehicle at a first shooting time and a second shooting time corresponding to the two frame target images, respectively;
The obtaining the yaw angle of the body of the vehicle according to the vehicle data corresponding to the shooting time of the two frames of target images sent by the sensor of the vehicle comprises the following steps:
and obtaining the body yaw angle of the vehicle according to the body wheelbase of the vehicle, the vehicle speed information and the wheel deflection information, wherein the vehicle speed of the vehicle, the wheel deflection angle and the body yaw angle are positively correlated, and the body wheelbase of the vehicle and the body yaw angle are negatively correlated.
In one possible design, the identifying the motion state of the target vehicle according to the absolute speed of the target vehicle and the position difference value of the target vehicle in the two frames of target images includes:
if the absolute speed of the target vehicle is not 0, judging whether the position difference value of the target vehicle is 0 or not;
if yes, determining that the motion state of the target vehicle is stationary;
if not, determining the motion state of the target vehicle as running.
In one possible design, the method further comprises:
and if the absolute speed of the target vehicle is 0, determining that the motion state of the target vehicle is stationary.
In a second aspect, an embodiment of the present invention provides a target vehicle motion state identifying apparatus, including:
The determining module is used for determining the relative speed of the target vehicle according to two frames of target images, wherein the two frames of target images are images comprising the target vehicle, and the images are acquired by a vehicle-mounted camera of the vehicle;
the processing module is further used for obtaining a body yaw angle of the vehicle according to the vehicle data of the vehicle, wherein the vehicle data comprises vehicle speed information, the vehicle speed information comprises speeds of the vehicle at a first shooting moment and a second shooting moment corresponding to the two frames of target images respectively, and the body yaw angle is an angle of the vehicle rotating relative to the first shooting moment at the second shooting moment;
the processing module is further used for obtaining the absolute speed of the target vehicle according to the relative speed of the target vehicle, the yaw angle of the vehicle body and the vehicle speed information, and obtaining the position difference value of the target vehicle in the two frames of target images according to the yaw angle of the vehicle body and the vehicle speed information;
and the identification module is used for identifying the motion state of the target vehicle according to the absolute speed of the target vehicle and the position difference value.
In one possible design, the processing module is specifically configured to:
Converting the speed of the host vehicle at the first shooting time and the speed of the host vehicle at the second shooting time into the same direction according to the yaw angle of the vehicle body;
and processing according to the relative speed of the target vehicle and the speed of the second shooting moment after the direction conversion to obtain the absolute speed of the target vehicle.
In one possible design, the processing module is specifically configured to:
according to the yaw angle of the vehicle body, performing rotation processing on a second vehicle body coordinate system corresponding to the second shooting moment of the vehicle body, and performing translation processing on the second vehicle body coordinate system according to the speed information of the vehicle body to obtain a processed second vehicle body coordinate system, wherein the processed second vehicle body coordinate system is identical to a first vehicle body coordinate system corresponding to the first shooting moment;
acquiring a first body coordinate of the target vehicle in the first body coordinate system and a second body coordinate in the processed second body coordinate system;
and obtaining the position difference value of the target vehicle in the two frames of target images according to the first vehicle body coordinates and the second vehicle body coordinates.
In one possible design, the determining module is specifically configured to:
Obtaining image coordinates of a target vehicle in the two frames of target images on an image coordinate system according to the two frames of target images;
converting the image coordinate system of the target vehicle in the two frames of target images into a third vehicle body coordinate system according to the corresponding relation between the image coordinate system and the vehicle body coordinate system;
obtaining the relative running distance of the target vehicle according to the third vehicle body coordinate of the target vehicle corresponding to the first shooting moment in the third vehicle body coordinate system and the fourth vehicle body coordinate of the target vehicle corresponding to the second shooting moment in the third vehicle body coordinate system;
and obtaining the relative speed of the target vehicle according to the time difference between the first shooting time and the second shooting time and the relative running distance.
In one possible design, the vehicle data further includes wheel deflection information, where the wheel deflection information includes wheel deflection angles of the vehicle at a first shooting time and a second shooting time corresponding to the two frame target images, respectively;
the processing module is specifically configured to:
and obtaining the body yaw angle of the vehicle according to the body wheelbase of the vehicle, the vehicle speed information and the wheel deflection information, wherein the vehicle speed of the vehicle, the wheel deflection angle and the body yaw angle are positively correlated, and the body wheelbase of the vehicle and the body yaw angle are negatively correlated.
In one possible design, the identification module is specifically configured to:
if the absolute speed of the target vehicle is not 0, judging whether the position difference value of the target vehicle is 0 or not;
if yes, determining that the motion state of the target vehicle is stationary;
if not, determining the motion state of the target vehicle as running.
In one possible design, the identification module is further configured to:
and if the absolute speed of the target vehicle is 0, determining that the motion state of the target vehicle is stationary.
In a third aspect, an embodiment of the present invention provides a target vehicle motion state identifying apparatus, including:
a memory for storing a program;
a processor for executing the program stored by the memory, the processor being adapted to perform the method of the first aspect and any of the various possible designs of the first aspect as described above when the program is executed.
In a fourth aspect, embodiments of the present invention provide a computer readable storage medium comprising instructions which, when run on a computer, cause the computer to perform the method of the first aspect above and any of the various possible designs of the first aspect.
The embodiment of the invention provides a target vehicle motion state identification method and device, wherein the method comprises the following steps: and determining the relative speed of the target vehicle according to two frames of target images, wherein the two frames of target images are images comprising the target vehicle and acquired by an onboard camera of the vehicle. And obtaining the yaw angle of the vehicle body of the vehicle according to the vehicle data of the vehicle, wherein the vehicle data comprises vehicle speed information, the vehicle speed information comprises speeds of the vehicle at a first shooting time and a second shooting time corresponding to two frames of target images respectively, and the yaw angle of the vehicle body is the angle of the vehicle body rotating relative to the first shooting time at the second shooting time. And obtaining the absolute speed of the target vehicle according to the relative speed of the target vehicle, the yaw angle of the vehicle body and the vehicle speed information, and obtaining the position difference value of the target vehicle in the two frames of target images according to the yaw angle of the vehicle body and the vehicle speed information. The motion state of the target vehicle is identified based on the absolute speed and the position difference of the target vehicle. The relative speed of the target vehicle is determined according to the two frames of target images, the absolute speed of the target vehicle is obtained according to the relative speed of the target vehicle, the yaw angle of the vehicle body and the vehicle speed information, and the position difference value of the target vehicle is obtained, so that the motion state of the target vehicle is directly judged, a matching point is not required to be found, a camera motion matrix is not required to be calculated, the calculated amount is reduced, and the recognition efficiency of the motion state of the target vehicle is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions of the prior art, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it will be obvious that the drawings in the following description are some embodiments of the present invention, and that other drawings can be obtained according to these drawings without inventive effort to a person skilled in the art.
Fig. 1 is a schematic diagram of a scene of a target vehicle motion state recognition method according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a method for identifying a motion state of a target vehicle according to an embodiment of the present invention;
FIG. 3 is a second flowchart of a method for identifying a motion state of a target vehicle according to an embodiment of the present invention;
fig. 4 is a second schematic view of a scene of a target vehicle motion state recognition method according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a third scenario of a target vehicle motion state recognition method according to an embodiment of the present invention;
fig. 6 is a schematic diagram of a scene of a target vehicle motion state recognition method according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a target vehicle motion state recognition device according to an embodiment of the present invention;
Fig. 8 is a schematic hardware structure of a target vehicle motion state recognition device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Fig. 1 is a schematic view of a scene of a target vehicle motion state recognition method according to an embodiment of the present invention, as shown in fig. 1, a vehicle-mounted camera is mounted on a vehicle and used for capturing a road situation in front of the vehicle, where the vehicle-mounted camera may be, for example, a vehicle recorder, or may also be a camera built in the vehicle, and as long as the vehicle-mounted camera can capture the road situation in front of the vehicle, a specific implementation manner of the vehicle-mounted camera is not limited.
Specifically, the vehicle-mounted camera sends the photographed image to the server in real time, so that the server processes the photographed image to identify the motion state of the target vehicle, in the present invention, the target vehicle refers to the vehicle photographed by the vehicle-mounted camera, and as understood by those skilled in the art, all vehicles (automobile, truck, electric vehicle, bicycle, etc.) on the road that can be photographed by the vehicle-mounted camera can be considered as the target vehicle, as shown in fig. 1, wherein the vehicles 102, 103 and 104 are all target vehicles.
In an alternative embodiment, after the server identifies the motion state of the target vehicle, the motion state of the target vehicle may be sent to the host vehicle, so that a driver of the host vehicle may timely acquire the motion state of the target vehicle ahead, thereby improving driving safety.
At present, the conventional method generally utilizes the epipolar geometry relationship between images under a motion platform, such as a planar homography matrix, a basic matrix, a trifocal tensor and the like, to compensate or estimate the motion of a vehicle-mounted camera when the motion state of a target vehicle is identified. The method comprises the steps of matching angular points between two frames of images, firstly, using the angular points detected based on an angular point detection algorithm on a first frame of image, finding out matching points on a second frame of image by adopting an optical flow algorithm, and then calculating a matrix representing camera motion according to the relation between the matching points of the two frames of images.
However, the conventional method has the following disadvantages: 1. it requires a three-dimensional sensor to acquire depth information of corner points, resulting in limited application scenarios. 2. The method comprises the steps of finding matched angular points in a second frame on a first frame image based on angular points detected by an angular point detection algorithm between two frames of images, and calculating a matrix representing camera motion according to the relation between the angular points of the two frames of images by using an optical flow algorithm, wherein the process of finding the matched points by using the optical flow algorithm and the process of calculating the camera motion matrix by using the relation between the angular points are complex, so that the calculated amount is very large.
In order to solve the problems in the prior art, the present invention provides a method for identifying a motion state of a target vehicle, so as to overcome the problem of large calculation amount when identifying the motion state of the target vehicle in the prior art, and the following description is made with reference to a specific embodiment, first, description is made with reference to fig. 2, fig. 2 is a flowchart one of the method for identifying the motion state of the target vehicle provided by the embodiment of the present invention, as shown in fig. 2, and the method includes:
s201, determining the relative speed of a target vehicle according to two frames of target images, wherein the two frames of target images are images comprising the target vehicle, and the images are acquired by an onboard camera of the vehicle.
In this embodiment, the vehicle-mounted camera sends the captured image data to the server in real time, and it will be understood by those skilled in the art that the image data captured by the vehicle-mounted camera may be video, and the server processes the video captured by the vehicle-mounted camera, thereby obtaining a multi-frame image.
Specifically, the server determines the relative speed of the target vehicle according to two of the target images, wherein the target image is an image including the target vehicle, and it should be noted that, in this embodiment, the target vehicle refers to all vehicles included in the image captured by the vehicle-mounted camera, and is not a specific vehicle, that is, the image may be considered as the target image as long as the vehicle is included in the image.
In an alternative implementation manner, the two frame target images may be any two frame target images, for example, the two frame target images may be two adjacent frame target images, or may be two spaced frame target images, which is not limited in this embodiment, as long as the two frame target images include a target vehicle.
If the target image comprises a plurality of target vehicles, respectively processing each target vehicle to determine the relative speed of each target vehicle, taking one of the target vehicles A as an example, for example, the advancing distance of the target vehicle can be determined according to two frames of target images, and then determining the relative speed of the target vehicle according to the time interval of the two frames of target images; alternatively, the relative speed of the target vehicle may be determined based on the distance of the target vehicle from the host vehicle in the two frames of target images, the speed of the host vehicle, and the like, and the embodiment is not particularly limited in its implementation.
S202, obtaining the yaw angle of the vehicle body of the vehicle according to the vehicle data of the vehicle, wherein the vehicle data comprises vehicle speed information, the vehicle speed information comprises speeds of the vehicle at a first shooting moment and a second shooting moment corresponding to two frames of target images respectively, and the yaw angle of the vehicle body is the rotation angle of the vehicle at the second shooting moment relative to the first shooting moment.
In this embodiment, a vehicle sensor is provided in the vehicle, and is configured to detect vehicle data of the vehicle, specifically, when a time corresponding to a previous frame of target image in two frames of target images is a first capturing time and a time corresponding to a next frame of target image is a second capturing time, the sensor may send a speed of the vehicle at the first capturing time and a speed of the vehicle at the second capturing time, so as to obtain vehicle speed information of the vehicle.
Specifically, the vehicle data is data for reflecting the running state of the host vehicle, and may further include, for example, wheel deviation information, which may include the wheel deviation angle of the host vehicle at the first photographing time and the second photographing time, or may further include, for example, parameters of the vehicle (such as the vehicle body wheelbase, etc.).
In one possible implementation, for example, the yaw angle of the vehicle body may be obtained by processing the vehicle speed information, the wheel yaw information, and the preset model, or the yaw angle of the vehicle body may be directly obtained from a difference between the wheel yaw angles at the first photographing time and the second photographing time, which is not limited herein, so long as the angle at which the vehicle body rotates at the second photographing time relative to the first photographing time can be obtained from the vehicle data.
S203, obtaining the absolute speed of the target vehicle according to the relative speed of the target vehicle, the yaw angle of the vehicle body and the vehicle speed information, and obtaining the position difference value of the target vehicle in the two frames of target images according to the yaw angle of the vehicle body and the vehicle speed information.
Specifically, the relative speed of the target vehicle may be understood as a running speed of the target vehicle when the target vehicle is stationary relative to the target vehicle, and the absolute speed of the target vehicle is an actual running speed of the target vehicle.
In one possible implementation, for example, the directions of the speeds at the first photographing time and the second photographing time may be converted into the same direction according to the yaw angle of the vehicle body, and then the addition or subtraction process may be performed according to the relative speed of the target vehicle (the direction of the relative speed obtained from the two frames of target images is practically identical to the direction of the speed of the previous frame) and the vehicle speed of the vehicle after the direction conversion, thereby obtaining the absolute speed of the target vehicle.
Alternatively, the direction of the relative speed of the target vehicle and the direction of the speed at the second photographing time may be converted to the same direction according to the yaw angle of the vehicle body, so as to obtain the sum absolute speed of the target vehicle at the second photographing time.
Meanwhile, in the embodiment, according to the yaw angle and the vehicle speed information of the vehicle body, a position difference value of the target vehicle in the two frames of target images is obtained, wherein the position difference value is used for indicating whether the position of the target vehicle in the two frames of target images changes or not.
S204, identifying the motion state of the target vehicle according to the absolute speed and the position difference value of the target vehicle.
In the present embodiment, the moving state of the target vehicle may include stationary and running, and when the absolute speed of the target vehicle is not 0 and the position difference is not 0, the target vehicle may be considered to have not moved, thereby determining that the moving state of the target vehicle is stationary, and otherwise, as long as one of the two is not 0, the target vehicle is considered to have moved, thereby determining that the moving state of the target vehicle is running.
The motion state of the target vehicle is jointly identified according to the absolute speed and the position difference value of the target vehicle, so that the accuracy of the motion state identification of the target vehicle is ensured, and the erroneous judgment caused by identification according to only one parameter is avoided.
In an alternative embodiment, the movement state of the target vehicle may further include an initial start state, a high-speed driving state, an impending stop state, and the like, which is not limited in this embodiment, and the movement state of the target vehicle may be determined by comparing the absolute speed and the position difference with a preset threshold interval, for example, which is not limited in this embodiment.
The method for identifying the motion state of the target vehicle provided by the embodiment of the invention comprises the following steps: and determining the relative speed of the target vehicle according to two frames of target images, wherein the two frames of target images are images comprising the target vehicle and acquired by an onboard camera of the vehicle. And obtaining the yaw angle of the vehicle body of the vehicle according to the vehicle data of the vehicle, wherein the vehicle data comprises vehicle speed information, the vehicle speed information comprises speeds of the vehicle at a first shooting time and a second shooting time corresponding to two frames of target images respectively, and the yaw angle of the vehicle body is the angle of the vehicle body rotating relative to the first shooting time at the second shooting time. And obtaining the absolute speed of the target vehicle according to the relative speed of the target vehicle, the yaw angle of the vehicle body and the vehicle speed information, and obtaining the position difference value of the target vehicle in the two frames of target images according to the yaw angle of the vehicle body and the vehicle speed information. The motion state of the target vehicle is identified based on the absolute speed and the position difference of the target vehicle. The relative speed of the target vehicle is determined according to the two frames of target images, the absolute speed of the target vehicle is obtained according to the relative speed of the target vehicle, the yaw angle of the vehicle body and the vehicle speed information, and the position difference value of the target vehicle is obtained, so that the motion state of the target vehicle is directly judged, a matching point is not required to be found, a camera motion matrix is not required to be calculated, the calculated amount is reduced, and the recognition efficiency of the motion state of the target vehicle is improved.
On the basis of the above embodiment, the method for identifying the motion state of the target vehicle according to the embodiment of the present invention will be described in further detail with reference to fig. 3 to 6, where fig. 3 is a flowchart of a second method for identifying the motion state of the target vehicle according to the embodiment of the present invention, fig. 4 is a schematic diagram of a second scene of the method for identifying the motion state of the target vehicle according to the embodiment of the present invention, fig. 5 is a schematic diagram of a third scene of the method for identifying the motion state of the target vehicle according to the embodiment of the present invention, and fig. 6 is a schematic diagram of a fourth scene of the method for identifying the motion state of the target vehicle according to the embodiment of the present invention.
Fig. 3 is a flowchart of a second method for identifying a motion state of a target vehicle according to an embodiment of the present invention, fig. 4 is a schematic diagram of a second scene of the method for identifying a motion state of a target vehicle according to an embodiment of the present invention, fig. 5 is a schematic diagram of a third scene of the method for identifying a motion state of a target vehicle according to an embodiment of the present invention, fig. 6 is a schematic diagram of a fourth scene u7 of the method for identifying a motion state of a target vehicle according to an embodiment of the present invention, as shown in fig. 3, and the method includes:
s301, obtaining image coordinates of a target vehicle in the two frames of target images on an image coordinate system according to the two frames of target images.
Specifically, the two frames of target images are images including the target vehicle, which are acquired by an on-board camera of the vehicle, where the image coordinate system may be understood as a plane rectangular coordinate system, or may also be a three-dimensional coordinate system, which may use a center point of the image as an origin, or may also use any vertex of the image as an origin, which may be selected according to actual requirements, and after the image coordinate system is established, image coordinates of the target vehicle on the image coordinate system, such as image coordinates of each pixel point of the vehicle, or image coordinates of vertices of 4 edges of the vehicle, are acquired.
In one possible implementation manner, two frames of target images may be input to a target detection module and a target tracking module, where the target detection module is configured to detect a target vehicle included in the target images, and the target tracking module is configured to detect whether the target vehicles included in the two frames of target images are the same vehicle, and determine, through the target detection module and the target tracking module, a location of the target vehicle in an image coordinate system, so as to obtain image coordinates of the target vehicle.
S302, converting an image coordinate system of the target vehicle in the two frames of target images into a third vehicle body coordinate system according to the corresponding relation between the image coordinate system and the vehicle body coordinate system.
In this embodiment, there is a correspondence between the image coordinate system and the vehicle body coordinate system, for example, a correspondence matrix between the image coordinate system and the vehicle body coordinate system may be obtained according to the inverse perspective projection method and camera parameters (focal length, erection height, etc. of the vehicle-mounted camera), and then the image coordinate system is converted into a third vehicle body coordinate system according to the correspondence matrix.
Specifically, the vehicle body coordinate system is a special dynamic coordinate system for describing the motion of the vehicle, the origin of the vehicle coordinate system coincides with the mass center of the vehicle, when the vehicle is in a static state on a horizontal road surface, the X axis of the vehicle body coordinate system is parallel to the ground and points to the front of the vehicle, the Z axis of the vehicle body coordinate system points to the upper side through the mass center of the vehicle, and the Y axis points to the left side of a driver.
As will be appreciated by those skilled in the art, each vehicle corresponds to a respective vehicle body coordinate system, and in this embodiment, the third vehicle body coordinate system is a vehicle body coordinate system corresponding to the host vehicle to which the image coordinate system is converted.
And S303, obtaining the relative running distance of the target vehicle according to the third vehicle body coordinate of the target vehicle corresponding to the first shooting moment in the third vehicle body coordinate system and the fourth vehicle body coordinate of the target vehicle corresponding to the second shooting moment in the third vehicle body coordinate system.
Specifically, in the two frames of target images, the target vehicle in the previous frame of target image corresponds to a first image coordinate, the target vehicle in the next frame of target image corresponds to a second image coordinate, after the image coordinate is converted into a third vehicle body coordinate system, the corresponding first image coordinate is converted into a third vehicle body coordinate in the third vehicle body coordinate system, the second image coordinate is converted into a fourth vehicle body coordinate in the third vehicle body coordinate system, wherein the third vehicle body coordinate and the fourth vehicle body coordinate are coordinates of the target vehicle corresponding to the first shooting time and the second shooting time in the third vehicle body coordinate system.
When the target vehicle is placed in the third body coordinate system of the own vehicle, it may be "considered" (not actually stationary) that the own vehicle is stationary so as to acquire the relative travel distance of the target vehicle, that is, the movement of the target vehicle takes the own vehicle as a reference, because the movement of the target vehicle can be measured in the same third body coordinate system (both of the own vehicle), even if the own vehicle is moving, the present embodiment acquires only the relative travel distance of the target vehicle in the third body coordinate system.
In an alternative embodiment, assuming that the body coordinate systems of the first photographing moment and the second photographing moment are moved or rotated, the body coordinate systems may be processed (e.g., moved or rotated correspondingly) such that the first photographing moment and the second photographing moment correspond to the same body coordinate system.
S304, obtaining the relative speed of the target vehicle according to the time difference value between the first shooting time and the second shooting time and the relative driving distance.
Specifically, a time difference exists between the first shooting time and the second shooting time, and the relative speed of the target vehicle can be obtained according to the time difference and the relative driving distance.
S305, obtaining the body yaw angle of the vehicle according to the body wheelbase, the vehicle speed information and the wheel deflection information of the vehicle, wherein the vehicle speed, the wheel deflection angle and the body yaw angle of the vehicle are positively correlated, and the body wheelbase and the body yaw angle of the vehicle are negatively correlated.
Specifically, the wheelbase is the distance from the center of the front axle to the center of the rear axle of the vehicle, which is a fixed parameter of the vehicle, and in this embodiment, the vehicle data further includes wheel deflection information, where the wheel deflection information includes the wheel deflection angles of the vehicle at the first capturing moment and the second capturing moment corresponding to the two frames of target images respectively, where the wheel deflection angles may be, for example, the deflection angles of the front wheels, because the steering of the vehicle is mainly performed by the front wheels, and in some special vehicles, the wheel deflection angles may also be, for example, the deflection angles of the rear wheels.
The yaw angle of the vehicle body of the vehicle can be obtained by the vehicle speed and the wheel deflection angle at the first shooting time, the vehicle speed and the wheel deflection angle at the second shooting time and the vehicle body wheelbase.
In one possible implementation manner, the corresponding relationship between the wheel base, the vehicle speed information, the wheel deflection information and the vehicle body yaw angle of the vehicle can be obtained according to a bicycle model of the vehicle, and the bicycle model of the vehicle is first described, wherein the bicycle model is a model for simplifying the movement of the vehicle, and specifically, the movement of the vehicle in the vertical direction is omitted, so that the description of the vehicle is performed based on the moving object on the two-dimensional plane (can be understood as a overlooking view standing in the sky).
Secondly, assuming that the structure of the vehicle is like a bicycle, that is, the front two wheels of the vehicle have consistent data such as wheel deflection angles, rotation speeds and the like, and the rear two wheels are also similar, the front and rear tires can be described by one wheel respectively, so that the simplified description of the motion of the vehicle on a two-dimensional plane is realized.
Further description will be made below with reference to fig. 4, in which the motion of the vehicle is described by a bicycle model, as shown in fig. 4, assuming that at any one photographing time the front wheel of the vehicle is located at the position indicated by the point a, the rear wheel of the vehicle is located at the position indicated by the point B, the centroid of the vehicle is located at the position indicated by the point C, v in the figure x Vehicle being own vehicle at present momentSpeed, beta is the slip angle of the vehicle,
Figure BDA0002205287570000131
for the yaw angle of the vehicle, l f Distance from the mass center of the vehicle to the point A, l r Is the distance from the vehicle centroid to point B (l f And l r The sum is the vehicle body wheelbase), delta f As for the vehicle yaw angle of the front wheel, the relation between the vehicle body wheelbase, the vehicle speed information and the wheel yaw angle of the vehicle and the vehicle body yaw angle of the vehicle can be obtained through the bicycle model introduced in fig. 4, as shown in formula one:
Figure BDA0002205287570000132
wherein omega d The yaw angle of the vehicle body is represented by L, the wheelbase of the vehicle body is represented by L f And l r The remaining parameters are referred to above for the description of the parameters in fig. 4, and will not be described herein, wherein the specific implementation of the formula one according to the bicycle model can be referred to the description in the prior art, and will not be described herein.
As will be appreciated by those skilled in the art, as long as the vehicle speed of the host vehicle and the wheel yaw angle and the body yaw angle are both positively correlated, the body wheelbase of the host vehicle and the body yaw angle are negatively correlated, the specific implementation manner for obtaining the body yaw angle can be selected according to actual requirements,
s306, according to the yaw angle of the vehicle body, the speed of the vehicle at the first shooting moment and the speed of the vehicle at the second shooting moment are converted into the same direction.
If the vehicle body yaw angle is not 0, it indicates that the vehicle direction at the first photographing time and the vehicle direction at the second photographing time are changed, and the direction of the vehicle speed of the corresponding host vehicle is also changed, and it is necessary to switch the vehicle speed of the host vehicle at the first photographing time and the vehicle speed of the host vehicle at the second photographing time to the same direction according to the vehicle body yaw angle.
Specifically, referring to fig. 5, it is assumed that the first photographing is performedSpeed of vehicle (V) 1 ) Is indicated by an arrow 1, and the speed (V 2 ) The direction of the (2) is the direction indicated by the arrow, and the included angle omega between the two corresponding directions d Namely, the yaw angle of the vehicle body, the speed can be converted to the same direction through the following formula II:
V 3 =V 2 ×cosω d formula II
Wherein V is 3 I.e. the velocity V 2 Is converted into a velocity V 1 The speed obtained after the direction of the coincidence of the directions.
Or, the direction of the speed of the vehicle at the first shooting moment can be converted into the speed of the vehicle at the second shooting moment, so that the speed of the vehicle at the second shooting moment is converted into the same direction, the implementation modes are similar, and the detailed description is omitted herein, so long as the two directions can be converted into the same direction, and the specific conversion mode can be selected according to actual requirements.
In another alternative implementation manner, if the yaw angle of the vehicle body is 0, it indicates that the direction of the vehicle is not changed, and the direction of the vehicle speed of the corresponding vehicle is the same at the first photographing time and the second photographing time, so that the speed direction of the two times is not changed even if the above steps are performed.
S307, processing is carried out according to the relative speed of the target vehicle and the speed of the second shooting moment after the direction conversion, and the absolute speed of the target vehicle is obtained.
In the present embodiment, the relative speed of the target vehicle is obtained from two frames of target images, and it will be understood by those skilled in the art that the direction of the relative speed of the target vehicle is substantially the same as the speed direction of the own vehicle corresponding to the first photographing time, so that in the preferred embodiment, the direction of the speed at the second photographing time is converted into the direction of the speed at the first photographing time to obtain the speed at the second photographing time after the direction conversion, that is, V obtained in fig. 5 above 3
Specifically, the relative speed of the target vehicle is a running speed with the host vehicle as a reference (the host vehicle is considered to be stationary), and the relative speed of the target vehicle and the vehicle speed at the second photographing time after the direction conversion are added or subtracted, so that the absolute speed of the target vehicle, which is the running speed of the target vehicle with the ground as a reference, that is, the actual running speed of the target vehicle, can be obtained.
Alternatively, if the direction of the speed of the own vehicle at the first imaging time is switched to coincide with the direction of the speed of the own vehicle at the second imaging time, the direction of the relative speed of the target vehicle may be switched again, and the technical scheme of the present invention may be similarly implemented, and it is understood by those skilled in the art that the direction of the relative speed and the direction of the vehicle speed at the second imaging time after the direction switching coincide.
If the speed of the vehicle is 0, the relative speed of the target vehicle and the absolute speed of the target vehicle are consistent, and as will be understood by those skilled in the art, each capturing moment corresponds to the relative speed and the absolute speed of the respective target vehicle, and the present embodiment performs real-time processing according to the image.
S308, according to the yaw angle of the vehicle body, performing rotation processing on a second vehicle body coordinate system corresponding to the second shooting moment of the vehicle, and according to the speed information of the vehicle, performing translation processing on the second vehicle body coordinate system to obtain a processed second vehicle body coordinate system, wherein the processed second vehicle body coordinate system is identical to the first vehicle body coordinate system corresponding to the first shooting moment.
Specifically, assuming that the coordinate system corresponding to the own vehicle at the first photographing time is the first vehicle body coordinate system, the vehicle body coordinate system corresponding to the second photographing time is the second vehicle body coordinate system, if the vehicle body yaw angle is not 0, it indicates that the traveling direction of the own vehicle changes from the first photographing time to the second photographing time, and the direction of the corresponding vehicle body coordinate system of the own vehicle changes, as shown in fig. 6, assuming that the vehicle body coordinate system corresponding to the first photographing time is the x-y coordinate system and the coordinate system corresponding to the second photographing time is the s-t coordinate system, it can be seen that the angle of the vehicle body coordinate slow rotation from the first photographing time to the second photographing time is θ (equivalent to the vehicle body yaw angle), the second vehicle body coordinate system (s-t coordinate system) corresponding to the own vehicle at the second photographing time is rotated according to the vehicle body yaw angle, and thus the second vehicle body coordinate system is rotated to the same direction as the first vehicle body coordinate system.
Meanwhile, if the vehicle speed information of the vehicle is not 0, it indicates that the position of the vehicle changes (advances or retreats) from the first photographing time to the second photographing time, and the position of the corresponding vehicle body coordinate system of the vehicle also changes, and the driving distance of the vehicle can be obtained according to the vehicle speed at the first photographing time, the vehicle speed at the second photographing time and the time interval between the first photographing time and the second photographing time, so that the translation processing is performed on the second vehicle body coordinate system according to the driving distance, and the second vehicle body coordinate system is translated to the same position as the first vehicle body coordinate system.
After rotation processing and translation processing, a processed second vehicle body coordinate system is obtained, wherein the processed second vehicle body coordinate system is the same as the first vehicle body coordinate system corresponding to the first shooting moment.
The rotation processing and the translation processing are carried out according to the second vehicle body coordinate system corresponding to the second shooting moment of the yaw angle of the vehicle body and the speed information of the vehicle, so that the target vehicle of the front frame and the rear frame is converted into the same vehicle body coordinate system, further, more accurate motion state judgment can be realized, and compared with a traditional method, a matrix representing the motion of the vehicle is not required to be calculated, so that the calculated amount is reduced.
S309, acquiring a first body coordinate of the target vehicle in a first body coordinate system and a second body coordinate in a processed second body coordinate system.
And S310, obtaining the position difference value of the target vehicle in the two frames of target images according to the first vehicle body coordinates and the second vehicle body coordinates.
Whether the position of the target vehicle changes or not can be determined according to the same vehicle body coordinate system, specifically, a first vehicle body coordinate of the target vehicle in the first vehicle body coordinate system and a second vehicle body coordinate in the processed second vehicle body coordinate system are obtained, and then a position difference value of the target vehicle in two frames of target images is obtained according to the first vehicle body coordinate and the second vehicle body coordinate, wherein the position difference value is used for indicating whether the position of the target vehicle changes or not.
If the absolute speed of the target vehicle is not 0, it is determined whether the position difference of the target vehicle is 0, if yes, S312 is executed, and if not, S313 is executed.
Specifically, if it is determined that the absolute speed of the target vehicle is not 0, it may be determined that the target vehicle has moved initially, so as to determine whether the position difference of the target vehicle is 0, thereby more accurately determining whether the target vehicle has moved.
In an alternative embodiment, if the absolute speed of the target vehicle is 0, it may be determined that the target vehicle is not moving according to the absolute speed, so as to determine that the motion state of the target vehicle is stationary.
S312, determining that the motion state of the target vehicle is stationary.
S313, determining the motion state of the target vehicle as running.
If the position difference of the target vehicle is 0, it may be determined that the moving state of the target vehicle is stationary, or if the position difference of the target vehicle is not 0, it may be determined that the moving state of the target vehicle is in running,
in an alternative embodiment, the position difference of the target vehicle may be determined first, and then, in determining the absolute speed of the target vehicle, this is not limited as long as it is ensured that the moving state of the target vehicle is in the running state in the case where the absolute speed of the target vehicle is not 0 and the absolute speed of the target vehicle is not 0.
The method for identifying the motion state of the target vehicle provided by the embodiment of the invention comprises the following steps: and obtaining the image coordinates of the target vehicle in the two frames of target images on an image coordinate system according to the two frames of target images. And converting the image coordinate system of the target vehicle in the two frames of target images into a third vehicle body coordinate system according to the corresponding relation between the image coordinate system and the vehicle body coordinate system. And obtaining the relative running distance of the target vehicle according to the third vehicle body coordinate of the target vehicle corresponding to the first shooting moment in the third vehicle body coordinate system and the fourth vehicle body coordinate of the target vehicle corresponding to the second shooting moment in the third vehicle body coordinate system. And obtaining the relative speed of the target vehicle according to the time difference value between the first shooting time and the second shooting time and the relative driving distance. And obtaining the body yaw angle of the vehicle according to the body wheelbase, the vehicle speed information and the wheel deflection information of the vehicle, wherein the vehicle speed, the wheel deflection angle and the body yaw angle of the vehicle are positively correlated, and the body wheelbase and the body yaw angle of the vehicle are negatively correlated. The speed of the own vehicle at the first photographing time and the speed of the own vehicle at the second photographing time are converted to the same direction according to the yaw angle of the vehicle body. And processing according to the relative speed of the target vehicle and the vehicle speed at the second shooting moment after the direction conversion to obtain the absolute speed of the target vehicle. And according to the yaw angle of the vehicle body, performing rotation processing on a second vehicle body coordinate system corresponding to the second shooting moment of the vehicle body, and according to the speed information of the vehicle body, performing translation processing on the second vehicle body coordinate system to obtain a processed second vehicle body coordinate system, wherein the processed second vehicle body coordinate system is identical to the first vehicle body coordinate system corresponding to the first shooting moment. The method comprises the steps of acquiring a first body coordinate of a target vehicle in a first body coordinate system and acquiring a second body coordinate in a processed second body coordinate system. And obtaining the position difference value of the target vehicle in the two frames of target images according to the first vehicle body coordinates and the second vehicle body coordinates. If the absolute speed of the target vehicle is not 0, judging whether the position difference value of the target vehicle is 0, and if so, determining that the motion state of the target vehicle is stationary. If not, determining the motion state of the target vehicle as running. The rotation processing and the translation processing are carried out on the second vehicle body coordinate system corresponding to the second shooting moment, so that the target vehicle of the front frame and the rear frame is converted into the same vehicle body coordinate system, whether the position of the target vehicle changes or not is determined, further, more accurate motion state judgment can be achieved, compared with a traditional method, a matrix representing the motion of the vehicle is not needed to be calculated, the calculated amount is reduced, meanwhile, judgment is carried out according to the absolute motion speed and the position difference value of the target vehicle, and accuracy of the vehicle is improved.
Fig. 7 is a schematic structural diagram of a target vehicle motion state recognition device according to an embodiment of the present invention. As shown in fig. 7, the apparatus 70 includes: a determination module 701, a processing module 702 and an identification module 703.
A determining module 701, configured to determine a relative speed of a target vehicle according to two frame target images, where the two frame target images are images including the target vehicle acquired by an on-board camera of the vehicle;
the processing module 702 is further configured to obtain a yaw angle of a vehicle body of the vehicle according to vehicle data of the vehicle, where the vehicle data includes vehicle speed information, the vehicle speed information includes speeds of the vehicle at a first shooting time and a second shooting time corresponding to the two frames of target images, and the yaw angle of the vehicle body is an angle at which the vehicle rotates relative to the first shooting time at the second shooting time;
the processing module 702 is further configured to obtain an absolute speed of the target vehicle according to the relative speed of the target vehicle, the yaw angle of the vehicle body, and the vehicle speed information, and obtain a position difference value of the target vehicle in the two frames of target images according to the yaw angle of the vehicle body and the vehicle speed information;
An identifying module 703, configured to identify a motion state of the target vehicle according to the absolute speed of the target vehicle and the position difference.
In one possible design, the processing module 702 is specifically configured to:
converting the speed of the host vehicle at the first shooting time and the speed of the host vehicle at the second shooting time into the same direction according to the yaw angle of the vehicle body;
and processing according to the relative speed of the target vehicle and the speed of the second shooting moment after the direction conversion to obtain the absolute speed of the target vehicle.
In one possible design, the processing module 702 is specifically configured to:
according to the yaw angle of the vehicle body, performing rotation processing on a second vehicle body coordinate system corresponding to the second shooting moment of the vehicle body, and performing translation processing on the second vehicle body coordinate system according to the speed information of the vehicle body to obtain a processed second vehicle body coordinate system, wherein the processed second vehicle body coordinate system is identical to a first vehicle body coordinate system corresponding to the first shooting moment;
acquiring a first body coordinate of the target vehicle in the first body coordinate system and a second body coordinate in the processed second body coordinate system;
And obtaining the position difference value of the target vehicle in the two frames of target images according to the first vehicle body coordinates and the second vehicle body coordinates.
In one possible design, the determining module 701 is specifically configured to:
obtaining image coordinates of a target vehicle in the two frames of target images on an image coordinate system according to the two frames of target images;
converting the image coordinate system of the target vehicle in the two frames of target images into a third vehicle body coordinate system according to the corresponding relation between the image coordinate system and the vehicle body coordinate system;
obtaining the relative running distance of the target vehicle according to the third vehicle body coordinate of the target vehicle corresponding to the first shooting moment in the third vehicle body coordinate system and the fourth vehicle body coordinate of the target vehicle corresponding to the second shooting moment in the third vehicle body coordinate system;
and obtaining the relative speed of the target vehicle according to the time difference between the first shooting time and the second shooting time and the relative running distance.
In one possible design, the vehicle data further includes wheel deflection information, where the wheel deflection information includes wheel deflection angles of the vehicle at a first shooting time and a second shooting time corresponding to the two frame target images, respectively;
The processing module 702 is specifically configured to:
and obtaining the body yaw angle of the vehicle according to the body wheelbase of the vehicle, the vehicle speed information and the wheel deflection information, wherein the vehicle speed of the vehicle, the wheel deflection angle and the body yaw angle are positively correlated, and the body wheelbase of the vehicle and the body yaw angle are negatively correlated.
In one possible design, the identification module 703 is specifically configured to:
if the absolute speed of the target vehicle is not 0, judging whether the position difference value of the target vehicle is 0 or not;
if yes, determining that the motion state of the target vehicle is stationary;
if not, determining the motion state of the target vehicle as running.
In one possible design, the identification module 703 is further configured to:
and if the absolute speed of the target vehicle is 0, determining that the motion state of the target vehicle is stationary.
The device provided in this embodiment may be used to implement the technical solution of the foregoing method embodiment, and its implementation principle and technical effects are similar, and this embodiment will not be described herein again.
Fig. 8 is a schematic hardware structure diagram of a target vehicle motion state recognition device according to an embodiment of the present invention, and as shown in fig. 8, a target vehicle motion state recognition device 80 according to the present embodiment includes: a processor 801 and a memory 802; wherein the method comprises the steps of
A memory 802 for storing computer-executable instructions;
the processor 801 is configured to execute computer-executable instructions stored in the memory to implement the steps executed by the target vehicle motion state identification method in the above-described embodiment. Reference may be made in particular to the relevant description of the embodiments of the method described above.
Alternatively, the memory 802 may be separate or integrated with the processor 801.
When the memory 802 is provided separately, the target vehicle motion state identification apparatus further includes a bus 803 for connecting the memory 802 and the processor 801.
The embodiment of the invention also provides a computer readable storage medium, wherein the computer readable storage medium stores computer execution instructions, and when the processor executes the computer execution instructions, the target vehicle motion state identification method executed by the target vehicle motion state identification device is realized.
In the several embodiments provided by the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple modules may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or modules, which may be in electrical, mechanical, or other forms.
The integrated modules, which are implemented in the form of software functional modules, may be stored in a computer readable storage medium. The software functional module is stored in a storage medium, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (english: processor) to perform some of the steps of the methods described in the embodiments of the present application.
It should be understood that the above processor may be a central processing unit (english: central Processing Unit, abbreviated as CPU), or may be other general purpose processors, digital signal processors (english: digital Signal Processor, abbreviated as DSP), application specific integrated circuits (english: application Specific Integrated Circuit, abbreviated as ASIC), or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present invention may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in a processor for execution.
The memory may comprise a high-speed RAM memory, and may further comprise a non-volatile memory NVM, such as at least one magnetic disk memory, and may also be a U-disk, a removable hard disk, a read-only memory, a magnetic disk or optical disk, etc.
The bus may be an industry standard architecture (Industry Standard Architecture, ISA) bus, an external device interconnect (Peripheral Component, PCI) bus, or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, among others. The buses may be divided into address buses, data buses, control buses, etc. For ease of illustration, the buses in the drawings of the present application are not limited to only one bus or one type of bus.
The storage medium may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the method embodiments described above may be performed by hardware associated with program instructions. The foregoing program may be stored in a computer readable storage medium. The program, when executed, performs steps including the method embodiments described above; and the aforementioned storage medium includes: various media that can store program code, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention.

Claims (14)

1. A target vehicle motion state recognition method, characterized by comprising:
determining the relative speed of a target vehicle according to two frames of target images, wherein the two frames of target images are images comprising the target vehicle, which are acquired by a vehicle-mounted camera of the vehicle;
obtaining a body yaw angle of the vehicle according to vehicle data of the vehicle, wherein the vehicle data comprises vehicle speed information, the vehicle speed information comprises speeds of the vehicle at a first shooting moment and a second shooting moment which correspond to the two frames of target images respectively, and the body yaw angle is an angle of the vehicle rotating relative to the first shooting moment at the second shooting moment;
Obtaining the absolute speed of the target vehicle according to the relative speed of the target vehicle, the yaw angle of the vehicle body and the vehicle speed information, and obtaining the position difference value of the target vehicle in the two frames of target images according to the yaw angle of the vehicle body and the vehicle speed information;
identifying a motion state of the target vehicle according to the absolute speed of the target vehicle and the position difference value;
the step of obtaining the position difference value of the target vehicle in the two frames of target images according to the yaw angle of the vehicle body and the vehicle speed information comprises the following steps:
according to the yaw angle of the vehicle body, performing rotation processing on a second vehicle body coordinate system corresponding to the second shooting moment of the vehicle body, and performing translation processing on the second vehicle body coordinate system according to the speed information of the vehicle body to obtain a processed second vehicle body coordinate system, wherein the processed second vehicle body coordinate system is identical to a first vehicle body coordinate system corresponding to the first shooting moment;
acquiring a first body coordinate of the target vehicle in the first body coordinate system and a second body coordinate in the processed second body coordinate system;
And obtaining the position difference value of the target vehicle in the two frames of target images according to the first vehicle body coordinates and the second vehicle body coordinates.
2. The method according to claim 1, wherein the obtaining the absolute speed of the target vehicle from the relative speed of the target vehicle, the vehicle body yaw angle, and the vehicle speed information includes:
converting the speed of the host vehicle at the first shooting time and the speed of the host vehicle at the second shooting time into the same direction according to the yaw angle of the vehicle body;
and processing according to the relative speed of the target vehicle and the speed of the second shooting moment after the direction conversion to obtain the absolute speed of the target vehicle.
3. The method of claim 1, wherein determining the relative speed of the target vehicle from the two frames of target images comprises:
obtaining image coordinates of a target vehicle in the two frames of target images on an image coordinate system according to the two frames of target images;
converting the image coordinate system of the target vehicle in the two frames of target images into a third vehicle body coordinate system according to the corresponding relation between the image coordinate system and the vehicle body coordinate system;
Obtaining the relative running distance of the target vehicle according to the third vehicle body coordinate of the target vehicle corresponding to the first shooting moment in the third vehicle body coordinate system and the fourth vehicle body coordinate of the target vehicle corresponding to the second shooting moment in the third vehicle body coordinate system;
and obtaining the relative speed of the target vehicle according to the time difference between the first shooting time and the second shooting time and the relative running distance.
4. The method according to claim 1, wherein the vehicle data further includes wheel deflection information including wheel deflection angles of the own vehicle at first and second photographing times corresponding to the two frames of target images, respectively;
obtaining the yaw angle of the body of the vehicle according to the vehicle data corresponding to the shooting time of the two frames of target images sent by the sensor of the vehicle, comprising:
and obtaining the body yaw angle of the vehicle according to the body wheelbase of the vehicle, the vehicle speed information and the wheel deflection information, wherein the vehicle speed of the vehicle, the wheel deflection angle and the body yaw angle are positively correlated, and the body wheelbase of the vehicle and the body yaw angle are negatively correlated.
5. The method of claim 1, wherein the identifying the motion state of the target vehicle based on the absolute speed of the target vehicle and the difference in the positions of the target vehicle in the two frames of target images comprises:
if the absolute speed of the target vehicle is not 0, judging whether the position difference value of the target vehicle is 0 or not;
if yes, determining that the motion state of the target vehicle is stationary;
if not, determining the motion state of the target vehicle as running.
6. The method of claim 5, wherein the method further comprises:
and if the absolute speed of the target vehicle is 0, determining that the motion state of the target vehicle is stationary.
7. A target vehicle motion state recognition apparatus characterized by comprising:
the determining module is used for determining the relative speed of the target vehicle according to two frames of target images, wherein the two frames of target images are images comprising the target vehicle, and the images are acquired by a vehicle-mounted camera of the vehicle;
the processing module is further used for obtaining a body yaw angle of the vehicle according to the vehicle data of the vehicle, wherein the vehicle data comprises vehicle speed information, the vehicle speed information comprises speeds of the vehicle at a first shooting moment and a second shooting moment corresponding to the two frames of target images respectively, and the body yaw angle is an angle of the vehicle rotating relative to the first shooting moment at the second shooting moment;
The processing module is further used for obtaining the absolute speed of the target vehicle according to the relative speed of the target vehicle, the yaw angle of the vehicle body and the vehicle speed information, and obtaining the position difference value of the target vehicle in the two frames of target images according to the yaw angle of the vehicle body and the vehicle speed information;
the identification module is used for identifying the motion state of the target vehicle according to the absolute speed of the target vehicle and the position difference value;
the processing module is specifically configured to:
according to the yaw angle of the vehicle body, performing rotation processing on a second vehicle body coordinate system corresponding to the second shooting moment of the vehicle body, and performing translation processing on the second vehicle body coordinate system according to the speed information of the vehicle body to obtain a processed second vehicle body coordinate system, wherein the processed second vehicle body coordinate system is identical to a first vehicle body coordinate system corresponding to the first shooting moment;
acquiring a first body coordinate of the target vehicle in the first body coordinate system and a second body coordinate in the processed second body coordinate system;
and obtaining the position difference value of the target vehicle in the two frames of target images according to the first vehicle body coordinates and the second vehicle body coordinates.
8. The apparatus of claim 7, wherein the processing module is specifically configured to:
converting the speed of the host vehicle at the first shooting time and the speed of the host vehicle at the second shooting time into the same direction according to the yaw angle of the vehicle body;
and processing according to the relative speed of the target vehicle and the speed of the second shooting moment after the direction conversion to obtain the absolute speed of the target vehicle.
9. The apparatus of claim 7, wherein the determining module is specifically configured to:
obtaining image coordinates of a target vehicle in the two frames of target images on an image coordinate system according to the two frames of target images;
converting the image coordinate system of the target vehicle in the two frames of target images into a third vehicle body coordinate system according to the corresponding relation between the image coordinate system and the vehicle body coordinate system;
obtaining the relative running distance of the target vehicle according to the third vehicle body coordinate of the target vehicle corresponding to the first shooting moment in the third vehicle body coordinate system and the fourth vehicle body coordinate of the target vehicle corresponding to the second shooting moment in the third vehicle body coordinate system;
And obtaining the relative speed of the target vehicle according to the time difference between the first shooting time and the second shooting time and the relative running distance.
10. The apparatus according to claim 7, wherein the vehicle data further includes wheel deflection information including wheel deflection angles of the own vehicle at first and second photographing times corresponding to the two frames of target images, respectively;
the processing module is specifically configured to:
and obtaining the body yaw angle of the vehicle according to the body wheelbase of the vehicle, the vehicle speed information and the wheel deflection information, wherein the vehicle speed of the vehicle, the wheel deflection angle and the body yaw angle are positively correlated, and the body wheelbase of the vehicle and the body yaw angle are negatively correlated.
11. The apparatus of claim 7, wherein the identification module is specifically configured to:
if the absolute speed of the target vehicle is not 0, judging whether the position difference value of the target vehicle is 0 or not;
if yes, determining that the motion state of the target vehicle is stationary;
if not, determining the motion state of the target vehicle as running.
12. The apparatus of claim 11, wherein the identification module is further configured to:
and if the absolute speed of the target vehicle is 0, determining that the motion state of the target vehicle is stationary.
13. A target vehicle motion state recognition apparatus characterized by comprising:
a memory for storing a program;
a processor for executing the program stored by the memory, the processor being for performing the method of any one of claims 1 to 6 when the program is executed.
14. A computer readable storage medium comprising instructions which, when run on a computer, cause the computer to perform the method of any one of claims 1 to 6.
CN201910879023.7A 2019-09-18 2019-09-18 Target vehicle motion state identification method and device Active CN112529935B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910879023.7A CN112529935B (en) 2019-09-18 2019-09-18 Target vehicle motion state identification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910879023.7A CN112529935B (en) 2019-09-18 2019-09-18 Target vehicle motion state identification method and device

Publications (2)

Publication Number Publication Date
CN112529935A CN112529935A (en) 2021-03-19
CN112529935B true CN112529935B (en) 2023-05-16

Family

ID=74974931

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910879023.7A Active CN112529935B (en) 2019-09-18 2019-09-18 Target vehicle motion state identification method and device

Country Status (1)

Country Link
CN (1) CN112529935B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113226885A (en) * 2021-05-27 2021-08-06 华为技术有限公司 Method and device for determining steering intention of target vehicle
CN113239464B (en) * 2021-06-02 2024-01-30 北京汽车集团越野车有限公司 Method and device for determining vehicle body section
CN116088014B (en) * 2023-03-03 2023-07-04 北京理工大学 Main car information coordinate system conversion method and system based on longitude and latitude information

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003149256A (en) * 2001-11-14 2003-05-21 Mitsubishi Electric Corp Vehicle speed measuring instrument
JP2008026030A (en) * 2006-07-18 2008-02-07 Denso Corp Vehicle obstacle detector and vehicle control system
JP2010198552A (en) * 2009-02-27 2010-09-09 Konica Minolta Holdings Inc Driving state monitoring device
CN104865579A (en) * 2014-02-21 2015-08-26 株式会社电装 Vehicle-installed Obstacle Detection Apparatus Having Function For Judging Motion Condition Of Detected Object
CN106054191A (en) * 2015-04-06 2016-10-26 通用汽车环球科技运作有限责任公司 Wheel detection and its application in object tracking and sensor registration

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6615725B2 (en) * 2016-09-16 2019-12-04 株式会社東芝 Travel speed calculation device and travel speed calculation method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003149256A (en) * 2001-11-14 2003-05-21 Mitsubishi Electric Corp Vehicle speed measuring instrument
JP2008026030A (en) * 2006-07-18 2008-02-07 Denso Corp Vehicle obstacle detector and vehicle control system
JP2010198552A (en) * 2009-02-27 2010-09-09 Konica Minolta Holdings Inc Driving state monitoring device
CN104865579A (en) * 2014-02-21 2015-08-26 株式会社电装 Vehicle-installed Obstacle Detection Apparatus Having Function For Judging Motion Condition Of Detected Object
CN106054191A (en) * 2015-04-06 2016-10-26 通用汽车环球科技运作有限责任公司 Wheel detection and its application in object tracking and sensor registration

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
《Real-time obstacles detection and status classification for collision warning in a vehicle active safety system》;Song W et al.;《IEEE Transactions on intelligent transportation systems》;20170525;第758-773页 *

Also Published As

Publication number Publication date
CN112529935A (en) 2021-03-19

Similar Documents

Publication Publication Date Title
CN109887033B (en) Positioning method and device
CN112529935B (en) Target vehicle motion state identification method and device
CN107111879B (en) Method and apparatus for estimating vehicle's own motion by panoramic looking-around image
JP6670071B2 (en) Vehicle image recognition system and corresponding method
JP6785620B2 (en) Predictive suspension control for vehicles using stereo camera sensors
JP6350374B2 (en) Road surface detection device
CN110910453A (en) Vehicle pose estimation method and system based on non-overlapping view field multi-camera system
US10832428B2 (en) Method and apparatus for estimating a range of a moving object
JP2010193428A (en) Roll angle correction method and device
US20210174113A1 (en) Method for limiting object detection area in a mobile system equipped with a rotation sensor or a position sensor with an image sensor, and apparatus for performing the same
CN113734157B (en) Memory parking method, device, equipment, storage medium and program product
CN111386530A (en) Vehicle detection method and apparatus
CN114091521B (en) Method, device and equipment for detecting vehicle course angle and storage medium
JP2018205950A (en) Environment map generation apparatus for estimating self vehicle position, self vehicle position estimation device, environment map generation program for estimating self vehicle position, and self vehicle position estimation program
CN108376384B (en) Method and device for correcting disparity map and storage medium
CN112308899B (en) Trailer angle identification method and device
JP5832850B2 (en) Lane monitoring system and lane monitoring method
CN114037977B (en) Road vanishing point detection method, device, equipment and storage medium
CN106874837B (en) Vehicle detection method based on video image processing
CN115346191A (en) Method and apparatus for calibration
CN114494200A (en) Method and device for measuring trailer rotation angle
CN114863096A (en) Semantic map construction and positioning method and device for indoor parking lot
CN114648743A (en) Three-dimensional traffic sign detection
CN112184605A (en) Method, equipment and system for enhancing vehicle driving visual field
CN114705121B (en) Vehicle pose measurement method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant