CN115588185A - Driving route generation method and device, electronic equipment and computer readable medium - Google Patents

Driving route generation method and device, electronic equipment and computer readable medium Download PDF

Info

Publication number
CN115588185A
CN115588185A CN202211422241.6A CN202211422241A CN115588185A CN 115588185 A CN115588185 A CN 115588185A CN 202211422241 A CN202211422241 A CN 202211422241A CN 115588185 A CN115588185 A CN 115588185A
Authority
CN
China
Prior art keywords
vehicle
obstacle vehicle
obstacle
generating
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211422241.6A
Other languages
Chinese (zh)
Other versions
CN115588185B (en
Inventor
李敏
张�雄
洪炽杰
兰莎郧
罗鸿
艾永军
申苗
陶武康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GAC Aion New Energy Automobile Co Ltd
Original Assignee
GAC Aion New Energy Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GAC Aion New Energy Automobile Co Ltd filed Critical GAC Aion New Energy Automobile Co Ltd
Priority to CN202211422241.6A priority Critical patent/CN115588185B/en
Publication of CN115588185A publication Critical patent/CN115588185A/en
Application granted granted Critical
Publication of CN115588185B publication Critical patent/CN115588185B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0011Planning or execution of driving tasks involving control alternatives for a single driving scenario, e.g. planning several paths to avoid obstacles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles

Abstract

The invention discloses a driving route generation method, a driving route generation device, electronic equipment and a computer readable medium. One embodiment of the method comprises: determining each two-dimensional coordinate of each wheel grounding point of the obstacle vehicle according to a foreground image acquired by a front camera of the target vehicle; determining the three-dimensional coordinates of the grounding point of each wheel according to each two-dimensional coordinate; determining the course angle of the obstacle vehicle as an initial obstacle vehicle course angle; generating a unit vector of the heading angle of the obstacle vehicle according to the initial heading angle of the obstacle vehicle and each three-dimensional coordinate; generating a heading angle of the obstacle vehicle according to the unit vector of the heading angle; generating a rotation matrix of the obstacle vehicle according to the course angle of the obstacle vehicle; generating center coordinates of the obstacle vehicle according to the rotation matrix; generating a bounding box of the obstacle vehicle according to the central coordinate and the rotation matrix; based on the bounding box, a driving route is generated. The implementation mode reduces the frequency that the running vehicle can not safely avoid the obstacle vehicle, and improves the running safety.

Description

Driving route generation method and device, electronic equipment and computer readable medium
Technical Field
Embodiments of the present disclosure relate to the field of computer technologies, and in particular, to a driving route generation method and apparatus, an electronic device, and a computer-readable medium.
Background
The course angle of the obstacle vehicle can represent the driving posture of the obstacle vehicle, and provides a basis for the driving route of the vehicle. Currently, when determining the heading angle of an obstacle vehicle, the following methods are generally adopted: and determining the heading angle of the obstacle vehicle through a pre-trained neural network model.
However, the inventors have found that when determining the heading angle of an obstacle vehicle in the above manner, there are often technical problems as follows:
firstly, when the training set for training the neural network model is less, the accuracy of the course angle of the obstacle vehicle determined by the neural network model is lower, so that the times that the running vehicle cannot safely avoid the obstacle vehicle are more, and the safety is poorer.
Secondly, when the obstacle vehicle is seriously shielded, the accuracy of the position of the obstacle vehicle generated through the neural network model is low, so that the times that the running vehicle cannot safely avoid the obstacle vehicle are more, and the safety is poor.
The above information disclosed in this background section is only for enhancement of understanding of the background of the inventive concept and, therefore, it may contain information that does not form the prior art that is already known to a person of ordinary skill in the art in this country.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Some embodiments of the present disclosure propose a driving route generation method, apparatus, electronic device, and computer readable medium to solve one or more of the technical problems mentioned in the background section above.
In a first aspect, some embodiments of the present disclosure provide a driving route generation method, including: determining each two-dimensional coordinate of each wheel grounding point of the obstacle vehicle according to a foreground image acquired by a front camera of the target vehicle; determining three-dimensional coordinates of the wheel contact points in a vehicle coordinate system of the target vehicle based on the two-dimensional coordinates; determining the course angle of the obstacle vehicle detected by the target detection model as an initial obstacle vehicle course angle; generating a unit vector of the course angle of the obstacle vehicle according to the course angle of the initial obstacle vehicle and the three-dimensional coordinates; generating a course angle of the obstacle vehicle according to the course angle unit vector of the obstacle vehicle; generating a rotation matrix of the obstacle vehicle under the vehicle coordinate system according to the heading angle of the obstacle vehicle; generating center coordinates of the obstacle vehicle in the vehicle coordinate system based on the rotation matrix; generating a bounding box of the obstacle vehicle according to the central coordinates and the rotation matrix; and generating a driving route of the target vehicle based on the bounding box.
In a second aspect, some embodiments of the present disclosure provide a driving route generation apparatus, including: a first determination unit configured to determine respective two-dimensional coordinates of respective wheel grounding points of the obstacle vehicle, based on a foreground image acquired by a front camera of the target vehicle; a second determination unit configured to determine, based on the respective two-dimensional coordinates, respective three-dimensional coordinates of the respective wheel grounding points in a vehicle coordinate system of the target vehicle; a third determination unit configured to determine a heading angle of the obstacle vehicle detected by the target detection model as an initial obstacle vehicle heading angle; a first generating unit configured to generate a unit vector of the heading angle of the obstacle vehicle according to the initial heading angle of the obstacle vehicle and the respective three-dimensional coordinates; a second generating unit configured to generate a course angle of the obstacle vehicle according to the unit vector of the course angle of the obstacle vehicle; a third generating unit configured to generate a rotation matrix of the obstacle vehicle in the vehicle coordinate system according to the obstacle vehicle heading angle; a fourth generation unit configured to generate center coordinates of the obstacle vehicle in the vehicle coordinate system based on the rotation matrix; a fifth generating unit configured to generate a bounding box of the obstacle vehicle based on the center coordinates and the rotation matrix; a sixth generation unit configured to generate a travel route of the target vehicle based on the bounding box.
In a third aspect, some embodiments of the present disclosure provide an electronic device, comprising: one or more processors; a storage device having one or more programs stored thereon, which when executed by one or more processors, cause the one or more processors to implement the method described in any of the implementations of the first aspect.
In a fourth aspect, some embodiments of the disclosure provide a computer-readable medium on which a computer program is stored, wherein the computer program, when executed by a processor, implements the method described in any implementation manner of the first aspect.
The above embodiments of the present disclosure have the following advantages: by means of the driving route generation method of some embodiments of the disclosure, the number of times that the driving vehicle cannot safely avoid the obstacle vehicle is reduced, and driving safety is improved. Specifically, the number of times the traveling vehicle cannot safely avoid the obstacle vehicle is large, and the reason for the poor safety is that: when the training set for training the neural network model is less, the accuracy of the course angle of the obstacle vehicle determined by the neural network model is lower, so that the times that the running vehicle cannot safely avoid the obstacle vehicle are more, and the safety is poorer. Based on this, the driving route generation method of some embodiments of the present disclosure first determines respective two-dimensional coordinates of respective wheel grounding points of the obstacle vehicle from a foreground image acquired by a front camera of the target vehicle. Thereby, the coordinates of the respective wheel contact points in the foreground image can be determined. Then, three-dimensional coordinates of the wheel contact points in a vehicle coordinate system of the target vehicle are determined based on the two-dimensional coordinates. Thereby, respective three-dimensional coordinates of respective wheel contact points can be obtained. And then, determining the course angle of the obstacle vehicle detected by the target detection model as an initial obstacle vehicle course angle. Thus, the initial obstacle vehicle heading angle can be obtained, and can be used for generating the obstacle vehicle heading angle unit vector. And secondly, generating a unit vector of the heading angle of the obstacle vehicle according to the initial heading angle of the obstacle vehicle and the three-dimensional coordinates. Therefore, the unit vector of the heading angle of the obstacle vehicle can be obtained, and the accuracy of the heading angle of the obstacle vehicle can be improved. And then, generating the heading angle of the obstacle vehicle according to the unit vector of the heading angle of the obstacle vehicle. Therefore, the heading angle of the obstacle vehicle can be obtained, and the accuracy of the heading angle of the obstacle vehicle is improved. And then, generating a rotation matrix of the obstacle vehicle in the vehicle coordinate system according to the heading angle of the obstacle vehicle. Thus, the rotation matrix may characterize the rotational pose of the obstacle vehicle. Next, center coordinates of the obstacle vehicle in the vehicle coordinate system are generated based on the rotation matrix. Thus, the center coordinates may characterize the coordinate position of the obstacle vehicle. Next, a bounding box of the obstacle vehicle is generated based on the center coordinates and the rotation matrix. Thus, the bounding box may characterize the rotational direction and rotational attitude of the obstacle vehicle. Finally, a travel route of the target vehicle is generated based on the bounding box. Thus, the travel route can be referred to as a route when the target vehicle travels. The course angle of the obstacle vehicle is not determined directly through the neural network model, but is generated through the three-dimensional coordinates of the grounding points of the wheels of the obstacle vehicle and the initial obstacle vehicle course angle detected by the target detection model, and therefore the accuracy of the course angle of the obstacle vehicle is improved. And because the rotation matrix of the obstacle vehicle and the center coordinates of the obstacle vehicle are determined, a bounding box of the obstacle vehicle is further generated, and the position and the posture information of the obstacle vehicle are further determined. Therefore, the running route of the target vehicle is generated, the frequency that the running vehicle cannot safely avoid the obstacle vehicle can be further reduced through the generated running route, and the running safety is improved.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and components are not necessarily drawn to scale.
FIG. 1 is a flow diagram of some embodiments of a travel route generation method according to the present disclosure;
FIG. 2 is a schematic illustration of respective wheel grounding points of an obstacle vehicle in accordance with some embodiments of a travel route generation method of the present disclosure;
FIG. 3 is a schematic block diagram of some embodiments of a travel route generation apparatus according to the present disclosure;
FIG. 4 is a schematic block diagram of an electronic device suitable for use in implementing some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings. The embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 illustrates a flow 100 of some embodiments of a driving route generation method according to the present disclosure. The driving route generation method comprises the following steps:
step 101, determining each two-dimensional coordinate of each wheel grounding point of the obstacle vehicle according to the foreground image acquired by the front camera of the target vehicle.
In some embodiments, an executing subject (e.g., a computing device) of the travel route generation method may determine respective two-dimensional coordinates of respective wheel grounding points of the obstacle vehicle from a foreground image acquired by a front camera of the target vehicle. The target vehicle may be a currently driving unmanned vehicle. The foreground image may be a photographed image of an obstacle vehicle. The respective wheel contact points may be contact points of four wheels of the obstacle vehicle with the ground. The respective two-dimensional coordinates may be respective two-dimensional coordinates of respective wheel contact points in the foreground image.
In practice, the executing body may input the foreground image to a wheel grounding point detection model trained in advance to obtain two-dimensional coordinates of each wheel grounding point. The wheel contact point detection model may be a neural network model in which a vehicle image is used as input data and two-dimensional coordinates of the recognized wheel contact point are used as output data. For example, the neural network model may be a convolutional neural network model. Thereby, respective two-dimensional coordinates of the respective wheel grounding points can be obtained, which can be used to generate respective three-dimensional coordinates of the respective wheel grounding points.
The execution subject may be a vehicle-mounted terminal of the target vehicle, or may be a server that is connected to the vehicle-mounted terminal of the target vehicle by communication.
And 102, determining three-dimensional coordinates of the grounding points of the wheels under the vehicle coordinate system of the target vehicle according to the two-dimensional coordinates.
In some embodiments, the execution body may determine respective three-dimensional coordinates of the respective wheel grounding points in the vehicle coordinate system of the target vehicle from the respective two-dimensional coordinates. The vehicle coordinate system of the target vehicle may be a vehicle coordinate system of a currently traveling unmanned vehicle. In practice, the executing body may convert the two-dimensional coordinates into the vehicle coordinate system to obtain three-dimensional coordinates. Thereby, respective three-dimensional coordinates of respective wheel grounding points are obtained, and thus, can be used to generate a unit vector of the heading angle of the obstacle vehicle.
And 103, determining the course angle of the obstacle vehicle detected by the target detection model as an initial obstacle vehicle course angle.
In some embodiments, the executing agent may determine a heading angle of the obstacle vehicle detected by the target detection model as an initial obstacle vehicle heading angle. The target detection model may be a detection model for detecting a heading angle of the obstacle vehicle. The target detection model may be, but is not limited to, one of the following: a yolo target detection model and a centrnet target detection model. Thus, an initial obstacle vehicle heading angle of the obstacle vehicle may be obtained, and thus, may be used to generate the obstacle vehicle heading angle unit vector.
And 104, generating a unit vector of the heading angle of the obstacle vehicle according to the initial heading angle of the obstacle vehicle and each three-dimensional coordinate.
In some embodiments, the execution subject may generate the unit vector of the heading angle of the obstacle vehicle according to the initial heading angle of the obstacle vehicle and the respective three-dimensional coordinates.
In practice, the execution main body can generate the unit vector of the heading angle of the obstacle vehicle according to the initial heading angle of the obstacle vehicle and each three-dimensional coordinate in various ways.
In some optional implementations of some embodiments, the executing agent may generate the unit vector of the heading angle of the obstacle vehicle according to the initial heading angle of the obstacle vehicle and the respective three-dimensional coordinates by:
firstly, inputting the initial obstacle vehicle course angle and a unit vector of a preset obstacle vehicle course angle into a preset course angle constraint function to obtain initial obstacle vehicle course angle constraint. The unit vector of the preset course angle of the obstacle vehicle can be the unit vector of the course angle of the obstacle vehicle. The preset course angle constraint function may be a constraint function for constraining the unit vector of the course angle of the preset obstacle vehicle and the course angle of the initial obstacle vehicle. For example, the predetermined course angle constraint function may be
Figure DEST_PATH_IMAGE001
And
Figure DEST_PATH_IMAGE002
. Wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE003
the initial obstacle vehicle heading angle of the current frame detected by the above-mentioned target detection model,
Figure DEST_PATH_IMAGE004
the initial obstacle vehicle heading angle of the last frame detected by the target detection model may be used.
Figure DEST_PATH_IMAGE005
The unit vector of the heading angle of the preset obstacle vehicle can be used. Wherein, the first and the second end of the pipe are connected with each other,
Figure DEST_PATH_IMAGE006
may be an obstacle vehicle heading angle. Here, the unit vector of the heading angle of the preset obstacle vehicle may be regarded as an unknown number. Course angle of obstacle vehicle
Figure DEST_PATH_IMAGE007
As may be the unknowns.
And secondly, executing the following steps for every two adjacent three-dimensional coordinates of the horizontal axis in each three-dimensional coordinate:
the first sub-step, make up two three-dimensional coordinates into the three-dimensional coordinate vector. As shown in FIG. 2, a three-dimensional coordinate vector consisting of two three-dimensional coordinates adjacent to each other on the horizontal axis may include
Figure DEST_PATH_IMAGE008
And
Figure DEST_PATH_IMAGE009
Figure DEST_PATH_IMAGE010
the three-dimensional coordinate vector can be formed by three-dimensional coordinates of the grounding point of the No. 0 wheel to three-dimensional coordinates of the grounding point of the No. 1 wheel.
Figure 513149DEST_PATH_IMAGE009
The three-dimensional coordinate vector can be formed by three-dimensional coordinates of the grounding point of the No. 3 wheel to three-dimensional coordinates of the grounding point of the No. 2 wheel.
And a second substep, inputting the formed three-dimensional coordinate vector and the unit vector of the preset obstacle vehicle course angle into a preset cross-axis three-dimensional coordinate vector constraint function to obtain cross-axis three-dimensional coordinate vector constraint. The preset horizontal axis three-dimensional coordinate vector constraint function can be a constraint function for the unit vector of the heading angle of the preset barrier vehicle. For example, the predetermined horizontal axis three-dimensional coordinatesThe vector constraint function may be
Figure DEST_PATH_IMAGE011
And
Figure DEST_PATH_IMAGE012
. vec () may be a vector error correction model. And multiplying the three-dimensional coordinate vector in the preset cross-axis three-dimensional coordinate vector constraint function by the unit vector of the preset obstacle vehicle course angle to obtain 0, wherein the unit vector of the three-dimensional coordinate vector and the unit vector of the preset obstacle vehicle course angle can be represented to be vertical.
Thirdly, for every two adjacent three-dimensional coordinates of the longitudinal axes in the three-dimensional coordinates, executing the following steps:
the first sub-step, make up two three-dimensional coordinates into the three-dimensional coordinate vector. As shown in fig. 2, a three-dimensional coordinate vector composed of two three-dimensional coordinates adjacent to each other on the longitudinal axis may include
Figure DEST_PATH_IMAGE013
And
Figure DEST_PATH_IMAGE014
Figure 774498DEST_PATH_IMAGE013
the three-dimensional coordinate vector can be formed by three-dimensional coordinates of the grounding point of the No. 3 wheel to three-dimensional coordinates of the grounding point of the No. 0 wheel.
Figure 882131DEST_PATH_IMAGE014
The three-dimensional coordinate vector can be formed by three-dimensional coordinates of the grounding point of the No. 2 wheel to three-dimensional coordinates of the grounding point of the No. 1 wheel.
And a second substep, inputting the formed three-dimensional coordinate vector and the unit vector of the preset obstacle vehicle course angle into a preset longitudinal axis three-dimensional coordinate vector constraint function to obtain longitudinal axis three-dimensional coordinate vector constraint. The predetermined longitudinal three-dimensional coordinate vector constraint function may be a constraint function for a unit vector of a heading angle of the predetermined obstacle vehicle. For example, the above-mentioned predetermined vertical axis three-dimensional coordinate vector constraint functionThe number can be
Figure DEST_PATH_IMAGE015
And
Figure DEST_PATH_IMAGE016
. And multiplying the three-dimensional coordinate vector in the preset longitudinal axis three-dimensional coordinate vector constraint function by the unit vector of the preset obstacle vehicle course angle to obtain 1, wherein the coincidence of the three-dimensional coordinate vector and the unit vector of the preset obstacle vehicle course angle can be represented.
And fourthly, establishing course angle unit vector constraints of the obstacle vehicles according to the initial obstacle vehicle course angle constraints and the obtained three-dimensional coordinate vector constraints of each transverse axis and the three-dimensional coordinate vector constraints of each longitudinal axis. The unit vector constraint of the heading angle of the obstacle vehicle can be a constraint established on the initial obstacle vehicle heading angle constraint, the obtained three-dimensional coordinate vector constraint of each transverse axis and the three-dimensional coordinate vector constraint of each longitudinal axis. In practice, the executing agent may combine the initial obstacle vehicle heading angle constraint and the horizontal axis three-dimensional coordinate vector constraints with the vertical axis three-dimensional coordinate vectors into a set of equations, and use the combined equations as obstacle vehicle heading angle unit vector constraints.
And fifthly, generating a unit vector of the heading angle of the obstacle vehicle according to the unit vector constraint of the heading angle of the obstacle vehicle. In practice, the executing body may use a least square method to solve the unit vector of the heading angle of the obstacle vehicle included in the unit vector constraint of the heading angle of the obstacle vehicle, so as to obtain the unit vector of the heading angle of the obstacle vehicle after the solution. Then, vector normalization processing can be carried out on the solved preset obstacle vehicle course angle unit vector to obtain an obstacle vehicle course angle unit vector. Thus, the generated unit vector of the heading angle of the obstacle vehicle can be utilized, and thus, the heading angle of the obstacle vehicle can be further generated.
And 105, generating the course angle of the obstacle vehicle according to the unit vector of the course angle of the obstacle vehicle.
In some embodiments, the execution subject may generate the obstacle vehicle heading angle according to the obstacle vehicle heading angle unit vector. The heading angle of the obstacle vehicle can represent the rotating direction and angle of the obstacle vehicle. In practice, the executing body may input the first element of the unit vector of the heading angle of the obstacle vehicle to the arccosine function to obtain the numerical value of the heading angle of the obstacle vehicle. Then, the second element of the unit vector of the heading angle of the obstacle vehicle may be input to the sine function to obtain the rotation direction. Here, the rotation direction may be indicated by a positive or negative sign. Finally, the rotation direction and the obstacle vehicle heading angle value may be combined into an obstacle vehicle heading angle. Therefore, the rotating direction and the rotating angle of the obstacle vehicle can be obtained, and the accuracy of the heading angle of the obstacle vehicle is further improved.
And 106, generating a rotation matrix of the obstacle vehicle in a vehicle coordinate system according to the heading angle of the obstacle vehicle.
In some embodiments, the execution subject may generate a rotation matrix of the obstacle vehicle in the vehicle coordinate system according to the obstacle vehicle heading angle. The rotation direction of the obstacle vehicle may be a clockwise direction or a counterclockwise direction. The rotation matrix may be a rotation posture of the obstacle vehicle around the target vehicle. In practice, the execution body may generate a rotation matrix of the obstacle vehicle in the vehicle coordinate system according to the heading angle of the obstacle vehicle in various ways.
In some optional implementations of some embodiments, the executing entity may generate a rotation matrix of the obstacle vehicle in the vehicle coordinate system according to the obstacle vehicle heading angle by:
the method comprises the following steps of firstly, responding to the fact that the heading angle of the obstacle vehicle is larger than a first preset value, and determining the rotating direction of the obstacle vehicle to be the clockwise direction. For example, the first preset value may be 0. Here, the specific setting of the first preset numerical value is not limited.
And a second step of determining the rotation direction of the obstacle vehicle as a counterclockwise direction in response to determining that the heading angle of the obstacle vehicle is smaller than the first preset value.
And thirdly, determining whether the front camera is horizontally installed.
And fourthly, in response to the fact that the front camera is horizontally installed, determining second preset values as a pitch angle of the obstacle vehicle and a roll angle of the obstacle vehicle respectively. In practice, the values of the obstacle vehicle pitch angle and the obstacle vehicle roll angle may be determined as second preset values, respectively. For example, the second predetermined value may be 0. Here, the specific setting of the second preset value is not limited.
And fifthly, generating a rotation matrix according to the obstacle vehicle pitch angle, the obstacle vehicle roll angle and the obstacle vehicle course angle. In practice, first, the execution body may rotate the obstacle vehicle pitch angle about the x-axis to obtain an x-axis rotation matrix. Then, the obstacle vehicle roll angle is rotated around the y-axis to obtain a y-axis rotation matrix. And then, rotating the heading angle of the obstacle vehicle around a z-axis to obtain a z-axis rotation matrix. And finally, determining the product of the x-axis rotation matrix, the y-axis rotation matrix and the z-axis rotation matrix as the rotation matrix of the obstacle vehicle. Thereby, the rotation matrix can be generated, and thus, the rotation posture of the obstacle vehicle can be further obtained.
And step 107, generating the central coordinates of the obstacle vehicle in the vehicle coordinate system according to the rotation matrix.
In some embodiments, the execution body may generate center coordinates of the obstacle vehicle in a vehicle coordinate system based on the rotation matrix. In practice, the execution body may generate the center coordinates of the obstacle vehicle in the vehicle coordinate system from the rotation matrix in various ways.
In some optional implementations of some embodiments, the executing body may generate the center coordinates of the obstacle vehicle in the vehicle coordinate system from the rotation matrix by:
firstly, inputting the foreground image into a vehicle type detection model trained in advance to obtain vehicle type information of the obstacle vehicle. The vehicle type detection model can be used for detecting type information of the obstacle vehicle. The vehicle type detection model may be, but is not limited to, one of the following: a CooVally detection model and an OpenVINO detection model. The vehicle type information may include a vehicle type. For example, the vehicle type may be, but is not limited to, one of the following: trucks, buses.
And secondly, selecting preset vehicle type information with the same preset vehicle type as the vehicle type from the preset vehicle type information set as target preset vehicle type information. The preset vehicle type information in the preset vehicle type information set may include a preset vehicle type and preset vehicle information corresponding to the preset vehicle type. The preset vehicle information may include a preset wheel base, a preset vehicle length, a preset vehicle width, and a preset vehicle height.
And thirdly, determining a preset wheel base included in the target preset vehicle type information as the vehicle wheel base of the obstacle vehicle.
And fourthly, determining the preset vehicle length included by the target preset vehicle type information as the vehicle length of the obstacle vehicle.
And fifthly, determining the preset vehicle width included in the target preset vehicle type information as the vehicle width of the obstacle vehicle.
And sixthly, determining the preset vehicle height included by the target preset vehicle type information as the vehicle height of the obstacle vehicle.
Seventh, a wheel grounding point is selected from the respective wheel grounding points as a target wheel grounding point. The target wheel ground point may be any wheel ground point of an obstacle vehicle.
And an eighth step of determining coordinates of the target wheel grounding point in a coordinate system of the obstacle vehicle as obstacle vehicle coordinates based on the three-dimensional coordinates of the vehicle wheel base, the vehicle length, the vehicle width, the vehicle height, and the target wheel grounding point. In practice, first, the executing body may determine a ratio of the vehicle wheel base to the vehicle length. Then, the above ratio is compared with the length of the vehicle
Figure DEST_PATH_IMAGE017
The x-coordinate of the target wheel contact point in the coordinate system of the obstacle vehicle is determined as the product of (a). Then, the vehicle is made to have a vehicle width
Figure DEST_PATH_IMAGE018
The y-coordinate of the target wheel grounding point in the coordinate system of the obstacle vehicle is determined. Finally, the established coordinate system of the obstacle vehicle is used for determining the z coordinate of the target wheel grounding point under the coordinate system of the obstacle vehicle. For example, the z-coordinate of the target wheel grounding point in the coordinate system of the obstacle vehicle may be 0.
And a ninth step of determining a bottom edge center coordinate of the obstacle vehicle based on the three-dimensional coordinate of the target wheel grounding point, the obstacle vehicle coordinate, and the rotation matrix. In practice, first, the execution body may determine a product of the rotation matrix and the obstacle vehicle coordinates. Then, a difference between the three-dimensional coordinate of the target wheel grounding point and the product may be determined as a bottom side center coordinate of the obstacle vehicle.
Tenth, the center coordinates of the obstacle vehicle in the vehicle coordinate system are determined based on the vehicle height and the bottom center coordinates. In practice, first, the execution body may determine an x-coordinate of the center coordinate of the bottom side as an x-coordinate of the center coordinate. Then, the y-coordinate of the bottom center coordinate is determined as the y-coordinate of the center coordinate. Finally, the z coordinate of the bottom edge center coordinate and the vehicle height are compared
Figure 149777DEST_PATH_IMAGE018
Is determined as the z coordinate of the center coordinate. For example, the vehicle coordinate system may be a coordinate system established with the bottom center as an origin. Thereby, the center coordinates of the obstacle vehicle in the vehicle coordinate system can be generated, and thus, the bounding box of the obstacle vehicle can be further generated.
And step 108, generating a bounding box of the obstacle vehicle according to the central coordinate and the rotation matrix.
In some embodiments, the execution body may generate a bounding box of the obstacle vehicle based on the center coordinates and the rotation matrix. In practice, the execution body described above may generate the bounding box of the obstacle vehicle from the center coordinates and the rotation matrix in various ways.
In some optional implementations of some embodiments, the executing entity may generate the bounding box of the obstacle vehicle according to the central coordinate and the rotation matrix by:
first, bounding box coordinate information of the obstacle vehicle is determined based on the vehicle length, the vehicle width, and the vehicle height. In practice, the execution body may determine bounding box coordinate information of the obstacle vehicle by using a bounding box algorithm based on the vehicle length, the vehicle width, and the vehicle height to determine frame data of a bounding box. For example, the bounding box algorithm described above may be, but is not limited to, one of: AABB bounding boxes, OBB bounding boxes.
And secondly, generating bounding box position information according to the rotation matrix and the central coordinate. In practice, first, the execution body may determine the rotation matrix as a bounding box rotation posture. The center coordinates can then be determined as bounding box center coordinates. Finally, the bounding box rotational attitude and the bounding box center coordinate may be combined into bounding box position information.
And thirdly, generating an enclosure of the obstacle vehicle according to the coordinate information of the enclosure and the position information of the enclosure. In practice, the execution body may combine the bounding box coordinate information and the bounding box position information to obtain bounding box information, and create a bounding box. Thus, the bounding box of the obstacle vehicle can be generated, and the coordinate information and the position information of the obstacle vehicle can be obtained.
The technical scheme is used as an invention point of the embodiment of the disclosure, and solves the technical problems mentioned in the background technology that when the obstacle vehicle is seriously shielded, the accuracy of the position of the obstacle vehicle generated by the neural network model is low, and the times that the running vehicle cannot safely avoid the obstacle vehicle are more, and the safety is poor. The number of times that the running vehicle cannot safely avoid the obstacle vehicle is large, and factors with poor safety are as follows: when the obstruction of the obstacle vehicle is serious, the accuracy of the position of the obstacle vehicle generated by the neural network model is low. If the factors are solved, the effects of reducing the times that the running vehicle cannot safely avoid the obstacle vehicle and improving the running safety can be achieved. In order to achieve the effect, when the obstacle vehicle is seriously shielded and the generated position of the obstacle vehicle is low in accuracy, the center coordinates of the obstacle vehicle in the vehicle coordinate system are generated by using the rotation matrix. And then, the center coordinates and the rotation matrix are processed to generate a bounding box of the obstacle vehicle. The bounding box may represent position information, coordinate information of the obstacle vehicle. Therefore, the accuracy of the position of the generated obstacle vehicle is improved, the frequency that the running vehicle cannot safely avoid the obstacle vehicle is reduced, and the running safety is improved.
And step 109, generating a running route of the target vehicle based on the bounding box.
In some embodiments, the execution subject may generate the travel route of the target vehicle based on the bounding box. The travel route may be a route on which the target vehicle travels. In practice, the execution body may generate the travel route of the target vehicle based on the bounding box in various ways.
In some optional implementations of some embodiments, the executing body may generate the driving route of the target vehicle based on the bounding box by:
first, the shortest distance between the obstacle vehicle and the target vehicle is determined according to the bounding box. In practice, the execution body may determine the coordinates and the position information of the bounding box as the coordinates and the position information of the obstacle vehicle. Then, the distance between the coordinates of the obstacle vehicle and the coordinates of the target vehicle may be determined as the shortest distance between the obstacle vehicle and the target vehicle.
And a second step of determining the original driving route of the target vehicle as the driving route of the target vehicle in response to the fact that the shortest distance is greater than or equal to a preset shortest distance.
And thirdly, generating an optimal running route of the target vehicle for avoiding the obstacle vehicle as the running route of the target vehicle in response to the fact that the shortest distance is smaller than the preset shortest distance. In practice, the executing body may generate the driving route of the target vehicle by using an obstacle avoidance path planning algorithm. For example, the preset shortest distance may be 1 meter.
In some optional implementations of some embodiments, the execution body may generate an optimal travel route of the target vehicle avoiding the obstacle vehicle as the travel route of the target vehicle according to the coordinate information and the position information of the bounding box. In practice, the executing subject may generate the driving route of the target vehicle by using an obstacle avoidance path planning algorithm.
Alternatively, after step 109, the executing body may further control the target vehicle to travel according to the travel route.
The above embodiments of the present disclosure have the following advantages: by means of the driving route generation method of some embodiments of the disclosure, the number of times that the driving vehicle cannot safely avoid the obstacle vehicle is reduced, and driving safety is improved. Specifically, the number of times the traveling vehicle cannot safely avoid the obstacle vehicle is large, and the reason for the poor safety is that: when the barrier vehicle is seriously shielded, the accuracy of the position of the barrier vehicle generated by the neural network model is low, so that the times that the running vehicle cannot safely avoid the barrier vehicle are more, and the safety is poor. Based on this, the driving route generation method of some embodiments of the present disclosure first determines respective two-dimensional coordinates of respective wheel grounding points of the obstacle vehicle from a foreground image acquired by a front camera of the target vehicle. Thereby, the coordinates of the respective wheel contact points in the foreground image can be determined. Then, three-dimensional coordinates of the wheel contact points in a vehicle coordinate system of the target vehicle are determined based on the two-dimensional coordinates. Thereby, respective three-dimensional coordinates of respective wheel contact points can be obtained. And then, determining the course angle of the obstacle vehicle detected by the target detection model as an initial obstacle vehicle course angle. Thus, the initial obstacle vehicle heading angle can be obtained, and can be used for generating the obstacle vehicle heading angle unit vector. And secondly, generating a unit vector of the heading angle of the obstacle vehicle according to the initial heading angle of the obstacle vehicle and the three-dimensional coordinates. Therefore, the unit vector of the heading angle of the obstacle vehicle can be obtained, and the accuracy of the heading angle of the obstacle vehicle can be improved. And then, generating the heading angle of the obstacle vehicle according to the unit vector of the heading angle of the obstacle vehicle. Therefore, the heading angle of the obstacle vehicle can be obtained, and the accuracy of the heading angle of the obstacle vehicle is improved. And then, generating a rotation matrix of the obstacle vehicle in the vehicle coordinate system according to the heading angle of the obstacle vehicle. Thus, the rotation matrix may characterize the rotational pose of the obstacle vehicle. Next, center coordinates of the obstacle vehicle in the vehicle coordinate system are generated based on the rotation matrix. Thus, the center coordinates may characterize the coordinate position of the obstacle vehicle. Next, a bounding box of the obstacle vehicle is generated based on the center coordinates and the rotation matrix. Thus, the bounding box may characterize the rotational direction and rotational attitude of the obstacle vehicle. Finally, a travel route of the target vehicle is generated based on the bounding box. Thus, the travel route can be referred to as a route when the target vehicle travels. The course angle of the obstacle vehicle is not determined directly through the neural network model, but is generated through the three-dimensional coordinates of the grounding points of the wheels of the obstacle vehicle and the initial obstacle vehicle course angle detected by the target detection model, and therefore the accuracy of the course angle of the obstacle vehicle is improved. And because the rotation matrix of the obstacle vehicle and the central coordinates of the obstacle vehicle are determined, the bounding box of the obstacle vehicle is further generated, and the position and posture information of the obstacle vehicle is further determined. Therefore, the running route of the target vehicle is generated, the frequency that the running vehicle cannot safely avoid the obstacle vehicle can be further reduced through the generated running route, and the running safety is improved.
With further reference to fig. 3, as an implementation of the methods illustrated in the above figures, the present disclosure provides some embodiments of a driving route generation apparatus, which correspond to those method embodiments illustrated in fig. 3, and which may be applied in particular to various electronic devices.
As shown in fig. 3, a travel route generation device 300 of some embodiments includes: a first determination unit 301, a second determination unit 302, a third determination unit 303, a first generation unit 304, a second generation unit 305, a third generation unit 306, a fourth generation unit 307, a fifth generation unit 308, and a sixth generation unit 309. Wherein the first determining unit 301 is configured to determine respective two-dimensional coordinates of respective wheel grounding points of the obstacle vehicle from a foreground image acquired by a front camera of the target vehicle; the second determining unit 302 is configured to determine, from the respective two-dimensional coordinates, respective three-dimensional coordinates of the respective wheel grounding points in the vehicle coordinate system of the target vehicle; the third determining unit 303 is configured to determine the heading angle of the obstacle vehicle detected by the target detection model as an initial obstacle vehicle heading angle; the first generating unit 304 is configured to generate a unit vector of the heading angle of the obstacle vehicle according to the initial obstacle vehicle heading angle and the respective three-dimensional coordinates; the second generating unit 305 is configured to generate an obstacle vehicle heading angle from the obstacle vehicle heading angle unit vector; the third generating unit 306 is configured to generate a rotation matrix of the obstacle vehicle in the vehicle coordinate system according to the obstacle vehicle heading angle; the fourth generation unit 307 is configured to generate center coordinates of the obstacle vehicle in the vehicle coordinate system, based on the rotation matrix; the fifth generating unit 308 is configured to generate a bounding box of the obstacle vehicle based on the center coordinates and the rotation matrix; the sixth generating unit 309 is configured to generate the travel route of the target vehicle based on the bounding box.
It will be understood that the units described in the apparatus 300 correspond to the various steps in the method described with reference to fig. 1. Thus, the operations, features and advantages described above with respect to the method are also applicable to the apparatus 300 and the units included therein, and are not described herein again.
Referring now to FIG. 4, shown is a block diagram of an electronic device 400 (e.g., a computing device) suitable for use in implementing some embodiments of the present disclosure. The electronic device in some embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle-mounted terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 4 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 4, electronic device 400 may include a processing device 401 (e.g., central processing unit, graphics processor, etc.) that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM) 402 or a program loaded from a storage device 408 into a Random Access Memory (RAM) 403. In the RAM 403, various programs and data necessary for the operation of the electronic apparatus 400 are also stored. The processing device 401, the ROM 402, and the RAM 403 are connected to each other via a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
Generally, the following devices may be connected to the I/O interface 405: input devices 406 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 407 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage devices 408 including, for example, magnetic tape, hard disk, etc.; and a communication device 409. The communication device 409 may allow the electronic device 400 to communicate with other devices, either wirelessly or by wire, to exchange data. While fig. 4 illustrates an electronic device 400 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 4 may represent one device or may represent multiple devices as desired.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In some such embodiments, the computer program may be downloaded and installed from a network through the communication device 409, or from the storage device 408, or from the ROM 402. The computer program, when executed by the processing device 401, performs the above-described functions defined in the methods of some embodiments of the present disclosure.
It should be noted that the computer readable medium described in some embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: determining each two-dimensional coordinate of each wheel grounding point of the obstacle vehicle according to the foreground image acquired by the front camera of the target vehicle; determining three-dimensional coordinates of the wheel contact points in a vehicle coordinate system of the target vehicle based on the two-dimensional coordinates; determining the course angle of the obstacle vehicle detected by the target detection model as an initial obstacle vehicle course angle; generating a unit vector of the heading angle of the obstacle vehicle according to the initial heading angle of the obstacle vehicle and the three-dimensional coordinates; generating a course angle of the obstacle vehicle according to the course angle unit vector of the obstacle vehicle; generating a rotation matrix of the obstacle vehicle in the vehicle coordinate system according to the course angle of the obstacle vehicle; generating center coordinates of the obstacle vehicle in the vehicle coordinate system based on the rotation matrix; generating a bounding box of the obstacle vehicle according to the central coordinate and the rotation matrix; and generating a driving route of the target vehicle based on the bounding box.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in some embodiments of the present disclosure may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a first determining unit, a second determining unit, a third determining unit, a first generating unit, a second generating unit, a third generating unit, a fourth generating unit, a fifth generating unit, and a sixth generating unit. Where the names of the units do not in some cases constitute a limitation on the units themselves, the first determination unit may also be described as a "unit that determines respective two-dimensional coordinates of respective wheel contact points of the obstacle vehicle from a foreground image acquired by a front camera of the target vehicle", for example.
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems on a chip (SOCs), complex Programmable Logic Devices (CPLDs), and the like.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept as defined above. For example, the above features and (but not limited to) technical features with similar functions disclosed in the embodiments of the present disclosure are mutually replaced to form the technical solution.

Claims (9)

1. A travel route generation method comprising:
determining each two-dimensional coordinate of each wheel grounding point of the obstacle vehicle according to the foreground image acquired by the front camera of the target vehicle;
determining each three-dimensional coordinate of each wheel grounding point under a vehicle coordinate system of the target vehicle according to each two-dimensional coordinate;
determining the course angle of the obstacle vehicle detected by the target detection model as an initial obstacle vehicle course angle;
generating a unit vector of the course angle of the obstacle vehicle according to the initial course angle of the obstacle vehicle and the three-dimensional coordinates;
generating a course angle of the obstacle vehicle according to the unit vector of the course angle of the obstacle vehicle;
generating a rotation matrix of the obstacle vehicle under the vehicle coordinate system according to the heading angle of the obstacle vehicle;
generating a central coordinate of the obstacle vehicle under the vehicle coordinate system according to the rotation matrix;
generating a bounding box of the obstacle vehicle according to the central coordinate and the rotation matrix;
generating a driving route of the target vehicle based on the bounding box.
2. The method of claim 1, wherein said generating an obstacle vehicle heading angle unit vector based on said initial obstacle vehicle heading angle and said respective three-dimensional coordinates comprises:
inputting the initial obstacle vehicle course angle and the unit vector of the preset obstacle vehicle course angle into a preset course angle constraint function to obtain initial obstacle vehicle course angle constraint;
for every two three-dimensional coordinates of which the horizontal axes are adjacent, the following steps are executed:
forming a three-dimensional coordinate vector by the two three-dimensional coordinates;
inputting the formed three-dimensional coordinate vector and the unit vector of the preset barrier vehicle course angle into a preset cross-axis three-dimensional coordinate vector constraint function to obtain a cross-axis three-dimensional coordinate vector constraint;
for every two three-dimensional coordinates of which the longitudinal axes are adjacent, the following steps are executed:
forming a three-dimensional coordinate vector by the two three-dimensional coordinates;
inputting the formed three-dimensional coordinate vector and the unit vector of the preset obstacle vehicle course angle into a preset longitudinal axis three-dimensional coordinate vector constraint function to obtain longitudinal axis three-dimensional coordinate vector constraint;
establishing course angle unit vector constraints of the obstacle vehicles according to the initial obstacle vehicle course angle constraints, the obtained three-dimensional coordinate vector constraints of each transverse axis and the obtained three-dimensional coordinate vector constraints of each longitudinal axis;
and generating a unit vector of the heading angle of the obstacle vehicle according to the unit vector constraint of the heading angle of the obstacle vehicle.
3. The method of claim 1, wherein said generating a rotation matrix of the obstacle vehicle in the vehicle coordinate system based on the obstacle vehicle heading angle comprises:
determining the rotation direction of the obstacle vehicle to be a clockwise direction in response to determining that the heading angle of the obstacle vehicle is greater than a first preset value;
in response to determining that the obstacle vehicle heading angle is less than the first preset value, determining the rotational direction of the obstacle vehicle as a counterclockwise direction;
determining whether the front camera is horizontally installed;
in response to the fact that the front camera is horizontally installed, determining second preset values as a barrier vehicle pitch angle and a barrier vehicle roll angle respectively;
and generating a rotation matrix according to the obstacle vehicle pitch angle, the obstacle vehicle roll angle and the obstacle vehicle course angle.
4. The method of claim 1, wherein the generating a travel route for the target vehicle based on the bounding box comprises:
determining the shortest distance between the obstacle vehicle and the target vehicle according to the bounding box;
in response to determining that the shortest distance is greater than or equal to a preset shortest distance, determining an original driving route of the target vehicle as a driving route of the target vehicle;
generating an optimal travel route for the target vehicle to avoid the obstacle vehicle as the travel route of the target vehicle in response to determining that the shortest distance is less than the preset shortest distance.
5. The method of claim 1, wherein the generating a travel route for the target vehicle based on the bounding box comprises:
and generating an optimal running route of the target vehicle avoiding the obstacle vehicle as the running route of the target vehicle according to the position information of the bounding box.
6. The method according to one of claims 1-5, wherein the method further comprises:
and controlling the target vehicle to run according to the running route.
7. A travel route generation device comprising:
a first determination unit configured to determine respective two-dimensional coordinates of respective wheel grounding points of the obstacle vehicle, based on a foreground image acquired by a front camera of the target vehicle;
a second determination unit configured to determine, from the respective two-dimensional coordinates, respective three-dimensional coordinates of the respective wheel grounding points in a vehicle coordinate system of the target vehicle;
a third determination unit configured to determine a heading angle of the obstacle vehicle detected by the target detection model as an initial obstacle vehicle heading angle;
a first generating unit configured to generate a unit vector of the heading angle of the obstacle vehicle according to the initial heading angle of the obstacle vehicle and the respective three-dimensional coordinates;
a second generating unit configured to generate a heading angle of the obstacle vehicle according to the unit vector of the heading angle of the obstacle vehicle;
a third generating unit configured to generate a rotation matrix of the obstacle vehicle in the vehicle coordinate system according to the obstacle vehicle heading angle;
a fourth generation unit configured to generate center coordinates of the obstacle vehicle in the vehicle coordinate system based on the rotation matrix;
a fifth generation unit configured to generate a bounding box of the obstacle vehicle based on the center coordinates and the rotation matrix;
a sixth generation unit configured to generate a travel route of the target vehicle based on the bounding box.
8. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-6.
9. A computer-readable medium, on which a computer program is stored, wherein the computer program, when being executed by a processor, carries out the method according to any one of claims 1-6.
CN202211422241.6A 2022-11-15 2022-11-15 Driving route generation method and device, electronic equipment and computer readable medium Active CN115588185B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211422241.6A CN115588185B (en) 2022-11-15 2022-11-15 Driving route generation method and device, electronic equipment and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211422241.6A CN115588185B (en) 2022-11-15 2022-11-15 Driving route generation method and device, electronic equipment and computer readable medium

Publications (2)

Publication Number Publication Date
CN115588185A true CN115588185A (en) 2023-01-10
CN115588185B CN115588185B (en) 2023-03-14

Family

ID=84782990

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211422241.6A Active CN115588185B (en) 2022-11-15 2022-11-15 Driving route generation method and device, electronic equipment and computer readable medium

Country Status (1)

Country Link
CN (1) CN115588185B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200089245A1 (en) * 2018-09-14 2020-03-19 Peyman Yadmellat System and method for hierarchical planning in autonomous vehicles
CN110962844A (en) * 2019-10-28 2020-04-07 纵目科技(上海)股份有限公司 Vehicle course angle correction method and system, storage medium and terminal
US20200149906A1 (en) * 2017-08-31 2020-05-14 Guangzhou Xiaopeng Motors Technology Co., Ltd. Path planning method, system and device for autonomous driving
CN114973198A (en) * 2022-05-27 2022-08-30 智道网联科技(北京)有限公司 Course angle prediction method and device of target vehicle, electronic equipment and storage medium
CN115185271A (en) * 2022-06-29 2022-10-14 禾多科技(北京)有限公司 Navigation path generation method and device, electronic equipment and computer readable medium
CN115257727A (en) * 2022-09-27 2022-11-01 禾多科技(北京)有限公司 Obstacle information fusion method and device, electronic equipment and computer readable medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200149906A1 (en) * 2017-08-31 2020-05-14 Guangzhou Xiaopeng Motors Technology Co., Ltd. Path planning method, system and device for autonomous driving
US20200089245A1 (en) * 2018-09-14 2020-03-19 Peyman Yadmellat System and method for hierarchical planning in autonomous vehicles
CN110962844A (en) * 2019-10-28 2020-04-07 纵目科技(上海)股份有限公司 Vehicle course angle correction method and system, storage medium and terminal
CN114973198A (en) * 2022-05-27 2022-08-30 智道网联科技(北京)有限公司 Course angle prediction method and device of target vehicle, electronic equipment and storage medium
CN115185271A (en) * 2022-06-29 2022-10-14 禾多科技(北京)有限公司 Navigation path generation method and device, electronic equipment and computer readable medium
CN115257727A (en) * 2022-09-27 2022-11-01 禾多科技(北京)有限公司 Obstacle information fusion method and device, electronic equipment and computer readable medium

Also Published As

Publication number Publication date
CN115588185B (en) 2023-03-14

Similar Documents

Publication Publication Date Title
CN112348029B (en) Local map adjusting method, device, equipment and computer readable medium
CN114419604A (en) Obstacle information generation method and device, electronic equipment and computer readable medium
CN115540894B (en) Vehicle trajectory planning method and device, electronic equipment and computer readable medium
CN115817463B (en) Vehicle obstacle avoidance method, device, electronic equipment and computer readable medium
CN114993328B (en) Vehicle positioning evaluation method, device, equipment and computer readable medium
CN114399588A (en) Three-dimensional lane line generation method and device, electronic device and computer readable medium
CN114663529B (en) External parameter determining method and device, electronic equipment and storage medium
CN115617051A (en) Vehicle control method, device, equipment and computer readable medium
CN114445597B (en) Three-dimensional lane line generation method and device, electronic device and computer readable medium
CN112649011B (en) Vehicle obstacle avoidance method, device, equipment and computer readable medium
CN112590929B (en) Correction method, apparatus, electronic device, and medium for steering wheel of autonomous vehicle
CN111338339B (en) Track planning method, track planning device, electronic equipment and computer readable medium
CN115588185B (en) Driving route generation method and device, electronic equipment and computer readable medium
CN114724115B (en) Method, device and equipment for generating obstacle positioning information and computer readable medium
CN114724116B (en) Vehicle traffic information generation method, device, equipment and computer readable medium
CN113804196B (en) Unmanned vehicle path planning method and related equipment
CN114419299A (en) Virtual object generation method, device, equipment and storage medium
CN113805578A (en) Unmanned vehicle path optimization method and related equipment
CN113778078A (en) Positioning information generation method and device, electronic equipment and computer readable medium
CN115610415B (en) Vehicle distance control method, device, electronic equipment and computer readable medium
CN114494428B (en) Vehicle pose correction method and device, electronic equipment and computer readable medium
CN113808050B (en) Denoising method, device and equipment for 3D point cloud and storage medium
JP7425169B2 (en) Image processing method, device, electronic device, storage medium and computer program
CN116563817B (en) Obstacle information generation method, obstacle information generation device, electronic device, and computer-readable medium
CN116563818B (en) Obstacle information generation method, obstacle information generation device, electronic device, and computer-readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant