CN113978456A - Reversing image method with target display - Google Patents

Reversing image method with target display Download PDF

Info

Publication number
CN113978456A
CN113978456A CN202111211609.XA CN202111211609A CN113978456A CN 113978456 A CN113978456 A CN 113978456A CN 202111211609 A CN202111211609 A CN 202111211609A CN 113978456 A CN113978456 A CN 113978456A
Authority
CN
China
Prior art keywords
target
coordinates
image
steering
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111211609.XA
Other languages
Chinese (zh)
Other versions
CN113978456B (en
Inventor
田雨禾
吴迪
李红吉
曹包华
陈浩
刘光远
王志伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changchun Faw Fusheng Group Co ltd
Original Assignee
Changchun Faw Fusheng Group Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changchun Faw Fusheng Group Co ltd filed Critical Changchun Faw Fusheng Group Co ltd
Priority to CN202111211609.XA priority Critical patent/CN113978456B/en
Publication of CN113978456A publication Critical patent/CN113978456A/en
Application granted granted Critical
Publication of CN113978456B publication Critical patent/CN113978456B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means

Abstract

The invention belongs to the technical field of car networking, and relates to a car backing image method with target detection. The method comprises the steps of obtaining the steering wheel corner and the steering direction, and calculating the front wheel corner; calculating a reverse driving area according to the steering angle of the front wheel; calculating the position of the target by detecting the pixel coordinates of the target; determining a target position and a driving area; calculating a rear wheel running track line; displaying the auxiliary image for backing a car; the traditional reversing image only displays a reversing track, and whether a collision risk exists in a target or an obstacle behind a vehicle cannot be accurately judged only through reversing estimation; the method and the device can automatically detect the target in the rear view image, and prompt a user whether the target has potential safety hazards of collision or not by calculating the distance between the target and a driving area in real time. The invention improves the safety auxiliary performance of the reversing image, and enables a user to observe the potential risk in the reversing process more intuitively.

Description

Reversing image method with target display
Technical Field
The invention belongs to the technical field of car networking and relates to a car backing image method with target display.
Background
The image of backing a car is also called as the auxiliary system of parking, apply to all kinds of large, medium and small vehicle backing a car or safe auxiliary field of driving extensively. When the reverse image is operated in a reverse gear mode, the system can automatically switch on a camera positioned at the tail of the vehicle, and the vehicle rear condition is displayed on a central control display screen. A higher-level reversing image system can mark a guide line on the video image, and the reversing guide line is changed in real time when the steering wheel rotates, so that the reversing track is accurately traced.
As technology advances, more and more vehicles are being equipped with automatic parking controllers for automatic parking. However, the efficiency of automatic parking and the coverage of parking scenes are still limited, and the image of backing a car is still practical.
In consideration of vehicle network layout, an automatic parking controller generally integrates automatic parking, 360-degree panoramic view and back image functions. Since a camera used in a vehicle is an ultra-wide angle lens or a fisheye lens, such a lens has a serious distortion, and although a user can observe pedestrians or obstacles through a reverse image, the positions of the pedestrians and the obstacles are not easy to judge.
Disclosure of Invention
The invention aims to solve the technical problem that the reversing image in the prior art cannot automatically detect whether a target has collision potential safety hazards, and provides a reversing image method with target display.
In order to improve the safety auxiliary capacity of the car backing image, the invention realizes a car backing image method with target display. The method not only displays the running track of backing, but also realizes the detection of a specific target based on a target display technology of deep learning. The detected targets include: pedestrians, riders, trolleys, buckets, garbage cans, strollers, roadblocks, parking spot ground locks, cats, dogs and the like. While indicating the location information of the object.
In order to solve the technical problems, the invention is realized by adopting the following technical scheme, which is described by combining the accompanying drawings as follows:
it is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
A car backing image method with target display comprises the following steps:
the method comprises the following steps: acquiring a steering wheel corner and a steering direction, and calculating a front wheel corner;
step two: calculating a reverse driving area according to the steering angle of the front wheel;
step three: calculating the position of the target by detecting the pixel coordinates of the target through the rearview image;
step four: determining a target position and a driving area;
step five: calculating a rear wheel running track line;
step six: and fusing and displaying the rearview image, the target frame information and the track line.
In the first step, the steering wheel angle and the steering direction are obtained, and the front wheel angle is calculated, wherein the specific contents are as follows:
and acquiring the turning angle and the steering of a steering wheel from the vehicle network bus, dividing the turning angle of the steering wheel by the steering ratio to obtain the front wheel steering angle gamma, and setting the steering of the steering wheel as D, the left steering as-1 and the right steering as + 1.
In the second step, the reverse driving area is calculated according to the steering angle of the front wheel, and the specific content is as follows:
when the steering angle gamma of the front wheel is 0, the backing track of the vehicle is a straight line; when gamma is not 0, the backing track of the vehicle is a circle; the radius of the central circle of the rear axle is R ═ Ldtanγ,LdIs the wheel base; assuming that the initial coordinates of the center of the rear axle are (0,0), the coordinates of the center O of the rear axle locus are (D × R,0), and the travel locus of all points on the vehicle body is a circle with O as the center.
Further, the travel region calculation method:
sampling the rear axle track at intervals of delta distance, wherein the coordinate of each sampling point is (x)0,y0),(x1,y1)...(xn,yn) Wherein (x)0,y0) For the initial coordinates (0,0), the sampling trajectory is calculated as follows:
Figure BDA0003309150560000031
Figure BDA0003309150560000032
Figure BDA0003309150560000033
the radian of turning over the distance delta for the vehicle to back up;
the coordinates of each sampling point of arc A are (x'0,y′0),(x′1,y′1)...(x′h,y′h) The calculation method is as follows:
Figure BDA0003309150560000034
the coordinates of each sampling point of the arc B are (x ″)0,y″0),(x″1,y″1)...(x″m,y″m) The calculation method is as follows:
Figure BDA0003309150560000035
w is the vehicle width d1Distance from the head to the rear axle, d2The distance from the tail of the vehicle to the rear axle;
between arc a and arc B is the area in which the vehicle is about to travel, and all objects in this area will cause a collision.
In the third step, the position of the target is calculated by detecting the pixel coordinates of the target through the rearview image, and the specific contents are as follows:
the target in the rear view image is detected by a target recognition algorithm based on deep learning, the detected target position information is pixel coordinates in the image, and position coordinates of the target in the physical world (position coordinates of the real world) are calculated from the pixel coordinates.
The pixel coordinate is (X, Y), and the distortion is required according to the internal reference matrix K and the distortion (K) due to the distortion of the fisheye camera0,k1,k2,k3) Coefficient calculation distortion corrected coordinate (X)c,Yc)。
Figure BDA0003309150560000041
Figure BDA0003309150560000042
Figure BDA0003309150560000043
Figure BDA0003309150560000044
(u, v) normalizing the image plane coordinates, wherein theta is an incident angle, and theta is an incident angle after distortion;
the corrected coordinates (X)c,Yc) Coordinates (X) mapped to the ground by mapping matrix Aw,Yw) The calculation process is as follows:
Figure BDA0003309150560000045
Figure BDA0003309150560000046
(Xt,Yt,Zt) Is the mapped coordinates.
The target position and the driving area in the fourth step are determined, and the specific contents are as follows:
physical coordinates (X) of the objectw,Yw) And the distance of the driving area on the x axis is used as a judgment basis, and the target information is displayed according to the following rule:
(1) marking the target with a red frame when the target is within 1 meter of the vehicle;
(2) marking the target out by a yellow frame when the target is 1 meter away from the vehicle and in the driving track;
(3) when the target is 1 meter away from the vehicle and outside the driving track, the target is marked by a green frame.
And step five, calculating the rear wheel running track line, wherein the specific contents are as follows:
sampling the tracks of the left rear wheel and the right rear wheel at intervals of delta distance in the first step;
left rear wheel track sampling point (X'0,Y′0),(X′1,Y′1)...(X′i,Y′i) The calculation is as follows:
Figure BDA0003309150560000051
right rear wheel track sampling point (X ″)0,Y″0),(X″1,Y″1)...(X″j,Y″j) The calculation is as follows:
Figure BDA0003309150560000052
taking any point (x, y) on the trajectory line and mapping the point to the pixel coordinate (x) of the back view imageP,yP) (ii) a The calculation process is as follows:
calculating coordinates (x) of a camera coordinate systemc,zc,yc):
Figure BDA0003309150560000053
By camera coordinates (x)c,zc,yc) Calculating normalized image plane coordinates (u, v)
Figure BDA0003309150560000061
Carrying out distortion operation on the normalized image plane coordinates (u, v) to obtain coordinates (x) mapped to the rearview imageP,yP)。
Figure BDA0003309150560000062
θd=θ+k0θ3+k1θ5+k2θ7+k3θ9
Figure BDA0003309150560000063
Theta is the incident angle thetadAnd K is an internal reference matrix for the distorted incidence angle.
And step six, fusing and displaying the rear view image, the target frame information and the track line, wherein the specific contents are as follows:
and fusing the rearview image, the target frame information and the track line into one image to be displayed by the vehicle-mounted central control screen.
Compared with the prior art, the invention has the beneficial effects that:
the traditional reversing image only displays a reversing track, and whether a collision risk exists in a target or an obstacle behind a vehicle or not can not be accurately judged only through reversing estimation. The method and the device can automatically detect the target in the rear view image, and prompt a user whether the target has potential safety hazards of collision or not by calculating the distance between the target and a driving area in real time. The invention improves the safety auxiliary performance of the reversing image, and enables a user to observe the potential risk in the reversing process more intuitively.
Drawings
The invention is further described with reference to the accompanying drawings in which:
FIG. 1 is a flowchart of a method for displaying reverse images with a target according to the present invention;
fig. 2 is a schematic diagram of a reverse trajectory.
Detailed Description
In order to make the implementation objects, technical solutions and advantages of the present invention clearer, the technical solutions in the embodiments of the present invention will be described in more detail below with reference to the accompanying drawings in the embodiments of the present invention. In the drawings, the same or similar reference numerals denote the same or similar elements or elements having the same or similar functions throughout. The described embodiments are only some, but not all embodiments of the invention. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention. Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
In the description of the present invention, it is to be understood that the terms "center", "longitudinal", "lateral", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", etc., indicate orientations or positional relationships based on those shown in the drawings, and are used merely for convenience in describing the present invention and for simplifying the description, and do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and therefore, should not be taken as limiting the scope of the present invention.
The invention is described in detail below with reference to the attached drawing figures:
referring to fig. 1 and 2, a method for displaying a reverse image with a target includes the following steps:
the method comprises the following steps: acquiring a steering wheel corner and a steering direction, and calculating a front wheel corner;
and acquiring the turning angle and the steering of a steering wheel from the vehicle network bus, dividing the turning angle of the steering wheel by the steering ratio to obtain the front wheel steering angle gamma, and setting the steering of the steering wheel as D, the left steering as-1 and the right steering as + 1.
Step two: calculating a reverse driving area according to the steering angle of the front wheel;
when the steering angle gamma of the front wheel is 0, the backing track of the vehicle is a straight line; when γ is not 0, the reverse travel locus of the vehicle is a circle. Wherein the radius of the central circle of the rear axle is R ═ Ldtanγ,LdIs the wheel base; assuming that the initial coordinates of the center of the rear axle are (0,0), the coordinates of the center O of the rear axle locus are (D × R,0), and the travel locus of all points on the vehicle body is a circle with O as the center.
A travel area calculation method comprises the following steps:
sampling the rear axle track at intervals of delta distance, wherein the coordinate of each sampling point is (x)0,y0),(x1,y1)...(xn,yn) Wherein (x)0,y0) For the initial coordinate (0,0), the sampling trajectory is calculated as follows
Figure BDA0003309150560000081
Figure BDA0003309150560000082
The coordinates of each sampling point of arc A are (x'0,y′0),(x′1,y′1)...(x′h,y′h) The calculation method is as follows:
Figure BDA0003309150560000083
the coordinates of each sampling point of the arc B are (x ″)0,y″0),(x″1,y″1)...(x″m,y″m) The calculation method is as follows
Figure BDA0003309150560000084
w is the vehicle width d1Distance from the head to the rear axle, d2Distance from tail to rear axle
Between arc a and arc B is the area in which the vehicle is about to travel, and all objects in this area will cause a collision.
Step three: calculating the position of the target by detecting the pixel coordinates of the target;
and detecting the target in the rear-view image through a target recognition algorithm based on deep learning, wherein the detected target position information is pixel coordinates in the image, and the position coordinates of the target in the physical world are calculated according to the pixel coordinates.
The pixel coordinate is (X, Y), and the distortion is required according to the internal reference matrix K and the distortion (K) due to the distortion of the fisheye camera0,k1,k2,k3) Coefficient calculation distortion corrected coordinate (X)c,Yc)。
Figure BDA0003309150560000091
The corrected coordinates (X)c,Yc) Mapping to physical coordinates (X) by a mapping matrix Aw,Yw) The calculation process is as follows:
Figure BDA0003309150560000092
step four: determining a target position and a driving area;
physical coordinates (X) of the objectw,Yw) And the distance of the driving area on the x axis is taken as a judgment basis, and the user is prompted according to the following rules:
(4) marking the target with a red frame when the target is within 1 meter of the vehicle;
(5) marking the target out by a yellow frame when the target is 1 meter away from the vehicle and in the driving track;
(6) when the target is 1 meter away from the vehicle and outside the driving track, the target is marked by a green frame.
Step five: calculating a rear wheel running track line;
the left and right rear wheel trajectories are sampled every delta distance apart as in step one.
Left rear wheel track sampling point (X'0,Y′0),(X′1,Y′1)...(X′i,Y′i) The calculation is as follows:
Figure BDA0003309150560000101
right rear wheel track sampling point (X ″)0,Y″0),(X″1,Y″1)...(X″j,Y″j) The calculation is as follows:
Figure BDA0003309150560000102
two back wheel tracks are physical world coordinates, and any point (x, y) on a track line is taken and mapped to a pixel coordinate (x) of a back view imageP,yP). The calculation process is as follows
Mapping physical coordinates (x, y) to camera coordinates (x)c,zc,yc):
Figure BDA0003309150560000103
By camera coordinates (x)c,zc,yc) Calculating imaging plane coordinates (u, v)
Figure BDA0003309150560000104
Carrying out distortion operation on the imaging plane coordinates (u, v) to obtain the coordinates (x) mapped to the back view imageP,yP)。
Figure BDA0003309150560000111
Step six: and displaying the auxiliary image for backing the car.
And displaying the image fused with the rear-view camera image, the target information and the rear wheel trajectory line by a vehicle-mounted central control screen.
The above description is only for the purpose of illustrating the present invention and the appended claims are not to be construed as limiting the scope of the invention, which is intended to cover all modifications, equivalents and improvements that are within the spirit and scope of the invention as defined by the appended claims. And those not described in detail in this specification are well within the skill of those in the art.

Claims (8)

1. A car backing image method with target display is characterized by comprising the following steps:
the method comprises the following steps: acquiring a steering wheel corner and a steering direction, and calculating a front wheel corner;
step two: calculating a reverse driving area according to the steering angle of the front wheel;
step three: calculating the position of the target by detecting the pixel coordinates of the target through the rearview image;
step four: determining a target position and a driving area;
step five: calculating a rear wheel running track line;
step six: and fusing and displaying the rearview image, the target frame information and the track line.
2. The reversing image method with the target identification function according to claim 1, characterized in that:
in the first step, the steering wheel angle and the steering direction are obtained, and the front wheel angle is calculated, wherein the specific contents are as follows:
and acquiring the turning angle and the steering of a steering wheel from the vehicle network bus, dividing the turning angle of the steering wheel by the steering ratio to obtain the front wheel steering angle gamma, and setting the steering of the steering wheel as D, the left steering as-1 and the right steering as + 1.
3. The reversing image method with the target identification function according to claim 2, characterized in that:
in the second step, the reverse driving area is calculated according to the steering angle of the front wheel, and the specific content is as follows:
when the steering angle gamma of the front wheel is 0, the backing track of the vehicle is a straight line; when gamma is not 0, the backing track of the vehicle is a circle; the radius of the central circle of the rear axle is R ═ Ldtanγ,LdIs the wheel base; assuming that the initial coordinates of the center of the rear axle are (0,0), the coordinates of the center O of the rear axle locus are (D × R,0), and the travel locus of all points on the vehicle body is a circle with O as the center.
4. The reversing image method with the target identification function according to claim 3, characterized in that:
a travel area calculation method comprises the following steps:
sampling the rear axle track at intervals of delta distance, wherein the coordinate of each sampling point is (x)0,y0),(x1,y1)...(xn,yn) Wherein (x)0,y0) As initial coordinates (0,0), sample the trackThe traces are calculated as follows:
Figure FDA0003309150550000011
Figure FDA0003309150550000021
Figure FDA0003309150550000022
the radian of turning over the distance delta for the vehicle to back up;
the coordinates of each sampling point of arc A are (x'0,y′0),(x′1,y′1)...(x′h,y′h) The calculation method is as follows:
Figure FDA0003309150550000023
the coordinates of each sampling point of the arc B are (x ″)0,y″0),(x″1,y″1)...(x″m,y″m) The calculation method is as follows:
Figure FDA0003309150550000024
w is the vehicle width d1Distance from the head to the rear axle, d2The distance from the tail of the vehicle to the rear axle;
between arc a and arc B is the area in which the vehicle is about to travel, and all objects in this area will cause a collision.
5. The reversing image method with the target identification function according to claim 4, characterized in that:
in the third step, the position of the target is calculated by detecting the pixel coordinates of the target through the rearview image, and the specific contents are as follows:
and detecting the target in the rear-view image through a target recognition algorithm based on deep learning, wherein the detected target position information is pixel coordinates in the image, and the position coordinates of the target in the physical world are calculated according to the pixel coordinates.
The pixel coordinate is (X, Y), and the distortion is required according to the internal reference matrix K and the distortion (K) due to the distortion of the fisheye camera0,k1,k2,k3) Coefficient calculation distortion corrected coordinate (X)c,Yc)。
Figure FDA0003309150550000031
Figure FDA0003309150550000032
Figure FDA0003309150550000033
Figure FDA0003309150550000034
(u, v) normalizing the image plane coordinates, wherein theta is an incident angle, and theta is an incident angle after distortion;
the corrected coordinates (X)c,Yc) Coordinates (X) mapped to the ground by mapping matrix Aw,Yw) The calculation process is as follows:
Figure FDA0003309150550000035
Figure FDA0003309150550000036
(Xt,Yt,Zt) Is the mapped coordinates.
6. The reversing image method with the target identification function according to claim 5, characterized in that:
the target position and the driving area in the fourth step are determined, and the specific contents are as follows:
physical coordinates (X) of the objectw,Yw) And the distance of the driving area on the x axis is used as a judgment basis, and the target information is displayed according to the following rule:
(1) marking the target with a red frame when the target is within 1 meter of the vehicle;
(2) marking the target out by a yellow frame when the target is 1 meter away from the vehicle and in the driving track;
(3) when the target is 1 meter away from the vehicle and outside the driving track, the target is marked by a green frame.
7. The reversing image method with the target identification function according to claim 6, characterized in that:
and step five, calculating the rear wheel running track line, wherein the specific contents are as follows:
sampling the tracks of the left rear wheel and the right rear wheel at intervals of delta distance in the first step;
left rear wheel track sampling point (X'0,Y0′),(X′1,Y1′)...(X′i,Yi') calculated as follows:
Figure FDA0003309150550000041
right rear wheel track sampling point (X ″)0,Y″0),(X″1,Y″1)...(X″j,Y″j) The calculation is as follows:
Figure FDA0003309150550000042
taking any point (x, y) on the trajectory line and mapping the point to the pixel coordinate (x) of the back view imageP,yP) (ii) a The calculation process is as follows:
calculating coordinates (x) of a camera coordinate systemc,zc,yc):
Figure FDA0003309150550000043
By camera coordinates (x)c,zc,yc) Calculating normalized image plane coordinates (u, v)
Figure FDA0003309150550000044
Carrying out distortion operation on the normalized image plane coordinates (u, v) to obtain coordinates (x) mapped to the back view imageP,yP)。
Figure FDA0003309150550000051
θd=θ+k0θ3+k1θ5+k2θ7+k3θ9
Figure FDA0003309150550000052
Theta is the incident angle thetadAnd K is an internal reference matrix for the distorted incidence angle.
8. The reversing image method with the target identification function according to claim 7, characterized in that:
and step six, fusing and displaying the rear view image, the target frame information and the track line, wherein the specific contents are as follows: and fusing the rearview image, the target frame information and the track line into one image to be displayed by the vehicle-mounted central control screen.
CN202111211609.XA 2021-10-18 2021-10-18 Reversing image display method with target display Active CN113978456B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111211609.XA CN113978456B (en) 2021-10-18 2021-10-18 Reversing image display method with target display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111211609.XA CN113978456B (en) 2021-10-18 2021-10-18 Reversing image display method with target display

Publications (2)

Publication Number Publication Date
CN113978456A true CN113978456A (en) 2022-01-28
CN113978456B CN113978456B (en) 2024-04-02

Family

ID=79739192

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111211609.XA Active CN113978456B (en) 2021-10-18 2021-10-18 Reversing image display method with target display

Country Status (1)

Country Link
CN (1) CN113978456B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100861543B1 (en) * 2007-04-06 2008-10-02 주식회사 만도 Parking assistant apparatus and method for avoiding collision with obstacle
CN101582164A (en) * 2009-06-24 2009-11-18 北京锦恒佳晖汽车电子系统有限公司 Image processing method of parking assist system
DE102010028911A1 (en) * 2010-05-12 2011-11-17 Robert Bosch Gmbh Method for monitoring movement of vehicle i.e. fork lift lorry, involves detecting collision hazard of vehicle by obstruction placed in region of curved travel path, or leaving curved travel path by vehicle
CN106740866A (en) * 2016-11-28 2017-05-31 南京安驾信息科技有限公司 The computational methods and device of a kind of steering wheel angle
CN112339762A (en) * 2020-10-29 2021-02-09 三一专用汽车有限责任公司 Reversing image safety early warning method, mixer truck and computer readable storage medium
CN112598734A (en) * 2020-12-17 2021-04-02 的卢技术有限公司 Image-based method for accurately positioning pedestrians around vehicle body

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100861543B1 (en) * 2007-04-06 2008-10-02 주식회사 만도 Parking assistant apparatus and method for avoiding collision with obstacle
CN101582164A (en) * 2009-06-24 2009-11-18 北京锦恒佳晖汽车电子系统有限公司 Image processing method of parking assist system
DE102010028911A1 (en) * 2010-05-12 2011-11-17 Robert Bosch Gmbh Method for monitoring movement of vehicle i.e. fork lift lorry, involves detecting collision hazard of vehicle by obstruction placed in region of curved travel path, or leaving curved travel path by vehicle
CN106740866A (en) * 2016-11-28 2017-05-31 南京安驾信息科技有限公司 The computational methods and device of a kind of steering wheel angle
CN112339762A (en) * 2020-10-29 2021-02-09 三一专用汽车有限责任公司 Reversing image safety early warning method, mixer truck and computer readable storage medium
CN112598734A (en) * 2020-12-17 2021-04-02 的卢技术有限公司 Image-based method for accurately positioning pedestrians around vehicle body

Also Published As

Publication number Publication date
CN113978456B (en) 2024-04-02

Similar Documents

Publication Publication Date Title
CN110517521B (en) Lane departure early warning method based on road-vehicle fusion perception
US11688183B2 (en) System and method of determining a curve
CN105539430B (en) A kind of people's car mutual intelligent parking method based on handheld terminal
CN101894271B (en) Visual computing and prewarning method of deviation angle and distance of automobile from lane line
CN107507131B (en) 360-degree panoramic reverse image generation method based on single camera
CN105678787A (en) Heavy-duty lorry driving barrier detection and tracking method based on binocular fisheye camera
CN106054174A (en) Fusion method for cross traffic application using radars and camera
CN110203210A (en) A kind of lane departure warning method, terminal device and storage medium
CN107133985A (en) A kind of vehicle-mounted vidicon automatic calibration method for the point that disappeared based on lane line
CN109624851B (en) Augmented reality-based driving assistance method and system and readable storage medium
CN103204104B (en) Monitored control system and method are driven in a kind of full visual angle of vehicle
CN108909625B (en) Vehicle bottom ground display method based on panoramic all-round viewing system
CN111547045B (en) Automatic parking method and device for vertical parking spaces
US20140160287A1 (en) Guide method of a reverse guideline system
CN113320474A (en) Automatic parking method and device based on panoramic image and human-computer interaction
Krasner et al. Automatic parking identification and vehicle guidance with road awareness
CN113518995A (en) Method for training and using neural networks to detect self-component position
WO2022062000A1 (en) Driver assistance method based on transparent a-pillar
CN116101325A (en) Narrow road traffic processing method and narrow road traffic processing device
CN109814115B (en) Angle identification and correction method for vertical parking
CN113978456A (en) Reversing image method with target display
CN111547048B (en) Automatic parking method and device for inclined parking spaces
CN113353071B (en) Narrow area intersection vehicle safety auxiliary method and system based on deep learning
CN115657037A (en) Detection method and system for judging whether vehicles can pass through barrier road section
CN111547046B (en) Parallel parking space pre-occupation type automatic parking method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant