CN113592945A - Parking target expected pose calculation method and system, vehicle and storage medium - Google Patents

Parking target expected pose calculation method and system, vehicle and storage medium Download PDF

Info

Publication number
CN113592945A
CN113592945A CN202110823860.5A CN202110823860A CN113592945A CN 113592945 A CN113592945 A CN 113592945A CN 202110823860 A CN202110823860 A CN 202110823860A CN 113592945 A CN113592945 A CN 113592945A
Authority
CN
China
Prior art keywords
parking
parking space
expected
point
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110823860.5A
Other languages
Chinese (zh)
Other versions
CN113592945B (en
Inventor
徐榕
王晟
汪哲文
党建民
邱利宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Changan Automobile Co Ltd
Original Assignee
Chongqing Changan Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Changan Automobile Co Ltd filed Critical Chongqing Changan Automobile Co Ltd
Priority to CN202110823860.5A priority Critical patent/CN113592945B/en
Publication of CN113592945A publication Critical patent/CN113592945A/en
Application granted granted Critical
Publication of CN113592945B publication Critical patent/CN113592945B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/06Automatic manoeuvring for parking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a method and a system for calculating expected poses of parking targets, a vehicle and a storage medium, wherein the method comprises the following steps: s1, receiving parking space information input by sensing, and establishing a parking coordinate system according to the received parking space information; s2, determining an expected heading angle when parking is completed; s3, preliminarily determining a straight line where the coordinates of the center point of the vehicle at the parking target depth are located and a straight line where the coordinates of the center point of the vehicle at the parking transverse position are located; s4, solving the coordinates of the intersection point of the two straight lines in the step S3 to obtain the coordinates of the vehicle center point of the preliminarily determined target expected pose; s5, judging whether the coordinates of the target expected pose vehicle center point obtained in the step S4 meet the expected parking pose limiting condition or not; and S6, if the vehicle attitude is not satisfied, correcting the expected attitude during the optimized parking, and if the vehicle attitude is satisfied, ending the process. The method and the device can determine the expected pose when parking is finished based on the parking rules under different scenes according to the scene signals input by sensing.

Description

Parking target expected pose calculation method and system, vehicle and storage medium
Technical Field
The invention belongs to the technical field of parking, and particularly relates to a parking expected pose calculation method, a parking expected pose calculation system, a vehicle and a storage medium.
Background
In the face of increasingly complex parking environments, the automatic parking system is used as an important automobile intelligent auxiliary system, automatic parking is realized through sensing detection and decision planning, the problem that parking is difficult in a plurality of complex scenes is solved, and the automatic parking system is favored by more and more market car purchasers. The realization of safe automatic parking for avoiding collision and correct sensing perception is an essential link in the system, so that scene reconstruction is realized, and intelligent decision logic and accurate transverse and longitudinal control are realized. The method comprises the steps of reconstructing obstacle information of a target parking space according to sensing information, and determining a parking coordinate point which is expected finally in parking according to a reconstructed scene, wherein the parking coordinate point is accurately controlled, and the basis of realizing functions is realized. The actual parking scene is very complicated, all scenes are hardly considered due to the decision logic of the expected position of the parking target based on the rules, and the finally calculated target position may be too close to a certain obstacle or too far away from or too close to a certain edge of the parking space, and may not be the optimal expected position.
Disclosure of Invention
The invention aims to provide a parking target expected pose calculation method, a parking target expected pose calculation system, a vehicle and a storage medium, which can determine an expected pose when parking is finished based on parking rules under different scenes according to scene signals input by sensors.
In a first aspect, the present invention provides a method for calculating an expected pose of a parking target, including the steps of:
s1, receiving parking space information input by sensing, and establishing a parking coordinate system according to the received parking space information;
s2, determining an expected heading angle when parking is completed;
s3, preliminarily determining a straight line where the coordinates of the center point of the vehicle at the parking target depth are located and a straight line where the coordinates of the center point of the vehicle at the parking transverse position are located;
s4, solving the coordinates of the intersection point of the two straight lines in the step S3 to obtain the coordinates of the vehicle center point of the preliminarily determined target expected pose;
s5, judging whether the coordinates of the target expected pose vehicle center point obtained in the step S4 meet the expected parking pose limiting condition or not;
and S6, if the vehicle attitude is not satisfied, correcting the expected attitude during the optimized parking, and if the vehicle attitude is satisfied, ending the process.
Optionally, a parking coordinate system is established according to the parking space information detected by sensing, specifically:
in the parking space searching process, a detected near-end parking space vertex is used as a point P0, a detected near-end parking space vertex is used as a point P1, a point P0P1 is used as an x-axis direction, and the vertical direction of the point is used as a direction to establish a coordinate system.
Optionally, the parking space and environment information detected by sensing is converted into four coordinate vertexes of an image parking space, four coordinate vertexes of an ultrasonic parking space and virtual and real of each coordinate point, and the four coordinate vertexes and the virtual and real of each coordinate point are used as sensing input signals;
the real point is determined if the coordinate vertex of the image parking space is a point actually detected by image processing, the imaginary point is determined if the coordinate vertex of the image parking space is a point calculated by an extension line of a side line of the detected parking space, the real point is determined if the coordinate vertex of the ultrasonic parking space is a point given according to the position of an actual obstacle, and the imaginary point is determined if the coordinate vertex of the ultrasonic parking space is free of the actual obstacle and is a point virtually found according to the surrounding environment.
Optionally, the expected target heading angle is selected to be a reference image boundary or a reference ultrasonic boundary according to the input image parking space signal and the input ultrasonic parking space signal, when the boundaries of the ultrasonic parking spaces on the two sides are located at the free distance, the vertical distance between the image and the image parking space center line meets the requirement of half car width plus a calibration value, the expected parking heading angle refers to the angle of the image parking space, and when the image and the image are not met, the expected parking heading angle refers to the ultrasonic boundary closer to the image parking space center line.
Optionally, the straight lines of the central point of the expected parking target pose in the x-direction position and the y-direction position are determined according to the boundary information of the input parking space and the type of the obstacle, and the intersection point of the two straight lines is the target position of the central point of the vehicle when the target parking is finished.
Optionally, after the parking expected pose is obtained through preliminary calculation, whether the coordinates of the center point of the vehicle at the target pose meet the limit condition of the expected parking pose is judged, if not, the heading and the coordinate position of the center point need to be adjusted according to the current target pose, and if not, the calculation is completed.
In a second aspect, the system for calculating the expected pose of the parking target comprises a memory, a controller and a sensor group, wherein the controller is respectively connected with the sensor group and the memory, the controller receives parking space information detected by the sensor group, a computer readable program is stored in the memory, and when the controller calls the computer readable program, the method for calculating the expected pose of the parking target can be executed.
In a third aspect, a vehicle according to the present invention employs the parking target expected pose calculation system according to the present invention.
In a fourth aspect, the present invention provides a storage medium having a computer readable program stored therein, where the computer readable program is capable of executing the steps of the method for calculating the expected pose of a parking target according to the present invention when the computer readable program is called by a controller.
The invention has the following advantages: after the initial target expected pose is obtained, the method corrects the expected pose by a certain number of steps according to different evaluation indexes, so that the final expected target pose can ensure that each edge distance barrier is kept in a proper range as far as possible.
Drawings
FIG. 1 is a flow chart of the present embodiment;
FIG. 2 is a schematic view of a parking space coordinate system;
FIG. 3 is a schematic view of parallelism evaluation;
FIG. 4 is a schematic diagram of target pose optimization;
FIG. 5 is a flow chart of target pose optimization.
Detailed Description
The invention will be further explained with reference to the drawings.
As shown in fig. 1, in the present embodiment, a method for calculating an expected pose of a parking target includes the following steps:
s1, receiving the parking space information input by sensing, and establishing a parking coordinate system according to the received parking space information, specifically:
the parking space coordinate system shown in fig. 2 is established according to the parking space information sensed and detected in the process of searching the parking space by the vehicle, if a lineation parking space exists, a near-end parking space vertex detected firstly by a camera is taken as a point P0, a near-end parking space vertex detected later is taken as a point P1, a point P0P1 is taken as an x-axis direction, and the vertical direction is taken as a y-direction to establish the coordinate system.
Converting the parking space and environment information detected by sensing into four coordinate vertexes of an image parking space, four coordinate vertexes of an ultrasonic parking space and virtual and real of each coordinate point, and using the four coordinate vertexes and the four coordinate vertexes as input signals; the real point is determined if the coordinate vertex of the image parking space is a point actually detected by image processing, the imaginary point is determined if the coordinate vertex of the image parking space is a point calculated by an extension line of a side line of the detected parking space, the real point is determined if the coordinate vertex of the ultrasonic parking space is a point given according to the position of an actual obstacle, and the imaginary point is determined if the coordinate vertex of the ultrasonic parking space is free of the actual obstacle and is a point virtually found according to the surrounding environment.
S2, determining an expected heading angle when parking is completed, specifically:
the parking space type is judged according to four vertex coordinates of the ultrasonic parking space and the image parking space which are input by sensing, the parking space is divided into three types, namely a vertical parking space, a parallel parking space and an inclined parking space according to the lengths of line segments (namely the parking space width and the parking space depth) of the vertex coordinate points of the parking space in the x-axis direction and the y-axis direction and the inclination angles of the line segments P2P0 and P3P1, wherein the expected heading angle of the vertical parking space and the inclined parking space is the average value of the included angles of the line segments P2P0 and P3P 1. The expected parking course angle of the parallel parking spaces is the same as the slope of the line segment P0P1 or the slope of the line segment P2P3 according to the difference of the depths of the parking spaces in the y direction; the P2 is the first detected far-end parking space vertex, and the P3 is the last detected far-end parking space vertex. When no obstacles such as curbs exist in the parallel parking spaces, the parking expected course angle is the same as the slope of the line segment P0P1 when the depth of the parking spaces in the parallel parking spaces is greater than the depth (the vehicle width plus the reserved space distance) required for parking, and when obstacles such as curbs exist in the parallel parking spaces, the parking expected course angle is the same as the slope of the line segment P2P 3.
S3, preliminarily determining a straight line where the center point coordinates of the vehicle at the parking target depth are located and a straight line where the center point coordinates of the vehicle at the parking transverse position are located, specifically:
determining a line segment of target pose reference according to the parking space type, whether a road edge exists in a P2P3 segment, the type of an obstacle and the parking space depth, if an obstacle exists in a P2P3 segment, ensuring that the distance between the nearest point of the vehicle and the obstacle is larger than or equal to a first calibration value, if no obstacle exists in a P2P3 segment, ensuring that the difference between the vehicle and the boundary line does not exceed a second calibration value while the vehicle does not exceed the rear boundary line, and obtaining the straight line of the center point coordinate of the vehicle at the parking target depth by offsetting the front boundary line and the rear boundary line of the parking space.
In this embodiment, the equation of the straight line of the line segment P0P1 (i.e. the front boundary line) is:
A01*X+B01*Y+C01=0 (1);
wherein A is01、B01、C01Respectively, the coefficients of the equation of the straight line on which the line segment P0P1 lies.
In this embodiment, the equation of the straight line of the line segment P2P3 (i.e. the rear boundary line) is:
A23*X+B23*Y+C23=0 (2);
wherein A is23、B23、C23Respectively, the coefficients of the equation of the straight line on which the line segment P2P3 lies.
Determining the distance between the center point of the vehicle at the target pose and the reference sideline as Dist _ Y according to the parking space depth and the type of the P2P3 segment of obstacles, and offsetting the depth reference straight line by the distance of Dist _ Y to obtain a straight line equation determining the straight line where the center point coordinate of the vehicle at the target expected pose is:
Ay*X+By*Y+Cy=0 (3);
wherein A isy、By、CyRespectively are coefficients of a linear equation of the target expected pose depth.
Judging whether an obstacle invades the image parking space or not according to the input image parking space vertex coordinates and the ultrasonic parking space vertex coordinates, and if the two sides of the obstacle do not invade the image parking space, taking the centers of the two sides of the image parking space in the x direction of the target expected pose; if the obstacle unilaterally invades the image parking space and the invaded parking space meets the parking requirement, the x-direction position of the target expected pose refers to the image parking space and the ultrasonic parking space at the same time, and an obstacle boundary curve is fitted, namely a linear equation of a line segment P2PO and a linear equation of a line segment P3P1 are included, wherein:
the equation of the straight line of the line segment P2P0 is:
A20*X+B20*Y+C20=0 (4);
wherein: a. the20、B20、C20Respectively, the coefficients of the equation of the straight line on which the line segment P2P0 is located.
The equation of the straight line of the line segment P3P1 is:
A31*X+B31*Y+C31=0 (5);
wherein: a. the31、B31、C31Respectively, the coefficients of the equation of the straight line in which the line segment P3P1 is located.
And offsetting the reference boundary straight line to the center line of the parking space by a certain distance according to a rule to obtain a linear equation of the center line point coordinates of the vehicle at the parking transverse position:
Ax*X+Bx*Y+Cx=0 (6)
wherein: a. thex、Bx、CxRespectively are coefficients of a linear equation where the coordinates of the center point of the vehicle at the transverse parking position are located.
If the two sides of the obstacle invade the image parking spaces and the parking areas represented by the ultrasonic parking spaces meet the parking conditions, the target expected pose x-direction position refers to the ultrasonic parking spaces for centered parking, and the straight line where the vehicle center point mark is located in the depth of the parking target is obtained:
Ay*X+By*Y+Cy=0 (7);
wherein: ay, By、CyAnd marking the coefficient of the linear equation of the center point of the vehicle for the depth of the parking target.
S4, solving the coordinates of the intersection point of the straight line (6) and the straight line (7), and obtaining the coordinates (X ', Y') of the preliminarily determined vehicle center point of the target expected pose:
Figure BDA0003172926420000051
Figure BDA0003172926420000052
s6, judging whether the coordinates of the target expected pose vehicle center point obtained in the step S5 meet the expected parking pose limiting condition, specifically:
according to whether vehicle boundary points VH _1, VH _2, VH _3 and VH _4 exceed parking space boundaries when parking at the target pose obtained by primary calculation is finished, determining a fitness function for evaluating the expected pose:
P=k1*A1+k2*A2+……+k8*A8 (10)
A1-A8 respectively represent the parallelism of the vehicle boundary and the parking space boundary of the target pose, k 1-k 8 are the weight proportion of the parallelism of each side, and P represents a fitness function.
Taking the left boundary of the vehicle as an example, the parallelism calculation is as shown in fig. 3, n sampling points are equidistantly taken on the left boundary of the vehicle, and the average value and the standard deviation of the vertical distances from the sampling points to the parking space boundary are calculated and used as the evaluation index of a 1.
Figure BDA0003172926420000053
Figure BDA0003172926420000054
Average distance D between sampling point and parking space boundaryaverThe smaller the difference value with the reference distance, the smaller the standard deviation D of the distance of the sampling pointstdThe smaller the evaluation index A1; wherein d isiAnd the vertical distance from the sampling point i to the parking space boundary is shown.
A2-A8 are calculated in the same way.
S6, correcting and optimizing the expected pose during parking completion, specifically:
as shown in fig. 4, the coordinates of the center point of the target expected pose vehicle in step S5 are used as the center of the circle, the radius of the circle is set to be R, N points are randomly selected on the circle as the next generation expected target points, and the pose change angle δ is set. And (3) calculating the value of the fitness function of the new expected pose, finishing the calculation if the fitness function of the previous generation has the minimum value, and taking the coordinate point and angle combination with the minimum fitness function as the original expected pose of the new generation until the set maximum iteration number is reached, referring to fig. 5.
In this embodiment, a system for calculating an expected pose of a parking target includes a memory, a controller, and a sensor group, where the controller is connected to the sensor group and the memory, respectively, the controller receives information about a parking space detected by the sensor group, and a computer readable program is stored in the memory, and when the controller calls the computer readable program, the steps of the method for calculating an expected pose of a parking target according to this embodiment may be executed.
In the present embodiment, a vehicle employs the parking target expected pose calculation system as described in the present embodiment.
In the present embodiment, a storage medium has stored therein a computer-readable program capable of executing the steps of the parking target expected pose calculation method as described in the present embodiment when called by a controller.

Claims (9)

1. A parking target expected pose calculation method is characterized by comprising the following steps:
s1, receiving parking space information input by sensing, and establishing a parking coordinate system according to the received parking space information;
s2, determining an expected heading angle when parking is completed;
s3, preliminarily determining a straight line where the coordinates of the center point of the vehicle at the parking target depth are located and a straight line where the coordinates of the center point of the vehicle at the parking transverse position are located;
s4, solving the coordinates of the intersection point of the two straight lines in the step S3 to obtain the coordinates of the vehicle center point of the preliminarily determined target expected pose;
s5, judging whether the coordinates of the target expected pose vehicle center point obtained in the step S4 meet the expected parking pose limiting condition or not;
and S6, if the parking position is not satisfied, correcting and optimizing the expected position during parking completion, and if the parking position is satisfied, ending the process.
2. The parking target expected pose calculation method according to claim 1, characterized in that: establishing a parking coordinate system according to the parking space information detected by sensing, which specifically comprises the following steps:
in the parking space searching process, a detected near-end parking space vertex is used as a point P0, a detected near-end parking space vertex is used as a point P1, a point P0P1 is used as an x-axis direction, and the vertical direction of the point is used as a direction to establish a coordinate system.
3. The parking target expected pose calculation method according to claim 2, characterized in that: converting the parking space and environment information detected by sensing into four coordinate vertexes of an image parking space, four coordinate vertexes of an ultrasonic parking space and virtual and real of each coordinate point, and using the four coordinate vertexes and the four coordinate vertexes as sensing input signals;
the real point is determined if the coordinate vertex of the image parking space is a point actually detected by image processing, the imaginary point is determined if the coordinate vertex of the image parking space is a point calculated by an extension line of a side line of the detected parking space, the real point is determined if the coordinate vertex of the ultrasonic parking space is a point given according to the position of an actual obstacle, and the imaginary point is determined if the coordinate vertex of the ultrasonic parking space is free of the actual obstacle and is a point virtually found according to the surrounding environment.
4. The parking target expected pose calculation method according to any one of claims 1 to 3, characterized by: and selecting the expected target course angle as a reference image boundary or a reference ultrasonic boundary according to the input image parking space signal and the input ultrasonic parking space signal, and when the boundaries of the ultrasonic parking spaces on the two sides are positioned at the vacant distance which is the vertical distance between the image and the center line of the image parking space and meets the requirement of adding a calibration value to a half car width, parking the expected course angle according to the angle of the image parking space, and parking the expected course angle according to the ultrasonic boundary which is closer to the center line of the image parking space when the image and the center line of the image parking space do not meet the requirement.
5. The parking target expected pose calculation method according to claim 4, characterized in that: and determining straight lines of the central point of the expected parking target pose in the x-direction position and the y-direction position according to the boundary information of the input parking space and the type of the obstacle, wherein the intersection point of the two straight lines is the target position of the central point of the vehicle when the target parking is finished.
6. The parking target expected pose calculation method according to claim 1, 2, 3, or 5, characterized in that: and after the parking expected pose is obtained through preliminary calculation, judging whether the coordinates of the center point of the vehicle at the target pose meet the limit condition of the expected parking pose, if not, adjusting the course and the coordinate position of the center point according to the current target pose, and otherwise, finishing the calculation.
7. A parking target expected pose calculation system comprises a storage, a controller and a sensor group, wherein the controller is respectively connected with the sensor group and the storage, and the controller receives parking space information detected by the sensor group, and is characterized in that: the memory stores therein a computer-readable program that, when being invoked, is capable of executing the steps of the parking target desired pose calculation method according to any one of claims 1 to 6.
8. A vehicle, characterized in that: the parking target expected pose calculation system according to claim 7 is adopted.
9. A storage medium having a computer-readable program stored therein, characterized in that: the computer readable program, when invoked by a controller, is capable of executing the steps of the method for calculating a desired pose of a parking target according to any one of claims 1 to 6.
CN202110823860.5A 2021-07-21 2021-07-21 Method, system, vehicle and storage medium for calculating expected pose of parking target Active CN113592945B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110823860.5A CN113592945B (en) 2021-07-21 2021-07-21 Method, system, vehicle and storage medium for calculating expected pose of parking target

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110823860.5A CN113592945B (en) 2021-07-21 2021-07-21 Method, system, vehicle and storage medium for calculating expected pose of parking target

Publications (2)

Publication Number Publication Date
CN113592945A true CN113592945A (en) 2021-11-02
CN113592945B CN113592945B (en) 2024-04-02

Family

ID=78248672

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110823860.5A Active CN113592945B (en) 2021-07-21 2021-07-21 Method, system, vehicle and storage medium for calculating expected pose of parking target

Country Status (1)

Country Link
CN (1) CN113592945B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114013428A (en) * 2021-11-29 2022-02-08 江苏大学 Dynamic parking path planning method based on intermolecular acting force
CN114511842A (en) * 2022-04-19 2022-05-17 北京主线科技有限公司 Vehicle pose determining method, device, equipment and medium
CN114852060A (en) * 2022-05-23 2022-08-05 广州小鹏自动驾驶科技有限公司 Parking control method, parking control device, vehicle, and storage medium
CN115482668A (en) * 2022-09-22 2022-12-16 安徽江淮汽车集团股份有限公司 Visual parking space management method and system, electronic equipment and computer readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140106126A (en) * 2013-02-26 2014-09-03 (주)팜비젼 Auto parking method based on around view image
DE102017126539A1 (en) * 2017-11-13 2019-05-16 Valeo Schalter Und Sensoren Gmbh Automatic reverse reverse parking
CN110775052A (en) * 2019-08-29 2020-02-11 浙江零跑科技有限公司 Automatic parking method based on fusion of vision and ultrasonic perception
CN111137277A (en) * 2018-11-05 2020-05-12 陕西汽车集团有限责任公司 Method for setting automatic parking position of unmanned mining vehicle
CN111547049A (en) * 2020-05-22 2020-08-18 北京罗克维尔斯科技有限公司 Vehicle parking control method and device and vehicle
CN112339748A (en) * 2020-11-09 2021-02-09 东风汽车集团有限公司 Method and device for correcting vehicle pose information through environment scanning in automatic parking
CN112793563A (en) * 2021-02-04 2021-05-14 武汉理工大学 Automatic parking method, device, storage medium and computer equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140106126A (en) * 2013-02-26 2014-09-03 (주)팜비젼 Auto parking method based on around view image
DE102017126539A1 (en) * 2017-11-13 2019-05-16 Valeo Schalter Und Sensoren Gmbh Automatic reverse reverse parking
CN111137277A (en) * 2018-11-05 2020-05-12 陕西汽车集团有限责任公司 Method for setting automatic parking position of unmanned mining vehicle
CN110775052A (en) * 2019-08-29 2020-02-11 浙江零跑科技有限公司 Automatic parking method based on fusion of vision and ultrasonic perception
CN111547049A (en) * 2020-05-22 2020-08-18 北京罗克维尔斯科技有限公司 Vehicle parking control method and device and vehicle
CN112339748A (en) * 2020-11-09 2021-02-09 东风汽车集团有限公司 Method and device for correcting vehicle pose information through environment scanning in automatic parking
CN112793563A (en) * 2021-02-04 2021-05-14 武汉理工大学 Automatic parking method, device, storage medium and computer equipment

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114013428A (en) * 2021-11-29 2022-02-08 江苏大学 Dynamic parking path planning method based on intermolecular acting force
CN114511842A (en) * 2022-04-19 2022-05-17 北京主线科技有限公司 Vehicle pose determining method, device, equipment and medium
CN114511842B (en) * 2022-04-19 2022-07-12 北京主线科技有限公司 Vehicle pose determining method, device, equipment and medium
CN114852060A (en) * 2022-05-23 2022-08-05 广州小鹏自动驾驶科技有限公司 Parking control method, parking control device, vehicle, and storage medium
CN114852060B (en) * 2022-05-23 2024-04-09 广州小鹏汽车科技有限公司 Parking control method, parking control device, vehicle and storage medium
CN115482668A (en) * 2022-09-22 2022-12-16 安徽江淮汽车集团股份有限公司 Visual parking space management method and system, electronic equipment and computer readable storage medium
CN115482668B (en) * 2022-09-22 2023-09-19 安徽江淮汽车集团股份有限公司 Visual parking space management method and system

Also Published As

Publication number Publication date
CN113592945B (en) 2024-04-02

Similar Documents

Publication Publication Date Title
CN113592945A (en) Parking target expected pose calculation method and system, vehicle and storage medium
US6985619B1 (en) Distance correcting apparatus of surroundings monitoring system and vanishing point correcting apparatus thereof
WO2020151212A1 (en) Calibration method for extrinsic camera parameter of on-board camera system, and calibration system
US8340896B2 (en) Road shape recognition device
JP5867176B2 (en) Moving object position and orientation estimation apparatus and method
KR102249769B1 (en) Estimation method of 3D coordinate value for each pixel of 2D image and autonomous driving information estimation method using the same
JP6317456B2 (en) Method and control device for detecting relative yaw angle changes in a vehicle stereo / video system
CN110979313B (en) Automatic parking positioning method and system based on space map
CN111510704B (en) Method for correcting camera dislocation and device using same
CN111062318B (en) Sensor sharing optimal node selection method based on entropy weight method
CN111369617A (en) 3D target detection method of monocular view based on convolutional neural network
JP2020126626A (en) Method for providing robust object distance estimation based on camera by performing pitch calibration of camera more precisely with fusion of information acquired through camera and information acquired through v2v communication and device using the same
CN112927309A (en) Vehicle-mounted camera calibration method and device, vehicle-mounted camera and storage medium
KR101869266B1 (en) Lane detection system based on extream learning convolutional neural network and method thereof
CN107480592B (en) Multi-lane detection method and tracking method
KR102433544B1 (en) Vehicle path restoration system through sequential image analysis and vehicle path restoration method using the same
CN110413942B (en) Lane line equation screening method and screening module thereof
KR101394770B1 (en) Image stabilization method and system using curve lane model
CN113936259A (en) Intelligent automobile body attitude control method and system based on visual perception
CN113140002A (en) Road condition detection method and system based on binocular stereo camera and intelligent terminal
KR100472823B1 (en) Method for detecting lane and system therefor
CN115731305A (en) Monocular camera three-dimensional lane line sensing method, system and electronic equipment
JP7344744B2 (en) Roadside edge detection method and roadside edge detection device
JP7344743B2 (en) Occupancy map creation method and occupancy map creation device
KR20220131378A (en) Positioning method, apparatus, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant