CN112541952A - Parking scene camera calibration method and device, computer equipment and storage medium - Google Patents

Parking scene camera calibration method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN112541952A
CN112541952A CN202011421203.XA CN202011421203A CN112541952A CN 112541952 A CN112541952 A CN 112541952A CN 202011421203 A CN202011421203 A CN 202011421203A CN 112541952 A CN112541952 A CN 112541952A
Authority
CN
China
Prior art keywords
point
marker post
marker
image
ground
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011421203.XA
Other languages
Chinese (zh)
Inventor
孙巍巍
师小凯
唐俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Elite Road Technology Co ltd
Original Assignee
Beijing Elite Road Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Elite Road Technology Co ltd filed Critical Beijing Elite Road Technology Co ltd
Priority to CN202011421203.XA priority Critical patent/CN112541952A/en
Publication of CN112541952A publication Critical patent/CN112541952A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30264Parking

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a parking scene camera calibration method and device, computer equipment and a storage medium, and belongs to the field of image pattern recognition. The method comprises the following steps: acquiring an image shot by the camera equipment; performing straight line detection on the image to obtain at least two straight lines perpendicular to the ground, and solving an initial y-direction blanking point by using the at least two straight lines perpendicular to the ground; selecting at least 3 line segments vertical to the ground as a marker post in the image by utilizing the initial y-direction blanking point, wherein one end of the marker post is positioned on the ground and is a marker post foot point, and the other end of the marker post is a marker post head point; and (4) inputting the mark post selected in the step (3) into an analysis module, solving camera parameters and an optimized y-direction blanking point, and determining a mapping relation between an image coordinate system and a world coordinate system.

Description

Parking scene camera calibration method and device, computer equipment and storage medium
Technical Field
The invention belongs to the field of image pattern recognition, relates to a method for calibrating a camera, and particularly relates to a calibration method in a parking scene.
Background
With the development of economy in China, the holding capacity of the vehicle market will be continuously increased, and the problem of difficult parking in cities generally exists, so that effective management of parking spaces is the key for solving the parking problem. In the background of this era, many intelligent parking systems have been derived, in which video-based capturing devices have received much attention.
There are many parking systems on the existing market to adopt the video mode to catch getting into the position and leaving from the position of road both sides parking stall vehicle, and the effective management that realizes the parking stall through this kind of mode has following advantage:
1. the labor investment can be reduced, and the operation cost of the parking system is saved;
2. the charge is fair and transparent, and unnecessary disputes are reduced;
3. the parking evidence chain constructed in a video mode is simple and clear and is approved by vast car owners.
However, the parking system based on the video monitoring mode is limited by the 2D image processing, and cannot accurately judge the relationship between the vehicle and the parking space, so that on one hand, a complaint caused by a false snapshot is caused, and on the other hand, the parking condition of the vehicle cannot be effectively judged, and particularly, the parking space management is disordered under abnormal parking conditions such as cross-parking, inclined parking, side parking and the like.
In the field of computer vision measurement, camera calibration is an indispensable step for acquiring three-dimensional (3D) space information from two-dimensional (2D) image information, and therefore, in order to implement the parking space limited management scheme, the first technical problem that must be overcome is that accurate calibration of a camera cannot be performed in a parking scene.
Based on the problems, the camera calibration method suitable for the parking scene is provided, and the existing technical problem of low calibration accuracy is solved by constructing the accurate relation between the image coordinate system and the world coordinate system.
Disclosure of Invention
The invention provides a parking scene camera calibration method, which can calibrate a camera in a general scene, can accurately calculate camera parameters particularly in a scene of parking a motor car, is beneficial to improving the robustness of camera calibration, and has the advantages of wide application, accurate and simple solution, high robustness and the like.
The technical solution of the purpose of the invention is as follows:
the embodiment of the invention provides a parking scene camera calibration method, which comprises the following steps:
step 1: acquiring an image shot by the camera equipment;
step 2: performing straight line detection on the image to obtain at least two straight lines perpendicular to the ground, and solving an initial y-direction blanking point by using the at least two straight lines perpendicular to the ground;
and step 3: selecting at least 3 line segments vertical to the ground as a marker post in the image by utilizing the initial y-direction blanking point, wherein one end of the marker post is positioned on the ground and is a marker post foot point, and the other end of the marker post is a marker post head point;
and 4, step 4: and (3) solving camera parameters by using the marker posts selected in the step (3), optimizing blanking points in the y direction, and determining the mapping relation between the image coordinate system and the world coordinate system.
Optionally, the method further includes step 5: inputting the actual distance of the marker post;
and (3) obtaining foot points and head points of the marker post in the world coordinate system through back projection according to the mapping relation between the image coordinate system and the world coordinate system, comparing the actual distances between the foot points and the head points and the marker post, and if the error exceeds a set threshold value, bringing the optimized y-direction blanking point back to the step (3) for iterative optimization until the error is smaller than the threshold value.
Optionally, in the step 5, iteration is performed for 10 times at most, and if an error between the post obtained by the back projection and the actual distance of the post, which is output by the 10 th calculation, is still greater than a threshold, the actual distance of the post is re-determined or manual fitting is performed.
Optionally, the step 4 specifically includes the following steps:
step 4-1: verifying the rationality of the benchmarking data selected in the step 3;
step 4-2: determining the distribution range of the roll angle;
step 4-3: obtaining an equation of a candidate horizontal line by enumerating a product f tan theta of a roll angle, a focal length and a horizontal line slope;
step 4-4: by judging the intersection point Vy of any two marker posts in the y directioni,yThe distribution around a straight line VzVy vertical to the horizontal line is used to find the blanking point and the main point in the z directionP;
And 4-5: and acquiring an optimized y-direction blanking point and a corresponding horizontal line equation thereof by enumerating the focal length f, and acquiring corresponding camera parameters.
Optionally, if the input marker post data is not reasonable, one of the marker posts is corrected to be parallel to the y axis.
Optionally, the step 4-2 specifically includes the following steps:
step 4-2-1: the intersection point and the mean value of any two marker posts in the y direction are obtained
Figure 1
Step 4-2-2: making a circle by taking the ideal principal point as a circle center and taking the maximum tolerance of the real principal point deviating from the ideal principal point as a radius;
step 4-2-3: passing point
Figure 2
Making two tangent lines of the circle, tangent to the tangent point Tl,Ty
Step 4-2-4: and taking the maximum included angle and the minimum included angle of the tangent point and the circle center connecting line as the maximum value and the minimum value of the roll angle.
Optionally, the step 4-3 specifically includes the following steps:
step 4-3-1: setting a constraint condition for solving a horizontal line;
step 4-3-2: enumerating a roll angle in the enumerated range of the roll angle in the step 4-2, enumerating f tan theta with a step size of 1 in the range of 1 to 2000, and solving a z-direction blanking point;
step 4-3-3: crossing a blanking point in the z direction, and constructing a candidate horizontal line with the slope of tan theta;
step 4-3-4: constructing a cost function
Figure BDA0002822439130000023
Wherein Hi, Hj are the marker post head points of any two marker posts, FiFjFor any two poles, CijIs the connecting line of any two club head points Hi, Hj and the candidate horizontal linePoint of intersection of (D), FjIs' FiAnd CijPoint of intersection of the connecting line of (D) and the marker post j, FiIs' FjAnd CijThe intersection point of the connecting line of (a) and the marker post i;
step 4-3-5: and under the condition of calculating each rho parameter and f tan theta parameter, solving the cost J of any two benchmarks, and selecting the rho parameter and f tan theta parameter corresponding to the minimum 10 cost J to construct a candidate level equation.
Optionally, step 4-4 specifically includes the following steps:
step 4-4-1: assuming an ideal principal point P0Located in the center of the image, and finding the ideal blanking point V in the z directionZ0
Step 4-4-2: let VZ0Is slid by Δ d along the horizontal line within a certain range to obtain VZdFinding the corresponding straight line V perpendicular to the horizontal lineZdVy
Step 4-4-3: calculating the intersection point of any two marker posts in the y direction to VZdVyDistance d ofi
Step 4-4-4: for all Δ d
Figure BDA0002822439130000031
Finding out the delta d corresponding to the minimum value;
step 4-4-5: calculating a principal point PdAnd the actual z-direction blanking point VZ
Optionally, the steps 4 to 5 specifically include the following steps:
step 4-5-1: obtaining the enumeration range of the focal length f;
step 4-5-2: enumerating a focal length f in the enumeration range of f, and calculating a corresponding optimized y-direction blanking point;
step 4-5-3: connecting the optimized y-direction blanking point with any one of the marker post points, calculating the distance from the marker post foot point to the straight line, and obtaining max d (f)i,hivy) Take the minimum optimized y-direction blanking point and horizontal line equations.
Optionally, the method further includes selecting a wheel center point as a head point of the marker post, and a tangent point of the wheel to the ground as a foot point of the marker post, the tangent point of the wheel to the ground being found by connecting the initial y-direction blanking point or the optimized y-direction blanking point with the wheel center point.
In another aspect, a parking scene camera calibration apparatus is provided, the apparatus includes:
the acquisition module is used for acquiring a scene image;
the preprocessing module is used for carrying out linear detection on the image to obtain at least two straight lines perpendicular to the ground, and solving an initial y-direction blanking point by utilizing the at least two straight lines perpendicular to the ground;
the marker post extraction module is used for selecting not less than 3 line segments vertical to the ground from the image as marker posts by utilizing the initial y-direction blanking points, one end of each marker post is positioned on the ground and is a marker post foot point, and the other end of each marker post is a marker post head point;
an analysis module: and the system is used for solving camera parameters and optimized y-direction blanking points by using the selected benchmarks and determining the mapping relation between the image coordinate system and the world coordinate system.
Optionally, the parking scene camera calibration apparatus further includes a detection module: and the system is used for inputting the actual distance of the marker post, obtaining the foot point and the head point of the marker post in the world coordinate system through back projection according to the mapping relation between the image coordinate system and the world coordinate system, comparing the foot point and the head point with the actual distance of the marker post, and if the error exceeds a set threshold value, bringing the optimized y-direction blanking point back to the marker post extraction module for iterative optimization until the error is smaller than the threshold value.
In another aspect, a computer device for parking scene calibration is provided, the device includes a memory, a processor, and a computer program stored in the memory and executable on the processor, the memory having stored therein at least one instruction, the instruction being loaded and executed by the processor to implement the operations performed by the parking scene camera calibration method as described above.
In another aspect, a computer-readable storage medium is provided, in which at least one instruction is stored, and the instruction is loaded and executed by a processor to implement the operations performed by the parking scene camera calibration method as described above.
The invention provides a parking scene camera calibration method, which comprises the steps of firstly, obtaining an image shot by a camera device; then carrying out linear detection on the image to obtain at least two straight lines perpendicular to the ground, and solving an initial y-direction blanking point by utilizing the at least two straight lines perpendicular to the ground; selecting at least 3 line segments vertical to the ground as a marker post in the image by utilizing the initial y-direction blanking point, wherein one end of the marker post is positioned on the ground and is a marker post foot point, and the other end of the marker post is a marker post head point; and (3) solving camera parameters by using the marker posts selected in the step (3), optimizing blanking points in the y direction, and determining the mapping relation between the image coordinate system and the world coordinate system.
According to the method, after an initial y-direction blanking point of an acquired image is solved by a traditional blanking point solving method, a new mark post is determined by utilizing the initial y-direction blanking point, and an optimized y-direction blanking point and camera parameters under an optimal condition are solved again, so that the problems of inaccuracy of linear selection in the process of solving the blanking point for the first time, and large blanking point solving error and inaccurate camera parameter calibration possibly caused by the inaccuracy are solved, the camera parameters are accurately calibrated, the robustness of camera calibration is improved, and the method has the advantages of wide application, accuracy and convenience in solving, high robustness and the like.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
FIG. 1 is a flow chart of a parking scene camera calibration method of the present invention;
FIG. 2 is a flowchart of a parking scene camera calibration method according to an embodiment of the invention;
FIG. 3 is a flowchart of the method for determining the mapping between the image coordinate system and the world coordinate system by solving the camera parameters and the optimized y-direction blanking points according to an embodiment of the present invention;
FIG. 4 is a schematic illustration of a post calibration in accordance with an embodiment of the present invention;
FIG. 5 is a schematic illustration of a range of distributions of estimated roll angles ρ in an embodiment of the present invention;
FIG. 6 is a schematic block diagram of a parking scene camera calibration apparatus according to an embodiment of the present invention;
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions in an embodiment of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiment of the present invention, and it is obvious that the described embodiment is a part of the embodiment of the present invention, but not all of the embodiment. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In an embodiment, an optimization scheme of a parking scene camera calibration method is provided, which includes the following steps, please refer to fig. 2:
201 (i.e., step 1): acquiring an image shot by the camera equipment;
202 (i.e., step 2): performing straight line detection on the image to obtain at least two straight lines perpendicular to the ground, and solving an initial y-direction blanking point by using the at least two straight lines perpendicular to the ground;
203 (i.e., step 3): selecting at least 3 line segments vertical to the ground as a marker post in the image by utilizing the initial y-direction blanking point, wherein one end of the marker post is positioned on the ground and is a marker post foot point, and the other end of the marker post is a marker post head point;
204 (i.e., step 4): and (3) solving camera parameters by using the marker posts selected in the step (3), optimizing blanking points in the y direction, and determining the mapping relation between the image coordinate system and the world coordinate system.
205 (i.e., step 5): inputting the actual distance of the marker post; and (3) obtaining foot points and head points of the marker post in the world coordinate system through back projection according to the mapping relation between the image coordinate system and the world coordinate system, comparing the actual distances between the foot points and the head points and the marker post, and if the error exceeds a set threshold value, bringing the optimized y-direction blanking point back to the step (3) for iterative optimization until the error is smaller than the threshold value.
Fig. 2 is a flowchart of the optimization scheme, and compared with fig. 1, an "iteration" is added as step 5, the optimized y-direction blank point is used as an input parameter again, and the operations of step 3 and step 4 are performed again, so that the target obtained by taking the optimized y-direction blank point as a reference is more accurate, so that more accurate camera parameters and further optimized y-direction blank points can be obtained, until the difference between the distance between the head point and the foot point of the target obtained by back projection of the camera parameters obtained at the last time and the actual distance of the target is smaller than a threshold, that is, a result is output.
On this basis, an optimized implementation is to set the maximum number of iterations to 10, i.e. when no camera parameters meeting the requirements can be obtained after 10 iterations, the actual distance of the target can be detected and re-input or manual fitting can be performed.
In one embodiment, a specific implementation of selecting the benchmarking point is provided, i.e. the wheels that are more common in the parking scene are used as the main source for extracting the benchmarking point, and other objects that are perpendicular to the ground and have known height, such as pedestrians, telegraph poles, road teeth, etc., can be selected as the benchmarking. The calibration method for selecting other objects as the marker post is the same as the calibration method for selecting the wheel as the marker post, taking the selected wheel as the marker post as an example, the specific calibration method is as follows:
firstly, acquiring an image shot by a camera device; then, performing linear detection on the image to obtain at least two straight lines perpendicular to the ground, and solving an initial y-direction blanking point by using the at least two straight lines perpendicular to the ground, wherein the solving method can be realized by adopting mature cascade Hough transform; the method comprises the steps of calculating an initial y-direction blanking point, using the calculated initial y-direction blanking point as one of references for selecting a benchmark point, detecting and identifying a wheel in an image, using the wheel as a reference for extracting the benchmark point, specifically, selecting the centers of three wheels in the image, connecting the centers of the three wheels with the initial y-direction blanking point, comparing a connecting line of the connecting line with a boundary line between the wheel and the ground to be a point which is the most accurate tangent point of the wheel which can be calculated under the current condition, using the tangent point of the wheel as a benchmark foot point, using the center of the wheel as a benchmark head point, thereby determining three benchmarks, using the three benchmarks as initial data, solving camera parameters and the optimized y-direction blanking point, and determining the mapping relation between an image coordinate system and a world coordinate system.
A more optimized embodiment is that after the mapping relation is obtained, the post obtained by back projection is compared with the actual post distance input by a user, if the actual post distance exceeds a set threshold value, iteration is carried out as described above, in the iteration process, the post point is obtained by taking the y-direction blanking point and the wheel center as two references each time, the connecting line of the two is used for obtaining the intersection point of the connecting line and the intersection line of the wheel and the ground, the point is the wheel tangent point, and the line segment determined by the wheel center and the wheel tangent point is the post input in the current analysis.
In one embodiment, the camera parameters obtained in step 4 include focal length, camera optical center coordinates, principal point coordinates, camera roll angle, and camera pitch angle.
In one embodiment, an implementation of "solving camera parameters and optimized y-direction blanking points to determine mapping relationship between image coordinate system and world coordinate system" in step 4 is provided, please refer to fig. 3 to 5. The overall process of implementing step 4 is as follows, please refer to fig. 3:
301, namely step 4-1, correcting the selected marker post;
302, namely step 4-2, determining the distribution range of the roll angle rho;
303, namely step 4-3, setting a horizontal line parallel to the horizon in a world coordinate system through a z-direction blanking point, and acquiring an equation of a candidate horizontal line by enumerating a product of a roll angle rho and a focal length and a slope of the horizontal line;
304, step 4-4, pass determinationBreaking the intersection point Vy of any two marker posts in the y directioni,yObtaining blanking points and main points P in the z direction according to the distribution around a straight line VzVy of a vertical horizontal line;
305, step 4-5, obtains the optimized y-direction blanking point and its corresponding horizontal line equation by enumerating the focal length f, and obtains the corresponding camera parameters.
Wherein, the step 4-1 corrects the selected marker post, which is specifically realized by the following method, please refer to fig. 4:
1/2/3 in fig. 4 are three markers obtained by using an initial y-direction blanking point, because in a parking scene, cameras all acquire a scene image by means of oblique shooting, when markers parallel to each other and perpendicular to the ground are presented on the image, their intersection points should be located below the image, and if there is an intersection point located above the image, it indicates that the input data is obviously abnormal, fig. 4 shows a possible abnormality, that is, the intersection point of the marker 1 and the marker 2 is located above the image, and for this situation, we can adopt the following method to correct: and extending the selected marker posts, and if the intersection point of any pair of the marker posts in the y direction is positioned above the image, correcting one marker post (such as the marker post 1) to be parallel to the y axis.
The determining of the distribution range of the roll angle ρ in step 4-2 is specifically realized by the following method, please refer to fig. 5:
step 4-2-1: solving the intersection point Vy of any two marker posts i, j in the y direction in the three marker posts vertical to the groundi,yAnd calculating the average value of the intersection points of every two marker posts in the y direction
Figure 3
Step 4-2-2: let the ideal principal point P0Is located at the center of the image, with the ideal principal point P0As a circle center, a real principal point PRealDeviation from the ideal principal point P0The maximum tolerance R is the radius of a circle;
step 4-2-3: passing point
Figure 4
Two cuts for making a circleLine tangent to the circle at tangent point Tl,Ty
Step 4-2-4: in a straight line PTl,PTyThe maximum included angle and the minimum included angle are the maximum value and the minimum value of the roll angle rho, and the value range of rho is as follows: rhomin<ρ<ρmax
The equation for obtaining the candidate horizontal line l1 in step 4-3 is specifically implemented by the following method:
step 4-3-1: setting a constraint condition for solving a horizontal line;
firstly, for any two marker posts i, j, calculating the intersection point P of the connecting lines of the head points and the head points of the marker posts and the foot pointsi,j(ii) a Subsequently, for all intersections Pi,jFinding the robust mean in the y-direction
Figure 5
HL is all possible horizontal lines, then the constraint is set to:
Figure BDA0002822439130000074
step 4-3-2: p is listed in the listed range of the roll angle p described in the step 2, and 1 is listed in the range of 1 to 2000 with the step size of 1
Figure 6
Solving a blanking point in the z direction;
the specific method adopted in the present embodiment is to use [ max (ρ)min,-30),min(ρmax,+30)]Where ρ is listed as the step Δ ρ (Δ ρ ═ 1), and 1 to 2000, where ρ is listed as the step 1
Figure 7
Obtaining a blanking point (u) in the z-direction from an expression of the blanking point in the z-direction on an image coordinate system when using a pinhole modelvzvvz)。
Step 4-3-3: passing through a blanking point in the z direction, and constructing a candidate horizontal line l1 with the slope tan theta;
step 4-3-4: constructing a cost function
Figure BDA0002822439130000077
Wherein Hi, Hj are the marker post head points of any two marker posts i, j, FiFjThe pole foot point, C, of any two poles i, jijIs the intersection point, F, of the line connecting any two club head points Hi, Hj and the candidate horizontal line l1jIs' FiAnd CijPoint of intersection of the connecting line of (D) and the marker post j, FiIs' FjAnd CijThe intersection point of the connecting line of (a) and the marker post i;
step 4-3-5: calculate each rho sum
Figure 10
Under the condition of parameters, solving the cost J of any two benchmarks i, J, and selecting the rho sum corresponding to the minimum 10 costs J
Figure 11
And constructing a candidate level equation.
Wherein, the step 4-4 is to judge the intersection point Vy of any root marker post i, j in the y directioni,yThe z-direction blanking point and the principal point P are found from the distribution around the straight line VzVy in the vertical horizontal line by:
step 4-4-1: assuming an ideal principal point P0Located in the center of the image, and finding the ideal blanking point V in the z directionZ0
Step 4-4-2: let VZ0Is slid along the horizontal line l1 within a certain range, and a corresponding straight line V perpendicular to the horizontal line is determinedZdVy
Step 4-4-3: calculating the intersection point V of any two marker posts in the y directionyiTo VZdVyDistance d ofi
Step 4-4-4: for all Δ d
Figure 12
Finding out the delta d corresponding to the minimum value;
step 4-4-5: calculating a principal point PdAnd z-direction blanking point VZ
In this embodiment, in step 4-5, the optimized y-direction blanking point and the horizontal line equation corresponding to the optimized y-direction blanking point are obtained by enumerating the focal length f, so as to obtain the corresponding camera parameter, which is specifically implemented in the following manner:
step 4-5-1: obtaining the enumeration range of the focal length f;
the method for finding the range of the focal length f used in this embodiment is as follows: determined by each pair of benchmarks
Figure 14
Projected to a straight line
Figure 13
To obtain
Figure BDA0002822439130000086
Find out in a straight line
Figure BDA0002822439130000087
The highest end and the shortest projection point are positioned on the upper surface of the projection lens to obtain VyInfimum of blanking points
Figure BDA0002822439130000088
And supremum
Figure BDA0002822439130000089
According to the expression of the blanking point in the y direction
Figure 15
Figure BDA00028224391300000811
Obtaining an enumerated range f of focal lengths fmin<f<fmax
Step 4-5-2: enumerating a focal length f in the enumeration range of f, and calculating a corresponding optimized y-direction blanking point;
wherein, the embodiment is specifically as fmin<f<fmaxEnumerating the focal length f, and calculating a blanking point V according to a blanking point expression in the y directionyOf the position of (a).
Step 4-5-3: optimized for connecting said f correspondencesDetermining a straight line between the blanking point in the y direction and any one of the mark post points, and calculating the distance between the foot point of any one mark post and the straight line to obtain maxd (f)i,hivy) Take the minimum optimized y-direction blanking point and horizontal line equations.
Connection VyAnd head point HiCalculating the foot point FiDistance d to the straight linei(ii) a Calculate maxd (f)i,hivy). For all candidate horizontal lines, find a maxd (f)i,hivy) The y-direction blanking point and the horizontal line corresponding to the minimum value of (c) are the final result.
In an embodiment, a parking scene camera calibration device is provided, which corresponds to the parking scene camera calibration methods in the first, second, and third embodiments one to one. As shown in fig. 6, the parking scene camera calibration apparatus includes: the system comprises an acquisition module 1001, a preprocessing module 1002, a benchmarking extraction module 1003, an analysis module 1004 and a detection module 1005. The functional modules are explained in detail as follows:
an obtaining module 1001 configured to obtain a scene image;
the preprocessing module 1002 is configured to perform line detection on the image to obtain at least two lines perpendicular to the ground, and obtain an initial y-direction blanking point by using the at least two lines perpendicular to the ground;
a marker post extraction module 1003, configured to select, as a marker post, not less than 3 line segments perpendicular to the ground from the image by using the initial y-direction blanking point, where one end of the marker post is located on the ground and is a marker post foot point, and the other end of the marker post is a marker post head point;
the analysis module 1004: and the system is used for solving camera parameters and optimized y-direction blanking points by using the selected benchmarks and determining the mapping relation between the image coordinate system and the world coordinate system.
The detection module 1005: and the system is used for inputting the actual distance of the marker post, obtaining the foot point and the head point of the marker post in the world coordinate system through back projection according to the mapping relation between the image coordinate system and the world coordinate system, comparing the foot point and the head point with the actual distance of the marker post, and if the error exceeds a set threshold value, bringing the optimized y-direction blanking point back to the marker post extraction module for iterative optimization until the error is smaller than the threshold value.
In one embodiment of the present invention, the step of the analysis module solving the camera parameters and the optimized y-direction blanking point is as follows:
step 4-1: verifying the rationality of the benchmarking data selected in the step 3;
step 4-2: determining the distribution range of the roll angle;
step 4-3: obtaining an equation of a candidate horizontal line by enumerating a product f tan theta of a roll angle, a focal length and a horizontal line slope;
step 4-4: by judging the intersection point Vy of any two marker posts in the y directioni,yObtaining blanking points and main points P in the z direction according to the distribution around a straight line VzVy of a vertical horizontal line;
and 4-5: and acquiring an optimized y-direction blanking point and a corresponding horizontal line equation thereof by enumerating the focal length f, and acquiring corresponding camera parameters.
Wherein, the step 4-2 specifically comprises:
step 4-2-1: solving the intersection point Vy of any two marker posts i, j in the y direction in the three marker posts vertical to the groundi,yAnd calculating the average value of the intersection points of every two marker posts in the y direction
Figure 16
Step 4-2-2: let the ideal principal point P0Is located at the center of the image, with the ideal principal point P0As a circle center, a real principal point PR6alDeviation from the ideal principal point P0The maximum tolerance R is the radius of a circle;
step 4-2-3: passing point
Figure 18
Two tangent lines of the circle are tangent to the tangent point Tl,Ty
Step 4-2-4: in a straight line PTl,PTyThe maximum included angle and the minimum included angle of (d) are the maximum of the roll angle rhoThe value and the minimum value, and the value range of rho is as follows: rhomin<ρ<ρmax
Wherein, the step 4-3 specifically comprises the following steps:
step 4-3-1: setting a constraint condition for solving a horizontal line;
firstly, for any two marker posts i, j, calculating the intersection point P of the connecting lines of the head points and the head points of the marker posts and the foot pointsi,j(ii) a Subsequently, for all intersections Pi,jFinding the robust mean in the y-direction
Figure 17
HL is all possible horizontal lines, then the constraint is set to:
Figure BDA0002822439130000103
step 4-3-2: p is listed in the listed range of the roll angle p described in the step 2, and 1 is listed in the range of 1 to 2000 with the step size of 1
Figure 19
Solving a blanking point in the z direction;
the specific method adopted in the present embodiment is to use [ max (ρ)min,-30),min(ρmax,+30)]Where ρ is listed as the step Δ ρ (Δ ρ ═ 1), and 1 to 2000, where ρ is listed as the step 1
Figure 20
Obtaining a blanking point (u) in the z-directionvzvvz)。
Step 4-3-3: passing through a blanking point in the z direction, and constructing a candidate horizontal line l1 with the slope tan theta;
step 4-3-4: constructing a cost function
Figure 21
Wherein Hi, Hj are the marker post head points of any two marker posts i, j, FiFjThe pole foot point, C, of any two poles i, jijIs the intersection point, F, of the line connecting any two club head points Hi, Hj and the candidate horizontal line l1jIs' FiAnd CijIntersection of connecting line of (a) and marker post jDot, FiIs' FjAnd CijThe intersection point of the connecting line of (a) and the marker post i;
step 4-3-5: calculate each rho sum
Figure 22
Under the condition of parameters, solving the cost J of any two benchmarks i, J, and selecting the rho sum corresponding to the minimum 10 costs J
Figure 23
And constructing a candidate level equation.
Wherein, the step 4-4 specifically comprises the following steps:
step 4-4-1: assuming an ideal principal point P0Located in the center of the image, and finding the ideal blanking point V in the z directionZ0
Step 4-4-2: let VZ0Is slid along the horizontal line l1 within a certain range, and a corresponding straight line V perpendicular to the horizontal line is determinedZdVy
Step 4-4-3: calculating the intersection point V of any two marker posts in the y directionyiTo VzdVyDistance d ofi
Step 4-4-4: for all Δ d
Figure 24
Finding out the delta d corresponding to the minimum value;
step 4-4-5: calculating a principal point PdAnd z-direction blanking point VZ
Wherein, the steps 4-5 specifically comprise:
step 4-5-1: obtaining the enumeration range of the focal length f;
the method for finding the range of the focal length f used in this embodiment is as follows: determined by each pair of benchmarks
Figure BDA0002822439130000112
Projected to a straight line
Figure 25
To obtain
Figure 26
Find out in a straight line
Figure 27
The highest end and the shortest projection point are positioned on the upper surface of the projection lens to obtain VyInfimum of blanking points
Figure BDA0002822439130000116
And supremum
Figure BDA0002822439130000117
According to the expression of the blanking point in the y direction
Figure 28
Figure BDA0002822439130000119
Obtaining an enumerated range f of focal lengths fmin<f<fman
Step 4-5-2: enumerating a focal length f in the enumeration range of f, and calculating a corresponding optimized y-direction blanking point;
wherein, the embodiment is specifically as fmin<f<fmanEnumerating the focal length f, and calculating a blanking point V according to a blanking point expression in the y directionyOf the position of (a).
Step 4-5-3: connecting the optimized y-direction blanking point corresponding to the f with any one of the marker post head points to determine a straight line, and calculating the distance from any one of the marker post foot points to the straight line to obtain maxd (f)i,hivy) Take the minimum optimized y-direction blanking point and horizontal line equations.
Connection VyAnd head point HiCalculating the foot point FiDistance d to the straight linei(ii) a Calculate maxd (f)i,hivy). For all candidate horizontal lines, find a maxd (f)i,hivy) The y-direction blanking point and the horizontal line corresponding to the minimum value of (c) are the final result.
For specific limitations of the parking scene camera calibration device, reference may be made to the above limitations of the parking scene camera calibration method, which are not described herein again. All or part of each module in the parking scene camera calibration device can be realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided that includes a memory, a processor, and a computer program stored in the memory and executable on the processor. The memory has stored therein at least one instruction that is loaded and executed by the processor to perform operations as performed by the parking scene camera calibration method described above. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The computer program is executed by a processor to implement a parking scene camera calibration method.
In one embodiment, a computer-readable storage medium is provided, having at least one instruction stored therein, which is loaded and executed by a processor to perform the operations performed by the parking scene camera calibration method as described above.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory or other media used in the embodiments provided herein may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions.
The above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (14)

1. A parking scene camera calibration method is characterized by comprising the following steps:
step 1: acquiring an image shot by the camera equipment;
step 2: performing straight line detection on the image to obtain at least two straight lines perpendicular to the ground, and solving an initial y-direction blanking point by using the at least two straight lines perpendicular to the ground;
and step 3: selecting at least 3 line segments vertical to the ground as a marker post in the image by utilizing the initial y-direction blanking point, wherein one end of the marker post is positioned on the ground and is a marker post foot point, and the other end of the marker post is a marker post head point;
and 4, step 4: and (3) solving camera parameters by using the marker posts selected in the step (3), optimizing blanking points in the y direction, and determining the mapping relation between the image coordinate system and the world coordinate system.
2. The parking scene camera calibration method as claimed in claim 1, further comprising the step of 5: inputting the actual distance of the marker post;
and (3) obtaining foot points and head points of the marker post in the world coordinate system through back projection according to the mapping relation between the image coordinate system and the world coordinate system, comparing the actual distances between the foot points and the head points and the marker post, and if the error exceeds a set threshold value, bringing the optimized y-direction blanking point back to the step (3) for iterative optimization until the error is smaller than the threshold value.
3. The method for calibrating a parking scene camera as claimed in claim 2, wherein in the step 5, iteration is performed for 10 times at most, and if the error between the actual distance between the target and the target obtained by the back projection output by the 10 th calculation is still greater than the threshold, the actual distance between the target is determined again or manual fitting is performed.
4. The parking scene camera calibration method according to claim 1, wherein the step 4 specifically includes the steps of:
step 4-1: verifying the rationality of the benchmarking data selected in the step 3;
step 4-2: determining the distribution range of the roll angle;
step 4-3: acquiring an equation of a candidate horizontal line by enumerating a product ftan theta of a roll angle, a focal length and a horizontal line slope;
step 4-4: by judging the intersection point Vy of any two marker posts in the y directioni,yObtaining blanking points and main points P in the z direction according to the distribution around a straight line VzVy of a vertical horizontal line;
and 4-5: and acquiring an optimized y-direction blanking point and a corresponding horizontal line equation thereof by enumerating the focal length f, and acquiring corresponding camera parameters.
5. The parking scene camera calibration method as recited in claim 4, wherein if the inputted target data is not reasonable, one of the targets is corrected to be parallel to the y-axis.
6. The parking scene camera calibration method according to claim 4, wherein the step 4-2 specifically comprises the steps of:
step 4-2-1: the intersection point and the mean value of any two marker posts in the y direction are obtained
Figure FDA0002822439120000011
Step 4-2-2: making a circle by taking the ideal principal point as a circle center and taking the maximum tolerance of the real principal point deviating from the ideal principal point as a radius;
step 4-2-3: passing point
Figure FDA0002822439120000012
Making two tangent lines of the circle, tangent to the tangent point Tl,Ty
Step 4-2-4: and taking the maximum included angle and the minimum included angle of the tangent point and the circle center connecting line as the maximum value and the minimum value of the roll angle.
7. The parking scene camera calibration method according to claim 4, wherein the step 4-3 specifically comprises the steps of:
step 4-3-1: setting a constraint condition for solving a horizontal line;
step 4-3-2: enumerating a roll angle in the enumerated range of the roll angle in the step 4-2, enumerating f tan theta with a step size of 1 in the range of 1 to 2000, and solving a z-direction blanking point;
step 4-3-3: crossing a blanking point in the z direction, and constructing a candidate horizontal line with the slope of tan theta;
step 4-3-4: construction costFunction(s)
Figure FDA0002822439120000021
Wherein Hi, Hj are the marker post head points of any two marker posts, FiFjFor any two poles, CijIs the intersection point, F ', of the connecting line of any two marker head points Hi, Hj and the candidate horizontal line'jIs FiAnd CijIs connected with the intersection point of the marker post j, F'iIs FjAnd CijThe intersection point of the connecting line of (a) and the marker post i;
step 4-3-5: and under the condition of calculating each rho parameter and f tan theta parameter, solving the cost J of any two benchmarks, and selecting the rho parameter and f tan theta parameter corresponding to the minimum 10 cost J to construct a candidate level equation.
8. The parking scene camera calibration method according to claim 4, wherein the step 4-4 specifically comprises the following steps:
step 4-4-1: assuming an ideal principal point P0Located in the center of the image, and finding the ideal blanking point V in the z directionZ0
Step 4-4-2: let VZ0Is slid by Δ d along the horizontal line within a certain range to obtain VZdFinding the corresponding straight line V perpendicular to the horizontal lineZdVy
Step 4-4-3: calculating the intersection point of any two marker posts in the y direction to VZdVyDistance d ofi
Step 4-4-4: for all Δ d
Figure FDA0002822439120000022
Finding out the delta d corresponding to the minimum value;
step 4-4-5: calculating a principal point PdAnd the actual z-direction blanking point VZ
9. The parking scene camera calibration method according to claim 4, wherein the step 4-5 specifically includes the steps of:
step 4-5-1: obtaining the enumeration range of the focal length f;
step 4-5-2: enumerating a focal length f in the enumeration range of f, and calculating a corresponding optimized y-direction blanking point;
step 4-5-3: connecting the optimized y-direction blanking point with any one of the marker post points, calculating the distance from the marker post foot point to the straight line, and obtaining max d (f)i,hivy) Take the minimum optimized y-direction blanking point and horizontal line equations.
10. The method according to claim 1, wherein a center point of a wheel is selected as a head point of the post, a tangent point of the wheel to the ground is selected as a foot point of the post, and the tangent point of the wheel to the ground is found by connecting the initial y-direction blanking point or the optimized y-direction blanking point with the center point of the wheel.
11. A parking scene camera calibration device, characterized in that the device comprises:
the acquisition module is used for acquiring a scene image;
the preprocessing module is used for carrying out linear detection on the image to obtain at least two straight lines perpendicular to the ground, and solving an initial y-direction blanking point by utilizing the at least two straight lines perpendicular to the ground;
the marker post extraction module is used for selecting not less than 3 line segments vertical to the ground from the image as marker posts by utilizing the initial y-direction blanking points, one end of each marker post is positioned on the ground and is a marker post foot point, and the other end of each marker post is a marker post head point;
an analysis module: and the system is used for solving camera parameters and optimized y-direction blanking points by using the selected benchmarks and determining the mapping relation between the image coordinate system and the world coordinate system.
12. The parking scene camera calibration apparatus as set forth in claim 11, further comprising:
a detection module: and the system is used for inputting the actual distance of the marker post, obtaining the foot point and the head point of the marker post in the world coordinate system through back projection according to the mapping relation between the image coordinate system and the world coordinate system, comparing the foot point and the head point with the actual distance of the marker post, and if the error exceeds a set threshold value, bringing the optimized y-direction blanking point back to the marker post extraction module for iterative optimization until the error is smaller than the threshold value.
13. A computer device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the memory has stored therein at least one instruction that is loaded and executed by the processor to perform operations performed by the parking scene camera calibration method according to any one of claims 1 to 10.
14. A computer-readable storage medium having stored thereon at least one instruction, which is loaded and executed by a processor to perform operations performed by the parking scene camera calibration method according to any one of claims 1 to 10.
CN202011421203.XA 2020-12-08 2020-12-08 Parking scene camera calibration method and device, computer equipment and storage medium Pending CN112541952A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011421203.XA CN112541952A (en) 2020-12-08 2020-12-08 Parking scene camera calibration method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011421203.XA CN112541952A (en) 2020-12-08 2020-12-08 Parking scene camera calibration method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112541952A true CN112541952A (en) 2021-03-23

Family

ID=75018396

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011421203.XA Pending CN112541952A (en) 2020-12-08 2020-12-08 Parking scene camera calibration method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112541952A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116912333A (en) * 2023-09-12 2023-10-20 安徽炬视科技有限公司 Camera attitude self-calibration method based on operation fence calibration rod

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104036512A (en) * 2014-06-25 2014-09-10 西北工业大学 Novel Tsai's camera calibration improved method based on orthogonal vanishing points
CN104392450A (en) * 2014-11-27 2015-03-04 苏州科达科技股份有限公司 Method for determining focal length and rotary angles of camera, camera calibration method and camera calibration system
EP3098754A1 (en) * 2015-05-29 2016-11-30 Accenture Global Services Limited Video camera scene translation
CN110533923A (en) * 2019-08-29 2019-12-03 北京精英路通科技有限公司 Parking management method, device, computer equipment and storage medium
WO2019233330A1 (en) * 2018-06-05 2019-12-12 上海商汤智能科技有限公司 Vehicle-mounted camera self-calibration method and apparatus, and vehicle driving method and apparatus
CN110930459A (en) * 2019-10-29 2020-03-27 北京经纬恒润科技有限公司 Vanishing point extraction method, camera calibration method and storage medium
CN111508027A (en) * 2019-01-31 2020-08-07 杭州海康威视数字技术股份有限公司 Method and device for calibrating external parameters of camera
CN111667536A (en) * 2019-03-09 2020-09-15 华东交通大学 Parameter calibration method based on zoom camera depth estimation
CN111724446A (en) * 2020-05-20 2020-09-29 同济大学 Zoom camera external parameter calibration method for building three-dimensional reconstruction

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104036512A (en) * 2014-06-25 2014-09-10 西北工业大学 Novel Tsai's camera calibration improved method based on orthogonal vanishing points
CN104392450A (en) * 2014-11-27 2015-03-04 苏州科达科技股份有限公司 Method for determining focal length and rotary angles of camera, camera calibration method and camera calibration system
EP3098754A1 (en) * 2015-05-29 2016-11-30 Accenture Global Services Limited Video camera scene translation
EP3098755A1 (en) * 2015-05-29 2016-11-30 Accenture Global Services Limited Local caching for object recognition
WO2019233330A1 (en) * 2018-06-05 2019-12-12 上海商汤智能科技有限公司 Vehicle-mounted camera self-calibration method and apparatus, and vehicle driving method and apparatus
CN111508027A (en) * 2019-01-31 2020-08-07 杭州海康威视数字技术股份有限公司 Method and device for calibrating external parameters of camera
CN111667536A (en) * 2019-03-09 2020-09-15 华东交通大学 Parameter calibration method based on zoom camera depth estimation
CN110533923A (en) * 2019-08-29 2019-12-03 北京精英路通科技有限公司 Parking management method, device, computer equipment and storage medium
CN110930459A (en) * 2019-10-29 2020-03-27 北京经纬恒润科技有限公司 Vanishing point extraction method, camera calibration method and storage medium
CN111724446A (en) * 2020-05-20 2020-09-29 同济大学 Zoom camera external parameter calibration method for building three-dimensional reconstruction

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
ZHENG TANG;YEN-SHUO LIN;KUAN-HUI LEE;JENQ-NENG HWANG;JEN-HUI CHUANG: "ESTHER: Joint Camera Self-Calibration and Automatic Radial Distortion Correction From Tracking of Walking Humans", IEEE ACCESS, vol. 7, 31 December 2019 (2019-12-31) *
ZHENG, YUAN;YOU, XINHUA: "An accurate and practical calibration method for roadside camera using two vanishing points", NEUROCOMPUTING, no. 5, 31 December 2016 (2016-12-31) *
余烨;刘晓平;徐伟;韩江洪;: "面向建筑物重建的相机标定方法研究", 图学学报, no. 04, 15 August 2012 (2012-08-15) *
蔡鸣;孙秀霞;刘树光;徐嵩;刘希: "基于消隐点无穷单应的摄像机焦距精确自标定方法", 光学学报, no. 005, 31 December 2014 (2014-12-31) *
霍炬;杨卫;杨明;: "基于消隐点几何特性的摄像机自标定方法", 光学学报, no. 02, 15 February 2010 (2010-02-15) *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116912333A (en) * 2023-09-12 2023-10-20 安徽炬视科技有限公司 Camera attitude self-calibration method based on operation fence calibration rod
CN116912333B (en) * 2023-09-12 2023-12-26 安徽炬视科技有限公司 Camera attitude self-calibration method based on operation fence calibration rod

Similar Documents

Publication Publication Date Title
CN109522804B (en) Road edge identification method and system
CN109583280A (en) Lane detection method, apparatus, equipment and storage medium
US20140063252A1 (en) Method for calibrating an image capture device
EP2759959A2 (en) Method and system for detecting multi-lanes
CN108573215B (en) Road reflective area detection method and device and terminal
CN104636724A (en) Vehicle-mounted camera rapid pedestrian and vehicle detection method based on goal congruence
WO2020133488A1 (en) Vehicle detection method and device
CN107851390B (en) Step detection device and step detection method
CN112541952A (en) Parking scene camera calibration method and device, computer equipment and storage medium
WO2020087322A1 (en) Lane line recognition method and device, and vehicle
CN114705121A (en) Vehicle pose measuring method and device, electronic equipment and storage medium
CN113609148A (en) Map updating method and device
CN114966632A (en) Laser radar calibration method and device, electronic equipment and storage medium
CN114863388A (en) Method, device, system, equipment, medium and product for determining obstacle orientation
CN110751040A (en) Three-dimensional object detection method and device, electronic equipment and storage medium
CN114754761A (en) Optimization method and device for lane line of high-precision map, electronic equipment and storage medium
CN112991327B (en) Steel grid welding system, method and terminal equipment based on machine vision
CN116740680A (en) Vehicle positioning method and device and electronic equipment
CN115507815A (en) Target ranging method and device and vehicle
CN111462243A (en) Vehicle-mounted streaming media rearview mirror calibration method, system and device
CN114049394B (en) Monocular distance measuring method, device, equipment and storage medium
CN116259022A (en) Tracking method based on visual lane line, electronic equipment, medium and vehicle
CN116148820A (en) Laser radar calibration method, computer equipment, readable storage medium and motor vehicle
CN115841517A (en) Structural light calibration method and device based on DIC double-circle cross ratio
CN115546216A (en) Tray detection method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination