CN113096187A - Method for automatically acquiring relative position of vehicle and obstacle - Google Patents

Method for automatically acquiring relative position of vehicle and obstacle Download PDF

Info

Publication number
CN113096187A
CN113096187A CN202110487005.1A CN202110487005A CN113096187A CN 113096187 A CN113096187 A CN 113096187A CN 202110487005 A CN202110487005 A CN 202110487005A CN 113096187 A CN113096187 A CN 113096187A
Authority
CN
China
Prior art keywords
vehicle
coordinate system
calibration
sphere
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110487005.1A
Other languages
Chinese (zh)
Other versions
CN113096187B (en
Inventor
周奎
丁松
付勇智
张友兵
杨亚会
张宇丰
刘瀚文
赵毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hubei University of Automotive Technology
Original Assignee
Hubei University of Automotive Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hubei University of Automotive Technology filed Critical Hubei University of Automotive Technology
Priority to CN202110487005.1A priority Critical patent/CN113096187B/en
Publication of CN113096187A publication Critical patent/CN113096187A/en
Application granted granted Critical
Publication of CN113096187B publication Critical patent/CN113096187B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/174Segmentation; Edge detection involving the use of two or more images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20164Salient point detection; Corner detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a method for automatically acquiring the relative position of a vehicle and an obstacle, which is characterized in that a camera is arranged above the middle position of a vehicle-mounted laser radar calibration field, spherical characteristic calibration objects are respectively arranged at two opposite corners of the calibration field, the world coordinate of the center of a sphere of the spherical characteristic calibration objects in the calibration field is detected, the outline of a calibrated vehicle to be acquired is detected, the external rectangle of the outline of the vehicle is calculated, the world coordinate of the center of the external rectangle is acquired, the center of the external rectangle of the outline of the vehicle is taken as the origin of a vehicle coordinate system, the X axis and the Y axis of the vehicle coordinate system are respectively parallel to two sides of the external rectangle of the outline of the vehicle, and the coordinates of the center of the spherical characteristic calibration objects in the vehicle coordinate system are finally calculated through the rotational translation rotating system of the vehicle coordinate system and the world coordinate system. The invention can automatically acquire the vehicle body pose and the relative position relation between the vehicle body pose and the barrier, reduces the requirement on the vehicle placing pose when calibrating the laser radar, and improves the laser radar calibration efficiency and precision.

Description

Method for automatically acquiring relative position of vehicle and obstacle
Technical Field
The invention relates to the technical field of automatic driving and intelligent networking automobiles, in particular to the field of sensor calibration, and specifically relates to a method for automatically acquiring the relative position of a vehicle and an obstacle.
Background
The calibration of the vehicle-mounted laser radar of the intelligent networked automobile or the automatic driving automobile is usually completed in a specific scene, and the calibration of the vehicle-mounted sensor is a preorder step of development and test of the intelligent networked automobile or the automatic driving automobile. Because the calibration scene and the characteristic marker are fixed in position, the parking according to the specified pose is required to be ensured when the laser radar is calibrated, and the effectiveness of the relative position relation between the prestored specific calibration object and the vehicle body can be ensured; the prior method is that when a vehicle is placed, manual operation is usually carried out, a standardized and automatic mode for acquiring the relative position relation between a vehicle body and a characteristic calibration object is lacked, the vehicle pose is adjusted manually, the efficiency is low, the difficulty is high, and the precision of a calibration result has certain uncertainty, even the calibration fails.
Disclosure of Invention
The invention aims to provide an automatic acquisition method for relative positions of a vehicle and an obstacle, which aims to solve the technical problems that in the process of calibrating a vehicle-mounted laser radar of an intelligent networked vehicle or an automatic driving vehicle in a specific scene, a standardized and automatic acquisition mode for relative position relation between a vehicle body and a characteristic calibration object is lacked, the vehicle body placement difficulty is high, the calibration efficiency and the calibration precision of the vehicle-mounted laser radar are low, the calibration failure is easily caused, and the like.
In order to solve the technical problem, the invention provides an automatic acquisition method for the relative position of a vehicle and an obstacle, which is characterized in that a camera for acquiring the state in a calibration field is arranged above the middle position of the calibration field of a vehicle-mounted laser radar, respectively arranging sphere feature calibration objects at two opposite corners of a calibration field, then detecting world coordinates of the sphere center of the sphere feature calibration objects in the calibration field, detecting the outline of a calibration vehicle to be obtained, then, the external rectangle of the vehicle outline is calculated, the world coordinate of the center of the external rectangle is obtained, the center of the external rectangle of the vehicle outline is taken as the origin of the vehicle coordinate system, the X axis and the Y axis of the vehicle coordinate system are set to be parallel to two sides of the external rectangle of the vehicle outline, and finally the coordinate of the sphere center of the sphere characteristic calibration object in the vehicle coordinate system is calculated through the rotation translation relation between the vehicle coordinate system and the world coordinate system.
Preferably, the method for automatically acquiring the relative position of the vehicle and the obstacle specifically comprises the following steps:
(1) respectively arranging a group of sphere feature calibration objects at two opposite angles of a vehicle-mounted laser radar calibration field, wherein each group of sphere feature calibration objects are formed by three spheres with high and matte surfaces which are distributed in a regular triangle;
(2) placing a camera above the middle of a vehicle-mounted laser radar calibration field, wherein the height of the camera needs to ensure that the camera has a complete and clear view field of the calibration field, keeping the imaging plane of the camera horizontal to the plane of the calibration field, and then parking a vehicle in the middle area of the calibration field and ensuring that the heading angle of a vehicle head is between 0 and 180 degrees;
(3) calibrating an internal reference matrix and an external reference matrix of a camera positioned above the middle of a vehicle-mounted laser radar calibration site, and storing the internal and external reference matrices of the camera in a calibration operation computer;
(4) loading and storing an image when a calibrated vehicle does not enter a calibration field, carrying out gray processing on the image, carrying out Gaussian filtering and smoothing processing on the obtained gray image, then detecting a circle in the image after the Gaussian filtering and smoothing processing, extracting a circle center coordinate, namely a coordinate of a sphere center under a two-dimensional pixel coordinate system, converting the coordinate of the sphere center under the two-dimensional pixel coordinate system into a world coordinate system, obtaining a coordinate (two-dimensional coordinate) of the sphere center under the world coordinate system, and storing the coordinate (two-dimensional coordinate) in a calibration computer;
(5) loading an image when a vehicle to be calibrated does not enter a calibration field to obtain a world coordinate (two-dimensional coordinate) of the sphere center of the sphere, performing background modeling on the image subjected to Gaussian filtering and smoothing processing in the step (4) through a Gaussian mixture model, and storing a modeling result, namely a background image;
(6) on the premise of ensuring that the heading angle of the vehicle head is within 0-180 degrees, the vehicle is parked in a calibration field at any pose, and a current foreground frame image after the vehicle to be calibrated enters the calibration field is obtained and loaded;
(7) acquiring and loading a current scene frame image P of a vehicle parked in a calibration fieldvehicleThen, the current foreground frame image P is used as the foreground frame imagevehicleAfter Gaussian filtering processing, the background frame image P is stored in a calibration computer in advanceemptyAs an input, processing the image by using a background difference method, calculating a pixel area deviating from the background by more than a certain threshold value as a region where the vehicle to be calibrated is located in the image, and acquiring a foreground pixel image only containing the vehicle to be calibrated;
(8) performing binarization, Gaussian filtering and smoothing on the foreground pixel map obtained in the step (7), then detecting the outer contour information of the calibrated vehicle to be obtained, extracting two diagonal points at the outermost periphery in the outer contour of the vehicle and obtaining the coordinates of the two diagonal points in a pixel coordinate system, and constructing an external rectangle of the outer contour of the calibrated vehicle by using the two diagonal points;
(9) setting the origin of the vehicle body coordinate system on a plane which is higher than the ground and can ensure that the camera has a clear view covering the whole calibration field by taking the geometric central point of the circumscribed rectangle constructed in the step (8) as the origin of the vehicle body coordinate system, respectively drawing perpendicular lines from the origin of the vehicle body coordinate system to two sides of the circumscribed rectangle, and acquiring coordinates of intersection points of the perpendicular lines and the sides of the circumscribed rectangle in a pixel coordinate system;
(10) respectively calculating coordinates of the origin of the vehicle body coordinate system, the intersection point of the perpendicular line and the side of the circumscribed rectangle in the step (9) in the world coordinate system through coordinate conversion formulas under the pixel coordinate system and the world coordinate system, and then calculating the current heading angle of the vehicle body (namely the rotation angle of the vehicle body coordinate system relative to the world coordinate system);
(11) and finally, calculating the coordinates of the sphere center of the sphere feature calibration object under the vehicle body coordinate system.
Preferably, in the step (1), the radius of each sphere is 0.6m, the distance between the centers of adjacent spheres in each set of sphere feature calibration objects is 2.2m, and the center of each set of sphere feature calibration objects is placed at the distance of 8-10m from the origin of the vehicle coordinate system.
Preferably, the step (3) is to calibrate an internal reference matrix and an external reference matrix of the camera which are arranged above the middle position of the calibration field by using a Zhangnyou chessboard pattern calibration method; and the coordinate conversion formulas under the pixel coordinate system and the world coordinate system are as follows:
Figure BDA0003050801920000041
u, v are the coordinates of a point in the pixel coordinate system, Xw,Yw,ZwThe coordinates of the point in the world coordinate system are K, an internal reference matrix of the camera and R, T, an external reference matrix of the camera.
Preferably, the step (4) is to load and save an image of the calibration vehicle when the calibration vehicle does not enter the calibration field by using a cvLoadImage () function in OpenCV;
the step (4) is to carry out graying processing on the image through a cvtColor () function;
detecting a circle in the image subjected to Gaussian filtering and smoothing by using a Hough gradient method;
the step (4) is to output the coordinates of the centers of all circles through an ellipse fitting function fitEllipse () in OpenCV, and the coordinates are expressed as a matrix Ocam
Figure BDA0003050801920000042
(u1,v1)..(u6,v6) Respectively representing the coordinates of the centers of six spheres in the two groups of sphere feature calibration objects under a pixel coordinate system, and the coordinate of the center of the sphere under a world coordinate system is represented as Oword
Figure BDA0003050801920000043
(xw1,yw1)..(xw6,yw6) The horizontal and vertical coordinates of the sphere centers of six spheres in the sphere feature calibration object under a two-dimensional world coordinate system are shown, and r is the radius of the sphere.
Preferably, in the step (5), a gaussian background model building function background modeling () in OpenCV is used to perform background modeling on the image after the gaussian filtering and smoothing processing in the step (4).
Preferably, the step (8) is to detect the outer contour information of the calibration vehicle to be acquired by using a Canny algorithm.
Preferably, the calculation formula for calculating the current heading angle θ of the vehicle body in the step (10) is as follows:
Figure BDA0003050801920000051
at this time, the coordinate calculation formula of the sphere center of each sphere under the vehicle body coordinate system is as follows:
Figure BDA0003050801920000052
wherein, (x, y) is the coordinate of the sphere center in the two-dimensional world coordinate system, then the coordinate of the sphere center in the vehicle body coordinate system is (x ', y', r-k), r is the sphere radius, k is the distance from the XY plane of the vehicle body coordinate system to the ground, and r-k is the coordinate of the sphere center in the Z axis of the vehicle body coordinate system.
Preferably, in the step (11), coordinates of the center of the sphere of the spherical feature calibration object in the vehicle body coordinate system are calculated through the formula (2) and the formula (3), the ordinate of the centers of all the spherical feature calibration objects in the vehicle body coordinate system is k-r, and a coordinate set of the centers of the spherical feature calibration objects in the vehicle body coordinate system is represented as Ovehicle
Figure BDA0003050801920000053
(xV1,yV1,zV1)..(xV6,yV6,zV6) And calibrating coordinates of the centers of six spheres in the two groups of sphere characteristic calibration objects under the vehicle body coordinate system.
Preferably, the calibration scene of the vehicle-mounted laser radar is a static scene.
By adopting the technical scheme, the invention has the following beneficial effects:
the method for automatically acquiring the relative position of the vehicle and the obstacle has reasonable conception and simple operation flow, and is a method for quickly determining the coordinate of a vehicle body coordinate system and the coordinate of a characteristic calibration object in the vehicle body coordinate system when an intelligent networked vehicle or an automatic driving vehicle-mounted laser radar is calibrated; the invention utilizes the image processing technology to acquire the image information in the calibration field in real time to determine the relative position relationship between the vehicle and the characteristic calibration object, can randomly place the vehicle in the specified moving range, greatly reduces the requirement on the placement position during the calibration of the vehicle-mounted laser radar, and lays a technical foundation for the standardization, the flow and the automation of the laser radar calibration step.
The invention determines the relative position between the vehicle and the characteristic calibration object based on the camera, allows the vehicle to be parked randomly in a specified range, does not need to strictly ensure the unique vehicle body pose, greatly improves the efficiency and the accuracy of vehicle-mounted laser radar calibration, and can provide a feasible technical scheme and engineering landing conditions for the fast and full-automatic calibration of the vehicle-mounted laser radar of the automatic driving vehicle in batch production.
The invention can effectively reduce the requirement of vehicle position and pose placement when the vehicle-mounted laser radar is calibrated, improve the calibration efficiency and precision of the laser radar, and solve the technical problems that the traditional laser radar calibration method based on the characteristic markers needs to be placed according to the specified vehicle body position and pose, has higher difficulty in accurate placement, higher error and the like.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments or the description in the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a diagram showing a state of a camera in the method for automatically acquiring the relative position between a vehicle and an obstacle according to the present invention;
FIG. 2 is a top view of a calibration site for the method for automatically acquiring the relative position of a vehicle and an obstacle according to the present invention;
FIG. 3 is a schematic view of a process for determining world coordinates of the center of sphere of a sphere feature calibration object in the method for automatically acquiring relative positions of a vehicle and an obstacle according to the present invention;
FIG. 4 is a schematic diagram of a background difference processing procedure in the method for automatically acquiring relative positions of a vehicle and an obstacle according to the present invention;
FIG. 5 is a schematic diagram of a vehicle outer contour detection output flow in the method for automatically acquiring the relative position between a vehicle and an obstacle according to the present invention.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it should be understood that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The present invention will be further explained with reference to specific embodiments.
The invention relates to a method for automatically acquiring the relative position of a vehicle and an obstacle, which is implemented by arranging a camera for acquiring the state in a calibration field above the middle of the calibration field of a vehicle-mounted laser radar, respectively arranging spherical characteristic calibration objects at two opposite corners of the calibration field, then detecting the world coordinate of the center of the sphere of the spherical characteristic calibration object in the calibration field, detecting the outline of the calibrated vehicle to be acquired, then calculating the external rectangle of the outline of the vehicle and acquiring the world coordinate of the center of the external rectangle, then setting the X axis and the Y axis of a vehicle body coordinate system to be respectively parallel to two sides of the external rectangle of the outline of the vehicle by taking the center of the external rectangle of the outline of the vehicle as the origin of the vehicle body coordinate system, finally calculating the coordinate of the center of the spherical characteristic calibration object in the vehicle body coordinate system through the rotational translation relationship between the vehicle body coordinate system and the world coordinate system, and obtaining the relative position relation between the spherical characteristic calibration object and the vehicle body.
The invention relates to a method for automatically acquiring the relative position of a vehicle and an obstacle, which specifically comprises the following steps:
(1) a group of sphere feature calibration objects are respectively arranged at two opposite corners of a vehicle-mounted laser radar calibration field (as shown in fig. 1, in the embodiment, the upper left corner and the lower right corner of a vehicle-mounted laser radar calibration scene of an automatic driving vehicle), each group of sphere feature calibration objects is formed by three spheres with high and matte surfaces and distributed in a regular triangle, and the specific distribution form is shown in fig. 1; wherein the radius of each sphere is 0.6m, the distance between the centers of adjacent spheres in each group of sphere feature calibration objects is 2.2m, and the center of each group of sphere feature calibration objects is arranged at the distance of 8-10m from the origin of the vehicle coordinate system.
(2) A high-definition camera is arranged above the middle position of a vehicle-mounted laser radar calibration field, the imaging plane of the camera and the calibration field plane are kept horizontal, the height of the camera is h meters (h is a numerical value in all feasible ranges), namely the height of the camera needs to ensure that the camera has a complete and clear view field of the calibration field, as shown in fig. 1; and then the vehicle is parked in the middle area of the calibration site shown in figure 2, and the heading angle of the vehicle head is ensured to be between 0 and 180 degrees.
(3) Calibrating an internal reference matrix K and external reference matrices R and T (namely rotation and translation matrices from a camera coordinate system to a world coordinate system) of the camera positioned above the middle position of a calibration field by using a Zhangyingyou chessboard lattice calibration method, and storing the internal and external reference matrices of the camera in a calibration operation computer; the coordinate conversion formulas under the pixel coordinate system and the world coordinate system are as follows:
Figure BDA0003050801920000081
wherein u and v are the coordinates of a certain point in the pixel coordinate system, Xw,Yw,ZwThe coordinates of the point in the world coordinate system are K, an internal reference matrix of the camera and R, T, an external reference matrix of the camera.
(4) Loading and storing an image of a calibration vehicle when the calibration vehicle does not enter a calibration field by using a function cvLoadimage () in OpenCV, carrying out gray processing on the image by using a cvtColor () function, carrying out Gaussian filtering and smoothing processing (by using a GaussionBlur () function) on the obtained gray image, detecting circles in the image after the Gaussian filtering and smoothing processing by using a Hough gradient method (cvHougcircle () function), and outputting the coordinates of the centers of all circles by using an ellipse fitting function fitEllipse () in OpenCV, wherein the coordinates are represented as a matrix Ocam
Figure BDA0003050801920000082
(u1,v1)..(u6,v6) Respectively representing two sets of sphere feature calibrationCoordinates of the centers of six spheres in the object under a two-dimensional pixel coordinate system, and coordinates (u) of the centers of the spheres under the pixel coordinate system1,v1)..(u6,v6) Converting into world coordinate system by coordinate conversion formula shown in formula (1) and storing in calibration computer, and extracting only the coordinates of the sphere center in two-dimensional world coordinate system (not considering Z axis temporarily), and its coordinates in world coordinate system is represented as Oword
Figure BDA0003050801920000091
The flow is shown in FIG. 3; wherein (x)w1,yw1)..(xw6,yw6) The method is characterized in that the horizontal and vertical coordinates of a calibration object sphere in a two-dimensional world coordinate system are characterized, r is the radius of the sphere, the operation is not needed to be carried out in the subsequent operation after the debugging stage is completed, and the data stored in a calibration computer can be directly used in the subsequent operation.
(5) Loading an image when a vehicle to be calibrated does not enter a calibration field to obtain the world coordinates of the sphere center of the sphere, taking the image subjected to Gaussian filtering and smoothing in the step (4) as an input for constructing a background model, performing background modeling on the image subjected to Gaussian filtering and smoothing in the step (4) by using a mixed Gaussian background model construction function background modeling MOG () in OpenCV, and storing a modeling result, namely a background frame image Pempty(ii) a Because the calibration scene is a fixed scene, the background frame image P constructed based on the Gaussian mixture modelemptyNo real-time updating is required.
(6) On the premise of ensuring that the heading angle of the vehicle head is within 0-180 degrees, the vehicle is parked in the calibration field at any pose (position + posture), and a current foreground frame image P after the vehicle to be calibrated enters the calibration field is acquired and loadedvehicleAnd load the foreground frame image PvehicleThen, before the calibration is completed, the pose of the vehicle can not change any more; in which the vehicle exits the parking area from above as it enters the parking area from below, as shown in fig. 2.
(7) Acquiring and loading a current scene frame image P of a vehicle parked in a calibration fieldvehicleThen, the current foreground frame image P is used as the foreground frame imagevehicleBackground frame image P after noise reduction and pre-stored in calibration computeremptyAs input, the image is processed by using a background difference method, namely, the current foreground frame image PvehicleAnd the background frame image PemptyAnd (3) performing subtraction operation, calculating a pixel area which deviates from the background by more than a certain threshold value T (T is a numerical value within all reasonable ranges) as an area where the vehicle to be calibrated is located in the image, and directly outputting information such as the position, size and shape of the vehicle to be calibrated in the image, namely obtaining a foreground pixel map only containing the vehicle to be calibrated, wherein the flow is shown in fig. 4. The calibration field scene is a static scene, so that the calibration scene with static characteristics and extremely small change degree of illumination conditions can be provided for the background difference method, the camera can be ensured not to shake in a static state, the technical defects of the background difference method can be avoided, and the output precision of a target area is improved.
(8) And (3) performing binarization, gaussian filtering and smoothing on the foreground pixel map obtained in the step (7), detecting the outer contour information of the vehicle to be calibrated by using a Canny algorithm, extracting two diagonal points which are positioned at the outermost periphery in the outer contour of the vehicle, obtaining coordinates of the two diagonal points in a pixel coordinate system, and constructing a circumscribed rectangle of the outer contour of the vehicle to be calibrated by using the two diagonal points, wherein the flow is shown in fig. 5.
(9) Taking the geometric central point C of the circumscribed rectangle constructed in the step (8) as the original point of the vehicle body coordinate system, setting the original point of the vehicle body coordinate system on a plane with the height being k meters away from the ground (k meters is a value in a reasonable range, namely, the camera is ensured to have a clear view covering the whole calibration field), and setting the coordinate of the point C in the image coordinate system as (u mcen,vcen) Drawing perpendicular lines from the point C to the two sides of the circumscribed rectangle, respectively, and obtaining coordinates D (u) in the image coordinate system of the intersection point of the perpendicular line and the sides of the circumscribed rectangletrs,vtrs)、F(ulon,vlonB) of the group A and B); wherein, the point D is the intersection point of a perpendicular line made towards the direction of the vehicle head and the external rectangleThe point F is an intersection of a perpendicular line made to the right vehicle body direction and the circumscribed rectangle, the direction from the point C to the point F is the direction of the X axis of the vehicle body coordinate system, and the direction from the point C to the point D is the direction of the Y axis of the vehicle body coordinate system, as shown in fig. 2.
(10) Respectively calculating the coordinates (x) of the point C, the point D and the point F in the world coordinate system in the step (9) through a formula (1)C,yC)、(xD,yD)、(xF,yF) (ii) a Then calculating the course angle theta of the vehicle body at the moment, wherein the calculation formula is as follows:
Figure BDA0003050801920000101
at this time, the coordinate calculation formula of the sphere center of each sphere under the vehicle body coordinate system is as follows:
Figure BDA0003050801920000111
wherein, (x, y) is the coordinate of the sphere center in the two-dimensional world coordinate system, then the coordinate of the sphere center in the vehicle body coordinate system is (x ', y', r-k), r is the sphere radius, k is the distance from the XY plane of the vehicle body coordinate system to the ground, and r-k is the coordinate of the sphere center in the Z axis of the vehicle body coordinate system.
(11) Calculating the coordinates of the center of the sphere characteristic calibration object under the vehicle body coordinate system according to the formula (2) and the formula (3); the coordinate set of the sphere center of the sphere feature calibration object under the vehicle body coordinate system is represented as Ovehicle
Figure BDA0003050801920000112
(xV1,yV1,zV1)...(xV6,yV6,zV6) Coordinates of the centers of six spheres in the two groups of sphere feature calibration objects under the vehicle body coordinate system are calibrated, the origin C of the vehicle body coordinate system is on a plane with the height of k meters away from the ground, and the values of the centers of all the spheres on the Z coordinate axis of the vehicle body coordinate system are k-r。
The method has reasonable concept and simple operation flow, utilizes the image processing technology to acquire the image information in the calibration field in real time to determine the relative position relationship between the vehicle body and the characteristic calibration object, can reduce the difficulty of placing the vehicle, improves the calibration efficiency and precision of the laser radar, and can place the vehicle at will in the specified moving range so as to achieve the calibration purpose.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. An automatic acquisition method for the relative position of a vehicle and an obstacle is characterized in that a camera for acquiring the state in a calibration field is arranged above the middle position of the calibration field of a vehicle-mounted laser radar, respectively arranging sphere feature calibration objects at two opposite corners of a calibration field, then detecting world coordinates of the sphere center of the sphere feature calibration objects in the calibration field, detecting the outline of a calibration vehicle to be obtained, then, the external rectangle of the vehicle outline is calculated, the world coordinate of the center of the external rectangle is obtained, the center of the external rectangle of the vehicle outline is taken as the origin of the vehicle coordinate system, the X axis and the Y axis of the vehicle coordinate system are set to be parallel to two sides of the external rectangle of the vehicle outline, and finally the coordinate of the sphere center of the sphere characteristic calibration object in the vehicle coordinate system is calculated through the rotation translation rotating system of the vehicle coordinate system and the world coordinate system.
2. The method for automatically acquiring the relative position of the vehicle and the obstacle according to claim 1, specifically comprising the steps of:
(1) respectively arranging a group of sphere feature calibration objects at two opposite angles of a vehicle-mounted laser radar calibration field, wherein each group of sphere feature calibration objects are formed by three spheres with high and matte surfaces which are distributed in a regular triangle;
(2) placing a camera above the middle of a vehicle-mounted laser radar calibration field, wherein the height of the camera needs to ensure that the camera has a complete and clear view field of the calibration field, keeping the imaging plane of the camera horizontal to the plane of the calibration field, and then parking a vehicle in the middle area of the calibration field and ensuring that the heading angle of a vehicle head is between 0 and 180 degrees;
(3) calibrating an internal reference matrix and an external reference matrix of a camera positioned above the middle of a vehicle-mounted laser radar calibration site, and storing the internal and external reference matrices of the camera in a calibration operation computer;
(4) loading and storing an image when a calibrated vehicle does not enter a calibration field, carrying out gray processing on the image, carrying out Gaussian filtering and smoothing processing on the obtained gray image, then detecting a circle in the image after the Gaussian filtering and smoothing processing, extracting a circle center coordinate, namely a coordinate of a sphere center under a two-dimensional pixel coordinate system, converting the coordinate of the sphere center under the two-dimensional pixel coordinate system into a world coordinate system, obtaining the coordinate of the sphere center under the world coordinate system, and storing the coordinate in a calibration computer;
(5) loading an image when a vehicle to be calibrated does not enter a calibration field to obtain the world coordinates of the sphere center of the sphere, performing background modeling on the image subjected to Gaussian filtering and smoothing in the step (4) through a Gaussian mixture model, and storing a modeling result, namely a background image;
(6) on the premise of ensuring that the heading angle of the vehicle head is within 0-180 degrees, the vehicle is parked in a calibration field at any pose, and a current foreground frame image after the vehicle to be calibrated enters the calibration field is obtained and loaded;
(7) acquiring and loading a current scene frame image P of a vehicle parked in a calibration fieldvehicleThen, the current foreground frame image P is used as the foreground frame imagevehicleAfter Gaussian filtering processing, the background frame image P is stored in a calibration computer in advanceemptyAs input, the image is processed by using a background subtraction method, and the deviation from the background is calculatedTaking the pixel area exceeding a certain threshold value as the area where the vehicle to be calibrated is located in the image, and acquiring a foreground pixel map only containing the vehicle to be calibrated;
(8) performing binarization, Gaussian filtering and smoothing on the foreground pixel map obtained in the step (7), then detecting the outer contour information of the calibrated vehicle to be obtained, extracting two diagonal points at the outermost periphery in the outer contour of the vehicle and obtaining the coordinates of the two diagonal points in a pixel coordinate system, and constructing an external rectangle of the outer contour of the calibrated vehicle by using the two diagonal points;
(9) setting the origin of the vehicle body coordinate system on a plane which is higher than the ground and can ensure that the camera has a clear view covering the whole calibration field by taking the geometric central point of the circumscribed rectangle constructed in the step (8) as the origin of the vehicle body coordinate system, respectively drawing perpendicular lines from the origin of the vehicle body coordinate system to two sides of the circumscribed rectangle, and acquiring coordinates of the intersection points of the perpendicular lines and the sides of the circumscribed rectangle in an image coordinate system;
(10) respectively calculating coordinates of the origin of the vehicle body coordinate system, the intersection point of the perpendicular line and the side of the circumscribed rectangle in the step (9) in the world coordinate system through coordinate conversion formulas under the pixel coordinate system and the world coordinate system, and then calculating the course angle of the vehicle body at the moment;
(11) and finally, calculating the coordinates of the sphere center of the sphere feature calibration object under the vehicle body coordinate system.
3. The method for automatically acquiring the relative position of the vehicle and the obstacle according to claim 2, characterized in that: the radius of each sphere in the step (1) is 0.6m, the distance between the centers of adjacent spheres in each group of sphere feature calibration objects is 2.2m, and the center of each group of sphere feature calibration objects is placed at the distance of 8-10m from the origin of the vehicle coordinate system.
4. The method for automatically acquiring the relative position of the vehicle and the obstacle according to claim 2, characterized in that: the step (3) is to calibrate an internal reference matrix and an external reference matrix of the camera which are arranged above the middle position of the calibration field by using a Zhang-Zhengyou calibration method; and the coordinate conversion formulas under the pixel coordinate system and the world coordinate system are as follows:
Figure FDA0003050801910000031
in the above formula (1), u and v are coordinates of a certain point in a pixel coordinate system, and Xw,Yw,ZwThe coordinates of the point in the world coordinate system are K, an internal reference matrix of the camera and R, T, an external reference matrix of the camera.
5. The method for automatically acquiring the relative position of the vehicle and the obstacle according to claim 2, characterized in that: the step (4) is to load and store an image when a calibration vehicle does not enter a calibration field by using a function cvLoadimage () in OpenCV;
the step (4) is to carry out graying processing on the image through a cvtColor () function;
detecting a circle in the image subjected to Gaussian filtering and smoothing by using a Hough gradient method;
the step (4) is to output the coordinates of the centers of all circles through an ellipse fitting function fitEllipse () in OpenCV, and the coordinates are expressed as a matrix Ocam
Figure FDA0003050801910000032
(u1,v1)..(u6,v6) Respectively representing the coordinates of the centers of six spheres in the two groups of sphere feature calibration objects under a pixel coordinate system, and the coordinate of the center of the sphere under a world coordinate system is represented as Oword
Figure FDA0003050801910000033
(xw1,yw1)..(xw6,yw6) For calibrating the horizontal and vertical coordinates of the centers of six spheres in the sphere feature under a two-dimensional world coordinate systemAnd r is the radius of the sphere.
6. The method for automatically acquiring the relative position of the vehicle and the obstacle according to claim 2, characterized in that: and (5) performing background modeling on the image subjected to Gaussian filtering and smoothing processing in the step (4) by using a mixed Gaussian background model building function background modeling MOG () in OpenCV.
7. The method for automatically acquiring the relative position of the vehicle and the obstacle according to claim 2, characterized in that: and (8) detecting the outer contour information of the calibration vehicle to be acquired by using a Canny algorithm.
8. The method for automatically acquiring the relative position of the vehicle and the obstacle according to claim 2, characterized in that: the calculation formula for calculating the current heading angle theta of the vehicle body in the step (10) is as follows:
Figure FDA0003050801910000041
at this time, the coordinate calculation formula of the sphere center of each sphere under the vehicle body coordinate system is as follows:
Figure FDA0003050801910000042
wherein, (x, y) is the coordinate of the sphere center in the two-dimensional world coordinate system, then the coordinate of the sphere center in the vehicle body coordinate system is (x ', y', r-k), r is the sphere radius, k is the distance from the XY plane of the vehicle body coordinate system to the ground, and r-k is the coordinate of the sphere center in the Z axis of the vehicle body coordinate system.
9. The method for automatically acquiring the relative position of the vehicle and the obstacle according to claim 8, characterized in that: the step (11) is to calculate the coordinates of the sphere centers of the sphere feature calibration objects under the vehicle body coordinate system through the formula (2) and the formula (3), and the sphere centers of all the sphere feature calibration objects are located at the vehicle body seatThe vertical coordinates under the standard system are all k-r, and the coordinate set of the sphere center of the sphere feature calibration object under the vehicle body coordinate system is represented as Ovehicle
Figure FDA0003050801910000043
(xV1,yV1,zV1)..(xV6,yV6,zV6) And calibrating coordinates of the centers of six spheres in the two groups of sphere characteristic calibration objects under the vehicle body coordinate system.
10. The method for automatically acquiring the relative position of the vehicle and the obstacle according to claim 1, characterized in that: and the calibration scene of the vehicle-mounted laser radar is a static scene.
CN202110487005.1A 2021-05-03 2021-05-03 Method for automatically acquiring relative position of vehicle and obstacle Expired - Fee Related CN113096187B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110487005.1A CN113096187B (en) 2021-05-03 2021-05-03 Method for automatically acquiring relative position of vehicle and obstacle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110487005.1A CN113096187B (en) 2021-05-03 2021-05-03 Method for automatically acquiring relative position of vehicle and obstacle

Publications (2)

Publication Number Publication Date
CN113096187A true CN113096187A (en) 2021-07-09
CN113096187B CN113096187B (en) 2022-05-17

Family

ID=76681258

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110487005.1A Expired - Fee Related CN113096187B (en) 2021-05-03 2021-05-03 Method for automatically acquiring relative position of vehicle and obstacle

Country Status (1)

Country Link
CN (1) CN113096187B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115083208A (en) * 2022-07-20 2022-09-20 深圳市城市交通规划设计研究中心股份有限公司 Human-vehicle conflict early warning method, early warning analysis method, electronic device and storage medium
CN118397109A (en) * 2024-06-27 2024-07-26 杭州海康威视数字技术股份有限公司 Vehicle-mounted camera calibration method and device, electronic equipment and machine-readable storage medium

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110102592A1 (en) * 2009-10-30 2011-05-05 Valeo Vision System of gauging a camera suitable for equipping a vehicle
CN103871071A (en) * 2014-04-08 2014-06-18 北京经纬恒润科技有限公司 Method for camera external reference calibration for panoramic parking system
CN106405555A (en) * 2016-09-23 2017-02-15 百度在线网络技术(北京)有限公司 Obstacle detecting method and device used for vehicle-mounted radar system
CN107133988A (en) * 2017-06-06 2017-09-05 科大讯飞股份有限公司 The scaling method and calibration system of camera in vehicle-mounted panoramic viewing system
CN107966700A (en) * 2017-11-20 2018-04-27 天津大学 A kind of front obstacle detecting system and method for pilotless automobile
CN108490830A (en) * 2018-03-16 2018-09-04 合肥工业大学 A kind of control device and its control method of the automatic car washing based on machine vision
CN109557525A (en) * 2019-01-31 2019-04-02 浙江工业大学 A kind of automatic calibration method of laser radar formula vehicle overall dimension measuring instrument
CN109767473A (en) * 2018-12-30 2019-05-17 惠州华阳通用电子有限公司 A kind of panorama parking apparatus scaling method and device
CN109849782A (en) * 2017-11-30 2019-06-07 比亚迪股份有限公司 Virtual panoramic auxiliary driving device and its display methods, vehicle
CN110488234A (en) * 2019-08-30 2019-11-22 北京百度网讯科技有限公司 Outer ginseng scaling method, device, equipment and the medium of vehicle-mounted millimeter wave radar
CN111145260A (en) * 2019-08-30 2020-05-12 广东星舆科技有限公司 Vehicle-mounted binocular calibration method
CN111862672A (en) * 2020-06-24 2020-10-30 北京易航远智科技有限公司 Parking lot vehicle self-positioning and map construction method based on top view
CN112233188A (en) * 2020-10-26 2021-01-15 南昌智能新能源汽车研究院 Laser radar-based roof panoramic camera and calibration method thereof
CN112224132A (en) * 2020-10-28 2021-01-15 武汉极目智能技术有限公司 Vehicle panoramic all-around obstacle early warning method

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110102592A1 (en) * 2009-10-30 2011-05-05 Valeo Vision System of gauging a camera suitable for equipping a vehicle
CN103871071A (en) * 2014-04-08 2014-06-18 北京经纬恒润科技有限公司 Method for camera external reference calibration for panoramic parking system
CN106405555A (en) * 2016-09-23 2017-02-15 百度在线网络技术(北京)有限公司 Obstacle detecting method and device used for vehicle-mounted radar system
CN107133988A (en) * 2017-06-06 2017-09-05 科大讯飞股份有限公司 The scaling method and calibration system of camera in vehicle-mounted panoramic viewing system
CN107966700A (en) * 2017-11-20 2018-04-27 天津大学 A kind of front obstacle detecting system and method for pilotless automobile
CN109849782A (en) * 2017-11-30 2019-06-07 比亚迪股份有限公司 Virtual panoramic auxiliary driving device and its display methods, vehicle
CN108490830A (en) * 2018-03-16 2018-09-04 合肥工业大学 A kind of control device and its control method of the automatic car washing based on machine vision
CN109767473A (en) * 2018-12-30 2019-05-17 惠州华阳通用电子有限公司 A kind of panorama parking apparatus scaling method and device
CN109557525A (en) * 2019-01-31 2019-04-02 浙江工业大学 A kind of automatic calibration method of laser radar formula vehicle overall dimension measuring instrument
CN110488234A (en) * 2019-08-30 2019-11-22 北京百度网讯科技有限公司 Outer ginseng scaling method, device, equipment and the medium of vehicle-mounted millimeter wave radar
CN111145260A (en) * 2019-08-30 2020-05-12 广东星舆科技有限公司 Vehicle-mounted binocular calibration method
CN111862672A (en) * 2020-06-24 2020-10-30 北京易航远智科技有限公司 Parking lot vehicle self-positioning and map construction method based on top view
CN112233188A (en) * 2020-10-26 2021-01-15 南昌智能新能源汽车研究院 Laser radar-based roof panoramic camera and calibration method thereof
CN112224132A (en) * 2020-10-28 2021-01-15 武汉极目智能技术有限公司 Vehicle panoramic all-around obstacle early warning method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
朱政泽 等: "基于延迟补偿的网联式自主驾驶车辆协同控制", 《系统仿真学报》 *
汪佩 等: "基于单线激光雷达与视觉融合的负障碍检测算法", 《计算机工程》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115083208A (en) * 2022-07-20 2022-09-20 深圳市城市交通规划设计研究中心股份有限公司 Human-vehicle conflict early warning method, early warning analysis method, electronic device and storage medium
CN115083208B (en) * 2022-07-20 2023-02-03 深圳市城市交通规划设计研究中心股份有限公司 Human-vehicle conflict early warning method, early warning analysis method, electronic device and storage medium
CN118397109A (en) * 2024-06-27 2024-07-26 杭州海康威视数字技术股份有限公司 Vehicle-mounted camera calibration method and device, electronic equipment and machine-readable storage medium

Also Published As

Publication number Publication date
CN113096187B (en) 2022-05-17

Similar Documents

Publication Publication Date Title
CN113096187B (en) Method for automatically acquiring relative position of vehicle and obstacle
US20220148213A1 (en) Method for fully automatically detecting chessboard corner points
CN104835173B (en) A kind of localization method based on machine vision
CN112223285B (en) Robot hand-eye calibration method based on combined measurement
CN106326892B (en) Visual landing pose estimation method of rotary wing type unmanned aerial vehicle
CN106340044B (en) Join automatic calibration method and caliberating device outside video camera
CN112308916B (en) Target pose recognition method based on image target
CN110823252B (en) Automatic calibration method for multi-line laser radar and monocular vision
CN111508027B (en) Method and device for calibrating external parameters of camera
CN107194399A (en) A kind of vision determines calibration method, system and unmanned plane
CN113156411A (en) Vehicle-mounted laser radar calibration method
WO2021017352A1 (en) Laser radar-camera joint calibration target and joint calibration method
CN113793270A (en) Aerial image geometric correction method based on unmanned aerial vehicle attitude information
CN113379712A (en) Steel bridge bolt disease detection method and system based on computer vision
CN114972531B (en) Corner detection method, equipment and readable storage medium
CN115014371A (en) Positioning and mapping method and device for grain transfer vehicle of intelligent grain depot and storage medium
CN115792865A (en) Camera and mechanical laser radar-based external parameter calibration method, system, medium and vehicle
CN110543612B (en) Card collection positioning method based on monocular vision measurement
CN112102396B (en) Method, device, equipment and storage medium for positioning vehicle under bridge crane
CN112907677B (en) Camera calibration method and device for single-frame image and storage medium
CN117315046A (en) Method and device for calibrating looking-around camera, electronic equipment and storage medium
CN110853103B (en) Data set manufacturing method for deep learning attitude estimation
CN114202548A (en) Forklift pallet positioning method and device, storage medium and electronic equipment
CN110969661B (en) Image processing device and method, and position calibration system and method
CN116758142A (en) Defect component positioning method and device for image geographic registration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220517

CF01 Termination of patent right due to non-payment of annual fee