CN111459176B - Automatic charging positioning control method, calibration method and vehicle attitude calculation method for vehicle - Google Patents

Automatic charging positioning control method, calibration method and vehicle attitude calculation method for vehicle Download PDF

Info

Publication number
CN111459176B
CN111459176B CN202010258584.8A CN202010258584A CN111459176B CN 111459176 B CN111459176 B CN 111459176B CN 202010258584 A CN202010258584 A CN 202010258584A CN 111459176 B CN111459176 B CN 111459176B
Authority
CN
China
Prior art keywords
point
vehicle
value
robot
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010258584.8A
Other languages
Chinese (zh)
Other versions
CN111459176A (en
Inventor
陈鹏飞
叶云波
吴迪
吕恕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Googol Changjiang Research Institute Co ltd
Original Assignee
Chongqing Googol Changjiang Research Institute Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Googol Changjiang Research Institute Co ltd filed Critical Chongqing Googol Changjiang Research Institute Co ltd
Priority to CN202010258584.8A priority Critical patent/CN111459176B/en
Publication of CN111459176A publication Critical patent/CN111459176A/en
Application granted granted Critical
Publication of CN111459176B publication Critical patent/CN111459176B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/60Other road transportation technologies with climate change mitigation effect
    • Y02T10/70Energy storage systems for electromobility, e.g. batteries
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/60Other road transportation technologies with climate change mitigation effect
    • Y02T10/7072Electromobility specific charging systems or methods for batteries, ultracapacitors, supercapacitors or double-layer capacitors

Abstract

The invention relates to a new energy vehicle automatic charging positioning control method, a calibration method and a vehicle attitude calculation method based on point lasers and a 2D camera. The control method comprises the steps of firstly, vehicle locating; a camera positioning stage; and (3) correcting the vehicle posture. The invention continuously and iteratively calculates the offset, so that the state measured by the photographing point and the measuring point of the robot is consistent with the state measured by the calibration template under the offset, the offset can be considered to be the position and posture offset of the vehicle relative to the calibration template, and after the offset is compensated to the track of the template action, the new track is suitable for the position of the current vehicle, and the operations such as charging and the like can be completed.

Description

Automatic charging positioning control method, calibration method and vehicle attitude calculation method for vehicle
Technical Field
The invention relates to the technical field of automation, in particular to a new energy vehicle automatic charging positioning control method, a calibration method and a vehicle attitude calculation method based on point laser and a 2D camera.
Background
The new energy vehicle has a large number of scenes needing to be charged because of shorter driving mileage. For domestic use, manual charging may be desirable. However, for factories and automatic unmanned parking environments, due to uncertain arrival time and frequency of vehicles and long charging period, special persons are arranged in a charging area to wait for the arrival of the vehicles and charge the vehicles, and the gun is pulled out after the completion of charging, so that manual treatment is arranged in the situation, which is a waste of manpower and cost.
Therefore, the need for automatic charging of new energy vehicles using six-axis robots has arisen. After the outer cover which cannot be opened outside is controlled to be opened in the vehicle, the robot is required to finish the actions of opening the inner cover, inserting the gun, pulling the gun and closing the cover. Because of the inconsistent vehicle stopping positions, the robot requires external precision positioning guidance to correct the offset in order to accurately complete the maneuver without collision.
Traditional positioning methods rely on the addition of sensors at the vehicle end to achieve positioning, such as the AGV entering the automatic charging pile. Or the position of the vehicle is forcedly limited by a mechanical device, so that the position of the charging port is constant, and the robot walks a fixed track. However, these methods have limitations, such as the mechanical limiting device cannot be adapted to vehicles with different specifications and sizes, and the vehicles of different manufacturers cannot uniformly install the same sensor, so that the compatibility to the vehicles is poor.
Disclosure of Invention
The invention aims to provide a device for accurately positioning the position and the posture of a vehicle, transmitting the position offset to a robot and guiding the robot to move. The compatibility of different vehicles is realized, the flexibility of the device is realized, the additional hardware equipment cost is hardly caused by the newly added vehicle type, and a new energy vehicle automatic charging positioning control method, a calibration method and a vehicle attitude calculation method based on point laser and a 2D camera only need to be completed by a worker for one-time debugging and calibration.
The technical scheme of the invention is that the novel energy vehicle automatic charging and positioning control method based on point laser and 2D camera comprises a 2D camera and point laser, wherein the 2D camera and the point laser are both arranged on a robot tail end tool and randomly move, and the method is characterized by comprising the following steps:
the vehicle locating stage specifically comprises the following steps:
(1.1) the robot moves from one end of the initial point of parking space locating to the other end in a stepping manner in order to detect the gesture, and photographs and detects after each movement;
(1.2) judging whether the inner cover of the vehicle is detected, if so, entering the step (1.4); if not, the method moves step by step for one time at a fixed distance to the locating end point direction;
(1.3) judging whether the locating end point is reached, if so, not detecting the coincidence vehicle, and ending; if not, returning to the step (1.1);
(1.4) when the inner cover of the vehicle exists in the image, judging which type of the inner cover accords with a model plate of the vehicle, calculating a matched model plate, calling the model plate, and setting the data of the detection model plate sent by a subsequent detection program as the model plate;
(1.5) calculating a position compensation amount by combining the moving distance and the pixel deviation value;
the camera location stage specifically includes:
(2.1) moving the robot to a photographing point according to the calculated position compensation amount;
(2.2) photographing and detecting, and calculating pixel deviation values and new compensation amounts of the robot;
(2.3) judging whether the deviation is smaller than a threshold value, if not, returning to the step (2.1), and if so, entering the next step;
the vehicle posture correcting stage specifically comprises the following steps:
(3.1) moving to a specified measurement point according to the current compensation quantity, and acquiring and calculating posture transformation;
the secondary positioning stage of the camera specifically comprises the following steps:
(4.1) the robot moves to the photographing point with the new compensation amount;
(4.2) photographing and detecting, and calculating pixel deviation values and new compensation amounts of the robot;
(4.3) judging whether the deviation is smaller than a threshold value, if not, returning to the step (4.1), and if so, entering the next step;
the vehicle distance correcting stage specifically comprises the following steps:
(5.1) the robot moves to the depth measurement point with the new compensation amount;
(5.2) measuring depth and calculating a depth difference from the template;
(5.3) judging whether the depth difference is smaller than a threshold value, if not, returning to the step (5.1), and if so, entering the next step;
the step of further judging whether the cycle times meet the threshold value or not, if not, returning to the step (2.1), and if yes, ending.
As preferable: the step (2.2) further comprises: the image pixel difference is calculated as the ratio value k of the pixel to the actual distance 1 Converting the motion distance into an actual motion distance of the robot, and enabling the robot to move according to the value; photographing and calculating again after the movement to obtain a new characteristic pixel difference value, and calculating to obtain a new pixel-to-actual distance proportional value k according to the actual movement value and the difference value of the pixel difference between the two times before and after the movement 1 Substituted into the next timeCalculating; the process is repeated until the difference value is smaller than the set threshold value, and then the vehicle posture correction stage is carried out.
As preferable: the step (3.1) further comprises: and on the basis of the measured offset compensation, the system moves to a plurality of measuring points with fixed point spacing, and depth data of the measuring points are measured sequentially. And calculating the spatial attitude change of the vehicle according to the result, and transmitting the spatial attitude change to the robot.
As preferable: said step (4.2) further comprises: the image pixel difference is calculated as the ratio value k of the pixel to the actual distance 1 Converting the motion distance into an actual motion distance of the robot, and enabling the robot to move according to the value; photographing and calculating again after the movement to obtain a new characteristic pixel difference value, and calculating to obtain a new pixel-to-actual distance proportional value k according to the actual movement value and the difference value of the pixel difference between the two times before and after the movement 1 Substituting the result into the next operation.
As preferable: said step (5.2) further comprises:
collecting the current depth value, comparing the current depth value with the collected value during template calibration, and setting a movement proportion value k by the difference value 2 After conversion, the robot is moved back and forth in a direction facing the vehicle; measuring again after moving, if the measured difference is smaller than the set threshold, considering that the stage is completed, otherwise, correcting the moving proportion value k according to the contrast difference value of the depth difference measured before and after moving 2 Substituting the result into the next operation.
The other technical scheme of the invention is that the calibration method of the new energy vehicle based on the point laser and the 2D camera is characterized by comprising the following steps:
the camera template picture is gathered, specifically includes:
(1.1) adjusting the posture position of the robot to enable the camera to shoot the inner cover of the vehicle, enabling the inner cover to be positioned in the center of the image, processing the image after shooting the image, setting the area except the inner cover in the image to be white, and eliminating interference features;
pose is gathered to degree of depth point, specifically includes:
(2.1) adjusting the posture position of the robot to enable the laser line of the point to be parallel to the ground as much as possible and perpendicular to the initial locating movement direction, and shooting the laser line in the center of the inner cover, so that when approaching or leaving the vehicle, the laser point is ensured to have smaller mapping position change on the inner cover;
(2.2) taking the position as a depth acquisition point, and finishing acquisition and recording of depth point laser measurement values;
the gesture measuring point collection specifically includes:
(3.1) taking the depth acquisition point as a reference, keeping the Euler angle ABC of the robot posture unchanged, and setting the set XYZ offset as a posture measurement point;
the offset direction is the offset of other two directions except the direction approaching the vehicle, namely if the X-axis movement direction of the robot is approaching or far away from the vehicle, the offset is set for the Y-axis direction and the Z-axis direction of the robot, when the robot positions of the depth recording points are (X, Y, Z, a, b and c), the robot positions of the acquisition points are (X, y+n1, z+m1, a, b, c), (X, y+n2, z+m2, a, b, c), (X, y+n3, z+m3, a, b, c) … …, and meanwhile, the acquisition points of the laser points are still positioned on the inner cover;
and after the acquisition and recording of the laser measurement values of each point are completed, the data required by calculation can be acquired.
The invention also provides a new energy vehicle attitude calculation method of the point laser and the 2D camera, which is characterized by comprising the following steps of:
the camera location specifically includes:
(1.1) acquiring an image by using a camera after reaching a photographing point;
(1.2) feature point extraction: respectively extracting characteristic points in a template picture and an operation acquisition picture by adopting an image characteristic descriptor mode;
(1.3) feature point matching and optimization: performing primary matching on the two groups of characteristic points extracted in the step (1.2); because a large number of error matching pairs exist in the direct matching, primary matching screening is carried out through a k nearest neighbor algorithm; then, through calculating a basic matrix and a homography matrix between images, matching and screening are carried out again; obtaining a matching pair group with higher confidence;
(1.4) pixel difference calculation: in the step (1.3), n groups of matching pairs are calculated; for after screeningFeature points P in a template image of (a) n1 (X n1 ,Y n1 ) A corresponding P exists in the operation acquisition image m1 (X m1 ,Y m1 ) For this matching point pair, the pixel difference
(1.5) actual motion calculation: when the template is calibrated, the actual distance L is measured according to the features on the vehicle cover real Pixel difference L image The initial motion proportion K can be calculated init =L real /L image
Vehicle attitude calculation specifically includes:
(2.1) at P 0 As an origin, the coordinate axis direction of the original robot coordinate system is reserved, and an observation coordinate system O is constructed p The method comprises the steps of carrying out a first treatment on the surface of the Under the observation coordinate system, the coordinates P of each observation point pi (d xi ,d yi ,d zi ) Eliminating P 0 Point coordinate influence;
(2.2) in the observation coordinate System O p In the laser distance value L of each measuring point 0 、L 1 、L 2 、L 3 …L n Can be used as a direction axis A of a distance direction coordinate system xis1 Is used for constructing a point set P on the vehicle cover pt The method comprises the steps of carrying out a first treatment on the surface of the If the Y-axis is the selected axis, the ith mapping point of the hood surface is in the coordinate system O p Wherein the coordinate value is P pti (d xi ,L i ,d zi );
(2.3) Using Point set P on the vehicle cover pt In the coordinate system O p The coordinates below are substituted into the space plane equation ax+by+cz+d=0; solving an overdetermined equation to obtain a vehicle cover plane equation coefficient A, B, C, D, wherein (A, B and C) is a normal vector of a plane;
(2.4) depth measurement Point P 0 Mapping points on the vehicle cover in the observation coordinate system O p The following is denoted as P pt0 Selecting a non-directional axis A xis1 Any other coordinate system direction axis A xis2 Point P on axis The direction axis A along the coordinate system is obtained xis1 Point P of direction projection onto plane axis
(2.5) at P pt0 As origin, vector P pt0Paxis The direction is the direction axis A of the new coordinate system xis2 In the positive direction of the axis, the normal vector of the plane is far from the point P pt0 Is taken as a new coordinate system direction axis A xis3 An axial positive direction; constructing a hood coordinate system O according to the right-hand system direction car
(2.6) in the observation coordinate system Op, the observation coordinate system O is represented p The three-axis unit vectors of the pose can be represented by X (1, 0), Y (0, 1, 0), Z (0, 1), that is
In the observation coordinate system, the hood coordinate system O is represented car Three axis unit vectors of the pose can be used as X (X 1 ,y 1 ,z 1 )、Y(x 2 ,y 2 ,z 2 )、Z(x 3 ,y 3 ,z 3 ) Representation, i.e.
The change from the posture of the coordinate system B to the posture of the coordinate system A can be used for rotating the matrix R B_A Representation of
There is a relationship
R B_A B=A
B -1 Inverse of B, i.e. the rotation matrix is R B_A =A B -1
The physical meaning of the rotation matrix is the attitude relation of the inner cover of the vehicle relative to the robot observation coordinate system;
(2.7) obtaining a rotation matrix R of the template position relative to the observation coordinate System when the template is calibrated B_A_std In the subsequent work, a rotation matrix R of the new position of the vehicle relative to the observation coordinate system is measured B_A_work The new position where the vehicle stays and the attitude change quantity R between the template positions during working work_std Namely the gesture of the robot to be compensated, and the relative gesture is compensatedThe state relation is the same as that of the calibration template;
after the transformation matrix is obtained, according to the Euler angle sequence form corresponding to the robot, euler angle change values (A, B and C) can be solved, and the values are compensated to the robot track; the depth distance correction specifically comprises the following steps:
(3.1) when the template is calibrated, the distance from the depth measurement point to the vehicle cover is L std Since this value is measured as the actual size, the initial motion ratio K can be set init =1; when in work, the depth measuring point measures that the distance from the depth measuring point to the vehicle cover is L work The robot approaching direction compensates the moving distance d l =(L work -L std )*K;
(3.2) because the laser line cannot be adjusted to be completely vertical to the vehicle cover surface, when the robot moves in the approaching direction, the position of the laser point on the vehicle cover slightly changes, the measured distance changes are inconsistent with the moving value, and the proportional value K needs to be corrected; the correction method comprises the following steps:
the depth of last measurement is L 1 Corresponding to the actual movement distance d L The depth of this measurement is L 2 Then
After the calculation is completed, checking the K to prevent abnormal values; when K is>0 and K<2*K init When K is considered valid, otherwise k=k init
As preferable: in the step (1.4), since the pose and spatial position of the two image capturing positions may be different, the positions of the object projected onto the image plane are different at different times, so the pixel difference group G is set for the matching point L Each member L below 1 、L 2 、L 3 …L k There is no case where the pixel differences are equal; only if the pixel difference of part of the matching point pairs is smaller than a threshold value, the camera positioning stage is considered to be completed, and the point laser gesture verification stage is entered for gesture correction;
for matching point to pixel difference group G L After the values in the pixel are sorted from small to large, starting from the first bit, judging that the values are the result pixel differences if the values are accordant, and increasing the number of bits in sequence if the values are not accordant; judging the ith bit value V i Whether or not it is smaller than the comparison minimum value V min Less than the comparison value V compare Setting the comparison minimum value, otherwise setting the comparison value to V compare =V i * K (amplification factor), all values smaller than the contrast value V in the pixel difference group are calculated compare The ratio of the number of (2) to the total number, and when the ratio is larger than a set threshold value, the point is considered to be used as a pixel difference result of the current positioning;
when the posture correction and the depth correction are completed, the spatial relationship of the camera relative to the vehicle cover is considered to be identical with that of the template establishment when the photographing points are photographed, and the pixel difference between the matching points is 0 ideally; the variance of pixel differences and 0 of all matching point pairs can be used as a grading check matching result, namely gradingAfter the calculation is completed, checking the K to prevent abnormal values; when K is>0 and K<2*K init When K is considered valid, otherwise k=k init
As preferable: in the step (1.4), the calculated pixel difference L i From d x 、d y Two parts are formed, the robot is arranged in the image d x 、d y The corresponding motion change in the axial direction is K x d x 、K*d y Transmitting the value to the robot to finish the movement; however, since the pose and the spatial position of the two image capturing positions are different, the scale of converting the pixel point into the actual distance will also change with the change of the actual distance between the camera and the inner cover feature, and therefore, in order to reduce the number of motion adjustment, the scale K needs to be corrected.
As preferable: the correction includes: the pixel difference measured last time is d x1 、d y1 Corresponding to the actual movement distance L x 、L y The pixel difference measured at this time is d x2 、d y2 The method comprises the steps of carrying out a first treatment on the surface of the ThenAfter the calculation is completed, checking the K to prevent abnormal values; when K is>0 and K<2*K init When K is considered valid, otherwise k=k init
Compared with the prior art, the invention has the beneficial effects that:
the automatic charging system can be combined with a robot to realize automatic charging actions on different vehicles, unmanned automatic charging is realized, and labor is reduced.
The hardware of the invention only uses one point laser and 2D camera, so that the cost is low.
According to the invention, due to the adoption of a real-time correction and deviation iterative calculation mode, the sensor does not need to be calibrated to reduce distortion errors, the accuracy of the sensor mounting position and the flatness of the mounting surface are not required, and the sensor can be mounted in any combination. Meanwhile, the calibration of the mutual position relationship among the camera, the point laser and the robot is not needed. The calibration process is simple, no additional auxiliary materials such as a calibration plate are needed, and the later maintenance is simple.
The newly added support vehicle type does not need to add extra hardware, and the system has flexibility.
Drawings
FIG. 1 is a schematic view of a clamp of the present invention;
FIG. 2 is a flow chart of the steps of the present invention;
FIG. 3 is a front and rear view of the template image acquisition process of the present invention;
FIG. 4 is a schematic view of a laser acquisition point of the present invention with 4 attitude acquisition points;
FIG. 5 is a schematic diagram showing feature point extraction (color spots are feature points) according to the present invention;
FIG. 6 is a schematic diagram of the optimized post-filter matching of the present invention;
fig. 7 is a schematic diagram of the coordinate system of the present invention.
Detailed Description
The invention will be further described in detail below with reference to the accompanying drawings:
referring to the fixture structure shown in fig. 1, the present invention relies on a 2D camera and a point laser to implement algorithm positioning, where the 2D camera and the point laser are mounted on a robot end tool and move with the robot.
The calculated result is an offset, namely a spatial position change value of the current stay position of the vehicle and the calibration template position, which is transmitted to the robot in the form of six degrees of freedom of spatial position change X, Y, Z and attitude Euler angle A, B, C for compensating the template track.
The functions realized by using the 2D camera are as follows:
searching a hood characteristic position acquired in a standard process in an image;
and secondly, calculating xy pixel differences between the current hood feature and the template hood feature position in an image coordinate system.
The functions realized by using point lasers are as follows:
measuring the depth of a laser point;
secondly, calculating the attitude Euler angle of the vehicle according to the plurality of acquisition points;
comparing the current distance with the template distance, and adjusting the distance between the robot and the vehicle.
The positioning process is to correct in real time according to the instant measurement result, iterate the deviation result until matching; the measurement is not completed once, so that the absolute positioning accuracy requirement in the measurement process is low, and the sensor does not need to be additionally calibrated.
Referring to the flow steps shown in fig. 2, the present invention has five steps executed during operation, wherein steps 2 to 5 are required to be repeated 2 to 3 times as a small cycle in order to ensure the final accuracy.
The steps are described as follows:
the vehicle locating stage comprises the following steps:
correcting the space offset XYZ at the stage, moving the robot from one end of a parking space to the other end in a stepping way by using a detection gesture, photographing and detecting after each movement, judging which type of vehicle template the inner cover accords with when the inner cover of the vehicle exists in an image, setting the data of the detection template sent by a subsequent detection program as the vehicle type, and entering a camera positioning stage;
the camera positioning stage:
the phase corrects the space offset XYZ, and the phase of camera positioning is countedAnd calculating the coordinate difference value of the extracted feature and the template feature in the image under the image coordinate, converting the image pixel difference into the actual robot movement distance by using a pixel-to-actual distance ratio value k1, and enabling the robot to move according to the value. Photographing and calculating again after the movement to obtain a new characteristic pixel difference value, and calculating to obtain a new pixel-to-actual distance proportional value k according to the actual movement value and the difference value of the pixel difference between the two times before and after the movement 1 Substituting the calculated value into the next operation, and continuously repeating the process until the difference value is smaller than the set threshold value, and entering a vehicle posture correction stage.
The vehicle posture correcting stage comprises the following steps:
this stage corrects the spatial offset ABC. In this stage, the depth data of the measurement points are sequentially measured by moving to a plurality of measurement points with fixed point spacing on the basis of the measured offset compensation. And calculating the spatial attitude change of the vehicle according to the result, transmitting the spatial attitude change to the robot, and then entering a camera secondary positioning stage.
And (3) a camera secondary positioning stage:
the stage corrects the space offset XYZ, because the stage 3 changes the gesture, when the original XYZ offset reaches a photographing point, the image position changes, at the moment, the coordinate difference value of the extracted feature and the template feature in the image under the image coordinate needs to be calculated again, the image pixel difference is converted into the actual robot movement distance from the pixel to the actual distance proportional value k1, and the robot moves according to the value. Photographing and calculating again after the movement to obtain a new characteristic pixel difference value, and calculating to obtain a new pixel-to-actual distance proportional value k according to the actual movement value and the difference value of the pixel difference between the two times before and after the movement 1 Substituting the value into the next operation, repeating the process until the difference value is smaller than the set threshold value, and entering a vehicle distance correction stage.
And (5) correcting the distance of the vehicle:
this stage corrects the spatial offset XYZ. At this stage, the robot moves to the distance measurement point on the basis of the measured offset compensation; collecting the current depth value, comparing the current depth value with the collected value during template calibration, and setting a movement proportion value k by the difference value 2 After conversion, let machineThe robot moves back and forth in a direction facing the vehicle. Measuring again after moving, if the measured difference is smaller than the set threshold, considering that the stage is completed, otherwise, correcting the moving proportion value k according to the contrast difference value of the depth difference measured before and after moving 2 Substituting the result into the next operation.
When a vehicle of a different type is newly added, calibration is needed, and the vehicle position during calibration is used as a template position for subsequent compensation calculation. After the robot is debugged at the track point of the vehicle position, the calibration process of the invention is started.
The calibration step of the invention is mainly divided into three parts, namely camera template picture acquisition, depth point acquisition pose and attitude measurement point acquisition; the steps are briefly described as follows:
the method comprises the steps of collecting a camera template picture.
Please refer to the effect diagram before and after the template image acquisition process shown in fig. 3; the gesture position of the robot is adjusted, so that the camera can shoot the inner cover of the vehicle, the inner cover is positioned in the center of the image as much as possible, after the image is shot, the image is processed, the area except the inner cover in the image is white, and interference characteristics are eliminated.
The method comprises the following steps of:
referring to the laser acquisition point position shown in fig. 4, the gesture position of the robot is adjusted to enable the point laser line to be parallel to the ground as much as possible and perpendicular to the initial locating movement direction, and the point laser line is shot in the center of the inner cover, so that when approaching or departing from the vehicle, the laser point is ensured to have smaller mapping position change on the inner cover. The position is used as a depth acquisition point to finish acquisition and recording of the laser measurement value of the depth point.
Collecting attitude measurement points:
and taking the depth acquisition point as a reference, keeping the Euler angle ABC of the robot posture unchanged, and setting the set XYZ offset as a posture measurement point. The offset direction is the offset of other two directions except the direction approaching the vehicle, namely if the X-axis motion direction of the robot approaches or moves away from the vehicle, the offset is set to the YZ-axis direction of the robot, and when the position of the robot with the depth recording point is (X, y, z, a, b and c), the position of the robot with the collecting point is (X, y+n1, z+m1, a, b, c), (X, y+n2, z+m2, a, b, c), (X, y+n3, z+m) 3 ,a,b,c)…More than three groups of … acquisition points are required to be arranged, and meanwhile, the laser points are still located on the inner cover, so that acquisition and recording of laser measurement values of each point are completed.
The three processes are completed, so that the data required by calculation can be completely acquired.
The principle of the implementation process of the invention:
the core idea of the algorithm is as follows: and continuously and iteratively calculating the offset, so that the states measured by the photographing point and the measuring point of the robot are consistent with the calibration template under the offset, and the offset can be considered to be the position and posture offset of the vehicle relative to the calibration template. After the offset is compensated to the track of the template action, the new track is suitable for the current position of the vehicle, and operations such as charging can be completed.
The offset contains six degrees of freedom X, Y, Z, A, B, C. Obtaining two values in X, Y, Z degrees of freedom through camera positioning calculation, and correcting the other value by the depth distance; the vehicle attitude is calculated to have A, B, C three degrees of freedom.
A. Camera positioning:
the camera positioning mainly comprises several stages of image acquisition, feature point extraction, feature point matching and optimization, pixel difference calculation and actual motion calculation.
The method comprises the steps of (1) obtaining an image: after reaching a photographing point, acquiring an image by using a camera;
please refer to the extracted feature point effect shown in fig. 5;
extracting the characteristic points: respectively extracting characteristic points in a template picture and an operation acquisition picture by adopting an image characteristic descriptor mode; .
Feature point matching and optimization
Referring to the schematic effect of optimized matching after filtering shown in fig. 6, for the two sets of feature points extracted in the step, primary matching is performed; because a large number of error matching pairs exist in the direct matching, primary matching screening is carried out through a k nearest neighbor algorithm; then, matching and screening again by calculating a basic matrix and a homography matrix between the images; and obtaining the matched pair group with higher confidence.
Calculating pixel difference:
in the third step, n groups of matching pairs are obtained through calculation; for characteristic points P in the screened template image n1 (X n1 ,Y n1 ) A corresponding P exists in the operation acquisition image m1 (X m1 ,Y m1 ) For this matching point pair, the pixel difference
Since the two pictures may have different shooting positions, such as different poses and different spatial positions, different projection positions of the object onto the image plane, the pixel difference group G is a matching point L Each member L below 1 、L 2 、L 3 …L k There is no case where the pixel differences are equal; only if the pixel difference of part of the matching point pairs is smaller than the threshold value, the camera positioning stage is considered to be completed, and the point laser gesture verification stage is entered for gesture correction.
For matching point to pixel difference group G L After the values in the pixel are sorted from small to large, the first bit is used for judging, if the values are consistent, the values are the result pixel differences, and if the values are not consistent, the bit numbers are increased in sequence. Judging the ith bit value V i Whether or not it is smaller than the comparison minimum value V min Less than the comparison value V compare Setting the comparison minimum value, otherwise; setting the comparison value to V compare =V i * K (amplification factor), all values smaller than the contrast value V in the pixel difference group are calculated compare When the ratio is greater than the set threshold, this is considered to be the result of the pixel difference for the current location.
Similarly, when the posture correction and the depth correction are completed, the spatial relationship of the camera relative to the vehicle cover is considered to be identical with that of the template establishment when the photographing points are photographed, and the pixel difference between the matching points is ideally 0. The variance of pixel differences and 0 of all matching point pairs can be used as a scoring check matching result, namely scoring.When the score is less than the set threshold, the result is considered valid.
Actual motion calculation:
when the template is calibrated, the actual distance L is measured according to the features on the vehicle cover real Pixel difference L image The initial motion proportion K can be calculated init =L real /L image
In step four, the calculated pixel difference L i From d x 、d y Two parts are formed, the robot is arranged in the image d x 、d y The corresponding motion change in the axial direction is K x d x 、K*d y And transmitting the value to the robot to finish the movement.
However, since the pose and the spatial position of the two image capturing positions are different, the scale of converting the pixel point into the actual distance will also change with the change of the actual distance between the camera and the inner cover feature, and therefore, in order to reduce the number of motion adjustment, the scale K needs to be corrected.
The correction method comprises the following steps: the pixel difference measured last time is d x1 、d y1 Corresponding to the actual movement distance L x 、L y The pixel difference measured at this time is d x2 、d y2 The method comprises the steps of carrying out a first treatment on the surface of the ThenAfter the calculation is completed, checking K to prevent abnormal value. When K is>0 and K<2*K init When K is considered valid, otherwise k=k init
B. Vehicle attitude calculation:
for the point set P of the robot gesture measurement points, the depth measurement points P are used 0 (X 0 ,Y 0 ,Z 0 ) Constructing a reference, wherein for an ith attitude measurement point, the coordinate value of the ith attitude measurement point in a robot coordinate system is P i (X 0 +dx i ,Y 0 +dy i ,Z 0 +dz i ). With different offsets, i.e. X, each time the attitude is measured 0 、Y 0 、Z 0 Each time the values are located differently. Can P 0 As an origin, the coordinate axis direction of the original robot coordinate system is reserved, and an observation coordinate system O is constructed p Under the observation coordinate system, eachObservation point coordinate P pi (dx i ,dy i ,dz i ) Eliminating P 0 The point coordinates affect, simplifying the calculation.
The distance movement direction A of approaching/separating the vehicle is fixed when the point is selected in the XYZ direction of the robot measuring point xis Shaft, observation point at A xis The values on the axis do not change, i.e
dx 1 =dx 2 =dx 3 =…=dx n =0 or dy 1 =dy 2 =dy 3 =…=dy n =0 or dz 1 =dz 2 =dz 3 =…=dz n =0. In the observation coordinate system O p In the laser distance value L of each measuring point 0 、L 1 、L 2 、L 3 ...L n Can be used as a direction axis A of a distance direction coordinate system xis1 Is used for constructing a point set P on the vehicle cover pt . If the Y-axis is the selected axis, the ith mapping point of the hood surface is in the coordinate system O p Wherein the coordinate value is P pti (dx i ,L i ,dz i )。
Using a set of points P on the vehicle cover pt In the coordinate system O p The coordinates below are substituted into the space plane equation ax+by+cz+d=0; and solving the overdetermined equation to obtain a vehicle cover plane equation coefficient A, B, C, D, wherein (A, B and C) is the normal vector of the plane.
Depth measurement point P 0 Mapping points on the vehicle cover in the observation coordinate system O p The following is denoted as P pt0 Selecting a non-directional axis A xis1 Any other coordinate system direction axis A xis2 Point P on axis The direction axis A along the coordinate system is obtained xis1 Point P' of direction projection onto plane axis
With P pt0 As origin, vector P pt0 P` axis The direction is the direction axis A of the new coordinate system xis2 In the positive direction of the axis, the normal vector of the plane is far from the point P pt0 Is taken as a new coordinate system direction axis A xis3 The axis is in the positive direction. Constructing a hood coordinate system O according to the right-hand system direction car
FIG. 7 shows coordinatesRobot base standard system O taking system relation diagram as reference rob With the observation coordinate system O below p Vehicle cover coordinate system O car Relationship between them.
In the observation coordinate system O p In (3) represents the observation coordinate system O p The three axis unit vectors of the pose can be represented by X (1, 0), Y (0, 1, 0), Z (0, 1). I.e.
In the observation coordinate system, the hood coordinate system O is represented car Three axis unit vectors of the pose can be used as X (X 1 ,y 1 ,z 1 )、Y(x 2 ,y 2 ,z 2 )、Z(x 3 ,y 3 ,z 3 ) And (3) representing. I.e.
The change from the posture of the coordinate system B to the posture of the coordinate system A can be used for rotating the matrix R B_A Representation of
There is a relationship
R B_A B=A
B -1 Inverse of B, i.e. the rotation matrix is R B_A =A B -1
The physical meaning of the rotation matrix is the attitude relation of the inner cover of the vehicle relative to the robot observation coordinate system.
When the template is calibrated, a rotation matrix R of the template position relative to an observation coordinate system is obtained B_A_std In the subsequent work, a rotation matrix R of the new position of the vehicle relative to the observation coordinate system is measured B_A_work . The new position where the vehicle stays in working and the attitude change quantity R between the template position work_std Namely, the gesture of the robot to be compensated, and the relative gesture relation is the same as that of the calibration template after the gesture is compensated.
After the transformation matrix is obtained, according to Euler angle sequence forms (such as zyx, xyz and the like) corresponding to the robot, euler angle change values (A, B and C) can be obtained by solving, and the values are compensated to the robot track.
C. Depth-distance correction:
when the template is calibrated, the depth measuring point measures that the distance from the vehicle cover to the vehicle cover is L std Since this value is measured as the actual size, the initial motion ratio K can be set init =1. When in work, the depth measuring point measures that the distance from the depth measuring point to the vehicle cover is L work The robot approaching direction compensates the moving distance d l =(L work -L std )*K。
However, since the laser line cannot be adjusted to be completely perpendicular to the vehicle cover surface, when the robot moves in the approaching direction, the position of the laser spot on the vehicle cover slightly changes, and the measured distance changes are inconsistent with the movement value, and the proportional value K needs to be corrected.
The correction method comprises the following steps:
the depth of last measurement is L 1 Corresponding to the actual movement distance d L The depth of this measurement is L 2 . ThenAfter the calculation is completed, checking K to prevent abnormal value. When K is>0 and K<2*K init When K is considered valid, otherwise k=k init
The foregoing description is only of the preferred embodiments of the invention, and all changes and modifications that come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.

Claims (9)

1. The automatic charging and positioning control method for the new energy vehicle based on the point laser and the 2D camera comprises the 2D camera and the point laser, wherein the 2D camera and the point laser are both arranged on a robot tail end tool and randomly move, and the method is characterized by comprising the following steps of:
the vehicle locating stage specifically comprises the following steps:
(1.1) the robot moves from one end of the initial point of parking space locating to the other end in a stepping manner in order to detect the gesture, and photographs and detects after each movement;
(1.2) judging whether the inner cover of the vehicle is detected, if so, entering the step (1.4); if not, the method moves step by step for one time at a fixed distance to the locating end point direction;
(1.3) judging whether the locating end point is reached, if so, not detecting the coincidence vehicle, and ending; if not, returning to the step (1.1);
(1.4) when the inner cover of the vehicle exists in the image, judging which type of the inner cover accords with a model plate of the vehicle, calculating a matched model plate, calling the model plate, and setting the data of the detection model plate sent by a subsequent detection program as the model plate;
(1.5) calculating a position compensation amount by combining the moving distance and the pixel deviation value;
the camera location stage specifically includes:
(2.1) moving the robot to a photographing point according to the calculated position compensation amount;
(2.2) photographing and detecting, and calculating pixel deviation values and new compensation amounts of the robot;
(2.3) judging whether the deviation is smaller than a threshold value, if not, returning to the step (2.1), and if so, entering the next step;
the vehicle posture correcting stage specifically comprises the following steps:
(3.1) moving to a specified measurement point according to the current compensation quantity, and acquiring and calculating posture transformation;
the secondary positioning stage of the camera specifically comprises the following steps:
(4.1) the robot moves to the photographing point with the new compensation amount;
(4.2) photographing and detecting, and calculating pixel deviation values and new compensation amounts of the robot;
(4.3) judging whether the deviation is smaller than a threshold value, if not, returning to the step (4.1), and if so, entering the next step;
the vehicle distance correcting stage specifically comprises the following steps:
(5.1) the robot moves to the depth measurement point with the new compensation amount;
(5.2) measuring depth and calculating a depth difference from the template;
(5.3) judging whether the depth difference is smaller than a threshold value, if not, returning to the step (5.1), and if so, entering the next step;
the step of further judging whether the cycle times meet the threshold value or not, if not, returning to the step (2.1), and if yes, ending.
2. The method for controlling automatic charging and positioning of a new energy vehicle based on point laser and 2D camera according to claim 1, wherein the step (2.2) further comprises: the image pixel difference is calculated as the ratio value k of the pixel to the actual distance 1 Converting the motion distance into an actual motion distance of the robot, and enabling the robot to move according to the value; photographing and calculating again after the movement to obtain a new characteristic pixel difference value, and calculating to obtain a new pixel-to-actual distance proportional value k according to the actual movement value and the difference value of the pixel difference between the two times before and after the movement 1 Substituting the result into the next operation; the process is repeated until the difference value is smaller than the set threshold value, and then the vehicle posture correction stage is carried out.
3. The method for automatic charging and positioning control of a new energy vehicle based on point laser and 2D camera according to claim 1, wherein the step (3.1) further comprises: on the basis of the measured offset compensation, the method moves to a plurality of measuring points with fixed point spacing, and depth data of the measuring points are measured in sequence; and calculating the spatial attitude change of the vehicle according to the result, and transmitting the spatial attitude change to the robot.
4. The method for automatic charging and positioning control of a new energy vehicle based on point laser and 2D camera according to claim 1, wherein the step (4.2) further comprises: the image pixel difference is calculated as the ratio value k of the pixel to the actual distance 1 Converting the motion distance into an actual motion distance of the robot, and enabling the robot to move according to the value; photographing and calculating again after the movement to obtain a new characteristic pixel difference value, and calculating to obtain a new pixel-to-actual distance proportional value k according to the actual movement value and the difference value of the pixel difference between the two times before and after the movement 1 Substituting the result into the next operation.
5. The method for automatic charging and positioning control of a new energy vehicle based on point laser and 2D camera according to claim 1, wherein the step (5.2) further comprises:
collecting the current depth value, comparing the current depth value with the collected value during template calibration, and setting a movement proportion value k by the difference value 2 After conversion, the robot is moved back and forth in a direction facing the vehicle; measuring again after moving, if the measured difference is smaller than the set threshold, considering that the stage is completed, otherwise, correcting the moving proportion value k according to the contrast difference value of the depth difference measured before and after moving 2 Substituting the result into the next operation.
6. A new energy vehicle attitude calculation method of point laser and a 2D camera is characterized by comprising the following steps:
the camera location specifically includes:
(1.1) acquiring an image by using a camera after reaching a photographing point;
(1.2) feature point extraction: respectively extracting characteristic points in a template picture and an operation acquisition picture by adopting an image characteristic descriptor mode;
(1.3) feature point matching and optimization: performing primary matching on the two groups of characteristic points extracted in the step (1.2); because a large number of error matching pairs exist in the direct matching, primary matching screening is carried out through a k nearest neighbor algorithm; then, through calculating a basic matrix and a homography matrix between images, matching and screening are carried out again; obtaining a matching pair group with higher confidence;
(1.4) pixel difference calculation: in the step (1.3), n groups of matching pairs are calculated; for the feature point Pn1 (Xn 1, yn 1) in the screened template image, a corresponding Pm1 (Xm 1, ym 1) exists in the operation acquisition image, and for the matching point pair, the pixel difference
(1.5) actual motion calculation: when the template is calibrated, according to the actual distance Lreal of the features on the vehicle cover and the pixel difference Limage, an initial motion proportion Kinit=Lreal/Limage can be calculated;
vehicle attitude calculation specifically includes:
(2.1) at P 0 As an origin, the coordinate axis direction of the original robot coordinate system is reserved, and an observation coordinate system O is constructed p The method comprises the steps of carrying out a first treatment on the surface of the Under the observation coordinate system, the coordinates P of each observation point pi (d xi ,d yi ,d zi ) Eliminating P 0 Point coordinate influence;
(2.2) in the observation coordinate System O p In the laser distance value L of each measuring point 0 、L 1 、L 2 、L 3 …L n Can be used as a direction axis A of a distance direction coordinate system xis1 Is used for constructing a point set P on the vehicle cover pt The method comprises the steps of carrying out a first treatment on the surface of the If the Y-axis is the selected axis, the ith mapping point of the hood surface is in the coordinate system O p Wherein the coordinate value is P pti (d xi ,L i ,d zi );
(2.3) Using Point set P on the vehicle cover pt In the coordinate system O p The coordinates below are substituted into the space plane equation ax+by+cz+d=0; solving an overdetermined equation to obtain a vehicle cover plane equation coefficient A, B, C, D, wherein (A, B and C) is a normal vector of a plane;
(2.4) depth measurement Point P 0 Mapping points on the vehicle cover in the observation coordinate system O p The following is denoted as P pt0 Selecting a non-directional axis A xis1 Any other coordinate system direction axis A xis2 Point P on axis The direction axis A along the coordinate system is obtained xis1 Point P of direction projection onto plane axis
(2.5) at P pt0 As origin, vector P pt0 P axis The direction is the direction axis A of the new coordinate system xis2 In the positive direction of the axis, the normal vector of the plane is far from the point P pt0 Is taken as a new coordinate system direction axis A xis3 An axial positive direction; constructing a hood coordinate system O according to the right-hand system direction car
(2.6) in the observation coordinate system Op, the observation coordinate system O is represented p The three-axis unit vectors of the pose can be represented by X (1, 0), Y (0, 1, 0), Z (0, 1), that is
In the observation coordinate system, the hood coordinate system O is represented car Three axis unit vectors of the pose can be used as X (X 1 ,y 1 ,z 1 )、Y(x 2 ,y 2 ,z 2 )、Z(x 3 ,y 3 ,z 3 ) Representation, i.e.
The change from the posture of the coordinate system B to the posture of the coordinate system A can be used for rotating the matrix R B_A Representation of
There is a relationship
R B_A B=A
B -1 Inverse of B, i.e. the rotation matrix is
R B_A =A B -1
The physical meaning of the rotation matrix is the attitude relation of the inner cover of the vehicle relative to the robot observation coordinate system;
(2.7) obtaining a rotation matrix R of the template position relative to the observation coordinate System when the template is calibrated B_A_std In the subsequent work, a rotation matrix R of the new position of the vehicle relative to the observation coordinate system is measured B_A_work The new position where the vehicle stays and the attitude change quantity R between the template positions during working work_std Namely, the gesture of the robot to be compensated, and the relative gesture relation is the same as that of the calibration template after the gesture is compensated;
after the transformation matrix is obtained, according to the Euler angle sequence form corresponding to the robot, euler angle change values (A, B and C) can be solved, and the values are compensated to the robot track;
the depth distance correction specifically comprises the following steps:
(3.1) when the template is calibrated, the depth measurement point measures that the vehicle cover distance is L std Since this value is measured as the actual size, the initial motion ratio K can be set init =1; when in work, the depth measuring point measures that the distance between the vehicle cover and the vehicle is L work The robot approaching direction compensates the moving distance d l =(L work -L std )*K;
(3.2) because the laser line cannot be adjusted to be completely vertical to the vehicle cover surface, when the robot moves in the approaching direction, the position of the laser point on the vehicle cover slightly changes, the measured distance changes are inconsistent with the moving value, and the proportional value K needs to be corrected; the correction method comprises the following steps:
the depth of last measurement is L 1 Corresponding to the actual movement distance d L The depth of this measurement is L 2 Then
After the calculation is completed, checking the K to prevent abnormal values; when K is>0 and K<2*K init When K is considered valid, otherwise k=k init
7. The method of calculating the pose of a new energy vehicle by a point laser and a 2D camera according to claim 6, wherein in the step (1.4), since the poses and spatial positions of the two image capturing positions may be different, the positions of the different objects projected onto the image plane are different, the pixel difference group G is set for the matching point L Each member L below 1 、L 2 、L 3 …L k There is no case where the pixel differences are equal; only if the pixel difference of part of the matching point pairs is smaller than a threshold value, the camera positioning stage is considered to be completed, and the point laser gesture verification stage is entered for gesture correction;
for matching point to pixel difference group G L After the values in the internal are ordered from small to large, starting from the first bit, judging that the values are accordant, and taking the values as result imagesThe element difference is not matched, and the sequential bit number is increased; judging the ith bit value V i Whether or not it is smaller than the comparison minimum value V min Less than the comparison value V compare Setting the comparison minimum value, otherwise setting the comparison value to V compare =V i * K is an amplification factor, and all values smaller than the contrast value V in the pixel difference group are calculated compare The ratio of the number of (2) to the total number, and when the ratio is larger than a set threshold value, the point is considered to be used as a pixel difference result of the current positioning;
when the posture correction and the depth correction are completed, the spatial relationship of the camera relative to the vehicle cover is considered to be identical with that of the template establishment when the photographing points are photographed, and the pixel difference between the matching points is 0 ideally; the variance of pixel differences and 0 of all matching point pairs can be used as a grading check matching result, namely gradingWhen the score is less than the set threshold, the result is considered valid.
8. The new energy vehicle posture calculating method of point laser and 2D camera according to claim 6, wherein in said step (1.4), the calculated pixel difference L i From d x 、d y Two parts are formed, the robot is arranged in the image d x 、d y The corresponding motion change in the axial direction is K x d x 、K*d y Transmitting the value to the robot to finish the movement; however, since the pose and the spatial position of the two image capturing positions are different, the scale of converting the pixel point into the actual distance will also change with the change of the actual distance between the camera and the inner cover feature, and therefore, in order to reduce the number of motion adjustment, the scale K needs to be corrected.
9. The new energy vehicle posture calculation method of the point laser and the 2D camera according to claim 8, wherein the correcting includes: the pixel difference measured last time is d x1 、d y1 Corresponding to the actual movement distance L x 、L y The pixel difference measured at this time is d x2 、d y2 The method comprises the steps of carrying out a first treatment on the surface of the ThenAfter the calculation is completed, checking the K to prevent abnormal values; when K is>0 and K<2*K init When K is considered valid, otherwise k=k init
CN202010258584.8A 2020-04-03 2020-04-03 Automatic charging positioning control method, calibration method and vehicle attitude calculation method for vehicle Active CN111459176B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010258584.8A CN111459176B (en) 2020-04-03 2020-04-03 Automatic charging positioning control method, calibration method and vehicle attitude calculation method for vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010258584.8A CN111459176B (en) 2020-04-03 2020-04-03 Automatic charging positioning control method, calibration method and vehicle attitude calculation method for vehicle

Publications (2)

Publication Number Publication Date
CN111459176A CN111459176A (en) 2020-07-28
CN111459176B true CN111459176B (en) 2023-09-01

Family

ID=71685895

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010258584.8A Active CN111459176B (en) 2020-04-03 2020-04-03 Automatic charging positioning control method, calibration method and vehicle attitude calculation method for vehicle

Country Status (1)

Country Link
CN (1) CN111459176B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112170124B (en) * 2020-09-29 2021-12-14 广汽本田汽车有限公司 Visual positioning method and device for vehicle body and vehicle frame

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107886545A (en) * 2017-11-03 2018-04-06 西安航天精密机电研究所 Electric automobile changes vision system, scaling method and the battery localization method of electricity
CN108508897A (en) * 2018-04-20 2018-09-07 杭州蓝芯科技有限公司 A kind of robot automatic charging alignment system and method for view-based access control model
CN109727290A (en) * 2018-12-26 2019-05-07 南京理工大学 Zoom camera dynamic calibrating method based on monocular vision triangle telemetry
CN109827507A (en) * 2019-01-22 2019-05-31 上海蔚来汽车有限公司 Method for electrically is changed away from the vision positioning of camera based on fixed-focus and changes electric system
CN109977954A (en) * 2019-04-01 2019-07-05 上海电气集团股份有限公司 The identification of electric vehicle charge interface and localization method and system
CN110340887A (en) * 2019-06-12 2019-10-18 西安交通大学 A method of the oiling robot vision guide based on image
CN209776188U (en) * 2019-01-21 2019-12-13 河南埃尔森智能科技有限公司 Unmanned charging system of car based on 3D vision technique

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11205283B2 (en) * 2017-02-16 2021-12-21 Qualcomm Incorporated Camera auto-calibration with gyroscope

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107886545A (en) * 2017-11-03 2018-04-06 西安航天精密机电研究所 Electric automobile changes vision system, scaling method and the battery localization method of electricity
CN108508897A (en) * 2018-04-20 2018-09-07 杭州蓝芯科技有限公司 A kind of robot automatic charging alignment system and method for view-based access control model
CN109727290A (en) * 2018-12-26 2019-05-07 南京理工大学 Zoom camera dynamic calibrating method based on monocular vision triangle telemetry
CN209776188U (en) * 2019-01-21 2019-12-13 河南埃尔森智能科技有限公司 Unmanned charging system of car based on 3D vision technique
CN109827507A (en) * 2019-01-22 2019-05-31 上海蔚来汽车有限公司 Method for electrically is changed away from the vision positioning of camera based on fixed-focus and changes electric system
CN109977954A (en) * 2019-04-01 2019-07-05 上海电气集团股份有限公司 The identification of electric vehicle charge interface and localization method and system
CN110340887A (en) * 2019-06-12 2019-10-18 西安交通大学 A method of the oiling robot vision guide based on image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王超.电动汽车充电口位姿激光扫描定位技术的研究.《中国优秀硕士学位论文全文数据库(电子期刊)》.2020,第 3 章 充电口位姿扫描式激光定位系统设计. *

Also Published As

Publication number Publication date
CN111459176A (en) 2020-07-28

Similar Documents

Publication Publication Date Title
CN111775146B (en) Visual alignment method under industrial mechanical arm multi-station operation
CN106457562B (en) Method and robot system for calibration machine people
CN108109174B (en) Robot monocular guidance method and system for randomly sorting scattered parts
CN110238845B (en) Automatic hand-eye calibration method and device for optimal calibration point selection and error self-measurement
JP6180087B2 (en) Information processing apparatus and information processing method
JP4021413B2 (en) Measuring device
JP5992184B2 (en) Image data processing apparatus, image data processing method, and image data processing program
US8520067B2 (en) Method for calibrating a measuring system
CN111595333A (en) Modularized unmanned vehicle positioning method and system based on visual inertial laser data fusion
CN110146099B (en) Synchronous positioning and map construction method based on deep learning
JP6165745B2 (en) Calibration method for on-board computer-based vision system
CN110136204B (en) Sound film dome assembly system based on calibration of machine tool position of bilateral telecentric lens camera
US20220230348A1 (en) Method and apparatus for determining a three-dimensional position and pose of a fiducial marker
JPH06131420A (en) Method and device for supporting construction
JP6626338B2 (en) Information processing apparatus, control method for information processing apparatus, and program
CN114519738A (en) Hand-eye calibration error correction method based on ICP algorithm
CN111459176B (en) Automatic charging positioning control method, calibration method and vehicle attitude calculation method for vehicle
CN114347008A (en) Industrial robot-based method and device for grabbing workpieces out of order and intelligent terminal
CN114001651B (en) Large-scale slender barrel type component pose in-situ measurement method based on binocular vision measurement and priori detection data
JP2003089086A (en) Robot controller
CN111899303B (en) Novel feature matching and relative positioning method considering space inverse projection constraint
JP2002046087A (en) Three-dimensional position measuring method and apparatus, and robot controller
CN111145267A (en) IMU (inertial measurement unit) assistance-based 360-degree panoramic view multi-camera calibration method
CN113459841B (en) Automatic charging control method and device based on uncalibrated binocular vision
CN112584041B (en) Image identification dynamic deviation rectifying method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant