CN109283889B - Visual positioning compensation method - Google Patents
Visual positioning compensation method Download PDFInfo
- Publication number
- CN109283889B CN109283889B CN201811108550.XA CN201811108550A CN109283889B CN 109283889 B CN109283889 B CN 109283889B CN 201811108550 A CN201811108550 A CN 201811108550A CN 109283889 B CN109283889 B CN 109283889B
- Authority
- CN
- China
- Prior art keywords
- camera
- calculating
- proportion
- rotation
- translation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/18—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
- G05B19/404—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by control arrangements for compensation, e.g. for backlash, overshoot, tool offset, tool wear, temperature, machine construction errors, load, inertia
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/34—Director, elements to supervisory
- G05B2219/34005—Motion control chip, contains digital filter as control compensator
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Manufacturing & Machinery (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention relates to the technical field of automatic positioning, in particular to a visual positioning compensation method. The method comprises a rotating platform and at least two cameras, and further comprises the following steps: step 1: the camera scans the product of the rotating platform and extracts the contour line of the product; step 2: calculating the translation proportion and the rotation proportion of each camera and the rotating platform; and step 3: converting the product contour line extracted by each camera into translation XY and rotation Angle according to the translation proportion and the rotation proportion; and 4, step 4: and carrying out integral compensation and station compensation according to the translation XY and the rotation Angle Angle. The invention can realize the rapid calculation of the actual difference between the actual product position and the theoretical position, and compensate through the calculated difference, thereby achieving the purpose of positioning compensation.
Description
Technical Field
The invention relates to the technical field of automatic positioning, in particular to a visual positioning compensation method.
Background
Currently, industrial products are of various types, and automatic positioning may be involved in the production process of the industrial products. And if the printing and the attaching are required in an automatic production line, all parts are quickly and accurately installed in place.
However, at present, some processing plants also adopt the traditional manual positioning method or mechanical positioning method, and the method has the following defects: (1) a single product is positioned for a long time, the eyes of a detection worker are easy to fatigue and are easily influenced by emotion, and the positioning result is difficult to ensure; (2) mechanical positioning requires high mechanical precision and complex operation.
With the development of science and technology, the requirements of the market and users on the precision degree and quality of products are higher and higher, and the traditional positioning method cannot meet the requirements of the users, so that the improvement of the positioning precision is a problem to be solved urgently in the industry.
Disclosure of Invention
The present invention provides a visual positioning compensation method for solving the above technical problems.
In order to solve the technical problems, the invention adopts the following technical scheme: a visual positioning compensation method comprises a rotating platform and at least two cameras, and further comprises the following steps:
step 1: the camera scans the product of the rotating platform and extracts the contour line of the product;
step 2: calculating the translation proportion and the rotation proportion of each camera and the rotating platform;
and step 3: converting the product contour line extracted by each camera into translation XY and rotation Angle according to the translation proportion and the rotation proportion;
and 4, step 4: and carrying out integral compensation and station compensation according to the translation XY and the rotation Angle Angle.
Further, the specific steps of calculating the translation and rotation ratio of each camera to the rotating platform are as follows:
step 2.1: presetting the values of moveX, moveY and roAngle;
step 2.2: moving the rotary platform to the original point position to acquire a reference line;
step 2.3: controlling the rotary platform to move along the X-axis direction, acquiring characteristics, and calculating the moving xScale proportion;
step 2.4: controlling the rotary platform to move along the direction of the Y axis, acquiring characteristics, and calculating the moving yScale proportion;
step 2.5: and controlling the rotating platform to rotate under the driving of the rotating shaft, acquiring the characteristics and calculating the movement proportion of the RoScale.
Further, the method comprises the following specific steps of controlling the rotary platform to move along the X-axis direction and acquiring the characteristics, and calculating the moving xScale ratio:
step 2.3.1: calculating an intersection baseMidPt1 of a reference line before the rotary platform moves along the direction of the X axis and the central line of the camera;
step 2.3.2: calculating an intersection point curMidPt1 of the reference line and the central line after the rotary platform moves;
step 2.3.3: the formula of the X movement proportion calculation is as follows:
further, the method comprises the specific steps of controlling the rotating platform to move along the Y-axis direction and acquiring the characteristics, and calculating the moving yScale ratio:
step 2.4.1: calculating an intersection baseMidPt2 of a reference line before the rotary platform moves along the Y-axis direction and the center line of the camera;
step 2.4.2: calculating an intersection point curMidPt2 of the reference line and the central line after the rotary platform moves;
step 2.4.3: the formula for calculating the Y movement proportion is as follows:
further, the method comprises the following specific steps of controlling the rotation of the rotating platform and obtaining characteristics, and calculating the movement proportion of the robScale:
step 2.5.1: calculating the intersection point baseMidPt3 of the reference line before the rotation of the rotary platform and the center line of the camera;
step 2.5.2: calculating an intersection point curMidPt3 of the reference line and the central line after the rotation of the rotary platform;
step 2.5.3: the equation for the angle moving proportion is as follows:
further, the specific steps of converting the product contour line extracted by each camera into translation XY and rotation Angle according to the translation proportion and the rotation proportion are as follows:
step 3.1: positioning the product contour line extracted by each camera;
step 3.2: calculating the intersection point baseMidPt4 of the datum line and the central line of each camera;
step 3.3: calculating an intersection curMidPt4 of the product contour line extracted by each camera and the central line;
step 3.4: calculating the distance Dis between curMidPt4 and baseMidPt4 of each camera;
step 3.5: the rotation angle A, Y axis offset value offectY and the X axis offset value offectX for each camera are calculated.
Further, the specific steps of performing overall compensation and station compensation according to the translation XY and the rotation Angle Angle comprise:
step 4.1: according to actual product requirements, a user inputs data X1, Y1, Y2 and a configured product length proLen to obtain compensation data X on an X axis, compensation data Y on a Y axis and angle compensation data A.
Further, the method also comprises the step 5: and (4) mechanically compensating.
Further, the rotating platform is a turntable.
The invention has the beneficial effects that: the invention can realize the rapid calculation of the actual difference between the actual product position and the theoretical position, and compensate through the calculated difference, thereby achieving the purpose of positioning compensation.
Drawings
Fig. 1 is a schematic view of the installation position of the camera according to the present invention.
FIG. 2 is a schematic representation of the movement of the rotating platform of the present invention in the direction of the X-axis and the acquisition of features.
FIG. 3 is a schematic representation of the movement of the rotating platform of the present invention in the direction of the Y-axis and capture features.
Fig. 4 is a schematic view of the rotating platform of the present invention rotated by a rotating shaft and acquiring features.
FIG. 5 is a schematic diagram of case 1 of Y-axis compensation and angle according to the present invention.
FIG. 6 is a schematic diagram of case 2 of Y-axis compensation and angle according to the present invention.
FIG. 7 is a schematic diagram of case 3 of Y-axis compensation and angle according to the present invention.
Reference numerals: 1-a rotating platform; 2-camera.
Detailed Description
In order to facilitate understanding of those skilled in the art, the present invention will be further described with reference to the following examples and drawings, which are not intended to limit the present invention. The present invention is described in detail below with reference to the attached drawings.
The invention provides a visual positioning compensation method, which comprises a rotary platform 1 and at least two cameras 2, wherein the number of the cameras 2 is5, the installation positions above the corresponding rotary platform 1 are shown in figure 1, and each point on the rotary platform 1 corresponds to a corresponding coordinate of an XY axis, and the visual positioning compensation method also comprises the following steps:
step 1: the camera 2 scans the product of the rotating platform 1 and extracts a contour line;
step 2: calculating the translation and rotation proportion of each camera 2 to the rotating platform 1;
and step 3: converting the product contour line extracted by each camera 2 into translation XY and rotation Angle according to translation proportion and rotation proportion conversion calculation;
and 4, step 4: and carrying out integral compensation and station compensation according to the translation XY and the rotation Angle Angle.
In the visual positioning compensation method described in this embodiment, the specific steps of calculating the translation and rotation ratio of each camera 2 and the rotating platform 1 are as follows:
step 2.1: presetting the values of moveX, moveY and roAngle; when rotating, the product edge needs to be controlled within the range of the shot image of the 5 cameras 2;
step 2.2: the rotating platform 1 moves to the original point position to obtain a datum line;
step 2.3: controlling the rotary platform to move along the X-axis direction, acquiring characteristics, and calculating the moving xScale proportion;
step 2.4: controlling the rotary platform to move along the direction of the Y axis, acquiring characteristics, and calculating the moving yScale proportion;
step 2.5: : and controlling the rotating platform to rotate under the driving of the rotating shaft, acquiring the characteristics and calculating the movement proportion of the RoScale.
As shown in fig. 2, in the visual positioning compensation method according to this embodiment, the specific steps of controlling the rotation platform to move along the X-axis direction and acquiring the features include:
step 2.3.1: calculating an intersection baseMidPt1 of a reference line before the rotary platform moves along the direction of the X axis and the central line of the camera;
step 2.3.2: calculating an intersection point curMidPt1 of the reference line and the central line after the rotary platform moves;
step 2.3.3: the formula of the X movement proportion calculation is as follows:
the method comprises the following steps of acquiring features, wherein the acquiring features include but are not limited to printing acquisition and reference positioning acquisition, and specifically, when a printed product is positioned, the acquired features are contour edges of the product; in the reference positioning, the whole product features are obtained by referring to the reference features to perform comparison and adjustment.
As shown in fig. 3, in the visual positioning compensation method according to this embodiment, the step of controlling the rotation platform to move along the Y-axis direction and obtain the features includes the specific steps of:
step 2.4.1: calculating an intersection baseMidPt2 of a reference line before the rotary platform moves along the Y-axis direction and the center line of the camera;
step 2.4.2: calculating an intersection point curMidPt2 of the reference line and the central line after the rotary platform moves;
step 2.4.3: the formula for calculating the Y movement proportion is as follows:
as shown in fig. 4, in the visual positioning compensation method according to this embodiment, the specific steps of controlling the rotation of the rotation platform and obtaining the features and calculating the robcale movement ratio include:
step 2.5.1: calculating the intersection point baseMidPt3 of the reference line before the rotation of the rotary platform and the center line of the camera;
step 2.5.2: calculating an intersection point curMidPt3 of the reference line and the central line after the rotation of the rotary platform;
step 2.5.3: the equation for the angle moving proportion is as follows:
the 5 cameras 2 are named camera No. 1, camera No. 2, camera No. 3, camera No. 4, and phase No. 5, respectively. Wherein, the difference value of the central point X is taken by the No. 3 camera and the No. 5 camera, and the difference value of the central point Y is taken by the No. 1 camera, the No. 2 camera and the No. 4 camera.
In the visual positioning compensation method described in this embodiment, the specific steps of converting the product contour line extracted by each camera into translation XY and rotation Angle according to the translation proportion and the rotation proportion are as follows:
step 3.1: positioning the product contour line extracted by each camera;
step 3.2: calculating the intersection point baseMidPt4 of the datum line and the central line of each camera;
step 3.3: calculating an intersection curMidPt4 of the product contour line extracted by each camera and the central line;
step 3.4: calculating the distance Dis between curMidPt4 and baseMidPt4 of each camera;
step 3.5: the rotation angle A, Y axis offset value offectY and the X axis offset value offectX for each camera are calculated.
The following is one of the specific calculation processes in this embodiment, specifically as follows: assuming that the X-direction distance is tmpDisX, the corresponding camera number 1, camera number 2, and camera number 4 displacements are redactiondis 1X, redactiondis 2X, and redactiondis 4X.
Presetting the values of moveX, moveY and roAngle, wherein the values of moveX, moveY and roAngle are used for calculating the translation proportion and the rotation proportion, the values are preset by a user, and then the distance between the intersection point of one line extracted by the camera 2 and the central line and the reference point is Dis, a formula can be deduced: dis ═ X1 × XS + Y1 × YS + Angle × AS. Wherein X1, Y1, Angle are data of the rotating platform 1 during motion, XS refers to the moving xScale proportion of the corresponding camera 2, i.e. the X motion proportion, YS refers to the moving yScale proportion of the corresponding camera 2, i.e. the Y motion proportion, and AS refers to the roScale proportion of the corresponding camera 2, i.e. the rotating motion proportion.
The translation may result in the following equation (where Dis3 refers to the reference point distance for camera No. 3 and Dis5 refers to the reference point distance for camera No. 5):
reductionDis1X=xScaleCam1×tmpDisX;
reductionDis2X=xScaleCam2×tmpDisX;
reductionDis4X=xSCaleCam4×tmpDisX。
therefore, the following formula can be obtained by translation, and the current intersection point distance of the camera 1, the camera 2 and the camera 4 is subtracted by the X-direction distance tmpdivx, so that the displacements of the camera 1, the camera 2 and the camera 4 caused by the corresponding operations are respectively:
afterReductionDis1Y=Dis1-reductionDis1X;
afterReductionDis2Y=Dis2-reductionDis2X;
afterReductionDis4Y=Dis1-reductionDis4X。
the X-axis movement ratios of the 5 cameras 2 are: xscale cam1, xscale cam2, xscale cam3, xscale cam4, and xscale cam 5.
The Y-axis movement ratios of the 5 cameras 2 are: yScaleCam1, yScaleCam2, yScaleCam3, yScaleCam4, and yScaleCam 5.
The rotation ratios of the 5 cameras 2 are: rotayscale 1, rotayscale 2, rotaxscale 3, rotayscale 4, and rotaxscale 5.
Calculating a rotation angle A: converting the 1 st camera 2 movement value to a second camera 2 movement value as:
the rotation scale of the first camera 2 converted into the second camera 2 is:
calculating a Y-axis offset value offectY (averaging the Y-direction motion amounts corresponding to the camera No. 1, the camera No. 2 and the camera No. 4 to obtain a final Y-direction motion amount):
calculating the X-axis deviation value (obtaining the platform X-direction movement amount corresponding to the No. 3 camera and the No. 5 camera): average value offectX of camera No. 3 and camera No. 5 after removing influence values caused by Y-axis movement and rotation:
in the visual positioning compensation method of this embodiment, the specific steps of performing the overall compensation and the station compensation according to the translation XY and the rotation Angle include:
step 4.1: according to actual product requirements, a user inputs data X1, Y1, Y2 and a configured product length proLen to obtain compensation data X on an X axis, compensation data Y on a Y axis and angle compensation data A. .
The compensation data X on the X axis is input X, and the compensation data Y and the angle compensation data A on the Y axis have the following conditions:
y1| ═ Y2|, then a ═ 0;
the angle is negated when the value of Y1 is not equal to the value of Y2 and Y1 is a negative number;
case 3 is shown in fig. 7:
y is 0; y1 is 0, Y2 is greater than zero angle value and is inverted;
y2 is 0, Y1 is smaller than zero angle value and is inverted;
in the visual positioning compensation method described in this embodiment, step 5: mechanical compensation; the mechanical compensation method comprises the following specific steps of:
step 5.1: extracting a reference line of each camera of a reference station;
step 5.2: extracting a corresponding line of each camera of the compensation station;
step 5.3: calculating the deviation value of the reference line and the corresponding line;
step 5.4: calculating the intersection baseMidPt of the datum line and the central line of each camera;
step 5.5: calculating the intersection point curMidPt of the boundary line and the central line of each camera;
step 5.6: calculating the distance Dis between curMidPt and baseMidPt of each camera;
step 5.7: the process of step 3 is consistent.
In the visual positioning compensation method described in this embodiment, the camera 2 is a CCD camera 2, but is not limited to the CCD camera 2, and may be a camera 2 of another model.
In the visual positioning compensation method according to this embodiment, the rotating platform 1 is a turntable.
Although the present invention has been described with reference to the above preferred embodiments, it should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.
Claims (8)
1. A visual positioning compensation method comprises a rotating platform and at least two cameras, and is characterized in that: further comprising the steps of:
step 1: the camera scans the product of the rotating platform and extracts the contour line of the product;
step 2: calculating the translation proportion and the rotation proportion of each camera and the rotating platform;
and step 3: converting the product contour line extracted by each camera into translation XY and rotation Angle according to the translation proportion and the rotation proportion;
and 4, step 4: carrying out integral compensation and station compensation according to the translation XY and the rotation Angle Angle;
the specific steps of converting the product contour line extracted by each camera into translation XY and rotation Angle according to the translation proportion and the rotation proportion are as follows:
step 3.1: positioning the product contour line extracted by each camera;
step 3.2: calculating the intersection point baseMidPt4 of the datum line and the central line of each camera;
step 3.3: calculating an intersection curMidPt4 of the product contour line extracted by each camera and the central line;
step 3.4: calculating the distance Dis between curMidPt4 and baseMidPt4 of each camera;
step 3.5: the rotation angle A, Y axis offset value offectY and the X axis offset value offectX for each camera are calculated.
2. The visual positioning compensation method of claim 1, wherein: the specific steps of calculating the translation and rotation proportion of each camera and the rotating platform are as follows:
step 2.1: presetting the values of moveX, moveY and roAngle;
step 2.2: moving the rotary platform to the original point position to acquire a reference line;
step 2.3: controlling the rotary platform to move along the X-axis direction, acquiring characteristics, and calculating the moving xScale proportion;
step 2.4: controlling the rotary platform to move along the direction of the Y axis, acquiring characteristics, and calculating the moving yScale proportion;
step 2.5: and controlling the rotating platform to rotate under the driving of the rotating shaft, acquiring the characteristics and calculating the movement proportion of the RoScale.
3. The visual positioning compensation method of claim 2, wherein: controlling the rotary platform to move along the X-axis direction and acquiring characteristics, and calculating the moving xScale proportion specifically comprises the following steps:
step 2.3.1: calculating an intersection baseMidPt1 of a reference line before the rotary platform moves along the direction of the X axis and the central line of the camera;
step 2.3.2: calculating an intersection point curMidPt1 of the reference line and the central line after the rotary platform moves;
step 2.3.3: the formula of the X movement proportion calculation is as follows:
4. the visual positioning compensation method of claim 2, wherein: controlling the rotary platform to move along the direction of the Y axis and acquiring the characteristics, and calculating the moving yScale proportion specifically comprises the following steps:
step 2.4.1: calculating an intersection baseMidPt2 of a reference line before the rotary platform moves along the Y-axis direction and the center line of the camera;
step 2.4.2: calculating an intersection point curMidPt2 of the reference line and the central line after the rotary platform moves;
step 2.4.3: the formula for calculating the Y movement proportion is as follows:
5. the visual positioning compensation method of claim 2, wherein: the method comprises the following specific steps of controlling the rotation of the rotating platform and obtaining characteristics, and calculating the movement proportion of the robScale:
step 2.5.1: calculating the intersection point baseMidPt3 of the reference line before the rotation of the rotary platform and the center line of the camera;
step 2.5.2: calculating an intersection point curMidPt3 of the reference line and the central line after the rotation of the rotary platform;
step 2.5.3: the equation for the angle moving proportion is as follows:
6. the visual positioning compensation method of claim 1, wherein: the specific steps of carrying out integral compensation and station compensation according to the translation XY and the rotation Angle Angle comprise:
step 4.1: according to actual product requirements, a user inputs data X1, Y1, Y2 and a configured product length proLen to obtain compensation data X on an X axis, compensation data Y on a Y axis and angle compensation data A.
7. The visual positioning compensation method of claim 1, wherein: further comprising the step 5: and (4) mechanically compensating.
8. The visual positioning compensation method of claim 1, wherein: the rotary platform is a turntable.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811108550.XA CN109283889B (en) | 2018-09-21 | 2018-09-21 | Visual positioning compensation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811108550.XA CN109283889B (en) | 2018-09-21 | 2018-09-21 | Visual positioning compensation method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109283889A CN109283889A (en) | 2019-01-29 |
CN109283889B true CN109283889B (en) | 2021-09-21 |
Family
ID=65181418
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811108550.XA Active CN109283889B (en) | 2018-09-21 | 2018-09-21 | Visual positioning compensation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109283889B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112018424B (en) * | 2020-07-30 | 2021-12-31 | 惠州市德赛电池有限公司 | Automatic correction method for battery production line |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103453889A (en) * | 2013-09-17 | 2013-12-18 | 深圳市创科自动化控制技术有限公司 | Calibrating and aligning method of CCD (Charge-coupled Device) camera |
CN104460698A (en) * | 2014-11-03 | 2015-03-25 | 北京凌云光技术有限责任公司 | UVW platform calibrating method and device |
-
2018
- 2018-09-21 CN CN201811108550.XA patent/CN109283889B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103453889A (en) * | 2013-09-17 | 2013-12-18 | 深圳市创科自动化控制技术有限公司 | Calibrating and aligning method of CCD (Charge-coupled Device) camera |
CN104460698A (en) * | 2014-11-03 | 2015-03-25 | 北京凌云光技术有限责任公司 | UVW platform calibrating method and device |
Non-Patent Citations (1)
Title |
---|
基于机器视觉的天地盖包装盒定位贴合系统开发;郑云龙;《中国优秀硕士学位论文全文数据库(电子期刊)工程科技Ⅰ辑》;20180215;正文第12-15、31-32页 * |
Also Published As
Publication number | Publication date |
---|---|
CN109283889A (en) | 2019-01-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111300422B (en) | Robot workpiece grabbing pose error compensation method based on visual image | |
CN110103217B (en) | Industrial robot hand-eye calibration method | |
WO2016165391A1 (en) | Parallel connection platform tracking control device and method using visual equipment as sensor | |
CN108106535B (en) | Line laser calibration method and line laser calibration device based on robot | |
CN105234802B (en) | A kind of small bulb tooling order turntable Polishing machining device and cutter presetting cutter method | |
CN108326850B (en) | Method and system for robot to accurately move mechanical arm to reach specified position | |
CN108714701A (en) | A kind of Machining of Shaft-type Parts device | |
CN112381827B (en) | Rapid high-precision defect detection method based on visual image | |
CN108562233B (en) | Utilize the axis part diameter size On-line Measuring Method of conic section invariant | |
CN107808401A (en) | The hand and eye calibrating method of the one camera of mechanical arm tail end | |
CN113592955A (en) | Circular workpiece plane coordinate high-precision positioning method based on machine vision | |
CN109671122A (en) | Trick camera calibration method and device | |
CN107804708A (en) | A kind of pivot localization method of placement equipment feeding rotary shaft | |
CN112365502B (en) | Calibration method based on visual image defect detection | |
CN111047586B (en) | Pixel equivalent measuring method based on machine vision | |
CN109283889B (en) | Visual positioning compensation method | |
CN112318107A (en) | Large-scale part hole shaft automatic assembly centering measurement method based on depth camera | |
CN105773661B (en) | Workpiece translational motion rotates scaling method under horizontal machine people's fixed camera | |
CN111251189B (en) | Visual positioning method for casting polishing | |
CN113793313B (en) | High-precision tool setting method and device for machining full-surface micro-pit structure of thin-wall spherical shell type micro-component | |
CN113932710A (en) | Combined type vision cutter geometric parameter measuring system and method | |
CN104424601B (en) | Centering assembly method and device for special-shaped body assembly parts | |
WO2020113978A1 (en) | Method for calculating center position of hole located on plane | |
CN106272414B (en) | Pitch ear auricle assembling method of servo-controlling | |
CN106951814B (en) | Circular grating eccentricity calculation method of encoder eccentricity adjustment system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |