CN114387228A - Automatic workpiece locating and aligning method based on line structured light - Google Patents

Automatic workpiece locating and aligning method based on line structured light Download PDF

Info

Publication number
CN114387228A
CN114387228A CN202111595872.3A CN202111595872A CN114387228A CN 114387228 A CN114387228 A CN 114387228A CN 202111595872 A CN202111595872 A CN 202111595872A CN 114387228 A CN114387228 A CN 114387228A
Authority
CN
China
Prior art keywords
coordinate
camera
image
workpiece
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111595872.3A
Other languages
Chinese (zh)
Inventor
梅雪松
运侠伦
吴卓成
李晓
赵亮
梅岭南
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuxi Chaotong Intelligent Manufacturing Technology Research Institute Co ltd
Original Assignee
Wuxi Chaotong Intelligent Manufacturing Technology Research Institute Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuxi Chaotong Intelligent Manufacturing Technology Research Institute Co ltd filed Critical Wuxi Chaotong Intelligent Manufacturing Technology Research Institute Co ltd
Priority to CN202111595872.3A priority Critical patent/CN114387228A/en
Publication of CN114387228A publication Critical patent/CN114387228A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to the field of electronic detection methods, in particular to a linear structured light-based automatic workpiece locating and aligning method. The method comprises the following steps: s1, fixing a workpiece; s2, irradiating line-structured light; s3, acquiring a light bar image; s4, image processing; s5, acquiring three-dimensional coordinates of points on the light bars; and S6, calculating the deviation from the ideal position. The invention has the advantages of high processing efficiency, low labor cost, flexible operation and high processing precision, can effectively shorten the processing period, is suitable for a single-piece small-batch mode, and can better meet the current market demand.

Description

Automatic workpiece locating and aligning method based on line structured light
Technical Field
The invention relates to the field of electronic detection methods, in particular to a linear structured light-based automatic workpiece locating and aligning method.
Background
In the field of machining, workpiece positioning is an important step in the workpiece machining process. After the workpiece is positioned, workpiece clamping and workpiece machining operations are performed, which affects the efficiency of the whole workpiece machining process and the workpiece machining precision. In a mass production mode, a dedicated high precision jig may be employed. However, in the process of processing single-piece small-batch workpieces, manual alignment is still mainly performed by operators. Under the condition, the processing precision is greatly reduced, the processing efficiency is seriously reduced, the labor cost is greatly increased, and the utilization rate of a processing center is reduced. Therefore, the conventional method cannot well meet the current market demand. Therefore, the present invention provides an automatic locating and aligning method. Different from the traditional workpiece positioning method, the automatic locating and aligning method does not need to accurately align the initial posture of the workpiece, and adopts active locating and pose adjusting to replace passive locating. The automatic locating and aligning method is high in machining efficiency, low in labor cost, flexible in operation, high in machining precision, capable of effectively shortening the machining period and suitable for a single-piece small-batch mode.
Disclosure of Invention
The invention aims to overcome the defects in the prior art, and provides a linear structured light-based automatic workpiece locating and aligning method which can be used for automatic locating and pose adjustment of a workpiece on a horizontal machining center.
The technical scheme for realizing the purpose of the invention is as follows:
a workpiece automatic locating and aligning method based on line structured light comprises the following steps:
s1, fixing a workpiece: mounting a workpiece on a rotary worktable of a horizontal machining center by using a clamp or a fixing element;
s2, illuminating line structure light: irradiating the linear structure light on the surface of a workpiece by using a laser, wherein bright stripes appear on the surface of the workpiece;
s3, acquiring a light bar image: acquiring an image of the light bar by using a CCD (charge coupled device) camera to obtain a two-dimensional image of the light bar;
s4, image processing: carrying out image filtering, image segmentation and contour extraction on the obtained image;
s5, acquiring three-dimensional coordinates of points on the light bars: the two-dimensional image coordinate can be converted into a three-dimensional actual coordinate through CCD camera calibration;
s6, calculating the deviation between the position and the ideal position: the three-dimensional actual coordinates can accurately describe the position of the workpiece on the machining center, and the position deviation comprising the rotation amount and the translation amount can be obtained by comparing the three-dimensional actual coordinates with the correct position of the workpiece;
s7, adjusting the pose of the workpiece: and adjusting the workpiece to a correct position through the rotary worktable.
Further, in S3, the CCD camera is calibrated to obtain a conversion relationship between the three-dimensional coordinates and the two-dimensional coordinates, the CCD camera includes a left camera and a right camera, and the calibration of the positional relationship between the two cameras is required in addition to the calibration of the internal and external parameters of the camera, the camera calibration target is first placed in the field of view of the two cameras, and the translation amount between the coordinate systems of the left camera and the right camera and the world coordinate system is set to T1、T2The rotation matrix is R1、R2The coordinate of a certain calibration point on the calibration target under the coordinate system of the left camera is Pl=(Xcl,Ycl,Zcl)TAnd the coordinate is Pr ═ X under the coordinate system of the right cameracr,Ycr,Zcr)TIn the world coordinate system, the coordinate is P ═ X (Xw, Y)w,Zw)T(ii) a Then there are:
Figure BDA0003431101900000021
can obtain
Figure BDA0003431101900000022
Wherein,
Figure BDA0003431101900000023
by the above formula, the conversion relation between the left camera coordinate system and the right camera coordinate system can be obtained;
the optical axes of the two cameras are parallel, the optical axes are vertical to the image plane, and the distance between the two cameras is d; cl、CrSetting the three-dimensional point P as (X) under the left camera coordinate systemcl,Ycl,Zcl) In the right camera coordinate system is (X)cr,Ycr,Zcr) The relationship between the pixel coordinate system and the image coordinate system is as follows:
Figure BDA0003431101900000031
wherein u0 and v0 are camera internal parameters (u)l,vl)、(ur,vr) Is the coordinate of P in the left and right pixel coordinate systems, (x)l,yl)、(xr,yr) Is the coordinate of P in the image coordinate system;
combining the relationship between the image coordinate system and the camera coordinate system, we can obtain:
Figure BDA0003431101900000032
wherein, f is the focal length of the camera, α is f/dx, β is f/dy is the camera parameter, and the coordinate conversion between the image coordinate system and the world coordinate system can be realized by the above formula.
Further, in S4, after the light bars are acquired, the center of the light bars is extracted by using a gray scale gravity center method, where a calculation formula of the gray scale gravity center method is as follows:
Figure BDA0003431101900000033
wherein (X)i,Yi) Is the coordinate, phase, of a certain pixel point in the plane of the light stripThe gray value of the pixel point is I (X)i,Yi) (i is 1,2, …, N), where N is the number of pixels in the cross-section and (X)c,Yc) Is the coordinate of the center of the light bar. Thus, coordinates of the center point of the light bar can be obtained.
In S6, the projection is made on a plane where Y is 0, and two arbitrary points a on the projection are definedi,BjRespectively has coordinates of Ai(XAi,ZAi),Bj(XBj,ZBj) (ii) a Line segment A is drawn by passing through AiCi,jParallel to XwShaft, passing through BjMaking a line segment BjCi,jParallel to ZwShaft, AiCi,jAnd BiCi,jIntersect at Ci,jThen C isi,jCoordinate is Ci,j(XBj,ZAi) And is combined with AiCi,jAnd AiBjAngle alpha ofi,j(ii) a Then there are:
Figure BDA0003431101900000041
it is possible to obtain:
Figure BDA0003431101900000042
the final required rotation angle is:
Figure BDA0003431101900000043
wherein N is the number of the obtained coordinate points;
if the measured coordinate point is (Xi, Yi) and the coordinate point corresponding to the correct position is (Xi, Yi), the translation amount of the position deviation is:
Figure BDA0003431101900000044
after the technical scheme is adopted, the invention has the following positive effects:
(1) the invention has the advantages of high processing efficiency, low labor cost, flexible operation and high processing precision, can effectively shorten the processing period, is suitable for a single-piece small-batch mode, and can better meet the current market demand.
Drawings
In order that the present disclosure may be more readily and clearly understood, reference is now made to the following detailed description of the present disclosure taken in conjunction with the accompanying drawings, in which
FIG. 1 is a work flow diagram;
FIG. 2 is a diagram of a seek system;
FIG. 3 is a diagram of a camera system model;
fig. 4 is a schematic view of the rotation angle.
Detailed Description
A workpiece automatic locating and aligning method based on line structured light comprises the following steps:
s1, fixing a workpiece: mounting a workpiece on a rotary worktable of a horizontal machining center by using a clamp or a fixing element;
s2, illuminating line structure light: irradiating the linear structure light on the surface of a workpiece by using a laser, wherein bright stripes appear on the surface of the workpiece;
s3, acquiring a light bar image: acquiring an image of the light bar by using a CCD (charge coupled device) camera to obtain a two-dimensional image of the light bar;
s4, image processing: processing the obtained image such as image filtering, image segmentation, contour extraction and the like;
s5, acquiring three-dimensional coordinates of points on the light bars: the two-dimensional image coordinate can be converted into a three-dimensional actual coordinate through CCD camera calibration;
s6, calculating the deviation between the position and the ideal position: the three-dimensional actual coordinates can accurately describe the position of the workpiece on the machining center, and the position deviation comprising the rotation amount and the translation amount can be obtained by comparing the three-dimensional actual coordinates with the correct position of the workpiece;
s7, adjusting the pose of the workpiece: and adjusting the workpiece to a correct position through the rotary worktable.
The three-dimensional image information of the surface of the workpiece of the CCD camera is obtained by adopting a binocular linear structured light measurement technology, and a measurement system consists of a laser, the CCD camera and a computer system. The CCD camera comprises a left camera and a right camera.
The software part consists of structured light scanning control, image processing, contour extraction and the like. When the laser projects line structured light onto the surface of the workpiece, a light bar is formed on the surface of the workpiece, the light bar containing three-dimensional coordinate information of the measured surface. Furthermore, the position finding system adopts two CCD cameras for shooting the surface of the workpiece from different directions to simulate human eyes so as to obtain the three-dimensional information of the surface of the workpiece, thereby obtaining the coordinate information of multiple points on the surface of the workpiece. The three-dimensional coordinate information on the light bars is converted into two-dimensional image coordinate system information by the camera. In this step, calibration of the camera is essential to obtain the conversion relationship between the actual three-dimensional coordinates and the two-dimensional coordinates of the image. The conversion between the three-dimensional coordinate and the two-dimensional coordinate can be realized through the parameter matrix obtained by calibrating the camera.
And performing image preprocessing on the obtained light bar image containing the coordinate information and extracting the light bar center. To obtain a clear and complete picture with high quality, the original image must be preprocessed by filtering, image segmentation and the like, so that the precision is improved. And then carrying out edge detection and feature point extraction on the image so as to obtain the required light stripe center information.
And converting the obtained central coordinates of the stripes into three-dimensional world coordinates, and calculating the position deviation between the central coordinates and the correct position through certain mathematical operation. This positional deviation specifically includes a rotation amount and a displacement amount.
Example 1
The specific working flow of the method is shown in fig. 1, and the locating system is shown in fig. 2. First, the workpiece is mounted on a rotary table of a horizontal machining center with a jig or a fixing member without putting the workpiece at its correct position. The line structure light is irradiated on the surface of the workpiece by using a laser, and bright stripes appear on the surface of the workpiece. The points on the light bar contain three-dimensional coordinate information of the object points on the workpiece. And (3) carrying out image acquisition on the light bar by using a CCD (charge coupled device) camera to obtain a two-dimensional image of the light bar. The obtained image is subjected to image filtering, image segmentation, contour extraction and other processing, and a matrix obtained through camera calibration can convert two-dimensional image coordinates into three-dimensional actual coordinates. A series of three-dimensional coordinates can accurately describe the position of the workpiece on the machining center, and the position deviation comprising the rotation amount and the translation amount can be obtained by comparing the position of the workpiece with the correct position. And finally, adjusting the workpiece to a correct position through the rotary worktable.
The purpose of camera calibration is to obtain a conversion relationship between a three-dimensional coordinate and a two-dimensional coordinate, and besides calibration of internal and external parameters of a camera, calibration of a position relationship between two cameras is also needed. Firstly, a camera calibration target is placed in the visual field range of a two-camera. Let T be the translation between the left and right camera coordinate systems and the world coordinate system1、T2The rotation matrix is R1、R2. The coordinate of a certain calibration point on the calibration target under the coordinate system of the left camera is Pl=(Xcl,Ycl,Zcl)TAnd the coordinate is Pr ═ X under the coordinate system of the right cameracr,Ycr,Zcr)TIn the world coordinate system, the coordinate is P ═ X (Xw, Y)w,Zw)T. Then there are:
Figure BDA0003431101900000061
can obtain
Figure BDA0003431101900000071
Wherein,
Figure BDA0003431101900000072
through the above formula, the conversion relationship between the left and right camera coordinate systems can be obtained.
The camera system is shown in fig. 3, where the optical axes of the two cameras are parallel and the optical axes are perpendicular to the image plane. The distance between the two cameras is d. Cl、CrLeft and right camera coordinate systems. Let three-dimensional point P be (X) under the left camera coordinate systemcl,Ycl,Zcl) In the right camera coordinate system is (X)cr,Ycr,Zcr). The relationship between the pixel coordinate system and the image coordinate system is:
Figure BDA0003431101900000073
wherein u0 and v0 are camera internal parameters (u)l,vl)、(ur,vr) Is the coordinate of P in the left and right pixel coordinate systems, (x)l,yl)、(xr,yr) Is the coordinate of P in the image coordinate system.
Combining the relationship between the image coordinate system and the camera coordinate system, we can obtain:
Figure BDA0003431101900000074
wherein, f is the focal length of the camera, α ═ f/dx, β ═ f/dy is the camera intrinsic parameter. The coordinate transformation between the image coordinate system and the world coordinate system can be realized through the above formula.
After the CCD camera obtains the light bar image, the obtained image should be preprocessed and the light bar center extracted. The image preprocessing mainly comprises eliminating operations such as image noise and the like to enable the image to better reflect real light bars and prepare for extracting the centers of the subsequent light bars. The link is mainly divided into image filtering, image segmentation and light bar center extraction.
Firstly, the median filtering is adopted to eliminate impulse noise, and meanwhile, the influence on image edge information can not be caused. Then, histogram processing is adopted to eliminate irrelevant information in the image and make important characteristic information in the image stand out, thereby improving the image quality. Image segmentation is performed next, which aims to extract bright areas in the image to facilitate subsequent feature extraction. A threshold segmentation method is used here. The threshold segmentation is mainly based on the gray difference between a target and a background in an image, a reasonable threshold is selected, and a pixel point in the image is compared with the selected threshold so as to judge whether the point is the target or the background. After the image is divided, clear light bar images can be obtained, and then the light bars can be extracted from the images by carrying out image thinning operation. And extracting the light strip center by adopting a gray gravity center method. The gray center-of-gravity method is to use the center of gray distribution of pixels in the image as the center point of the light bar, so as to reduce the error of uneven gray distribution of the light bar. The calculation formula of the gray scale gravity center method is as follows:
Figure BDA0003431101900000081
wherein (X)i,Yi) The gray value of a pixel point is I (X) which is the coordinate of a certain pixel point in the plane of the light stripi,Yi) (i is 1,2, …, N), where N is the number of pixels in the cross-section and (X)c,Yc) Is the coordinate of the center of the light bar. Thus, coordinates of the center point of the light bar can be obtained.
Through the steps, the coordinate information of the surface of the workpiece can be obtained, and then the position deviation between the surface of the workpiece and the correct position is calculated.
As shown in fig. 4, the coordinate points on the workpiece surface obtained in the above-described steps are projected on a plane where Y is 0, and arbitrary two points a on the projection are seti,BjRespectively has coordinates of Ai(XAi,ZAi),Bj(XBj,ZBj). Line segment A is drawn by passing through AiCi,jParallel to XwShaft, passing through BjMaking a line segment BjCi,jParallel to ZwShaft, AiCi,jAnd BiCi,jIntersect at Ci,jThen C isi,jCoordinate is Ci,j(XBj,ZAi) And is combined with AiCi,jAnd AiBjAngle alpha ofi,j. Then there are:
Figure BDA0003431101900000082
it is possible to obtain:
Figure BDA0003431101900000091
the final required rotation angle is:
Figure BDA0003431101900000092
wherein N is the number of coordinate points obtained.
If the measured coordinate point is (Xi, Yi) and the coordinate point corresponding to the correct position is (Xi, Yi), the translation amount of the position deviation is:
Figure BDA0003431101900000093
and finally, sending the position deviation to a machining center, and placing the workpiece at a correct position through a rotary worktable.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are only exemplary embodiments of the present invention, and are not intended to limit the present invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (4)

1. A workpiece automatic locating and aligning method based on line structured light is characterized in that: the method comprises the following steps:
s1, fixing a workpiece: mounting a workpiece on a rotary worktable of a horizontal machining center by using a clamp or a fixing element;
s2, illuminating line structure light: irradiating the linear structure light on the surface of a workpiece by using a laser, wherein bright stripes appear on the surface of the workpiece;
s3, acquiring a light bar image: acquiring an image of the light bar by using a CCD (charge coupled device) camera to obtain a two-dimensional image of the light bar;
s4, image processing: carrying out image filtering, image segmentation and contour extraction on the obtained image;
s5, acquiring three-dimensional coordinates of points on the light bars: the two-dimensional image coordinate can be converted into a three-dimensional actual coordinate through CCD camera calibration;
s6, calculating the deviation between the position and the ideal position: the three-dimensional actual coordinates can accurately describe the position of the workpiece on the machining center, and the position deviation comprising the rotation amount and the translation amount can be obtained by comparing the three-dimensional actual coordinates with the correct position of the workpiece;
s7, adjusting the pose of the workpiece: and adjusting the workpiece to a correct position through the rotary worktable.
2. The method of claim 1, wherein the method comprises: in S3, the CCD camera is calibrated to obtain a conversion relationship between the three-dimensional coordinates and the two-dimensional coordinates, the CCD camera includes a left camera and a right camera, and the calibration of the positional relationship between the two cameras is required in addition to the calibration of the internal and external parameters of the camera, first, the camera calibration target is placed in the visual field of the two cameras, and the translation amount between the coordinate systems of the left camera and the right camera and the world coordinate system is set to T1、T2The rotation matrix is R1、R2The coordinate of a certain calibration point on the calibration target under the coordinate system of the left camera is Pl=(Xcl,Ycl,Zcl)TAnd the coordinate is Pr ═ X under the coordinate system of the right cameracr,Ycr,Zcr)TIn the world coordinate system, the coordinate is P ═ X (Xw, Y)w,Zw)T(ii) a Then there are:
Figure FDA0003431101890000011
can obtain
Figure FDA0003431101890000021
Wherein,
Figure FDA0003431101890000022
by the above formula, the conversion relation between the left camera coordinate system and the right camera coordinate system can be obtained;
the optical axes of the two cameras are parallel, the optical axes are vertical to the image plane, and the distance between the two cameras is d; cl、CrSetting the three-dimensional point P as (X) under the left camera coordinate systemcl,Ycl,Zcl) In the right camera coordinate system is (X)cr,Ycr,Zcr) The relationship between the pixel coordinate system and the image coordinate system is as follows:
Figure FDA0003431101890000023
wherein u0 and v0 are camera internal parameters (u)l,vl)、(ur,vr) Is the coordinate of P in the left and right pixel coordinate systems, (x)l,yl)、(xr,yr) Is the coordinate of P in the image coordinate system;
combining the relationship between the image coordinate system and the camera coordinate system, we can obtain:
Figure FDA0003431101890000024
wherein, f is the focal length of the camera, α is f/dx, β is f/dy is the camera parameter, and the coordinate conversion between the image coordinate system and the world coordinate system can be realized by the above formula.
3. The method of claim 1, wherein the method comprises: in S4, after the light bars are obtained, the light bar centers are extracted by a gray scale gravity center method, where the calculation formula of the gray scale gravity center method is:
Figure FDA0003431101890000031
wherein (X)i,Yi) In the plane of the light barThe gray value of a certain pixel point is I (X) according to the coordinate of the corresponding pixel pointi,Yi) (i is 1,2, …, N), where N is the number of pixels in the cross-section and (X)c,Yc) Is the coordinate of the center of the light bar. Thus, coordinates of the center point of the light bar can be obtained.
4. The method of claim 1, wherein the method comprises: in S6, the projection is made on a plane where Y is 0, and arbitrary two points a on the projection are seti,BjRespectively has coordinates of Ai(XAi,ZAi),Bj(XBj,ZBj) (ii) a Line segment A is drawn by passing through AiCi,jParallel to XwShaft, passing through BjMaking a line segment BjCi,jParallel to ZwShaft, AiCi,jAnd BiCi,jIntersect at Ci,jThen C isi,jCoordinate is Ci,j(XBj,ZAi) And is combined with AiCi,jAnd AiBjAngle alpha ofi,j(ii) a Then there are:
Figure FDA0003431101890000032
it is possible to obtain:
Figure FDA0003431101890000033
the final required rotation angle is:
Figure FDA0003431101890000034
wherein N is the number of the obtained coordinate points;
if the measured coordinate point is (Xi, Yi) and the coordinate point corresponding to the correct position is (Xi, Yi), the translation amount of the position deviation is:
Figure FDA0003431101890000041
CN202111595872.3A 2021-12-24 2021-12-24 Automatic workpiece locating and aligning method based on line structured light Pending CN114387228A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111595872.3A CN114387228A (en) 2021-12-24 2021-12-24 Automatic workpiece locating and aligning method based on line structured light

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111595872.3A CN114387228A (en) 2021-12-24 2021-12-24 Automatic workpiece locating and aligning method based on line structured light

Publications (1)

Publication Number Publication Date
CN114387228A true CN114387228A (en) 2022-04-22

Family

ID=81197560

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111595872.3A Pending CN114387228A (en) 2021-12-24 2021-12-24 Automatic workpiece locating and aligning method based on line structured light

Country Status (1)

Country Link
CN (1) CN114387228A (en)

Similar Documents

Publication Publication Date Title
CN109598762B (en) High-precision binocular camera calibration method
CN110555889B (en) CALTag and point cloud information-based depth camera hand-eye calibration method
CN107590835B (en) Mechanical arm tool quick-change visual positioning system and positioning method in nuclear environment
CN111089569B (en) Large box body measuring method based on monocular vision
CN108269255B (en) Electric connector detection method based on machine vision
CN109029299B (en) Dual-camera measuring device and method for butt joint corner of cabin pin hole
CN112949478B (en) Target detection method based on tripod head camera
CN113119129A (en) Monocular distance measurement positioning method based on standard ball
CN113324478A (en) Center extraction method of line structured light and three-dimensional measurement method of forge piece
CN110136211A (en) A kind of workpiece localization method and system based on active binocular vision technology
CN110966956A (en) Binocular vision-based three-dimensional detection device and method
Wang et al. Error analysis and improved calibration algorithm for LED chip localization system based on visual feedback
CN113310433A (en) Virtual binocular stereo vision measuring method based on line structured light
CN116642433A (en) Three-dimensional point cloud splicing method and three-dimensional measurement system based on visual tracking
CN112729112A (en) Engine cylinder bore diameter and hole site detection method based on robot vision
CN114299153B (en) Camera array synchronous calibration method and system for oversized power equipment
CN112489141B (en) Production line calibration method and device for single-board single-image strip relay lens of vehicle-mounted camera
CN111145254B (en) Door valve blank positioning method based on binocular vision
CN114387228A (en) Automatic workpiece locating and aligning method based on line structured light
CN112164018B (en) Machine vision calibration system and calibration method thereof
CN111435075A (en) Computer vision measurement system
CN111862237B (en) Camera calibration method for optical surface shape measurement and device for realizing method
CN111028231B (en) Workpiece position acquisition system based on ARM and FPGA
CN110108209A (en) The measurement method and system of small-sized porous part
CN110689582A (en) Total station camera calibration method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination