CN110021027B - Edge cutting point calculation method based on binocular vision - Google Patents

Edge cutting point calculation method based on binocular vision Download PDF

Info

Publication number
CN110021027B
CN110021027B CN201910311925.0A CN201910311925A CN110021027B CN 110021027 B CN110021027 B CN 110021027B CN 201910311925 A CN201910311925 A CN 201910311925A CN 110021027 B CN110021027 B CN 110021027B
Authority
CN
China
Prior art keywords
point
trimming
dimensional
points
theoretical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910311925.0A
Other languages
Chinese (zh)
Other versions
CN110021027A (en
Inventor
魏志博
邢威
张楠楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yi Si Si Hangzhou Technology Co ltd
Original Assignee
Isvision Hangzhou Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Isvision Hangzhou Technology Co Ltd filed Critical Isvision Hangzhou Technology Co Ltd
Priority to CN201910311925.0A priority Critical patent/CN110021027B/en
Publication of CN110021027A publication Critical patent/CN110021027A/en
Application granted granted Critical
Publication of CN110021027B publication Critical patent/CN110021027B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • G06T3/04
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Abstract

The invention discloses a binocular vision-based trimming point calculation method, which comprises the following steps: measuring by binocular vision to obtain a trimming three-dimensional coordinate; fitting the plane to obtain a plane I, projecting the trimming three-dimensional coordinates to the fitting plane I to obtain Nj(ii) a Projecting the theoretical coordinate and the theoretical normal of the edge cutting point to a plane I, and establishing a straight line L: calculating NjDistance to line L, obtaining Nj'; cross multiplication is carried out to obtain a vector n 'and a straight line L' is established; calculating N along the normal N' direction of the theoretical projectionjThe distance D from the L' is used for enabling the three-dimensional projection point N to be in a preset step length according to the value DjDividing the obtained product into N groups; selecting a group with the maximum D value and the number of points in the group larger than m as a group where the trimming points are located; averaging the coordinates of all three-dimensional projection points in the group of the trimming points to obtain trimming point measurement coordinates; the method can achieve the coordinate measuring precision of tens of microns, the measuring time is in millisecond level, and the method can meet the detection requirements of precision machining manufacturing industries such as the automobile industry and the like.

Description

Edge cutting point calculation method based on binocular vision
Technical Field
The invention relates to the field of feature detection, in particular to a binocular vision-based trimming point calculation method.
Background
With the development of automation technology and the improvement of social productivity, the manufacture industry has stricter and stricter control on the product quality, and the basic principles for improving the manufacture quality can be classified into two types: 1. the method for avoiding errors utilizes high-precision processing equipment, an advanced control structure and a high-reliability sensor detection mechanism to fundamentally improve the quality of processed parts, but the method has complex operation and high cost, and a complete real-time scheme is not provided at present; 2. the method for compensating the error is characterized in that a modern test technology, a new statistical tolerance method and the existing engineering control and statistical control method are combined to research the size distribution characteristics of products, analyze the processing state of a die clamp and further control and adjust the die clamp, the method is more and more concerned, the key technology for implementing the method is measurement, and effective data of a measured object can be obtained through measurement, so that a basis is provided for a subsequent control mode.
In a plurality of tested characteristics, the trimming point is an important type to be tested, taking a white automobile body as an example, whether the position of the trimming point is accurate directly relates to the straightness of the edge of a workpiece, and is the basis for whether the later-stage installation and the appearance design meet the requirements or not; the traditional test sample rack is a detection tool widely used in the 70 s of the 20 th century, and the method has low cost, but has low detection speed and low precision (about 0.1mm), and cannot reach the detection precision required by the manufacture of precision instruments; the other detection method is to adopt a three-Coordinate Measuring Machine (CMM) to detect the edge cutting point, although the method has high precision, the measurement speed is slow, only limited points can be measured for the space boundary point, the geometric dimension characteristics of the space boundary cannot be completely described, manual intervention is needed in the measurement process, and probe damage and probe radius compensation are needed to be carried out on the measurement result.
Disclosure of Invention
In order to solve the problems, the invention provides a binocular vision-based trimming point calculation method, which effectively improves the measurement precision and efficiency, and the trimming point measurement precision of the method can reach tens of microns.
A binocular vision-based trimming point calculation method comprises the following steps:
1) collecting left and right images of the trimming characteristics by using a binocular vision measuring system;
2) image preprocessing is carried out on the left image and the right image, and two-dimensional point coordinates N of the edge cutting outline edges of the left image and the right image are extractedtT is 1,2, … s; s is the number of two-dimensional points of the edge of the trimming contour;
3) converting the trimming contour points in the left image and the right image into three-dimensional space by using an epipolar matching method to obtain three-dimensional point coordinate data of the trimming contour;
4) obtaining three-dimensional point cloud of the plane where the measured trimming edge is located by using a three-dimensional scanner, performing point cloud plane fitting to obtain a fitting plane I, and projecting the three-dimensional point coordinate data of the trimming edge profile obtained in the step 3)Obtaining the three-dimensional projection point of the trimming edge contour on the fitting plane I, and recording the three-dimensional projection point as NjJ is 1,2, …, m, m is the number of three-dimensional projection points of the trimming edge profile;
5) projecting the coordinate P and the theoretical normal n of the theoretical point of the edge cutting point to be detected to the fitting plane I to obtain a theoretical projection point P 'and a theoretical projection normal n';
6) according to the theoretical projection point P' (b)1,b2,b3) And the theoretical projection normal n' (k)1,k2,k3) Establishing a straight line L:
Figure BDA0002031788420000021
wherein, (X, Y, Z) is any point on the straight line L;
7) calculating three-dimensional projection point NjWhen the distance to the straight line L is smaller than T, the T is 0.2-0.7 mm, the three-dimensional projection point at the moment is recorded, and a point set N is obtainedj', j-1, 2, …, m ', m ' is the number of recorded three-dimensional projection points;
8) the normal direction of the fitting plane I is cross-multiplied with the theoretical projection normal direction n 'to obtain a direction vector n' (k)4,k5,k6) Establishing a straight line L ' according to the theoretical projection point P ' and the direction vector n ':
Figure BDA0002031788420000031
wherein, (X ', Y', Z ') is any point on the straight line L';
9) sequentially calculating three-dimensional projection points N along the direction of the theoretical projection normal NjThe distance D from the three-dimensional projection point N to the straight line L' is calculated according to the value D and the preset step lengthjDividing the obtained product into N groups; selecting a group with the maximum D value and the number of points in the group larger than m as a group where the trimming points are located; and averaging the coordinates of all three-dimensional projection points in the group where the trimming points are located, namely, the measurement coordinates of the trimming points to be measured.
Further, the left and right images are gray-scale images.
Further, the image preprocessing comprises Gaussian filtering, image binarization and edge detection.
Further, the coordinate P and the theoretical normal n of the theoretical point are digital-analog standard data values of the edge cutting point to be detected, which are pre-stored in the system.
Furthermore, the number of the edge cutting points to be detected is one or more.
Further, the method also comprises a step 10) of making a difference value between the measured coordinate obtained in the step 9) and the coordinate of the theoretical point, and judging that the trimming point is qualified when the difference value is smaller than a tolerance threshold, or judging that the trimming point is unqualified.
Further, the fitting plane I is fitted by a least square method.
Further, the preset step length is 0.2-0.5 mm.
Further, m is more than or equal to 2.
The method provided by the application can effectively avoid the defect of poor quality of the trimming edge point cloud obtained by binocular vision, the three-dimensional point cloud of the plane where the measured trimming edge is located is obtained by a three-dimensional scanner, point cloud plane fitting is carried out, the point set of the trimming edge to be measured, the theoretical coordinate point and the normal vector are all projected into the fitting plane, the optimal measuring coordinate of the trimming edge point to be measured is screened and calculated by taking the theoretical value of the point to be measured as guidance, and then the trimming edge quality is detected; the method can achieve the coordinate measuring precision of tens of microns, the measuring time is in millisecond level, and the method can meet the detection requirements of precision machining manufacturing industries such as the automobile industry and the like.
Drawings
FIG. 1 is a schematic flow chart of the method of the present invention.
Detailed Description
The technical solution of the present invention is described in detail below with reference to the accompanying drawings and examples.
A binocular vision-based trimming point calculation method comprises the following steps:
1) collecting left and right gray images of the trimming characteristics by using a binocular vision measuring system;
2) performing Gaussian filtering and binarization on the left image and the right imageImage preprocessing such as edge detection, and extracting two-dimensional point coordinates N of the edge cutting contour edge of the left image and the right imagetT is 1,2, … s; s is the number of two-dimensional points of the edge of the trimming contour;
3) converting the trimming contour points in the left image and the right image into three-dimensional space by using an epipolar matching method to obtain three-dimensional point coordinate data of the trimming contour;
4) obtaining three-dimensional point cloud of the plane where the measured trimming edge is located by using a three-dimensional scanner, performing point cloud plane fitting by using a least square method to obtain a fitting plane I, projecting the three-dimensional point coordinate data of the trimming edge contour obtained in the step 3) to the fitting plane I to obtain a three-dimensional projection point of the trimming edge contour, and recording the three-dimensional projection point as NjJ is 1,2, …, m, m is the number of three-dimensional projection points of the trimming edge profile;
5) projecting the coordinate P and the theoretical normal n of the theoretical point of the edge cutting point to be detected to a fitting plane I to obtain a theoretical projection point P 'and a theoretical projection normal n';
in this embodiment, the coordinate P and the theoretical normal n of the theoretical point are digital-analog standard data values of the edge cutting point to be detected, which are pre-stored in the system.
One trimming point to be measured is arranged;
6) according to the theoretical projection point P' (b)1,b2,b3) And the theoretical projection normal n' (k)1,k2,k3) Establishing a straight line L:
Figure BDA0002031788420000051
7) calculating three-dimensional projection point NjWhen the distance to the straight line L is smaller than T, the T is 0.2-0.7 mm, the three-dimensional projection point at the moment is recorded, and a point set N is obtainedj', j-1, 2, …, m ', m ' is the number of recorded three-dimensional projection points;
8) fitting the normal cross-product theory projection normal n 'of the plane I to obtain a direction vector n' (k)4,k5,k6) Establishing a straight line L ' according to the theoretical projection point P ' and the direction vector n ':
Figure BDA0002031788420000052
9) sequentially calculating three-dimensional projection points N along the direction of the theoretical projection normal NjThe distance D from the straight line L' is used for projecting the three-dimensional projection point N according to the value D and the preset step lengthjDividing the obtained product into N groups; selecting a group with the maximum D value and the number of points in the group larger than m as a group where the trimming points are located; and averaging the coordinates of all three-dimensional projection points in the group of the trimming points to be used as the measurement coordinates of the trimming points to be measured.
In this embodiment, the step length is 0.3, and m takes a value of 3, and the specific process is as follows: sequentially calculating three-dimensional projection points N along the direction of the theoretical projection normal NjA distance D to the straight line L', and projecting the three-dimensional projection point NjSequentially dividing the obtained product into N groups from near to far according to the following formula;
0.3n<D<0.3(n+1),n=0,1,2,3……N;
that is, 0 to 0.3 is the 1 st group, 0.3 to 0.6 is the 2 nd group, 0.6 to 0.9 is the 3 rd group … …, and so on, all three-dimensional projection points N are exhaustedjDividing N groups in total;
screening out a group of three-dimensional projection points which are farthest in distance along the direction of the theoretical projection normal n' and have more than 3 points in the group; and averaging the coordinates of the group of three-dimensional projection points to be used as the final measurement coordinates of the edge cutting point to be measured.
And 10) making a difference value between the measured coordinate obtained in the step 9) and the coordinate of the theoretical point, judging that the trimming point is qualified when the difference value is smaller than a tolerance threshold, and otherwise, judging that the trimming point is unqualified.
The foregoing descriptions of specific exemplary embodiments of the present invention have been presented for purposes of illustration and description. The foregoing description is not intended to be exhaustive or to limit the invention to the precise form disclosed, and obviously many modifications and variations are possible in light of the above teaching. The exemplary embodiments were chosen and described in order to explain certain principles of the invention and its practical application to enable others skilled in the art to make and use various exemplary embodiments of the invention and various alternatives and modifications thereof. It is intended that the scope of the invention be defined by the following claims and their equivalents.

Claims (8)

1. A binocular vision-based trimming point calculation method is characterized by comprising the following steps: the method comprises the following steps:
1) collecting left and right images of the trimming characteristics by using a binocular vision measuring system;
2) image preprocessing is carried out on the left image and the right image, and two-dimensional point coordinates N of the edge cutting outline edges of the left image and the right image are extractedtS, · t ═ 1, 2; s is the number of two-dimensional points of the edge of the trimming contour;
3) converting the trimming contour points in the left image and the right image into three-dimensional space by using an epipolar matching method to obtain three-dimensional point coordinate data of the trimming contour;
4) acquiring three-dimensional point cloud of the plane where the measured trimming edge is located by using a three-dimensional scanner, performing point cloud plane fitting to obtain a fitting plane I, projecting the three-dimensional point coordinate data of the trimming edge contour obtained in the step 3) to the fitting plane I to obtain a three-dimensional projection point of the trimming edge contour, and recording the three-dimensional projection point as NjJ is 1,2, …, m is the number of three-dimensional projection points of the trimming edge contour, and m is more than or equal to 2;
5) projecting the coordinate P and the theoretical normal n of the theoretical point of the edge cutting point to be detected to the fitting plane I to obtain a theoretical projection point P 'and a theoretical projection normal n';
6) according to the theoretical projection point P' (b)1,b2,b3) And the theoretical projection normal n' (k)1,k2,k3) Establishing a straight line L:
Figure FDA0002899829070000011
7) calculating three-dimensional projection point NjWhen the distance to the straight line L is smaller than T, the T is 0.2-0.7 mm, the three-dimensional projection point at the moment is recorded, and a point set N is obtainedj', j-1, 2, …, m ', m ' is the number of recorded three-dimensional projection points;
8) the normal direction of the fitting plane I is cross-multiplied with the theoretical projection normal direction n 'to obtain a direction vector n' (k)4,k5,k6) Establishing a straight line L ' according to the theoretical projection point P ' and the direction vector n ':
Figure FDA0002899829070000012
9) sequentially calculating three-dimensional projection points N along the direction of the theoretical projection normal NjThe distance D from the three-dimensional projection point N to the straight line L' is calculated according to the value D and the preset step lengthjDividing the obtained product into N groups; selecting a group with the maximum D value and the number of points in the group larger than m as a group where the trimming points are located; and averaging the coordinates of all three-dimensional projection points in the group where the trimming points are located, namely, the measurement coordinates of the trimming points to be measured.
2. The binocular vision-based cut edge point calculation method of claim 1, wherein: the left and right images are grey-scale images.
3. The binocular vision-based cut edge point calculation method of claim 1, wherein: the image preprocessing comprises Gaussian filtering, binarization image and edge detection.
4. The binocular vision-based cut edge point calculation method of claim 1, wherein: and the coordinate P and the theoretical normal n of the theoretical point are digital-analog standard data values of the edge cutting point to be detected, which are pre-stored in the system.
5. The binocular vision-based cut edge point calculation method of claim 1, wherein: the number of the edge cutting points to be detected is one or more.
6. The binocular vision-based cut edge point calculation method of claim 1, wherein: and 10) making a difference value between the measured coordinate obtained in the step 9) and the coordinate of the theoretical point, judging that the trimming point is qualified when the difference value is smaller than a tolerance threshold, and otherwise, judging that the trimming point is unqualified.
7. The binocular vision-based cut edge point calculation method of claim 1, wherein: the fitting plane I is fitted by a least square method.
8. The binocular vision-based cut edge point calculation method of claim 1, wherein: the preset step length is 0.2-0.5 mm.
CN201910311925.0A 2019-04-18 2019-04-18 Edge cutting point calculation method based on binocular vision Active CN110021027B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910311925.0A CN110021027B (en) 2019-04-18 2019-04-18 Edge cutting point calculation method based on binocular vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910311925.0A CN110021027B (en) 2019-04-18 2019-04-18 Edge cutting point calculation method based on binocular vision

Publications (2)

Publication Number Publication Date
CN110021027A CN110021027A (en) 2019-07-16
CN110021027B true CN110021027B (en) 2021-03-16

Family

ID=67191768

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910311925.0A Active CN110021027B (en) 2019-04-18 2019-04-18 Edge cutting point calculation method based on binocular vision

Country Status (1)

Country Link
CN (1) CN110021027B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112710313A (en) * 2020-12-31 2021-04-27 广州极飞科技股份有限公司 Overlay path generation method and device, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09259283A (en) * 1996-03-22 1997-10-03 Matsushita Electric Ind Co Ltd Method and device for calculating projection area of three-dimensional model
CN108615699A (en) * 2018-05-29 2018-10-02 深圳信息职业技术学院 A kind of wafer alignment system and method and the optical imaging device for wafer alignment
CN109248963A (en) * 2018-11-27 2019-01-22 东莞市骏毅机电科技有限公司 A kind of transmitting drawing die trimming structure of power battery shell

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080122799A1 (en) * 2001-02-22 2008-05-29 Pryor Timothy R Human interfaces for vehicles, homes, and other applications

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09259283A (en) * 1996-03-22 1997-10-03 Matsushita Electric Ind Co Ltd Method and device for calculating projection area of three-dimensional model
CN108615699A (en) * 2018-05-29 2018-10-02 深圳信息职业技术学院 A kind of wafer alignment system and method and the optical imaging device for wafer alignment
CN109248963A (en) * 2018-11-27 2019-01-22 东莞市骏毅机电科技有限公司 A kind of transmitting drawing die trimming structure of power battery shell

Also Published As

Publication number Publication date
CN110021027A (en) 2019-07-16

Similar Documents

Publication Publication Date Title
CN102589435B (en) Efficient and accurate detection method of laser beam center under noise environment
CN111738985B (en) Visual detection method and system for weld joint contour
CN112614098B (en) Blank positioning and machining allowance analysis method based on augmented reality
CN111062940B (en) Screw positioning and identifying method based on machine vision
CN103292701A (en) Machine-vision-based online dimensional measurement method of precise instrument
CN107133565B (en) Line laser-based laser engraving line feature extraction method
CN107462587B (en) Precise visual inspection system and method for concave-convex mark defects of flexible IC substrate
CN111047588A (en) Imaging measurement method for size of shaft type small part
CN109483887B (en) Online detection method for contour accuracy of forming layer in selective laser melting process
CN110146017B (en) Industrial robot repeated positioning precision measuring method
CN112729112B (en) Engine cylinder bore diameter and hole site detection method based on robot vision
CN115482195B (en) Train part deformation detection method based on three-dimensional point cloud
CN105184792B (en) A kind of saw blade wear extent On-line Measuring Method
CN108662989A (en) A kind of car light profile quality determining method based on 3 D laser scanning
CN114037675A (en) Airplane sample plate defect detection method and device
CN109990711B (en) Appearance quality detection method for punched nickel-plated steel strip
CN115546125A (en) Method for error detection and track deviation correction of additive manufacturing cladding layer based on point cloud information
CN113450379A (en) Method and device for extracting and analyzing profile line of section of special-shaped workpiece
CN116402792A (en) Space hole site butt joint method based on three-dimensional point cloud
CN110021027B (en) Edge cutting point calculation method based on binocular vision
CN108627103A (en) A kind of 2D laser measurement methods of parts height dimension
CN116880353A (en) Machine tool setting method based on two-point gap
CN116681912A (en) Rail gauge detection method and device for railway turnout
CN109902694A (en) A kind of extracting method of square hole feature
CN114111576B (en) Aircraft skin gap surface difference detection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: Room 495, building 3, 1197 Bin'an Road, Binjiang District, Hangzhou City, Zhejiang Province 310051

Patentee after: Yi Si Si (Hangzhou) Technology Co.,Ltd.

Address before: Room 495, building 3, 1197 Bin'an Road, Binjiang District, Hangzhou City, Zhejiang Province 310051

Patentee before: ISVISION (HANGZHOU) TECHNOLOGY Co.,Ltd.