CN110021027A - A kind of trimming point calculating method based on binocular vision - Google Patents

A kind of trimming point calculating method based on binocular vision Download PDF

Info

Publication number
CN110021027A
CN110021027A CN201910311925.0A CN201910311925A CN110021027A CN 110021027 A CN110021027 A CN 110021027A CN 201910311925 A CN201910311925 A CN 201910311925A CN 110021027 A CN110021027 A CN 110021027A
Authority
CN
China
Prior art keywords
point
trimming
coordinate
tripleplane
binocular vision
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910311925.0A
Other languages
Chinese (zh)
Other versions
CN110021027B (en
Inventor
魏志博
邢威
张楠楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yi Si Si Hangzhou Technology Co ltd
Original Assignee
Isvision Hangzhou Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Isvision Hangzhou Technology Co Ltd filed Critical Isvision Hangzhou Technology Co Ltd
Priority to CN201910311925.0A priority Critical patent/CN110021027B/en
Publication of CN110021027A publication Critical patent/CN110021027A/en
Application granted granted Critical
Publication of CN110021027B publication Critical patent/CN110021027B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a kind of trimming point calculating method based on binocular vision includes the following steps: that Binocular vision photogrammetry obtains trimming three-dimensional coordinate;Plane fitting obtains plane I, and trimming three-dimensional coordinate is projected to fit Plane I, obtains Nj;The theoretical coordinate of trimming point and theoretical normal direction are projected into plane I, straight line L is established: calculating NjTo the distance of straight line L, N is obtainedj′;Multiplication cross obtains vector n ", establish straight line L ';Along the theoretical projection direction normal direction n ', N is calculatedjTo the distance D of L ', according to D value according to preset step-length by tripleplane's point NjIt is divided into N group;It selects D value maximum and counts one group that is greater than m in group as group where trimming point;The coordinate of all tripleplane's points is averaged in organizing where the trimming point, obtains trimming point measurement coordinate;This method measurement coordinate precision can reach micro scale, and time of measuring can satisfy the manufacturing testing requirements of the Precision Machinings such as automobile industry in Millisecond.

Description

A kind of trimming point calculating method based on binocular vision
Technical field
The present invention relates to feature detection fields, and in particular to a kind of trimming point calculating method based on binocular vision.
Background technique
With the development of automatic technology, the raising of social productive forces, manufacturing is more next for the control of product quality Stringenter, the basic principle for improving manufacturing quality can be attributed to two classes: 1. avoid method-this method of error using high-precision Process equipment, the transducer detecting mechanism of advanced control structure and high reliability fundamentally improve the part processed Quality, but this method is complicated for operation, and expensive, there has been no complete real-time proposals at present;2. the method-of error compensation This method is mutually to be tied using modern test technology, new statistical tolerance method with existing Engineering Control and statistical control method It closes, studies product size distribution character, analyze the machining state of die clamp, and then control it and adjust, this side More and more attention has been paid to and the key technology for implementing this method is exactly to measure to method, and measurement can obtain the significant figure of measured object According to provide basis for subsequent control mode.
In numerous tested features, trimming point is an important classification to be measured, by taking automobile body-in-white as an example, trimming point Position it is whether accurate, be directly related to the straightness at workpiece edge, be the later period installation, configuration design whether meet the requirements according to According to;Traditional inspection sample rack is widely used detection instrument the 1970s, and this method is at low cost, but detection speed is slow, Precision is low (about 0.1mm), and the detection accuracy that precision instrument manufacture requires is not achieved;Another detection method is to be surveyed using three coordinates Amount machine (CMM) carries out the detection of trimming point, although this method precision is high, measuring speed is very slow, only for space boundary point It is capable of measuring limited point, the geometries characteristic of space boundary can not be described comprehensively, and measurement process needs manual intervention, needs Probe damage and probe radius compensation are carried out to measurement result.
Summary of the invention
To solve the above-mentioned problems, the present invention proposes a kind of trimming point calculating method based on binocular vision, effectively improves The edge point measurement accuracy of measurement accuracy and efficiency, this method can reach some tens of pm magnitude.
A kind of trimming point calculating method based on binocular vision, includes the following steps:
1) the left and right image of two CCD camera measure system acquisition trimming feature is utilized;
2) image preprocessing is carried out to the left and right image, the two-dimensional points for extracting the trimming contour edge of left and right image are sat Mark Nt, t=1,2 ... s;S is the number of the trimming contour edge two-dimensional points;
3) the trimming profile point in left and right image is transformed into space three-dimensional using polar curve matching process, obtains trimming wheel Wide three-dimensional point coordinate data;
4) three-dimensional point cloud of plane, carries out a cloud plane fitting, obtains where obtaining tested trimming using spatial digitizer Fit Plane I, then the three-dimensional point coordinate data projection of trimming profile obtained in step 3) is obtained to the fit Plane I Tripleplane's point of trimming profile, is denoted as Nj, j=1,2 ..., m, m is the number of tripleplane's point of trimming profile;
5) the coordinate P of the mathematical point of trimming point to be measured and theory normal direction n are projected into the fit Plane I, obtains theory Subpoint P ' and theoretical projection normal direction n ';
6) according to theoretical subpoint P ' (b1,b2,b3) and theoretical projection normal direction n ' (k1,k2,k3), establish straight line L:
Wherein, (X, Y, Z) is any point on straight line L;
7) tripleplane's point N is calculatedjTo the distance of straight line L, when distance is less than T, T=0.2~0.7mm records this When tripleplane's point, obtain point set Nj', j=1,2 ..., m ', m ' they are the number for the tripleplane's point being recorded;
8) by the normal direction of fit Plane I and theoretical projection normal direction n ' multiplication cross, direction vector n " (k is obtained4,k5,k6), according to Theoretical subpoint P ' and direction vector n " establish straight line L ':
Wherein, (X ', Y ', Z ') is that straight line L ' goes up any point;
9) direction normal direction n ' is projected along theory, successively calculates tripleplane's point NjTo the distance D of straight line L ', according to D value According to preset step-length by tripleplane's point NjIt is divided into N group;Select D value maximum and in group points greater than one group of m as Group where trimming point;The coordinate of all tripleplane's points is averaged in organizing where the trimming point, trimming point as to be measured Measure coordinate.
Further, the left and right image is grayscale image.
Further, described image pretreatment includes gaussian filtering, binary image, edge detection.
Further, the coordinate P of the mathematical point and theory normal direction n is the digital-to-analogue of pre-stored trimming point to be measured in system Normal data value.
Further, the trimming point to be measured is one or more.
Further, further include step 10), measurement coordinate obtained in step 9) and the coordinate of mathematical point are made into difference, when When difference is less than tolerance threshold value, judges that the trimming point is qualified, otherwise, judge that the trimming point is unqualified.
Further, the fit Plane I is fitted by least square method.
Further, 0.2~0.5mm of the preset step-length value.
Further, m >=2.
The shortcomings that trimming point cloud poor quality that method provided by the present application can effectively avoid binocular vision from obtaining, with three The three-dimensional point cloud for tieing up plane where scanner obtains tested trimming, carries out a cloud plane fitting, and by trimming point set to be measured and reason Projected in fit Plane by coordinate points, normal vector, using the theoretical value of tested point as guidance screening calculate it is optimal to It surveys trimming point and measures coordinate, and then trimming quality is detected;This method measurement coordinate precision can reach micro scale, Time of measuring can satisfy the manufacturing testing requirements of the Precision Machinings such as automobile industry in Millisecond.
Detailed description of the invention
Fig. 1 is the method for the present invention flow diagram.
Specific embodiment
Technical solution of the present invention is described in detail below in conjunction with drawings and examples.
A kind of trimming point calculating method based on binocular vision, includes the following steps:
1) the left and right gray level image of two CCD camera measure system acquisition trimming feature is utilized;
2) image preprocessings such as gaussian filtering, binary image, edge detection are carried out to left and right image, extracts left and right figure The two-dimensional points coordinate N of the trimming contour edge of picturet, t=1,2 ... s;S is the number of trimming contour edge two-dimensional points;
3) the trimming profile point in left and right image is transformed into space three-dimensional using polar curve matching process, obtains trimming wheel Wide three-dimensional point coordinate data;
4) three-dimensional point cloud of plane, carries out a cloud by least square method where obtaining tested trimming using spatial digitizer Plane fitting obtains fit Plane I, then the three-dimensional point coordinate data projection of trimming profile obtained in step 3) is flat to fitting Face I obtains tripleplane's point of trimming profile, is denoted as Nj, j=1,2 ..., m, m is of tripleplane's point of trimming profile Number;
5) the coordinate P of the mathematical point of trimming point to be measured and theory normal direction n are projected into fit Plane I, obtains theoretical projection Point P ' and theoretical projection normal direction n ';
In the present embodiment, the coordinate P and theory normal direction n of mathematical point are the digital-to-analogue of pre-stored trimming point to be measured in system Normal data value.
Trimming point to be measured is one;
6) according to theoretical subpoint P ' (b1,b2,b3) and theoretical projection normal direction n ' (k1,k2,k3), establish straight line L:
7) tripleplane's point N is calculatedjTo the distance of straight line L, when distance is less than T, T=0.2~0.7mm records this When tripleplane's point, obtain point set Nj', j=1,2 ..., m ', m ' they are the number for the tripleplane's point being recorded;
8) the normal direction multiplication cross theory of fit Plane I projects normal direction n ', obtains direction vector n " (k4,k5,k6), according to theory Subpoint P ' and direction vector n " establish straight line L ':
9) direction normal direction n ' is projected along theory, successively calculates tripleplane's point NjTo the distance D of straight line L ', according to D value According to preset step-length by tripleplane point NjIt is divided into N group;It selects D value maximum and counts one group that is greater than m as trimming in group Group where point;The coordinate of all tripleplane's points is averaged in organizing where trimming point, the measurement coordinate as trimming point to be measured.
In this implementation, step-length 0.3, m value 3, detailed process are as follows: project the direction normal direction n ' along theory, successively calculate Tripleplane point NjTo the distance D of straight line L ', and by tripleplane point NjAccording to the following formula, by closely to remote, being successively divided into N group;
0.3n < D < 0.3 (n+1), n=0,1,2,3 ... N;
I.e. 0~0.3 is the 1st group, and 0.3~0.6 is the 2nd group, and 0.6~0.9 is the 3rd group ... and so on, exhaustive all Tripleplane point Nj, N group is divided altogether;
Filter out one group of tripleplane that points in and group farthest along the theoretical projection direction normal direction n ' distance are greater than 3 Point;The coordinate of this group of tripleplane's point is averaged, the measurement coordinate as final trimming point to be measured.
Further including step 10) makees difference for measurement coordinate obtained in step 9) and the coordinate of mathematical point, when difference is less than When tolerance threshold value, judges that the trimming point is qualified, otherwise, judge that the trimming point is unqualified.
The description that specific exemplary embodiment of the present invention is presented in front is for the purpose of illustration and description.Front Description be not intended to become without missing, be not intended to limit the invention to disclosed precise forms, it is clear that root It is possible for much changing and change all according to above-mentioned introduction.It selects exemplary implementation scheme and is described to be to explain this hair Bright certain principles and practical application, so that others skilled in the art can be realized and utilize of the invention each Kind exemplary implementation scheme and its different selection forms and modification.The scope of the present invention be intended to by the appended claims and Its equivalent form is limited.

Claims (9)

1. a kind of trimming point calculating method based on binocular vision, characterized by the following steps:
1) the left and right image of two CCD camera measure system acquisition trimming feature is utilized;
2) image preprocessing is carried out to the left and right image, extracts the two-dimensional points coordinate N of the trimming contour edge of left and right imaget, T=1,2 ... s;S is the number of the trimming contour edge two-dimensional points;
3) the trimming profile point in left and right image is transformed into space three-dimensional using polar curve matching process, obtains trimming profile Three-dimensional point coordinate data;
4) three-dimensional point cloud of plane, carries out a cloud plane fitting, is fitted where obtaining tested trimming using spatial digitizer Plane I, then the three-dimensional point coordinate data projection of trimming profile obtained in step 3) is obtained into trimming to the fit Plane I Tripleplane's point of profile, is denoted as Nj, j=1,2 ..., m, m is the number of tripleplane's point of trimming profile;
5) the coordinate P of the mathematical point of trimming point to be measured and theory normal direction n are projected into the fit Plane I, obtains theoretical projection Point P ' and theoretical projection normal direction n ';
6) according to theoretical subpoint P ' (b1, b2, b3) and theoretical projection normal direction n ' (k1, k2, k3), establish straight line L:
7) tripleplane's point N is calculatedjTo the distance of straight line L, when distance is less than T, T=0.2~0.7mm records three at this time Subpoint is tieed up, point set N is obtainedj', j=1,2 ..., m ', m ' they are the number for the tripleplane's point being recorded;
8) by the normal direction of fit Plane I and theoretical projection normal direction n ' multiplication cross, direction vector n " (k is obtained4, k5, k6), according to theory Subpoint P ' and direction vector n " establish straight line L ':
9) direction normal direction n ' is projected along theory, successively calculates tripleplane's point NjTo the distance D of straight line L ', according to D value according to pre- If step-length is by tripleplane's point NjIt is divided into N group;It selects D value maximum and counts one group that is greater than m as trimming point in group Place group;The coordinate of all tripleplane's points is averaged in organizing where the trimming point, and the measurement of trimming point as to be measured is sat Mark.
2. the trimming point calculating method based on binocular vision as described in claim 1, it is characterised in that: the left and right image is Grayscale image.
3. the trimming point calculating method based on binocular vision as described in claim 1, it is characterised in that: described image pretreatment packet Include gaussian filtering, binary image, edge detection.
4. the trimming point calculating method based on binocular vision as described in claim 1, it is characterised in that: the coordinate of the mathematical point P and theory normal direction n is the digital-to-analogue normal data value of pre-stored trimming point to be measured in system.
5. the trimming point calculating method based on binocular vision as described in claim 1, it is characterised in that: the trimming point to be measured is It is one or more.
6. the trimming point calculating method based on binocular vision as described in claim 1, it is characterised in that: it further include step 10), it will Measurement coordinate and the coordinate of mathematical point obtained in step 9) make difference, when difference is less than tolerance threshold value, judge the trimming point Otherwise qualification judges that the trimming point is unqualified.
7. the trimming point calculating method based on binocular vision as described in claim 1, it is characterised in that: the fit Plane I is logical Cross least square method fitting.
8. the trimming point calculating method based on binocular vision as described in claim 1, it is characterised in that: the preset step-length value 0.2~0.5mm.
9. the trimming point calculating method based on binocular vision as described in claim 1, it is characterised in that: m >=2.
CN201910311925.0A 2019-04-18 2019-04-18 Edge cutting point calculation method based on binocular vision Active CN110021027B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910311925.0A CN110021027B (en) 2019-04-18 2019-04-18 Edge cutting point calculation method based on binocular vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910311925.0A CN110021027B (en) 2019-04-18 2019-04-18 Edge cutting point calculation method based on binocular vision

Publications (2)

Publication Number Publication Date
CN110021027A true CN110021027A (en) 2019-07-16
CN110021027B CN110021027B (en) 2021-03-16

Family

ID=67191768

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910311925.0A Active CN110021027B (en) 2019-04-18 2019-04-18 Edge cutting point calculation method based on binocular vision

Country Status (1)

Country Link
CN (1) CN110021027B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112710313A (en) * 2020-12-31 2021-04-27 广州极飞科技股份有限公司 Overlay path generation method and device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09259283A (en) * 1996-03-22 1997-10-03 Matsushita Electric Ind Co Ltd Method and device for calculating projection area of three-dimensional model
US20110032203A1 (en) * 2000-02-22 2011-02-10 Pryor Timothy R Human interfaces for vehicles, homes, and other applications
CN108615699A (en) * 2018-05-29 2018-10-02 深圳信息职业技术学院 A kind of wafer alignment system and method and the optical imaging device for wafer alignment
CN109248963A (en) * 2018-11-27 2019-01-22 东莞市骏毅机电科技有限公司 A kind of transmitting drawing die trimming structure of power battery shell

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09259283A (en) * 1996-03-22 1997-10-03 Matsushita Electric Ind Co Ltd Method and device for calculating projection area of three-dimensional model
US20110032203A1 (en) * 2000-02-22 2011-02-10 Pryor Timothy R Human interfaces for vehicles, homes, and other applications
CN108615699A (en) * 2018-05-29 2018-10-02 深圳信息职业技术学院 A kind of wafer alignment system and method and the optical imaging device for wafer alignment
CN109248963A (en) * 2018-11-27 2019-01-22 东莞市骏毅机电科技有限公司 A kind of transmitting drawing die trimming structure of power battery shell

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112710313A (en) * 2020-12-31 2021-04-27 广州极飞科技股份有限公司 Overlay path generation method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN110021027B (en) 2021-03-16

Similar Documents

Publication Publication Date Title
CN106863014B (en) A kind of five-axle number control machine tool linear axis geometric error detection method
CN107186548B (en) A kind of five-axle number control machine tool rotating shaft geometric error detection method
CN102519400B (en) Large slenderness ratio shaft part straightness error detection method based on machine vision
CN102800096B (en) Robustness estimation algorithm of camera parameter
CN103292701A (en) Machine-vision-based online dimensional measurement method of precise instrument
CN107044821A (en) A kind of measuring method and system of contactless tubing object
CN109978938A (en) A kind of pillow spring detection method based on machine vision
CN112614098A (en) Blank positioning and machining allowance analysis method based on augmented reality
CN111047588A (en) Imaging measurement method for size of shaft type small part
CN103759672A (en) Vision measurement method for ice cream stick plane contour dimensions
CN109990711B (en) Appearance quality detection method for punched nickel-plated steel strip
CN115060452B (en) Panoramic error detection method applied to large wind tunnel spray pipe molded surface
CN112734858A (en) Binocular calibration precision online detection method and device
CN108627103A (en) A kind of 2D laser measurement methods of parts height dimension
CN110940271A (en) Method for detecting, monitoring and intelligently carrying and installing large-scale industrial manufacturing of ships and the like based on space three-dimensional measurement and control network
CN108844469B (en) Method and system for testing workpiece step height based on laser
CN110021027A (en) A kind of trimming point calculating method based on binocular vision
CN109115127A (en) A kind of sub-pix peak point extraction algorithm based on Bezier
CN106645168A (en) Detection method for surface concave-convex defect of boom cylinder of crane
CN114111576B (en) Aircraft skin gap surface difference detection method
CN113743483B (en) Road point cloud error scene analysis method based on spatial plane offset analysis model
CN113155648B (en) Material micro-deformation measurement method and system based on impact test
CN115685164A (en) Three-dimensional laser imager working parameter testing system and method
CN112414316B (en) Strain gauge sensitive grid size parameter measuring method
CN109902694A (en) A kind of extracting method of square hole feature

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: Room 495, building 3, 1197 Bin'an Road, Binjiang District, Hangzhou City, Zhejiang Province 310051

Patentee after: Yi Si Si (Hangzhou) Technology Co.,Ltd.

Address before: Room 495, building 3, 1197 Bin'an Road, Binjiang District, Hangzhou City, Zhejiang Province 310051

Patentee before: ISVISION (HANGZHOU) TECHNOLOGY Co.,Ltd.