CN116563292A - Measurement method, detection device, detection system, and storage medium - Google Patents

Measurement method, detection device, detection system, and storage medium Download PDF

Info

Publication number
CN116563292A
CN116563292A CN202310841520.4A CN202310841520A CN116563292A CN 116563292 A CN116563292 A CN 116563292A CN 202310841520 A CN202310841520 A CN 202310841520A CN 116563292 A CN116563292 A CN 116563292A
Authority
CN
China
Prior art keywords
image
points
clustering
detected
physical coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310841520.4A
Other languages
Chinese (zh)
Other versions
CN116563292B (en
Inventor
许沈榕
朱云龙
郑军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Matrixtime Robotics Shanghai Co ltd
Original Assignee
Jushi Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jushi Technology Shenzhen Co ltd filed Critical Jushi Technology Shenzhen Co ltd
Priority to CN202310841520.4A priority Critical patent/CN116563292B/en
Publication of CN116563292A publication Critical patent/CN116563292A/en
Application granted granted Critical
Publication of CN116563292B publication Critical patent/CN116563292B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30148Semiconductor; IC; Wafer

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Quality & Reliability (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a measuring method, a detecting device, a detecting system and a storage medium, wherein a plurality of images of an object to be detected are obtained by shooting the object to be detected at different positions by an image acquisition device, and physical coordinates corresponding to a plurality of appointed pixel points of an area where the object to be detected is located in each image of the object to be detected are calculated; and then clustering and integrating physical coordinates corresponding to all the specified pixel points to obtain physical coordinates of a plurality of clustered actual points for determining the actual size information related to the object to be detected. According to the invention, the physical coordinates corresponding to the specified pixel points in each object image are determined, and the physical coordinates of the actual clustering points are integrated by clustering, so that the physical coordinates of the actual clustering points determined by the object images under multiple fields are accurate, the accuracy of the determined actual size information related to the object is further ensured, and false detection can be effectively avoided.

Description

Measurement method, detection device, detection system, and storage medium
Technical Field
The present invention relates to the field of image processing, and in particular, to a measurement method, a detection apparatus, a detection system, and a storage medium.
Background
The current electronic component integration level and the refinement degree are high, and the corresponding requirements on the detection speed and the efficiency of the electronic component product and the integrated product thereof are also higher, so that an automatic optical detection (Automatically Optical Inspection, AOI for short) technology is generated. The automatic optical detection technology has the greatest advantages of saving labor, reducing cost, improving production efficiency, unifying detection standards and eliminating human factor interference, ensuring the stability, repeatability and accuracy of detection results, finding out the defects of products in time and ensuring shipment quality.
Taking three-dimensional measurement of a wafer as an example, an automatic optical detection technology is applied to detect the height of a solder ball on the surface of the wafer so as to judge whether the wafer meets the manufacturing standard. In the prior art, detection is mainly performed based on a wafer surface image under a single field of view, but the detection accuracy is greatly influenced by pixel size and sub-pixel extraction, so that the deviation between the physical position information of the restored solder ball and the actual position is large, and the calculated result of the height of the solder ball is inconsistent with the actual result, so that false detection occurs.
Disclosure of Invention
The invention aims to provide a measuring method, a detecting device, a detecting system and a storage medium, so as to solve the problems existing in the prior art.
Embodiments of the invention may be implemented as follows:
in a first aspect, the present invention provides a measurement method comprising:
obtaining a plurality of images of the object to be detected, wherein the images of the object to be detected are obtained by shooting the object to be detected at different positions by image acquisition equipment;
obtaining physical coordinates corresponding to a plurality of appointed pixel points of an area where the object to be detected is located in each object to be detected image;
clustering and integrating physical coordinates corresponding to all the appointed pixel points to obtain physical coordinates of a plurality of clustering actual points of the object to be detected; and the physical coordinates of the plurality of clustering actual points are used for determining the actual size information related to the object to be detected.
Optionally, the step of obtaining physical coordinates corresponding to a plurality of specified pixel points of the region where the object to be detected is located in each image of the object to be detected includes:
acquiring the pixel position of an optical center of the image acquisition equipment in an imaging area;
determining the pixel position and the height value of each appointed pixel point in each object image to be detected;
and calculating physical coordinates corresponding to the specified pixel points by using equipment physical coordinates corresponding to each image of the object to be detected, an image conversion rule of the image acquisition equipment in an imaging area, pixel positions of the specified pixel points and height values of the specified pixel points in the image of the object to be detected.
Optionally, the physical coordinates of the device include an X-axis coordinate value of the device, a Y-axis coordinate value of the device, and a Z-axis coordinate value of the device;
the step of calculating the physical coordinates corresponding to the specified pixel point by using the physical coordinates of the device corresponding to the image of the object to be detected, the image conversion rule of the image acquisition device in the imaging area, the pixel position of the specified pixel point and the height value of the specified pixel point in the image corresponding to the object to be detected, includes:
acquiring a transverse coordinate value accumulated value and a longitudinal coordinate value accumulated value between the specified pixel point and the optical center based on the pixel position of the optical center and the pixel position of the specified pixel point;
taking the sum of the X-axis coordinate value of the equipment and the accumulated value of the transverse coordinate value as a target X-axis coordinate value in the physical coordinates corresponding to the appointed pixel point;
taking the sum of the Y-axis coordinate value of the equipment and the accumulated value of the longitudinal coordinate value as a target Y-axis coordinate value in the physical coordinates corresponding to the appointed pixel point;
and taking the sum of the Z-axis coordinate value of the equipment and the height value of the specified pixel point in the corresponding image of the object to be detected as a target Z-axis coordinate value in the physical coordinate corresponding to the specified pixel point.
Optionally, the physical coordinate corresponding to the specified pixel point is an actual point; the step of integrating the physical coordinates corresponding to all the specified pixel points by clustering to obtain the physical coordinates of a plurality of clustering actual points of the object to be detected comprises the following steps:
clustering all the actual points based on a preset clustering pixel threshold value and physical coordinates corresponding to all the specified pixel points to obtain a plurality of clustering sets; said set of clusters comprising at least one of said actual points;
and calculating the mean value of coordinate values corresponding to all the actual points in the clustering set aiming at each clustering set to obtain the physical coordinates of the clustering actual points corresponding to the clustering set.
Optionally, the step of clustering all the actual points to obtain a plurality of cluster sets based on a preset cluster pixel threshold and physical coordinates corresponding to all the specified pixel points includes:
obtaining an image object conversion mean value in the imaging area;
determining a clustering actual threshold based on the preset clustering pixel threshold and the image object conversion mean;
and clustering all the actual points based on the clustering actual threshold value to obtain a plurality of clustering sets.
Optionally, the image object conversion rule is obtained by:
obtaining a plurality of calibration plate images of the checkered calibration plates at different placement positions;
and determining an image object conversion rule in the imaging area based on all the calibration plate images.
Optionally, the checkerboard calibration plate comprisesCharacteristic points, and->In the characteristic pointsThe feature points are feature corner points;
the step of determining an image object conversion rule in the imaging area based on all the calibration plate images comprises the following steps:
extracting pixel positions of each characteristic point in the calibration plate image aiming at any calibration plate image;
calculating image object conversion proportion information at the characteristic corner points based on the actual spacing of the corner points and the pixel positions of four adjacent characteristic points of the characteristic corner points aiming at any characteristic corner point in the calibration plate image;
traversing each characteristic angular point in the calibration plate image to obtain image object conversion proportion information at each characteristic angular point in the calibration plate image; the image object conversion proportion information at the characteristic corner point comprises a transverse image object conversion proportion and a longitudinal image object conversion proportion at the characteristic corner point;
Traversing each calibration plate image to obtain image object conversion proportion information at each characteristic angular point in each calibration plate image;
performing surface fitting on the image object conversion proportion information of all the characteristic angular points in all the calibration plate images to determine the image object conversion rule; the image conversion rule is used for calculating a transverse image conversion ratio and a longitudinal image conversion ratio at any pixel point in the imaging region.
In a second aspect, the present invention provides a measurement device comprising:
the equipment measuring module is used for obtaining an image object conversion rule of the image acquisition equipment in the imaging area;
the image acquisition module is used for acquiring a plurality of images of the object to be detected, wherein the images of the object to be detected are obtained by shooting the object to be detected at different positions by the image acquisition equipment;
a data processing module for:
calculating physical coordinates corresponding to a plurality of appointed pixel points of an area where the object to be detected is located in each object to be detected image based on the object conversion rule;
clustering and integrating physical coordinates corresponding to all the appointed pixel points to obtain physical coordinates of a plurality of clustering actual points of the object to be detected; and the physical coordinates of the plurality of clustering actual points are used for determining the actual size information related to the object to be detected.
In a third aspect, the present invention provides a detection apparatus comprising: a memory storing a software program that when executed by the detection device implements the measurement method according to any one of the preceding embodiments.
In a fourth aspect, the present invention provides a detection system comprising a communicatively coupled image acquisition device and a detection device as described in the previous embodiments.
In a fifth aspect, the present invention provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the measurement method of any one of the preceding embodiments.
Compared with the prior art, the embodiment of the invention provides a measuring method, a detecting device, a detecting system and a storage medium, wherein physical coordinates corresponding to a plurality of appointed pixel points of an area where an object to be detected is located in each object to be detected image are calculated; and then clustering and integrating physical coordinates corresponding to all the specified pixel points to obtain physical coordinates of a plurality of clustered actual points for determining the actual size information related to the object to be detected. According to the invention, the physical coordinates corresponding to the specified pixel points in each object image are obtained, and the physical coordinates of the actual clustering points are integrated by clustering, so that the physical coordinates of the actual clustering points determined by the object images under multiple fields are accurate, the accuracy of the determined actual size information related to the object is further ensured, and false detection can be effectively avoided.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of overlapping the same object in two view images.
Fig. 2 is a schematic flow chart of a measurement method according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of an image of a calibration plate according to an embodiment of the present invention.
Fig. 4 is a second flow chart of a measurement method according to an embodiment of the invention.
Fig. 5 is a third flow chart of a measurement method according to an embodiment of the invention.
Fig. 6 is a schematic diagram of image superposition to be tested according to an embodiment of the present invention.
Fig. 7 is a schematic structural diagram of a measurement device according to an embodiment of the present invention.
Fig. 8 is a schematic structural diagram of a detection device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. The components of the embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the invention, as presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
It should be noted that the features of the embodiments of the present invention may be combined with each other without conflict.
The current electronic component integration level and the refinement degree are high, and the corresponding requirements on the detection speed and the efficiency of the electronic component product and the integrated product thereof are also higher, so that an automatic optical detection (Automatically Optical Inspection, AOI for short) technology is generated. The automatic optical detection technology has the greatest advantages of saving labor, reducing cost, improving production efficiency, unifying detection standards and eliminating human factor interference, ensuring the stability, repeatability and accuracy of detection results, finding out the defects of products in time and ensuring shipment quality.
Taking three-dimensional measurement of a wafer as an example, an automatic optical detection technology is applied to detect the height of a solder ball on the surface of the wafer so as to judge whether the wafer meets the manufacturing standard. In the prior art, detection is mainly performed based on a wafer surface image under a single field of view, but the detection accuracy is greatly influenced by pixel size and sub-pixel extraction, so that the deviation between the physical position information of the restored solder ball and the actual position is large, and the calculated result of the height of the solder ball is inconsistent with the actual result, so that false detection occurs.
Therefore, the inventors considered that measurement was achieved based on a plurality of view-field object images, but this involved operations such as data merging, there are several problems to be solved:
1. the method of calculating the pixel resolution (which can be understood as the number of pixels corresponding to a unit length) is mostly performed based on a calibration board (checkerboard or dot matrix), and although the pixel resolution calculated by the current mature camera model can completely correct an image, the pixel resolution still cannot be ensured to conform to a standard scale, and sub-pixel (also pixels between two physical pixels, called sub-pixels) level errors are easy to occur. Particularly when facing objects of larger size, sub-pixel level errors in pixel resolution can cause error accumulation due to the larger size of the object. Therefore, consideration needs to be given to how accurately the pixel resolution is calculated to minimize the sub-pixel level error.
2. In an ideal case, there are two images with overlapping fields of view, and the two image areas where the same object is located should be completely coincident. However, due to factors such as the size of the internal pixels of the camera, image distortion, and motion of the mechanism, the sizes of the same object presented in the images of different fields of view may be different, so that in practical situations, there is a certain deviation between the two image area images. For example, referring to fig. 1, in an ideal case, the annular regions in the two images may completely overlap, but in an actual case, a certain deviation still exists when the annular regions in the two images overlap. Therefore, it is necessary to consider how to effectively fuse the data corresponding to the multi-field image of the same object.
3. The size of the area occupied by the same object a in the two view images may be different, e.g. the first view image contains all of the object a, while the second view image only contains a portion of the object a. Therefore, the area where the first view image is located exceeds the view overlapping area of the two images, if the physical coordinate information of the object A under the two views is combined, the data duty ratio is offset, the data duty ratio in the overlapping area is more, and finally the size calculation error is caused. Therefore, consideration needs to be given to how to integrate the data of the field of view overlapping region.
Based on the findings of the above technical problems, the inventors have made creative efforts to propose the following technical solutions to solve or improve the above problems. It should be noted that the above prior art solutions have all the drawbacks that the inventors have obtained after practice and careful study, and thus the discovery process of the above problems and the solutions to the problems that the embodiments of the present application hereinafter propose should not be construed as what the inventors have made in the invention creation process to the present application, but should not be construed as what is known to those skilled in the art.
Here, an application scenario of the embodiment of the present invention will be described.
In the automatic detection process of related devices or elements in the integrated circuit field, the measurement method provided by the embodiment of the invention can be used for determining the physical position information related to the measured object so as to further calculate the actual size information related to the measured object based on the physical position information, and comparing the actual size information with the standard size information to judge whether the devices or elements have size defects.
Based on the measuring method provided by the embodiment of the invention, the physical coordinates corresponding to each clustering pixel point can be accurately determined by utilizing the image of the object to be measured in multiple fields of view, so that the related actual size information of the object to be measured can be determined based on the physical coordinates corresponding to each clustering pixel point to judge whether the object to be measured has the size defect.
The measuring method provided by the invention can be applied to the detection equipment, and the detection equipment can be integrated with the image acquisition equipment or can be independent computer equipment, and is not limited herein.
The measuring method provided by the invention is described in detail below through examples and with reference to the attached drawings.
Referring to fig. 2, fig. 2 is a flow chart of a measurement method according to an embodiment of the present invention, and an execution subject of the method may be the above detection device, including the following steps S101 to S104:
s102, obtaining a plurality of images of the object to be detected.
In this embodiment, the plurality of images of the object to be measured are obtained by photographing the object to be measured at different positions by the image capturing device. The image of the object to be detected can be transmitted after being acquired by the image acquisition equipment in real time, or can be stored in the storage space of the detection equipment after being acquired by the image acquisition equipment in advance, and the detection equipment can directly read out from the storage space of the detection equipment.
Alternatively, the image capturing device may be fixed on the movement axis by using a slide rail, so that the image capturing device may slide on the movement axis to different positions to capture the object to be detected.
S103, obtaining physical coordinates corresponding to a plurality of appointed pixel points of the region where the object to be detected is located in each object to be detected image.
In this embodiment, the pixel coordinates of a plurality of specified pixels in the region where the object to be detected is located in each object to be detected image may be converted into a uniform physical coordinate system by using an image object conversion rule of the image acquisition device in the imaging region, so as to obtain the physical coordinates corresponding to each specified pixel in each object to be detected image. The physical coordinate system is the world coordinate system, and the image object conversion rule can represent the conversion rule between the pixel distance and the actual physical distance in the image acquired by the image acquisition device in the imaging area.
Alternatively, the test object may be, but is not limited to: an integrated device such as a non-patterned wafer, a patterned wafer, an unpackaged die, a packaged Chip, a System on a Chip (SoC) Chip, or some electronic component.
Taking the package chip as an example, each designated pixel point may correspond to each solder ball on the package chip, or each designated pixel point may correspond to an element or device integrated on the package chip. This example is merely an example and is not intended to be limiting herein.
S104, clustering and integrating physical coordinates corresponding to all the specified pixel points to obtain physical coordinates of a plurality of clustering actual points of the object to be detected.
In this embodiment, physical coordinates of specified pixel points in all the images of the object to be detected are fused by using a clustering method, so as to obtain physical coordinates of a plurality of clustered actual points. Wherein the physical coordinates of the actual points of the clusters may depend on the physical coordinates corresponding to the at least one specified pixel point. Thus, the physical coordinates of the plurality of clustered physical points can be used to determine physical size information associated with the test object.
Taking the packaged chip as an example, the actual spacing between each solder ball of the packaged chip, the height and width of each solder ball, and the like can be calculated. This example is merely an example and is not intended to be limiting herein.
According to the measuring method provided by the embodiment of the invention, the physical coordinates of the clustering actual points are obtained by determining the physical coordinates corresponding to the specified pixel points in each image of the object to be measured and then clustering and integrating, so that the physical coordinates of the clustering actual points determined by the image of the object to be measured under multiple fields of view are accurate, the accuracy of the determined information of the actual size related to the object to be measured is further ensured, and false detection can be effectively avoided.
In an optional implementation manner, in the process of calculating the physical coordinates corresponding to the specified pixel points, an image object conversion rule of the image acquisition device in the imaging area can be used to eliminate camera distortion. The image conversion rule may be used to calculate a lateral image conversion ratio and a longitudinal image conversion ratio at any pixel point within the imaging region, which may together characterize the pixel resolution at that pixel point.
A process for calculating an image conversion rule using a checkerboard calibration plate is described herein, which includes the steps of: s1011 to S1012.
S1011, obtaining a plurality of calibration plate images of the checkered calibration plates at different placement positions.
In this embodiment, the plurality of calibration plate images are obtained by shooting checkerboard calibration plates at different placement positions at a fixed position by the image acquisition device.
S1012, determining an image object conversion rule in the imaging area based on all the calibration plate images.
Alternatively, the checkerboard calibration plate may includeCharacteristic points, and->In the characteristic pointsThe feature points are feature corner points. The feature points represent vertices of any one of the lattices (either black or white) in the checkered calibration plate, and the feature corner points represent common vertices of the four lattices in the checkered calibration plate.
For example, referring to fig. 3, the checkerboard calibration plate shown in fig. 3 includes 77 feature points, wherein 45 feature points are feature corner points, and the feature corner points represent common vertices of four lattices in the checkerboard calibration plate. That is, in fig. 3, the feature points P1 and P7 are located in the row and the column, respectively, and feature points are not present. The specification of the checkerboard calibration plate is not limited herein, and the actual application needs are subject to the present invention.
Optionally, since the distortion degrees of the image acquisition device at different positions in the imaging area are different, the image object conversion proportion information at each characteristic corner point in each calibration plate image can be calculated first. And then fitting out an image object conversion rule by using the image object conversion proportion information at each characteristic angular point in all the calibration plate images. Therefore, in connection with FIG. 4, the substeps of step S1012 may include S001-S005.
S001, extracting the pixel position of each characteristic point in the calibration plate image aiming at any calibration plate image.
It is understood that the pixel positions of the feature points represent the positions of the feature points in the pixel coordinate system. Referring to fig. 3, the pixel coordinate system is a coordinate system having an upper left corner of an imaging region of the image capturing apparatus as an origin of coordinates.
S002, calculating the image object conversion proportion information at the characteristic corner points according to the actual distance between the corner points and the pixel positions of four adjacent characteristic points of the characteristic corner points aiming at any characteristic corner point in the calibration plate image.
Wherein, the number of the calibration plate images is assumed to be S, and the total number of the characteristic angular points corresponding to the S Zhang Biaoding plate images is. To->The%>For example, the substeps of step S002 may include S0011 to S0012.
S0011, constructing an image transformation equation based on the actual distance between the corner points, the pixel positions of the feature corner points and the pixel positions of four adjacent feature points of the feature corner points.
For the followingThe%>The corresponding image object conversion equation of the characteristic corner points is as follows:
wherein, the liquid crystal display device comprises a liquid crystal display device,is->The%>Pixel positions of the feature corner points; />、/>、/>Respectively +.>Pixel positions of four adjacent feature points of the feature corner points. />Is the actual spacing of the corner points. />、/>Respectively +.>At each characteristic corner pointCan characterize the +.>Pixel resolution at individual feature corner, +.>、/>Namely, the object to be solved in the image object conversion equation.
S0012, solving an image object conversion equation to obtain image object conversion proportion information at the characteristic corner points.
For the followingThe%>Characteristic corner points, solving the obtained +.>The expression of the image object conversion ratio information at each characteristic corner point is as follows:
wherein, the liquid crystal display device comprises a liquid crystal display device,is->Coefficient matrix of object conversion equation corresponding to each characteristic corner point +.>The image object conversion ratio information at each characteristic corner point comprises: first->Horizontal image conversion ratio at each characteristic corner point>And longitudinal image conversion ratio
Referring to FIG. 3, ifThe feature corner is P4, and then the four adjacent feature points corresponding to the feature corner are P2, P3, P5, and P6. Based on the actual distance between the angular points and the pixel positions of P2, P3, P4, P5 and P6, an image conversion equation can be constructed, and the equation is solved to obtain the image conversion proportion information at the position P2. This example is merely an example, and is not limiting herein, and the feature point labels given in fig. 3 are merely one exemplary label for ease of understanding and are not limiting as to the actual order of the feature points.
S003, traversing each characteristic corner point in the calibration plate image to obtain the image object conversion proportion information at each characteristic corner point in the calibration plate image.
S004, traversing each calibration plate image to obtain the image object conversion proportion information at each characteristic angular point in each calibration plate image.
It will be appreciated that for the followingEach of the characteristic corner points is subjected to the steps S0011 and S0012, so that a set of discrete result data including +.>Image object conversion ratio information corresponding to each characteristic corner point: />…/>
S005, performing surface fitting on the image object conversion proportion information of all characteristic angular points in all the calibration plate images to determine an image object conversion rule.
The discrete result data obtained above, in which there is a systematic random error, needs to be smoothed to achieve noise suppression. Alternatively, a surface fitting may be performed based on the distortion characteristics of the camera to achieve a smoothing effect, and the formula of the used basic surface equation is as follows:
in the method, in the process of the invention,to fit the surface equation coefficients +.>Is the position parameter of the surface equation.
Therefore, the substeps of step S005 may include S0051 to S0053:
s0051, constructing a surface fitting equation corresponding to the characteristic corner points according to the pixel positions of the characteristic corner points and the image object conversion proportion information at the characteristic corner points aiming at each characteristic corner point in each calibration plate image.
It will be appreciated that for the followingSubstituting the pixel position and the image conversion proportion information of each characteristic corner point into the basic curved surface equation to construct +.>A surface fitting equation, wherein->The expression of the surface fitting equation corresponding to each characteristic corner point is as follows:
wherein, the liquid crystal display device comprises a liquid crystal display device,representing all the transverse image object conversion distribution coefficients;representing all the longitudinal image transformation distribution coefficients, which are the coefficients that need to be solved.
S0052 treatment by least square methodAnd solving a curved surface fitting equation to obtain image object conversion distribution information in the imaging region.
In this embodiment, the image object conversion distribution information in the imaging area includes all the lateral image object conversion distribution coefficients and all the longitudinal image object conversion distribution coefficients, and the expression obtained by solving is as follows:
wherein, the liquid crystal display device comprises a liquid crystal display device,for use of->A coefficient matrix determined by a curved surface fitting equation;
among the feature corner points: />Pixel position for the 1 st feature corner, +.>Is->Pixel positions of individual feature corner,/>、/>A horizontal image conversion ratio and a vertical image conversion ratio, which are the first characteristic corner points respectively, +.>、/>Respectively +.>The horizontal image conversion ratio and the vertical image conversion ratio of each characteristic corner point.
S0053, constructing an expression of an image object conversion rule in the imaging area based on the image object conversion distribution information.
In this embodiment, the expression of the object conversion rule may be:
wherein, the liquid crystal display device comprises a liquid crystal display device,representing the pixel location of any pixel point within the imaging region. />、/>Respectively is the pixel point in the imaging area +.>A lateral image conversion ratio and a longitudinal image conversion ratio.
Thus, as long as the pixel position of any pixel point in the imaging area is known, the pixel position can be substituted into a calculation formula of an image object conversion rule to obtain the transverse image object conversion ratio and the longitudinal image object conversion ratio at the pixel point.
In this embodiment, in the imaging region, the horizontal image conversion ratio may represent the horizontal axis of the pixel coordinate systemxAxis) direction (in this embodiment, the actual physical distance corresponding to the unit pixel), the vertical pixel conversion ratio may represent the vertical axis of the pixel coordinate system [ ]yAxis) of the pixel resolution in the direction.
The above description of steps S1011-S1012 and sub-steps thereof is a detailed process for determining the rule of image object transformation in the imaging region. It will be appreciated that the detection device may pre-calculate the image-object transformation rules, or the image-capturing device may determine the image-object transformation rules in the imaging region before the image-capturing device leaves the factory, and then configure the image-object transformation rules in the detection device.
Based on the image object conversion rule, a procedure of determining physical coordinates corresponding to a specified pixel point is described below.
In an alternative implementation, referring to fig. 5, the sub-steps of step S103 and S1031 to S1033 are described above with reference to fig. 2.
S1031, acquiring pixel positions of the optical center of the image acquisition device in the imaging area.
It will be appreciated that, for each calibration plate image, performing the above step S001 may obtain the pixel position of each feature point in the calibration plate image. In this embodiment, the pixel position of the optical center may be obtained by performing camera calibration based on the pixel position of each feature point in all the calibration plate images. The camera calibration mode may be an existing camera calibration mode, for example, zhang's calibration, which is not limited herein. S1032, determining the pixel position and the height value of each appointed pixel point in each object image.
In this embodiment, the image capturing apparatus may be mounted with a depth camera, so that a height value of each specified pixel point in the image of the object to be detected may be obtained.
S1033, calculating physical coordinates corresponding to the specified pixel points by using the equipment physical coordinates corresponding to the object images, the image conversion rule of the image acquisition equipment in the imaging area, the pixel positions of the specified pixel points and the height values of the specified pixel points in the corresponding object images.
In this embodiment, each image of the object to be measured corresponds to a physical coordinate of the device, where the physical coordinate of the device is a physical position of the image acquisition device when the image of the object to be measured is acquired on the motion axis.
Alternatively, the device physical coordinates may include a device X-axis coordinate value, a device Y-axis coordinate value, and a device Z-axis coordinate value, and the substep of S1033 may include:
s10331, obtaining a horizontal coordinate value accumulated value and a vertical coordinate value accumulated value between the specified pixel point and the optical center based on the pixel position of the optical center and the pixel position of the specified pixel point;
s10332, taking the sum of the X-axis coordinate value and the accumulated value of the transverse coordinate value of the equipment as a target X-axis coordinate value in the physical coordinates corresponding to the appointed pixel point;
S10333, taking the sum of the Y-axis coordinate value and the accumulated value of the longitudinal coordinate value of the equipment as a target Y-axis coordinate value in the physical coordinates corresponding to the designated pixel point;
s10334, taking the sum of the Z-axis coordinate value of the equipment and the height value of the appointed pixel point in the corresponding image of the object to be detected as the target Z-axis coordinate value in the physical coordinate corresponding to the appointed pixel point.
Therefore, for any appointed pixel point in any object image to be detectedThe calculation formula of the corresponding physical coordinates can be:
wherein, the liquid crystal display device comprises a liquid crystal display device,pixel position for optical center; />For designating pixel point +.>Pixel positions in the corresponding object image to be detected; />、/>The accumulated values of the horizontal coordinate values and the vertical coordinate values between the designated pixel point and the optical center are respectively.
Respectively an X-axis coordinate value of the device, a Y-axis coordinate value of the device and a Z-axis coordinate value of the device, which represent the appointed pixel point +.>The physical coordinates of the equipment when the image of the object to be detected is acquired; />For designating pixel point +.>A height value in an image of the object to be measured; />Respectively, a target X-axis coordinate value, a target Y-axis coordinate value and a target Z-axis coordinate value, which represent the appointed pixel point +.>Corresponding physical coordinates.
In an optional implementation manner, the physical coordinate corresponding to each designated pixel point is an actual point, clustering of all the actual points can be achieved based on a preset clustering pixel threshold, and the substep of the step S104 may include S1041 to S1042:
S1041, clustering all actual points based on a preset clustering pixel threshold value and physical coordinates corresponding to all designated pixel points to obtain a plurality of clustering sets.
In this embodiment, the cluster set may include at least one actual point.
Optionally, the preset clustering pixel threshold represents a pixel distance, and the clustering can be achieved only by converting the preset clustering pixel threshold into an actual value. That is, the step S1041 may include the following sub-steps S10411 to S10413.
S10411, obtaining an image object conversion mean value in the imaging area.
In this embodiment, the image object conversion mean may represent an average pixel resolution in the imaging area, which may be calculated by using a horizontal image object conversion ratio and a vertical image object conversion ratio of each pixel point in the imaging area, where a calculation formula of the image object conversion mean may be:
wherein, the liquid crystal display device comprises a liquid crystal display device,is the number of all pixels in the imaging area.
S10412, determining an actual clustering threshold based on a preset clustering pixel threshold and an image object conversion mean value.
In this embodiment, the calculation formula of the clustering actual threshold may be:,/>and presetting a clustering pixel threshold value. It is stated that->The value of (2) may be according to the actual requirement, for example, 0.5 pixels, and this example is only an example and is not limited herein.
S10413, clustering all actual points based on the clustering actual threshold value to obtain a plurality of clustering sets.
If there are H images of the object to be measured, then all the objects to be measured in the H images of the object to be measured are mirrored, and part of the objects to be measured in the other images may be mirrored. Because the physical coordinates corresponding to each specified pixel point are one actual point, the physical coordinates corresponding to all specified pixel points corresponding to one object image to be detected can form one actual point set, namely, the actual point set can be divided into H actual point sets.
Then can be atActual Point set->For->Any one of the actual points with the actual point as the center and the radius of +.>All actual points within the circular area of (a) may be regarded as one point (referred to as clustered actual points), i.e. the set of clusters corresponding to clustered actual points may comprise all actual points within the circular area.
If it isIncludes F actual points for +.>Can find out all the actual points in the circular area corresponding to the actual pointTo form a cluster set. Because there may be some cases where the object under test is partially mirrored, there is only one actual point in some clusters and there is more than one actual point in some clusters.
For example, taking two images of the object to be measured (image 1 and image 2) as an example, please refer to fig. 6, it can be seen that: all objects to be measured in the image 1 are mirrored, and part of the objects to be measured in the image 1 are mirrored, and black points in the image can be regarded as actual points corresponding to a designated pixel point.
In clustering, the actual points corresponding to the image 1 are taken as references, and for each actual point of the image 1, the actual point is taken as the circle center and the radius is taken as the circle centerAll actual points within the circular area of (c) may constitute a cluster set. In combination with the overlapping effect map of all the actual points in the images 1, 2 given in fig. 6, it can be seen that there is a full overlap region and a non-full overlap region in the field of view after overlapping. In the fully overlapping region, there is only one actual point in each circular region; in the non-fully overlapping region, there are two actual points per circular region.
It should be noted that the above examples are only examples, and in practical application, the number of the images of the object to be measured is not limited, and the mirror-entering condition of the object to be measured in the images of the object to be measured is not limited, which is based on the practical application condition.
S1042, calculating the mean value of coordinate values corresponding to all actual points in the clustering set aiming at each clustering set to obtain the physical coordinates of the clustering actual points corresponding to the clustering set.
In this embodiment, it is assumed that in one cluster setPhysical coordinates of the actual points of the clusters corresponding to the cluster set>The calculation mode of (a) is as follows: />
Wherein, the liquid crystal display device comprises a liquid crystal display device,is->The>Physical coordinates of the individual.
It can be understood that in more scenes of the image of the object to be detected, the point number accounts for the difference between the possible full overlap area and the non-full overlap area, so that after clustering, the actual point number in each cluster set is different (1, 2, 3 or more are possible), but each cluster set is regarded as one cluster actual point, and the point number accounts for the difference can be eliminated, so that the weights of the point numbers in the field overlap area and the non-overlap area are consistent.
It should be noted that, in the above method embodiment, the execution sequence of each step is not limited by the drawing, and the execution sequence of each step is based on the actual application situation.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
first, the present invention utilizes a checkerboard calibration plate to determine the image object conversion rules of the image acquisition device in the imaging region. The image conversion rule can be used for calculating a horizontal image conversion ratio and a vertical image conversion ratio at any pixel point in the imaging area, and the two can jointly represent the pixel resolution at the pixel point, so that the pixel resolution errors at all the pixel points in the imaging area can be controlled within 0.5 pixel.
Secondly, the invention converts the pixel coordinates of the appointed pixel point in the object images with different visual fields into the same physical coordinate system by utilizing the object conversion rule to obtain the physical coordinates of the appointed pixel point corresponding to the actual point. Compared with the traditional mode of calculating the pixel resolution of the full graph to perform conversion, the method not only optimizes the deviation caused by image distortion and reduces the difficulties of distortion correction and image splicing, but also avoids the error accumulation of sub-pixel level errors when facing objects with larger sizes, so that the determined actual points are more accurate.
Thirdly, a plurality of clustering sets are determined in a clustering mode, all actual points in each clustering set are regarded as one clustering actual point, so that the weights of the points in the overlapping area and the non-overlapping area of the visual field are consistent, calculation errors of the size of the object to be measured caused by difference of the point number proportion in the full overlapping area and the non-full overlapping area can be effectively avoided, and measurement accuracy is improved.
In order to perform the respective steps of the above-described method embodiments and of the various possible implementations, an implementation of a measuring device is given below.
Referring to fig. 7, fig. 7 is a schematic structural diagram of a measurement device according to an embodiment of the invention. The measuring apparatus 200 includes: an image acquisition module 220, a data processing module 230.
The image acquisition module 220 is configured to obtain a plurality of images of an object to be detected, where the plurality of images of the object to be detected are obtained by the image acquisition device capturing images of the object to be detected at different positions;
a data processing module 230 for: calculating physical coordinates corresponding to a plurality of appointed pixel points of an area where the object to be detected is located in each object to be detected image; clustering and integrating physical coordinates corresponding to all the appointed pixel points to obtain physical coordinates of a plurality of clustering actual points of the object to be detected; and the physical coordinates of the plurality of clustering actual points are used for determining the actual size information related to the object to be detected.
It will be apparent to those skilled in the art that the measuring apparatus 200 may further include a device measurement module 210, and the device measurement module 210 may be configured to implement the steps S1011-S1012 and the sub-steps thereof; the data processing module 230 may be configured to implement the steps S103, S104 and their respective sub-steps described above. For convenience and brevity, the specific working process of the measuring device 200 described above may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
Referring to fig. 8, fig. 8 is a schematic structural diagram of a detection apparatus according to an embodiment of the present invention. The detection device 300 comprises a processor 310, a memory 320 and a bus 330, the processor 310 being connected to the memory 320 via the bus 330.
The memory 320 may be used to store a software program, for example, a software program corresponding to the measurement device 200 provided in an embodiment of the present invention. The processor 310 performs various functional applications and data processing by running software programs stored in the memory 320 to implement the measurement method as provided by the embodiments of the present invention.
The Memory 320 may be, but is not limited to, random access Memory (Random Access Memory, RAM), read Only Memory (ROM), flash Memory (Flash), programmable Read Only Memory (Programmable Read-Only Memory, PROM), erasable Read Only Memory (Erasable Programmable Read-Only Memory, EPROM), electrically erasable Read Only Memory (Electric Erasable Programmable Read-Only Memory, EEPROM), etc.
The processor 310 may be an integrated circuit chip with signal processing capabilities. The processor 310 may be a general-purpose processor including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also digital signal processors (Digital Signal Processing, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
It will be appreciated that the configuration shown in fig. 8 is merely illustrative, and that the detection device 300 may also include more or fewer components than shown in fig. 8, or have a different configuration than shown in fig. 8. The components shown in fig. 8 may be implemented in hardware, software, or a combination thereof.
The embodiment of the invention also provides a detection system which comprises the image acquisition equipment and the detection equipment which are in communication connection.
The embodiment of the invention also provides a computer readable storage medium, and a computer program is stored on the computer readable storage medium, and the computer program realizes the measuring method disclosed in the embodiment when being run by a processor. The computer readable storage medium may be, but is not limited to: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, RAM, PROM, EPROM, EEPROM, FLASH magnetic disk or an optical disk.
In summary, the embodiment of the invention provides a measuring method, a detecting device, a detecting system and a storage medium, wherein physical coordinates corresponding to a plurality of appointed pixel points of an area where an object to be detected is located in each object to be detected image are calculated; and then clustering and integrating physical coordinates corresponding to all the specified pixel points to obtain physical coordinates of a plurality of clustered actual points for determining the actual size information related to the object to be detected. According to the invention, the physical coordinates corresponding to the specified pixel points in each object image are determined, and the physical coordinates of the actual clustering points are integrated by clustering, so that the physical coordinates of the actual clustering points determined by the object images under multiple fields are accurate, the accuracy of the determined actual size information related to the object is further ensured, and false detection can be effectively avoided.
The present invention is not limited to the above embodiments, and any changes or substitutions that can be easily understood by those skilled in the art within the technical scope of the present invention are intended to be included in the scope of the present invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.

Claims (11)

1. A method of measurement, comprising:
obtaining a plurality of images of the object to be detected, wherein the images of the object to be detected are obtained by shooting the object to be detected at different positions by image acquisition equipment;
obtaining physical coordinates corresponding to a plurality of appointed pixel points of an area where the object to be detected is located in each object to be detected image;
clustering and integrating physical coordinates corresponding to all the appointed pixel points to obtain physical coordinates of a plurality of clustering actual points of the object to be detected; and the physical coordinates of the plurality of clustering actual points are used for determining the actual size information related to the object to be detected.
2. The method according to claim 1, wherein the step of obtaining physical coordinates corresponding to a plurality of specified pixels of an area where the object is located in each of the object images includes:
Acquiring the pixel position of an optical center of the image acquisition equipment in an imaging area;
determining the pixel position and the height value of each appointed pixel point in each object image to be detected;
and calculating physical coordinates corresponding to the specified pixel points by using equipment physical coordinates corresponding to each image of the object to be detected, an image conversion rule of the image acquisition equipment in an imaging area, pixel positions of the specified pixel points and height values of the specified pixel points in the image of the object to be detected.
3. The method of claim 2, wherein the device physical coordinates comprise device X-axis coordinate values, device Y-axis coordinate values, device Z-axis coordinate values;
the step of calculating the physical coordinates corresponding to the specified pixel point by using the physical coordinates of the device corresponding to the image of the object to be detected, the image conversion rule of the image acquisition device in the imaging area, the pixel position of the specified pixel point and the height value of the specified pixel point in the image corresponding to the object to be detected, includes:
acquiring a transverse coordinate value accumulated value and a longitudinal coordinate value accumulated value between the specified pixel point and the optical center based on the pixel position of the optical center and the pixel position of the specified pixel point;
Taking the sum of the X-axis coordinate value of the equipment and the accumulated value of the transverse coordinate value as a target X-axis coordinate value in the physical coordinates corresponding to the appointed pixel point;
taking the sum of the Y-axis coordinate value of the equipment and the accumulated value of the longitudinal coordinate value as a target Y-axis coordinate value in the physical coordinates corresponding to the appointed pixel point;
and taking the sum of the Z-axis coordinate value of the equipment and the height value of the specified pixel point in the corresponding image of the object to be detected as a target Z-axis coordinate value in the physical coordinate corresponding to the specified pixel point.
4. The method of claim 1, wherein the physical coordinates corresponding to the specified pixel point are an actual point; the step of integrating the physical coordinates corresponding to all the specified pixel points by clustering to obtain the physical coordinates of a plurality of clustering actual points of the object to be detected comprises the following steps:
clustering all the actual points based on a preset clustering pixel threshold value and physical coordinates corresponding to all the specified pixel points to obtain a plurality of clustering sets; said set of clusters comprising at least one of said actual points;
and calculating the mean value of coordinate values corresponding to all the actual points in the clustering set aiming at each clustering set to obtain the physical coordinates of the clustering actual points corresponding to the clustering set.
5. The method of claim 4, wherein the step of clustering all the actual points based on a preset clustering pixel threshold and physical coordinates corresponding to all the specified pixel points to obtain a plurality of cluster sets includes:
obtaining an image object conversion mean value in an imaging area;
determining a clustering actual threshold based on the preset clustering pixel threshold and the image object conversion mean;
and clustering all the actual points based on the clustering actual threshold value to obtain a plurality of clustering sets.
6. The method of claim 2, wherein the image transformation rule is obtained by:
obtaining a plurality of calibration plate images of the checkered calibration plates at different placement positions;
and determining an image object conversion rule in the imaging area based on all the calibration plate images.
7. The method of claim 6, wherein the tessellated calibration plate comprisesCharacteristic points, and->+.>The feature points are feature corner points;
the step of determining an image object conversion rule in the imaging area based on all the calibration plate images comprises the following steps:
Extracting pixel positions of each characteristic point in the calibration plate image aiming at any calibration plate image;
calculating image object conversion proportion information at the characteristic corner points based on the actual spacing of the corner points and the pixel positions of four adjacent characteristic points of the characteristic corner points aiming at any characteristic corner point in the calibration plate image; the image object conversion proportion information at the characteristic corner point comprises a transverse image object conversion proportion and a longitudinal image object conversion proportion at the characteristic corner point;
traversing each characteristic angular point in the calibration plate image to obtain image object conversion proportion information at each characteristic angular point in the calibration plate image;
traversing each calibration plate image to obtain image object conversion proportion information at each characteristic angular point in each calibration plate image;
performing surface fitting on the image object conversion proportion information of all the characteristic angular points in all the calibration plate images to determine the image object conversion rule; the image conversion rule is used for calculating a transverse image conversion ratio and a longitudinal image conversion ratio at any pixel point in the imaging region.
8. A measurement device, comprising:
The image acquisition module is used for acquiring a plurality of images of the object to be detected, wherein the images of the object to be detected are obtained by shooting the object to be detected at different positions by the image acquisition equipment;
a data processing module for:
calculating physical coordinates corresponding to a plurality of appointed pixel points of an area where the object to be detected is located in each object to be detected image;
clustering and integrating physical coordinates corresponding to all the appointed pixel points to obtain physical coordinates of a plurality of clustering actual points of the object to be detected; and the physical coordinates of the plurality of clustering actual points are used for determining the actual size information related to the object to be detected.
9. A detection apparatus, characterized by comprising: a memory storing a software program that when executed by the detection device performs the measurement method of any one of claims 1-7, and a processor.
10. A detection system comprising a communicatively coupled image acquisition device and the detection device of claim 9.
11. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, implements the measuring method of any one of claims 1-7.
CN202310841520.4A 2023-07-11 2023-07-11 Measurement method, detection device, detection system, and storage medium Active CN116563292B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310841520.4A CN116563292B (en) 2023-07-11 2023-07-11 Measurement method, detection device, detection system, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310841520.4A CN116563292B (en) 2023-07-11 2023-07-11 Measurement method, detection device, detection system, and storage medium

Publications (2)

Publication Number Publication Date
CN116563292A true CN116563292A (en) 2023-08-08
CN116563292B CN116563292B (en) 2023-09-26

Family

ID=87486570

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310841520.4A Active CN116563292B (en) 2023-07-11 2023-07-11 Measurement method, detection device, detection system, and storage medium

Country Status (1)

Country Link
CN (1) CN116563292B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117710488A (en) * 2024-01-17 2024-03-15 苏州市欧冶半导体有限公司 Camera internal parameter calibration method, device, computer equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102506705A (en) * 2011-10-17 2012-06-20 罗艺 Method and device for obtaining coordinates of positioning mark on PCB (Printed Circuit Board) and patch device
CN109003311A (en) * 2018-08-22 2018-12-14 上海庄生晓梦信息科技有限公司 A kind of fish-eye scaling method
US20200226789A1 (en) * 2019-01-14 2020-07-16 Beijing Boe Optoelectronics Technology Co., Ltd. Camera calibration plate, camera calibration method and device, and image acquisition system
CN113393439A (en) * 2021-06-11 2021-09-14 重庆理工大学 Forging defect detection method based on deep learning
CN113513981A (en) * 2021-06-15 2021-10-19 西安交通大学 Multi-target parallel measurement method, system, equipment and storage medium based on binocular stereo vision
CN114331924A (en) * 2022-03-15 2022-04-12 四川焱飞科技有限公司 Large workpiece multi-camera vision measurement method
CN115187612A (en) * 2022-07-08 2022-10-14 南京邮电大学 Plane area measuring method, device and system based on machine vision

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102506705A (en) * 2011-10-17 2012-06-20 罗艺 Method and device for obtaining coordinates of positioning mark on PCB (Printed Circuit Board) and patch device
CN109003311A (en) * 2018-08-22 2018-12-14 上海庄生晓梦信息科技有限公司 A kind of fish-eye scaling method
US20200226789A1 (en) * 2019-01-14 2020-07-16 Beijing Boe Optoelectronics Technology Co., Ltd. Camera calibration plate, camera calibration method and device, and image acquisition system
CN113393439A (en) * 2021-06-11 2021-09-14 重庆理工大学 Forging defect detection method based on deep learning
CN113513981A (en) * 2021-06-15 2021-10-19 西安交通大学 Multi-target parallel measurement method, system, equipment and storage medium based on binocular stereo vision
CN114331924A (en) * 2022-03-15 2022-04-12 四川焱飞科技有限公司 Large workpiece multi-camera vision measurement method
CN115187612A (en) * 2022-07-08 2022-10-14 南京邮电大学 Plane area measuring method, device and system based on machine vision

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
梅鸿翔: "一种基于图像坐标检测的目标定位方法", 计算机与数字工程, vol. 44, no. 3, pages 438 - 444 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117710488A (en) * 2024-01-17 2024-03-15 苏州市欧冶半导体有限公司 Camera internal parameter calibration method, device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN116563292B (en) 2023-09-26

Similar Documents

Publication Publication Date Title
CN110689579B (en) Rapid monocular vision pose measurement method and measurement system based on cooperative target
CN107633536B (en) Camera calibration method and system based on two-dimensional plane template
CN111263142B (en) Method, device, equipment and medium for testing optical anti-shake of camera module
CN116563292B (en) Measurement method, detection device, detection system, and storage medium
CN106570907B (en) Camera calibration method and device
CN116386028B (en) Image layering identification method and device for processing tee pipe fitting
CN108844462A (en) A kind of size detecting method, device, equipment and system
CN111627073B (en) Calibration method, calibration device and storage medium based on man-machine interaction
CN112308930A (en) Camera external parameter calibration method, system and device
CN113012234A (en) High-precision camera calibration method based on plane transformation
CN112308934A (en) Calibration detection method and device, storage medium and computing equipment
CN115187612A (en) Plane area measuring method, device and system based on machine vision
CN111699513B (en) Calibration plate, internal parameter calibration method, machine vision system and storage device
CN112985258B (en) Calibration method and measurement method of three-dimensional measurement system
KR102023087B1 (en) Method for camera calibration
CN104677911B (en) Inspection apparatus and method for machine vision inspection
CN112102415A (en) Depth camera external parameter calibration method, device and equipment based on calibration ball
CN112504156A (en) Structural surface strain measurement system and measurement method based on foreground grid
CN107783310B (en) Calibration method and device of cylindrical lens imaging system
CN116205993A (en) Double-telecentric lens high-precision calibration method for 3D AOI
CN111145247A (en) Vision-based position detection method, robot and computer storage medium
CN115861443A (en) Multi-camera internal reference calibration method and device, electronic equipment and storage medium
Sun et al. A new method of camera calibration based on the segmentation model
CN116124393A (en) Bridge multipoint dynamic deflection measuring method and device during off-axis measurement
CN112927299B (en) Calibration method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20231117

Address after: Room 801, No. 1126 Shenbin South Road, Minhang District, Shanghai, 201107

Patentee after: MATRIXTIME ROBOTICS (SHANGHAI) Co.,Ltd.

Address before: 518000 No. 4, Floor 4, Huashen Building, 1038 Aiguo Road, Xinyi Community, Huangbei Street, Luohu District, Shenzhen, Guangdong Province

Patentee before: Jushi Technology (Shenzhen) Co.,Ltd.