CN116559176A - Detection method, detection device, electronic equipment and storage medium - Google Patents

Detection method, detection device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116559176A
CN116559176A CN202310583292.5A CN202310583292A CN116559176A CN 116559176 A CN116559176 A CN 116559176A CN 202310583292 A CN202310583292 A CN 202310583292A CN 116559176 A CN116559176 A CN 116559176A
Authority
CN
China
Prior art keywords
feature
detected
coordinates
camera
current station
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310583292.5A
Other languages
Chinese (zh)
Inventor
曹康
罗小军
孙高磊
黎国栋
凌桂林
潘望
李静
党梓豪
姚屏
吴丰礼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Topstar Technology Co Ltd
Original Assignee
Guangdong Topstar Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Topstar Technology Co Ltd filed Critical Guangdong Topstar Technology Co Ltd
Priority to CN202310583292.5A priority Critical patent/CN116559176A/en
Publication of CN116559176A publication Critical patent/CN116559176A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application discloses a detection method, a detection device, electronic equipment and a storage medium. The method specifically comprises the following steps: acquiring image coordinates of at least one feature to be detected of an object to be detected on a current station and camera affine relations between the current station and at least two cameras on different detection points; determining the platform coordinates of each feature to be detected at the current station according to the image coordinates and the affine relationship of the camera; and determining whether the feature to be detected is qualified or not according to the platform coordinates of the feature to be detected. According to the method and the device for detecting the object to be detected, camera affine relations between different detection points and the current station of different cameras calibrated in advance are obtained, in the process of detecting the object to be detected, image coordinates of the features to be detected on the object to be detected are visually identified through the cameras, physical coordinates of the features to be detected on the current station platform are further determined according to the camera affine relations, and therefore quality detection of the object to be detected is assisted in a coordinate transformation mode, and accuracy and efficiency of detection are improved.

Description

Detection method, detection device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of automation technologies, and in particular, to a detection method, a detection device, an electronic device, and a storage medium.
Background
With the development of technology, more and more manufacturers adopt an automatic scheme to produce and manufacture products. In the whole process of product production, the quality detection link is particularly important. The quality detection, defect identification and the like of products are carried out in an automatic mode, and the method is one of industrial upgrading contents which are more important to current manufacturers.
Currently, for appearance quality inspection of products, related technologies adopt methods such as machine vision, computer vision and the like, and whether the products meet production requirements is detected by a method of photographing the appearance of the products and analyzing images. But visual recognition is less effective and less efficient to detect.
Disclosure of Invention
The application provides a detection method, a detection device, electronic equipment and a storage medium, so as to improve detection efficiency and detection precision.
According to an aspect of the present application, there is provided a detection method, the method comprising:
acquiring image coordinates of at least one feature to be detected of an object to be detected on a current station and camera affine relations between the current station and at least two cameras on different detection points;
determining the platform coordinates of each feature to be detected at the current station according to the image coordinates and the affine relationship of the camera;
and determining whether the feature to be detected is qualified or not according to the platform coordinates of the feature to be detected.
According to another aspect of the present application, there is provided a detection apparatus including:
the image coordinate acquisition module is used for acquiring image coordinates of at least one feature to be detected of the object to be detected on the current station and camera affine relations between the current station and at least two cameras on different detection points;
the platform coordinate determining module is used for determining the platform coordinate of each feature to be detected at the current station according to the image coordinate and the affine relationship of the camera;
and the feature detection module to be detected is used for determining whether the feature to be detected is qualified or not according to the platform coordinates of each feature to be detected.
According to another aspect of the present application, there is provided an electronic device including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the detection method described in any one of the embodiments of the present application.
According to another aspect of the present application, there is provided a computer readable storage medium storing computer instructions for causing a processor to execute the detection method according to any embodiment of the present application.
According to the technical scheme, camera affine relations between different detection points and the current station of different cameras calibrated in advance are obtained, in the process of inspecting an object to be inspected, image coordinates of features to be inspected on the object to be inspected are visually identified through the cameras, physical coordinates of the features to be inspected on a current station platform are further determined according to the camera affine relations, and therefore quality inspection of the object to be inspected is assisted in a coordinate transformation mode, and inspection accuracy and inspection efficiency are improved.
It should be understood that the description of this section is not intended to identify key or critical features of the embodiments of the application or to delineate the scope of the application. Other features of the present application will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1A is a flow chart of a detection method according to a first embodiment of the present application;
fig. 1B is a schematic diagram of a calibration board with a combination of a circular feature point and a two-dimensional code according to a first embodiment of the present application;
FIG. 2A is a schematic illustration of a glass product quality inspection suitable for use in accordance with embodiment II of the present application;
FIG. 2B is a schematic diagram of a camera affine relationship calibration process adapted according to a second embodiment of the present application;
fig. 3 is a schematic structural diagram of a detection device according to a third embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device implementing the detection method according to the embodiment of the present application.
Detailed Description
In order to make the present application solution better understood by those skilled in the art, the following description will be made in detail and with reference to the accompanying drawings in the embodiments of the present application, it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1A is a flowchart of a detection method according to an embodiment of the present application, where the method may be applied to a quality detection scene of an appearance index of a glass product, and the method may be performed by a detection device, where the detection device may be implemented in a form of hardware and/or software, and the detection device may be configured in an electronic device. As shown in fig. 1, the method includes:
s110, obtaining image coordinates of at least one feature to be detected of the object to be detected on the current station and camera affine relations between the current station and at least two cameras on different detection points.
The article to be detected may be a commodity or a product which needs to be subjected to appearance detection or quality detection, for example, may be a glass product, and after the silk-screen glass is manufactured, the length, the width, the silk-screen pattern, the mark and the like of the silk-screen glass need to be checked to determine whether the product is qualified. In a quality inspection scene of a glass product, a plurality of quality inspection platforms can be arranged to distinguish different stations for accelerating the quality inspection process and improving the quality inspection efficiency, and different detected objects of the glass product are respectively detected. It will be appreciated that the camera used for visual identification generally needs to be fixed, and the current station (quality inspection platform) may be designed to be movable so that the station lifts the item to be inspected for movement (e.g. single axis unidirectional translation) so that the camera can identify the portion of the whole item to be inspected for quality inspection. Then the position of the current station, which is shot by the camera in the moving process, is the detection point. In general, a plurality of cameras are provided to perform photographing, thereby overcoming errors that may be caused by photographing with a single camera. The camera affine relationship may be a coordinate transformation relationship of each camera between each detection point location and the current workstation. That is, all cameras need to perform the coordinate transformation calibration process between all points and the current station, so that the calibrated affine relationship of the cameras is used for helping a user to test the object to be tested through the coordinate transformation.
The feature to be measured may be various appearance attributes of the object to be measured, for example, in a quality inspection link of silk screen glass, the feature to be measured may include, but is not limited to, glass edges, positioning lines, ink edges, character mark points, and the like. Thus, there may be multiple features to be tested on one item to be tested. The image coordinates of the feature to be measured may be coordinates of the feature to be measured in an image coordinate system in the camera. Any feature recognition image processing algorithm in the related art can be used for recognizing the feature to be detected, and the embodiment of the application is not limited to this.
In an alternative embodiment, the camera affine relationship between the current station and at least two cameras at different detection points may be determined by the following method: acquiring calibration plate images of calibration plates placed on a current station, shot by at least two cameras, on different detection points; the calibration plate comprises a circular checkerboard and two-dimensional codes, and the two-dimensional codes are embedded in circular feature points of the checkerboard; determining the center image coordinates of all the round feature points on the calibration plate and the center physical coordinates corresponding to the two-dimensional codes according to the calibration plate image; and determining camera affine relations of different cameras and the current station on different detection points according to the circle center image coordinates and the circle center physical coordinates.
Before obtaining the image coordinates of the feature to be measured and the affine relationship of the camera, the affine relationship of the camera can be calibrated. As shown in fig. 1B, in each embodiment of the present application, the calibration board shown in fig. 1B is taken as an example, the calibration board is rectangular as a whole, circular checkerboards are evenly distributed, a two-dimensional code is embedded in the middle of each circular feature point, the information corresponding to the two-dimensional code can be the physical coordinates of the center of a circle of each circular feature point in the preset calibration board, for example, the coordinates of the first circular feature point on the calibration board from the left of the first row can be (1, 1), and so on. The method comprises the steps that a calibration plate image obtained by shooting a calibration plate is obtained through a camera, and on one hand, circle center image coordinates of the circle center of each circular characteristic point in an image coordinate system shot by the camera are recognized from the calibration plate image through a circular recognition algorithm; on the other hand, by identifying the two-dimensional code, the physical coordinates of the center of the circle of each circular feature point appearing in the image are determined. And then, obtaining the conversion relation between the circular feature points in the image coordinate system and the calibration plate coordinate system (namely the current station coordinate system) through coordinate transformation in the three-dimensional space, namely the affine relation of the camera. Regardless of the number of cameras, each camera is calibrated in advance at each detection point, so that whether some features on the object to be detected meet production requirements or not can be quickly determined according to camera affine relations in the subsequent detection process of the object to be detected. It should be noted that the calibration plate image includes at least three non-collinear circular feature points. It will be appreciated that three non-collinear points may define a plane and that a calibration plate placed on the current station may assist the camera in obtaining plane information for the current station.
In the embodiment, the affine relationship of the cameras is calibrated in advance, and as the platform of the current station moves in the detection process and is provided with a plurality of detection points, the coordinate transformation relationship of different cameras at different detection points is determined by physical coordinates through the two-dimensional code circular feature points, so that the efficiency and the accuracy of calibrating the coordinate transformation relationship are improved.
In an optional implementation manner, the determining the affine relationship between the different cameras and the cameras of the current station on different detection points according to the circle center image coordinates and the circle center physical coordinates may include: according to the circle center image coordinates and the circle center physical coordinates determined by different cameras on different detection points, respectively determining coordinate transformation matrixes between the different cameras and the current station; the different coordinate transformation matrices are saved as camera affine relations.
The method comprises the steps that all cameras acquire images at different detection points in the current station moving process, each image of each camera corresponding to each detection point can determine a coordinate conversion matrix corresponding to the camera and the current station moving to the detection point, and the coordinate conversion matrix is actually obtained by carrying out coordinate conversion in a three-dimensional space by utilizing circle center image coordinates and circle center physical coordinates acquired from different pictures. It should be noted that, each camera obtains a coordinate transformation matrix corresponding to each detection point, and the coordinate transformation matrices are comprehensively stored in a library table, and the library table can be used as an affine relationship of the camera, so that the affine relationship of the camera can be conveniently invoked later.
The coordinate conversion relations between all the detection points and the calibration plate of all the cameras are determined in advance, so that the coordinate conversion relations can be directly called during detection to help calculate whether the appearance attribute of the object to be detected meets the production standard or not, and the detection efficiency is improved.
S120, determining the platform coordinates of each feature to be detected at the current station according to the image coordinates and the affine relationship of the camera.
According to the image coordinates of the features to be detected, which are obtained by different detection points of each camera on the current station, and the camera affine relation which is calibrated in advance, the coordinates of the features to be detected on the current station plane, namely the platform coordinates, can be obtained. The method can immediately determine some special feature points of the feature to be detected when the image coordinates of the feature to be detected are acquired, and the feature points are used as the basis of the image coordinates and are also used as the basis of detection after being converted into the platform coordinates.
S130, determining whether the feature to be detected is qualified or not according to the platform coordinates of the feature to be detected.
And judging the qualification of the features to be tested on the surface of the object to be tested according to the platform coordinates of the features to be tested. The size or distance can be calculated through the platform coordinates of different features to be measured so as to determine whether the positions of the features to be measured on the surface of the object to be measured meet the production requirements.
For example, in a quality inspection scene of screen printing glass, the platform coordinates of the positioning line and the glass edge are identified, and then the distance between the positioning line and the glass edge is calculated, so that whether the distance can pass through qualification verification is judged according to production requirements.
In an alternative embodiment, the feature to be measured includes an edge point feature and an index point feature of the object to be measured; the determining whether the feature to be detected is qualified according to the platform coordinates of each feature to be detected may include: determining the distance from the index point feature to the edge of the object to be detected according to the index platform coordinates of the index point feature and the edge platform coordinates of the edge point feature; and determining whether the feature to be detected is qualified or not according to the distance from the index point feature to the edge of the object to be detected.
The edge point feature may be, among other things, a point on the edge of the surface appearance of the item to be measured, such as a vertex or a point where a special mark is present. In the production process of silk-screened glass, edge points can be identified to effectively help determine the edge line of the glass. It is possible to identify the vertex or a point marked in advance on the glass edge, but it is of course not limited to these two cases. The index point features can be represented by the appearance features of patterns, characters, patterns or marks on the silk-screen glass, because the appearance features of various silk-screen printing on the silk-screen glass have corresponding production indexes, if the appearance features of the silk-screen printing have defects, the silk-screen glass cannot be sold as good products.
Then the edge platform coordinates may be the physical coordinates of the edge point feature on the current station platform. Similarly, the index platform coordinates may be physical coordinates of the index point feature on the current station platform. After the image is identified, the position of the edge of the object to be detected is calculated through the coordinates (the number is not limited) of the edge platform, and whether the corresponding feature to be detected is qualified is judged through the distance from the coordinates of the index platform to the edge. Further, the index point feature may include a character mark point feature. In a quality inspection scenario of a glass article, the index point features may also include glass edges, ink edges, alignment lines, and the like.
In an optional embodiment, the determining the distance from the index point feature to the edge of the object to be measured according to the index platform coordinate of the index point feature and the edge platform coordinate of the edge point feature may include: according to the edge platform coordinates of the features of each edge point, determining a fitting straight line of the edge of the object to be detected at the current station; and determining the distance from the characteristic of the index point to the edge of the object to be measured according to the fitting straight line and the coordinates of the index platform.
The coordinates of the edge platform are obtained through the correspondence of at least two edge points, an edge straight line or line segment is fitted, and a fitting algorithm in the related art can be adopted in the straight line fitting method, which is not described herein. The distance from the index platform coordinates of the index point feature to the fitting straight line is calculated to be capable of following the index point feature
For example, whether the silk-screen position of a character mark on silk-screen glass meets the standard of good products needs to be detected, the shape of the character mark is identified through a camera, the vertex of the character mark is found, the coordinate of the vertex is obtained through conversion from the image coordinate to the platform coordinate, the point on the edge of the piece of silk-screen glass is identified through the camera, the straight line of the edge is fitted, the distance from the platform coordinate of the vertex to the fitted straight line is calculated, and the distance is compared with the standard distance in the inspection index, so that whether the standard of good products is met is judged.
According to the technical scheme, camera affine relations between different detection points and the current station of different cameras calibrated in advance are obtained, in the process of inspecting an object to be inspected, image coordinates of features to be inspected on the object to be inspected are visually identified through the cameras, physical coordinates of the features to be inspected on a current station platform are further determined according to the camera affine relations, and therefore quality inspection of the object to be inspected is assisted in a coordinate transformation mode, and inspection accuracy and inspection efficiency are improved.
Example two
Fig. 2A is a schematic diagram of multi-station screen-printed glass quality inspection according to a second embodiment of the present application, which is a preferred embodiment provided on the basis of the foregoing embodiments, and the embodiment of the present application is illustrated by taking multi-station quality inspection of rectangular screen-printed glass products as an example. As shown in fig. 2, the following is specifically introduced:
in general, as shown in FIG. 2A, coordinate system correlation between multiple cameras at each station at different platform points is required before the size measurement of the screen glass is performed. When quality detection is carried out on silk screen printing glass, a glass product is fed to the station 1, the station 1 controls the product to move along the long side direction of the glass plate, and different cameras measure required measurement items on the short side of the product on a plurality of detection points at the same time and calculate related results. It should be noted that, in the embodiment of the application, 4 cameras are set in the station 1 and 6 cameras are set in the station 2, it is understood that in actual production, the increase of the number of cameras can help to improve the detection accuracy, and if different camera targets are different detection objects (features to be detected), the detection efficiency can be improved. After the station 1 is finished, the product is moved to the station 2, the station 2 controls the product to move in the long side direction of the product, and a plurality of cameras measure required measurement items on the long side of the product on a plurality of platform points at the same time and calculate related results. And finally, judging whether the product meets the requirements according to the preset control requirements of the product.
Firstly, determining affine relations of cameras between different detection points and the current station. By using the checkerboard calibration plate (as shown in fig. 1B) in the combination of the circular feature points and the two-dimensional codes, at least 3 non-collinear circular feature points can be shot in each camera field of view in the calibration process, and the calibration plate does not move relatively on the object carrying platform.
When the station 1 performs affine relationship calibration of a camera, a calibration plate is steadily placed on the carrying platform, the carrying platform is firstly controlled to move to a first point position (a first detection point position, the setting of the detection point position can be set by a technician according to the characteristics and actual conditions of a product) which is required to be moved in the measurement process, and all identifications in the field of view of the camera are calculated on the detection point positionIdentifying the center coordinates (namely the center image coordinates) of the distinguished circular feature points, identifying the physical coordinates (namely the center physical coordinates) of the calibration plate corresponding to the two-dimensional code information in the circle, and according to the center image coordinates P j (u, v) and the corresponding physical coordinates Q of the center of a circle j (x, y) calculating the camera and calibration plate conversion relation A j Let P j A j =Q j . (j represents a camera serial number, j= … 4, since there are 4 cameras in the station 1, the serial numbers of the cameras are 1 to 4.n.gtoreq.3, indicating that at least 3 circle coordinates and two-dimensional code information in a circle need to be identified), namely:
wherein,,
then sequentially completing all the detection points P i (i represents the serial number of the test site, i=1, 2, 3.). Calibrating the different cameras in the station 1 at all the detection points, then the general formula of the affine relationship of the cameras of the station 1 between the cameras and the calibration plate at all the detection points is as follows (j represents the camera serial number, j= … 4):
similarly, when calibrating the affine relationship of the camera of the station 2, the operation steps are the same as those of the station 1. Then, the general formula of the camera affine relationship between all cameras of station 2 and calibration plate at each detection point is as follows (i represents the serial number of the detection point, i=1, 2, 3.; j denotes a camera serial number, j=5 … 10, since station 2 is provided with 6 cameras, the camera serial number here is from 5 to 10):
through calibrating the affine relation of the camera in advance, the general formula can be directly called when the quality inspection of the glass product is carried out later, and the general formula can call the coordinate transformation matrix in the stored library table.
The product is moved to a first detection point on a station 1, the glass edge, the ink edge, the positioning line or the character mark point waiting for the detection feature in the visual field of the camera is identified, and the image coordinate P of the corresponding feature point is determined jk (PX jk ,PY jk ) Wherein j represents the serial number of the camera, j=1..4; k represents the serial number of the point feature measured at that point (there may be multiple features to be measured), k=1, 2,3 ….
Conversion into coordinate Q under the coordinate system of the calibration plate through T1j (camera affine relation calibrated by the first detection point position) jk (x jk ,y jk ) Wherein j represents the serial number of the camera, j=1..4; k denotes the serial number of the point feature measured at that point, k=1, 2,3 ….
And then sequentially aligning all detection points P of the station 1 i Identifying the features to be detected in the fields of view of the cameras, and obtaining the image coordinates P1 of the vertexes of the features ijk (PX1 ijk ,PY1 ijk ) (i represents the serial number of the test site, i=1, 2, 3.; j represents a serial number of the camera, j=1..4; k represents the serial number of the feature to be measured at the point, k=1, 2,3, …) and is converted into the coordinate Q1 of the station platform coordinate system at the corresponding point ijk (x1 ijk ,y1 ijk ) (i represents the serial number of the test site, i=1, 2, 3.; j represents a serial number of the camera, j=1..4; k represents the serial number of the feature to be measured at the detection point, k=1, 2,3 …):
after all the coordinates of the station 1 are converted, the point-to-line distance is used as an evaluation standard to judge each feature to be tested. Assume that the feature coordinates required to be detected on the station 1 are Pp (px, py), and the head and tail coordinates of the fitted line segments are Lp respectively 1 (Lx 1 ,Ly 1 ),Lp 2 (Lx 2 ,Ly 2 ). The distance from the point to the line segment is M1 i (i represents the serial number of the item of the product measurement, i=1, 2, 3.):
after the calculation of the station 1 is completed, after the glass moves to the station 2, the same steps as the station 1 are carried out, and all the point positions P of the station 2 are completed i Product feature point location measurement P2 ijk (PX2 ijk ,PY2 ijk ) Conversion Q2 ijk (x2 ijk ,y2 ijk ) Wherein i represents the sequence number of the point location, i=1, 2, 3.; j represents a serial number of the camera, j=1..4; k denotes the serial number of the point feature measured at that point, k=1, 2,3 ….
Similarly, assume that the feature coordinate to be detected at station 2 is Pp 2 (p 2x, p2 y), the coordinates of the head and tail of the fitted line segment are Lp2 respectively 1 (Lx2 1 ,Ly2 1 ),Lp2 2 (Lx2 2 ,Ly2 2 ). The distance from the point to the line segment is M2 i (i represents the serial number of the item measured for the product, i=1, 2, 3.).
Finally, the standard value according to the preset product control requirement is Si (i represents the serial number of the item measured by the product, i=1, 2,3, …), and the positive tolerance is Ui (i represents the item measured by the productSerial number i=1, 2,3 …) and negative tolerance is Di (i represents the serial number of the item measured for the product, i=1, 2,3 …). Record M1 i And M2 i Is integrated as M i . Meanwhile, if the following judging conditions are met, the feature to be tested is qualified, and if any judging condition is not met, the feature to be tested is unqualified:
example III
Fig. 3 is a schematic structural diagram of a detection device according to a third embodiment of the present application. As shown in fig. 3, the apparatus 300 includes:
an image coordinate obtaining module 310, configured to obtain image coordinates of at least one feature to be measured of the object to be measured at the current station, and camera affine relations between the current station and at least two cameras at different detection points;
the platform coordinate determining module 320 is configured to determine the platform coordinate of each feature to be measured at the current station according to the image coordinate and the affine relationship of the camera;
the feature to be measured detection module 330 is configured to determine whether the feature to be measured is qualified according to the platform coordinates of each feature to be measured.
According to the technical scheme, camera affine relations between different detection points and the current station of different cameras calibrated in advance are obtained, in the process of inspecting an object to be inspected, image coordinates of features to be inspected on the object to be inspected are visually identified through the cameras, physical coordinates of the features to be inspected on a current station platform are further determined according to the camera affine relations, and therefore quality inspection of the object to be inspected is assisted in a coordinate transformation mode, and inspection accuracy and inspection efficiency are improved.
In an alternative embodiment, the feature to be measured detection module 330 may include:
the distance determining unit is used for determining the distance from the index point feature to the edge of the object to be detected according to the index platform coordinate of the index point feature and the edge platform coordinate of the edge point feature;
and the qualification testing unit is used for determining whether the feature to be tested is qualified or not according to the distance from the index point feature to the edge of the object to be tested.
In an alternative embodiment, the index point feature comprises a character mark point feature.
In an alternative embodiment, the distance determining unit may include:
the straight line fitting subunit is used for determining a fitting straight line of the edge of the object to be measured at the current station according to the edge platform coordinates of the edge point characteristics;
and the distance determining subunit is used for determining the distance from the characteristic of the index point to the edge of the object to be measured according to the fitting straight line and the coordinates of the index platform.
In an alternative embodiment, the apparatus 300 further comprises:
the calibration plate image acquisition module is used for acquiring calibration plate images of the calibration plate placed on the current station, shot by at least two cameras, on different detection points; the calibration plate comprises a circular checkerboard and two-dimensional codes, and the two-dimensional codes are embedded in circular feature points of the checkerboard;
the circle center coordinate determining module is used for determining circle center image coordinates of all the round feature points on the calibration plate and circle center physical coordinates corresponding to the two-dimensional codes according to the calibration plate image;
and the affine relation determining module is used for determining the affine relation of cameras of different cameras and the current station on different detection points according to the circle center image coordinates and the circle center physical coordinates.
In an alternative embodiment, the affine relation determining module may include:
the coordinate transformation matrix determining unit is used for respectively determining coordinate transformation matrixes between different cameras and the current station according to the circle center image coordinates and the circle center physical coordinates determined by the different cameras on the different detection points;
and the camera affine relation determining unit is used for storing different coordinate transformation matrixes as camera affine relations.
In an alternative embodiment, the calibration plate image includes at least three non-collinear circular feature points.
The detection device provided by the embodiment of the application can execute the detection method provided by any embodiment of the application, and has the corresponding functional modules and beneficial effects of executing the detection methods.
Example IV
Fig. 4 shows a schematic diagram of the structure of an electronic device 10 that may be used to implement embodiments of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the application described and/or claimed herein.
As shown in fig. 4, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the various methods and processes described above, such as the detection method.
In some embodiments, the detection method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as the storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into RAM 13 and executed by processor 11, one or more steps of the detection method described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the detection method in any other suitable way (e.g. by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out the methods of the present application may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this application, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present application may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solutions of the present application are achieved, and the present application is not limited herein.
The above embodiments do not limit the scope of the application. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present application are intended to be included within the scope of the present application.

Claims (10)

1. A method of detection, the method comprising:
acquiring image coordinates of at least one feature to be detected of an object to be detected on a current station and camera affine relations between the current station and at least two cameras on different detection points;
determining the platform coordinates of each feature to be detected at the current station according to the image coordinates and the affine relationship of the camera;
and determining whether the feature to be detected is qualified or not according to the platform coordinates of the feature to be detected.
2. The method of claim 1, wherein the feature to be measured comprises an edge point feature and an index point feature of the item to be measured; determining whether the feature to be detected is qualified according to the platform coordinates of the feature to be detected, including:
determining the distance from the index point feature to the edge of the object to be detected according to the index platform coordinate of the index point feature and the edge platform coordinate of the edge point feature;
and determining whether the feature to be detected is qualified or not according to the distance from the index point feature to the edge of the object to be detected.
3. The method of claim 2, wherein the index point feature comprises a character mark point feature.
4. The method of claim 2, wherein determining the distance of the index point feature to the edge of the object to be measured based on the index plateau coordinates of the index point feature and the edge plateau coordinates of the edge point feature comprises:
determining a fitting straight line of the edge of the object to be detected at the current station according to the edge platform coordinates of the edge point characteristics;
and determining the distance from the index point characteristic to the edge of the object to be detected according to the fitting straight line and the index platform coordinates.
5. The method according to any of claims 1-4, wherein the camera affine relationship between the current workstation and at least two cameras at different detection points is determined by:
acquiring calibration plate images of the calibration plates placed on the current station, shot by at least two cameras, on different detection points; the calibration plate comprises a circular checkerboard and two-dimensional codes, and the two-dimensional codes are embedded in circular feature points of the checkerboard;
determining the center image coordinates of all the round feature points on the calibration plate and the center physical coordinates corresponding to the two-dimensional code according to the calibration plate image;
and determining camera affine relations of different cameras and the current station on different detection points according to the circle center image coordinates and the circle center physical coordinates.
6. The method of claim 5, wherein determining camera affine relationships of different cameras and the current station at different detection points based on the center image coordinates and the center physical coordinates comprises:
according to the circle center image coordinates and the circle center physical coordinates determined by different cameras on different detection points, respectively determining coordinate transformation matrixes between the different cameras and the current station;
and saving different coordinate transformation matrixes as affine relations of the camera.
7. The method of claim 5, wherein the calibration plate image includes at least three non-collinear circular feature points.
8. A detection apparatus, characterized by comprising:
the image coordinate acquisition module is used for acquiring image coordinates of at least one feature to be detected of the object to be detected on the current station and camera affine relations between the current station and at least two cameras on different detection points;
the platform coordinate determining module is used for determining the platform coordinate of each feature to be detected at the current station according to the image coordinate and the affine relationship of the camera;
and the feature detection module to be detected is used for determining whether the feature to be detected is qualified or not according to the platform coordinates of each feature to be detected.
9. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the detection method of any one of claims 1-7.
10. A computer readable storage medium, characterized in that the computer readable storage medium stores computer instructions for causing a processor to implement the detection method of any one of claims 1-7 when executed.
CN202310583292.5A 2023-05-22 2023-05-22 Detection method, detection device, electronic equipment and storage medium Pending CN116559176A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310583292.5A CN116559176A (en) 2023-05-22 2023-05-22 Detection method, detection device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310583292.5A CN116559176A (en) 2023-05-22 2023-05-22 Detection method, detection device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116559176A true CN116559176A (en) 2023-08-08

Family

ID=87496305

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310583292.5A Pending CN116559176A (en) 2023-05-22 2023-05-22 Detection method, detection device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116559176A (en)

Similar Documents

Publication Publication Date Title
CN114220757B (en) Wafer detection alignment method, device and system and computer medium
CN107945184A (en) A kind of mount components detection method positioned based on color images and gradient projection
CN104634242A (en) Point adding system and method of probe
CN113030123B (en) AOI detection feedback system based on Internet of things
CN108627104A (en) A kind of dot laser measurement method of parts height dimension
CN115797359A (en) Detection method and device based on solder paste on circuit board and storage medium
CN113781434A (en) Defect detection method and device, intelligent terminal and computer readable storage medium
CN114628301A (en) Positioning precision determination method of wafer transmission system
CN109712115B (en) Automatic PCB detection method and system
CN108627103A (en) A kind of 2D laser measurement methods of parts height dimension
CN116124081B (en) Non-contact workpiece detection method and device, electronic equipment and medium
CN110006903A (en) Printed circuit board rechecks system, marker method and reinspection method
CN112109374A (en) Method for positioning and controlling assembling and disassembling of bending die based on computer vision system
CN116559176A (en) Detection method, detection device, electronic equipment and storage medium
CN114734444B (en) Target positioning method and device, electronic equipment and storage medium
CN107734324B (en) Method and system for measuring illumination uniformity of flash lamp and terminal equipment
CN116298785A (en) Electronic signal testing method and device, electronic equipment and storage medium
CN115908581A (en) Vehicle-mounted camera pitch angle calibration method, device, equipment and storage medium
CN115422617A (en) Frame image size measuring method, device and medium based on CAD
CN115035481A (en) Image object distance fusion method, device, equipment and storage medium
CN111640096B (en) Method, device and terminal for detecting appearance of electronic product
CN104677906A (en) Image information detecting method
CN110874837A (en) Automatic defect detection method based on local feature distribution
CN107894421B (en) Method for detecting and marking defects of castings by combining photogrammetric system and light pen measuring system
CN116258714B (en) Defect identification method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination