CN110838099A - Foreign matter detection method, device and system and terminal equipment - Google Patents

Foreign matter detection method, device and system and terminal equipment Download PDF

Info

Publication number
CN110838099A
CN110838099A CN201910957953.XA CN201910957953A CN110838099A CN 110838099 A CN110838099 A CN 110838099A CN 201910957953 A CN201910957953 A CN 201910957953A CN 110838099 A CN110838099 A CN 110838099A
Authority
CN
China
Prior art keywords
image
definition
camera
images
detection method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910957953.XA
Other languages
Chinese (zh)
Inventor
陈军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yanmade Technology Co Ltd
Original Assignee
Shenzhen Yanmade Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yanmade Technology Co Ltd filed Critical Shenzhen Yanmade Technology Co Ltd
Priority to CN201910957953.XA priority Critical patent/CN110838099A/en
Publication of CN110838099A publication Critical patent/CN110838099A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0008Industrial image inspection checking presence/absence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/92Dynamic range modification of images or parts thereof based on global image properties
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)

Abstract

The application is suitable for the technical field of foreign matter detection, and provides a foreign matter detection method, a device, a system and terminal equipment, wherein the foreign matter detection method comprises the following steps: the method comprises the steps of obtaining a plurality of images of a detection object, obtaining the definition of each image, comparing the definition of each image, obtaining the image with the highest definition, carrying out image processing on the image with the highest definition, and detecting whether foreign matters exist. The foreign matter detection method is an automatic detection method, whether foreign matters exist in an image is automatically detected in an image processing mode, the reliability and the accuracy of foreign matter detection can be improved, and the accuracy and the reliability of a detection result are guaranteed. Moreover, a plurality of images are detected, the clearest image is found in the plurality of images, the definition of the images can be guaranteed through the method, and the accuracy of foreign matter detection is improved.

Description

Foreign matter detection method, device and system and terminal equipment
Technical Field
The present application belongs to the technical field of foreign object detection, and in particular, to a foreign object detection method, apparatus, system, and terminal device.
Background
In industrial sites, it is sometimes necessary to ensure that the surfaces of electronic devices are smooth and free of foreign matter, such as: a film-facing surface, a gold surface, a welding surface, etc. If there are foreign matters such as dust on the surface, it is first removed or disposed as a defective product. Although the production is basically carried out in a dust-free workshop at present, fine particles such as dust and the like widely exist in an air space, and quality control often requires that the devices, particularly the valuable devices, be subjected to dust-free detection before the materials are used.
In most cases, the field is manually inspected for the presence of fine particles such as dust on the surface of the device by means of a microscope. The method has low efficiency and high cost.
Disclosure of Invention
In view of this, the present application provides a foreign object detection method, apparatus, system and terminal device, so as to solve the problem of low efficiency of the existing method for manually detecting a foreign object.
A first aspect of embodiments of the present application provides a foreign object detection method, including:
acquiring a plurality of images of a detection object;
acquiring the definition of each image;
comparing the definition of each image to obtain an image with the highest definition;
and carrying out image processing on the image with the highest definition, and detecting whether foreign matters exist.
A second aspect of the embodiments of the present application provides a foreign matter detection apparatus including:
the image acquisition module is used for acquiring a plurality of images of the detection object;
the definition acquisition module is used for acquiring the definition of each image;
the comparison module is used for comparing the definition of each image to obtain the image with the highest definition;
and the image processing module is used for carrying out image processing on the image with the highest definition and detecting whether foreign matters exist.
A third aspect of embodiments of the present application provides a terminal device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor implements the steps of the foreign object detection method as described in the first aspect of embodiments of the present application when executing the computer program.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, which stores a computer program that, when executed by a processor, implements the steps of the foreign object detection method provided in the first aspect of embodiments of the present application.
A fifth aspect of embodiments of the present application provides a foreign object detection system including:
a camera for photographing a detection object;
a moving mechanism moving in a certain direction; and
a terminal device;
the camera shoots a detection object to obtain a plurality of images in the process of moving along the corresponding moving direction under the action of the moving mechanism, the camera outputs the images to the terminal equipment, and the terminal equipment executes the steps of the foreign matter detection method provided by the first aspect of the embodiment of the application.
A sixth aspect of embodiments of the present application provides a computer program product, which, when run on a terminal device, causes the terminal device to execute a foreign object detection method as provided in the first aspect of embodiments of the present application.
Compared with the prior art, the implementation mode of the invention has the following beneficial effects: the foreign matter detection method is an automatic detection method, the definition of each image is obtained by processing and analyzing a plurality of acquired images of a detection object, the image with the highest definition, namely the clearest image, is found, and finally the image with the highest definition is subjected to image analysis to detect whether foreign matters exist. Firstly, whether foreign matters exist in the image is automatically detected in an image processing mode, so that the reliability and the accuracy of foreign matter detection can be improved, and the accuracy and the reliability of a detection result are ensured. Moreover, a plurality of images are detected, the clearest image is found in the plurality of images, the definition of the images can be guaranteed through the method, and the accuracy of foreign matter detection is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings required to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive labor.
Fig. 1 is a schematic flowchart of a foreign object detection method according to an embodiment of the present application;
fig. 2 is a first structural schematic diagram of a foreign object detection system according to a second embodiment of the present application;
fig. 3 is a second structural schematic diagram of a foreign object detection system according to a second embodiment of the present application;
4-a to 4-j are schematic diagrams illustrating the sharpness of 10 images detected by the foreign object detection system according to the second embodiment of the present application;
fig. 5 is a schematic structural diagram of a foreign object detection apparatus according to a third embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal device according to a fourth embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that the order of writing each step in this embodiment does not mean the order of execution, and the order of execution of each process should be determined by its function and inherent logic, and should not constitute any limitation on the implementation process of the embodiment of the present invention.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
In order to explain the technical means described in the present application, the following description will be given by way of specific embodiments.
Referring to fig. 1, it is a flowchart of an implementation process of a foreign object detection method provided in a first embodiment of the present application, and for convenience of description, only a part related to the embodiment of the present application is shown.
The foreign matter detection method includes:
step S101: several images of the test object are acquired.
In this embodiment, the foreign object detection method is applied to detecting whether foreign objects, such as fine particles like dust, exist on the surface of an electronic device, and then the detected object is the electronic device and the foreign objects are the dust. The image of the detection object is then the image of the surface of the electronic device.
Detecting an image of the surface of the electronic device by means of an image acquisition device, such as: a camera. The camera shoots the surface of the electronic device to obtain a plurality of images, the specific number is set according to actual needs, but at least two images are ensured, the more the number is, and the clearer image is found in the subsequent steps.
The images are different in definition. When in photographing, the camera can be fixed and continuously photograph the surface of the electronic device, and the obtained images are the images with the same angle and the same position. Although the camera is fixed, if the focusing position is different, the definition of the obtained images is different, so that the focusing position of the camera can be changed to obtain a plurality of images with different definitions. Of course, it is also possible to fix the camera to a moving mechanism that moves in a certain direction, such as: the camera mounting frame is movably assembled with the guide pillar in the up-down direction, the camera is fixed on the camera mounting frame, and then the camera can move in the up-down direction, wherein the camera can be manually adjusted on the guide pillar, and can also be electrically adjusted through the driving motor. In this embodiment, the camera and the motion mechanism are both controlled by a control instruction, and perform actions under the control of the control instruction, specifically: the foreign matter detection method is executed by the terminal equipment, so that when photographing is carried out, the terminal equipment outputs a photographing instruction to the camera, the photographing instruction is used for indicating the camera to photograph a detection object, moreover, the terminal equipment outputs a motion instruction to the motion mechanism, and the motion instruction is used for indicating the motion mechanism to move along a certain direction so as to control the camera to photograph in the motion process of the motion mechanism.
The electronic device is arranged right below the camera, and the surface of the electronic device faces the camera, so that the camera can shoot a plurality of images in the process of moving in the up-down direction. In order to ensure that the difference between the images is small, the movement range of the camera mounting frame on the guide post, namely the movement range of the camera, is small, such as 1 cm. Of course, when arranging the camera and the guide pillar, it is necessary to ensure that the best shooting point of the camera is within the motion range of the camera, such as: the best photographing point is in the middle position of the camera movement range, so when the camera moves from top to bottom, the obtained image is clearer and clearer in the process from the highest point to the middle position, the obtained image is clearest and clearer in the process from the middle position to the lowest point, and then the obtained image is less and clearer.
Of course, the images may not be at the same angle and at the same position, but the detection area on the surface of the electronic device needs to be captured in the image regardless of how the image is captured.
Step S102: and acquiring the definition of each image.
After acquiring a plurality of images of the surface of the electronic device, the terminal device processes each image to obtain the definition of each image.
For black and white photos, the clearest photo is the clearest black and white photo, and the edge part on the image is very clear and sharp, and is not blurred or blurred. Then, after acquiring a plurality of images of the surface of the electronic device, graying each image to obtain a black-and-white image corresponding to each image. The black-and-white image is stored by gray-scale dot matrix data, the range of the gray-scale value is 0-255, 0 represents black, and 255 represents white. When the image is clear and sharp, the data is reflected that the gray difference (jump) of adjacent points is the largest at the edge part, no transition point is arranged in the middle (the effect of gradual change and virtual change is formed), the image is just like a chessboard of the chess, the black and white points are alternated, the image is more striking and clearer, the white grid (the gray value 255) is directly changed into the black grid (the gray value 0), and no grid with other gray values is arranged in the middle.
Then, a gradation gradient value of each black-and-white image, preferably, a gradation gradient value of an edge of each black-and-white image is acquired. Mathematically, the degree of change is expressed in terms of a gradient, with a larger gradient indicating a larger change (steeper) and a smaller gradient indicating a smaller change (gentler). The larger the gradient at the edge, the clearer the edge and the clearer the image appears; the smaller the gradient, the more blurred the image, i.e. the larger the grey gradient value, the sharper the image is represented.
The gradient of the image can be solved with a first derivative and a second partial derivative. However, the image gray-scale dot matrix is stored in a matrix form, is discrete values, and cannot be differentiated like a straight line or a curve in mathematical theory. One specific way of calculating the gray scale gradient values is given below:
for a certain pixel in the image, the calculation process also needs to use a pixel in the same column as the pixel and in the next row of the pixel, calculate the absolute value of the difference between the gray values of the two pixels to obtain a matrix, and then calculate the sum of all elements in the matrix, where the obtained sum is the definition of the image.
Specifically, the method comprises the following steps: for a pixel f (x, y) in the image, the specific positional relationship with the surrounding eight neighborhood pixels is:
the absolute variance function is calculated as:
F(k)=∑∑|f(x,y)-f(x+1,y)|
the absolute variance function is a specific gray gradient function, and the obtained F (k) is the gray gradient value of the corresponding image, and the obtained gray gradient value is the definition of the image. The above-mentioned absolute variance function means: for any pixel f (x, y), f (x +1, y) is a pixel which is in the same column with the pixel f (x, y) and is at the next row of the pixel f (x, y), obtaining | f (x, y) -f (x +1, y) |, wherein f (x, y) and f (x +1, y) both represent gray values, obtaining a matrix, then obtaining the sum of all elements of the matrix, and the meaning of sigma double summation is to obtain the sum of all elements of the matrix.
Of course, the gray scale gradient function may be a Roberts gradient sum function, a gradient vector square function, a Brenner function, a Laplacian function, or the like, in addition to the absolute variance function. Then, in addition to the above absolute variance function, a Roberts gradient sum function, a gradient vector square function, a Brenner function, a Laplacian function, or the like may be used to obtain the gray scale value.
For example: the calculation formula for the Roberts gradient and function is:
Figure BDA0002227981390000071
in addition, the template can be adopted to carry out convolution operation on the image, and the gradient of the obtained image is converted into: the image is convolved by using a template (such as Roberts, Prewitt, Sobel, etc. operators) to obtain gradient values of the whole image, a matrix is formed, and then the mean square error of the gradient matrix is calculated, and the obtained variance value is the gray gradient value.
It should be noted that, in addition to the image sharpness detecting process, this embodiment also provides another image sharpness detecting method, which includes:
dividing the image into a plurality of image areas according to the preset area parameters, and respectively obtaining gray values of the plurality of image areas to obtain the gray values of the plurality of image areas; the preset area parameters can be set by a user according to the requirements of an actual application scene and used when image definition detection is performed, and include the number of image areas, the positions of the image areas, the sizes of the image areas and the like, wherein the positions and the sizes of the image areas can be represented by image sizes or pixel sizes. For example, a camera module with 2592 × 1944 resolution and pixel size of 2592 pixels per row and 1944 pixels per column, the size of the image area can be represented as 300 pixels × 200 pixels;
acquiring a maximum gray value and a minimum gray value from the gray values, wherein the maximum gray value refers to the maximum value of the acquired gray values, and the minimum gray value refers to the minimum value of the acquired gray values; calculating the mean value and the mean square error of the gray value, and respectively solving the sum and the difference of the mean value and the mean square error; calculating the average value of the gray values between the maximum gray value and the sum and difference of the mean and the mean square error to obtain the maximum gray value average value, and calculating the average value of the gray values between the minimum gray value and the sum and difference of the mean and the mean square error to obtain the minimum gray value average value;
calculating the image definition according to the maximum gray average value and the minimum gray average value, wherein the calculation formula of the image definition is as follows:
MTF=(I’max-I’min)/(I’max+I’min)
wherein MTF represents image sharpness, I 'max represents maximum gray level average value, and I' min represents minimum gray level average value.
Step S103: and comparing the definition of each image to obtain the image with the highest definition.
After the gray gradient values of the images are obtained, the gray gradient values of the images are compared to obtain the maximum gray gradient value, the maximum gray gradient value is the highest definition, and the image corresponding to the maximum gray gradient value is the highest definition image, namely the clearest image.
Step S104: and carrying out image processing on the image with the highest definition, and detecting whether foreign matters exist.
There are many methods for detecting foreign objects, such as template matching, deep learning-based methods, and the like.
The template comparison method comprises the following steps:
comparing the image with the highest definition with a standard template image to obtain a matching value of the image and the standard template image, wherein the standard template image is an image without foreign matters;
if the matching value of the two images is greater than or equal to the threshold value, the matching degree of the image with the highest definition and the standard template image is higher, the difference between the two images is not large, and the standard template image is an image without foreign matters, so that the image with the highest definition is judged to have no foreign matters; and if the matching value of the two images is smaller than the set threshold value, the matching degree of the image with the highest definition and the standard template image is lower, and the difference between the two images is large, and the image with the highest definition is judged to have foreign matters. Such as: the threshold value is 95%, if the matching value of the image with the highest definition and the standard template image is 97%, and 97% is more than 95%, the image with the highest definition is judged to have no foreign matters; if the matching value of the two is 80%, 80% < 95%, it is determined that the image with the highest resolution has a foreign object.
The overall process of the deep learning method comprises the following steps: firstly, a plurality of standard foreign-matter-free pictures and a plurality of foreign-matter-containing pictures are obtained as two types of samples, and then the samples are input into a deep learning tool to obtain a discriminant model. In actual detection, the clearest image is input into the discrimination model, and whether foreign matters exist in the clearest image can be identified.
Therefore, the foreign matter detection method is an automatic detection method, the definition of each image is obtained by processing and analyzing a plurality of acquired images of the detection object, the image with the highest definition, namely the clearest image, is acquired, and finally the image with the highest definition is subjected to image processing to detect whether foreign matters exist. Firstly, whether foreign matters exist in the image is automatically detected in an image processing mode, so that the reliability and the accuracy of foreign matter detection can be improved, and the accuracy and the reliability of a detection result are ensured. Moreover, a plurality of images are detected, the clearest image is found in the plurality of images, the definition of the images can be guaranteed through the method, and the accuracy of foreign matter detection is improved.
Fig. 2 is a schematic view of a first structure of a foreign object detection system according to a second embodiment of the present application.
The foreign object detection system includes a camera 201, a movement mechanism 202, and a terminal device 203. Wherein the camera 201 is used for taking a picture of the surface of the object to be inspected, i.e. the electronic device, to obtain several images, the camera 201 may be a conventional camera. The moving mechanism 202 can move along a certain direction, such as an up-down direction, a left-right direction, or the like, and the moving mechanism 202 is used to drive the camera 201 to move along the corresponding direction, then the moving mechanism 202 may include a slide rail, a lead screw, a rack, and the like, and the camera 201 may move along the slide rail, the lead screw, the rack, and the like.
Then, the camera 201 photographs the surface of the electronic device during the movement along the corresponding movement direction under the action of the movement mechanism 202 to obtain a plurality of images. The automatic photographing time may be set inside the camera 201, and the camera 201 may automatically photograph according to a certain time interval, of course, the camera 201 may also be controlled by the terminal device 203, and the terminal device 203 sends a photographing instruction and a photographing time interval to the camera 201. Therefore, the terminal device 203 may be connected to the camera 201 by wire through a data transmission line, or may be connected to the camera 201 by wireless communication through a wireless communication device. Further, the camera 201 may output an image to the terminal device 203 while capturing an image, may output an image to the terminal device 203 after capturing is completed, and may manually upload an image captured by the camera 201 to the terminal device 203.
The movement mechanism 202 may be controlled manually or electrically by a driving motor, and in the case of electric control, it may be controlled by another control device or may be controlled by the terminal device 203. When the movement mechanism 202 is controlled by the terminal device 203, the terminal device 203 may be connected to the movement mechanism 202 by wire or wirelessly.
The terminal device 203 receives a plurality of images shot by the camera 201, and the terminal device 203 has a software program corresponding to the foreign object detection method provided by the present application. Therefore, the terminal device 203 processes the plurality of images, and the processing process is the foreign object detection method provided by the present application, which is specifically referred to as an implementation process of an embodiment of the foreign object detection method provided by the first embodiment of the present application and is not described again.
Fig. 3 is a schematic view of a second structure of the foreign object detection system according to the second embodiment of the present application.
The foreign object detection system includes a camera 301, a moving mechanism, and a terminal device 304. The inspection object 306, i.e., the electronic device, is provided on the carrier board 305, facilitating photographing. The camera 301 is used to take a picture of the test object 306 to obtain several images, and the camera 301 may be a conventional camera.
The moving mechanism includes a guide pillar 302, the guide pillar 302 extends along a certain direction, in this embodiment, the guide pillar 302 extends along an up-and-down direction, that is, the guide pillar 302 is vertically disposed, and the detection object 306 is disposed below the guide pillar 302. The camera mounting frame 303 is arranged on the guide post 302, the camera mounting frame 303 is movably assembled with the guide post 302 along the up-down direction, the camera 301 is fixed on the camera mounting frame 303, and the lens of the camera 301 is arranged downwards. The guide post 302 may be a guide rail with a sliding slot, and the camera mounting bracket 303 has a sliding block or a roller movably assembled with the sliding slot; alternatively, a lead screw is provided on the guide post 302 in the up-down direction, and the camera mounting bracket 303 is assembled to the lead screw so as to move up and down along the guide post 302. Therefore, the structure and the fitting relationship of the guide post 302 and the camera mount 303 are not unique. The moving mode of the camera mounting bracket 303 on the guide post 302 can be manually controlled, the camera mounting bracket 303 is manually operated to move up and down, and of course, the moving mode can also be electrically controlled. The guide post 302 and the camera mounting bracket 303 constitute a linear reciprocating module, and such an assembly structure is conventional and will not be described in detail.
The camera 301 is placed vertically downwards, and assuming that the most suitable distance from the detected object 306 is approximately position D, where the picture taken is the clearest, position D is the clearest picture point. Then, as the camera 301 approaches the sharpest picture point D, the pictures taken will become sharper and sharper; as the camera 301 moves away from the sharpest picture point D, the picture taken becomes increasingly blurred. By utilizing the characteristic, the clearest picture can be obtained by adopting a continuous motion shooting mode without adjusting the optical parameters of the camera 301, so that the full-automatic process of shooting and identification is realized by using computer software, and the reciprocating motion and the camera shooting are controlled by a computer. Moreover, after the system is built, debugging is needed to ensure that the clearest Nth picture is obtained, and the debugging method comprises the following steps: the movement speed and the photographing interval are controlled to ensure that the definition of the taken M pictures is changed as follows: from the 1 st to the Nth, the definition gradually becomes better, and the Nth is best; from the Nth to the last, the sharpness gradually deteriorates. Moreover, the definition of at least 1 sheet before and after the Nth sheet is good enough to meet the requirements, namely: if the Nth sheet is clearest, the (N-1) th and (N + 1) th sheets are also very clear and do not differ much from the Nth sheet.
In this embodiment, the terminal device 304 controls the camera 301 and the motion mechanism in a wired manner, and the terminal device 304 outputs a photographing instruction to the camera 301, so as to control the camera 301 to photograph and photograph automatically at a certain time interval; the terminal device 304 outputs a motion instruction to the motion mechanism for controlling the motion mechanism to move in the up-and-down direction at a certain speed.
The camera 301 moves up and down, and the movement range is set as [ D-L, D + L ], where L may be set arbitrarily as long as it is ensured that the best photographing point can be covered, and it is sufficient to set L to 1cm in general.
At the beginning of each inspection, the camera 301 starts moving down from the D-L point at a speed V and takes pictures, takes pictures every T milliseconds until it reaches the D + L position, stops taking pictures, and then returns to the D-L point position to prepare for the next inspection. M images are set, and the numbers are 1, 2, 3,. eta., N, N +1, N +2,. eta., and M. For example: as shown in fig. 4-a to 4-j, the number of the images is 10, wherein, as can be seen from the figure, the 6 th image is the most clear image in fig. 4-f; and 5 th and 7 th images, namely the image definition of the images of 4-e and 4-g is not much different from that of the images of 4-f, and finally the images of 4-f are selected to be used for dust detection.
The camera 301 outputs the M captured images to the terminal device 304, and the terminal device 304 has a software program corresponding to the foreign object detection method provided by the present application. Therefore, the terminal device 304 processes the plurality of images, and the processing procedure is the foreign object detection method provided by the present application, which is specifically referred to as an implementation procedure of the foreign object detection method provided by the first embodiment of the present application and is not described again.
Therefore, the foreign matter detection system does not need to manually focus the camera, avoids a complex manual focusing process, further avoids the problem that the definition of a photo is difficult to guarantee due to large mode error of artificial focusing, also avoids the problem that the camera is changed due to touching the camera in the manual focusing process, and further causes the camera to shoot once more unclear, so that the most clear image can be reliably and accurately acquired, the reliability and accuracy of foreign matter detection are improved, the detection result is accurate and reliable, and the detection efficiency is remarkably improved.
Fig. 5 shows a block diagram of a foreign object detection device provided in the third embodiment of the present application, corresponding to the foreign object detection method described in the above embodiment of the foreign object detection method.
Referring to fig. 5, the foreign object detection apparatus 400 includes:
an image acquisition module 401, configured to acquire a plurality of images of a detection object;
a definition obtaining module 402, configured to obtain a definition of each image;
a comparing module 403, configured to compare the definitions of the images to obtain an image with the highest definition;
an image processing module 404, configured to perform image processing on the image with the highest definition, and detect whether a foreign object exists.
Optionally, the acquiring the definition of each image includes:
graying the plurality of images to obtain corresponding black and white images;
and acquiring the gray gradient value of each black-and-white image, wherein the gray gradient value is the definition of the image.
Optionally, the foreign object detection apparatus 400 further includes a control instruction output module configured to:
outputting a photographing instruction to a camera, wherein the photographing instruction is used for instructing the camera to photograph the detection object;
and outputting a motion instruction to a motion mechanism, wherein the motion instruction is used for indicating the motion mechanism to move along a certain direction so as to control the camera to take pictures in the motion process.
Optionally, the performing image processing on the image with the highest definition, and detecting whether a foreign object exists includes:
comparing the image with the highest definition with a standard template image to obtain a matching value of the image and the standard template image; the standard template image is an image without foreign matters;
if the matching value of the two is larger than or equal to the threshold value, judging that the image with the highest definition has no foreign matters, otherwise, judging that the image with the highest definition has the foreign matters.
It should be noted that, because the contents of information interaction, execution process, and the like between the above devices/modules are based on the same concept as that of the foreign object detection method embodiment of the present application, specific functions and technical effects thereof may be referred to specifically in the section of the foreign object detection method embodiment, and are not described herein again.
It will be clear to those skilled in the art that, for convenience and simplicity of description, the above-mentioned division of the functional modules is merely illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional modules according to needs, that is, the internal structure of the foreign object detection apparatus 400 is divided into different functional modules to perform all or part of the above-mentioned functions. Each functional module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional modules are only used for distinguishing one functional module from another, and are not used for limiting the protection scope of the application. The specific working process of each functional module in the above description may refer to the corresponding process in the foregoing foreign object detection method embodiment, and is not described herein again.
Fig. 6 is a schematic structural diagram of a terminal device according to a fourth embodiment of the present application. As shown in fig. 6, the terminal device 500 includes: a processor 502, a memory 501, and a computer program 503 stored in the memory 501 and executable on the processor 502. The number of the processors 502 is at least one, and fig. 6 exemplifies one. The processor 502 implements the implementation steps of the above-described foreign object detection method, i.e., the steps shown in fig. 1, when executing the computer program 503.
The specific implementation process of the terminal device 500 can be referred to the foreign object detection method embodiment above.
Illustratively, the computer program 503 may be partitioned into one or more modules/units that are stored in the memory 501 and executed by the processor 502 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 503 in the terminal device 500.
The terminal device 500 may be a computing device such as a desktop computer, a notebook, and a palm computer, or may be a device such as a camera and a mobile phone having an image capturing function and a data processing function. Terminal device 500 may include, but is not limited to, a processor and a memory. Those skilled in the art will appreciate that fig. 6 is only an example of the terminal device 500 and does not constitute a limitation of the terminal device 500, and may include more or less components than those shown, or combine some of the components, or different components, for example, the terminal device 500 may further include an input-output device, a network access device, a bus, etc.
The Processor 502 may be a CPU (Central Processing Unit), other general-purpose Processor, a DSP (Digital Signal Processor), an ASIC (application specific Integrated Circuit), an FPGA (Field-Programmable Gate Array), other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 501 may be an internal storage unit of the terminal device 500, such as a hard disk or a memory. The memory 501 may also be an external storage device of the terminal device 500, such as a plug-in hard disk, SMC (smart storage Card), SD (Secure Digital Card), Flash Card, or the like provided on the terminal device 500. Further, the memory 501 may also include both an internal storage unit of the terminal device 500 and an external storage device. The memory 501 is used for storing an operating system, application programs, a boot loader, data, and other programs, such as program codes of the computer program 503. The memory 501 may also be used to temporarily store data that has been output or is to be output.
Embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when being executed by a processor, the computer program implements the steps in the foregoing foreign object detection method embodiments.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the above foreign object detection method embodiments may be implemented by a computer program, which may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the steps of the above foreign object detection method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, ROM (Read-Only Memory), RAM (Random access Memory), electrical carrier wave signal, telecommunication signal, and software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (9)

1. A foreign object detection method, comprising:
acquiring a plurality of images of a detection object;
acquiring the definition of each image;
comparing the definition of each image to obtain an image with the highest definition;
and carrying out image processing on the image with the highest definition, and detecting whether foreign matters exist.
2. The foreign object detection method according to claim 1, wherein the acquiring the sharpness of each image includes:
graying the plurality of images to obtain corresponding black and white images;
and acquiring the gray gradient value of each black-and-white image, wherein the gray gradient value is the definition of the image.
3. The foreign object detection method according to claim 1, further comprising:
outputting a photographing instruction to a camera, wherein the photographing instruction is used for instructing the camera to photograph the detection object;
and outputting a motion instruction to a motion mechanism, wherein the motion instruction is used for indicating the motion mechanism to move along a certain direction so as to control the camera to take pictures in the motion process.
4. The foreign object detection method according to any one of claims 1 to 3, wherein the image processing the image with the highest resolution to detect whether or not a foreign object exists includes:
comparing the image with the highest definition with a standard template image to obtain a matching value of the image and the standard template image; the standard template image is an image without foreign matters;
if the matching value of the two is larger than or equal to the threshold value, judging that the image with the highest definition has no foreign matters, otherwise, judging that the image with the highest definition has the foreign matters.
5. A foreign matter detection device, characterized by comprising:
the image acquisition module is used for acquiring a plurality of images of the detection object;
the definition acquisition module is used for acquiring the definition of each image;
the comparison module is used for comparing the definition of each image to obtain the image with the highest definition;
and the image processing module is used for carrying out image processing on the image with the highest definition and detecting whether foreign matters exist.
6. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the foreign object detection method according to any one of claims 1 to 4 when executing the computer program.
7. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the foreign object detection method according to any one of claims 1 to 4.
8. A foreign object detection system, comprising:
a camera for photographing a detection object;
a moving mechanism moving in a certain direction; and
a terminal device;
the camera photographs a detection object to obtain a plurality of images in the process of moving along the corresponding moving direction under the action of the moving mechanism, the camera outputs the plurality of images to the terminal equipment, and the terminal equipment executes the steps of the foreign matter detection method according to any one of claims 1 to 4.
9. The system of claim 8, wherein the moving mechanism includes a guide post extending along the certain direction, the guide post is provided with a camera mounting bracket, the camera mounting bracket is movably assembled with the guide post along the certain direction, and the camera is fixed on the camera mounting bracket.
CN201910957953.XA 2019-10-10 2019-10-10 Foreign matter detection method, device and system and terminal equipment Pending CN110838099A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910957953.XA CN110838099A (en) 2019-10-10 2019-10-10 Foreign matter detection method, device and system and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910957953.XA CN110838099A (en) 2019-10-10 2019-10-10 Foreign matter detection method, device and system and terminal equipment

Publications (1)

Publication Number Publication Date
CN110838099A true CN110838099A (en) 2020-02-25

Family

ID=69575382

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910957953.XA Pending CN110838099A (en) 2019-10-10 2019-10-10 Foreign matter detection method, device and system and terminal equipment

Country Status (1)

Country Link
CN (1) CN110838099A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111522074A (en) * 2020-05-29 2020-08-11 深圳市燕麦科技股份有限公司 Microphone detection device and microphone detection method
CN111601227A (en) * 2020-05-29 2020-08-28 深圳市燕麦科技股份有限公司 Microphone detection device and microphone detection method
CN111639708A (en) * 2020-05-29 2020-09-08 深圳市燕麦科技股份有限公司 Image processing method, image processing apparatus, storage medium, and device
CN111812106A (en) * 2020-09-15 2020-10-23 沈阳风驰软件股份有限公司 Method and system for detecting glue overflow of appearance surface of wireless earphone
CN113538431A (en) * 2021-09-16 2021-10-22 深圳市鑫信腾科技股份有限公司 Display screen flaw positioning method and device, terminal equipment and system
CN113873231A (en) * 2021-09-26 2021-12-31 江西盛泰精密光学有限公司 System and method for monitoring baking of camera module

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103793918A (en) * 2014-03-07 2014-05-14 深圳市辰卓科技有限公司 Image definition detecting method and device
CN103927749A (en) * 2014-04-14 2014-07-16 深圳市华星光电技术有限公司 Image processing method and device and automatic optical detector
CN106204524A (en) * 2016-06-23 2016-12-07 凌云光技术集团有限责任公司 A kind of method and device of evaluation image quality
CN106570028A (en) * 2015-10-10 2017-04-19 比亚迪股份有限公司 Mobile terminal, fuzzy image deletion method and fuzzy picture deletion device
CN109714535A (en) * 2019-01-15 2019-05-03 南京信息工程大学 A kind of auto-focusing machine vision metrology device and method based on color difference
US20190228254A1 (en) * 2018-01-24 2019-07-25 Denso Ten Limited Extraneous-matter detecting apparatus and extraneous-matter detecting method
US20190294917A1 (en) * 2018-03-22 2019-09-26 Rota Technologies LLC Debris detection system and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103793918A (en) * 2014-03-07 2014-05-14 深圳市辰卓科技有限公司 Image definition detecting method and device
CN103927749A (en) * 2014-04-14 2014-07-16 深圳市华星光电技术有限公司 Image processing method and device and automatic optical detector
CN106570028A (en) * 2015-10-10 2017-04-19 比亚迪股份有限公司 Mobile terminal, fuzzy image deletion method and fuzzy picture deletion device
CN106204524A (en) * 2016-06-23 2016-12-07 凌云光技术集团有限责任公司 A kind of method and device of evaluation image quality
US20190228254A1 (en) * 2018-01-24 2019-07-25 Denso Ten Limited Extraneous-matter detecting apparatus and extraneous-matter detecting method
US20190294917A1 (en) * 2018-03-22 2019-09-26 Rota Technologies LLC Debris detection system and method
CN109714535A (en) * 2019-01-15 2019-05-03 南京信息工程大学 A kind of auto-focusing machine vision metrology device and method based on color difference

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨再华等: "基于边缘特征提取的图像清晰度评价函数", 《计算机工程与应用》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111522074A (en) * 2020-05-29 2020-08-11 深圳市燕麦科技股份有限公司 Microphone detection device and microphone detection method
CN111601227A (en) * 2020-05-29 2020-08-28 深圳市燕麦科技股份有限公司 Microphone detection device and microphone detection method
CN111639708A (en) * 2020-05-29 2020-09-08 深圳市燕麦科技股份有限公司 Image processing method, image processing apparatus, storage medium, and device
CN111601227B (en) * 2020-05-29 2021-12-31 深圳市燕麦科技股份有限公司 Microphone detection device and microphone detection method
CN111812106A (en) * 2020-09-15 2020-10-23 沈阳风驰软件股份有限公司 Method and system for detecting glue overflow of appearance surface of wireless earphone
CN111812106B (en) * 2020-09-15 2020-12-08 沈阳风驰软件股份有限公司 Method and system for detecting glue overflow of appearance surface of wireless earphone
CN113538431A (en) * 2021-09-16 2021-10-22 深圳市鑫信腾科技股份有限公司 Display screen flaw positioning method and device, terminal equipment and system
CN113873231A (en) * 2021-09-26 2021-12-31 江西盛泰精密光学有限公司 System and method for monitoring baking of camera module
CN113873231B (en) * 2021-09-26 2024-05-03 江西盛泰精密光学有限公司 Monitoring system and method for baking camera module

Similar Documents

Publication Publication Date Title
CN110838099A (en) Foreign matter detection method, device and system and terminal equipment
US10210415B2 (en) Method and system for recognizing information on a card
CN109479082B (en) Image processing method and apparatus
CN109829904B (en) Method and device for detecting dust on screen, electronic equipment and readable storage medium
CN109085113B (en) Automatic focusing method and device for cervical exfoliated cell detection device
CN106934806B (en) It is a kind of based on text structure without with reference to figure fuzzy region dividing method out of focus
US9482855B2 (en) Microscope system
CN108764139B (en) Face detection method, mobile terminal and computer readable storage medium
CN109214996B (en) Image processing method and device
CN112396073A (en) Model training method and device based on binocular images and data processing equipment
CN113066088A (en) Detection method, detection device and storage medium in industrial detection
US20090021595A1 (en) Low Memory Auto-Focus and Exposure System for Large Multi-Frame Image Acquisition
CN108107611B (en) Self-adaptive defect detection method and device and electronic equipment
CN112461853B (en) Automatic focusing method and system
CN112950618B (en) Appearance defect detection method and system
CN112272267A (en) Shooting control method, shooting control device and electronic equipment
CN111486790A (en) Full-size detection method and device for battery
CN109995985B (en) Panoramic image shooting method and device based on robot and robot
CN115578291A (en) Image brightness correction method, storage medium and electronic device
CN115330859A (en) Automatic focusing and automatic centering method and system based on machine vision
RU2647645C1 (en) Method of eliminating seams when creating panoramic images from video stream of frames in real-time
Lin et al. Depth recovery from motion blurred images
CN114785953A (en) SFR-based camera automatic focusing method and device
CN114885095A (en) Image acquisition control method and device, image acquisition system and readable storage medium
CN114219758A (en) Defect detection method, system, electronic device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200225

RJ01 Rejection of invention patent application after publication