WO2022104900A1 - Procédé de détection d'image de saleté, dispositif de détection d'image de saleté et mécanisme de détection d'image de saleté - Google Patents

Procédé de détection d'image de saleté, dispositif de détection d'image de saleté et mécanisme de détection d'image de saleté Download PDF

Info

Publication number
WO2022104900A1
WO2022104900A1 PCT/CN2020/132652 CN2020132652W WO2022104900A1 WO 2022104900 A1 WO2022104900 A1 WO 2022104900A1 CN 2020132652 W CN2020132652 W CN 2020132652W WO 2022104900 A1 WO2022104900 A1 WO 2022104900A1
Authority
WO
WIPO (PCT)
Prior art keywords
dirt
area
unit
image
detection area
Prior art date
Application number
PCT/CN2020/132652
Other languages
English (en)
Chinese (zh)
Inventor
罗涛
Original Assignee
诚瑞光学(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 诚瑞光学(深圳)有限公司 filed Critical 诚瑞光学(深圳)有限公司
Publication of WO2022104900A1 publication Critical patent/WO2022104900A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Definitions

  • the present application relates to the technical field of contamination detection, and in particular, to a contamination image detection method, a contamination image detection device, and a contamination image detection mechanism.
  • Mobile phone lens refers to an optical element made of lens and used for shooting.
  • the surface of the mobile phone lens is always contaminated due to direct or indirect contact, which affects the shooting effect of the mobile phone. Therefore, the contamination detection of the lens is an essential process.
  • the detection of contamination mainly relies on manual identification under a high-power microscope, which requires a large workload and has a certain degree of subjective awareness, thereby affecting the stability of lens detection.
  • a dirty image detection method comprising
  • the presence or absence of dirt in the detection area is detected.
  • a dirty image detection device comprising:
  • the image acquisition unit is used to acquire an image of the lens to be detected
  • the first processing unit is used to determine the range of the detection area
  • the second processing unit is configured to determine the dirt particles within the detection area
  • a detection unit configured to detect whether there is dirt in the detection area.
  • the determining the dirty particles within the detection area includes: dividing the detection area into multiple local areas; capturing multiple pixel points in the local area; selecting the pixel points to be determined ; Compare the pixel point with other surrounding pixel points to obtain the difference in gray level between the pixel point and other surrounding pixel points; based on the difference value, determine whether the pixel point is the dirty particle ;
  • the detecting whether there is dirt in the detection area based on the dirt particles includes one, two or three of the following three methods: determining whether there is individual-type dirt in the detection area; Determining whether there is aggregated contamination in the detection area; determining whether there is regional contamination in the detection area;
  • the determining whether there is an individual type of dirt in the detection area includes: selecting the dirt particles to be determined; calculating the area of the dirt particles; and determining the dirt particles based on the area of the dirt particles. Whether the said dirt particles belong to the individual type of dirt;
  • the determining whether there is aggregation-type dirt in the detection area includes: selecting the dirt particles to be determined; defining an aggregation-type distance; connecting the dirt particles with other dirt particles within the aggregation-type distance. contamination particles to form a contamination particle set; calculating the area of the contamination particle set; determining whether the contamination particle set belongs to aggregated contamination based on the area of the contamination particle set;
  • the judging whether there is regional-type dirt in the detection area includes: defining a judging area; calculating the number and area of the dirt particles in the judging area; based on the number and area of the dirt particles , and determine whether the determination area belongs to the area-type contamination.
  • a dirty image detection device comprising:
  • the image acquisition unit is used to acquire an image of the lens to be detected
  • the first processing unit is used to determine the range of the detection area
  • the second processing unit is configured to determine the dirt particles within the detection area
  • a detection unit which is used to detect whether there is dirt in the detection area
  • the second processing unit includes: a dividing unit, which is used for dividing the detection area into a plurality of local areas; a capturing unit, which is used for capturing a plurality of pixels in the local area point; image selection unit, the image selection unit is used to select the pixel point to be determined; comparison unit, the comparison unit is used to compare the pixel point with other surrounding pixel points to obtain the pixel point and The difference between the gray levels of other surrounding pixels; a dirty particle acquisition unit, the dirty particle acquisition unit is configured to determine whether the pixel point is the dirty particle based on the difference value;
  • the detection unit includes one, two or three of the following three types of units: a first determination unit, which is used to determine whether there is contamination of an individual type in the detection area; a second determination unit unit, the second determination unit is used to determine whether there is aggregated contamination in the detection area; the third determination unit is used to determine whether there is area-type contamination in the detection area ;
  • the first determination unit includes: a first selection unit for selecting the dirt particles to be determined; a first calculation unit for calculating the dirt particles area; a first determination subunit, the first determination subunit is used to determine whether the dirt particles belong to individual-type dirt;
  • the second determination unit includes: a second selection unit for selecting the dirty particles to be determined; a first definition unit for defining an aggregation-type distance; a dirt particle connecting unit, the dirt particle connecting unit is used for connecting the dirt particles and other dirt particles within the aggregation-type distance to form a dirt particle set; a second computing unit, the second The calculation unit is used for calculating the area of the dirt particle set; the second judgment subunit is used for judging whether the dirt particle set belongs to aggregated dirt;
  • the third determination unit includes: a second definition unit, which is used to define a determination area; and a third calculation unit, which is used to calculate the amount of the dirt particles in the determination area. The number and area; a third determination subunit, the third determination subunit is used to determine whether the determination area belongs to area-type dirt.
  • the present application also provides a dirty image detection mechanism, which is used to realize the above dirty image detection method, and specifically includes a camera for collecting the image of the lens to be collected, and for driving the camera to move in a vertical direction.
  • the driving component and the light-emitting component for illuminating the lens to be collected.
  • the image of the lens is obtained first, and then the range of the detection area to be detected is determined according to the obtained image, and then the dirty particles in the detection area are obtained. Based on the dirt particles, the dirt in the detection area is judged.
  • FIG. 1 is a schematic flowchart of a dirty image detection method provided by a first embodiment of the present application.
  • FIG. 2 is a schematic partial flowchart of a dirty image detection method provided by a second embodiment of the present application.
  • FIG. 3 is a schematic flowchart of a dirty image detection method provided by a third embodiment of the present application.
  • FIG. 4 is a schematic partial flowchart of a dirty image detection method provided by a fourth embodiment of the present application.
  • FIG. 5 is a schematic partial flowchart of a dirty image detection method provided by a fifth embodiment of the present application.
  • FIG. 6 is a schematic flowchart of a dirty image detection method provided by a sixth embodiment of the present application.
  • FIG. 7 is a schematic partial flowchart of a dirty image detection method provided by a seventh embodiment of the present application.
  • FIG. 8 is a schematic partial flowchart of a dirty image detection method provided by an eighth embodiment of the present application.
  • FIG. 9 is a schematic partial flowchart of a dirty image detection method provided by an eighth embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of a dirty image detection device provided by the present application.
  • FIG. 11 is a schematic structural diagram of a detection unit of the dirty image detection device as provided in FIG. 10 .
  • FIG. 12 is a schematic structural diagram of the first determination unit of the detection unit provided in FIG. 11 .
  • FIG. 13 is a schematic structural diagram of the second determination unit of the detection unit provided in FIG. 11 .
  • FIG. 14 is a schematic structural diagram of the third determination unit of the detection unit provided in FIG. 11 .
  • FIG. 15 is a schematic structural diagram of the second processing unit of the dirty image detection apparatus as provided in FIG. 10 .
  • FIG. 16 is a schematic structural diagram of a dirty image detection mechanism provided by the present application.
  • FIG. 1 shows a schematic flowchart of a dirty image detection method provided by a first embodiment of the present application.
  • the first embodiment of the present application will be described in detail below with reference to FIG. 1 .
  • the image contains the image of the lens to be tested, and also includes the image of the periphery of the lens to be tested.
  • S200 Determine the range of the detection area based on the image.
  • the main function of this step is to remove the image around the lens to be detected, so as to avoid the interference of the peripheral image to the contamination detection.
  • S300 Determine the dirt particles within the detection area.
  • Dirty particles refer to particles that may be judged as dirty.
  • S400 Determine the dirt in the detection area based on the dirt particles.
  • the above-mentioned dirt particles that may be dirt are judged by comparing the characteristics of dirt.
  • each different lens has different criteria for judging dirt.
  • the dirt particles on the lens can be detected stably and objectively without manual operation, which greatly improves the dirt detection efficiency.
  • FIG. 2 shows a partial schematic flowchart of a dirty image detection method provided by the second embodiment of the present application.
  • the second embodiment of the present application will be described in detail below with reference to FIG. 2 .
  • the center coordinates of the detection area are calculated by a visual algorithm.
  • the center coordinates of the detection area are calculated by a caliper algorithm.
  • the caliper algorithm first divides the detection area to form a uniform block, then takes points in each divided block, and finally obtains a circle or a straight line through circle fitting or straight line fitting, and finally obtains the detection area. Center coordinates.
  • S220 Define the radius of the detection area.
  • the radius of the detection area is defined according to the model of the lens.
  • FIG. 3 shows a schematic flowchart of a dirty image detection method provided by a third embodiment of the present application.
  • the third embodiment of the present application will be described in detail below with reference to FIG. 3 .
  • step S250 is also included.
  • Bright spots are spots formed by light hitting the lens.
  • the bright spot exists in the image of the lens, and is easily judged to be dirty, thereby affecting the judgment result of soiling. Therefore, removing bright spots that affect the contamination determination result can ensure the accuracy of the contamination determination result.
  • FIG. 4 shows a schematic flowchart of part of the dirty image detection method provided by the fourth embodiment of the present application.
  • the fourth embodiment of the present application will be described in detail below with reference to FIG. 4 .
  • Step S250 specifically further includes the following steps.
  • S251 Define the maximum radius of the long side of the circumscribed ellipse for dirt.
  • S252 Remove the bright spots whose maximum radius of the short side of the circumscribed ellipse is greater than the radius of the long side of the maximum circumscribed ellipse of dirt.
  • the maximum radius of the long side of the circumscribed ellipse of the dirt has been determined, that is, the bright spot larger than the radius of the long side of the maximum circumscribed ellipse of the dirt cannot be dirty.
  • FIG. 5 shows a schematic flowchart of part of the dirty image detection method provided by the fifth embodiment of the present application.
  • the fifth embodiment of the present application will be described in detail below with reference to FIG. 5 .
  • Step S300 specifically further includes the following steps.
  • the detection area is composed of multiple local areas.
  • S320 Capture multiple pixel points in the local area.
  • a pixel is composed of small squares of an image. These small squares have a clear position and assigned color value. The color and position of the small squares determine the appearance of the image.
  • a digital image is an image with only one sampled color per pixel, and such images are usually displayed in grayscale ranging from the darkest black to the brightest white.
  • grayscale the relationship between white and black is divided into several levels according to the logarithmic relationship, which is called "gray level”. The range is generally from 0 to 255, white is 255, black is 0, so black and white images are also called grayscale images.
  • Steps S330 and S340 are repeated multiple times to compare and determine all the pixel points.
  • a dynamic threshold method may be used to assist in completing step S300.
  • the dynamic threshold method refers to the method of dividing the image into blocks according to the left side and selecting a threshold for each block.
  • FIG. 6 shows a schematic flowchart of the dirty image detection method provided by the sixth embodiment of the present application.
  • the sixth embodiment of the present application will be described in detail below with reference to FIG. 6 .
  • Step S400 includes any one or more of the following three methods.
  • S410 Determine whether there is contamination of the individual type in the detection area.
  • this method is mainly used to determine dirty particles with a large area.
  • S420 Determine whether or not there is aggregated contamination in the detection area.
  • this method is mainly used to determine a plurality of dirt particles that are relatively close together or relatively concentrated.
  • S430 Determine whether there is area-type contamination in the detection area.
  • this method is mainly used to determine a plurality of relatively dispersed dirt particles.
  • FIG. 7 shows a schematic flowchart of part of a dirty image detection method provided by the seventh embodiment of the present application.
  • the seventh embodiment of the present application will be described in detail below with reference to FIG. 7 .
  • Step S410 specifically includes the following steps.
  • S411 Select the dirty particles to be determined.
  • step S230 the range of the detection area and the coordinates of each point in the detection area are obtained.
  • the area of the dirty particles can be obtained according to the coordinates of the dirty particles.
  • S413 Based on the area of the dirt particles, determine whether the dirt particles belong to individual types of dirt.
  • the data of the contamination is collected.
  • FIG. 8 shows a schematic flowchart of part of the dirty image detection method provided by the eighth embodiment of the present application.
  • the eighth embodiment of the present application will be described in detail below with reference to FIG. 8 .
  • Step S420 specifically includes the following steps.
  • the size of the converging distance is determined by the model of the lens.
  • step S230 the range of the detection area and the coordinates of each point in the detection area are obtained.
  • the area of the set of dirty particles can be obtained according to the coordinates of the set of dirty particles.
  • S425 Based on the area of the contamination particle set, it is determined whether the contamination particle set belongs to aggregated contamination.
  • a closed operation may be used to assist in completing step S420.
  • the closing operation is defined as dilation followed by erosion.
  • FIG. 9 shows a schematic flowchart of part of the dirty image detection method provided by the ninth embodiment of the present application.
  • the ninth embodiment of the present application will be described in detail below with reference to FIG. 9 .
  • Step S430 specifically includes the following steps.
  • the detection area is divided into a plurality of judgment areas.
  • S432 Calculate the number and area of dirt particles in the determination area.
  • step S230 the range of the detection area and the coordinates of each point in the detection area are obtained. In this way, the total area of the dirty particles can be obtained according to the coordinates of each dirty particle.
  • S433 Based on the number and area of the dirt particles, it is determined whether the determination area belongs to the area-type dirt.
  • the data of the contamination is collected.
  • a dirty image detection device 1000 including:
  • the image acquisition unit 1100 is used to acquire an image of the lens to be detected
  • the first processing unit 1200 is used to determine the range of the detection area
  • the second processing unit 1300 the second processing unit 1300 is used to determine the dirt particles within the detection area
  • the detection unit 1400 is used to detect whether there is dirt in the detection area.
  • the second processing unit 1300 specifically includes the following units.
  • the dividing unit 1310 is used to divide the detection area into a plurality of partial areas
  • the capturing unit 1320 is used to capture a plurality of pixel points in the local area;
  • Image selection unit 1330 the image selection unit 1330 is used to select the pixel to be determined
  • a comparison unit 1340 the comparison unit 1340 is used to compare the pixel point with other surrounding pixel points, so as to obtain the difference in gray level between the pixel point and other surrounding pixel points;
  • the dirty particle acquiring unit 1350 is configured to determine whether the pixel point is a dirty particle based on the difference value.
  • the detection unit 1400 includes one, two or three of the following three units:
  • the first determination unit 1410 the first determination unit 1410 is used to determine whether there is contamination of the individual type in the detection area;
  • the second determination unit 1420 is configured to determine whether there is aggregated contamination in the detection area
  • the third determination unit 1430 is used to determine whether there is regional-type contamination in the detection area.
  • the first determination unit 1410 includes:
  • the first selection unit 1411 the first selection unit 1411 is used to select the dirty particles to be determined
  • the first calculation unit 1412 is used to calculate the area of the dirt particles
  • the first determination subunit 1413 is used to determine whether the dirt particles belong to individual-type dirt.
  • the second determination unit 1420 includes:
  • the second selection unit 1421 the second selection unit 1421 is used to select the dirty particles to be determined
  • the first defining unit 1422 is used to define an aggregated distance
  • Dirty particle connecting unit 1423 which is used to connect the dirty particle and other dirty particles within the aggregation-type distance to form a dirty particle set
  • the second calculation unit 1424 is used to calculate the area of the dirty particle set
  • the second determination subunit 1425 is used to determine whether the set of dirt particles belongs to aggregated dirt.
  • the third determination unit 1430 includes:
  • the second definition unit 1431, the second definition unit 1431 is used to define the determination area
  • the third calculation unit 1432 the third calculation unit 1432 is used to calculate the number and area of the dirt particles in the determination area;
  • the third determination sub-unit 1433 is used to determine whether the determination area belongs to the area-type contamination.
  • the present application also provides a dirty image detection mechanism, which specifically includes a camera 10 for capturing an image of a lens to be captured, a driving assembly 20 for driving the camera 10 to move in a vertical direction, and a The light-emitting component 30 of the lens to be collected is illuminated.
  • the drive assembly 20 includes a power element 21 , a lead screw (not shown in the figure), a moving nut (not shown in the figure), a guide rail 22 and a slider 23 .
  • the output shaft of the power element 21 is connected with the screw rod, the moving nut is sleeved and engaged with the screw rod, and the moving nut is fixedly connected with the slider 23, the slider 23 is slidably arranged on the guide rail 22, and the camera 10 It is mounted on the slider 23 by a mounting member 40 . In this way, the power element 21 can drive the camera 10 to move longitudinally.
  • the power element 21 rotates the servo motor to increase the precision and stability of the lens point movement.
  • the camera 10 selects a bi-telecentric lens with low depth of field and high resolution to ensure that the dirt can be clearly imaged. Further, the camera 10 adopts a CMOS global exposure camera 10 to ensure fast and stable image acquisition.
  • the light-emitting assembly 30 adopts a combination of a green light and a condenser lens, and the installation angle is at a 45-degree angle to the lens axis, and illuminates from bottom to top, so that the surface of the finished lens to be detected has multiple light spots; these light spots start from the center, and the brightness is higher. Slowly darkens, so that the surface of the finished lens to be inspected presents uneven light distribution; in this way, slight dirt can be better presented.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Studio Devices (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

L'invention concerne un procédé de détection d'image de saleté, un dispositif de détection d'image de saleté et un mécanisme de détection d'image de saleté. Un procédé d'acquisition de saleté de lentille comprend les étapes suivantes : l'obtention d'une image d'une lentille en cours de détection (S100); sur la base de l'image, la détermination de la plage d'une région de détection (S200); la détermination de particules de saleté dans la plage de la région de détection (S300); et la détermination de la saleté dans la région de détection sur la base des particules de saleté (S400). Le dispositif de détection d'image de saleté (1000) comprend une unité d'obtention d'image (1100), une première unité de traitement (1200), une seconde unité de traitement (1300) et une unité de détection (1400). L'utilisation du procédé de détection d'image de saleté et du dispositif de détection d'image de saleté décrit ci-dessus, les particules de saleté sur une lentille peuvent être détectées de manière stable et objective, une opération manuelle n'est pas nécessaire, et l'efficacité de détection de saleté est améliorée dans une large mesure.
PCT/CN2020/132652 2020-11-17 2020-11-30 Procédé de détection d'image de saleté, dispositif de détection d'image de saleté et mécanisme de détection d'image de saleté WO2022104900A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011282276.5 2020-11-17
CN202011282276.5A CN112102319B (zh) 2020-11-17 2020-11-17 脏污图像检测方法、脏污图像检测装置及脏污图像检测机构

Publications (1)

Publication Number Publication Date
WO2022104900A1 true WO2022104900A1 (fr) 2022-05-27

Family

ID=73785682

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/132652 WO2022104900A1 (fr) 2020-11-17 2020-11-30 Procédé de détection d'image de saleté, dispositif de détection d'image de saleté et mécanisme de détection d'image de saleté

Country Status (2)

Country Link
CN (1) CN112102319B (fr)
WO (1) WO2022104900A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116183940A (zh) * 2023-02-07 2023-05-30 泰州奥尔斯顿生物科技有限公司 基于污点分布鉴别的生物检测分析装置

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113570597A (zh) * 2021-09-01 2021-10-29 南通中煌工具有限公司 基于人工智能的泥头车车厢脏污程度的判定方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160004144A1 (en) * 2014-07-04 2016-01-07 The Lightco Inc. Methods and apparatus relating to detection and/or indicating a dirty lens condition
CN106231297A (zh) * 2016-08-29 2016-12-14 深圳天珑无线科技有限公司 摄像头的检测方法及装置
CN111246204A (zh) * 2020-03-24 2020-06-05 昆山丘钛微电子科技有限公司 一种基于相对亮度偏差的脏污检测方法和装置
CN111726612A (zh) * 2020-07-07 2020-09-29 歌尔科技有限公司 镜头模组脏污检测方法、系统、设备及计算机存储介质

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6317223B1 (en) * 1998-12-14 2001-11-13 Eastman Kodak Company Image processing system for reducing vertically disposed patterns on images produced by scanning
CN102413354B (zh) * 2011-10-05 2014-04-30 深圳市联德合微电子有限公司 一种手机摄像模组自动光学检测方法、装置及系统
CN102410974A (zh) * 2011-12-14 2012-04-11 华北电力大学 气流输送管道中颗粒料粒度分布及形状分布在线测量方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160004144A1 (en) * 2014-07-04 2016-01-07 The Lightco Inc. Methods and apparatus relating to detection and/or indicating a dirty lens condition
CN106231297A (zh) * 2016-08-29 2016-12-14 深圳天珑无线科技有限公司 摄像头的检测方法及装置
CN111246204A (zh) * 2020-03-24 2020-06-05 昆山丘钛微电子科技有限公司 一种基于相对亮度偏差的脏污检测方法和装置
CN111726612A (zh) * 2020-07-07 2020-09-29 歌尔科技有限公司 镜头模组脏污检测方法、系统、设备及计算机存储介质

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116183940A (zh) * 2023-02-07 2023-05-30 泰州奥尔斯顿生物科技有限公司 基于污点分布鉴别的生物检测分析装置
CN116183940B (zh) * 2023-02-07 2024-05-14 广东蓝莺高科有限公司 基于污点分布鉴别的生物检测分析装置

Also Published As

Publication number Publication date
CN112102319B (zh) 2021-02-12
CN112102319A (zh) 2020-12-18

Similar Documents

Publication Publication Date Title
CN110766684B (zh) 一种基于机器视觉的定子表面缺陷检测系统及检测方法
WO2022082904A1 (fr) Procédé, appareil et dispositif de détection de taches de lentille
WO2022104900A1 (fr) Procédé de détection d'image de saleté, dispositif de détection d'image de saleté et mécanisme de détection d'image de saleté
CN103743761B (zh) 一种镜片水印疵病图像检测装置
CN110108711A (zh) 圆环侧壁缺陷的视觉检测系统
CN107610085A (zh) 一种基于计算机视觉的焊点缺陷检测系统
CN110751635B (zh) 一种基于帧间差分和hsv颜色空间的口腔检测方法
CN107274403B (zh) 一种浮选表面质量的评价方法
CN111551350A (zh) 一种基于U_Net网络的光学镜片表面划痕检测方法
CN109461156B (zh) 基于视觉的螺纹密封塞装配检测方法
JP7393313B2 (ja) 欠陥分類装置、欠陥分類方法及びプログラム
TWI512284B (zh) 玻璃氣泡瑕疵檢測系統
CN114577805A (zh) 一种MiniLED背光面板缺陷检测方法及装置
JP2017227474A (ja) 照明装置、及び、画像検査装置
CN110570412B (zh) 一种零件误差视觉判断系统
US11493453B2 (en) Belt inspection system, belt inspection method, and recording medium for belt inspection program
CN105606623A (zh) 一种基于线阵相机的钢板表面缺陷检测系统
CN111008960B (zh) 基于机器视觉的铝电解电容底部外观检测方法及装置
CN109990742A (zh) 基于图像处理技术的板栗检测方法
CN111833350A (zh) 机器视觉检测方法与系统
CN111724375A (zh) 一种屏幕检测方法及系统
CN104881652A (zh) 一种基于玉米穗凸性特征的行数自动检测算法
CN111815705B (zh) 激光跟踪仪滤光保护镜片污染识别方法、装置及电子设备
CN115272173A (zh) 锡球缺陷检测方法及其装置、计算机设备、存储介质
JP2007285868A (ja) 輝度勾配検出方法、欠陥検出方法、輝度勾配検出装置および欠陥検出装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20962191

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20962191

Country of ref document: EP

Kind code of ref document: A1