CN112862832A - Dirt detection method based on concentric circle segmentation positioning - Google Patents

Dirt detection method based on concentric circle segmentation positioning Download PDF

Info

Publication number
CN112862832A
CN112862832A CN202011634906.0A CN202011634906A CN112862832A CN 112862832 A CN112862832 A CN 112862832A CN 202011634906 A CN202011634906 A CN 202011634906A CN 112862832 A CN112862832 A CN 112862832A
Authority
CN
China
Prior art keywords
dirt
region
concentric circle
value
block
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011634906.0A
Other languages
Chinese (zh)
Other versions
CN112862832B (en
Inventor
胡露
董泽成
吴峰
肖仁涛
宋凯静
林映庭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Shine Photics Co Ltd
Original Assignee
Chongqing Shine Photics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Shine Photics Co Ltd filed Critical Chongqing Shine Photics Co Ltd
Priority to CN202011634906.0A priority Critical patent/CN112862832B/en
Publication of CN112862832A publication Critical patent/CN112862832A/en
Application granted granted Critical
Publication of CN112862832B publication Critical patent/CN112862832B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of camera detection, and particularly discloses a dirt detection method based on concentric circle segmentation positioning, which comprises the steps of obtaining multi-frame image data; performing gray scale, gray scale and edge shearing processing on the image; carrying out concentric circle segmentation processing and area division on the cut image; calculating a regional deviation ratio and gradient processing according to the divided regions; then, carrying out dirt detection through a multi-region double-card-space standard; if the threshold value is exceeded, the dirt is judged to be dirty, otherwise, the dirt is not polluted. By adopting the technical scheme of the invention, the dirt of the camera module can be effectively detected, the dirt detection accuracy is improved, and different users can independently adjust the dirt control standard conveniently.

Description

Dirt detection method based on concentric circle segmentation positioning
Technical Field
The invention relates to the technical field of camera detection, in particular to a dirt detection method based on concentric circle segmentation positioning.
Background
At present, cameras are generally applied to the fields of mobile phones, flat panels, notebooks, security, vehicle-mounted, medical treatment, monitoring and the like, and lenses such as wide-angle lenses, micro-distance lenses, long-focus lenses, fixed-focus lenses and the like are derived. The most challenging of index detection of the module factory is dirt detection of the camera module, different standards are provided for dirt detection of different clients, and therefore the module factory is required to have a set of universal, accurate and flexible methods for dirt detection.
The existing dirt testing method mainly comprises the following modes, namely, manual operation in the traditional sense and visual search are carried out; secondly, a brightness difference dirt detection algorithm is used for partitioning the image, calculating average brightness by using the region blocks in the image, and performing difference processing on the average brightness and the average brightness of the left, right, upper and lower blocks to obtain dirt characteristic information; thirdly, a curved surface dirt detection algorithm is adopted, a template curved surface is generated through image curved surface fitting so as to obtain a template image, and then the original gray level image and the generated template image are subtracted to finally obtain dirt characteristic information; and fourthly, performing a frequency domain dirt detection algorithm, performing Fourier transform on the image, performing frequency filtering enhancement processing, and performing blocking threshold detection on the image through inverse Fourier transform to obtain dirt characteristic information.
However, the above methods all have their own defects, and the first manual detection method is low in efficiency, visual fatigue and the like, and is not suitable for large-scale operation; the second type of contamination detection has limitations and lower accuracy; the third type has high misjudgment rate and requires more samples for establishing the card control standard; and fourthly, the kit is not mature, is suitable for single sample detection and has high technical requirements.
Therefore, a method for detecting contamination based on concentric circle segmentation positioning, which can improve the detection accuracy, is required
Disclosure of Invention
The invention provides a dirt detection method based on concentric circle segmentation positioning, which can improve the detection accuracy.
In order to solve the technical problem, the present application provides the following technical solutions:
a dirt detection method based on concentric circle segmentation positioning comprises the following steps:
step 1: shooting a plurality of frames of images of the camera module in a white field, and storing the images into a buffer area bmp24buffer [ i ] [ j ], wherein i is a frame number, i is 0,1, N-1, j is the number of pixels per frame bmp, j is 0,1, width 3-1, width is a pixel width, and height is a pixel height;
step 2: for bmp24buffer [ i][j]Obtaining maximum values max { B of three channels of RGB (Red, Green, blue) } from each frame of image dataimax}、max{Gimax}、max{Rimax and sum of channel pixels bmp24buffer BGR j];
And step 3: taking the sum of the pixels of each channel in the step 2 and the maximum value of the channel to obtain a gray-scale map Bmp24[ i ];
and 4, step 4: taking the gray-scale image in the step 3 to obtain a gray-scale image Graybuffer [ i ];
and 5: performing edge shearing processing on the gray-scale image obtained in the step 4 to obtain a sheared image Shearbuffer [ i ];
step 6: acquiring the optical center of the shearing graph in the step 5, and taking the optical center as a coordinate origin to divide the shearing graph into sector-shaped area blocks of N x M concentric circles, wherein N is the number of the concentric circles, and M is a division block of each concentric circle;
and 7: and (3) performing average brightness value on the sector region blocks of the concentric circles and performing average brightness value on the neighborhood blocks of the sector region blocks, namely:
Figure BDA0002880926820000021
Figure BDA0002880926820000022
wherein Blockave [ i ] is an average brightness value of the region block, blockneighbor [ i ] is an average brightness value of the neighborhood block, i belongs to N × M-P, k is 0,1,2.. N-1, and N is a statistic number of the neighborhood blocks of the region block;
and 8: obtaining a deviation ratio Blockdeviationratio [ i ] between the region block and the neighborhood block by using the region block average brightness value and the neighborhood block average brightness value in the step 7, namely:
Figure BDA0002880926820000023
and step 9: classifying the area blocks into various areas according to the positions of the area blocks, and formulating a card control standard:
step 10: calculating the maximum gradient value of each region and the neighborhood by using the regions classified in the step 9, namely:
Blockgradient[i]=max{Blockdeviationratio[i]-Blockdeviationratio[n]}
wherein, Block gradient [ i ] is the maximum gradient value of each region and each neighborhood, i and N belong to N × M-P, and N is the number of neighborhoods;
step 11: and (5) completing the dual-stuck control standard of the dirt detection by using the deviation ratio and the maximum gradient value obtained in the steps (8) and (10), when the deviation value of the image area value is greater than the threshold value, calculating whether the gradient is greater than the gradient threshold value, wherein if the gradient is greater than the gradient threshold value, the dirt is detected, and if the gradient is not greater than the gradient threshold value, the dirt is detected, and otherwise the dirt is detected.
The basic scheme principle and the beneficial effects are as follows:
in the scheme, a gray-scale image is obtained by processing a plurality of shot images, a shearing image is obtained by performing edge shearing processing on the gray-scale image, the shearing image is divided to obtain a plurality of concentric circular sector-shaped region blocks, the average brightness value of the concentric circular sector-shaped region blocks and the average brightness value of adjacent region blocks of the region blocks are obtained, and then the deviation ratio between the region blocks and the adjacent region blocks is obtained; and calculating the maximum gradient value obtained by combining the deviation ratio with each region and the neighborhood, so that the dirt can be accurately identified. Moreover, by adjusting the dual-clamping standard, the dirty sewage can be relaxed or tightened, and the operation is convenient.
In conclusion, effectively solve the dirty detection problem of module factory automated production, improve dirty accuracy that detects, be convenient for different users independently adjust the dirty card accuse standard moreover.
Further, the method also comprises the step 12: judging the size of the dirt: when the contamination is detected, judging whether the contamination is single-region contamination or multi-region contamination; if the single region is dirty, positioning a dirty position and outputting a dirty picture; and if the area is multi-area dirt, positioning the dirt size and position according to the neighborhood aggregation processing, and outputting a dirt picture.
And outputting the dirty picture to facilitate subsequent manual checking and analyzing the accuracy of dirty detection.
Further, the step 9 specifically includes:
step 9-1: classifying the area blocks into a central area, four corner areas, edge weakened areas and other view field areas according to the positions of the area blocks;
step 9-2: carrying out classified statistics on the maximum value of each region of the n module samples by using the classified regions to the deviation ratio in the step 8, and recording the maximum value;
step 9-3: and obtaining the deviation ratio stuck threshold value of each region based on the maximum value of each divided region of the n samples.
Further, in step 6, when the number of pixels of the edge area block is lower than the minimum number of pixels pixelmin of the edge area block, that is:
Figure BDA0002880926820000031
wherein N is the number of concentric circles, and M is each concentric circle division block;
and performing polymerization treatment to obtain final sector-shaped region blocks of N × M-P concentric circles, wherein P is the polymerization number.
Further, in step 3, the grayscale map is:
Bmp24[i]=bmp24bufferBGR[i]*GrayLevel/X_max
wherein X is RGB three channels, and GrayLevel is the gray scale.
Further, in step 4, the grayscale map is:
Graybuffer[i]=0.2990*Bmp24[3*i]+0.5870*Bmp24[3*i+1]+0.1140*Bmp24[3*i+1]
where i is 0, 1., (width × height-1), width is the pixel width, and height is the pixel height.
Further, in the step 5, the step of,specifically, the gray-scale image in the step 4 is cut at the upper, lower, left and right edges to obtain a cut image Shearbuffer [ i ]c]Wherein ic=0,1,...,(widthc)*(heightc)-1,widthcHeight to crop out the pixel width of the left and right edgescThe upper and lower edge pixel heights are clipped.
Further, in the step 3, the gray scale is 160-220.
Drawings
FIG. 1 is a flowchart illustrating a contamination detection method based on concentric circle segmentation positioning according to an embodiment;
FIG. 2 is a concentric circle segmentation chart of an image in a contamination detection method based on concentric circle segmentation positioning according to an embodiment;
FIG. 3 is a sample diagram of a contamination detection method based on concentric circle segmentation positioning according to an embodiment;
fig. 4 is a sample diagram in the contamination detection method based on the concentric circle division positioning according to the embodiment.
Detailed Description
The following is further detailed by way of specific embodiments:
examples
As shown in fig. 1, a contamination detection method based on concentric circle segmentation positioning in this embodiment includes the following steps:
step 1: in a proper white field, shooting a plurality of frames of images of the camera module, and storing the images into a buffer zone bmp24buffer [ i ] [ j ], wherein i is a frame number, i is 0,1, and N-1, j is the number of pixels bmp (bitmap) per frame, j is 0,1, and width is 3-1, and height is a pixel width and height is a pixel height. For the convenience of analysis, N takes 3 frames and goes to step 2.
Step 2: for bmp24buffer [ i][j]Obtaining maximum values max { B of three channels of RGB (Red, Green, blue) } from each frame of image dataimax}、max{Gimax}、max{Rimax and the sum of the channel pixels, i.e.:
Figure BDA0002880926820000041
wherein bmp24buffer bgr [3 × j + n ] is the sum of RGB channel pixels, i is the frame number, i is 0,1,2, j is the number of bmp pixels per frame, j is 0,1, (width 1), width is the pixel width, height is the pixel height, and n is RGB channels, i.e., 0,1,2.
And step 3: and (3) taking the sum of the pixels of each channel of RGB in the step (2) and the maximum value of the channel, enhancing the contrast and obtaining a gray-scale image thereof, namely:
Bmp24[i]=bmp24bufferBGR[i]*GrayLevel/X_max
wherein Bmp24[ i ] is a gray scale image, X is RGB three channels, and GrayLevel is a gray scale, and is generally between 160-220.
And 4, step 4: taking the gray-scale image in the step 3 to obtain a gray-scale image:
graybuffer [ i ] ═ 0.2990 × Bmp24[3 × i ] +0.5870 × Bmp24[3 × i +1] +0.1140 × Bmp24[3 × i +1], where Graybuffer [ i ] is a grayscale map, i ═ 0, 1., (width height-1), width is the pixel width, and height is the pixel height.
And 5: and 4, performing upper, lower, left and right edge shearing treatment on the gray-scale image in the step 4 to obtain a sheared image Shearbuffer [ i [ i ]c]Wherein ic=0,1,...,(widthc)*(heightc)-1,widthcHeight to crop out the pixel width of the left and right edgescThe upper and lower edge pixel heights are clipped.
As shown in fig. 2, step 6: and (5) acquiring the optical center of the shearing graph in the step 5, and taking the optical center as a coordinate origin to divide the shearing graph into N × M concentric circular sector-shaped area blocks. Considering that the edge detection error is caused by a small number of pixel samples of the edge region block, when the number of pixel of the edge region block is lower than pixelmin, that is:
Figure BDA0002880926820000051
and performing polymerization treatment to obtain final sector-shaped area blocks of N × M-P concentric circles, wherein P is the polymerization number.
And 7: and (3) performing average brightness value on the sector region blocks of the concentric circles and performing average brightness value on the neighborhood blocks of the sector region blocks, namely:
Figure BDA0002880926820000052
Figure BDA0002880926820000053
wherein Blockave [ i ] is an average brightness value of the region block, blockneighbor [ i ] is an average brightness value of the neighborhood block, i belongs to N × M-P, k is 0,1,2.
And 8: obtaining the deviation ratio between the region block and the neighborhood block by using the region block average brightness value and the neighborhood block average brightness value in the step 7, namely:
Figure BDA0002880926820000054
wherein Block resolution [ i ] is the deviation ratio between the region block and the neighborhood block of the concentric circular sector.
And step 9: and classifying the area blocks into various areas according to the positions of the area blocks, and setting a card control standard.
Step 9-1: the region blocks are classified into a center region, four corner regions, edge weakened regions and other field-of-view regions according to the positions of the region blocks.
Step 9-2: and 4, carrying out classified statistics on the maximum value of each area of the n module samples by using the classified area to the deviation ratio in the step 8, and recording the maximum value.
Step 9-3: and obtaining the deviation ratio stuck threshold value of each region by the maximum value of each divided region of the n module samples.
Step 10: calculating the maximum gradient value of each region and the neighborhood by using the regions classified in the step 9, namely:
Blockgradient[i]=max{Blockdeviationratio[i]-Blockdeviationratio[n]}
wherein, Block gradient [ i ] is the maximum gradient value of each region and each neighborhood, i and N belong to N × M-P, and N is the number of neighborhoods.
Step 11: multi-zone dual-card control standard dirt detection: and (4) finishing the dual-stuck control standard of the dirt detection by using the deviation ratio and the maximum gradient value obtained in the steps (8) and (10), when the deviation value of the image area value is greater than the threshold value, calculating whether the gradient is greater than the gradient threshold value, if so, determining that the image area value is dirty, otherwise, determining that the image area value is qualified.
As shown in fig. 3 and 4, step 12: judging the size of the dirt: when the stain is detected, it is judged whether the stain is single-region stain or multi-region stain. If the single region is dirty, positioning a dirty position and outputting a dirty picture; and if the area is multi-area dirt, positioning the dirt size and position according to the neighborhood aggregation processing, and outputting a dirt picture.
The above are merely examples of the present invention, and the present invention is not limited to the field related to this embodiment, and the common general knowledge of the known specific structures and characteristics in the schemes is not described herein too much, and those skilled in the art can know all the common technical knowledge in the technical field before the application date or the priority date, can know all the prior art in this field, and have the ability to apply the conventional experimental means before this date, and those skilled in the art can combine their own ability to perfect and implement the scheme, and some typical known structures or known methods should not become barriers to the implementation of the present invention by those skilled in the art in light of the teaching provided in the present application. It should be noted that, for those skilled in the art, without departing from the structure of the present invention, several changes and modifications can be made, which should also be regarded as the protection scope of the present invention, and these will not affect the effect of the implementation of the present invention and the practicability of the patent. The scope of the claims of the present application shall be determined by the contents of the claims, and the description of the embodiments and the like in the specification shall be used to explain the contents of the claims.

Claims (8)

1. A dirt detection method based on concentric circle segmentation positioning is characterized by comprising the following steps:
step 1: shooting a plurality of frames of images of the camera module in a white field, and storing the images into a buffer area bmp24buffer [ i ] [ j ], wherein i is a frame number, i is 0,1, N-1, j is the number of pixels per frame bmp, j is 0,1, width 3-1, width is a pixel width, and height is a pixel height;
step 2: for bmp24buffer [ i][j]Obtaining maximum values max { B of three channels of RGB (Red, Green, blue) } from each frame of image dataimax}、max{Gimax}、max{Rimax and sum of channel pixels bmp24buffer BGR j];
And step 3: taking the sum of the pixels of each channel in the step 2 and the maximum value of the channel to obtain a gray-scale map Bmp24[ i ];
and 4, step 4: taking the gray-scale image in the step 3 to obtain a gray-scale image Graybuffer [ i ];
and 5: performing edge shearing processing on the gray-scale image obtained in the step 4 to obtain a sheared image Shearbuffer [ i ];
step 6: acquiring the optical center of the shearing graph in the step 5, and taking the optical center as a coordinate origin to divide the shearing graph into sector-shaped area blocks of N x M concentric circles, wherein N is the number of the concentric circles, and M is a division block of each concentric circle;
and 7: and (3) performing average brightness value on the sector region blocks of the concentric circles and performing average brightness value on the neighborhood blocks of the sector region blocks, namely:
Figure FDA0002880926810000011
Figure FDA0002880926810000013
wherein Blockave [ i ] is an average brightness value of the region block, blockneighbor [ i ] is an average brightness value of the neighborhood block, i belongs to N × M-P, k is 0,1,2.. N-1, and N is a statistic number of the neighborhood blocks of the region block;
and 8: obtaining a deviation ratio Blockdeviationratio [ i ] between the region block and the neighborhood block by using the region block average brightness value and the neighborhood block average brightness value in the step 7, namely:
Figure FDA0002880926810000012
and step 9: classifying the area blocks into various areas according to the positions of the area blocks, and formulating a card control standard:
step 10: calculating the maximum gradient value of each region and the neighborhood by using the regions classified in the step 9, namely:
Blockgradient[i]=max{Blockdeviationratio[i]-Blockdeviationratio[n]}
wherein, Block gradient [ i ] is the maximum gradient value of each region and each neighborhood, i and N belong to N × M-P, and N is the number of neighborhoods;
step 11: and (5) completing the dual-stuck control standard of the dirt detection by using the deviation ratio and the maximum gradient value obtained in the steps (8) and (10), when the deviation value of the image area value is greater than the threshold value, calculating whether the gradient is greater than the gradient threshold value, wherein if the gradient is greater than the gradient threshold value, the dirt is detected, and if the gradient is not greater than the gradient threshold value, the dirt is detected, and otherwise the dirt is detected.
2. The contamination detection method based on concentric circle segmentation positioning as claimed in claim 1, wherein: further comprising the step 12: judging the size of the dirt: when the contamination is detected, judging whether the contamination is single-region contamination or multi-region contamination; if the single region is dirty, positioning a dirty position and outputting a dirty picture; and if the area is multi-area dirt, positioning the dirt size and position according to the neighborhood aggregation processing, and outputting a dirt picture.
3. The contamination detection method based on concentric circle segmentation positioning as claimed in claim 2, wherein: the step 9 specifically includes:
step 9-1: classifying the area blocks into a central area, four corner areas, edge weakened areas and other view field areas according to the positions of the area blocks;
step 9-2: carrying out classified statistics on the maximum value of each region of the n module samples by using the classified regions to the deviation ratio in the step 8, and recording the maximum value;
step 9-3: and obtaining the deviation ratio stuck threshold value of each region based on the maximum value of each divided region of the n samples.
4. The contamination detection method based on concentric circle segmentation positioning as claimed in claim 3, wherein: in step 6, when the number of pixels of the edge area block is lower than the minimum number of pixels pixelmin of the edge area block, that is:
Figure FDA0002880926810000021
wherein N is the number of concentric circles, and M is each concentric circle division block;
and performing polymerization treatment to obtain final sector-shaped region blocks of N × M-P concentric circles, wherein P is the polymerization number.
5. The contamination detection method based on concentric circle segmentation positioning as claimed in claim 4, wherein: in step 3, the grayscale map is:
Bmp24[i]=bmp24bufferBGR[i]*GrayLevel/X_max
wherein X is RGB three channels, and GrayLevel is the gray scale.
6. The contamination detection method based on concentric circle segmentation positioning as claimed in claim 5, wherein: in the step 4, the gray scale map is:
Graybuffer[i]=0.2990*Bmp24[3*i]+0.5870*Bmp24[3*i+1]+0.1140*Bmp24[3*i+1]
where i is 0, 1., (width × height-1), width is the pixel width, and height is the pixel height.
7. The contamination detection method based on concentric circle segmentation positioning as claimed in claim 6, wherein: in the step 5, the grayscale image in the step 4 is subjected to upper, lower, left and right edge clipping processing to obtain a clipping image Shearbuffer [ i ]c]Wherein ic=0,1,...,(widthc)*(heightc)-1,widthcHeight to crop out the pixel width of the left and right edgescThe upper and lower edge pixel heights are clipped.
8. The contamination detection method based on concentric circle segmentation positioning as claimed in claim 7, wherein: in the step 3, the gray scale is 160-220.
CN202011634906.0A 2020-12-31 2020-12-31 Dirt detection method based on concentric circle segmentation positioning Active CN112862832B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011634906.0A CN112862832B (en) 2020-12-31 2020-12-31 Dirt detection method based on concentric circle segmentation positioning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011634906.0A CN112862832B (en) 2020-12-31 2020-12-31 Dirt detection method based on concentric circle segmentation positioning

Publications (2)

Publication Number Publication Date
CN112862832A true CN112862832A (en) 2021-05-28
CN112862832B CN112862832B (en) 2022-07-12

Family

ID=76000527

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011634906.0A Active CN112862832B (en) 2020-12-31 2020-12-31 Dirt detection method based on concentric circle segmentation positioning

Country Status (1)

Country Link
CN (1) CN112862832B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113570582A (en) * 2021-07-30 2021-10-29 上海集成电路制造创新中心有限公司 Camera cover plate cleanliness detection method and detection device
CN116758071A (en) * 2023-08-17 2023-09-15 青岛冠宝林活性炭有限公司 Intelligent detection method for carbon electrode dirt under visual assistance
CN117495860A (en) * 2024-01-02 2024-02-02 江苏圣点世纪科技有限公司 Method and system for detecting abnormality of camera module of vein equipment

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009017158A (en) * 2007-07-04 2009-01-22 Panasonic Corp Camera inspection device
CN102682432A (en) * 2012-05-11 2012-09-19 中国科学院半导体研究所 Inferior-quality fingerprint grayscale image enhancement method on basis of three gaussian filtering
WO2012171834A2 (en) * 2011-06-17 2012-12-20 Robert Bosch Gmbh Method and device for detecting impairment of visibility through a pane
EP2680227A1 (en) * 2012-06-27 2014-01-01 Clarion Co., Ltd. Water and drying mark detection
CN104980730A (en) * 2014-04-01 2015-10-14 宁波舜宇光电信息有限公司 Method for positioning optical center on the basis of concentric circle theory
CN106791804A (en) * 2016-11-23 2017-05-31 歌尔股份有限公司 For the smear detecting method and device of camera module
CN107024541A (en) * 2015-10-08 2017-08-08 株式会社日立电力解决方案 Defect detecting method and its device
CN107945158A (en) * 2017-11-15 2018-04-20 上海摩软通讯技术有限公司 A kind of dirty method and device of detector lens
CN109187581A (en) * 2018-07-12 2019-01-11 中国科学院自动化研究所 The bearing finished products plate defects detection method of view-based access control model
CN109767428A (en) * 2018-12-26 2019-05-17 中国科学院西安光学精密机械研究所 A kind of dirty detection method of camera module
JP2019176300A (en) * 2018-03-28 2019-10-10 パナソニックIpマネジメント株式会社 Dirt detection apparatus, camera, computer program, and recording media
CN110446025A (en) * 2019-06-25 2019-11-12 盐城华昱光电技术有限公司 Camera module detection system and method applied to electronic equipment
CN111246204A (en) * 2020-03-24 2020-06-05 昆山丘钛微电子科技有限公司 Relative brightness deviation-based dirt detection method and device
CN111476750A (en) * 2019-01-04 2020-07-31 宁波舜宇光电信息有限公司 Method, device and system for carrying out stain detection on imaging module and storage medium

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009017158A (en) * 2007-07-04 2009-01-22 Panasonic Corp Camera inspection device
WO2012171834A2 (en) * 2011-06-17 2012-12-20 Robert Bosch Gmbh Method and device for detecting impairment of visibility through a pane
CN102682432A (en) * 2012-05-11 2012-09-19 中国科学院半导体研究所 Inferior-quality fingerprint grayscale image enhancement method on basis of three gaussian filtering
EP2680227A1 (en) * 2012-06-27 2014-01-01 Clarion Co., Ltd. Water and drying mark detection
CN104980730A (en) * 2014-04-01 2015-10-14 宁波舜宇光电信息有限公司 Method for positioning optical center on the basis of concentric circle theory
CN107024541A (en) * 2015-10-08 2017-08-08 株式会社日立电力解决方案 Defect detecting method and its device
CN106791804A (en) * 2016-11-23 2017-05-31 歌尔股份有限公司 For the smear detecting method and device of camera module
CN107945158A (en) * 2017-11-15 2018-04-20 上海摩软通讯技术有限公司 A kind of dirty method and device of detector lens
JP2019176300A (en) * 2018-03-28 2019-10-10 パナソニックIpマネジメント株式会社 Dirt detection apparatus, camera, computer program, and recording media
CN109187581A (en) * 2018-07-12 2019-01-11 中国科学院自动化研究所 The bearing finished products plate defects detection method of view-based access control model
CN109767428A (en) * 2018-12-26 2019-05-17 中国科学院西安光学精密机械研究所 A kind of dirty detection method of camera module
CN111476750A (en) * 2019-01-04 2020-07-31 宁波舜宇光电信息有限公司 Method, device and system for carrying out stain detection on imaging module and storage medium
CN110446025A (en) * 2019-06-25 2019-11-12 盐城华昱光电技术有限公司 Camera module detection system and method applied to electronic equipment
CN111246204A (en) * 2020-03-24 2020-06-05 昆山丘钛微电子科技有限公司 Relative brightness deviation-based dirt detection method and device

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113570582A (en) * 2021-07-30 2021-10-29 上海集成电路制造创新中心有限公司 Camera cover plate cleanliness detection method and detection device
CN113570582B (en) * 2021-07-30 2022-07-29 上海集成电路制造创新中心有限公司 Camera cover plate cleanliness detection method and detection device
CN116758071A (en) * 2023-08-17 2023-09-15 青岛冠宝林活性炭有限公司 Intelligent detection method for carbon electrode dirt under visual assistance
CN116758071B (en) * 2023-08-17 2023-11-03 青岛冠宝林活性炭有限公司 Intelligent detection method for carbon electrode dirt under visual assistance
CN117495860A (en) * 2024-01-02 2024-02-02 江苏圣点世纪科技有限公司 Method and system for detecting abnormality of camera module of vein equipment
CN117495860B (en) * 2024-01-02 2024-04-12 江苏圣点世纪科技有限公司 Method and system for detecting abnormality of camera module of vein equipment

Also Published As

Publication number Publication date
CN112862832B (en) 2022-07-12

Similar Documents

Publication Publication Date Title
CN112862832B (en) Dirt detection method based on concentric circle segmentation positioning
CN108460757B (en) Mobile phone TFT-LCD screen Mura defect online automatic detection method
US20110135156A1 (en) Method of Locating License Plate of Moving Vehicle
US7599519B2 (en) Method and apparatus for detecting structural elements of subjects
Shi et al. Single image dehazing in inhomogeneous atmosphere
JP2005505870A (en) Method and apparatus for identifying different regions of an image
CN116703910B (en) Intelligent detection method for quality of concrete prefabricated bottom plate
CN107292828B (en) Image edge processing method and device
CN110648330B (en) Defect detection method for camera glass
US20120320433A1 (en) Image processing method, image processing device and scanner
CN116188468B (en) HDMI cable transmission letter sorting intelligent control system
CN113781406A (en) Scratch detection method and device for electronic component and computer equipment
CN111145105A (en) Image rapid defogging method and device, terminal and storage medium
CN110689003A (en) Low-illumination imaging license plate recognition method and system, computer equipment and storage medium
CN113744326B (en) Fire detection method based on seed region growth rule in YCRCB color space
CN103607558A (en) Video monitoring system, target matching method and apparatus thereof
CN105229665A (en) To the enhancing analysis of the snakelike belt wear assessment based on image
CN116152255B (en) Modified plastic production defect judging method
TWI498830B (en) A method and system for license plate recognition under non-uniform illumination
CN109165659B (en) Vehicle color identification method based on superpixel segmentation
KR20060007901A (en) Apparatus and method for automatic extraction of salient object from an image
CN110633705A (en) Low-illumination imaging license plate recognition method and device
CN110688979A (en) Illegal vehicle tracking method and device
CN111695374A (en) Method, system, medium, and apparatus for segmenting zebra crossing region in monitoring view
CN112560929B (en) Oil spilling area determining method and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 400900 the 3 building of the 207 electronic workshop of the electronic information industrial park, Chongqing.

Applicant after: Shengtai Photoelectric Technology Co.,Ltd.

Address before: 400900 the 3 building of the 207 electronic workshop of the electronic information industrial park, Chongqing.

Applicant before: Chongqing Sheng Tai optoelectronic Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant