CN113722672A - Method for detecting and calculating stray light noise of VR Lens - Google Patents

Method for detecting and calculating stray light noise of VR Lens Download PDF

Info

Publication number
CN113722672A
CN113722672A CN202110818940.1A CN202110818940A CN113722672A CN 113722672 A CN113722672 A CN 113722672A CN 202110818940 A CN202110818940 A CN 202110818940A CN 113722672 A CN113722672 A CN 113722672A
Authority
CN
China
Prior art keywords
gray
area
noise
image
lens
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110818940.1A
Other languages
Chinese (zh)
Other versions
CN113722672B (en
Inventor
张玉潘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiamen Weiya Intelligent Technology Co ltd
Original Assignee
Xiamen Weiya Intelligence Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiamen Weiya Intelligence Technology Co ltd filed Critical Xiamen Weiya Intelligence Technology Co ltd
Priority to CN202110818940.1A priority Critical patent/CN113722672B/en
Publication of CN113722672A publication Critical patent/CN113722672A/en
Application granted granted Critical
Publication of CN113722672B publication Critical patent/CN113722672B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume

Abstract

The invention provides a method for detecting and calculating stray light noise of VR Lens, which comprises the following steps: s1, acquiring an image; s2, processing the image and capturing astigmatic noise; s3, dividing the area where the astigmatic noise is located, calculating the area of each divided area, and determining the value of the astigmatic noise according to the area. The detection of VR Lens is mainly still dependent on visual inspection, the glasses are close to the product during detection, only the Lens on one side of the VR product can be detected through visual inspection at each time, the efficiency is low, operators are easy to fatigue, and the eyesight can be influenced for a long time. Personnel's acutance and quality control ability are different, are difficult to guarantee higher detectable rate, and can not carry out subregion and accurate judgement bad stratification degree to Lens, and the testing process is traceable yet. The invention realizes the intelligent detection of VR Lens by effectively analyzing the VR image and capturing astigmatic noise.

Description

Method for detecting and calculating stray light noise of VR Lens
Technical Field
The invention discloses automatic optical detection, and relates to a detection and calculation method for VR Lens stray light noise.
Background
The VR Lens is mainly used at present, because of the abnormal of the screw thread structure or the sawtooth structure on the surface of the Fresnel Lens, the image of the VR display screen when reaching the eyes of people through the VR Lens appears, the stray light/noise phenomenon appears, no device corresponding to the type is applied to a VR production line on the market at present, the detection of the VR Lens is mainly in a visual inspection mode in the production process of VR products, the glasses are close to the products during detection, only the Lens on one side of the VR product can be detected in each visual inspection, the efficiency is low, operators are easy to fatigue, and the vision can be influenced for a long time. Personnel's acutance and quality control ability are different, are difficult to guarantee higher detectable rate, and can not carry out subregion and accurate judgement bad stratification degree to Lens, and the testing process is traceable yet.
Disclosure of Invention
The invention provides a method for detecting and calculating stray light noise of VR Lens, which is used for realizing intelligent detection of VR Lens.
The invention provides a method for detecting and calculating stray light noise of VR Lens, which comprises the following steps:
s1, acquiring an image;
s2, processing the image and capturing astigmatic noise;
s3, dividing the area where the astigmatic noise is located, calculating the area of each divided area, and determining the value of the astigmatic noise according to the area.
Further, the S1 includes:
s101, setting an image format to have grid lines;
s102, photographing is carried out by adopting a camera, and an image is obtained.
Further, the image format in S101 is specifically a black background and green grid line.
Further, the step S101 includes setting a central ring shape and a peripheral ring shape in sequence from outside to inside with the lens facing position in the image format as a center, so that both the central ring shape and the peripheral ring shape intersect with the grid line.
Further, the S2 specifically includes:
s201, carrying out gray level conversion on the image;
s202 sets a gradation threshold value, and captures the astigmatism noise exceeding the gradation threshold value.
Further, the S3 includes:
s301, dividing the image into n detection areas according to a gray threshold, and determining detection areas Block1, Block2, … … and Block n;
s302, collecting the pixel Area exceeding the gray threshold in each detection Area, and calculating the Total pixel Area.
Further, in S302, the formula is Total Area = Area1+ Area2+ … … + Area n.
Further, the gray threshold is 40-120.
Still further, the step S302 further includes calculating an astigmatism mean gray value:
and calculating the pixel Average Gray level Gray _ Average Value corresponding to the stray light/noise of each Block, summing the Average Gray levels in each detection area, and averaging to obtain the Total Average Gray level Total Gray _ Average Value.
Further, the total average gray level calculation formula is: total Gray _ Average Value = (Gray _ Average Value1+ Gray _ Average Value2+ … … + Gray _ Average Value n)/n.
Compared with the prior art, the VR Lens intelligent detection method has the advantages that the VR images are effectively analyzed, astigmatic noise is captured, and the VR Lens intelligent detection is realized.
Drawings
FIG. 1 is a diagram of an image format according to an embodiment of the present invention;
FIG. 2 is a photograph of a camera according to an embodiment of the present invention;
FIG. 3 is a diagram of an image format having a central annular shape and a peripheral annular shape in accordance with an embodiment of the present invention;
FIG. 4 illustrates an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood by those skilled in the art, the technical solutions in the embodiments of the present invention will be clearly and completely described below, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments.
The invention provides a method for detecting and calculating stray light noise of VR Lens, which comprises the following steps:
s1, acquiring an image;
s2, processing the image and capturing astigmatic noise;
s3, dividing the area where the astigmatic noise is located, calculating the area of each divided area, and determining the value of the astigmatic noise according to the area.
According to the embodiment of the invention, the VR Lens is effectively analyzed, and astigmatic noise is captured, so that the intelligent detection of VR Lens is realized. Meanwhile, the embodiment of the invention partitions the area where the astigmatic noise is located at S3, so that the partitioned quantitative analysis of the VR Lens detection is realized, and the accurate OK NG judgment and grade division of the VR Lens are effectively achieved.
Optionally, the S1 includes:
s101, setting an image format to have grid lines;
as shown in fig. 1, the image format in S101 is specifically a pattern of black background and green grid lines;
s102, photographing is carried out by adopting a camera, and an image is obtained.
As shown in fig. 2, the camera is a camera using a small-aperture lens.
The embodiment of the invention can partition the image by adopting the grid lines, thereby facilitating the effective detection of each part in the image.
Specifically, the step S101 includes setting a central ring shape and a peripheral ring shape in sequence from outside to inside with the position where the lens faces in the image format as a center, so that both the central ring shape and the peripheral ring shape intersect with the grid line.
As shown in fig. 3, the central ring and the peripheral ring are both blue ellipses. The green line intersects the blue line as shown in fig. 3. The centers of the central ring and the peripheral ring are superposed and are opposite to the lens of the camera.
According to the embodiment of the invention, the central ring and the peripheral ring are arranged, so that the image format can be subjected to curvature change according to the central ring and the peripheral ring, and the image format is matched with the shot image of the camera.
Optionally, the S2 specifically includes:
s201, carrying out gray level conversion on the image;
s202 sets a gradation threshold value, and captures the astigmatism noise exceeding the gradation threshold value.
Wherein, the gray threshold value after capturing the green part in the image and converting into the gray scale is in the range of Graysscale 40-120 (can be self-defined according to the actual situation) and is partitioned.
Specifically, the S3 includes:
s301, dividing the image into n detection areas according to a gray threshold, and determining detection areas Block1, Block2, … … and Block n;
s302, collecting the pixel Area exceeding the gray threshold in each detection Area, and calculating the Total pixel Area.
In the embodiment of the present invention, an image is divided into 9 interested detection regions, 4 corner points of each of 9 blocks are searched, a corresponding Mask is constructed, 9 ROI regions can be obtained by using the Mask, an average value of gray values is calculated for the 9 regions respectively (Block 1, Block2, Block3, Block4, Block5, Block6, Block7, Block8, Block9, and the number of partitions can be set according to requirements, in the embodiment of the present invention, 9 interested regions are taken as an example), and 4 minimum frames in the image form one Block, and the effect is as shown in fig. 4.
In calculating the astigmatic Area, the pixel Area corresponding to the stray light/noise of each Block is calculated as shown in fig. 4, and the sum of the Area areas of each Block is the Total Area of the pixel areas of the stray light/noise of the Area to be detected by the next VR Lens, so that the astigmatic areas of all blocks of the product or for a certain Block can be controlled, and the formula is as follows, wherein the Total Area = Area1+ Area2+ Area3+ Area4+ Area5+ Area6+ Area7+ Area8+ Area9 calculates the average astigmatic gray value
In particular, said S302 further comprises the calculation of mean gray value of astigmatism:
and calculating the pixel Average Gray level Gray _ Average Value corresponding to the stray light/noise of each Block, summing the Average Gray levels in each detection area, and averaging to obtain the Total Average Gray level Total Gray _ Average Value.
Specifically, the total average gray level calculation formula is: total Gray _ Average Value = (Gray _ Average Value1+ Gray _ Average Value2+ … … + Gray _ Average Value n)/n.
The Average Gray scale Value of the pixel corresponding to the stray light/noise of each Block is calculated by an algorithm, the sum of each Average Gray scale and the Total number of the blocks is divided by the Total number of the blocks, that is, the Total Average Gray scale Value of the stray light/noise of the area to be detected by the current VR Lens, and as a result, the Average Gray scale Value of the Total product blocks or the stray light of a certain Block can be controlled, and the calculation formula is as follows, i.e., Total Gray scale Value1+ Gray scale Value2+ Gray scale Value3+ Gray scale Value4+ Gray scale Value5+ Gray scale Value6+ Gray scale Value7+ Gray scale Value8+ Gray scale Value 9/9.
The embodiment of the invention is suitable for online production, replaces manpower, provides quick, stable and reliable detection capability, can perform customized partition and quantification on VR Lens detection in a visual field, can customize various card control values, and realizes accurate judgment and grade division of OK NG of products. The process and the result are controllable and traceable.
Finally, it should be noted that the above-mentioned embodiments are only used for illustrating the technical solutions of the present invention and not for limiting the same, and although the present invention is described in detail with reference to the above-mentioned embodiments, it should be understood by those skilled in the art that the modifications and equivalents of the specific embodiments of the present invention can be made by those skilled in the art after reading the present specification, but these modifications and variations do not depart from the scope of the claims of the present application.

Claims (10)

1. A VR Lens stray light noise detection and calculation method is characterized by comprising the following steps:
s1, acquiring an image;
s2, processing the image and capturing astigmatic noise;
s3, dividing the area where the astigmatic noise is located, calculating the area of each divided area, and determining the value of the astigmatic noise according to the area.
2. The method of claim 1, wherein the step S1 includes:
s101, setting an image format to have grid lines;
s102, photographing is carried out by adopting a camera, and an image is obtained.
3. The method of claim 2, wherein the image format in S101 is black-bottom green grid lines.
4. The method of claim 3, wherein S101 comprises setting a central ring shape and a peripheral ring shape from outside to inside with the Lens facing position in the image format as a center, such that the central ring shape and the peripheral ring shape intersect with the grid lines.
5. The method of claim 1, wherein S2 specifically includes:
s201, carrying out gray level conversion on the image;
s202 sets a gradation threshold value, and captures the astigmatism noise exceeding the gradation threshold value.
6. The method of claim 5, wherein the step S3 includes:
s301, dividing the image into n detection areas according to a gray threshold, and determining detection areas Block1, Block2, … … and Block n;
s302, collecting the pixel Area exceeding the gray threshold in each detection Area, and calculating the Total pixel Area.
7. The method of claim 6, wherein in S302, the calculation formula is Total Area = Area1+ Area2+ … … + Area n.
8. The method of claim 5, wherein the threshold grayscale value is 40-120.
9. The method of claim 6, wherein the step S302 further comprises calculating an average gray value of the scattered light:
and calculating the pixel Average Gray level Gray _ Average Value corresponding to the stray light/noise of each Block, summing the Average Gray levels in each detection area, and averaging to obtain the Total Average Gray level Total Gray _ Average Value.
10. The method of claim 9, wherein the total average gray scale calculation formula is: total Gray _ Average Value = (Gray _ Average Value1+ Gray _ Average Value2+ … … + Gray _ Average Value n)/n.
CN202110818940.1A 2021-07-20 2021-07-20 Method for detecting and calculating stray light noise of VR Lens Active CN113722672B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110818940.1A CN113722672B (en) 2021-07-20 2021-07-20 Method for detecting and calculating stray light noise of VR Lens

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110818940.1A CN113722672B (en) 2021-07-20 2021-07-20 Method for detecting and calculating stray light noise of VR Lens

Publications (2)

Publication Number Publication Date
CN113722672A true CN113722672A (en) 2021-11-30
CN113722672B CN113722672B (en) 2022-04-05

Family

ID=78673636

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110818940.1A Active CN113722672B (en) 2021-07-20 2021-07-20 Method for detecting and calculating stray light noise of VR Lens

Country Status (1)

Country Link
CN (1) CN113722672B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1689386A (en) * 2002-08-08 2005-10-26 大日本印刷株式会社 Electromagnetic shielding sheet
US20060281221A1 (en) * 2005-06-09 2006-12-14 Sharad Mehrotra Enhanced routing grid system and method
US20060290695A1 (en) * 2001-01-05 2006-12-28 Salomie Ioan A System and method to obtain surface structures of multi-dimensional objects, and to represent those surface structures for animation, transmission and display
CN101738217A (en) * 2008-11-18 2010-06-16 株式会社三丰 Scale track configuration for absolute optical encoder
CN102216941A (en) * 2008-08-19 2011-10-12 数字标记公司 Methods and systems for content processing
US20120059850A1 (en) * 2010-09-06 2012-03-08 Jonathan Binnings Bent Computerized face photograph-based dating recommendation system
CN102460632A (en) * 2009-05-20 2012-05-16 迈普尔平版印刷Ip有限公司 Method of generating a two-level pattern for lithographic processing and pattern generator using the same
US20170351708A1 (en) * 2016-06-06 2017-12-07 Think-Cell Software Gmbh Automated data extraction from scatter plot images
CN108801601A (en) * 2018-04-13 2018-11-13 歌尔科技有限公司 Test method, equipment and the storage medium of the spuious optical noise of Fresnel Lenses

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060290695A1 (en) * 2001-01-05 2006-12-28 Salomie Ioan A System and method to obtain surface structures of multi-dimensional objects, and to represent those surface structures for animation, transmission and display
CN1689386A (en) * 2002-08-08 2005-10-26 大日本印刷株式会社 Electromagnetic shielding sheet
US20060281221A1 (en) * 2005-06-09 2006-12-14 Sharad Mehrotra Enhanced routing grid system and method
CN102216941A (en) * 2008-08-19 2011-10-12 数字标记公司 Methods and systems for content processing
CN101738217A (en) * 2008-11-18 2010-06-16 株式会社三丰 Scale track configuration for absolute optical encoder
CN102460632A (en) * 2009-05-20 2012-05-16 迈普尔平版印刷Ip有限公司 Method of generating a two-level pattern for lithographic processing and pattern generator using the same
US20120059850A1 (en) * 2010-09-06 2012-03-08 Jonathan Binnings Bent Computerized face photograph-based dating recommendation system
US20170351708A1 (en) * 2016-06-06 2017-12-07 Think-Cell Software Gmbh Automated data extraction from scatter plot images
CN108801601A (en) * 2018-04-13 2018-11-13 歌尔科技有限公司 Test method, equipment and the storage medium of the spuious optical noise of Fresnel Lenses

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
WANG YE 等: ""Design on the photographic lamps for panoramic imaging"", 《AOPC 2020: OPTICS ULTRA PRECISION MANUFACTURING AND TESTING》 *
徐节速 等: ""激光引力波望远镜镜面杂散光测试方法"", 《红外与激光工程》 *

Also Published As

Publication number Publication date
CN113722672B (en) 2022-04-05

Similar Documents

Publication Publication Date Title
CN106851264B (en) Camera module lens surface detection method and device
CN101209207A (en) Eyelid detecting apparatus, eyelid detecting method and program thereof
CN109471276B (en) Method and device for detecting color cast defect of liquid crystal display
CN105812790B (en) Method for evaluating verticality between photosensitive surface and optical axis of image sensor and optical test card
CN109167997A (en) A kind of video quality diagnosis system and method
CN108827597B (en) Light spot uniformity detection method and detection system of structured light projector
CN114881915A (en) Symmetry-based mobile phone glass cover plate window area defect detection method
CN110648330B (en) Defect detection method for camera glass
CN109461156B (en) Threaded sealing plug assembly detection method based on vision
US10375383B2 (en) Method and apparatus for adjusting installation flatness of lens in real time
CN104777172A (en) Quick and intelligent defective optical lens detection device and method
CN112819844B (en) Image edge detection method and device
CN116993682B (en) Cornea shaping mirror flaw area extraction method based on image data analysis
CN115131354A (en) Laboratory plastic film defect detection method based on optical means
CN110880184A (en) Method and device for carrying out automatic camera inspection based on optical flow field
CN114612418A (en) Method, device and system for detecting surface defects of mouse shell and electronic equipment
CN108965749A (en) Defect pixel detection and means for correcting and method based on texture recognition
CN113722672B (en) Method for detecting and calculating stray light noise of VR Lens
CN110446025B (en) Camera module detection system and method applied to electronic equipment
CN109544535B (en) Peeping camera detection method and system based on optical filtering characteristics of infrared cut-off filter
CN114820597B (en) Smelting product defect detection method, device and system based on artificial intelligence
CN111008960A (en) Aluminum electrolytic capacitor bottom appearance detection method and device based on machine vision
CN112102319B (en) Dirty image detection method, dirty image detection device, and dirty image detection mechanism
CN115690089A (en) Image enhancement preprocessing method and system for weak defect detection
KR20160123455A (en) Method for measuring a pollution level of spot welding electrode tip using images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: Room 201a, Jinfeng Building, information optoelectronic Park, torch high tech Zone, Xiamen, Fujian Province

Patentee after: Xiamen Weiya Intelligent Technology Co.,Ltd.

Address before: Room 201a, Jinfeng Building, information optoelectronic Park, torch high tech Zone, Xiamen, Fujian Province

Patentee before: XIAMEN WEIYA INTELLIGENCE TECHNOLOGY Co.,Ltd.

CP01 Change in the name or title of a patent holder