CN108198175B - Detection method, detection device, computer equipment and computer readable storage medium - Google Patents

Detection method, detection device, computer equipment and computer readable storage medium Download PDF

Info

Publication number
CN108198175B
CN108198175B CN201711465117.7A CN201711465117A CN108198175B CN 108198175 B CN108198175 B CN 108198175B CN 201711465117 A CN201711465117 A CN 201711465117A CN 108198175 B CN108198175 B CN 108198175B
Authority
CN
China
Prior art keywords
image
optical element
frames
light
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711465117.7A
Other languages
Chinese (zh)
Other versions
CN108198175A (en
Inventor
吴安平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201711465117.7A priority Critical patent/CN108198175B/en
Publication of CN108198175A publication Critical patent/CN108198175A/en
Priority to PCT/CN2018/123585 priority patent/WO2019129004A1/en
Application granted granted Critical
Publication of CN108198175B publication Critical patent/CN108198175B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a method for detecting breakage of an optical element. The optical element comprises a camera optical element of a camera module, and the detection method comprises the following steps: acquiring a first image; processing the first image to determine whether the first image has a crack pattern; the camera optical element is confirmed to be broken when the first image has a crack pattern. The invention also discloses a device for detecting the breakage of the optical element, a computer device and a computer readable storage medium. The detection method, the detection device, the computer equipment and the computer readable storage medium of the embodiment of the invention judge whether the camera optical element is broken or not by identifying whether the crack pattern exists in the first image acquired by the camera module. Therefore, the computer equipment can automatically detect whether the optical element is intact or not, and the intelligence of the computer equipment is improved.

Description

Detection method, detection device, computer equipment and computer readable storage medium
Technical Field
The present invention relates to the field of imaging technologies, and in particular, to a detection method, a detection apparatus, a computer device, and a computer-readable storage medium.
Background
A conventional mobile phone is generally mounted with a device having an optical element such as a camera. During the use of the mobile phone, the optical element may be broken due to external force. However, the current mobile phone does not have the function of self-detecting whether the optical element is broken, and the intelligence of the mobile phone is poor.
Disclosure of Invention
The embodiment of the invention provides a detection method, a detection device, a computer and a computer-readable storage medium.
The invention provides a method for detecting breakage of an optical element. The optical element comprises a camera optical element of a camera module, and the detection method comprises the following steps:
acquiring a first image;
processing the first image to determine whether a crack pattern is present in the first image; and
confirming the camera optical element is broken when a crack pattern is present in the first image.
The invention provides a detection device for optical element breakage. The optical element comprises a camera optical element of a camera module, and the detection device comprises a first acquisition module, a first processing module and a first confirmation module. The first acquisition module is used for acquiring a first image. The first processing module is used for processing the first image to judge whether a crack pattern exists in the first image. The first confirmation module is to confirm that the camera optical element is broken when a crack pattern is present in the first image.
The invention provides a computer device, which comprises a memory and a processor, wherein the memory is stored with computer readable instructions, and the instructions are executed by the processor to cause the processor to execute the detection method.
The present invention provides one or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the detection methods described above.
The detection method, the detection device, the computer equipment and the computer readable storage medium of the embodiment of the invention judge whether the camera optical element is broken or not by identifying whether the crack pattern exists in the first image acquired by the camera module. Therefore, the computer equipment can automatically detect whether the optical element is intact or not, and the intelligence of the computer equipment is improved.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic flow diagram of a detection method according to some embodiments of the present invention.
FIG. 2 is a block diagram of a computer device in accordance with certain embodiments of the invention.
FIG. 3 is a block diagram of a computer device in accordance with certain embodiments of the invention.
FIG. 4 is a schematic flow chart of a detection method according to some embodiments of the present invention.
FIG. 5 is a block diagram of a computer device in accordance with certain embodiments of the invention.
FIG. 6 is a schematic flow chart of a detection method according to some embodiments of the present invention.
FIG. 7 is a block diagram of a computer device in accordance with certain embodiments of the invention.
FIG. 8 is a schematic flow chart of a detection method according to some embodiments of the present invention.
FIG. 9 is a block diagram of a computer device in accordance with certain embodiments of the invention.
FIG. 10 is a schematic flow chart of a detection method according to some embodiments of the present invention.
FIG. 11 is a block diagram of a computer device in accordance with certain embodiments of the invention.
FIG. 12 is a schematic flow chart of a detection method according to some embodiments of the present invention.
FIG. 13 is a block diagram of a computer device in accordance with certain embodiments of the invention.
FIG. 14 is a schematic flow chart of a detection method according to some embodiments of the present invention.
FIG. 15 is a block diagram of a computer device in accordance with certain embodiments of the invention.
FIG. 16 is a schematic flow chart of a detection method according to some embodiments of the present invention.
FIG. 17 is a block diagram of a computer device according to some embodiments of the invention.
FIG. 18 is a schematic flow chart of a detection method according to some embodiments of the present invention.
FIG. 19 is a block diagram of a computer device in accordance with certain embodiments of the invention.
FIG. 20 is a schematic flow chart of a detection method according to some embodiments of the present invention.
FIG. 21 is a block diagram of a computer device in accordance with certain embodiments of the invention.
FIG. 22 is a block diagram of a computer device in accordance with certain embodiments of the invention.
FIG. 23 is a block diagram of image processing circuitry in accordance with certain implementations of the invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
Referring to fig. 1, fig. 2 and fig. 23, the present invention provides a method for detecting a crack of an optical device. The optical element includes a camera optical element 411 of a camera module. The detection method comprises the following steps:
01: acquiring a first image;
02: processing the first image to determine whether the first image has a crack pattern; and
04: the camera optical element 411 is confirmed to be broken when the first image has a crack pattern.
Referring to fig. 2, the present invention further provides a device 10 for detecting a broken optical element. The optical element includes a camera optical element 411 of a camera module. The method for detecting a break of an optical element according to the embodiment of the present invention can be realized by the apparatus 10 for detecting a break of an optical element according to the embodiment of the present invention. The detection device 10 comprises a first acquisition module 11, a first processing module 12 and a first confirmation module 15. Step 01 may be implemented by the first obtaining module 11, step 02 may be implemented by the first processing module 12, and step 03 may be implemented by the first confirming module 15. That is, the first acquiring module 11 may be used to acquire the first image. The first processing module 12 may be configured to process the first image to determine whether the first image has a crack pattern. The first confirmation module 15 may be used to confirm that the camera optical element 411 is broken when the first image has a crack pattern. Further, the first confirmation module 15 may be used to confirm that the camera optical element 411 is not broken when the first image is absent of a crack pattern.
Referring to fig. 3, the present invention further provides a computer apparatus 100. The computer apparatus 100 includes a memory 51 and a processor 52. The memory 51 has stored therein computer-readable storage instructions. The computer readable instructions 511, when executed by the processor 52, cause the processor 52 to perform the operations of acquiring the first image, processing the first image to determine whether the first image has a crack pattern, and confirming that the camera optical element 411 is broken when the first image has a crack pattern.
In an embodiment of the present invention, the camera module is an infrared camera 41, and the first image in step 01 is formed by the infrared camera 41 capturing infrared light from ambient light reflected by the object. The camera optical element 411 of the camera module is a lens component in the infrared camera 41, such as a filter, a focusing lens, a dustproof lens, and the like.
In some embodiments, the computer device 100 may be a cell phone, a tablet computer. Notebook computer, intelligent bracelet, intelligent wrist-watch, intelligent helmet, intelligent glasses etc..
Taking a mobile phone as an example, it can be understood that the existing mobile phone is usually equipped with devices with optical elements, such as a visible light camera, an infrared camera 41, and a light supplement element. During the use of the mobile phone, the optical element may be broken due to external factors, such as dropping of the mobile phone. However, the current mobile phone does not have the function of self-detecting whether the optical element is broken, and the intelligence of the mobile phone is poor.
Since the infrared camera 41 photographs the crack pattern on the broken camera optical element 411 when the camera optical element 411 of the infrared camera 41 is broken, the detection method, the detection device 10 and the computer apparatus 100 for detecting the broken optical element of the embodiment of the present invention determine whether the camera optical element 411 is broken by recognizing whether there is a crack pattern in the first image acquired by the camera module. Thus, the computer device 100 can detect whether the optical element is intact by itself, and the intelligence of the computer device 100 is improved.
Further, in some embodiments, upon confirming that the camera optical element 411 is broken, the detection apparatus 10 and a prompting module (e.g., a speaker, a display screen) in the computer device 100 may prompt the user to replace the broken camera optical element 411 by voice or text.
Referring to FIG. 4, in some embodiments, processing 02 the first image to determine whether the first image has a crack pattern includes:
021: constructing a multi-stage crack pattern classifier based on the Haar-like rectangular feature set;
022: training a multi-stage crack pattern classifier by adopting a positive crack pattern sample and a negative crack pattern sample; and
023: and detecting whether the first image has a crack pattern or not by adopting the trained multistage crack pattern classifier.
Referring to fig. 5, in some embodiments, the first processing module 12 includes a first constructing unit 121, a first training unit 122 and a first detecting unit 123. Step 021 may be implemented by the first construction unit 121, step 022 may be implemented by the first training unit 122, and step 023 may be implemented by the first detection unit 123. That is, the first construction unit 121 may be configured to construct a multi-stage crack pattern classifier based on a Haar-like rectangular feature set. The first training unit 122 may be used to train a multi-stage crack pattern classifier using positive and negative crack pattern samples. The first detection unit 123 may be configured to detect whether a crack pattern exists in the first image by using the trained multi-stage crack pattern classifier.
Referring back to fig. 3, in some embodiments, the computer readable instructions 511, when executed by the processor 52, cause the processor 52 to further perform the operations of constructing a multi-stage crack pattern classifier based on the Haar-like rectangular feature set, training the multi-stage crack pattern classifier using the positive crack pattern samples and the negative crack pattern samples, and detecting whether the crack pattern exists in the first image using the trained multi-stage crack pattern classifier.
Specifically, first, a large number of positive crack pattern samples and negative crack pattern samples are selected, wherein the positive crack pattern samples are samples with crack patterns of the camera optical element 411, and the negative crack pattern samples are samples with crack patterns of the non-camera optical element 411. Secondly, the positive crack pattern sample and the negative crack pattern sample are subjected to resolution normalization calculation to reduce the size of the image, so that the subsequent rapid detection of image features is facilitated. Then, aiming at morphological characteristics of the crack patterns, a Haar-like rectangular feature set capable of detecting line features, center surrounding features, diagonal line features and edge features is adopted to construct a multistage crack pattern classifier. Specifically, for a matrix eigenvalue corresponding to each Haar-like rectangular characteristic calculator, then, a plurality of matrix eigenvalues are selected from the plurality of matrix eigenvalues to form a plurality of weak classifiers, and each weak classifier comprises one or more matrix eigenvalues. The weak classifiers form a plurality of strong classifiers, and each strong classifier comprises a plurality of weak classifiers. All strong classifiers are cascaded to form a multi-stage crack pattern classifier. And then, training the multistage crack pattern classifiers by adopting a large number of positive crack pattern samples and negative crack pattern samples with normalized resolution to correct the weight corresponding to each weak classifier, so that the loss function of the finally output image classification result is smaller than a preset loss value. And finally, detecting whether the first image has the crack pattern by using the trained multistage crack pattern classifier. The multistage crack pattern classifier is formed by cascading a plurality of strong classifiers, each strong classifier is formed by cascading a plurality of weak classifiers, the multistage structure can accurately extract image characteristics, and the accuracy of image classification is improved.
Referring to FIG. 6, in some embodiments, the first image includes a plurality of frames. Step 02 of processing the first image to determine whether the first image has a crack pattern includes:
021: processing the multiple frames of first images to judge whether crack patterns exist in the multiple frames of first images;
the method for detecting the breakage of the optical element further comprises:
031: comparing whether the crack positions of the crack patterns in any two frames of first images are consistent;
032: when the crack positions in any two frames of the first image coincide, it is confirmed that the camera optical element 411 is broken.
Referring to fig. 7, in some embodiments, the detecting device 10 further includes a first comparing module 13. Step 021 may be implemented by the first processing module 12, step 031 may be implemented by the first comparing module 13, and step 032 may be implemented by the first confirming module 15. That is, the first processing module 12 may be further configured to process the plurality of frames of the first image to determine whether the crack pattern exists in each of the plurality of frames of the first image. The first comparing module 13 may be configured to compare whether the crack positions of the crack patterns in any two frames of the first image are consistent. The first confirmation module 15 may also be used to confirm that the camera optical element 411 is broken when the crack positions in any two frames of the first image coincide. At this time, when the crack pattern exists in all of the plurality of frames of the first image, but the crack positions in any two frames of the first image do not coincide, it is confirmed that the camera optical element 411 is not broken.
Referring back to fig. 3, in some embodiments, the computer readable instructions 511, when executed by the processor 52, enable the processor 52 to further perform the operations of processing the plurality of first images to determine whether crack patterns exist in each of the plurality of first images, comparing whether crack positions of the crack patterns in any two first images are consistent, and confirming that the camera optical element 411 is broken when the crack positions in any two first images are consistent.
Specifically, after the infrared camera 41 captures a plurality of frames of first images, the processor 52 may detect each frame of first image by using a trained multi-stage crack pattern classifier to determine whether each frame of first image has a crack pattern. Upon detecting the presence of a crack pattern in each frame of the first image, the processor 52 extracts the crack location in the corresponding first image according to each crack pattern identified. Subsequently, the processor 52 compares the crack positions in the first images of the plurality of frames two by two, and when each comparison result shows that the crack positions in the corresponding two first images are consistent, the camera optical element 411 is considered to be broken. The two first images are divided into two frames, wherein the two first images have the same crack position, and the difference of the distances between the crack positions in the two first images is smaller than a preset distance difference. It is understood that when the camera optical element 411 of the infrared camera 41 is broken, the position of the crack pattern is unchanged in each frame of the first image captured by the infrared camera 41. Therefore, whether the camera optical element 411 is broken or not can be further determined by the crack position, so that the accuracy of the detection of the breakage of the camera optical element 411 can be improved.
Referring to fig. 8, in some embodiments, the optical elements further include a fill-in optical element of the infrared fill-in lamp 42. The method for detecting the breakage of the optical element further comprises:
051: when the optical element 411 of the camera is not damaged, the infrared fill-in light 42 and the infrared camera 41 are turned on;
052: acquiring a second image according to the infrared light emitted by the infrared light supplement lamp 42;
053: processing the second image to determine whether the second image has a light pattern; and
054: and confirming that the light supplementing optical element is broken when the second image has the light pattern.
Referring to fig. 9, in some embodiments, the detecting device 10 further includes a first opening module 21, a second obtaining module 22, a second processing module 23, and a second confirming module 25. Step 051 may be implemented by the first opening module 21, step 052 may be implemented by the second obtaining module 22, step 053 may be implemented by the second processing module 23, and step 054 may be implemented by the second confirming module 25. That is, the first turning-on module 21 may be configured to turn on the infrared fill-in light 42 and the infrared camera 41 when it is determined that the camera optical element 411 is not damaged. The second obtaining module 22 may be configured to obtain a second image according to the infrared light emitted by the infrared fill-in light 42. The second processing module 23 can be used for processing the second image to determine whether the second image has the light pattern. The second confirmation module 25 may be configured to confirm that the fill-in optical element is broken when the second image has the light pattern. Further, the second confirming module 25 may be further configured to confirm that the light supplementing optical element is not broken when the second image does not have the light pattern.
Referring back to fig. 3, in some embodiments, the computer readable instructions 511, when executed by the processor 52, cause the processor 52 to perform: when the optical element 411 of the camera is determined not to be damaged, the infrared fill-in lamp 42 and the infrared camera 41 are turned on, a second image is obtained according to infrared light emitted by the infrared fill-in lamp 42, the second image is processed to determine whether a light pattern exists in the second image, and when the light pattern exists in the second image, the operation of breaking the fill-in optical element is determined.
The method for detecting the breakage of the optical element in the embodiment of the invention can be further used for detecting whether the light supplement optical element of the infrared light supplement lamp 42 is broken or not. The light supplement optical element of the infrared light supplement lamp 42 is a lens or a lampshade in the infrared light supplement lamp 42. Because when the light supplementing optical element is broken, the light emitted by the light supplementing optical element cannot be uniformly diffused into a scene, and because of the existence of cracks of the light supplementing optical element, light fringes can also appear in the infrared light diffused into the scene. Therefore, when it is confirmed that the camera optical element 411 of the infrared camera 41 is not broken, the infrared fill-in lamp 42 and the infrared camera 41 can be turned on at the same time. The infrared camera 41 can capture light emitted by the infrared fill-in light 42 to capture a second image. The processor 52 processes the second image to detect whether there is a light pattern in the second image, and determines that the fill-in optical element is broken when there is a light pattern in the second image. Therefore, on one hand, the computer device 100 can automatically detect the integrity of the light supplementing optical element, and the intelligence of the computer device 100 is improved; on the other hand, the light that infrared light filling lamp 42 sent is infrared laser usually, and when light filling optical element broke, the people's eye was burnt easily to the infrared laser that sends through crack department, consequently, can in time close infrared light filling lamp 42 when computer equipment 100 detected that light filling optical element broke to the situation of avoiding infrared laser to burn people's eye takes place, promotes the security that the user used computer equipment 100, improves user experience.
Referring to fig. 10, in some embodiments, the step 053 of processing the second image to determine whether the second image has the light pattern includes:
0531: constructing a multi-level optical pattern classifier based on a Haar-like rectangular feature set;
0532: training a multi-stage light pattern classifier by adopting positive light pattern samples and negative light pattern samples; and
0533: and detecting whether the second image has the light pattern or not by adopting the trained multi-stage light pattern classifier.
Referring to fig. 11, in some embodiments, the second processing module 23 includes a second constructing unit 231, a second training unit 232, and a second detecting unit 233. Step 0531 may be implemented by the second construction unit 231, step 0532 may be implemented by the second training unit 232, and step 0533 may be implemented by the second detection unit 233. That is, the second construction unit 231 may be configured to construct a multi-level optical grain pattern classifier based on the Haar-like rectangular feature set. The second training unit 232 may be used to train the multi-stage lightprint pattern classifier using the positive lightprint pattern samples and the negative lightprint pattern samples. The second detecting unit 233 may be configured to detect whether the second image has a light pattern by using the trained multi-stage light pattern classifier.
Referring back to fig. 3, in some embodiments, the computer readable instructions 511, when executed by the processor 52, cause the processor 52 to further perform operations of constructing a multi-level lightstripe pattern classifier based on the Haar-like rectangular feature set, training the multi-level lightstripe pattern classifier using the positive lightstripe pattern samples and the negative lightstripe pattern samples, and detecting whether a lightstripe pattern exists in the second image using the trained multi-level lightstripe pattern classifier.
Specifically, firstly, a large number of positive light pattern samples and negative light pattern samples are selected, wherein the positive light pattern samples are samples with light patterns, and the negative light pattern samples are samples with non-light patterns. Secondly, the resolution normalization calculation is carried out on the positive light pattern sample and the negative light pattern sample to reduce the size of the image, so that the subsequent rapid detection of the image characteristics is facilitated. Then, aiming at morphological characteristics of the light pattern, a Haar-like rectangular feature set capable of detecting line features, center surrounding features, diagonal features and edge features is adopted to construct a multilevel light pattern classifier. Specifically, for a matrix eigenvalue corresponding to each Haar-like rectangular characteristic calculator, then, a plurality of matrix eigenvalues are selected from the plurality of matrix eigenvalues to form a plurality of weak classifiers, and each weak classifier comprises one or more matrix eigenvalues. The weak classifiers form a plurality of strong classifiers, and each strong classifier comprises a plurality of weak classifiers. All the strong classifiers are cascaded to form a multistage light pattern classifier. And then, training the multi-stage light pattern classifiers by adopting a large number of positive light pattern samples and negative light pattern samples with normalized resolution to correct the weight corresponding to each weak classifier, so that the loss function of the finally output image classification result is smaller than a preset loss value. And finally, detecting whether the second image has the light pattern by using the trained multistage light pattern classifier. The multistage light pattern classifier is formed by cascading a plurality of strong classifiers, each strong classifier is formed by cascading a plurality of weak classifiers, the multistage structure can accurately extract image characteristics, and the accuracy of image classification is improved.
Referring to fig. 12, in some embodiments, the second image comprises a plurality of frames, and the step 053 of processing the second image to determine whether the second image has the light pattern comprises:
0534: processing the multiple frames of second images to judge whether the multiple frames of second images all have light pattern or not;
the method for detecting the breakage of the optical element further comprises:
0541: comparing whether the positions of the light patterns in any two frames of second images are consistent; and
0542: and when the positions of the light fringes in any two frames of second images are consistent, confirming that the light supplementing optical element is broken.
Referring to fig. 13, in some embodiments, the detection apparatus 10 further includes a second comparison module 24. Step 0534 may be implemented by the second processing module 23. Step 0541 may be performed by the second comparison module 24. Step 0542 may be implemented by the second validation module 25. That is, the second processing module 23 is further configured to process the plurality of frames of the second image to determine whether the light pattern exists in each of the plurality of frames of the second image. The second comparing module 24 can be used to compare whether the positions of the light stripes in any two frames of the second images are consistent. The second confirmation module 25 may further be configured to confirm that the light supplement optical element is broken when the positions of the light fringes in any two frames of the second image are consistent. Further, the second confirmation module 25 may be further configured to confirm that the light supplementing optical element is not broken when the light pattern positions in any two frames of the second image are not consistent.
Referring back to fig. 3, in some embodiments, the computer readable instructions 511 are executed to enable the processor 52 to further perform the operations of processing the plurality of frames of the second images to determine whether the light pattern exists in each of the plurality of frames of the second images, comparing whether the positions of the light patterns in any two frames of the second images are consistent, and confirming that the light-filling optical element is broken when the positions of the light patterns in any two frames of the second images are consistent.
Specifically, when it is confirmed that the camera optical element 411 of the infrared camera 41 is not broken, the infrared fill light 42 and the infrared camera 41 are turned on at the same time. After the infrared camera 41 captures infrared light emitted by the infrared fill-in light 42 to obtain multiple frames of second images, the processor 52 may detect each frame of second image by using a trained multi-stage light pattern classifier to determine whether there is a light pattern in each frame of second image. Upon detecting the presence of a light pattern in each frame of the second image, processor 52 extracts the position of the light pattern in the corresponding second image according to each identified light pattern. Subsequently, the processor 52 compares the positions of the light fringes in the second images of the multiple frames two by two, and when each comparison result shows that the positions of the light fringes in the corresponding two frames of second images are consistent, the light supplementing optical element is considered to be broken. The consistency of the positions of the light stripes in the two frames of second images means that the distance difference between the positions of the light stripes in the two frames of second images is smaller than a preset distance difference. It can be understood that, when the light supplement optical element of the infrared light supplement lamp 42 is broken, since the distance between the infrared light supplement lamp 42 and the infrared camera 41 is fixed, the position of the crack pattern is not changed in each frame of the second image captured by the infrared camera 41. Therefore, whether the light supplement optical element of the infrared light supplement lamp 42 is broken or not can be further judged through the crack position, so that the accuracy of breakage detection of the light supplement optical element can be improved.
Referring to fig. 14, in some embodiments, the optical elements further include structured light optical elements of the structured light projector 43, wherein the structured light optical elements refer to collimation, diffraction elements, etc. in the structured light projector 43. The method for detecting the breakage of the optical element further comprises:
061: upon confirming that the camera optical element 411 is not broken, the structured light projector 43 and the infrared camera 41 are turned on;
062: acquiring a third image from the infrared light emitted from the structured light projector 43;
063: processing the third image to judge whether depth abnormal information exists in the third image; and
065: confirming that the structured light optical element is broken when there is depth anomaly information in the third image.
Referring to fig. 15, in some embodiments, the detection apparatus 10 further includes a second starting module 31, a third obtaining module 32, a third processing module 33, and a third confirming module 35. Step 061 may be implemented by the second enabling module 31, step 062 may be implemented by the third obtaining module 32, step 063 may be implemented by the third processing module 33, and step 065 may be implemented by the third confirming module 35. That is, the second activation module 31 may be used to activate the structured light projector 43 and the infrared camera 41 upon confirming that the camera optical element 411 is not broken. The third acquisition module 32 may be configured to acquire a third image based on the infrared light emitted by the structured light projector 43. The third processing module 33 may be configured to process the third image to determine whether depth anomaly information exists in the third image. The third confirmation module 35 may be used to confirm that the structured light optical element is broken when there is depth anomaly information in the third image.
Referring back to fig. 3, in some embodiments, the computer readable instructions 511, when executed by the processor 52, cause the processor 52 to further perform the operations of turning on the structured light projector 43 and the infrared camera 41 when it is determined that the camera optical element 411 is not broken, acquiring a third image based on infrared light emitted from the structured light projector 43, processing the third image to determine whether depth anomaly information exists in the third image, and determining that the structured light optical element is broken when the depth anomaly information exists in the third image.
The method for detecting the breakage of an optical element according to the embodiment of the present invention can be further used for detecting whether or not the structured light optical element of the structured light projector 43 is broken. The structured light optical element of the structured light projector 43 may be a collimating lens or a Diffractive Optical Element (DOE) in the structured light projector 43. When the structured light optical element is broken, and the infrared camera 41 calculates the depth information of the current scene from the captured image and the preset reference image, a certain part of the depth information of the current scene may become abnormally large. Therefore, when it is confirmed that the camera optical element 411 of the infrared camera 41 is not broken, the structured light projector 43 and the infrared camera 41 can be turned on at the same time. The infrared camera 41 may capture infrared light projected by the structured-light projector 43 to capture a third image. Processor 52 processes the third image to detect whether depth anomaly information is present in the third image and confirms that the structured light optical element is broken when depth anomaly information is present in the third image. Therefore, on one hand, the computer device 100 can automatically detect the integrity of the structured light optical element, and the intelligence of the computer device 100 is improved; on the other hand, the light emitted by the structured light projector 43 is usually infrared laser, and when the structured light optical element of the structured light projector 43 is broken, the infrared laser of the structured light projector 43 easily burns the human eyes, so that the structured light projector 43 can be closed in time when the computer device 100 detects that the structured light optical element is broken, thereby avoiding the situation that the human eyes are burnt by the infrared laser, improving the safety of the user using the computer device 100, and improving the user experience.
Referring to fig. 16, in some embodiments, the processing 063 of the third image to determine whether depth anomaly information exists in the third image comprises:
0631: calculating the scattered point offset of each pixel point in the third image according to the third image and a preset reference image;
0632: calculating the depth information of each pixel point in the third image according to the scatter offset;
0633: calculating the number of pixel points with adjacent depth information larger than a preset threshold; and
0634: and determining that the depth abnormal information exists in the third image when the number of the adjacent pixel points is larger than the preset number.
Referring to fig. 17, in some embodiments, the third processing module 33 includes a first calculating unit 331, a second calculating unit 332, a third calculating unit 333 and a confirming unit 334. Step 0631 may be implemented by the first calculation unit 331, step 0632 may be implemented by the second calculation unit 332, step 0633 may be implemented by the third calculation unit 333, and step 0634 may be implemented by the validation unit 334. That is to say, the first calculating unit 331 is configured to calculate a scatter offset of each pixel point in the third image according to the third image and the preset reference image, the second calculating unit 332 is configured to calculate depth information of each pixel point in the third image according to the scatter offset, the third calculating unit 333 is configured to calculate the number of pixel points whose adjacent depth information is greater than the preset threshold, and the determining unit 334 is configured to determine that depth anomaly information exists in the third image when the number of adjacent pixel points is greater than the preset number.
Referring back to fig. 3, in some embodiments, when executed by the processor 52, the computer readable instructions 511 enable the processor 52 to perform operations of calculating a scatter offset of each pixel point in the third image according to the third image and the preset reference image, calculating depth information of each pixel point in the third image according to the scatter offset, calculating the number of pixel points whose adjacent depth information is greater than a preset threshold, and determining that depth anomaly information exists in the third image when the number of adjacent pixel points is greater than a preset number.
Specifically, first, the processor 52 extracts a plurality of input image blocks from the third image, where center points of any two input image blocks are the same and have different sizes, and then the processor 52 extracts a plurality of matching search windows from the preset reference image, where the matching search windows are matching blocks extracted within a certain range around a corresponding point in the preset reference image, which is the same as the center point of the input image block, as the center point. Then, the sum SAD of the absolute values of the differences of corresponding pixels between the input image block and a matching block of the same size in the matching search window is calculated by adopting a parallel algorithm, so that the SAD between each input image block and all the matching blocks is obtained. Then, for each input image block, a matching block corresponding to the SAD at the minimum SAD value is selected from the plurality of SADs as a best matching block for the input image block. Then, the processor 52 subtracts the coordinate value of the center point of the best matching block from the coordinate value of the center point of the preset reference image to calculate the best offset corresponding to the input image block, where the best offset is the scatter offset of the pixel point where the center point of the corresponding input image block is located, and positive and negative scatter offsets represent the front-back relationship between the depth of the center point of the input image block and the depth of the preset reference pattern, and the larger the scatter offset is, the farther the plane vertical distance between the center point of the input image block and the preset reference pattern with a known depth distance is. The processor 52 calculates the amount of scatter offset for each pixel in the third image one by one according to the above method. Finally, the processor 52 calculates the depth information of each pixel point according to the depth calculation formula by combining the known depth distance of the preset reference pattern and the baseline distance between the structured light projector 43 and the infrared camera 41 according to the calculated offset of each pixel point in the third image. The processor 52 determines whether a plurality of adjacent pixel points with depth information greater than a preset threshold exist in the third image according to the calculated depth information, and determines that depth abnormality information exists in the third image when the number of the adjacent pixel points with depth information greater than the preset threshold exceeds a preset number. In this manner, it can be determined whether the structured light optical element of the structured light projector 43 is broken based on the determination of the depth anomaly information in the third image.
Referring to fig. 18, in some embodiments, the third image includes a plurality of frames, and the step 063 of processing the third image to determine whether the depth anomaly information exists in the third image includes:
0635: processing the multiple frames of third images to judge whether depth abnormal information exists in the multiple frames of third images;
the method for detecting the breakage of the optical element further comprises:
0641: comparing whether the position information of the abnormal depth information in any two frames of third images is consistent; and
0642: and confirming that the structured light optical element is broken when the position information of the depth anomaly information in any two frames of third images is consistent.
Referring to fig. 19, in some embodiments, the detection apparatus 10 further includes a third comparing module 34. Step 0635 may be implemented by the third processing module 33, step 0641 may be implemented by the third comparing module 34, and step 0642 may be implemented by the third confirming module 35. That is, the third processing module 33 may be configured to process the plurality of frames of the third image to determine whether depth anomaly information exists in the plurality of frames of the third image. The third comparing module 34 can be used to compare whether the position information of the abnormal depth information in any two frames of the third image is consistent. The third confirming module 35 may be configured to confirm that the structured light optical element is broken when the position information of the depth anomaly information in any two frames of the third image is consistent.
Referring back to fig. 3, in some embodiments, the computer readable instructions 511, when executed by the processor 52, enable the processor 52 to further perform the operations of processing the plurality of frames of the third image to determine whether there is depth anomaly information in the plurality of frames of the third image, comparing whether the position information of the anomaly depth information in any two frames of the third image is consistent, and confirming that the structured light optical element is broken when the position information of the depth anomaly information in any two frames of the third image is consistent.
Specifically, upon confirming that the camera optical element 411 of the infrared camera 41 is not broken, the structured light projector 43 and the infrared camera 41 are turned on at the same time. After the infrared camera 41 captures the infrared laser light projected by the structured light projector 43 to capture a plurality of frames of third images, the processor 52 detects each frame of third image to determine whether depth anomaly information exists in each frame of third image. Upon detecting that there is depth anomaly information for each frame of the third image, processor 52 extracts its position information in the corresponding third image according to each depth anomaly information identified. Subsequently, the processor 52 compares the position information of the depth anomaly information in the third images of the multiple frames two by two, and when each comparison result shows that the position information of the depth anomaly information in the corresponding two frames of third images is consistent, the structured light optical element is considered to be broken. The consistency of the position information of the depth anomaly information in the two frames of third images means that the distance difference between the position information of the depth anomaly information in the two frames of third images is smaller than a preset distance difference. It is understood that when the structured light optical element of the structured light projector 43 is broken, the distance between the structured light projector 43 and the infrared camera 41 is fixed because the crack of the structured light optical element is fixed, and therefore, the processor 52 calculates the position information of the depth anomaly information of each frame of the third image from each frame of the third image captured by the infrared camera 41 to be constant. Therefore, whether the structured light optical element of the structured light projector 43 is broken or not can be further determined by the position information of the depth anomaly information, so that the accuracy of the detection of the breakage of the structured light optical element can be improved.
In some embodiments, when it is determined that the camera optical element 411 of the infrared camera 41 is not broken, the infrared fill-in lamp 42 and the infrared camera 41 may be simultaneously turned on, and whether the fill-in optical element of the infrared fill-in lamp 42 is broken or not may be determined according to whether a light pattern exists in each of a plurality of frames of second images captured by the infrared camera 41 and whether positions of the light patterns in any two frames of second images are consistent or not, and when the fill-in optical element is not broken, the structured light projector 43, the infrared fill-in lamp 42 and the infrared camera 41 are simultaneously turned on, and whether the structured light optical element of the structured light projector 43 is broken or not may be determined according to whether depth anomaly information exists in each of a plurality of frames of third images captured by the infrared camera 41 and whether position information corresponding to the depth anomaly information of any two frames of third images is consistent or not. Therefore, when the light supplement optical element of the light supplement lamp is not broken, the infrared light supplement lamp 42 can normally execute the light supplement function, the brightness of the third image acquired by the infrared camera 41 is high, calculation of depth information is facilitated, and the accuracy of structural light optical element breakage detection can be further improved.
Referring to fig. 20, in some embodiments, the method for detecting the breakage of the optical element further includes:
001: acquiring the movement speed of the camera module;
002: and (5) judging whether the movement speed is greater than a preset speed value or not, and entering the step (01) to acquire a first image when the movement speed is greater than the preset movement speed value.
Referring to fig. 21, in some embodiments, the detection apparatus 10 further includes a fourth obtaining module 16 and a fourth processing module 17. Step 001 may be implemented by the fourth obtaining module 16 and step 002 may be implemented by the fourth processing module 17. That is, the fourth obtaining module 16 may be configured to obtain a moving speed of the camera module, and the fourth processing module 17 may be configured to determine whether the moving speed is greater than a preset speed value, and enter step 01 to obtain the first image when the moving speed is greater than the preset moving speed value.
Referring to fig. 3, in some embodiments, when the computer readable instructions 511 are executed by the processor 52, the processor 52 further executes the operation of acquiring the moving speed of the camera module, determining whether the moving speed is greater than the preset speed value, and entering step 01 to acquire the first image when the moving speed is greater than the preset moving speed value.
Wherein, the speed sensor can be adopted to detect the movement speed of the camera module. When the movement speed of the camera module is higher, it indicates that the computer device 100 may fall, at this time, the infrared camera 41 is turned on to detect whether the camera optical element 411 of the infrared camera 41 is broken, and when the camera optical element 411 is not broken, the infrared fill-in lamp 42 and the infrared camera 41 are turned on at the same time to detect whether the fill-in optical element of the infrared fill-in lamp 42 is broken, and then the structured light projector 43 and the infrared camera 41 are turned on at the same time to detect whether the structured light optical element of the structured light projector 43 is broken; or, when the camera optical element 411 is not broken, the structured light projector 43 and the infrared camera 41 are turned on at the same time to detect whether the structured light optical element of the structured light projector 43 is broken, and then the infrared fill-in light lamp 42 and the infrared camera 41 are turned on at the same time to detect whether the fill-in light optical element of the infrared fill-in light lamp 42 is broken; or, when the camera optical element 411 is not broken, the infrared fill-in lamp 42 and the infrared camera 41 are turned on simultaneously to detect whether the fill-in optical element of the infrared fill-in lamp 42 is broken, and when the fill-in optical element is not broken, the structured light projector 43, the infrared fill-in lamp 42 and the infrared camera 41 are turned on simultaneously to detect whether the structured light optical element of the structured light projector 43 is broken. In this way, the detection of the breakage of the optical element is performed when the moving speed of the computer apparatus 100 is large, without the detection of the breakage of the homogeneous optical element every time the optical element is used, reducing the power consumption of the computer apparatus 100.
The present invention also provides a non-transitory computer-readable storage medium containing one or more computer-executable instructions. The computer-executable instructions, when executed by the one or more processors 52, cause the processors 52 to perform the method for detecting a break in an optical element as described in any of the above embodiments.
For example, the computer-executable instructions, when executed by the one or more processors 52, cause the processors 52 to perform the operations of:
01: acquiring a first image;
02: processing the first image to determine whether the first image has a crack pattern; and
04: the camera optical element 411 is confirmed to be broken when the first image has a crack pattern.
As another example, the computer-executable instructions, when executed by the one or more processors 52, cause the processors 52 to perform the operations of:
021: constructing a multi-stage crack pattern classifier based on the Haar-like rectangular feature set;
022: training a multi-stage crack pattern classifier by adopting a positive crack pattern sample and a negative crack pattern sample; and
023: and detecting whether the first image has a crack pattern or not by adopting the trained multistage crack pattern classifier.
FIG. 22 is a schematic diagram of internal modules of computer device 100, under an embodiment. As shown in fig. 22, the computer apparatus 100 includes a processor 52, a memory 51 (e.g., a nonvolatile storage medium), an internal memory 54, a display 55, and an input device 56, which are connected by a system bus 53. The memory 51 of the computer device 100 has stored therein an operating system and computer readable instructions 511 (shown in FIG. 3). The computer readable instructions 511 are executable by the processor 52 to implement the method for detecting a break in an optical element according to any of the above embodiments. The processor 52 may be used to provide computing and control capabilities to enable the overall operation of the computing device 100. The internal memory 54 of the computer device 100 provides an environment for the computer-readable instructions 511 in the memory 51 to run. The display 55 of the computer device 100 may be a liquid crystal display or an electronic ink display, and the input device 56 may be a touch layer covered on the display 55, a case, a trackball or a touch pad arranged on the housing of the computer device 100, or an external keyboard, a touch pad or a mouse. The computer device 100 may be a mobile phone, a tablet computer, a notebook computer, a personal digital assistant, or a wearable device (e.g., a smart bracelet, a smart watch, a smart helmet, smart glasses), etc. It will be understood by those skilled in the art that the configuration shown in fig. 22 is only a schematic diagram of a part of the configuration related to the solution of the present invention, and does not constitute a limitation to the computer device 100 to which the solution of the present invention is applied, and a specific computer device 100 may include more or less components than those shown in the figure, or combine some components, or have a different arrangement of components.
Referring to fig. 23, a computer apparatus 100 according to an embodiment of the invention includes an image processor circuit 80. The image processing circuit 80 may be implemented using hardware and/or software. Various Processing units may be included that define an ISP (Image Signal Processing) pipeline. FIG. 23 is a diagram of an image processing circuit 80 in one embodiment. As shown in fig. 23, for convenience of explanation, only aspects of the image processing technique related to the embodiment of the present invention are shown.
As shown in fig. 23, the image processing circuit includes an ISP processor (which may be the processor 52 or a part of the processor 52) and control logic. The image data captured by the infrared camera 41 is first processed by the ISP processor 63, and the ISP processor 82 analyzes the image data to capture image statistics that may be used to determine one or more control parameters of the infrared camera 41. The infrared camera 41 may include a camera optics 411 and an image sensor 412, the camera optics 411 including one or more lenses. The image sensor 412 may acquire light intensity and wavelength information captured by each imaging pixel and provide a set of raw image data that may be processed by the ISP processor 82. The sensor 81 (e.g., gyroscope) may provide parameters of the acquired image processing (e.g., anti-shake parameters) to the ISP processor 82 based on the sensor interface type. The sensor interface may be an SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the above.
In addition, the image sensor 412 may also send raw image data to the sensor 81, the sensor 81 may provide the raw image data to the ISP processor 82 based on the sensor interface type, or the sensor may store the raw image data in the memory 51.
The ISP processor 82 processes the raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 82 may perform one or more image processing operations on the raw image data, gathering statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
The ISP processor 82 may also receive image data from the memory 51. For example, the sensor interface sends raw image data to the memory 51, and the raw image data in the memory 51 is then provided to the ISP processor 82 for processing.
Upon receiving raw image data from the image sensor interface or from the sensor 81 interface or from the memory 51, the ISP processor 82 may perform one or more image processing operations, such as temporal filtering. The processed image data may be sent to image memory 62 for additional processing before being displayed. The ISP processor 82 receives the processing data from the memory 51 and performs image data processing on the processing data. The image data processed by the ISP processor 82 may be output to a display screen for viewing by a user and/or further processed by a Graphics Processing Unit (GPU). In addition, the output of the ISP processor 82 may also be sent to the image memory 82, and the display screen 55 may read the image data from the memory 51. In one embodiment, memory 51 may be configured to implement one or more frame buffers. In addition, the output of the ISP processor 82 may be sent to an encoder/decoder 84 for encoding/decoding the image data. The encoded image data may be saved and decompressed before being displayed on the display screen 55. The encoder/decoder 84 may be implemented by a CPU or GPU or coprocessor.
The statistical data determined by the ISP processor 82 may be sent to the control logic unit 83. For example, the statistical data may include image sensor statistics such as auto-exposure, auto-focus, flicker detection, black level compensation, lens shading correction, and the like. Control logic 83 may include a processing element and/or microcontroller that executes one or more routines (e.g., firmware) that determine camera control parameters and ISP processor 82 control parameters based on the received statistical data. For example, the control parameters of the infrared camera 41 may include sensor 81 control parameters (e.g., gain, integration time for exposure control, anti-shake parameters, etc.), camera flash control parameters, lens control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters.
It will be understood by those skilled in the art that all or part of the processes of the methods of the above embodiments may be implemented by a computer program, which can be stored in a non-volatile computer readable storage medium, and when executed, can include the processes of the above embodiments of the methods. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), or the like.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (22)

1. A method of detecting a fracture in an optical element, the method being for use in a computer apparatus, the computer apparatus comprising a camera module, the optical element comprising a camera optical element of the camera module, the camera optical element being located within the camera module during inspection of the camera optical element using the method of detection; the detection method comprises the following steps:
acquiring the movement speed of the camera module;
when the movement speed is greater than a preset movement speed value, the camera module acquires a first image;
processing the first image to determine whether a crack pattern is present in the first image; and
confirming that the camera optical element is broken when a crack pattern is present in the first image, and the confirming that the camera optical element is broken occurs during use of the computer device.
2. The detection method according to claim 1, wherein the camera module is an infrared camera, and the first image is formed by the infrared camera shooting infrared rays in ambient light reflected by the object.
3. The inspection method of claim 1, wherein the step of processing the first image to determine whether the first image has a crack pattern comprises:
constructing a multi-stage crack pattern classifier based on the Haar-like rectangular feature set;
training the multistage crack pattern classifier by adopting a positive crack pattern sample and a negative crack pattern sample; and
and detecting whether the first image has a crack pattern or not by adopting the trained multistage crack pattern classifier.
4. The detection method according to claim 1, wherein the first image comprises a plurality of frames, and wherein the plurality of frames of the first image are processed to determine whether crack patterns exist in all of the plurality of frames of the first image, and when crack patterns exist in all of the plurality of frames of the first image, the detection method further comprises:
comparing whether the crack positions of the crack patterns in any two frames of the first image are consistent; and
confirming that the camera optical element is broken when the crack positions in any two frames of the first image are consistent.
5. The detection method according to any one of claims 2 to 4, wherein the camera module is an infrared camera, the optical element further includes a light supplement optical element of an infrared light supplement lamp, and the detection method further includes:
when the optical element of the camera is not damaged, the infrared light supplement lamp and the infrared camera are turned on;
acquiring a second image according to the infrared light emitted by the infrared light supplement lamp;
processing the second image to determine whether a light pattern exists in the second image; and
and confirming that the light supplementing optical element is broken when the second image has the light pattern.
6. The detection method of claim 5, wherein the step of processing the second image to determine whether the second image has a light pattern comprises:
constructing a multi-level optical pattern classifier based on a Haar-like rectangular feature set;
training the multistage light pattern classifier by adopting positive light pattern samples and negative light pattern samples; and
and detecting whether the second image has the light pattern or not by adopting the trained multistage light pattern classifier.
7. The detection method according to claim 6, wherein the second image comprises a plurality of frames, and wherein the plurality of frames of the second image are processed to determine whether the plurality of frames of the second image all have the light pattern, and when the plurality of frames of the second image all have the light pattern, the detection method further comprises:
comparing whether the positions of the light stripes of the light stripe patterns in any two frames of the second images are consistent; and
and when the positions of the light fringes in any two frames of the second image are consistent, confirming that the light supplementing optical element is broken.
8. The inspection method of any one of claims 2 to 4, wherein the camera module is an infrared camera, the optical element further comprises a structured light optical element of a structured light projector, and the inspection method further comprises:
turning on the structured light projector and the infrared camera upon confirming that the camera optical element is not broken;
acquiring a third image according to the infrared light emitted by the structured light projector;
processing the third image to determine whether depth anomaly information exists in the third image; and
confirming the structured light optical element is broken when there is depth anomaly information in the third image.
9. The detection method according to claim 8, wherein the step of processing the third image to determine whether depth anomaly information exists in the third image comprises:
calculating the scatter offset of each pixel point in the third image according to the third image and a preset reference image;
calculating the depth information of each pixel point in the third image according to the scatter offset;
calculating the number of adjacent pixel points of which the depth information is greater than a preset threshold; and
and determining that the depth abnormal information exists in the third image when the number is larger than a preset number.
10. The detection method according to claim 8, wherein the third image comprises a plurality of frames, a plurality of frames of the third image are processed to determine whether the depth anomaly information exists in the plurality of frames of the third image, and when the depth anomaly information exists in all of the plurality of frames of the third image, the detection method further comprises:
comparing whether the position information of the depth abnormal information in any two frames of the third images is consistent; and
and when the position information of the depth anomaly information in any two frames of the third image is consistent, confirming that the structured light optical element is broken.
11. A device for detecting the breakage of an optical element, wherein the optical element comprises a camera optical element of a camera module in a computer device, and the camera optical element is positioned in the camera module during the detection of the camera optical element by using the detecting device; the detection device includes:
the fourth acquisition module is used for acquiring the movement speed of the camera module;
when the movement speed is greater than a preset speed value, the camera module acquires a first image;
a first processing module for processing the first image to determine whether a crack pattern exists in the first image; and
a first confirmation module to confirm that the camera optical element is broken when a crack pattern is present in the first image, and that the camera optical element is broken occurs during use of the computer device.
12. The detecting device according to claim 11, wherein the camera module is an infrared camera, and the first image is formed by the infrared camera shooting infrared rays in the ambient light reflected by the object.
13. The detection apparatus according to claim 11, wherein the first processing module comprises:
the first construction unit is used for constructing a multistage crack pattern classifier based on a Haar-like rectangular feature set;
a first training unit for training the multi-stage crack pattern classifier using positive crack pattern samples and negative crack pattern samples; and
a first detection unit, configured to detect whether a crack pattern exists in the first image by using the trained multi-stage crack pattern classifier.
14. The apparatus according to claim 11, wherein the first image comprises a plurality of frames, and wherein the plurality of frames of the first image are processed to determine whether a crack pattern exists in each of the plurality of frames of the first image, and wherein when a crack pattern exists in each of the plurality of frames of the first image, the apparatus further comprises:
the first comparison module is used for comparing whether the crack positions of the crack patterns in any two frames of the first image are consistent;
the first confirming module is further used for confirming that the camera optical element is broken when the crack positions in any two frames of the first image are consistent.
15. The detecting device according to any one of claims 12 to 14, wherein the camera module is an infrared camera, the optical element further includes a light-compensating optical element of an infrared light-compensating lamp, and the detecting device further includes:
the first starting module is used for starting the infrared light supplement lamp and the infrared camera when the fact that the camera optical element is not damaged is confirmed;
the second acquisition module is used for acquiring a second image according to the infrared light emitted by the infrared light supplement lamp;
a second processing module, configured to process the second image to determine whether there is a light pattern in the second image; and
and the second confirmation module is used for confirming that the light supplementing optical element is broken when the second image has the light pattern.
16. The detection apparatus according to claim 15, wherein the second processing module comprises:
the second construction unit is used for constructing a multi-level light-pattern classifier based on a Haar-like rectangular feature set;
a second training unit for training the multi-stage lightprint pattern classifier by using the positive lightprint pattern samples and the negative lightprint pattern samples; and
and the second detection unit is used for detecting whether the second image has the light pattern or not by adopting the trained multistage light pattern classifier.
17. The detecting device according to claim 16, wherein the second image comprises a plurality of frames, and wherein the plurality of frames of the second image are processed to determine whether the plurality of frames of the second image all have the light pattern, and when the plurality of frames of the second image all have the light pattern, the detecting device further comprises:
the second comparison module is used for comparing whether the positions of the light patterns in any two frames of the second image are consistent;
the second confirming module is used for confirming that the light supplementing optical element is broken when the light veins in any two frames of the second image are consistent in position.
18. The inspection device of any one of claims 12 to 14, wherein the camera module is an infrared camera, the optical element further comprises a structured light optical element of a structured light projector, the inspection device further comprising:
a second enabling module for enabling the structured light projector and the infrared camera upon confirming that the camera optical element is not broken;
the third acquisition module is used for acquiring a third image according to the infrared light emitted by the structured light projector;
a third processing module, configured to process the third image to determine whether depth anomaly information exists in the third image; and
a third confirmation module to confirm the structured light optical element is broken when there is depth anomaly information in the third image.
19. The detection apparatus according to claim 18, wherein the third processing module comprises:
the first calculating unit is used for calculating the scatter point offset of each pixel point in the third image according to the third image and a preset reference image;
the second calculating unit is used for calculating the depth information of each pixel point in the third image according to the scatter offset;
the third calculating unit is used for calculating the number of adjacent pixel points of which the depth information is greater than a preset threshold value; and
a confirming unit, configured to determine that the depth anomaly information exists in the third image when the number is greater than a preset number.
20. The apparatus according to claim 18, wherein the third image comprises a plurality of frames, and wherein a plurality of frames of the third image are processed to determine whether the depth anomaly information exists in the plurality of frames of the third image, and when the depth anomaly information exists in all of the plurality of frames of the third image, the apparatus further comprises:
the third comparison module is used for comparing whether the position information of the depth anomaly information in any two frames of the third image is consistent; and
the third confirming module is further used for confirming that the structured light optical element is broken when the position information of the depth anomaly information in any two frames of the third image is consistent.
21. A computer device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the detection method of any one of claims 1 to 10.
22. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the detection method of any one of claims 1 to 10.
CN201711465117.7A 2017-12-28 2017-12-28 Detection method, detection device, computer equipment and computer readable storage medium Active CN108198175B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201711465117.7A CN108198175B (en) 2017-12-28 2017-12-28 Detection method, detection device, computer equipment and computer readable storage medium
PCT/CN2018/123585 WO2019129004A1 (en) 2017-12-28 2018-12-25 Detection method, detection device, computer device, and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711465117.7A CN108198175B (en) 2017-12-28 2017-12-28 Detection method, detection device, computer equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN108198175A CN108198175A (en) 2018-06-22
CN108198175B true CN108198175B (en) 2021-09-10

Family

ID=62585450

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711465117.7A Active CN108198175B (en) 2017-12-28 2017-12-28 Detection method, detection device, computer equipment and computer readable storage medium

Country Status (2)

Country Link
CN (1) CN108198175B (en)
WO (1) WO2019129004A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108989710B (en) * 2018-07-16 2020-09-04 维沃移动通信有限公司 Infrared supplementary lighting module failure detection method, terminal device and computer readable storage medium
CN109063761B (en) * 2018-07-20 2020-11-03 北京旷视科技有限公司 Diffuser falling detection method and device and electronic equipment
CN110769246B (en) * 2019-09-06 2023-04-11 华为技术有限公司 Method and device for detecting faults of monitoring equipment
CN112419228B (en) * 2020-10-14 2022-04-05 高视科技(苏州)有限公司 Method and device for detecting three-dimensional edge defect of cover plate
CN112991264B (en) * 2021-02-05 2023-08-11 西安理工大学 Method for detecting crack defect of monocrystalline silicon photovoltaic cell

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102025901A (en) * 2009-09-23 2011-04-20 国基电子(上海)有限公司 Camera module and detection method thereof
CN101819163A (en) * 2010-06-03 2010-09-01 成都精密光学工程研究中心 Detection device of subsurface defect of optical element and method thereof
US8766192B2 (en) * 2010-11-01 2014-07-01 Asm Assembly Automation Ltd Method for inspecting a photovoltaic substrate
KR20150023907A (en) * 2012-06-28 2015-03-05 펠리칸 이매징 코포레이션 Systems and methods for detecting defective camera arrays, optic arrays, and sensors
CN203133000U (en) * 2013-03-15 2013-08-14 安徽工程大学 Image-based glass defect online detection device
CN103399021B (en) * 2013-08-15 2015-11-04 厦门大学 The detection method of the sub-surface crack of a kind of transparent optical element
CN104132944B (en) * 2014-07-11 2017-02-15 西安交通大学 Method for detecting subsurface damage degree characterization parameters of spherical optical element
CN104777172B (en) * 2015-04-30 2018-10-09 重庆理工大学 A kind of quick, intelligent detection device of optical lens substandard products and method

Also Published As

Publication number Publication date
CN108198175A (en) 2018-06-22
WO2019129004A1 (en) 2019-07-04

Similar Documents

Publication Publication Date Title
CN108198175B (en) Detection method, detection device, computer equipment and computer readable storage medium
CN102147856B (en) Image recognition apparatus and its control method
CN108716983B (en) Optical element detection method and device, electronic equipment, storage medium
KR102270674B1 (en) Biometric camera
US9152850B2 (en) Authentication apparatus, authentication method, and program
US9953428B2 (en) Digital camera unit with simultaneous structured and unstructured illumination
CN108716982B (en) Optical element detection method, optical element detection device, electronic equipment and storage medium
US9058519B2 (en) System and method for passive live person verification using real-time eye reflection
TWI696391B (en) Projector, detection method and detection device thereof, image capturing device, electronic device, and computer readable storage medium
CN109327626B (en) Image acquisition method and device, electronic equipment and computer readable storage medium
US10891479B2 (en) Image processing method and system for iris recognition
CN107563979B (en) Image processing method, image processing device, computer-readable storage medium and computer equipment
CN110121031B (en) Image acquisition method and device, electronic equipment and computer readable storage medium
CN108600740A (en) Optical element detection method, device, electronic equipment and storage medium
CN109683698B (en) Payment verification method and device, electronic equipment and computer-readable storage medium
US10121067B2 (en) Image processing apparatus that determines processing target area of an image based on degree of saliency, image processing method, and storage medium
US10163009B2 (en) Apparatus and method for recognizing iris
CN104090656A (en) Eyesight protecting method and system for smart device
CN109068060B (en) Image processing method and device, terminal device and computer readable storage medium
CN107563329B (en) Image processing method, image processing device, computer-readable storage medium and mobile terminal
CN107454335B (en) Image processing method, image processing device, computer-readable storage medium and mobile terminal
CN108760245B (en) Optical element detection method and device, electronic equipment, readable storage medium storing program for executing
US20170185839A1 (en) Apparatus and method for acquiring iris image outdoors and indoors
CN108259769B (en) Image processing method, image processing device, storage medium and electronic equipment
CN108875545B (en) Method, device and system for determining light state of face image and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: OPPO Guangdong Mobile Communications Co.,Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant