WO2022016797A1 - 一种光学信息检测方法、装置及设备 - Google Patents

一种光学信息检测方法、装置及设备 Download PDF

Info

Publication number
WO2022016797A1
WO2022016797A1 PCT/CN2020/138123 CN2020138123W WO2022016797A1 WO 2022016797 A1 WO2022016797 A1 WO 2022016797A1 CN 2020138123 W CN2020138123 W CN 2020138123W WO 2022016797 A1 WO2022016797 A1 WO 2022016797A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
field
full
target
feature
Prior art date
Application number
PCT/CN2020/138123
Other languages
English (en)
French (fr)
Inventor
杨鹏
李文健
王兆民
Original Assignee
奥比中光科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 奥比中光科技集团股份有限公司 filed Critical 奥比中光科技集团股份有限公司
Publication of WO2022016797A1 publication Critical patent/WO2022016797A1/zh
Priority to US17/732,773 priority Critical patent/US20220254067A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0008Industrial image inspection checking presence/absence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures

Definitions

  • the present application belongs to the field of optical technology, and in particular relates to an optical information detection method, device and equipment.
  • Depth measurement systems based on structured light or time of flight (TOF) technology all include a transmitter and a receiver, and the optical information of the transmitter and receiver needs to be detected before the system is used, for example, the optical diffraction in the transmitter.
  • the existing optical information detection method is to use the grayscale similarity to calculate the image features, so as to calculate the optical parameter information.
  • the zero-level projection pattern needs to have globally unique characteristics.
  • the projection patterns are regularly arranged, the formed zero-order projection patterns are not unique, and the optical information cannot be accurately calculated in this case.
  • Embodiments of the present application provide an optical information detection method, device, and device, which can solve the problem that optical information cannot be accurately calculated when projection patterns are regularly arranged.
  • an embodiment of the present application provides an optical information detection method, including:
  • the first imaging module acquiring a first image collected by the first imaging module, where the first image includes a full-field projection pattern of the target;
  • target optical information is calculated.
  • the first feature information is the target full-field astigmatism in the target full-field projection pattern
  • the second feature information is the reference full-field astigmatism in the reference full-field projection pattern
  • the calculating the first mapping relationship between the first feature information and the second feature information includes:
  • a mapping relationship between the target full-field speckle and the reference full-field speckle is determined according to the preset local speckle correspondence and the initial mapping relationship.
  • the target optical information includes light and dark distribution information of the target full-field projection pattern and/or field angle information of the target full-field projection pattern.
  • the projection component for projecting the target full-field projection pattern includes a light source and a diffractive optical element, and the light source is an array composed of a plurality of vertical cavity surface emitting lasers VCSEL;
  • the target optical information includes relative deflection information of the diffractive optical element and the light source and/or information on the absence of VCSELs of the array.
  • the first feature information includes one or more of an internal feature of the target full-field speckle, a corner point feature of the target full-field speckle, and an edge curve feature of the target full-field speckle.
  • the second feature information includes one or more of an internal feature of the reference full-field speckle, a corner point feature of the reference full-field speckle, and an edge curve feature of the reference full-field speckle.
  • the target optical information is the angle between the optical axis of the second imaging module and the projection assembly
  • the calculating target optical information according to the second graphic information includes:
  • the included angle between the optical axis of the second imaging module and the projection assembly is calculated.
  • an optical information detection device including:
  • a first acquisition unit configured to acquire a first image collected by a first imaging module, where the first image includes a target full-field projection pattern
  • an extraction unit configured to perform feature extraction on the first image to obtain first feature information
  • a second acquiring unit configured to acquire second feature information and first graphic information of the preset reference full-field projection pattern, where the graphic information includes zero-level information and/or secondary information;
  • a first calculation unit configured to calculate a first mapping relationship between the first feature information and the second feature information
  • a first processing unit configured to map the first graphic information to the target full-field projection pattern according to the first mapping relationship, and obtain second graphic information corresponding to the target full-field projection pattern;
  • the second calculation unit is configured to calculate target optical information according to the second graphic information.
  • the first feature information is the target full-field astigmatism in the target full-field projection pattern
  • the second feature information is the reference full-field astigmatism in the reference full-field projection pattern
  • the first computing unit is specifically used for:
  • a mapping relationship between the target full-field speckle and the reference full-field speckle is determined according to the preset local speckle correspondence and the initial mapping relationship.
  • the target optical information includes light and dark distribution information of the target full-field projection pattern and/or field angle information of the target full-field projection pattern.
  • the projection component for projecting the target full-field projection pattern includes a light source and a diffractive optical element, and the light source is an array composed of a plurality of vertical cavity surface emitting lasers VCSEL;
  • the target optical information includes relative deflection information of the diffractive optical element and the light source and/or information on the absence of VCSELs of the array.
  • the first feature information includes one or more of an internal feature of the target full-field speckle, a corner point feature of the target full-field speckle, and an edge curve feature of the target full-field speckle.
  • the second feature information includes one or more of an internal feature of the reference full-field speckle, a corner point feature of the reference full-field speckle, and an edge curve feature of the reference full-field speckle.
  • the target optical information is the angle between the optical axis of the second imaging module and the projection assembly
  • the second computing unit is specifically used for:
  • the included angle between the optical axis of the second imaging module and the projection assembly is calculated.
  • embodiments of the present application provide an optical information detection device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor executing the computer During the program, the optical information detection method as described in the first aspect above is realized.
  • an embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, realizes the optical information described in the first aspect above Detection method.
  • the first image collected by the first imaging module is acquired, and the first image includes the full-field projection pattern of the target; the first image is subjected to feature extraction to obtain first feature information; the preset reference is obtained second feature information and first graphic information of the full-field projection pattern, where the graphic information includes zero-order information and/or secondary information; calculating a first mapping between the first feature information and the second feature information mapping the first graphic information to the target full-field projection pattern according to the first mapping relationship to obtain second graphic information corresponding to the target full-field projection pattern; according to the second graphic information, calculate target optical information.
  • the above method is compatible with the case where the projection pattern projected by the projection component is regular or irregular, that is, in the speckle pattern projected by the projection component, regardless of whether the zero-order speckle pattern is globally unique, the optical information can be accurately evaluated. test.
  • FIG. 1 is a schematic diagram of an optical information detection system provided in a first embodiment of the present application
  • FIG. 2 is a schematic flowchart of an optical information detection method provided in a second embodiment of the present application.
  • FIG. 3 is a schematic diagram of a target full-field projection pattern in an optical information detection method provided by a second embodiment of the present application.
  • FIG. 4 is a schematic diagram of a target full-field projection pattern in an optical information detection method provided by a second embodiment of the present application.
  • FIG. 5 is a schematic flowchart of the refinement of S104 in an optical information detection method provided by the second embodiment of the present application.
  • FIG. 6 is a schematic flowchart of the refinement of S106 in an optical information detection method provided by the second embodiment of the present application.
  • FIG. 7 is a schematic diagram of an optical information detection device provided by a third embodiment of the present application.
  • FIG. 8 is a schematic diagram of an optical information detection device provided by a fourth embodiment of the present application.
  • the term “if” may be contextually interpreted as “when” or “once” or “in response to determining” or “in response to detecting “.
  • the phrases “if it is determined” or “if the [described condition or event] is detected” may be interpreted, depending on the context, to mean “once it is determined” or “in response to the determination” or “once the [described condition or event] is detected. ]” or “in response to detection of the [described condition or event]”.
  • references in this specification to "one embodiment” or “some embodiments” and the like mean that a particular feature, structure or characteristic described in connection with the embodiment is included in one or more embodiments of the present application.
  • appearances of the phrases “in one embodiment,” “in some embodiments,” “in other embodiments,” “in other embodiments,” etc. in various places in this specification are not necessarily All refer to the same embodiment, but mean “one or more but not all embodiments” unless specifically emphasized otherwise.
  • the terms “including”, “including”, “having” and their variants mean “including but not limited to” unless specifically emphasized otherwise.
  • FIG. 1 is a schematic diagram of an optical information detection system provided by the first embodiment of the present application.
  • the optical information detection system includes a projection assembly, a projection screen, a first imaging module, and a device with an optical information detection function that is connected in communication with the projection module and the first imaging module, respectively.
  • the device with the optical information detection function may be a server, a processor, or the like.
  • the projection assembly is used for projecting a projection pattern to the projection screen; the first imaging module. Used to collect a first image projected to the projection screen, the first image including the target full-field projection pattern.
  • FIG. 2 is a schematic flowchart of an optical information detection method provided by a second embodiment of the present application.
  • the execution body of an optical information detection method in this embodiment is a device having an optical information detection function, for example, a server, a processor, and the like.
  • the optical information detection method shown in FIG. 2 may include:
  • S101 Acquire a first image collected by a first imaging module, where the first image includes a target full-field projection pattern.
  • the device acquires a first image collected by the first imaging module, where the first image includes a full-field projection pattern of the target.
  • the projection component projects the projection pattern onto the projection screen, and the first imaging module collects the projection pattern on the projection screen.
  • the image collected by the first imaging module is the first image, and the projection pattern included in the first image is the target full-field projection pattern.
  • the projection assembly and the first imaging module may be disposed on two sides of the projection screen, respectively, or may be disposed on the same side of the projection screen.
  • the projection screen includes a glass plate, paper and PC platen arranged in sequence along the direction of the beam emitted by the projection assembly.
  • the transmittance of the above-mentioned glass plate, paper and PC platen should be guaranteed not to less than 90%.
  • the first imaging module can use a wide-angle imaging module, or the distance between the first imaging module and the projection screen can be appropriately lengthened to avoid the first imaging module.
  • the field of view of the group is too large, affecting the quality of the first image acquired.
  • the projection assembly includes a light source and a diffractive optical element DOE, the DOE is arranged on the light exit path of the light source, and the diffractive optical element is used to project the light generated by the light source to the projection screen to form a projection pattern.
  • the light source may include an edge emitting laser, a vertical cavity surface emitting laser (VCSEL), an array of VCSELs, or an LED.
  • the VCSEL array can be arranged regularly or irregularly.
  • the light generated by the light source can be visible light, infrared light, ultraviolet light, invisible light, etc.
  • the light source also supports coding projection schemes composed of different images, such as speckle, block, cross, stripe, specific symbols and other patterns. It can be understood that the wavelength of the light that can be collected by the first imaging module should be the same as the wavelength of the light projected by the projection assembly.
  • the first imaging module corresponds to an infrared camera.
  • S102 Perform feature extraction on the first image to obtain first feature information.
  • the device performs feature extraction on the collected first image, extracts feature information of the target full-field projection pattern in the first image, and obtains the first feature information.
  • the first feature information may include one of internal features of the target full-field speckle, corner point features of the target full-field speckle, and edge curve features of the target full-field speckle or more.
  • Figures 3 and 4 are schematic diagrams of the full-field projection pattern of the target, as well as the internal features of the full-field speckle, the corner features of the full-field speckle, and the edge curve features of the full-field speckle Schematic. It can be understood that the schematic diagrams in FIG. 3 and FIG. 4 are only two forms of the target full-field projection pattern, and are not limitations on the target full-field projection pattern.
  • S103 Acquire second feature information and first graphic information of a preset reference full-field projection pattern, where the graphic information includes zero-level information and/or secondary information.
  • the second characteristic information of the preset reference full-field projection pattern is stored in the device, wherein the preset reference full-field projection pattern is the projection pattern projected by the pre-collected projection component onto the projection screen.
  • the second feature information may include one or more of an internal feature of the reference full-field speckle, a corner point feature of the reference full-field speckle, and an edge curve feature of the reference full-field speckle.
  • the internal features of the reference full-field speckle, the corner point features of the reference full-field speckle, and the edge curve features of the reference full-field speckle can refer to the internal features of the target full-field speckle and the angle of the target full-field speckle in S102. The related descriptions of the point feature and the edge curve feature of the target full-field speckle will not be repeated here.
  • the device acquires first graphic information, where the graphic information includes zero-level information and/or secondary information.
  • the zero-order information may include zero-order dots, zero-order speckle patterns, and the like; the secondary information may include coordinates of speckles, grayscale information, and the like.
  • S104 Calculate a first mapping relationship between the first feature information and the second feature information.
  • the device calculates a first mapping relationship between the first feature information and the second feature information according to the first feature information and the second feature information.
  • the first mapping relationship also substantially represents the relationship between the preset reference full-field projection pattern and the target full-field projection pattern.
  • the first feature information is the target full-field astigmatism in the target full-field projection pattern
  • the second feature information is the reference full-field astigmatism in the reference full-field projection pattern
  • S104 may include: S1041 ⁇ S1042, as shown in Figure 5, S1041 ⁇ S1042 are as follows:
  • S1041 Calculate an initial mapping relationship between the target full-field speckle and the reference full-field speckle.
  • the first feature information is the target full-field astigmatism in the target full-field projection pattern
  • the second feature information is the reference full-field astigmatism in the reference full-field projection pattern.
  • the device calculates the initial mapping relationship between the target full-field speckle and the reference full-field speckle according to the target full-field speckle and the reference full-field speckle.
  • the initial mapping relationship is calculated by using the homography of the reference full-field speckle and the target full-field speckle; the initial mapping relationship can be calculated according to the edge curve characteristics of the full-field speckle by using the reference full-field projection pattern.
  • the homography of the edge curves and the edge curves in the target full-field projection pattern is calculated.
  • S1042 Determine a mapping relationship between the target full-field speckle and the reference full-field speckle according to a preset local speckle correspondence relationship and the initial mapping relationship.
  • the preset speckle local correspondence is stored in the device, and the preset speckle local correspondence may include speckle similarity information, speckle local nearest neighbor information, and the like.
  • the device determines the mapping relationship between the target full-field speckle and the reference full-field speckle according to the preset speckle local correspondence and the initial mapping relationship.
  • S105 Map the first graphic information to the target full-field projection pattern according to the first mapping relationship, to obtain second graphic information corresponding to the target full-field projection pattern.
  • the device maps the first graphic information to the target full-field projection pattern according to the first mapping relationship, and obtains graphic information of the first graphic information in the target full-field projection pattern, which is the second graphic information.
  • the device maps the zero-order information to the target full-field projection pattern according to the first mapping relationship, and obtains the zero-order information of the zero-order information in the target full-field projection pattern, which is the second graphic information.
  • S106 Calculate target optical information according to the second graphic information.
  • the device may calculate the target optical information according to the corresponding second graphic information in the target full-field projection pattern.
  • the calculation rule can be preset in the device, and the calculation rule is determined according to the type of target optical information.
  • the target optical information is calculated according to the calculation rule.
  • the target optical information may include light and dark distribution information of the target full-field projection pattern and/or field angle information of the target full-field projection pattern.
  • the projection component for projecting the full-field projection pattern of the target includes a light source and a diffractive optical element
  • the light source is an array composed of a plurality of vertical cavity surface emitting lasers VCSELs
  • the target optical information may include the relative deflection of the diffractive optical element and the light source Information and/or VCSEL absence information for the array.
  • the VCSEL array can project different combined light dot matrix patterns through the DOE under the control of the processor, and the DOE can perform operations such as copying, overlapping and/or rotating the original pattern formed by the VCSEL array to form the final pattern on the projection screen. Therefore, the speckle pattern and the VCSEL light source of the VCSEL array form a one-to-one correspondence in spatial position, and then the speckle pattern can be used to detect whether a single VCSEL light source in the VCSEL array light source is missing.
  • the target optical information is the angle between the optical axis of the second imaging module and the projection assembly
  • S106 may include S1061-S1064, as shown in FIG. 6, S1061-S1064 are as follows:
  • S1061 Acquire a first calibration pattern collected by the first imaging module and a second calibration pattern collected by the second imaging module.
  • the optical information detection device includes a second imaging module
  • the final target optical information to be calculated is the angle between the optical axis of the second imaging module and the projection assembly.
  • the optical information detection system further includes a second imaging module and a calibration pattern projector.
  • the calibration pattern projector is used to project the calibration pattern on the projection screen.
  • the calibration pattern is not limited to checkerboard, dots, etc., the purpose is to provide enough features.
  • the imaging module is used for collecting the above calibration pattern. It should be noted that when the above-mentioned calibration pattern projector is turned on, the projection assembly needs to be turned off. In order to facilitate the distinction, the calibration pattern collected by the first imaging module is denoted as the first calibration pattern, and the calibration pattern captured by the second imaging module is denoted as the second calibration pattern.
  • S1062 Calculate a second mapping relationship between the first calibration pattern and the second calibration pattern.
  • the device calculates a second mapping relationship between the first calibration pattern and the second calibration pattern.
  • the second mapping relationship identifies the relationship between the first imaging module and the second imaging module.
  • S1063 Map the second graphic information to the second calibration pattern according to the second mapping relationship to obtain third graphic information of the second calibration pattern.
  • the device maps the second graphic information to the second calibration pattern according to the second mapping relationship, and obtains graphic information of the second graphic information in the second calibration pattern, that is, the third graphic information.
  • the device maps the zero-level information to the second calibration pattern according to the second mapping relationship, and obtains the zero-level information of the zero-level information in the second calibration pattern, which is the third graphic information.
  • S1064 Calculate the included angle between the optical axis of the second imaging module and the projection assembly according to the third graphic information.
  • the device determines the coordinates of the mapped points in the first calibration pattern according to the third graphic information, and calculates the included angle of the optical axis between the projection assembly and the second imaging module according to the coordinates of the mapped points in the first calibration pattern.
  • the first image collected by the first imaging module is acquired, and the first image includes the full-field projection pattern of the target; the first image is subjected to feature extraction to obtain first feature information; the preset reference is obtained second feature information and first graphic information of the full-field projection pattern, where the graphic information includes zero-order information and/or secondary information; calculating a first mapping between the first feature information and the second feature information mapping the first graphic information to the target full-field projection pattern according to the first mapping relationship to obtain second graphic information corresponding to the target full-field projection pattern; according to the second graphic information, calculate target optical information.
  • the above method is compatible with the case where the projection pattern projected by the projection component is regular or irregular, that is, in the speckle pattern projected by the projection component, regardless of whether the zero-order speckle pattern is globally unique, the optical information can be accurately evaluated. test.
  • FIG. 7 is a schematic diagram of an optical information detection apparatus provided by a third embodiment of the present application.
  • the included units are used to execute the steps in the embodiments corresponding to FIG. 2 and FIG. 5 to FIG. 6 .
  • FIG. 7 please refer to the relevant descriptions in the respective corresponding embodiments of FIG. 2 and FIG. 5 to FIG. 6 .
  • FIG. 7 For convenience of explanation, only the parts related to this embodiment are shown. Referring to FIG.
  • the optical information detection device 7 includes: a first acquisition unit 710 for acquiring a first image collected by a first imaging module, where the first image includes a target full-field projection pattern; an extraction unit 720 for Feature extraction is performed on the first image to obtain first feature information; the second obtaining unit 730 is configured to obtain second feature information and first graphic information of the preset reference full-field projection pattern, where the graphic information includes zero-level information and/or secondary information; a first calculation unit 740 for calculating a first mapping relationship between the first feature information and the second feature information; a first processing unit 750 for calculating the first mapping relationship between the first feature information and the second feature information; The mapping relationship maps the first graphic information to the target full-field projection pattern to obtain second graphic information corresponding to the target full-field projection pattern; the second computing unit 760 is configured to, according to the second graphic information, Calculate the target optical information.
  • the first feature information is the target full-field astigmatism in the target full-field projection pattern
  • the second feature information is the reference full-field astigmatism in the reference full-field projection pattern
  • the first calculation unit 740 is specifically configured to: calculate the initial mapping relationship between the target full-field speckle and the reference full-field speckle; The mapping relationship between the target full-field speckle and the reference full-field speckle.
  • the target optical information includes light and dark distribution information of the target full-field projection pattern and/or field angle information of the target full-field projection pattern.
  • the projection component for projecting the target full-field projection pattern includes a light source and a diffractive optical element
  • the light source is an array composed of a plurality of vertical cavity surface emitting lasers VCSELs
  • the target optical information includes the diffractive optics The relative deflection information of the element and the light source and/or the absence of VCSELs of the array.
  • the first feature information includes one or more of internal features of the target full-field speckle, corner point features of the target full-field speckle, and edge curve features of the target full-field speckle.
  • the second feature information includes one or more of an internal feature of the reference full-field speckle, a corner point feature of the reference full-field speckle, and an edge curve feature of the reference full-field speckle.
  • the target optical information is the angle between the optical axis of the second imaging module and the projection assembly;
  • the second calculation unit 760 is specifically configured to: acquire the first calibration pattern collected by the first imaging module and the second calibration pattern collected by the second imaging module; calculate the first calibration pattern and the second mapping relationship between the second calibration patterns; the second graphic information is mapped to the second calibration pattern according to the second mapping relationship, and the third graphic information of the second calibration pattern is obtained; according to the The third graphic information is used to calculate the included angle between the optical axis of the second imaging module and the projection assembly.
  • FIG. 8 is a schematic diagram of an optical information detection device provided by a fourth embodiment of the present application.
  • the optical information detection device 8 of this embodiment includes: a processor 80, a memory 81, and a computer program 82 stored in the memory 81 and executable on the processor 80, such as optical information detection program.
  • the processor 80 executes the computer program 82, the steps in each of the foregoing embodiments of the optical information detection method are implemented, for example, steps 101 to 106 shown in FIG. 2 .
  • the processor 80 executes the computer program 82
  • the functions of the modules/units in each of the foregoing apparatus embodiments such as the functions of the modules 710 to 760 shown in FIG. 5 , are implemented.
  • the computer program 82 may be divided into one or more modules/units, and the one or more modules/units are stored in the memory 81 and executed by the processor 80 to complete the this application.
  • the one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, and the instruction segments are used to describe the execution process of the computer program 82 in the optical information detection device 8 .
  • the computer program 82 can be divided into a first acquisition unit, an extraction unit, a second acquisition unit, a first calculation unit, a first processing unit, and a second calculation unit, and the specific functions of each unit are as follows:
  • a first acquisition unit configured to acquire a first image collected by a first imaging module, the first image including a target full-field projection pattern; an extraction unit, configured to perform feature extraction on the first image to obtain a first feature information; a second acquisition unit for acquiring second feature information and first graphic information of the preset reference full-field projection pattern, where the graphic information includes zero-level information and/or secondary information; a first computing unit for calculating a first mapping relationship between the first feature information and the second feature information; a first processing unit, configured to map the first graphics information to the target full field according to the first mapping relationship
  • the projection pattern is used to obtain second graphic information corresponding to the target full-field projection pattern; the second calculation unit is configured to calculate the target optical information according to the second graphic information.
  • the optical information detection device may include, but is not limited to, a processor 80 and a memory 81 .
  • FIG. 8 is only an example of the optical information detection device 8, and does not constitute a limitation on the optical information detection device 8, and may include more or less components than those shown in the figure, or combine some components, Or different components, for example, the optical information detection device may further include an input/output device, a network access device, a bus, and the like.
  • the so-called processor 80 may be a central processing unit (Central Processing Unit, CPU), or other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), Off-the-shelf programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the memory 81 may be an internal storage unit of the optical information detection device 8 , such as a hard disk or a memory of the optical information detection device 8 .
  • the memory 81 can also be an external storage device of the optical information detection device 8, such as a plug-in hard disk equipped on the optical information detection device 8, a smart memory card (Smart Media Card, SMC), a secure digital (Secure) Digital, SD) card, flash card (Flash Card), etc.
  • the optical information detection device 8 may also include both an internal storage unit of the optical information detection device 8 and an external storage device.
  • the memory 81 is used to store the computer program and other programs and data required by the optical information detection apparatus.
  • the memory 81 can also be used to temporarily store data that has been output or will be output.
  • An embodiment of the present application also provides a network device, the network device includes: at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, the processor executing The computer program implements the steps in any of the foregoing method embodiments.
  • Embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the steps in the foregoing method embodiments can be implemented.
  • the embodiments of the present application provide a computer program product, when the computer program product runs on a mobile terminal, the steps in the foregoing method embodiments can be implemented when the mobile terminal executes the computer program product.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as an independent product, may be stored in a computer-readable storage medium.
  • all or part of the processes in the methods of the above embodiments can be implemented by a computer program to instruct the relevant hardware.
  • the computer program can be stored in a computer-readable storage medium, and the computer program When executed by a processor, the steps of each of the above method embodiments can be implemented.
  • the computer program includes computer program code, and the computer program code may be in the form of source code, object code, executable file or some intermediate form, and the like.
  • the computer-readable medium may include at least: any entity or device capable of carrying the computer program code to the photographing device/terminal device, recording medium, computer memory, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), electrical carrier signals, telecommunication signals, and software distribution media.
  • ROM read-only memory
  • RAM random access memory
  • electrical carrier signals telecommunication signals
  • software distribution media For example, U disk, mobile hard disk, disk or CD, etc.
  • a computer-readable medium may not be an electrical carrier signal and a telecommunications signal.
  • the disclosed apparatus/network device and method may be implemented in other manners.
  • the apparatus/network device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods, such as multiple units. Or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Optics & Photonics (AREA)
  • Quality & Reliability (AREA)
  • Geometry (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

一种光学信息检测方法,包括:获取第一成像模组采集的第一图像,第一图像包括目标全场投影图案(S101);对第一图像进行特征提取,得到第一特征信息(S102);获取预设参考全场投影图案的第二特征信息、第一图形信息(S103);计算第一特征信息和第二特征信息之间的第一映射关系(S104);根据第一映射关系将第一图形信息映射至目标全场投影图案,得到目标全场投影图案对应的第二图形信息(S105);根据第二图形信息,计算目标光学信息(S106)。方法可兼容投影组件投影的投影图案为规则或非规则两种情况,即在投影组件投影的散斑图案中,无论零级散斑图案是否具有全局唯一性,都可以准确的对光学信息进行检测。

Description

一种光学信息检测方法、装置及设备 技术领域
本申请属于光学技术领域,尤其涉及一种光学信息检测方法、装置及设备。
背景技术
基于结构光或飞行时间(Time of flight,,TOF)技术的深度测量系统都包括发射端与接收端,在系统使用前需要对发射端与接收端的光学信息进行检测,例如,发射端中光学衍射元件与光源之间的相对旋转角度信息、发射端与接收端的光轴偏转角度信息等等。
现有的光学信息的检测方法是利用灰度相似性进行图像特征计算,从而计算光学参数信息。在利用灰度相似性进行图像特征计算时,需要零级的投影图案具有全局唯一的特性。但是,当投影图案规则排列时,形成的零级的投影图案不具有唯一性,无法准确的在这种情况下计算光学信息。
发明内容
本申请实施例提供了一种光学信息检测方法、装置及设备,可以解决当投影图案规则排列时,无法准确计算光学信息的问题。
第一方面,本申请实施例提供了一种光学信息检测方法,包括:
获取第一成像模组采集的第一图像,所述第一图像包括目标全场投影图案;
对所述第一图像进行特征提取,得到第一特征信息;
获取预设参考全场投影图案的第二特征信息、第一图形信息,所述图形信息包括零级信息和/或次级信息;
计算所述第一特征信息和所述第二特征信息之间的第一映射关系;
根据所述第一映射关系将所述第一图形信息映射至所述目标全场投影图案,得到所述目标全场投影图案对应的第二图形信息;
根据所述第二图形信息,计算目标光学信息。
进一步地,所述第一特征信息为所述目标全场投影图案中的目标全场散斑点,所述第二特征信息为所述参考全场投影图案中的参考全场散斑点;
所述计算所述第一特征信息和所述第二特征信息之间的第一映射关系,包括:
计算所述目标全场散斑点和所述参考全场散斑点之间的初始映射关系;
根据预设散斑局部对应关系和所述初始映射关系,确定所述目标全场散斑点和所述参考全场散斑点之间的映射关系。
进一步地,所述目标光学信息包括所述目标全场投影图案的亮暗分布信息和/或所述目标全场投影图案的视场角信息。
进一步地,投影所述目标全场投影图案的投影组件包括光源和衍射光学元件,所述光源为多个垂直腔面发射激光器VCSEL组成的阵列;
所述目标光学信息包括所述衍射光学元件与所述光源的相对偏转信息和/或所述阵列的VCSEL缺失情况信息。
进一步地,所述第一特征信息包括目标全场散斑点的内部特征、目标全场散斑的角点特征、目标全场散斑的边缘曲线特征中的一种或多种。
进一步地,所述第二特征信息包括参考全场散斑点的内部特征、参考全场散斑的角点特征、参考全场散斑的边缘曲线特征中的一种或多种。
进一步地,所述目标光学信息为第二成像模组与投影组件的光轴夹角;
所述根据所述第二图形信息,计算目标光学信息,包括:
获取由所述第一成像模组采集的第一标定图案和所述第二成像模组采集的第二标定图案;
计算所述第一标定图案和所述第二标定图案之间的第二映射关系;
根据所述第二映射关系将所述第二图形信息映射至所述第二标定图案,得到所述第二标定图案的第三图形信息;
根据所述第三图形信息,计算所述第二成像模组与所述投影组件的光轴夹角。
第二方面,本申请实施例提供了一种光学信息检测装置,包括:
第一获取单元,用于获取第一成像模组采集的第一图像,所述第一图像包括目标全场投影图案;
提取单元,用于对所述第一图像进行特征提取,得到第一特征信息;
第二获取单元,用于获取预设参考全场投影图案的第二特征信息、第一图形信息,所述图形信息包括零级信息和/或次级信息;
第一计算单元,用于计算所述第一特征信息和所述第二特征信息之间的第一映射关系;
第一处理单元,用于根据所述第一映射关系将所述第一图形信息映射至所述目标全场投影图案,得到所述目标全场投影图案对应的第二图形信息;
第二计算单元,用于根据所述第二图形信息,计算目标光学信息。
进一步地,所述第一特征信息为所述目标全场投影图案中的目标全场散斑点,所述第二特征信息为所述参考全场投影图案中的参考全场散斑点;
所述第一计算单元,具体用于:
计算所述目标全场散斑点和所述参考全场散斑点之间的初始映射关系;
根据预设散斑局部对应关系和所述初始映射关系,确定所述目标全场散斑点和所述参考全场散斑点之间的映射关系。
进一步地,所述目标光学信息包括所述目标全场投影图案的亮暗分布信息和/或所述目标全场投影图案的视场角信息。
进一步地,投影所述目标全场投影图案的投影组件包括光源和衍射光学元件,所述光源为多个垂直腔面发射激光器VCSEL组成的阵列;
所述目标光学信息包括所述衍射光学元件与所述光源的相对偏转信息和/ 或所述阵列的VCSEL缺失情况信息。
进一步地,所述第一特征信息包括目标全场散斑点的内部特征、目标全场散斑的角点特征、目标全场散斑的边缘曲线特征中的一种或多种。
进一步地,所述第二特征信息包括参考全场散斑点的内部特征、参考全场散斑的角点特征、参考全场散斑的边缘曲线特征中的一种或多种。
进一步地,所述目标光学信息为第二成像模组与投影组件的光轴夹角;
所述第二计算单元,具体用于:
获取由所述第一成像模组采集的第一标定图案和所述第二成像模组采集的第二标定图案;
计算所述第一标定图案和所述第二标定图案之间的第二映射关系;
根据所述第二映射关系将所述第二图形信息映射至所述第二标定图案,得到所述第二标定图案的第三图形信息;
根据所述第三图形信息,计算所述第二成像模组与所述投影组件的光轴夹角。
第三方面,本申请实施例提供了一种光学信息检测设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现如上述第一方面所述的光学信息检测方法。
第四方面,本申请实施例提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时实现如上述第一方面所述的光学信息检测方法。
本申请实施例中,获取第一成像模组采集的第一图像,所述第一图像包括目标全场投影图案;对所述第一图像进行特征提取,得到第一特征信息;获取预设参考全场投影图案的第二特征信息、第一图形信息,所述图形信息包括零级信息和/或次级信息;计算所述第一特征信息和所述第二特征信息之间的第一映射关系;根据所述第一映射关系将所述第一图形信息映射至所述目标全场投影图案,得到所述目标全场投影图案对应的第二图形信息;根据所述第二图形 信息,计算目标光学信息。上述方法,可兼容投影组件投影的投影图案为规则或非规则两种情况,即在投影组件投影的散斑图案中,无论零级散斑图案是否具有全局唯一性,都可以准确的对光学信息进行检测。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是本申请第一实施例提供的一种光学信息检测系统的示意图;
图2是本申请第二实施例提供的一种光学信息检测方法的示意流程图;
图3是本申请第二实施例提供的一种光学信息检测方法中目标全场投影图案的示意图;
图4是本申请第二实施例提供的一种光学信息检测方法中目标全场投影图案的示意图;
图5是本申请第二实施例提供的一种光学信息检测方法中S104细化的示意流程图;
图6是本申请第二实施例提供的一种光学信息检测方法中S106细化的示意流程图;
图7是本申请第三实施例提供的光学信息检测装置的示意图;
图8是本申请第四实施例提供的光学信息检测设备的示意图。
具体实施方式
以下描述中,为了说明而不是为了限定,提出了诸如特定系统结构、技术之类的具体细节,以便透彻理解本申请实施例。然而,本领域的技术人员应当 清楚,在没有这些具体细节的其它实施例中也可以实现本申请。在其它情况中,省略对众所周知的系统、装置、电路以及方法的详细说明,以免不必要的细节妨碍本申请的描述。
应当理解,当在本申请说明书和所附权利要求书中使用时,术语“包括”指示所描述特征、整体、步骤、操作、元素和/或组件的存在,但并不排除一个或多个其它特征、整体、步骤、操作、元素、组件和/或其集合的存在或添加。
还应当理解,在本申请说明书和所附权利要求书中使用的术语“和/或”是指相关联列出的项中的一个或多个的任何组合以及所有可能组合,并且包括这些组合。
如在本申请说明书和所附权利要求书中所使用的那样,术语“如果”可以依据上下文被解释为“当...时”或“一旦”或“响应于确定”或“响应于检测到”。类似地,短语“如果确定”或“如果检测到[所描述条件或事件]”可以依据上下文被解释为意指“一旦确定”或“响应于确定”或“一旦检测到[所描述条件或事件]”或“响应于检测到[所描述条件或事件]”。
另外,在本申请说明书和所附权利要求书的描述中,术语“第一”、“第二”、“第三”等仅用于区分描述,而不能理解为指示或暗示相对重要性。
在本申请说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。
请参见图1,图1是本申请第一实施例提供的一种光学信息检测系统的示意。光学信息检测系统包括投影组件、投影屏、第一成像模组,以及分别与投影组件、第一成像模组进行通信连接的具有光学信息检测功能的设备。其中, 具有光学信息检测功能的设备可以为服务器、处理器等。所述投影组件,用于将投影图案投射至所述投影屏;所述第一成像模组。用于采集投射至投影屏的第一图像,所述第一图像包括目标全场投影图案。
请参见图2,图2是本申请第二实施例提供的一种光学信息检测方法的示意流程图。本实施例中一种光学信息检测方法的执行主体为具有光学信息检测功能的设备,例如,服务器、处理器等。如图2所示的光学信息检测方法可以包括:
S101:获取第一成像模组采集的第一图像,所述第一图像包括目标全场投影图案。
设备获取由第一成像模组采集的第一图像,第一图像包括目标全场投影图案。其中,投影组件将投影图案投射至投影屏,第一成像模组采集投影屏上的投影图案。第一成像模组采集到的图像即为第一图像,第一图像中包括的投影图案即为目标全场投影图案。
在一个实施例中,投影组件与第一成像模组可以分别设置于投影屏的两侧,也可以置于投影屏的同侧。投影屏包括沿投影组件发射的光束方向依次设置的玻璃板、纸张和PC压板,为了使第一成像模组能采集到清晰的投影图案,上述玻璃板、纸张和PC压板的透射率应该保证不小于90%。需要说明的是,为了能采集到全场第一图像,第一成像模组可以采用大广角的成像模组,或者可适当拉长第一成像模组与投影屏的距离,避免第一成像模组的视场过大,影响采集的第一图像的质量。
投影组件包括光源和衍射光学元件DOE,DOE设于光源的出光路径上,衍射光学元件用于将光源产生的光投射至投影屏,以形成投影图案。可以理解的是,光源发出的光束经由DOE元件扩散后投射至投影屏形成结构光图案。其中,光源可以包括边发射激光器、垂直腔面发射激光器(VCSEL)、VCSEL阵列或LED。其中,VCSEL阵列可以为规则排列或不规则排列。光源产生的光可以为可见光、红外、紫外、不可见光等,光源也支持不同图像组成的编码 投射方案,如散斑状、块状、十字状、条纹状、特定符号等图案。可以理解是,第一成像模组能采集的光的波长应该与投影组件投影的光的波长相同。
在一个实施例中,当投影组件投影的投影图案为红外散斑图案时,第一成像模组对应为红外相机。
S102:对所述第一图像进行特征提取,得到第一特征信息。
设备对采集到的第一图像进行特征提取,提取第一图像中的目标全场投影图案的特征信息,得到第一特征信息。当目标全场投影图案为散斑状图案时,第一特征信息可以包括目标全场散斑点的内部特征、目标全场散斑的角点特征、目标全场散斑的边缘曲线特征中的一种或多种。如图3和图4所示,图3和图4为目标全场投影图案的示意图,以及全场散斑点的内部特征、全场散斑的角点特征、全场散斑的边缘曲线特征的示意图。可以理解是,图3和图4中的示意图只是目标全场投影图案的两种形式,并非对目标全场投影图案的限制。
S103:获取预设参考全场投影图案的第二特征信息、第一图形信息,所述图形信息包括零级信息和/或次级信息。
设备中存储预设参考全场投影图案的第二特征信息,其中,预设参考全场投影图案为预先采集的投影组件投射至投影屏的投影图案。第二特征信息可以包括参考全场散斑点的内部特征、参考全场散斑的角点特征、参考全场散斑的边缘曲线特征中的一种或多种。参考全场散斑点的内部特征、参考全场散斑的角点特征、参考全场散斑的边缘曲线特征可以参考S102中的关于目标全场散斑点的内部特征、目标全场散斑的角点特征、目标全场散斑的边缘曲线特征的相关描述,此处不再赘述。
设备获取第一图形信息,图形信息包括零级信息和/或次级信息。其中,零级信息可以包括零级点、零级散斑图案等等;次级信息可以包括散斑点的坐标、灰度信息等等。
S104:计算所述第一特征信息和所述第二特征信息之间的第一映射关系。
设备根据第一特征信息和第二特征信息,计算第一特征信息和第二特征信 息之间的第一映射关系。第一映射关系实质上也表示了预设参考全场投影图案和目标全场投影图案之间的关系。
进一步地,所述第一特征信息为所述目标全场投影图案中的目标全场散斑点,所述第二特征信息为所述参考全场投影图案中的参考全场散斑点,S104可以包括S1041~S1042,如图5所示,S1041~S1042具体如下:
S1041:计算所述目标全场散斑点和所述参考全场散斑点之间的初始映射关系。
本实施例中,第一特征信息为目标全场投影图案中的目标全场散斑点,第二特征信息为参考全场投影图案中的参考全场散斑点。设备根据目标全场散斑点和参考全场散斑点计算目标全场散斑点和参考全场散斑点之间的初始映射关系。举例来说,计算初始映射关系为利用参考全场散斑点与目标全场散斑点的单应性进行计算;根据全场散斑的边缘曲线特征计算初始映射关系可以利用参考全场投影图案中的边缘曲线和目标全场投影图案中的边缘曲线的单应性进行计算。
S1042:根据预设散斑局部对应关系和所述初始映射关系,确定所述目标全场散斑点和所述参考全场散斑点之间的映射关系。
设备中存储预设散斑局部对应关系,预设散斑局部对应关系可以包括散斑块相似性信息,散斑点局部最近邻信息等等。设备根据预设散斑局部对应关系和初始映射关系,确定目标全场散斑点和参考全场散斑点之间的映射关系。
S105:根据所述第一映射关系将所述第一图形信息映射至所述目标全场投影图案,得到所述目标全场投影图案对应的第二图形信息。
设备根据第一映射关系将第一图形信息映射至目标全场投影图案中,得到第一图形信息在目标全场投影图案中的图形信息,即为第二图形信息。举例来说,设备根据第一映射关系将零级信息映射至目标全场投影图案中,得到零级信息在目标全场投影图案中的零级信息,即为第二图形信息。
S106:根据所述第二图形信息,计算目标光学信息。
设备可以根据目标全场投影图案中对应的第二图形信息来计算目标光学信息。计算规则可以在设备中预先设置,根据目标光学信息的种类来确定计算规则。根据计算规则来计算目标光学信息。
其中,目标光学信息可以包括目标全场投影图案的亮暗分布信息和/或所述目标全场投影图案的视场角信息。
在一种实施方式中,投影目标全场投影图案的投影组件包括光源和衍射光学元件,光源为多个垂直腔面发射激光器VCSEL组成的阵列;目标光学信息可以包括衍射光学元件与光源的相对偏转信息和/或阵列的VCSEL缺失情况信息。其中,VCSEL阵列在处理器的控制下经由DOE可投射出不同的组合光点点阵图案,其中DOE对VCSEL阵列形成的原始图案进行复制、重叠和/或旋转等操作即可在投影屏上形成最终的散斑图案,因此,该散斑图案与VCSEL阵列的VCSEL光源在空间位置上形成一一对应关系,进而可通过散斑图案来检测VCSEL阵列光源中单个VCSEL光源是否缺失。
在一种实施方式中,目标光学信息为第二成像模组与投影组件的光轴夹角,S106可以包括S1061~S1064,如图6所示,S1061~S1064具体如下:
S1061:获取由所述第一成像模组采集的第一标定图案和所述第二成像模组采集的第二标定图案。
在本实施例中,光学信息的检测装置包括第二成像模组,最后需要计算的目标光学信息为第二成像模组与投影组件的光轴夹角。
在本实施例中,光学信息检测系统还包括第二成像模组和标定图案投影仪。其中,标定图案投影仪,用于将标定图案投射至投影屏上。其中,标定图案不限定于棋盘格,圆点等,目的是够提供足够多的特征。成像模组,用于采集上述标定图案。需要说明的是,当开启上述标定图案投影仪时,需要关闭投影组件。为了便于区分,将第一成像模组采集的标定图案记为第一标定图案,第二成像模组采集的标定图案记为第二标定图案。
S1062:计算所述第一标定图案和所述第二标定图案之间的第二映射关系。
设备计算第一标定图案和第二标定图案之间的第二映射关系。第二映射关系标识了第一成像模组和第二成像模组之间的关系。
S1063:根据所述第二映射关系将所述第二图形信息映射至所述第二标定图案,得到所述第二标定图案的第三图形信息。
设备根据第二映射关系将第二图形信息映射至第二标定图案中,得到第二图形信息在第二标定图案中的图形信息,即为第三图形信息。举例来说,设备根据第二映射关系将零级信息映射至第二标定图案中,得到零级信息在第二标定图案中的零级信息,即为第三图形信息。
S1064:根据所述第三图形信息,计算所述第二成像模组与所述投影组件的光轴夹角。
设备根据第三图形信息确定第一标定图案中映射后的点的坐标,根据第一标定图案中映射后的点的坐标计算投影组件与第二成像模组之间的光轴夹角。
本申请实施例中,获取第一成像模组采集的第一图像,所述第一图像包括目标全场投影图案;对所述第一图像进行特征提取,得到第一特征信息;获取预设参考全场投影图案的第二特征信息、第一图形信息,所述图形信息包括零级信息和/或次级信息;计算所述第一特征信息和所述第二特征信息之间的第一映射关系;根据所述第一映射关系将所述第一图形信息映射至所述目标全场投影图案,得到所述目标全场投影图案对应的第二图形信息;根据所述第二图形信息,计算目标光学信息。上述方法,可兼容投影组件投影的投影图案为规则或非规则两种情况,即在投影组件投影的散斑图案中,无论零级散斑图案是否具有全局唯一性,都可以准确的对光学信息进行检测。
应理解,上述实施例中各步骤的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
请参见图7,图7是本申请第三实施例提供的光学信息检测装置的示意图。包括的各单元用于执行图2、图5~图6对应的实施例中的各步骤。具体请参阅 图2、图5~图6各自对应的实施例中的相关描述。为了便于说明,仅示出了与本实施例相关的部分。参见图7,光学信息检测装置7包括:第一获取单元710,用于获取第一成像模组采集的第一图像,所述第一图像包括目标全场投影图案;提取单元720,用于对所述第一图像进行特征提取,得到第一特征信息;第二获取单元730,用于获取预设参考全场投影图案的第二特征信息、第一图形信息,所述图形信息包括零级信息和/或次级信息;第一计算单元740,用于计算所述第一特征信息和所述第二特征信息之间的第一映射关系;第一处理单元750,用于根据所述第一映射关系将所述第一图形信息映射至所述目标全场投影图案,得到所述目标全场投影图案对应的第二图形信息;第二计算单元760,用于根据所述第二图形信息,计算目标光学信息。
进一步地,所述第一特征信息为所述目标全场投影图案中的目标全场散斑点,所述第二特征信息为所述参考全场投影图案中的参考全场散斑点;
第一计算单元740,具体用于:计算所述目标全场散斑点和所述参考全场散斑点之间的初始映射关系;根据预设散斑局部对应关系和所述初始映射关系,确定所述目标全场散斑点和所述参考全场散斑点之间的映射关系。
在一个实施例中,所述目标光学信息包括所述目标全场投影图案的亮暗分布信息和/或所述目标全场投影图案的视场角信息。
在一个实施例中,投影所述目标全场投影图案的投影组件包括光源和衍射光学元件,所述光源为多个垂直腔面发射激光器VCSEL组成的阵列;所述目标光学信息包括所述衍射光学元件与所述光源的相对偏转信息和/或所述阵列的VCSEL缺失情况信息。
在一个实施例中,所述第一特征信息包括目标全场散斑点的内部特征、目标全场散斑的角点特征、目标全场散斑的边缘曲线特征中的一种或多种。所述第二特征信息包括参考全场散斑点的内部特征、参考全场散斑的角点特征、参考全场散斑的边缘曲线特征中的一种或多种。所述目标光学信息为第二成像模组与投影组件的光轴夹角;
第二计算单元760,具体用于:获取由所述第一成像模组采集的第一标定图案和所述第二成像模组采集的第二标定图案;计算所述第一标定图案和所述第二标定图案之间的第二映射关系;根据所述第二映射关系将所述第二图形信息映射至所述第二标定图案,得到所述第二标定图案的第三图形信息;根据所述第三图形信息,计算所述第二成像模组与所述投影组件的光轴夹角。
图8是本申请第四实施例提供的光学信息检测设备的示意图。如图8所示,该实施例的光学信息检测设备8包括:处理器80、存储器81以及存储在所述存储器81中并可在所述处理器80上运行的计算机程序82,例如光学信息检测程序。所述处理器80执行所述计算机程序82时实现上述各个光学信息检测方法实施例中的步骤,例如图2所示的步骤101至106。或者,所述处理器80执行所述计算机程序82时实现上述各装置实施例中各模块/单元的功能,例如图5所示模块710至760的功能。
示例性的,所述计算机程序82可以被分割成一个或多个模块/单元,所述一个或者多个模块/单元被存储在所述存储器81中,并由所述处理器80执行,以完成本申请。所述一个或多个模块/单元可以是能够完成特定功能的一系列计算机程序指令段,该指令段用于描述所述计算机程序82在所述光学信息检测设备8中的执行过程。例如,所述计算机程序82可以被分割成第一获取单元、提取单元、第二获取单元、第一计算单元、第一处理单元、第二计算单元,各单元具体功能如下:
第一获取单元,用于获取第一成像模组采集的第一图像,所述第一图像包括目标全场投影图案;提取单元,用于对所述第一图像进行特征提取,得到第一特征信息;第二获取单元,用于获取预设参考全场投影图案的第二特征信息、第一图形信息,所述图形信息包括零级信息和/或次级信息;第一计算单元,用于计算所述第一特征信息和所述第二特征信息之间的第一映射关系;第一处理单元,用于根据所述第一映射关系将所述第一图形信息映射至所述目标全场投影图案,得到所述目标全场投影图案对应的第二图形信息;第二计算单元,用 于根据所述第二图形信息,计算目标光学信息。
所述光学信息检测设备可包括,但不仅限于,处理器80、存储器81。本领域技术人员可以理解,图8仅仅是光学信息检测设备8的示例,并不构成对光学信息检测设备8的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件,例如所述光学信息检测设备还可以包括输入输出设备、网络接入设备、总线等。
所称处理器80可以是中央处理单元(Central Processing Unit,CPU),还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现成可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
所述存储器81可以是所述光学信息检测设备8的内部存储单元,例如光学信息检测设备8的硬盘或内存。所述存储器81也可以是所述光学信息检测设备8的外部存储设备,例如所述光学信息检测设备8上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。进一步地,所述光学信息检测设备8还可以既包括所述光学信息检测设备8的内部存储单元也包括外部存储设备。所述存储器81用于存储所述计算机程序以及所述光学信息检测设备所需的其他程序和数据。所述存储器81还可以用于暂时地存储已经输出或者将要输出的数据。
需要说明的是,上述装置/单元之间的信息交互、执行过程等内容,由于与本申请方法实施例基于同一构思,其具体功能及带来的技术效果,具体可参见方法实施例部分,此处不再赘述。
所属领域的技术人员可以清楚地了解到,为了描述的方便和简洁,仅以上述各功能单元、模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能单元、模块完成,即将所述装置的内部结构划分成不 同的功能单元或模块,以完成以上描述的全部或者部分功能。实施例中的各功能单元、模块可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中,上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。另外,各功能单元、模块的具体名称也只是为了便于相互区分,并不用于限制本申请的保护范围。上述系统中单元、模块的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
本申请实施例还提供了一种网络设备,该网络设备包括:至少一个处理器、存储器以及存储在所述存储器中并可在所述至少一个处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现上述任意各个方法实施例中的步骤。
本申请实施例还提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时实现可实现上述各个方法实施例中的步骤。
本申请实施例提供了一种计算机程序产品,当计算机程序产品在移动终端上运行时,使得移动终端执行时实现可实现上述各个方法实施例中的步骤。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实现上述实施例方法中的全部或部分流程,可以通过计算机程序来指令相关的硬件来完成,所述的计算机程序可存储于一计算机可读存储介质中,该计算机程序在被处理器执行时,可实现上述各个方法实施例的步骤。其中,所述计算机程序包括计算机程序代码,所述计算机程序代码可以为源代码形式、对象代码形式、可执行文件或某些中间形式等。所述计算机可读介质至少可以包括:能够将计算机程序代码携带到拍照装置/终端设备的任何实体或装置、记录介质、计算机存储器、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、电载波信号、电信信号以及软件分发介质。例如U盘、移动硬盘、磁碟或者光盘等。在某些司法管辖区,根据立法和专利实践, 计算机可读介质不可以是电载波信号和电信信号。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述或记载的部分,可以参见其它实施例的相关描述。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
在本申请所提供的实施例中,应该理解到,所揭露的装置/网络设备和方法,可以通过其它的方式实现。例如,以上所描述的装置/网络设备实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通讯连接可以是通过一些接口,装置或单元的间接耦合或通讯连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
以上所述实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围,均应包含在本申请的保护范围之内。

Claims (10)

  1. 一种光学信息检测方法,其特征在于,包括:
    获取第一成像模组采集的第一图像,所述第一图像包括目标全场投影图案;
    对所述第一图像进行特征提取,得到第一特征信息;
    获取预设参考全场投影图案的第二特征信息、第一图形信息,所述图形信息包括零级信息和/或次级信息;
    计算所述第一特征信息和所述第二特征信息之间的第一映射关系;
    根据所述第一映射关系将所述第一图形信息映射至所述目标全场投影图案,得到所述目标全场投影图案对应的第二图形信息;
    根据所述第二图形信息,计算目标光学信息。
  2. 如权利要求1所述的光学信息检测方法,其特征在于,所述第一特征信息为所述目标全场投影图案中的目标全场散斑点,所述第二特征信息为所述参考全场投影图案中的参考全场散斑点;
    所述计算所述第一特征信息和所述第二特征信息之间的第一映射关系,包括:
    计算所述目标全场散斑点和所述参考全场散斑点之间的初始映射关系;
    根据预设散斑局部对应关系和所述初始映射关系,确定所述目标全场散斑点和所述参考全场散斑点之间的映射关系。
  3. 如权利要求1所述的光学信息检测方法,其特征在于,所述目标光学信息包括所述目标全场投影图案的亮暗分布信息和/或所述目标全场投影图案的视场角信息。
  4. 如权利要求1所述的光学信息检测方法,其特征在于,投影所述目标全场投影图案的投影组件包括光源和衍射光学元件,所述光源为多个垂直腔面发射激光器VCSEL组成的阵列;
    所述目标光学信息包括所述衍射光学元件与所述光源的相对偏转信息和/ 或所述阵列的VCSEL缺失情况信息。
  5. 如权利要求1所述的光学信息检测方法,其特征在于,所述第一特征信息包括目标全场散斑点的内部特征、目标全场散斑的角点特征、目标全场散斑的边缘曲线特征中的一种或多种。
  6. 如权利要求1所述的光学信息检测方法,其特征在于,所述第二特征信息包括参考全场散斑点的内部特征、参考全场散斑的角点特征、参考全场散斑的边缘曲线特征中的一种或多种。
  7. 如权利要求1所述的光学信息检测方法,其特征在于,所述目标光学信息为第二成像模组与投影组件的光轴夹角;
    所述根据所述第二图形信息,计算目标光学信息,包括:
    获取由所述第一成像模组采集的第一标定图案和所述第二成像模组采集的第二标定图案;
    计算所述第一标定图案和所述第二标定图案之间的第二映射关系;
    根据所述第二映射关系将所述第二图形信息映射至所述第二标定图案,得到所述第二标定图案的第三图形信息;
    根据所述第三图形信息,计算所述第二成像模组与所述投影组件的光轴夹角。
  8. 一种光学信息检测装置,其特征在于,包括:
    第一获取单元,用于获取第一成像模组采集的第一图像,所述第一图像包括目标全场投影图案;
    提取单元,用于对所述第一图像进行特征提取,得到第一特征信息;
    第二获取单元,用于获取预设参考全场投影图案的第二特征信息、第一图形信息,所述图形信息包括零级信息和/或次级信息;
    第一计算单元,用于计算所述第一特征信息和所述第二特征信息之间的第一映射关系;
    第一处理单元,用于根据所述第一映射关系将所述第一图形信息映射至所 述目标全场投影图案,得到所述目标全场投影图案对应的第二图形信息;
    第二计算单元,用于根据所述第二图形信息,计算目标光学信息。
  9. 一种光学信息检测设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现如权利要求1至7任一项所述的方法。
  10. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1至7任一项所述的方法。
PCT/CN2020/138123 2020-07-20 2020-12-21 一种光学信息检测方法、装置及设备 WO2022016797A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/732,773 US20220254067A1 (en) 2020-07-20 2022-04-29 Optical information detection method, device and equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010697772.0 2020-07-20
CN202010697772.0A CN111986154A (zh) 2020-07-20 2020-07-20 一种光学信息检测方法、装置及设备

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/732,773 Continuation US20220254067A1 (en) 2020-07-20 2022-04-29 Optical information detection method, device and equipment

Publications (1)

Publication Number Publication Date
WO2022016797A1 true WO2022016797A1 (zh) 2022-01-27

Family

ID=73438789

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/138123 WO2022016797A1 (zh) 2020-07-20 2020-12-21 一种光学信息检测方法、装置及设备

Country Status (3)

Country Link
US (1) US20220254067A1 (zh)
CN (1) CN111986154A (zh)
WO (1) WO2022016797A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111986154A (zh) * 2020-07-20 2020-11-24 深圳奥比中光科技有限公司 一种光学信息检测方法、装置及设备
CN112629828B (zh) * 2020-11-27 2023-07-04 奥比中光科技集团股份有限公司 一种光学信息检测方法、装置及设备
CN112556994B (zh) * 2020-11-27 2023-07-07 奥比中光科技集团股份有限公司 一种光学信息检测方法、装置及设备
CN113793339B (zh) * 2021-11-18 2022-08-26 合肥的卢深视科技有限公司 Doe脱落程度检测方法、电子设备和存储介质

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108230399A (zh) * 2017-12-22 2018-06-29 清华大学 一种基于结构光技术的投影仪标定方法
WO2018229358A1 (fr) * 2017-06-14 2018-12-20 Majo Procédé et dispositif de construction d'une image tridimensionnelle
CN109342028A (zh) * 2018-10-17 2019-02-15 深圳奥比中光科技有限公司 衍射光学元件检测方法与系统
CN110177266A (zh) * 2017-12-18 2019-08-27 西安交通大学 一种结构光3d深度相机的自校正方法及装置
CN110189380A (zh) * 2019-05-30 2019-08-30 Oppo广东移动通信有限公司 标定数据的优化方法、结构光模组及存储介质
CN110490938A (zh) * 2019-08-05 2019-11-22 Oppo广东移动通信有限公司 用于校验摄像头标定参数的方法、装置及电子设备
CN110657785A (zh) * 2019-09-02 2020-01-07 清华大学 一种高效的场景深度信息获取方法及系统
CN110689581A (zh) * 2018-07-06 2020-01-14 Oppo广东移动通信有限公司 结构光模组标定方法、电子设备、计算机可读存储介质
CN111354033A (zh) * 2020-02-28 2020-06-30 西安交通大学 基于特征匹配的数字图像测量方法
CN111986154A (zh) * 2020-07-20 2020-11-24 深圳奥比中光科技有限公司 一种光学信息检测方法、装置及设备

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3029238B2 (ja) * 1995-08-08 2000-04-04 岸本産業株式会社 レーザ光を用いた被計測物の遠近方向の移動量測定方法
CA2528791A1 (en) * 2005-12-01 2007-06-01 Peirong Jia Full-field three-dimensional measurement method
JP4917615B2 (ja) * 2006-02-27 2012-04-18 プライム センス リミティド スペックルの無相関を使用した距離マッピング(rangemapping)
JP5588353B2 (ja) * 2008-01-21 2014-09-10 プライムセンス リミテッド 0次低減のための光学設計
CN105190405B (zh) * 2013-03-15 2019-08-30 图像影院国际有限公司 针对调制器衍射效应优化的投影仪
US9978135B2 (en) * 2015-02-27 2018-05-22 Cognex Corporation Detecting object presence on a target surface
CN110276838B (zh) * 2019-05-30 2023-04-28 盎锐(上海)信息科技有限公司 基于激光投影仪的模型获取方法及装置

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018229358A1 (fr) * 2017-06-14 2018-12-20 Majo Procédé et dispositif de construction d'une image tridimensionnelle
CN110177266A (zh) * 2017-12-18 2019-08-27 西安交通大学 一种结构光3d深度相机的自校正方法及装置
CN108230399A (zh) * 2017-12-22 2018-06-29 清华大学 一种基于结构光技术的投影仪标定方法
CN110689581A (zh) * 2018-07-06 2020-01-14 Oppo广东移动通信有限公司 结构光模组标定方法、电子设备、计算机可读存储介质
CN109342028A (zh) * 2018-10-17 2019-02-15 深圳奥比中光科技有限公司 衍射光学元件检测方法与系统
CN110189380A (zh) * 2019-05-30 2019-08-30 Oppo广东移动通信有限公司 标定数据的优化方法、结构光模组及存储介质
CN110490938A (zh) * 2019-08-05 2019-11-22 Oppo广东移动通信有限公司 用于校验摄像头标定参数的方法、装置及电子设备
CN110657785A (zh) * 2019-09-02 2020-01-07 清华大学 一种高效的场景深度信息获取方法及系统
CN111354033A (zh) * 2020-02-28 2020-06-30 西安交通大学 基于特征匹配的数字图像测量方法
CN111986154A (zh) * 2020-07-20 2020-11-24 深圳奥比中光科技有限公司 一种光学信息检测方法、装置及设备

Also Published As

Publication number Publication date
US20220254067A1 (en) 2022-08-11
CN111986154A (zh) 2020-11-24

Similar Documents

Publication Publication Date Title
WO2022016797A1 (zh) 一种光学信息检测方法、装置及设备
WO2022016798A1 (zh) 一种光学信息检测系统
JP5680976B2 (ja) 電子黒板システム及びプログラム
US8172407B2 (en) Camera-projector duality: multi-projector 3D reconstruction
US8441440B2 (en) Position information detection device, position information detection method, and position information detection program
CN112272292B (zh) 投影校正方法、装置和存储介质
CN101639746B (zh) 触摸屏的自动校准方法
CN110489015B (zh) 触摸点确定方法及装置、触摸屏和显示器
CN112556994B (zh) 一种光学信息检测方法、装置及设备
US20230351635A1 (en) Optical axis calibration method and apparatus of optical axis detection system, terminal, system, and medium
CN109949306B (zh) 反射面角度偏差检测方法、终端设备及存储介质
CN109116663B (zh) 一种结构光模组的平行aa方法、装置及可读存储介质
US20230267628A1 (en) Decoding an image for active depth sensing to account for optical distortions
WO2022105277A1 (zh) 投影控制方法、装置、投影光机及可读存储介质
CN112629828B (zh) 一种光学信息检测方法、装置及设备
WO2022047839A1 (zh) 一种红外触摸屏多点触摸识别方法、装置及设备
CN108780572A (zh) 图像校正的方法及装置
JP7509897B2 (ja) 深度画像生成方法及び装置、基準画像生成方法及び装置、電子機器、ならびにコンピュータプログラム
CN114821987B (zh) 提醒方法、装置及终端设备
WO2022231725A1 (en) Systems and methods for determining an adaptive region of interest (roi) for image metrics calculations
CN210570528U (zh) 深度检测系统及其支架和电子装置
KR102551261B1 (ko) 외부 객체에 투사된 구조광을 이용하여 깊이 정보를 생성하는 방법 및 이를 사용하는 전자 장치
CN114332341A (zh) 一种点云重建方法、装置及系统
CN113052884A (zh) 信息处理方法、信息处理装置、存储介质与电子设备
JP2017125764A (ja) 物体検出装置、及び物体検出装置を備えた画像表示装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20946003

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20946003

Country of ref document: EP

Kind code of ref document: A1