CN114926519A - Depth recovery method and device, electronic equipment and computer readable storage medium - Google Patents

Depth recovery method and device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN114926519A
CN114926519A CN202210456562.1A CN202210456562A CN114926519A CN 114926519 A CN114926519 A CN 114926519A CN 202210456562 A CN202210456562 A CN 202210456562A CN 114926519 A CN114926519 A CN 114926519A
Authority
CN
China
Prior art keywords
speckle
target
candidate
bit
gray value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210456562.1A
Other languages
Chinese (zh)
Inventor
化雪诚
付贤强
刘祺昌
王海彬
李东洋
户磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Dilusense Technology Co Ltd
Original Assignee
Hefei Dilusense Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Dilusense Technology Co Ltd filed Critical Hefei Dilusense Technology Co Ltd
Priority to CN202210456562.1A priority Critical patent/CN114926519A/en
Publication of CN114926519A publication Critical patent/CN114926519A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application relates to the technical field of machine vision, and discloses a depth recovery method, a device, electronic equipment and a computer readable storage medium, wherein the method comprises the following steps: determining a target infrared region in the obtained infrared image, and determining a target speckle region in a speckle image corresponding to the infrared image according to the target infrared region; generating a plurality of candidate speckle patterns according to a data effective bit of a target speckle area and a preset target data bit selection strategy, wherein the target data bits selected by different candidate speckle patterns are different; the depth recovery method can ensure that speckles participating in depth recovery are all high-quality speckles, so that the precision of depth recovery is improved, and the stability and universality of the depth camera used under the complex illumination condition are improved.

Description

Depth recovery method, device, electronic equipment and computer readable storage medium
Technical Field
The embodiment of the application relates to the technical field of machine vision, in particular to a depth recovery method, a depth recovery device, electronic equipment and a computer-readable storage medium.
Background
The depth camera on the market is mainly divided into a Time of flight (TOF) method, a binocular vision technology and a monocular structured light technology, wherein the monocular structured light technology is a mainstream scheme aiming at consumer-grade face recognition and payment scenes at present, the monocular structured light depth camera projects irregular speckle patterns to the target scene through a speckle projector, shoots the projection of the speckle patterns on the target scene through an infrared lens, and finally carries out speckle feature matching based on the shot infrared image and a pre-calibrated reference image to carry out depth recovery.
However, the inventor of the present application finds that, when performing depth restoration by using a monocular structured light technology in the industry, a server is operated for a whole image, and actually, many background areas do not need depth restoration, and under the condition of limited computing resources, the accuracy of depth restoration is greatly reduced, which cannot meet the actual requirements.
Disclosure of Invention
An object of the embodiments of the present application is to provide a depth recovery method, an apparatus, an electronic device, and a computer-readable storage medium, which ensure that speckles participating in depth recovery are all high-quality speckles, thereby improving the accuracy of depth recovery, and improving the stability and universality of a depth camera used under a complex illumination condition.
In order to solve the above technical problem, an embodiment of the present application provides a depth recovery method, including the following steps: determining a target infrared region in the obtained infrared image, and determining a target speckle region in a speckle image corresponding to the infrared image according to the target infrared region; the target infrared region comprises a preset target object; generating a plurality of candidate speckle patterns according to the data valid bit of the target speckle area and a preset target data bit selection strategy; wherein, the target data bits selected by different candidate speckle patterns are different; and respectively carrying out quality scoring on the plurality of candidate speckle patterns based on a preset quality scoring algorithm, and generating a depth map corresponding to the infrared image according to the candidate speckle pattern with the highest score.
An embodiment of the present application further provides a depth recovery apparatus, including: the positioning module is used for determining a target infrared region in the acquired infrared image and determining a target speckle region in a speckle pattern corresponding to the infrared image according to the target infrared region; the grouping module is used for generating a plurality of candidate speckle patterns according to the data valid bit of the target speckle area and a preset target data bit selection strategy, wherein the target data bits selected by different candidate speckle patterns are different; the scoring module is used for respectively performing quality scoring on the candidate speckle patterns based on a preset quality scoring algorithm; and the execution module is used for generating a depth map corresponding to the infrared map according to the candidate speckle pattern with the highest score.
An embodiment of the present application further provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the above-described depth recovery method.
Embodiments of the present application further provide a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the depth recovery method described above.
The depth recovery method, the device, the electronic equipment and the computer-readable storage medium provided by the embodiment of the application firstly carry out preset target object detection in an obtained infrared image to determine a target infrared region, determine the target speckle region in a speckle pattern corresponding to the infrared image according to the target infrared region, then generate a plurality of candidate speckle patterns according to a data valid bit of the target speckle region and a preset target data bit selection strategy, wherein the target data bits selected by different candidate speckle patterns are different, finally carry out quality grading on the candidate speckle patterns respectively based on a preset quality grading algorithm, generate a depth map corresponding to the infrared image according to the candidate speckle pattern with the highest grade, determine the target infrared region and the target speckle region in the infrared image and the speckle pattern, and subsequently operate only on the target speckle region, namely a foreground region of the target object, the method can save computing resources, and considers that data in a speckle pattern acquired by a depth camera can be normalized in the industry in order to reduce data volume, data bit conversion is carried out based on the maximum value and the minimum value of the gray value of each pixel point of the whole image and is seriously influenced by illumination intensity, so that the normalized speckles are easily over-bright or over-dark.
In addition, the data valid bit of the target speckle region is K bits, the K is an integer greater than 2, the preset target data bit selection strategy comprises a high-order selection strategy and a low-order selection strategy, and the candidate speckle patterns comprise high-order candidate speckle patterns and low-order candidate speckle patterns; generating a plurality of candidate speckle patterns according to the data valid bit of the target speckle region and a preset target data bit selection strategy, wherein the candidate speckle patterns comprise: obtaining the gray value of each pixel point in the target speckle area, wherein the gray value is K-bit data; according to the high-order selection strategy, selecting front L-order data of the gray value of each pixel point as a first gray value of each pixel point to generate a high-order candidate speckle pattern; wherein L is an integer less than K; according to the low-order selection strategy, the rear L-order data of the gray value of each pixel point is selected as the second gray value of each pixel point to generate a low-order candidate speckle pattern, the original K-order gray value data can be compressed into L-order data in order to reduce the data volume in the industry, the process is the gray value normalization process, normalization is carried out based on the maximum value and the minimum value of the gray value of each pixel point of the whole pattern, the L-order data is selected from the K-order data to achieve the purpose of reducing the data volume, the calculated amount is greatly reduced, the normalization time is saved, and therefore the quality and the efficiency of deep recovery are improved.
In addition, the data valid bit of the target speckle area is 10 bits, the preset target data bit selection strategy comprises a high 8-bit selection strategy, a medium 8-bit selection strategy and a low 8-bit selection strategy, and the candidate speckle patterns comprise a high 8-bit candidate speckle pattern, a medium 8-bit candidate speckle pattern and a low 8-bit candidate speckle pattern; generating a plurality of candidate speckle patterns according to the data valid bit of the target speckle region and a preset target data bit selection strategy, wherein the candidate speckle patterns comprise: acquiring gray values of all pixel points in the target speckle area, wherein the gray value is 10-bit data; according to the high 8-bit selection strategy, selecting the first 8-bit data of the gray value of each pixel point as the third gray value of each pixel point to generate a high 8-bit candidate speckle pattern; selecting the data from 2 nd bit to 9 th bit of the gray value of each pixel point as the fourth gray value of each pixel point according to the medium 8-bit selection strategy to generate a medium 8-bit candidate speckle pattern; according to the low 8-bit selection strategy, the later 8-bit data of the gray value of each pixel point is selected as the fifth gray value of each pixel point to generate a low 8-bit candidate speckle pattern, the effective bit of data corresponding to the speckle pattern shot by the depth camera is generally 10 bits, namely the gray value of each pixel point in the speckle pattern is generally 10 bits of data, the application abandons the conversion of 10 bits of data into 8 bits of data based on the maximum value and the minimum value of the gray value of each pixel point of the whole pattern, and generates 3 groups of different 8 bits of data according to the preset high 8-bit selection strategy, medium 8-bit selection strategy and low 8-bit selection strategy, for a scene with strong illumination, the low 8 bits in the 10 bits of data are basically 1, and for a scene with weak illumination, the high 8 bits of data cannot cause the over-brightness of the normalized speckle pattern, and for a scene with weak illumination, the high 8 bits of the 10 bits of data are basically 0, the low 8-bit data can not cause the speckle pattern after normalization to be too dark, and the normalization method can better adapt to scenes with different illumination intensities and greatly reduce the calculation amount.
In addition, the quality scoring is respectively carried out on the plurality of candidate speckle patterns based on a preset quality scoring algorithm, and the quality scoring comprises the following steps: traversing the candidate speckle patterns, and calculating the mean value of the gray values of the current candidate speckle patterns according to the gray values of all pixel points in the current candidate speckle patterns; calculating the signal-to-noise ratio of the current candidate speckle pattern according to speckle points and non-speckle points in the current candidate speckle pattern; according to the gray value mean value of each candidate speckle pattern, the signal-to-noise ratio of each candidate speckle pattern and a preset quality scoring algorithm, the quality score of each candidate speckle pattern is calculated, the gray value mean value of the candidate speckle patterns can represent the average brightness of the candidate speckle patterns, and the signal-to-noise ratio of the candidate speckle patterns can represent whether the features of the speckle patterns are obvious or not.
In addition, the calculating the signal-to-noise ratio of the current candidate speckle pattern according to the speckles and the non-speckle points in the current candidate speckle pattern includes: up-sampling the current candidate speckle pattern to obtain an up-sampled current candidate speckle pattern; determining a speckle point area and a non-speckle point area of the current candidate speckle pattern according to the up-sampled current candidate speckle pattern and a preset speckle point area extraction method; the preset speckle point region extraction method comprises Gaussian blur, edge extraction, ellipse fitting and ellipse detection; respectively calculating the gray value average value of the speckle region and the gray value average value of the non-speckle region in the current candidate speckle pattern; and calculating the ratio of the gray value average value of the speckle region to the gray value average value of the non-speckle point region, taking the ratio as the signal-to-noise ratio of the current candidate speckle pattern, considering that the speckle point regions are relatively dispersed, and each speckle point region is relatively small, the server can perform up-sampling on the current candidate speckle pattern, namely amplifying the current candidate speckle pattern, so that the speckle point region of the current candidate speckle pattern is more accurately detected, and the accuracy and reliability of the calculated signal-to-noise ratio are improved.
In addition, the calculating the signal-to-noise ratio of the current candidate speckle pattern according to the speckles and the non-speckle points in the current candidate speckle pattern includes: judging whether the gray value average value of the current candidate speckle pattern is within a preset gray value range or not; if the mean value of the gray values of the current candidate speckle patterns is within a preset gray value range, the mean value of the gray values of the current candidate speckle patterns is within the preset gray value range; if the mean value of the gray values of the current candidate speckle patterns is outside a preset gray value range, the current candidate speckle patterns are directly discarded, a trigger condition is set before quality scoring, namely the mean value of the gray values of the candidate speckle patterns needs to be within the preset gray value range, the brightness of the candidate speckle patterns is reasonable within the preset gray value range, the candidate speckle patterns cannot be too bright and too dark, the too bright or too dark speckle patterns are directly discarded, computing resources are further saved, and the speed of depth recovery can be effectively improved.
In addition, the preset target object is a human face, the determining a target infrared region in the acquired infrared image, and determining a target speckle region in a speckle pattern corresponding to the infrared image according to the target infrared region includes: carrying out face detection on the acquired infrared image, determining a face contour in the infrared image, and taking a circumscribed rectangle of the face contour as the target infrared region; according to the coordinates of each vertex of the target infrared region, determining the homonymous point of each vertex in a speckle pattern corresponding to the infrared pattern; and connecting the homonymous points of the vertexes to obtain a target speckle region, and taking the human face contour into consideration that the human face contour is not a regular figure, so that when the preset target object is a human face, the external rectangle of the human face contour in the infrared image is selected as a target infrared region, and subsequent quality evaluation and depth recovery are facilitated.
Drawings
One or more embodiments are illustrated by the figures in the accompanying drawings, which correspond to and are not intended to limit the embodiments.
FIG. 1 is a first flowchart of a depth recovery method according to an embodiment of the present application;
FIG. 2 is a first flowchart for generating a plurality of candidate speckle patterns according to the data valid bits of the target speckle region and a predetermined target data bit selection strategy, according to an embodiment of the present disclosure;
FIG. 3 is a second flowchart for generating a plurality of candidate speckle patterns according to the data valid bits of the target speckle region and a predetermined target data bit selection strategy, according to an embodiment of the present application;
FIG. 4 is a flow chart of quality scoring a plurality of candidate speckle patterns, respectively, based on a preset quality scoring algorithm, according to an embodiment of the present application;
FIG. 5 is a flow chart for calculating a signal-to-noise ratio of a current candidate speckle pattern based on speckle and non-speckle points in the current candidate speckle pattern, according to one embodiment of the present application;
FIG. 6 is a flow chart two of a depth recovery method according to another embodiment of the present application;
FIG. 7 is a flow chart of determining a target infrared region in an acquired infrared map and determining a target speckle region in a speckle pattern corresponding to the infrared map based on the target infrared region, according to an embodiment of the present application;
FIG. 8 is a schematic view of a depth restoration apparatus according to another embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to another embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present application clearer, the embodiments of the present application will be described in detail below with reference to the accompanying drawings. However, it will be appreciated by those of ordinary skill in the art that in the examples of the present application, numerous technical details are set forth in order to provide a better understanding of the present application. However, the technical solution claimed in the present application can be implemented without these technical details and various changes and modifications based on the following embodiments. The following embodiments are divided for convenience of description, and should not constitute any limitation to the specific implementation manner of the present application, and the embodiments may be mutually incorporated and referred to without contradiction.
For the convenience of understanding the embodiments of the present application, the relevant contents of "depth recovery" introduced in the description of the embodiments of the present application are first introduced herein.
Depth recovery of a depth camera in the industry is mainly divided into a TOF method, a binocular vision technology and a monocular structured light method technology, the TOF method is used for obtaining distance by measuring flight time of light, the depth camera continuously transmits laser pulses to a target scene, and receives optical fibers reflected back by the target scene through an internal sensor, so that the exact distance is obtained by detecting the flight round trip time of the light pulses, the TOF method has high requirements on hardware of the depth camera, particularly a time measuring module has large calculation amount and large resource consumption, the recovered target object edge has low depth value precision, and a consumer-level face recognition scene with high depth recovery precision requirement cannot be met.
A depth recovery method based on binocular vision technology utilizes the principle of parallax, a depth camera shoots a plurality of images of a measured object from different positions, and depth information of the measured object is obtained by calculating position deviation between corresponding points of the images.
The monocular structured light technology is a mainstream depth recovery scheme aiming at a consumer-grade face recognition scene in the industry, a depth camera is matched with a structured light projector and an infrared lens, an irregular speckle pattern is projected to a target scene through the structured light projector, the infrared lens is used for shooting the projection of the speckle pattern on the target scene, and then depth recovery is carried out on the basis of a shot projection image, a pre-calibrated reference image and a preset matching algorithm to obtain the depth information of the target scene, but the depth recovery in the industry is operated aiming at the whole image, actually, a plurality of background areas in the image do not need depth recovery, under the condition of limited computing resources, the precision of the depth recovery can be greatly reduced due to the depth recovery of the whole image, meanwhile, the illumination conditions of different scenes are different, and the quality of speckles in the speckle image shot by the depth camera is poor due to over-strong illumination or over-dark illumination, the quality of speckles participating in the depth recovery is not high, and further the precision of the depth recovery is influenced.
In order to solve the technical problems that the depth recovery is low in precision and is easily affected by changes of illumination conditions, an embodiment of the present application provides a depth recovery method, which is applied to an electronic device, where the electronic device may be a terminal or a server, and the electronic device in this embodiment and in the following embodiments is described by taking the server as an example.
A specific flow of the depth recovery method of this embodiment may be as shown in fig. 1, and includes:
step 101, determining a target infrared region in the acquired infrared image, and determining a target speckle region in a speckle pattern corresponding to the infrared image according to the target infrared region.
In a specific implementation, when the server performs depth recovery, the server may first perform target object detection on an acquired infrared image, that is, an infrared image captured by a depth camera, determine a target infrared region in the infrared image, where the target infrared region includes a preset target object, and then determine a target speckle region in a speckle pattern corresponding to the infrared image by using the target infrared region as a reference, where the infrared image and the speckle pattern corresponding to the infrared image are homologous data captured by the same lens of the depth camera, and the infrared image and the speckle pattern are naturally aligned, where the preset target object may be selected and set by a person skilled in the art according to a scene to which the depth recovery is applied.
It can be understood that, in practical application, the depth camera needs depth information of a target object and does not need depth information of a background area, if depth recovery is performed on a full image, depth recovery of the background area occupies a large amount of computing resources, which indirectly causes insufficient computing resources of the target object area, thereby affecting the precision of the depth recovery.
In one example, the depth recovery is applied to scanning and detecting a part, the detected part is a bolt, namely, a preset target object in the application is the bolt, the server detects the target object from an infrared image shot by a depth camera, the position of the bolt in the infrared image is detected, namely, the outline of the bolt is used as a target infrared region, the server obtains coordinates of each pixel point in the target infrared region, and finds a point with the same coordinates in a speckle pattern corresponding to the infrared image, so that the position of the bolt is determined in the speckle pattern, and the target speckle region is obtained.
And 102, generating a plurality of candidate speckle patterns according to the data valid bit of the target speckle area and a preset target data bit selection strategy.
Specifically, after the server determines a target speckle area in a speckle pattern corresponding to the infrared image, a plurality of candidate speckle patterns can be generated according to a data valid bit of the target speckle area and a preset target data bit selection strategy, wherein the target data bits selected by different candidate speckle patterns are different.
In specific implementation, the data corresponding to each pixel point in the target speckle region is a gray value, and the data valid bit of the target speckle region is a data valid bit of binary data representing the gray value of each pixel point, which is shot by the depth camera.
In one example, the effective data bit of the target speckle region is 12 bits, and the preset target data bit selection strategy is to select continuous 8 bits, for example, 12 bit data bit of the gray value of a certain pixel is "001010000101", the server selects continuous 8 bits, that is, the 1 st bit to the 8 th bit are respectively selected to obtain "00101000", the 2 nd bit to the 9 th bit are selected to obtain "01010000", the 3 rd bit to the 10 th bit are selected to obtain "10100001", the 4 th bit to the 11 th bit are selected to obtain "01000010", the 5 th bit to the 12 th bit are selected to obtain "10000101", and according to such target data bit selection strategy, the server can generate 5 candidate speckle patterns in total.
And 103, respectively performing quality scoring on the plurality of candidate speckle patterns based on a preset quality scoring algorithm, and generating a depth map corresponding to the infrared image according to the candidate speckle pattern with the highest score.
Specifically, after the server obtains a plurality of candidate speckle patterns, the server can respectively perform quality scoring on the plurality of candidate speckle patterns based on a preset quality scoring algorithm, and generate a depth map corresponding to the infrared map according to the candidate speckle pattern with the highest score.
In one example, the preset quality scoring algorithm may be a digital image correlation method, i.e., the server may perform quality scoring on each candidate speckle pattern by calculating a second derivative of the average gray value of each candidate speckle pattern.
In another example, the server may quality score each candidate speckle pattern by calculating a gradient of gray value gradients for each candidate speckle pattern.
In this embodiment, a server performs preset target object detection in an acquired infrared image to determine a target infrared region, determines a target speckle region in a speckle pattern corresponding to the infrared image according to the target infrared region, generates a plurality of candidate speckle patterns according to a data valid bit of the target speckle region and a preset target data bit selection strategy, wherein target data bits selected by different candidate speckle patterns are different, performs quality scoring on the plurality of candidate speckle patterns respectively based on a preset quality scoring algorithm, generates a depth map corresponding to the infrared image according to the candidate speckle pattern with the highest score, determines the target infrared region and the target speckle region in the infrared image and the speckle pattern, subsequently operates only the target speckle region, namely a foreground region with a target object, can save computing resources, and considers that data in the speckle pattern acquired by a depth camera is normalized in order to reduce data amount in the industry, the embodiment of the application directly groups the data effective bits of the target speckle area according to a preset target data bit selection strategy to obtain a plurality of selected candidate speckle patterns with different target data bits, and finally selects the candidate speckle pattern with the best quality to carry out depth recovery, so that the speckles participating in the depth recovery are all high-quality speckles, the precision of the depth recovery is improved, and the stability and universality of the depth camera used under the complex illumination condition are improved.
In one embodiment, the data valid bit of the target speckle area is K bits, K is an integer greater than 2, the preset target data bit selection strategy includes a high-order selection strategy and a low-order selection strategy, the candidate speckle patterns include a high-order candidate speckle pattern and a low-order candidate speckle pattern, and the server generates a plurality of candidate speckle patterns according to the data valid bit of the target speckle area and the preset target data bit selection strategy, which may be implemented through the steps shown in fig. 2, and specifically includes:
step 201, obtaining the gray value of each pixel point in the target speckle area, where the gray value of each pixel point is K-bit data.
Specifically, after the server determines the target speckle region in the speckle pattern corresponding to the infrared image, the server may traverse each pixel point in the target speckle region to obtain a gray value of each pixel point in the target speckle region, where the gray value of each pixel point is binary K-bit data.
Step 202, according to the high-order selection strategy, selecting the first L-order data of the gray-scale value of each pixel point as the first gray-scale value of each pixel point, and generating a high-order candidate speckle pattern, wherein L is an integer smaller than K.
In specific implementation, according to a high-order selection strategy, the server selects the first L-bit data, namely the 1 st bit data to the L-th bit data, of the gray value of each pixel point as the first gray value of each pixel point, and after the server obtains the first gray value of each pixel point, the server generates a high-order candidate speckle pattern according to the first gray value of each pixel point.
In one example, K is 12, L is 9, and according to the high-order selection policy, the server selects the first 9 bits of data of the gray-scale value of each pixel as the first gray-scale value of each pixel, thereby generating a high-order candidate speckle pattern.
In one example, K is 10, L is 4, and according to the high-order selection policy, the server selects the first 4 bits of data of the gray-scale value of each pixel as the first gray-scale value of each pixel, thereby generating a high-order candidate speckle pattern.
And 203, selecting the rear L-bit data of the gray value of each pixel point as a second gray value of each pixel point according to a low-bit selection strategy to generate a low-bit candidate speckle pattern.
In specific implementation, according to a low-order selection strategy, the server selects the rear L-order data of the gray value of each pixel point, namely the data from the K-L +1 th order to the K-order data, as the second gray value of each pixel point, and after the server obtains the second gray value of each pixel point, the server generates a low-order candidate speckle pattern according to the second gray value of each pixel point.
In one example, K is 12, L is 9, and according to the low-order selection policy, the server selects the last 9-order data of the gray-scale values of the pixels as the second gray-scale value of each pixel, thereby generating a low-order candidate speckle pattern.
In one example, K is 10 and L is 4, and according to the low-order selection policy, the server selects the last 4-order data of the gray-scale values of the pixels as the second gray-scale value of each pixel, thereby generating a low-order candidate speckle pattern.
In the embodiment, in order to reduce the data volume in the industry, the original K-bit gray value data is compressed into the L bits, the process is the gray value normalization process, the normalization is performed based on the maximum value and the minimum value of the gray value of each pixel point of the whole image, but the L bits are selected from the K-bit data to achieve the purpose of reducing the data volume, the calculated amount is greatly reduced, the normalization time is saved, and therefore the quality and the efficiency of depth recovery are improved.
In one embodiment, the data valid bit of the target speckle region is 10 bits, the preset target data bit selection strategy includes an upper 8-bit selection strategy, a middle 8-bit selection strategy and a lower 8-bit selection strategy, the server generates a plurality of candidate speckle patterns including an upper 8-bit candidate speckle pattern, a middle 8-bit candidate speckle pattern and a lower 8-bit candidate speckle pattern, and the server generates a plurality of candidate speckle patterns according to the data valid bit of the target speckle region and the preset target data bit selection strategy, which may be implemented by the steps shown in fig. 3, and specifically includes:
step 301, obtaining a gray value of each pixel point in the target speckle area, wherein the gray value of each pixel point is 10 bits of data.
Specifically, after the server determines the target speckle region in the speckle pattern corresponding to the infrared image, the server may traverse each pixel point in the target speckle region to obtain a gray value of each pixel point in the target speckle region, where the gray value of each pixel point is binary 10-bit data.
Step 302, according to the high 8-bit selection strategy, the first 8-bit data of the gray value of each pixel point is selected as the third gray value of each pixel point, and a high 8-bit candidate speckle pattern is generated.
In specific implementation, according to a high 8-bit selection strategy, the server selects the first 8-bit data, namely the 1 st to 8 th bit data, of the gray value of each pixel point as a third gray value of each pixel point, and after the server obtains the third gray value of each pixel point, the server generates a high 8-bit candidate speckle pattern according to the third gray value of each pixel point.
In one example, the gray value of 10-bit data of a certain pixel in the target speckle region is "0100101101", and the server obtains the third gray value of 8-bit data of the pixel as "01001011" according to the high 8-bit selection strategy.
Step 303, according to the medium 8-bit selection strategy, selecting the data from the 2 nd bit to the 9 th bit of the gray value of each pixel point as the fourth gray value of each pixel point, and generating a medium 8-bit candidate speckle pattern.
In specific implementation, according to the 8-bit selection strategy, the server selects the data from the 2 nd bit to the 9 th bit of the gray value of each pixel point as the fourth gray value of each pixel point, and after the server obtains the fourth gray value of each pixel point, the server generates the 8-bit candidate speckle pattern according to the fourth gray value of each pixel point.
In one example, the gray value of 10-bit data of a certain pixel point in the target speckle region is "0100101101", and the server obtains the fourth gray value of 8-bit data of the pixel point according to the 8-bit selection policy, which is "10010110".
And 304, selecting the later 8-bit data of the gray value of each pixel point as the fifth gray value of each pixel point according to the low 8-bit selection strategy, and generating a low 8-bit candidate speckle pattern.
In the specific implementation, according to the low 8-bit selection strategy, the server selects the last 8-bit data, namely the 3 rd bit to the 10 th bit data, of the gray value of each pixel point as the fifth gray value of each pixel point, and after the server obtains the fifth gray value of each pixel point, the server generates a low 8-bit candidate speckle pattern according to the fifth gray value of each pixel point.
In one example, the gray value of 10-bit data of a certain pixel point in the target speckle region is "0100101101", and the server obtains the fifth gray value of 8-bit data of the pixel point according to the low 8-bit selection strategy, wherein the fifth gray value is "00101101".
In this embodiment, the effective data bits corresponding to the speckle pattern captured by the depth camera are generally 10 bits, that is, the gray value of each pixel in the speckle pattern is generally 10 bits of data, in order to reduce the data volume in the industry, the 10 bits of data are converted into 8 bits of data, that is, the gray value is normalized to 0 to 255, the traditional normalization is to traverse the target speckle region, find the maximum value and the minimum value of the gray value, and normalize the target speckle region by using the maximum value and the minimum value of the gray value, but when the illumination intensity of the capturing environment is too strong or too dark, the depth camera method does not know whether the current scene is an indoor scene or an outdoor scene, the illumination intensity is strong or weak, and the maximum value or the minimum value of the gray value of each pixel in the target speckle region is too large or too small, which results in poor quality of the normalized target speckle region, the accuracy of depth recovery is greatly reduced, the application abandons the conversion of 10-bit data into 8-bit data based on the maximum value and the minimum value of the gray value of each pixel point of the whole image, but 3 groups of different 8-bit data are generated according to a preset high 8-bit selection strategy, a medium 8-bit selection strategy and a low 8-bit selection strategy, for a scene with strong illumination, the lower 8 bits in the 10-bit data are basically 1, the use of the upper 8 bits will not cause the normalized speckle pattern to be too bright, for the scene with weak illumination, the high 8 bits in the 10 bit data are basically 0, and the normalized speckle pattern is not too dark by using the low 8 bit data, so that the normalization method can better adapt to the scenes with different illumination intensities, and the calculated amount is greatly reduced, a light sensing module does not need to be added for the depth camera to detect the current illumination condition, and the hardware cost of the depth camera is saved.
In an embodiment, the server performs quality scoring on the candidate speckle patterns based on a preset quality scoring algorithm, which may be implemented by the steps shown in fig. 4, and specifically includes:
step 401, traversing a plurality of candidate speckle patterns, and calculating a mean value of gray values of the current candidate speckle patterns according to the gray values of all pixel points in the current candidate speckle patterns.
In a specific implementation, after obtaining a plurality of candidate speckle patterns, the server may traverse the plurality of candidate speckle patterns, obtain the gray value of each pixel in the current candidate speckle pattern, and calculate the mean gray value of the current candidate speckle pattern by dividing the sum of the gray values of each pixel in the current candidate speckle pattern by the total number of pixels in the current candidate speckle pattern.
Step 402, calculating the signal-to-noise ratio of the current candidate speckle pattern according to the speckles and non-speckle points in the current candidate speckle pattern.
In one example, after calculating the mean gray value of the current candidate speckle pattern, the server may determine speckle points and non-speckle points in the current candidate speckle pattern, and calculate the signal-to-noise ratio of the current candidate speckle pattern by dividing the number of speckle points by the number of non-speckle points.
In an example, the server may perform step 401 and then step 402, may perform step 402 and then step 401, or may perform step 401 and step 402 at the same time.
And step 403, calculating the quality score of each candidate speckle pattern according to the gray value average value of each candidate speckle pattern, the signal-to-noise ratio of each candidate speckle pattern and a preset quality scoring algorithm.
Specifically, after the server calculates the mean gray value and the signal-to-noise ratio of the current candidate speckle patterns, the server may calculate the quality score of each candidate speckle pattern according to the mean gray value of each candidate speckle pattern, the signal-to-noise ratio of each candidate speckle pattern, and a preset quality scoring algorithm.
In one example, the server calculates the quality score of each candidate speckle pattern according to the mean gray value of each candidate speckle pattern, the signal-to-noise ratio of each candidate speckle pattern, and a preset quality scoring algorithm, and may be implemented by the following formula:
C=α*A+β*B,α+β=1
in the formula, α and β are preset coefficients, a is a mean value of gray values of the candidate speckle patterns, B is a signal-to-noise ratio of the candidate speckle patterns, C is a quality score of the candidate speckle patterns, α is generally set to 0.4, and β is generally set to 0.6.
In this embodiment, considering that the mean value of the gray values of the candidate speckle patterns can represent the average brightness of the candidate speckle patterns, and the signal-to-noise ratio of the candidate speckle patterns can represent whether the features of the speckle patterns are obvious, the method and the device can determine and select the candidate speckle patterns with the highest quality, the most suitable brightness and the most obvious features by performing quality scoring on the candidate speckle patterns in combination with the two aspects, so as to further improve the accuracy of depth recovery.
In an embodiment, the server calculates the signal-to-noise ratio of the current candidate speckle pattern according to the speckle and the non-speckle points in the current candidate speckle pattern, which may be implemented by the steps shown in fig. 5, and specifically includes:
step 501, performing up-sampling on the current candidate speckle pattern to obtain an up-sampled current candidate speckle pattern.
In a specific implementation, in order to find speckle and non-speckle points in the current candidate speckle pattern more clearly and accurately, the server may perform up-sampling on the current candidate speckle pattern by 4 times, that is, amplify the current candidate speckle pattern by 4 times, so as to obtain the up-sampled current candidate speckle pattern.
Step 502, determining speckle point areas and non-speckle point areas of the current candidate speckle pattern according to the up-sampled current candidate speckle pattern and a preset speckle point area extraction method.
In a specific implementation, the preset speckle point region extraction method comprises gaussian blurring, edge extraction, ellipse fitting and ellipse detection, namely, the server performs gaussian blurring, edge extraction, ellipse fitting and ellipse detection on the up-sampled current candidate speckle pattern, so that elliptical speckles are found in the up-sampled current candidate speckle pattern, wherein the speckle areas are in the speckles, and the non-speckle areas are out of the speckles.
And step 503, respectively calculating the gray value average value of the speckle point area and the gray value average value of the non-speckle point area in the current candidate speckle pattern.
And step 504, calculating the ratio of the gray value average value of the speckle point area to the gray value average value of the non-speckle point area, and taking the ratio as the signal-to-noise ratio of the current candidate speckle pattern.
In specific implementation, after the server determines the speckle point area and the non-speckle point area of the current candidate speckle pattern, the server can respectively calculate the gray value average value of the speckle point area and the gray value average value of the non-speckle point area in the current candidate speckle pattern, then calculate the ratio between the gray value average value of the speckle point area and the gray value average value of the non-speckle point area, and take the ratio as the signal-to-noise ratio of the current candidate speckle pattern.
In the embodiment, considering that the speckle point areas are relatively dispersed and each speckle point area is relatively small, the server can perform up-sampling on the current candidate speckle pattern, namely amplify the current candidate speckle pattern, so that the speckle point area of the current candidate speckle pattern can be detected more accurately, and the accuracy and the reliability of the calculated signal-to-noise ratio are improved.
Another embodiment of the present application relates to a depth recovery method, and the following describes implementation details of the depth recovery method of this embodiment in detail, where the following are provided only for facilitating understanding, and are not necessary to implement the present invention, and a specific flow of the depth recovery method of this embodiment may be as shown in fig. 6, and includes:
step 601, determining a target infrared area in the obtained infrared image, and determining a target speckle area in a speckle image corresponding to the infrared image according to the target infrared area.
Step 602, generating a plurality of candidate speckle patterns according to the data valid bit of the target speckle region and a preset target data bit selection strategy.
Here, steps 601 to 602 are substantially the same as steps 101 to 102, and are not described herein again.
Step 603, traversing a plurality of candidate speckle patterns, and calculating the mean value of the gray values of the current candidate speckle patterns according to the gray values of all the pixel points in the current candidate speckle patterns.
Step 603 is substantially the same as step 401, and is not described herein again.
Step 604, determining whether the mean gray value of the current candidate speckle pattern is within a preset gray value range, if so, executing step 605, otherwise, directly executing step 608.
In the specific implementation, a trigger condition is set before quality scoring, that is, the gray value mean value of the candidate speckle pattern needs to be within a preset gray value range, it is described that the brightness of the candidate speckle pattern is reasonable within the preset gray value range, and the candidate speckle pattern is neither too bright nor too dark, after the server calculates the gray value mean value of the current candidate speckle pattern, whether the gray value mean value of the current candidate speckle pattern is within the preset gray value range can be judged, if so, the quality scoring of the current candidate speckle pattern is allowed to be continued, and if the gray value mean value of the current candidate speckle pattern is outside the preset gray value range, the current candidate speckle pattern is directly discarded, so that the calculation resources are further saved, and the speed of depth recovery can be effectively improved.
In one example, the predetermined gray-level value range may be [80, 150], that is, the candidate speckle patterns with the gray-level value average value greater than or equal to 80 and less than or equal to 150 are retained, and the candidate speckle patterns with the gray-level value average value less than 80 or greater than 150 are discarded.
Step 605, calculating the signal-to-noise ratio of the current candidate speckle pattern according to the speckle and the non-speckle points in the current candidate speckle pattern.
And 606, calculating the quality score of each candidate speckle pattern according to the gray value average value of each candidate speckle pattern, the signal-to-noise ratio of each candidate speckle pattern and a preset quality scoring algorithm.
Steps 605 to 606 are substantially the same as steps 402 to 403, and are not described herein again.
And step 607, generating a depth map corresponding to the infrared map according to the candidate speckle map with the highest score.
Step 607 is substantially the same as step 103, and is not repeated herein.
At step 608, the current candidate speckle pattern is discarded directly.
In one embodiment, the preset target object is a human face, the server determines a target infrared region in the acquired infrared image, and determines a target speckle region in a speckle pattern corresponding to the infrared image according to the target infrared region, which may be implemented by the steps shown in fig. 7, and specifically includes:
and 701, performing face detection on the acquired infrared image, determining a face contour in the infrared image, and taking an external rectangle of the face contour in the infrared image as a target infrared region.
In the specific implementation, the server firstly carries out face detection on the obtained infrared image and determines the face contour in the infrared image, and the face contour is not a regular figure and is inconvenient to operate.
And step 702, determining the homonymous points of all the vertexes of the target infrared region in the speckle pattern corresponding to the infrared pattern according to the coordinates of all the vertexes of the target infrared region.
And 703, connecting the homonymous points of the vertexes of the target infrared regions to obtain a target speckle region.
In specific implementation, the target infrared region is a regular rectangle, and as long as coordinates of four vertexes are determined and the position of the rectangle in the image can be determined, coordinates of points in the rectangle do not need to be determined, the server determines the homonymy points of the vertexes of the target infrared region in the speckle pattern corresponding to the infrared pattern according to the coordinates of the vertexes of the target infrared region, and connects the homonymy points of the vertexes of the target infrared region, that is, a rectangle is also determined in the speckle pattern corresponding to the infrared pattern, and the rectangular region in the speckle pattern is the target speckle region.
In the embodiment, the fact that the face contour is not a regular figure is considered, so that when the preset target object is a face, the circumscribed rectangle of the face contour in the infrared image is selected as the target infrared region, and subsequent quality evaluation and depth recovery are facilitated.
The steps of the above methods are divided for clarity, and the implementation may be combined into one step or split some steps, and the steps are divided into multiple steps, so long as the same logical relationship is included, which are all within the protection scope of the present patent; it is within the scope of this patent to add insignificant modifications or introduce insignificant designs to the algorithms or processes, but not to change the core designs of the algorithms and processes.
Another embodiment of the present application relates to a depth recovery apparatus, and the implementation details of the depth recovery apparatus of this embodiment are specifically described below, and the following are provided only for the convenience of understanding, and are not necessary for implementing the present solution, and a schematic diagram of the depth recovery apparatus of this embodiment may be as shown in fig. 8, and includes:
the positioning module 801 is configured to determine a target infrared region in the acquired infrared image, and determine a target speckle region in a speckle pattern corresponding to the infrared image according to the target infrared region, where the target infrared region includes a preset target object;
a grouping module 802, configured to generate a plurality of candidate speckle patterns according to a data valid bit of a target speckle region and a preset target data bit selection policy, where target data bits selected by different candidate speckle patterns are different;
a scoring module 803, configured to perform quality scoring on the multiple candidate speckle patterns respectively based on a preset quality scoring algorithm;
and the execution module 804 is used for generating a depth map corresponding to the infrared map according to the candidate speckle pattern with the highest score.
It should be noted that, all the modules involved in this embodiment are logic modules, and in practical application, one logic unit may be one physical unit, may also be a part of one physical unit, and may also be implemented by a combination of multiple physical units. In addition, in order to highlight the innovative part of the present application, a unit that is not so closely related to solving the technical problem proposed by the present application is not introduced in the present embodiment, but this does not indicate that there is no other unit in the present embodiment.
Another embodiment of the present application relates to an electronic device, as shown in fig. 9, including: at least one processor 901; and a memory 902 communicatively connected to the at least one processor 901; the memory 902 stores instructions executable by the at least one processor 901, and the instructions are executed by the at least one processor 901, so that the at least one processor 901 can execute the depth recovery method in the foregoing embodiments.
Where the memory and processor are connected by a bus, the bus may comprise any number of interconnected buses and bridges, the buses connecting together one or more of the various circuits of the processor and the memory. The bus may also connect various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. A bus interface provides an interface between the bus and the transceiver. The transceiver may be one element or a plurality of elements, such as a plurality of receivers and transmitters, providing a means for communicating with various other apparatus over a transmission medium. The data processed by the processor is transmitted over a wireless medium via an antenna, which further receives the data and transmits the data to the processor.
The processor is responsible for managing the bus and general processing and may also provide various functions including timing, peripheral interfaces, voltage regulation, power management, and other control functions. While the memory may be used to store data used by the processor in performing operations.
Another embodiment of the present application relates to a computer-readable storage medium storing a computer program. The computer program realizes the above-described method embodiments when executed by a processor.
That is, as can be understood by those skilled in the art, all or part of the steps in the method for implementing the embodiments described above may be implemented by a program instructing related hardware, where the program is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It will be understood by those of ordinary skill in the art that the foregoing embodiments are specific examples of implementations of the present application and that various changes in form and details may be made therein without departing from the spirit and scope of the present application.

Claims (11)

1. A method of depth recovery, comprising:
determining a target infrared region in the acquired infrared image, and determining a target speckle region in a speckle pattern corresponding to the infrared image according to the target infrared region; the target infrared region comprises a preset target object;
generating a plurality of candidate speckle patterns according to the data valid bit of the target speckle area and a preset target data bit selection strategy; wherein, the target data bits selected by different candidate speckle patterns are different;
and respectively carrying out quality scoring on the plurality of candidate speckle patterns based on a preset quality scoring algorithm, and generating a depth map corresponding to the infrared image according to the candidate speckle pattern with the highest score.
2. The depth recovery method according to claim 1, wherein the data valid bits of the target speckle area are K bits, the K is an integer greater than 2, the preset target data bit selection strategy includes a high bit selection strategy and a low bit selection strategy, and the candidate speckle patterns include a high bit candidate speckle pattern and a low bit candidate speckle pattern;
generating a plurality of candidate speckle patterns according to the data valid bit of the target speckle region and a preset target data bit selection strategy, wherein the candidate speckle patterns comprise:
acquiring gray values of all pixel points in the target speckle area, wherein the gray values are K-bit data;
according to the high-order selection strategy, selecting front L-order data of the gray value of each pixel point as a first gray value of each pixel point to generate a high-order candidate speckle pattern; wherein L is an integer less than K;
and selecting the rear L-bit data of the gray value of each pixel point as a second gray value of each pixel point according to the low-bit selection strategy to generate a low-bit candidate speckle pattern.
3. The depth recovery method according to any one of claims 1 to 2, wherein the data valid bit of the target speckle region is 10 bits, the preset target data bit selection strategy comprises an upper 8-bit selection strategy, a middle 8-bit selection strategy and a lower 8-bit selection strategy, and the candidate speckle patterns comprise an upper 8-bit candidate speckle pattern, a middle 8-bit candidate speckle pattern and a lower 8-bit candidate speckle pattern;
generating a plurality of candidate speckle patterns according to the data valid bit of the target speckle region and a preset target data bit selection strategy, wherein the candidate speckle patterns comprise:
acquiring the gray value of each pixel point in the target speckle area, wherein the gray value is 10-bit data;
according to the high 8-bit selection strategy, selecting the first 8-bit data of the gray value of each pixel point as the third gray value of each pixel point to generate a high 8-bit candidate speckle pattern;
selecting the data from 2 nd bit to 9 th bit of the gray value of each pixel point as the fourth gray value of each pixel point according to the medium 8-bit selection strategy to generate a medium 8-bit candidate speckle pattern;
and selecting the later 8-bit data of the gray value of each pixel point as the fifth gray value of each pixel point according to the low 8-bit selection strategy to generate a low 8-bit candidate speckle pattern.
4. The depth restoration method according to any one of claims 1 to 2, wherein the quality scoring the candidate speckle patterns based on a preset quality scoring algorithm comprises:
traversing the candidate speckle patterns, and calculating the mean value of the gray values of the current candidate speckle patterns according to the gray values of all pixel points in the current candidate speckle patterns;
calculating the signal-to-noise ratio of the current candidate speckle pattern according to speckle points and non-speckle points in the current candidate speckle pattern;
and calculating the quality score of each candidate speckle pattern according to the gray value mean value of each candidate speckle pattern, the signal-to-noise ratio of each candidate speckle pattern and a preset quality scoring algorithm.
5. The method of claim 4, wherein the calculating the signal-to-noise ratio of the current candidate speckle pattern based on speckle and non-speckle points in the current candidate speckle pattern comprises:
up-sampling the current candidate speckle pattern to obtain an up-sampled current candidate speckle pattern;
determining speckle point areas and non-speckle point areas of the current candidate speckle pattern according to the up-sampled current candidate speckle pattern and a preset speckle point area extraction method; the preset speckle point region extraction method comprises Gaussian blur, edge extraction, ellipse fitting and ellipse detection;
respectively calculating the gray value average value of the speckle region and the gray value average value of the non-speckle region in the current candidate speckle pattern;
and calculating the ratio of the gray value average value of the scattered spot area to the gray value average value of the non-speckle point area, and taking the ratio as the signal-to-noise ratio of the current candidate scattered spot image.
6. The method of claim 4, wherein the calculating the signal-to-noise ratio of the current candidate speckle pattern based on speckle and non-speckle points in the current candidate speckle pattern comprises:
judging whether the gray value average value of the current candidate speckle pattern is within a preset gray value range or not;
if the mean value of the gray values of the current candidate speckle patterns is within a preset gray value range, the mean value of the gray values of the current candidate speckle patterns is within the preset gray value range;
and if the mean value of the gray values of the current candidate speckle patterns is outside a preset gray value range, directly discarding the current candidate speckle patterns.
7. The method of claim 4, wherein the quality score of each candidate speckle pattern is calculated according to a mean gray value of each candidate speckle pattern, a signal-to-noise ratio of each candidate speckle pattern, and a preset quality scoring algorithm by the following formula:
C=α*A+β*B,α+β=1
wherein alpha and beta are preset coefficients, A is the mean value of the gray values of the candidate speckle patterns, B is the signal-to-noise ratio of the candidate speckle patterns, and C is the quality score of the candidate speckle patterns.
8. The depth recovery method according to any one of claims 1 to 2, wherein the preset target object is a human face, the determining a target infrared region in the acquired infrared image, and determining a target speckle region in a speckle image corresponding to the infrared image according to the target infrared region includes:
carrying out face detection on the obtained infrared image, determining a face contour in the infrared image, and taking a circumscribed rectangle of the face contour as the target infrared region;
according to the coordinates of each vertex of the target infrared region, determining the homonymy point of each vertex in a speckle pattern corresponding to the infrared pattern;
and connecting the homonymous points of the vertexes to obtain the target speckle area.
9. A depth recovery device, the device comprising:
the positioning module is used for determining a target infrared region in the acquired infrared image and determining a target speckle region in a speckle pattern corresponding to the infrared image according to the target infrared region, wherein the target infrared region comprises a preset target object;
the grouping module is used for generating a plurality of candidate speckle patterns according to the data valid bit of the target speckle area and a preset target data bit selection strategy, wherein the target data bits selected by different candidate speckle patterns are different;
the scoring module is used for respectively performing quality scoring on the candidate speckle patterns based on a preset quality scoring algorithm;
and the execution module is used for generating a depth map corresponding to the infrared map according to the candidate speckle pattern with the highest score.
10. An electronic device, comprising:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of depth recovery of any one of claims 1 to 8.
11. A computer-readable storage medium storing a computer program, wherein the computer program is configured to implement the depth recovery method according to any one of claims 1 to 8 when executed by a processor.
CN202210456562.1A 2022-04-24 2022-04-24 Depth recovery method and device, electronic equipment and computer readable storage medium Pending CN114926519A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210456562.1A CN114926519A (en) 2022-04-24 2022-04-24 Depth recovery method and device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210456562.1A CN114926519A (en) 2022-04-24 2022-04-24 Depth recovery method and device, electronic equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN114926519A true CN114926519A (en) 2022-08-19

Family

ID=82807569

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210456562.1A Pending CN114926519A (en) 2022-04-24 2022-04-24 Depth recovery method and device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN114926519A (en)

Similar Documents

Publication Publication Date Title
US10997696B2 (en) Image processing method, apparatus and device
US8830227B2 (en) Depth-based gain control
CN107864337B (en) Sketch image processing method, device and equipment and computer readable storage medium
WO2019042216A1 (en) Image blurring processing method and device, and photographing terminal
US11615547B2 (en) Light field image rendering method and system for creating see-through effects
CN109640066B (en) Method and device for generating high-precision dense depth image
WO2020083307A1 (en) Method, apparatus, and storage medium for obtaining depth image
KR102073468B1 (en) System and method for scoring color candidate poses against a color image in a vision system
CN112361990B (en) Laser pattern extraction method and device, laser measurement equipment and system
CN111768450A (en) Automatic detection method and device for line deviation of structured light camera based on speckle pattern
US20130044226A1 (en) Imaging device and distance information detecting method
CN104574312A (en) Method and device of calculating center of circle for target image
CN111311562B (en) Ambiguity detection method and device for virtual focus image
CN113808135B (en) Image brightness abnormality detection method, electronic device, and storage medium
CN112969023A (en) Image capturing method, apparatus, storage medium, and computer program product
CN114332014A (en) Projector quality evaluation method, device, equipment and storage medium
CN113888509A (en) Method, device and equipment for evaluating image definition and storage medium
CN114926519A (en) Depth recovery method and device, electronic equipment and computer readable storage medium
CN114283081B (en) Depth recovery method based on pyramid acceleration, electronic device and storage medium
CN113379816B (en) Structure change detection method, electronic device, and storage medium
CN107025636B (en) Image defogging method and device combined with depth information and electronic device
CN113936316B (en) DOE (DOE-out-of-state) detection method, electronic device and computer-readable storage medium
JP2020108060A (en) Deposit detector and deposit detection method
CN114783041B (en) Target object recognition method, electronic device, and computer-readable storage medium
CN115797995B (en) Face living body detection method, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination