CN114283089B - Jump acceleration based depth recovery method, electronic device, and storage medium - Google Patents

Jump acceleration based depth recovery method, electronic device, and storage medium Download PDF

Info

Publication number
CN114283089B
CN114283089B CN202111603078.9A CN202111603078A CN114283089B CN 114283089 B CN114283089 B CN 114283089B CN 202111603078 A CN202111603078 A CN 202111603078A CN 114283089 B CN114283089 B CN 114283089B
Authority
CN
China
Prior art keywords
parallax
value
disparity
seed point
speckle pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111603078.9A
Other languages
Chinese (zh)
Other versions
CN114283089A (en
Inventor
李东洋
化雪诚
王海彬
刘祺昌
户磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Dilusense Technology Co Ltd
Original Assignee
Hefei Dilusense Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Dilusense Technology Co Ltd filed Critical Hefei Dilusense Technology Co Ltd
Priority to CN202111603078.9A priority Critical patent/CN114283089B/en
Publication of CN114283089A publication Critical patent/CN114283089A/en
Application granted granted Critical
Publication of CN114283089B publication Critical patent/CN114283089B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The embodiment of the invention relates to the field of image processing, and discloses a depth recovery method based on jump acceleration, electronic equipment and a storage medium, wherein the method comprises the following steps: aiming at the preprocessed object speckle pattern and the reference speckle pattern, selecting a plurality of candidate seed points from the object speckle pattern and a first parallax search range corresponding to each candidate seed point; for each candidate seed point, jumping and selecting a plurality of parallax values from the corresponding first parallax search range to perform parallax search, determining whether the candidate seed point is a seed point or not based on the obtained matching cost value corresponding to each parallax value, and obtaining the parallax value of the seed point; determining the disparity value of the object speckle pattern and the reference speckle pattern by using the seed points and the disparity values thereof and adopting a region growing method; recovering depth information based on disparity values of the object speckle pattern and the reference speckle pattern. According to the scheme, the depth recovery process can be accelerated on the basis of effectively ensuring the recovery image precision.

Description

Jump acceleration based depth recovery method, electronic device, and storage medium
Technical Field
The present invention relates to the field of image processing, and in particular, to a depth recovery method based on jump acceleration, an electronic device, and a storage medium.
Background
At present, the most active technical branch in the field of machine vision belongs to the depth perception technology, and the speckle structure light technology is an important part in the depth perception technology. The speckle structured light technology is used as the most common active stereo vision technology and has wide application in the fields of face recognition, automatic driving, security monitoring and the like. The speckle structured light system projects pseudo-random speckles to a shot object, and then performs characteristic matching of the speckles according to a specific algorithm to obtain parallax information, so as to further obtain depth information of a scene.
However, brute force speckle matching is computationally intensive and time consuming. Currently, a number of different techniques are being adopted in the industry to address this problem. For example: by using an image binarization method, the Hamming (Hamming) distance is used as the similarity measure during matching, which greatly saves the calculation amount and time consumption, but the precision of the method is generally not higher than that of the methods of image Local gray Normalization (LCN) and Zero-mean Normalized Cross Correlation (ZNCC) measure; or, the approximate range of the depth is estimated in a certain way, so that the parallax searching range is further reduced, but the time consumption is reduced, and the parallax searching range is limited by the service use; alternatively, the approach of using neural networks, but also limited by the amount of data required for training and memory limitations of the model size, and black boxes are not easily interpretable.
Disclosure of Invention
An object of embodiments of the present invention is to provide a depth recovery method based on jump acceleration, an electronic device, and a storage medium, which can accelerate a depth recovery process on the basis of effectively ensuring the accuracy of a recovered image.
In order to solve the above technical problem, an embodiment of the present invention provides a depth recovery method based on jump acceleration, including:
aiming at the preprocessed object speckle pattern and the reference speckle pattern, selecting a plurality of candidate seed points from the object speckle pattern and a first parallax search range corresponding to each candidate seed point;
for each candidate seed point, jumping and selecting a plurality of parallax values from the corresponding first parallax search range to perform parallax search in the preprocessed reference speckle pattern, determining whether the candidate seed point is a seed point or not based on the matching cost value corresponding to each parallax value obtained by the parallax search, and obtaining the parallax value of the seed point;
determining the disparity value of the object speckle pattern and the reference speckle pattern by using the seed points and the disparity values thereof and adopting a region growing method;
recovering depth information based on disparity values of the object speckle pattern and the reference speckle pattern.
An embodiment of the present invention also provides an electronic device, including:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a jump acceleration based depth restoration method as described above.
Embodiments of the present invention also provide a computer-readable storage medium storing a computer program which, when executed by a processor, implements a jump acceleration based depth recovery method as described above.
Compared with the prior art, the method and the device have the advantages that a plurality of candidate seed points and a first parallax search range corresponding to each candidate seed point are selected from the object speckle pattern by aiming at the preprocessed object speckle pattern and the reference speckle pattern; for each candidate seed point, jumping and selecting a plurality of parallax values from the corresponding first parallax search range to perform parallax search in the preprocessed reference speckle pattern, determining whether the candidate seed point is a seed point or not based on the matching cost value corresponding to each parallax value obtained by the parallax search, and obtaining the parallax value of the seed point; determining the parallax values of the object speckle pattern and the reference speckle pattern by using the seed points and the parallax values thereof and adopting a region growing method; and recovering the depth information based on the parallax values of the object speckle pattern and the reference speckle pattern. According to the scheme, when the large-range parallax matching of the candidate seed points is carried out, continuous searching and matching are not carried out in a preset parallax searching range, and the matching searching is carried out in a jumping mode from the middle according to a certain interval; after the seed points are successfully searched and determined, the seed points are grown until the image area is grown completely, the parallax value of the object speckle pattern and the reference speckle pattern is determined, and the depth information of the object is recovered based on the parallax image, so that the depth recovery process is accelerated on the basis of effectively ensuring the accuracy of the recovered image.
Drawings
Fig. 1 is a specific flowchart one of a depth recovery method based on jump acceleration according to an embodiment of the present invention;
FIG. 2 is a diagram of a candidate seed point selection according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a triangulation principle to calculate depth according to an embodiment of the invention;
FIG. 4 is a specific flowchart II of a depth recovery method based on jump acceleration according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, embodiments of the present invention will be described in detail below with reference to the accompanying drawings. However, it will be appreciated by those of ordinary skill in the art that numerous technical details are set forth in order to provide a better understanding of the present application in various embodiments of the present invention. However, the technical solution claimed in the present application can be implemented without these technical details and various changes and modifications based on the following embodiments.
The invention relates to a depth recovery method based on jump acceleration, which is suitable for an image processing scene for recovering a depth image of a target object by using a speckle pattern of the target object. As shown in fig. 1, the depth recovery method based on jump acceleration provided by this embodiment includes the following steps.
Step 101: and aiming at the preprocessed object speckle pattern and the reference speckle pattern, selecting a plurality of candidate seed points from the object speckle pattern and a first parallax search range corresponding to each candidate seed point.
Specifically, a speckle pattern of a target object can be captured by a structured light camera (referred to simply as "camera") as an object speckle pattern; the reference speckle pattern is a planar speckle pattern of known distance. The object speckle pattern and the reference speckle pattern are preprocessed to improve the light-dark contrast and the brightness balance effect of the speckle.
In one example, pre-processing the object speckle pattern and the reference speckle pattern may include: and sequentially carrying out histogram equalization processing and local binarization processing on the object speckle pattern and the reference speckle pattern.
Specifically, histogram equalization processing is performed on the speckle pattern (gray scale pattern) to equalize the brightness value of the image, so as to avoid the situation that the image is too bright or too dark. The specific operation process is as follows:
for speckle images, counting all the pixel numbers m of each pixel value i in the images according to the pixel value i ,i∈[0,255](ii) a If the speckle image resolution is R (the number of pixels included in the image), the method for calculating the processed pixel value i' corresponding to the pixel value i is formula (1).
Figure BDA0003432506590000031
Then, local binarization processing is carried out on the speckle image after histogram equalization processing, and a pixel point P at any coordinate (x, y) on the speckle image is set, wherein the gray value of the pixel point P is G (x, y); taking a neighborhood window (the window size is k x k) with the pixel point P as the center, and calculating the average value avg and the standard deviation std (the maximum value is marked as stdmax) of the gray levels in the neighborhood window, so that the binary threshold value is the formula (2).
Figure BDA0003432506590000032
Wherein, thre (x, y) is the local binary threshold of the pixel P, and Delta is a hyper-parameter with the value range of [ -1,1].
And then carrying out binarization through a formula (3) to obtain a binarized image m (x, y).
Figure BDA0003432506590000033
The local binarization processing method can be self-adaptive to the image brightness and contrast in the speckle image, and compared with a fixed binarization threshold or an average threshold, the tolerance to the image quality of the speckle image and the robustness of an algorithm are increased.
For the preprocessed object speckle pattern and the reference speckle pattern, a plurality of candidate seed points can be selected from the object speckle pattern, and a first parallax search range corresponding to each candidate seed point.
Specifically, the principle of depth recovery using the region growing algorithm is to consider that the depth of the scene has a certain continuity, which is equivalent to the cost aggregation part in the depth recovery process. Therefore, a plurality of pixel points are selected from the object speckle pattern as candidate seed points (such as solid points in fig. 2) according to a certain interval grid mode, and a queue of the candidate seed points is formed. And selecting candidate seed points in the queue in sequence, and determining an initial parallax search range, namely a first parallax search range, for each candidate seed point so as to perform parallax search. The first parallax search range here may be a full range or a partially continuous range of parallax search.
Step 102: and for each candidate seed point, jumping and selecting a plurality of parallax values from the corresponding first parallax search range to perform parallax search in the preprocessed reference speckle pattern, determining whether the candidate seed point is a seed point or not based on the matching cost value corresponding to each parallax value obtained by the parallax search, and obtaining the parallax value of the seed point.
Specifically, the conventional disparity search for candidate seed points is a full-range disparity search, and in order to alleviate the problem of excessive calculation amount at this stage, the disparity search is performed in a jump matching manner in the embodiment. Namely: and jumping and selecting a plurality of parallax values from the initially determined first parallax searching range to perform parallax searching in the preprocessed reference speckle pattern. The jump selection can be performed by using a fixed jump step size or a variable jump step size, wherein the step size value of the fixed jump step size is larger than 1. In other words, a partial disparity value is selected from the first disparity search range to perform disparity search. After a plurality of parallax values are selected from the corresponding first parallax search range for each candidate seed point, local image block matching can be performed on the matching points corresponding to the parallax values and the corresponding candidate seed points in the preprocessed reference speckle pattern, so that the matching cost value corresponding to each parallax value is obtained. And finally, determining whether the candidate seed point is a seed point according to the matching cost value corresponding to each parallax value of each candidate seed point, and acquiring the parallax value of the seed point when the candidate seed point is determined to be one seed point. When whether the candidate seed point is the seed point is determined according to the matching cost value corresponding to each disparity value of each candidate seed point, the smaller the matching cost value corresponding to each disparity value is, the higher the possibility that the candidate seed point is determined as the seed point is. After the disparity value corresponding to the seed point is determined as the seed point, the disparity value corresponding to the seed point may be determined from the disparity values based on the matching cost values corresponding to the disparity values, or may be further calculated and generated based on the disparity values.
Step 103: and determining the parallax value of the object speckle pattern and the reference speckle pattern by using the seed points and the parallax value thereof and adopting a region growing method.
Specifically, if the candidate seed point is successfully judged as the seed point and the disparity value of the seed point is obtained, entering a growth link of the seed point. Growth was performed around the seed point. For each neighborhood point around the seed point, carrying out [ -2,2] parallax search matching at the position of the difference value with the same parallax value as that of the seed point, wherein the matching method and the cost value calculation method are the same as the related method used by the original seed point. And if the matching cost value is smaller than the growth threshold value, determining to find a matched parallax value. In this case, the disparity value corresponding to the matching cost value may be directly used as the disparity value of the current matching point, or the sub-pixel level disparity of the disparity value may be calculated as the final disparity value with reference to formula (5). And then, taking the current neighborhood point as a new seed point and carrying out region growth on the periphery of the new neighborhood point so as to further obtain the parallax value of the new neighborhood point. And if the matching cost value is smaller than the growth threshold value, calculating the neighborhood point of the next seed point.
And (3) for each seed point, iterating and searching the parallax of the neighborhood point by adopting a region growing method, and finally determining the parallax value of the pixel point between the object speckle pattern and the reference speckle pattern to form a parallax map.
Step 104: and recovering the depth information based on the parallax values of the object speckle pattern and the reference speckle pattern.
Specifically, after the image growth is completed, the depth Z is calculated according to the triangulation principle shown in fig. 3 by using the parallax values d of all the pixel points, and the calculation formula is as follows:
Figure BDA0003432506590000041
wherein z is 0 Is the reference plane distance in mm; f, l are camera calibration focal length and base line distance respectively.
After the depth map is obtained, post-processing, such as median filtering, may be performed on the depth map to remove redundant noise and output a high-precision depth map.
Compared with the prior art, the embodiment of the invention selects a plurality of candidate seed points from the object speckle pattern and a first parallax search range corresponding to each candidate seed point by aiming at the preprocessed object speckle pattern and the reference speckle pattern; for each candidate seed point, jumping and selecting a plurality of parallax values from the corresponding first parallax search range to perform parallax search in the preprocessed reference speckle pattern, determining whether the candidate seed point is a seed point or not based on the matching cost value corresponding to each parallax value obtained by the parallax search, and obtaining the parallax value of the seed point; determining the parallax value of the object speckle pattern and the reference speckle pattern by using the seed points and the parallax value thereof and adopting a region growing method; and recovering the depth information based on the parallax values of the object speckle pattern and the reference speckle pattern. When the large-range parallax matching of the candidate seed points is carried out, continuous searching and matching are not carried out in a preset parallax searching range, and the matching searching is carried out in a jumping mode from the middle according to a certain interval; after the seed points are successfully searched and determined, the seed points are grown until the growth of the image area is finished, the parallax value of the object speckle pattern and the reference speckle pattern is determined, and then the depth information of the object is recovered based on the parallax map, so that the depth recovery process is accelerated on the basis of effectively ensuring the accuracy of the recovered image.
Another embodiment of the present invention relates to a jump acceleration based depth recovery method, as shown in fig. 4, which is an improvement of the steps of the method shown in fig. 1, and is improved by refining the process of determining a seed point and obtaining a disparity value of the seed point. As shown in fig. 4, the step 102 may include the following sub-steps.
Substep 1021: for the current candidate seed point, jumping and selecting a plurality of parallax values from any end boundary value in the corresponding first parallax search range along the direction towards the other end boundary value; or, a plurality of disparity values are selected in a jumping manner along the direction towards the two end boundary values from the position with the disparity value of 0 in the corresponding first disparity search range.
Specifically, a first disparity search range, such as [ -d ], corresponding to the current candidate seed point is determined l ,d r ]Then, the range [ -d ] can be searched from the first disparity l ,d r ]A plurality of disparity values are selected in different jumps. For example, a plurality of disparity values may be skipped in a direction toward the other end boundary value starting at either end boundary value within the first disparity search range. Such as with-d l As a first selected disparity value, followed by a fixed or variable jump step towards d r Extending the direction, and selecting multiple parallax values as the direction from [ -d [) l ,d r ]A plurality of disparity values selected internally; or, e.g. with d r As the first selected disparity value, followed by a fixed or variable jump step towards-d l Extending the direction, and selecting multiple parallax values as the direction from [ -d [) l ,d r ]A plurality of disparity values selected within. For another example, a plurality of disparity values may be selected to jump in a direction toward both end boundary values starting from a position where the disparity value is 0 in the first disparity search range. E.g. with a disparity value of 0 as the first disparity value chosen, followed by a fixed or variable jump step towards-d, respectively l And d r Extending in two directions, and selecting multiple parallax values as the sum of [ -d ] l ,d r ]A plurality of disparity values selected within. In this manner, it is desirable to obtain d l <0<d r
In one example, the jump step size used when the jump selects the plurality of disparity values is the radius of the speckle.
Specifically, the principle of the skip matching employed in the present embodiment is that, when the reference image block and the object image block are slid in the parallax extending direction for matching, after it comes within a small range of the GT parallax (real parallax) (the range coincides with the radius of the scattering spot), the degree of matching within the range is relatively good. By using such a feature, if a point with a high matching degree is found by using a jump point finding method, it can be considered that the position near or at the point is the best position corresponding to the parallax. Therefore, in the present embodiment, the jump step is set to the radius of the speckle. For example, when the speckle point diameter is 3 to 5, the jumping step size is set to 2.
Substep 1022: and performing parallax search on the selected multiple parallax values in the preprocessed reference speckle pattern, and determining the matching cost value corresponding to each parallax value.
Specifically, after a plurality of disparity values are selected for each candidate seed point, disparity search may be performed on the selected plurality of disparity values in the preprocessed reference speckle pattern, and a matching cost value corresponding to each disparity value is determined. In this embodiment, the matching cost algorithm in the parallax search is not limited.
In one example, a neighborhood window may be utilized to calculate a hamming distance between the object image block and the reference image block corresponding to each disparity value, and the hamming distance is used as a matching cost value corresponding to the corresponding disparity value.
Specifically, in the information theory, hamming Distance (Hamming Distance) represents the number of different characters in corresponding positions of two character strings of equal length. In this embodiment, the hamming distance between the object image block and the reference image block is defined as the number of pixel points with different gray values at the same position in the object image block and the reference image block, and the larger the hamming distance is, the more the number of pixel points with different gray values at the same position is, the worse the matching degree between the object image block and the reference image block is. The Hamming distance between the object image block and the reference image block corresponding to each parallax value is used as the matching cost value corresponding to the corresponding parallax value, so that the matching degree between the two image blocks under the parallax value can be represented, and the quality of the parallax value can be evaluated.
For example, the seed points in the queue are selected in order, and the coordinates of the seed points are recorded as (x) p ,y p ) With this point as the center, the object image block with window length w is I w (x, y); coordinate with matching parallax position d is (x) p +d,y p ) With this point as the center, the reference image block with window length w is J w (x, y); the disparity value d e-d l ,d r ]. The jump step length s (i.e. s | d) is divided by the range of the parallax value to obtain the jump times, and further obtain the parallax value selected in each jump. The hamming distance (i.e. the number of different points in the two windows) of the object image block and the reference image block under the disparity is calculated by the xor operator in the bit operation.
In one example, in order to reduce the amount of calculation, image compression may be performed on the preprocessed object speckle pattern and the reference speckle pattern in advance to obtain an integer pattern that is 1/32 times larger than the original image (speckle image after local binarization). Accordingly, in this step 1022, the object image block and the reference image block corresponding to each disparity value may be refined, and the corresponding integer image blocks in the integer image are subjected to xor operation to obtain the hamming distance, and the hamming distance is used as the matching cost value corresponding to the corresponding disparity value.
The advantage of the processing is that the general integer is 32 bits, the binary value is a Boolean type, and only occupies 1 bit, so that the Hamming distance can be calculated subsequently through the XOR operation of the integer, the calculation amount is 32 times less, and the reason that the depth recovery can be quickly realized by adopting the local binarization method is also the reason.
Substep 1023: and determining whether the current candidate seed point is a seed point or not based on the matching cost value corresponding to each parallax value, and acquiring the parallax value of the seed point.
In one example, this sub-step 1023 can be achieved by the following steps.
The method comprises the following steps: determining a second parallax search range based on the parallax value with the minimum matching cost value in the parallax values, and performing continuous parallax search on the second parallax search range in the preprocessed reference speckle pattern to obtain a plurality of first matching cost values of the current candidate seed points; the length of the second disparity search range is smaller than the length of the first disparity search range.
Specifically, after determining, for the current candidate seed point, the matching cost values corresponding to the disparity values selected based on the skipping manner, one disparity value with the smallest matching cost value may be selected from the disparity values, and the second disparity search range may be determined based on the smallest disparity value. The length of the second parallax search range is smaller than the length of the first parallax search range. For example, the second disparity search range may be a jump step size, such as a radius of a blob, used to select a plurality of disparity values for the jump, superimposed on the minimum disparity value by [ -s, s ]. And for the second parallax search range, continuous parallax search is carried out in the preprocessed reference speckle pattern, and the matching cost value corresponding to each parallax d of the current candidate seed point in the second parallax search range is obtained. In order to distinguish from the matching cost values determined in the first parallax search range in the past, the matching cost values determined continuously in the second parallax search range are marked as first matching cost values.
In this embodiment, the first matching cost value of the parallax in the second parallax search range is calculated by using a hamming distance method, which is the same as the method for calculating the matching cost value of the parallax in the first parallax search range.
Step two: and if the minimum value of the first matching cost values is smaller than the set threshold value, taking the current candidate seed point as a seed point, and taking the parallax value corresponding to the minimum value of the first matching cost values as the parallax value of the seed point.
Specifically, when the minimum first matching cost value obtained by performing disparity search in the reference speckle pattern is smaller than the set threshold in the second disparity search range, it is determined that a better disparity value corresponding to the current candidate seed point is searched in the current disparity search range. At this time, the current candidate seed point may be directly determined as a seed point, and the disparity value of the seed point with the minimum first matching cost value in the current disparity search range is used as the disparity value of the seed point.
Step three: if the minimum value of the plurality of first matching cost values is not less than the matching threshold, the current candidate seed point is discarded.
Specifically, when the minimum first matching cost value obtained by performing the disparity search in the reference speckle pattern is not less than the set threshold in the second disparity search range, it is determined that the superior disparity value corresponding to the current candidate seed point is not searched in the second disparity search range. At this time, the matching degree of the candidate seed point itself may not be high, and the current candidate seed point is determined to be failed, and then the determining process of other candidate seed points may be performed from sub-step 1021.
In an example, when the second step is satisfied, and the minimum value of the plurality of first matching cost values is smaller than the set threshold, the original disparity may be replaced by calculating the sub-pixel level disparity, so as to increase the accuracy of the disparity. The treatment process comprises the following steps:
step four: aiming at the parallax value d corresponding to the minimum value of a plurality of first matching cost values, determining the matching cost value C of the adjacent parallax value d-1 corresponding to the parallax value d d-1 Matching cost value C with adjacent disparity value d +1 d+1
In particular, the matching cost value C d-1 And matching cost value C d+1 And C d The calculation processes are the same and can be obtained by calculating the hamming distance, which is not described herein.
Step five: calculating a sub-pixel level parallax d 'of the parallax value d by adopting the following formula, and replacing the parallax value d with the sub-pixel level parallax d' as the parallax value of the seed point:
Figure BDA0003432506590000071
wherein L = C d-1 -C d ,R=C d+1 -C d
Specifically, for the candidate seed point (x, y) determined as the seed point, the corresponding minimum first matching cost value C is obtained d Can be calculated for the disparity value dThe matching cost value C corresponding to two adjacent parallax values d-1 and d +1 d-1 And C d+1 . Set L = C d-1 -C d ,R=C d+1 -C d The sub-pixel level disparity d' of each disparity value d is calculated by formula (5). And finally, replacing the parallax value d with the sub-pixel level parallax d' as the parallax value of the current seed point.
Compared with the related art, the embodiment skips and selects a plurality of disparity values from any boundary value in the corresponding first disparity search range in the direction towards the boundary value at the other end of the corresponding first disparity search range for the current candidate seed point; or, jumping and selecting a plurality of parallax values along the direction towards the boundary values at the two ends from the position where the parallax value is 0 in the corresponding first parallax search range; performing parallax search on the selected multiple parallax values in the preprocessed reference speckle pattern, and determining the matching cost value corresponding to each parallax value; and determining whether the current candidate seed point is a seed point or not based on the matching cost value corresponding to each parallax value, and acquiring the parallax value of the seed point, so that the seed point and the parallax value of the seed point are determined quickly and accurately based on the mode of selecting the parallax value to be matched based on jumping.
Another embodiment of the invention relates to an electronic device, as shown in FIG. 5, comprising at least one processor 202; and a memory 201 communicatively coupled to the at least one processor 202; wherein the memory 201 stores instructions executable by the at least one processor 202, the instructions being executable by the at least one processor 202 to enable the at least one processor 202 to perform any of the method embodiments described above.
Where the memory 201 and the processor 202 are coupled in a bus, the bus may comprise any number of interconnected buses and bridges that couple one or more of the various circuits of the processor 202 and the memory 201 together. The bus may also connect various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. A bus interface provides an interface between the bus and the transceiver. The transceiver may be one element or a plurality of elements, such as a plurality of receivers and transmitters, providing a means for communicating with various other apparatus over a transmission medium. The data processed by the processor 202 is transmitted over a wireless medium through an antenna, which further receives the data and transmits the data to the processor 202.
The processor 202 is responsible for managing the bus and general processing and may also provide various functions including timing, peripheral interfaces, voltage regulation, power management, and other control functions. And memory 201 may be used to store data used by processor 202 in performing operations.
Another embodiment of the present invention relates to a computer-readable storage medium storing a computer program. The computer program realizes any of the above-described method embodiments when executed by a processor.
That is, as can be understood by those skilled in the art, all or part of the steps in the method for implementing the embodiments described above may be implemented by a program instructing related hardware, where the program is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It will be understood by those of ordinary skill in the art that the foregoing embodiments are specific examples for carrying out the invention, and that various changes in form and details may be made therein without departing from the spirit and scope of the invention in practice.

Claims (5)

1. A depth recovery method based on jump acceleration is characterized by comprising the following steps:
aiming at the preprocessed object speckle pattern and the reference speckle pattern, selecting a plurality of candidate seed points from the object speckle pattern and a first parallax search range corresponding to each candidate seed point;
for each candidate seed point, jumping and selecting a plurality of parallax values from the corresponding first parallax search range to perform parallax search in the preprocessed reference speckle pattern, determining whether the candidate seed point is a seed point or not based on the matching cost value corresponding to each parallax value obtained by the parallax search, and obtaining the parallax value of the seed point; the jump step length adopted when the plurality of parallax values are selected in the jumping is the radius of the scattered spots;
determining the parallax value of the object speckle pattern and the reference speckle pattern by using the seed points and the parallax values thereof and adopting a region growing method;
recovering depth information based on disparity values of the object speckle pattern and the reference speckle pattern;
pre-processing the object speckle pattern and the reference speckle pattern, comprising:
sequentially carrying out histogram equalization processing and local binarization processing on the object speckle pattern and the reference speckle pattern;
the step of, for each candidate seed point, skipping and selecting a plurality of disparity values from the corresponding first disparity search range to perform disparity search in the reference speckle pattern, and determining whether the candidate seed point is a seed point based on a matching cost value corresponding to each disparity value obtained by the disparity search, and obtaining a disparity value of the seed point, includes:
for the current candidate seed point, jumping and selecting a plurality of disparity values from any end boundary value in the corresponding first disparity search range along the direction towards the other end boundary value; or, jumping and selecting a plurality of parallax values along the direction towards the boundary values at the two ends from the position where the parallax value is 0 in the corresponding first parallax search range;
performing parallax search on the selected multiple parallax values in the preprocessed reference speckle pattern, and determining a matching cost value corresponding to each parallax value;
determining whether the current candidate seed point is a seed point or not based on the matching cost value corresponding to each parallax value, and acquiring the parallax value of the seed point;
performing disparity search on the selected multiple disparity values in the preprocessed reference speckle pattern to determine a matching cost value corresponding to each disparity value, wherein the method comprises the following steps:
calculating the Hamming distance between the object image block and the reference image block corresponding to each parallax value by using a neighborhood window, and taking the Hamming distance as the matching cost value corresponding to the corresponding parallax value;
performing image compression on the preprocessed object speckle pattern and the reference speckle pattern to obtain an integer pattern which is 1/32 times larger than the original pattern;
the calculating a hamming distance between the object image block and the reference image block corresponding to each parallax value by using the neighborhood window, and taking the hamming distance as a matching cost value corresponding to the parallax value, includes:
and carrying out XOR operation on the object image blocks and the reference image blocks corresponding to the parallax values in the integer image blocks corresponding to the integer image block to obtain the Hamming distance, and taking the Hamming distance as the matching cost value corresponding to the parallax values.
2. The method of claim 1, wherein the determining whether the current candidate seed point is a seed point based on the matching cost value corresponding to each of the disparity values, and obtaining the disparity value of the seed point comprises:
determining a second parallax search range based on the parallax value with the minimum matching cost value in each parallax value, and performing continuous parallax search on the second parallax search range in the preprocessed reference speckle pattern to obtain a plurality of first matching cost values of the current candidate seed points; the length of the second parallax search range is smaller than the length of the first parallax search range;
if the minimum value in the first matching cost values is smaller than a set threshold value, taking the current candidate seed point as a seed point, and taking the parallax value corresponding to the minimum value of the first matching cost values as the parallax value of the seed point;
discarding the current candidate seed point if a minimum value of the plurality of first matching cost values is not less than a matching threshold.
3. The method of claim 2, wherein when a minimum value of the plurality of first matching cost values is less than a set threshold, the method further comprises:
aiming at the parallax value d corresponding to the minimum value of the first matching cost values, determining the matching cost value C of the adjacent parallax value d-1 corresponding to the parallax value d d-1 Matching cost value C with adjacent disparity value d +1 d+1
Calculating a sub-pixel level disparity d 'of the disparity value d by adopting the following formula to replace the disparity value d by the sub-pixel level disparity d' as the disparity value of the seed point:
Figure FDA0003880671310000021
wherein L = C d-1 -C d ,R=C d+1 -C d
4. An electronic device, comprising:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a jump acceleration based depth restoration method as claimed in any one of claims 1 to 3.
5. A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, implements the jump acceleration based depth recovery method according to any one of claims 1 to 3.
CN202111603078.9A 2021-12-24 2021-12-24 Jump acceleration based depth recovery method, electronic device, and storage medium Active CN114283089B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111603078.9A CN114283089B (en) 2021-12-24 2021-12-24 Jump acceleration based depth recovery method, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111603078.9A CN114283089B (en) 2021-12-24 2021-12-24 Jump acceleration based depth recovery method, electronic device, and storage medium

Publications (2)

Publication Number Publication Date
CN114283089A CN114283089A (en) 2022-04-05
CN114283089B true CN114283089B (en) 2023-01-31

Family

ID=80875334

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111603078.9A Active CN114283089B (en) 2021-12-24 2021-12-24 Jump acceleration based depth recovery method, electronic device, and storage medium

Country Status (1)

Country Link
CN (1) CN114283089B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114820393A (en) * 2022-06-28 2022-07-29 合肥的卢深视科技有限公司 Depth recovery method for fusion hole repair, electronic device and storage medium
CN115423808B (en) * 2022-11-04 2023-03-24 合肥的卢深视科技有限公司 Quality detection method for speckle projector, electronic device, and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112070819A (en) * 2020-11-11 2020-12-11 湖南极点智能科技有限公司 Face depth image construction method and device based on embedded system
CN112771573A (en) * 2019-04-12 2021-05-07 深圳市汇顶科技股份有限公司 Depth estimation method and device based on speckle images and face recognition system

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102903096B (en) * 2012-07-04 2015-06-17 北京航空航天大学 Monocular video based object depth extraction method
CN104268871A (en) * 2014-09-23 2015-01-07 清华大学 Method and device for depth estimation based on near-infrared laser speckles
CN108734776B (en) * 2018-05-23 2022-03-25 四川川大智胜软件股份有限公司 Speckle-based three-dimensional face reconstruction method and equipment
US11176694B2 (en) * 2018-10-19 2021-11-16 Samsung Electronics Co., Ltd Method and apparatus for active depth sensing and calibration method thereof
CN111325782A (en) * 2020-02-18 2020-06-23 南京航空航天大学 Unsupervised monocular view depth estimation method based on multi-scale unification
CN111402313B (en) * 2020-03-13 2022-11-04 合肥的卢深视科技有限公司 Image depth recovery method and device
CN113674335B (en) * 2021-08-19 2022-05-31 合肥的卢深视科技有限公司 Depth imaging method, electronic device and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112771573A (en) * 2019-04-12 2021-05-07 深圳市汇顶科技股份有限公司 Depth estimation method and device based on speckle images and face recognition system
CN112070819A (en) * 2020-11-11 2020-12-11 湖南极点智能科技有限公司 Face depth image construction method and device based on embedded system

Also Published As

Publication number Publication date
CN114283089A (en) 2022-04-05

Similar Documents

Publication Publication Date Title
CN114283089B (en) Jump acceleration based depth recovery method, electronic device, and storage medium
CN110084757B (en) Infrared depth image enhancement method based on generation countermeasure network
US9105091B2 (en) Watermark detection using a propagation map
US9406140B2 (en) Method and apparatus for generating depth information
CN113674335B (en) Depth imaging method, electronic device and storage medium
CN110610150A (en) Tracking method, device, computing equipment and medium of target moving object
KR20180109658A (en) Apparatus and method for image processing
CN111105452A (en) High-low resolution fusion stereo matching method based on binocular vision
JP4296617B2 (en) Image processing apparatus, image processing method, and recording medium
CN110334652B (en) Image processing method, electronic device, and storage medium
CN110188640B (en) Face recognition method, face recognition device, server and computer readable medium
CN110120012B (en) Video stitching method for synchronous key frame extraction based on binocular camera
CN114331919B (en) Depth recovery method, electronic device, and storage medium
CN114283081B (en) Depth recovery method based on pyramid acceleration, electronic device and storage medium
US10515455B2 (en) Optical flow measurement
CN113936316B (en) DOE (DOE-out-of-state) detection method, electronic device and computer-readable storage medium
CN113822818B (en) Speckle extraction method, device, electronic device, and storage medium
CN114693546B (en) Image denoising method and device, electronic equipment and computer readable storage medium
CN114529509B (en) Image noise evaluation method, electronic device, and computer-readable storage medium
KR101332630B1 (en) Weight lightened random ferns and image expression method using the same
CN113642442B (en) Face detection method and device, computer readable storage medium and terminal
CN110751163A (en) Target positioning method and device, computer readable storage medium and electronic equipment
CN115272284A (en) Power transmission line defect identification method based on image quality evaluation
CN113674319A (en) Target tracking method, system, equipment and computer storage medium
KR102556350B1 (en) Method and Apparatus for Calculating Ratio of Lesion Area

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220601

Address after: 230091 room 611-217, R & D center building, China (Hefei) international intelligent voice Industrial Park, 3333 Xiyou Road, high tech Zone, Hefei, Anhui Province

Applicant after: Hefei lushenshi Technology Co.,Ltd.

Address before: 100083 room 3032, North B, bungalow, building 2, A5 Xueyuan Road, Haidian District, Beijing

Applicant before: BEIJING DILUSENSE TECHNOLOGY CO.,LTD.

Applicant before: Hefei lushenshi Technology Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant