CN114119693A - iToF depth data acquisition method and device - Google Patents

iToF depth data acquisition method and device Download PDF

Info

Publication number
CN114119693A
CN114119693A CN202111230208.9A CN202111230208A CN114119693A CN 114119693 A CN114119693 A CN 114119693A CN 202111230208 A CN202111230208 A CN 202111230208A CN 114119693 A CN114119693 A CN 114119693A
Authority
CN
China
Prior art keywords
phase difference
obtaining
target pixel
depth data
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111230208.9A
Other languages
Chinese (zh)
Inventor
田指南
燕宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kunshan Q Technology Co Ltd
Original Assignee
Kunshan Q Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kunshan Q Technology Co Ltd filed Critical Kunshan Q Technology Co Ltd
Priority to CN202111230208.9A priority Critical patent/CN114119693A/en
Publication of CN114119693A publication Critical patent/CN114119693A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses an iToF depth data acquisition method and device, wherein the method acquires a first phase difference, a second phase difference, a third phase difference and a fourth phase difference corresponding to each pixel in a target depth frame; then, obtaining a gray image corresponding to the target depth frame according to the first phase difference, the second phase difference, the third phase difference and the fourth phase difference; obtaining a plurality of target pixel coordinates according to the brightness of the gray level image; and finally, obtaining final depth data according to the multiple target pixel coordinates and the first phase difference, the second phase difference, the third phase difference and the fourth phase difference corresponding to each target pixel coordinate. The final depth data obtained in the invention is filtered by the target pixel coordinate, so that the stability and accuracy of the final depth data are ensured.

Description

iToF depth data acquisition method and device
Technical Field
The invention relates to the technical field of computer vision, in particular to an iToF depth data acquisition method and device.
Background
With the development of imaging technology and camera technology, depth cameras have been developed for acquiring depth information. The iToF (Indirect time of Flight) depth camera is widely active in the field of depth information acquisition, and obtains depth through Indirect measurement by a phase shift method, for example, the iToF depth camera transmits a modulated infrared light signal to a scene, a sensor receives a light signal reflected by an object to be measured in the scene, and a phase difference between the transmitted signal and the received signal is calculated according to accumulated charges in exposure (integration) time, so that the depth of a target object is acquired. At present, depth cameras based on a Spot light source are affected by more serious noise when the depth is calculated; thus, the finally obtained depth data has poor stability and low accuracy.
Disclosure of Invention
In view of the above problems, the invention provides an iToF depth data acquisition method and apparatus, which can accurately denoise the iToF depth data and improve the stability and accuracy of the depth data.
In a first aspect, the present application provides the following technical solutions through an embodiment:
an iToF depth data acquisition method, comprising:
acquiring a first phase difference, a second phase difference, a third phase difference and a fourth phase difference corresponding to each pixel in a target depth frame; obtaining a gray image corresponding to the target depth frame according to the first phase difference, the second phase difference, the third phase difference and the fourth phase difference; obtaining a plurality of target pixel coordinates according to the brightness of the gray level image; and obtaining final depth data according to the plurality of target pixel coordinates and the first phase difference, the second phase difference, the third phase difference and the fourth phase difference corresponding to each target pixel coordinate.
Optionally, the obtaining the coordinates of the plurality of target pixels according to the brightness of the grayscale image includes:
obtaining area outlines of a plurality of bright areas according to the brightness of the gray level image; and obtaining the target pixel coordinates of each bright area according to the area outline of each bright area.
Optionally, the obtaining the area outlines of the plurality of bright areas according to the brightness of the grayscale image includes:
and extracting the outline of the bright area from the gray level image according to a preset outline extraction function to obtain the area outline of each bright area.
Optionally, the contour extraction function is a canny function.
Optionally, the obtaining the target pixel coordinates of the bright regions according to the region outlines of the bright regions includes:
obtaining a circumscribed rectangle of each region outline according to the region outline of each bright region; and aiming at each circumscribed rectangle, obtaining the target pixel coordinate according to the maximum gray value in the circumscribed rectangle.
Optionally, the obtaining the target pixel coordinate according to the maximum gray value in the circumscribed rectangle includes:
acquiring all gray values in the circumscribed rectangle; sorting all the gray values by numerical value to obtain the maximum gray value; and obtaining the target pixel coordinate according to the maximum gray value.
Optionally, obtaining a grayscale image corresponding to the target depth frame according to the first phase difference, the second phase difference, the third phase difference, and the fourth phase difference includes:
obtaining a confidence of each pixel according to the first phase difference, the second phase difference, the third phase difference and the fourth phase difference; converting the confidence coefficient of each pixel into a gray value to obtain the gray image; wherein the greater the confidence, the greater the grayscale value.
In a second aspect, based on the same inventive concept, the present application provides the following technical solutions through an embodiment:
an iToF depth data acquisition apparatus comprising:
the acquisition module is used for acquiring a first phase difference, a second phase difference, a third phase difference and a fourth phase difference corresponding to each pixel in the target depth frame; the first processing module is used for obtaining a gray image corresponding to the target depth frame according to the first phase difference, the second phase difference, the third phase difference and the fourth phase difference; the filtering module is used for obtaining a plurality of target pixel coordinates according to the brightness of the gray level image; and the second processing module is used for obtaining final depth data according to the plurality of target pixel coordinates and the first phase difference, the second phase difference, the third phase difference and the fourth phase difference corresponding to each target pixel coordinate.
In a third aspect, based on the same inventive concept, the present application provides the following technical solutions through an embodiment:
an electronic device comprising a processor and a memory coupled to the processor, the memory storing instructions that, when executed by the processor, cause the electronic device to perform the steps of the method of any of the first aspects above.
In a fourth aspect, based on the same inventive concept, the present application provides the following technical solutions through an embodiment:
a computer-readable storage medium, having stored thereon a computer program which, when being executed by a processor, carries out the steps of the method of any of the first aspects.
According to the method and the device for acquiring the iToF depth data, provided by the embodiment of the invention, the first phase difference, the second phase difference, the third phase difference and the fourth phase difference corresponding to each pixel in the target depth frame are acquired; then, obtaining a gray image corresponding to the target depth frame according to the first phase difference, the second phase difference, the third phase difference and the fourth phase difference; obtaining a plurality of target pixel coordinates according to the brightness of the gray level image; and finally, obtaining final depth data according to the multiple target pixel coordinates and the first phase difference, the second phase difference, the third phase difference and the fourth phase difference corresponding to each target pixel coordinate. The obtained final depth data is subjected to filtering processing of the target pixel coordinates, and stability and accuracy of the final depth data are guaranteed.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on the drawings without creative efforts. In the drawings:
fig. 1 is a flowchart illustrating an iToF depth data acquisition method according to a first embodiment of the present invention;
FIG. 2 is a schematic diagram showing the structure of a light source point in the first embodiment of the present invention;
FIG. 3 is a diagram illustrating a depth data distribution before being optimized according to a first embodiment of the present invention;
FIG. 4 is a diagram illustrating a depth data distribution after being optimized according to a first embodiment of the present invention;
fig. 5 shows a functional block schematic diagram of an ietf depth data acquisition apparatus according to a second embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
The method and the device for obtaining the iToF depth data can be used for obtaining the depth data of the depth camera of equipment such as a mobile phone, a computer, a camera, a distance measuring device and the like, wherein the depth camera is a depth camera based on a Spot light source. In particular, the method can be applied to image processing special chips or processors in the devices. For example, a notebook computer has a depth camera at present, and when the depth camera captures whether a target user is in front of the notebook computer and performs depth calculation in the process of capturing the target user, the method of the present invention may be used to achieve acquisition of depth data corresponding to the target user. In order to make the concept of the method and apparatus of the present invention more comprehensible, the method and apparatus of the present invention are described in more detail by way of specific examples.
Referring to fig. 1, in an embodiment of the present invention, an iToF depth data obtaining method is provided, and fig. 1 shows a flowchart of the iToF depth data obtaining method. The iToF depth data acquisition method comprises the following steps:
step S10: acquiring a first phase difference, a second phase difference, a third phase difference and a fourth phase difference corresponding to each pixel in a target depth frame;
step S20: obtaining a gray image corresponding to the target depth frame according to the first phase difference, the second phase difference, the third phase difference and the fourth phase difference;
step S30: obtaining a plurality of target pixel coordinates according to the brightness of the gray level image;
step S40: and obtaining final depth data according to the plurality of target pixel coordinates and the first phase difference, the second phase difference, the third phase difference and the fourth phase difference corresponding to each target pixel coordinate.
In the iToF depth data obtaining method in this embodiment, based on the first phase difference, the second phase difference, the third phase difference, and the fourth phase difference corresponding to each pixel in the target depth frame, a grayscale image corresponding to the target depth frame is obtained, as shown in fig. 2; and then, finding out the target pixel coordinate through the gray level image, wherein the target pixel coordinate is a coordinate point with the maximum brightness, and the target pixel coordinate can accurately represent the light source point, so that noise data of a region outside the target pixel coordinate is filtered, and finally, only the depth data corresponding to the target pixel coordinate is obtained and is the final depth data, and the stability and the accuracy of the final depth data are ensured.
The following respectively further describes possible implementation manners and implementation details of each step in this embodiment.
Step S10: and acquiring a first phase difference, a second phase difference, a third phase difference and a fourth phase difference corresponding to each pixel in the target depth frame.
In step S10, the target depth frame includes four different phases of image data captured by the depth camera. When shooting, the transmitting end of the depth camera transmits an optical signal, such as an infrared optical signal; then, the receiving end of the depth camera receives the infrared light signal, so that the exposure generates a photoelectric signal. The first phase difference, the second phase difference, the third phase difference, and the fourth phase difference may correspond to phase delays of 0 °, 90 °, 180 °, and 270 °, respectively.
Step S20: and obtaining a gray image corresponding to the target depth frame according to the first phase difference, the second phase difference, the third phase difference and the fourth phase difference.
In step S20, in some specific implementations, the process of acquiring the grayscale image may be as follows:
first, the confidence of each pixel is obtained according to the first phase difference, the second phase difference, the third phase difference and the fourth phase difference. The confidence may be obtained based on the formula confidence abs (D0-D2) + abs (D1-D3); where confidence represents confidence, abs () represents the absolute value, D0 represents the first phase difference, D1 represents the second phase difference, D2 represents the third phase difference, and D3 represents the fourth phase difference. Then, converting the confidence of each pixel into a gray value to obtain a gray image; wherein the greater the confidence, the greater the grayscale value. Specifically, the confidence corresponding to each pixel may be normalized and then mapped to 0-255 to obtain the gray value of each pixel; and thus can be constructed as a gray image based on gray values. For example, the confidence level may be normalized to a [0,1] interval by using the existing normalization processing method, and then the normalized value is mapped to a 0-255 interval in an equal proportion (for example, the value is amplified by 255 times in an equal proportion), so as to realize the mapping of the confidence level to the gray value.
Step S30: and obtaining a plurality of target pixel coordinates according to the brightness of the gray level image.
In step S30, one possible implementation is as follows:
first, a region profile of a plurality of bright regions may be obtained according to the brightness of the gray image. Specifically, a preset contour extraction function may be used to extract the contour of the bright region from the grayscale image, and obtain the region contour of each bright region. The bright area is also a white dot area in the gray image, and there are a plurality of white dot areas in one gray image, as shown in fig. 2. The resulting area profile may be circular, elliptical, or other irregular polygonal shape.
The contour extraction function includes, but is not limited to, canny function, Roberts function, Sobel function, and Prewitt function, by which the edge detection can be performed on the white circular dot region to identify the region contour. Furthermore, the canny function can be used for extraction in the embodiment, which can ensure that the extraction process is less interfered by noise, low in error rate and good in positioning.
Then, target pixel coordinates of each bright region are obtained from the region profile of each bright region. In the area contour range, there are still many pixel values, and obtaining the final depth data according to each pixel value in the area contour may be affected by the data of the area around the many light source points, which may still result in the depth data being inaccurate. Therefore, the target pixel coordinates are determined from the region outline, noise data can be further filtered, and the stability and reliability of the final depth data are improved.
In a particular implementation, the target pixel coordinates may be obtained as follows. According to the area outline of each bright area, a circumscribed rectangle of each area outline can be obtained; therefore, the pixels of the bright area and the surrounding area of the bright area are taken into consideration as much as possible, the positions of white point pixels which are omitted in the contour identification process are favorably included, and the accuracy and the reliability of the final depth data are improved. The circumscribed rectangle in this embodiment should be understood to encompass a square. And then, aiming at each circumscribed rectangle, obtaining the coordinates of the target pixels according to the maximum gray value in the circumscribed rectangle. The maximum gray value in the circumscribed rectangle can be obtained by screening in a sorting mode; firstly, acquiring all gray values in a circumscribed rectangle; and then, numerical sorting is carried out on all the gray values, and sorting methods such as bubble sorting, insertion sorting, quick sorting, selection sorting and the like can be adopted during sorting, so that sorting from large to small or sorting from small to large is completed, and the maximum gray value is obtained. In this embodiment, bubble sorting may be adopted, and higher calculation efficiency may be achieved for gray-level values with a small data size. Finally, obtaining the target pixel coordinate according to the maximum gray value; the target pixel coordinate is the pixel point coordinate corresponding to the maximum gray value, and each bright area at least has one target pixel coordinate.
Step S40: and obtaining final depth data according to the plurality of target pixel coordinates and the first phase difference, the second phase difference, the third phase difference and the fourth phase difference corresponding to each target pixel coordinate.
Calculating depth data through the first phase difference, the second phase difference, the third phase difference and the fourth phase difference is an existing means, and for example, a 4-sampling-bucket algorithm can be adopted to calculate and obtain the depth data; the detailed process of obtaining depth data is not described in this embodiment. In step S40, only the depth data corresponding to the target pixel coordinates may be obtained, i.e., the final depth data. Therefore, unstable data other than the target pixel coordinates is no longer contained in the final depth data, and noise data is eliminated.
In addition, in this embodiment, when step S20 is executed, the depth data corresponding to all pixels are synchronously acquired and recorded as the initial depth data. Since the pixel coordinates are not filtered in the initial depth data, there is more unstable depth data in the bright area and outside the bright area. When step S40 is executed, the depth data corresponding to the target pixel coordinates may be directly extracted from the initial depth data. In this way, when step S40 is executed, the calculation of depth data is not performed any more; the depth data calculation process can be performed during the execution of step S20 and step S30, and the execution time of step S40 can be shortened to improve efficiency.
It should be noted that, in the actual application process, final depth data corresponding to a plurality of depth frames may be obtained, and then the average value is taken as the actually used depth data.
The technical effect of the present embodiment is also verified by an example as follows:
when the depth data is acquired by adopting the prior art scheme, the initial depth data is obtained by directly calculating the first phase difference, the second phase difference, the third phase difference and the fourth phase difference, the initial depth data does not filter noise data, and the initial depth data has more noise points and poor stability, as shown in fig. 3. If the final depth data is acquired by using the method for acquiring iToF depth data provided by the embodiment, the data is filtered, and the retained data has high accuracy and good stability, as shown in fig. 4.
In summary, in the method for acquiring iToF depth data provided in this embodiment, the first phase difference, the second phase difference, the third phase difference, and the fourth phase difference corresponding to each pixel in the target depth frame are acquired; then, obtaining a gray image corresponding to the target depth frame according to the first phase difference, the second phase difference, the third phase difference and the fourth phase difference; obtaining a plurality of target pixel coordinates according to the brightness of the gray level image; and finally, obtaining final depth data according to the multiple target pixel coordinates and the first phase difference, the second phase difference, the third phase difference and the fourth phase difference corresponding to each target pixel coordinate. The obtained final depth data is subjected to filtering processing of the target pixel coordinates, and stability and accuracy of the final depth data are guaranteed.
Second embodiment
Referring to fig. 5, based on the same inventive concept, a second embodiment of the present invention further provides an iToF depth data obtaining apparatus 300, where the iToF depth data obtaining apparatus 300 includes:
an obtaining module 301, configured to obtain a first phase difference, a second phase difference, a third phase difference, and a fourth phase difference corresponding to each pixel in a target depth frame; a first processing module 302, configured to obtain a grayscale image corresponding to the target depth frame according to the first phase difference, the second phase difference, the third phase difference, and the fourth phase difference; a filtering module 303, configured to obtain a plurality of target pixel coordinates according to the brightness of the grayscale image; a second processing module 304, configured to obtain final depth data according to the multiple target pixel coordinates and the first phase difference, the second phase difference, the third phase difference, and the fourth phase difference corresponding to each target pixel coordinate.
As an optional implementation manner, the filtering module 303 is specifically configured to:
obtaining area outlines of a plurality of bright areas according to the brightness of the gray level image; and obtaining the target pixel coordinates of each bright area according to the area outline of each bright area.
As an optional implementation manner, the filtering module 303 is further specifically configured to:
and extracting the outline of the bright area from the gray level image according to a preset outline extraction function to obtain the area outline of each bright area.
As an alternative embodiment, the contour extraction function is a canny function.
As an optional implementation manner, the filtering module 303 is further specifically configured to:
obtaining a circumscribed rectangle of each region outline according to the region outline of each bright region; and aiming at each circumscribed rectangle, obtaining the target pixel coordinate according to the maximum gray value in the circumscribed rectangle.
As an optional implementation manner, the filtering module is further specifically configured to:
acquiring all gray values in the circumscribed rectangle; sorting all the gray values by numerical value to obtain the maximum gray value; and obtaining the target pixel coordinate according to the maximum gray value.
As an optional implementation manner, the first processing module 302 is specifically configured to:
obtaining a confidence of each pixel according to the first phase difference, the second phase difference, the third phase difference and the fourth phase difference; converting the confidence coefficient of each pixel into a gray value to obtain the gray image; wherein the greater the confidence, the greater the grayscale value.
It should be noted that the implementation and technical effects of the iToF depth data obtaining apparatus 300 provided in the embodiment of the present invention are the same as those of the foregoing method embodiment, and for the sake of brief description, reference may be made to corresponding contents in the foregoing method embodiment where no mention is made in part of the apparatus embodiment.
Third embodiment
Based on the same inventive concept, another embodiment of the present application provides an electronic device implementing the iToF depth data obtaining method in the foregoing embodiment of the present application. The electronic device comprising a processor and a memory coupled to the processor, the memory storing instructions that, when executed by the processor, cause the electronic device to perform the steps of the method of the preceding method embodiment.
It should be noted that, in the electronic device provided in the embodiment of the present invention, when the instructions are executed by the processor, the specific implementation and the resulting technical effect of each step are the same as those of the foregoing method embodiment, and for a brief description, reference may be made to corresponding contents in the foregoing method embodiment for what is not mentioned in the present embodiment.
Fourth embodiment
Based on the same inventive concept, another embodiment of the present application further provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the steps of the method of any of the preceding method embodiments.
It should be noted that, in the computer-readable storage medium provided by the embodiment of the present invention, when the program is executed by the processor, the specific implementation of each step and the generated technical effect are the same as those of the foregoing method embodiment, and for the sake of brief description, for the sake of brevity, reference may be made to corresponding contents in the foregoing method embodiment for the non-mentioned points of the embodiment.
The term "and/or" appearing herein is merely one type of associative relationship that describes an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship; the word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (10)

1. An iToF depth data acquisition method, comprising:
acquiring a first phase difference, a second phase difference, a third phase difference and a fourth phase difference corresponding to each pixel in a target depth frame;
obtaining a gray image corresponding to the target depth frame according to the first phase difference, the second phase difference, the third phase difference and the fourth phase difference;
obtaining a plurality of target pixel coordinates according to the brightness of the gray level image;
and obtaining final depth data according to the plurality of target pixel coordinates and the first phase difference, the second phase difference, the third phase difference and the fourth phase difference corresponding to each target pixel coordinate.
2. The method of claim 1, wherein obtaining a plurality of target pixel coordinates based on the intensity of the grayscale image comprises:
obtaining area outlines of a plurality of bright areas according to the brightness of the gray level image;
and obtaining the target pixel coordinates of each bright area according to the area outline of each bright area.
3. The method according to claim 2, wherein obtaining the region profile of the plurality of bright regions according to the brightness of the grayscale image comprises:
and extracting the outline of the bright area from the gray level image according to a preset outline extraction function to obtain the area outline of each bright area.
4. The method of claim 3, wherein the contour extraction function is a canny function.
5. The method according to claim 2, wherein obtaining the target pixel coordinates of the bright regions according to the region profile of each bright region comprises:
obtaining a circumscribed rectangle of each region outline according to the region outline of each bright region;
and aiming at each circumscribed rectangle, obtaining the target pixel coordinate according to the maximum gray value in the circumscribed rectangle.
6. The method of claim 5, wherein obtaining the target pixel coordinates according to the maximum gray value within the bounding rectangle comprises:
acquiring all gray values in the circumscribed rectangle;
sorting all the gray values by numerical value to obtain the maximum gray value;
and obtaining the target pixel coordinate according to the maximum gray value.
7. The method according to claim 1, wherein obtaining the grayscale image corresponding to the target depth frame according to the first phase difference, the second phase difference, the third phase difference, and the fourth phase difference comprises:
obtaining a confidence of each pixel according to the first phase difference, the second phase difference, the third phase difference and the fourth phase difference;
converting the confidence coefficient of each pixel into a gray value to obtain the gray image; wherein the greater the confidence, the greater the grayscale value.
8. An iToF depth data acquisition apparatus, comprising:
the acquisition module is used for acquiring a first phase difference, a second phase difference, a third phase difference and a fourth phase difference corresponding to each pixel in the target depth frame;
the first processing module is used for obtaining a gray image corresponding to the target depth frame according to the first phase difference, the second phase difference, the third phase difference and the fourth phase difference;
the filtering module is used for obtaining a plurality of target pixel coordinates according to the brightness of the gray level image;
and the second processing module is used for obtaining final depth data according to the plurality of target pixel coordinates and the first phase difference, the second phase difference, the third phase difference and the fourth phase difference corresponding to each target pixel coordinate.
9. An electronic device comprising a processor and a memory coupled to the processor, the memory storing instructions that, when executed by the processor, cause the electronic device to perform the steps of the method of any of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN202111230208.9A 2021-10-22 2021-10-22 iToF depth data acquisition method and device Pending CN114119693A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111230208.9A CN114119693A (en) 2021-10-22 2021-10-22 iToF depth data acquisition method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111230208.9A CN114119693A (en) 2021-10-22 2021-10-22 iToF depth data acquisition method and device

Publications (1)

Publication Number Publication Date
CN114119693A true CN114119693A (en) 2022-03-01

Family

ID=80376551

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111230208.9A Pending CN114119693A (en) 2021-10-22 2021-10-22 iToF depth data acquisition method and device

Country Status (1)

Country Link
CN (1) CN114119693A (en)

Similar Documents

Publication Publication Date Title
CN110378945B (en) Depth map processing method and device and electronic equipment
CN107948517B (en) Preview picture blurring processing method, device and equipment
CN107016348B (en) Face detection method and device combined with depth information and electronic device
CN109640066B (en) Method and device for generating high-precision dense depth image
US10354413B2 (en) Detection system and picture filtering method thereof
JP2005072888A (en) Image projection method and image projection device
CN111340749B (en) Image quality detection method, device, equipment and storage medium
CN112017231B (en) Monocular camera-based human body weight identification method, monocular camera-based human body weight identification device and storage medium
CN113160161B (en) Method and device for detecting defects at edge of target
CN113762253B (en) Speckle extraction method and device, electronic device and storage medium
CN108764139B (en) Face detection method, mobile terminal and computer readable storage medium
CN104615972B (en) Intelligent identification method and device for pointer instrument
CN109447902B (en) Image stitching method, device, storage medium and equipment
CN112434715B (en) Target identification method and device based on artificial intelligence and storage medium
CN113822942A (en) Method for measuring object size by monocular camera based on two-dimensional code
CN104574312A (en) Method and device of calculating center of circle for target image
WO2021195873A1 (en) Method and device for identifying region of interest in sfr test chart image, and medium
US10803625B2 (en) Detection system and picturing filtering method thereof
CN113808135B (en) Image brightness abnormality detection method, electronic device, and storage medium
CN113283439B (en) Intelligent counting method, device and system based on image recognition
CN114119695A (en) Image annotation method and device and electronic equipment
CN112837384A (en) Vehicle marking method and device and electronic equipment
CN115423804B (en) Image calibration method and device and image processing method
US20200210750A1 (en) Adhering substance detection apparatus and adhering substance detection method
CN115239789B (en) Method and device for determining liquid volume, storage medium and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination