CN114742756A - Depth map detection method and device, storage medium and electronic equipment - Google Patents

Depth map detection method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN114742756A
CN114742756A CN202210217090.4A CN202210217090A CN114742756A CN 114742756 A CN114742756 A CN 114742756A CN 202210217090 A CN202210217090 A CN 202210217090A CN 114742756 A CN114742756 A CN 114742756A
Authority
CN
China
Prior art keywords
light intensity
depth
value
determining
abnormal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210217090.4A
Other languages
Chinese (zh)
Inventor
侯烨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202210217090.4A priority Critical patent/CN114742756A/en
Publication of CN114742756A publication Critical patent/CN114742756A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

A depth map detection method, a depth map detection device, a storage medium and electronic equipment are provided, wherein a flight time depth map of a flight time camera and a reflected light intensity map corresponding to the flight time depth map are obtained; determining light intensity reference pixel points from the reflected light intensity map according to the reflected light intensity map; and determining abnormal pixel points with abnormal depth values from the flight time depth map according to the light intensity values of the light intensity reference pixel points.

Description

Depth map detection method and device, storage medium and electronic equipment
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a depth map detection method and device, a storage medium and electronic equipment.
Background
With the rapid development of computer technology, the acquisition of depth information not only becomes the key of three-dimensional vision tasks, but also plays an increasingly important role in traditional color-based vision tasks, the previously popular structured light and stereo vision or the range of range is too small or scene texture is required, and the time-of-flight technology overcomes these defects to become the most promising depth acquisition mode.
The time-of-flight camera emits modulated optical signals outwards through the emitting end, the modulated optical signals are reflected by objects in a scene and then received by the receiving end, the received optical signals are resolved to obtain a depth map of the scene, and pixel values in the depth map describe depths of different objects in the scene. However, the process of acquiring the depth map by the time-of-flight camera is susceptible to interference, so that there may be a pixel point with an abnormal depth value, and the pixel point with the abnormal depth value needs to be identified.
Disclosure of Invention
The application provides a depth map detection method, a depth map detection device, a storage medium and electronic equipment, which can screen out pixel points with accurate depth information and facilitate utilization of pixel depth information.
In a first aspect, the present application provides a depth map detection method, including:
acquiring a flight time depth map of a flight time camera and a reflected light intensity map corresponding to the flight time depth map;
determining light intensity reference pixel points from the reflected light intensity map according to the reflected light intensity map;
and determining abnormal pixel points with abnormal depth values from the flight time depth map according to the light intensity values of the light intensity reference pixel points.
In a second aspect, the present application provides a depth map detection apparatus, including:
the image acquisition module is used for acquiring a flight time depth map of the flight time camera and a reflected light intensity map corresponding to the flight time depth map;
the first determining module is used for determining light intensity reference pixel points from the reflected light intensity map according to the reflected light intensity map;
and the second determining module is used for determining an abnormal depth pixel point from the flight time depth map according to the light intensity value of the light intensity reference pixel point.
In a third aspect, the present application provides a storage medium having stored thereon a computer program which, when loaded by a processor of an electronic device, performs the steps of any of the depth map detection methods as provided herein.
In a fourth aspect, the present application further provides an electronic device, which includes a processor and a memory, where the memory stores a computer program, and the processor executes the steps in any one of the depth map detection methods provided in the present application by loading the computer program stored in the memory.
In this application, consider that there is the object of the different degree of depth in the reality scene, and the different degree of depth object is different to the reflectance ability of light signal, and the reflected light signal of different light intensity will influence the resolving of time of flight camera to the degree of depth to produce the pixel that the degree of depth value is unusual. Therefore, the abnormal pixel points of the depth values are identified by taking the light intensity as a basis. The method comprises the steps of firstly obtaining a flight time depth map generated by a flight time camera and reflected light intensity corresponding to the flight time depth map, then determining light intensity reference pixel points from the reflected light intensity map according to the reflected light intensity map, and determining abnormal pixel points with abnormal depth values from the flight time depth map by taking the light intensity values of the light intensity reference pixel points as the basis, thereby realizing the identification of the abnormal depth value pixel points.
Drawings
In order to more clearly illustrate the technical solutions in the present application, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flowchart of a depth map detection method provided in an embodiment of the present application;
fig. 2 is an exemplary diagram of a signal light emission principle in an embodiment of the present application;
FIG. 3 is a diagram illustrating an example of depth value changes before and after a reflected light disturbance in an embodiment of the present application;
FIG. 4 is a diagram of an example of determining a reference sub-region from reflected light and a reference sub-region from a depth value in an embodiment of the present application;
fig. 5 is a block diagram of a depth map detection apparatus provided in an embodiment of the present application;
fig. 6 is a block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
It should be noted that the terms "first", "second", and "third", etc. in this application are used for distinguishing different objects, and are not used for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or modules is not limited to only those steps or modules recited, but rather, some embodiments include additional steps or modules not recited, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
An embodiment of the present application provides a depth map detection method, a depth map detection apparatus, a storage medium, and an electronic device, where an execution main body of the depth map detection method may be the depth map detection apparatus provided in the embodiment of the present application, or the electronic device integrated with the depth map detection apparatus, where the depth map detection apparatus may be implemented in a hardware or software manner. The electronic device may be a device with data processing capability and configured with a processor, such as a smart phone, a tablet computer, a palm computer, and a notebook computer.
Referring to fig. 1, fig. 1 is a schematic flow chart of a depth map detection method according to an embodiment of the present disclosure, and as shown in fig. 1, the flow of the depth map detection method according to the embodiment of the present disclosure may be as follows:
101. and acquiring a flight time depth map of the flight time camera and a reflected light intensity map corresponding to the flight time depth map.
For example, in the embodiment of the present application, the Time of Flight camera includes a light emitting end and a receiving end, the emitting end emits modulated infrared light, the modulated infrared light is reflected by a target object in a scene and then received by the receiving end, and modulated signal light received by the receiving end is resolved by other hardware circuits and software algorithms to generate a two-dimensional depth map.
The time-of-flight depth map indicates that each pixel value of an image represents the distance between a certain point in a scene and a camera, and the distance between an object and the camera can be determined through the time-of-flight depth map, so that the function of measuring the distance of the object at a long distance is realized.
Wherein the reflected light intensity map refers to light intensity information of each pixel point of the image, and is used to describe the intensity of the received signal light, and further used as the confidence of the target distance measurement value, that is, each pixel point corresponds to a depth value and a light intensity value in the image corresponding to the scene, and usually the receiving end photosensitive element (b)Such as an image sensor) The light intensity value received by each pixel point (pixel) represents the confidence coefficient of the pixel point for resolving the generated depth value, namely confidence.
The depth value calculation of the pixel point is influenced by the reflected light intensity value of the pixel point, for example, when the signal light received by the receiving end is weaker, the reliability of the calculated distance depth value is generally poorer, and similarly, the higher the light received by the receiving end is, the higher the reliability of the depth value is, for example, when the reflected light intensity value is in a certain range, the generated depth distance value is credible; however, when the reflected light intensity value is large beyond a certain range, overexposure occurs to affect the resolution of the depth value. Therefore, in the embodiment of the present application, whether overexposure occurs or not can be determined according to whether the reflected light intensity value is greater than the light intensity threshold value or not, and whether the depth value calculation is accurate or not can be determined.
In addition, because the reflected light intensity information of different objects and objects with different depth information in the scene image is different, there is a situation that the reflected light of a close-distance object covers or interferes with the imaging area of a long-distance object, for example, the reflected light of an object located in a foreground area interferes with an object located in a background area close to the edge of the object, so that the reflected light intensity information of the background area is inaccurate, and the calculation of the depth value of each pixel point in the scene image is influenced by the emitted light intensity, so when the reflected light intensity information acquired by the background area is inaccurate, the corresponding depth information is also measured inaccurately.
For example, as shown in fig. 2, a target object A, B close to the time-of-flight camera exists in a certain actual usage scene, and a background C also exists. Based on the principle of active optical ranging imaging of a time-of-flight camera, the stronger the reflected light, the higher the confidence of the calculated depth value, which results in that stronger reflected light is needed to be used when the depth value of a long-distance background or the depth value of other short-distance low-reflectivity target objects is to be acquired. Then, when the reflected light intensity reaches a certain degree or the target object a is too close to the time-of-flight camera, the light reflected by the object a is very strong, and even the reflected light of the object a can be received in the pixel area where the receiving end of the time-of-flight camera should receive the object B and the background C, for the pixel areas of the object B and the background C, the light reflected by the object a is noise, and the depth value calculation of the pixel areas of the object B and the background C is affected.
For example, if the time-of-flight camera is 1m, 2m, and 5m away from the target object A, B, C, and the three are controlled to occupy one third of the field of view, the theoretical depth information generated is shown as part a in fig. 3, reflecting the real distance, but actually, since the target object A, B is closer to the camera, the reflected light at the edge of the target object a interferes with the target object B or C, and the reflected light at the edge of the target object B interferes with the target object C. The actually generated depth information is generally shown as part b in fig. 3, and a transition zone is formed between the target objects AB and BC, respectively.
In general, the depth information of the transition zone is not the average value of the part B in fig. 3, and the depth information of the transition zone is generally between the depth information of two adjacent regions, for example, the depth information of the transition zone between the target object AB is between the depth information of the target object a and the depth information of B, and the depth information of the transition zone between the target object BC is between the depth information of the target object B and the depth information of C. Therefore, the region of the transition zone cannot be directly determined according to the depth information, that is, it is difficult to directly determine the abnormal pixel points with abnormal depth information.
Due to the existence of reflected light interference, the depth value of the pixel point of the transition band part is not accurately calculated, the distance between an object and a camera is difficult to accurately describe, and the method cannot be used for distance measurement.
Therefore, the embodiment of the application can be used for identifying the pixel points with abnormal depth information (for example, identifying the pixel points located in the transition zone) by acquiring the flight time depth map and the reflected light intensity map after the shooting of the flight time camera, and further acquiring the target pixel points with normal depth information.
102. And determining light intensity reference pixel points from the reflected light intensity map according to the reflected light intensity map.
Wherein, because the reflection light intensity is higher forms signal light to remote object easily and shelters from or disturb to and when the reverberation is stronger, the interference range to remote object is big more, consequently, in this application embodiment, can screen the light intensity reference pixel through the light intensity value of pixel in the reflection light intensity map, so that follow-up unusual pixel of depth value according to the light intensity value determination of light intensity reference pixel.
In this embodiment, before selecting the light intensity reference pixel according to the light intensity value, the reference sub-region may be selected according to the light intensity value, and the light intensity reference pixel is selected from the reference sub-region, that is, optionally, in some embodiments of this application, the step "determining the light intensity reference pixel from the reflected light intensity map according to the reflected light intensity map" includes:
determining a reference sub-region of the reflected light intensity graph, wherein the light intensity value meets a preset condition;
and determining all pixel points or part of pixel points of the reference subarea as light intensity reference pixel points.
The light intensity value forms interference light on the remote object after reaching a certain range, so that the reference sub-area which can influence the light intensity values of other areas can be determined according to the light intensity value.
In this embodiment, in this application, the light intensity value of the light intensity reference pixel is used to represent the light intensity information of the reference sub-region, so that all the pixels in the reference sub-region can be used as the light intensity reference pixels, and a part of the pixels can be selected from all the pixels, and the light intensity value of the part of the pixels is used to represent the whole light intensity information of the reference sub-region, that is, in this application, the pixel whose light intensity value can represent the whole light intensity information of the reference sub-region can be selected as the light intensity reference pixel.
For example, in the embodiment of the present application, when it is necessary to determine the interference problem of the foreground region to the background region and determine the affected pixel points, the foreground region may be determined as a reference sub-region, and the light intensity reference pixel points are determined from the reference sub-region, and the abnormal pixel points whose depth values are affected in the background region are determined according to the light intensity values of the light intensity reference pixel points, so as to position the transition zone between the foreground region and the background region.
In addition, because the reflected light of the object at a short distance may affect the object at a long distance, and the reflected light of the object at the same depth distance has a small influence on itself, in the embodiment of the present application, a reference sub-region capable of affecting the light intensity values of other regions needs to be accurately selected.
However, since the Indirect Time of Flight (ietf) camera calculates the distance value by the phase difference of the reflected light, a small amount of interference light may cause a large influence on the calculation of the depth of the pixel point, that is, it is difficult to accurately divide the foreground region and the background region according to the depth value, and the light intensity value of each pixel point is the sum of the received emitted light, that is, each pixel point needs to receive a large amount of interference light to have an obvious influence on the light intensity value of the pixel point, and therefore, the fluctuation range of the light intensity value of each pixel point in the scene is not large, that is, the foreground region and the background region divided according to the light intensity value are closer to the actual foreground region and the actual background region in the scene, that is, the division of the foreground region and the background region according to the light intensity value is relatively more accurate.
For example, please refer to fig. 4, fig. 4 is a comparative example diagram of determining a reference sub-area according to reflected light and determining a reference sub-area according to a depth value in the embodiment of the present application, wherein a left side in fig. 4 is a schematic diagram of determining a reference sub-area according to reflected light of a pixel point in a reflected light intensity map, and a right side in fig. 4 is a schematic diagram of determining a reference sub-area according to a depth value of a pixel point in a time-of-flight depth map. The region corresponding to the sequence number 3 in the reflected light intensity map and the time-of-flight depth map is a foreground region (i.e., foreground region 3) under an ideal condition, and the region except the foreground region 3 in the reflected light intensity map and the time-of-flight depth map is a background region 5. Wherein, in the same scene, the areas affected by the reflected light intensity map and the time-of-flight depth map are the same, such as the area part except the foreground area 3 in the two arc areas in fig. 4, (hereinafter, the interference area 1 is used to replace), and for the affected interference area 1, because the interference light has a large influence on the depth information calculation, under the condition of a small amount of interference light, the depth information of each pixel point in the area shows a significant difference, for example, the depth value of the pixel point located in the background area 5 is originally relatively close, but because of the existence of the interference light, the depth value calculation of the interference area 1 is wrong, that is, the depth value calculated by the pixel point in the interference area 1 is greatly different from the depth value of the background area 5, therefore, when the area is divided according to the depth value, the interference area 1 and the background area 5 are easily divided into different areas (actually, the interference area 1 is still part of the background area 5), for example, the area 4 close to the contour of the interference area 1 is divided into foreground areas (i.e. the area 4 is the foreground area 4 divided according to the light intensity value), so the background area divided according to the depth value is different from the actual background area greatly, and the area division is not accurate.
Since the fluctuation range of the light intensity value of the pixel point is small after each pixel point is affected by the disturbance light, the light intensity value of the pixel point in the disturbance area is still close to the light intensity value of the background area, therefore, when the area is divided according to the light intensity value, the disturbance area is still divided into the background area, for example, please refer to fig. 4, when the foreground area is determined according to the reflected light, the area 2 close to the foreground area 3 is divided into the foreground area (i.e. the area 2 is the foreground area 2 divided according to the reflected light), and after comparing the foreground area 2 divided according to the light intensity value with the foreground area 4 divided according to the depth value, the foreground area divided according to the light intensity value is closer to the foreground area 3 in the real scene, that is, the accuracy of dividing the foreground area and the background area can be improved through the division of the light intensity value, that is, the accuracy of selecting the reference sub-area is improved, the accuracy of obtaining the light intensity reference pixel points is further improved, the accuracy of obtaining the light intensity reference pixel points is improved, and the accuracy of obtaining the abnormal depth value pixel points is improved.
In the embodiment of the present application, the reflected light intensity map and the color (RGB) image corresponding to the color camera may be further combined to determine the reference sub-region, where, because the color image has a higher resolution and can provide color information, compared to the reflected light intensity map, the edge regions of the reference sub-region and the target sub-region may be clearer and more accurate, and therefore, the accuracy of obtaining the edge region may be improved, for example, the background information provided by the color image is richer, and may be a scene with a plane, an inclined plane, or a randomly changing scene, or a situation with a background surface fitting, and the like. Therefore, when the requirements on the accuracy, the integrity and the like of the time-of-flight depth map are high and the requirements on the frame rate and the computing resources are low, the reflected light intensity map in the scheme can be replaced by a color image or the reflected light intensity map and the color image can be combined for use. Of course, the algorithm complexity is higher when the color image is combined, and whether the combination is needed or not can be selected and determined according to the precision and the requirement of the flight time depth map.
103. And determining abnormal pixel points with abnormal depth values from the flight time depth map according to the light intensity values of the light intensity reference pixel points.
The interference areas of the pixel areas of the remote object have different sizes due to different reflected light intensity values, so that the size of the interference areas can be determined according to the reflected light intensity values, and abnormal pixel points with abnormal depth values can be determined.
Wherein, in this application embodiment, the light intensity reference pixel is obtained from screening in the reflected light intensity map, consequently, the light intensity value of light intensity reference pixel can reflect the light intensity information of pixel in the reflected light intensity map.
Wherein, in this application embodiment, the light intensity reference pixel can contain a plurality ofly, consequently, can obtain the reference light intensity value according to the mean value of the light intensity value of a plurality of light intensity reference pixel, wherein, because the light intensity reference pixel is selected from the reference subregion to and this reference light intensity value is the mean value of the light intensity value of a plurality of light intensity reference pixels, consequently, this reference light intensity value can characterize the light intensity condition of reference subregion. And the positioning accuracy of the abnormal pixel point can be improved by positioning the abnormal pixel point with the abnormal depth value by the reference light intensity value which can represent the light intensity condition of the reference sub-region.
In this embodiment, the light intensity reference pixel may be selected from an edge of the reference sub-region, for example, one or more pixels are selected as the light intensity reference pixel at a position where the reference sub-region is connected to the background region. When a plurality of light intensity reference pixel points are selected, a plurality of pixel points adjacent to the background area can be respectively selected as the light intensity reference pixel points (namely, each light intensity reference pixel point is respectively adjacent to the background area), and a normal can be established according to a boundary of a reference sub-area and the background area, and the plurality of light intensity reference pixel points are selected from the normal direction. Because the distance between the light intensity reference pixel point and the background area is short, the selection effectiveness of the light intensity reference pixel point can be improved, and the positioning accuracy of the subsequent abnormal pixel points can be improved.
Wherein, because the influence size of different reflection light intensity value to the interference area is different, for example, when the reflection light intensity value reaches certain value range, can influence to locating at distant whole objects, consequently, in this application embodiment, can judge the scope of influence area according to whether the reflection light intensity value reaches certain threshold value, namely, optionally, in some embodiments of this application, step "according to the light intensity value of light intensity reference pixel, follow the unusual pixel of time of flight depth map determination depth value anomaly", include:
determining a target sub-region adjacent to the reference sub-region according to the reflected light intensity map and the reference sub-region;
when the light intensity value of the light intensity reference pixel point is larger than a first preset threshold value, determining the pixel point corresponding to the target sub-region as an abnormal depth pixel point;
and when the light intensity value of the light intensity reference pixel point is greater than a second preset threshold and smaller than the first preset threshold, determining an abnormal pixel point with an abnormal depth value from the target subregion according to the light intensity value of the light intensity reference pixel point.
In the embodiment of the present application, the target sub-region is a region adjacent to the reference sub-region and having a distance from the time-of-flight camera greater than a distance between the reference sub-region and the time-of-flight camera, that is, reflected light of the reference sub-region interferes with the target sub-region, for example, a transition zone is formed on the target sub-region under the influence of reflected light of the reference sub-region, that is, an abnormal pixel point with an abnormal depth value is formed in the target sub-region.
In this embodiment of the application, the first preset threshold and the second preset threshold may be obtained according to actual requirements or experimental data, for example, by continuously adjusting the light intensity value of the reflected light, and recording the width of the transition band under each light intensity value, when the width of the transition band is the width of the target sub-region, the light intensity value at this time may be recorded, and the light intensity value is used as the first preset threshold, and when the width of the transition band is smaller than the width of the target sub-region, the light intensity value at this time may be recorded, and the light intensity value is used as the second preset threshold.
In the embodiment of the present application, the first preset threshold is greater than the second preset threshold, wherein when the light intensity value of the light intensity reference pixel is greater than the first preset threshold, it can be understood that the reflected light of the reference region has reached a condition that sufficiently affects the entire target sub-region, that is, the depth information of each pixel in the target sub-region is resolved incorrectly, that is, each pixel in the target sub-region is an abnormal pixel with an abnormal depth value; when the light intensity value of the light intensity reference pixel point is smaller than the first preset threshold and larger than the second preset threshold, it can be understood that the reflected light of the reference sub-region affects part of the pixel points in the target sub-region, but not all the pixel points in the target sub-region, therefore, for the condition that the light intensity value of the light intensity reference pixel point is larger than the second preset threshold and smaller than the first preset threshold, the abnormal pixel point with abnormal depth value resolving in the target sub-region can be determined according to the light intensity value of the light intensity reference pixel point, and the region affected by the interference light in the target sub-region is determined.
In this embodiment, the range of signal light interference may be determined according to the light intensity value, and the abnormal pixel point with an abnormal depth value is determined according to the interference range, that is, optionally, in some embodiments of the present application, the step "determining the abnormal pixel point with an abnormal depth value from the target subregion according to the light intensity value of the light intensity reference pixel point" includes:
determining an edge region where the reference sub-region and the target sub-region are intersected;
determining a target width distance according to the light intensity value of the light intensity reference pixel point;
determining an abnormal pixel region in the target sub-region according to the edge region and the target width distance;
and determining abnormal pixel points with abnormal depth values according to the abnormal pixel areas.
In this embodiment, the edge region refers to a start region affected by the interference light, that is, an edge portion of the reference sub-region or an edge portion of the target sub-region, and therefore, in this embodiment, for the target sub-region, a region where the reference sub-region and the target sub-region are connected may be determined as the edge region.
The abnormal pixel region can be determined according to the width distance by determining the width distance of the interference by the light intensity value of the light intensity reference pixel point, and then the abnormal pixel point is determined according to the abnormal pixel region.
In the embodiment of the application, a functional relationship between the light intensity value and the width distance of the interference region can be determined in advance according to the sample flight time depth map and the reflected light intensity map corresponding to the sample flight time depth map, and then the width distance corresponding to the light intensity value of the light intensity reference pixel point is determined according to the functional relationship.
In the embodiment of the present application, the pixel points included in the abnormal pixel region may be determined as abnormal pixel points with abnormal depth values.
In this embodiment, it may also be determined, that is, optionally, in some embodiments of the present application, the step "determining the target width distance according to the light intensity value of the light intensity reference pixel" includes:
and determining the target width distance according to the light intensity value of the light intensity reference pixel point and the depth value of the light intensity reference pixel point.
Wherein, because the distance of closely object and time of flight camera is more close, and is bigger to the influence range of remote object reverberation, consequently, through the degree of depth value that combines light intensity reference pixel, can promote the accuracy that interference range width distance obtained.
Wherein, because the light intensities of the reflected lights corresponding to the objects with different reflectivities are different, and the interference areas formed by the reflected lights with different light intensities are different, the width distance of the interference area is also related to the reflectivity of the object, therefore, the width distance of the interference area can be determined according to the light intensity value and the reflectivity, that is, optionally, in some embodiments of the present application, the step "determining the target width distance according to the light intensity value of the light intensity reference pixel point and the depth value of the light intensity reference pixel point" includes:
determining the reflectivity of the light intensity reference pixel points according to the light intensity values of the light intensity reference pixel points and the depth values of the light intensity reference pixel points;
and determining the target width distance according to the light intensity value and the reflectivity of the light intensity reference pixel point.
In the embodiment of the present application, the reflectivity of the pixel may be determined according to the light intensity value and the depth value of the pixel, where the reflectivity of the pixel has the following relationship with the light intensity value confidence and the depth value depth of the pixel:
reflectivity=α*confidence*depth2
wherein α is an attenuation coefficient of light received by the peripheral and central regions, α is related to factors such as intensity of emitted light, angle of field of view of reflected light, and TX light field distribution, and α may be preset or obtained by calibration. For example, under the assumption that the light field distribution at the transmitting end of the time-of-flight camera is uniform, the α value of each pixel point on the transmitted light intensity map is related to the position of the pixel point on the transmitted light intensity map, that is, the field angle at which the pixel point receives the reflected light, and at this time, the α value of each pixel point on the two-dimensional plane can be calculated. In an actual situation, the optical path of the center of the field of view is shortest, so that the light intensity value corresponding to the pixel point is also the largest, the confidence coefficient and the precision of the calculated depth value are high, and conversely, the confidence coefficient and the precision of the depth values of the surrounding pixel points are lower, so that when the hardware of the time-of-flight camera is designed, a non-uniform light source at the transmitting end is usually designed, the surrounding transmitting light intensity is enhanced, and accordingly, the alpha value calibrated has more practical significance.
The accuracy of width distance calculation can be improved through calculation of the width distance of the interference area by the light intensity value and the reflectivity of the pixel point.
Wherein, in this application embodiment, can confirm the light intensity value of light intensity reference pixel and the width distance that the reflectivity corresponds according to the functional relation between interference zone width distance, light intensity value and the reflectivity, promptly, optionally, in some embodiments of this application, step "confirm target width distance according to light intensity value and the reflectivity of light intensity reference pixel", include:
obtaining a functional relation, wherein the functional relation is obtained by fitting the light intensity value and the reflectivity of the sample reflected light intensity graph with the sample width distance corresponding to the sample reflected light intensity graph;
and determining the light intensity value of the light intensity reference pixel point and the target width distance corresponding to the reflectivity according to the functional relation.
The fitting of the functional relationship among the light intensity value, the reflectivity and the sample width distance can be realized through the light intensity value, the reflectivity and the sample width distance of the sample reflected light intensity graph, for example, the functional relationship among the light intensity value, the reflectivity and the sample width distance can be fitted by taking the light intensity value and the reflectivity as independent variables and taking the sample width distance as a dependent variable. The width distance corresponding to the light intensity value and the reflectivity of the light intensity reference pixel point can be determined through the fitted functional relation, wherein the width distance is the width distance of the interference area of the reference sub-area to the target sub-area.
In the embodiment of the present application, a functional relationship between the light intensity value c (confidence), the reflectivity r (reflectivity), and the sample width distance dExNum may be fitted by a polynomial, and the functional relationship is specifically as follows:
dExNum=a00+a10r+a01c+a11rc+a21r2c+a12rc2+a22r2c2+…;
in theory, the number of the subsequent high-order terms is enough, any curve can be fitted, and the appropriate number of terms can be selected according to the requirements of application on accuracy and the requirements of computing resource consumption, or only one term can be selected. In addition, the method can be simplified, for example, only the light intensity value is used as an independent variable, and a formula which does not contain the reflectivity is fitted, so that the method is used in some scenes which have low requirements on the quality of the depth map but have high requirements on the complexity of the algorithm.
In this embodiment of the present application, according to an abnormal pixel point with an abnormal depth value, a target pixel point with a normal depth value may be obtained, and an image to be processed corresponding to the time-of-flight depth map is processed according to the target pixel point, that is, optionally, in some embodiments of the present application, after "determining an abnormal pixel point with an abnormal depth value from the time-of-flight depth map according to the light intensity value of the light intensity reference pixel point", the method further includes:
determining target pixel points with normal depth values according to the abnormal pixel points;
acquiring an image to be processed corresponding to the flight time depth map;
and performing image processing on the image to be processed according to the target pixel point, wherein the image processing comprises at least one of semantic segmentation processing, gesture recognition processing, gesture tracking processing, three-dimensional reconstruction processing, ray scanning processing or defect recognition processing.
The depth information of the normal pixel points can be applied by locating the abnormal pixel points with abnormal depth information, acquiring normal pixel points (target pixel points) with normal depth information according to the abnormal pixel points, and performing image processing and other operations according to the normal pixel points, wherein in the embodiment of the application, the image processing can include multiple types, for example, the image processing includes but is not limited to semantic segmentation processing, gesture recognition processing, gesture tracking processing, three-dimensional reconstruction processing, light scanning processing or defect recognition processing and the like.
In this embodiment of the present application, after determining the width distance of the interference area, the exposure parameter of the time-of-flight camera may be adjusted according to the width distance, that is, the interference of the reflected light of the near-distance object to the pixel area corresponding to the far-distance object is reduced, that is, optionally, in some embodiments of the present application, after "determining the abnormal pixel point with the abnormal depth value from the time-of-flight depth map according to the light intensity value of the light intensity reference pixel point", the method further includes:
determining a target adjustment amount according to the target width distance;
and adjusting the exposure parameters of the flight time camera according to the target adjustment quantity.
In this embodiment, the exposure parameter may include an exposure time, and the adjustment of the number of signal light pulses may be implemented by adjusting the exposure time, for example, reducing the exposure time may reduce the time for transmitting and receiving the light wave, and reducing the time for transmitting and receiving the light wave may reduce the number of signal light pulses corresponding to the reflected light, and the reduction of the number of signal light pulses may reduce interference of the reflected light of the short-distance object on the pixel region corresponding to the long-distance object.
In the embodiment of the present application, the target adjustment amount refers to a value amplitude to be adjusted, that is, the value amplitude to be adjusted of the exposure parameter is determined according to a width distance affected by the interference light, wherein the range of the interference region is reduced by adjusting the exposure parameter, and the number of affected pixel points during the depth information calculation is reduced.
In the embodiment of the application, after the width distance of the interference area, the adjustment amount and the exposure parameter are determined, the width distance of the interference area, the adjustment amount and the exposure parameter can be used as an exposure adjustment strategy, and the practicability of the time-of-flight camera is enhanced.
In this embodiment of the present application, after locating an abnormal pixel point with an abnormal depth value, the depth value of the abnormal pixel point may be further adjusted to improve the accuracy of the depth information of the time-of-flight depth map, that is, optionally, in some embodiments of the present application, after "determining the abnormal pixel point with the abnormal depth value from the time-of-flight depth map according to the light intensity value of the light intensity reference pixel point", the method further includes:
and determining a depth reference pixel point corresponding to the abnormal pixel point from the flight time depth map, and correcting the depth value of the abnormal pixel point according to the depth value of the depth reference pixel point to obtain a corrected pixel point.
The accuracy of the depth value of the abnormal pixel point is improved by adjusting the depth value of the abnormal pixel point, the depth reference pixel point is selected through the time-of-flight depth map, the accuracy of the selection of the depth reference pixel point is improved, and the accuracy of the correction of the depth value of the abnormal pixel point is improved.
In this embodiment of the present application, a depth reference pixel may be selected within a preset distance range from an abnormal pixel, that is, optionally, in some embodiments of the present application, the step "determining a depth reference pixel corresponding to the abnormal pixel from the time-of-flight depth map" includes:
and aiming at the abnormal pixel points, depth reference pixel points are selected from a preset distance range according to the illumination intensity of the abnormal pixel points, wherein the depth reference pixel points are different from the abnormal pixel points.
Wherein, in this application embodiment, can select the pixel point that is close with unusual pixel point light intensity value as the degree of depth reference pixel point, wherein, because the probability that light intensity value is close in certain extent indicates that these two points belong to same object is higher, should have the distance value the same with the time of flight camera or very close, consequently, can select the pixel point that is close with unusual pixel point light intensity value as degree of depth reference pixel point in the certain distance range of unusual pixel point.
In this embodiment, the depth value of the depth reference pixel point may be used as the depth value of the abnormal pixel point, that is, the depth value of the abnormal pixel point is directly adjusted according to the depth value of the depth reference pixel point.
In the embodiment of the application, a plurality of depth reference pixel points can be selected, and the average value of the depth values of the depth reference pixel points is used as the depth value of the abnormal pixel point, wherein the selection of the depth reference pixel points improves the accuracy of correcting the depth value of the abnormal pixel point.
In this embodiment, the preset distance range may be adjusted or pre-established according to actual needs, for example, in this embodiment, the preset distance range may be set to be smaller to ensure that the distance between the depth reference pixel point and the abnormal pixel point is closer, so as to improve the reference value of the depth reference pixel point, or the preset distance range may be set to be large enough, for example, the preset distance range is set to be the range of the whole reflected light intensity map, and the depth reference pixel point is selected from the whole reflected light intensity map.
For example, in the embodiment of the present application, a nearest neighbor interpolation method may be adopted, that is, interpolation is started from the edge of the transition zone adjacent to the target sub-region, a depth reference pixel closest to the light intensity value of the abnormal pixel is searched in a certain neighborhood range according to the light intensity value of the abnormal pixel, and the depth value of the depth reference pixel is assigned to the abnormal pixel. The basis of the method is as follows: two pixels with close light intensity values in a small range indicate that the probability that the two pixels belong to the same object is high, and the two pixels should have the same or very close distance values. For the condition of multiple abnormal pixel points, the compensation can be performed on each abnormal pixel point in sequence, for example, the compensation result of the previous pixel point can be used as the input information of the next pixel point.
In this embodiment, a nearest neighbor average value complementation method may be further adopted, that is, an average value of depth values of all pixels in a certain neighborhood around the abnormal pixel and close to the light intensity value of the abnormal pixel is used as a complementation value, and depth values in the neighboring pixels, which have a larger difference from the average depth value, may be removed and then averaged, and so on.
In this embodiment, a full-image background statistics back-filling method may also be adopted, that is, based on a light intensity value of a certain abnormal pixel, or a mean value of light intensity values in a small area where a certain abnormal pixel is located, or a mean value of light intensity values of all abnormal pixels, reference pixels where a total reflection light intensity image is close to the light intensity value or the mean value of light intensity values are searched, a mean value of depth values of the reference pixels is obtained, the mean value of the depth values is assigned to the abnormal pixel, or a surface function is fitted according to a position of the reference pixel, then a depth value is estimated based on the fitted surface function and the position where the abnormal pixel is located, and the estimated depth value is assigned to the abnormal pixel.
Wherein, in this application embodiment, the depth reference pixel can be a plurality of, therefore, can be according to the degree of depth value of a plurality of depth reference pixel, confirm the degree of depth value trend of pixel in the reflected light intensity map, adjust the degree of depth value of unusual pixel according to the degree of depth value trend of change, namely, optionally, in some embodiments of this application, the depth reference pixel can include first depth reference pixel and second depth reference pixel, step "follow in the flight time depth map confirm correspond the depth reference pixel of unusual pixel, according to the degree of depth value of depth reference pixel revise the degree of depth value of unusual pixel, obtain revise the back pixel", include:
aiming at the abnormal pixel point, determining a first depth reference pixel point and a second depth reference pixel point according to the distance between the abnormal pixel point and the abnormal pixel point, wherein the depth value difference value between the first depth reference pixel point and the second depth reference pixel point is larger than a third preset threshold value;
establishing a connection line between the first depth reference pixel point and the second depth reference pixel point, and determining the depth information change trend of a plurality of pixel points on the connection line;
determining a target depth value according to the depth information change trend;
and adjusting the depth value of the abnormal pixel point to the target depth value to obtain the corrected pixel point.
The third preset threshold is used for reflecting that the depth difference exists between the first depth reference pixel point and the second depth reference pixel point, and determining the depth information change trend conveniently according to the depth difference. In this embodiment, the third preset threshold may be set according to actual needs, for example, when the first depth reference pixel point is required to be selected from the reference sub-region and the second depth reference pixel point is required to be selected from the target sub-region, the third preset threshold may be set as a depth difference between the pixel points in the reference sub-region and the pixel points in the target sub-region, where the depth difference may be a difference between depth values of a single pixel point in two regions, or a difference between depth values of multiple pixel points after averaging.
The method comprises the steps of establishing a connection line passing through abnormal pixel points, determining the change trend of the depth values of the pixel points on the connection line according to the depth values of all the pixel points on the connection line, and filling or complementing the depth values of the abnormal pixel points with abnormal depth values on the connection line according to the depth values of the pixel points with normal depth values on the connection line based on the change trend. For example, in the embodiment of the present application, a straight-line function is established according to the position information and the depth information of the first depth reference pixel and the second depth reference pixel, an estimated depth value is obtained according to the straight-line function and the position information of the abnormal pixel (for the abnormal pixel on the straight line), and the estimated depth value is used as the depth value of the abnormal pixel.
In this embodiment, the first depth reference pixel may be located in the reference sub-region, and the first depth reference pixel may be located in the target sub-region and different from the abnormal pixel, so that the depth value of the abnormal pixel located between the third pixel and the fourth pixel may be determined according to a trend of change of the depth value of the pixel between the first depth reference pixel and the second depth reference pixel.
For example, in the embodiment of the present application, a linear interpolation method may be adopted, that is, depth values of two pixel points closest to an abnormal pixel point (one of the two pixel points is located in the reference sub-region, and the other one of the two pixel points is located in the target sub-region and does not belong to the abnormal pixel point) are taken out, and all abnormal pixel points spanned by a connection line of the two pixel points are sequentially assigned with depth values that gradually decrease from the background depth value to the foreground depth value according to a linear gradual change method. Of course, the selection of the pixel points in the reference sub-region and the target sub-region may also be limited to the same row or the same column of the abnormal pixel points (where the same row and the same column refer to the coordinate on the same horizontal axis or the same vertical axis with the abnormal pixel points), instead of directly selecting the nearest pixel points, and so on.
In the embodiment of the present application, before adjusting the depth value of the abnormal pixel, the abnormal pixel may be marked first. The abnormal pixel points are marked, so that the depth values of the abnormal pixel points can be corrected conveniently, and the correction efficiency is improved.
In this embodiment of the present application, for a pixel point after correcting a depth value, a pixel point after correcting a depth value may be selected to perform image processing or ranging, that is, optionally, in some embodiments of the present application, after the step "obtaining a pixel point after correcting", the method further includes:
obtaining a corrected flight time depth map according to the corrected pixel points;
acquiring a to-be-processed image corresponding to the corrected time-of-flight depth map;
and carrying out image processing on the image to be processed according to the corrected time-of-flight depth map, wherein the image processing comprises at least one of semantic segmentation processing, gesture recognition processing, gesture tracking processing, three-dimensional reconstruction processing, ray scanning processing or defect recognition processing.
The method includes performing depth value correction on a pixel point with an abnormal depth value to obtain a corrected time-of-flight depth map, and performing image processing and other operations according to the corrected time-of-flight depth map to implement application of pixel point depth information, where in this embodiment, the image processing may include multiple types, for example, including but not limited to semantic segmentation processing, gesture recognition processing, gesture tracking processing, three-dimensional reconstruction processing, light scanning processing, defect recognition processing, and the like.
Wherein, in this application, consider that there is the object of the different degree of depth in the real scene, and the object of different degree of depth distance is different to the reflective power of light signal, and the reflected light signal of different light intensity will influence the resolving of time of flight camera to the degree of depth to produce the pixel that the degree of depth value is unusual. Therefore, the abnormal pixel points of the depth values are identified by taking the light intensity as a basis. The method comprises the steps of firstly obtaining a flight time depth map generated by a flight time camera and reflected light intensity corresponding to the flight time depth map, then determining light intensity reference pixel points from the reflected light intensity map according to the reflected light intensity map, and determining abnormal pixel points with abnormal depth values from the flight time depth map by taking the light intensity values of the light intensity reference pixel points as the basis, thereby realizing identification of the abnormal depth value pixel points.
In order to better implement the depth map detection method of the present application, the present application further provides a depth map detection apparatus based on the depth map detection method. The meaning of words in the depth map detection device is the same as that in the depth map detection method, and specific implementation details can refer to the description in the method embodiment.
Referring to fig. 5, fig. 5 is a schematic structural diagram of a depth map detection apparatus provided in the present application, wherein the depth map detection apparatus may include:
an image obtaining module 201, configured to obtain a time-of-flight depth map of a time-of-flight camera and a reflected light intensity map corresponding to the time-of-flight depth map;
a first determining module 202, configured to determine a light intensity reference pixel point from the reflected light intensity map according to the reflected light intensity map;
and the second determining module 203 is configured to determine an abnormal pixel with an abnormal depth value from the time-of-flight depth map according to the light intensity value of the light intensity reference pixel.
Optionally, in an embodiment, the first determining module 202 is specifically configured to:
determining a reference sub-region in which the light intensity value in the reflected light intensity graph meets a preset condition;
and determining all pixel points or part of pixel points of the reference subarea as light intensity reference pixel points.
Optionally, in an embodiment, the second determining module 203 is specifically configured to:
determining a target sub-region adjacent to the reference sub-region according to the reflected light intensity map and the reference sub-region;
when the light intensity value of the light intensity reference pixel point is larger than a first preset threshold value, determining the pixel point corresponding to the target sub-region as an abnormal depth pixel point;
and when the light intensity value of the light intensity reference pixel point is greater than a second preset threshold and smaller than the first preset threshold, determining an abnormal pixel point with an abnormal depth value from the target subregion according to the light intensity value of the light intensity reference pixel point.
Optionally, in an embodiment, the second determining module 203 is specifically configured to:
determining a target sub-region adjacent to the reference sub-region according to the reflected light intensity map and the reference sub-region;
when the light intensity value of the light intensity reference pixel point is larger than a first preset threshold value, determining the pixel point corresponding to the target subregion as an abnormal depth pixel point;
and when the light intensity value of the light intensity reference pixel point is greater than a second preset threshold and smaller than the first preset threshold, determining an abnormal pixel point with an abnormal depth value from the target subregion according to the light intensity value of the light intensity reference pixel point.
Optionally, in an embodiment, the second determining module 203 is specifically configured to:
determining an edge region where the reference sub-region and the target sub-region are intersected;
determining a target width distance according to the light intensity value of the light intensity reference pixel point;
determining an abnormal pixel region in the target sub-region according to the edge region and the target width distance;
and determining abnormal pixel points with abnormal depth values according to the abnormal pixel areas.
Optionally, in an embodiment, the second determining module 203 is specifically configured to:
and determining the target width distance according to the light intensity value of the light intensity reference pixel point and the depth value of the light intensity reference pixel point.
Optionally, in an embodiment, the second determining module 203 is specifically configured to:
determining the reflectivity of the light intensity reference pixel points according to the light intensity values of the light intensity reference pixel points and the depth values of the light intensity reference pixel points;
and determining the target width distance according to the light intensity value and the reflectivity of the light intensity reference pixel point.
Optionally, in an embodiment, the second determining module 203 is specifically configured to:
obtaining a functional relation, wherein the functional relation is obtained by fitting the light intensity value and the reflectivity of the sample reflected light intensity graph with the sample width distance corresponding to the sample reflected light intensity graph;
and determining the light intensity value of the light intensity reference pixel point and the target width distance corresponding to the reflectivity according to the functional relation.
Optionally, in an embodiment, the apparatus further includes a first application module, where the first application module is specifically configured to:
determining target pixel points with normal depth values according to the abnormal pixel points;
acquiring an image to be processed corresponding to the flight time depth map;
and performing image processing on the image to be processed according to the target pixel point, wherein the image processing comprises at least one of semantic segmentation processing, gesture recognition processing, gesture tracking processing, three-dimensional reconstruction processing, ray scanning processing or defect recognition processing.
Optionally, in an embodiment, the apparatus further includes an adjusting module, where the adjusting module is specifically configured to:
determining a target adjustment amount according to the target width distance;
and adjusting the exposure parameters of the time-of-flight camera according to the target adjustment quantity.
Optionally, in an embodiment, the apparatus further includes a back-up module, where the back-up module is specifically configured to:
and the back-complementing unit is used for determining a depth reference pixel point corresponding to the abnormal pixel point from the flight time depth map, and correcting the depth value of the abnormal pixel point according to the depth value of the depth reference pixel point to obtain a corrected pixel point.
Optionally, in an embodiment, the back-up module is specifically configured to:
and aiming at the abnormal pixel points, selecting depth reference pixel points from a preset distance range according to the illumination intensity of the abnormal pixel points, wherein the depth reference pixel points are different from the abnormal pixel points.
Optionally, in an embodiment, the back-up module is specifically configured to:
aiming at the abnormal pixel point, determining a first depth reference pixel point and a second depth reference pixel point according to the distance between the abnormal pixel point and the abnormal pixel point, wherein the first depth reference pixel point and the second depth reference pixel point are different from the abnormal pixel point;
establishing a connection line between the first depth reference pixel point and the second depth reference pixel point, and determining the depth information change trend of a plurality of pixel points on the connection line;
and adjusting the depth value of the abnormal pixel point according to the change trend of the depth information.
Optionally, in an embodiment, the apparatus further includes a second application module, where the second application module is specifically configured to:
obtaining a corrected flight time depth map according to the corrected pixel points;
acquiring a to-be-processed image corresponding to the corrected time-of-flight depth map;
and performing image processing on the image to be processed according to the corrected time-of-flight depth map, wherein the image processing comprises at least one of semantic segmentation processing, gesture recognition processing, gesture tracking processing, three-dimensional reconstruction processing, ray scanning processing or defect recognition processing.
It should be noted that the depth map detection apparatus provided in the embodiment of the present application and the depth map detection method in the foregoing embodiment belong to the same concept, and specific implementation processes thereof are described in the foregoing related embodiments, and are not described herein again.
The embodiment of the present application provides a storage medium, on which a computer program is stored, and when the computer program stored in the storage medium is executed on a processor of an electronic device provided in the embodiment of the present application, the processor of the electronic device is caused to execute any of the steps in the depth map detection method suitable for the electronic device. The storage medium may be a magnetic disk, an optical disk, a Read Only Memory (ROM), a Random Access Memory (RAM), or the like.
Referring to fig. 6, the electronic device 300 includes a processor 310 and a memory 320.
The processor 310 in the present embodiment may be a general purpose processor, such as an ARM architecture processor.
The memory 320 stores a computer program, which may be a high speed random access memory, and may also be a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid state storage device. Accordingly, the memory 320 may also include a memory controller to provide the processor 310 with access to the memory 320. The processor 310 is configured to execute any one of the above depth map detection methods by executing the computer program in the memory 320, such as:
acquiring a flight time depth map of a flight time camera and a reflected light intensity map corresponding to the flight time depth map;
determining light intensity reference pixel points from the reflected light intensity map according to the reflected light intensity map;
and determining abnormal pixel points with abnormal depth values from the flight time depth map according to the light intensity values of the light intensity reference pixel points.
The depth map detection method, the depth map detection device, the storage medium and the electronic device provided by the present application are described in detail above, and a specific example is applied in the present application to explain the principle and the implementation of the present application, and the description of the above embodiment is only used to help understanding the method and the core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (16)

1. A depth map detection method is characterized by comprising the following steps:
acquiring a flight time depth map of a flight time camera and a reflected light intensity map corresponding to the flight time depth map;
determining light intensity reference pixel points from the reflected light intensity map according to the reflected light intensity map;
and determining abnormal pixel points with abnormal depth values from the flight time depth map according to the light intensity values of the light intensity reference pixel points.
2. The depth map detecting method of claim 1, wherein the determining a light intensity reference pixel point from the reflected light intensity map according to the reflected light intensity map comprises:
determining a reference sub-region of the reflected light intensity graph, wherein the light intensity value meets a preset condition;
and determining all pixel points or part of pixel points of the reference sub-region as light intensity reference pixel points.
3. The method for detecting the depth map of claim 2, wherein the determining an abnormal pixel with an abnormal depth value from the time-of-flight depth map according to the light intensity value of the light intensity reference pixel comprises:
determining a target sub-region adjacent to the reference sub-region according to the reflected light intensity map and the reference sub-region;
when the light intensity value of the light intensity reference pixel point is larger than a first preset threshold value, determining the pixel point corresponding to the target subregion as an abnormal depth pixel point;
and when the light intensity value of the light intensity reference pixel point is greater than a second preset threshold and smaller than the first preset threshold, determining an abnormal pixel point with an abnormal depth value from the target subregion according to the light intensity value of the light intensity reference pixel point.
4. The method for detecting the depth map of claim 3, wherein the determining an abnormal pixel with an abnormal depth value from the target sub-region according to the light intensity value of the light intensity reference pixel comprises:
determining an edge region where the reference sub-region and the target sub-region are intersected;
determining a target width distance according to the light intensity value of the light intensity reference pixel point;
determining an abnormal pixel region in the target sub-region according to the edge region and the target width distance;
and determining abnormal pixel points with abnormal depth values according to the abnormal pixel areas.
5. The depth map detecting method of claim 4, wherein the determining the target width distance according to the light intensity value of the light intensity reference pixel point comprises:
and determining the target width distance according to the light intensity value of the light intensity reference pixel point and the depth value of the light intensity reference pixel point.
6. The method for detecting the depth map of claim 5, wherein the determining the target width distance according to the light intensity values of the light intensity reference pixels and the depth values of the light intensity reference pixels comprises:
determining the reflectivity of the light intensity reference pixel points according to the light intensity values of the light intensity reference pixel points and the depth values of the light intensity reference pixel points;
and determining the target width distance according to the light intensity value and the reflectivity of the light intensity reference pixel point.
7. The depth map detecting method of claim 6, wherein the determining the target width distance according to the light intensity value and the reflectivity of the light intensity reference pixel point comprises:
obtaining a functional relation, wherein the functional relation is obtained by fitting the light intensity value and the reflectivity of the sample reflected light intensity graph with the sample width distance corresponding to the sample reflected light intensity graph;
and determining the light intensity value of the light intensity reference pixel point and the target width distance corresponding to the reflectivity according to the functional relation.
8. The depth map detecting method according to any one of claims 1 to 7, wherein after determining an abnormal pixel having an abnormal depth value from the time-of-flight depth map according to the light intensity value of the light intensity reference pixel, the method further comprises:
determining a target pixel point with a normal depth value according to the abnormal pixel point;
acquiring an image to be processed corresponding to the flight time depth map;
and performing image processing on the image to be processed according to the target pixel point, wherein the image processing comprises at least one of semantic segmentation processing, gesture recognition processing, gesture tracking processing, three-dimensional reconstruction processing, ray scanning processing or defect recognition processing.
9. The depth map detecting method according to any one of claims 4 to 7, wherein after determining an abnormal pixel having an abnormal depth value from the time-of-flight depth map according to the light intensity value of the light intensity reference pixel, the method further comprises:
determining a target adjustment amount according to the target width distance;
and adjusting the exposure parameters of the flight time camera according to the target adjustment quantity.
10. The depth map detecting method according to any one of claims 1 to 7, wherein after determining an abnormal pixel having an abnormal depth value from the time-of-flight depth map according to the light intensity value of the light intensity reference pixel, the method further comprises:
and determining a depth reference pixel point corresponding to the abnormal pixel point from the flight time depth map, and correcting the depth value of the abnormal pixel point according to the depth value of the depth reference pixel point to obtain a corrected pixel point.
11. The depth map detection method of claim 10, wherein the determining depth reference pixels corresponding to the abnormal pixels from the time-of-flight depth map comprises:
and aiming at the abnormal pixel points, selecting depth reference pixel points from a preset distance range according to the illumination intensity of the abnormal pixel points, wherein the depth reference pixel points are different from the abnormal pixel points.
12. The method of claim 10, wherein the depth reference pixels include a first depth reference pixel and a second depth reference pixel, and the determining a depth reference pixel corresponding to the abnormal pixel from the time-of-flight depth map, and modifying the depth value of the abnormal pixel according to the depth value of the depth reference pixel to obtain a modified pixel includes:
aiming at the abnormal pixel point, determining a first depth reference pixel point and a second depth reference pixel point according to the distance between the abnormal pixel point and the abnormal pixel point, wherein the depth value difference value between the first depth reference pixel point and the second depth reference pixel point is greater than a third preset threshold value;
establishing a connection line between the first depth reference pixel point and the second depth reference pixel point, and determining the depth information change trend of a plurality of pixel points on the connection line;
determining a target depth value according to the depth information change trend;
and adjusting the depth value of the abnormal pixel point to the target depth value to obtain the corrected pixel point.
13. The depth map detection method of claim 10, wherein after obtaining the corrected pixel points, the method further comprises:
obtaining a corrected flight time depth map according to the corrected pixel points;
acquiring an image to be processed corresponding to the corrected time-of-flight depth map;
and performing image processing on the image to be processed according to the corrected time-of-flight depth map, wherein the image processing comprises at least one of semantic segmentation processing, gesture recognition processing, gesture tracking processing, three-dimensional reconstruction processing, ray scanning processing or defect recognition processing.
14. A depth map detection apparatus, comprising:
the system comprises an image acquisition module, a data processing module and a data processing module, wherein the image acquisition module is used for acquiring a flight time depth map of a flight time camera and a reflected light intensity map corresponding to the flight time depth map;
the first determining module is used for determining light intensity reference pixel points from the reflected light intensity map according to the reflected light intensity map;
and the second determining module is used for determining an abnormal pixel point with an abnormal depth value from the flight time depth map according to the light intensity value of the light intensity reference pixel point.
15. A storage medium having stored thereon a computer program for performing the steps of the depth map detection method according to any of claims 1-13, when the computer program is loaded by a processor of an electronic device.
16. An electronic device comprising a processor and a memory, the memory storing a computer program, wherein the processor performs the steps in the depth map detection method according to any one of claims 1-13 by loading the computer program.
CN202210217090.4A 2022-03-07 2022-03-07 Depth map detection method and device, storage medium and electronic equipment Pending CN114742756A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210217090.4A CN114742756A (en) 2022-03-07 2022-03-07 Depth map detection method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210217090.4A CN114742756A (en) 2022-03-07 2022-03-07 Depth map detection method and device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN114742756A true CN114742756A (en) 2022-07-12

Family

ID=82275764

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210217090.4A Pending CN114742756A (en) 2022-03-07 2022-03-07 Depth map detection method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN114742756A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118379286A (en) * 2024-06-21 2024-07-23 宝鸡市聚鑫源新材料股份有限公司 Surface defect detection method and system for titanium alloy forging

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118379286A (en) * 2024-06-21 2024-07-23 宝鸡市聚鑫源新材料股份有限公司 Surface defect detection method and system for titanium alloy forging
CN118379286B (en) * 2024-06-21 2024-08-30 宝鸡市聚鑫源新材料股份有限公司 Surface defect detection method and system for titanium alloy forging

Similar Documents

Publication Publication Date Title
US10091491B2 (en) Depth image generating method and apparatus and depth image processing method and apparatus
US8305377B2 (en) Image processing method
TW201916679A (en) Auto white balance method performed by an image signal processor
US20240027619A1 (en) Image processing method and system for optical distance measurement
US20230325979A1 (en) Image correction method, and under-screen system
CN114742756A (en) Depth map detection method and device, storage medium and electronic equipment
WO2022250894A1 (en) Distributed depth data processing
EP4423708A1 (en) Denoising depth image data using neural networks
WO2023064148A1 (en) System and method for detecting calibration of a 3d sensor
CN110490848B (en) Infrared target detection method, device and computer storage medium
US11074708B1 (en) Dark parcel dimensioning
US11585936B2 (en) Range imaging camera and range imaging method
US20220262026A1 (en) Depth image generation method and apparatus, reference image generation method and apparatus, electronic device, and computer-readable storage medium
CN115423808B (en) Quality detection method for speckle projector, electronic device, and storage medium
CN109741384A (en) The more distance detection devices and method of depth camera
EP4071578A1 (en) Light source control method for vision machine, and vision machine
US10991112B2 (en) Multiple scale processing for received structured light
JP2022125966A (en) Ranging correction device, ranging correction method, ranging correction program, and ranging device
CN114697433A (en) Under screen phase machine
US20240098382A1 (en) Image processing device and image processing method
US20240265548A1 (en) Method and computing device for enhanced depth sensor coverage
US20230072179A1 (en) Temporal metrics for denoising depth image data
EP4386675A1 (en) External parameter determination method and image processing device
US20200191917A1 (en) Image processing device, distance detection device, image processing method, and non-transitory storage medium
CN117911478A (en) Depth camera capable of measuring extremely close distance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination