CN113099120A - Depth information acquisition method and device, readable storage medium and depth camera - Google Patents

Depth information acquisition method and device, readable storage medium and depth camera Download PDF

Info

Publication number
CN113099120A
CN113099120A CN202110394765.8A CN202110394765A CN113099120A CN 113099120 A CN113099120 A CN 113099120A CN 202110394765 A CN202110394765 A CN 202110394765A CN 113099120 A CN113099120 A CN 113099120A
Authority
CN
China
Prior art keywords
pixel
image
pixel point
depth
depth camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110394765.8A
Other languages
Chinese (zh)
Other versions
CN113099120B (en
Inventor
熊斌
郭振民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanchang Virtual Reality Institute Co Ltd
Original Assignee
Nanchang Virtual Reality Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanchang Virtual Reality Institute Co Ltd filed Critical Nanchang Virtual Reality Institute Co Ltd
Priority to CN202110394765.8A priority Critical patent/CN113099120B/en
Publication of CN113099120A publication Critical patent/CN113099120A/en
Application granted granted Critical
Publication of CN113099120B publication Critical patent/CN113099120B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/22Measuring arrangements characterised by the use of optical techniques for measuring depth
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

A depth information acquisition method, a device, a readable storage medium and a depth camera are provided, the method comprises: detecting the closest distance between the object and the depth camera, and judging whether the closest distance is within the working distance range of the depth camera; if so, acquiring a current image of the object, and determining the pixel offset of the image of the calibration plane relative to the reference image at the closest distance; modifying a pixel offset range of the depth camera; traversing all reference pixel points in the pixel offset range in the reference image aiming at each pixel point in the current image so as to search the reference pixel point corresponding to each pixel point in the current image; and determining the target pixel offset of the pixel point according to the coordinates of the pixel point and the coordinates of the corresponding reference pixel point, and calculating the depth value of the pixel point. The method improves the acquisition efficiency of the image depth information.

Description

Depth information acquisition method and device, readable storage medium and depth camera
Technical Field
The present invention relates to the field of depth cameras, and in particular, to a depth information acquiring method and apparatus, a readable storage medium, and a depth camera.
Background
The depth camera may be used to acquire depth information of a target object to enable 3D scanning, scene reconstruction, recognition, interaction, and the like. Depth cameras can be mainly classified into structured light depth cameras, binocular vision depth cameras, time-of-flight depth cameras, and the like according to different principles. The structured light depth camera is a depth camera with mature technology and wide application.
The existing structured light depth camera generally includes an infrared projector and an image sensor, the infrared projector is used for projecting an infrared beam to an object, the infrared beam projected to the object is reflected to the image sensor, and after a series of processing, depth information of the object is obtained. The current structured light depth camera has the common problems that the process of processing information consumes much computing resources and time, so that the efficiency is low, and the use of a user is influenced.
Disclosure of Invention
In view of the foregoing, it is necessary to provide a depth information acquiring method, device, readable storage medium and depth camera for solving the problem of low efficiency of the structured light depth camera in acquiring object depth information in the prior art.
A depth information acquisition method, comprising:
detecting the closest distance between an object and a depth camera, and judging whether the closest distance is within the working distance range of the depth camera;
if so, acquiring a current image of the object, and determining the pixel offset of the image of the calibration plane relative to a reference image when the calibration plane is at the closest distance, wherein the reference image is an image acquired when the calibration plane is at a reference depth;
modifying a pixel shift range of the depth camera such that an upper value of the pixel shift range is equal to the determined pixel shift amount, an initial value of the pixel shift range being determined from pixel shift amounts of images of the calibration plane acquired at two extreme working distances of the depth camera with respect to a reference image;
traversing all reference pixel points in the pixel offset range in the reference image aiming at each pixel point in the current image so as to search the reference pixel point corresponding to each pixel point in the current image;
and determining the target pixel offset of the pixel point according to the coordinates of the pixel point and the coordinates of the corresponding reference pixel point, and calculating the depth value of the pixel point according to the target pixel offset.
Further, in the depth information obtaining method, the step of traversing all reference pixel points in the pixel offset range in the reference image to search for a reference pixel point corresponding to each pixel point in the current image includes:
calculating the correlation between the pixel points and each reference pixel point in the pixel offset range in the reference image;
and determining the largest correlation value as a reference pixel point corresponding to the pixel point.
Further, in the depth information obtaining method, a calculation formula of a correlation S between the pixel point and a reference pixel point in the reference image is as follows:
Figure 931789DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 595988DEST_PATH_IMAGE002
is the average value of the gray values of the pixels in a window with the size of N x N by taking the reference pixel point as the center,
Figure 118236DEST_PATH_IMAGE003
is the average value of pixel gray values in a window with the size of N x N and taking a pixel point of the current image as the center, AmnFor the grey values of the pixels of m rows and N columns in the N x N window in the reference image, BmnThe gray value of the pixels in m rows and N columns in an N x N window in the current image is referred to, and N is the size of the window.
Further, the above depth information obtaining method, wherein after the step of determining whether the closest distance is within the working distance range of the depth camera, further includes:
and when the nearest distance exceeds the working distance range, controlling the depth camera to sleep.
Further, in the depth information acquiring method, the step of detecting the closest distance between the object and the depth camera includes;
detecting a closest distance of the object to a depth camera by a distance sensor having a beam angle comparable to a field angle of the depth camera.
Further, in the depth information obtaining method, a calculation formula of the depth value of the pixel point is as follows:
Figure 302224DEST_PATH_IMAGE004
wherein the content of the first and second substances,
Figure 251726DEST_PATH_IMAGE005
is the distance of the reference image from the depth camera,
Figure 403221DEST_PATH_IMAGE006
is the focal length of the depth camera,
Figure 729160DEST_PATH_IMAGE007
is the distance of the projector from the image sensor,
Figure 767655DEST_PATH_IMAGE008
and the target pixel offset of the pixel point is obtained.
Further, in the depth information obtaining method, when the minimum distance is determined, the pixel offset of the image of the calibration plane relative to the reference image is determined
Figure 153637DEST_PATH_IMAGE009
The calculation formula of (2) is as follows:
Figure 667795DEST_PATH_IMAGE010
wherein the content of the first and second substances,
Figure 656479DEST_PATH_IMAGE011
is the distance of the reference image from the depth camera,
Figure 674114DEST_PATH_IMAGE006
is the focal length of the depth camera,
Figure 106363DEST_PATH_IMAGE007
is the distance of the projector from the image sensor,d minis the closest distance.
The invention also discloses a depth information acquisition device, which comprises:
the detection module is used for detecting the closest distance between an object and the depth camera and judging whether the closest distance is within the working distance range of the depth camera;
the acquisition module is used for acquiring a current image of the object;
the determining module is used for determining the pixel offset of the image of the calibration plane relative to a reference image when the calibration plane is at the closest distance, wherein the reference image is an image acquired by the calibration plane when the calibration plane is at a reference depth;
a modification module, configured to modify a pixel shift range of the depth camera such that an upper limit value of the pixel shift range is equal to the determined pixel shift amount, and an initial value of the pixel shift range is determined according to a pixel shift amount of an image of the calibration plane acquired by the calibration plane at two limit working distances of the depth camera relative to a reference image;
the searching module is used for traversing all reference pixel points in the pixel offset range in the reference image aiming at each pixel point in the current image so as to search the reference pixel point corresponding to each pixel point in the current image;
and the calculation module is used for determining the target pixel offset of the pixel point according to the coordinate of the pixel point and the coordinate of the corresponding reference pixel point, and calculating the depth value of the pixel point according to the target pixel offset.
Further, in the depth information obtaining apparatus, the step of traversing all reference pixels in the reference image within the pixel offset range to find a reference pixel corresponding to each pixel in the current image includes:
calculating the correlation between the pixel points and each reference pixel point in the pixel offset range in the reference image;
and determining the largest correlation value as a reference pixel point corresponding to the pixel point.
Further, the depth information acquiring apparatus further includes:
and the control module is used for controlling the depth camera to sleep when the nearest distance exceeds the working distance range.
The present invention also discloses a computer-readable storage medium on which a program is stored, the program implementing any of the above-described depth information acquisition methods when executed by a processor.
The invention also discloses a depth camera, which comprises a memory, a processor and a program which is stored on the memory and can run on the processor, wherein the processor realizes the depth information acquisition method according to any item when executing the program.
In the invention, the depth information of the image of the depth camera is calculated according to the pixel offset of each pixel point in the acquired current image. The offset of each pixel point of the current image is determined according to the coordinate between each pixel point of the current image and the corresponding reference pixel point in the reference image, and the depth value of each pixel point is calculated according to the offset, so that the depth information of the current image is obtained. And the pixel offset range of the depth camera is reduced by detecting the closest distance between the object and the depth camera, so that the process of inquiring reference pixel points corresponding to the pixel points of the current image in the reference image is greatly shortened, and the acquisition efficiency of the depth information of the current image is improved.
Drawings
Fig. 1 is a flowchart of a depth information acquisition method according to a first embodiment of the present invention;
fig. 2 is a flowchart of a depth information obtaining method according to a second embodiment of the present invention;
FIG. 3 is a schematic diagram of the relationship between the field angle of the depth camera and the beam angle of the distance sensor;
fig. 4 is a schematic diagram of a window with a size of N × N centered on a pixel point in a current image;
FIG. 5 is a schematic view of a window of size N;
fig. 6 is a block diagram of a depth information acquiring apparatus according to a third embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
These and other aspects of embodiments of the invention will be apparent with reference to the following description and attached drawings. In the description and drawings, particular embodiments of the invention have been disclosed in detail as being indicative of some of the ways in which the principles of the embodiments of the invention may be practiced, but it is understood that the scope of the embodiments of the invention is not limited correspondingly. On the contrary, the embodiments of the invention include all changes, modifications and equivalents coming within the spirit and terms of the claims appended hereto.
Referring to fig. 1, a depth information obtaining method according to a first embodiment of the invention includes steps S11-S15.
Step S11, detecting a closest distance between the object and the depth camera, and determining whether the closest distance is within a working distance range of the depth camera.
The depth information acquiring method in the embodiment is applied to a depth camera and is used for acquiring depth information of a shot object. The depth camera is specifically a structured light depth camera, and structured light of the depth camera can be realized by technologies such as speckle, mask, grating or line laser. Compared with the conventional structured light camera, the depth camera in the embodiment has a distance measurement function, that is, a distance sensor is added in the depth camera, and the depth camera can detect the distance between an object and the depth camera through the distance sensor. The distance sensor is, for example, an ultrasonic sensor, a millimeter wave distance measuring sensor, or the like. Taking the ultrasonic sensor as an example, the ultrasonic sensor emits ultrasonic pulses in real time and detects echoes, the closest distance between the object and the depth camera is calculated according to the time of the received first valid echo, and when the closest distance is within the working distance range of the depth camera, step S12 is executed.
Step S12, when the closest distance is within the working distance range, acquiring a current image of the object, and determining a pixel offset of an image of the calibration plane at the closest distance relative to a reference image, where the reference image is an image acquired by the calibration plane at a reference depth.
Step S13, modifying the pixel shift range of the depth camera so that the upper limit value of the pixel shift range is equal to the determined pixel shift amount. The initial value of the pixel offset range is determined according to the pixel offset of the image of the calibration plane relative to a reference image, which is acquired by the calibration plane at two limit working distances of the depth camera.
Before the depth camera leaves the factory, a reference image needs to be preset as a reference standard of a subsequently shot image, so as to calculate the pixel offset. The reference image is an image of a calibration plane acquired when the reference image is parallel to the camera and the distance is the reference depth. The reference depth can be set according to actual needs, and can be a middle value of a working distance range of the depth camera, for example, the working distance range is 50 cm-150 cm, the reference depth is 100cm, and a corresponding pixel offset under the reference depth is defined as 0. When the calibration object deviates from the reference depth, the pixels of the acquired image of the calibration plane have a certain deviation relative to the reference image, namely, the corresponding pixel deviation amount is determined.
Taking the speckle structure light depth camera as an example, when the camera works, the speckle projector emits speckle structure light outwards. Before the speckle structure light camera leaves a factory, a gray scale image of speckles needs to be shot on a calibration plane on a plane parallel to the camera to serve as a reference image. When the distance between the calibration plane and the camera is changed, speckles emitted by the camera move on a plane object, and the distance between the calibration plane and the camera, namely the depth, can be calculated by observing the offset (shown as transverse offset) between the speckles and the reference image speckles. When an object is close to the camera, the speckles shift to one side, and when the camera is far from the camera, the speckles shift to the other side.
In this embodiment, the pixel offset of the calibration plane at different depths can be calculated according to the pixel offset relation, where the depth is the distance between the object to be photographed and the depth camera. The relationship between the pixel offset and the depth can be calculated by the design parameters of the depth camera and the triangulation principle, specifically, the pixel offset of the image of the calibration plane relative to the reference image when the calibration plane is at the closest distance
Figure 107817DEST_PATH_IMAGE012
The calculation formula of (2) is as follows:
Figure 41138DEST_PATH_IMAGE010
wherein the content of the first and second substances,
Figure 37913DEST_PATH_IMAGE013
is the distance of the reference image from the depth camera,
Figure 765698DEST_PATH_IMAGE006
is the focal length of the depth camera,
Figure 129814DEST_PATH_IMAGE007
is the distance of the projector from the image sensor,d minis the closest distance.
Each depth camera has a range of working distances, i.e. the effective range that the camera can shoot, typically a few tens of centimeters to two meters. When the calibration plane is located at the closest working distance and the farthest working distance of the camera respectively, two obtained limit deviation values are two endpoint values of the pixel deviation range of the depth camera, for example, the pixel deviation range corresponding to the depth camera with the working distance of 50 cm-150 cm is (-20, 59).
When the closest distance of the object to the depth camera is determined, the pixel offset of the image of the calibration plane acquired at the closest distance of the calibration plane relative to the reference image is calculated. Then, the range of the pixel offset of the depth camera is modified, so that the calculated pixel offset is used as the upper limit value of the pixel offset range. For example, the closest distance between the object and the depth camera is detected to be 110cm, and the corresponding pixel offset is calculated to be-4, i.e. the modified pixel offset range is (-20, -4). And if the position of the object is changed and the image is collected again, revising the pixel offset range according to the nearest distance of the object which is detected again.
Step S14, for each pixel point in the current image, traversing all reference pixel points in the pixel offset range in the reference image to find a reference pixel point corresponding to each pixel point in the current image.
Step S15, determining the target pixel offset of the pixel point according to the coordinate of the pixel point and the coordinate of the corresponding reference pixel point, and calculating the depth value of the pixel point according to the target pixel offset.
The current image of the object captured by the depth camera is typically a grayscale image. And traversing all the reference pixel points in the pixel offset range in the reference image aiming at each pixel point in the current image so as to search the reference pixel point corresponding to each pixel point in the current image in the reference image. To distinguish from the current image, the pixel points in the reference image are defined as reference pixel points in the present embodiment. In specific implementation, the correspondence between the pixel point of the current image and the reference pixel point in the reference image can be measured by the correlation, and the reference pixel point with the maximum correlation is determined as the corresponding reference pixel point. And determining the target pixel offset of each pixel point of the current image relative to the reference image according to the coordinate difference between the pixel point in the current image and the corresponding reference pixel point.
Specifically, for each pixel point in the current image, for example, the pixel point B (i, j), i.e., the pixel in the ith row and the jth column, the algorithm of the depth camera traverses all reference pixel points, i.e., 17 pixels, which are respectively the pixel a (i, j-4), the pixel a (i, j-5), …, and the pixel a (i, j-20), in the reference image, within the offset range of-20 to-4, so as to find a reference pixel point corresponding to the pixel point B (i, j), for example, the corresponding reference pixel point is a (i, j-p), and then the target pixel offset is Δ p = p.
And converting the pixel offset into depth, and calculating the distance d between the pixel and the depth camera by triangulation, namely obtaining the depth value of the pixel. Specifically, the calculation formula of the depth value of the pixel point is as follows:
Figure 601247DEST_PATH_IMAGE014
wherein the content of the first and second substances,
Figure 593473DEST_PATH_IMAGE005
is the distance of the reference image from the depth camera,
Figure 882372DEST_PATH_IMAGE006
is the focal length of the depth camera,
Figure 592839DEST_PATH_IMAGE007
is the distance of the projector from the image sensor,
Figure 867963DEST_PATH_IMAGE015
and the target pixel offset of the pixel point is obtained.
In this embodiment, the depth information of the image of the depth camera is calculated according to the pixel offset of each pixel point in the acquired current image. The offset of each pixel point of the current image is determined according to the coordinate between each pixel point of the current image and the corresponding reference pixel point in the reference image, and the depth value of each pixel point is calculated according to the offset, so that the depth information of the current image is obtained. And the pixel offset range of the depth camera is reduced by detecting the closest distance between the object and the depth camera, so that the process of inquiring reference pixel points corresponding to the pixel points of the current image in the reference image is greatly shortened, and the acquisition efficiency of the depth information of the current image is improved.
Referring to fig. 2, a depth information obtaining method according to a second embodiment of the present invention includes steps S21 to S27.
Step S21, detecting a closest distance between the object and the depth camera by a distance sensor having a beam angle corresponding to a field angle of the depth camera.
In order to make the detection range of the distance sensor cover the field of view of the depth camera, as shown in fig. 3, the distance sensor used in the depth camera in this embodiment is a sensor with a large beam angle, such as an ultrasonic sensor, the beam angle of which is equivalent to the field angle of the depth camera, and is about 60 °, so that objects in the detection range of the depth camera can be detected by the distance sensor.
It will be appreciated that in other embodiments of the invention, a large beam angle distance sensor may also be simulated by providing a plurality of small beam angle distance sensors.
The ultrasonic sensor sends out sound wave pulses in real time and detects echoes, and the depth camera is in a dormant state when the echoes are not received; if a valid echo is received, the time of the first valid echo is determined. If the sound wave pulse is sent out at the time t0 and the first effective echo is detected at the time t1, the shortest distance d between the object and the cameraminAnd (= C) (t 1-t 0), wherein C is the speed of sound 340 m/s.
Step S22, determining whether the closest distance is within the working distance range of the depth camera.
And step S23, when the nearest distance is within the working distance range, starting the depth camera and acquiring the current image of the object.
Each phase of depthThe machines are all provided with corresponding working distance ranges d 0-d 1, which are usually tens of centimeters to two meters, such as 50-150 cm in the embodiment. If the closest distance dmin<d0, indicating that the object is too close to the camera, turning off the depth camera projector for safety, sleeping the camera, and repeating the steps of the ultrasonic sensor sending out sound pulses in real time and detecting echoes. If the closest distance dmin>d1, the object is far from the depth camera, the depth camera is dormant, and the steps of the ultrasonic sensor sending out sound wave pulses in real time and detecting echoes are repeated.
If the echo distance dminIn the range d 0-d 1 of the working distance of the depth camera, the depth camera is started, and an image transmitter of the depth camera acquires an infrared image to obtain a current image.
And step S24, calculating the pixel offset of the image of the calibration plane at the closest distance relative to a reference image, wherein the reference image is the image acquired by the calibration plane at the reference depth.
The relationship between the pixel offset and the depth can be calculated by the design parameters of the depth camera and the triangulation principle, specifically, the pixel offset of the image of the calibration plane relative to the reference image when the calibration plane is at the closest distance
Figure 855642DEST_PATH_IMAGE016
The calculation formula of (2) is as follows:
Figure 659649DEST_PATH_IMAGE010
wherein the content of the first and second substances,
Figure 122992DEST_PATH_IMAGE013
is the distance of the reference image from the depth camera,
Figure 795282DEST_PATH_IMAGE006
is the focal length of the depth camera,
Figure 27680DEST_PATH_IMAGE007
to be thrown intoThe distance of the emitter from the image sensor,d minis the closest distance.
In step S25, the pixel shift range of the depth camera is modified so that the upper limit value of the pixel shift range is equal to the calculated pixel shift amount.
The initial values of the two end points of the pixel offset range of the depth camera are the values of the measured pixel point offset when the depth of the calibration plane is at the two limit working distances of the depth camera, for example, the pixel offset range of the depth camera is (-20, 59). And after the depth camera acquires the closest distance of the currently shot object, calculating the corresponding pixel offset under the closest distance, and modifying the upper limit value of the pixel offset range according to the calculated pixel offset to narrow the range. For example, if the closest distance between the object and the depth camera is 110cm, the pixel offset range corresponding to the depth range of 50cm to 110cm is a value which is impossible, and the corresponding pixel offset is calculated to be-4, that is, the modified pixel offset range is (-20, -4).
It should be noted that, in this embodiment, the two steps of acquiring the current image of the object and modifying the pixel offset range are not in sequence, and in other embodiments of the present invention, after it is determined that the closest distance is within the preset distance range, the step of modifying the pixel offset range may be performed first and then acquiring the current image of the object, that is, the step of acquiring the current image of the object may be performed after step S24.
Step S26, for each pixel point in the current image, traversing all reference pixel points in the pixel offset range in the reference image to find a reference pixel point corresponding to each pixel point in the current image.
In order to determine the offset of each pixel point in the current image, it is necessary to know the corresponding reference pixel of each pixel point in the reference image of the current image, the correspondence can be measured by the correlation, the reference pixel with the maximum correlation is determined as the corresponding pixel, and the offset of the pixel is calculated by the coordinate difference of the two corresponding pixels.
In order to obtain the reference pixel with the maximum correlation, the correlation coefficients of all pixels in the offset range and the current image pixel need to be calculated, and if the offset range is reduced, the calculation efficiency of the process can be improved. If the modified pixel deviation range is-20 to-4, 17 pixels, namely, pixel A (i, j-4), pixel A (i, j-5), … and pixel A (i, j-20) are in the range in the reference image. The 17 pixel points and the pixel point B (i, j) are subjected to correlation calculation one by one, only 17 times of calculation is needed to obtain 17 correlation numbers S, and the pixel with the maximum correlation is obtained, so that the calculation amount is greatly reduced compared with the initial pixel offset range (the initial pixel range needs to be calculated for 80 times). Assuming that the obtained pixel point a (i, j-p) of the reference image is the reference pixel point corresponding to the pixel point B (i, j), the target pixel offset Δ p = p of the pixel point B (i, j).
There are many kinds of calculation of the degree of correlation, for example, a de-mean correlation algorithm, in which a window division is performed on an area where each pixel point in the reference image and the current image is located, that is, a window (as shown in fig. 4 and fig. 5) with each pixel point as a center and a size of N × N is obtained, and a calculation formula for calculating the degree of correlation S between the current image and the corresponding pixel point of the reference image is as follows:
Figure 268168DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 94173DEST_PATH_IMAGE002
is the average value of the gray values of the pixels in a window with the size of N x N by taking the reference pixel point as the center,
Figure 445520DEST_PATH_IMAGE003
is the average value of pixel gray values in a window with the size of N x N and taking a pixel point of the current image as the center, AmnFor the grey values of the pixels of m rows and N columns in the N x N window in the reference image, BmnThe gray value of the pixels in m rows and N columns in an N x N window in the current image is referred to, and N is the size of the window.
Step S27, determining the target pixel offset of the pixel point according to the coordinate of the pixel point and the coordinate of the corresponding reference pixel point, and calculating the depth value of the pixel point according to the target pixel offset.
And converting the pixel offset into depth, and calculating the distance d between the pixel and the depth camera by triangulation, namely obtaining the depth value of the pixel. Specifically, the calculation formula of the depth value of the pixel point is as follows:
Figure 532425DEST_PATH_IMAGE017
wherein the content of the first and second substances,
Figure 68448DEST_PATH_IMAGE005
is the distance of the reference image from the depth camera,
Figure 975224DEST_PATH_IMAGE006
is the focal length of the depth camera,
Figure 5628DEST_PATH_IMAGE007
is the distance of the projector from the image sensor,
Figure 212619DEST_PATH_IMAGE015
and the target pixel offset of the pixel point is obtained.
Referring to fig. 6, a depth information obtaining apparatus according to a third embodiment of the present invention includes:
the detection module 10 is configured to detect a closest distance between an object and a depth camera, and determine whether the closest distance is within a working distance range of the depth camera;
an acquisition module 20 for acquiring a current image of the object;
a determining module 30, configured to determine a pixel offset of an image of the calibration plane at the closest distance relative to a reference image, where the reference image is an image acquired by the calibration plane at a reference depth;
a modification module 40, configured to modify a pixel shift range of the depth camera such that an upper limit value of the pixel shift range is equal to the determined pixel shift amount, and an initial value of the pixel shift range is determined according to a pixel shift amount of an image of the calibration plane acquired by the calibration plane at two limit working distances of the depth camera relative to a reference image;
a searching module 50, configured to traverse all reference pixel points in the pixel offset range in the reference image for each pixel point in the current image, so as to search for a reference pixel point corresponding to each pixel point in the current image;
and a calculating module 60, configured to determine a target pixel offset of the pixel according to the coordinate of the pixel and the coordinate of the corresponding reference pixel, and calculate a depth value of the pixel according to the target pixel offset.
Further, in the depth information obtaining apparatus, the step of traversing all reference pixels in the reference image within the pixel offset range to find a reference pixel corresponding to each pixel in the current image includes:
calculating the correlation between the pixel points and each reference pixel point in the pixel offset range in the reference image;
and determining the largest correlation value as a reference pixel point corresponding to the pixel point.
Further, the depth information acquiring apparatus further includes:
and the control module is used for controlling the depth camera to sleep when the nearest distance exceeds the working distance range.
The depth information obtaining apparatus provided in the embodiment of the present invention has the same implementation principle and technical effect as those of the foregoing method embodiment, and for brief description, no mention is made in the apparatus embodiment, and reference may be made to the corresponding contents in the foregoing method embodiment.
An embodiment of the present invention further provides a computer-readable storage medium, on which a program is stored, where the program, when executed by a processor, implements any one of the depth information obtaining methods described above.
The embodiment of the invention also provides a depth camera, which comprises a memory, a processor and a program which is stored on the memory and can be run on the processor, wherein when the processor executes the program, the depth information acquisition method can be realized.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable storage medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable storage medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable storage medium may even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A depth information acquisition method, comprising:
detecting the closest distance between an object and a depth camera, and judging whether the closest distance is within the working distance range of the depth camera;
if so, acquiring a current image of the object, and determining the pixel offset of the image of the calibration plane relative to a reference image when the calibration plane is at the closest distance, wherein the reference image is an image acquired when the calibration plane is at a reference depth;
modifying a pixel shift range of the depth camera such that an upper value of the pixel shift range is equal to the determined pixel shift amount, an initial value of the pixel shift range being determined from pixel shift amounts of images of the calibration plane acquired at two extreme working distances of the depth camera with respect to a reference image;
traversing all reference pixel points in the pixel offset range in the reference image aiming at each pixel point in the current image so as to search the reference pixel point corresponding to each pixel point in the current image;
and determining the target pixel offset of the pixel point according to the coordinates of the pixel point and the coordinates of the corresponding reference pixel point, and calculating the depth value of the pixel point according to the target pixel offset.
2. The method according to claim 1, wherein the step of traversing all reference pixels in the reference image within the pixel offset range to find a reference pixel corresponding to each pixel in the current image comprises:
calculating the correlation between the pixel points and each reference pixel point in the pixel offset range in the reference image;
and determining the largest correlation value as a reference pixel point corresponding to the pixel point.
3. The depth information acquisition method according to claim 2, wherein a calculation formula of a correlation degree S between the pixel point and a reference pixel point in the reference image is:
Figure 207218DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 473114DEST_PATH_IMAGE002
to be based on a reference imageA pixel point is the average of the gray values of the pixels in a window of size N x N at the center,
Figure 691737DEST_PATH_IMAGE003
is the average value of pixel gray values in a window with the size of N x N and taking a pixel point of the current image as the center, AmnFor the grey values of the pixels of m rows and N columns in the N x N window in the reference image, BmnThe gray value of the pixels in m rows and N columns in an N x N window in the current image is referred to, and N is the size of the window.
4. The depth information acquisition method according to claim 1, wherein the step of detecting the closest distance of the object to the depth camera includes:
detecting a closest distance of the object to a depth camera by a distance sensor having a beam angle comparable to a field angle of the depth camera.
5. The depth information obtaining method according to claim 1, wherein the calculation formula of the depth value of the pixel point is:
Figure 334071DEST_PATH_IMAGE004
wherein the content of the first and second substances,
Figure 938228DEST_PATH_IMAGE005
is the distance of the reference image from the depth camera,
Figure 375025DEST_PATH_IMAGE006
is the focal length of the depth camera,
Figure 815365DEST_PATH_IMAGE007
is the distance of the projector from the image sensor,
Figure 995811DEST_PATH_IMAGE008
is the target of the pixel pointThe pixel offset.
6. The depth information acquisition method according to claim 1, wherein the determination of the pixel shift amount of the image of the calibration plane with respect to the reference image at the closest distance to the calibration plane is performed
Figure 454474DEST_PATH_IMAGE009
The calculation formula of (2) is as follows:
Figure 62173DEST_PATH_IMAGE010
wherein the content of the first and second substances,
Figure 978090DEST_PATH_IMAGE011
is the distance of the reference image from the depth camera,
Figure 696647DEST_PATH_IMAGE006
is the focal length of the depth camera,
Figure 275396DEST_PATH_IMAGE007
is the distance of the projector from the image sensor,d minis the closest distance.
7. A depth information acquisition apparatus characterized by comprising:
the detection module is used for detecting the closest distance between an object and the depth camera and judging whether the closest distance is within the working distance range of the depth camera;
the acquisition module is used for acquiring a current image of the object;
the determining module is used for determining the pixel offset of the image of the calibration plane relative to a reference image when the calibration plane is at the closest distance, wherein the reference image is an image acquired by the calibration plane when the calibration plane is at a reference depth;
a modification module, configured to modify a pixel shift range of the depth camera such that an upper limit value of the pixel shift range is equal to the determined pixel shift amount, and an initial value of the pixel shift range is determined according to a pixel shift amount of an image of the calibration plane acquired by the calibration plane at two limit working distances of the depth camera relative to a reference image;
the searching module is used for traversing all reference pixel points in the pixel offset range in the reference image aiming at each pixel point in the current image so as to search the reference pixel point corresponding to each pixel point in the current image;
and the calculation module is used for determining the target pixel offset of the pixel point according to the coordinate of the pixel point and the coordinate of the corresponding reference pixel point, and calculating the depth value of the pixel point according to the target pixel offset.
8. The depth information acquiring apparatus according to claim 7, wherein the step of traversing all the reference pixels in the pixel offset range in the reference image to find the reference pixel corresponding to each pixel in the current image comprises:
calculating the correlation between the pixel points and each reference pixel point in the pixel offset range in the reference image;
and determining the largest correlation value as a reference pixel point corresponding to the pixel point.
9. A computer-readable storage medium on which a program is stored, the program implementing the depth information acquisition method according to any one of claims 1 to 6 when executed by a processor.
10. A depth camera comprising a memory, a processor and a program stored on the memory and executable on the processor, wherein the processor implements the depth information acquisition method of any one of claims 1 to 6 when executing the program.
CN202110394765.8A 2021-04-13 2021-04-13 Depth information acquisition method and device, readable storage medium and depth camera Active CN113099120B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110394765.8A CN113099120B (en) 2021-04-13 2021-04-13 Depth information acquisition method and device, readable storage medium and depth camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110394765.8A CN113099120B (en) 2021-04-13 2021-04-13 Depth information acquisition method and device, readable storage medium and depth camera

Publications (2)

Publication Number Publication Date
CN113099120A true CN113099120A (en) 2021-07-09
CN113099120B CN113099120B (en) 2023-04-18

Family

ID=76676750

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110394765.8A Active CN113099120B (en) 2021-04-13 2021-04-13 Depth information acquisition method and device, readable storage medium and depth camera

Country Status (1)

Country Link
CN (1) CN113099120B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113822920A (en) * 2021-09-29 2021-12-21 北京的卢深视科技有限公司 Method for acquiring depth information by structured light camera, electronic equipment and storage medium
CN115019157A (en) * 2022-07-06 2022-09-06 武汉市聚芯微电子有限责任公司 Target detection method, device, equipment and computer readable storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2011265379A1 (en) * 2011-12-20 2013-07-04 Canon Kabushiki Kaisha Single shot image based depth mapping
US20130266211A1 (en) * 2012-04-06 2013-10-10 Brigham Young University Stereo vision apparatus and method
CN104636709A (en) * 2013-11-12 2015-05-20 中国移动通信集团公司 Method and device for positioning monitored target
WO2015163350A1 (en) * 2014-04-22 2015-10-29 株式会社ニコン Image processing device, imaging device and image processing program
CN110049305A (en) * 2017-12-18 2019-07-23 西安交通大学 A kind of the structure light depth camera automatic correcting method and device of smart phone
CN110088563A (en) * 2019-03-13 2019-08-02 深圳市汇顶科技股份有限公司 Calculation method, image processing apparatus and the three-dimension measuring system of picture depth
CN110657785A (en) * 2019-09-02 2020-01-07 清华大学 Efficient scene depth information acquisition method and system
WO2020185351A1 (en) * 2019-03-08 2020-09-17 Interdigital Vc Holdings, Inc. Depth map processing
WO2020188120A1 (en) * 2019-03-21 2020-09-24 Five AI Limited Depth extraction

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2011265379A1 (en) * 2011-12-20 2013-07-04 Canon Kabushiki Kaisha Single shot image based depth mapping
US20130266211A1 (en) * 2012-04-06 2013-10-10 Brigham Young University Stereo vision apparatus and method
CN104636709A (en) * 2013-11-12 2015-05-20 中国移动通信集团公司 Method and device for positioning monitored target
WO2015163350A1 (en) * 2014-04-22 2015-10-29 株式会社ニコン Image processing device, imaging device and image processing program
CN110049305A (en) * 2017-12-18 2019-07-23 西安交通大学 A kind of the structure light depth camera automatic correcting method and device of smart phone
WO2020185351A1 (en) * 2019-03-08 2020-09-17 Interdigital Vc Holdings, Inc. Depth map processing
CN110088563A (en) * 2019-03-13 2019-08-02 深圳市汇顶科技股份有限公司 Calculation method, image processing apparatus and the three-dimension measuring system of picture depth
WO2020188120A1 (en) * 2019-03-21 2020-09-24 Five AI Limited Depth extraction
CN110657785A (en) * 2019-09-02 2020-01-07 清华大学 Efficient scene depth information acquisition method and system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113822920A (en) * 2021-09-29 2021-12-21 北京的卢深视科技有限公司 Method for acquiring depth information by structured light camera, electronic equipment and storage medium
CN115019157A (en) * 2022-07-06 2022-09-06 武汉市聚芯微电子有限责任公司 Target detection method, device, equipment and computer readable storage medium
CN115019157B (en) * 2022-07-06 2024-03-22 武汉市聚芯微电子有限责任公司 Object detection method, device, equipment and computer readable storage medium

Also Published As

Publication number Publication date
CN113099120B (en) 2023-04-18

Similar Documents

Publication Publication Date Title
EP3229041A1 (en) Object detection using radar and vision defined image detection zone
JP6657500B2 (en) Mobile platform control method and system
US9470548B2 (en) Device, system and method for calibration of camera and laser sensor
CN113099120B (en) Depth information acquisition method and device, readable storage medium and depth camera
JP6450294B2 (en) Object detection apparatus, object detection method, and program
CN113658241B (en) Monocular structured light depth recovery method, electronic device and storage medium
JP5540217B2 (en) Laser scan sensor
Santos et al. Underwater place recognition using forward‐looking sonar images: A topological approach
KR20200071960A (en) Method and Apparatus for Vehicle Detection Using Lidar Sensor and Camera Convergence
JP2011145924A (en) Moving device and method
CN112771575A (en) Distance determination method, movable platform and computer readable storage medium
CN115047472B (en) Method, device, equipment and storage medium for determining laser radar point cloud layering
JP7348414B2 (en) Method and device for recognizing blooming in lidar measurement
KR101238748B1 (en) System for measuring distance of target using step-staring infrared sensor unit
CN102401901B (en) Distance measurement system and distance measurement method
Portugal-Zambrano et al. Robust range finder through a laser pointer and a webcam
US11733362B2 (en) Distance measuring apparatus comprising deterioration determination of polarizing filters based on a reflected polarized intensity from a reference reflector
CN113256483A (en) De-dithering of point cloud data for target identification
US11892569B2 (en) Method and device for optical distance measurement
CN113014899B (en) Binocular image parallax determination method, device and system
KR20230158474A (en) sensing system
Kim et al. Imaging sonar based navigation method for backtracking of AUV
US10698111B2 (en) Adaptive point cloud window selection
US20240077586A1 (en) Method for generating intensity information having extended expression range by reflecting geometric characteristic of object, and lidar apparatus performing same method
US20230243923A1 (en) Method for detecting intensity peaks of a specularly reflected light beam

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant