CN111866490A - Depth image imaging system and method - Google Patents

Depth image imaging system and method Download PDF

Info

Publication number
CN111866490A
CN111866490A CN202010729553.6A CN202010729553A CN111866490A CN 111866490 A CN111866490 A CN 111866490A CN 202010729553 A CN202010729553 A CN 202010729553A CN 111866490 A CN111866490 A CN 111866490A
Authority
CN
China
Prior art keywords
scene
depth
image
determining
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010729553.6A
Other languages
Chinese (zh)
Inventor
高岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alipay Hangzhou Information Technology Co Ltd
Original Assignee
Alipay Hangzhou Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alipay Hangzhou Information Technology Co Ltd filed Critical Alipay Hangzhou Information Technology Co Ltd
Priority to CN202010729553.6A priority Critical patent/CN111866490A/en
Publication of CN111866490A publication Critical patent/CN111866490A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/257Colour aspects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The depth image imaging system and method provided by the present specification can acquire two RGB images and two IR images respectively by using 2 RGB-IR sensors. Under a scene with strong illumination, calculating the depth information of the scene through the parallax of the two RGB images. In a low-light scene, the depth information of the scene is calculated through the parallax of two IR images. Under a common illumination scene, the depth information of the scene can be calculated through the parallax of the two RGB images, the depth information of the scene can also be calculated through the parallax of the two IR images, and the two depth information can be subjected to feature fusion to obtain the final depth information. Therefore, the depth image imaging system and method provided by the specification can be applied to various illumination scenes, range light is applied, and depth calculation accuracy is high.

Description

Depth image imaging system and method
Technical Field
The present disclosure relates to the field of image acquisition technologies, and in particular, to a depth image imaging system and method.
Background
In recent years, with the continuous development of science and technology, the application scenes of computer vision become more and more extensive, and meanwhile, the depth camera technology receives more and more attention in the industry. The depth camera may measure the distance between an object in the scene being photographed and the camera. The distance between an object in a shot scene and the camera is acquired, so that a large development space is brought to a human-computer interaction technology, and people can easily acquire three-dimensional information of a surrounding scene by matching with a two-dimensional image in the shot scene acquired by the camera. Therefore, the products are widely applied to a plurality of fields such as industrial automation, computer graphics, virtual reality, interaction between a robot and a natural person, medical surgery, computer vision and the like.
The depth vision imaging schemes currently provided, such as a depth structured light scheme, a binocular stereo vision scheme, and the like, all require 2 or more cameras, and the depth of an image is calculated through the parallax of the 2 cameras. The 2 cameras can be 2 RGB cameras, or 2 IR cameras, or 1 RGB camera and 1 IR camera. The depth images shot by the 2 RGB cameras can only shoot RGB images but cannot shoot IR images, so that the precision is low in a scene with dark light; 2 IR cameras can only shoot IR images but cannot shoot RGB images, so that the precision is low under strong light; the 1 RGB camera and the 1 IR camera only shoot RGB images under strong light, depth information cannot be obtained, and only shoot IR images in a scene with dark light, depth information cannot be obtained.
It is therefore desirable to provide a system and method for imaging depth images using a wider scene.
Disclosure of Invention
The present specification provides a depth image imaging system and method that uses a wider scene.
In a first aspect, the present disclosure provides a depth image imaging system, including a support component, a first camera, a second camera, and a control device, where the first camera is connected to the support component and is configured to obtain a first optical image of a captured scene, and the first camera includes a first image sensor including a first photosensitive cell array integrated with a visible light photosensitive cell and an infrared photosensitive cell; the second camera is connected to the supporting component, is relatively fixed with the first camera in a preset pose and is used for acquiring a second optical image of the scene, and comprises a second image sensor and a second photosensitive unit array integrated with a visible light photosensitive unit and an infrared photosensitive unit; the control device is in communication connection with the first camera and the second camera, determines the depth of the scene based on the first optical image and the second optical image, and generates a depth image of the scene.
In some embodiments, the first array of light-sensing units is formed by arraying red light-sensing units, green light-sensing units, blue light-sensing units and IR light-sensing units in a predetermined manner, each light-sensing unit corresponding to one pixel; and the second photosensitive unit array is formed by arraying a red photosensitive unit, a green photosensitive unit, a blue photosensitive unit and an IR photosensitive unit in a predetermined mode, wherein each photosensitive unit corresponds to one pixel.
In some embodiments, the depth image imaging system further comprises an infrared lamp coupled to the support member.
In some embodiments, the first optical image comprises a first RGB image and a first IR image; and the second optical image comprises a second RGB image and a second IR image.
In some embodiments, the control device determines the depth of the scene based on the disparity of the scene in the first RGB image and the second RGB image when the illumination intensity of the environment of the scene is greater than a first preset value.
In some embodiments, the control device determines the depth of the scene based on the disparity of the scene in the first IR image and the second IR image when the illumination intensity of the environment of the scene is less than a second preset value, wherein the first preset value is greater than the second preset value.
In some embodiments, when the illumination intensity of the environment of the scene is between the first and second preset values, the control device determines the depth of the scene based on the disparity of the scene in the first and second RGB images and/or the disparity in the first and second IR images.
In some embodiments, the depth image imaging system further comprises a light emitter coupled to the support member and operable to emit a light array of light encoded in a predetermined pattern, the light array illuminating a plurality of spots formed on objects of the scene, the light emitter comprising at least one of an LED emitter and a laser emitter.
In some embodiments, said determining a depth of said scene based on disparity of said scene in said first IR image and said second IR image comprises: determining a depth of the scene based on a disparity of objects in the scene in the first and second IR images when the distance of the objects in the scene from the depth image imaging system is greater than a distance threshold; and when the distance of an object in the scene from the depth image imaging system is less than the distance threshold, determining the depth of each spot in the plurality of spots in the scene based on the parallax of the each spot in the first IR image and the second IR image, thereby determining the depth of the scene.
In some embodiments, the determining the depth of the scene based on the disparity of the scene in the first RGB image and the second RGB image and/or the disparity in the first IR image and the second IR image comprises: determining a depth of the scene based on a disparity of objects in the scene in the first and second RGB images and/or the first and second IR images when a distance of the objects in the scene from the depth image imaging system is greater than the distance threshold; and when the distance of an object in the scene from the depth image imaging system is less than the distance threshold, determining the depth of each spot in the plurality of spots in the scene based on the parallax of the each spot in the first IR image and the second IR image, thereby determining the depth of the scene.
In some embodiments, the depth image imaging system further includes a fill light, and the control device drives the fill light to turn on when the ambient illumination intensity of the scene is less than an illumination intensity threshold.
In a second aspect, the present specification further provides a method of depth image imaging, which is used in the depth image imaging system provided in the first aspect of the present specification, and includes: acquiring the first optical image and the second optical image; and determining the depth of the scene based on the first optical image and the second optical image, and generating a depth image of the scene.
In some embodiments, said determining the depth of the scene based on the first optical image and the second optical image comprises one of: determining that the illumination intensity of the environment of the scene is greater than a first preset value, and determining the depth of the scene based on the parallax of the scene in the first RGB image and the second RGB image; determining that the illumination intensity of the environment of the scene is less than a second preset value, and determining the depth of the scene based on the parallax of the scene in the first IR image and the second IR image, wherein the first preset value is greater than the second preset value; and determining that the illumination intensity of the environment of the scene is between the first preset value and the second preset value, determining the depth of the scene based on the parallax of the scene in the first RGB image and the second RGB image and/or the parallax in the first IR image and the second IR image.
In some embodiments, the determining the depth of the scene based on the disparity of the scene in the first IR image and the second IR image comprises one of: determining that an object in the scene is at a distance from the depth image imaging system that is greater than a distance threshold, determining a depth of the scene based on a disparity of the object in the scene in the first IR image and the second IR image; and determining a distance of an object in the scene from the depth image imaging system is less than the distance threshold, determining a depth of each spot of the plurality of spots in the scene based on a disparity of the each spot in the first IR image and the second IR image, thereby determining the depth of the scene.
In some embodiments, the determining the depth of the scene based on the disparity of the scene in the first RGB image and the second RGB image and/or the disparity in the first IR image and the second IR image comprises one of: determining that an object in the scene is a distance from the depth image imaging system that is greater than the distance threshold, determining a depth of the scene based on a disparity of the object in the scene in the first and second RGB images and/or the first and second IR images; and determining a distance of an object in the scene from the depth image imaging system is less than the distance threshold, determining a depth of each spot of the plurality of spots in the scene based on a disparity of the each spot in the first IR image and the second IR image, thereby determining the depth of the scene.
In some embodiments, the method further comprises, by the control device: and when the ambient illumination intensity of the scene is determined to be smaller than the illumination intensity threshold value, driving the light supplement lamp to be turned on.
According to the technical scheme, the depth image imaging system and the depth image imaging method provided by the specification can acquire two RGB images and two IR images through 2 RGB-IR sensors respectively. Under a scene with strong illumination, calculating the depth information of the scene through the parallax of the two RGB images. In a low-light scene, the depth information of the scene is calculated through the parallax of two IR images. Under a common illumination scene, the depth information of the scene can be calculated through the parallax of the two RGB images, the depth information of the scene can also be calculated through the parallax of the two IR images, and the two depth information can be subjected to feature fusion to obtain the final depth information. Therefore, the depth image imaging system and method provided by the specification can be applied to various illumination scenes, range light is applied, and depth calculation accuracy is high.
Additional functions of the depth image imaging systems and methods provided by the present description will be set forth in part in the description that follows. The following numerical and exemplary descriptions will be readily apparent to those of ordinary skill in the art in view of the description. The inventive aspects of the depth image imaging systems and methods provided herein may be fully explained by the practice or use of the methods, apparatus and combinations described in the detailed examples below.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present disclosure, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1A illustrates a schematic structural diagram of a passive depth image imaging system provided in accordance with an embodiment of the present description;
fig. 1B illustrates a schematic structural diagram of an active depth image imaging system provided in accordance with an embodiment of the present description;
fig. 1C illustrates a hardware configuration diagram of an active depth image imaging system provided according to an embodiment of the present description;
FIG. 2 illustrates a schematic diagram of an array of photosensitive cells provided in accordance with an embodiment of the present description;
fig. 3 is a schematic diagram illustrating placement positions of a first camera and a second camera provided in accordance with an embodiment of the present disclosure;
FIG. 4 illustrates a schematic diagram of a computing image depth information provided in accordance with an embodiment of the present description;
FIG. 5A illustrates a flow diagram of a method of passive depth image imaging provided in accordance with an embodiment of the present description; and
fig. 5B illustrates a flowchart of a method of active depth image imaging provided in accordance with an embodiment of the present description.
Detailed Description
The following description is presented to enable any person skilled in the art to make and use the present description, and is provided in the context of a particular application and its requirements. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present description. Thus, the present description is not limited to the embodiments shown, but is to be accorded the widest scope consistent with the claims.
The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting. For example, as used herein, the singular forms "a", "an" and "the" may include the plural forms as well, unless the context clearly indicates otherwise. The terms "comprises," "comprising," "includes," and/or "including," when used in this specification, are intended to specify the presence of stated integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
These and other features of the present specification, as well as the operation and function of the elements of the structure related thereto, and the combination of parts and economies of manufacture, may be particularly improved upon in view of the following description. Reference is made to the accompanying drawings, all of which form a part of this specification. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the specification. It should also be understood that the drawings are not drawn to scale.
The flow diagrams used in this specification illustrate the operation of system implementations according to some embodiments of the specification. It should be clearly understood that the operations of the flow diagrams may be performed out of order. Rather, the operations may be performed in reverse order or simultaneously. In addition, one or more other operations may be added to the flowchart. One or more operations may be removed from the flowchart.
The depth image imaging system and method provided by the specification can be used for measuring the distance of an object in a shot scene relative to a camera, so as to obtain a three-dimensional image of the shot scene. The three-dimensional image may include the position of an object in the captured scene in the scene, as well as the distance in the captured scene relative to the camera. The depth image imaging system and method provided by the present specification can acquire two RGB images and two IR images respectively by using 2 RGB-IR sensors. Under a scene with strong illumination, calculating the depth information of the scene through the parallax of the two RGB images. In a low-light scene, the depth information of the scene is calculated through the parallax of two IR images. Under a common illumination scene, the depth information of the scene can be calculated through the parallax of the two RGB images, the depth information of the scene can also be calculated through the parallax of the two IR images, and the two depth information can be subjected to feature fusion to obtain the final depth information. Therefore, the depth image imaging system and method provided by the specification can be applied to various illumination scenes, range light is applied, and depth calculation accuracy is high. It should be noted that, since the depth image imaging system and method provided in this specification can acquire RGB images and IR images at the same time, the depth image imaging system and method can also be used in human body recognition and other scenes, such as face recognition, iris recognition, sclera recognition, palm print recognition, and so on.
An RGB-IR image sensor is an image sensor that can integrate an RGB image and an IR (infrared) image. The traditional RGB image sensor comprises four basic monochromatic pixel points R, Gr, Gb and B, wherein the four monochromatic pixel points R, Gr, Gb and B respectively acquire photoelectric induction charge components of different colors, and then the RGB value of each pixel structure is calculated by adopting a linear interpolation algorithm. Therefore, the conventional RGB image sensor can generate only an RGB image. Conventional IR image sensors can only generate IR images. If the RGB-IR image is to be obtained, the RGB image and the IR image need to be converted into a coordinate system through calculation, so that the images are aligned, and the calculation amount and the calculation cost of image processing are increased. The RGB-IR image sensor can collect visible light and infrared light, so that the visible light and the infrared light participate in imaging at the same time. The RGB-IR image sensor includes a basic single-color pixel point R, G or B and an IR pixel, where the basic single-color pixel point R, G or B and the IR pixel respectively obtain photoelectric induced charge components of different colors and photoelectric induced charge components of infrared light, and then calculate an RGB value and an IR value of each pixel structure by using a linear interpolation algorithm. Therefore, the RGB-IR image sensor can generate the RGB image and the IR image at the same time, and the RGB image and the IR image are automatically aligned without coordinate conversion, so that the calculation cost is saved.
Fig. 1 shows a schematic diagram of a depth image imaging system provided in accordance with an embodiment of the present description. The depth image imaging system (hereinafter referred to as a system) may be a passive depth image imaging system 001A (hereinafter referred to as a passive system 001A) or an active depth image imaging system 001B (hereinafter referred to as an active system 001B). Fig. 1A illustrates a schematic structural diagram of a passive depth image imaging system 001A provided in accordance with an embodiment of the present description; fig. 1B illustrates a schematic structural diagram of an active depth image imaging system 001B provided in accordance with an embodiment of the present description; fig. 1C shows a hardware configuration diagram of an active depth image imaging system 001B provided according to an embodiment of the present specification. As shown in fig. 1A, the passive system 001A may include a support member 400, a first camera 100, a second camera 200, and a control device 600. In some embodiments, the passive system 001A may further include an infrared lamp 700 and a fill light 800. As shown in fig. 1B, the active system 001B may include a support member 400, a first camera 100, a second camera 200, a control device 600, and a light emitter 900. In some embodiments, the active system 001B may further include an infrared lamp 700 and a fill light 800.
The support member 400 may be used to support and fix the first and second cameras 100 and 200. In some embodiments, the support member 400 may also be used to support and secure the control device 600.
The first camera 100 may be coupled to the support member 400 for acquiring a first optical image of a scene being photographed. As shown in fig. 1C, the first camera 100 may include a first image sensor 110 and a first lens 130. The first lens 130 is an imaging device of the first camera 100. The first lens 130 may include a plurality of lenses. Light enters from the light-entering side of the first lens 130, is refracted by the first lens 130, exits from the light-exiting side, and is converged on the first image sensor 110. The first lens 130 may be a fixed focus lens, a zoom lens, a conventional lens composed of a plurality of lenses, or a Metalens adaptive zoom lens, which is not limited in this specification.
The first image sensor 110 may be an RGB-IR image sensor. The first image sensor 110 may include a first light sensing unit array 111 integrating visible light (RGB) light sensing units and Infrared (IR) light sensing units. The first photosensitive cell array 111 is formed by arraying red (R), green (G), blue (B), and IR photosensitive cells in a predetermined manner, each corresponding to one pixel. Specifically, the first photosensitive cell array 111 may be implemented by a Color Filter Array (CFA) technology. A first color filter 112 may be disposed in front of the first photosensitive cell array 111. The light is refracted by the first lens 130, filtered by the first color filter 112, and then converged on the first photosensitive cell array 111, and an array of R, G, B, and IR photosensitive cells arranged in a predetermined manner is formed. The first color filter 112 may include a filter array composed of a plurality of filter units, each of which corresponds to one light-sensing unit. The filter array is arrayed in the predetermined manner by a plurality of filters sensitive to R light, a plurality of filters sensitive to G light, a plurality of filters sensitive to B light, and a plurality of filters sensitive to IR light. The filter sensitive to the R light corresponds to the R light sensitive unit. The filter sensitive to the G light corresponds to the G light sensing unit. The filter sensitive to the B light corresponds to the B light sensing unit. The filter sensitive to the IR light corresponds to the IR sensitive unit. Accordingly, the first optical image photographed by the first camera 100 may include a first RGB image and a first IR image.
Fig. 2 shows a schematic diagram of a first photosensitive cell array 111 provided according to an embodiment of the present specification. Two array patterns of the first photosensitive cell array 111 are shown in fig. 2. It should be noted that fig. 2 is only an example, the arrangement of the first photosensitive cell array 111 is various, and all the first photosensitive cell arrays 111 formed by the R photosensitive cells, the G photosensitive cells, the B photosensitive cells, and the IR photosensitive cells are within the scope of protection of the present specification.
A second camera 200 may be attached to the support member 400 for acquiring a second optical image of the scene. As shown in fig. 1C, the second camera 200 may include a second image sensor 210 and a second lens 230. The second lens 230 is an imaging device of the second camera 200. The second lens 230 may be identical to the first lens 130, and will not be described herein. The second image sensor 210 may be an RGB-IR image sensor. The second image sensor 210 may include a second light sensing unit array 211 integrating visible light (RGB) light sensing units and Infrared (IR) light sensing units. The second light sensing unit array 211 may be formed by arraying red (R), green (G), blue (B), and IR light sensing units, each corresponding to one pixel, in a predetermined manner. Specifically, the second photosensitive cell array 211 may have a structure identical to that of the first photosensitive cell array 111, and is not described herein again. The second optical image photographed by the second camera 200 includes a second RGB image and a second IR image.
The second camera 200 is relatively fixed with respect to the first camera 100 in the predetermined attitude, that is, the relative attitude of the second camera 200 to the first camera 100 is fixed. Fig. 3 shows a schematic placing position diagram of a first camera 100 and a second camera 200 provided according to an embodiment of the present disclosure. As shown in fig. 3 (a), ideally, the visual axes of the first and second cameras 100 and 200 may be arranged in parallel, and the image planes of the first and second cameras 100 and 200 are parallel. In practice, due to installation and manufacturing errors, the first camera 100 and the second camera 200 are often not installed in a completely satisfactory manner, for example, as shown in fig. 3 (b), which is a schematic diagram of a non-ideal state, the first camera 100 and the second camera 200 are disposed at an angle. Therefore, the control device 600 can correct the first camera 100 and the second camera 200 according to the actual installation angle of the first camera 100 and the second camera 200, and perform distortion elimination on the first optical image and the second optical image according to the internal and external parameters obtained after calibration of the first camera 100 and the second camera 200, so that the optical axes of the first camera 100 and the second camera 200 are parallel and the imaging plane is coplanar, and finally, the alignment of polar lines is realized, that is, the alignment of polar lines of the first optical image and the second optical image shot by the first camera 100 and the second camera 200 is realized, which is convenient for processing the shot images and reduces the complexity of subsequent matching.
The control device 600 may be in communication with the first camera 100 and the second camera 200. The control device 600 may acquire the first optical image and the second optical image, determine a depth of the scene based on a parallax of the first optical image and the second optical image, and generate a depth image of the scene. The communication connection refers to any form of connection capable of receiving information directly or indirectly. For example, the control device 600 may establish a wireless connection with the first camera 100 and the second camera 200 through wireless communication to transfer data with each other; the control device 600 may also be directly connected with the first camera 100 and the second camera 200 through wires to transmit data to each other; the control device 600 may also establish an indirect connection with the first camera 100 and the second camera 200 by directly connecting wires with other circuits, so as to realize data transmission with each other.
The control device 600 may store data or instructions for performing the methods of depth image imaging described herein and may execute or be used to execute the data and/or instructions. The method of depth image imaging is described elsewhere in this specification. For example, the method of depth image imaging is described in the description of fig. 5. As shown in fig. 1C, the control apparatus 600 may include at least one storage medium 630 and at least one processor 620. In some embodiments, the control device 600 may also include an internal communication bus 610 and a communication port 650.
The storage medium 630 may include a data storage device. The data storage device may be a non-transitory storage medium or a transitory storage medium. For example, the data storage device may include one or more of a magnetic disk, a read-only memory medium (ROM), or a random access memory medium (RAM). The storage medium 630 also includes at least one set of instructions stored in the data storage device. The instructions are computer program code that may include programs, routines, objects, components, data structures, processes, modules, etc. that execute the method for depth image imaging PA100 provided herein.
The internal communication bus 610 may connect the various system components including the storage medium 630 and the processor 620.
The communication port 650 may be used for data communication of the control apparatus 600 with the first and second cameras 100 and 200. The communication port 650 may also be used for data communication between the control device 600 and the outside. For example, the control device 600 may communicate with an external storage device or a computing device through the communication port 650 via a network or bluetooth, or may communicate with an external storage device or a computing device through the communication port 650 via a wired connection. The external storage device or computing device may be a personal computer, a tablet computer, a smart phone, or a mobile storage device, etc.
The at least one processor 620 is communicatively coupled to the at least one storage medium 630 via an internal communication bus 610. The at least one processor 620 is configured to execute the at least one instruction set. When the system is running, the at least one processor 620 reads the at least one instruction set and performs the method of depth image imaging provided herein in accordance with the instructions of the at least one instruction set. Processor 620 may perform all of the steps involved in the method of depth image imaging. The processor 620 may be in the form of one or more processors, and in some embodiments, the processor 620 may include one or more hardware processors, such as microcontrollers, microprocessors, Reduced Instruction Set Computers (RISC), Application Specific Integrated Circuits (ASICs), application specific instruction set processors (ASIPs), Central Processing Units (CPUs), Graphics Processing Units (GPUs), Physical Processing Units (PPUs), microcontroller units, Digital Signal Processors (DSPs), Field Programmable Gate Arrays (FPGAs), Advanced RISC Machines (ARMs), Programmable Logic Devices (PLDs), any circuit or processor capable of executing one or more functions, or the like, or any combination thereof. For illustrative purposes only, only one processor 620 is depicted in the control device 600 in this description. It should be noted, however, that the control device 600 may also include multiple processors, and thus, the operations and/or method steps disclosed in this specification may be performed by one processor as described in this specification, or may be performed by a combination of multiple processors. For example, if the processor 620 of the control apparatus 600 performs steps a and B in this specification, it should be understood that steps a and B may also be performed by two different processors 620 in combination or separately (e.g., a first processor performs step a, a second processor performs step B, or both a first and second processor perform steps a and B together).
As described above, the control device 600 may calculate the depth based on the parallax of the objects in the scene in the first optical image and the second optical image. Fig. 4 is a schematic diagram illustrating a principle of calculating image depth information based on parallax according to an embodiment of the present disclosure.
As shown in FIG. 4, wherein O1、O2The visual axis centers of the first camera 100 and the second camera 200, respectively. d is the center distance of the first camera 100 and the second camera 200 in the X direction. The center distances of the first camera 100 and the second camera 200 in the Y direction and the Z direction are both 0. We mark the coordinates of the midpoints of the first camera 100 and the second camera 200 as (0, 0, 0). O is1Has the coordinates of (-d/2, 0, 0). O is2The coordinates of (d/2, 0, 0). The focal lengths of the first camera 100 and the second camera 200 are both f. M is a point in the scene. The coordinates of the point M are labeled (x, y, z), where z represents the vertical distance of the point M from the center line of the first camera 100 and the second camera 200, i.e., the depth information of the point M. Imaging points of the point M on the first optical image of the first camera 100 and the second optical image of the second camera 200 are M, respectively1、M2。M1A distance x from the visual central axis of the first camera 1001。M2The distance to the visual central axis of the second camera 200 is x2. The disparity value of the M points on the first optical image and the colloquially second optical image may be expressed as d-x1-x2
Then, according to the principle of triangular geometry, we can obtain:
Figure BDA0002602603390000151
according to the disparity value d ═ x1-x2The image depth information of the available M points is:
Figure BDA0002602603390000152
the principle of calculating image depth information based on parallax is based on the imaging models of the first camera 100 and the second camera 200. Therefore, in this specification, by calibrating the first camera 100 and the second camera 200, internal and external parameters of the first camera 100 and the second camera 200 can be obtained, and a mapping relationship between the coordinate system of the first camera 100, the coordinate system of the second camera 200, and the three-dimensional space world coordinate system is determined. The internal and external parameters of the first camera 100 and the second camera 200 include but are not limited to: internal parameters of the first camera 100 and the second camera 200, such as focal length, imaging origin, etc., external parameters of the first camera 100 and the second camera 200, such as: the relative position relationship of the first camera 100 and the second camera 200, such as a rotation matrix, a translation vector and the like, determines the relative position relationship between the first camera 100 and the second camera 200 according to external parameters of the relative positions of the first camera 100 and the second camera 200 and a world coordinate system, namely the rotation matrix and the translation vector, so as to establish a mapping relationship between the world coordinate system and the coordinate systems of the first camera 100 and the second camera 200; the control device 600 may respectively determine, according to the mapping relationship between the inside and outside parameters of the first camera 100 and the second camera 200 and the world coordinate system, a first projection matrix of each group of pixel points in the first optical image in the first camera 100 coordinate system, and a second projection matrix of each group of pixel points in the second optical image in the second camera 200 coordinate system; then, the control device 600 may respectively determine, according to the first projection matrix and the second projection matrix, the coordinate position of each group of pixels in the first optical image in the first camera 100 coordinate system and the coordinate position of each group of pixels in the second optical image in the second camera 200 coordinate system, and respectively determine, according to the coordinate positions, the distance from each group of pixels in the first optical image to the center point of the first camera 100 coordinate system (including the distance in the X direction and the distance in the Y direction), and the distance from each group of pixels in the second optical image to the center point of the second camera 200 coordinate system (including the distance in the X direction and the distance in the Y direction); finally, the control device 600 may calculate the parallax value of each group of pixel points in the first optical image and the second optical image according to the distance. For the calibration of the first camera 100 and the second camera 200, a calibration method in the prior art may be adopted, and the present specification is not limited thereto. For example, "Zhangyingyou calibration method".
Specifically, the control device 600 may select a plurality of points as feature points for calculating depth information based on the scene; according to the parallax d of each feature point in the first optical image and the second optical image, the depth of each feature point can be calculated. In the passive system 001A, the control device 600 may select a plurality of points on the object in the scene as the feature points, and calculate the depth of the scene.
As previously described, the first optical image includes the first RGB image and the first IR image. The second optical image includes the second RGB image and the second IR image. Because visible light is less in a scene with darker light, and characteristic points in the RGB images are not obvious, the first RGB image and the second RGB image are suitable for depth calculation in a scene with brighter light; the infrared light is suitable for auxiliary photographing in a dark scene, and thus, the first IR image and the second IR image are suitable for depth calculation in a dark scene. The control device 600 may select a depth to calculate the image according to the ambient illumination intensity of the scene. In particular, the system may further comprise a light intensity sensor for detecting a light intensity of an environment of the scene. The illumination intensity sensor may be in communication with the control device 600, and the control device 600 may acquire detection data of the illumination intensity sensor and determine an image required for depth calculation according to the detection data.
Specifically, when the ambient light intensity of the scene is greater than a first preset value, the control device 600 may determine the depth of the scene based on the parallax of the scene in the first RGB image and the second RGB image. At this time, the control device 600 may also assist in determining the depth of the scene based on the disparity of the scene in the first IR image and the second IR image. That is, the control device 600 may perform a weighted sum calculation of a depth determined based on the parallax of the scene in the first and second RGB images and a depth determined based on the parallax of the scene in the first and second IR images. Wherein a weighting coefficient of a depth determined based on the disparity of the scene in the first and second RGB images is greater than a weighting coefficient of a depth determined based on the disparity of the scene in the first and second IR images. The first preset value may be stored in the control apparatus 600 in advance. The first preset value can be set manually or changed manually. The first preset value can be obtained based on empirical values, and can also be obtained by training sample data with labels. For example, the first preset value may be 500lux, 600lux, or higher, for example, 10000lux, 50000lux, or the like. When the ambient illumination intensity of the scene is less than a second preset value, the control device 600 may determine the depth of the scene based on the parallax of the scene in the first IR image and the second IR image, wherein the first preset value is greater than the second preset value. The second threshold value may be stored in the control apparatus 600 in advance. The second threshold value can be set manually or changed manually. The second preset value can be obtained based on empirical values, and can also be obtained by training sample data with labels. For example, the second threshold may be 20lux, or 30lux, 40lux, and so on, or may be lower, for example, 10lux, and so on. When the ambient illumination intensity of the scene is between the first preset value and the second preset value, the control device 600 may determine the depth of the scene based on the parallax of the scene in the first RGB image and the second RGB image and/or the parallax in the first IR image and the second IR image. The determining of the depth of the scene based on the parallax of the scene in the first RGB image and the second RGB image and/or the parallax of the first IR image and the second IR image may be that the control device 600 determines the depth of the scene according to the parallax of the feature points in the scene in the first RGB image and the second RGB image, determines the depth of the scene according to the parallax of the feature points in the scene in the first IR image and the second IR image, determines the first depth of the scene according to the parallax of the feature points in the scene in the first RGB image and the second RGB image, determines the second depth of the scene according to the parallax of the feature points in the scene in the first IR image and the second IR image, and performs the feature fusion calculation on the first depth and the second depth, determining a depth of the scene. The feature fusion calculation may be a weighted sum calculation, an average pooling, or the like.
When the system is used for human body recognition, the control device 600 may perform human body recognition based on the first RGB image, the second RGB image, the first IR image, and the second IR image. When the first RGB image and the second RGB image are not available in the case where the illumination intensity is extremely low, the control device 600 may perform human body recognition based on the first IR image and the second IR image.
As shown in fig. 1A, 1B, and 1C, the system may further include an infrared lamp 700. The infrared lamp 700 may be attached to the support member 400 for emitting infrared light. The number of the infrared lamps 700 may be 1 or more. The infrared lamp 700 may emit infrared light, the infrared light irradiates on an object and is diffusely reflected by the object, and the diffusely reflected light is absorbed by the first camera 100 and the second camera 200 to obtain the first optical image and the second optical image. The infrared light 700 allows the system to capture images of the scene in dark environments as well. In some embodiments, infrared lamp 700 may also be used for living body identification. The spectral information reflected by the face skin under different illumination conditions is analyzed and classified, and the correlation judgment is carried out on heterogeneous face images, so that the difference between the real face skin and all other attack materials is effectively distinguished.
As shown in fig. 1A, 1B, and 1C, the system may further include a fill light 800. The fill light 800 may be a visible light fill light or an infrared light fill light, and may be used to fill visible light or infrared light under dim light conditions. The fill light 800 may be communicatively coupled to the control device 600. When the illumination intensity of the environment of the scene is less than the illumination intensity threshold, the control device 600 may drive the light supplement lamp 800 to turn on. The illumination intensity threshold may be preset in the control device 600. The illumination intensity threshold value can be set manually or changed. It should be noted that, when the ambient light is greater than the illumination intensity threshold, the fill light 800 may also be turned on by the control device 600 through manual operation. For example, when the illumination intensity in the environment of the scene is smaller than the first preset value, the control device 600 may control the light supplement lamp 800 to be turned on.
As shown in fig. 1B and 1C, active system 001B may also include optical transmitter 900. The light emitter 900 is coupled to the support member 400 and is operable to emit a light array of light encoded in a predetermined pattern. The light emitter 900 may include at least one of an LED emitter and a laser emitter. The light array illuminates objects in the scene to form a plurality of spots. The light spot may be speckle or fringe. The first optical image and the second optical image captured by the first camera 100 and the second camera 200 include the plurality of light spots. When calculating the depth information of the scene, the control device 600 may determine the parallax of each light spot in the first optical image and the second optical image respectively by using the plurality of light spots as feature points for depth calculation, so as to determine the depth information of the position where each light spot is located. Compared with the passive system 001A which takes the object in the scene as the feature point of the depth calculation, the active system 001B takes the plurality of light spots as the feature point of the depth calculation, and in the scene with general or dark light, the plurality of light spots are more obvious compared with the object in the scene, and therefore, the accuracy of the depth calculation is higher.
Since the spot can be used as the feature point in the active system 001B, the first preset value can be higher in the active system 001B than in the passive system 001A. In the active system 001B, the first preset value can be selected according to the light spot. The spot is less visible when the ambient illumination intensity is stronger. For example, the first preset value in the active system 001B may be 50000lux, and may be higher, such as 60000lux, and the like, and may be lower, such as 40000lux, and the like. Since the light spot is not obvious in a strong-light scene and is not suitable for being used as a feature point, in the strong-light scene of the active system 001B, that is, when the light intensity of the environment of the scene is greater than the first preset value, the control device 600 may still determine the depth of the scene based on the parallax by using the object in the scene as the feature point for depth calculation. When the illumination intensity in the environment of the scene is smaller than the first preset value, the control device 600 may determine the depth of the scene based on the parallax using the light spot as the feature point of the depth calculation.
The more distant an object in the scene is from the active system 001B, the less noticeable the spot on the object. The use of the spot as a feature point for the depth calculation is not suitable for a long-range depth calculation. Therefore, when the distance of an object in the scene from the active system 001B is greater than a distance threshold, the accuracy of using the spot as the feature point of the depth calculation is not as high as the accuracy of using an object in the scene as the feature point of the depth calculation. Therefore, when the distance of the object in the scene from the system active system 001B is greater than the distance threshold, the control device 600 may determine the depth of the scene based on the parallax using the object in the scene as the feature point of the depth calculation; when the distance from the object in the scene to the active system 001B is smaller than the distance threshold, the control device 600 may determine the depth of the scene based on the parallax using the light spot as the feature point of the depth calculation. The distance threshold may be stored in the control device 600 in advance, or may be set and changed manually. The distance threshold may be obtained based on empirical values, or may be obtained by training sample data with a label. For example, the distance threshold may be 1.5m, or 1.2m, 1.8m, and so on.
Therefore, when the illumination intensity of the environment of the scene is smaller than the second preset value and the distance between the object in the scene and the active system 001B is greater than the distance threshold, the control device 600 may determine the depth of the scene based on the parallax of the feature point in the first IR image and the second IR image, with the object in the scene as the feature point of the depth calculation; when the illumination intensity of the environment of the scene is smaller than the second preset value and the distance between the object in the scene and the active system 001B is smaller than the distance threshold, the control device 600 may determine the depth of each of the plurality of light spots in the scene by using the plurality of light spots as the feature points of the depth calculation based on the parallax of each of the plurality of light spots in the first IR image and the second IR image, thereby determining the depth of the scene.
When the illumination intensity of the environment of the scene is between the first preset value and the second preset value, and the distance between the object in the scene and the active system 001B is greater than the distance threshold, the control device 600 may determine the depth of the scene based on the parallax of the feature point in the first RGB image and the second RGB image and/or the parallax of the first IR image and the second IR image, with the object in the scene as the feature point of the depth calculation; when the illumination intensity of the environment of the scene is between the first preset value and the second preset value, and the distance between the object in the scene and the active system 001B is smaller than the distance threshold, the control device 600 may determine the depth of each of the plurality of light spots in the scene by using the plurality of light spots as the feature points of the depth calculation, and based on the parallax of each of the plurality of light spots in the first IR image and the second IR image, determining the depth of each of the plurality of light spots, thereby determining the depth of the scene.
Since the light spot is more conspicuous in the IR image, the first IR image and the second IR image may be selected as the depth calculation images when the light spot is taken as the feature point of the depth calculation.
As described above, in the passive system 001A, the control device 600 may obtain the depth information based on the parallax of the feature point in the scene in the first IR image and the second IR image in the dark scene, may obtain the depth information based on the parallax of the feature point in the scene in the first RGB image and the second RGB image in the sufficiently bright scene, or may obtain the depth information based on the parallax of the feature point in the scene in the first RGB image and the second RGB image and/or the first IR image and the second IR image in the general bright scene. In the active system 001B, the method can be applied not only to scenes with dark light, general light and sufficient light, but also to remote depth calculation and near depth calculation. In summary, the depth image imaging system provided by the present specification has a wide application range and is accurate in depth calculation.
Fig. 5 shows a flowchart of a depth image imaging method provided in accordance with an embodiment of the present description. As previously mentioned, the system may be either a passive system 001A or an active system 001B. Therefore, the depth image imaging method may be a depth image imaging method corresponding to the passive system 001A, or may be a depth image imaging method corresponding to the active system 001B. Fig. 5A shows a flowchart of a passive depth image imaging method PA100 corresponding to the passive system 001A provided in accordance with an embodiment of the present description; fig. 5B shows a flowchart of an active depth image imaging method PB200 corresponding to the active system 001B provided according to an embodiment of the present disclosure. As described above, the control device 600 may execute the methods PA100 and PB200 of depth image imaging provided in the present specification. Specifically, the processor 620 in the control device 600 may read an instruction set stored in its local storage medium, and then execute the methods PA100 and PB200 of depth image imaging provided in the present specification according to the specification of the instruction set.
As shown in fig. 5A, the method PA100 may include performing, by the at least one processor 620, the following steps:
s110: and when the illumination intensity of the environment of the scene is determined to be smaller than the illumination intensity threshold value, driving the light supplement lamp 800 to be turned on.
S120: acquiring the first optical image and the second optical image.
S140: determining the depth of the scene based on the first optical image and the second optical image, and generating a depth image of the scene.
Specifically, in the passive system 001A, the step S140 may include one of the following manners:
s142: determining that the illumination intensity of the environment of the scene is greater than the first preset value, and determining the depth of the scene based on the parallax of the scene in the first RGB image and the second RGB image. As described above, in a scene with sufficient light, the first RGB image and the second RGB image are used as images for depth calculation. Specifically, the control device 600 may determine the depth of the feature points based on the parallaxes of the feature points in the scene in the first RGB image and the second RGB image, with the objects in the scene as feature points for depth calculation, so as to determine the depth of the scene, and generate a depth image of the scene. At this time, the control device 600 may also assist in determining the depth of the scene based on the disparity of the scene in the first IR image and the second IR image. That is, the control device 600 may perform a weighted sum calculation of a depth determined based on the parallax of the scene in the first and second RGB images and a depth determined based on the parallax of the scene in the first and second IR images. Wherein a weighting coefficient of a depth determined based on the disparity of the scene in the first and second RGB images is greater than a weighting coefficient of a depth determined based on the disparity of the scene in the first and second IR images.
S144: determining that an illumination intensity of an environment of the scene is less than a second preset value, and determining a depth of the scene based on a disparity of the scene in the first IR image and the second IR image. As described above, in a dark scene, the first IR image and the second IR image are used as images for depth calculation. Specifically, the control device 600 may determine the depth of the feature points in the scene based on the parallax of the feature points in the first IR image and the second IR image, using the object in the scene as the feature point for depth calculation, so as to determine the depth of the scene, and generate the depth image of the scene.
S146: determining that an illumination intensity of an environment of the scene is between the first preset value and the second preset value, determining a depth of the scene based on a disparity of the scene in the first RGB image and the second RGB image and/or a disparity in the first IR image and the second IR image. As described above, in a scene with general lighting, the first RGB image and the second RGB image may be used as images for depth calculation, objects in the scene may be used as feature points for depth calculation, and the depth of the scene may be determined based on the parallax of the feature points in the scene in the first RGB image and the second RGB image; the control device 600 may also use the first IR image and the second IR image as images for depth calculation, use an object in the scene as a feature point for depth calculation, and determine the depth of the scene based on the parallax of the feature point in the scene in the first IR image and the second IR image; the control device 600 may further use the first RGB image and the second RGB image and the first IR image and the second IR image as depth calculation images at the same time, use an object in the scene as a feature point of depth calculation, determine a first depth of the scene based on a parallax of the feature point in the scene in the first RGB image and the second RGB image, determine a second depth of the scene based on the parallax of the feature point in the scene in the first IR image and the second IR image, perform feature fusion calculation on the first depth and the second depth, and determine the depth of the scene. The feature fusion calculation may be a weighted sum calculation, an average pooling, or the like.
As shown in fig. 5B, the method PB200 may include performing, by at least one processor 620, the following steps:
s210: and when the illumination intensity of the environment of the scene is determined to be smaller than the illumination intensity threshold value, driving the light supplement lamp 800 to be turned on.
S220: acquiring the first optical image and the second optical image.
S240: determining the depth of the scene based on the first optical image and the second optical image, and generating a depth image of the scene.
Specifically, in the active system 001B, the step S240 may include one of the following manners:
s242: determining that the illumination intensity of the environment of the scene is greater than the first preset value, and determining the depth of the scene based on the parallax of the scene in the first RGB image and the second RGB image. This step is substantially the same as step S142, and is not described herein again.
S244: determining that the illumination intensity of the environment of the scene is less than the second preset value, and determining the depth of the scene based on the parallax of the scene in the first IR image and the second IR image. Step S244 may include one of the following ways:
s244-2: determining that an object in the scene is greater than the distance threshold from active system 001B, determining a depth of the scene based on a disparity of objects in the scene in the first and second IR images. This step is substantially the same as step S144, and is not described herein again.
S244-4: determining that an object in the scene is at a distance from active system 001B that is less than the distance threshold, determining a depth of each spot of the plurality of spots in the scene based on a disparity of the each spot in the first and second IR images, thereby determining the depth of the scene. Specifically, the control device 600 may determine the depth of the feature points in the scene based on the parallax of the feature points in the first IR image and the second IR image, using the plurality of light spots in the scene as feature points for depth calculation, so as to determine the depth of the scene, and generate a depth image of the scene.
S246: determining that the ambient illumination intensity of the scene is between the first preset value and the second preset value, and determining the depth of the scene based on the parallax of the scene in the first RGB image and the second RGB image and/or the parallax of the first IR image and the second IR image. Step S246 may include one of the following ways:
s246-2: determining that an object in the scene is a distance from active system 001B that is greater than the distance threshold, determining a depth of the scene based on a disparity of objects in the scene in the first and second RGB images and/or the first and second IR images. This step is substantially the same as S146, and is not described herein again.
S246-4: determining that an object in the scene is at a distance from active system 001B that is less than the distance threshold, determining a depth of each spot of the plurality of spots in the scene based on a disparity of the each spot in the first and second IR images, thereby determining the depth of the scene. Specifically, the control device 600 may determine the depth of the feature points in the scene based on the parallax of the feature points in the first IR image and the second IR image, using the plurality of light spots in the scene as feature points for depth calculation, so as to determine the depth of the scene, and generate a depth image of the scene.
In summary, in the passive system 001A and the method PA100 provided in this specification, the control device 600 may obtain depth information based on the parallax of the feature point in the scene in the first IR image and the second IR image in a dark scene, or obtain depth information based on the parallax of the feature point in the scene in the first RGB image and the second RGB image in a sufficiently bright scene, or obtain depth information based on the parallax of the feature point in the scene in the first RGB image and the second RGB image and/or the first IR image and the second IR image in a generally bright scene. The active system 001B and the method PB200 provided in this specification can be applied not only to scenes with dark light, general light, and sufficient light, but also to remote depth calculation and near depth calculation. In summary, the depth image imaging system and the depth image imaging method provided by the present specification are applicable to various scenes, and have a wide application range and accurate depth calculation.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
In conclusion, upon reading the present detailed disclosure, those skilled in the art will appreciate that the foregoing detailed disclosure can be presented by way of example only, and not limitation. Those skilled in the art will appreciate that the present specification contemplates various reasonable variations, enhancements and modifications to the embodiments, even though not explicitly described herein. Such alterations, improvements, and modifications are intended to be suggested by this specification, and are within the spirit and scope of the exemplary embodiments of this specification.
Furthermore, certain terminology has been used in this specification to describe embodiments of the specification. For example, "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various portions of this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined as suitable in one or more embodiments of the specification.
It should be appreciated that in the foregoing description of embodiments of the specification, various features are grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the specification, for the purpose of aiding in the understanding of one feature. This is not to be taken as an admission that any of the features are required in combination, and it is fully possible for one skilled in the art to extract some of the features as separate embodiments when reading this specification. That is, embodiments in this specification may also be understood as an integration of a plurality of sub-embodiments. And each sub-embodiment described herein is equally applicable to less than all features of a single foregoing disclosed embodiment.
Each patent, patent application, publication of a patent application, and other material, such as articles, books, descriptions, publications, documents, articles, and the like, cited herein is hereby incorporated by reference. All matters hithertofore set forth herein except as related to any prosecution history, may be inconsistent or conflicting with this document or any prosecution history which may have a limiting effect on the broadest scope of the claims. Now or later associated with this document. For example, if there is any inconsistency or conflict in the description, definition, and/or use of terms associated with any of the included materials with respect to the terms, descriptions, definitions, and/or uses associated with this document, the terms in this document are used.
Finally, it should be understood that the embodiments of the application disclosed herein are illustrative of the principles of the embodiments of the present specification. Other modified embodiments are also within the scope of this description. Accordingly, the disclosed embodiments are to be considered in all respects as illustrative and not restrictive. Those skilled in the art may implement the applications in this specification in alternative configurations according to the embodiments in this specification. Therefore, the embodiments of the present description are not limited to the embodiments described precisely in the application.

Claims (18)

1. A depth image imaging system comprising:
a support member;
a first camera coupled to the support member for acquiring a first optical image of a scene being photographed, comprising:
the first image sensor comprises a first photosensitive unit array integrating a visible light photosensitive unit and an infrared photosensitive unit;
a second camera, connected to the support member, fixed relative to the first camera in a predetermined attitude, for acquiring a second optical image of the scene, including:
a second image sensor including a second photosensitive cell array in which a visible light photosensitive cell and an infrared photosensitive cell are integrated; and
and the control device is in communication connection with the first camera and the second camera, determines the depth of the scene based on the first optical image and the second optical image, and generates a depth image of the scene.
2. The depth image imaging system of claim 1, wherein the first array of light sensing units is arrayed in a predetermined manner by a red light sensing unit, a green light sensing unit, a blue light sensing unit, and an IR light sensing unit, each light sensing unit corresponding to one pixel; and
the second photosensitive unit array is formed by arraying a red photosensitive unit, a green photosensitive unit, a blue photosensitive unit and an IR photosensitive unit in a preset mode, and each photosensitive unit corresponds to one pixel.
3. The depth image imaging system of claim 1, further comprising:
an infrared lamp connected to the support member.
4. The depth image imaging system of claim 1, wherein the first optical image comprises a first RGB image and a first IR image; and
the second optical image includes a second RGB image and a second IR image.
5. The depth image imaging system of claim 4, wherein the control device determines the depth of the scene based on a disparity of the scene in the first RGB image and the second RGB image when an illumination intensity of an environment of the scene is greater than a first preset value.
6. The depth image imaging system of claim 5, wherein the control device determines the depth of the scene based on a disparity of the scene in the first and second IR images when an illumination intensity of an environment of the scene is less than a second preset value, wherein the first preset value is greater than the second preset value.
7. The depth image imaging system of claim 6, wherein the control device determines the depth of the scene based on the disparity of the scene in the first and second RGB images and/or the disparity in the first and second IR images when the illumination intensity of the environment of the scene is between the first and second preset values.
8. The depth image imaging system of claim 7, further comprising:
a light emitter coupled to the support member and operable to emit a light array of light encoded in a predetermined pattern, the light array impinging on an object of the scene to form a plurality of spots, the light emitter comprising at least one of an LED emitter and a laser emitter.
9. The depth image imaging system of claim 8, wherein the determining the depth of the scene based on the disparity of the scene in the first and second IR images comprises:
determining a depth of the scene based on a disparity of objects in the scene in the first and second IR images when the distance of the objects in the scene from the depth image imaging system is greater than a distance threshold; and
determining a depth of each spot of the plurality of spots in the scene based on a disparity of the each spot in the first and second IR images when an object in the scene is less than the distance threshold from the depth image imaging system, thereby determining the depth of the scene.
10. The depth image imaging system of claim 8, wherein the determining the depth of the scene based on the disparity of the scene in the first and second RGB images and/or the disparity in the first and second IR images comprises:
determining a depth of the scene based on a disparity of objects in the scene in the first and second RGB images and/or the first and second IR images when a distance of the objects in the scene from the depth image imaging system is greater than the distance threshold; and
determining a depth of each spot of the plurality of spots in the scene based on a disparity of the each spot in the first and second IR images when an object in the scene is less than the distance threshold from the depth image imaging system, thereby determining the depth of the scene.
11. The depth image imaging system of claim 1, further comprising:
and when the ambient illumination intensity of the scene is smaller than the illumination intensity threshold value, the control device drives the light supplement lamp to be turned on.
12. A method of depth image imaging for a depth image imaging system, the depth image imaging system comprising:
a support member;
a first camera coupled to the support member for acquiring a first optical image of a scene being photographed, comprising:
a first image sensor including a first photosensitive cell array integrating a visible light sensitive cell and an infrared light sensitive cell,
a second camera, connected to the support member, fixed relative to the first camera in a predetermined attitude, for acquiring a second optical image of the scene, including:
a second image sensor including a second photosensitive cell array in which a visible light photosensitive cell and an infrared photosensitive cell are integrated; and
the control device is in communication connection with the first camera and the second camera;
the method comprises the steps of:
acquiring the first optical image and the second optical image; and
determining the depth of the scene based on the first optical image and the second optical image, and generating a depth image of the scene.
13. The method of claim 12, wherein,
the depth image imaging system also comprises an infrared lamp which is connected to the supporting component;
the first photosensitive unit array is formed by arraying a red light photosensitive unit, a green light photosensitive unit, a blue light photosensitive unit and an IR photosensitive unit in a preset mode, and each photosensitive unit corresponds to one pixel;
the second photosensitive unit array is formed by arraying a red light photosensitive unit, a green light photosensitive unit, a blue light photosensitive unit and an IR photosensitive unit in a preset mode, and each photosensitive unit corresponds to one pixel;
the first optical image comprises a first RGB image and a first IR image; and
the second optical image includes a second RGB image and a second IR image.
14. The method of claim 13, wherein said determining the depth of the scene based on the first optical image and the second optical image comprises one of:
determining that the illumination intensity of the environment of the scene is greater than a first preset value, and determining the depth of the scene based on the parallax of the scene in the first RGB image and the second RGB image;
determining that the illumination intensity of the environment of the scene is less than a second preset value, and determining the depth of the scene based on the parallax of the scene in the first IR image and the second IR image, wherein the first preset value is greater than the second preset value; and
determining that an illumination intensity of an environment of the scene is between the first preset value and the second preset value, determining a depth of the scene based on a disparity of the scene in the first RGB image and the second RGB image and/or a disparity in the first IR image and the second IR image.
15. The method of claim 14, wherein the depth image imaging system further comprises:
a light emitter coupled to the support member and operable to emit a light array of light encoded in a predetermined pattern, the light array impinging on an object of the scene to form a plurality of spots, the light emitter comprising at least one of an LED emitter and a laser emitter.
16. The method of claim 15, wherein said determining a depth of the scene based on the disparity of the scene in the first IR image and the second IR image comprises one of:
determining that an object in the scene is at a distance from the depth image imaging system that is greater than a distance threshold, determining a depth of the scene based on a disparity of the object in the scene in the first IR image and the second IR image; and
determining that a distance of an object in the scene from the depth image imaging system is less than the distance threshold, determining a depth of each spot of the plurality of spots in the scene based on a disparity of the each spot in the first and second IR images, thereby determining the depth of the scene.
17. The method of claim 15, wherein the determining the depth of the scene based on the disparity of the scene in the first and second RGB images and/or the disparity in the first and second IR images comprises one of:
determining that an object in the scene is a distance from the depth image imaging system that is greater than the distance threshold, determining a depth of the scene based on a disparity of the object in the scene in the first and second RGB images and/or the first and second IR images; and
determining that a distance of an object in the scene from the depth image imaging system is less than the distance threshold, determining a depth of each spot of the plurality of spots in the scene based on a disparity of the each spot in the first and second IR images, thereby determining the depth of the scene.
18. The method of claim 12, wherein the depth image imaging system further comprises a fill light; the method further comprises, by the control device:
and when the ambient illumination intensity of the scene is determined to be smaller than the illumination intensity threshold value, driving the light supplement lamp to be turned on.
CN202010729553.6A 2020-07-27 2020-07-27 Depth image imaging system and method Pending CN111866490A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010729553.6A CN111866490A (en) 2020-07-27 2020-07-27 Depth image imaging system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010729553.6A CN111866490A (en) 2020-07-27 2020-07-27 Depth image imaging system and method

Publications (1)

Publication Number Publication Date
CN111866490A true CN111866490A (en) 2020-10-30

Family

ID=72947201

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010729553.6A Pending CN111866490A (en) 2020-07-27 2020-07-27 Depth image imaging system and method

Country Status (1)

Country Link
CN (1) CN111866490A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114745509A (en) * 2022-04-08 2022-07-12 深圳鹏行智能研究有限公司 Image acquisition method, image acquisition equipment, foot type robot and storage medium
CN117560480A (en) * 2024-01-09 2024-02-13 荣耀终端有限公司 Image depth estimation method and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150256813A1 (en) * 2014-03-07 2015-09-10 Aquifi, Inc. System and method for 3d reconstruction using multiple multi-channel cameras
US20170034499A1 (en) * 2014-04-03 2017-02-02 Heptagon Micro Optics Pte. Ltd. Structured-stereo imaging assembly including separate imagers for different wavelengths
CN106576159A (en) * 2015-06-23 2017-04-19 华为技术有限公司 Photographing device and method for acquiring depth information
CN106572340A (en) * 2016-10-27 2017-04-19 深圳奥比中光科技有限公司 Camera shooting system, mobile terminal and image processing method
CN106780589A (en) * 2016-12-09 2017-05-31 深圳奥比中光科技有限公司 A kind of method for obtaining target depth image
CN107172407A (en) * 2016-03-08 2017-09-15 聚晶半导体股份有限公司 Electronic installation and method suitable for producing depth map
CN110874852A (en) * 2019-11-06 2020-03-10 Oppo广东移动通信有限公司 Method for determining depth image, image processor and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150256813A1 (en) * 2014-03-07 2015-09-10 Aquifi, Inc. System and method for 3d reconstruction using multiple multi-channel cameras
US20170034499A1 (en) * 2014-04-03 2017-02-02 Heptagon Micro Optics Pte. Ltd. Structured-stereo imaging assembly including separate imagers for different wavelengths
CN106576159A (en) * 2015-06-23 2017-04-19 华为技术有限公司 Photographing device and method for acquiring depth information
CN107172407A (en) * 2016-03-08 2017-09-15 聚晶半导体股份有限公司 Electronic installation and method suitable for producing depth map
CN106572340A (en) * 2016-10-27 2017-04-19 深圳奥比中光科技有限公司 Camera shooting system, mobile terminal and image processing method
CN106780589A (en) * 2016-12-09 2017-05-31 深圳奥比中光科技有限公司 A kind of method for obtaining target depth image
CN110874852A (en) * 2019-11-06 2020-03-10 Oppo广东移动通信有限公司 Method for determining depth image, image processor and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114745509A (en) * 2022-04-08 2022-07-12 深圳鹏行智能研究有限公司 Image acquisition method, image acquisition equipment, foot type robot and storage medium
CN117560480A (en) * 2024-01-09 2024-02-13 荣耀终端有限公司 Image depth estimation method and electronic equipment

Similar Documents

Publication Publication Date Title
CN110276808B (en) Method for measuring unevenness of glass plate by combining single camera with two-dimensional code
CN106643699B (en) Space positioning device and positioning method in virtual reality system
CN105574525B (en) A kind of complex scene multi-modal biological characteristic image acquiring method and its device
CN110044300B (en) Amphibious three-dimensional vision detection device and detection method based on laser
CN111009007B (en) Finger multi-feature comprehensive three-dimensional reconstruction method
US7729530B2 (en) Method and apparatus for 3-D data input to a personal computer with a multimedia oriented operating system
CN108701363B (en) Method, apparatus and system for identifying and tracking objects using multiple cameras
CN109859272B (en) Automatic focusing binocular camera calibration method and device
CN106371281A (en) Multi-module 360-degree space scanning and positioning 3D camera based on structured light
CN106909911A (en) Image processing method, image processing apparatus and electronic installation
CN108234984A (en) Binocular depth camera system and depth image generation method
CN107025635A (en) Processing method, processing unit and the electronic installation of image saturation based on the depth of field
CN110572630B (en) Three-dimensional image shooting system, method, device, equipment and storage medium
CN110288656A (en) A kind of object localization method based on monocular cam
CN106991378B (en) Depth-based face orientation detection method and device and electronic device
CN102438111A (en) Three-dimensional measurement chip and system based on double-array image sensor
CN111127540B (en) Automatic distance measurement method and system for three-dimensional virtual space
WO2017077276A1 (en) Systems and methods for forming models of three dimensional objects
EP3381015B1 (en) Systems and methods for forming three-dimensional models of objects
CN111866490A (en) Depth image imaging system and method
CN105004324A (en) Monocular vision sensor with triangulation ranging function
EP3371780A1 (en) System and methods for imaging three-dimensional objects
CN116697888A (en) Method and system for measuring three-dimensional coordinates and displacement of target point in motion
CN109308714A (en) Camera and laser radar information method for registering based on classification punishment
CN102316355A (en) Generation method of 3D machine vision signal and 3D machine vision sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201030

RJ01 Rejection of invention patent application after publication