CN112363180A - Imaging distance measuring sensor, method, system and storage medium - Google Patents

Imaging distance measuring sensor, method, system and storage medium Download PDF

Info

Publication number
CN112363180A
CN112363180A CN202011173365.6A CN202011173365A CN112363180A CN 112363180 A CN112363180 A CN 112363180A CN 202011173365 A CN202011173365 A CN 202011173365A CN 112363180 A CN112363180 A CN 112363180A
Authority
CN
China
Prior art keywords
sub
pixel
pixels
infrared
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011173365.6A
Other languages
Chinese (zh)
Inventor
张学勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202011173365.6A priority Critical patent/CN112363180A/en
Publication of CN112363180A publication Critical patent/CN112363180A/en
Priority to PCT/CN2021/115371 priority patent/WO2022088913A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
    • G01S7/4866Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak by fitting a model or function to the received signal

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The utility model is suitable for a time of flight technical field provides an imaging range finding sensor, method, system and storage medium, and imaging range finding sensor includes a plurality of pixels that arrange with the array form, every the pixel includes four at least sub-pixels that arrange with the array form, every sub-pixel includes a single photon sensitization unit and covers the light filter of single photon sensitization unit, four at least sub-pixels include infrared sub-pixel, red sub-pixel, green sub-pixel and blue sub-pixel for an imaging range finding sensor can acquire the function of depth information and RGB information simultaneously, and the depth image that obtains based on the depth information does not have the visual angle difference with the RGB image that obtains based on RGB information, need not to align when fusing depth image and RGB image, can effectively save computational resource and time.

Description

Imaging distance measuring sensor, method, system and storage medium
Technical Field
The present application relates to Time Of Flight (TOF) technologies, and in particular, to an imaging range finding sensor, method, system, and storage medium.
Background
The time-of-flight ranging method is a method in which an optical pulse signal is continuously transmitted to a target, and then an optical signal reflected by the target is received, and the distance to the target is obtained by detecting the time of flight (round trip) of the optical pulse signal. Single photon ranging systems based on time-of-flight technology have been widely used in the fields of consumer electronics, unmanned driving, virtual reality, augmented reality, and the like.
Currently, most TOF devices are specially used for measuring distances, in practical applications, depth information measured by the TOF devices, RGB ((Red, R), Green (G), Blue (B)) image sensors are often required to obtain RGB information, and depth images obtained based on the depth information and RGB images obtained based on the RGB information are subjected to image fusion, and then the depth images and the RGB images need to be aligned through an algorithm, which is complex in calculation and consumes computing resources and time.
Disclosure of Invention
In view of this, embodiments of the present application provide an imaging ranging sensor, a method, a system, and a storage medium, so as to solve the problems in the prior art that a depth image needs to be aligned with an RGB image through an algorithm, the calculation is complex, and computational resources and time are consumed.
A first aspect of the embodiments of the present application provides an imaging ranging sensor, including a plurality of pixels arranged in an array, where each of the pixels includes at least four sub-pixels arranged in an array, each of the sub-pixels includes a single photon photosensitive unit and a filter covering the single photon photosensitive unit, and each of the pixels includes an infrared sub-pixel, a red sub-pixel, a green sub-pixel, and a blue sub-pixel.
A second aspect of the embodiments of the present application provides an imaging ranging method, which is implemented by an imaging ranging sensor according to the first aspect of the embodiments of the present application, and the method includes:
acquiring a first light sensing signal output by each pixel and used for indicating the time of receiving infrared light reflected by an object;
acquiring a second light sensing signal output by each pixel and used for indicating the time of receiving the visible light reflected by the object;
acquiring depth information of the object according to the first light sensing signal;
and acquiring RGB information of the object according to the second light sensing signal.
A third aspect of an embodiment of the present application provides an imaging ranging system, including:
a light emitter for emitting infrared rays and visible rays to an object;
an imaging ranging sensor as described in the first aspect of the embodiments of the present application;
and the controller is respectively connected with the light emitter and the imaging ranging sensor and is used for executing the imaging ranging method according to the second aspect of the embodiment of the application.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, which stores a computer program, which when executed by a controller, implements the steps of the imaging range finding method according to the second aspect of embodiments of the present application.
The imaging distance measuring sensor provided by the first aspect of the embodiments of the present application includes a plurality of pixels arranged in an array, each of the pixels includes at least four sub-pixels arranged in an array, each of the sub-pixels includes a single photon photosensitive unit and a filter covering the single photon photosensitive unit, the at least four sub-pixels include an infrared sub-pixel, a red sub-pixel, a green sub-pixel and a blue sub-pixel, each of the sub-pixels includes an infrared sub-pixel, a red sub-pixel, a green sub-pixel and a blue sub-pixel, and each of the sub-pixels is composed of a single photon photosensitive unit and a filter covering the single photon photosensitive unit, so that one imaging distance measuring sensor can simultaneously acquire a function of depth information and RGB information, and a depth image acquired based on the depth information and an RGB image based on the RGB information have no viewing angle difference, alignment is not needed when the depth image and the RGB image are fused, so that the operation resource and time can be effectively saved.
The imaging ranging method provided by the second aspect of the embodiment of the present application is implemented based on the imaging ranging sensor provided by the first aspect of the embodiment of the present application, and is implemented by acquiring a first light sensing signal output by each pixel and used for indicating a time when an infrared ray reflected by an object is received; acquiring a second light sensing signal output by each pixel and used for indicating the time of receiving the visible light reflected by the object; acquiring depth information of the object according to the first light sensing signal; the RGB information of the object is acquired according to the second light-induced signal, and the depth signal and the RGB information of the object can be acquired respectively through the first light-induced signal and the second light-induced signal output by the imaging distance measuring sensor, so that a depth image acquired based on the depth information and an RGB image acquired based on the RGB information have no visual angle difference, alignment is not needed when the depth image and the RGB image are fused, and the calculation resource and time can be effectively saved.
It is to be understood that, the beneficial effects of the third aspect and the fourth aspect may be referred to the description of the first aspect or the second aspect, and are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic diagram of a first structure of a pixel provided in an embodiment of the present application;
fig. 2 is a schematic diagram of a second structure of a pixel provided in an embodiment of the present application;
fig. 3 is a schematic diagram of a third structure of a pixel provided in this embodiment of the present application;
FIG. 4 is a schematic flow chart of a method for imaging range finding provided in an embodiment of the present application;
FIG. 5 is a table of correspondence provided in an embodiment of the present application;
fig. 6 is a schematic structural diagram of an imaging ranging system provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The embodiment of the application provides an imaging ranging sensor, which comprises a plurality of pixels arranged in an array form, wherein each pixel comprises at least four sub-pixels arranged in the array form, each sub-pixel comprises a single photon light-sensitive unit and a light filter covering the single photon light-sensitive unit, and each pixel comprises an infrared sub-pixel, a red sub-pixel, a green sub-pixel and a blue sub-pixel.
In application, the number of pixels included in the imaging ranging sensor can be set according to actual needs, and the imaging ranging sensor has higher resolution when the number of pixels is larger. The several pixels in the imaging ranging sensor may be arranged in an array of any regular shape, for example, a rectangular array, a circular array, or any other regular polygonal array. The number of the sub-pixels included in each pixel can be set according to actual needs, and as long as it is ensured that each pixel includes an infrared sub-pixel, a red sub-pixel, a green sub-pixel and a blue sub-pixel, when the depth information and the RGB information of the object are acquired by the imaging distance measuring sensor, the depth information and the RGB information of each reflective point corresponding to each pixel on the object can be acquired. The depth information of each light reflecting point on the object comprises the distance between the light reflecting point and the imaging ranging sensor, the RGB information of each light reflecting point comprises the R value, the G value and the B value of the light reflecting point, and the R value, the G value and the B value can be CIE spectrum tristimulus values specifically.
In application, each sub-pixel comprises a single photon light sensing unit and a filter covering the single photon light sensing unit, the filters of the infrared sub-pixel, the red sub-pixel, the green sub-pixel and the blue sub-pixel are respectively an infrared filter which can only pass infrared light, a red filter which can only pass red light, a green filter which can only pass green light and a blue filter which can only pass blue light, and each filter can be set as a reflection filter or an absorption filter according to actual needs.
In application, the Single Photon photosensitive unit may be a Single Photon Avalanche Diode (SPAD) or a photomultiplier tube, and may respond to an incident Single Photon and output a light sensing signal for indicating a Time when the Photon reaches the Single Photon photosensitive unit, and based on the light sensing signal output by the Single Photon photosensitive unit, the collection of a weak light signal and the calculation of a flight Time may be implemented by using a Time-Correlated Single Photon Counting method (TCSPC).
As shown in fig. 1, a first structural diagram of an imaging distance measuring sensor is exemplarily shown;
each pixel 101 includes four infrared sub-pixels IR, three red sub-pixels R, six green sub-pixels G, and three blue sub-pixels B;
four infrared sub-pixels IR are arranged in a 2 × 2 array to form an infrared sub-pixel unit 10, and one red sub-pixel R, two green sub-pixels G and one blue sub-pixel B are arranged in a bayer array to form a color sub-pixel unit 20;
each pixel 101 comprises one infrared sub-pixel element 10 and three color sub-pixel elements 20.
In application, the arrangement of one infrared sub-pixel unit and three color sub-pixel units included in each pixel can be set according to actual needs, for example, the infrared sub-pixel units and the three color sub-pixel units are arranged in a 1 × 4, 4 × 1 or 2 × 2 array.
Fig. 1 schematically shows a structure of an imaging range sensor in which one infrared sub-pixel unit 10 and three color sub-pixel units 20 are arranged in a 2 × 2 array.
In the structure of the imaging ranging sensor shown in fig. 1, four infrared sub-pixels are arranged in a 2 × 2 array form to form one infrared sub-pixel unit, so that the four infrared sub-pixels can be combined (bining) into one super-pixel, for example, the four infrared sub-pixels are connected to one or gate or one and gate to form one infrared super-pixel, which can effectively improve the signal-to-noise ratio of the infrared sensing signal and improve the ranging accuracy; by arranging one red sub-pixel, two green sub-pixels, and one blue sub-pixel into one color sub-pixel unit in the bayer array format (i.e., RGGB), color information can be reduced using a well-established bayer array-based algorithm, and imaging accuracy is high.
As shown in fig. 2, a second structural diagram of the imaging range sensor is exemplarily shown, wherein each pixel 102 includes one infrared sub-pixel IR, one red sub-pixel R, one green sub-pixel G and one blue sub-pixel B arranged in a 2 × 2 array.
The imaging ranging sensor structure shown in fig. 2 arranges four sub-pixels into a pixel in a 2 × 2 array form, so that the occupied area of a single pixel is small, and the imaging ranging sensor can accommodate more pixels in the same area, thereby effectively improving the resolution of the imaging ranging sensor and ensuring that the fusion result of depth information and RGB information is more accurate.
As shown in fig. 3, a third structural diagram of the imaging distance measuring sensor is exemplarily shown;
wherein each pixel 103 comprises M × M infrared subpixels IR, M × M red subpixels R, M × M green subpixels G, and M × M blue subpixels B;
the M × M infrared sub-pixels IR are arranged in an array form to form an infrared sub-pixel unit 1, the M × M red sub-pixels R are arranged in an array form to form a red sub-pixel unit 2, the M × M green sub-pixels G are arranged in an array form to form a green sub-pixel unit 3, and the M × M blue sub-pixels B are arranged in an array form to form a blue sub-pixel unit 4;
each pixel 103 includes an infrared sub-pixel unit 1, a red sub-pixel unit 2, a green sub-pixel unit 3, and a blue sub-pixel unit 4.
In application, M is an integer greater than or equal to 2, and the specific value of M can be set according to actual needs. The arrangement of one infrared sub-pixel unit, one red sub-pixel unit, one green sub-pixel unit and one blue sub-pixel unit included in each pixel can be set according to actual needs, for example, the infrared sub-pixel unit, the red sub-pixel unit, the green sub-pixel unit and the blue sub-pixel unit are arranged in a 1 × 4, 4 × 1 or 2 × 2 array.
Fig. 3 exemplarily shows a schematic structural diagram of an imaging ranging sensor when M is 2 and one infrared sub-pixel unit, one red sub-pixel unit, one green sub-pixel unit, and one blue sub-pixel unit are arranged in a 2 × 2 array.
In the structure of the imaging ranging sensor shown in fig. 3, by arranging M × M infrared sub-pixels in an M × M array form into one infrared sub-pixel unit, four infrared sub-pixels can be combined into one infrared super-pixel, for example, the four infrared sub-pixels are connected to one or gate or one and gate to form one infrared super-pixel, so that the signal-to-noise ratio of the infrared sensing signal can be effectively improved, and the ranging accuracy is improved;
by arranging the M multiplied by M red sub-pixels into a red sub-pixel unit in an M multiplied by M array form, four red sub-pixels can be synthesized into a red super-pixel, for example, the four red sub-pixels are connected into an OR gate or an AND gate to form a red super-pixel, so that the signal-to-noise ratio of a red sensing signal can be effectively improved, and the precision of an R value can be improved;
the four green sub-pixels can be combined into one green super-pixel by arranging the M multiplied by M green sub-pixels into one green sub-pixel unit in an M multiplied by M array form, for example, the four green sub-pixels are connected into one OR gate or one AND gate to form one green super-pixel, so that the signal-to-noise ratio of a green light induction signal can be effectively improved, and the precision of a G value is improved;
by arranging the M multiplied by M blue sub-pixels into a blue sub-pixel unit in an M multiplied by M array form, four blue sub-pixels can be synthesized into one blue super-pixel, for example, the four blue sub-pixels are connected into one OR gate or one AND gate to form one blue super-pixel, so that the signal-to-noise ratio of a blue sensing signal can be effectively improved, and the precision of a B value is improved;
the red sub-pixel unit, the green sub-pixel unit and the blue sub-pixel unit in each pixel have a structure which can improve the accuracy of RGB information as a whole.
The imaging ranging sensor provided by the embodiment of the application enables each pixel to comprise an infrared sub-pixel, a red sub-pixel, a green sub-pixel and a blue sub-pixel, and enables each sub-pixel to be composed of a single photon light-sensitive unit and a light filter covering the single photon light-sensitive unit, so that one imaging ranging sensor can simultaneously obtain the depth information and the RGB information, a depth image obtained based on the depth information and an RGB image obtained based on the RGB information have no visual angle difference, alignment is not needed when the depth image and the RGB image are fused, and operation resources and time can be effectively saved.
As shown in fig. 4, based on the structure of the imaging ranging sensor in any of the embodiments, the embodiment of the present application further provides an imaging ranging method, including:
step S201, acquiring a first light sensing signal which is output by each pixel and used for indicating the time of receiving infrared light reflected by an object;
step S202, acquiring a second light sensing signal which is output by each pixel and used for indicating the time of receiving the visible light reflected by the object;
step S203, acquiring depth information of the object according to the first light sensing signal;
and step S204, acquiring RGB information of the object according to the second light sensing signal.
In application, the infrared sub-pixel in each pixel is used for receiving infrared light reflected by a reflecting point at a corresponding position on an object, outputting a first light sensing signal when receiving the infrared light, calculating the distance between each reflecting point on the object and the imaging ranging sensor by using a flight time method based on the time of receiving the first light sensing signal and the time of transmitting the infrared light to the object, and obtaining the depth image of the object based on the depth information of the object, wherein the depth information of the object comprises the distances between all the reflecting points on the object and the imaging ranging sensor. When the imaging ranging sensor is only used for acquiring depth information, the depth information of the object can also be acquired according to the second light sensing signal, that is, the distance between each reflecting point on the object and the imaging ranging sensor can be calculated by using a flight time method based on the time of receiving the second light sensing signal and the time of emitting visible light to the object.
It should be understood that since the depth information of the object includes the distance between each light reflection point on the object and the imaging ranging sensor, the imaging ranging sensor can be used for ranging in addition to obtaining the depth information and RGB information.
In application, the color sub-pixels in each pixel are used for receiving visible light rays reflected by the reflective points at corresponding positions on the object, outputting a second photoinduction signal when the visible light rays are received, obtaining the RGB value (including the R value, the G value and the B value) of each reflective point on the object based on the second photoinduction signal, wherein the RGB information of the object includes the RGB values of all the reflective points on the object, and obtaining the RGB image of the object based on the RGB information of the object. Similarly, the infrared image information of the object can be obtained based on the first light sensing signal, and the infrared image of the object is obtained based on the infrared image information.
In one embodiment, step S203 includes:
acquiring the light intensity of infrared light reflected by the object according to the first light sensing signal;
acquiring infrared image information of the object according to the light intensity of the infrared light reflected by the object;
step S204 includes:
acquiring the light intensity of the visible light reflected by the object according to the second light sensing signal;
acquiring RGB information of the object according to the light intensity of the visible light reflected by the object;
in application, the infrared image information of the object is obtained based on the light intensity of the infrared light reflected by each reflection point on the object, and similarly, the RGB information of the object is obtained based on the light intensity of the visible light reflected by each reflection point on the object. According to the light intensity of the infrared light reflected by each point on the object, the brightness value corresponding to each reflection point in the infrared image information of the object can be obtained, the brightness value can be embodied as a gray value in the infrared image, and the infrared image obtained based on the infrared image information of the object is a gray image only containing the gray value corresponding to each reflection point. According to the light intensity of the visible light reflected by each reflecting point on the object, the brightness value corresponding to each reflecting point in the RGB information of the object can be obtained, and according to the brightness value of each reflecting point and the color of each color pixel in the corresponding pixel, the RGB value of each reflecting point can be obtained based on the calculation formula of the tristimulus values, so that the RGB information comprising the RGB value of each reflecting point of the object is obtained, and the RGB image of the object is further obtained.
In one embodiment, the acquiring the light intensity of the visible light reflected by the object according to the second light sensing signal includes:
acquiring the total photon number or the total peak signal frequency received by the imaging ranging sensor according to the second optical sensing signal;
acquiring the light intensity of the visible light reflected by the object according to a first corresponding relation between preset visible light intensity and total photon number or total peak signal frequency;
similarly, the acquiring the light intensity of the infrared light reflected by the object according to the first light sensing signal includes:
acquiring the total photon number or the total peak signal frequency received by the imaging ranging sensor according to the first light sensing signal;
and acquiring the light intensity of the infrared light reflected by the object according to a second corresponding relation between the preset infrared light intensity and the total photon number or the total peak signal frequency.
In application, since each single photon photosensitive unit is configured to respond to an incident single photon and output a light sensing signal indicating a time when the photon arrives at the single photon photosensitive unit, the total photon number of the visible light received by the imaging ranging sensor can be obtained according to the light sensing signals output by all the color sub-pixels, that is, the second light sensing signal. The Time of flight is calculated and converted into a Time code according to the photoinduction signal output by each color sub-pixel through a Time To Digital Converter (TDC) connected with the color sub-pixel in each pixel, then the Time code output by the Time Digital Converter is saved through a histogram circuit connected with the Time Digital Converter, a histogram is generated according to the Time code saved in at least one emission period of visible light, and the frequency of peak signals in the histogram is obtained, so that the total frequency of the peak signals in the histogram generated according to the photoinduction signal output by the color sub-pixel in each pixel can be obtained, and the total frequency of the peak signals is obtained. Similarly, the total number of photons that can obtain the infrared light received by the imaging ranging sensor and the total number of times of peak signals in a histogram generated according to the photosensitive signals output by the infrared sub-pixels in each pixel can also be obtained, so as to obtain the total number of times of peak signals. The first corresponding relationship and the second corresponding relationship may be obtained through a large number of experiments in advance by using the above-mentioned manner of obtaining the total photon number and the total peak signal number.
In one embodiment, before acquiring the total number of photons or the total number of peak signals received by the imaging ranging sensor according to the second photo sensing signal, the method includes:
respectively acquiring the total photon number or the total peak signal frequency received by the imaging ranging sensor under different light intensities of the visible light rays reflected by the object;
acquiring a first corresponding relation between different visible light intensities and the total photon number or the total peak signal number according to the total photon number or the total peak signal number acquired under each light intensity of the visible light reflected by the object;
similarly, before acquiring the total number of photons or the total number of peak signals received by the imaging ranging sensor according to the first light sensing signal, the method includes:
respectively acquiring the total photon number or the total peak signal frequency received by the imaging distance measuring sensor under different light intensities of infrared light rays reflected by the object;
and acquiring a second corresponding relation between different infrared light intensities and the total photon number or the total peak signal number according to the total photon number or the total peak signal number acquired under each light intensity of the infrared light reflected by the object.
In application, infrared rays with different light intensities and visible rays with different light intensities can be emitted to an object respectively or simultaneously, and then based on the mode, a first corresponding relation between different visible light intensities and the total photon number or the total peak signal frequency and a second corresponding relation between different infrared light intensities and the total photon number or the total peak signal frequency are obtained. Therefore, when RGB information needs to be obtained subsequently, the light intensity of the visible light can be obtained according to the first corresponding relation and the total photon number or the total peak signal frequency obtained based on the second light sensing signal; when infrared image information needs to be obtained subsequently, the light intensity of the infrared light can be obtained according to the second corresponding relation and the total photon number or the total peak signal frequency obtained based on the first light sensing signal.
In an application, the first corresponding relation and the second corresponding relation may exist in the form of a corresponding relation table, for example, a display lookup table.
As shown in fig. 5, a table of correspondence among light intensity, total photon number, and total peak signal number is exemplarily shown.
The imaging distance measurement method provided by the embodiment of the application can directly acquire the depth signal and the RGB information of an object through the first light-induced signal and the second light-induced signal output by the imaging distance measurement sensor, so that the depth image acquired based on the depth information and the RGB image acquired based on the RGB information have no visual angle difference, alignment is not needed when the depth image and the RGB image are subjected to image fusion, and operation resources and time can be effectively saved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
As shown in fig. 6, an embodiment of the present application provides an imaging ranging system 1000, including:
a light emitter 200 for emitting infrared rays and visible rays toward the object 2000;
an imaging range sensor 100;
and a controller 300 connected to the light emitter 200 and the imaging range sensor 100, respectively, for performing the imaging range finding method in the above-described embodiment.
In application, the imaging range finding system at least comprises a controller, a light emitter and an imaging range finding sensor, and may further comprise a collimating Optical element and a Diffractive Optical Element (DOE) covering the light emitter, a focusing lens or a micro-lens array covering the imaging range finding sensor, and the like. The collimating optical element is used for collimating the light rays emitted by the light emitter, and the diffracting optical element is used for diffracting the light rays. The lens or the micro-lens array is used for focusing the light reflected by the object on the photosensitive surface of the imaging distance measuring sensor. The controller is used for controlling the light emitter and the imaging distance measuring sensor to be turned on or off and can also adjust the light intensity of light emitted by the light emitter. The object may be any object in free space that reflects light.
In application, the Light emitter may be set as a Laser, a Light Emitting Diode (LED), a Laser Diode (LD), an Edge-Emitting Laser (EEL), or the like according to actual needs. The Laser may be a Vertical-Cavity Surface-Emitting Laser (VCSEL). The optical transmitter may be a tunable device.
In application, the controller may include signal amplifiers, Time-to-Digital converters (TDCs), Analog-to-Digital converters (ADCs), and the like, which are equal to the number of pixels or single photon sensing units in the imaging range sensor. The time rate converter connected to each pixel is connected to a histogram circuit.
In Application, the controller may further include a Central Processing Unit (CPU), other general purpose controllers, a Digital Signal controller (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA), other Programmable logic devices, discrete gates, transistor logic devices, discrete hardware components, or the like. The general controller may be a microcontroller or any conventional controller or the like.
The imaging distance measuring sensor, the imaging distance measuring method and the imaging distance measuring system provided by the embodiment of the application can be applied to terminal devices such as mobile phones, tablet computers, wearable devices, vehicle-mounted devices, Augmented Reality (AR) devices, Virtual Reality (VR) devices, notebook computers, netbooks and Personal Digital Assistants (PDAs).
The embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a controller, the steps in the above-mentioned embodiments of the imaging ranging method may be implemented.
The embodiments of the present application further provide a computer program product, which when running on a terminal device, enables the terminal device to implement the steps in the above-mentioned imaging ranging method embodiments when executed.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment. In the embodiments provided in the present application, it should be understood that the disclosed system/terminal device and method may be implemented in other ways. For example, the system/terminal device embodiments described above are merely illustrative.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. An imaging distance measuring sensor is characterized by comprising a plurality of pixels arranged in an array form, each pixel comprises at least four sub-pixels arranged in the array form, each sub-pixel comprises a single photon light-sensitive unit and a light filter covering the single photon light-sensitive unit, and each pixel comprises an infrared sub-pixel, a red sub-pixel, a green sub-pixel and a blue sub-pixel.
2. The imaging ranging sensor of claim 1, wherein each said pixel comprises four said infrared sub-pixels, three said red sub-pixels, six said green sub-pixels, and three said blue sub-pixels;
four infrared sub-pixels are arranged in a 2 x 2 array form to form an infrared sub-pixel unit, and one red sub-pixel, two green sub-pixels and one blue sub-pixel are arranged in a Bayer array form to form a color sub-pixel unit;
each pixel comprises one infrared sub-pixel unit and three color sub-pixel units.
3. The imaging range sensor of claim 1, wherein each of said pixels comprises one of said infrared sub-pixels, one of said red sub-pixels, one of said green sub-pixels, and one of said blue sub-pixels arranged in a 2 x 2 array.
4. The imaging range sensor of claim 1, wherein each said pixel comprises M x M of said infrared subpixels, M x M of said red subpixels, M x M of said green subpixels, and M x M of said blue subpixels;
the M multiplied by M infrared sub-pixels are arranged in an array form to form an infrared sub-pixel unit, the M multiplied by M red sub-pixels are arranged in an array form to form a red sub-pixel unit, the M multiplied by M green sub-pixels are arranged in an array form to form a green sub-pixel unit, and the M multiplied by M blue sub-pixels are arranged in an array form to form a blue sub-pixel unit;
each pixel comprises one infrared sub-pixel unit, one red sub-pixel unit, one green sub-pixel unit and one blue sub-pixel unit, and M is an integer greater than or equal to 2.
5. An imaging ranging method implemented based on the imaging ranging sensor of any one of claims 1 to 4, the method comprising:
acquiring a first light sensing signal output by each pixel and used for indicating the time of receiving infrared light reflected by an object;
acquiring a second light sensing signal output by each pixel and used for indicating the time of receiving the visible light reflected by the object;
acquiring depth information of the object according to the first light sensing signal;
and acquiring RGB information of the object according to the second light sensing signal.
6. The imaging ranging method as claimed in claim 5, wherein the acquiring RGB information of the object according to the second photo sensing signal comprises:
acquiring the light intensity of the visible light reflected by the object according to the second light sensing signal;
and acquiring the RGB information of the object according to the light intensity of the visible light reflected by the object.
7. The imaging ranging method as claimed in claim 6, wherein the obtaining of the light intensity of the visible light reflected by the object according to the second light sensing signal comprises:
acquiring the total photon number or the total peak signal frequency received by the imaging ranging sensor according to the second optical sensing signal;
and acquiring the light intensity of the visible light reflected by the object according to a first corresponding relation between preset visible light intensity and the total photon number or the total peak signal frequency.
8. The imaging ranging method as claimed in claim 7, wherein before acquiring the total number of photons or the total number of peak signals received by the imaging ranging sensor according to the second photo sensing signal, the method comprises:
respectively acquiring the total photon number or the total peak signal frequency received by the imaging ranging sensor under different light intensities of the visible light rays reflected by the object;
and acquiring a first corresponding relation between different visible light intensities and the total photon number or the total peak signal number according to the total photon number or the total peak signal number acquired under each light intensity of the visible light reflected by the object.
9. An imaging ranging system, comprising:
a light emitter for emitting infrared rays and visible rays to an object;
the imaging ranging sensor of any one of claims 1 to 4;
a controller connected to the light emitter and the imaging range sensor, respectively, for performing the imaging range finding method of any one of claims 5 to 8.
10. A computer-readable storage medium, in which a computer program is stored, which, when being executed by a controller, carries out the steps of the imaging range finding method according to one of claims 5 to 8.
CN202011173365.6A 2020-10-28 2020-10-28 Imaging distance measuring sensor, method, system and storage medium Pending CN112363180A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011173365.6A CN112363180A (en) 2020-10-28 2020-10-28 Imaging distance measuring sensor, method, system and storage medium
PCT/CN2021/115371 WO2022088913A1 (en) 2020-10-28 2021-08-30 Imaging distance measurement sensor, method, and system, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011173365.6A CN112363180A (en) 2020-10-28 2020-10-28 Imaging distance measuring sensor, method, system and storage medium

Publications (1)

Publication Number Publication Date
CN112363180A true CN112363180A (en) 2021-02-12

Family

ID=74511135

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011173365.6A Pending CN112363180A (en) 2020-10-28 2020-10-28 Imaging distance measuring sensor, method, system and storage medium

Country Status (2)

Country Link
CN (1) CN112363180A (en)
WO (1) WO2022088913A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112738385A (en) * 2021-03-30 2021-04-30 北京芯海视界三维科技有限公司 Sensor and shooting module
CN112804438A (en) * 2021-03-30 2021-05-14 北京芯海视界三维科技有限公司 Sensor and shooting module
CN112799097A (en) * 2021-04-14 2021-05-14 深圳阜时科技有限公司 Method for acquiring depth map and gray scale map, depth camera and electronic equipment
CN113055621A (en) * 2021-03-11 2021-06-29 维沃移动通信有限公司 Camera module and electronic equipment
CN113938664A (en) * 2021-09-10 2022-01-14 思特威(上海)电子科技股份有限公司 Signal acquisition method of pixel array, image sensor, equipment and storage medium
WO2022088913A1 (en) * 2020-10-28 2022-05-05 Oppo广东移动通信有限公司 Imaging distance measurement sensor, method, and system, and storage medium
CN115184956A (en) * 2022-09-09 2022-10-14 荣耀终端有限公司 TOF sensor system and electronic device
WO2023226395A1 (en) * 2022-05-25 2023-11-30 Oppo广东移动通信有限公司 Image sensor, camera, and electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1657873A (en) * 2004-02-17 2005-08-24 欧姆龙株式会社 Optical measuring device and optical measuring method
CN105049829A (en) * 2015-07-10 2015-11-11 北京唯创视界科技有限公司 Optical filter, image sensor, imaging device and three-dimensional imaging system
CN105895645B (en) * 2015-02-17 2019-09-03 豪威科技股份有限公司 Pixel array and image sensing system
CN110574367A (en) * 2019-07-31 2019-12-13 华为技术有限公司 Image sensor and image sensitization method
CN211429422U (en) * 2018-09-14 2020-09-04 深圳阜时科技有限公司 Image sensor, image acquisition device, identity recognition device and electronic equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110003696A (en) * 2009-07-06 2011-01-13 삼성전자주식회사 Optical filter array for the single chip three-dimension color image sensor and method for manufacturing same
KR102086509B1 (en) * 2012-11-23 2020-03-09 엘지전자 주식회사 Apparatus and method for obtaining 3d image
CN112363180A (en) * 2020-10-28 2021-02-12 Oppo广东移动通信有限公司 Imaging distance measuring sensor, method, system and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1657873A (en) * 2004-02-17 2005-08-24 欧姆龙株式会社 Optical measuring device and optical measuring method
CN105895645B (en) * 2015-02-17 2019-09-03 豪威科技股份有限公司 Pixel array and image sensing system
CN105049829A (en) * 2015-07-10 2015-11-11 北京唯创视界科技有限公司 Optical filter, image sensor, imaging device and three-dimensional imaging system
CN211429422U (en) * 2018-09-14 2020-09-04 深圳阜时科技有限公司 Image sensor, image acquisition device, identity recognition device and electronic equipment
CN110574367A (en) * 2019-07-31 2019-12-13 华为技术有限公司 Image sensor and image sensitization method

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022088913A1 (en) * 2020-10-28 2022-05-05 Oppo广东移动通信有限公司 Imaging distance measurement sensor, method, and system, and storage medium
CN113055621A (en) * 2021-03-11 2021-06-29 维沃移动通信有限公司 Camera module and electronic equipment
CN113055621B (en) * 2021-03-11 2024-04-09 维沃移动通信有限公司 Camera module and electronic equipment
CN112738385A (en) * 2021-03-30 2021-04-30 北京芯海视界三维科技有限公司 Sensor and shooting module
CN112804438A (en) * 2021-03-30 2021-05-14 北京芯海视界三维科技有限公司 Sensor and shooting module
CN112799097A (en) * 2021-04-14 2021-05-14 深圳阜时科技有限公司 Method for acquiring depth map and gray scale map, depth camera and electronic equipment
CN112799097B (en) * 2021-04-14 2023-11-28 深圳阜时科技有限公司 Depth map and gray map acquisition method, depth camera and electronic device
CN113938664A (en) * 2021-09-10 2022-01-14 思特威(上海)电子科技股份有限公司 Signal acquisition method of pixel array, image sensor, equipment and storage medium
WO2023226395A1 (en) * 2022-05-25 2023-11-30 Oppo广东移动通信有限公司 Image sensor, camera, and electronic device
CN115184956A (en) * 2022-09-09 2022-10-14 荣耀终端有限公司 TOF sensor system and electronic device
CN115184956B (en) * 2022-09-09 2023-01-13 荣耀终端有限公司 TOF sensor system and electronic device

Also Published As

Publication number Publication date
WO2022088913A1 (en) 2022-05-05

Similar Documents

Publication Publication Date Title
CN112363180A (en) Imaging distance measuring sensor, method, system and storage medium
US20210181317A1 (en) Time-of-flight-based distance measurement system and method
US9058081B2 (en) Application using a single photon avalanche diode (SPAD)
WO2021120403A1 (en) Depth measurement device and method
WO2021051478A1 (en) Time-of-flight-based distance measurement system and method for dual-shared tdc circuit
WO2021238212A1 (en) Depth measurement apparatus and method, and electronic device
WO2021120402A1 (en) Fused depth measurement apparatus and measurement method
US8461533B2 (en) Radiation sensor
WO2021072802A1 (en) Distance measurement system and method
US20140091206A1 (en) Proximity sensor and associated method, computer readable medium and firmware
CN105991978B (en) The method of imaging sensor and generation depth data with depth detection pixel
CN111965658B (en) Distance measurement system, method and computer readable storage medium
JPWO2018211831A1 (en) Photodetectors and portable electronics
US20210208275A1 (en) Angle of rotation determination in scanning lidar systems
US11709271B2 (en) Time of flight sensing system and image sensor used therein
CN113780349A (en) Method for acquiring training sample set, model training method and related device
WO2022241942A1 (en) Depth camera and depth calculation method
WO2019041257A1 (en) Signal processing chip, image processing system and distance measurement system
CN111983630B (en) Single photon ranging system, method, terminal equipment and storage medium
US20240127566A1 (en) Photography apparatus and method, electronic device, and storage medium
GB2486164A (en) Using a single photon avalanche diode (SPAD) as a proximity detector
CN111965659B (en) Distance measurement system, method and computer readable storage medium
WO2022088914A1 (en) Photosensitive device and time-of-flight ranging system
WO2023133939A1 (en) Image-laser fusion lidar detection system and method
WO2022088492A1 (en) Collector, distance measurement system, and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210212