WO2021037141A1 - Dispositif électronique et procédé d'acquisition d'informations de profondeur - Google Patents

Dispositif électronique et procédé d'acquisition d'informations de profondeur Download PDF

Info

Publication number
WO2021037141A1
WO2021037141A1 PCT/CN2020/111750 CN2020111750W WO2021037141A1 WO 2021037141 A1 WO2021037141 A1 WO 2021037141A1 CN 2020111750 W CN2020111750 W CN 2020111750W WO 2021037141 A1 WO2021037141 A1 WO 2021037141A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
lens
image
path
image sensor
Prior art date
Application number
PCT/CN2020/111750
Other languages
English (en)
Chinese (zh)
Inventor
张荣祥
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2021037141A1 publication Critical patent/WO2021037141A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/514Depth or shape recovery from specularities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction

Definitions

  • the embodiment of the present invention relates to the field of communication technology, and in particular to an electronic device and a method for acquiring depth information.
  • the camera With the popularization of electronic equipment and the continuous development of the camera function of electronic equipment, the camera has become the standard module of electronic equipment. Among them, the depth information plays an important role in the realization of the camera function.
  • the depth information can be used to realize distance measurement and portrait Techniques such as background blur can also be used to optimize image quality.
  • Method 1 Use Time of Flight (TOF) to obtain depth information.
  • the light pulse transmitter is used to continuously send light pulses to the target object, and then the sensor is used to receive the light pulses returned from the target object, and The time statistics chip detects the flight (round trip) time of the light pulse to obtain the depth information of the target object.
  • the second method is to realize the acquisition of depth information through a laser speckle camera equipped with RGB (red, green, and blue) image sensors and infrared image sensors.
  • the third method is to use the binocular camera module to obtain the depth information, specifically by using the binocular parallax principle to register the information obtained by the two image sensors, so as to obtain the parallax information, and use the relationship between the parallax and the depth information Convert the depth information.
  • the first method needs to be configured with a relatively expensive time statistics chip
  • the second method requires an additional RGB image sensor and an infrared image sensor
  • the third method requires two image sensors. Therefore, the above three methods have the problems of high cost and low cost performance.
  • the embodiments of the present invention provide an electronic device and a method for acquiring in-depth information to solve the problems of high production cost and low cost performance of electronic devices in the prior art.
  • an electronic device including:
  • the reflective component and the image sensor are provided;
  • the reflective component is used to reflect the first light output from the first light-incident component to the image sensor along a first path, and the reflective component reflects the second light output from the second light-incident component along a first path.
  • the second path is reflected to the image sensor;
  • the image sensor is configured to generate a target image according to the first light and the second light, and send the target image to the processor;
  • the processor is used to calculate the target depth information of the target image.
  • an embodiment of the present invention provides a method for acquiring depth information, the method including:
  • an embodiment of the present invention also provides an electronic device, including a processor, a memory, and a computer program stored on the memory and running on the processor, and the computer program is executed by the processor. When executed, the steps of the depth information acquisition method provided by the present invention are realized.
  • the embodiments of the present invention also provide a readable storage medium.
  • the instructions in the storage medium are executed by the processor of the electronic device, the electronic device can execute the depth information acquisition method provided by the present invention. step.
  • an embodiment of the present invention also provides a computer-readable storage medium on which a computer program is stored.
  • the computer program is executed by a processor, the depth as provided by the embodiment of the present invention is realized. The steps of the information acquisition method.
  • the electronic device includes: a first light-incident component, a second light-incident component, a reflective component, an image sensor, and a processor; on the same side of the first light-incident component and the second light-incident component, Reflective component and image sensor; the reflective component reflects the first light output from the first light incident component to the image sensor along the first path, and the reflective component reflects the second light output from the second light incident component to the image along the second path
  • the sensor sends the target image generated according to the first light and the second light to the processor; the processor calculates the target depth information corresponding to the target image, and the embodiment of the present invention uses the reflective component to perform the first light entrance component and the second entrance light component
  • the reflection operation of the two light rays output by the optical component shortens the distance between the incident paths of the two light rays incident on the image sensor, and achieves the purpose of receiving two sets of incident light rays through only one image sensor, and the present invention is implemented
  • Reflective component and image sensor on the
  • FIG. 1 is a structural block diagram of an electronic device provided by an embodiment of the present invention.
  • FIG. 2 is a structural diagram of an electronic device provided by an embodiment of the present invention.
  • Figure 3 is a structural diagram of another electronic device provided by an embodiment of the present invention.
  • Figure 4 is a structural diagram of another electronic device provided by an embodiment of the present invention.
  • Fig. 5 is a flow chart of the steps of a method for acquiring depth information according to an embodiment of the present invention.
  • any lower limit can be combined with any upper limit to form an unspecified range; and any lower limit can be combined with other lower limits to form an unspecified range, and any upper limit can be combined with any other upper limit to form an unspecified range.
  • every point or single value between the end points of the range is included in the range. Therefore, each point or single numerical value can be used as its own lower limit or upper limit, combined with any other point or single numerical value, or combined with other lower or upper limits to form an unspecified range.
  • FIG. 1 is a structural block diagram of an electronic device provided by an embodiment of the present invention.
  • the electronic device 1 may include: a first light-incident component 10, a second light-incident component 20, a reflective component 30, and an image sensor 40 and the processor 50; wherein, on the same side of the first light-incident component 10 and the second light-incident component 20, a reflection component 30 and an image sensor 40 are provided; the reflection component 30 outputs the first light-incident component 10
  • the light AB is reflected to the image sensor 40 along the first path EF, and the reflection component 30 reflects the second light CD output by the second light incident component 20 to the image sensor 40 along the second path GH; the image sensor 40 will follow the first light
  • the target image generated by AB and the second ray CD is sent to the processor 50; the processor 50 calculates the target depth information corresponding to the target image.
  • the vertical distance between the first light incident component 10 and the second light incident component 20 is a first preset distance a; the first path EF and the second path GH are parallel to each other, and the first path EF and the second path GH are parallel to each other.
  • the vertical distance between the paths GH is a second preset distance b, and the second preset distance b is smaller than the first preset distance a.
  • the imaging device In the embodiment of the present invention, based on the current design concept of light and thin electronic equipment, the imaging device also needs to be designed to be light enough to avoid occupying too much space in the electronic equipment.
  • the imaging device needs to include two light paths that can guide the incidence of light, so as to obtain two groups of light with a certain parallax.
  • the first light-incident component 10 and the second light-incident component 10 and the second light path can be obtained.
  • the light incident component 20 realizes the acquisition of two groups of light rays.
  • a certain distance between the two groups of light rays is required. Therefore, the first light incident component 10 and the second light incident The components 20 may be separated by a first preset distance a.
  • the length of the light-receiving surface of the image sensor 40 is limited, and the length of the light-receiving surface of the image sensor 40 is usually smaller than the first preset distance a, in order to obtain the first light-incident component 10 and the second light-incident component 20
  • the two sets of light rays converge to the light-receiving surface of the image sensor 40.
  • the two sets of light rays can be reflected by the reflective component 30 to form a set of reflected light rays parallel to each other and separated by a second preset distance b, and Converge to the light-receiving surface of the image sensor 40.
  • the reflective component 30 can reflect the first light AB output from the first light incident group 10 pieces along the first path EF to the image sensor 40, and the reflective component 30 can reflect the second incident light AB into the image sensor 40.
  • the second light CD output by the optical component 20 is reflected to the image sensor 40 along the second path GH. Due to the light reflection, the second predetermined path b between the first path EF and the second path GH may be smaller than the first predetermined path. Assuming the distance a, the purpose of receiving two sets of incident light by only one image sensor 40 is achieved.
  • the image sensor 40 After the image sensor 40 receives the light incident along the first path EF and the second path GH, it can generate a target image according to the optical information of the two light beams, and send the target image to the processor 50.
  • the first light incident component 10 and the second light incident component 20 respectively have corresponding effective imaging area calibration information, and the effective imaging area calibration information is determined by the hardware characteristics of the corresponding light incident components.
  • the processor 50 uses the effective imaging area calibration information to distinguish between the area corresponding to the first light incident component 10 and the area corresponding to the second light incident component 20 in the target image, so that the target image can be divided into two independent images. Of the two independent images, one can be understood as an image generated based on the first light incident component 10, and the other can be understood as an image generated based on the second light incident component 20, and there is a certain parallax between the two images.
  • the processor 50 performs registration calculation on the two independent images, can calculate the disparity information, and can obtain the target depth information corresponding to the target image according to the preset mapping relationship between the disparity information and the depth information.
  • the reflecting component 30 can be used to reflect the two light rays output by the first light incident component 10 and the second light incident component 20, which shortens the incidence of the two rays of light incident on the image sensor 40.
  • the distance between the paths without increasing the number of image sensors 40 and the length of its light-receiving surface, achieves the purpose of receiving two sets of incident light through only one image sensor 40, and the embodiment of the present invention does not require additional Adding functional modules such as time statistics chip, RGB image sensor, infrared image sensor, etc., makes the production cost of electronic equipment lower and improves the cost performance.
  • the electronic device in the embodiment of the present invention may have only one image sensor, the incidence of two groups of light can be realized based on one image sensor. Since the two groups of light are incident on the same image sensor, the incidence of the two groups of light does not need to be framed. Synchronous calculation solves the problem of image frame synchronization caused by the use of two image sensors in the prior art.
  • an electronic device includes: a first light-incident component, a second light-incident component, a reflective component, an image sensor, and a processor; On the same side of the component, a reflective component and an image sensor are arranged; the reflective component reflects the first light output from the first light-incoming component along the first path to the image sensor, and the reflective component reflects the second light output from the second light-incoming component, The image sensor is reflected along the second path to the image sensor; the image sensor sends the target image generated according to the first light and the second light to the processor; the processor calculates the target depth information corresponding to the target image.
  • the reflection operation of the two light rays output by the light incident component and the second light incident component shortens the distance between the incident paths of the two rays of light incident on the image sensor, and achieves that only one image sensor receives two sets of incident light rays.
  • the embodiment of the present invention does not need to add additional functional modules such as a time statistics chip, an RGB image sensor, an infrared image sensor, etc., so that the production cost of the electronic device is lower, and the cost performance is improved.
  • the first light incident component 10 includes: a first lens 101 and a first mirror 102; a second light incident component 20 includes: a second lens 201 and a second mirror 202; the reflection assembly 30 includes: a third mirror 301 and a fourth mirror 302, the third mirror 301 and the fourth mirror 302 are perpendicular to each other; the first lens The vertical distance between 101 and the second lens 201 is a first preset distance a, and the light path of the first lens 101 and the light path of the second lens 201 are parallel to each other; the first mirror 102 and the third mirror 301 Are parallel to each other, the second mirror 202 and the fourth mirror 302 are parallel to each other; the first mirror 102 reflects the first light output from the first lens 101 to the third mirror 301, and the second mirror 202 will The second light output by the second lens 201 is reflected to the fourth mirror 302; the third mirror 301 reflects the first light along
  • the first lens 101 and the second lens 201 may be conventional camera lenses.
  • the light paths are parallel to each other, that is, the light output from the first lens 101 and the light output from the second lens 201 need to be parallel to each other.
  • the first reflector 102, the second reflector 202, the third reflector 301, and the fourth reflector 302 can reflect light.
  • the first reflector 102 and the third reflector 301 can be parallel to each other.
  • the mirror 202 and the fourth mirror 302 are parallel to each other, and the third mirror 301 and the fourth mirror 302 are perpendicular to each other.
  • the first lens 101 and the second lens 201 can be separated from each other.
  • the two light rays output at a distance are respectively output to different areas of the image sensor 40 in short paths.
  • the image processor 40 can generate a target image that can reflect the parallax information according to the optical information of the light rays falling in different areas.
  • the light path structure of the electronic device shown in Figure 2 is similar to the periscope type light path structure.
  • Two lenses with a certain parallax are obtained through two lenses separated by a larger distance, and the reflection is used.
  • Component after combining two light rays into a set of parallel light with a short distance, they are incident on the image sensor and irradiated on the photosensitive surface of the same image sensor, which simplifies the design of the light path, saves the number of image sensors, and saves Cost.
  • the first light incident component 10 further includes: a first guide rail 103; and the second light incident component 20 further includes: The second guide rail 203; the first lens 101 is fixedly connected to the first mirror 102, the second lens 201 is fixedly connected to the second mirror 202, the first guide rail 103 is arranged between the first lens 101 and the image sensor 40, and the second The guide rail 203 is arranged between the second lens 201 and the image sensor 40; the first mirror 102 or the first lens 101 is arranged on the first guide rail 103 and moves along the first guide rail 103; the second mirror 202 or the second lens 201 is arranged on the second guide rail 203 and moves along the second guide rail 203.
  • the first guide rail 103 may be perpendicular to the light path of the first lens 101
  • the second guide rail 203 may be perpendicular to the light path of the second lens 201.
  • the first lens 101 and the first reflector 102 can be jointly fixed on the first base and keep the relative position between the two fixed, and the first base can be movably arranged on the first guide rail 103 and along the first base.
  • a guide rail 103 moves; the second lens 201 and the second reflector 202 can be fixed together on the second base and keep the relative position between them fixed, and the second base can be movably arranged on the second guide rail 203 and move along the second guide rail 203.
  • the distance between the first light incident component 10 and the second light incident component 20 can be adjusted The size, so as to achieve the purpose of adjusting the size of the parallax value of the incident light, so that the electronic device can cover scenes with different depth information ranges.
  • the image sensor 40 can generate target images reflecting different parallax information, and enable the processor 50 to obtain different target depths for different target images.
  • the processor 50 may also perform fusion calculation processing on different target depth information, and finally output a fused depth information image. After fusion, more detailed and wider depth information can be obtained, which expands the depth range and accuracy that electronic devices can measure.
  • the first light-incident component 10 includes: a third lens 104; the second light-incident component 20 includes: a fourth The lens 204; the reflection assembly 30 includes: a fifth mirror 303 and a sixth mirror 304; the vertical distance between the third lens 104 and the fourth lens 204 is a first preset distance a, and the light path of the third lens 104 is The light paths of the fourth lens 204 are parallel to each other, and the fifth mirror 303 and the sixth mirror 304 are parallel to each other; the vertical distance between the third lens 104 and the image sensor 40 is greater than that of the fourth lens 204 and the image sensor 40 The vertical distance between the fifth mirror 303 is arranged on one side of the light path direction of the third lens 104, and the light path of the fifth mirror 303 overlaps with the light path of the third lens 104, and the sixth mirror 304 is arranged in the first One side of the light path direction of the fourth
  • the light path structure of the electronic device can be further simplified, so that only through the two sets of lenses and the mirrors corresponding to the two sets of lenses, it is realized that two rays of light with a certain parallax are obtained through the electronic device, and Using reflective components, the two rays of light are combined into a set of parallel light with a short distance, and then they are incident on the image sensor and irradiated on the photosensitive surface of the same image sensor. This further simplifies the design of the light path and saves the cost of the image sensor. Quantity, cost savings.
  • the electronic device in the embodiment of the present invention may have only one image sensor, the incidence of two groups of light can be realized based on one image sensor. Since the two groups of light are incident on the same image sensor, the incidence of the two groups of light does not need to be framed. Synchronous calculation solves the problem of image frame synchronization caused by the use of two image sensors in the prior art.
  • an electronic device includes: a first light-incident component, a second light-incident component, a reflective component, an image sensor, and a processor; On the same side of the component, a reflective component and an image sensor are arranged; the reflective component reflects the first light output from the first light-incoming component along the first path to the image sensor, and the reflective component reflects the second light output from the second light-incoming component, The image sensor is reflected along the second path to the image sensor; the image sensor sends the target image generated according to the first light and the second light to the processor; the processor calculates the target depth information corresponding to the target image.
  • the reflection operation of the two light rays output by the light incident component and the second light incident component shortens the distance between the incident paths of the two rays of light incident on the image sensor, and achieves that only one image sensor receives two sets of incident light rays.
  • the embodiment of the present invention does not need to add additional functional modules such as a time statistics chip, an RGB image sensor, an infrared image sensor, etc., so that the production cost of the electronic device is lower, and the cost performance is improved.
  • Fig. 5 is a step flow chart of a method for acquiring depth information according to an embodiment of the present invention. The method is applied to a processor in an electronic device. As shown in Fig. 5, the method may include:
  • Step 501 Obtain a target image generated by the image sensor according to the first light and the second light.
  • the image sensor after the image sensor receives the first light incident along the first path and the second light incident along the second path, it can generate a target image based on the optical information of the two light rays.
  • the disparity information generated due to the separation distance between the first path and the second path can be reflected.
  • Step 502 According to the effective imaging area calibration information of the first light incident component and the effective imaging area calibration information of the second light incident component, divide the target image into a first image and a second image.
  • the first light incident component and the second light incident component respectively have corresponding effective imaging area calibration information
  • the effective imaging area calibration information is determined by the hardware characteristics of the corresponding light incident component.
  • the processor uses the effective imaging area calibration information to distinguish the area corresponding to the first light incident component and the area corresponding to the second light incident component in the target image, so that the target image can be divided into a first image and a second independent image.
  • the first image may be understood as an image generated based on the first light incident component
  • the second image may be understood as an image generated based on the second light incident component.
  • the effective imaging area calibration information of the light-incident component is the parameter that comes with the light-incident component when it leaves the factory. Therefore, you can refer to the factory manual of the light-incident component to obtain the effective imaging area calibration information of the light-incident component, or, Check the effective imaging area calibration information of the light-incident component of the electronic device through the official website of the electronic device.
  • Step 503 Obtain target disparity information between the first image and the second image.
  • the registration calculation can be a binocular matching calculation.
  • the function of the binocular matching calculation is to match the corresponding pixels of the same scene in the left and right views (that is, the first image and the second image). The purpose of this is In order to get the target disparity value. After the target disparity value is obtained, the operation of calculating the target depth information can be further performed.
  • step 503 may specifically include:
  • Sub-step 5031 Perform coarse registration calculation on the first image and the second image to obtain a first disparity map and a second disparity map.
  • a pixel-by-pixel coarse registration calculation may be performed for the first image and the second image to obtain the first disparity map and the second disparity map.
  • the coarse registration calculation may be a binocular matching calculation. , Including: taking the first image as a reference image, performing a rough registration calculation between the second image and the first image to obtain a first disparity map based on the first image; using the second image as the reference image, combining the first image with the first image Perform rough registration calculation on the two images to obtain a second disparity map based on the second image.
  • Sub-step 5032 Perform fine registration calculation on the first disparity map and the second disparity map to obtain the target disparity information.
  • the fine registration calculation includes: establishing a mutual matching template, which includes the pixels in the first disparity map and the second disparity map, and then combining the pixels in the first disparity map and the second disparity map.
  • Mutual matching check is performed between the corresponding pixels of the image.
  • the similarity between the pixels is greater than or equal to the preset threshold, the mutual matching check between the two pixels can be considered effective, and the corresponding matching template will be
  • the pixel position parameter of is set to 1, for the pixel that is invalid for mutual matching check, the corresponding pixel position parameter is set to 0 in the mutual matching template, and a mutual matching template image can be obtained.
  • the pixels in the first disparity map or the pixels in the second disparity map can be corrected according to the position parameter values of the pixels in the mutual matching template image.
  • the target disparity information is obtained. Fine disparity map.
  • the value of the pixel in the first disparity map is modified. Specifically, if the position parameter of the pixel in the mutual matching template image is 0, the first disparity If the value of the corresponding pixel in the figure is not 0, the value of the corresponding pixel in the first disparity map is changed from non-zero to 0; if the position parameter of the pixel in the mutual matching template image is 1, the corresponding pixel in the first disparity map If the value of the pixel is 0, the value of the corresponding pixel in the first disparity map is modified from 0 to 1. Until all pixels in the first disparity map are corrected, a fine disparity map including target disparity information is obtained.
  • Step 504 Determine the target depth information corresponding to the target disparity information according to the corresponding relationship between the target disparity information and the preset disparity information and depth information.
  • the electronic device may be calibrated in advance to obtain the preset correspondence between the disparity information and the depth information. After the target disparity information is obtained, the target depth information corresponding to the target disparity information may be determined based on the corresponding relationship.
  • the method may further include:
  • Step A1 Correct the first image to a first front view according to the internal parameters, external parameters and distortion parameters of the first light incident component.
  • lens distortion is actually the general term for the inherent perspective distortion of optical lenses, that is, distortion caused by perspective reasons, such as wide-angle lenses and fisheye lenses will cause large picture distortions in the photos taken. And due to the setting angle of the first light-incident component,
  • the first image obtained often has the problem of image distortion.
  • Camera calibration is a process of eliminating the distortion of the camera due to the characteristics of the optical lens. Through camera calibration, the internal parameters and external parameters of the first light-incident component can be obtained. Parameters and distortion parameters.
  • the internal parameters, external parameters, and distortion parameters of the first light-incident component obtained by the camera calibration are used to perform distortion elimination and line alignment processing on the first image to obtain no distortion The first front view.
  • Step A2 Correct the second image to a second front view according to the internal parameters, external parameters, and distortion parameters of the second light incident component.
  • step A1 For details of this step, please refer to the above step A1, which will not be repeated here.
  • step 503 may also be implemented by determining target disparity information between the first front view and the second front view.
  • the undistorted first front view and the second front view through the undistorted first front view and the second front view, more accurate target disparity information can be calculated by calculation, and the accuracy of target disparity information obtained from the target disparity information is further improved.
  • the method for acquiring depth information includes: acquiring a target image generated by an image sensor according to a first light and a second light; calibrating information according to the effective imaging area of the first light incident component, and The effective imaging area calibration information of the second light-incident component divides the target image into the first image and the second image; determines the target disparity information between the first image and the second image; according to the target disparity information and preset disparity information The corresponding relationship with the depth information determines the target depth information corresponding to the target disparity information.
  • the embodiment of the present invention utilizes the reflective component to reflect the two light rays output by the first light incident component and the second light incident component, thereby shortening the distance between the incident paths of the two light rays entering the image sensor, and achieving only
  • the purpose of receiving two sets of incident light through one image sensor, and the embodiment of the present invention does not need to add additional functional modules such as time statistics chip, RGB image sensor, infrared image sensor, etc., so that the production cost of electronic equipment is lower, and the cost performance is improved. .
  • An embodiment of the present invention also provides an electronic device including a processor, a memory, and a computer program stored on the memory and capable of running on the processor.
  • the computer program is executed by the processor, the following The steps of the depth information acquisition method provided by the embodiment of the invention.
  • the processor may include, but is not limited to, a general-purpose processor, a special-purpose processor, a special application processor, or a field programmable processor.
  • Memory can include read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), disk storage media devices, optical storage media devices, flash memory devices, electrical, optical, or other physical/tangible memories Storage device.
  • ROM read-only memory
  • RAM random access memory
  • disk storage media devices optical storage media devices
  • flash memory devices electrical, optical, or other physical/tangible memories Storage device.
  • the various component embodiments of the present invention may be implemented in hardware, or implemented in software modules running on one or more processors, or implemented in a combination thereof.
  • a microprocessor or a digital signal processor (DSP) may be used in practice to implement some or all of the functions of some or all of the components in the embodiments of the present invention.
  • DSP digital signal processor
  • the present invention can also be implemented as a device or device program (for example, a computer program and a computer program product) for executing part or all of the methods described herein.
  • Such a program for realizing the present invention may be stored on a computer-readable medium, or may have the form of one or more signals.
  • Such a signal can be downloaded from an Internet website, or provided on a carrier signal, or provided in any other form.
  • the embodiment of the present invention also provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, it realizes the depth information acquisition method as provided in the embodiment of the present invention. step.
  • the computer-readable medium may be a non-transitory computer-readable medium, and examples thereof include ROM, RAM, magnetic disks, or optical disks.
  • the electronic equipment provided here is not inherently related to any particular computer, virtual system or other equipment.
  • Various general-purpose systems can also be used with the teaching based on this. From the above description, the structure required to construct the system with the solution of the present invention is obvious.
  • the present invention is not directed to any specific programming language. It should be understood that various programming languages can be used to implement the content of the present invention described herein, and the above description of a specific language is for the purpose of disclosing the best embodiment of the present invention.
  • modules or units or components in the embodiments can be combined into one module or unit or component, and in addition, they can be divided into multiple sub-modules or sub-units or sub-components. Except that at least some of such features and/or processes or units are mutually exclusive, any combination can be used to compare all the features disclosed in this specification (including the accompanying claims, abstract and drawings) and any method or methods disclosed in this manner or All the processes or units of the equipment are combined. Unless expressly stated otherwise, each feature disclosed in this specification (including the accompanying claims, abstract and drawings) may be replaced by an alternative feature providing the same, equivalent or similar purpose.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

La présente invention se rapporte au domaine technique des communications, et concerne un dispositif électronique et un procédé d'acquisition d'informations de profondeur. Le dispositif électronique comprend : un premier ensemble d'incidence de lumière, un second ensemble d'incidence de lumière, un ensemble de réflexion, un capteur d'image et un processeur. L'ensemble de réflexion réfléchit une première lumière émise par le premier ensemble d'incidence de lumière vers le capteur d'image le long d'un premier trajet, et réfléchit une seconde lumière émise par le second ensemble d'incidence de lumière vers le capteur d'image le long d'un second trajet. Le capteur d'image envoie au processeur une image cible générée en fonction de la première lumière et de la seconde lumière. Le processeur calcule des informations de profondeur cible correspondant à l'image cible.
PCT/CN2020/111750 2019-08-28 2020-08-27 Dispositif électronique et procédé d'acquisition d'informations de profondeur WO2021037141A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910804707.0 2019-08-28
CN201910804707.0A CN110533708A (zh) 2019-08-28 2019-08-28 一种电子设备及深度信息获取方法

Publications (1)

Publication Number Publication Date
WO2021037141A1 true WO2021037141A1 (fr) 2021-03-04

Family

ID=68664842

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/111750 WO2021037141A1 (fr) 2019-08-28 2020-08-27 Dispositif électronique et procédé d'acquisition d'informations de profondeur

Country Status (2)

Country Link
CN (1) CN110533708A (fr)
WO (1) WO2021037141A1 (fr)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110533708A (zh) * 2019-08-28 2019-12-03 维沃移动通信有限公司 一种电子设备及深度信息获取方法
CN113132709B (zh) * 2019-12-31 2022-11-08 中移物联网有限公司 一种双目测距装置、双目测距方法及电子设备
CN111402313B (zh) * 2020-03-13 2022-11-04 合肥的卢深视科技有限公司 图像深度恢复方法和装置
CN111416948A (zh) * 2020-03-25 2020-07-14 维沃移动通信有限公司 一种图像处理方法及电子设备
CN112511731A (zh) * 2020-12-17 2021-03-16 南昌欧菲光电技术有限公司 摄像模组及电子设备
CN113052898B (zh) * 2021-04-08 2022-07-12 四川大学华西医院 基于主动式双目相机的点云和强反光目标实时定位方法
CN114383564A (zh) * 2022-01-11 2022-04-22 平安普惠企业管理有限公司 基于双目摄像头的深度测量方法、装置、设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080247022A1 (en) * 2007-04-03 2008-10-09 Seiko Epson Corporation Light source device and projector
CN102999939A (zh) * 2012-09-21 2013-03-27 魏益群 坐标获取装置、实时三维重建系统和方法、立体交互设备
CN105578019A (zh) * 2014-08-15 2016-05-11 光宝科技股份有限公司 一种可获得深度信息的影像提取系统与对焦方法
CN106226977A (zh) * 2016-08-24 2016-12-14 深圳奥比中光科技有限公司 激光投影模组、图像采集系统及其控制方法和装置
CN107135388A (zh) * 2017-05-27 2017-09-05 东南大学 一种光场图像的深度提取方法
CN110533708A (zh) * 2019-08-28 2019-12-03 维沃移动通信有限公司 一种电子设备及深度信息获取方法

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105530431A (zh) * 2015-12-16 2016-04-27 景好 一种反射式全景成像系统及方法
CN107968902A (zh) * 2016-10-20 2018-04-27 上海富瀚微电子股份有限公司 基于单图像传感器的全景摄像机及其实现方法
CN106525004A (zh) * 2016-11-09 2017-03-22 人加智能机器人技术(北京)有限公司 双目立体视觉系统及深度测量方法
CN107369172B (zh) * 2017-07-14 2021-07-09 上海肇观电子科技有限公司 一种智能设备和输出深度图像的方法
CN207963848U (zh) * 2018-03-12 2018-10-12 武汉大学 一种基于双目视觉的望远测距系统
CN109525830A (zh) * 2018-11-28 2019-03-26 浙江未来技术研究院(嘉兴) 一种立体视频采集系统
CN110139012A (zh) * 2019-05-28 2019-08-16 成都易瞳科技有限公司 双鱼眼全景图像采集装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080247022A1 (en) * 2007-04-03 2008-10-09 Seiko Epson Corporation Light source device and projector
CN102999939A (zh) * 2012-09-21 2013-03-27 魏益群 坐标获取装置、实时三维重建系统和方法、立体交互设备
CN105578019A (zh) * 2014-08-15 2016-05-11 光宝科技股份有限公司 一种可获得深度信息的影像提取系统与对焦方法
CN106226977A (zh) * 2016-08-24 2016-12-14 深圳奥比中光科技有限公司 激光投影模组、图像采集系统及其控制方法和装置
CN107135388A (zh) * 2017-05-27 2017-09-05 东南大学 一种光场图像的深度提取方法
CN110533708A (zh) * 2019-08-28 2019-12-03 维沃移动通信有限公司 一种电子设备及深度信息获取方法

Also Published As

Publication number Publication date
CN110533708A (zh) 2019-12-03

Similar Documents

Publication Publication Date Title
WO2021037141A1 (fr) Dispositif électronique et procédé d'acquisition d'informations de profondeur
US11856291B2 (en) Thin multi-aperture imaging system with auto-focus and methods for using same
US10477185B2 (en) Systems and methods for multiscopic noise reduction and high-dynamic range
EP3389268B1 (fr) Procédé et appareil d'acquisition d'informations de profondeur, et dispositif de collecte d'image
US10630884B2 (en) Camera focusing method, apparatus, and device for terminal
CN109712192B (zh) 摄像模组标定方法、装置、电子设备及计算机可读存储介质
EP3480648B1 (fr) Système d'imagerie tridimensionnelle adaptative
CN102143305B (zh) 摄像方法及系统
TWI761684B (zh) 影像裝置的校正方法及其相關影像裝置和運算裝置
US10821911B2 (en) Method and system of camera focus for advanced driver assistance system (ADAS)
US11022858B2 (en) Multiple camera apparatus and method for synchronized autofocus
US10904512B2 (en) Combined stereoscopic and phase detection depth mapping in a dual aperture camera
WO2018001252A1 (fr) Unité de projection et appareil de photographie comportant ladite unité de projection, processeur, et dispositif d'imagerie
TWI595444B (zh) 影像擷取裝置及其產生深度資訊的方法與自動校正的方法
JP5857712B2 (ja) ステレオ画像生成装置、ステレオ画像生成方法及びステレオ画像生成用コンピュータプログラム
WO2018076529A1 (fr) Procédé, dispositif et terminal de calcul de profondeur de scène
CN106657982B (zh) 一种摄像模组图像清晰度校准方法及装置
KR102031485B1 (ko) 360도 카메라와 평면 거울을 이용한 다시점 영상 획득 장치 및 방법
US20140354875A1 (en) Image capturing apparatus and control method therefor
CN109923585B (zh) 使用立体图像进行深度检测的方法和装置
CN115314698A (zh) 一种立体拍摄及显示装置、方法
TWM516732U (zh) 涵蓋景深之調焦系統

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20856323

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20856323

Country of ref document: EP

Kind code of ref document: A1