CN110533708A - A kind of electronic equipment and depth information acquisition method - Google Patents

A kind of electronic equipment and depth information acquisition method Download PDF

Info

Publication number
CN110533708A
CN110533708A CN201910804707.0A CN201910804707A CN110533708A CN 110533708 A CN110533708 A CN 110533708A CN 201910804707 A CN201910804707 A CN 201910804707A CN 110533708 A CN110533708 A CN 110533708A
Authority
CN
China
Prior art keywords
light
optical assembly
camera lens
reflective mirror
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910804707.0A
Other languages
Chinese (zh)
Inventor
张荣祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201910804707.0A priority Critical patent/CN110533708A/en
Publication of CN110533708A publication Critical patent/CN110533708A/en
Priority to PCT/CN2020/111750 priority patent/WO2021037141A1/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/514Depth or shape recovery from specularities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The present invention provides a kind of electronic equipment and depth information acquisition methods, are related to field of communication technology, comprising: first enters optical assembly, second enters optical assembly, reflection subassembly, imaging sensor and processor;The first light that reflection subassembly enters optical assembly output for first reflexes to imaging sensor, the second light that reflection subassembly enters optical assembly output for second, along the second multipath tolerant to imaging sensor along first path;Imaging sensor will be sent to processor according to the target image of the first light and the generation of the second light;Processor calculates the corresponding target depth information of target image, the embodiment of the present invention enters optical assembly and second to first using reflection subassembly and enters the reflective operation for two light that optical assembly exports, achieve the purpose that only to receive two groups of incident rays by an imaging sensor, and, the embodiment of the present invention does not need additionally to increase functional module, so that the lower production costs of electronic equipment, improve cost performance.

Description

A kind of electronic equipment and depth information acquisition method
Technical field
The present embodiments relate to field of communication technology more particularly to a kind of electronic equipment and depth information acquisition method.
Background technique
With universal and electronic equipment camera function the continuous development of electronic equipment, camera has been electronic equipment Standard configuration module, wherein realization of the depth information for camera function plays an important role, can be with using depth information It realizes the technologies such as ranging, portrait be background blurring, can also be used to optimize picture quality.
In the prior art, the mode of depth information is obtained usually there are three types of mode.Mode one utilizes flight time telemetry (TOF, Time of flight) realizes the acquisition of depth information, continuous to target object specifically by light pulse emission device Light pulse is sent, the light pulse returned from target object is then received with sensor, and chip detecting optical pulses are counted by the time Flight (round-trip) time, to obtain the depth information of target object.Mode two, by being configured with RGB (red, green, blue) image The laser speckle camera of sensor and infrared image sensor realizes the acquisition to depth information.Mode three, passes through binocular camera shooting Mould group realizes the acquisition to depth information, the information specially got using binocular parallax principle to two imaging sensors It is registrated, to obtain parallax information, and converses depth information using the relationship of parallax and depth information.
But at present in scheme, mode one needs to configure the time statistics chip of higher cost, and mode two needs additionally to match RGB image sensor and infrared image sensor are set, mode three needs to configure two imaging sensors.Therefore, above-mentioned three kinds of sides Formula has higher cost, the low problem of cost performance.
Summary of the invention
The embodiment of the present invention provides a kind of electronic equipment and depth information acquisition method, is set with solving electronics in the prior art Standby production cost is higher, the low problem of cost performance.
In a first aspect, the embodiment of the invention provides a kind of electronic equipment, comprising:
First enters optical assembly, second enters optical assembly, reflection subassembly, imaging sensor and processor;
Enter optical assembly and described second described first and enter the same side of optical assembly, is provided with the reflection subassembly and described Imaging sensor;
The first light that the reflection subassembly is used to enter optical assembly output for described first reflexes to described along first path Imaging sensor, the second light that the reflection subassembly enters optical assembly output for described second, along the second multipath tolerant to described Imaging sensor;
Described image sensor is used to generate target image according to first light and second light, and will be described Target image is sent to the processor;
The processor is used to calculate the target depth information of the target image.
Second aspect, the embodiment of the invention provides a kind of depth information acquisition methods, this method comprises:
Obtain the target image that imaging sensor is generated according to the first light and the second light;
The effective imaging region calibration information and described second for entering optical assembly according to described first enter the effective of optical assembly The target image is divided into the first image and the second image by imaging region calibration information;
Obtain the target parallax information between the first image and second image;
According to the corresponding relationship between the target parallax information and preset parallax information and depth information, determine described in The corresponding target depth information of target parallax information.
The third aspect the embodiment of the invention also provides a kind of electronic equipment, including processor, memory and is stored in institute The computer program that can be run on memory and on the processor is stated, when the computer program is executed by the processor It realizes such as the step of depth information acquisition method provided by the invention.
Fourth aspect, the embodiment of the invention also provides a kind of readable storage medium storing program for executing, the instruction in the storage medium When being executed by the processor of electronic equipment, so that electronic equipment is able to carry out such as depth information acquisition method provided by the invention Step.
In embodiments of the present invention, electronic equipment includes: first to enter optical assembly, second enter optical assembly, reflection subassembly, image Sensor and processor;Enter optical assembly and second first and enter the same side of optical assembly, is provided with reflection subassembly and image sensing Device;The first light that reflection subassembly enters optical assembly output for first reflexes to imaging sensor along first path, and reflection subassembly will Second enters the second light of optical assembly output, along the second multipath tolerant to imaging sensor;Imaging sensor will be according to the first light The target image that line and the second light generate is sent to processor;Processor calculates the corresponding target depth information of target image, The embodiment of the present invention enters optical assembly and second to first using reflection subassembly and enters the reflective operation for two light that optical assembly exports, The spacing between the incident path that this two light are incident in imaging sensor is shortened, has reached and has only been passed by an image Sensor receives the purpose of two groups of incident rays, also, the embodiment of the present invention does not need additionally to increase time statistics chip, RGB figure As the functional modules such as sensor, infrared image sensor, so that the lower production costs of electronic equipment, improve cost performance.
Detailed description of the invention
Fig. 1 is the structural block diagram of a kind of electronic equipment provided in an embodiment of the present invention;
Fig. 2 is the structure chart of a kind of electronic equipment provided in an embodiment of the present invention;
Fig. 3 is the structure chart of another electronic equipment provided in an embodiment of the present invention;
Fig. 4 is the structure chart of another electronic equipment provided in an embodiment of the present invention;
Fig. 5 is a kind of step flow chart of depth information acquisition method provided in an embodiment of the present invention.
Specific embodiment
The exemplary embodiment that the present invention will be described in more detail below with reference to accompanying drawings.Although showing the present invention in attached drawing Exemplary embodiment, it being understood, however, that may be realized in various forms the present invention without should be by embodiments set forth here It is limited.It is to be able to thoroughly understand the present invention on the contrary, providing these embodiments, and can be by the scope of the present invention It is fully disclosed to those skilled in the art.
Fig. 1 is the structural block diagram of a kind of electronic equipment provided in an embodiment of the present invention, as shown in Figure 1, the electronic equipment 1 can To include: first to enter optical assembly 10, second enter optical assembly 20, reflection subassembly 30, imaging sensor 40 and processor 50;Wherein, Enter optical assembly 10 and second first and enter the same side of optical assembly 20, is provided with reflection subassembly 30 and imaging sensor 40;Reflection Component 30 enters the first light AB that optical assembly 10 exports for first, reflexes to imaging sensor 40, reflection group along first path EF Part 30 enters the second light CD that optical assembly 20 exports for second, reflexes to imaging sensor 40 along the second path GH;Image sensing Device 40 will be sent to processor 50 according to the target image of the first light AB and the second light CD generation;Processor 50 calculates target The corresponding target depth information of image.
Optionally, first enters optical assembly 10 and second to enter vertical range between optical assembly 20 be the first pre-determined distance a;The One path EF is parallel to each other with the second path GH, and the vertical range between first path EF and the second path GH is second default Distance b, the second pre-determined distance b is less than the first pre-determined distance a.
In embodiments of the present invention, based on the lightening design concept of current electronic equipment, imaging device is also required to design Enough lightweights, to avoid excessive space in electronic equipment is occupied.
Specifically, imaging device needs to include two light-pathes that can guide light incidence in order to obtain depth information, Two groups of light with certain parallax are obtained to realize, in the embodiment of the present invention, optical assembly 10 and second can be entered by first Enter optical assembly 20 to realize the acquisition of two groups of light, moreover, being directed to the required precision to the depth information of acquisition, needs two groups Vertical range certain distance between light, therefore, first, which enters optical assembly 10 and second, enters that can be spaced between optical assembly 20 One pre-determined distance a.
Further, due to the light-receiving surface limited length of imaging sensor 40, and the light-receiving surface length of imaging sensor 40 Therefore usually less than first pre-determined distance a enters two groups of light that optical assembly 20 obtains to enter optical assembly 10 and second for first Line converges to the light-receiving surface of imaging sensor 40, and the embodiment of the present invention can carry out two groups of light anti-by reflection subassembly 30 It penetrates, forms it into one group and be parallel to each other, and be spaced the reflection light of the second pre-determined distance b, and converge to imaging sensor 40 Light-receiving surface, specifically, the first light AB that reflection subassembly 30 can enter light group 10 output for first, reflects along first path EF To imaging sensor 40, reflection subassembly 30 can enter the second light CD that optical assembly 20 exports for second, anti-along the second path GH It is incident upon imaging sensor 40, the second preset path b due to have passed through light reflection, between first path EF and the second path GH It can achieve the purpose that only to receive two groups of incident rays by an imaging sensor 40 less than the first pre-determined distance a.
Further, imaging sensor 40 is receiving after first path EF and the light of the second path GH incidence, A target image can be generated, and the target image is sent to processor 50 according to the optical information of two light.
In embodiments of the present invention, first enters optical assembly 10 and second and enters optical assembly 20 and be respectively provided with corresponding effective imaging Region labeling information, effective imaging region calibration information are determined by the hardware feature for corresponding to optical assembly.Processor 50 It using effective imaging region calibration information, can distinguish in target image, corresponding first enters the region of optical assembly 10 and right The second region for entering optical assembly 20 is answered, so as to which target image is divided into two independent images, two independent images In, one can be understood as the image for entering the generation of optical assembly 10 based on first, and another can be understood as entering light group based on second The image that part 20 generates, has certain parallax between this two images, and processor 50 matches this two independent images Standard calculates, and can be calculated parallax information, and according to the mapping relations between preset parallax information and depth information, can be with Obtain the corresponding target depth information of target image.
In embodiments of the present invention, it can use that reflection subassembly 30 enters optical assembly 10 to first and second to enter optical assembly 20 defeated The reflective operation that two light out carries out, shortens this two light and is incident between the incident path in imaging sensor 40 Spacing so that do not increase imaging sensor 40 quantity and its light-receiving surface length on the basis of, reached and only passed through One imaging sensor 40 receives the purpose of two groups of incident rays, also, the embodiment of the present invention does not need additionally to increase time system The functional modules such as chip, RGB image sensor, infrared image sensor are counted, so that the lower production costs of electronic equipment, mention High cost performance.In addition, due in the embodiment of the present invention electronic equipment can only have an imaging sensor, can be based on One imaging sensor realizes the incidence of two groups of light, since two groups of light are incident in same imaging sensor, then two groups of light Incidence do not need to carry out frame synchronization calculating, solve in the prior art using image frame synchronization caused by two imaging sensors The problem of.
In conclusion a kind of electronic equipment provided in an embodiment of the present invention, comprising: first enters optical assembly, second enters light group Part, reflection subassembly, imaging sensor and processor;Enter optical assembly and second first and enter the same side of optical assembly, is provided with anti- Penetrate component and imaging sensor;The first light that reflection subassembly enters optical assembly output for first, reflexes to image along first path Sensor, the second light that reflection subassembly enters optical assembly output for second, along the second multipath tolerant to imaging sensor;Image passes Sensor will be sent to processor according to the target image of the first light and the generation of the second light;It is corresponding that processor calculates target image Target depth information, the embodiment of the present invention using reflection subassembly to first enter optical assembly and second enter optical assembly output two The reflective operation of light shortens the spacing between the incident path that this two light are incident in imaging sensor, reaches The purpose of two groups of incident rays is only received by imaging sensor, also, when the embodiment of the present invention does not need additionally to increase Between count chip, RGB image sensor, the functional modules such as infrared image sensor so that the production cost of electronic equipment compared with It is low, improve cost performance.
Optionally, referring to Fig. 2, it illustrates the structure charts of a kind of electronic equipment provided in an embodiment of the present invention, first to enter Optical assembly 10 includes: the first camera lens 101 and the first reflective mirror 102;Second to enter optical assembly 20 include: the second camera lens 201 and second Reflective mirror 202;Reflection subassembly 30 includes: third reflective mirror 301 and the 4th reflective mirror 302, third reflective mirror 301 and the 4th reflective It is mutually perpendicular between mirror 302;Vertical range the first pre-determined distance a between first camera lens 101 and the second camera lens 201, and first It is parallel to each other between the light-path of camera lens 101 and the light-path of the second camera lens 201;First reflective mirror 102 and third reflective mirror 301 Between be parallel to each other, be parallel to each other between the second reflective mirror 202 and the 4th reflective mirror 302;First reflective mirror 102 is by the first camera lens First light of 101 outputs reflexes to third reflective mirror 301, the second light that the second reflective mirror 202 exports the second camera lens 201 Reflex to the 4th reflective mirror 302;Third reflective mirror 301 reflexes to imaging sensor 40 by the first light, along first path, and the 4th Reflective mirror 302 is by the second light, along the second multipath tolerant to imaging sensor 40.
Specifically, the first camera lens 101 and the second camera lens 201 can be conventional camera lens, to the first camera lens 101 When carrying out position setting with the second camera lens 201, need to meet the light-path of the first camera lens 101 and the light-path of the second camera lens 201 Between be parallel to each other, that is, need the first camera lens 101 export light and the second camera lens 201 output light between be parallel to each other. First reflective mirror 102, the second reflective mirror 202, third reflective mirror 301 and the 4th reflective mirror 302 can reflect light, lead to Cross between the first reflective mirror 102 of setting and third reflective mirror 301 and be parallel to each other, the second reflective mirror 202 and the 4th reflective mirror 302 it Between be parallel to each other, be mutually perpendicular between third reflective mirror 301 and the 4th reflective mirror 302, by the reflection of reflective mirror, can make First camera lens 101 and the second camera lens 201 are exported with two light of larger space distance output with the path that interval is shorter respectively To the different zones of imaging sensor 40, image processor 40 can be raw according to the optical information for the light for falling in different zones The target image of parallax information is able to reflect at one.
In embodiments of the present invention, Fig. 2 shows electronic equipment light-path structure, be similar to periscopic light-path knot Structure obtains two light with certain parallax, and utilize reflection subassembly by being spaced two camera lenses of relatively large distance respectively, After two light are merged into one group of shorter directional light of spacing, it is incident in imaging sensor, and be radiated at the same figure As sensor photosurface on, simplify the design of light-path, save the quantity of imaging sensor, save cost.
Optionally, referring to Fig. 3, it illustrates the structure chart of another electronic equipment provided in an embodiment of the present invention, first Enter optical assembly 10 further include: the first guide rail 103;Second enters optical assembly 20 further include: the second guide rail 203;First camera lens 101 and One reflective mirror 102 is fixedly connected, and the second camera lens 201 is fixedly connected with the second reflective mirror 202, and the first guide rail 103 is arranged first Between camera lens 101 and imaging sensor 40, the second guide rail 203 is arranged between the second camera lens 201 and imaging sensor 40;First Reflective mirror 102 or the setting of the first camera lens 101 are moved on the first guide rail 103, and along the first guide rail 103;Second reflective mirror 202 or The setting of second camera lens 201 is moved on the second guide rail 203, and along the second guide rail 203.
In embodiments of the present invention, the first guide rail 103 can be perpendicular to the light-path of the first camera lens 101, the second guide rail 203 It can be perpendicular to the light-path of the second camera lens 201.First camera lens 101 and the first reflective mirror 102 can be fixed on the first base jointly On seat, and relative position between the two is kept to fix, and first pedestal can be movably arranged on the first guide rail 103, and It is moved along the first guide rail 103;Second camera lens 201 and the second reflective mirror page 202 can be fixed on jointly on the second pedestal, and be kept Relative position between the two is fixed, and second pedestal can be movably arranged on the second guide rail 203, and along the second guide rail 203 is mobile.
By adjusting position of first pedestal on the first guide rail 103, and/or the second pedestal of adjustment is on the second guide rail 203 Position, adjustable first enters optical assembly 10 and second enters spacing size between optical assembly 20, to realize that adjusting is incident The purpose of the parallax value size of light enables electronic equipment to cover different depth information ranges scenes.
In addition, entering optical assembly 10 and second by adjusting first enters spacing size between optical assembly 20, imaging sensor 40 can be generated the target image for reflecting different parallax informations, and processor 50 is made to obtain the difference for different target image Target depth information, processor 50 can also carry out fusion calculation processing, final output fusion to different target depth informations A depth information image afterwards.After fusion, the available more fine and depth information with more wide scope expands The depth bounds and precision that big electronic equipment can measure.
Optionally, referring to Fig. 4, it illustrates the structure chart of another electronic equipment provided in an embodiment of the present invention, first Entering optical assembly 10 includes: third camera lens 104;Second to enter optical assembly 20 include: the 4th camera lens 204;Reflection subassembly 30 includes: the 5th Reflective mirror 303 and the 6th reflective mirror 304;Vertical range the first pre-determined distance a between third camera lens 104 and the 4th camera lens 204, And be parallel to each other between the light-path of third camera lens 104 and the light-path of the 4th camera lens 204, the 5th reflective mirror 303 and the 6th is anti- It is parallel to each other between light microscopic 304;Vertical range between third camera lens 104 and imaging sensor 40 be greater than the 4th camera lens 204 with Vertical range between imaging sensor 40;One on the light-path direction of third camera lens 104 is arranged in 5th reflective mirror 303 Side, and the 5th reflective mirror 303 is Chong Die with the light-path of third camera lens 104, the 4th camera lens 204 is arranged in the 6th reflective mirror 304 Side on light-path direction, and the 6th reflective mirror 304 is Chong Die with the light-path of the 4th camera lens 204;5th reflective mirror 303 is by The first light that three-lens 104 exports reflexes to imaging sensor 40 along first path, and the 6th reflective mirror 304 is by the 4th camera lens Second light of 204 outputs, along the second multipath tolerant to imaging sensor 40.
In embodiments of the present invention, it can be further simplified the light-path structure of electronic equipment, so that only passing through two groups of mirrors Head and two groups of corresponding reflective mirrors of camera lens realize and obtain two light with certain parallax respectively by electronic equipment Line, and imaging sensor is incident on after two light are merged into one group of shorter directional light of spacing using reflection subassembly In, and be radiated on the photosurface of the same imaging sensor, the design of light-path is further simplified, image sensing is saved The quantity of device, saves cost.
In addition, due in the embodiment of the present invention electronic equipment can only have an imaging sensor, can make based on one A imaging sensor realizes the incidence of two groups of light, since two groups of light are incident in same imaging sensor, then two groups of light Incidence does not need to carry out frame synchronization calculating, solves in the prior art using image frame synchronization caused by two imaging sensors Problem.
In conclusion a kind of electronic equipment provided in an embodiment of the present invention, comprising: first enters optical assembly, second enters light group Part, reflection subassembly, imaging sensor and processor;Enter optical assembly and second first and enter the same side of optical assembly, is provided with anti- Penetrate component and imaging sensor;The first light that reflection subassembly enters optical assembly output for first, reflexes to image along first path Sensor, the second light that reflection subassembly enters optical assembly output for second, along the second multipath tolerant to imaging sensor;Image passes Sensor will be sent to processor according to the target image of the first light and the generation of the second light;It is corresponding that processor calculates target image Target depth information, the embodiment of the present invention using reflection subassembly to first enter optical assembly and second enter optical assembly output two The reflective operation of light shortens the spacing between the incident path that this two light are incident in imaging sensor, reaches The purpose of two groups of incident rays is only received by imaging sensor, also, when the embodiment of the present invention does not need additionally to increase Between count chip, RGB image sensor, the functional modules such as infrared image sensor so that the production cost of electronic equipment compared with It is low, improve cost performance.
Fig. 5 is a kind of step flow chart of depth information acquisition method provided in an embodiment of the present invention, and this method is applied to Processor in electronic equipment, as shown in figure 5, this method may include:
Step 501 obtains the target image that imaging sensor is generated according to the first light and the second light.
In this step, imaging sensor is receiving along the first light of first path incidence and is entering along the second path After the second light penetrated, a target image can be generated according to the optical information of this two light, which can be with Reflect the parallax information generated due to the spacing distance between first path and the second path.
Step 502 enters effective imaging region calibration information of optical assembly according to described first and described second enters light group Effective imaging region calibration information of part, is divided into the first image and the second image for the target image.
In embodiments of the present invention, first enters optical assembly and second and enters optical assembly and be respectively provided with corresponding effective imaging Region labeling information, effective imaging region calibration information are determined by the hardware feature for corresponding to optical assembly.Processor benefit It with effective imaging region calibration information, can distinguish in target image, corresponding first enters the region of optical assembly and corresponding the Two enter the region of optical assembly, so as to which target image is divided into mutually independent first image and the second image, first In image and the second image, the first image can be understood as the image for entering optical assembly generation based on first, and the second image can be managed Solution is the image for entering optical assembly generation based on second.
It should be noted that the effective imaging region calibration information for entering optical assembly is into optical assembly when leaving the factory included Parameter, therefore, the shop instructions that can be consulted into optical assembly obtain effective imaging region calibration information into optical assembly, or, By the official website of electronic equipment, the effective imaging region calibration information for entering optical assembly in the electronic equipment is consulted.
Target parallax information between step 503, acquisition the first image and second image.
In this step, due to first enter optical assembly and second enter vertical range between optical assembly first it is default away from From can make that there is certain parallax between the first image and the second image, processor matches this two independent images Standard calculates, and the target parallax information between the first image and the second image can be calculated.
Specifically, the registration calculates can calculate for binocular ranging, the effect that binocular ranging calculates is that Same Scene is existed Corresponding pixel matching is got up on left and right view (i.e. the first image and the second image), the purpose for the arrangement is that mesh in order to obtain Mark parallax value.After obtaining target parallax value, the operation of target depth information can be calculated with further progress.
Optionally, step 503 can specifically include:
Sub-step 5031 carries out rough registration calculating to the first image and second image, obtains the first disparity map With the second disparity map.
In embodiments of the present invention, it can be directed to the first image and the second image, the rough registration carried out pixel-by-pixel calculates, and obtains To the first disparity map and the second disparity map, specifically, rough registration calculates can calculate for binocular ranging, comprising: with the first image On the basis of scheme, the second image and the first image are subjected to rough registration calculating, obtain the first disparity map based on the first image;With Scheme on the basis of two images, the first image and the second image are subjected to rough registration calculating, obtain the second parallax based on the second image Figure.
Sub-step 5032 carries out essence registration calculating to first disparity map and second disparity map, obtains the mesh Mark parallax information.
In this step, it includes: to establish a mutual matching template that essence registration, which calculates, includes the first parallax in mutual matching template Figure and the second disparity map in pixel, later, by the corresponding pixel points of the pixel of the first disparity map and the second disparity map it Between do and mutually check, when the similarity between pixel is greater than or equal to preset threshold, it is believed that the two pixels Between mutually check effectively, and pixel location parameter corresponding in mutual matching template is set 1, for mutually checking nothing Its corresponding pixel location parameter is set 0 in mutual matching template, an available mutual matching template by the pixel of effect Figure.
It, can be according to the value of the location parameter of pixel in mutual matching template figure, to after obtaining mutual matching template figure The pixel of pixel or the second disparity map in one disparity map is modified, obtained after amendment include target parallax information essence Thin disparity map.
For example, according to the value of the location parameter of pixel in mutual matching template figure, to the pixel in the first disparity map Value is modified, specifically, if mutually the location parameter of pixel is 0 in matching template figure, corresponding pixel points in the first disparity map Value be it is non-zero, then the value of corresponding pixel points in the first disparity map is revised as 0 by non-zero;If pixel in mutual matching template figure Location parameter is 1, and the value of corresponding pixel points is 0 in the first disparity map, then repairs the value of corresponding pixel points in the first disparity map by 0 It is changed to 1.Until obtaining one includes the fine of target parallax information after completing amendment to pixel all in the first disparity map Disparity map.
Step 504, according to the corresponding relationship between the target parallax information and preset parallax information and depth information, Determine the corresponding target depth information of the target parallax information.
In embodiments of the present invention, the calibration that can carry out electronic equipment in advance, obtains preset parallax information and depth Corresponding relationship between information.After obtaining target parallax information, it can be based on the corresponding relationship, determine target parallax information Corresponding target depth information.
Optionally, before step 503, the method can also include:
Step A1, intrinsic parameter, outer parameter and the distortion parameter for entering optical assembly according to described first, by the first image Correction is the first front view.
In embodiments of the present invention, the camera lens for entering optical assembly itself due to first is often that multiple surfaces are the saturating of arcuate structure Mirror assembles, and can generate certain rigging error when assembling camera lens, therefore can generate lens distortion in shooting process, Lens distortion is actually the general name of the intrinsic perspective distortion of optical lens, namely because being distorted caused by transillumination, such as Wide-angle lens and fish eye lens can all make in the photo shot that there are the distortion of biggish picture.And enter optical assembly due to first Setting angle the problem of,
Therefore, the first image obtained can usually have picture distortion, and camera calibration is to camera due to light The characteristic for learning lens makes that the existing elimination process to distort is imaged, and by camera calibration, available first enters optical assembly Intrinsic parameter, outer parameter and distortion parameter.
In this step, after getting the first image, using first that camera calibration obtains enter optical assembly intrinsic parameter, Outer parameter and distortion parameter carry out distortion elimination and row registration process to the first image, obtain distortionless first front view.
Step A2, intrinsic parameter, outer parameter and the distortion parameter for entering optical assembly according to described second, by second image Correction is the second front view.
The step is specifically referred to above-mentioned steps A1, and details are not described herein again.
Optionally, step 503 can also pass through the target between determination first front view and second front view The mode of parallax information is realized.
In embodiments of the present invention by distortionless first front view and the second front view, can be calculated more smart True target parallax information also further improves the precision of the target parallax information obtained according to target parallax information.
In conclusion a kind of depth information acquisition method provided in an embodiment of the present invention, comprising: obtain imaging sensor root The target image generated according to the first light and the second light;The effective imaging region calibration information for entering optical assembly according to first, with And second enter optical assembly effective imaging region calibration information, target image is divided into the first image and the second image;It determines Target parallax information between first image and the second image;According to target parallax information and preset parallax information and depth Corresponding relationship between information determines the corresponding target depth information of target parallax information., the embodiment of the present invention utilize reflection group Part enters optical assembly and second to first and enters the reflective operation for two light that optical assembly exports, and shortens this two light and is incident to The spacing between incident path in imaging sensor has reached and has only received two groups of incident rays by an imaging sensor Purpose, also, the embodiment of the present invention does not need additionally to increase time statistics chip, RGB image sensor, infrared image sensor Equal functional modules, so that the lower production costs of electronic equipment, improve cost performance.
For above-mentioned apparatus embodiment, since it is basically similar to the method embodiment, so be described relatively simple, The relevent part can refer to the partial explaination of embodiments of method.
All the embodiments in this specification are described in a progressive manner, the highlights of each of the examples are with The difference of other embodiments, the same or similar parts between the embodiments can be referred to each other.
It would have readily occurred to a person skilled in the art that: any combination application of above-mentioned each embodiment is all feasible, therefore Any combination between above-mentioned each embodiment is all embodiment of the present invention, but this specification exists as space is limited, This is not just detailed one by one.
Electronic equipment is not inherently related to any particular computer, virtual system, or other device provided herein.Respectively Kind general-purpose system can also be used together with teachings based herein.As described above, it constructs with the present invention program's Structure required by system is obvious.In addition, the present invention is also not directed to any particular programming language.It should be understood that can With using various programming languages realize summary of the invention described herein, and the description that language-specific is done above be for Disclosure preferred forms of the invention.
In the instructions provided here, numerous specific details are set forth.It is to be appreciated, however, that implementation of the invention Example can be practiced without these specific details.In some instances, well known method, structure is not been shown in detail And technology, so as not to obscure the understanding of this specification.
Similarly, it should be understood that in order to simplify the present invention and help to understand one or more of the various inventive aspects, In Above in the description of exemplary embodiment of the present invention, each feature of the invention is grouped together into single implementation sometimes In example, figure or descriptions thereof.However, the disclosed method should not be interpreted as reflecting the following intention: i.e. required to protect Shield the present invention claims features more more than feature expressly recited in each claim.More precisely, such as right As claim reflects, inventive aspect is all features less than single embodiment disclosed above.Therefore, it then follows tool Thus claims of body embodiment are expressly incorporated in the specific embodiment, wherein each claim conduct itself Separate embodiments of the invention.
Those skilled in the art will understand that can be carried out adaptively to the module in the equipment in embodiment Change and they are arranged in one or more devices different from this embodiment.It can be the module or list in embodiment Member or component are combined into a module or unit or component, and furthermore they can be divided into multiple submodule or subelement or Sub-component.Other than such feature and/or at least some of process or unit exclude each other, it can use any Combination is to all features disclosed in this specification (including adjoint claim, abstract and attached drawing) and so disclosed All process or units of what method or apparatus are combined.Unless expressly stated otherwise, this specification is (including adjoint power Benefit require, abstract and attached drawing) disclosed in each feature can carry out generation with an alternative feature that provides the same, equivalent, or similar purpose It replaces.
In addition, it will be appreciated by those of skill in the art that although some embodiments described herein include other embodiments In included certain features rather than other feature, but the combination of the feature of different embodiments mean it is of the invention Within the scope of and form different embodiments.For example, in detail in the claims, embodiment claimed it is one of any Can in any combination mode come using.
Various component embodiments of the invention can be implemented in hardware, or to run on one or more processors Software module realize, or be implemented in a combination thereof.It will be understood by those of skill in the art that can be used in practice Microprocessor or digital signal processor (DSP) realize one of some or all components according to embodiments of the present invention A little or repertoire.The present invention is also implemented as setting for executing some or all of method as described herein Standby or program of device (for example, computer program and computer program product).It is such to realize that program of the invention deposit Storage on a computer-readable medium, or may be in the form of one or more signals.Such signal can be from because of spy It downloads and obtains on net website, be perhaps provided on the carrier signal or be provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and ability Field technique personnel can be designed alternative embodiment without departing from the scope of the appended claims.In the claims, Any reference symbol between parentheses should not be configured to limitations on claims.Word "comprising" does not exclude the presence of not Element or step listed in the claims.Word "a" or "an" located in front of the element does not exclude the presence of multiple such Element.The present invention can be by means of including the hardware of several different elements and being come by means of properly programmed computer real It is existing.In the unit claims listing several devices, several in these devices can be through the same hardware branch To embody.The use of word first, second, and third does not indicate any sequence.These words can be explained and be run after fame Claim.

Claims (10)

1. a kind of electronic equipment characterized by comprising
First enters optical assembly, second enters optical assembly, reflection subassembly, imaging sensor and processor;
Enter optical assembly and described second described first and enter the same side of optical assembly, is provided with the reflection subassembly and described image Sensor;
The first light that the reflection subassembly is used to enter optical assembly output for described first, reflexes to described image along first path Sensor, the second light that the reflection subassembly enters optical assembly output for described second, along the second multipath tolerant to described image Sensor;
Described image sensor is used to generate target image according to first light and second light, and by the target Image is sent to the processor;
The processor is used to calculate the target depth information of the target image.
2. electronic equipment according to claim 1, which is characterized in that
Described first, which enters optical assembly and described second, enters the vertical range between optical assembly for the first pre-determined distance;
The first path is parallel to each other with second path, and vertical between the first path and second path Distance is the second pre-determined distance, and second pre-determined distance is less than first pre-determined distance.
3. electronic equipment according to claim 2, which is characterized in that
Described first to enter optical assembly include: the first camera lens and the first reflective mirror;
Described second to enter optical assembly include: the second camera lens and the second reflective mirror;
The reflection subassembly includes: third reflective mirror and the 4th reflective mirror, the third reflective mirror and the 4th reflective mirror it Between be mutually perpendicular to;
Vertical range between first camera lens and second camera lens is first pre-determined distance, and first camera lens Light-path and second camera lens light-path between be parallel to each other;
It is parallel to each other between first reflective mirror and the third reflective mirror, second reflective mirror and the 4th reflective mirror Between be parallel to each other;
First reflective mirror is used to the first light that first camera lens exports reflexing to the third reflective mirror, and described the Two reflective mirrors are used to the second light that second camera lens exports reflexing to the 4th reflective mirror;The third reflective mirror is used In by first light, described image sensor is reflexed to along the first path, the 4th reflective mirror is used for will be described Second light, along second multipath tolerant to described image sensor.
4. electronic equipment according to claim 3, which is characterized in that
Described first enters optical assembly further include: the first guide rail;Described second enters optical assembly further include: the second guide rail;
First camera lens is fixedly connected with first reflective mirror, second camera lens and the fixed company of second reflective mirror It connects, first guide rail is arranged between first camera lens and described image sensor, and second guide rail is arranged described Between second camera lens and described image sensor;
First reflective mirror or first camera lens setting are moved on first guide rail, and along first guide rail;Institute It states the second reflective mirror or second camera lens setting is moved on second guide rail, and along second guide rail.
5. electronic equipment according to claim 2, which is characterized in that
Described first to enter optical assembly include: third camera lens;
Described second to enter optical assembly include: the 4th camera lens;
The reflection subassembly includes: the 5th reflective mirror and the 6th reflective mirror;
First pre-determined distance described in vertical range between the third camera lens and the 4th camera lens, and the third camera lens It is parallel to each other between light-path and the light-path of the 4th camera lens, phase between the 5th reflective mirror and the 6th reflective mirror It is mutually parallel;
Vertical range between the third camera lens and described image sensor is greater than the 4th camera lens and described image senses Vertical range between device;
The side on the light-path direction of the third camera lens, and the 5th reflective mirror and institute is arranged in 5th reflective mirror The light-path overlapping of third camera lens is stated, the side on the light-path direction of the 4th camera lens is arranged in the 6th reflective mirror, And the 6th reflective mirror is Chong Die with the light-path of the 4th camera lens;
5th reflective mirror is used for first light for exporting the third camera lens, reflexes to institute along the first path Imaging sensor is stated, the 6th reflective mirror is used for second light for exporting the 4th camera lens, along second tunnel Diameter reflexes to described image sensor.
6. a kind of depth information acquisition method is applied to a kind of electronic equipment as claimed in claim 1 to 5, feature It is, which comprises
Obtain the target image that imaging sensor is generated according to the first light and the second light;
The effective imaging region calibration information and described second for entering optical assembly according to described first enter effective imaging of optical assembly The target image is divided into the first image and the second image by region labeling information;
Obtain the target parallax information between the first image and second image;
According to the corresponding relationship between the target parallax information and preset parallax information and depth information, the target is determined The corresponding target depth information of parallax information.
7. according to the method described in claim 6, it is characterized in that, in the determining the first image and second image Between target parallax information before, the method also includes:
The first image correction is first by intrinsic parameter, outer parameter and the distortion parameter for entering optical assembly according to described first Front view;
Second image flame detection is second by intrinsic parameter, outer parameter and the distortion parameter for entering optical assembly according to described second Front view;
Target parallax information between the determining the first image and second image, comprising:
Determine the target parallax information between first front view and second front view.
8. according to the method described in claim 6, it is characterized in that, the determining the first image and second image it Between target parallax information, comprising:
Rough registration calculating is carried out to the first image and second image, obtains the first disparity map and the second disparity map;
Essence registration is carried out to first disparity map and second disparity map to calculate, and obtains the target parallax information.
9. a kind of electronic equipment, which is characterized in that including processor, memory and be stored on the memory and can be described The computer program run on processor is realized when the computer program is executed by the processor as in claim 6 to 8 The step of described in any item depth information acquisition methods.
10. a kind of computer readable storage medium, which is characterized in that be stored with computer on the computer readable storage medium Program realizes the Depth Information Acquistion as described in any one of claim 6 to 8 when the computer program is executed by processor The step of method.
CN201910804707.0A 2019-08-28 2019-08-28 A kind of electronic equipment and depth information acquisition method Pending CN110533708A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910804707.0A CN110533708A (en) 2019-08-28 2019-08-28 A kind of electronic equipment and depth information acquisition method
PCT/CN2020/111750 WO2021037141A1 (en) 2019-08-28 2020-08-27 Electronic device and depth information acquisition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910804707.0A CN110533708A (en) 2019-08-28 2019-08-28 A kind of electronic equipment and depth information acquisition method

Publications (1)

Publication Number Publication Date
CN110533708A true CN110533708A (en) 2019-12-03

Family

ID=68664842

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910804707.0A Pending CN110533708A (en) 2019-08-28 2019-08-28 A kind of electronic equipment and depth information acquisition method

Country Status (2)

Country Link
CN (1) CN110533708A (en)
WO (1) WO2021037141A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111402313A (en) * 2020-03-13 2020-07-10 合肥的卢深视科技有限公司 Image depth recovery method and device
CN111416948A (en) * 2020-03-25 2020-07-14 维沃移动通信有限公司 Image processing method and electronic equipment
WO2021037141A1 (en) * 2019-08-28 2021-03-04 维沃移动通信有限公司 Electronic device and depth information acquisition method
CN112511731A (en) * 2020-12-17 2021-03-16 南昌欧菲光电技术有限公司 Camera module and electronic equipment
CN113132709A (en) * 2019-12-31 2021-07-16 中移物联网有限公司 Binocular ranging device, binocular ranging method and electronic equipment
CN114004880A (en) * 2021-04-08 2022-02-01 四川大学华西医院 Point cloud and strong-reflection target real-time positioning method of binocular camera
CN114383564A (en) * 2022-01-11 2022-04-22 平安普惠企业管理有限公司 Depth measurement method, device and equipment based on binocular camera and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105530431A (en) * 2015-12-16 2016-04-27 景好 Reflective panoramic imaging system and method
CN106525004A (en) * 2016-11-09 2017-03-22 人加智能机器人技术(北京)有限公司 Binocular stereo vision system and depth measuring method
CN107369172A (en) * 2017-07-14 2017-11-21 上海肇观电子科技有限公司 A kind of method of smart machine and output depth image
CN107968902A (en) * 2016-10-20 2018-04-27 上海富瀚微电子股份有限公司 Panoramic camera and its implementation based on single image sensor
CN207963848U (en) * 2018-03-12 2018-10-12 武汉大学 A kind of range-measurement system of looking in the distance based on binocular vision
CN109525830A (en) * 2018-11-28 2019-03-26 浙江未来技术研究院(嘉兴) A kind of dimensional video collecting system
CN110139012A (en) * 2019-05-28 2019-08-16 成都易瞳科技有限公司 Pisces eye panoramic picture acquisition device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4379482B2 (en) * 2007-04-03 2009-12-09 セイコーエプソン株式会社 Light source device and projector
CN102999939B (en) * 2012-09-21 2016-02-17 魏益群 Coordinate acquiring device, real-time three-dimensional reconstructing system and method, three-dimensional interactive device
TWI538508B (en) * 2014-08-15 2016-06-11 光寶科技股份有限公司 Image capturing system obtaining scene depth information and focusing method thereof
CN106226977A (en) * 2016-08-24 2016-12-14 深圳奥比中光科技有限公司 Laser projection module, image capturing system and control method thereof and device
CN107135388A (en) * 2017-05-27 2017-09-05 东南大学 A kind of depth extraction method of light field image
CN110533708A (en) * 2019-08-28 2019-12-03 维沃移动通信有限公司 A kind of electronic equipment and depth information acquisition method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105530431A (en) * 2015-12-16 2016-04-27 景好 Reflective panoramic imaging system and method
CN107968902A (en) * 2016-10-20 2018-04-27 上海富瀚微电子股份有限公司 Panoramic camera and its implementation based on single image sensor
CN106525004A (en) * 2016-11-09 2017-03-22 人加智能机器人技术(北京)有限公司 Binocular stereo vision system and depth measuring method
CN107369172A (en) * 2017-07-14 2017-11-21 上海肇观电子科技有限公司 A kind of method of smart machine and output depth image
CN207963848U (en) * 2018-03-12 2018-10-12 武汉大学 A kind of range-measurement system of looking in the distance based on binocular vision
CN109525830A (en) * 2018-11-28 2019-03-26 浙江未来技术研究院(嘉兴) A kind of dimensional video collecting system
CN110139012A (en) * 2019-05-28 2019-08-16 成都易瞳科技有限公司 Pisces eye panoramic picture acquisition device

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021037141A1 (en) * 2019-08-28 2021-03-04 维沃移动通信有限公司 Electronic device and depth information acquisition method
CN113132709A (en) * 2019-12-31 2021-07-16 中移物联网有限公司 Binocular ranging device, binocular ranging method and electronic equipment
CN113132709B (en) * 2019-12-31 2022-11-08 中移物联网有限公司 Binocular distance measuring device, binocular distance measuring method and electronic equipment
CN111402313A (en) * 2020-03-13 2020-07-10 合肥的卢深视科技有限公司 Image depth recovery method and device
CN111402313B (en) * 2020-03-13 2022-11-04 合肥的卢深视科技有限公司 Image depth recovery method and device
CN111416948A (en) * 2020-03-25 2020-07-14 维沃移动通信有限公司 Image processing method and electronic equipment
CN112511731A (en) * 2020-12-17 2021-03-16 南昌欧菲光电技术有限公司 Camera module and electronic equipment
CN114004880A (en) * 2021-04-08 2022-02-01 四川大学华西医院 Point cloud and strong-reflection target real-time positioning method of binocular camera
CN114004880B (en) * 2021-04-08 2023-04-25 四川大学华西医院 Point cloud and strong reflection target real-time positioning method of binocular camera
CN114383564A (en) * 2022-01-11 2022-04-22 平安普惠企业管理有限公司 Depth measurement method, device and equipment based on binocular camera and storage medium

Also Published As

Publication number Publication date
WO2021037141A1 (en) 2021-03-04

Similar Documents

Publication Publication Date Title
CN110533708A (en) A kind of electronic equipment and depth information acquisition method
US9414045B2 (en) Stereo camera
US10291894B2 (en) Single-sensor system for extracting depth information from image blur
CN102494609B (en) Three-dimensional photographing process based on laser probe array and device utilizing same
EP3480648B1 (en) Adaptive three-dimensional imaging system
CN102143305B (en) Image pickup method and system
CN109068033B (en) Depth of field camera module
JP2014174357A (en) Imaging apparatus, imaging system, signal processor, program, and storage medium
WO2011125937A1 (en) Calibration data selection device, method of selection, selection program, and three dimensional position measuring device
US10904512B2 (en) Combined stereoscopic and phase detection depth mapping in a dual aperture camera
KR101493451B1 (en) Multi Optical Axies Arrange Inspection Device and Axies Arranging Method thereof
JP2958458B1 (en) Multi-view image sensor
CN106611430A (en) An RGB-D image generation method, apparatus and a video camera
WO2018166829A1 (en) Imaging device with an improved autofocusing performance
TW202235909A (en) High-resolution time-of-flight depth imaging
JP2013126135A (en) Stereo image generation device, stereo image generation method and computer program for stereo image generation
JP6381206B2 (en) Image processing apparatus, control method thereof, and program
CN106534704A (en) Infrared technology-based photographing method and apparatus of terminal device
EP3220185A1 (en) Device and process for the plenoptic capture of images
CN112997121B (en) Depth camera system
JP6379646B2 (en) Information processing apparatus, measurement method, and program
US20190251700A1 (en) Method and device for depth detection using stereo images
CN111277811A (en) Three-dimensional space camera and photographing method thereof
CN104198038B (en) The brightness detection method of built-in light source
JP3632315B2 (en) Distance detector

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20191203

RJ01 Rejection of invention patent application after publication