WO2020088290A1 - 一种获取深度信息的方法及电子设备 - Google Patents

一种获取深度信息的方法及电子设备 Download PDF

Info

Publication number
WO2020088290A1
WO2020088290A1 PCT/CN2019/112254 CN2019112254W WO2020088290A1 WO 2020088290 A1 WO2020088290 A1 WO 2020088290A1 CN 2019112254 W CN2019112254 W CN 2019112254W WO 2020088290 A1 WO2020088290 A1 WO 2020088290A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
electronic device
image information
speckle
target object
Prior art date
Application number
PCT/CN2019/112254
Other languages
English (en)
French (fr)
Inventor
陈国乔
袁梦尤
袁江峰
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP19878166.8A priority Critical patent/EP3869462A4/en
Priority to US17/290,660 priority patent/US20220020165A1/en
Priority to JP2021524333A priority patent/JP2022506753A/ja
Publication of WO2020088290A1 publication Critical patent/WO2020088290A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/514Depth or shape recovery from specularities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image

Definitions

  • Embodiments of the present application relate to the field of image processing technology, and in particular, to a method and electronic device for acquiring depth information.
  • the electronic device may include functions such as face recognition and augmented reality (AR).
  • face recognition when performing face recognition, an electronic device can not only collect a two-dimensional image of a target object (such as a human face), but also obtain depth information of the target object.
  • the electronic device can separately acquire the image information of the target object through the binocular camera (that is, two cameras, such as camera 1 and camera 2); then, it is recognized that the image information collected by camera 1 and the image information collected by camera 2 are the same And calculate the parallax of camera 1 and camera 2 for the same feature, and then calculate the depth of the point (that is, the vertical distance between the point and the above two cameras) according to the parallax, hardware parameters of camera 1 and camera 2 .
  • the depths of multiple points of the target object may constitute the depth information of the target object.
  • the depth information of the target object can represent the three-dimensional feature of the target object.
  • the more identical features recognized by the electronic device the more depth the electronic device can calculate, and the more accurate the depth information of the target object.
  • the number of features included in the image information collected by the camera will be affected by at least the following two parameters: (1) Whether the characteristics of the target object itself are obvious. For example, if the target object has sharp edges and corners (that is, the characteristics of the target object are obvious), the image information collected by the camera may include more features; if the target object is smoother, that is, the characteristics of the target object (such as a white wall) Not obvious, then the image information collected by the camera to the target object includes fewer features. (2) The intensity of light. For example, when the light is strong, the image information collected by the camera includes more features; when the light is poor, the image information collected by the camera includes fewer features.
  • the more features in the image information collected by the binocular camera the more accurate the depth information of the target object calculated by the electronic device; and the number of features in the image information collected by the binocular camera will be affected by many factors influences. In other words, the accuracy of depth information is affected by many factors, and it is difficult for electronic devices to obtain depth information with high accuracy.
  • Embodiments of the present application provide a method for acquiring depth information and an electronic device, which can improve the accuracy of the depth information of a target object acquired by the electronic device.
  • an embodiment of the present application provides a method for acquiring depth information, which may be applied to an electronic device, where the electronic device includes an infrared projector, a first camera, and a second camera.
  • the distance between the first camera and the second camera is a first length.
  • the method may include: the electronic device receives a first instruction for triggering the electronic device to obtain the depth information of the target object; in response to the first instruction, the infrared light with a spot is emitted through the infrared projector, and the first object is collected through the first camera One image information, collect the second image information of the target object through the second camera; calculate the target object according to the first image information, the second image information, the first length, the lens focal length of the first camera and the lens focal length of the second camera Depth information.
  • the more features in the image information collected by the first camera and the image information collected by the second camera the more obvious the features are, and the more the electronic device recognizes the same features in the first image information and the second image information.
  • the more features that the electronic device recognizes the more depth the feature can be calculated by the electronic device. Since the depth information of the target object is composed of the depths of multiple points of the target object; therefore, the more the depth of the points calculated by the electronic device, the more accurate the depth information of the target object.
  • the first image information and the second image information include features of the target object and texture features formed by irradiating the infrared light with light spots on the target object. That is, the first image information and the second image information not only include the characteristics of the target object itself, but also include texture features formed by the infrared light with light spots irradiating the target object.
  • the electronic device after adding the features of the image of the target object collected by the first camera and the second camera, the electronic device can more accurately recognize the same features in the image information collected by the first camera and the image information collected by the second camera, and then determine The parallax of the first camera and the second camera on the same feature is calculated, so that the depth of each point is calculated to obtain the depth information of the target object, which can improve the accuracy of the electronic device in calculating the depth information of the target object.
  • the lens focal lengths of the first camera and the second camera are the same.
  • the electronic device calculates the target based on the first image information, the second image information, the first length, and the lens focal length of the first camera and the lens focal length of the second camera
  • the method of the depth information of the object may include: the electronic device calculates the parallax of the first camera and the second camera to the plurality of first features in the first image information and the second image information; the first feature is the first image information and the first The same feature in the two image information; for each first feature, the electronic device calculates the first using the following formula (2) according to the parallax, first length, and lens focal length of the first camera and the second camera for the first feature Depth Z of the point where the feature is located to obtain the depth information of the target object:
  • f is the focal length of the lens
  • d is the parallax of the first camera and the second camera to the first feature
  • T is the first length
  • the distance between each feature on the target object and the camera may be different.
  • the distance between each feature on the target object and the camera is called the depth of the feature (or the point where the feature is located).
  • the depth of each point on the target object constitutes the depth information of the target object.
  • the depth information of the target object can represent the three-dimensional feature of the target object.
  • the foregoing multiple first features are some of the same features in the first image information and the second image information.
  • the electronic device 100 may select a plurality of first features from the first image information according to a preset feature frequency, and then search for features in the second image information that are the same as the plurality of first features; finally, for each first Feature, calculate its depth, get the depth information of the target object.
  • the electronic device may select some first features from the first image information randomly or at intervals.
  • the above feature frequency may be the number of the same two first features appearing in the preset area.
  • the above feature frequency reflects the distance between two adjacent first features that can be selected for the electronic device on the image (referred to as feature distance).
  • the electronic device may select a plurality of first features from the first image information according to the preset feature frequency.
  • the method may include: the electronic device selects a first feature from every feature in the first image information at every feature distance One characteristic. That is to say, the electronic device does not need to calculate the depth of each feature in the same feature in the first image information and the second image information, but selects a feature at every feature distance and calculates the location of the selected feature depth.
  • the above-mentioned infrared light with spots includes a plurality of spots, and the plurality of spots includes a plurality of speckle array groups.
  • one speckle array group includes one or more speckle arrays; the speckle array includes multiple speckles.
  • At least two of the plurality of speckle array groups are different.
  • at least two speckle array groups in the plurality of speckle array groups are different, which can reduce the repetitiveness of the speckle array groups in the plurality of light spots, and is beneficial for the electronic device to recognize different characteristics.
  • the first speckle array group is any one of a plurality of speckle array groups, and among the plurality of speckle arrays included in the first speckle array group At least two speckle patterns are different. Among them, at least two of the plurality of speckle arrays are different, which can reduce the repeatability of the plurality of speckle arrays in the plurality of light spots, and is beneficial for the electronic device to recognize different features.
  • the first speckle array is any one of a plurality of speckle arrays.
  • Each speckle in the first speckle array has the same shape.
  • the electronic device can identify different speckles according to the position of the speckles in the speckle array.
  • At least two speckles in the first speckle array have different shapes.
  • the shapes of at least two speckles in the plurality of speckles are different, which can reduce the repetitiveness of the plurality of speckles in the speckle array, which is helpful for the electronic device to recognize different features.
  • the electronic device selects multiple first features from the first image information and the second image information according to a preset feature frequency.
  • the characteristic frequency is greater than or equal to the repetition frequency of the speckle array in multiple light spots.
  • the feature frequency is characterized by the number of the same first feature selected by the electronic device from the image of the preset area, and the repetition frequency is characterized by the number of the same speckle pattern in the preset area.
  • the feature distance adopted by the electronic device when selecting the first feature may be less than or equal to the repetition period of the speckle array in the multiple light spots. That is, the above characteristic frequency is greater than or equal to the repetition frequency of the speckle array among the plurality of light spots. In this way, it can be ensured that the two adjacent first features selected by the electronic device from the first image information correspond to the speckles in different speckle arrays, which is beneficial to the electronic device to distinguish the two adjacent first features Features can reduce the possibility of feature matching errors and improve the accuracy of depth information calculated by electronic devices.
  • the two sides of each lens of the first camera and the second camera include an antireflection coating
  • the filters of the first camera and the second camera include a cut-off coating .
  • the antireflection film is used to increase the transmittance of infrared light
  • the cut-off coating is used to filter out other light than infrared light and visible light, and increase the transmittance of infrared light.
  • RGB cameras can only sense visible light, but not infrared light.
  • antireflection coatings can be coated on both sides of each lens of the RGB camera, and a cut-off coating can be plated on the filter (double-sided or single-sided) of the RGB camera to obtain both sides of each lens
  • the infrared light is infrared light from 890 nanometers (nm) to 990 nm.
  • the infrared light may specifically be infrared light at 940 nm.
  • the above-mentioned antireflection film may be an antireflection film of infrared light of 890 nm to 990 nm, such as an antireflection film of infrared light of 940 nm.
  • the first length is between 20 mm and 30 mm.
  • the foregoing electronic device further includes a third camera, and the third camera is an RGB camera.
  • the third camera is used to collect image information under visible light; the image information collected by the third camera is used to be displayed on the display screen of the electronic device.
  • an embodiment of the present application provides an electronic device, the electronic device includes: one or more processors, a memory, an infrared projector, a first camera and a second camera, between the first camera and the second camera The distance is the first length, the memory, the infrared projector, the first camera and the second camera are coupled with the processor, and the memory is used to store information.
  • the foregoing processor is configured to receive a first instruction, and the first instruction is used to trigger the electronic device to obtain depth information of the target object.
  • the processor is further configured to emit infrared light with a spot through the infrared projector in response to the first instruction, collect the first image information of the target object through the first camera, and collect the second image information of the target object through the second camera.
  • the first image information and the second image information include features of the target object and texture features formed by irradiating the infrared light with light spots on the target object.
  • the processor is further used to calculate the target object's target based on the first image information collected by the first camera, the second image information collected by the second camera, the first length, the lens focal length of the first camera and the lens focal length of the second camera In-depth information.
  • the lens focal lengths of the first camera and the second camera are the same.
  • the above-mentioned processor is configured to use the first image information, the second image information, the first length, and the lens focal length of the first camera and the lens focal length of the second camera , Calculating the depth information of the target object, including: a processor, configured to: calculate the parallax of the first camera and the second camera on a plurality of first features in the first image information and the second image information; the first feature is the first The same feature in the image information and the second image information; for each first feature, according to the parallax, first length, and lens focal length of the first camera and the second camera for the first feature, the first formula (2) is used to calculate the first The depth Z of the point where the feature is located to obtain the depth information of the target object.
  • f is the focal length of the lens
  • d is the parallax of the first camera and the second camera to the first feature
  • T is the first length.
  • the above processor is further used to calculate the parallax of the first camera and the second camera on the plurality of first features in the first image information and the second image information , Multiple first features are selected from the first image information and the second image information.
  • the plurality of first features are some of the same features in the first image information and the second image information.
  • the above-mentioned infrared light with spots includes a plurality of spots, and the plurality of spots includes a plurality of speckle array groups.
  • one speckle pattern group includes one or more speckle patterns; each speckle pattern includes multiple speckle patterns.
  • the speckle array group, speckle array, multiple speckles in the speckle array and the wavelength of the infrared light emitted by the infrared projector in the second aspect and its possible design methods can refer to the first aspect The relevant descriptions in the possible design methods are not repeated here.
  • the processor is configured to select multiple first features from the first image information and the second image information according to a preset feature frequency.
  • the characteristic frequency is greater than or equal to the repetition frequency of the speckle array in multiple light spots.
  • the characteristic frequency is characterized by the number of the same first feature selected by the processor from the image of the preset area, and the repetition frequency is characterized by the number of the same speckle pattern appearing in the preset area.
  • the two sides of each lens of the first camera and the second camera include an AR coating
  • the filters of the first camera and the second camera include a cut-off coating
  • the antireflection film is used to increase the transmittance of infrared light
  • the cut-off coating layer is used to filter out other light than infrared light and visible light, and increase the transmittance of infrared light.
  • the foregoing electronic device further includes a third camera and a display screen.
  • the third camera is an RGB camera; the third camera is used to collect image information under visible light; and the image information collected by the third camera is used to display on the display screen.
  • an embodiment of the present application provides a dual-pass camera (that is, the above-mentioned first camera or second camera), and the dual-pass camera is used to receive visible light and infrared light.
  • Each side of each lens of the dual-pass camera includes an antireflection coating, and the filter of the dual-pass camera includes a cut-off plating layer.
  • the antireflection film is used to increase the transmittance of infrared light;
  • the cut-off coating layer is used to filter out other light than infrared light and visible light, and increase the transmittance of infrared light.
  • the dual-pass camera includes: an RGB camera, an AR coating coated on each lens of the RGB camera, and a filter (double-sided) coated on the RGB camera Or single-sided) plating cut-off plating.
  • an embodiment of the present application provides a computer storage medium, including computer instructions, which, when the computer instructions run on an electronic device, cause the electronic device to execute as described in the first aspect and its possible design Method for obtaining depth information.
  • an embodiment of the present application provides a computer program product that, when the computer program product runs on a computer, causes the computer to execute the method for acquiring depth information as described in the first aspect and its possible design .
  • 1A is a partial schematic diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 1B is a schematic diagram 1 of a principle for calculating depth information provided by an embodiment of the present application.
  • 1C is a schematic diagram of an example of the same feature in two image information collected by a binocular camera provided by an embodiment of the present application;
  • 1D is a schematic diagram of a hardware structure of an electronic device provided by an embodiment of the present application.
  • FIG. 2 is a flowchart 1 of a method for obtaining depth information according to an embodiment of the present application
  • FIG. 3 is a schematic diagram 1 of an example of a graphical user interface provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of an example of a speckle array provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram 1 of an example of a speckle array group provided by an embodiment of the present application.
  • 6A is a schematic diagram 2 of an example of a speckle array group provided by an embodiment of the present application.
  • 6B is a schematic diagram 3 of an example of a speckle array group provided by an embodiment of the present application.
  • 6C is a schematic diagram 4 of an example of a speckle array group provided by an embodiment of the present application.
  • FIG. 7A is a schematic diagram 5 of an example of a speckle array group provided by an embodiment of the present application.
  • FIG. 7B is a schematic diagram 6 of an example of a speckle array group provided by an embodiment of the present application.
  • FIG. 7C is a schematic diagram 7 of an example of a speckle array group provided by an embodiment of the present application.
  • FIG. 8 is a schematic structural diagram of a camera module provided by an embodiment of the present application.
  • FIG. 9 is a flowchart 2 of a method for obtaining depth information according to an embodiment of the present application.
  • 10A is a schematic diagram 2 of a principle for calculating depth information provided by an embodiment of the present application.
  • 10B is a schematic diagram of a principle of calculating parallax provided by an embodiment of the present application.
  • 10C is a schematic diagram of a relationship between parallax and depth provided by an embodiment of the present application.
  • FIG. 11 is a schematic diagram of an example of a target object and infrared light with a light spot provided by an embodiment of the present application;
  • FIG. 12 is a schematic diagram of an example of a characteristic frequency provided by an embodiment of the present application.
  • FIG. 13 is a schematic diagram of an example of face depth information provided by an embodiment of the present application.
  • FIG. 14 is a schematic diagram of an example of a graphical user interface provided by an embodiment of this application.
  • 15 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • Embodiments of the present application provide a method for acquiring depth information, which can be applied to a process in which an electronic device acquires image information of a target object.
  • the image information may include a two-dimensional image of the target object and depth information of the target object.
  • a target object (such as a human face) is an object with a three-dimensional shape.
  • the distance between each feature on the target object (such as the tip of the person's nose and eyes) and the camera may be different.
  • the distance between each feature on the target object and the camera is called the depth of the feature (or the point where the feature is located).
  • the depth of each point on the target object constitutes the depth information of the target object.
  • the depth information of the target object can represent the three-dimensional feature of the target object.
  • the distance between each feature on the target object and the camera may be: The vertical distance between the point where each feature is located and the connection between the two cameras.
  • P is a feature on the target object
  • the depth of the feature P is the vertical distance Z from P to O L O R.
  • O L is the position of the first camera
  • O R is the position of the second camera.
  • the electronic device provided by the embodiment of the present application may include an infrared projector and two dual-pass cameras (such as a first camera and a second camera).
  • the distance between the two dual-pass cameras is the first length.
  • the distance between the centers of the two cameras is called the distance between the two cameras.
  • FIG. 1A shows a partial schematic diagram of an electronic device 100 provided by an embodiment of the present application.
  • the electronic device 100 may include an infrared projector 101, a first camera 102 and a second camera 103.
  • the distance between the first camera 102 and the second camera 103 is the first length T.
  • the infrared projector 101 is used to emit infrared light with a light spot.
  • the dual-pass camera means that the camera can receive not only visible light but also infrared light.
  • a dual-pass camera can receive visible light and infrared light at 940 nm. 940nm is the wavelength of infrared light.
  • the all-pass camera means that the camera can receive a variety of light including visible light, infrared light, and light of other wavelengths. In comparison, ordinary RGB cameras can receive visible light, but cannot receive infrared light.
  • the first camera 102 may be a dual-pass camera on the left, and the second camera 103 may be a dual-pass camera on the right; or, the first camera 102 may be a dual-pass camera on the right, the second The camera 103 may be a dual-pass camera on the left.
  • FIG. 1A taking the first camera 102 as a dual-pass camera on the left and the second camera 103 as a dual-pass camera on the right as an example, a partial structure of the electronic device 100 will be described as an example.
  • the first camera 102 and the second camera 103 in the embodiment of the present application can receive infrared light of the same type.
  • the infrared light of the same type refers to infrared light of the same wavelength.
  • the first camera 102 and the second camera 103 have the same ability to receive infrared light.
  • the first camera 102 and the second camera 103 may receive infrared light at 940 nm.
  • the infrared light emitted by the infrared projector 101 and the infrared light that the first camera 102 and the second camera 103 can receive are also the same type of infrared light.
  • the electronic device 100 can calculate the depth of each feature on the target object according to the parallax of the binocular camera on the same feature and the hardware parameters of the binocular camera, using the principle of triangulation to obtain the depth information of the target object.
  • O L is the position of the first camera 102
  • O R is the position of the second camera 103
  • the lens focal lengths of the first camera 102 and the second camera 103 are both f.
  • Feature P is a feature of the target object.
  • the vertical distance between the point where the feature P is located and the first camera 102 and the second camera 103 is Z. That is, the depth information of P is Z.
  • the first camera 102 acquires the image 1 of the target object, and the feature P is at the point P L of the image 1.
  • the second camera 103 to capture an image of the target object 2, wherein R & lt point P in the image P 2.
  • the characteristic image point P L P R of 1 in two points corresponding to the characteristic of the target object is P.
  • P L P R O L O R -B L P L -P R B R.
  • O L O R T
  • P R B R x / 2-x R.
  • the depth Z of the feature P can be calculated by the distance T between the two cameras, the lens focal length f of the two cameras, and the parallax d.
  • the features in the image information collected by the first camera 102 ie the first image information, such as the above image 1 and the image information collected by the second camera 103 (ie the second image information, such as the above image 2)
  • the first image information such as the above image 1
  • the image information collected by the second camera 103 ie the second image information, such as the above image 2
  • the same feature in the two image information refers to information corresponding to the same feature in the two image information.
  • the A L point in the first image information corresponds to the A part of the object
  • the A R point in the second image information also corresponds to the A part of the object
  • the A L point and the A R point are the same features in the two image information .
  • the first image information includes the image 1 of the building shown in FIG. 1C
  • the second image information includes the image 2 of the building shown in FIG. 1C.
  • Point A L in image 1 corresponds to part A of the building, and point AR in image 2 also corresponds to part A of the building.
  • Point B L corresponds to part B of the building
  • point B R also corresponds to part B of the building.
  • the parallax of the binocular camera to the A part of the building is x L1 -x R1 .
  • the parallax of the binocular camera to the B part of the building is x L2 -x R2 .
  • the more features in the first image information and the second image information the easier it is to obtain more identical features from the two image information, so that the electronic device 100 obtains the depth information of the target object accurate.
  • the image information of the target object collected by the first camera 102 and the second camera 103 may not only include the characteristics of the target object, but also include The infrared light of the light spot illuminates the texture feature formed on the target object. That is, the characteristics of the image of the target object collected by the first camera 102 and the second camera 103 can be increased.
  • the electronic device 100 After adding the features of the image of the target object collected by the first camera 102 and the second camera 103, the electronic device 100 can more accurately recognize the same features in the image information collected by the first camera 102 and the image information collected by the second camera 103 , And then determine the parallax of the first camera 102 and the second camera 103 for the same feature, thereby calculating the depth of each point to obtain the depth information of the target object, which can improve the accuracy of the electronic device 100 in calculating the depth information of the target object degree.
  • the dual-pass camera in the embodiment of the present application can be obtained by improving the RGB camera, which can reduce the hardware cost of the electronic device 100.
  • antireflection coatings can be coated on both sides of each lens of the RGB camera to improve the lens' ability to perceive infrared light, so that the lens can receive infrared light as much as possible;
  • the cut-off coating is coated on the filter of the RGB camera , To filter out other light than infrared light and visible light, and increase the transmittance of infrared light.
  • the RGB camera including the above-mentioned AR coating and the cut-off plating layer can receive not only visible light but also infrared light.
  • the RGB camera improved in the above manner can be used as a dual-pass camera.
  • the partial schematic diagram of the electronic device 100 shown in FIG. 1A may be a partial schematic diagram of the front of the electronic device 100.
  • the infrared projector 101, the first camera 102, and the second camera 103 are provided on the front of the electronic device 100.
  • the electronic device 100 may further include one or more other cameras.
  • the one or more other cameras may include a third camera 104, which is an RGB camera.
  • the third camera is used to adopt the two-dimensional image of the target object.
  • the electronic device 100 may display the two-dimensional image collected by the third camera on the display screen.
  • the third camera 104 is installed on the same surface of the electronic device 100 as the first camera 102, the second camera 103 and the infrared projector. That is, the third camera 104 may also be provided on the front of the electronic device 100.
  • the third camera 104 is a front camera of the electronic device 100.
  • the partial schematic diagram of the electronic device 100 shown in FIG. 1A may be a partial schematic diagram of the back of the electronic device 100. That is, the infrared projector 101, the first camera 102, the second camera 103, and the RGB camera 104 are provided on the back of the electronic device 100.
  • the RGB camera 104 is a rear camera of the electronic device 100.
  • the front side of the electronic device 100 refers to the side of the electronic device 100 displaying a graphical user interface (such as the main interface of the electronic device 100, that is, the desktop), that is, the side where the display panel is located is generally called the front side; and the back side of the electronic device 100 is It is the side opposite to the front.
  • the front side of the electronic device refers to the side facing the user in the normal use state by the user; the side facing away from the user is called the back side.
  • the one or more other cameras may also include other RGB cameras or black-and-white cameras.
  • the other RGB camera or black-and-white camera may be a front camera or a rear camera of the electronic device 100, which is not limited in the embodiment of the present application.
  • the electronic device in the embodiment of the present application may be a portable computer (such as a mobile phone) including the infrared projector, the RGB camera (ie, the third camera), and two dual-pass cameras (ie, the first camera and the second camera), notebook computers, wearable electronic devices (such as smart watches), tablets, augmented reality (augmented reality (AR) ⁇ virtual reality (VR) devices, or in-vehicle devices, etc., the specific forms of the electronic devices in the following embodiments are not Make special restrictions.
  • FIG. 1D shows a schematic structural diagram of an electronic device 100 provided by an embodiment of the present application.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2.
  • Mobile communication module 150 wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, key 190, motor 191, indicator 192, camera 193, display screen 194, And subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM subscriber identification module
  • the electronic device 100 may further include an infrared projector 196 (infrared projector 101 shown in FIG. 1A).
  • the infrared projector 196 is used to emit infrared light with a light spot.
  • the infrared projector 196 may emit infrared light with a wavelength of 940 nm, and the infrared light has a light spot.
  • the shape and arrangement of the light spot reference may be made to subsequent descriptions of the embodiments of the present application, and details are not described here.
  • the camera 193 may include two dual-pass cameras (the first camera 102 and the second camera 103 shown in FIG. 1A) 193B and 193C, and 1 to N other cameras 193A.
  • the 1 to N other cameras 193A may include the third camera 104 shown in FIG. 1A, that is, the RGB camera, and may also include other cameras, such as a black-and-white camera.
  • the dual-pass cameras 193B and 193C can receive visible light and infrared light.
  • the RGB camera is used to collect RGB images, such as face images (that is, two-dimensional images of faces).
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an environment A plurality of sensors such as an optical sensor 180L and a bone conduction sensor 180M.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or fewer components than shown, or combine some components, or split some components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), and an image signal processor (image) signal processor (ISP), controller, video codec, digital signal processor (DSP), baseband processor, and / or neural-network processing unit (NPU), etc.
  • application processor application processor
  • AP application processor
  • modem processor graphics processor
  • graphics processor graphics processor
  • ISP image signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • different processing units may be independent devices, or may be integrated in one or more processors.
  • the controller may be the nerve center and command center of the electronic device 100.
  • the controller can generate the operation control signal according to the instruction operation code and the timing signal to complete the control of fetching instructions and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory may store instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Avoid repeated access, reduce the waiting time of the processor 110, thus improving the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • Interfaces can include integrated circuit (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit, sound, I2S) interface, pulse code modulation (pulse code modulation (PCM) interface, universal asynchronous transceiver (universal asynchronous) receiver / transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input / output (GPIO) interface, subscriber identity module (SIM) interface, and And / or universal serial bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input / output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial) (SDL) and a serial clock line (derail clock (SCL)).
  • the processor 110 may include multiple sets of I2C buses.
  • the processor 110 may respectively couple the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface to realize the touch function of the electronic device 100.
  • the I2S interface can be used for audio communication.
  • the processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled to the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, to realize the function of answering the phone call through the Bluetooth headset.
  • the PCM interface can also be used for audio communication, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface to realize the function of answering the phone call through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is generally used to connect the processor 110 and the wireless communication module 160.
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 to peripheral devices such as the display screen 194 and the camera 193.
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI) and so on.
  • the processor 110 and the camera 193 communicate through a CSI interface to implement the shooting function of the electronic device 100.
  • the processor 110 and the display screen 194 communicate through the DSI interface to realize the display function of the electronic device 100.
  • the GPIO interface can be configured via software.
  • the GPIO interface can be configured as a control signal or a data signal.
  • the GPIO interface may be used to connect the processor 110 to the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transfer data between the electronic device 100 and peripheral devices. It can also be used to connect headphones and play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiments of the present invention is only a schematic description, and does not constitute a limitation on the structure of the electronic device 100.
  • the electronic device 100 may also use different interface connection methods in the foregoing embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive the charging input of the wired charger through the USB interface 130.
  • the charging management module 140 may receive wireless charging input through the wireless charging coil of the electronic device 100. While the charging management module 140 charges the battery 142, it can also supply power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and / or the charging management module 140, and supplies power to the processor 110, internal memory 121, external memory, display screen 194, camera 193, wireless communication module 160, and the like.
  • the power management module 141 can also be used to monitor battery capacity, battery cycle times, battery health status (leakage, impedance) and other parameters.
  • the power management module 141 may also be disposed in the processor 110.
  • the power management module 141 and the charging management module 140 may also be set in the same device.
  • the wireless communication function of the electronic device 100 can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • Antenna 1 and antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide a wireless communication solution including 2G / 3G / 4G / 5G and the like applied to the electronic device 100.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), and so on.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1 and filter, amplify, etc. the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor and convert it to electromagnetic wave radiation through the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be transmitted into a high-frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to a speaker 170A, a receiver 170B, etc.), or displays an image or video through a display screen 194.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 110, and may be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide the wireless local area network (wireless local area networks, WLAN) (such as Wi-Fi network), Bluetooth (bluetooth, BT), global navigation satellite system (global navigation satellite system) applied to the electronic device 100, GNSS), frequency modulation (FM), NFC, infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency-modulates and filters electromagnetic wave signals, and transmits the processed signals to the processor 110.
  • the wireless communication module 160 may also receive the signal to be transmitted from the processor 110, frequency-modulate it, amplify it, and convert it to electromagnetic wave radiation through the antenna 2.
  • the antenna 1 of the electronic device 100 and the mobile communication module 150 are coupled, and the antenna 2 and the wireless communication module 160 are coupled so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include a global mobile communication system (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Wideband code division multiple access (WCDMA), time-division code division multiple access (TD-SCDMA), long-term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and / or IR technology, etc.
  • the GNSS may include a global positioning system (GPS), a global navigation satellite system (GLONASS), a beidou navigation system (BDS), and a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and / or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS beidou navigation system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device 100 realizes a display function through a GPU, a display screen 194, and an application processor.
  • the GPU is a microprocessor for image processing, connecting the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations, and is used for graphics rendering.
  • the processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel may use a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light-emitting diode or an active matrix organic light-emitting diode (active-matrix organic light) emitting diode, AMOLED, flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diode (QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the electronic device 100 can realize a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is mainly used to process the data fed back by the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, and the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, which is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be set in the camera 193.
  • the camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (charge coupled device, CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CCD charge coupled device
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the electronic device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • the digital signal processor is used to process digital signals. In addition to digital image signals, it can also process other digital signals. For example, when the electronic device 100 is selected at a frequency point, the digital signal processor is used to perform Fourier transform on the energy at the frequency point.
  • Video codec is used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in various encoding formats, for example: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG moving picture experts group
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • the NPU can realize applications such as intelligent recognition of the electronic device 100, such as image recognition, face recognition, voice recognition, and text understanding.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 110 executes instructions stored in the internal memory 121 to execute various functional applications and data processing of the electronic device 100.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area may store an operating system, at least one function required application programs (such as sound playback function, image playback function, etc.).
  • the storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100 and the like.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and so on.
  • a non-volatile memory such as at least one disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and so on.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a headphone interface 170D, and an application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and also used to convert analog audio input into digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
  • the speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also known as "handset" is used to convert audio electrical signals into sound signals.
  • the voice can be received by bringing the receiver 170B close to the ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through a person's mouth, and input a sound signal to the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C. In addition to collecting sound signals, it may also implement a noise reduction function. In other embodiments, the electronic device 100 may further include three, four, or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the headset interface 170D is used to connect wired headsets.
  • the earphone interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device (open terminal) platform (OMTP) standard interface, and the American Telecommunications Industry Association (cellular telecommunications industry association of the USA, CTIA) standard interface.
  • OMTP open mobile electronic device
  • CTIA American Telecommunications Industry Association
  • the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A may be provided on the display screen 194.
  • the capacitive pressure sensor may be at least two parallel plates with conductive materials. When force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the strength of the pressure according to the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position based on the detection signal of the pressure sensor 180A.
  • touch operations that act on the same touch position but have different touch operation intensities may correspond to different operation instructions. For example, when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the movement posture of the electronic device 100. In some embodiments, the angular velocity of the electronic device 100 around three axes (ie, x, y, and z axes) may be determined by the gyro sensor 180B.
  • the gyro sensor 180B can be used for image stabilization. Exemplarily, when the shutter is pressed, the gyro sensor 180B detects the jitter angle of the electronic device 100, calculates the distance that the lens module needs to compensate based on the angle, and allows the lens to counteract the jitter of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude using the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can detect the opening and closing of the flip holster using the magnetic sensor 180D.
  • the electronic device 100 may detect the opening and closing of the clamshell according to the magnetic sensor 180D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to recognize the posture of electronic devices, and can be used in applications such as switching between horizontal and vertical screens, pedometers, etc.
  • the distance sensor 180F is used to measure the distance.
  • the electronic device 100 can measure the distance by infrared or laser. In some embodiments, when shooting scenes, the electronic device 100 may use the distance sensor 180F to measure distance to achieve fast focusing.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 100 emits infrared light outward through the light emitting diode.
  • the electronic device 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it may be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100.
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense the brightness of ambient light.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, access to application locks, fingerprint photographing, and fingerprint answering calls.
  • the temperature sensor 180J is used to detect the temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs performance reduction of the processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection. In some other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the abnormal shutdown of the electronic device 100 due to the low temperature. In some other embodiments, when the temperature is below another threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
  • Touch sensor 180K also known as "touch panel”.
  • the touch sensor 180K may be provided on the display screen 194, and the touch sensor 180K and the display screen 194 constitute a touch screen, also called a "touch screen”.
  • the touch sensor 180K is used to detect a touch operation acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation may be provided through the display screen 194.
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100, which is different from the location where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the pulse of the human body and receive a blood pressure beating signal.
  • the bone conduction sensor 180M may also be provided in the earphone and combined into a bone conduction earphone.
  • the audio module 170 may parse out the voice signal based on the vibration signal of the vibrating bone block of the voice part acquired by the bone conduction sensor 180M to realize the voice function.
  • the application processor may analyze the heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M to implement the heart rate detection function.
  • the key 190 includes a power-on key, a volume key, and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device 100 can receive key input and generate key signal input related to user settings and function control of the electronic device 100.
  • the motor 191 may generate a vibration prompt.
  • the motor 191 can be used for incoming vibration notification or touch vibration feedback.
  • touch operations applied to different applications may correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminder, receiving information, alarm clock, game, etc.
  • Touch vibration feedback effect can also support customization.
  • the indicator 192 may be an indicator light, which may be used to indicate a charging state, a power change, and may also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be inserted into or removed from the SIM card interface 195 to achieve contact and separation with the electronic device 100.
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM cards, Micro SIM cards, SIM cards, etc.
  • the same SIM card interface 195 can insert multiple cards at the same time. The types of the multiple cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 can also be compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to realize functions such as call and data communication.
  • the electronic device 100 uses eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
  • An embodiment of the present application provides a method for acquiring depth information, which may be applied to the electronic device 100.
  • the electronic device 100 includes an infrared projector 101, a first camera 102, and a second camera 103.
  • the lens focal lengths of the first camera 102 and the second camera 103 are both f
  • the distance between the first camera 102 and the second camera 103 is T.
  • the method for obtaining depth information may include S201-S203:
  • the electronic device 100 receives the first instruction.
  • the first instruction triggers the electronic device 100 to obtain depth information of the target object.
  • the electronic device 100 may receive the first operation of the user, and the first operation is used to trigger the electronic device 100 to execute the first event. Since the electronic device 100 executes the first event to use the depth information of the target object; therefore, in response to the first operation described above, the electronic device 100 may obtain a first instruction, which may trigger the electronic device 100 to obtain the depth of the target object information.
  • the method of the embodiment of the present application may be applied to multiple scenes such as a face unlocking scene, a face payment scene, an AR scene, a 3D modeling scene, or a large aperture scene.
  • the above-mentioned first operation may be a user's click operation on the relevant physical key in the electronic device 100 (such as a click operation).
  • the above-mentioned related physical keys may be a lock screen key or a home key. If the electronic device 100 receives the first operation of the physical key by the user, it indicates that the user may have the intention to unlock the electronic device 100.
  • the electronic device 100 may include one or more sensors for detecting the state in which the electronic device 100 is held by the user. It is assumed that the electronic device 100 has been turned on, the electronic device 100 is blank, or the electronic device 100 displays a lock screen interface.
  • the receiving of the first operation by the electronic device 100 may specifically be: the sensor detects that the state of the electronic device 100 currently being held by the user has changed to meet the preset condition.
  • the state when the electronic device 100 is held by the user has changed to meet the preset condition (for example, the electronic device 100 is picked up by the user, and the user holds the electronic device 100 such that the angle between the display screen of the electronic device 100 and the horizontal plane Within a certain range), it means that the user may have the intention to unlock the electronic device 100.
  • the first operation may be used to trigger the electronic device 100 to unlock (ie, execute the first event).
  • a user identity verification is performed before the electronic device 100 is unlocked.
  • face recognition is a way of user identity verification.
  • the electronic device 100 can obtain depth information of the target object.
  • one or more sensors in the embodiments of the present application can determine whether the electronic device 100 is held by the user by detecting that the electronic device 100 is rotated, the electronic device 100 moves forward relative to the user, and the electronic device 100 moves upward relative to the horizon. Whether the status changes according to the preset conditions. Specifically, the electronic device 100 may detect the motion parameters of the electronic device 100; then, according to the motion parameters, determine whether the electronic device 100 is rotated, whether it moves forward relative to the user, or whether it moves upward relative to the horizon; finally, according to The judgment result determines whether the state in which the electronic device 100 is held by the user changes according to a preset condition.
  • “the state in which the electronic device 100 is currently held by the user has changed in accordance with preset conditions” may specifically include: after the sensor detects that the electronic device 100 is rotated and moves from bottom to top, the The angle between the display screen of the electronic device 100 and the horizontal plane is within a preset range.
  • the electronic device 100 can count the state change parameters collected by the sensor during the process of collecting face images by most or all users using the electronic device 100, and incorporate these state change parameters The change is determined as a change that meets the preset conditions. In this way, the electronic device 100 may respond to the change of the state change parameter collected by the sensor in the electronic device 100 to meet the change of the preset condition, and determine that the user has a high possibility of unlocking the electronic device 100, and may execute S202.
  • the above-mentioned first operation may be a click operation (such as a click operation) of a payment button of a user in a payment interface.
  • the payment interface 301 shown in (a) of FIG. 3 includes a payment button "Pay Now" 302, and the first operation may be a user's click operation on the button "Pay Now” 302.
  • the first operation may be a preset gesture input by the user on the payment interface, such as an S-shaped gesture.
  • the electronic device 100 can pay for the order (ie, execute the first event).
  • face recognition is a way of user identity verification.
  • the electronic device 100 can obtain depth information of the target object.
  • the electronic device 100 may display the face recognition interface 303 shown in (b) in FIG.
  • the electronic device 100 emits infrared light with a light spot through the infrared projector 101, collects first image information of the target object through the first camera 102, and collects second image of the target object through the second camera 103 information.
  • the electronic device 100 can turn on the infrared projector 101 and turn on the first camera 102 and the second camera 103.
  • the infrared projector 101 can emit infrared light with a light spot.
  • the first camera 102 and the second camera 103 can collect image information of the target object.
  • the infrared projector 101 emits infrared light with a light spot to the target object, and the infrared light with the light spot irradiates the target object, which can increase the characteristics of the target object (that is, texture characteristics) and improve The recognition rate of each feature on the target object by the second camera 103.
  • the infrared light with a spot emitted by the infrared projector 101 and the principle of the infrared light with a spot emitted by the infrared projector 101 are introduced here.
  • infrared light is not visible to the human eye.
  • some infrared lights such as 850nm infrared light
  • the human eye can still see a small amount of red light. If the infrared projector 101 emits such infrared light to the target object, the user can see the infrared light irradiated on the target object, which affects the user's visual experience.
  • the infrared light emitted by the infrared projector 101 may be infrared light that is completely invisible to the human eye.
  • the infrared light emitted by the infrared projector 101 may be infrared light of 890 nm to 990 nm, specifically, for example, infrared light of 940 nm.
  • the exposure of some cameras is performed line by line; therefore, if the exposure of the first camera 102 and the second camera 103 is performed line by line, then the first camera 102 During the entire exposure period of the second camera 103, the infrared projector 101 must be turned on to emit the infrared light with the light spot; otherwise, the image information collected by the first camera 102 and the second camera 103 may appear during the exposure process There is no light spot on the middle part of the image.
  • the operating power of the infrared projector 101 cannot be too high to prevent the infrared projector 101 from heating at a high temperature due to the operating power of the infrared projector 101 being too high, affecting the efficiency of the infrared projector 101 to emit infrared light Affects the brightness of infrared light), thereby affecting the effect of the texture features formed by irradiating the spotted infrared light on the object.
  • the operating power of the infrared projector 101 cannot be too low. If the working power of the infrared projector 101 is too low, it will also affect the efficiency of the infrared projector 101 to emit infrared light, thereby affecting the effect of the texture feature formed by the infrared light with light spot irradiating the object.
  • the power of the infrared projector 101 should not be too high or too low.
  • the operating current of the infrared projector 101 may be between 100mA and 200mA.
  • the operating current of the infrared projector 101 may be 150 mA.
  • an infrared projector may include three parts: (1) an infrared light source; (2) a collimating mirror; and (3) a diffractive optics (DOE).
  • an infrared light source may include three parts: (1) an infrared light source; (2) a collimating mirror; and (3) a diffractive optics (DOE).
  • DOE diffractive optics
  • the infrared light source may be a laser light source.
  • the laser light source may be a vertical cavity surface emitting laser (VCSEL).
  • VCSEL vertical cavity surface emitting laser
  • the collimator lens can be a 1P lens or a 2P lens.
  • the lens whose collimator lens is 1P means that the collimator lens is composed of one lens.
  • 2P lens means that the collimator lens is composed of 2 lenses.
  • Collimator mirrors are used to convert non-parallel light rays into approximately parallel light sources. This conversion can reduce the texture noise level of the infrared light emitted by the infrared projector, that is, it can reduce the light of the non-textured part. In other words, this conversion can make the bright spots in the infrared light emitted by the infrared projector brighter and the dark spots darker.
  • the texture features formed on the target object when the infrared light with the light spot is irradiated are almost invisible.
  • the electronic device 100 recognizes the same feature in the two image information, it can be determined according to the feature of the target object itself. Therefore, the bright spot obtained by the conversion of the light by the collimator has no influence on the calculation of the depth.
  • the collimator lens in the infrared projector 101 can be reduced. That is, the infrared projector 101 in the embodiment of the present application may not include the collimator lens, but only includes the infrared light source and the DOE.
  • the role of the collimator lens is to let the light emitted by the VCSEL pass through the collimator lens, and the light is close to parallel light; in this way, in the case of the collimator lens, after the light passes through the DOE, it will cause a small spot diameter and high contrast ) Has a high brightness, while the non-spot area has a low brightness).
  • the target object illuminated by the projector is black or invisible in the area without spots or light spots; because the projector has a collimator
  • the light of the spot is more concentrated, so the place without the spot is theoretically no light, or only weak ambient light. If the collimator lens is removed in the infrared projector, the light beam is not parallel light before entering the DOE, so the contrast of the light spot is low, and some light still exists in the non-spot area.
  • a projector without a collimator lens is noisy for the design of the projector, but in the embodiments of the present application, it is indeed possible to improve the accuracy of measuring depth information.
  • the target object in the case of weak visible light or pure black, in the area of the target object illuminated by the light spot, the target object can be illuminated by the light spot; in the area not illuminated by the light spot, due to the presence of this noise , The target object will also be irradiated with relatively weak light, at this time the entire target object is visible. Therefore, the characteristics of the target object can be made more obvious, which is convenient for improving the accuracy of depth information measurement.
  • DOE is used to convert the parallel light into the required light output form by using the principle of Fourier optics.
  • the output form can be points, lines, or areas.
  • the DOE can be set to control the infrared projector to emit infrared light with a light spot.
  • the shape, number and arrangement of light spots can be achieved by setting DOE.
  • the infrared light emitted by the infrared projector 101 may include multiple light spots.
  • the plurality of light spots may include a plurality of speckle array groups.
  • a speckle pattern group includes one or more speckle patterns.
  • the speckle pattern is composed of multiple speckles.
  • FIG. 4 illustrates an example schematic diagram of a plurality of speckle arrays provided by an embodiment of the present application.
  • the speckle pattern 401 shown in FIG. 4 is a speckle pattern composed of a plurality of circular speckles.
  • the shapes of multiple speckles in a speckle array may be the same.
  • the speckles in the speckle pattern 401, the speckle pattern 402, the speckle pattern 403, the speckle pattern 404, and the speckle pattern 408 shown in FIG. 4 are all circular.
  • the speckles in the speckle array 405 shown in FIG. 4 are all rectangular.
  • At least two speckles in a speckle array have different shapes.
  • the speckle array 406 shown in FIG. 4 includes circular speckles and rectangular speckles.
  • the speckle array 407 shown in FIG. 4 includes a plurality of speckles with different shapes.
  • the shape of different speckle patterns can be different.
  • the speckle pattern 401 shown in FIG. 4 is a rectangular speckle pattern
  • the speckle pattern 402 is a triangular pattern
  • the speckle pattern 403 is an octagonal pattern
  • the speckle pattern 404 is another.
  • the speckle pattern 401 and the speckle pattern 408 shown in FIG. 4 are the same; One more speckle). Therefore, the speckle pattern 401 and the speckle pattern 408 are different speckle patterns.
  • the shapes of the speckle pattern 401 and the speckle pattern 405 shown in FIG. 4 are the same, and the number of speckles included in the speckle pattern 401 and the speckle pattern 405 is the same; The shape (circular) is different from the shape (rectangular) of the speckle in the speckle array 405.
  • the speckle pattern 401 and the speckle pattern 405 are different speckle patterns.
  • the shapes of the speckles included in the speckle array 402 and the speckle array 404 shown in FIG. 4 are the same, and the number of speckles included in the two speckle arrays is the same;
  • the shape (triangle) is different from the shape (octagon) of the speckle pattern 404. Therefore, the speckle pattern 402 and the speckle pattern 404 are different speckle patterns.
  • FIG. 4 only gives an example schematic diagram of a partial speckle array by way of example.
  • the speckle array in the plurality of light spots in the infrared light includes but is not limited to the speckle array shown in FIG. 4.
  • the plurality of light spots include a plurality of identical groups of speckle arrays.
  • Each speckle pattern group includes a speckle pattern.
  • 501 shown in FIG. 5 is a plurality of light spots.
  • the plurality of light spots 501 include a plurality of identical speckle array groups 502.
  • the speckle pattern group 502 includes a speckle pattern 401.
  • the infrared light emitted by the infrared projector 101 includes a plurality of light spots 503 shown in FIG. 5.
  • the plurality of light spots 503 include a plurality of identical speckle array groups 504.
  • the speckle pattern group 504 includes a speckle pattern 407.
  • the plurality of light spots include a plurality of same speckle array groups.
  • Each speckle pattern group includes a plurality of speckle pattern arrays. At least two of the plurality of speckle patterns are different.
  • the infrared light emitted by the infrared projector 101 has a plurality of light spots 601 shown in FIG. 6A.
  • the plurality of light spots 601 include a plurality of identical speckle array groups 602.
  • the speckle pattern group 602 includes a speckle pattern 402 and a speckle pattern 404.
  • the speckle pattern 402 and the speckle pattern 404 are different.
  • the infrared light emitted by the infrared projector 101 includes a plurality of light spots 603 shown in FIG. 6B.
  • the plurality of light spots 603 include a plurality of identical speckle array groups 604.
  • the speckle pattern group 604 includes two speckle patterns 401 and two speckle patterns 403.
  • the speckle pattern 401 and the speckle pattern 403 are different.
  • the infrared light emitted by the infrared projector 101 has a plurality of light spots 605 shown in FIG. 6C.
  • the plurality of light spots 605 include a plurality of identical speckle array groups 606.
  • the speckle pattern group 606 includes a speckle pattern 404, a speckle pattern 408, a speckle pattern 406, and a speckle pattern 407.
  • the speckle pattern 404, the speckle pattern 408, the speckle pattern 406, and the speckle pattern 407 are different from each other.
  • the plurality of light spots include a plurality of speckle array groups.
  • the plurality of speckle array groups include at least two different speckle array groups.
  • the plurality of light spots 701 include a plurality of different speckle array groups, such as a speckle array group 602, a speckle array group 604, and a speckle array group 606.
  • the plurality of light spots include a plurality of speckles with different shapes.
  • the plurality of speckles with different shapes are randomly arranged.
  • the plurality of light spots 702 include a plurality of speckles with different shapes.
  • FIG. 5, FIG. 6A, FIG. 6B, FIG. 6C, or FIG. 7A only give an example schematic diagram of multiple light spots in infrared light by way of example.
  • the number of speckle arrays among the plurality of light spots in infrared light is not limited to the number of speckle arrays shown in FIG. 5, FIG. 6A, FIG. 6B, FIG. 6C, or FIG. 7A.
  • the number of speckles in the infrared light emitted by the infrared emitter 101 is about 3,000.
  • the greater the number of speckles in the infrared light emitted by the infrared emitter 101 the more features that the infrared light irradiates on the target object for the target object, the more advantageous it is for the electronic device 100 to calculate the target object's In-depth information.
  • the depth information of the target object calculated by the electronic device 100 is calculated as compared to when the number of speckles is about 3 thousand The depth information of the target object is more accurate.
  • the repetition frequency of the speckle array can be characterized by the number of the same speckle array in the preset area. The greater the number of the same speckle array group in the preset area, the higher the repetition frequency of the speckle array, and the smaller the repetition period of the speckle array. In this way, it is easy to cause a feature matching error when the electronic device 100 recognizes the same feature in the image information collected by the first camera 102 and the image information collected by the second camera 103.
  • the feature matching in the embodiment of the present application refers to: identifying the same feature in the image information collected by the first camera 102 and the image information collected by the second camera 103.
  • the repetitive period of the speckle pattern is reflected on the image as the shortest distance between two identical speckle patterns.
  • the distance between two speckle arrays may be the distance between the center points of the two speckle arrays.
  • the plurality of light spots 703 include a plurality of repeated speckle array groups, such as a speckle array group 704, a speckle array group 705, and a speckle array group 706.
  • the speckle array group 704, the speckle array group 705, and the speckle array group 706 are the same.
  • the point K1 is the center point of the speckle pattern 707 in the speckle pattern group 704.
  • the point K2 is the center point of the speckle array 708 in the speckle array group 705.
  • the point K3 is the center point of the speckle pattern 709 in the speckle pattern group 706.
  • the repetition period of the speckle pattern shown in FIG. 7C can be characterized by the shortest distance between the distance between point K1 and point K2 and the distance between point K1 and point K3. Since the distance between point K1 and point K2 is shorter than the distance between point K1 and point K3; therefore, the repetition period of the speckle pattern shown in FIG. 7C can be characterized by the distance between point K1 and point K2.
  • the smaller the repetition frequency of the speckle array the lower the repeatability of the texture feature formed by the infrared light irradiating the target object, and the more accurate the depth information of the target object calculated by the electronic device 100 is.
  • the electronic device 100 is based on the principle of triangular positioning, using a formula To achieve the calculation of the depth information of the target object. That is, the depth information of the target object is calculated based on the distance between the first camera 102 and the second camera 103 (that is, the first length T), the lens focal length f of the first camera 102 and the second camera 103, and the parallax d.
  • the coverage of the field of view (FOV) of the infrared projector 101, the coverage of the FOV of the first camera 102, and the coverage of the FOV of the second camera 103 partially overlap or all overlap.
  • the size of FOV can characterize the field of view of optical instruments (such as cameras).
  • the infrared projector 101, the first camera 102, and the second camera 103 the larger the overlapping area of the FOV coverage of the three, the more texture features collected by the first camera 102 and the second camera 103 are.
  • the infrared projector 101 may be provided on the first camera 102 and the second camera Between 103.
  • the distance between the first camera 102 and the second camera 103 also affects the size of the overlapping area.
  • the larger the first length the smaller the overlapping area of the FOV coverage of the first camera 102 and the second camera 103.
  • the parallax of the first camera 102 and the second camera 103 to each feature on the target object will also be very small, close to zero. Therefore, the above-mentioned first length cannot be too large or too small. If the first length is too large or too small, it will affect the accuracy of the depth information calculated by the electronic device 100.
  • the first length T may be any length from 20 mm to 30 mm.
  • the first length T may be 29.5 mm. It should be noted that how to set the distance between the two cameras may be affected by the camera parameters. Therefore, the above-mentioned first length T of 20 mm to 30 mm is only an example.
  • the first camera 102 and the second camera 103 are required to collect image information including the light spot, then the first camera 102 and the second camera 103 are required to receive infrared light.
  • the first camera 102 and the second camera 103 may be used to receive infrared light of 890 nm to 990 nm, such as infrared light of 940 nm.
  • the first camera 102 and the second camera 103 can sense infrared light (that is, receive infrared light).
  • the ordinary RGB camera can only sense visible light, but not infrared light; and the cost of configuring an infrared camera with infrared light sensing function in the electronic device 100 is high, and the use of the infrared camera will increase the power consumption of the electronic device 100 .
  • the embodiments of the present application can improve the ordinary RGB camera to obtain the first camera 102 and the second camera 103 that can sense infrared light.
  • FIG. 8 shows a schematic structural diagram of a general RGB camera module provided by an embodiment of the present application.
  • the rolling shutter camera is used as an example in the embodiment of the present application to describe the manner in which the ordinary RGB camera is improved in the embodiment of the present application to obtain the first camera 102 and the second camera 103:
  • the RGB camera module 800 may include a 3P lens, a filter (also called a filter) 804, and a sensor (805).
  • the 3P lens means that the RGB camera includes three lenses: a lens 801, a lens 802, and a lens 803.
  • the sensor 805 may be a 2M sensor. 2M means that the highest resolution of RGB camera imaging can reach 2 million pixels.
  • the following improvements can be made to the RGB camera module 800 shown in (a) of FIG. 8: An anti-reflection coating is applied on each surface of each lens (such as lens 801, lens 802, and lens 803) of the RGB camera module 800 shown in (a) of FIG. 8 to obtain the image shown in (b) of FIG. 8 A lens 801, a lens 802, and a lens 803; a cut-off plating layer is plated on the filter 804 to obtain the filter 804 shown in (b) of FIG. 8.
  • the sensor 805 in the RGB camera module 800 may not be improved.
  • the cut-off plating layer may be plated on both sides of the filter 804; or, the cut-off plating layer may be plated on one side of the filter 804.
  • the anti-reflection coating may be an anti-reflection coating corresponding to the wavelength of infrared light emitted by the infrared projector 101.
  • the antireflection film may be an antireflection film for infrared light of 890 nm to 990 nm, such as an antireflection film for infrared light of 940 nm.
  • Coating the two surfaces of each lens (such as lens 801, lens 802, and lens 803) with the above-mentioned AR coating can improve the ability of lens 801, lens 802, and lens 803 to sense infrared light emitted by infrared projector 101, making the lens 801, lens 802 and lens 803 can receive infrared light emitted by infrared projector 101 as much as possible.
  • the above-mentioned cut-off plating layer can be used to filter out the infrared light and infrared light emitted by the infrared projector 101 and increase the transmittance of the infrared light.
  • the above cut-off plating layer can be used to filter out infrared light with a wavelength of 850 nm. It can be understood that due to the obvious red light characteristics of infrared light with a wavelength of 850nm, serious red exposure may occur; therefore, filtering the infrared light with a wavelength of 850nm through the above-mentioned cut-off coating layer can reduce the possibility of red exposure .
  • the phenomenon of red exposure refers to a problem of unclean filters. For example, sometimes you may want to use only infrared light (infrared light) for illumination, you can add a filter to the light source to filter other light than infrared light. At this time, a small amount of infrared light may still be seen by the human eye due to dirty filters. This phenomenon is called red exposure.
  • the general RGB cameras improved in the embodiments of the present application include but are not limited to the rolling shutter cameras described above.
  • the reason why the rolling shutter camera is improved is that the above-mentioned first camera is obtained because the exposure of the rolling shutter camera is performed line by line and the cost is lower.
  • improving the rolling shutter camera to obtain the first camera 102 and the second camera 103 can further reduce costs.
  • the electronic device 100 calculates the depth information of the target object according to the first image information and the second image information, the first length, and the lens focal length.
  • the depth Z of each feature point of the target object is inversely proportional to the parallax d of the first camera 102 and the second camera 103 at that point; the depth Z of each feature point is proportional to the focal length f of the lens; each feature The depth Z at the point is proportional to the first length T.
  • the first length T and the lens focal length f are hardware parameters of the first camera 102 and the second camera 103.
  • the first length T and the lens focal length f are fixed.
  • the size of the depth Z of each point of the target object depends on the size of the parallax d of the first camera 102 and the second camera 103 with respect to the point.
  • the first camera 102 is a dual-pass camera on the left
  • the second camera 13 is a dual-pass camera on the right.
  • the smiley face, triangle, cylinder, and crescent respectively represent different features on the target object, and the distance between the location of the smiley face, triangle, cylinder, and crescent and the camera gradually becomes farther away.
  • O L is the position of the first camera 102
  • O R is the position of the second camera 103
  • the distance between O L and O R is T (that is, the first length).
  • the first camera 102 acquires the image A
  • the second camera 103 acquires the image B.
  • the crescent's coordinate in the x-axis of the coordinate system with O l (the upper left corner of image A) as the origin (abbreviated as coordinate system L) is x L1
  • the coordinate of the cylinder in the x-axis of coordinate system L is x L2
  • the coordinate of the x-axis of the system L is x L3
  • the coordinate of the smiling face on the x-axis of the coordinate system L is x L4 .
  • Coordinate in the x-axis crescent to O r (upper left corner of the image B) as the origin of the coordinate system (the coordinate system R) is x R1
  • cylindrical coordinates in the coordinate system of the x-axis is R x R2
  • triangular coordinate The coordinate of the x axis of the system R is x R3
  • the coordinate of the smiling face on the x axis of the coordinate system R is x R4 .
  • the coordinate of the smiling face collected by the first camera 102 in the coordinate system of image A is (x L4 , y)
  • the coordinate of the smiling face collected by the second camera 103 in the coordinate system of image B is (x R4 , Y).
  • the parallax d4 x L4- x R4 .
  • the electronic device 100 may first calculate the parallax d of the first camera 102 and the second camera 103 to a feature on the target object, and then calculate the parallax d of the feature, the first length T, and the lens focal length f The depth Z of the point where the feature is located; then the depth information of the target object is obtained from the depth of multiple points.
  • S203 shown in FIG. 2 may include S203a-S203b:
  • the electronic device 100 calculates the disparity of the first camera 102 and the second camera 103 with respect to a plurality of first features in the first image information and the second image information.
  • the first feature is the same feature in the first image information and the second image information.
  • the infrared light with the light spot emitted by the infrared emitter 101 is irradiated on the target object to increase the texture characteristics of the target object.
  • the target object is the human face 1101 shown in (a) in FIG. 11
  • the infrared emitter 101 emits the infrared light 1102 shown in (b) in FIG. 11 including the light spot shown in FIG. 7C.
  • a human face 1103 irradiated with light spots as shown in (c) of FIG. 11 can be obtained.
  • the difference is that the points of the same feature in the image information of the face 1103 collected by the first camera 102 (that is, the first image information) and the image information of the face 1103 collected by the second camera 103 (that is, the second image information) are in coordinates
  • the x-axis position of the system is different, that is, the first camera 102 and the second camera 103 have parallax.
  • the electronic device 100 can recognize the first image information collected by the first camera 102 and the second image information collected by the second camera 103, and determine the same features in the first image information and the second image information.
  • the same features in the first image information and the second image information may include: features of the target object itself and texture features formed on the target object by irradiation of infrared light with light spots. That is to say, in the embodiment of the present application, when the electronic device 100 recognizes the same feature in the first image information and the second image information, it can not only recognize the same feature of the target object in the first image information and the second image information, It is also possible to identify the same feature among the texture features formed on the target object by the infrared light with flare in the first image information and the second image information.
  • the feature or texture feature of the target object itself when recognizing the same feature in the two image information, it can be judged according to the feature or texture feature of the target object itself, or both. For example, if it can be judged based on the characteristics of the target object itself or solely on the basis of texture characteristics, there is no need to combine the two. Alternatively, when it is impossible or difficult to determine whether the same feature is based on the feature of the target object itself, it can be determined whether the feature is the same feature based on the texture feature combined with the feature of the target object itself.
  • the electronic device 100 recognizes the same feature in the two image information, it can be determined according to the feature of the target object itself.
  • the shapes of the speckles in the above speckle array may be the same (for example, the speckle array includes a plurality of dots), the position of each speckle in the speckle array is different. Therefore, the electronic device 100 can recognize the same feature represented by the same-shaped speckle in the first image information and the second image information according to the speckle array and the position of the speckle in the speckle array.
  • the plurality of first features include all the same features in the first image information and the second image information.
  • the electronic device 100 may recognize all the same features in the first image information and the second image information, and then execute S203b to calculate the depth of each feature to obtain the depth information of the target object.
  • the above-mentioned multiple first features are some of the same features in the first image information and the second image information.
  • the electronic device 100 may select a plurality of first features from the first image information according to a preset feature frequency, and then search for features in the second image information that are the same as the plurality of first features; and finally For each first feature, execute S203b to calculate its depth to obtain the depth information of the target object.
  • the electronic device 100 may also select part of the first features from the first image information randomly or at intervals.
  • the above feature frequency may be the number of the same two first features appearing in the preset area.
  • the above feature frequency reflects the distance between two adjacent first features that can be selected for the electronic device 100 on the image (referred to as the feature distance).
  • the electronic device 100 may select multiple first features from the first image information according to a preset feature frequency.
  • the method may include: the electronic device 100 selects every feature distance from all the features in the first image information A first feature.
  • the electronic device 100 does not need to calculate the depth of the point where each of the same features in the first image information and the second image information is located, but selects a feature at every feature distance and calculates the depth of the point where the selected feature is located .
  • the texture feature formed on the target object by irradiating the infrared light with flare in the first image information and the second image information is taken as an example to describe the above-mentioned periodic feature.
  • the above characteristic distance may be the distance between speckle 1201 and speckle 1202, or the distance between speckle 1202 and speckle 1204, or the distance between speckle 1204 and speckle 1203, or The distance between speckle 1201 and speckle 1203.
  • the embodiment of the present application adopts a method of marking speckles to black to show some of the first features among the texture features formed by irradiating the infrared light with light spots on the target object. That is, the speckles marked in black shown in FIG. 12 are part of the first feature.
  • the shapes of the speckles in some speckle arrays may be the same (for example, the speckle array includes multiple dots).
  • the electronic device 100 can distinguish different speckles according to the position of the speckle in the speckle array; however, the electronic device 100 needs a longer time to distinguish different speckles according to the position of the speckle in the speckle array, and will The power consumption of the electronic device 100 is wasted.
  • the feature distance adopted by the electronic device 100 when selecting the first feature may be less than or equal to the repetition period of the speckle array in the multiple light spots. That is, the above characteristic frequency is greater than or equal to the repetition frequency of the speckle array among the plurality of light spots.
  • the two adjacent first features selected by the electronic device 100 from the first image information correspond to speckles in different speckle arrays, which is beneficial to the electronic device 100 to distinguish the two adjacent ones
  • the first feature can reduce the possibility of feature matching error and improve the accuracy of the depth information calculated by the electronic device 100.
  • the electronic device 100 calculates the point where the first feature is based on the parallax d, the first length T, and the lens focal length f of the first feature of the first camera 102 and the second camera 103 with respect to the first feature The depth of the target object.
  • the target object is the face 1301 shown in FIG. 13.
  • the electronic device 100 can calculate the depths of multiple points shown in FIG. 13 to obtain depth information 1302 of the target object 1301.
  • An embodiment of the present application provides a method for acquiring depth information.
  • the electronic device 100 can emit infrared light with a light spot through the infrared projector 101.
  • the images of the target object collected by the first camera 102 and the second camera 103 may include not only the characteristics of the target object, but also texture features formed by illuminating the target object with infrared light with light spots. That is, the characteristics of the image of the target object collected by the first camera 102 and the second camera 103 can be increased.
  • the electronic device 100 After adding the features of the image of the target object collected by the first camera 102 and the second camera 103, the electronic device 100 can more accurately recognize the same features in the image information collected by the first camera 102 and the image information collected by the second camera 103 , And then determine the parallax of the first camera 102 and the second camera 103 for the same feature, thereby calculating the depth of each feature point, to obtain the depth information of the target object, which can improve the electronic device 100 to calculate the depth information of the target object Accuracy.
  • the electronic device 100 includes the first camera 102 and the second camera 103 described above.
  • the first camera 102 and the second camera 103 can sense visible light.
  • the first camera 102 or the second camera 103 may serve as a main camera.
  • the electronic device 100 may display image information under visible light collected by the main camera on the display screen of the electronic device 100.
  • the first camera 102 and the second camera 103 can also sense infrared light to cooperate with the infrared projector 101 to calculate the depth information of the target object by the electronic device 100.
  • the image information of the target object collected by the electronic device 100 may appear reddish.
  • the image information collected by the main camera (the first camera 102 or the second camera 103) displayed on the display screen of the electronic device 100 may appear reddish, which affects the user's visual experience.
  • the electronic device 100 may not only include the first camera 102 and the second camera 103, but also include a third Camera 104.
  • the third camera 104 is an RGB camera (that is, an ordinary RGB camera).
  • the third camera 104 is used to collect image information under visible light.
  • the third camera 104 may be disposed between the first camera 102 and the second camera 103, or may be disposed at another location.
  • the first camera 102 and the second camera 103 can sense not only visible light but also infrared light; that is, the first camera 102 and the second camera 103 can not only collect image information under infrared light, but also use It is used to collect image information under visible light; however, the image information collected by the first camera 102 and the second camera 103 is only used to calculate the depth information of the target object.
  • the third camera 104 is used to collect image information under visible light.
  • the image information collected by the third camera 104 is used to be displayed on the display screen of the electronic device 100. In this way, the problem that the image information displayed on the display screen of the electronic device 100 may appear reddish can be avoided, which can ensure that the user's visual experience of capturing images is affected.
  • the lens focal length of the first camera and the lens focal length of the second camera are the same as an example for description. In other embodiments, the lens focal length of the first camera and the lens focal length of the second camera may also be different.
  • the aforementioned formula for calculating the depth Z is modified to calculate the calculated depth.
  • the specific calculation formula can use the formula in the prior art.
  • the image information collected by the third camera 104 may be displayed on the electronic when the electronic device 100 inputs a two-dimensional face image for face recognition.
  • the image information collected by the first camera 102 and the second camera 103 is used to calculate the depth information of the target object when the electronic device 100 performs face recognition. That is, the face recognition process of the electronic device 100 may include: recognition of two-dimensional face information and authentication of face depth.
  • the recognition of two-dimensional face information means that the electronic device 100 determines whether the two-dimensional image of the target object collected by the electronic device 100 matches the two-dimensional image of the face stored in the electronic device 100.
  • Face depth authentication refers to: the electronic device 100 determines whether the depth information of the target object has the depth characteristics of the real face; if the depth information of the target object has the depth characteristics of the real face, it means that the target object is a real face ; If the depth information of the target object does not have the depth feature of a real face, it means that the target object is not a real face.
  • the target object may be a photo including a two-dimensional image matching the two-dimensional image of the face pre-stored in the electronic device. If the depth information of the target object does not have the depth characteristics of a real face, even if the two-dimensional image of the target object matches the two-dimensional image of the face stored in the electronic device 100, the face recognition of the target object will not pass.
  • the information of the electronic device 100 can be prevented from being leaked or property damage to the user can be avoided, the information security of the electronic device 100 can be protected, and the electronic device 100 can be improved Safety performance.
  • the electronic device 100 may calculate the depth information of the target object calculated based on the image information collected by the first camera 102 and the second camera 103, and the image of the target object collected by the third camera 104 The information is combined to build a real face model of the target object to enhance the authenticity of AR scenes and 3D modeling scenes.
  • the electronic device 100 may perform the above method, calculate the depth of each point in the image collected by the electronic device 100, and then perform blur processing on the points with a depth greater than a preset value, highlighting that the depth is less than the preset Value point to achieve the effect of large aperture.
  • the electronic device 100 collects an image including a face
  • the depth of each point on the face of the person is smaller than the depth of each point on the background (the background is behind the face and farther from the camera). Therefore, in a large aperture scene, you can highlight the face image and blur the background image.
  • the above-mentioned first operation may include a user's opening operation of the "camera” application and a user's selection operation of the "large aperture” mode in the "camera” application.
  • the electronic device 100 receives the above-mentioned first operation, it means that the electronic device 100 is to capture an image using a large-aperture scene (ie, execute the first event).
  • the electronic device 100 may perform the above-described method of acquiring depth information, acquire the depth information of the target object, and then realize the effect of a large aperture according to the depth information of the target object.
  • the image collected by the third camera 104 is displayed on the display screen of the electronic device 100, and the electronic device 100 can perform a large aperture process on the displayed photo preview screen and the final captured image according to the acquired depth information, that is, preview In the picture and the image obtained by taking pictures, it is possible to highlight a certain near target object, and blur other objects.
  • the user's opening operation on the "camera” application may be a user's clicking operation on the "camera” application icon.
  • the electronic device 100 may display the shooting interface 1401 shown in FIG. 14 in response to the user's opening operation on the “camera” application.
  • the shooting interface 1401 includes a framing frame 1402, a camera conversion key 1404, a shooting key 1403, and an album key 1405.
  • the viewfinder 1402 is used to display the preview image captured by the rear camera or the front camera of the electronic device 100; the camera conversion key 1404 is used to trigger the electronic device 100 to convert and use the front camera and the rear camera to capture the image; the capture key 1403 is used to control the electronic device 100 to save the preview image captured by the rear camera or the front camera displayed in the view frame 1402; the album key 1405 is used to view the images saved in the electronic device 100.
  • the default camera of the electronic device 100 is a rear camera
  • a preview image captured by the rear camera as shown in FIG. 14 may be displayed in the framing frame 1402.
  • the preview image is collected by the third camera 104.
  • the above-mentioned shooting interface 1401 may further include options of the shooting mode of the electronic device 100.
  • the shooting modes of the electronic device 100 may include: a rear camera shooting mode, a selfie mode, a video mode, a large aperture mode, a panorama mode, and the like.
  • the shooting interface 1401 may further include a "large aperture” option 1406, a "photograph” option, a "video” option, and the like.
  • the shooting modes in the embodiments of the present application include, but are not limited to, the rear camera shooting mode, the self-timer mode, the panoramic mode, the large aperture mode, and the video mode. Other shooting modes are not described here in the embodiments of the present application.
  • the user's selection operation of the "large aperture” mode in the “camera” application may be a user's click operation on the "large aperture” option 1406 shown in FIG. 14 (such as a click operation, a double-click operation, or a long press operation, etc.).
  • the user's selection operation of the “large aperture” mode in the “camera” application may be a preset gesture (such as an L-shaped gesture) input by the user on the shooting interface 1401 shown in FIG. 14, and the preset gesture is used to trigger
  • the electronic device 100 adopts a large aperture mode to capture images.
  • the method of the embodiment of the present application may further include: in response to the first event described above, the electronic device 100 collects a two-dimensional image of the target object through the third camera 104; the electronic device 100 displays the third camera 104 on the display screen
  • the collected two-dimensional image (referred to as the RGB image of the target object); the electronic device 100 recognizes the face image (called the RGB image of the human face) in the two-dimensional image collected by the third camera 104.
  • the method of the embodiment of the present application may further include: the electronic device 100 recognizes a human face from the depth information of the target object (referred to as a depth map of the target object) Depth information (called the depth map of the human face); the electronic device 100 maps the RGB image of the target object and the depth map of the target object according to the RGB image of the human face and the depth map of the human face, and finds the RGB image of the target object The coordinate area corresponding to the face depth image in the depth map, that is, the coordinate area of the face area in the RGB image is found, and then the other parts of the target object's RGB map except this part of the coordinate area are blurred. In this way, the effect of background blur can be achieved.
  • the electronic device includes a hardware structure and / or a software module corresponding to each function.
  • the embodiments of the present application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a function is executed by hardware or computer software driven hardware depends on the specific application and design constraints of the technical solution. Professional technicians can use different methods to implement the described functions for each specific application, but such implementation should not be considered beyond the scope of the embodiments of the present application.
  • the above electronic devices may be divided into function modules according to the above method examples.
  • each function module may be divided corresponding to each function, or two or more functions may be integrated into one processing module.
  • the above integrated modules may be implemented in the form of hardware or software function modules. It should be noted that the division of the modules in the embodiments of the present application is schematic, and is only a division of logical functions. In actual implementation, there may be another division manner.
  • FIG. 15 shows a possible structural schematic diagram of the electronic device 1500 involved in the foregoing embodiment.
  • the electronic device 1500 may include: a processing module 1501, a display module 1502, an infrared emission module 1503, a dual-pass acquisition module 1504, a dual-pass acquisition module 1505, and an RGB acquisition module 1506.
  • the electronic device 1500 may further include a communication module, and the communication module includes a Bluetooth module and a Wi-Fi module.
  • the processing module 1501 is used to control and manage the operation of the electronic device 1500.
  • the RGB collection module 1506 is used to collect images of target objects under visible light.
  • the display module 1502 is used to display the image generated by the processing module 1501 and the image collected by the RGB collection module 1504.
  • the infrared emitting module 1503 is used to emit infrared light with a light spot.
  • the dual-pass acquisition module 1504 and the dual-pass acquisition module 1505 are used to acquire images of the target object under visible light and images of the target object under infrared light.
  • the communication module is used to support communication between the electronic device 1500 and other devices.
  • the processing module 1501 is also used to calculate the depth information of the target object according to the image collected by the dual-pass collection module 1504.
  • the foregoing processing module 1501 may be used to support the electronic device 1500 to execute S201, S203, S203a, S203b in the foregoing method embodiments, and / or other processes used in the technology described herein.
  • the above-mentioned display module 1502 may be used to support the electronic device 1500 to perform the operation of “displaying the image collected by the RGB collection module 1506” in the above method embodiments, and / or other processes used in the technology described herein.
  • the infrared emission module 1503 may be used to support the electronic device 1500 to perform the operation of “emitting infrared light with a light spot” in S202 in the above method embodiment, and / or other processes used in the technology described herein.
  • the dual-pass acquisition module 1504 may be used to support the electronic device 1500 to perform the operation of “acquiring first image information” in S202 in the foregoing method embodiment, and / or other processes used in the technology described herein.
  • the dual-pass acquisition module 1505 may be used to support the electronic device 1500 to perform the operation of “collecting second image information” in S202 in the foregoing method embodiment, and / or other processes used in the technology described herein.
  • the RGB collection module 1506 may be used to support the electronic device 1500 to collect image information under visible light, and / or for other processes of the technology described herein.
  • the unit modules in the electronic device 1500 include but are not limited to the processing module 1501, the display module 1502, the infrared emission module 1503, the dual-pass acquisition module 1504, the RGB acquisition module 1506, and the like.
  • the electronic device 1500 may further include a storage module.
  • the storage module is used to store the program code and data of the electronic device 1500.
  • the processing module 1501 may be a processor or a controller, such as a central processing unit (CPU), a digital signal processor (DSP), an application-specific integrated circuit (Application-Specific Integrated Circuit, ASIC) ), Field Programmable Gate Array (Field Programmable Gate Array, FPGA) or other programmable logic devices, transistor logic devices, hardware components or any combination thereof.
  • the processor may include an application processor and a baseband processor. It can implement or execute various exemplary logical blocks, modules, and circuits described in conjunction with the disclosure of the present application.
  • the processor may also be a combination of computing functions, for example, including one or more microprocessor combinations, a combination of DSP and microprocessor, and so on.
  • the storage module may be a memory.
  • the processing module 1501 is one or more processors (the processor 110 shown in FIG. 1D), and the communication module includes a wireless communication module (the wireless communication module 160 shown in FIG. 1D.
  • the wireless communication module 160 includes BT ( (Bluetooth module), WLAN (such as Wi-Fi module)).
  • the wireless communication module may be referred to as a communication interface.
  • the storage module may be a memory (internal memory 121 shown in FIG. 1D).
  • the display module 1502 may be a display screen (such as the display screen 194 shown in FIG. 1D).
  • the infrared emitting module 1503 may be an infrared projector (the infrared projector 196 shown in FIG. 1D, that is, the infrared projector 101 in the foregoing embodiment).
  • the two dual-pass acquisition modules 1504 may be two dual-pass cameras (such as the dual-pass camera 193B shown in FIG. 1D (that is, the first camera 102 in the foregoing embodiment) and the dual-pass camera 193C (that is, the first Two cameras 103)).
  • the RGB collection module 1506 may be an EGB camera among the 1-N other cameras 193A shown in FIG. 1D, that is, the third camera 104 in the foregoing embodiment.
  • the two dual-pass acquisition modules 1504 and the RGB acquisition module 1506 are disposed on the same side of the electronic device 100, such as the front or back.
  • the electronic device 1500 provided by the embodiment of the present application may be the electronic device 100 shown in FIG. 1D.
  • the one or more processors, memory, infrared projector, first camera, second camera, display screen and third camera, etc. may be connected together, for example, via a bus.
  • An embodiment of the present application further provides a computer storage medium that stores computer program code.
  • the processor executes the computer program code
  • the electronic device 1500 executes any of the drawings in FIG. 2 or FIG. 9.
  • the relevant method steps implement the method in the above embodiment.
  • An embodiment of the present application also provides a computer program product, which, when the computer program product runs on a computer, causes the computer to execute the relevant method steps in any of the drawings in FIG. 2 or FIG. 9 to implement the method in the above embodiment.
  • the electronic device 1500, the computer storage medium, or the computer program product provided in the embodiments of the present application are used to perform the corresponding methods provided above. Therefore, for the beneficial effects that can be achieved, refer to the corresponding provided The beneficial effects in the method will not be repeated here.
  • the disclosed device and method may be implemented in other ways.
  • the device embodiments described above are only schematic.
  • the division of the modules or units is only a division of logical functions.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or software function unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in a readable storage medium.
  • the technical solutions of the embodiments of the present application may essentially be part of or contribute to the existing technology, or all or part of the technical solutions may be embodied in the form of software products, which are stored in a storage medium
  • several instructions are included to enable a device (which may be a single-chip microcomputer, chip, etc.) or processor to execute all or part of the steps of the methods described in the embodiments of the present application.
  • the foregoing storage media include various media that can store program codes, such as a U disk, a mobile hard disk, a ROM, a magnetic disk, or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)
  • Image Analysis (AREA)

Abstract

本申请实施例提供一种获取深度信息的方法及电子设备,应用于图像处理技术领域,可以提高电子设备获取的目标对象的深度信息的准确度。具体方案包括:电子设备包括第一摄像头、第二摄像头和红外投射器;电子设备接收用于触发电子设备获取目标对象的深度信息的第一指令;响应于第一指令,通过红外投射器发射带光斑的红外光,通过第一摄像头采集目标对象的第一图像信息,通过第二摄像头采集目标对象的第二图像信息,第一图像信息和第二图像信息包括目标对象的特征和带光斑的红外光照射在目标对象上形成的纹理特征;根据第一图像信息、第二图像信息、第一长度、以及第一摄像头的镜头焦距和第二摄像头的镜头焦距,计算目标对象的深度信息。

Description

一种获取深度信息的方法及电子设备 技术领域
本申请实施例涉及图像处理技术领域,尤其涉及一种获取深度信息的方法及电子设备。
背景技术
随着电子技术的发展,电子设备的功能越来越多。例如,电子设备可以包括人脸识别、增强现实(augmented reality,AR)等功能。以人脸识别为例,电子设备在进行人脸识别时,不仅可以采集目标对象(如人脸)的二维图像,还可以获取目标对象的深度信息。
具体的,电子设备可以通过双目摄像头(即两个摄像头,如摄像头1和摄像头2)分别采集目标对象的图像信息;然后,识别出摄像头1采集的图像信息和摄像头2采集的图像信息中相同的特征,并计算摄像头1和摄像头2对相同的特征的视差,然后根据视差、摄像头1和摄像头2的硬件参数,计算该点的深度(即该点到上述两个摄像头连线的垂直距离)。其中,目标对象的多个点的深度可以组成该目标对象的深度信息。目标对象的深度信息可以表征目标对象的三维特征。
由此可见,上述摄像头1和摄像头2采集的图像信息中的特征越多,电子设备100识别到摄像头1采集的图像信息和摄像头2采集的图像信息中相同的特征则越多。电子设备识别到的相同的特征越多,该电子设备则可以计算得到越多点的深度,目标对象的深度信息则越准确。
其中,摄像头采集到的图像信息中包括的特征的多少,至少会受到以下两个参数的影响:(1)目标对象本身的特征是否明显。例如,如果目标对象棱角分明(即目标对象的特征比较明显),那么摄像头采集到目标对象的图像信息则可以包括较多的特征;如果目标对象比较平滑,即目标对象(如白墙)的特征不明显,那么摄像头采集到目标对象的图像信息中包括的特征较少。(2)光线的强弱。例如,光线较强时,摄像头采集到的图像信息中包括较多的特征;光线较差时,摄像头采集到的图像信息中包括特征则较少。
综上所述,双目摄像头采集的图像信息中的特征越多,电子设备计算得到的目标对象的深度信息越准确;而双目摄像头采集的图像信息中的特征的数量会受到多种因素的影响。换言之,深度信息准确度会受到多种因素的影响,电子设备很难获取到准确度较高的深度信息。
发明内容
本申请实施例提供一种获取深度信息的方法及电子设备,可以提高电子设备获取的目标对象的深度信息的准确度。
第一方面,本申请实施例提供一种获取深度信息的方法,可以应用于电子设备,该 电子设备包括红外投射器、第一摄像头和第二摄像头。该第一摄像头和第二摄像头之间的距离为第一长度。该方法可以包括:电子设备接收用于触发电子设备获取目标对象的深度信息的第一指令;响应于第一指令,通过红外投射器发射带光斑的红外光,通过第一摄像头采集目标对象的第一图像信息,通过第二摄像头采集目标对象的第二图像信息;根据第一图像信息、第二图像信息、第一长度、以及第一摄像头的镜头焦距和第二摄像头的镜头焦距,计算目标对象的深度信息。
其中,上述第一摄像头采集的图像信息和第二摄像头采集的图像信息中的特征越多越明显,电子设备识别到第一图像信息和第二图像信息中相同的特征则越多。电子设备识别到的相同的特征越多,该电子设备则可以计算得到越多特征所在点的深度。由于该目标对象的深度信息由目标对象的多个点的深度组成;因此,电子设备计算得到的点的深度越多,目标对象的深度信息则越准确。
本身实施例中,上述第一图像信息和第二图像信息包括目标对象的特征和带光斑的红外光照射在目标对象上形成的纹理特征。即上述第一图像信息和第二图像信息中不仅包括目标对象本身的特征,还包括带光斑的红外光照射在目标对象上形成的纹理特征。
其中,增加第一摄像头和第二摄像头采集的目标对象的图像的特征后,电子设备便可以更加准确识别出第一摄像头采集的图像信息和第二摄像头采集的图像信息中相同的特征,进而确定出第一摄像头和第二摄像头对该相同的特征的视差,从而计算出每个点的深度,得到目标对象的深度信息,可以提高电子设备计算目标对象的深度信息的准确度。
结合第一方面,在一种可能的设计方式中,上述第一摄像头与第二摄像头的镜头焦距相同。
结合第一方面,在另一种可能的设计方式中,上述电子设备根据第一图像信息、第二图像信息、第一长度、以及第一摄像头的镜头焦距和第二摄像头的镜头焦距,计算目标对象的深度信息的方法可以包括:电子设备计算第一摄像头与第二摄像头对第一图像信息和第二图像信息中的多个第一特征的视差;该第一特征是第一图像信息和第二图像信息中相同的特征;电子设备针对每个第一特征,根据第一摄像头与第二摄像头对所述第一特征的视差、第一长度和镜头焦距,采用以下公式(2)计算第一特征所在点的深度Z,得到目标对象的深度信息:
Figure PCTCN2019112254-appb-000001
其中,f为镜头焦距,d为第一摄像头与第二摄像头对第一特征的视差,T为第一长度。
可以理解,目标对象上的各个特征(如人的鼻尖和眼睛)与摄像头之间的距离可能不同。目标对象上每个特征与摄像头之间的距离称为该特征(或者该特征所在点)的深度。目标对象上的各个点的深度组成该目标对象的深度信息。目标对象的深度信息可以表征目标对象的三维特征。
结合第一方面,在另一种可能的设计方式中,上述多个第一特征是第一图像信息和第二图像信息中相同的特征中的部分特征。电子设备100可以按照预设的特征频率,从第一图像信息中选择出多个第一特征,然后查找第二图像信息中与该多个第一特征相同 的特征;最后,针对每一个第一特征,计算其深度,得到目标对象的深度信息。或者,电子设备也可以随机或间隔从第一图像信息中选择出部分第一特征。
其中,上述特征频率可以为相同的两个第一特征在预设面积中出现的数量。上述特征频率反应在图像上可以为电子设备选择的相邻的两个第一特征之间的距离(称为特征距离)。电子设备可以按照预设的特征频率,从第一图像信息中选择出多个第一特征的方法可以包括:电子设备从第一图像信息中的所有特征中,每间隔一个特征距离选择出一个第一特征。也就是说,电子设备不需要计算第一图像信息和第二图像信息中相同的特征中每一个特征所在点的深度,而是每间隔一个特征距离选择一个特征,计算选择出的特征所在点的深度。
结合第一方面,在另一种可能的设计方式中,上述带光斑的红外光包括多个光斑,多个光斑包括多个散斑点阵组。其中,一个散斑点阵组中包括一个或多个散斑点阵;散斑点阵包括多个散斑。
结合第一方面,在另一种可能的设计方式中,上述多个散斑点阵组中的至少两个散斑点阵组不同。其中,多个散斑点阵组中的至少两个散斑点阵组不同,可以降低多个光斑中散斑点阵组的重复性,有利于电子设备识别出不同的特征。
结合第一方面,在另一种可能的设计方式中,上述第一散斑点阵组为多个散斑点阵组中的任意一个,第一散斑点阵组中包括的多个散斑点阵中的至少两个散斑点阵不同。其中,多个散斑点阵中的至少两个散斑点阵不同,可以降低多个光斑中散斑点阵的重复性,有利于电子设备识别出不同的特征。
结合第一方面,在另一种可能的设计方式中,上述第一散斑点阵为多个散斑点阵中的任意一个。第一散斑点阵中的多个散斑中每个散斑的形状相同。其中,如果多个散斑中每个散斑的形状相同,那么电子设备可以根据散斑在散斑点阵中的位置识别出不同的散斑。
第一散斑点阵中的多个散斑中至少两个散斑的形状不同。其中,多个散斑中至少两个散斑的形状不同,可以降低散斑点阵中多个散斑的重复性,有利于电子设备识别出不同的特征。
结合第一方面,在另一种可能的设计方式中,上述电子设备按照预设的特征频率从第一图像信息和所述第二图像信息中选择多个第一特征。特征频率大于或者等于多个光斑中散斑点阵的重复频率。其中,特征频率通过电子设备从预设面积的图像中选择的相同的第一特征的数量来表征,重复频率通过预设面积中出现同一个散斑点阵的数量来表征。
其中,电子设备在选择上述第一特征时所采用的特征距离可以小于或者等于上述多个光斑中散斑点阵的重复周期。即上述特征频率大于或者等于上述多个光斑中散斑点阵的重复频率。这样,可以尽量保证电子设备从第一图像信息中选择出的相邻的两个第一特征对应不同的散斑点阵中的散斑,有利于电子设备可以区分出该相邻的两个第一特征,可以降低特征匹配错误的可能性,提升电子设备计算得到的深度信息的准确度。
结合第一方面,在另一种可能的设计方式中,上述第一摄像头和第二摄像头的每个透镜的双面均包括增透膜,第一摄像头和第二摄像头的滤光片包括截止镀层。其中,增透膜用于增加红外光的透过率;截止镀层用于滤去红外光和可见光之外的其他光线,并 增加红外光的透过率。
其中,普通的红绿蓝(red green blue,RGB)摄像头只可以感知可见光,不能感知红外光。为了降低电子设备的硬件成本,可以在RGB摄像头的每个透镜的双面镀增透膜,在RGB摄像头的滤光片(双面或者单面)上镀截止镀层,得到每个透镜的双面均包括增透膜且滤光片包括截止镀层的双通摄像头(即上述第一摄像头和第二摄像头)。
结合第一方面,在另一种可能的设计方式中,上述红外光为890纳米(nm)~990nm的红外光。例如,上述红外光具体可以为940nm的红外光。相应的,上述增透膜可以为890nm~990nm的红外光的增透膜,如940nm的红外光的增透膜。
结合第一方面,在另一种可能的设计方式中,上述第一长度在20毫米~30毫米之间。
结合第一方面,在另一种可能的设计方式中,上述电子设备还包括第三摄像头,第三摄像头是RGB摄像头。第三摄像头用于采集可见光下的图像信息;第三摄像头采集的图像信息用于显示在电子设备的显示屏上。
第二方面,本申请实施例提供一种电子设备,该电子设备包括:一个或多个处理器、存储器、红外投射器、第一摄像头和第二摄像头,第一摄像头和第二摄像头之间的距离为第一长度,存储器、红外投射器、第一摄像头和第二摄像头与处理器耦合,存储器用于存储信息。上述处理器,用于接收第一指令,第一指令用于触发电子设备获取目标对象的深度信息。处理器,还用于响应于第一指令,通过红外投射器发射带光斑的红外光,通过第一摄像头采集目标对象的第一图像信息,通过第二摄像头采集目标对象的第二图像信息,第一图像信息和第二图像信息包括目标对象的特征和带光斑的红外光照射在目标对象上形成的纹理特征。处理器,还用于根据第一摄像头采集的第一图像信息、第二摄像头采集的第二图像信息、第一长度、以及第一摄像头的镜头焦距和第二摄像头的镜头焦距,计算目标对象的深度信息。
结合第二方面,在一种可能的设计方式中,上述第一摄像头与第二摄像头的镜头焦距相同。
结合第二方面,在另一种可能的设计方式中,上述处理器,用于根据第一图像信息、第二图像信息、第一长度、以及第一摄像头的镜头焦距和第二摄像头的镜头焦距,计算目标对象的深度信息,包括:处理器,用于:计算第一摄像头与第二摄像头对第一图像信息和第二图像信息中的多个第一特征的视差;第一特征是第一图像信息和第二图像信息中相同的特征;针对每个第一特征,根据第一摄像头与第二摄像头对第一特征的视差、第一长度和镜头焦距,采用上述公式(2)计算第一特征所在点的深度Z,得到目标对象的深度信息。其中,f为镜头焦距,d为第一摄像头与第二摄像头对第一特征的视差,T为第一长度。
结合第二方面,在一种可能的设计方式中,上述处理器,还用于在计算第一摄像头与第二摄像头对第一图像信息和第二图像信息中的多个第一特征的视差之前,从第一图像信息和第二图像信息中选择出多个第一特征。其中,多个第一特征是第一图像信息和第二图像信息中相同的特征中的部分特征。
结合第二方面,在一种可能的设计方式中,上述上述带光斑的红外光包括多个光斑,多个光斑包括多个散斑点阵组。其中,一个散斑点阵组中包括一个或多个散斑点阵;每个散斑点阵包括多个散斑。
需要注意的是,第二方面及其可能的设计方式中的散斑点阵组、散斑点阵、散斑点阵中的多个散斑,以及红外投射器发射的红外光的波长可以参考第一方面的可能的设计方式中的相关描述,这里不予赘述。
结合第二方面,在一种可能的设计方式中,上述处理器,用于按照预设的特征频率从第一图像信息和第二图像信息中选择多个第一特征。特征频率大于或者等于多个光斑中散斑点阵的重复频率。其中,特征频率通过处理器从预设面积的图像中选择的相同的第一特征的数量来表征,重复频率通过预设面积中出现同一个散斑点阵的数量来表征。
结合第二方面,在一种可能的设计方式中,上述第一摄像头和第二摄像头的每个透镜的双面均包括增透膜,第一摄像头和第二摄像头的滤光片包括截止镀层;其中,增透膜用于增加红外光的透过率;截止镀层用于滤去红外光和可见光之外的其他光线,并增加红外光的透过率。
结合第二方面,在一种可能的设计方式中,上述电子设备还包括第三摄像头和显示屏。第三摄像头是RGB摄像头;第三摄像头用于采集可见光下的图像信息;第三摄像头采集的图像信息用于显示在显示屏上。
第三方面,本申请实施例提供一种双通摄像头(即上述第一摄像头或者第二摄像头),该双通摄像头用于接收可见光和红外光。该双通摄像头的每个透镜的双面均包括增透膜,该双通摄像头的滤光片包括截止镀层。其中,增透膜用于增加红外光的透过率;截止镀层用于滤去红外光和可见光之外的其他光线,并增加红外光的透过率。
结合第三方面,在一种可能的设计方式中,上述双通摄像头包括:RGB摄像头、镀在RGB摄像头的每个透镜的双面的增透膜,镀在RGB摄像头的滤光片(双面或者单面)上的镀截止镀层。
第四方面,本申请实施例提供一种计算机存储介质,包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行如第一方面及其可能的设计方式所述的获取深度信息的方法。
第五方面,本申请实施例提供一种计算机程序产品,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如第一方面及其可能的设计方式所述的获取深度信息的方法。
可以理解,上述提供的第二方面及其可能的设计方法所述的电子设备、第三方面所述的计算机存储介质,以及第四方面所述的计算机程序产品均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
附图说明
图1A为本申请实施例提供的一种电子设备的局部示意图;
图1B为本申请实施例提供的一种计算深度信息的原理示意图一;
图1C为本申请实施例提供的一种双目摄像头采集的两个图像信息中相同的特征的实例示意图;
图1D为本申请实施例提供的一种电子设备的硬件结构示意图;
图2为本申请实施例提供的一种获取深度信息的方法流程图一;
图3为本申请实施例提供的一种图形用户界面实例示意图一;
图4为本申请实施例提供的一种散斑点阵的实例示意图;
图5为本申请实施例提供的一种散斑点阵组的实例示意图一;
图6A为本申请实施例提供的一种散斑点阵组的实例示意图二;
图6B为本申请实施例提供的一种散斑点阵组的实例示意图三;
图6C为本申请实施例提供的一种散斑点阵组的实例示意图四;
图7A为本申请实施例提供的一种散斑点阵组的实例示意图五;
图7B为本申请实施例提供的一种散斑点阵组的实例示意图六;
图7C为本申请实施例提供的一种散斑点阵组的实例示意图七;
图8为本申请实施例提供的一种摄像头模组的结构示意图;
图9为本申请实施例提供的一种获取深度信息的方法流程图二;
图10A为本申请实施例提供的一种计算深度信息的原理示意图二;
图10B为本申请实施例提供的一种计算视差的原理示意图;
图10C为本申请实施例提供的一种视差与深度的关系示意图;
图11为本申请实施例提供的一种目标对象与带光斑的红外光的实例示意图;
图12为本申请实施例提供的一种特征频率的实例示意图;
图13为本申请实施例提供的一种人脸深度信息实例示意图;
图14为本申请实施例提供的一种图形用户界面实例示意图;
图15为本申请实施例提供的一种电子设备的结构组成示意图。
具体实施方式
本申请实施例提供一种获取深度信息的方法,可以应用于电子设备获取目标对象的图像信息的过程中。该图像信息可以包括目标对象的二维图像和目标对象的深度信息。
一般而言,目标对象(如人脸)是具备三维立体形态的物体。电子设备的摄像头拍摄该目标对象时,该目标对象上的各个特征(如人的鼻尖和眼睛)与摄像头之间的距离可能不同。目标对象上每个特征与摄像头之间的距离称为该特征(或者该特征所在点)的深度。目标对象上的各个点的深度组成该目标对象的深度信息。目标对象的深度信息可以表征目标对象的三维特征。
其中,对于双目摄像头(即两个摄像头,如第一摄像头和第二摄像头)而言,上述目标对象上的各个特征与摄像头之间的距离(即点的深度)可以为:该目标对象上的各个特征所在点与两个摄像头连线的垂直距离。例如,如图1B所示,假设P为目标对象上的一个特征,特征P的深度为P到O LO R的垂直距离Z。其中,O L为第一摄像头的位置,O R为第二摄像头的位置。
本申请实施例提供的电子设备可以包括红外投射器和两个双通摄像头(如第一摄像头和第二摄像头)。其中,两个双通摄像头之间的距离为第一长度。一般而言,两个摄像头的中心之间的距离称为两个摄像头之间的距离。
请参考图1A,其示出本申请实施例提供的电子设备100的局部示意图。如图1A所示,电子设备100可以包括红外投射器101、第一摄像头102和第二摄像头103。第一摄像头102和第二摄像头103的之间的距离为第一长度T。
其中,红外投射器101用于发射带光斑的红外光。双通摄像头是指该摄像头不仅可以接收可见光,还可以接收红外光。例如,双通摄像头可以接收可见光以及940nm的红外光。940nm是红外光的波长。当然,上述两个双通摄像头可以替换为两个全通摄像头。全通摄像头是指该摄像头可以接收包括可见光、红外光以及其它波长的光在内的多种光线。相比而言,普通的RGB摄像头可以接收可见光,但是不能接收红外光。
在本申请实施例中,第一摄像头102可以为左侧的双通摄像头,第二摄像头103可以为右侧的双通摄像头;或者,第一摄像头102可以为右侧的双通摄像头,第二摄像头103可以为左侧的双通摄像头。图1A中以第一摄像头102为左侧的双通摄像头,第二摄像头103为右侧的双通摄像头为例,对电子设备100的局部结构进行举例说明。
需要注意的是,本申请实施例中的第一摄像头102和第二摄像头103可以接收同类型的红外光。该同类型的红外光是指波长相同的红外光。换言之,第一摄像头102和第二摄像头103接收红外光的能力相同。例如,第一摄像头102和第二摄像头103可以接收940nm的红外光。并且,本申请实施例中,红外投射器101发射的红外光与第一摄像头102和第二摄像头103可以接收的红外光也是同类型的红外光。
电子设备100可以根据双目摄像头对同一特征的视差,结合双目摄像头的硬件参数,采用三角定位原理计算目标对象上每一个特征的深度,得到目标对象的深度信息。
本申请实施例这里对电子设备100根据视差计算深度信息的方法进行举例说明:
如图1B所示,O L为第一摄像头102的位置,O R为第二摄像头103的位置,O L与O R之间的距离为第一长度T,即O LO R=T。第一摄像头102和第二摄像头103的镜头焦距均为f。
特征P为目标对象的一个特征。特征P所在点与第一摄像头102和第二摄像头103连线的垂直距离为Z。即P的深度信息为Z。第一摄像头102采集到目标对象的图像1,特征P在图像1的P L点。第二摄像头103采集到目标对象的图像2,特征P在图像2的P R点。其中,图像1中的P L点与图像2中的P R点所对应的特征都是目标对象的特征P。
如图1B所示,A LC L=A RC R=x,A LB L=B LC L=A RB R=B RC R=x/2。其中,特征P L与A L之间的距离为x L,即特征P L距离图像1的最左端的距离为x L,即A LP L=x L。特征P R与A R之间的距离为x R,即特征P R距离图像2的最左端的距离为x R,即A RP R=x R。A LP L与A RP R的差值为第一摄像头102与第二摄像头103对特征P的视差,即特征P的视差d=x L-x R
由于,P LP R平行于O LO R;因此,按照三角形原理可以得出以下公式(1):
Figure PCTCN2019112254-appb-000002
其中,P LP R=O LO R-B LP L-P RB R。O LO R=T,B LP L=A LP L-A LB L=x L-x/2,P RB R=x/2-x R。P LP R=T-(x L-x/2)-(x/2-x R)=T-(x L-x R)=T-d。
将P LP R=T-d,O LO R=T代入公式(1)可以得到:
Figure PCTCN2019112254-appb-000003
Figure PCTCN2019112254-appb-000004
Figure PCTCN2019112254-appb-000005
可知:特征P的深度Z可以通过两个摄像头之间的距离T、两个摄像头的镜头焦距f,以及视差d计算得到。
由上述描述可知:上述第一摄像头102采集的图像信息(即第一图像信息,如上述图像1)和第二摄像头103采集的图像信息(即第二图像信息,如上述图像2)中的特征越多越明显,电子设备100识别到第一图像信息和第二图像信息中相同的特征则越多。电子设备100识别到的相同的特征越多,该电子设备100则可以计算得到越多特征所在点的深度。由于该目标对象的深度信息由目标对象的多个点的深度组成;因此,电子设备100计算得到的点的深度越多,目标对象的深度信息则越准确。
其中,所述的两个图像信息中相同的特征指的是:两个图像信息中对应同一个特征的信息。例如:第一图像信息中A L点对应对象的A部位,第二图像信息中的A R点也对应对象的A部位,则A L点和A R点是这两个图像信息中相同的特征。例如,第一图像信息包括图1C所示的建筑物的图像1,第二图像信息包括图1C所示的建筑物的图像2。图像1中的A L点对应建筑物的A部位,图像2中的A R点也对应建筑物的A部位。B L点对应建筑物的B部位,B R点也对应建筑物的B部位。双目摄像头对上述建筑物的A部位的视差为x L1-x R1。双目摄像头对上述建筑物的B部位的视差为x L2-x R2
综上所述,当第一图像信息和第二图像信息中的特征越多时,越容易从这两个图像信息中获取较多的相同特征,从而使得电子设备100得到的目标对象的深度信息越准确。
本申请实施例中,红外投射器101向目标对象发射带光斑的红外光后,第一摄像头102和第二摄像头103采集的目标对象的图像信息中则不仅可以包括目标对象的特征,还包括带光斑的红外光照射在目标对象上形成的纹理特征。即可以增加第一摄像头102和第二摄像头103采集的目标对象的图像的特征。增加第一摄像头102和第二摄像头103采集的目标对象的图像的特征后,电子设备100便可以更加准确识别出第一摄像头102采集的图像信息和第二摄像头103采集的图像信息中相同的特征,进而确定出第一摄像头102和第二摄像头103对该相同的特征的视差,从而计算出每个点的深度,得到目标对象的深度信息,可以提高电子设备100计算目标对象的深度信息的准确度。
本申请实施例中的双通摄像头可以通过改进RGB摄像头得到,可以降低电子设备100的硬件成本。具体的,可以在RGB摄像头的每个透镜的双面镀增透膜,以提高透镜对红外光的感知能力,使得透镜可以尽可能的接收红外光;在RGB摄像头的滤光片上镀截止镀层,以滤去红外光和可见光之外的其他光线,并增加红外光的透过率。包括上述增透膜和截止镀层的RGB摄像头不仅可以接收可见光,还可以接收红外光。按照上述方式改进后的RGB摄像头可以作为双通摄像头被使用。
需要注意的是,在一些实施例中,图1A所示的电子设备100的局部示意图可以为电子设备100的正面的局部示意图。也就是说,上述红外投射器101、第一摄像头102、第二摄像头103设置在电子设备100的正面。
上述电子设备100中除上述第一摄像头102和第二摄像头103之外,还可以包括一个或多个其他摄像头。该一个或多个其他摄像头可以包括第三摄像头104,该第三摄像头为RGB摄像头。该第三摄像头用于采用目标对象的二维图像。电子设备100可以在显示屏上显示第三摄像头采集的二维图像。上述第三摄像头104与第一摄像头102、第二摄像头103以及红外投射器设置在电子设备100的同一面上。即第三摄像头104也可 以设置在电子设备100的正面。第三摄像头104是电子设备100的前置摄像头。
在另一些实施例中,图1A所示的电子设备100的局部示意图可以为电子设备100的背面的局部示意图。也就是说,上述红外投射器101、第一摄像头102、第二摄像头103和RGB摄像头104设置在电子设备100的背面。上述RGB摄像头104为电子设备100的后置摄像头。
其中,上述电子设备100的正面是指电子设备100显示图形用户界面(如电子设备100的主界面,即桌面)的一面,即显示面板所在的面通常称为正面;而电子设备100的背面则是与正面的朝向相反的一面。通常的,电子设备的正面指的是:在被用户正常使用状态下,朝向用户的一面;而背离用户的一面称为背面。
当然,上述一个或多个其他摄像头还可以包括其他的RGB摄像头或者黑白摄像头等。该其他的RGB摄像头或者黑白摄像头等可以为电子设备100的前置摄像头或者后置摄像头,本申请实施例对此不做限制。
示例性的,本申请实施例电子设备可以为包括上述红外投射器、RGB摄像头(即第三摄像头)和两个双通摄像头(即第一摄像头和第二摄像头)的便携式计算机(如手机)、笔记本电脑、可穿戴电子设备(如智能手表)、平板电脑、增强现实(augmented reality,AR)\虚拟现实(virtual reality,VR)设备或车载设备等,以下实施例对该电子设备的具体形式不做特殊限制。
请参考图1D,其示出本申请实施例提供的一种电子设备100的结构示意图。该电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。
其中,电子设备100还可以包括红外投射器196(如图1A所示的红外投射器101)。该红外投射器196用于发射带光斑的红外光。例如,红外投射器196可以发射波长为940nm的红外光,该红外光带光斑。光斑的形状和排布可以参考本申请实施例后续相关描述,此处不予赘述。
摄像头193可以包括两个双通摄像头(如图1A所示的第一摄像头102和第二摄像头103)193B和193C、1~N个其他摄像头193A。该1~N个其他摄像头193A可以包括图1A所示的第三摄像头104,即RGB摄像头,还可以包括其它摄像头,例如黑白摄像头。其中,双通摄像头193B和193C可以接收可见光和红外光。RGB摄像头用于采集RGB图像,如人脸图像(即人脸二维图像)。
其中,传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等多个传感器。
可以理解的是,本发明实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某 些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDL)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。例如:处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以 通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备100的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,显示屏194,无线通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本发明实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,外部存储器,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。 在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如Wi-Fi网络),蓝牙(blue tooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),NFC,红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及 应用处理器等实现拍摄功能。
ISP主要用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备100可以设置至少一个麦克风170C。在另一些实施例中,电子设备100可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备100还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,电子设备100根据压力传感器180A检测所述触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180B检测电子设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
气压传感器180C用于测量气压。在一些实施例中,电子设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器180D包括霍尔传感器。电子设备100可以利用磁传感器180D检测翻盖皮套的开合。在一些实施例中,当电子设备100是翻盖机时,电子设备100可以根据磁传感器180D检测翻盖的开合。进而根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。
加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应 用于横竖屏切换,计步器等应用。
距离传感器180F,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备100通过发光二极管向外发射红外光。电子设备100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备100附近有物体。当检测到不充分的反射光时,电子设备100可以确定电子设备100附近没有物体。电子设备100可以利用接近光传感器180G检测用户手持电子设备100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测电子设备100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,电子设备100利用温度传感器180J检测的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,电子设备100执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备100对电池142加热,以避免低温导致电子设备100异常关机。在其他一些实施例中,当温度低于又一阈值时,电子设备100对电池142的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号。骨传导传感器180M也可以接触人体脉搏,接收血压跳动信号。在一些实施例中,骨传导传感器180M也可以设置于耳机中,结合成骨传导耳机。音频模块170可以基于所述骨传导传感器180M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。应用处理器可以基于所述骨传导传感器180M获取的血压跳动信号解析心率信息,实现心率检测功能。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振 动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口195可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备100中,不能和电子设备100分离。
本申请实施例提供一种获取深度信息的方法,该方法可以应用于电子设备100。该电子设备100包括红外投射器101、第一摄像头102和第二摄像头103。例如,如图1A所示,第一摄像头102和第二摄像头103的镜头焦距均为f,第一摄像头102和第二摄像头103之间的距离(即第一长度)为T。如图2所示,该获取深度信息的方法可以包括S201-S203:
S201、电子设备100接收第一指令。该第一指令触发电子设备100获取目标对象的深度信息。
其中,电子设备100可以接收用户的第一操作,该第一操作用于触发电子设备100执行第一事件。由于电子设备100执行该第一事件要使用目标对象的深度信息;因此,响应于上述第一操作,电子设备100可以获取到第一指令,该第一指令可以触发电子设备100获取目标对象的深度信息。
示例性的,本申请实施例的方法可以应用于人脸解锁场景、人脸支付场景、AR场景、3D建模场景或者大光圈场景等多个场景中。
在人脸解锁场景中,假设电子设备100已经开机,且电子设备100黑屏。上述第一操作可以是用户对电子设备100中相关物理按键的点击操作(如单击操作)。例如,上述相关物理按键可以为锁屏键或者Home键等。如果电子设备100接收到用户对上述物理按键的第一操作,那么则表示用户可能有解锁电子设备100的意愿。
或者,电子设备100可以包括一个或多个用于检测电子设备100被用户手持的状态的传感器。假设电子设备100已经开机,电子设备100黑屏或者电子设备100显示锁屏界面。上述电子设备100接收到第一操作具体可以为:传感器检测到电子设备100当前被用户手持的状态发生了符合预设条件的变化。其中,当电子设备100被用户手持的状态发生了符合预设条件的变化(例如,电子设备100被用户拿起,并且用户手持电子设备100使得电子设备100的显示屏与水平面之间的夹角在一定范围内)时,则表示用户可能有解锁电子设备100的意愿。
也就是说,该第一操作可以用于触发电子设备100解锁(即执行第一事件)。电子 设备100解锁前要进行用户身份校验。其中,人脸识别是用户身份校验的一种方式。电子设备100要进行人脸识别,则可以获取目标对象的深度信息。
其中,本申请实施例中的一个或多个传感器可以通过检测电子设备100被旋转、电子设备100相对于用户向前运动、电子设备100相对于地平线向上运动,来判断电子设备100被用户手持的状态是否发生符合预设条件的变化。具体的,电子设备100可以检测该电子设备100的运动参数;然后根据该运动参数判断该电子设备100是否被旋转、是否相对于用户发生向前运动、是否相对于地平线发生向上运动;最后,根据判断结果确定该电子设备100被用户手持的状态是否发生符合预设条件的变化。
举例来说,本实施例中,“电子设备100当前被用户手持的状态发生了符合预设条件的变化”具体可以包括:上述传感器检测到电子设备100被旋转、且由下向上移动后,该电子设备100的显示屏与水平面之间的夹角在预设范围内。
需要说明的是,由于不同用户的拍摄习惯不同,因此电子设备100中可以统计大多数或者所有用户使用电子设备100采集人脸图像的过程中,传感器采集的状态变化参数,并将这些状态变化参数的变化确定为符合预设条件的变化。如此,电子设备100可以响应于该电子设备100中的传感器采集到的状态变化参数的变化符合预设条件的变化,确定用户想要解锁该电子设备100的可能性较高,则可以执行S202。
示例性的,在人脸支付场景中,上述第一操作可以是用户在支付界面中的支付按钮的点击操作(如单击操作)。例如,图3中的(a)所示的支付界面301中包括支付按钮“立即付款”302,第一操作可以是用户对按钮“立即付款”302的单击操作。或者,上述第一操作可以是用户在支付界面输入的预设手势,如S形手势。响应于该第一操作,电子设备100可以支付订单(即执行第一事件)。但是,电子设备100支付订单前要进行用户身份校验。其中,人脸识别是用户身份校验的一种方式。电子设备100要进行人脸识别,则可以获取目标对象的深度信息。响应于用户对按钮“立即付款”302的单击操作,电子设备100可以显示图3中的(b)所示的人脸识别界面303。
S202、响应于上述第一指令,电子设备100通过红外投射器101发射带光斑的红外光,通过第一摄像头102采集目标对象的第一图像信息,通过第二摄像头103采集目标对象的第二图像信息。
其中,响应于上述第一指令,电子设备100可以开启红外投射器101,并开启第一摄像头102和第二摄像头103。电子设备100开启红外投射器101后,红外投射器101便可以发射带光斑的红外光。电子设备100开启并初始化第一摄像头102和第二摄像头103之后,第一摄像头102和第二摄像头103便可以采集目标对象的图像信息。
本申请实施例中,红外投射器101向目标对象发射带光斑的红外光,带光斑的红外光照射到目标对象上,可以增加目标对象的特征(即纹理特征),提高第一摄像头102和第二摄像头103对目标对象上各个特征的识别率。
本申请实施例这里介绍红外投射器101发射的带光斑的红外光,以及红外投射器101发射带光斑的红外光的原理。
一般而言,红外光对人眼不可见。但是,部分红外光(如850nm的红外光)存在明显的红光特性,人眼还是可以看到少量的红光。如果红外投射器101向目标对象发射这类红外光,那么用户则可以看到照射在目标对象上的红外光,影响用户视觉体验。
为了避免红外投射器101向目标对象发射红外光影响用户视觉体验,红外投射器101发射的红外光可以为对人眼完全不可见的红外光。例如,红外投射器101发射的红外光可以为890nm~990nm的红外光,具体例如为940nm的红外光。
其中,部分摄像头(如卷帘快门(rolling shutter)摄像头)的曝光是逐行进行的;因此,如果上述第一摄像头102和第二摄像头103的曝光是逐行进行的,那么在第一摄像头102和第二摄像头103的整个曝光周期内,上述红外投射器101都要开启以发射上述带光斑的红外光;否则可能会出现在曝光过程中,第一摄像头102和第二摄像头103采集的图像信息中部分图像上没有光斑的问题。如此,则要求红外投射器101的工作功率不能过于高,以防止由于红外投射器101的工作功率过高而导致红外投射器101发热温度较高,影响红外投射器101发射红外光的效率(即影响红外光的亮度),从而影响带光斑的红外光照射到对象上形成的纹理特征的效果。
当然,红外投射器101的工作功率也不能过于低。如果红外投射器101的工作功率过低,也会影响红外投射器101发射红外光的效率,从而影响带光斑的红外光照射到对象上形成的纹理特征的效果。
综上所述,为了保证红外光照射到对象上形成的纹理特征的效果,红外投射器101的功率不宜过高或者过低。为了稳定红外投射器101的功率,红外投射器101的工作电流可以在100mA~200mA之间。例如,红外投射器101的工作电流可以为150mA。
一般而言,红外投射器可以包括三部分:(1)红外光源;(2)准直镜;(3)衍射光学器件(diffractive optical elements,DOE)。
其中,红外光源可以为激光光源。例如,激光光源可以为垂直腔面发射激光器(vertical cavity surface emitting laser,VCSEL)。VCSEL可以发射上述红外光。
准直镜可以为1P的透镜或者2P的透镜。其中,准直镜为1P的透镜是指准直镜由1片透镜组成。2P的透镜是指准直镜由2片透镜组成。准直镜用于将非平行光线转换为近似平行的光源。这种转化可以降低红外投射器发射的红外光的纹理的噪声水平,即可以减少非纹理部分的光线。换言之,这种转化可以使红外投射器发射的红外光中的亮点更亮,暗点更暗。
其中,白天可见光比较强时,第一摄像头102和第二摄像头103采集的图像信息中,带光斑的红外光照射在目标对象上形成的纹理特征几乎看不到。这种情况下,电子设备100在识别两个图像信息中相同的特征时,可以根据目标对象本身的特征来判断。因此,准直镜对光线的转换得到的亮点对深度的计算没有影响。
本申请实施例中可以减少红外投射器101中的准直镜。即本申请实施例中的红外投射器101可以不包括准直镜,仅包括红外光源和DOE。准直镜的作用是让VCSEL发出的光线通过准直镜之后,光线接近平行光;这样,在有准直镜的情况下,光线通过DOE之后,会造成光斑直径小,对比度高(即光斑区域的亮度高,而非光斑区域的亮度低)。由此,在可见光的强度很低时或者纯黑时(例如晚上),投射器照射的目标对象,在没有斑点或光斑的区域,目标对象发黑或者看不见;因为有准直镜的投射器的光斑光线更汇聚,所以没有光斑的地方理论上是没有光的,或者只有微弱的环境光。若在红外投射器中去掉准直镜,则由于光线入射DOE之前不是平行光,所以光斑的对比度低,非光斑区域仍然有部分的光线存在。没有准直镜的投射器,对投射器设计上来说是噪声,但 是在本申请实施例中确可以提高测量深度信息的准确性。没有准直镜的投射器,可以在微弱可见光或者纯黑的情况下,在有光斑照射的目标对象的区域,目标对象可以被光斑照亮;在没有光斑照亮的区域,由于存在这种噪声,目标对象也会被相对较弱一点的光线照射,此时目标对象整体都是可见的。从而可以使得目标对象的特征更明显,便于提高深度信息测量的准确性。
DOE用于利用傅里叶光学原理,将平行光转化成需要的光线输出形式,该输出形式可以为点,线,也可以是面。本申请实施例中,可以通过设置DOE控制红外投射器发射带光斑的红外光。其中,光斑的形状、数量和排布均可以通过设置DOE实现。
具体的,红外投射器101发射的红外光可以包括多个光斑。该多个光斑可以包括多个散斑点阵组。一个散斑点阵组中包括一个或多个散斑点阵。散斑点阵是多个散斑组成。请参考图4,其示出本申请实施例提供的多个散斑点阵的实例示意图。例如,图4所示的散斑点阵401是由多个圆形散斑组成的散斑点阵。
在一些实施例中,一个散斑点阵中的多个散斑的形状可以相同。例如,图4所示的散斑点阵401、散斑点阵402、散斑点阵403、散斑点阵404和散斑点阵408中的散斑都是圆形的。图4所示的散斑点阵405中的散斑都是矩形的。
在另一些实施例中,一个散斑点阵中的多个散斑中至少两个散斑的形状不同。例如,图4所示的散斑点阵406中包括圆形的散斑和矩形的散斑。图4所示的散斑点阵407中包括多个形状不同的散斑。
需要注意的是,不同的散斑点阵的形状可以不同。例如,图4所示的散斑点阵401是矩形的散斑点阵、散斑点阵402是三角形的散斑点阵、散斑点阵403是一种八边形的散斑点阵,散斑点阵404是另一种八边形的散斑点阵。由于散斑点阵401、散斑点阵402、散斑点阵403和散斑点阵404的形状不同,因此散斑点阵401、散斑点阵402、散斑点阵403和散斑点阵404是不同的散斑点阵。
不同的散斑点阵中,散斑数量、散斑的形状、点阵的形状中的至少一项不同。例如,虽然图4所示的散斑点阵401和散斑点阵408的形状相同;但是,散斑点阵401和散斑点阵408中包括的散斑的数量不同(散斑点阵401比散斑点阵408中多一个散斑)。因此,散斑点阵401和散斑点阵408是不同的散斑点阵。例如,虽然图4所示的散斑点阵401和散斑点阵405的形状相同,且散斑点阵401和散斑点阵405中包括的散斑的数量相同;但是,散斑点阵401中的散斑的形状(圆形)与散斑点阵405中的散斑的形状(矩形)不同。因此,散斑点阵401和散斑点阵405是不同的散斑点阵。又例如:虽然图4所示的散斑点阵402和散斑点阵404中包括的散斑的形状相同,且这两个散斑点阵包括的散斑的数量也相同;但是,散斑点阵402的形状(三角形)与散斑点阵404的形状(八边形)不同。因此,散斑点阵402和散斑点阵404是不同的散斑点阵。
其中,图4仅以举例方式给出部分散斑点阵的实例示意图。红外光中的多个光斑中的散斑点阵包括但不限于图4所示的散斑点阵。
在一些实施例中,上述多个光斑中包括多个相同的散斑点阵组。每个散斑点阵组中包括一个散斑点阵。例如,假设红外投射器101发射的红外光中带图5所示的多个光斑501。其中,图5所示的501即为多个光斑。该多个光斑501中包括多个相同的散斑点阵组502。该散斑点阵组502中包括一个散斑点阵401。又例如,假设红外投射器101 发射的红外光中带图5所示的多个光斑503。该多个光斑503中包括多个相同的散斑点阵组504。该散斑点阵组504中包括一个散斑点阵407。
在另一些实施例中,上述多个光斑中包括多个相同的散斑点阵组。每个散斑点阵组中包括多个散斑点阵。该多个散斑点阵中的至少两个散斑点阵不同。
例如,假设红外投射器101发射的红外光中带图6A所示的多个光斑601。该多个光斑601中包括多个相同的散斑点阵组602。该散斑点阵组602中包括散斑点阵402和散斑点阵404。散斑点阵402和散斑点阵404不同。
又例如,假设红外投射器101发射的红外光中带图6B所示的多个光斑603。该多个光斑603中包括多个相同的散斑点阵组604。该散斑点阵组604中包括两个散斑点阵401和两个散斑点阵403。散斑点阵401和散斑点阵403不同。
又例如,假设红外投射器101发射的红外光中带图6C所示的多个光斑605。该多个光斑605中包括多个相同的散斑点阵组606。该散斑点阵组606中包括散斑点阵404、散斑点阵408、散斑点阵406和散斑点阵407。散斑点阵404、散斑点阵408、散斑点阵406,以及散斑点阵407两两不相同。
在另一些实施例中,上述多个光斑中包括多个散斑点阵组。该多个散斑点阵组中包括至少两个不同的散斑点阵组。例如,假设红外投射器101发射的红外光中带图7A所示的多个光斑701。该多个光斑701中包括多个不同的散斑点阵组,如散斑点阵组602、散斑点阵组604和散斑点阵组606。
在另一些实施例中,上述多个光斑中包括多个形状不同的散斑。该多个形状不同的散斑随机排布。例如,如图7B所示,多个光斑702中包括多个形状不同的散斑。
需要注意的是,图5、图6A、图6B、图6C或者图7A仅以举例方式给出红外光中的多个光斑的实例示意图。红外光中的多个光斑中的散斑点阵的数量不限于图5、图6A、图6B、图6C或者图7A所示的散斑点阵的数量。
一般而言,红外发射器101发射的红外光中带的散斑的数量在3千个左右。当然,如果红外发射器101发射的红外光中带的散斑的数量越大,该红外光照射在目标对象上为目标对象增加的特征则越多,则越有利于电子设备100计算目标对象的深度信息。例如,当红外发射器101发射的红外光中带的散斑的数量在7千左右时,电子设备100计算的目标对象的深度信息相比于散斑的数量在3千左右时电子设备100计算的目标对象的深度信息更加准确。
可以理解,当红外发射器101发射的红外光中的散斑点阵组周期性重复时,红外光照射在目标对象上形成的纹理特征也是重复的。本申请实施例中,散斑点阵的重复频率可以通过预设面积中出现同一个散斑点阵的数量来表征。预设面积中出现同一个散斑点阵组的数量越多,散斑点阵的重复频率则越高,散斑点阵的重复周期则会比较小。如此,容易造成电子设备100识别第一摄像头102采集的图像信息和第二摄像头103采集的图像信息中相同的特征时的特征匹配错误。其中,本申请实施例中的特征匹配是指:将第一摄像头102采集的图像信息和第二摄像头103采集的图像信息中相同的特征识别出来。预设面积中出现同一个散斑点阵组的数量越少,散斑点阵的重复频率则越低,散斑点阵的重复周期则会比较大。
其中,散斑点阵的重复周期反应在图像上则是相同的两个散斑点阵之间的最短距 离。示例性的,两个散斑点阵之间的距离可以为这两个散斑点阵的中心点之间的距离。例如,如图7C所示,多个光斑703中包括多个重复的散斑点阵组,如散斑点阵组704、散斑点阵组705和散斑点阵组706。其中,散斑点阵组704、散斑点阵组705和散斑点阵组706相同。点K1是散斑点阵组704中的散斑点阵707的中心点。点K2是散斑点阵组705中的散斑点阵708的中心点。点K3是散斑点阵组706中的散斑点阵709的中心点。图7C所示的散斑点阵的重复周期可以用点K1与点K2之间的距离和点K1与点K3之间的距离中最短的距离来表征。由于点K1与点K2之间的距离短于点K1与点K3之间的距离;因此,图7C所示的散斑点阵的重复周期可以由点K1与点K2之间的距离来表征。
本申请实施例中,散斑点阵的重复频率越小,则红外光照射在目标对象上形成的纹理特征的重复性则越低,电子设备100计算得到的目标对象的深度信息越准确。
其中,电子设备100是根据三角定位原理,采用公式
Figure PCTCN2019112254-appb-000006
实现目标对象的深度信息的计算的。即目标对象的深度信息是根据第一摄像头102和第二摄像头103之间的距离(即第一长度T)、第一摄像头102和第二摄像头103的镜头焦距f,以及视差d计算得到的。
为了保证红外投射器101发射的带光斑的红外光可以照射在目标对象上,并且,照射了带光斑的红外光的目标对象可以被第一摄像头102和第二摄像头103拍摄到;本申请实施例中,红外投射器101的视场角(field of view,FOV)的覆盖范围、第一摄像头102的FOV的覆盖范围,以及第二摄像头103的FOV的覆盖范围三者部分重叠或全部重叠。FOV的大小可以表征光学仪器(如摄像头)的视野范围。
其中,红外投射器101、第一摄像头102,以及第二摄像头103,这三者的FOV的覆盖范围的重叠区域越大,第一摄像头102和第二摄像头103采集的上述纹理特征则越多。为了使得红外投射器101、第一摄像头102,以及第二摄像头103的FOV的覆盖范围的重叠区域较大,如图1A所示,上述红外投射器101可以设置在第一摄像头102和第二摄像头103之间。
可以理解,第一摄像头102和第二摄像头103之间的距离(即第一长度)也会影响上述重叠区域的大小。例如,第一长度较大,第一摄像头102和第二摄像头103的FOV的覆盖范围的重叠区域则越小。但是,第一长度如果过于小,第一摄像头102和第二摄像头103对目标对象上各个特征的视差也会很小,接近于零。因此,上述第一长度不能过大或者过小。第一长度过大或者过小,都会影响电子设备100计算得到的深度信息的准确度。
通过实验得出:当第一长度T为20mm~30mm的任一长度时,电子设备100计算得到的深度信息的准确度较高。因此,本申请实施例中,第一长度T可以为20mm~30mm的任一长度。例如,第一长度T可以为29.5mm。需要说明的是,如何设置两个摄像头之间的距离可能会受摄像头参数的影响,由此上述第一长度T为20mm~30mm仅作为一个举例。
上述红外投射器101发射带光斑的红外光后,如果要求第一摄像头102和第二摄像头103可以采集到包括光斑的图像信息,那么则要求第一摄像头102和第二摄像头103可以接收红外光。例如,第一摄像头102和第二摄像头103可以用于接收890nm~990nm 的红外光,如940nm的红外光。
由上述描述可知:本申请实施例中要求第一摄像头102和第二摄像头103可以感知红外光(即接收红外光)。但是,普通的RGB摄像头只能感知可见光,无法感知红外光;而在电子设备100中配置具备感知红外光功能的红外摄像头的成本较高,且红外摄像头的使用会增大电子设备100的功耗。
为了降低电子设备100的硬件成本,降低电子设备100的功耗,本申请实施例可以对普通的RGB摄像头进行改进,得到可以感知红外光的第一摄像头102和第二摄像头103。
请参考图8中的(a),其示出本申请实施例提供的一种普通的RGB摄像头模组的结构示意图。其中,如图8中的(a)所示,本申请实施例这里以rolling shutter摄像头为例,对本申请实施例中改进普通的RGB摄像头得到第一摄像头102和第二摄像头103的方式进行说明:
如图8中的(a)所示,RGB摄像头模组800可以包括3P镜头、滤光片(也称滤镜)804和传感器(sensor)805。其中,3P镜头是指RGB摄像头中包括3片透镜:透镜801、透镜802和透镜803。传感器805可以为2M的传感器。2M是指RGB摄像头成像的最高分辨率可以达到200万像素。
为了使得图8中的(a)所示RGB摄像头模组800既可以感知可见光,又可以感知红外光,可以对图8中的(a)所示的RGB摄像头模组800做出如下改进:对图8中的(a)所示的RGB摄像头模组800的每个透镜(如透镜801、透镜802和透镜803)的双面上镀增透膜,得到图8中的(b)所示的透镜801、透镜802和透镜803;在滤光片804上镀截止镀层,得到图8中的(b)所示的滤光片804。本申请实施例中,对RGB摄像头模组800中的传感器805可以不作改进。本申请实施例中,可以在滤光片804的双面镀截止镀层;或者,可以在滤光片804的单面镀截止镀层。
需要注意的是,为了使得第一摄像头102和第二摄像头103可以感知红外投射器101发射的红外光,该第一摄像头102和第二摄像头103要具备感知红外投射器101发射的红外光的能力。因此,上述增透膜可以为上述红外投射器101发射的红外光的波长对应的增透膜。例如,该增透膜可以为890nm~990nm的红外光的增透膜,如940nm的红外光的增透膜。
为每个透镜(如透镜801、透镜802和透镜803)的两个面镀上述增透膜,可以提高透镜801、透镜802和透镜803对红外投射器101发射的红外光的感知能力,使得透镜801、透镜802和透镜803可以尽可能的接收红外投射器101发射的红外光。
上述截止镀层可以用于滤去红外投射器101发射的红外光和可见光之外的其他光线,并增加上述红外光的透过率。
例如,上述截止镀层可以用于滤去波长为850nm的红外光。可以理解,由于波长为850nm的红外光存在明显的红光特性,可能会出现严重的红曝现象;因此,通过上述截止镀层滤去波长为850nm的红外光,可以减少出现红曝现象的可能性。其中,红曝现象是指一种滤光不干净的问题。例如,有时可能会希望只使用红外光(即红外线)照明,则可以在光源加上滤镜过滤红外线之外的其他光线。此时,可能会因为滤光不干净而导致少量红外光还是可以被人眼看到,这种现象称为红曝现象。
本申请实施例中改进的普通RGB摄像头包括但不限于上述rolling shutter摄像头。改进rolling shutter摄像头得到上述第一摄像头的原因在于:rolling shutter摄像头的曝光是逐行进行的,成本较低。本申请实施例中,改进rolling shutter摄像头得到上述第一摄像头102和第二摄像头103,可以进一步降低成本。
S203、电子设备100根据第一图像信息和第二图像信息、第一长度和镜头焦距,计算目标对象的深度信息。
其中,由上述公式
Figure PCTCN2019112254-appb-000007
可知:目标对象的每一个特征所在点的深度Z与第一摄像头102与第二摄像头103对该点的视差d成反比;每一个特征所在点的深度Z与镜头焦距f成正比;每一个特征所在点的深度Z与第一长度T成正比。
第一长度T和镜头焦距f是第一摄像头102和第二摄像头103的硬件参数。第一长度T和镜头焦距f是固定不变的。那么,目标对象的每一个点的深度Z的大小则取决于第一摄像头102与第二摄像头103对该点的视差d的大小。
例如,如图10A中的(a)所示,假设第一摄像头102是左侧的双通摄像头,第二摄像头13是右侧的双通摄像头。笑脸、三角形、圆柱体和月牙分别代表目标对象上的不同特征,笑脸、三角形、圆柱体和月牙所在位置与摄像头之间的距离逐渐变远。如图10A中的(b)所示,O L为第一摄像头102的位置,O R为第二摄像头103的位置,O L与O R之间的距离为T(即第一长度)。
如图10A中的(b)所示,第一摄像头102采集到图像A,第二摄像头103采集到图像B。月牙在以O l(图像A的左上角)为原点的坐标系(简称坐标系L)的x轴的坐标为x L1,圆柱体在坐标系L的x轴的坐标为x L2,三角形在坐标系L的x轴的坐标为x L3,笑脸在坐标系L的x轴的坐标为x L4。月牙在以O r(图像B的左上角)为原点的坐标系(简称坐标系R)的x轴的坐标为x R1,圆柱体在坐标系R的x轴的坐标为x R2,三角形在坐标系R的x轴的坐标为x R3,笑脸在坐标系R的x轴的坐标为x R4
其中,第一摄像头102和第二摄像头103对月牙的视差为d1,d1=x L1-x R1。第一摄像头102和第二摄像头103对圆柱体的视差为d2,d2=x L2-x R2。第一摄像头102和第二摄像头103对三角形的视差为d3,d3=x L3-x R3。第一摄像头102和第二摄像头103对笑脸的视差为d4,d4=x L4-x R4
本申请实施例这里以图10A中的(b)所示的笑脸为例,对第一摄像头102和第二摄像头103对笑脸的视差d4(d4=x L4-x R4)进行说明:
如图10B所示,第一摄像头102采集的笑脸在图像A的坐标系中的坐标为(x L4,y),第二摄像头103采集的笑脸在图像B的坐标系中的坐标为(x R4,y)。其中,如图10B所示,坐标(x L4,y)与坐标(x R4,y)在x轴上相差d4,d4=x L4-x R4,即第一摄像头102和第二摄像头103对笑脸的视差d4=x L4-x R4
其中,d1<d2<d3<d4。由于目标对象的每一个特征(即特征所在点)的深度Z与第一摄像头102与第二摄像头103对该点的视差d成反比;因此,可以得出如图10C所示的视差d与深度z的关系示意图。如图10C所示,随着视差d的逐渐增大,深度z逐渐变小。
综上所述,电子设备100可以先计算出第一摄像头102与第二摄像头103对目标对象上一个特征的视差d,然后根据该特征的视差d、第一长度T和镜头焦距f,计算该特 征所在点的深度Z;然后由多个点的深度得到目标对象的深度信息。具体的,如图9所示,图2所示的S203可以包括S203a-S203b:
S203a、电子设备100计算第一摄像头102与第二摄像头103对第一图像信息和第二图像信息中的多个第一特征的视差。第一特征是第一图像信息和第二图像信息中相同的特征。
其中,红外发射器101发射的带光斑的红外光照射在目标对象上可以增加目标对象的纹理特征。例如,假设目标对象为图11中的(a)所示的人脸1101,红外发射器101发射图11中的(b)所示的红外光1102中包括图7C所示的光斑。红外发射器101发射的红外光1102照射在人脸1101上之后,可以得到图11中的(c)所示的照射有光斑的人脸1103。相比于人脸1101,照射有光斑的人脸1103上的特征纹理较多。
不同的是,第一摄像头102采集的人脸1103的图像信息(即第一图像信息)和第二摄像头103采集的人脸1103的图像信息(即第二图像信息)中相同特征所在点在坐标系的x轴的位置不同,即第一摄像头102和第二摄像头103有视差。
电子设备100可以识别第一摄像头102采集的第一图像信息和第二摄像头103采集的第二图像信息,确定出第一图像信息和第二图像信息中相同的特征。其中,第一图像信息和第二图像信息中相同的特征可以包括:目标对象本身的特征和带光斑的红外光照射在目标对象上形成的纹理特征。也就是说,本申请实施例中,电子设备100在识别第一图像信息和第二图像信息中相同的特征时,不仅可以识别第一图像信息和第二图像信息中目标对象本身相同的特征,还可以识别第一图像信息和第二图像信息中带光斑的红外光照射在目标对象上形成的纹理特征中相同的特征。即,在识别两个图像信息中相同的特征时,可以根据目标对象本身的特征或纹理特征来判断,也可以结合这两者一起判断。例如:若单独根据目标对象本身的特征或者单独根据纹理特征就能够判断,则无需两者结合。或者,当根据目标对象本身的特征无法或者不容易判断是否为相同特征时,可以根据纹理特征结合目标对象本身的特征一起来判断是否为相同特征。
例如,白天可见光比较强时,第一摄像头102和第二摄像头103采集的图像信息中,带光斑的红外光照射在目标对象上形成的纹理特征几乎看不到。但是,因为可见光比较强,那么可见光照射在目标对象上使得该目标对象本身的特征比较明显。在这种情况下,电子设备100在识别两个图像信息中相同的特征时,可以根据目标对象本身的特征来判断。
需要注意的是,虽然上述散斑点阵中散斑的形状可以相同(如散斑点阵包括多个圆点),但是每个散斑在散斑点阵中的位置不同。因此,电子设备100可以根据散斑点阵和散斑在散斑点阵中的位置识别出第一图像信息和第二图像信息中相同形状的散斑所代表的相同特征。
在一些实施例中,上述多个第一特征包括第一图像信息和第二图像信息中所有相同的特征。电子设备100可以识别出第一图像信息和第二图像信息中所有相同的特征,然后针对每一个特征,执行S203b计算其深度,得到目标对象的深度信息。
在另一些实施例中,上述多个第一特征是第一图像信息和第二图像信息中相同的特征中的部分特征。在该实施例中,电子设备100可以按照预设的特征频率,从第一图像信息中选择出多个第一特征,然后查找第二图像信息中与该多个第一特征相同的特征; 最后,针对每一个第一特征,执行S203b计算其深度,得到目标对象的深度信息。或者,电子设备100也可以随机或间隔从第一图像信息中选择出部分第一特征。
其中,上述特征频率可以为相同的两个第一特征在预设面积中出现的数量。上述特征频率反应在图像上可以为电子设备100选择的相邻的两个第一特征之间的距离(称为特征距离)。电子设备100可以按照预设的特征频率,从第一图像信息中选择出多个第一特征的方法可以包括:电子设备100从第一图像信息中的所有特征中,每间隔一个特征距离选择出一个第一特征。
换言之,电子设备100不需要计算第一图像信息和第二图像信息中相同的特征中每一个特征所在点的深度,而是每间隔一个特征距离选择一个特征,计算选择出的特征所在点的深度。
示例性的,本申请实施例这里以第一图像信息和第二图像信息中带光斑的红外光照射在目标对象上形成的纹理特征为例,对上述周期特征进行说明。
如图12所示,上述特征距离可以为散斑1201与散斑1202之间的距离,或者散斑1202与散斑1204之间的距离,或者散斑1204与散斑1203之间的距离,或者散斑1201与散斑1203之间的距离。如图12所示,本申请实施例采用将散斑标黑的方式,示出带光斑的红外光照射在目标对象上形成的纹理特征中的部分第一特征。即图12所示的标黑的散斑为部分第一特征。
其中,一些散斑点阵中的散斑的形状可以相同(如散斑点阵包括多个圆点)。虽然电子设备100可以根据散斑在散斑点阵中的位置区分出不同的散斑;但是,电子设备100根据散斑在散斑点阵中的位置区分出不同的散斑需要较长时间,并且会浪费电子设备100的功耗。电子设备100在选择上述第一特征时所采用的特征距离可以小于或者等于上述多个光斑中散斑点阵的重复周期。即上述特征频率大于或者等于上述多个光斑中散斑点阵的重复频率。这样,可以尽量保证电子设备100从第一图像信息中选择出的相邻的两个第一特征对应不同的散斑点阵中的散斑,有利于电子设备100可以区分出该相邻的两个第一特征,可以降低特征匹配错误的可能性,提升电子设备100计算得到的深度信息的准确度。
S203b、电子设备100针对每个第一特征,根据第一摄像头102与第二摄像头103对第一特征的视差d、第一长度T和镜头焦距f,采用公式(2)计算第一特征所在点的深度,得到目标对象的深度信息。
其中,上述公式(2)为
Figure PCTCN2019112254-appb-000008
示例性的,假设目标对象为图13所示的人脸1301。电子设备100可以计算得到图13所示多个点的深度,得到目标对象1301的深度信息1302。
本申请实施例提供一种获取深度信息的方法,电子设备100可以通过红外投射器101发射带光斑的红外光。这样,第一摄像头102和第二摄像头103采集的目标对象的图像则不仅可以包括目标对象的特征,还包括带光斑的红外光照射在目标对象上形成的纹理特征。即可以增加第一摄像头102和第二摄像头103采集的目标对象的图像的特征。增加第一摄像头102和第二摄像头103采集的目标对象的图像的特征后,电子设备100便可以更加准确识别出第一摄像头102采集的图像信息和第二摄像头103采集的图像信 息中相同的特征,进而确定出第一摄像头102和第二摄像头103对该相同的特征的视差,从而计算出每个特征所在点的深度,得到目标对象的深度信息,可以提高电子设备100计算目标对象的深度信息的准确度。
在一些实施例中,电子设备100包括上述第一摄像头102和第二摄像头103。该第一摄像头102和第二摄像头103可以感知可见光。第一摄像头102或者第二摄像头103可以作为主摄像头。电子设备100可以在该电子设备100的显示屏上显示该主摄像头采集的可见光下的图像信息。进一步的,第一摄像头102和第二摄像头103还可以感知红外光,以配合红外投射器101实现电子设备100对目标对象的深度信息的计算。
但是,由于第一摄像头102和第二摄像头103可以感知红外光;因此,电子设备100采集的目标对象的图像信息可以会出现偏红的现象。那么,电子设备100的显示屏显示的主摄像头(第一摄像头102或者第二摄像头103)采集的图像信息可能会出现偏红的现象,影响用户的视觉体验。
为了避免电子设备100的显示屏显示的图像信息会出现偏红的现象的问题,在另一些实施例中,电子设备100不仅可以包括上述第一摄像头102和第二摄像头103,还可以包括第三摄像头104。该第三摄像头104是RGB摄像头(即普通的RGB摄像头)。该第三摄像头104用于采集可见光下的图像信息。该第三摄像头104可以设置在第一摄像头102和第二摄像头103之间,也可以设置在其它位置。
在该实施例中,虽然第一摄像头102和第二摄像头103不仅可以感知可见光,还可以感知红外光;即第一摄像头102和第二摄像头103不仅可以采集红外光下的图像信息,还可以用于采集可见光下的图像信息;但是,第一摄像头102和第二摄像头103采集的图像信息,只用于计算目标对象的深度信息。上述第三摄像头104用于采集可见光下的图像信息。该第三摄像头104采集的图像信息用于显示在电子设备100的显示屏上。如此,则可以避免电子设备100的显示屏显示的图像信息会出现偏红的现象的问题,可以保证影响用户拍摄图像的视觉体验。
需要说明的是,前述各实施例中以第一摄像头的镜头焦距和第二摄像头的镜头焦距相同为例进行说明。而在其它实施方式中,第一摄像头的镜头焦距和第二摄像头的镜头焦距也可以不同。当这两个摄像头的镜头焦距不同时,修改前述计算深度Z的公式来计算计算深度。具体的计算公式可以采用现有技术中的公式。
示例性的,在人脸解锁场景或者人脸支付场景等场景中,上述第三摄像头104采集的图像信息可以在电子设备100录入用于进行人脸识别的人脸二维图像时,显示在电子设备100的显示屏上。上述第一摄像头102和第二摄像头103采集的图像信息,用于在电子设备100进行人脸识别时,计算目标对象的深度信息。即电子设备100的人脸识别过程可以包括:二维人脸信息的识别和人脸深度的认证。二维人脸信息的识别是指:电子设备100判断电子设备100采集的目标对象的二维图像是否与电子设备100中保存的人脸二维图像匹配。人脸深度的认证是指:电子设备100判断该目标对象的深度信息是否具备真实人脸的深度特征;如果目标对象的深度信息具备真实人脸的深度特征,则表示该目标对象是真实人脸;如果目标对象的深度信息不具备真实人脸的深度特征,则表示该目标对象不是真实人脸。例如,该目标对象可能是包括与电子设备中预先保存的人脸二维图像匹配的二维图像的一张照片。如果目标对象的深度信息不具备真实人脸的深 度特征,那么即使目标对象的二维图像与电子设备100中保存的人脸二维图像匹配,该目标对象的人脸识别也不会通过。即使用上述照片在电子设备中进行的人脸识别不会通过,从而可以避免电子设备100的信息被泄露或者可以避免给用户带来财产损失,可以保护电子设备100的信息安全,提升电子设备100的安全性能。
在AR场景和3D建模场景中,电子设备100可以将根据上述第一摄像头102和第二摄像头103采集的图像信息计算得到的目标对象的深度信息,以及第三摄像头104采集的目标对象的图像信息结合起来,构建目标对象的真实人脸模型,提升AR场景和3D建模场景的真实性。
在大光圈场景中,电子设备100可以执行上述方法,计算得到电子设备100采集到的图像中每个点的深度,然后对深度大于预设值的点进行虚化处理,突出显示深度小于预设值的点,以实现大光圈的效果。
示例性的,当电子设备100采集包括人脸的图像时,该人脸上的各个点的深度相比于背景(背景在人脸后,距离摄像头较远)上各个点的深度较小。因此,在大光圈场景中,可以突出显示人脸图像,虚化背景图像。
示例性的,在大光圈场景中,上述第一操作可以包括用户对“照相机”应用的开启操作和用户对“照相机”应用中的“大光圈”模式的选择操作。如果电子设备100接收到上述第一操作,那么则表示电子设备100要使用大光圈场景拍摄图像(即执行第一事件)。响应于该第一事件,电子设备100可以执行上述获取深度信息的方法,获取目标对象的深度信息,然后根据目标对象的深度信息实现大光圈的效果。其中,第三摄像头104采集的图像显示在电子设备100的显示屏上,电子设备100可以根据获取的深度信息,对显示的拍照预览画面和最终拍照得到的图像都进行大光圈的处理,即预览画面和拍照得到的图像中都可以突出近的某个目标对象,而虚化其它对象。
例如,用户对“照相机”应用的开启操作可以为用户对“照相机”应用图标的单击操作。电子设备100响应于用户对“照相机”应用的开启操作,可以显示图14所示的拍摄界面1401。该拍摄界面1401中包括取景框1402、摄像头转化键1404、拍摄键1403、相册键1405。其中,取景框1402中用于显示电子设备100的后置摄像头或者前置摄像头捕获的预览图像;摄像头转化键1404用于触发电子设备100转化使用前置摄像头和后置摄像头来捕获图像;拍摄键1403用于控制电子设备100保存取景框1402中显示的后置摄像头或者前置摄像头捕获的预览图像;相册键1405用于查看电子设备100中保存的图像。其中,当电子设备100的默认摄像头为后置摄像头时,取景框1402中可以显示如图14所示的后置摄像头捕获的预览图像。该预览图像是第三摄像头104采集的。
上述拍摄界面1401还可以包括电子设备100的拍摄模式的选项。电子设备100的拍摄模式可以包括:后置摄像头拍照模式、自拍模式、视频模式、大光圈模式和全景模式等。例如,如图14所示,拍摄界面1401还可以包括“大光圈”选项1406、“照片”选项和“视频”选项等。当然,本申请实施例中的拍摄模式包括但不限于后置摄像头拍照模式、自拍模式、全景模式、大光圈模式和视频模式,其他的拍摄模式,本申请实施例这里不予赘述。
上述用户对“照相机”应用中的“大光圈”模式的选择操作可以为用户对图14所 示的“大光圈”选项1406的点击操作(如单击操作、双击操作或者长按操作等)。或者,上述用户对“照相机”应用中的“大光圈”模式的选择操作可以为用户在图14所示的拍摄界面1401输入的预设手势(如L形手势),该预设手势用于触发电子设备100采用大光圈模式拍摄图像。
在大光圈场景中,本申请实施例的方法还可以包括:响应于上述第一事件,电子设备100通过第三摄像头104采集目标对象的二维图像;电子设备100在显示屏显示第三摄像头104采集的二维图像(称为目标对象的RGB图像);电子设备100识别出第三摄像头104采集的二维图像中的人脸图像(称为人脸的RGB图像)。在电子设备100计算出目标对象的深度信息(即S203)之后,本申请实施例的方法还可以包括:电子设备100从目标对象的深度信息(称为目标对象的深度图)中识别出人脸的深度信息(称为人脸的深度图);电子设备100根据人脸的RGB图像和人脸的深度图,将目标对象的RGB图像和目标对象的深度图进行映射,找到目标对象的RGB图像中与深度图当中人脸深度图像对应的坐标区域,即找到RGB图像中的人脸区域的坐标区域,然后将目标对象的RGB图中除了这部分坐标区域以外的其他部分进行虚化。这样,便可以达到背景虚化的效果。
可以理解的是,上述电子设备为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请实施例能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请实施例的范围。
本申请实施例可以根据上述方法示例对上述电子设备进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
在采用集成的单元的情况下,图15示出了上述实施例中所涉及的电子设备1500的一种可能的结构示意图。该电子设备1500可以包括:处理模块1501、显示模块1502、红外发射模块1503、双通采集模块1504、双通采集模块1505和RGB采集模块1506。可选的,该电子设备1500还可以包括通信模块,该通信模块包括蓝牙模块和Wi-Fi模块等。
其中,处理模块1501用于对电子设备1500的动作进行控制管理。RGB采集模块1506用于采集可见光下目标对象的图像。显示模块1502用于显示处理模块1501生成的图像和RGB采集模块1504采集的图像。红外发射模块1503用于发射带光斑的红外光。双通采集模块1504和双通采集模块1505用于采集可见光下目标对象的图像和红外光下目标对象的图像。通信模块用于支持电子设备1500与其他设备的通信。处理模块1501还用于根据双通采集模块1504采集的图像计算目标对象的深度信息。
具体的,上述处理模块1501可以用于支持电子设备1500执行上述方法实施例中的S201,S203,S203a,S203b,和/或用于本文所描述的技术的其它过程。上述显示模块 1502可以用于支持电子设备1500执行上述方法实施例中的“显示RGB采集模块1506采集的图像”的操作,和/或用于本文所描述的技术的其它过程。红外发射模块1503可以用于支持电子设备1500执行上述方法实施例中的S202中“发射带光斑的红外光”的操作,和/或用于本文所描述的技术的其它过程。双通采集模块1504可以用于支持电子设备1500执行上述方法实施例中的S202中“采集第一图像信息”的操作,和/或用于本文所描述的技术的其它过程。双通采集模块1505可以用于支持电子设备1500执行上述方法实施例中的S202中“采集第二图像信息”的操作,和/或用于本文所描述的技术的其它过程。RGB采集模块1506可以用于支持电子设备1500采集可见光下的图像信息,和/或用于本文所描述的技术的其它过程。
当然,上述电子设备1500中的单元模块包括但不限于上述处理模块1501、显示模块1502、红外发射模块1503、双通采集模块1504和RGB采集模块1506等。例如,电子设备1500中还可以包括存储模块。存储模块用于保存电子设备1500的程序代码和数据。
其中,处理模块1501可以是处理器或控制器,例如可以是中央处理器(Central Processing Unit,CPU),数字信号处理器(Digital Signal Processor,DSP),专用集成电路(Application-Specific Integrated Circuit,ASIC),现场可编程门阵列(Field Programmable Gate Array,FPGA)或者其他可编程逻辑器件、晶体管逻辑器件、硬件部件或者其任意组合。处理器可以包括应用处理器和基带处理器。其可以实现或执行结合本申请公开内容所描述的各种示例性的逻辑方框,模块和电路。所述处理器也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,DSP和微处理器的组合等等。存储模块可以是存储器。
例如,处理模块1501为一个或多个处理器(如图1D所示的处理器110),通信模块包括无线通信模块(如图1D所示的无线通信模块160,该无线通信模块160包括BT(即蓝牙模块)、WLAN(如Wi-Fi模块))。无线通信模块可以称为通信接口。存储模块可以为存储器(如图1D所示的内部存储器121)。显示模块1502可以为显示屏(如图1D所示的显示屏194)。上述红外发射模块1503可以为红外投射器(如图1D所示的红外投射器196,即上述实施例中的红外投射器101)。两个双通采集模块1504可以为两个双通摄像头(如图1D所示的双通摄像头193B(即上述实施例中的第一摄像头102)和双通摄像头193C(即上述实施例中的第二摄像头103))。上述RGB采集模块1506可以为图1D所示的1-N个其他摄像头193A中的一个EGB摄像头,即上述实施例中的第三摄像头104。两个双通采集模块1504和RGB采集模块1506设置在电子设备100的同一面,如正面或者背面。本申请实施例所提供的电子设备1500可以为图1D所示的电子设备100。其中,上述一个或多个处理器、存储器、红外投射器、第一摄像头、第二摄像头、显示屏和第三摄像头等可以连接在一起,例如通过总线连接。
本申请实施例还提供一种计算机存储介质,该计算机存储介质中存储有计算机程序代码,当上述处理器执行该计算机程序代码时,电子设备1500执行图2或图9中任一附图中的相关方法步骤实现上述实施例中的方法。
本申请实施例还提供了一种计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行图2或图9中任一附图中的相关方法步骤实现上述实施例中的方法。
其中,本申请实施例提供的电子设备1500、计算机存储介质或者计算机程序产品均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以使用硬件的形式实现,也可以使用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、ROM、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (30)

  1. 一种获取深度信息的方法,其特征在于,应用于电子设备,所述电子设备包括红外投射器、第一摄像头和第二摄像头;所述第一摄像头和所述第二摄像头之间的距离为第一长度;所述方法包括:
    所述电子设备接收第一指令,所述第一指令用于触发所述电子设备获取目标对象的深度信息;
    响应于所述第一指令,所述电子设备通过所述红外投射器发射带光斑的红外光,通过所述第一摄像头采集所述目标对象的第一图像信息,通过所述第二摄像头采集所述目标对象的第二图像信息,所述第一图像信息和所述第二图像信息包括所述目标对象的特征和所述带光斑的红外光照射在所述目标对象上形成的纹理特征;
    所述电子设备根据所述第一图像信息、所述第二图像信息、所述第一长度、以及所述第一摄像头的镜头焦距和所述第二摄像头的镜头焦距,计算所述目标对象的深度信息。
  2. 根据权利要求1所述的方法,其特征在于,所述第一摄像头与所述第二摄像头的镜头焦距相同。
  3. 根据权利要求2所述的方法,其特征在于,所述电子设备根据所述第一图像信息、所述第二图像信息、所述第一长度、以及所述第一摄像头的镜头焦距和所述第二摄像头的镜头焦距,计算所述目标对象的深度信息,包括:
    所述电子设备计算所述第一摄像头与所述第二摄像头对所述第一图像信息和所述第二图像信息中的多个第一特征的视差;所述第一特征是所述第一图像信息和所述第二图像信息中相同的特征;
    所述电子设备针对每个第一特征,根据所述第一摄像头与所述第二摄像头对所述第一特征的视差、所述第一长度和所述镜头焦距,采用以下公式计算所述第一特征所在点的深度Z,得到所述目标对象的深度信息:
    Figure PCTCN2019112254-appb-100001
    其中,f为所述镜头焦距,d为所述第一摄像头与所述第二摄像头对所述第一特征的视差,T为所述第一长度。
  4. 根据权利要求3所述的方法,其特征在于,在所述电子设备计算所述第一摄像头与所述第二摄像头对所述第一图像信息和所述第二图像信息中的多个第一特征的视差之前,所述方法还包括:
    所述电子设备从所述第一图像信息和所述第二图像信息中选择出多个所述第一特征;
    其中,多个所述第一特征是所述第一图像信息和所述第二图像信息中相同的特征中的部分特征。
  5. 根据权利要求1-4中任意一项所述的方法,其特征在于,所述带光斑的红外光包括多个光斑,所述多个光斑包括多个散斑点阵组;
    其中,一个散斑点阵组中包括一个或多个散斑点阵;所述散斑点阵包括多个散斑。
  6. 根据权利要求5所述的方法,其特征在于,所述多个散斑点阵组中的至少两个 散斑点阵组不同。
  7. 根据权利要求5或6所述的方法,其特征在于,第一散斑点阵组为所述多个散斑点阵组中的任意一个,所述第一散斑点阵组中包括的多个散斑点阵中的至少两个散斑点阵不同。
  8. 根据权利要求5-7中任意一项所述的方法,其特征在于,第一散斑点阵为所述多个散斑点阵中的任意一个,
    所述第一散斑点阵中的多个散斑中每个散斑的形状相同;或者,所述第一散斑点阵中的多个散斑中至少两个散斑的形状不同。
  9. 根据权利要求4-8中任意一项所述的方法,其特征在于,所述电子设备按照预设的特征频率从所述第一图像信息和所述第二图像信息中选择多个所述第一特征;
    所述特征频率大于或者等于所述多个光斑中散斑点阵的重复频率;
    其中,所述特征频率通过所述电子设备从预设面积的图像中选择的相同的第一特征的数量来表征,所述重复频率通过所述预设面积中出现同一个散斑点阵的数量来表征。
  10. 根据权利要求1-9中任意一项所述的方法,其特征在于,所述第一摄像头和所述第二摄像头的每个透镜的双面均包括增透膜,所述第一摄像头和所述第二摄像头的滤光片包括截止镀层;
    其中,所述增透膜用于增加红外光的透过率;所述截止镀层用于滤去红外光和可见光之外的其他光线,并增加红外光的透过率。
  11. 根据权利要求1-10中任意一项所述的方法,其特征在于,所述红外投射器发射的红外光为890纳米~990纳米的红外光。
  12. 根据权利要求1-11中任意一项所述的方法,其特征在于,所述红外投射器发射的红外光为940纳米的红外光。
  13. 根据权利要求1-12中任意一项所述的方法,其特征在于,所述第一长度在20毫米~30毫米之间。
  14. 根据权利要求1-13中任意一项所述的方法,其特征在于,所述电子设备还包括第三摄像头,所述第三摄像头是RGB摄像头;所述第三摄像头用于采集可见光下的图像信息;所述第三摄像头采集的图像信息用于显示在所述电子设备的显示屏上。
  15. 一种电子设备,其特征在于,所述电子设备包括:一个或多个处理器、存储器、红外投射器、第一摄像头和第二摄像头,所述第一摄像头和所述第二摄像头之间的距离为第一长度,所述存储器、所述红外投射器、所述第一摄像头和所述第二摄像头与所述处理器耦合,所述存储器用于存储信息;
    所述处理器,用于接收第一指令,所述第一指令用于触发所述电子设备获取目标对象的深度信息;
    所述处理器,还用于响应于所述第一指令,通过所述红外投射器发射带光斑的红外光,通过所述第一摄像头采集所述目标对象的第一图像信息,通过所述第二摄像头采集所述目标对象的第二图像信息,所述第一图像信息和所述第二图像信息包括所述目标对象的特征和所述带光斑的红外光照射在所述目标对象上形成的纹理特征;
    所述处理器,还用于根据所述第一摄像头采集的所述第一图像信息、所述第二摄像头采集的所述第二图像信息、所述第一长度、以及所述第一摄像头的镜头焦距和所述第 二摄像头的镜头焦距,计算所述目标对象的深度信息。
  16. 根据权利要求15所述的电子设备,其特征在于,所述第一摄像头与所述第二摄像头的镜头焦距相同。
  17. 根据权利要求16所述的电子设备,其特征在于,所述处理器,用于根据所述第一图像信息、所述第二图像信息、所述第一长度、以及所述第一摄像头的镜头焦距和所述第二摄像头的镜头焦距,计算所述目标对象的深度信息,包括:
    所述处理器,用于:
    计算所述第一摄像头与所述第二摄像头对所述第一图像信息和所述第二图像信息中的多个第一特征的视差;所述第一特征是所述第一图像信息和所述第二图像信息中相同的特征;
    针对每个第一特征,根据所述第一摄像头与所述第二摄像头对所述第一特征的视差、所述第一长度和所述镜头焦距,采用以下公式计算所述第一特征所在点的深度Z,得到所述目标对象的深度信息:
    Figure PCTCN2019112254-appb-100002
    其中,f为所述镜头焦距,d为所述第一摄像头与所述第二摄像头对所述第一特征的视差,T为所述第一长度。
  18. 根据权利要求17所述的电子设备,其特征在于,所述处理器,还用于在计算所述第一摄像头与所述第二摄像头对所述第一图像信息和所述第二图像信息中的多个第一特征的视差之前,从所述第一图像信息和所述第二图像信息中选择出多个所述第一特征;
    其中,多个所述第一特征是所述第一图像信息和所述第二图像信息中相同的特征中的部分特征。
  19. 根据权利要求15-18中任意一项所述的电子设备,其特征在于,所述带光斑的红外光包括多个光斑,所述多个光斑包括多个散斑点阵组;
    其中,一个散斑点阵组中包括一个或多个散斑点阵;所述散斑点阵包括多个散斑。
  20. 根据权利要求19所述的电子设备,其特征在于,所述多个散斑点阵组中的至少两个散斑点阵组不同。
  21. 根据权利要求19或20所述的电子设备,其特征在于,第一散斑点阵组为所述多个散斑点阵组中的任意一个,所述第一散斑点阵组中包括的多个散斑点阵中的至少两个散斑点阵不同。
  22. 根据权利要求19-21中任意一项所述的电子设备,其特征在于,第一散斑点阵为所述多个散斑点阵中的任意一个,
    所述第一散斑点阵中的多个散斑中每个散斑的形状相同;或者,所述第一散斑点阵中的多个散斑中至少两个散斑的形状不同。
  23. 根据权利要求19-22中任意一项所述的电子设备,其特征在于,所述处理器,用于按照预设的特征频率从所述第一图像信息和所述第二图像信息中选择多个所述第一特征;
    所述特征频率大于或者等于所述多个光斑中散斑点阵的重复频率;
    其中,所述特征频率通过所述处理器从预设面积的图像中选择的相同的第一特征的数量来表征,所述重复频率通过所述预设面积中出现同一个散斑点阵的数量来表征。
  24. 根据权利要求15-23中任意一项所述的电子设备,其特征在于,所述第一摄像头和所述第二摄像头的每个透镜的双面均包括增透膜,所述第一摄像头和所述第二摄像头的滤光片包括截止镀层;
    其中,所述增透膜用于增加红外光的透过率;所述截止镀层用于滤去红外光和可见光之外的其他光线,并增加红外光的透过率。
  25. 根据权利要求15-24中任意一项所述的电子设备,所述红外投射器发射的红外光为890纳米~990纳米的红外光。
  26. 根据权利要求15-25中任意一项所述的电子设备,其特征在于,所述红外投射器发射的红外光为940纳米的红外光。
  27. 根据权利要求15-26中任意一项所述的电子设备,其特征在于,所述第一长度在20毫米~30毫米之间。
  28. 根据权利要求15-27中任意一项所述的电子设备,其特征在于,所述电子设备还包括第三摄像头和显示屏;
    所述第三摄像头是红绿蓝RGB摄像头;所述第三摄像头用于采集可见光下的图像信息;所述第三摄像头采集的图像信息用于显示在所述显示屏上。
  29. 一种计算机存储介质,其特征在于,包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行如权利要求1-14中任一项所述的获取深度信息的方法。
  30. 一种计算机程序产品,其特征在于,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如权利要求1-14中任一项所述的获取深度信息的方法。
PCT/CN2019/112254 2018-10-30 2019-10-21 一种获取深度信息的方法及电子设备 WO2020088290A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP19878166.8A EP3869462A4 (en) 2018-10-30 2019-10-21 PROCESS FOR OBTAINING DEPTH INFORMATION AND ELECTRONIC DEVICE
US17/290,660 US20220020165A1 (en) 2018-10-30 2019-10-21 Method for Obtaining Depth Information and Electronic Device
JP2021524333A JP2022506753A (ja) 2018-10-30 2019-10-21 深度情報の取得方法および電子デバイス

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811279855.7A CN109544618B (zh) 2018-10-30 2018-10-30 一种获取深度信息的方法及电子设备
CN201811279855.7 2018-10-30

Publications (1)

Publication Number Publication Date
WO2020088290A1 true WO2020088290A1 (zh) 2020-05-07

Family

ID=65846096

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/112254 WO2020088290A1 (zh) 2018-10-30 2019-10-21 一种获取深度信息的方法及电子设备

Country Status (6)

Country Link
US (1) US20220020165A1 (zh)
EP (1) EP3869462A4 (zh)
JP (1) JP2022506753A (zh)
CN (1) CN109544618B (zh)
DE (1) DE202019006001U1 (zh)
WO (1) WO2020088290A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112669362A (zh) * 2021-01-12 2021-04-16 四川深瑞视科技有限公司 基于散斑的深度信息获取方法、装置及系统
CN115484383A (zh) * 2021-07-31 2022-12-16 华为技术有限公司 拍摄方法及相关装置
CN116067305A (zh) * 2023-02-09 2023-05-05 深圳市安思疆科技有限公司 一种结构光测量系统和测量方法
CN116843731A (zh) * 2022-03-23 2023-10-03 腾讯科技(深圳)有限公司 对象识别方法以及相关设备

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109544618B (zh) * 2018-10-30 2022-10-25 荣耀终端有限公司 一种获取深度信息的方法及电子设备
CN109635539B (zh) * 2018-10-30 2022-10-14 荣耀终端有限公司 一种人脸识别方法及电子设备
CN110009673B (zh) * 2019-04-01 2020-04-21 四川深瑞视科技有限公司 深度信息检测方法、装置及电子设备
CN110163097A (zh) * 2019-04-16 2019-08-23 深圳壹账通智能科技有限公司 三维头像真伪性的鉴别方法、装置、电子设备及存储介质
CN110308817B (zh) * 2019-06-10 2023-04-07 青岛小鸟看看科技有限公司 一种触控动作识别方法及触控投影系统
CN112540494B (zh) * 2019-09-06 2022-05-03 浙江舜宇光学有限公司 成像装置和成像方法
CN110864440B (zh) * 2019-11-20 2020-10-30 珠海格力电器股份有限公司 一种送风方法及送风装置、空调
CN111626928B (zh) * 2020-04-28 2023-08-25 Oppo广东移动通信有限公司 深度图像生成方法及装置、存储介质和电子设备
CN114073063B (zh) * 2020-05-27 2024-02-13 北京小米移动软件有限公司南京分公司 图像处理方法及装置、相机组件、电子设备、存储介质
CN111751830B (zh) * 2020-07-08 2021-02-19 北京工业大学 一种基于vcsel混合激光的空间微弱目标红外探测系统
CN112225022A (zh) * 2020-09-28 2021-01-15 海拓信息技术(佛山)有限公司 一种实体按键的控制方法和装置
CN112529766B (zh) * 2020-11-26 2023-10-03 维沃移动通信有限公司 图像的处理方法、装置及电子设备
US11778157B2 (en) * 2021-03-25 2023-10-03 Eys3D Microelectronics, Co. Image capture device and depth information calculation method thereof
CN115068109B (zh) * 2022-06-13 2023-07-28 元化智能科技(深圳)有限公司 面向医疗手术导航的红外标靶识别方法和装置

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104918035A (zh) * 2015-05-29 2015-09-16 深圳奥比中光科技有限公司 一种获取目标三维图像的方法及系统
US20160040978A1 (en) * 2014-08-08 2016-02-11 The Board Of Trustees Of The University Of Illinois Smart Phone Attachment for 3-D Optical Coherence Tomography Imaging
CN108195305A (zh) * 2018-02-09 2018-06-22 京东方科技集团股份有限公司 一种双目检测系统及其深度检测方法
CN108234984A (zh) * 2018-03-15 2018-06-29 百度在线网络技术(北京)有限公司 双目深度相机系统和深度图像生成方法
CN108271012A (zh) * 2017-12-29 2018-07-10 维沃移动通信有限公司 一种深度信息的获取方法、装置以及移动终端
CN108388071A (zh) * 2018-02-07 2018-08-10 深圳奥比中光科技有限公司 深度相机及其投影模组
CN109544618A (zh) * 2018-10-30 2019-03-29 华为技术有限公司 一种获取深度信息的方法及电子设备
CN109635539A (zh) * 2018-10-30 2019-04-16 华为技术有限公司 一种人脸识别方法及电子设备

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001264033A (ja) * 2000-03-17 2001-09-26 Sony Corp 三次元形状計測装置とその方法、三次元モデリング装置とその方法、およびプログラム提供媒体
WO2011013079A1 (en) * 2009-07-30 2011-02-03 Primesense Ltd. Depth mapping based on pattern matching and stereoscopic information
CN102789114B (zh) * 2011-05-18 2015-07-15 中国科学院微电子研究所 一种可见-红外双通摄像机
CN203055077U (zh) * 2013-01-11 2013-07-10 天广消防股份有限公司 一种视频火灾探测器
CN104634276B (zh) * 2015-02-12 2018-08-07 上海图漾信息科技有限公司 三维测量系统、拍摄设备和方法、深度计算方法和设备
CN204613977U (zh) * 2015-05-12 2015-09-02 郑州畅想高科股份有限公司 一种机车乘务员防作弊饮酒检测装置
US20160377414A1 (en) * 2015-06-23 2016-12-29 Hand Held Products, Inc. Optical pattern projector
CN105049829B (zh) * 2015-07-10 2018-12-25 上海图漾信息科技有限公司 滤光片、图像传感器、成像装置以及三维成像系统
CN206559462U (zh) * 2016-10-28 2017-10-13 无锡豪帮高科股份有限公司 一种基于分区双通滤光片的虹膜识别及拍照二合一摄像模组
CN206541078U (zh) * 2016-12-14 2017-10-03 浙江舜宇智能光学技术有限公司 发散光式散斑投射器和三维重建系统
EP3570066A4 (en) * 2017-01-13 2019-12-25 Sony Corporation SIGNAL PROCESSING DEVICE, SIGNAL PROCESSING METHOD, AND PROGRAM
CN118447070A (zh) * 2017-08-21 2024-08-06 上海图漾信息科技有限公司 深度数据检测系统
CN108052878B (zh) * 2017-11-29 2024-02-02 上海图漾信息科技有限公司 人脸识别设备和方法
CN207530934U (zh) * 2017-12-19 2018-06-22 北京中科虹霸科技有限公司 一种具有红外成像功能的双摄像头模组
CN108107662A (zh) * 2018-01-06 2018-06-01 广东欧珀移动通信有限公司 激光发射器、光电设备和深度相机
CN108701232A (zh) * 2018-05-09 2018-10-23 深圳阜时科技有限公司 目标的三维映射的方法及装置、身份识别装置与电子设备
CN109190484A (zh) * 2018-08-06 2019-01-11 北京旷视科技有限公司 图像处理方法、装置和图像处理设备

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160040978A1 (en) * 2014-08-08 2016-02-11 The Board Of Trustees Of The University Of Illinois Smart Phone Attachment for 3-D Optical Coherence Tomography Imaging
CN104918035A (zh) * 2015-05-29 2015-09-16 深圳奥比中光科技有限公司 一种获取目标三维图像的方法及系统
CN108271012A (zh) * 2017-12-29 2018-07-10 维沃移动通信有限公司 一种深度信息的获取方法、装置以及移动终端
CN108388071A (zh) * 2018-02-07 2018-08-10 深圳奥比中光科技有限公司 深度相机及其投影模组
CN108195305A (zh) * 2018-02-09 2018-06-22 京东方科技集团股份有限公司 一种双目检测系统及其深度检测方法
CN108234984A (zh) * 2018-03-15 2018-06-29 百度在线网络技术(北京)有限公司 双目深度相机系统和深度图像生成方法
CN109544618A (zh) * 2018-10-30 2019-03-29 华为技术有限公司 一种获取深度信息的方法及电子设备
CN109635539A (zh) * 2018-10-30 2019-04-16 华为技术有限公司 一种人脸识别方法及电子设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3869462A4 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112669362A (zh) * 2021-01-12 2021-04-16 四川深瑞视科技有限公司 基于散斑的深度信息获取方法、装置及系统
CN112669362B (zh) * 2021-01-12 2024-03-29 四川深瑞视科技有限公司 基于散斑的深度信息获取方法、装置及系统
CN115484383A (zh) * 2021-07-31 2022-12-16 华为技术有限公司 拍摄方法及相关装置
CN115484383B (zh) * 2021-07-31 2023-09-26 华为技术有限公司 拍摄方法及相关装置
CN116843731A (zh) * 2022-03-23 2023-10-03 腾讯科技(深圳)有限公司 对象识别方法以及相关设备
CN116067305A (zh) * 2023-02-09 2023-05-05 深圳市安思疆科技有限公司 一种结构光测量系统和测量方法

Also Published As

Publication number Publication date
CN109544618B (zh) 2022-10-25
US20220020165A1 (en) 2022-01-20
EP3869462A1 (en) 2021-08-25
CN109544618A (zh) 2019-03-29
EP3869462A4 (en) 2021-12-15
DE202019006001U1 (de) 2024-01-16
JP2022506753A (ja) 2022-01-17

Similar Documents

Publication Publication Date Title
WO2020088290A1 (zh) 一种获取深度信息的方法及电子设备
JP7195422B2 (ja) 顔認識方法および電子デバイス
CN110456938B (zh) 一种曲面屏的防误触方法及电子设备
WO2020192209A1 (zh) 一种基于Dual Camera+TOF的大光圈虚化方法
CN114946169A (zh) 一种图像获取方法以及装置
CN110138999B (zh) 一种用于移动终端的证件扫描方法及装置
CN111741283A (zh) 图像处理的装置和方法
CN113570617B (zh) 图像处理方法、装置和电子设备
WO2023284715A1 (zh) 一种物体重建方法以及相关设备
CN112087649B (zh) 一种设备搜寻方法以及电子设备
CN113542580B (zh) 去除眼镜光斑的方法、装置及电子设备
CN110248037B (zh) 一种身份证件扫描方法及装置
CN111103922A (zh) 摄像头、电子设备和身份验证方法
CN114090102B (zh) 启动应用程序的方法、装置、电子设备和介质
CN114283195B (zh) 生成动态图像的方法、电子设备及可读存储介质
CN113592751A (zh) 图像处理方法、装置和电子设备
CN115032640B (zh) 手势识别方法和终端设备
CN114390195B (zh) 一种自动对焦的方法、装置、设备及存储介质
WO2022033344A1 (zh) 视频防抖方法、终端设备和计算机可读存储介质
CN116055872B (zh) 图像获取方法、电子设备和计算机可读存储介质
CN115209027B (zh) 相机对焦的方法及电子设备
CN113472996B (zh) 图片传输方法及装置
US20230289199A1 (en) Method and Apparatus for Starting Application, Electronic Device, and Medium
CN116664701A (zh) 光照估计方法及其相关设备
CN115619628A (zh) 图像处理方法和终端设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19878166

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021524333

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019878166

Country of ref document: EP

Effective date: 20210520