WO2020038063A1 - Electronic device and control method for electronic device - Google Patents

Electronic device and control method for electronic device Download PDF

Info

Publication number
WO2020038063A1
WO2020038063A1 PCT/CN2019/090077 CN2019090077W WO2020038063A1 WO 2020038063 A1 WO2020038063 A1 WO 2020038063A1 CN 2019090077 W CN2019090077 W CN 2019090077W WO 2020038063 A1 WO2020038063 A1 WO 2020038063A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
electronic device
sub
subject
captured image
Prior art date
Application number
PCT/CN2019/090077
Other languages
French (fr)
Chinese (zh)
Inventor
张学勇
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2020038063A1 publication Critical patent/WO2020038063A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation

Definitions

  • the present application relates to the field of consumer electronic products, and more particularly, to an electronic device and a control method for an electronic device.
  • An electronic device typically captures a captured image of a subject through a single camera.
  • An embodiment of the present application provides an electronic device and a control method of the electronic device.
  • An embodiment of the present application provides an electronic device.
  • the electronic device includes a time-of-flight module, a main camera, a sub-camera, and a processor.
  • the time-of-flight module is used to collect a depth image of a subject.
  • the main camera is used to collect the image.
  • the depth image and the captured image construct a three-dimensional image of the subject.
  • An embodiment of the present application provides a control method for an electronic device, the electronic device includes a time-of-flight module, a main camera, and a sub-camera, and the control method includes: acquiring a depth image of a subject through the time-of-flight module; Switching the main camera or the sub camera according to a predetermined condition to acquire a captured image of the subject; and constructing a three-dimensional image of the subject based on the depth image and the captured image.
  • FIG. 1 is a schematic flowchart of a method for controlling an electronic device according to some embodiments of the present application
  • FIG. 2 is a schematic structural diagram of an electronic device according to some embodiments of the present application.
  • 3 and 4 are schematic flowcharts of a method for controlling an electronic device according to some embodiments of the present application.
  • 5 to 7 are schematic diagrams of a method for controlling an electronic device according to some embodiments of the present application.
  • FIG. 8 is a schematic flowchart of a method for controlling an electronic device according to some embodiments of the present application.
  • FIG. 9 is a perspective structural diagram of a time-of-flight module according to some embodiments of the present application.
  • FIG. 10 is a schematic top view of a time-of-flight module according to some embodiments of the present application.
  • FIG. 11 is a schematic bottom view of a time of flight module according to some embodiments of the present application.
  • FIG. 12 is a schematic side view of a time-of-flight module according to some embodiments of the present application.
  • FIG. 13 is a schematic cross-sectional view of the time-of-flight module shown in FIG. 10 along the line XIII-XIII;
  • FIG. 14 is an enlarged schematic diagram of a XIV portion in the time-of-flight module shown in FIG. 13; FIG.
  • 15 is a schematic front view of a time-of-flight module of some embodiments of the present application when the flexible circuit board is not bent;
  • 16 to 19 are schematic structural diagrams of a light transmitter according to some embodiments of the present application.
  • an electronic device 100 includes a time-of-flight module 20, a main camera 30, a sub-camera 40, and a processor 10.
  • the time-of-flight module 20 is used to collect a depth image of a subject.
  • the main camera 30 is used to capture a captured image of the subject, and the sub camera 40 is used to capture a captured image of the subject.
  • the processor 10 is configured to switch the main camera 30 or the sub camera 40 according to a predetermined condition to acquire a captured image of the subject, and construct a three-dimensional image of the subject based on the depth image and the captured image.
  • the main camera 30 is a wide-angle camera, and the sub camera 40 is a telephoto camera; or the main camera 30 is a color camera, and the sub camera 40 is a black and white camera.
  • the time-of-flight module 20 and the secondary camera 40 are respectively disposed on two sides of the primary camera 30.
  • the main camera 30 is a wide-angle camera
  • the sub-camera 40 is a telephoto camera.
  • the processor 10 is further configured to obtain the current distance between the subject and the electronic device 100 in real time. When the current distance is less than the distance threshold, the main camera 30 collects captured images; when the current distance is greater than or equal to the distance threshold, the sub camera 40 collects captured images.
  • the processor 10 is configured to obtain the current distance according to the depth image; or the electronic device 100 further includes a distance detection device 50, and the distance detection device 50 is configured to detect the current distance in real time and send it to the processor 10.
  • the main camera 30 is a wide-angle camera
  • the sub-camera 40 is a telephoto camera.
  • the main camera 30 or the sub camera 40 is also used to collect a preview image of the subject.
  • the processor 10 is configured to detect face information in the preview image, and switch the main camera 30 or the sub camera 40 according to the face information to capture a captured image.
  • the main camera 30 is a color camera
  • the sub camera 40 is a black and white camera.
  • the processor 10 is configured to obtain the current brightness of the ambient light in real time. When the current brightness is less than the brightness threshold, the sub-camera 40 collects captured images. When the current brightness is greater than or equal to the brightness threshold, the main camera 30 captures captured images.
  • the time-of-flight module 20 includes a first substrate assembly 21, a spacer 22, a light transmitter 23 and a light receiver 24.
  • the first substrate assembly 21 includes a first substrate 211 and a flexible circuit board 212 connected to each other.
  • the spacer 22 is disposed on the first substrate 211.
  • the light transmitter 23 is configured to emit an optical signal outward.
  • the light transmitter 23 is disposed on the cushion block 22.
  • the flexible circuit board 212 is bent and one end of the flexible circuit board 212 is connected to the first substrate 211 and the other end is connected to the light emitter 23.
  • the light receiver 24 is disposed on the first substrate 211.
  • the light receiver 24 is configured to receive the light signal emitted by the reflected light transmitter 23.
  • the light receiver 24 includes a casing 241 and an optical element disposed on the casing 241. 242.
  • the housing 241 is connected with the cushion block 22 as a whole.
  • the housing 241 and the cushion block 22 are integrally formed.
  • the light emitter 23 includes a second substrate assembly 231, a light source assembly 232 and a housing 233.
  • the second substrate assembly 231 is disposed on the pad 22, and the second substrate assembly 231 is connected to the flexible circuit board 212.
  • the light source assembly 232 is disposed on the second substrate assembly 231, and the light source assembly 232 is configured to emit a light signal.
  • the casing 233 is disposed on the second substrate assembly 231.
  • the casing 233 is formed with a receiving space 2331 to receive the light source assembly 232.
  • the second substrate assembly 231 includes a second substrate 2311 and a reinforcing member 2312.
  • the second substrate 2311 is connected to the flexible circuit board 212.
  • the light source assembly 232 and the reinforcing member 2312 are disposed on opposite sides of the second substrate 2311.
  • the reinforcing member 2312 is integrally formed with the cushion block 22; or the reinforcing member 2312 and the cushion block 22 are formed separately.
  • a first positioning member 2313 is formed on the reinforcing member 2312.
  • the cushion block 22 includes a body 221 and a second positioning member 222.
  • the second positioning member 222 is formed on the body 221.
  • the first positioning member 2313 cooperates with the second positioning member 222.
  • the side where the cushion block 22 is combined with the first substrate 211 is provided with a receiving cavity 223.
  • the time-of-flight module 20 further includes an electronic component 25 disposed on the first substrate 211, and the electronic component 25 is contained in the receiving cavity 223.
  • the cushion block 22 is provided with an avoiding through hole 224 communicating with at least one receiving cavity 223, and at least one electronic component 25 extends into the avoiding through hole 224.
  • the first substrate assembly 21 further includes a reinforcing plate 213.
  • the reinforcing plate 213 is coupled to a side of the first substrate 211 opposite to the pad 22.
  • the cushion block 22 includes a protruding portion 225 protruding from the side edge 2111 of the first substrate 211, and the flexible circuit board 212 is bent around the protruding portion 225.
  • the time-of-flight module 20 further includes a connector 26 connected to the first substrate 211.
  • the connector 26 is used to connect the first substrate assembly 21 and an external device.
  • the connector 26 and the flexible circuit board 212 are respectively connected to opposite ends of the first substrate 211.
  • the light transmitter 23 and the light receiver 24 are arranged along a straight line L, and the connector 26 and the flexible circuit board 212 are located on opposite sides of the straight line L, respectively. .
  • the electronic device 100 includes a time-of-flight module 20, a main camera 30 and a sub-camera 40.
  • Control methods include:
  • the time-of-flight module 20 and the secondary camera 40 are respectively disposed on two sides of the primary camera 30.
  • control methods also include:
  • the step (ie, 02) of switching the main camera 30 or the sub camera 40 to acquire a captured image of the subject according to a predetermined condition includes:
  • the main camera 30 is a wide-angle camera
  • the sub-camera 40 is a telephoto camera.
  • the control method further includes:
  • the step (ie, 02) of switching the main camera 30 or the sub camera 40 to acquire a captured image of the subject according to a predetermined condition includes:
  • control methods also include:
  • the step (ie, 02) of switching the main camera 30 or the sub camera 40 to acquire a captured image of the subject according to a predetermined condition includes:
  • the embodiment of the present application provides a control method of the electronic device 100.
  • the electronic device 100 includes a time of flight module 20, a main camera 30 and a sub camera 40.
  • Control methods include:
  • an embodiment of the present application provides an electronic device 100.
  • the electronic device 100 includes a time of flight module 20, a main camera 30, a sub camera 40, and a processor 10.
  • the main camera 30 is used to capture a captured image of the subject
  • the sub camera 40 is also used to capture a captured image of the subject.
  • the control method of the electronic device 100 according to the embodiment of the present application may be implemented by the electronic device 100 according to the embodiment of the present application.
  • the time of flight module 20 may be used to execute the method in 01
  • the processor 10 may be used to execute the methods in 02 and 03. That is to say, the time-of-flight module 20 can be used to collect a depth image of a subject.
  • the processor 10 may be configured to switch the main camera 30 or the sub camera 40 according to a predetermined condition to acquire a captured image of the subject; and construct a three-dimensional image of the subject according to the depth image and the captured image.
  • An electronic device usually collects a captured image of a subject through a single camera, which has insufficient functions and poor user experience.
  • the control method of the electronic device 100 and the electronic device 100 can switch the main camera 30 or the sub camera 40 according to a predetermined condition to capture a captured image of the subject, and also can capture the depth of the captured image based on the captured image and the time of flight module 20
  • the image builds a three-dimensional image of the subject, which has a variety of functions and is conducive to improving the user experience.
  • the electronic device 100 may be a mobile phone, a tablet computer, a smart watch, a smart bracelet, a smart wearable device, etc.
  • the embodiment of the present application is described by taking the electronic device 100 as a mobile phone as an example. It can be understood that the electronic device 100 The specific form is not limited to mobile phones.
  • the electronic device 100 may include a case 101 and a bracket 102.
  • the time-of-flight module 20, the main camera 30 and the auxiliary camera 40 are all disposed on the bracket 102.
  • the time-of-flight module 20, the main camera 30, the sub-camera 40, and the bracket 102 are all housed in the casing 101 and can extend from the casing 101.
  • the bracket 102 When the time-of-flight module 20 is used to collect a depth image of the subject, or the main camera 30 is used to capture a captured image of the subject, or the sub-camera 40 is used to capture a captured image of the subject, the bracket 102 The time-of-flight module 20, the main camera 30, and the sub-camera 40 are driven to move outside the casing 101 to extend the casing 101, so as to acquire a depth image or take an image.
  • the time-of-flight module 20, the main camera 30, and the sub-camera may be all front cameras or all rear cameras.
  • the subject can be a person, object, or other subject that the user wishes to photograph.
  • the housing 101 may be provided with a light through hole (not shown).
  • the time-of-flight module 20, the main camera 30 and the sub camera 40 are immovably disposed in the housing 101 and correspond to the light through hole.
  • the display screen 103 of the electronic device 100 disposed on the casing 101 may be provided with a light through hole (not shown), and the time-of-flight module 20, the main camera 30 and the sub-camera 40 are disposed on the display screen 103.
  • the processor 10 switches the main camera 30 or the sub-camera 40 according to a predetermined condition to acquire a captured image of the subject (ie, 02), and the processor 10 according to the depth image and
  • the step of capturing the image to construct a three-dimensional image of the subject (ie, 03) is implemented in two different application scenarios, and there is no sequential relationship between step execution between 02 and 03. That is to say, in an application scenario, the processor 10 may switch the main camera 30 or the sub camera 40 according to a predetermined condition to collect a captured image of a subject, thereby realizing an optical zoom and telephoto experience without having to The depth image and the captured image construct a three-dimensional image of the subject.
  • the processor 10 may choose to construct a three-dimensional image of the subject based on the captured image collected by the main camera 30 and the depth image collected by the time-of-flight module 20, or select the captured image and flight based on the captured image acquired by the secondary camera 40
  • the depth image collected by the time module 20 constructs a three-dimensional image of the subject, thereby realizing 3D effects and Augmented Reality (AR) applications, without the need to switch the main camera 30 or the sub camera 40 to capture the subject according to predetermined conditions A captured image of the subject.
  • AR Augmented Reality
  • the processor 10 switches the main camera 30 or the sub-camera 40 according to a predetermined condition to acquire a captured image of the subject (ie, 02), and the processor 10 according to the depth image
  • the step of constructing a three-dimensional image of the object and the captured image (ie, 03) is implemented in the same application scenario, and 02 is executed before 03.
  • the processor 10 first switches to capture a captured image of the subject according to a predetermined condition, and then constructs a three-dimensional image of the subject based on the depth image and the captured image captured by the main camera 30; or, the processor 10 First, switch to the sub-camera 40 to capture a captured image of the subject according to a predetermined condition, and then construct a three-dimensional image of the subject based on the depth image and the captured image collected by the sub-camera 40.
  • the processor 10 since the processor 10 first selects an appropriate camera to collect a captured image of the subject according to a predetermined condition, so that the quality of the captured image is better, and then a three-dimensional image is constructed based on the captured image and the depth image with better quality. So you can get the ideal 3D effect and AR experience.
  • the main camera 30 is a wide-angle camera
  • the sub-camera 40 is a telephoto camera.
  • wide-angle and telephoto are relative terms.
  • the wide-angle camera has a larger field of view relative to the telephoto camera, and the telephoto camera has a longer focal length and a longer shooting distance than the wide-angle camera.
  • the processor 10 switches to a wide-angle camera to capture a captured image of the subject according to a predetermined condition, the image is a wide-angle image, and the processor 10 constructs a three-dimensional image of the subject according to the depth image and the wide-angle image;
  • the image is a telephoto image, and the processor 10 constructs a three-dimensional image of the subject according to the depth image and the telephoto image.
  • the electronic device 100 includes a wide-angle camera and a telephoto camera at the same time.
  • the processor 10 may switch the wide-angle camera or the telephoto camera according to the actual situation to collect the captured image of the subject, thereby constructing two different types of wide-angle and telephoto Type of three-dimensional images is conducive to improving the photographing experience.
  • the main camera 30 is a color camera (ie, an RGB camera), and the sub camera 40 is a black and white camera (ie, a Mono camera). Compared with the color camera, the black and white camera can improve the shooting quality of low-light / night scene images.
  • the processor 10 switches to collect a captured image of the subject according to a predetermined condition, the image is an RGB image, and the processor 10 constructs a three-dimensional image of the object according to the depth image and the RGB image;
  • the condition is switched to when the sub-camera 40 collects a captured image of the subject, the image is a Mono image, and the processor 10 constructs a three-dimensional image of the subject according to the depth image and the Mono image.
  • the electronic device 100 includes both a color camera and a black-and-white camera.
  • the processor 10 may switch the color camera or the black-and-white camera according to the actual situation to collect the captured image of the subject, thereby constructing two different types of three-dimensional RGB and Mono Images, which will help improve the photo experience.
  • the time-of-flight module 20 and the sub-camera 40 are respectively disposed on two sides of the main camera 30.
  • the main camera 30 is located between the time-of-flight module 20 and the sub-camera 40, on the one hand, if the processor 10 switches the main camera 30 or the sub-camera 40 according to a predetermined condition to collect a captured image of the subject, the main camera 30 and the sub-camera The parallax between 40 is small, which is helpful for achieving smooth zoom.
  • the processor 10 is more likely to switch to the main camera 30 to collect captured images of the subject according to predetermined conditions, so that the processor 10 When constructing a three-dimensional image of the object based on the depth image and the captured image collected by the main camera 30, the parallax between the time-of-flight module 20 and the main camera 30 is small, which is beneficial to constructing a three-dimensional image of the object.
  • the time-of-flight module 20 and the sub-camera 40 are respectively disposed on both sides of the main camera 30, which is beneficial to achieve smooth zooming between the main camera 30 and the sub-camera 40, and is also conducive to constructing a three-dimensional object image.
  • the centers of the time-of-flight module 20, the main camera 30, and the sub-camera 40 can be located in a straight line in order, on the one hand, the bracket 102 can be reduced along the top of the electronic device 100 (that is, Side) to the bottom (that is, the side of the electronic device 100 away from the bracket 102); on the other hand, the bracket 102 drives the time-of-flight module 20, the main camera 30, and the sub-camera 40 to move outside the casing 101 It can synchronously protrude from the casing 101 to structurally ensure that the time-of-flight module 20, the main camera 30 and the sub-camera 40 can work synchronously, saving shooting time.
  • control methods also include:
  • the step (ie, 02) of switching the main camera 30 or the sub camera 40 to acquire a captured image of the subject according to a predetermined condition includes:
  • the main camera 30 is a wide-angle camera
  • the sub-camera 40 is a telephoto camera.
  • the processor 10 may be used to execute the method in 04.
  • the main camera 30 can be used to execute the method in 021, and the sub camera 40 can be used to execute the method in 022. That is to say, the processor 10 may be used to obtain the current distance between the subject and the electronic device 100 in real time.
  • the main camera 30 collects captured images; when the current distance is greater than or equal to the distance threshold, the sub camera 40 collects captured images.
  • the electronic device 100 can switch the camera used to capture the captured image in real time according to the current distance.
  • the processor 10 obtains the current distance between the subject and the electronic device 100 as d1, where d1 ⁇ d0.
  • the main camera 30 collects a captured image of the subject;
  • the processing The current distance between the object 10 and the electronic device 100 is d2, where d2> d0.
  • the sub-camera 40 acquires a captured image of the object again.
  • the final processor 10 constructs a three-dimensional image of the subject based on the depth image and the captured image collected by the sub-camera 40.
  • the main camera 30 may focus using a laser focusing mode.
  • the sub-camera 40 may use the passive focus mode for focusing.
  • the passive focus method includes a contrast focus mode and a phase focus mode.
  • the step (that is, 04) of the processor 10 acquiring the current distance between the subject and the electronic device 100 in real time includes: the processor 10 continuously acquiring multiple initial values between the subject and the electronic device 100 distance.
  • the processor 10 calculates a current distance between the subject and the electronic device 100 according to an average value of the plurality of initial distances, and then switches according to the calculated current distance
  • the main camera 30 or the sub camera 40 captures a captured image of a subject.
  • the processor 10 is configured to obtain the current distance according to the depth image.
  • the time-of-flight module 20 first collects a depth image of the subject, and then obtains a current distance between the subject and the electronic device 100 according to the depth image.
  • the main camera 30 collects captured images;
  • the sub camera 40 collects captured images of the subject.
  • the depth image collected by the time-of-flight module 20 can be used to construct a three-dimensional image of the subject, and can also be used to detect the current distance between the subject and the electronic device 100.
  • the electronic device 100 does not need to be set.
  • Another distance sensor to detect the current distance is beneficial to reduce the number of components in the electronic device 100 and save costs.
  • the electronic device 100 further includes a distance detection device 50.
  • the distance detecting device 50 is configured to detect the current distance in real time and send it to the processor 10.
  • the distance detection device 50 is a distance sensor, and the distance sensor directly detects the current distance between the subject and the electronic device 100.
  • the distance detection device 50 is a structured light module.
  • the structured light module collects a structured light image of a subject, and then detects a current distance between the subject and the electronic device 100 based on the structured light image.
  • the distance detection device 50 may be another type of detection device, and it is only required to be able to detect the current distance between the subject and the electronic device 100, such as an ultrasonic rangefinder, a radar rangefinder, Proximity sensors, etc.
  • the distance detection device 50 can detect the current distance in advance. If the current distance is less than the distance threshold, the main camera 30 collects captured images; if the current distance is greater than or equal to the distance threshold, the sub camera 40 collects captured images.
  • the main camera 30 or the sub-camera 40 may collect a depth image of the subject while the time-of-flight module 20 collects a captured image of the subject, so as to save time for constructing a three-dimensional image of the subject and improve a user experience.
  • the main camera 30 or the sub-camera 40 may also capture captured images before the time-of-flight module 20 captures depth images; or capture images after the depth-of-flight module 20 captures depth images, which is not limited herein.
  • the main camera 30 is a wide-angle camera
  • the sub-camera 40 is a telephoto camera.
  • the control method further includes:
  • the step (ie, 02) of switching the main camera 30 or the sub camera 40 to acquire a captured image of the subject according to a predetermined condition includes:
  • the main camera 30 is a wide-angle camera
  • the sub-camera 40 is a telephoto camera.
  • the main camera 30 or the sub camera 40 may be used to execute the method in 05
  • the processor 10 may be used to execute the methods in 06 and 023. That is to say, the main camera 30 or the sub camera 40 can be used to collect a preview image of a subject.
  • the processor 10 may be configured to detect face information in the preview image, and switch the main camera 30 or the sub-camera 40 according to the face information to capture a captured image.
  • the face information includes the number of faces.
  • the main camera 30 collects captured images; when the number of faces is less than or equal to a predetermined number (as shown in FIG. 6), the sub-camera 40 collects captured images. It can be understood that when the number of human faces is large, shooting with the main camera 30 having a larger field of view can capture more human faces into the captured image. When the number of human faces is small, the sub-camera 40 with a relatively small field of view angle and a long focal length can be used, which can make the subject appear closer and larger visually.
  • the processor 10 keeps the main image when the number of faces is greater than a predetermined number.
  • the camera 30 collects captured images of the subject.
  • the main camera 30 may directly use the preview image as the captured image without collecting the captured images again; the processor 10 switches to the secondary camera when the number of faces is less than or equal to a predetermined number 40 Capture captured images.
  • the processor 10 switches to when the number of faces is greater than a predetermined number.
  • the main camera 30 collects captured images of the subject; when the number of faces is less than or equal to a predetermined number, the processor 10 maintains the sub-camera 40 to capture captured images.
  • the sub-camera 40 can also directly preview the captured images without collecting additional images. The image is taken as a captured image.
  • the face information includes an area ratio of a face region in the preview image.
  • a predetermined ratio for example, one third
  • the main camera 30 captures the captured image;
  • the area ratio of the face area in the preview image is When it is less than or equal to a predetermined ratio (as shown in FIG. 6), the sub-camera 40 captures a captured image.
  • the sub-camera 40 with a relatively small field of view angle and a long focal length can be used to visually make the subject appear closer and larger .
  • the step of acquiring the preview image of the subject through the main camera 30 or the sub-camera 40 (ie, 05) is to collect the preview image of the subject through the main camera 30, the area of the processor 10 in the preview image in the face area
  • the main camera 30 is kept to capture the captured image of the subject.
  • the main camera 30 may also directly use the preview image as the captured image without collecting the captured image again; the processor 10 previews the image in the face area.
  • the area ratio is less than or equal to a predetermined ratio, the camera is switched to the sub-camera 40 to capture a captured image.
  • the step of acquiring the preview image of the subject through the main camera 30 or the sub-camera 40 (ie, 05) is to collect the preview image of the subject through the sub-camera 40
  • the area of the processor 10 in the preview image in the face area of the processor occupies
  • the main camera 30 is switched to collect captured images; when the area ratio of the face area in the preview image is less than or equal to the predetermined ratio, the processor 10 maintains the sub-camera 40 to collect captured images.
  • the sub-camera It is also possible to directly use the preview image as the captured image without acquiring the captured image again.
  • control methods also include:
  • the step (ie, 02) of switching the main camera 30 or the sub camera 40 to acquire a captured image of the subject according to a predetermined condition includes:
  • the main camera 30 is a color camera
  • the sub camera 40 is a black and white camera.
  • the processor 10 may be used to execute the method in 07
  • the secondary camera 40 may be used to execute the method in 024
  • the main camera 30 may be used to execute the method in 025. That is to say, the processor 10 can be used to obtain the current brightness of the ambient light in real time.
  • the sub-camera 40 collects captured images.
  • the main camera 30 captures captured images.
  • the black and white camera can improve the shooting quality of low-light / night scene images compared to the color camera. Therefore, when the current brightness is less than the brightness threshold, the captured image may be captured by the sub-camera 40; when the current brightness is greater than or equal to the brightness threshold, the captured image may be captured by the main camera 30.
  • the processor 10 switches a color camera or a black and white camera according to the current brightness of the ambient light to capture a captured image of the subject, thereby constructing two different types of three-dimensional images, RGB and Mono, which is beneficial to improving the photographing experience.
  • the electronic device 100 may further include an ambient light sensor, and the ambient light sensor is configured to detect the current brightness of the ambient light and send it to the processor 10.
  • the electronic device 100 can switch the camera used to capture the captured image in real time according to the current brightness.
  • the brightness threshold is L0.
  • the processor 10 obtains the current brightness of the ambient light as L1, where L1 ⁇ L0.
  • the sub-camera 40 collects the captured image of the subject; at the second moment, the processor 10 obtains the current brightness of the ambient light.
  • the brightness is L2, where L2> L0.
  • the main camera 30 captures a captured image of the subject again.
  • the processor 10 constructs a three-dimensional image of the subject based on the depth image and the captured image collected by the main camera 30.
  • the time of flight module 20 may have the following structure.
  • the time-of-flight module 20 includes a first substrate assembly 21, a spacer 22, a light transmitter 23 and a light receiver 24.
  • the first substrate assembly 21 includes a first substrate 211 and a flexible circuit board 212 connected to each other.
  • the spacer 22 is disposed on the first substrate 211.
  • the light transmitter 23 is configured to emit an optical signal outward.
  • the light transmitter 23 is disposed on the cushion block 22.
  • the flexible circuit board 212 is bent and one end of the flexible circuit board 212 is connected to the first substrate 211 and the other end is connected to the light emitter 23.
  • the light receiver 24 is disposed on the first substrate 211.
  • the light receiver 24 is configured to receive the light signal emitted by the reflected light transmitter 23.
  • the light receiver 24 includes a casing 241 and an optical element disposed on the casing 241. 242.
  • the housing 241 is connected with the cushion block 22 as a whole.
  • the pad 22 can raise the height of the light emitter 23, thereby increasing the height of the light emitting surface of the light emitter 23, and the light emitter 23
  • the emitted light signal is not easily blocked by the light receiver 24, so that the light signal can be completely irradiated on the measured object.
  • the first substrate assembly 21 includes a first substrate 211 and a flexible circuit board 212.
  • the first substrate 211 may be a printed circuit board or a flexible circuit board.
  • the control circuit of the time of flight module 20 may be laid on the first substrate 211.
  • One end of the flexible circuit board 212 can be connected to the first substrate 211, and the flexible circuit board 212 can be bent at a certain angle, so that the relative positions of the devices connected at both ends of the flexible circuit board 212 can be selected.
  • the pad 22 is disposed on the first substrate 211.
  • the pad 22 is in contact with the first substrate 211 and is carried on the first substrate 211.
  • the pad 22 may be combined with the first substrate 211 by means of adhesion or the like.
  • the material of the spacer 22 may be metal, plastic, or the like.
  • a surface where the pad 22 is combined with the first substrate 211 may be a flat surface, and a surface opposite to the combined surface of the pad 22 may also be a flat surface, so that when the light emitter 23 is disposed on the pad 22 Has better stability.
  • the light transmitter 23 is configured to emit an optical signal outwards.
  • the light signal may be infrared light, and the light signal may be a lattice spot emitted to the object to be measured.
  • the light signal is emitted from the light transmitter 23 at a certain divergence angle. .
  • the light transmitter 23 is disposed on the spacer 22. In the embodiment of the present application, the light transmitter 23 is disposed on the side of the spacer 22 opposite to the first substrate 211, or in other words, the spacer 22 connects the first substrate 211.
  • the light emitter 23 is spaced apart from the light emitter 23 so that a height difference is formed between the light emitter 23 and the first substrate 211.
  • the light transmitter 23 is also connected to the flexible circuit board 212.
  • the flexible circuit board 212 is bent, one end of the flexible circuit board 212 is connected to the first substrate 211, and the other end is connected to the light transmitter 23, so that the control signal of the light transmitter 23 is removed
  • the first substrate 211 is transmitted to the light transmitter 23, or a feedback signal of the light transmitter 23 (for example, time information, frequency information of the light signal emitted by the light transmitter 23, temperature information of the light transmitter 23, etc.) is transmitted to the first Substrate 211.
  • the optical receiver 24 is configured to receive an optical signal emitted by the optical transmitter 23 reflected back.
  • the light receiver 24 is disposed on the first substrate 211, and the contact surface between the light receiver 24 and the first substrate 211 is substantially flush with the contact surface between the pad 22 and the first substrate 211 (that is, the installation starting point of the two is On the same plane).
  • the light receiver 24 includes a housing 241 and an optical element 242.
  • the casing 241 is disposed on the first substrate 211, and the optical element 242 is disposed on the casing 241.
  • the casing 241 may be a lens holder and a lens barrel of the light receiver 24, and the optical element 242 may be a lens disposed in the casing 241.
  • the light receiver 24 may further include a photosensitive chip (not shown).
  • the optical signal reflected by the measured object is irradiated into the photosensitive chip through the optical element 242, and the photosensitive chip responds to the optical signal.
  • the time-of-flight module 20 calculates the time difference between the light signal emitted by the light transmitter 23 and the light sensor receiving the light signal reflected by the measured object, and further obtains the depth information of the measured object, which can be used for distance measurement, For generating depth images or for 3D modeling.
  • the housing 241 and the cushion block 22 are integrally connected. Specifically, the housing 241 and the spacer 22 may be integrally formed.
  • the materials of the housing 241 and the spacer 22 are the same and are integrally formed by injection molding, cutting or the like; or the materials of the housing 241 and the spacer 22 are different, both Integrated molding by two-color injection molding.
  • the housing 241 and the spacer 22 may also be separately formed, and the two form a matching structure.
  • the housing 241 and the spacer 22 may be connected into one body, and then may be disposed on the first substrate 211 together. It is also possible to firstly arrange one of the housing 241 and the pad 22 on the first substrate 211, and then arrange the other on the first substrate 211 and connect them as a whole.
  • the pad 22 can raise the height of the light emitter 23, thereby increasing the height of the light emitting surface of the light emitter 23, and the light emitter 23
  • the emitted light signal is not easily blocked by the light receiver 24, so that the light signal can be completely irradiated on the measured object.
  • the exit surface of the light transmitter 23 may be flush with the entrance surface of the light receiver 24, or the exit surface of the light transmitter 23 may be slightly lower than the entrance surface of the light receiver 24, or it may be the exit surface of the light transmitter 23 Slightly higher than the incident surface of the light receiver 24.
  • the first substrate assembly 21 further includes a reinforcing plate 213.
  • the reinforcing plate 213 is coupled to a side of the first substrate 211 opposite to the pad 22.
  • the reinforcing plate 213 may cover one side of the first substrate 211, and the reinforcing plate 213 may be used to increase the strength of the first substrate 211 and prevent deformation of the first substrate 211.
  • the reinforcing plate 213 may be made of a conductive material, such as a metal or an alloy.
  • the reinforcing plate 213 may be electrically connected to the casing 10 to make the reinforcing plate 213. Grounding and effectively reducing the interference of static electricity from external components on the time of flight module 20.
  • the cushion block 22 includes a protruding portion 225 protruding from the side edge 2111 of the first substrate 211, and the flexible circuit board 212 is bent around the protruding portion 225. Specifically, a part of the cushion block 22 is directly carried on the first substrate 211, and another part is not in direct contact with the first substrate 211, and protrudes from the side edge 2111 of the first substrate 211 to form a protruding portion 225.
  • the flexible circuit board 212 may be connected to the side edge 2111, and the flexible circuit board 212 is bent around the protrusion 225, or the flexible circuit board 212 is bent so that the protrusion 225 is located in a space surrounded by the flexible circuit board 212. Inside, when the flexible circuit board 212 is subjected to an external force, the flexible circuit board 212 will not collapse inward and cause excessive bending, which will cause damage to the flexible circuit board 212.
  • the outer surface 2251 of the protruding portion 225 is a smooth curved surface (eg, the outer surface of a cylinder, etc.), that is, the outer surface 2251 of the protruding portion 225 does not form a curvature. Hence, even if the flexible circuit board 212 is bent over the outer side 2251 of the protruding portion 225, the degree of bending of the flexible circuit board 212 will not be too large, which further ensures the integrity of the flexible circuit board 212.
  • the time-of-flight module 20 further includes a connector 26 connected to the first substrate 211.
  • the connector 26 is used to connect the first substrate assembly 21 and an external device.
  • the connector 26 and the flexible circuit board 212 are respectively connected to opposite ends of the first substrate 211.
  • the connector 26 may be a connection base or a connector.
  • the connector 26 may be connected to the main board of the mobile terminal 100 so that the time-of-flight module 20 is electrically connected to the main board.
  • the connector 26 and the flexible circuit board 212 are respectively connected to opposite ends of the first substrate 211.
  • the connectors 26 and the flexible circuit board 212 may be respectively connected to the left and right ends of the first substrate 211, or respectively connected to the front and rear ends of the first substrate 211.
  • the light transmitter 23 and the light receiver 24 are arranged along a straight line L, and the connector 26 and the flexible circuit board 212 are located on opposite sides of the straight line L, respectively. It can be understood that, since the light transmitter 23 and the light receiver 24 are arranged in an array, the size of the time-of-flight module 20 may be larger in the direction of the straight line L.
  • the connector 26 and the flexible circuit board 212 are respectively disposed on opposite sides of the straight line L, which will not increase the size of the time-of-flight module 20 in the direction of the straight line L, thereby facilitating the installation of the time-of-flight module 20 on the mobile terminal 100.
  • the chassis 10 On the chassis 10.
  • a receiving cavity 223 is defined on a side where the cushion block 22 is combined with the first substrate 211.
  • the time-of-flight module 20 further includes an electronic component 25 disposed on the first substrate 211, and the electronic component 25 is contained in the receiving cavity 223.
  • the electronic component 25 may be an element such as a capacitor, an inductor, a transistor, a resistor, etc.
  • the electronic component 25 may be electrically connected to a control line laid on the first substrate 211 and used to drive or control the operation of the light transmitter 23 or the light receiver 24.
  • the electronic component 25 is contained in the containing cavity 223, and the space in the cushion block 22 is used reasonably.
  • the number of the receiving cavities 223 may be one or more, and the plurality of receiving cavities 223 may be spaced apart from each other.
  • the positions of the receiving cavity 223 and the electronic component 25 may be aligned and the pad 22 may be disposed at On the first substrate 211.
  • the cushion block 22 is provided with an avoiding through hole 224 communicating with at least one receiving cavity 223, and at least one electronic component 25 extends into the avoiding through hole 224.
  • the height of the electronic component 25 is required to be not higher than the height of the containing cavity 223.
  • an avoiding through hole 224 corresponding to the receiving cavity 223 may be provided, and the electronic component 25 may partially extend into the avoiding through hole 224, so as not to increase the height of the spacer 22
  • the electronic component 25 is arranged.
  • the light emitter 23 includes a second substrate assembly 231, a light source assembly 232 and a housing 233.
  • the second substrate assembly 231 is disposed on the pad 22, and the second substrate assembly 231 is connected to the flexible circuit board 212.
  • the light source assembly 232 is disposed on the second substrate assembly 231, and the light source assembly 232 is configured to emit a light signal.
  • the casing 233 is disposed on the second substrate assembly 231.
  • the casing 233 is formed with a receiving space 2331.
  • the receiving space 2331 can be used for receiving the light source module 232.
  • the flexible circuit board 212 may be detachably connected to the second substrate assembly 231.
  • the light source assembly 232 is electrically connected to the second substrate assembly 231.
  • the casing 233 may be bowl-shaped as a whole, and the opening of the casing 233 is disposed on the second substrate assembly 231 downwardly, so as to receive the light source assembly 232 in the accommodation space 2331.
  • a light outlet 2332 corresponding to the light source component 232 is provided on the housing 233.
  • the optical signal emitted from the light source component 232 passes through the light outlet 2332 and is emitted.
  • the light signal can pass directly through the light outlet 2332. It can also pass through the optical outlet 2332 after changing the optical path through other optical devices.
  • the second substrate assembly 231 includes a second substrate 2311 and a reinforcing member 2312.
  • the second substrate 2311 is connected to the flexible circuit board 212.
  • the light source assembly 232 and the reinforcing member 2312 are disposed on opposite sides of the second substrate 2311.
  • a specific type of the second substrate 2311 may be a printed circuit board or a flexible circuit board, and a control circuit may be laid on the second substrate 2311.
  • the reinforcing member 2312 may be fixedly connected to the second substrate 2311 by means of gluing, riveting, or the like.
  • the reinforcing member 2312 may increase the overall strength of the second substrate assembly 231.
  • the reinforcing member 2312 can directly contact the spacer 22, the second substrate 2311 is not exposed to the outside, and does not need to be in direct contact with the spacer 22, and the second substrate 2311 is not easily affected. Contamination by dust, etc.
  • the reinforcing member 2312 and the cushion block 22 are formed separately.
  • the spacer 22 may be first mounted on the first substrate 211.
  • the two ends of the flexible circuit board 212 are respectively connected to the first substrate 211 and the second substrate 2311, and the flexible circuit board 212 may Do not bend first (state shown in Fig. 15).
  • the flexible circuit board 212 is then bent, so that the reinforcing member 2312 is disposed on the cushion block 22.
  • the reinforcing member 2312 and the spacer 22 may be integrally formed, for example, integrally formed by a process such as injection molding.
  • the spacer 22 and the light emitter 23 may be installed together.
  • the first substrate 211 On the first substrate 211.
  • a first positioning member 2313 is formed on the reinforcing member 2312.
  • the cushion block 22 includes a body 221 and a second positioning member 222.
  • the second positioning member 222 is formed on the body 221.
  • the first positioning member 2313 cooperates with the second positioning member 222.
  • the relative movement between the second substrate assembly 231 and the cushion block 22 can be effectively restricted.
  • the specific types of the first positioning member 2313 and the second positioning member 222 can be selected according to needs.
  • the first positioning member 2313 is a positioning hole formed in the reinforcing member 2312
  • the second positioning member 222 is a positioning column. Protrude into the positioning hole so that the first positioning member 2313 and the second positioning member 222 cooperate with each other; or the first positioning member 2313 is a positioning column formed on the reinforcing member 2312, and the second positioning member 222 is a positioning hole and the positioning column Project into the positioning hole so that the first positioning member 2313 and the second positioning member 222 cooperate with each other; or the number of the first positioning member 2313 and the second positioning member 222 are multiple, and part of the first positioning member 2313 is a positioning hole, Part of the second positioning member 222 is a positioning column, part of the first positioning member 2313 is a positioning column, and part of the second positioning member 222 is a positioning hole.
  • the positioning column projects into the positioning hole so that the first positioning member 2313 and the second positioning member 222 work cooperatively.
  • the structure of the light source component 232 will be described as an example below:
  • the light source assembly 232 includes a light source 60, a lens barrel 70, a diffuser 80 and a protective cover 90.
  • the light source 60 is connected to the second substrate assembly 231.
  • the lens barrel 70 includes a first surface 71 and a second surface 72 opposite to each other.
  • the lens barrel 11 defines a receiving cavity 75 penetrating the first surface 71 and the second surface 72.
  • the first surface 71 is recessed toward the second surface 72 to form a mounting groove 76 communicating with the receiving cavity 75.
  • the diffuser 80 is installed in the mounting groove 76.
  • the protective cover 90 is mounted on the side where the first surface 71 of the lens barrel 70 is located, and the diffuser 80 is sandwiched between the protective cover 90 and the bottom surface 77 of the mounting groove 76.
  • the protective cover 90 can be mounted on the lens barrel 70 by means of screw connection, engagement, and fastener connection.
  • the protective cover 90 when the protective cover 90 includes a top wall 91 and a protective side wall 92, the protective cover 90 (protective side wall 92) is provided with internal threads and the lens barrel 70 is provided with external threads.
  • the protective cover The internal thread of 90 is screwed with the external thread of the lens barrel 70 to mount the protective cover 90 on the lens barrel 70; or, referring to FIG. 17, when the protective cover 90 includes a top wall 91, the protective cover 90 (top wall 91) A locking hole 95 is opened, and a hook 73 is provided at an end of the lens barrel 70.
  • the hook 73 is inserted into the locking hole 95 so that the protective cover 90 is mounted on the lens barrel 70.
  • the protective cover 90 includes a top wall 91 and a protective side wall 92
  • the protective cover 90 (protective side wall 92) is provided with a locking hole 95
  • a hook 73 is provided on the lens barrel 70.
  • the protective cover 90 includes the top wall 91
  • the end of the lens barrel 70 is provided with a first positioning hole 74
  • the protective cover 90 (top wall 91) is provided with a second positioning hole 93 corresponding to the first positioning hole 74
  • the fastener 94 passes through the second positioning hole 93
  • a first positioning hole 74 to the protective cover 90 is mounted on the lens barrel 70.
  • the light source assembly 232 is provided with a mounting groove 76 on the lens barrel 70 and the diffuser 80 is installed in the mounting groove 76, and is mounted on the lens barrel 70 through a protective cover 90 to clamp the diffuser 80 between the protective cover 90 and the installation. Between the bottom surfaces 77 of the grooves 76, the diffuser 80 is actually fixed to the lens barrel 70. And avoid using glue to fix the diffuser 80 on the lens barrel 70, so as to prevent the glue from diffusing and solidifying on the surface of the diffuser 80 and affecting the microstructure of the diffuser 80 after the glue is volatilized to a gaseous state, and the connection and diffusion can be avoided. When the glue of the device 80 and the lens barrel 70 decreases due to aging, the diffuser 80 falls off from the lens barrel 70.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, the features defined as “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present application, the meaning of "plurality” is at least two, for example, two, three, unless specifically defined otherwise.

Abstract

An electronic device (100) and a control method for the electronic device (100). The electronic device (100) comprises a time-of-flight module (20), a main camera (30), an auxiliary camera (40), and a processor (10). The time-of-flight module (20) is used to collect a depth image of a photographed object. The main camera (30) is used to collect a camera image of the photographed object. The auxiliary camera (40) is used to collect a camera image of the photographed object. The processor (10) is used to switch to the main camera (30) or the auxiliary camera (40) according to a predetermined condition so as to collect a camera image of the photographed object, and to construct a three-dimensional image of the photographed object on the basis of the depth image and the camera image.

Description

电子装置和电子装置的控制方法Electronic device and control method of electronic device
优先权信息Priority information
本申请请求2018年8月22日向中国国家知识产权局提交的、专利申请号为201810963397.2的专利申请的优先权和权益,并且通过参照将其全文并入此处。This application claims the priority and rights of the patent application filed with the State Intellectual Property Office of China on August 22, 2018, with a patent application number of 201810963397.2, which is hereby incorporated by reference in its entirety.
技术领域Technical field
本申请涉及消费性电子产品领域,更具体而言,涉及一种电子装置和电子装置的控制方法。The present application relates to the field of consumer electronic products, and more particularly, to an electronic device and a control method for an electronic device.
背景技术Background technique
随着电子技术的快速发展,诸如智能手机、平板电脑等电子装置已经越来越普及。电子装置通常通过单个摄像头采集被摄物的拍摄图像。With the rapid development of electronic technology, electronic devices such as smart phones and tablet computers have become increasingly popular. An electronic device typically captures a captured image of a subject through a single camera.
发明内容Summary of the Invention
本申请实施方式提供一种电子装置和电子装置的控制方法。An embodiment of the present application provides an electronic device and a control method of the electronic device.
本申请实施方式提供一种电子装置,所述电子装置包括飞行时间模组、主摄像头、副摄像头和处理器,飞行时间模组用于采集被摄物的深度图像;主摄像头用于采集所述被摄物的拍摄图像;副摄像头用于采集所述被摄物的拍摄图像;处理器用于根据预定条件切换所述主摄像头或所述副摄像头以采集所述被摄物的拍摄图像、及根据所述深度图像和所述拍摄图像构建所述被摄物的三维图像。An embodiment of the present application provides an electronic device. The electronic device includes a time-of-flight module, a main camera, a sub-camera, and a processor. The time-of-flight module is used to collect a depth image of a subject. The main camera is used to collect the image. A captured image of the subject; a secondary camera for capturing the captured image of the subject; a processor for switching the main camera or the secondary camera to capture the captured image of the subject according to a predetermined condition; and The depth image and the captured image construct a three-dimensional image of the subject.
本申请实施方式提供一种电子装置的控制方法,所述电子装置包括飞行时间模组、主摄像头和副摄像头,所述控制方法包括:通过所述飞行时间模组采集被摄物的深度图像;根据预定条件切换所述主摄像头或所述副摄像头以采集所述被摄物的拍摄图像;和根据所述深度图像和所述拍摄图像构建所述被摄物的三维图像。An embodiment of the present application provides a control method for an electronic device, the electronic device includes a time-of-flight module, a main camera, and a sub-camera, and the control method includes: acquiring a depth image of a subject through the time-of-flight module; Switching the main camera or the sub camera according to a predetermined condition to acquire a captured image of the subject; and constructing a three-dimensional image of the subject based on the depth image and the captured image.
本申请的实施方式的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本申请的实施方式的实践了解到。Additional aspects and advantages of the embodiments of the present application will be partially given in the following description, and part of them will become apparent from the following description, or be learned through the practice of the embodiments of the present application.
附图说明BRIEF DESCRIPTION OF THE DRAWINGS
本申请的上述和/或附加的方面和优点从结合下面附图对实施方式的描述中将变得明显和容易理解,其中:The above and / or additional aspects and advantages of the present application will become apparent and easily understood from the description of the embodiments in conjunction with the following drawings, in which:
图1是本申请某些实施方式的电子装置的控制方法的流程示意图;1 is a schematic flowchart of a method for controlling an electronic device according to some embodiments of the present application;
图2是本申请某些实施方式的电子装置的结构示意图;2 is a schematic structural diagram of an electronic device according to some embodiments of the present application;
图3及图4是本申请某些实施方式的电子装置的控制方法的流程示意图;3 and 4 are schematic flowcharts of a method for controlling an electronic device according to some embodiments of the present application;
图5至图7是本申请某些实施方式的电子装置的控制方法的场景示意图;5 to 7 are schematic diagrams of a method for controlling an electronic device according to some embodiments of the present application;
图8是本申请某些实施方式的电子装置的控制方法的流程示意图;8 is a schematic flowchart of a method for controlling an electronic device according to some embodiments of the present application;
图9是本申请某些实施方式的飞行时间模组的立体结构示意图;FIG. 9 is a perspective structural diagram of a time-of-flight module according to some embodiments of the present application; FIG.
图10是本申请某些实施方式的飞行时间模组的俯视示意图;10 is a schematic top view of a time-of-flight module according to some embodiments of the present application;
图11是本申请某些实施方式的飞行时间模组的仰视示意图;11 is a schematic bottom view of a time of flight module according to some embodiments of the present application;
图12是本申请某些实施方式的飞行时间模组的侧视示意图;12 is a schematic side view of a time-of-flight module according to some embodiments of the present application;
图13是图10所示的飞行时间模组沿XIII-XIII线的截面示意图;13 is a schematic cross-sectional view of the time-of-flight module shown in FIG. 10 along the line XIII-XIII;
图14是图13所示的飞行时间模组中XIV部分的放大示意图;FIG. 14 is an enlarged schematic diagram of a XIV portion in the time-of-flight module shown in FIG. 13; FIG.
图15是本申请某些实施方式的飞行时间模组在柔性电路板未弯折时的正面结构示意图;15 is a schematic front view of a time-of-flight module of some embodiments of the present application when the flexible circuit board is not bent;
图16至图19是本申请某些实施方式的光发射器的结构示意图。16 to 19 are schematic structural diagrams of a light transmitter according to some embodiments of the present application.
具体实施方式detailed description
以下结合附图对本申请的实施方式作进一步说明。附图中相同或类似的标号自始至终表示相同或类似的元件或具有相同或类似功能的元件。另外,下面结合附图描述的本申请的实施方式是示例性的,仅用于解释本申请的实施方式,而不能理解为对本申请的限制。The embodiments of the present application are further described below with reference to the accompanying drawings. The same or similar reference numerals in the drawings represent the same or similar elements or elements having the same or similar functions throughout. In addition, the embodiments of the present application described below with reference to the drawings are exemplary, and are only used to explain the embodiments of the present application, and should not be construed as limiting the present application.
请参阅图2,本申请实施方式的电子装置100包括飞行时间模组20、主摄像头30、副摄像头40和处理器10。飞行时间模组20用于采集被摄物的深度图像。主摄像头30用于采集被摄物的拍摄图像,副摄像头40用于采集被摄物的拍摄图像。处理器10用于根据预定条件切换主摄像头30或副摄像头40以采集被摄物的拍摄图像、及根据深度图像和拍摄图像构建被摄物的三维图像。Referring to FIG. 2, an electronic device 100 according to an embodiment of the present application includes a time-of-flight module 20, a main camera 30, a sub-camera 40, and a processor 10. The time-of-flight module 20 is used to collect a depth image of a subject. The main camera 30 is used to capture a captured image of the subject, and the sub camera 40 is used to capture a captured image of the subject. The processor 10 is configured to switch the main camera 30 or the sub camera 40 according to a predetermined condition to acquire a captured image of the subject, and construct a three-dimensional image of the subject based on the depth image and the captured image.
在某些实施方式中,主摄像头30为广角摄像头,副摄像头40为长焦摄像头;或者主摄像头30为彩色摄像头,副摄像头40为黑白摄像头。In some embodiments, the main camera 30 is a wide-angle camera, and the sub camera 40 is a telephoto camera; or the main camera 30 is a color camera, and the sub camera 40 is a black and white camera.
在某些实施方式中,飞行时间模组20和副摄像头40分别设置在主摄像头30的两侧。In some embodiments, the time-of-flight module 20 and the secondary camera 40 are respectively disposed on two sides of the primary camera 30.
在某些实施方式中,主摄像头30为广角摄像头,副摄像头40为长焦摄像头。处理器10还用于实时获取被摄物与电子装置100之间的当前距离。在当前距离小于距离阈值时,主摄像头30采集拍摄图像;在当前距离大于或等于距离阈值时,副摄像头40采集拍摄图像。In some embodiments, the main camera 30 is a wide-angle camera, and the sub-camera 40 is a telephoto camera. The processor 10 is further configured to obtain the current distance between the subject and the electronic device 100 in real time. When the current distance is less than the distance threshold, the main camera 30 collects captured images; when the current distance is greater than or equal to the distance threshold, the sub camera 40 collects captured images.
在某些实施方式中,处理器10用于根据深度图像获取当前距离;或者电子装置100还包括距离检测装置50,距离检测装置50用于实时检测当前距离并发送至处理器10。In some implementations, the processor 10 is configured to obtain the current distance according to the depth image; or the electronic device 100 further includes a distance detection device 50, and the distance detection device 50 is configured to detect the current distance in real time and send it to the processor 10.
在某些实施方式中,主摄像头30为广角摄像头,副摄像头40为长焦摄像头。主摄像头30或副摄像头40还用于采集被摄物的预览图像。处理器10用于检测预览图像中的人脸信息、及根据人脸信息切换主摄像头30或副摄像头40以采集拍摄图像。In some embodiments, the main camera 30 is a wide-angle camera, and the sub-camera 40 is a telephoto camera. The main camera 30 or the sub camera 40 is also used to collect a preview image of the subject. The processor 10 is configured to detect face information in the preview image, and switch the main camera 30 or the sub camera 40 according to the face information to capture a captured image.
在某些实施方式中,主摄像头30为彩色摄像头,副摄像头40为黑白摄像头。处理器10用于实时获取环境光的当前亮度。在当前亮度小于亮度阈值时,副摄像头40采集拍摄图像在当前亮度大于或等于亮度阈值时,主摄像头30采集拍摄图像。In some embodiments, the main camera 30 is a color camera, and the sub camera 40 is a black and white camera. The processor 10 is configured to obtain the current brightness of the ambient light in real time. When the current brightness is less than the brightness threshold, the sub-camera 40 collects captured images. When the current brightness is greater than or equal to the brightness threshold, the main camera 30 captures captured images.
请一并参阅图9至图12,在某些实施方式中,飞行时间模组20包括第一基板组件21、垫块22、光发射器23及光接收器24。第一基板组件21包括互相连接的第一基板211及柔性电路板212。垫块22设置在第一基板211上。光发射器23用于向外发射光信号,光发射器23设置在垫块22上。柔性电路板212弯折且柔性电路板212的一端连接第一基板211,另一端连接光发射器23。光接收器24设置在第一基板211上,光接收器24用于接收被反射回的光发射器23发射的光信号,光接收器24包括壳体241及设置在壳体241上的光学元件242,壳体241与垫块22连接成一体。Please refer to FIGS. 9 to 12 together. In some embodiments, the time-of-flight module 20 includes a first substrate assembly 21, a spacer 22, a light transmitter 23 and a light receiver 24. The first substrate assembly 21 includes a first substrate 211 and a flexible circuit board 212 connected to each other. The spacer 22 is disposed on the first substrate 211. The light transmitter 23 is configured to emit an optical signal outward. The light transmitter 23 is disposed on the cushion block 22. The flexible circuit board 212 is bent and one end of the flexible circuit board 212 is connected to the first substrate 211 and the other end is connected to the light emitter 23. The light receiver 24 is disposed on the first substrate 211. The light receiver 24 is configured to receive the light signal emitted by the reflected light transmitter 23. The light receiver 24 includes a casing 241 and an optical element disposed on the casing 241. 242. The housing 241 is connected with the cushion block 22 as a whole.
在某些实施方式中,壳体241与垫块22为一体成型。In some embodiments, the housing 241 and the cushion block 22 are integrally formed.
请参阅图13,在某些实施方式中,光发射器23包括第二基板组件231、光源组件232及外壳233。第二基板组件231设置在垫块22上,第二基板组件231与柔性电路板212连接。光源组件232设置在第二基板组件231上,光源组件232用于发射光信号。外壳233设置在第二基板组件231上,外壳233形成有收容空间2331以收容光源组件232。Please refer to FIG. 13. In some embodiments, the light emitter 23 includes a second substrate assembly 231, a light source assembly 232 and a housing 233. The second substrate assembly 231 is disposed on the pad 22, and the second substrate assembly 231 is connected to the flexible circuit board 212. The light source assembly 232 is disposed on the second substrate assembly 231, and the light source assembly 232 is configured to emit a light signal. The casing 233 is disposed on the second substrate assembly 231. The casing 233 is formed with a receiving space 2331 to receive the light source assembly 232.
在某些实施方式中,第二基板组件231包括第二基板2311及补强件2312。第二基板2311与柔性电路板212连接。光源组件232及补强件2312设置在第二基板2311的相背的两侧。In some embodiments, the second substrate assembly 231 includes a second substrate 2311 and a reinforcing member 2312. The second substrate 2311 is connected to the flexible circuit board 212. The light source assembly 232 and the reinforcing member 2312 are disposed on opposite sides of the second substrate 2311.
在某些实施方式中,补强件2312与垫块22一体成型;或补强件2312与垫块22分体成型。In some embodiments, the reinforcing member 2312 is integrally formed with the cushion block 22; or the reinforcing member 2312 and the cushion block 22 are formed separately.
请参阅图15,在某些实施方式中,补强件2312上形成有第一定位件2313。垫块22包括本体221及第二定位件222,第二定位件222形成在本体221上。第二基板组件231设置在垫块22上时,第一定位件2313与第二定位件222配合。Referring to FIG. 15, in some embodiments, a first positioning member 2313 is formed on the reinforcing member 2312. The cushion block 22 includes a body 221 and a second positioning member 222. The second positioning member 222 is formed on the body 221. When the second substrate assembly 231 is disposed on the cushion block 22, the first positioning member 2313 cooperates with the second positioning member 222.
请一并参阅图13及图14,在某些实施方式中,垫块22与第一基板211结合的一侧开设有收容腔223。飞行时间模组20还包括设置在第一基板211上的电子元件25,电子元件25收容在收容腔223内。Please refer to FIG. 13 and FIG. 14 together. In some embodiments, the side where the cushion block 22 is combined with the first substrate 211 is provided with a receiving cavity 223. The time-of-flight module 20 further includes an electronic component 25 disposed on the first substrate 211, and the electronic component 25 is contained in the receiving cavity 223.
请一并参阅图13及图15,在某些实施方式中,垫块22开设有与至少一个收容腔223连通的避让通孔224,至少一个电子元件25伸入避让通孔224内。Please refer to FIG. 13 and FIG. 15 together. In some embodiments, the cushion block 22 is provided with an avoiding through hole 224 communicating with at least one receiving cavity 223, and at least one electronic component 25 extends into the avoiding through hole 224.
请一并参阅图11及图13,在某些实施方式中,第一基板组件21还包括加强板213,加强板213结合在第一基板211的与垫块22相背的一侧。Please refer to FIG. 11 and FIG. 13 together. In some embodiments, the first substrate assembly 21 further includes a reinforcing plate 213. The reinforcing plate 213 is coupled to a side of the first substrate 211 opposite to the pad 22.
请一并参阅图13至图15,在某些实施方式中,垫块22包括伸出第一基板211的侧边缘2111的凸出部225,柔性电路板212绕凸出部225弯折设置。Please refer to FIG. 13 to FIG. 15 together. In some embodiments, the cushion block 22 includes a protruding portion 225 protruding from the side edge 2111 of the first substrate 211, and the flexible circuit board 212 is bent around the protruding portion 225.
请一并参阅图9至图11,在某些实施方式中,飞行时间模组20还包括连接器26,连接器26连接在第一基板211上。连接器26用于连接第一基板组件21及外部设备。连接器26与柔性电路板212分别连接在第一基板211的相背的两端。Please refer to FIGS. 9 to 11 together. In some embodiments, the time-of-flight module 20 further includes a connector 26 connected to the first substrate 211. The connector 26 is used to connect the first substrate assembly 21 and an external device. The connector 26 and the flexible circuit board 212 are respectively connected to opposite ends of the first substrate 211.
请一并参阅图10及图11,在某些实施方式中,光发射器23与光接收器24沿一直线L排列,连接器26与柔性电路板212分别位于直线L的相背的两侧。Please refer to FIG. 10 and FIG. 11 together. In some embodiments, the light transmitter 23 and the light receiver 24 are arranged along a straight line L, and the connector 26 and the flexible circuit board 212 are located on opposite sides of the straight line L, respectively. .
请一并参阅图1和图2,本申请实施方式的电子装置100包括飞行时间模组20、主摄像头30和副摄像头40。控制方法包括:Please refer to FIG. 1 and FIG. 2 together. The electronic device 100 according to the embodiment of the present application includes a time-of-flight module 20, a main camera 30 and a sub-camera 40. Control methods include:
01:通过飞行时间模组20采集被摄物的深度图像;01: Capture the depth image of the subject through the time-of-flight module 20;
02:根据预定条件切换主摄像头30或副摄像头40以采集被摄物的拍摄图像;和02: Switch the main camera 30 or the sub camera 40 according to a predetermined condition to capture a captured image of the subject; and
03:根据深度图像和拍摄图像构建被摄物的三维图像。03: Construct a three-dimensional image of the subject from the depth image and the captured image.
在某些实施方式中,飞行时间模组20和副摄像头40分别设置在主摄像头30的两侧。In some embodiments, the time-of-flight module 20 and the secondary camera 40 are respectively disposed on two sides of the primary camera 30.
请一并参阅图2和图3,在某些实施方式中,主摄像头30为广角摄像头,副摄像头40为长焦摄像头。控制方法还包括:Please refer to FIG. 2 and FIG. 3 together. In some embodiments, the main camera 30 is a wide-angle camera, and the sub-camera 40 is a telephoto camera. Control methods also include:
04:实时获取被摄物与电子装置100之间的当前距离;04: Real-time acquisition of the current distance between the subject and the electronic device 100;
根据预定条件切换主摄像头30或副摄像头40以采集被摄物的拍摄图像的步骤(即02)包括:The step (ie, 02) of switching the main camera 30 or the sub camera 40 to acquire a captured image of the subject according to a predetermined condition includes:
021:在当前距离小于距离阈值时,通过主摄像头30采集拍摄图像;和021: when the current distance is less than the distance threshold, capture a captured image through the main camera 30; and
022:在当前距离大于或等于距离阈值时,通过副摄像头40采集拍摄图像。022: When the current distance is greater than or equal to the distance threshold, capture a captured image through the sub-camera 40.
请一并参阅图2和图4,在某些实施方式中,主摄像头30为广角摄像头,副摄像头40为长焦摄像头,控制方法还包括:Please refer to FIG. 2 and FIG. 4 together. In some embodiments, the main camera 30 is a wide-angle camera, and the sub-camera 40 is a telephoto camera. The control method further includes:
05:通过主摄像头30或副摄像头40采集被摄物的预览图像;和05: Collect a preview image of the subject through the main camera 30 or the sub camera 40; and
06:检测预览图像中的人脸信息;06: Detect face information in the preview image;
根据预定条件切换主摄像头30或副摄像头40以采集被摄物的拍摄图像的步骤(即02)包括:The step (ie, 02) of switching the main camera 30 or the sub camera 40 to acquire a captured image of the subject according to a predetermined condition includes:
023:根据人脸信息切换主摄像头30或副摄像头40以采集拍摄图像。023: Switch the main camera 30 or the sub camera 40 according to the face information to capture a captured image.
请参阅图8,在某些实施方式中,主摄像头30为彩色摄像头,副摄像头40为黑白摄像头。 控制方法还包括:Referring to FIG. 8, in some embodiments, the main camera 30 is a color camera, and the sub camera 40 is a black and white camera. Control methods also include:
07:实时获取环境光的当前亮度;07: Get the current brightness of the ambient light in real time;
根据预定条件切换主摄像头30或副摄像头40以采集被摄物的拍摄图像的步骤(即02)包括:The step (ie, 02) of switching the main camera 30 or the sub camera 40 to acquire a captured image of the subject according to a predetermined condition includes:
024:在当前亮度小于亮度阈值时,通过副摄像头40采集拍摄图像;和024: when the current brightness is less than the brightness threshold, capture a captured image through the sub-camera 40; and
025:在当前亮度大于或等于亮度阈值时,通过主摄像头30采集拍摄图像。025: When the current brightness is greater than or equal to the brightness threshold, capture a captured image through the main camera 30.
请一并参阅图1和图2,本申请实施方式提供一种电子装置100的控制方法。电子装置100包括飞行时间模组20、主摄像头30和副摄像头40。控制方法包括:Please refer to FIG. 1 and FIG. 2 together. The embodiment of the present application provides a control method of the electronic device 100. The electronic device 100 includes a time of flight module 20, a main camera 30 and a sub camera 40. Control methods include:
01:通过飞行时间模组20采集被摄物的深度图像;01: Capture the depth image of the subject through the time-of-flight module 20;
02:根据预定条件切换主摄像头30或副摄像头40以采集被摄物的拍摄图像;和02: Switch the main camera 30 or the sub camera 40 according to a predetermined condition to capture a captured image of the subject; and
03:根据深度图像和拍摄图像构建被摄物的三维图像。03: Construct a three-dimensional image of the subject from the depth image and the captured image.
请参阅图2,本申请实施方式提供一种电子装置100。电子装置100包括飞行时间模组20、主摄像头30、副摄像头40和处理器10。主摄像头30用于采集被摄物的拍摄图像,副摄像头40也用于采集被摄物的拍摄图像。本申请实施方式的电子装置100的控制方法可由本申请实施方式的电子装置100实现。例如,飞行时间模组20可用于执行01中的方法,处理器10可用于执行02和03中的方法。也即是说,飞行时间模组20可以用于采集被摄物的深度图像。处理器10可以用于根据预定条件切换主摄像头30或副摄像头40以采集被摄物的拍摄图像;和根据深度图像和拍摄图像构建被摄物的三维图像。Please refer to FIG. 2, an embodiment of the present application provides an electronic device 100. The electronic device 100 includes a time of flight module 20, a main camera 30, a sub camera 40, and a processor 10. The main camera 30 is used to capture a captured image of the subject, and the sub camera 40 is also used to capture a captured image of the subject. The control method of the electronic device 100 according to the embodiment of the present application may be implemented by the electronic device 100 according to the embodiment of the present application. For example, the time of flight module 20 may be used to execute the method in 01, and the processor 10 may be used to execute the methods in 02 and 03. That is to say, the time-of-flight module 20 can be used to collect a depth image of a subject. The processor 10 may be configured to switch the main camera 30 or the sub camera 40 according to a predetermined condition to acquire a captured image of the subject; and construct a three-dimensional image of the subject according to the depth image and the captured image.
可以理解,随着电子技术的快速发展,诸如智能手机、平板电脑等电子装置已经越来越普及。电子装置通常通过单个摄像头采集被摄物的拍摄图像,功能不够多样化,用户体验不佳。It can be understood that with the rapid development of electronic technology, electronic devices such as smart phones and tablet computers have become increasingly popular. An electronic device usually collects a captured image of a subject through a single camera, which has insufficient functions and poor user experience.
本申请实施方式的电子装置100的控制方法和电子装置100能够根据预定条件切换主摄像头30或副摄像头40以采集被摄物的拍摄图像,也能够根据拍摄图像和飞行时间模组20采集的深度图像构建被摄物的三维图像,功能较为多样化,有利于提升用户体验。The control method of the electronic device 100 and the electronic device 100 according to the embodiments of the present application can switch the main camera 30 or the sub camera 40 according to a predetermined condition to capture a captured image of the subject, and also can capture the depth of the captured image based on the captured image and the time of flight module 20 The image builds a three-dimensional image of the subject, which has a variety of functions and is conducive to improving the user experience.
请再次参阅图2,电子装置100可以是手机、平板电脑、智能手表、智能手环、智能穿戴设备等,本申请实施方式以电子装置100是手机为例进行说明,可以理解,电子装置100的具体形式并不限于手机。Please refer to FIG. 2 again, the electronic device 100 may be a mobile phone, a tablet computer, a smart watch, a smart bracelet, a smart wearable device, etc. The embodiment of the present application is described by taking the electronic device 100 as a mobile phone as an example. It can be understood that the electronic device 100 The specific form is not limited to mobile phones.
电子装置100可包括机壳101和支架102。飞行时间模组20、主摄像头30和副摄像头40均设置在支架102上。飞行时间模组20、主摄像头30、副摄像头40和支架102均收容在机壳101内并能够从机壳101中伸出。具体地,当飞行时间模组20用于采集被摄物的深度图像、或者主摄像头30用于采集被摄物的拍摄图像、或者副摄像头40用于采集被摄物的拍摄图像时,支架102带动飞行时间模组20、主摄像头30和副摄像头40朝机壳101外运动以伸出机壳101,从而采集深度图像或拍摄图像。本申请实施方式中,飞行时间模组20、主摄像头30和副摄像头可以均为前置摄像头或者均为后置摄像头。被摄物可以是人、物或用户希望拍摄的其他主体。在其他实施方式中,机壳101上可以开设有通光孔(图未示),飞行时间模组20、主摄像头30和副摄像头40不可移动地设置在机壳101内并与通光孔对应。在再一实施方式中,电子装置100设置在机壳101上的显示屏103可以开设有通光孔(图未示),飞行时间模组20、主摄像头30和副摄像头40设置在显示屏103的下方。The electronic device 100 may include a case 101 and a bracket 102. The time-of-flight module 20, the main camera 30 and the auxiliary camera 40 are all disposed on the bracket 102. The time-of-flight module 20, the main camera 30, the sub-camera 40, and the bracket 102 are all housed in the casing 101 and can extend from the casing 101. Specifically, when the time-of-flight module 20 is used to collect a depth image of the subject, or the main camera 30 is used to capture a captured image of the subject, or the sub-camera 40 is used to capture a captured image of the subject, the bracket 102 The time-of-flight module 20, the main camera 30, and the sub-camera 40 are driven to move outside the casing 101 to extend the casing 101, so as to acquire a depth image or take an image. In the embodiment of the present application, the time-of-flight module 20, the main camera 30, and the sub-camera may be all front cameras or all rear cameras. The subject can be a person, object, or other subject that the user wishes to photograph. In other embodiments, the housing 101 may be provided with a light through hole (not shown). The time-of-flight module 20, the main camera 30 and the sub camera 40 are immovably disposed in the housing 101 and correspond to the light through hole. . In yet another embodiment, the display screen 103 of the electronic device 100 disposed on the casing 101 may be provided with a light through hole (not shown), and the time-of-flight module 20, the main camera 30 and the sub-camera 40 are disposed on the display screen 103. Below.
在一个实施方式中,电子装置100工作过程中,处理器10根据预定条件切换主摄像头30或副摄像头40以采集被摄物的拍摄图像的步骤(即02),与处理器10根据深度图像和拍摄图像构建被摄物的三维图像的步骤(即03)是在两种不同的应用场景下实现的,02与03之间没有步骤 执行的先后关系。也即是说,在一个应用场景下,处理器10可仅根据预定条件切换主摄像头30或副摄像头40以采集被摄物的拍摄图像,从而实现光学变焦及望远拍照体验,而无需再根据深度图像和拍摄图像构建被摄物的三维图像。在另一个应用场景下,处理器10可选择根据主摄像头30采集的拍摄图像和飞行时间模组20采集的深度图像构建被摄物的三维图像,或者选择根据副摄像头40采集的拍摄图像和飞行时间模组20采集的深度图像构建被摄物的三维图像,从而实现3D效果及增强现实技术(Augmented Reality,AR)类应用,而无需根据预定条件切换主摄像头30或副摄像头40以采集被摄物的拍摄图像。In an embodiment, during the operation of the electronic device 100, the processor 10 switches the main camera 30 or the sub-camera 40 according to a predetermined condition to acquire a captured image of the subject (ie, 02), and the processor 10 according to the depth image and The step of capturing the image to construct a three-dimensional image of the subject (ie, 03) is implemented in two different application scenarios, and there is no sequential relationship between step execution between 02 and 03. That is to say, in an application scenario, the processor 10 may switch the main camera 30 or the sub camera 40 according to a predetermined condition to collect a captured image of a subject, thereby realizing an optical zoom and telephoto experience without having to The depth image and the captured image construct a three-dimensional image of the subject. In another application scenario, the processor 10 may choose to construct a three-dimensional image of the subject based on the captured image collected by the main camera 30 and the depth image collected by the time-of-flight module 20, or select the captured image and flight based on the captured image acquired by the secondary camera 40 The depth image collected by the time module 20 constructs a three-dimensional image of the subject, thereby realizing 3D effects and Augmented Reality (AR) applications, without the need to switch the main camera 30 or the sub camera 40 to capture the subject according to predetermined conditions A captured image of the subject.
在另一个实施方式中,电子装置100工作过程中,处理器10根据预定条件切换主摄像头30或副摄像头40以采集被摄物的拍摄图像的步骤(即02),与处理器10根据深度图像和拍摄图像构建被摄物的三维图像的步骤(即03)是在同一应用场景下实现的,02在03之前执行。也即是说,处理器10先根据预定条件切换为主摄像头30采集被摄物的拍摄图像,然后根据深度图像和主摄像头30采集的拍摄图像构建被摄物的三维图像;或者,处理器10先根据预定条件切换为副摄像头40采集被摄物的拍摄图像,然后根据深度图像和副摄像头40采集的拍摄图像构建被摄物的三维图像。本申请实施方式中,由于处理器10先根据预定条件选择合适的摄像头来采集被摄物的拍摄图像,使得拍摄图像的质量较好,然后根据质量较好的拍摄图像和深度图像构建三维图像,从而能够获得理想的3D效果及AR体验。In another embodiment, during the operation of the electronic device 100, the processor 10 switches the main camera 30 or the sub-camera 40 according to a predetermined condition to acquire a captured image of the subject (ie, 02), and the processor 10 according to the depth image The step of constructing a three-dimensional image of the object and the captured image (ie, 03) is implemented in the same application scenario, and 02 is executed before 03. That is, the processor 10 first switches to capture a captured image of the subject according to a predetermined condition, and then constructs a three-dimensional image of the subject based on the depth image and the captured image captured by the main camera 30; or, the processor 10 First, switch to the sub-camera 40 to capture a captured image of the subject according to a predetermined condition, and then construct a three-dimensional image of the subject based on the depth image and the captured image collected by the sub-camera 40. In the embodiment of the present application, since the processor 10 first selects an appropriate camera to collect a captured image of the subject according to a predetermined condition, so that the quality of the captured image is better, and then a three-dimensional image is constructed based on the captured image and the depth image with better quality. So you can get the ideal 3D effect and AR experience.
在某些实施方式中,主摄像头30为广角摄像头,副摄像头40为长焦摄像头。需要指出是,“广角”和“长焦”是相对而言的。广角摄像头相对于长焦摄像头具有更大的视场角,长焦摄像头相对于广角摄像头焦距更长、拍摄距离更远。In some embodiments, the main camera 30 is a wide-angle camera, and the sub-camera 40 is a telephoto camera. It should be noted that "wide-angle" and "telephoto" are relative terms. The wide-angle camera has a larger field of view relative to the telephoto camera, and the telephoto camera has a longer focal length and a longer shooting distance than the wide-angle camera.
当处理器10根据预定条件切换为广角摄像头采集被摄物的拍摄图像时,此图像为广角图像,处理器10根据深度图像和广角图像构建被摄物的三维图像;当处理器10根据预定条件切换为长焦摄像头采集被摄物的拍摄图像时,此图像为长焦图像,处理器10根据深度图像和长焦图像构建被摄物的三维图像。When the processor 10 switches to a wide-angle camera to capture a captured image of the subject according to a predetermined condition, the image is a wide-angle image, and the processor 10 constructs a three-dimensional image of the subject according to the depth image and the wide-angle image; When switching to a telephoto camera to capture a captured image of the subject, the image is a telephoto image, and the processor 10 constructs a three-dimensional image of the subject according to the depth image and the telephoto image.
本申请实施方式中,电子装置100同时包括广角摄像头和长焦摄像头,处理器10可以根据实际情况切换广角摄像头或长焦摄像头以采集被摄物的拍摄图像,从而构建广角、长焦两种不同类型的三维图像,有利于提升拍照体验。In the embodiment of the present application, the electronic device 100 includes a wide-angle camera and a telephoto camera at the same time. The processor 10 may switch the wide-angle camera or the telephoto camera according to the actual situation to collect the captured image of the subject, thereby constructing two different types of wide-angle and telephoto Type of three-dimensional images is conducive to improving the photographing experience.
在某些实施方式中,主摄像头30为彩色摄像头(即RGB摄像头),副摄像头40为黑白摄像头(即Mono摄像头)。黑白摄像头相对于彩色摄像头能够提升暗光/夜景影像拍摄质量。In some embodiments, the main camera 30 is a color camera (ie, an RGB camera), and the sub camera 40 is a black and white camera (ie, a Mono camera). Compared with the color camera, the black and white camera can improve the shooting quality of low-light / night scene images.
当处理器10根据预定条件切换为主摄像头30采集被摄物的拍摄图像时,此图像为RGB图像,处理器10根据深度图像和RGB图像构建被摄物的三维图像;当处理器10根据预定条件切换为副摄像头40采集被摄物的拍摄图像时,此图像为Mono图像,处理器10根据深度图像和Mono图像构建被摄物的三维图像。When the processor 10 switches to collect a captured image of the subject according to a predetermined condition, the image is an RGB image, and the processor 10 constructs a three-dimensional image of the object according to the depth image and the RGB image; When the condition is switched to when the sub-camera 40 collects a captured image of the subject, the image is a Mono image, and the processor 10 constructs a three-dimensional image of the subject according to the depth image and the Mono image.
本申请实施方式中,电子装置100同时包括彩色摄像头和黑白摄像头,处理器10可以根据实际情况切换彩色摄像头或黑白摄像头以采集被摄物的拍摄图像,从而构建RGB和Mono两种不同类型的三维图像,有利于提升拍照体验。In the embodiment of the present application, the electronic device 100 includes both a color camera and a black-and-white camera. The processor 10 may switch the color camera or the black-and-white camera according to the actual situation to collect the captured image of the subject, thereby constructing two different types of three-dimensional RGB and Mono Images, which will help improve the photo experience.
请继续参阅图2,在某些实施方式中,飞行时间模组20和副摄像头40分别设置在主摄像头30的两侧。Please continue to refer to FIG. 2. In some embodiments, the time-of-flight module 20 and the sub-camera 40 are respectively disposed on two sides of the main camera 30.
由于主摄像头30位于飞行时间模组20与副摄像头40之间,一方面,若处理器10根据预定条件切换主摄像头30或副摄像头40以采集被摄物的拍摄图像,主摄像头30与副摄像头40之间 的视差较小,有利于实现平滑的变焦。另一方面,用户一般情况下使用得较多的摄像头为主摄像头30,也即是说,处理器10根据预定条件切换为主摄像头30采集被摄物的拍摄图像的几率较大,从而处理器10根据深度图像和主摄像头30采集的拍摄图像构建被摄物的三维图像时,飞行时间模组20与主摄像头30之间的视差较小,有利于构建被摄物的三维图像。Since the main camera 30 is located between the time-of-flight module 20 and the sub-camera 40, on the one hand, if the processor 10 switches the main camera 30 or the sub-camera 40 according to a predetermined condition to collect a captured image of the subject, the main camera 30 and the sub-camera The parallax between 40 is small, which is helpful for achieving smooth zoom. On the other hand, users generally use more cameras as the main camera 30, that is, the processor 10 is more likely to switch to the main camera 30 to collect captured images of the subject according to predetermined conditions, so that the processor 10 When constructing a three-dimensional image of the object based on the depth image and the captured image collected by the main camera 30, the parallax between the time-of-flight module 20 and the main camera 30 is small, which is beneficial to constructing a three-dimensional image of the object.
本申请实施方式中,飞行时间模组20和副摄像头40分别设置在主摄像头30的两侧,有利于实现主摄像头30与副摄像头40之间的平滑变焦,同时有利于构建被摄物的三维图像。In the embodiment of the present application, the time-of-flight module 20 and the sub-camera 40 are respectively disposed on both sides of the main camera 30, which is beneficial to achieve smooth zooming between the main camera 30 and the sub-camera 40, and is also conducive to constructing a three-dimensional object image.
进一步地,飞行时间模组20、主摄像头30、副摄像头40的中心可依次位于一条直线上,一方面能够减小支架102在沿电子装置100的顶部(即电子装置100的靠近支架102的一侧)至底部(即电子装置100的远离支架102的一侧)的方向上的长度;另一方面,支架102带动飞行时间模组20、主摄像头30和副摄像头40朝机壳101外运动后能够同步从机壳101内伸出,以从结构上保证能够飞行时间模组20、主摄像头30和副摄像头40同步工作,节省拍摄时间。Further, the centers of the time-of-flight module 20, the main camera 30, and the sub-camera 40 can be located in a straight line in order, on the one hand, the bracket 102 can be reduced along the top of the electronic device 100 (that is, Side) to the bottom (that is, the side of the electronic device 100 away from the bracket 102); on the other hand, the bracket 102 drives the time-of-flight module 20, the main camera 30, and the sub-camera 40 to move outside the casing 101 It can synchronously protrude from the casing 101 to structurally ensure that the time-of-flight module 20, the main camera 30 and the sub-camera 40 can work synchronously, saving shooting time.
请一并参阅图2和图3,在某些实施方式中,主摄像头30为广角摄像头,副摄像头40为长焦摄像头。控制方法还包括:Please refer to FIG. 2 and FIG. 3 together. In some embodiments, the main camera 30 is a wide-angle camera, and the sub-camera 40 is a telephoto camera. Control methods also include:
04:实时获取被摄物与电子装置100之间的当前距离;04: Real-time acquisition of the current distance between the subject and the electronic device 100;
根据预定条件切换主摄像头30或副摄像头40以采集被摄物的拍摄图像的步骤(即02)包括:The step (ie, 02) of switching the main camera 30 or the sub camera 40 to acquire a captured image of the subject according to a predetermined condition includes:
021:在当前距离小于距离阈值时,通过主摄像头30采集拍摄图像;和021: when the current distance is less than the distance threshold, capture a captured image through the main camera 30; and
022:在当前距离大于或等于距离阈值时,通过副摄像头40采集拍摄图像。022: When the current distance is greater than or equal to the distance threshold, capture a captured image through the sub-camera 40.
请参阅图2,在某些实施方式中,主摄像头30为广角摄像头,副摄像头40为长焦摄像头。处理器10可用于执行04中的方法。主摄像头30可用于执行021中的方法,副摄像头40可用于执行022中的方法。也即是说,处理器10可以用于实时获取被摄物与电子装置100之间的当前距离。在当前距离小于距离阈值时,主摄像头30采集拍摄图像;在当前距离大于或等于距离阈值时,副摄像头40采集拍摄图像。Referring to FIG. 2, in some embodiments, the main camera 30 is a wide-angle camera, and the sub-camera 40 is a telephoto camera. The processor 10 may be used to execute the method in 04. The main camera 30 can be used to execute the method in 021, and the sub camera 40 can be used to execute the method in 022. That is to say, the processor 10 may be used to obtain the current distance between the subject and the electronic device 100 in real time. When the current distance is less than the distance threshold, the main camera 30 collects captured images; when the current distance is greater than or equal to the distance threshold, the sub camera 40 collects captured images.
具体地,由于处理器10是实时获取被摄物与电子装置100之间的当前距离的,因此,电子装置100可以根据当前距离实时切换用于采集拍摄图像的摄像头。Specifically, since the processor 10 acquires the current distance between the subject and the electronic device 100 in real time, the electronic device 100 can switch the camera used to capture the captured image in real time according to the current distance.
假设距离阈值为d0。在第一时刻,处理器10获取被摄物与电子装置100之间的当前距离为d1,其中d1<d0,此时,由主摄像头30采集被摄物的拍摄图像;在第二时刻,处理器10获取被摄物与电子装置100之间的当前距离为d2,其中d2>d0,此时,由副摄像头40再次采集被摄物的拍摄图像。最终处理器10根据深度图像和副摄像头40采集的拍摄图像构建被摄物的三维图像。Assume that the distance threshold is d0. At the first moment, the processor 10 obtains the current distance between the subject and the electronic device 100 as d1, where d1 <d0. At this time, the main camera 30 collects a captured image of the subject; at the second moment, the processing The current distance between the object 10 and the electronic device 100 is d2, where d2> d0. At this time, the sub-camera 40 acquires a captured image of the object again. The final processor 10 constructs a three-dimensional image of the subject based on the depth image and the captured image collected by the sub-camera 40.
进一步地,在当前距离小于距离阈值时,主摄像头30可采用激光对焦模式进行对焦。在当前距离大于或等于距离阈值时,副摄像头40可采用被动对焦模式进行对焦。其中,被动对焦方式包括反差对焦模式及相位对焦模式。Further, when the current distance is less than the distance threshold, the main camera 30 may focus using a laser focusing mode. When the current distance is greater than or equal to the distance threshold, the sub-camera 40 may use the passive focus mode for focusing. Among them, the passive focus method includes a contrast focus mode and a phase focus mode.
在某些实施方式中,处理器10实时获取被摄物与电子装置100之间的当前距离的步骤(即04)包括:处理器10连续获取被摄物与电子装置100之间的多个初始距离。当多个初始距离相互之间的变化量均小于预定值时,处理器10根据多个初始距离的平均值计算被摄物与电子装置100之间的当前距离,然后根据计算得到的当前距离切换主摄像头30或副摄像头40以采集被摄物的拍摄图像。In some embodiments, the step (that is, 04) of the processor 10 acquiring the current distance between the subject and the electronic device 100 in real time includes: the processor 10 continuously acquiring multiple initial values between the subject and the electronic device 100 distance. When the amount of change between the plurality of initial distances is less than a predetermined value, the processor 10 calculates a current distance between the subject and the electronic device 100 according to an average value of the plurality of initial distances, and then switches according to the calculated current distance The main camera 30 or the sub camera 40 captures a captured image of a subject.
在一个实施方式中,处理器10用于根据深度图像获取当前距离。In one embodiment, the processor 10 is configured to obtain the current distance according to the depth image.
具体地,飞行时间模组20先采集被摄物的深度图像,再根据深度图像获取被摄物与电子装置100之间的当前距离。在当前距离小于距离阈值时,主摄像头30采集拍摄图像;在当前距离 大于或等于距离阈值时,副摄像头40采集被摄物的拍摄图像。Specifically, the time-of-flight module 20 first collects a depth image of the subject, and then obtains a current distance between the subject and the electronic device 100 according to the depth image. When the current distance is less than the distance threshold, the main camera 30 collects captured images; when the current distance is greater than or equal to the distance threshold, the sub camera 40 collects captured images of the subject.
本申请实施方式中,飞行时间模组20采集的深度图像既可以用于构建被摄物的三维图像,又可以用于检测被摄物与电子装置100之间的当前距离,电子装置100无需设置另外的距离传感器来检测当前距离,有利于减少电子装置100内元器件的数量,节省成本。In the embodiment of the present application, the depth image collected by the time-of-flight module 20 can be used to construct a three-dimensional image of the subject, and can also be used to detect the current distance between the subject and the electronic device 100. The electronic device 100 does not need to be set. Another distance sensor to detect the current distance is beneficial to reduce the number of components in the electronic device 100 and save costs.
在另一个实施方式中,电子装置100还包括距离检测装置50。距离检测装置50用于实时检测当前距离并发送至处理器10。In another embodiment, the electronic device 100 further includes a distance detection device 50. The distance detecting device 50 is configured to detect the current distance in real time and send it to the processor 10.
例如,距离检测装置50为距离传感器,距离传感器直接检测被摄物与电子装置100之间的当前距离。又例如,距离检测装置50为结构光模组,结构光模组采集被摄物的结构光图像,然后再根据结构光图像检测被摄物与电子装置100之间的当前距离。当然,在其他实施方式中,距离检测装置50也可以为其他类型的检测装置,能够检测出被摄物与电子装置100之间的当前距离即可,例如超声波测距仪、雷达测距仪、接近传感器等。For example, the distance detection device 50 is a distance sensor, and the distance sensor directly detects the current distance between the subject and the electronic device 100. For another example, the distance detection device 50 is a structured light module. The structured light module collects a structured light image of a subject, and then detects a current distance between the subject and the electronic device 100 based on the structured light image. Of course, in other embodiments, the distance detection device 50 may be another type of detection device, and it is only required to be able to detect the current distance between the subject and the electronic device 100, such as an ultrasonic rangefinder, a radar rangefinder, Proximity sensors, etc.
本申请实施方式中,距离检测装置50可提前检测出当前距离,若当前距离小于距离阈值,由主摄像头30采集拍摄图像;若当前距离大于或等于距离阈值,由副摄像头40采集拍摄图像。主摄像头30或副摄像头40可以在飞行时间模组20采集被摄物的深度图像的同时,采集被摄物的拍摄图像,以节省构建被摄物的三维图像的时间,提升用户体验。当然,主摄像头30或副摄像头40也可以在飞行时间模组20采集深度图像前采集拍摄图像;或者在飞行时间模组20采集深度图像后采集拍摄图像,这里不作限制。In the embodiment of the present application, the distance detection device 50 can detect the current distance in advance. If the current distance is less than the distance threshold, the main camera 30 collects captured images; if the current distance is greater than or equal to the distance threshold, the sub camera 40 collects captured images. The main camera 30 or the sub-camera 40 may collect a depth image of the subject while the time-of-flight module 20 collects a captured image of the subject, so as to save time for constructing a three-dimensional image of the subject and improve a user experience. Of course, the main camera 30 or the sub-camera 40 may also capture captured images before the time-of-flight module 20 captures depth images; or capture images after the depth-of-flight module 20 captures depth images, which is not limited herein.
请一并参阅图2和图4,在某些实施方式中,主摄像头30为广角摄像头,副摄像头40为长焦摄像头,控制方法还包括:Please refer to FIG. 2 and FIG. 4 together. In some embodiments, the main camera 30 is a wide-angle camera, and the sub-camera 40 is a telephoto camera. The control method further includes:
05:通过主摄像头30或副摄像头40采集被摄物的预览图像;和05: Collect a preview image of the subject through the main camera 30 or the sub camera 40; and
06:检测预览图像中的人脸信息;06: Detect face information in the preview image;
根据预定条件切换主摄像头30或副摄像头40以采集被摄物的拍摄图像的步骤(即02)包括:The step (ie, 02) of switching the main camera 30 or the sub camera 40 to acquire a captured image of the subject according to a predetermined condition includes:
023:根据人脸信息切换主摄像头30或副摄像头40以采集拍摄图像。023: Switch the main camera 30 or the sub camera 40 according to the face information to capture a captured image.
请参阅图2,在某些实施方式中,主摄像头30为广角摄像头,副摄像头40为长焦摄像头。主摄像头30或副摄像头40可用于执行05中的方法,处理器10可用于执行06和023中的方法。也即是说,主摄像头30或副摄像头40可以用于采集被摄物的预览图像。处理器10可以用于检测预览图像中的人脸信息、及根据人脸信息切换主摄像头30或副摄像头40以采集拍摄图像。Referring to FIG. 2, in some embodiments, the main camera 30 is a wide-angle camera, and the sub-camera 40 is a telephoto camera. The main camera 30 or the sub camera 40 may be used to execute the method in 05, and the processor 10 may be used to execute the methods in 06 and 023. That is to say, the main camera 30 or the sub camera 40 can be used to collect a preview image of a subject. The processor 10 may be configured to detect face information in the preview image, and switch the main camera 30 or the sub-camera 40 according to the face information to capture a captured image.
在一个实施方式中,人脸信息包括人脸数量。当人脸数量大于预定数量时(如图5所示),主摄像头30采集拍摄图像;当人脸数量小于或等于预定数量时(如图6所示),副摄像头40采集拍摄图像。可以理解,当人脸数量较多时,采用视场角较大的主摄像头30进行拍摄能够将较多的人脸都拍入拍摄图像中。而当人脸数量较少时,采用视场角相对较小而焦距较长的副摄像头40即可,能够在视觉上让被摄物显得较近且较大。In one embodiment, the face information includes the number of faces. When the number of faces is greater than a predetermined number (as shown in FIG. 5), the main camera 30 collects captured images; when the number of faces is less than or equal to a predetermined number (as shown in FIG. 6), the sub-camera 40 collects captured images. It can be understood that when the number of human faces is large, shooting with the main camera 30 having a larger field of view can capture more human faces into the captured image. When the number of human faces is small, the sub-camera 40 with a relatively small field of view angle and a long focal length can be used, which can make the subject appear closer and larger visually.
当通过主摄像头30或副摄像头40采集被摄物的预览图像的步骤(即05)为通过主摄像头30采集被摄物的预览图像时,处理器10在人脸数量大于预定数量时,保持主摄像头30采集被摄物的拍摄图像,此时,主摄像头30也可以无需再次采集拍摄图像,直接将预览图像作为拍摄图像;处理器10在人脸数量小于或等于预定数量时,切换为副摄像头40采集拍摄图像。When the step of acquiring the preview image of the subject through the main camera 30 or the sub-camera 40 (ie, 05) is to collect the preview image of the subject through the main camera 30, the processor 10 keeps the main image when the number of faces is greater than a predetermined number. The camera 30 collects captured images of the subject. At this time, the main camera 30 may directly use the preview image as the captured image without collecting the captured images again; the processor 10 switches to the secondary camera when the number of faces is less than or equal to a predetermined number 40 Capture captured images.
当通过主摄像头30或副摄像头40采集被摄物的预览图像的步骤(即05)为通过副摄像头40采集被摄物的预览图像时,处理器10在人脸数量大于预定数量时,切换为主摄像头30采集被摄物的拍摄图像;处理器10在人脸数量小于或等于预定数量时,保持副摄像头40采集拍摄图像, 此时,副摄像头40也可以无需再次采集拍摄图像,直接将预览图像作为拍摄图像。When the step of acquiring the preview image of the subject through the main camera 30 or the sub-camera 40 (ie, 05) is to collect the preview image of the subject through the sub-camera 40, the processor 10 switches to when the number of faces is greater than a predetermined number. The main camera 30 collects captured images of the subject; when the number of faces is less than or equal to a predetermined number, the processor 10 maintains the sub-camera 40 to capture captured images. At this time, the sub-camera 40 can also directly preview the captured images without collecting additional images. The image is taken as a captured image.
在另一个实施方式中,人脸信息包括人脸区域在预览图像中的面积占比。当人脸区域在预览图像中的面积占比大于预定比值(例如三分之一)时(如图7所示),主摄像头30采集拍摄图像;当人脸区域在预览图像中的面积占比小于或等于预定比值时(如图6所示),副摄像头40采集拍摄图像。可以理解,当人脸区域在预览图像中的面积占比较大时,采用视场角较大的主摄像头30进行拍摄能够将人脸都拍入拍摄图像中,并使得人脸区域具有较好的构图比例,而当人脸区域在预览图像中的面积占比较小时,采用视场角相对较小而焦距较长的副摄像头40即可,能够在视觉上让被摄物显得较近且较大。In another embodiment, the face information includes an area ratio of a face region in the preview image. When the area ratio of the face area in the preview image is greater than a predetermined ratio (for example, one third) (as shown in FIG. 7), the main camera 30 captures the captured image; when the area ratio of the face area in the preview image is When it is less than or equal to a predetermined ratio (as shown in FIG. 6), the sub-camera 40 captures a captured image. It can be understood that when the area of the face area in the preview image occupies a relatively large area, shooting with the main camera 30 with a large field of view can capture the face into the captured image and make the face area have a better Composition ratio, and when the area of the face area in the preview image is relatively small, the sub-camera 40 with a relatively small field of view angle and a long focal length can be used to visually make the subject appear closer and larger .
当通过主摄像头30或副摄像头40采集被摄物的预览图像的步骤(即05)为通过主摄像头30采集被摄物的预览图像时,处理器10在人脸区域在预览图像中的面积占比大于预定比值时,保持主摄像头30采集被摄物的拍摄图像,此时,主摄像头30也可以无需再次采集拍摄图像,直接将预览图像作为拍摄图像;处理器10在人脸区域在预览图像中的面积占比小于或等于预定比值时,切换为副摄像头40采集拍摄图像。When the step of acquiring the preview image of the subject through the main camera 30 or the sub-camera 40 (ie, 05) is to collect the preview image of the subject through the main camera 30, the area of the processor 10 in the preview image in the face area When the ratio is greater than a predetermined ratio, the main camera 30 is kept to capture the captured image of the subject. At this time, the main camera 30 may also directly use the preview image as the captured image without collecting the captured image again; the processor 10 previews the image in the face area. When the area ratio is less than or equal to a predetermined ratio, the camera is switched to the sub-camera 40 to capture a captured image.
当通过主摄像头30或副摄像头40采集被摄物的预览图像的步骤(即05)为通过副摄像头40采集被摄物的预览图像时,处理器10在人脸区域在预览图像中的面积占比大于预定比值时,切换为主摄像头30采集拍摄图像;处理器10在人脸区域在预览图像中的面积占比小于或等于预定比值时,保持副摄像头40采集拍摄图像,此时,副摄像头40也可以无需再次采集拍摄图像,直接将预览图像作为拍摄图像。When the step of acquiring the preview image of the subject through the main camera 30 or the sub-camera 40 (ie, 05) is to collect the preview image of the subject through the sub-camera 40, the area of the processor 10 in the preview image in the face area of the processor occupies When the ratio is greater than a predetermined ratio, the main camera 30 is switched to collect captured images; when the area ratio of the face area in the preview image is less than or equal to the predetermined ratio, the processor 10 maintains the sub-camera 40 to collect captured images. At this time, the sub-camera It is also possible to directly use the preview image as the captured image without acquiring the captured image again.
请参阅图8,在某些实施方式中,主摄像头30为彩色摄像头,副摄像头40为黑白摄像头。控制方法还包括:Referring to FIG. 8, in some embodiments, the main camera 30 is a color camera, and the sub camera 40 is a black and white camera. Control methods also include:
07:实时获取环境光的当前亮度;07: Get the current brightness of the ambient light in real time;
根据预定条件切换主摄像头30或副摄像头40以采集被摄物的拍摄图像的步骤(即02)包括:The step (ie, 02) of switching the main camera 30 or the sub camera 40 to acquire a captured image of the subject according to a predetermined condition includes:
024:在当前亮度小于亮度阈值时,通过副摄像头40采集拍摄图像;和024: when the current brightness is less than the brightness threshold, capture a captured image through the sub-camera 40; and
025:在当前亮度大于或等于亮度阈值时,通过主摄像头30采集拍摄图像。025: When the current brightness is greater than or equal to the brightness threshold, capture a captured image through the main camera 30.
在某些实施方式中,主摄像头30为彩色摄像头,副摄像头40为黑白摄像头。处理器10可用于执行07中的方法,副摄像头40可用于执行024中的方法,主摄像头30可用于执行025中的方法。也即是说,处理器10可以用于实时获取环境光的当前亮度。在当前亮度小于亮度阈值时,副摄像头40采集拍摄图像在当前亮度大于或等于亮度阈值时,主摄像头30采集拍摄图像。In some embodiments, the main camera 30 is a color camera, and the sub camera 40 is a black and white camera. The processor 10 may be used to execute the method in 07, the secondary camera 40 may be used to execute the method in 024, and the main camera 30 may be used to execute the method in 025. That is to say, the processor 10 can be used to obtain the current brightness of the ambient light in real time. When the current brightness is less than the brightness threshold, the sub-camera 40 collects captured images. When the current brightness is greater than or equal to the brightness threshold, the main camera 30 captures captured images.
可以理解,黑白摄像头相对于彩色摄像头能够提升暗光/夜景影像拍摄质量。因此,在当前亮度小于亮度阈值时,可由副摄像头40采集拍摄图像;在当前亮度大于或等于亮度阈值时,可由主摄像头30采集拍摄图像。处理器10根据环境光的当前亮度切换彩色摄像头或黑白摄像头以采集被摄物的拍摄图像,从而构建RGB和Mono两种不同类型的三维图像,有利于提升拍照体验。It can be understood that the black and white camera can improve the shooting quality of low-light / night scene images compared to the color camera. Therefore, when the current brightness is less than the brightness threshold, the captured image may be captured by the sub-camera 40; when the current brightness is greater than or equal to the brightness threshold, the captured image may be captured by the main camera 30. The processor 10 switches a color camera or a black and white camera according to the current brightness of the ambient light to capture a captured image of the subject, thereby constructing two different types of three-dimensional images, RGB and Mono, which is beneficial to improving the photographing experience.
具体地,电子装置100还可以包括环境光传感器,环境光传感器用于检测环境光的当前亮度并发送至处理器10。Specifically, the electronic device 100 may further include an ambient light sensor, and the ambient light sensor is configured to detect the current brightness of the ambient light and send it to the processor 10.
由于处理器10是实时获取环境光的当前亮度,因此,电子装置100可以根据当前亮度实时切换用于采集拍摄图像的摄像头。假设亮度阈值为L0。在第一时刻,处理器10获取环境光的当前亮度为L1,其中L1<L0,此时,由副摄像头40采集被摄物的拍摄图像;在第二时刻,处理器10获取环境光的当前亮度为L2,其中L2>L0,此时,由主摄像头30再次采集被摄物的拍摄图像。最终,处理器10根据深度图像和主摄像头30采集的拍摄图像构建被摄物的三维图像。Since the processor 10 acquires the current brightness of the ambient light in real time, the electronic device 100 can switch the camera used to capture the captured image in real time according to the current brightness. Assume that the brightness threshold is L0. At the first moment, the processor 10 obtains the current brightness of the ambient light as L1, where L1 <L0. At this time, the sub-camera 40 collects the captured image of the subject; at the second moment, the processor 10 obtains the current brightness of the ambient light. The brightness is L2, where L2> L0. At this time, the main camera 30 captures a captured image of the subject again. Finally, the processor 10 constructs a three-dimensional image of the subject based on the depth image and the captured image collected by the main camera 30.
示例性的,本申请实施方式的飞行时间模组20可具有如下结构。Exemplarily, the time of flight module 20 according to the embodiment of the present application may have the following structure.
请参阅图9至图12,飞行时间模组20包括第一基板组件21、垫块22、光发射器23及光接收器24。第一基板组件21包括互相连接的第一基板211及柔性电路板212。垫块22设置在第一基板211上。光发射器23用于向外发射光信号,光发射器23设置在垫块22上。柔性电路板212弯折且柔性电路板212的一端连接第一基板211,另一端连接光发射器23。光接收器24设置在第一基板211上,光接收器24用于接收被反射回的光发射器23发射的光信号,光接收器24包括壳体241及设置在壳体241上的光学元件242,壳体241与垫块22连接成一体。Please refer to FIGS. 9 to 12. The time-of-flight module 20 includes a first substrate assembly 21, a spacer 22, a light transmitter 23 and a light receiver 24. The first substrate assembly 21 includes a first substrate 211 and a flexible circuit board 212 connected to each other. The spacer 22 is disposed on the first substrate 211. The light transmitter 23 is configured to emit an optical signal outward. The light transmitter 23 is disposed on the cushion block 22. The flexible circuit board 212 is bent and one end of the flexible circuit board 212 is connected to the first substrate 211 and the other end is connected to the light emitter 23. The light receiver 24 is disposed on the first substrate 211. The light receiver 24 is configured to receive the light signal emitted by the reflected light transmitter 23. The light receiver 24 includes a casing 241 and an optical element disposed on the casing 241. 242. The housing 241 is connected with the cushion block 22 as a whole.
本申请实施方式的移动终端100中,由于光发射器23设置在垫块22上,垫块22可以垫高光发射器23的高度,进而提高光发射器23的出射面的高度,光发射器23发射的光信号不易被光接收器24遮挡,使得光信号能够完全照射到被测物体上。In the mobile terminal 100 according to the embodiment of the present application, since the light emitter 23 is disposed on the pad 22, the pad 22 can raise the height of the light emitter 23, thereby increasing the height of the light emitting surface of the light emitter 23, and the light emitter 23 The emitted light signal is not easily blocked by the light receiver 24, so that the light signal can be completely irradiated on the measured object.
具体地,第一基板组件21包括第一基板211及柔性电路板212。第一基板211可以是印刷线路板或柔性线路板,第一基板211上可以铺设有飞行时间模组20的控制线路等。柔性电路板212的一端可以连接在第一基板211上,柔性电路板212可以发生一定角度的弯折,使得柔性电路板212两端连接的器件的相对位置可以有较多选择。Specifically, the first substrate assembly 21 includes a first substrate 211 and a flexible circuit board 212. The first substrate 211 may be a printed circuit board or a flexible circuit board. The control circuit of the time of flight module 20 may be laid on the first substrate 211. One end of the flexible circuit board 212 can be connected to the first substrate 211, and the flexible circuit board 212 can be bent at a certain angle, so that the relative positions of the devices connected at both ends of the flexible circuit board 212 can be selected.
请参阅图9及图13,垫块22设置在第一基板211上。在一个例子中,垫块22与第一基板211接触且承载在第一基板211上,具体地,垫块22可以通过胶粘等方式与第一基板211结合。垫块22的材料可以是金属、塑料等。在本申请实施例中,垫块22与第一基板211结合的面可以是平面,垫块22与该结合的面相背的面也可以是平面,使得光发射器23设置在垫块22上时具有较好的平稳性。Referring to FIGS. 9 and 13, the pad 22 is disposed on the first substrate 211. In one example, the pad 22 is in contact with the first substrate 211 and is carried on the first substrate 211. Specifically, the pad 22 may be combined with the first substrate 211 by means of adhesion or the like. The material of the spacer 22 may be metal, plastic, or the like. In the embodiment of the present application, a surface where the pad 22 is combined with the first substrate 211 may be a flat surface, and a surface opposite to the combined surface of the pad 22 may also be a flat surface, so that when the light emitter 23 is disposed on the pad 22 Has better stability.
光发射器23用于向外发射光信号,具体地,光信号可以是红外光,光信号可以是向被测物体发射的点阵光斑,光信号以一定的发散角从光发射器23中射出。光发射器23设置在垫块22上,在本申请实施例中,光发射器23设置在垫块22的与第一基板211相背的一侧,或者说,垫块22将第一基板211及光发射器23间隔开,以使光发射器23与第一基板211之间形成高度差。光发射器23还与柔性电路板212连接,柔性电路板212弯折设置,柔性电路板212的一端连接第一基板211,另一端连接光发射器23,以将光发射器23的控制信号从第一基板211传输到光发射器23,或将光发射器23的反馈信号(例如光发射器23的发射光信号的时间信息、频率信息,光发射器23的温度信息等)传输到第一基板211。The light transmitter 23 is configured to emit an optical signal outwards. Specifically, the light signal may be infrared light, and the light signal may be a lattice spot emitted to the object to be measured. The light signal is emitted from the light transmitter 23 at a certain divergence angle. . The light transmitter 23 is disposed on the spacer 22. In the embodiment of the present application, the light transmitter 23 is disposed on the side of the spacer 22 opposite to the first substrate 211, or in other words, the spacer 22 connects the first substrate 211. The light emitter 23 is spaced apart from the light emitter 23 so that a height difference is formed between the light emitter 23 and the first substrate 211. The light transmitter 23 is also connected to the flexible circuit board 212. The flexible circuit board 212 is bent, one end of the flexible circuit board 212 is connected to the first substrate 211, and the other end is connected to the light transmitter 23, so that the control signal of the light transmitter 23 is removed The first substrate 211 is transmitted to the light transmitter 23, or a feedback signal of the light transmitter 23 (for example, time information, frequency information of the light signal emitted by the light transmitter 23, temperature information of the light transmitter 23, etc.) is transmitted to the first Substrate 211.
请参阅图9、图10及图12,光接收器24用于接收被反射回的光发射器23发射的光信号。光接收器24设置在第一基板211上,且光接收器24和第一基板211的接触面与垫块22和第一基板211的接触面基本齐平设置(即,二者的安装起点是在同一平面上)。具体地,光接收器24包括壳体241及光学元件242。壳体241设置在第一基板211上,光学元件242设置在壳体241上,壳体241可以是光接收器24的镜座及镜筒,光学元件242可以是设置在壳体241内的透镜等元件。进一步地,光接收器24还可以包括感光芯片(图未示),由被测物体反射回的光信号通过光学元件242作用后照射到感光芯片中,感光芯片对该光信号产生响应。飞行时间模组20计算光发射器23发出光信号与感光芯片接收经被测物体反射该光信号之间的时间差,并进一步获取被测物体的深度信息,该深度信息可以用于测距、用于生成深度图像或用于三维建模等。本申请实施例中,壳体241与垫块22连接成一体。具体地,壳体241与垫块22可以是一体成型,例如壳体241与垫块22的材料相同并通过注塑、切削等方式一体成型;或者壳体241与垫块22的材料不同,二者通过双色注塑形成等方式一体成型。壳体241与垫块22也可以是分别成型, 二者形成配合结构,在组装飞行时间模组20时,可以先将壳体241与垫块22连接成一体,再共同设置在第一基板211上;也可以先将壳体241与垫块22中的一个设置在第一基板211上,再将另一个设置在第一基板211上且连接成一体。Please refer to FIG. 9, FIG. 10 and FIG. 12. The optical receiver 24 is configured to receive an optical signal emitted by the optical transmitter 23 reflected back. The light receiver 24 is disposed on the first substrate 211, and the contact surface between the light receiver 24 and the first substrate 211 is substantially flush with the contact surface between the pad 22 and the first substrate 211 (that is, the installation starting point of the two is On the same plane). Specifically, the light receiver 24 includes a housing 241 and an optical element 242. The casing 241 is disposed on the first substrate 211, and the optical element 242 is disposed on the casing 241. The casing 241 may be a lens holder and a lens barrel of the light receiver 24, and the optical element 242 may be a lens disposed in the casing 241. And other components. Further, the light receiver 24 may further include a photosensitive chip (not shown). The optical signal reflected by the measured object is irradiated into the photosensitive chip through the optical element 242, and the photosensitive chip responds to the optical signal. The time-of-flight module 20 calculates the time difference between the light signal emitted by the light transmitter 23 and the light sensor receiving the light signal reflected by the measured object, and further obtains the depth information of the measured object, which can be used for distance measurement, For generating depth images or for 3D modeling. In the embodiment of the present application, the housing 241 and the cushion block 22 are integrally connected. Specifically, the housing 241 and the spacer 22 may be integrally formed. For example, the materials of the housing 241 and the spacer 22 are the same and are integrally formed by injection molding, cutting or the like; or the materials of the housing 241 and the spacer 22 are different, both Integrated molding by two-color injection molding. The housing 241 and the spacer 22 may also be separately formed, and the two form a matching structure. When assembling the time-of-flight module 20, the housing 241 and the spacer 22 may be connected into one body, and then may be disposed on the first substrate 211 together. It is also possible to firstly arrange one of the housing 241 and the pad 22 on the first substrate 211, and then arrange the other on the first substrate 211 and connect them as a whole.
本申请实施方式的移动终端100中,由于光发射器23设置在垫块22上,垫块22可以垫高光发射器23的高度,进而提高光发射器23的出射面的高度,光发射器23发射的光信号不易被光接收器24遮挡,使得光信号能够完全照射到被测物体上。光发射器23的出射面可以与光接收器24的入射面齐平,也可以是光发射器23的出射面略低于光接收器24的入射面,还可以是光发射器23的出射面略高于光接收器24的入射面。In the mobile terminal 100 according to the embodiment of the present application, since the light emitter 23 is disposed on the pad 22, the pad 22 can raise the height of the light emitter 23, thereby increasing the height of the light emitting surface of the light emitter 23, and the light emitter 23 The emitted light signal is not easily blocked by the light receiver 24, so that the light signal can be completely irradiated on the measured object. The exit surface of the light transmitter 23 may be flush with the entrance surface of the light receiver 24, or the exit surface of the light transmitter 23 may be slightly lower than the entrance surface of the light receiver 24, or it may be the exit surface of the light transmitter 23 Slightly higher than the incident surface of the light receiver 24.
请参阅图11及图13,在某些实施方式中,第一基板组件21还包括加强板213,加强板213结合在第一基板211的与垫块22相背的一侧。加强板213可以覆盖第一基板211的一个侧面,加强板213可以用于增加第一基板211的强度,避免第一基板211发生形变。另外,加强板213可以由导电的材料制成,例如金属或合金等,当飞行时间模组20安装在移动终端100上时,可以将加强板213与机壳10电连接,以使加强板213接地,并有效地减少外部元件的静电对飞行时间模组20的干扰。Please refer to FIG. 11 and FIG. 13. In some embodiments, the first substrate assembly 21 further includes a reinforcing plate 213. The reinforcing plate 213 is coupled to a side of the first substrate 211 opposite to the pad 22. The reinforcing plate 213 may cover one side of the first substrate 211, and the reinforcing plate 213 may be used to increase the strength of the first substrate 211 and prevent deformation of the first substrate 211. In addition, the reinforcing plate 213 may be made of a conductive material, such as a metal or an alloy. When the time-of-flight module 20 is installed on the mobile terminal 100, the reinforcing plate 213 may be electrically connected to the casing 10 to make the reinforcing plate 213. Grounding and effectively reducing the interference of static electricity from external components on the time of flight module 20.
请参阅图13至图15,在某些实施方式中,垫块22包括伸出第一基板211的侧边缘2111的凸出部225,柔性电路板212绕凸出部225弯折设置。具体地,垫块22的一部分直接承载在第一基板211上,另一部分未与第一基板211直接接触,且相对第一基板211的侧边缘2111伸出形成凸出部225。柔性电路板212可以连接在该侧边缘2111,柔性电路板212绕凸出部225弯折,或者说,柔性电路板212弯折以使凸出部225位于柔性电路板212弯折围成的空间内,当柔性电路板212受到外力的作用时,柔性电路板212不会向内塌陷而导致弯折的程度过大,造成柔性电路板212损坏。Please refer to FIGS. 13 to 15. In some embodiments, the cushion block 22 includes a protruding portion 225 protruding from the side edge 2111 of the first substrate 211, and the flexible circuit board 212 is bent around the protruding portion 225. Specifically, a part of the cushion block 22 is directly carried on the first substrate 211, and another part is not in direct contact with the first substrate 211, and protrudes from the side edge 2111 of the first substrate 211 to form a protruding portion 225. The flexible circuit board 212 may be connected to the side edge 2111, and the flexible circuit board 212 is bent around the protrusion 225, or the flexible circuit board 212 is bent so that the protrusion 225 is located in a space surrounded by the flexible circuit board 212. Inside, when the flexible circuit board 212 is subjected to an external force, the flexible circuit board 212 will not collapse inward and cause excessive bending, which will cause damage to the flexible circuit board 212.
进一步地,如图14所示,在某些实施方式中,凸出部225的外侧面2251为平滑的曲面(例如圆柱的外侧面等),即凸出部225的外侧面2251不会形成曲率突变,即使柔性电路板212贴覆着凸出部225的外侧面2251弯折,柔性电路板212的弯折程度也不会过大,进一步确保柔性电路板212的完好。Further, as shown in FIG. 14, in some embodiments, the outer surface 2251 of the protruding portion 225 is a smooth curved surface (eg, the outer surface of a cylinder, etc.), that is, the outer surface 2251 of the protruding portion 225 does not form a curvature. Suddenly, even if the flexible circuit board 212 is bent over the outer side 2251 of the protruding portion 225, the degree of bending of the flexible circuit board 212 will not be too large, which further ensures the integrity of the flexible circuit board 212.
请参阅图9至图11,在某些实施方式中,飞行时间模组20还包括连接器26,连接器26连接在第一基板211上。连接器26用于连接第一基板组件21及外部设备。连接器26与柔性电路板212分别连接在第一基板211的相背的两端。连接器26可以是连接座或连接头,当飞行时间模组20安装在机壳10内时,连接器26可以与移动终端100的主板连接,以使得飞行时间模组20与主板电连接。连接器26与柔性电路板212分别连接在第一基板211的相背的两端,例如可以是分别连接在第一基板211的左右两端,或者分别连接在第一基板211的前后两端。Please refer to FIGS. 9 to 11. In some embodiments, the time-of-flight module 20 further includes a connector 26 connected to the first substrate 211. The connector 26 is used to connect the first substrate assembly 21 and an external device. The connector 26 and the flexible circuit board 212 are respectively connected to opposite ends of the first substrate 211. The connector 26 may be a connection base or a connector. When the time-of-flight module 20 is installed in the casing 10, the connector 26 may be connected to the main board of the mobile terminal 100 so that the time-of-flight module 20 is electrically connected to the main board. The connector 26 and the flexible circuit board 212 are respectively connected to opposite ends of the first substrate 211. For example, the connectors 26 and the flexible circuit board 212 may be respectively connected to the left and right ends of the first substrate 211, or respectively connected to the front and rear ends of the first substrate 211.
请参阅图10及图11,在某些实施方式中,光发射器23与光接收器24沿一直线L排列,连接器26与柔性电路板212分别位于直线L的相背的两侧。可以理解,由于光发射器23与光接收器24排列设置,因此沿直线L的方向上,飞行时间模组20的尺寸可能已经较大。连接器26与柔性电路板212分别设置在直线L的相背的两侧,不会再增加飞行时间模组20沿直线L方向上的尺寸,进而便于将飞行时间模组20安装在移动终端100的机壳10上。Please refer to FIG. 10 and FIG. 11. In some embodiments, the light transmitter 23 and the light receiver 24 are arranged along a straight line L, and the connector 26 and the flexible circuit board 212 are located on opposite sides of the straight line L, respectively. It can be understood that, since the light transmitter 23 and the light receiver 24 are arranged in an array, the size of the time-of-flight module 20 may be larger in the direction of the straight line L. The connector 26 and the flexible circuit board 212 are respectively disposed on opposite sides of the straight line L, which will not increase the size of the time-of-flight module 20 in the direction of the straight line L, thereby facilitating the installation of the time-of-flight module 20 on the mobile terminal 100. On the chassis 10.
请参阅图13及图14,在某些实施方式中,垫块22与第一基板211结合的一侧开设有收容腔223。飞行时间模组20还包括设置在第一基板211上的电子元件25,电子元件25收容在收容腔223内。电子元件25可以是电容、电感、晶体管、电阻等元件,电子元件25可以与铺设在第一 基板211上的控制线路电连接,并用于驱动或控制光发射器23或光接收器24工作。电子元件25收容在收容腔223内,合理地利用了垫块22内的空间,不需要增加第一基板211的宽度来设置电子元件25,利于减小飞行时间模组20的整体尺寸。收容腔223的数量可以是一个或多个,多个收容腔223可以是互相间隔的,在安装垫块22时,可以将收容腔223与电子元件25的位置对准并将垫块22设置在第一基板211上。Please refer to FIG. 13 and FIG. 14. In some embodiments, a receiving cavity 223 is defined on a side where the cushion block 22 is combined with the first substrate 211. The time-of-flight module 20 further includes an electronic component 25 disposed on the first substrate 211, and the electronic component 25 is contained in the receiving cavity 223. The electronic component 25 may be an element such as a capacitor, an inductor, a transistor, a resistor, etc. The electronic component 25 may be electrically connected to a control line laid on the first substrate 211 and used to drive or control the operation of the light transmitter 23 or the light receiver 24. The electronic component 25 is contained in the containing cavity 223, and the space in the cushion block 22 is used reasonably. It is not necessary to increase the width of the first substrate 211 to set the electronic component 25, which is beneficial to reducing the overall size of the time-of-flight module 20. The number of the receiving cavities 223 may be one or more, and the plurality of receiving cavities 223 may be spaced apart from each other. When the pad 22 is installed, the positions of the receiving cavity 223 and the electronic component 25 may be aligned and the pad 22 may be disposed at On the first substrate 211.
请参阅图13及图15,在某些实施方式中,垫块22开设有与至少一个收容腔223连通的避让通孔224,至少一个电子元件25伸入避让通孔224内。可以理解,需要将电子元件25收容在收容腔223内时,要求电子元件25的高度不高于收容腔223的高度。而对于高度高于收容腔223的电子元件25,可以开设与收容腔223对应的避让通孔224,电子元件25可以部分伸入避让通孔224内,以在不提高垫块22高度的前提下布置电子元件25。Please refer to FIG. 13 and FIG. 15. In some embodiments, the cushion block 22 is provided with an avoiding through hole 224 communicating with at least one receiving cavity 223, and at least one electronic component 25 extends into the avoiding through hole 224. It can be understood that when the electronic component 25 needs to be contained in the containing cavity 223, the height of the electronic component 25 is required to be not higher than the height of the containing cavity 223. For the electronic component 25 having a height higher than the receiving cavity 223, an avoiding through hole 224 corresponding to the receiving cavity 223 may be provided, and the electronic component 25 may partially extend into the avoiding through hole 224, so as not to increase the height of the spacer 22 The electronic component 25 is arranged.
请参阅图13,在某些实施方式中,光发射器23包括第二基板组件231、光源组件232及外壳233。第二基板组件231设置在垫块22上,第二基板组件231与柔性电路板212连接。光源组件232设置在第二基板组件231上,光源组件232用于发射光信号。外壳233设置在第二基板组件231上,外壳233形成有收容空间2331,收容空间2331可用于收容光源组件232。柔性电路板212可以是可拆装地连接在第二基板组件231上。光源组件232与第二基板组件231电连接。外壳233整体可以呈碗状,且外壳233的开口向下罩设在第二基板组件231上,以将光源组件232收容在收容空间2331内。在本申请实施例中,外壳233上开设有与光源组件232对应的出光口2332,从光源组件232发出的光信号穿过出光口2332后发射到出去,光信号可以直接从出光口2332穿出,也可以经其他光学器件改变光路后从出光口2332穿出。Please refer to FIG. 13. In some embodiments, the light emitter 23 includes a second substrate assembly 231, a light source assembly 232 and a housing 233. The second substrate assembly 231 is disposed on the pad 22, and the second substrate assembly 231 is connected to the flexible circuit board 212. The light source assembly 232 is disposed on the second substrate assembly 231, and the light source assembly 232 is configured to emit a light signal. The casing 233 is disposed on the second substrate assembly 231. The casing 233 is formed with a receiving space 2331. The receiving space 2331 can be used for receiving the light source module 232. The flexible circuit board 212 may be detachably connected to the second substrate assembly 231. The light source assembly 232 is electrically connected to the second substrate assembly 231. The casing 233 may be bowl-shaped as a whole, and the opening of the casing 233 is disposed on the second substrate assembly 231 downwardly, so as to receive the light source assembly 232 in the accommodation space 2331. In the embodiment of the present application, a light outlet 2332 corresponding to the light source component 232 is provided on the housing 233. The optical signal emitted from the light source component 232 passes through the light outlet 2332 and is emitted. The light signal can pass directly through the light outlet 2332. It can also pass through the optical outlet 2332 after changing the optical path through other optical devices.
请继续参阅图13,在某些实施方式中,第二基板组件231包括第二基板2311及补强件2312。第二基板2311与柔性电路板212连接。光源组件232及补强件2312设置在第二基板2311的相背的两侧。第二基板2311的具体类型可以是印刷线路板或柔性线路板等,第二基板2311上可以铺设有控制线路。补强件2312可以通过胶粘、铆接等方式与第二基板2311固定连接,补强件2312可以增加第二基板组件231整体的强度。光发射器23设置在垫块22上时,补强件2312可以与垫块22直接接触,第二基板2311不会暴露在外部,且不需要与垫块22直接接触,第二基板2311不易受到灰尘等的污染。Please continue to refer to FIG. 13. In some embodiments, the second substrate assembly 231 includes a second substrate 2311 and a reinforcing member 2312. The second substrate 2311 is connected to the flexible circuit board 212. The light source assembly 232 and the reinforcing member 2312 are disposed on opposite sides of the second substrate 2311. A specific type of the second substrate 2311 may be a printed circuit board or a flexible circuit board, and a control circuit may be laid on the second substrate 2311. The reinforcing member 2312 may be fixedly connected to the second substrate 2311 by means of gluing, riveting, or the like. The reinforcing member 2312 may increase the overall strength of the second substrate assembly 231. When the light emitter 23 is disposed on the spacer 22, the reinforcing member 2312 can directly contact the spacer 22, the second substrate 2311 is not exposed to the outside, and does not need to be in direct contact with the spacer 22, and the second substrate 2311 is not easily affected. Contamination by dust, etc.
在如图13所示的实施例中,补强件2312与垫块22分体成型。在组装飞行时间模组20时,可以先将垫块22安装在第一基板211上,此时柔性电路板212的两端分别连接第一基板211及第二基板2311,且柔性电路板212可以先不弯折(如图15所示的状态)。然后再将柔性电路板212弯折,使得补强件2312设置在垫块22上。In the embodiment shown in FIG. 13, the reinforcing member 2312 and the cushion block 22 are formed separately. When assembling the time-of-flight module 20, the spacer 22 may be first mounted on the first substrate 211. At this time, the two ends of the flexible circuit board 212 are respectively connected to the first substrate 211 and the second substrate 2311, and the flexible circuit board 212 may Do not bend first (state shown in Fig. 15). The flexible circuit board 212 is then bent, so that the reinforcing member 2312 is disposed on the cushion block 22.
当然,在其他实施例中,补强件2312与垫块22可以一体成型,例如通过注塑等工艺一体成型,在组装飞行时间模组20时,可以将垫块22及光发射器23一同安装在第一基板211上。Of course, in other embodiments, the reinforcing member 2312 and the spacer 22 may be integrally formed, for example, integrally formed by a process such as injection molding. When assembling the time-of-flight module 20, the spacer 22 and the light emitter 23 may be installed together. On the first substrate 211.
请参阅图15,在某些实施方式中,补强件2312上形成有第一定位件2313。垫块22包括本体221及第二定位件222,第二定位件222形成在本体221上。第二基板组件231设置在垫块22上时,第一定位件2313与第二定位件222配合。具体地,第一定位件2313与第二定位件222配合后,能有效地限制第二基板组件231与垫块22之间的相对运动。第一定位件2313及第二定位件222的具体类型可以依据需要进行选择,例如第一定位件2313为形成在补强件2312上的定位孔,同时第二定位件222为定位柱,定位柱伸入定位孔内以使第一定位件2313与第二定位件222相互配合;或者第一定位件2313为形成在补强件2312上的定位柱,第二定位件222为定位孔, 定位柱伸入定位孔内以使第一定位件2313与第二定位件222相互配合;或者第一定位件2313及第二定位件222的数量均为多个,部分第一定位件2313为定位孔,部分第二定位件222为定位柱,部分第一定位件2313为定位柱,部分第二定位件222为定位孔,定位柱伸入定位孔内以使第一定位件2313与第二定位件222相互配合。Referring to FIG. 15, in some embodiments, a first positioning member 2313 is formed on the reinforcing member 2312. The cushion block 22 includes a body 221 and a second positioning member 222. The second positioning member 222 is formed on the body 221. When the second substrate assembly 231 is disposed on the cushion block 22, the first positioning member 2313 cooperates with the second positioning member 222. Specifically, after the first positioning member 2313 cooperates with the second positioning member 222, the relative movement between the second substrate assembly 231 and the cushion block 22 can be effectively restricted. The specific types of the first positioning member 2313 and the second positioning member 222 can be selected according to needs. For example, the first positioning member 2313 is a positioning hole formed in the reinforcing member 2312, and the second positioning member 222 is a positioning column. Protrude into the positioning hole so that the first positioning member 2313 and the second positioning member 222 cooperate with each other; or the first positioning member 2313 is a positioning column formed on the reinforcing member 2312, and the second positioning member 222 is a positioning hole and the positioning column Project into the positioning hole so that the first positioning member 2313 and the second positioning member 222 cooperate with each other; or the number of the first positioning member 2313 and the second positioning member 222 are multiple, and part of the first positioning member 2313 is a positioning hole, Part of the second positioning member 222 is a positioning column, part of the first positioning member 2313 is a positioning column, and part of the second positioning member 222 is a positioning hole. The positioning column projects into the positioning hole so that the first positioning member 2313 and the second positioning member 222 work cooperatively.
下面将对光源组件232的结构进行举例说明:The structure of the light source component 232 will be described as an example below:
请参阅图16,光源组件232包括光源60、镜筒70、扩散器(diffuser)80及保护罩90。光源60连接在第二基板组件231上,镜筒70包括相背的第一面71及第二面72,镜筒11开设贯穿第一面71与第二面72的收容腔75,第一面71朝第二面72凹陷形成与收容腔75连通的安装槽76。扩散器80安装在安装槽76内。保护罩90安装在镜筒70的第一面71所在的一侧,扩散器80夹设在保护罩90与安装槽76的底面77之间。Referring to FIG. 16, the light source assembly 232 includes a light source 60, a lens barrel 70, a diffuser 80 and a protective cover 90. The light source 60 is connected to the second substrate assembly 231. The lens barrel 70 includes a first surface 71 and a second surface 72 opposite to each other. The lens barrel 11 defines a receiving cavity 75 penetrating the first surface 71 and the second surface 72. The first surface 71 is recessed toward the second surface 72 to form a mounting groove 76 communicating with the receiving cavity 75. The diffuser 80 is installed in the mounting groove 76. The protective cover 90 is mounted on the side where the first surface 71 of the lens barrel 70 is located, and the diffuser 80 is sandwiched between the protective cover 90 and the bottom surface 77 of the mounting groove 76.
保护罩90可以通过螺纹连接、卡合、紧固件连接的方式安装在镜筒70上。例如,请参阅图16,当保护罩90包括顶壁91及保护侧壁92时,保护罩90(保护侧壁92)上设置有内螺纹,镜筒70上设置有外螺纹,此时保护罩90的内螺纹与镜筒70的外螺纹螺合以将保护罩90安装在镜筒70上;或者,请参阅图17,当保护罩90包括顶壁91时,保护罩90(顶壁91)开设有卡孔95,镜筒70的端部设置有卡勾73,当保护罩90设置在镜筒70上时,卡勾73穿设在卡孔95内以使保护罩90安装在镜筒70上;或者,请参阅图18,当保护罩90包括顶壁91及保护侧壁92时,保护罩90(保护侧壁92)开设有卡孔95,镜筒70上设置有卡勾73,当保护罩90设置在镜筒70上时,卡勾73穿设在卡孔95内以使保护罩90安装在镜筒70上;或者,请参阅图19,当保护罩90包括顶壁91时,镜筒70的端部开设有第一定位孔74,保护罩90(顶壁91)上开设有与第一定位孔74对应的第二定位孔93,紧固件94穿过第二定位孔93并锁紧在第一定位孔74内以将保护罩90安装在镜筒70上。当保护罩90安装在镜筒70上时,保护罩90与扩散器80抵触并使扩散器80与底面77抵触,从而使扩散器80被夹设在保护罩90与底面77之间。The protective cover 90 can be mounted on the lens barrel 70 by means of screw connection, engagement, and fastener connection. For example, referring to FIG. 16, when the protective cover 90 includes a top wall 91 and a protective side wall 92, the protective cover 90 (protective side wall 92) is provided with internal threads and the lens barrel 70 is provided with external threads. At this time, the protective cover The internal thread of 90 is screwed with the external thread of the lens barrel 70 to mount the protective cover 90 on the lens barrel 70; or, referring to FIG. 17, when the protective cover 90 includes a top wall 91, the protective cover 90 (top wall 91) A locking hole 95 is opened, and a hook 73 is provided at an end of the lens barrel 70. When the protective cover 90 is provided on the lens barrel 70, the hook 73 is inserted into the locking hole 95 so that the protective cover 90 is mounted on the lens barrel 70. 18; or when referring to FIG. 18, when the protective cover 90 includes a top wall 91 and a protective side wall 92, the protective cover 90 (protective side wall 92) is provided with a locking hole 95, and a hook 73 is provided on the lens barrel 70. When the protective cover 90 is disposed on the lens barrel 70, the hook 73 is inserted into the hole 95 so that the protective cover 90 is mounted on the lens barrel 70; or, referring to FIG. 19, when the protective cover 90 includes the top wall 91, The end of the lens barrel 70 is provided with a first positioning hole 74, the protective cover 90 (top wall 91) is provided with a second positioning hole 93 corresponding to the first positioning hole 74, and the fastener 94 passes through the second positioning hole 93 And locked A first positioning hole 74 to the protective cover 90 is mounted on the lens barrel 70. When the protective cover 90 is mounted on the lens barrel 70, the protective cover 90 is in contact with the diffuser 80 and the diffuser 80 is in contact with the bottom surface 77, so that the diffuser 80 is sandwiched between the protective cover 90 and the bottom surface 77.
光源组件232通过在镜筒70上开设安装槽76,并将扩散器80安装在安装槽76内,以及通过保护罩90安装在镜筒70上以将扩散器80夹持在保护罩90与安装槽76的底面77之间,从而现实将扩散器80固定在镜筒70上。且避免使用胶水将扩散器80固定在镜筒70上,从而能够避免胶水挥发成气态后,气态的胶水扩散并凝固在扩散器80的表面而影响扩散器80的微观结构,并能够避免连接扩散器80和镜筒70胶水因老化而使粘着力下降时扩散器80从镜筒70上脱落。The light source assembly 232 is provided with a mounting groove 76 on the lens barrel 70 and the diffuser 80 is installed in the mounting groove 76, and is mounted on the lens barrel 70 through a protective cover 90 to clamp the diffuser 80 between the protective cover 90 and the installation. Between the bottom surfaces 77 of the grooves 76, the diffuser 80 is actually fixed to the lens barrel 70. And avoid using glue to fix the diffuser 80 on the lens barrel 70, so as to prevent the glue from diffusing and solidifying on the surface of the diffuser 80 and affecting the microstructure of the diffuser 80 after the glue is volatilized to a gaseous state, and the connection and diffusion can be avoided. When the glue of the device 80 and the lens barrel 70 decreases due to aging, the diffuser 80 falls off from the lens barrel 70.
在本说明书的描述中,参考术语“某些实施方式”、“一个实施方式”、“一些实施方式”、“示意性实施方式”、“示例”、“具体示例”、或“一些示例”的描述意指结合所述实施方式或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施方式或示例中。在本说明书中,对上述术语的示意性表述不一定指的是相同的实施方式或示例。而且,描述的具体特征、结构、材料或者特点可以在任何的一个或多个实施方式或示例中以合适的方式结合。In the description of this specification, reference is made to the terms "certain embodiments", "one embodiment", "some embodiments", "schematic embodiments", "examples", "specific examples", or "some examples" The description means that a specific feature, structure, material, or characteristic described in combination with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, the schematic expressions of the above terms do not necessarily refer to the same implementation or example. Moreover, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more implementations or examples.
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个所述特征。在本申请的描述中,“多个”的含义是至少两个,例如两个,三个,除非另有明确具体的限定。In addition, the terms "first" and "second" are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, the features defined as “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present application, the meaning of "plurality" is at least two, for example, two, three, unless specifically defined otherwise.
尽管上面已经示出和描述了本申请的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本申请的限制,本领域的普通技术人员在本申请的范围内可以对上述实施例进行变化、修改、替换和变型,本申请的范围由权利要求及其等同物限定。Although the embodiments of the present application have been shown and described above, it can be understood that the above embodiments are exemplary and should not be construed as limitations on the present application. Those skilled in the art may, within the scope of the present application, understand the above. The embodiments are subject to change, modification, replacement, and modification, and the scope of the present application is defined by the claims and their equivalents.

Claims (20)

  1. 一种电子装置,其特征在于,所述电子装置包括:An electronic device is characterized in that the electronic device includes:
    飞行时间模组,用于采集被摄物的深度图像;Time-of-flight module, used to collect depth images of the subject;
    主摄像头,用于采集所述被摄物的拍摄图像;A main camera for collecting a captured image of the subject;
    副摄像头,用于采集所述被摄物的拍摄图像;和A secondary camera for capturing a captured image of the subject; and
    处理器,用于根据预定条件切换所述主摄像头或所述副摄像头以采集所述被摄物的拍摄图像、及根据所述深度图像和所述拍摄图像构建所述被摄物的三维图像。A processor, configured to switch the main camera or the sub camera according to a predetermined condition to acquire a captured image of the subject, and construct a three-dimensional image of the subject according to the depth image and the captured image.
  2. 根据权利要求1所述的电子装置,其特征在于,所述主摄像头为广角摄像头,所述副摄像头为长焦摄像头;或者The electronic device according to claim 1, wherein the main camera is a wide-angle camera, and the sub camera is a telephoto camera; or
    所述主摄像头为彩色摄像头,所述副摄像头为黑白摄像头。The main camera is a color camera, and the sub camera is a black and white camera.
  3. 根据权利要求1所述的电子装置,其特征在于,所述飞行时间模组和所述副摄像头分别设置在所述主摄像头的两侧。The electronic device according to claim 1, wherein the time-of-flight module and the sub-camera are respectively disposed on two sides of the main camera.
  4. 根据权利要求1所述的电子装置,其特征在于,所述主摄像头为广角摄像头,所述副摄像头为长焦摄像头,所述处理器还用于实时获取所述被摄物与所述电子装置之间的当前距离;The electronic device according to claim 1, wherein the main camera is a wide-angle camera, the sub-camera is a telephoto camera, and the processor is further configured to obtain the subject and the electronic device in real time. The current distance between
    在所述当前距离小于距离阈值时,所述主摄像头采集所述拍摄图像;When the current distance is less than a distance threshold, the main camera collects the captured image;
    在所述当前距离大于或等于所述距离阈值时,所述副摄像头采集所述拍摄图像。When the current distance is greater than or equal to the distance threshold, the secondary camera collects the captured image.
  5. 根据权利要求4所述的电子装置,其特征在于,所述处理器用于根据所述深度图像获取所述当前距离;或者The electronic device according to claim 4, wherein the processor is configured to obtain the current distance according to the depth image; or
    所述电子装置还包括距离检测装置,所述距离检测装置用于实时检测所述当前距离并发送至所述处理器。The electronic device further includes a distance detection device for detecting the current distance in real time and sending the current distance to the processor.
  6. 根据权利要求1所述的电子装置,其特征在于,所述主摄像头为广角摄像头,所述副摄像头为长焦摄像头,所述主摄像头或所述副摄像头还用于采集所述被摄物的预览图像;The electronic device according to claim 1, wherein the main camera is a wide-angle camera, the sub camera is a telephoto camera, and the main camera or the sub camera is further used for collecting the subject Preview image
    所述处理器用于检测所述预览图像中的人脸信息、及根据所述人脸信息切换所述主摄像头或所述副摄像头以采集所述拍摄图像。The processor is configured to detect face information in the preview image, and switch the main camera or the sub camera to acquire the captured image according to the face information.
  7. 根据权利要求1所述的电子装置,其特征在于,所述主摄像头为彩色摄像头,所述副摄像头为黑白摄像头,所述处理器还用于实时获取环境光的当前亮度;The electronic device according to claim 1, wherein the main camera is a color camera, the sub camera is a black and white camera, and the processor is further configured to obtain the current brightness of the ambient light in real time;
    在所述当前亮度小于亮度阈值时,所述副摄像头采集所述拍摄图像;When the current brightness is less than a brightness threshold, the secondary camera collects the captured image;
    在所述当前亮度大于或等于所述亮度阈值时,所述主摄像头采集所述拍摄图像。When the current brightness is greater than or equal to the brightness threshold, the main camera collects the captured image.
  8. 根据权利要求1所述的电子装置,其特征在于,所述飞行时间模组包括:The electronic device according to claim 1, wherein the time of flight module comprises:
    第一基板组件,所述第一基板组件包括互相连接的第一基板及柔性电路板;A first substrate assembly including a first substrate and a flexible circuit board connected to each other;
    垫块,所述垫块设置在所述第一基板上;A cushion block disposed on the first substrate;
    光发射器,所述光发射器用于向外发射光信号,所述光发射器设置在所述垫块上,所述柔性电路板弯折且所述柔性电路板的一端连接所述第一基板,另一端连接所述光发射器;及A light transmitter for transmitting an optical signal outward, the light transmitter is disposed on the pad, the flexible circuit board is bent and one end of the flexible circuit board is connected to the first substrate , The other end is connected to the light emitter; and
    光接收器,所述光接收器设置在所述第一基板上,所述光接收器用于接收被反射回的所述光发射器发射的光信号,所述光接收器包括壳体及设置在壳体上的光学元件,所述壳体与所述垫块连接成一体。A light receiver, the light receiver is disposed on the first substrate, the light receiver is configured to receive an optical signal emitted by the light transmitter that is reflected back, the light receiver includes a housing and is disposed on An optical element on the casing, the casing and the pad are connected into a whole.
  9. 根据权利要求8所述的电子装置,其特征在于,所述垫块与所述壳体为一体成型。The electronic device according to claim 8, wherein the cushion block is integrally formed with the casing.
  10. 根据权利要求8或9所述的电子装置,其特征在于,所述光发射器包括:The electronic device according to claim 8 or 9, wherein the light emitter comprises:
    第二基板组件,所述第二基板组件设置在所述垫块上,所述第二基板组件与所述柔性电路板连接;A second substrate assembly disposed on the pad, the second substrate assembly being connected to the flexible circuit board;
    设置在所述第二基板组件上的光源组件,所述光源组件用于发射所述光信号;及A light source assembly disposed on the second substrate assembly, the light source assembly being configured to emit the optical signal; and
    设置在所述第二基板组件上的外壳,所述外壳形成有收容空间以收容所述光源组件。A casing provided on the second substrate assembly, and the casing is formed with a receiving space to receive the light source assembly.
  11. 根据权利要求10所述的电子装置,其特征在于,所述第二基板组件包括第二基板及补强件,所述第二基板与所述柔性电路板连接,所述光源组件及所述补强件设置在所述第二基板的相背的两侧。The electronic device according to claim 10, wherein the second substrate assembly includes a second substrate and a reinforcing member, the second substrate is connected to the flexible circuit board, the light source assembly and the reinforcement The strong members are disposed on opposite sides of the second substrate.
  12. 根据权利要求11所述的电子装置,其特征在于,所述补强件与所述垫块一体成型;或所述补强件与所述垫块分体成型。The electronic device according to claim 11, wherein the reinforcing member is integrally formed with the pad; or the reinforcing member is formed separately from the pad.
  13. 根据权利要求11所述的电子装置,其特征在于,所述补强件上形成有第一定位件,所述垫块包括本体及形成在所述本体上的第二定位件,所述第二基板组件设置在所述垫块上时,所述第一定位件与所述第二定位件配合。The electronic device according to claim 11, wherein a first positioning member is formed on the reinforcing member, the pad includes a main body and a second positioning member formed on the main body, and the second positioning member is formed on the reinforcing member. When the substrate assembly is disposed on the pad, the first positioning member cooperates with the second positioning member.
  14. 根据权利要求8所述的电子装置,其特征在于,所述垫块与所述第一基板结合的一侧开设有收容腔,所述飞行时间模组还包括设置在所述第一基板上的电子元件,所述电子元件收容在所述收容腔内。The electronic device according to claim 8, wherein a receiving cavity is provided on a side where the pad is combined with the first substrate, and the time-of-flight module further comprises a An electronic component is contained in the containing cavity.
  15. 根据权利要求14所述的电子装置,其特征在于,所述垫块开设有与至少一个所述收容腔连通的避让通孔,至少一个所述电子元件伸入所述避让通孔内。The electronic device according to claim 14, wherein the cushion block is provided with an avoidance through-hole communicating with at least one of the accommodating cavities, and at least one of the electronic components extends into the avoidance through-hole.
  16. 一种电子装置的控制方法,其特征在于,所述电子装置包括飞行时间模组、主摄像头和副摄像头,所述控制方法包括:A control method for an electronic device, characterized in that the electronic device includes a time-of-flight module, a main camera, and a sub camera, and the control method includes:
    通过所述飞行时间模组采集被摄物的深度图像;Acquiring a depth image of a subject through the time of flight module;
    根据预定条件切换所述主摄像头或所述副摄像头以采集所述被摄物的拍摄图像;和Switching the main camera or the sub camera to capture a captured image of the subject according to a predetermined condition; and
    根据所述深度图像和所述拍摄图像构建所述被摄物的三维图像。A three-dimensional image of the subject is constructed based on the depth image and the captured image.
  17. 根据权利要求16所述的电子装置的控制方法,其特征在于,所述飞行时间模组和所述副摄像头分别设置在所述主摄像头的两侧。The method for controlling an electronic device according to claim 16, wherein the time-of-flight module and the sub-camera are respectively disposed on both sides of the main camera.
  18. 根据权利要求16所述的电子装置的控制方法,其特征在于,所述主摄像头为广角摄像头,所述副摄像头为长焦摄像头,所述控制方法还包括:The method for controlling an electronic device according to claim 16, wherein the main camera is a wide-angle camera, the sub-camera is a telephoto camera, and the control method further comprises:
    实时获取所述被摄物与所述电子装置之间的当前距离;Acquiring the current distance between the subject and the electronic device in real time;
    所述根据预定条件切换所述主摄像头或所述副摄像头以采集所述被摄物的拍摄图像的步骤包括:The step of switching the main camera or the sub camera to acquire a captured image of the subject according to a predetermined condition includes:
    在所述当前距离小于距离阈值时,通过所述主摄像头采集所述拍摄图像;和When the current distance is less than a distance threshold, collecting the captured image through the main camera; and
    在所述当前距离大于或等于所述距离阈值时,通过所述副摄像头采集所述拍摄图像。When the current distance is greater than or equal to the distance threshold, the captured image is collected by the secondary camera.
  19. 根据权利要求16所述的电子装置的控制方法,其特征在于,所述主摄像头为广角摄像头,所述副摄像头为长焦摄像头,所述控制方法还包括:The method for controlling an electronic device according to claim 16, wherein the main camera is a wide-angle camera, the sub-camera is a telephoto camera, and the control method further comprises:
    通过所述主摄像头或所述副摄像头采集所述被摄物的预览图像;和Collecting a preview image of the subject through the main camera or the sub camera; and
    检测所述预览图像中的人脸信息;Detecting face information in the preview image;
    所述根据预定条件切换所述主摄像头或所述副摄像头以采集所述被摄物的拍摄图像的步骤包括:The step of switching the main camera or the sub camera to acquire a captured image of the subject according to a predetermined condition includes:
    根据所述人脸信息切换所述主摄像头或所述副摄像头以采集所述拍摄图像。Switching the main camera or the sub camera according to the face information to acquire the captured image.
  20. 根据权利要求16所述的电子装置的控制方法,其特征在于,所述主摄像头为彩色摄像头, 所述副摄像头为黑白摄像头,所述控制方法还包括:The method for controlling an electronic device according to claim 16, wherein the main camera is a color camera, the sub camera is a black and white camera, and the control method further comprises:
    实时获取环境光的当前亮度;Get the current brightness of the ambient light in real time;
    所述根据预定条件切换所述主摄像头或所述副摄像头以采集所述被摄物的拍摄图像的步骤包括:The step of switching the main camera or the sub camera to acquire a captured image of the subject according to a predetermined condition includes:
    在所述当前亮度小于亮度阈值时,通过所述副摄像头采集所述拍摄图像;和When the current brightness is less than a brightness threshold, collecting the captured image through the secondary camera; and
    在所述当前亮度大于或等于所述亮度阈值时,通过所述主摄像头采集所述拍摄图像。When the current brightness is greater than or equal to the brightness threshold, the captured image is collected by the main camera.
PCT/CN2019/090077 2018-08-22 2019-06-05 Electronic device and control method for electronic device WO2020038063A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810963397.2 2018-08-22
CN201810963397.2A CN109005348A (en) 2018-08-22 2018-08-22 The control method of electronic device and electronic device

Publications (1)

Publication Number Publication Date
WO2020038063A1 true WO2020038063A1 (en) 2020-02-27

Family

ID=64593671

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/090077 WO2020038063A1 (en) 2018-08-22 2019-06-05 Electronic device and control method for electronic device

Country Status (2)

Country Link
CN (1) CN109005348A (en)
WO (1) WO2020038063A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114765662A (en) * 2021-01-13 2022-07-19 富士康(昆山)电脑接插件有限公司 Sensing module

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109005348A (en) * 2018-08-22 2018-12-14 Oppo广东移动通信有限公司 The control method of electronic device and electronic device
CN112492138A (en) 2018-12-24 2021-03-12 华为技术有限公司 Camera shooting assembly and electronic equipment
CN109639983B (en) * 2019-01-03 2020-09-04 Oppo广东移动通信有限公司 Photographing method, photographing device, terminal and computer-readable storage medium
CN112738397A (en) * 2020-12-29 2021-04-30 维沃移动通信(杭州)有限公司 Shooting method, shooting device, electronic equipment and readable storage medium
WO2023070313A1 (en) * 2021-10-26 2023-05-04 京东方科技集团股份有限公司 Time-of-flight camera module and display device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105321159A (en) * 2014-07-29 2016-02-10 宏达国际电子股份有限公司 Hand-held electronic apparatus, image capturing apparatus and method for obtaining depth information
CN106454077A (en) * 2016-09-26 2017-02-22 宇龙计算机通信科技(深圳)有限公司 Photographing method, photographing apparatus and terminal
US20180184071A1 (en) * 2015-06-23 2018-06-28 Huawei Technologies Co., Ltd. Photographing device and method for obtaining depth information
CN108989783A (en) * 2018-08-22 2018-12-11 Oppo广东移动通信有限公司 The control method of electronic device and electronic device
CN109005348A (en) * 2018-08-22 2018-12-14 Oppo广东移动通信有限公司 The control method of electronic device and electronic device
CN109040556A (en) * 2018-08-22 2018-12-18 Oppo广东移动通信有限公司 Imaging device and electronic equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101862199B1 (en) * 2012-02-29 2018-05-29 삼성전자주식회사 Method and Fusion system of time-of-flight camera and stereo camera for reliable wide range depth acquisition
CN102722080B (en) * 2012-06-27 2015-11-18 杭州南湾科技有限公司 A kind of multi purpose spatial image capture method based on many lens shootings
CN106657455B (en) * 2016-10-25 2023-05-05 奥比中光科技集团股份有限公司 Electronic equipment with rotatable camera
CN106851107A (en) * 2017-03-09 2017-06-13 广东欧珀移动通信有限公司 Switch control method, control device and the electronic installation of camera assisted drawing
CN107590793A (en) * 2017-09-11 2018-01-16 广东欧珀移动通信有限公司 Image processing method and device, electronic installation and computer-readable recording medium
CN107995434A (en) * 2017-11-30 2018-05-04 广东欧珀移动通信有限公司 Image acquiring method, electronic device and computer-readable recording medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105321159A (en) * 2014-07-29 2016-02-10 宏达国际电子股份有限公司 Hand-held electronic apparatus, image capturing apparatus and method for obtaining depth information
US20180184071A1 (en) * 2015-06-23 2018-06-28 Huawei Technologies Co., Ltd. Photographing device and method for obtaining depth information
CN106454077A (en) * 2016-09-26 2017-02-22 宇龙计算机通信科技(深圳)有限公司 Photographing method, photographing apparatus and terminal
CN108989783A (en) * 2018-08-22 2018-12-11 Oppo广东移动通信有限公司 The control method of electronic device and electronic device
CN109005348A (en) * 2018-08-22 2018-12-14 Oppo广东移动通信有限公司 The control method of electronic device and electronic device
CN109040556A (en) * 2018-08-22 2018-12-18 Oppo广东移动通信有限公司 Imaging device and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114765662A (en) * 2021-01-13 2022-07-19 富士康(昆山)电脑接插件有限公司 Sensing module

Also Published As

Publication number Publication date
CN109005348A (en) 2018-12-14

Similar Documents

Publication Publication Date Title
WO2020038063A1 (en) Electronic device and control method for electronic device
WO2020038054A1 (en) Electronic device and control method therefor
WO2020038068A1 (en) Imaging device and electronic apparatus
WO2020125388A1 (en) Time-of-flight module and electronic device
WO2020038060A1 (en) Laser projection module and control method therefor, and image acquisition device and electronic apparatus
EP3349429B1 (en) Camera module applied to terminal and terminal including same
US20060067678A1 (en) Camera head
US9986137B2 (en) Image pickup apparatus
WO2020052289A1 (en) Depth acquisition module and electronic apparatus
EP3993370A1 (en) Electronic device
TWM523106U (en) Optical device
CN111093018B (en) Imaging module and terminal
US20240053479A1 (en) Tof apparatus and electronic device
WO2020052288A1 (en) Depth collection module and mobile terminal
KR20190006689A (en) Optical apparatus
CN213069426U (en) Imaging lens, image capturing device and electronic device
US20130272692A1 (en) Photographing apparatus for recognizing type of external device, method of controlling the photographing apparatus, and the external device
US20230168500A1 (en) Smart glasses and camera device thereof
WO2020038057A1 (en) Depth collection module and electronic device
WO2020038052A1 (en) Input/output assembly and mobile device
JP2007174040A (en) Optical device and camera unit
WO2021249024A1 (en) Zoom lens group, lens assembly, camera apparatus, electronic device, and zoom method
US20190253590A1 (en) Camera Module
CN213693886U (en) Camera module and equipment
TW202113417A (en) Lens assembly module and electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19852566

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19852566

Country of ref document: EP

Kind code of ref document: A1