WO2020038063A1 - Dispositif électronique et procédé de commande pour dispositif électronique - Google Patents

Dispositif électronique et procédé de commande pour dispositif électronique Download PDF

Info

Publication number
WO2020038063A1
WO2020038063A1 PCT/CN2019/090077 CN2019090077W WO2020038063A1 WO 2020038063 A1 WO2020038063 A1 WO 2020038063A1 CN 2019090077 W CN2019090077 W CN 2019090077W WO 2020038063 A1 WO2020038063 A1 WO 2020038063A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
electronic device
sub
subject
captured image
Prior art date
Application number
PCT/CN2019/090077
Other languages
English (en)
Chinese (zh)
Inventor
张学勇
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2020038063A1 publication Critical patent/WO2020038063A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation

Definitions

  • the present application relates to the field of consumer electronic products, and more particularly, to an electronic device and a control method for an electronic device.
  • An electronic device typically captures a captured image of a subject through a single camera.
  • An embodiment of the present application provides an electronic device and a control method of the electronic device.
  • An embodiment of the present application provides an electronic device.
  • the electronic device includes a time-of-flight module, a main camera, a sub-camera, and a processor.
  • the time-of-flight module is used to collect a depth image of a subject.
  • the main camera is used to collect the image.
  • the depth image and the captured image construct a three-dimensional image of the subject.
  • An embodiment of the present application provides a control method for an electronic device, the electronic device includes a time-of-flight module, a main camera, and a sub-camera, and the control method includes: acquiring a depth image of a subject through the time-of-flight module; Switching the main camera or the sub camera according to a predetermined condition to acquire a captured image of the subject; and constructing a three-dimensional image of the subject based on the depth image and the captured image.
  • FIG. 1 is a schematic flowchart of a method for controlling an electronic device according to some embodiments of the present application
  • FIG. 2 is a schematic structural diagram of an electronic device according to some embodiments of the present application.
  • 3 and 4 are schematic flowcharts of a method for controlling an electronic device according to some embodiments of the present application.
  • 5 to 7 are schematic diagrams of a method for controlling an electronic device according to some embodiments of the present application.
  • FIG. 8 is a schematic flowchart of a method for controlling an electronic device according to some embodiments of the present application.
  • FIG. 9 is a perspective structural diagram of a time-of-flight module according to some embodiments of the present application.
  • FIG. 10 is a schematic top view of a time-of-flight module according to some embodiments of the present application.
  • FIG. 11 is a schematic bottom view of a time of flight module according to some embodiments of the present application.
  • FIG. 12 is a schematic side view of a time-of-flight module according to some embodiments of the present application.
  • FIG. 13 is a schematic cross-sectional view of the time-of-flight module shown in FIG. 10 along the line XIII-XIII;
  • FIG. 14 is an enlarged schematic diagram of a XIV portion in the time-of-flight module shown in FIG. 13; FIG.
  • 15 is a schematic front view of a time-of-flight module of some embodiments of the present application when the flexible circuit board is not bent;
  • 16 to 19 are schematic structural diagrams of a light transmitter according to some embodiments of the present application.
  • an electronic device 100 includes a time-of-flight module 20, a main camera 30, a sub-camera 40, and a processor 10.
  • the time-of-flight module 20 is used to collect a depth image of a subject.
  • the main camera 30 is used to capture a captured image of the subject, and the sub camera 40 is used to capture a captured image of the subject.
  • the processor 10 is configured to switch the main camera 30 or the sub camera 40 according to a predetermined condition to acquire a captured image of the subject, and construct a three-dimensional image of the subject based on the depth image and the captured image.
  • the main camera 30 is a wide-angle camera, and the sub camera 40 is a telephoto camera; or the main camera 30 is a color camera, and the sub camera 40 is a black and white camera.
  • the time-of-flight module 20 and the secondary camera 40 are respectively disposed on two sides of the primary camera 30.
  • the main camera 30 is a wide-angle camera
  • the sub-camera 40 is a telephoto camera.
  • the processor 10 is further configured to obtain the current distance between the subject and the electronic device 100 in real time. When the current distance is less than the distance threshold, the main camera 30 collects captured images; when the current distance is greater than or equal to the distance threshold, the sub camera 40 collects captured images.
  • the processor 10 is configured to obtain the current distance according to the depth image; or the electronic device 100 further includes a distance detection device 50, and the distance detection device 50 is configured to detect the current distance in real time and send it to the processor 10.
  • the main camera 30 is a wide-angle camera
  • the sub-camera 40 is a telephoto camera.
  • the main camera 30 or the sub camera 40 is also used to collect a preview image of the subject.
  • the processor 10 is configured to detect face information in the preview image, and switch the main camera 30 or the sub camera 40 according to the face information to capture a captured image.
  • the main camera 30 is a color camera
  • the sub camera 40 is a black and white camera.
  • the processor 10 is configured to obtain the current brightness of the ambient light in real time. When the current brightness is less than the brightness threshold, the sub-camera 40 collects captured images. When the current brightness is greater than or equal to the brightness threshold, the main camera 30 captures captured images.
  • the time-of-flight module 20 includes a first substrate assembly 21, a spacer 22, a light transmitter 23 and a light receiver 24.
  • the first substrate assembly 21 includes a first substrate 211 and a flexible circuit board 212 connected to each other.
  • the spacer 22 is disposed on the first substrate 211.
  • the light transmitter 23 is configured to emit an optical signal outward.
  • the light transmitter 23 is disposed on the cushion block 22.
  • the flexible circuit board 212 is bent and one end of the flexible circuit board 212 is connected to the first substrate 211 and the other end is connected to the light emitter 23.
  • the light receiver 24 is disposed on the first substrate 211.
  • the light receiver 24 is configured to receive the light signal emitted by the reflected light transmitter 23.
  • the light receiver 24 includes a casing 241 and an optical element disposed on the casing 241. 242.
  • the housing 241 is connected with the cushion block 22 as a whole.
  • the housing 241 and the cushion block 22 are integrally formed.
  • the light emitter 23 includes a second substrate assembly 231, a light source assembly 232 and a housing 233.
  • the second substrate assembly 231 is disposed on the pad 22, and the second substrate assembly 231 is connected to the flexible circuit board 212.
  • the light source assembly 232 is disposed on the second substrate assembly 231, and the light source assembly 232 is configured to emit a light signal.
  • the casing 233 is disposed on the second substrate assembly 231.
  • the casing 233 is formed with a receiving space 2331 to receive the light source assembly 232.
  • the second substrate assembly 231 includes a second substrate 2311 and a reinforcing member 2312.
  • the second substrate 2311 is connected to the flexible circuit board 212.
  • the light source assembly 232 and the reinforcing member 2312 are disposed on opposite sides of the second substrate 2311.
  • the reinforcing member 2312 is integrally formed with the cushion block 22; or the reinforcing member 2312 and the cushion block 22 are formed separately.
  • a first positioning member 2313 is formed on the reinforcing member 2312.
  • the cushion block 22 includes a body 221 and a second positioning member 222.
  • the second positioning member 222 is formed on the body 221.
  • the first positioning member 2313 cooperates with the second positioning member 222.
  • the side where the cushion block 22 is combined with the first substrate 211 is provided with a receiving cavity 223.
  • the time-of-flight module 20 further includes an electronic component 25 disposed on the first substrate 211, and the electronic component 25 is contained in the receiving cavity 223.
  • the cushion block 22 is provided with an avoiding through hole 224 communicating with at least one receiving cavity 223, and at least one electronic component 25 extends into the avoiding through hole 224.
  • the first substrate assembly 21 further includes a reinforcing plate 213.
  • the reinforcing plate 213 is coupled to a side of the first substrate 211 opposite to the pad 22.
  • the cushion block 22 includes a protruding portion 225 protruding from the side edge 2111 of the first substrate 211, and the flexible circuit board 212 is bent around the protruding portion 225.
  • the time-of-flight module 20 further includes a connector 26 connected to the first substrate 211.
  • the connector 26 is used to connect the first substrate assembly 21 and an external device.
  • the connector 26 and the flexible circuit board 212 are respectively connected to opposite ends of the first substrate 211.
  • the light transmitter 23 and the light receiver 24 are arranged along a straight line L, and the connector 26 and the flexible circuit board 212 are located on opposite sides of the straight line L, respectively. .
  • the electronic device 100 includes a time-of-flight module 20, a main camera 30 and a sub-camera 40.
  • Control methods include:
  • the time-of-flight module 20 and the secondary camera 40 are respectively disposed on two sides of the primary camera 30.
  • control methods also include:
  • the step (ie, 02) of switching the main camera 30 or the sub camera 40 to acquire a captured image of the subject according to a predetermined condition includes:
  • the main camera 30 is a wide-angle camera
  • the sub-camera 40 is a telephoto camera.
  • the control method further includes:
  • the step (ie, 02) of switching the main camera 30 or the sub camera 40 to acquire a captured image of the subject according to a predetermined condition includes:
  • control methods also include:
  • the step (ie, 02) of switching the main camera 30 or the sub camera 40 to acquire a captured image of the subject according to a predetermined condition includes:
  • the embodiment of the present application provides a control method of the electronic device 100.
  • the electronic device 100 includes a time of flight module 20, a main camera 30 and a sub camera 40.
  • Control methods include:
  • an embodiment of the present application provides an electronic device 100.
  • the electronic device 100 includes a time of flight module 20, a main camera 30, a sub camera 40, and a processor 10.
  • the main camera 30 is used to capture a captured image of the subject
  • the sub camera 40 is also used to capture a captured image of the subject.
  • the control method of the electronic device 100 according to the embodiment of the present application may be implemented by the electronic device 100 according to the embodiment of the present application.
  • the time of flight module 20 may be used to execute the method in 01
  • the processor 10 may be used to execute the methods in 02 and 03. That is to say, the time-of-flight module 20 can be used to collect a depth image of a subject.
  • the processor 10 may be configured to switch the main camera 30 or the sub camera 40 according to a predetermined condition to acquire a captured image of the subject; and construct a three-dimensional image of the subject according to the depth image and the captured image.
  • An electronic device usually collects a captured image of a subject through a single camera, which has insufficient functions and poor user experience.
  • the control method of the electronic device 100 and the electronic device 100 can switch the main camera 30 or the sub camera 40 according to a predetermined condition to capture a captured image of the subject, and also can capture the depth of the captured image based on the captured image and the time of flight module 20
  • the image builds a three-dimensional image of the subject, which has a variety of functions and is conducive to improving the user experience.
  • the electronic device 100 may be a mobile phone, a tablet computer, a smart watch, a smart bracelet, a smart wearable device, etc.
  • the embodiment of the present application is described by taking the electronic device 100 as a mobile phone as an example. It can be understood that the electronic device 100 The specific form is not limited to mobile phones.
  • the electronic device 100 may include a case 101 and a bracket 102.
  • the time-of-flight module 20, the main camera 30 and the auxiliary camera 40 are all disposed on the bracket 102.
  • the time-of-flight module 20, the main camera 30, the sub-camera 40, and the bracket 102 are all housed in the casing 101 and can extend from the casing 101.
  • the bracket 102 When the time-of-flight module 20 is used to collect a depth image of the subject, or the main camera 30 is used to capture a captured image of the subject, or the sub-camera 40 is used to capture a captured image of the subject, the bracket 102 The time-of-flight module 20, the main camera 30, and the sub-camera 40 are driven to move outside the casing 101 to extend the casing 101, so as to acquire a depth image or take an image.
  • the time-of-flight module 20, the main camera 30, and the sub-camera may be all front cameras or all rear cameras.
  • the subject can be a person, object, or other subject that the user wishes to photograph.
  • the housing 101 may be provided with a light through hole (not shown).
  • the time-of-flight module 20, the main camera 30 and the sub camera 40 are immovably disposed in the housing 101 and correspond to the light through hole.
  • the display screen 103 of the electronic device 100 disposed on the casing 101 may be provided with a light through hole (not shown), and the time-of-flight module 20, the main camera 30 and the sub-camera 40 are disposed on the display screen 103.
  • the processor 10 switches the main camera 30 or the sub-camera 40 according to a predetermined condition to acquire a captured image of the subject (ie, 02), and the processor 10 according to the depth image and
  • the step of capturing the image to construct a three-dimensional image of the subject (ie, 03) is implemented in two different application scenarios, and there is no sequential relationship between step execution between 02 and 03. That is to say, in an application scenario, the processor 10 may switch the main camera 30 or the sub camera 40 according to a predetermined condition to collect a captured image of a subject, thereby realizing an optical zoom and telephoto experience without having to The depth image and the captured image construct a three-dimensional image of the subject.
  • the processor 10 may choose to construct a three-dimensional image of the subject based on the captured image collected by the main camera 30 and the depth image collected by the time-of-flight module 20, or select the captured image and flight based on the captured image acquired by the secondary camera 40
  • the depth image collected by the time module 20 constructs a three-dimensional image of the subject, thereby realizing 3D effects and Augmented Reality (AR) applications, without the need to switch the main camera 30 or the sub camera 40 to capture the subject according to predetermined conditions A captured image of the subject.
  • AR Augmented Reality
  • the processor 10 switches the main camera 30 or the sub-camera 40 according to a predetermined condition to acquire a captured image of the subject (ie, 02), and the processor 10 according to the depth image
  • the step of constructing a three-dimensional image of the object and the captured image (ie, 03) is implemented in the same application scenario, and 02 is executed before 03.
  • the processor 10 first switches to capture a captured image of the subject according to a predetermined condition, and then constructs a three-dimensional image of the subject based on the depth image and the captured image captured by the main camera 30; or, the processor 10 First, switch to the sub-camera 40 to capture a captured image of the subject according to a predetermined condition, and then construct a three-dimensional image of the subject based on the depth image and the captured image collected by the sub-camera 40.
  • the processor 10 since the processor 10 first selects an appropriate camera to collect a captured image of the subject according to a predetermined condition, so that the quality of the captured image is better, and then a three-dimensional image is constructed based on the captured image and the depth image with better quality. So you can get the ideal 3D effect and AR experience.
  • the main camera 30 is a wide-angle camera
  • the sub-camera 40 is a telephoto camera.
  • wide-angle and telephoto are relative terms.
  • the wide-angle camera has a larger field of view relative to the telephoto camera, and the telephoto camera has a longer focal length and a longer shooting distance than the wide-angle camera.
  • the processor 10 switches to a wide-angle camera to capture a captured image of the subject according to a predetermined condition, the image is a wide-angle image, and the processor 10 constructs a three-dimensional image of the subject according to the depth image and the wide-angle image;
  • the image is a telephoto image, and the processor 10 constructs a three-dimensional image of the subject according to the depth image and the telephoto image.
  • the electronic device 100 includes a wide-angle camera and a telephoto camera at the same time.
  • the processor 10 may switch the wide-angle camera or the telephoto camera according to the actual situation to collect the captured image of the subject, thereby constructing two different types of wide-angle and telephoto Type of three-dimensional images is conducive to improving the photographing experience.
  • the main camera 30 is a color camera (ie, an RGB camera), and the sub camera 40 is a black and white camera (ie, a Mono camera). Compared with the color camera, the black and white camera can improve the shooting quality of low-light / night scene images.
  • the processor 10 switches to collect a captured image of the subject according to a predetermined condition, the image is an RGB image, and the processor 10 constructs a three-dimensional image of the object according to the depth image and the RGB image;
  • the condition is switched to when the sub-camera 40 collects a captured image of the subject, the image is a Mono image, and the processor 10 constructs a three-dimensional image of the subject according to the depth image and the Mono image.
  • the electronic device 100 includes both a color camera and a black-and-white camera.
  • the processor 10 may switch the color camera or the black-and-white camera according to the actual situation to collect the captured image of the subject, thereby constructing two different types of three-dimensional RGB and Mono Images, which will help improve the photo experience.
  • the time-of-flight module 20 and the sub-camera 40 are respectively disposed on two sides of the main camera 30.
  • the main camera 30 is located between the time-of-flight module 20 and the sub-camera 40, on the one hand, if the processor 10 switches the main camera 30 or the sub-camera 40 according to a predetermined condition to collect a captured image of the subject, the main camera 30 and the sub-camera The parallax between 40 is small, which is helpful for achieving smooth zoom.
  • the processor 10 is more likely to switch to the main camera 30 to collect captured images of the subject according to predetermined conditions, so that the processor 10 When constructing a three-dimensional image of the object based on the depth image and the captured image collected by the main camera 30, the parallax between the time-of-flight module 20 and the main camera 30 is small, which is beneficial to constructing a three-dimensional image of the object.
  • the time-of-flight module 20 and the sub-camera 40 are respectively disposed on both sides of the main camera 30, which is beneficial to achieve smooth zooming between the main camera 30 and the sub-camera 40, and is also conducive to constructing a three-dimensional object image.
  • the centers of the time-of-flight module 20, the main camera 30, and the sub-camera 40 can be located in a straight line in order, on the one hand, the bracket 102 can be reduced along the top of the electronic device 100 (that is, Side) to the bottom (that is, the side of the electronic device 100 away from the bracket 102); on the other hand, the bracket 102 drives the time-of-flight module 20, the main camera 30, and the sub-camera 40 to move outside the casing 101 It can synchronously protrude from the casing 101 to structurally ensure that the time-of-flight module 20, the main camera 30 and the sub-camera 40 can work synchronously, saving shooting time.
  • control methods also include:
  • the step (ie, 02) of switching the main camera 30 or the sub camera 40 to acquire a captured image of the subject according to a predetermined condition includes:
  • the main camera 30 is a wide-angle camera
  • the sub-camera 40 is a telephoto camera.
  • the processor 10 may be used to execute the method in 04.
  • the main camera 30 can be used to execute the method in 021, and the sub camera 40 can be used to execute the method in 022. That is to say, the processor 10 may be used to obtain the current distance between the subject and the electronic device 100 in real time.
  • the main camera 30 collects captured images; when the current distance is greater than or equal to the distance threshold, the sub camera 40 collects captured images.
  • the electronic device 100 can switch the camera used to capture the captured image in real time according to the current distance.
  • the processor 10 obtains the current distance between the subject and the electronic device 100 as d1, where d1 ⁇ d0.
  • the main camera 30 collects a captured image of the subject;
  • the processing The current distance between the object 10 and the electronic device 100 is d2, where d2> d0.
  • the sub-camera 40 acquires a captured image of the object again.
  • the final processor 10 constructs a three-dimensional image of the subject based on the depth image and the captured image collected by the sub-camera 40.
  • the main camera 30 may focus using a laser focusing mode.
  • the sub-camera 40 may use the passive focus mode for focusing.
  • the passive focus method includes a contrast focus mode and a phase focus mode.
  • the step (that is, 04) of the processor 10 acquiring the current distance between the subject and the electronic device 100 in real time includes: the processor 10 continuously acquiring multiple initial values between the subject and the electronic device 100 distance.
  • the processor 10 calculates a current distance between the subject and the electronic device 100 according to an average value of the plurality of initial distances, and then switches according to the calculated current distance
  • the main camera 30 or the sub camera 40 captures a captured image of a subject.
  • the processor 10 is configured to obtain the current distance according to the depth image.
  • the time-of-flight module 20 first collects a depth image of the subject, and then obtains a current distance between the subject and the electronic device 100 according to the depth image.
  • the main camera 30 collects captured images;
  • the sub camera 40 collects captured images of the subject.
  • the depth image collected by the time-of-flight module 20 can be used to construct a three-dimensional image of the subject, and can also be used to detect the current distance between the subject and the electronic device 100.
  • the electronic device 100 does not need to be set.
  • Another distance sensor to detect the current distance is beneficial to reduce the number of components in the electronic device 100 and save costs.
  • the electronic device 100 further includes a distance detection device 50.
  • the distance detecting device 50 is configured to detect the current distance in real time and send it to the processor 10.
  • the distance detection device 50 is a distance sensor, and the distance sensor directly detects the current distance between the subject and the electronic device 100.
  • the distance detection device 50 is a structured light module.
  • the structured light module collects a structured light image of a subject, and then detects a current distance between the subject and the electronic device 100 based on the structured light image.
  • the distance detection device 50 may be another type of detection device, and it is only required to be able to detect the current distance between the subject and the electronic device 100, such as an ultrasonic rangefinder, a radar rangefinder, Proximity sensors, etc.
  • the distance detection device 50 can detect the current distance in advance. If the current distance is less than the distance threshold, the main camera 30 collects captured images; if the current distance is greater than or equal to the distance threshold, the sub camera 40 collects captured images.
  • the main camera 30 or the sub-camera 40 may collect a depth image of the subject while the time-of-flight module 20 collects a captured image of the subject, so as to save time for constructing a three-dimensional image of the subject and improve a user experience.
  • the main camera 30 or the sub-camera 40 may also capture captured images before the time-of-flight module 20 captures depth images; or capture images after the depth-of-flight module 20 captures depth images, which is not limited herein.
  • the main camera 30 is a wide-angle camera
  • the sub-camera 40 is a telephoto camera.
  • the control method further includes:
  • the step (ie, 02) of switching the main camera 30 or the sub camera 40 to acquire a captured image of the subject according to a predetermined condition includes:
  • the main camera 30 is a wide-angle camera
  • the sub-camera 40 is a telephoto camera.
  • the main camera 30 or the sub camera 40 may be used to execute the method in 05
  • the processor 10 may be used to execute the methods in 06 and 023. That is to say, the main camera 30 or the sub camera 40 can be used to collect a preview image of a subject.
  • the processor 10 may be configured to detect face information in the preview image, and switch the main camera 30 or the sub-camera 40 according to the face information to capture a captured image.
  • the face information includes the number of faces.
  • the main camera 30 collects captured images; when the number of faces is less than or equal to a predetermined number (as shown in FIG. 6), the sub-camera 40 collects captured images. It can be understood that when the number of human faces is large, shooting with the main camera 30 having a larger field of view can capture more human faces into the captured image. When the number of human faces is small, the sub-camera 40 with a relatively small field of view angle and a long focal length can be used, which can make the subject appear closer and larger visually.
  • the processor 10 keeps the main image when the number of faces is greater than a predetermined number.
  • the camera 30 collects captured images of the subject.
  • the main camera 30 may directly use the preview image as the captured image without collecting the captured images again; the processor 10 switches to the secondary camera when the number of faces is less than or equal to a predetermined number 40 Capture captured images.
  • the processor 10 switches to when the number of faces is greater than a predetermined number.
  • the main camera 30 collects captured images of the subject; when the number of faces is less than or equal to a predetermined number, the processor 10 maintains the sub-camera 40 to capture captured images.
  • the sub-camera 40 can also directly preview the captured images without collecting additional images. The image is taken as a captured image.
  • the face information includes an area ratio of a face region in the preview image.
  • a predetermined ratio for example, one third
  • the main camera 30 captures the captured image;
  • the area ratio of the face area in the preview image is When it is less than or equal to a predetermined ratio (as shown in FIG. 6), the sub-camera 40 captures a captured image.
  • the sub-camera 40 with a relatively small field of view angle and a long focal length can be used to visually make the subject appear closer and larger .
  • the step of acquiring the preview image of the subject through the main camera 30 or the sub-camera 40 (ie, 05) is to collect the preview image of the subject through the main camera 30, the area of the processor 10 in the preview image in the face area
  • the main camera 30 is kept to capture the captured image of the subject.
  • the main camera 30 may also directly use the preview image as the captured image without collecting the captured image again; the processor 10 previews the image in the face area.
  • the area ratio is less than or equal to a predetermined ratio, the camera is switched to the sub-camera 40 to capture a captured image.
  • the step of acquiring the preview image of the subject through the main camera 30 or the sub-camera 40 (ie, 05) is to collect the preview image of the subject through the sub-camera 40
  • the area of the processor 10 in the preview image in the face area of the processor occupies
  • the main camera 30 is switched to collect captured images; when the area ratio of the face area in the preview image is less than or equal to the predetermined ratio, the processor 10 maintains the sub-camera 40 to collect captured images.
  • the sub-camera It is also possible to directly use the preview image as the captured image without acquiring the captured image again.
  • control methods also include:
  • the step (ie, 02) of switching the main camera 30 or the sub camera 40 to acquire a captured image of the subject according to a predetermined condition includes:
  • the main camera 30 is a color camera
  • the sub camera 40 is a black and white camera.
  • the processor 10 may be used to execute the method in 07
  • the secondary camera 40 may be used to execute the method in 024
  • the main camera 30 may be used to execute the method in 025. That is to say, the processor 10 can be used to obtain the current brightness of the ambient light in real time.
  • the sub-camera 40 collects captured images.
  • the main camera 30 captures captured images.
  • the black and white camera can improve the shooting quality of low-light / night scene images compared to the color camera. Therefore, when the current brightness is less than the brightness threshold, the captured image may be captured by the sub-camera 40; when the current brightness is greater than or equal to the brightness threshold, the captured image may be captured by the main camera 30.
  • the processor 10 switches a color camera or a black and white camera according to the current brightness of the ambient light to capture a captured image of the subject, thereby constructing two different types of three-dimensional images, RGB and Mono, which is beneficial to improving the photographing experience.
  • the electronic device 100 may further include an ambient light sensor, and the ambient light sensor is configured to detect the current brightness of the ambient light and send it to the processor 10.
  • the electronic device 100 can switch the camera used to capture the captured image in real time according to the current brightness.
  • the brightness threshold is L0.
  • the processor 10 obtains the current brightness of the ambient light as L1, where L1 ⁇ L0.
  • the sub-camera 40 collects the captured image of the subject; at the second moment, the processor 10 obtains the current brightness of the ambient light.
  • the brightness is L2, where L2> L0.
  • the main camera 30 captures a captured image of the subject again.
  • the processor 10 constructs a three-dimensional image of the subject based on the depth image and the captured image collected by the main camera 30.
  • the time of flight module 20 may have the following structure.
  • the time-of-flight module 20 includes a first substrate assembly 21, a spacer 22, a light transmitter 23 and a light receiver 24.
  • the first substrate assembly 21 includes a first substrate 211 and a flexible circuit board 212 connected to each other.
  • the spacer 22 is disposed on the first substrate 211.
  • the light transmitter 23 is configured to emit an optical signal outward.
  • the light transmitter 23 is disposed on the cushion block 22.
  • the flexible circuit board 212 is bent and one end of the flexible circuit board 212 is connected to the first substrate 211 and the other end is connected to the light emitter 23.
  • the light receiver 24 is disposed on the first substrate 211.
  • the light receiver 24 is configured to receive the light signal emitted by the reflected light transmitter 23.
  • the light receiver 24 includes a casing 241 and an optical element disposed on the casing 241. 242.
  • the housing 241 is connected with the cushion block 22 as a whole.
  • the pad 22 can raise the height of the light emitter 23, thereby increasing the height of the light emitting surface of the light emitter 23, and the light emitter 23
  • the emitted light signal is not easily blocked by the light receiver 24, so that the light signal can be completely irradiated on the measured object.
  • the first substrate assembly 21 includes a first substrate 211 and a flexible circuit board 212.
  • the first substrate 211 may be a printed circuit board or a flexible circuit board.
  • the control circuit of the time of flight module 20 may be laid on the first substrate 211.
  • One end of the flexible circuit board 212 can be connected to the first substrate 211, and the flexible circuit board 212 can be bent at a certain angle, so that the relative positions of the devices connected at both ends of the flexible circuit board 212 can be selected.
  • the pad 22 is disposed on the first substrate 211.
  • the pad 22 is in contact with the first substrate 211 and is carried on the first substrate 211.
  • the pad 22 may be combined with the first substrate 211 by means of adhesion or the like.
  • the material of the spacer 22 may be metal, plastic, or the like.
  • a surface where the pad 22 is combined with the first substrate 211 may be a flat surface, and a surface opposite to the combined surface of the pad 22 may also be a flat surface, so that when the light emitter 23 is disposed on the pad 22 Has better stability.
  • the light transmitter 23 is configured to emit an optical signal outwards.
  • the light signal may be infrared light, and the light signal may be a lattice spot emitted to the object to be measured.
  • the light signal is emitted from the light transmitter 23 at a certain divergence angle. .
  • the light transmitter 23 is disposed on the spacer 22. In the embodiment of the present application, the light transmitter 23 is disposed on the side of the spacer 22 opposite to the first substrate 211, or in other words, the spacer 22 connects the first substrate 211.
  • the light emitter 23 is spaced apart from the light emitter 23 so that a height difference is formed between the light emitter 23 and the first substrate 211.
  • the light transmitter 23 is also connected to the flexible circuit board 212.
  • the flexible circuit board 212 is bent, one end of the flexible circuit board 212 is connected to the first substrate 211, and the other end is connected to the light transmitter 23, so that the control signal of the light transmitter 23 is removed
  • the first substrate 211 is transmitted to the light transmitter 23, or a feedback signal of the light transmitter 23 (for example, time information, frequency information of the light signal emitted by the light transmitter 23, temperature information of the light transmitter 23, etc.) is transmitted to the first Substrate 211.
  • the optical receiver 24 is configured to receive an optical signal emitted by the optical transmitter 23 reflected back.
  • the light receiver 24 is disposed on the first substrate 211, and the contact surface between the light receiver 24 and the first substrate 211 is substantially flush with the contact surface between the pad 22 and the first substrate 211 (that is, the installation starting point of the two is On the same plane).
  • the light receiver 24 includes a housing 241 and an optical element 242.
  • the casing 241 is disposed on the first substrate 211, and the optical element 242 is disposed on the casing 241.
  • the casing 241 may be a lens holder and a lens barrel of the light receiver 24, and the optical element 242 may be a lens disposed in the casing 241.
  • the light receiver 24 may further include a photosensitive chip (not shown).
  • the optical signal reflected by the measured object is irradiated into the photosensitive chip through the optical element 242, and the photosensitive chip responds to the optical signal.
  • the time-of-flight module 20 calculates the time difference between the light signal emitted by the light transmitter 23 and the light sensor receiving the light signal reflected by the measured object, and further obtains the depth information of the measured object, which can be used for distance measurement, For generating depth images or for 3D modeling.
  • the housing 241 and the cushion block 22 are integrally connected. Specifically, the housing 241 and the spacer 22 may be integrally formed.
  • the materials of the housing 241 and the spacer 22 are the same and are integrally formed by injection molding, cutting or the like; or the materials of the housing 241 and the spacer 22 are different, both Integrated molding by two-color injection molding.
  • the housing 241 and the spacer 22 may also be separately formed, and the two form a matching structure.
  • the housing 241 and the spacer 22 may be connected into one body, and then may be disposed on the first substrate 211 together. It is also possible to firstly arrange one of the housing 241 and the pad 22 on the first substrate 211, and then arrange the other on the first substrate 211 and connect them as a whole.
  • the pad 22 can raise the height of the light emitter 23, thereby increasing the height of the light emitting surface of the light emitter 23, and the light emitter 23
  • the emitted light signal is not easily blocked by the light receiver 24, so that the light signal can be completely irradiated on the measured object.
  • the exit surface of the light transmitter 23 may be flush with the entrance surface of the light receiver 24, or the exit surface of the light transmitter 23 may be slightly lower than the entrance surface of the light receiver 24, or it may be the exit surface of the light transmitter 23 Slightly higher than the incident surface of the light receiver 24.
  • the first substrate assembly 21 further includes a reinforcing plate 213.
  • the reinforcing plate 213 is coupled to a side of the first substrate 211 opposite to the pad 22.
  • the reinforcing plate 213 may cover one side of the first substrate 211, and the reinforcing plate 213 may be used to increase the strength of the first substrate 211 and prevent deformation of the first substrate 211.
  • the reinforcing plate 213 may be made of a conductive material, such as a metal or an alloy.
  • the reinforcing plate 213 may be electrically connected to the casing 10 to make the reinforcing plate 213. Grounding and effectively reducing the interference of static electricity from external components on the time of flight module 20.
  • the cushion block 22 includes a protruding portion 225 protruding from the side edge 2111 of the first substrate 211, and the flexible circuit board 212 is bent around the protruding portion 225. Specifically, a part of the cushion block 22 is directly carried on the first substrate 211, and another part is not in direct contact with the first substrate 211, and protrudes from the side edge 2111 of the first substrate 211 to form a protruding portion 225.
  • the flexible circuit board 212 may be connected to the side edge 2111, and the flexible circuit board 212 is bent around the protrusion 225, or the flexible circuit board 212 is bent so that the protrusion 225 is located in a space surrounded by the flexible circuit board 212. Inside, when the flexible circuit board 212 is subjected to an external force, the flexible circuit board 212 will not collapse inward and cause excessive bending, which will cause damage to the flexible circuit board 212.
  • the outer surface 2251 of the protruding portion 225 is a smooth curved surface (eg, the outer surface of a cylinder, etc.), that is, the outer surface 2251 of the protruding portion 225 does not form a curvature. Hence, even if the flexible circuit board 212 is bent over the outer side 2251 of the protruding portion 225, the degree of bending of the flexible circuit board 212 will not be too large, which further ensures the integrity of the flexible circuit board 212.
  • the time-of-flight module 20 further includes a connector 26 connected to the first substrate 211.
  • the connector 26 is used to connect the first substrate assembly 21 and an external device.
  • the connector 26 and the flexible circuit board 212 are respectively connected to opposite ends of the first substrate 211.
  • the connector 26 may be a connection base or a connector.
  • the connector 26 may be connected to the main board of the mobile terminal 100 so that the time-of-flight module 20 is electrically connected to the main board.
  • the connector 26 and the flexible circuit board 212 are respectively connected to opposite ends of the first substrate 211.
  • the connectors 26 and the flexible circuit board 212 may be respectively connected to the left and right ends of the first substrate 211, or respectively connected to the front and rear ends of the first substrate 211.
  • the light transmitter 23 and the light receiver 24 are arranged along a straight line L, and the connector 26 and the flexible circuit board 212 are located on opposite sides of the straight line L, respectively. It can be understood that, since the light transmitter 23 and the light receiver 24 are arranged in an array, the size of the time-of-flight module 20 may be larger in the direction of the straight line L.
  • the connector 26 and the flexible circuit board 212 are respectively disposed on opposite sides of the straight line L, which will not increase the size of the time-of-flight module 20 in the direction of the straight line L, thereby facilitating the installation of the time-of-flight module 20 on the mobile terminal 100.
  • the chassis 10 On the chassis 10.
  • a receiving cavity 223 is defined on a side where the cushion block 22 is combined with the first substrate 211.
  • the time-of-flight module 20 further includes an electronic component 25 disposed on the first substrate 211, and the electronic component 25 is contained in the receiving cavity 223.
  • the electronic component 25 may be an element such as a capacitor, an inductor, a transistor, a resistor, etc.
  • the electronic component 25 may be electrically connected to a control line laid on the first substrate 211 and used to drive or control the operation of the light transmitter 23 or the light receiver 24.
  • the electronic component 25 is contained in the containing cavity 223, and the space in the cushion block 22 is used reasonably.
  • the number of the receiving cavities 223 may be one or more, and the plurality of receiving cavities 223 may be spaced apart from each other.
  • the positions of the receiving cavity 223 and the electronic component 25 may be aligned and the pad 22 may be disposed at On the first substrate 211.
  • the cushion block 22 is provided with an avoiding through hole 224 communicating with at least one receiving cavity 223, and at least one electronic component 25 extends into the avoiding through hole 224.
  • the height of the electronic component 25 is required to be not higher than the height of the containing cavity 223.
  • an avoiding through hole 224 corresponding to the receiving cavity 223 may be provided, and the electronic component 25 may partially extend into the avoiding through hole 224, so as not to increase the height of the spacer 22
  • the electronic component 25 is arranged.
  • the light emitter 23 includes a second substrate assembly 231, a light source assembly 232 and a housing 233.
  • the second substrate assembly 231 is disposed on the pad 22, and the second substrate assembly 231 is connected to the flexible circuit board 212.
  • the light source assembly 232 is disposed on the second substrate assembly 231, and the light source assembly 232 is configured to emit a light signal.
  • the casing 233 is disposed on the second substrate assembly 231.
  • the casing 233 is formed with a receiving space 2331.
  • the receiving space 2331 can be used for receiving the light source module 232.
  • the flexible circuit board 212 may be detachably connected to the second substrate assembly 231.
  • the light source assembly 232 is electrically connected to the second substrate assembly 231.
  • the casing 233 may be bowl-shaped as a whole, and the opening of the casing 233 is disposed on the second substrate assembly 231 downwardly, so as to receive the light source assembly 232 in the accommodation space 2331.
  • a light outlet 2332 corresponding to the light source component 232 is provided on the housing 233.
  • the optical signal emitted from the light source component 232 passes through the light outlet 2332 and is emitted.
  • the light signal can pass directly through the light outlet 2332. It can also pass through the optical outlet 2332 after changing the optical path through other optical devices.
  • the second substrate assembly 231 includes a second substrate 2311 and a reinforcing member 2312.
  • the second substrate 2311 is connected to the flexible circuit board 212.
  • the light source assembly 232 and the reinforcing member 2312 are disposed on opposite sides of the second substrate 2311.
  • a specific type of the second substrate 2311 may be a printed circuit board or a flexible circuit board, and a control circuit may be laid on the second substrate 2311.
  • the reinforcing member 2312 may be fixedly connected to the second substrate 2311 by means of gluing, riveting, or the like.
  • the reinforcing member 2312 may increase the overall strength of the second substrate assembly 231.
  • the reinforcing member 2312 can directly contact the spacer 22, the second substrate 2311 is not exposed to the outside, and does not need to be in direct contact with the spacer 22, and the second substrate 2311 is not easily affected. Contamination by dust, etc.
  • the reinforcing member 2312 and the cushion block 22 are formed separately.
  • the spacer 22 may be first mounted on the first substrate 211.
  • the two ends of the flexible circuit board 212 are respectively connected to the first substrate 211 and the second substrate 2311, and the flexible circuit board 212 may Do not bend first (state shown in Fig. 15).
  • the flexible circuit board 212 is then bent, so that the reinforcing member 2312 is disposed on the cushion block 22.
  • the reinforcing member 2312 and the spacer 22 may be integrally formed, for example, integrally formed by a process such as injection molding.
  • the spacer 22 and the light emitter 23 may be installed together.
  • the first substrate 211 On the first substrate 211.
  • a first positioning member 2313 is formed on the reinforcing member 2312.
  • the cushion block 22 includes a body 221 and a second positioning member 222.
  • the second positioning member 222 is formed on the body 221.
  • the first positioning member 2313 cooperates with the second positioning member 222.
  • the relative movement between the second substrate assembly 231 and the cushion block 22 can be effectively restricted.
  • the specific types of the first positioning member 2313 and the second positioning member 222 can be selected according to needs.
  • the first positioning member 2313 is a positioning hole formed in the reinforcing member 2312
  • the second positioning member 222 is a positioning column. Protrude into the positioning hole so that the first positioning member 2313 and the second positioning member 222 cooperate with each other; or the first positioning member 2313 is a positioning column formed on the reinforcing member 2312, and the second positioning member 222 is a positioning hole and the positioning column Project into the positioning hole so that the first positioning member 2313 and the second positioning member 222 cooperate with each other; or the number of the first positioning member 2313 and the second positioning member 222 are multiple, and part of the first positioning member 2313 is a positioning hole, Part of the second positioning member 222 is a positioning column, part of the first positioning member 2313 is a positioning column, and part of the second positioning member 222 is a positioning hole.
  • the positioning column projects into the positioning hole so that the first positioning member 2313 and the second positioning member 222 work cooperatively.
  • the structure of the light source component 232 will be described as an example below:
  • the light source assembly 232 includes a light source 60, a lens barrel 70, a diffuser 80 and a protective cover 90.
  • the light source 60 is connected to the second substrate assembly 231.
  • the lens barrel 70 includes a first surface 71 and a second surface 72 opposite to each other.
  • the lens barrel 11 defines a receiving cavity 75 penetrating the first surface 71 and the second surface 72.
  • the first surface 71 is recessed toward the second surface 72 to form a mounting groove 76 communicating with the receiving cavity 75.
  • the diffuser 80 is installed in the mounting groove 76.
  • the protective cover 90 is mounted on the side where the first surface 71 of the lens barrel 70 is located, and the diffuser 80 is sandwiched between the protective cover 90 and the bottom surface 77 of the mounting groove 76.
  • the protective cover 90 can be mounted on the lens barrel 70 by means of screw connection, engagement, and fastener connection.
  • the protective cover 90 when the protective cover 90 includes a top wall 91 and a protective side wall 92, the protective cover 90 (protective side wall 92) is provided with internal threads and the lens barrel 70 is provided with external threads.
  • the protective cover The internal thread of 90 is screwed with the external thread of the lens barrel 70 to mount the protective cover 90 on the lens barrel 70; or, referring to FIG. 17, when the protective cover 90 includes a top wall 91, the protective cover 90 (top wall 91) A locking hole 95 is opened, and a hook 73 is provided at an end of the lens barrel 70.
  • the hook 73 is inserted into the locking hole 95 so that the protective cover 90 is mounted on the lens barrel 70.
  • the protective cover 90 includes a top wall 91 and a protective side wall 92
  • the protective cover 90 (protective side wall 92) is provided with a locking hole 95
  • a hook 73 is provided on the lens barrel 70.
  • the protective cover 90 includes the top wall 91
  • the end of the lens barrel 70 is provided with a first positioning hole 74
  • the protective cover 90 (top wall 91) is provided with a second positioning hole 93 corresponding to the first positioning hole 74
  • the fastener 94 passes through the second positioning hole 93
  • a first positioning hole 74 to the protective cover 90 is mounted on the lens barrel 70.
  • the light source assembly 232 is provided with a mounting groove 76 on the lens barrel 70 and the diffuser 80 is installed in the mounting groove 76, and is mounted on the lens barrel 70 through a protective cover 90 to clamp the diffuser 80 between the protective cover 90 and the installation. Between the bottom surfaces 77 of the grooves 76, the diffuser 80 is actually fixed to the lens barrel 70. And avoid using glue to fix the diffuser 80 on the lens barrel 70, so as to prevent the glue from diffusing and solidifying on the surface of the diffuser 80 and affecting the microstructure of the diffuser 80 after the glue is volatilized to a gaseous state, and the connection and diffusion can be avoided. When the glue of the device 80 and the lens barrel 70 decreases due to aging, the diffuser 80 falls off from the lens barrel 70.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, the features defined as “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present application, the meaning of "plurality” is at least two, for example, two, three, unless specifically defined otherwise.

Abstract

L'invention concerne un dispositif électronique (100) et un procédé de commande pour le dispositif électronique (100). Le dispositif électronique (100) comprend un module de temps de vol (20), une caméra principale (30), une caméra auxiliaire (40), et un processeur (10). Le module de temps de vol (20) est utilisé pour collecter une image de profondeur d'un objet photographié. La caméra principale (30) est utilisée pour collecter une image de caméra de l'objet photographié. La caméra auxiliaire (40) est utilisée pour collecter une image de caméra de l'objet photographié. Le processeur (10) est utilisé pour commuter vers la caméra principale (30) ou la caméra auxiliaire (40) selon une condition prédéterminée de façon à collecter une image de caméra de l'objet photographié, et pour construire une image tridimensionnelle de l'objet photographié sur la base de l'image de profondeur et de l'image de caméra.
PCT/CN2019/090077 2018-08-22 2019-06-05 Dispositif électronique et procédé de commande pour dispositif électronique WO2020038063A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810963397.2A CN109005348A (zh) 2018-08-22 2018-08-22 电子装置和电子装置的控制方法
CN201810963397.2 2018-08-22

Publications (1)

Publication Number Publication Date
WO2020038063A1 true WO2020038063A1 (fr) 2020-02-27

Family

ID=64593671

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/090077 WO2020038063A1 (fr) 2018-08-22 2019-06-05 Dispositif électronique et procédé de commande pour dispositif électronique

Country Status (2)

Country Link
CN (1) CN109005348A (fr)
WO (1) WO2020038063A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114765662A (zh) * 2021-01-13 2022-07-19 富士康(昆山)电脑接插件有限公司 感测模组

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109005348A (zh) * 2018-08-22 2018-12-14 Oppo广东移动通信有限公司 电子装置和电子装置的控制方法
CN109451228B (zh) 2018-12-24 2020-11-10 华为技术有限公司 摄像组件及电子设备
CN109639983B (zh) * 2019-01-03 2020-09-04 Oppo广东移动通信有限公司 拍照方法、装置、终端及计算机可读存储介质
CN112738397A (zh) * 2020-12-29 2021-04-30 维沃移动通信(杭州)有限公司 拍摄方法、装置、电子设备及可读存储介质
WO2023070313A1 (fr) * 2021-10-26 2023-05-04 京东方科技集团股份有限公司 Module de caméra à temps de vol et dispositif d'affichage

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105321159A (zh) * 2014-07-29 2016-02-10 宏达国际电子股份有限公司 手持式电子装置、图像提取装置及景深信息的获取方法
CN106454077A (zh) * 2016-09-26 2017-02-22 宇龙计算机通信科技(深圳)有限公司 拍摄方法、拍摄装置和终端
US20180184071A1 (en) * 2015-06-23 2018-06-28 Huawei Technologies Co., Ltd. Photographing device and method for obtaining depth information
CN108989783A (zh) * 2018-08-22 2018-12-11 Oppo广东移动通信有限公司 电子装置和电子装置的控制方法
CN109005348A (zh) * 2018-08-22 2018-12-14 Oppo广东移动通信有限公司 电子装置和电子装置的控制方法
CN109040556A (zh) * 2018-08-22 2018-12-18 Oppo广东移动通信有限公司 成像装置及电子设备

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101862199B1 (ko) * 2012-02-29 2018-05-29 삼성전자주식회사 원거리 획득이 가능한 tof카메라와 스테레오 카메라의 합성 시스템 및 방법
CN102722080B (zh) * 2012-06-27 2015-11-18 杭州南湾科技有限公司 一种基于多镜头拍摄的多用途立体摄像方法
CN106657455B (zh) * 2016-10-25 2023-05-05 奥比中光科技集团股份有限公司 一种带可旋转摄像头的电子设备
CN106851107A (zh) * 2017-03-09 2017-06-13 广东欧珀移动通信有限公司 切换摄像头辅助构图的控制方法、控制装置及电子装置
CN107590793A (zh) * 2017-09-11 2018-01-16 广东欧珀移动通信有限公司 图像处理方法及装置、电子装置和计算机可读存储介质
CN107995434A (zh) * 2017-11-30 2018-05-04 广东欧珀移动通信有限公司 图像获取方法、电子装置和计算机可读存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105321159A (zh) * 2014-07-29 2016-02-10 宏达国际电子股份有限公司 手持式电子装置、图像提取装置及景深信息的获取方法
US20180184071A1 (en) * 2015-06-23 2018-06-28 Huawei Technologies Co., Ltd. Photographing device and method for obtaining depth information
CN106454077A (zh) * 2016-09-26 2017-02-22 宇龙计算机通信科技(深圳)有限公司 拍摄方法、拍摄装置和终端
CN108989783A (zh) * 2018-08-22 2018-12-11 Oppo广东移动通信有限公司 电子装置和电子装置的控制方法
CN109005348A (zh) * 2018-08-22 2018-12-14 Oppo广东移动通信有限公司 电子装置和电子装置的控制方法
CN109040556A (zh) * 2018-08-22 2018-12-18 Oppo广东移动通信有限公司 成像装置及电子设备

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114765662A (zh) * 2021-01-13 2022-07-19 富士康(昆山)电脑接插件有限公司 感测模组

Also Published As

Publication number Publication date
CN109005348A (zh) 2018-12-14

Similar Documents

Publication Publication Date Title
WO2020038063A1 (fr) Dispositif électronique et procédé de commande pour dispositif électronique
WO2020038054A1 (fr) Ddispositif électronique et procédé de commande associé
WO2020038068A1 (fr) Dispositif d'imagerie et appareil électronique
WO2020125388A1 (fr) Module à temps de vol et dispositif électronique
WO2020038060A1 (fr) Module de projection laser et son procédé de commande, et dispositif d'acquisition d'images et appareil électronique
EP3349429B1 (fr) Module de caméra appliqué à un terminal et un terminal comprenant celui-ci
US20060067678A1 (en) Camera head
US9986137B2 (en) Image pickup apparatus
WO2020052289A1 (fr) Module d'acquisition de profondeur et appareil électronique
TWM523106U (zh) 光學裝置
WO2021004248A1 (fr) Dispositif électronique
CN111093018B (zh) 成像模组及终端
US20240053479A1 (en) Tof apparatus and electronic device
WO2020052288A1 (fr) Module de collecte de profondeur et terminal mobile
KR20190006689A (ko) 광학 기기
US20130272692A1 (en) Photographing apparatus for recognizing type of external device, method of controlling the photographing apparatus, and the external device
US20230168500A1 (en) Smart glasses and camera device thereof
WO2020038057A1 (fr) Module de collecte de profondeur et dispositif électronique
WO2020038052A1 (fr) Ensemble d'entrée/sortie et dispositif mobile
JP2007174040A (ja) 光学装置およびカメラユニット
WO2021249024A1 (fr) Groupe d'objectif zoom, ensemble objectif, appareil de prise de vues, dispositif électronique et procédé de zoom
US20190253590A1 (en) Camera Module
CN213693886U (zh) 一种摄像头模组及设备
TW202113417A (zh) 鏡頭模組與電子裝置
CN220671792U (zh) 光路转折元件、相机模块与电子装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19852566

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19852566

Country of ref document: EP

Kind code of ref document: A1