WO2020038054A1 - 电子装置和电子装置的控制方法 - Google Patents

电子装置和电子装置的控制方法 Download PDF

Info

Publication number
WO2020038054A1
WO2020038054A1 PCT/CN2019/090017 CN2019090017W WO2020038054A1 WO 2020038054 A1 WO2020038054 A1 WO 2020038054A1 CN 2019090017 W CN2019090017 W CN 2019090017W WO 2020038054 A1 WO2020038054 A1 WO 2020038054A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
image
substrate
black
subject
Prior art date
Application number
PCT/CN2019/090017
Other languages
English (en)
French (fr)
Chinese (zh)
Inventor
张学勇
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Priority to EP19851528.0A priority Critical patent/EP3833019B1/de
Publication of WO2020038054A1 publication Critical patent/WO2020038054A1/zh
Priority to US17/175,681 priority patent/US11516455B2/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/218Image signal generators using stereoscopic image cameras using a single 2D image sensor using spatial multiplexing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
    • G01S7/4813Housing arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/257Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/25Image signal generators using stereoscopic image cameras using two or more image sensors with different characteristics other than in their location or field of view, e.g. having different resolutions or colour pickup characteristics; using image signals from one sensor to control the characteristics of another sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects

Definitions

  • the present application relates to the field of consumer electronic products, and more particularly, to an electronic device and a control method for an electronic device.
  • Electronic devices such as smart phones and tablet computers have become increasingly popular.
  • Electronic devices typically acquire two-dimensional images of a subject through a single camera.
  • An embodiment of the present application provides an electronic device and a control method of the electronic device.
  • An embodiment of the present application provides an electronic device.
  • the electronic device includes a time-of-flight module, a color camera, a black-and-white camera, and a processor; the time-of-flight module is used to collect a depth image of a subject; and a color camera is used to collect the A color image of a subject; a black and white camera for capturing a black and white image of the subject; a processor for obtaining the current brightness of the ambient light in real time; and when the current brightness is less than a first threshold, according to the depth image, The color image and the black and white image construct a three-dimensional image of the subject.
  • An embodiment of the present application provides a control method for an electronic device, the electronic device includes a time-of-flight module, a color camera, and a black and white camera, and the control method includes: acquiring a depth image of a subject through the time-of-flight module; Collect a color image of the subject through the color camera; acquire the current brightness of the ambient light in real time; and capture a black and white image of the subject through the black and white camera when the current brightness is less than a first threshold; and A three-dimensional image of the subject is constructed from the depth image, the color image, and the black and white image.
  • FIG. 1 is a schematic flowchart of a method for controlling an electronic device according to some embodiments of the present application
  • FIG. 2 is a schematic structural diagram of an electronic device according to some embodiments of the present application.
  • 3 and 4 are schematic flowcharts of a method for controlling an electronic device according to some embodiments of the present application.
  • FIG. 5 is a perspective structural diagram of a time-of-flight module according to some embodiments of the present application.
  • FIG. 6 is a schematic top view of a time-of-flight module according to some embodiments of the present application.
  • FIG. 7 is a schematic bottom view of a time-of-flight module according to some embodiments of the present application.
  • FIG. 8 is a schematic side view of a time-of-flight module according to some embodiments of the present application.
  • FIG. 9 is a schematic cross-sectional view of the time-of-flight module shown in FIG. 6 along the line IX-IX;
  • FIG. 10 is an enlarged schematic view of part X in the time-of-flight module shown in FIG. 9; FIG.
  • FIG. 11 is a schematic front structural diagram of a time-of-flight module in some embodiments of the present application when the flexible circuit board is not bent;
  • 12 to 15 are schematic structural diagrams of a light emitter according to some embodiments of the present application.
  • an electronic device 100 includes a time-of-flight module 20, a color camera 30, a black-and-white camera 40, and a processor 10.
  • the time-of-flight module 20 is used to collect a depth image of a subject.
  • the color camera 30 is used to collect a color image of a subject.
  • the black-and-white camera 40 is used to capture a black-and-white image of a subject.
  • the processor 10 is configured to acquire the current brightness of the ambient light in real time, and construct a three-dimensional image of the subject according to the depth image, the color image, and the black and white image when the current brightness is less than the first threshold.
  • the time-of-flight module 20 and the black-and-white camera 40 are respectively disposed on two sides of the color camera 30.
  • the processor 10 is further configured to construct a fused image of the subject according to the color image and the black-and-white image when the current brightness is less than the first threshold.
  • the processor 10 is further configured to construct a three-dimensional image of the subject according to the depth image and the color image when the current brightness is greater than or equal to the first threshold.
  • the electronic device 100 further includes a flash 50.
  • the flash 50 is turned on when the current brightness is greater than or equal to the first threshold and less than the second threshold.
  • the time-of-flight module 20 includes a first substrate assembly 21, a spacer 22, a light transmitter 23 and a light receiver 24.
  • the first substrate assembly 21 includes a first substrate 211 and a flexible circuit board 212 connected to each other.
  • the spacer 22 is disposed on the first substrate 211.
  • the light transmitter 23 is configured to emit an optical signal outward.
  • the light transmitter 23 is disposed on the cushion block 22.
  • the flexible circuit board 212 is bent and one end of the flexible circuit board 212 is connected to the first substrate 211 and the other end is connected to the light emitter 23.
  • the light receiver 24 is disposed on the first substrate 211.
  • the light receiver 24 is configured to receive the light signal emitted by the reflected light transmitter 23.
  • the light receiver 24 includes a casing 241 and an optical element disposed on the casing 241. 242.
  • the housing 241 is connected with the cushion block 22 as a whole.
  • the housing 241 and the cushion block 22 are integrally formed.
  • the light emitter 23 includes a second substrate assembly 231, a light source assembly 232, and a housing 233.
  • the second substrate assembly 231 is disposed on the pad 22, and the second substrate assembly 231 is connected to the flexible circuit board 212.
  • the light source assembly 232 is disposed on the second substrate assembly 231, and the light source assembly 232 is configured to emit a light signal.
  • the casing 233 is disposed on the second substrate assembly 231.
  • the casing 233 is formed with a receiving space 2331 to receive the light source assembly 232.
  • the second substrate assembly 231 includes a second substrate 2311 and a reinforcing member 2312.
  • the second substrate 2311 is connected to the flexible circuit board 212.
  • the light source assembly 232 and the reinforcing member 2312 are disposed on opposite sides of the second substrate 2311.
  • the reinforcing member 2312 is integrally formed with the cushion block 22; or the reinforcing member 2312 and the cushion block 22 are formed separately.
  • a first positioning member 2313 is formed on the reinforcing member 2312.
  • the cushion block 22 includes a body 221 and a second positioning member 222.
  • the second positioning member 222 is formed on the body 221.
  • the first positioning member 2313 cooperates with the second positioning member 222.
  • the side where the cushion block 22 is combined with the first substrate 211 is provided with a receiving cavity 223.
  • the time-of-flight module 20 further includes an electronic component 25 disposed on the first substrate 211, and the electronic component 25 is contained in the receiving cavity 223.
  • the cushion block 22 is provided with an escape hole 224 communicating with at least one receiving cavity 223, and at least one electronic component 25 extends into the escape hole 224.
  • the first substrate assembly 21 further includes a reinforcing plate 213.
  • the reinforcing plate 213 is coupled to a side of the first substrate 211 opposite to the pad 22.
  • the cushion block 22 includes a protruding portion 225 protruding from the side edge 2111 of the first substrate 211, and the flexible circuit board 212 is bent around the protruding portion 225.
  • the time-of-flight module 20 further includes a connector 26 connected to the first substrate 211.
  • the connector 26 is used to connect the first substrate assembly 21 and an external device.
  • the connector 26 and the flexible circuit board 212 are respectively connected to opposite ends of the first substrate 211.
  • the light transmitter 23 and the light receiver 24 are arranged along a straight line L, and the connector 26 and the flexible circuit board 212 are located on opposite sides of the straight line L, respectively. .
  • the electronic device 100 includes a time-of-flight module 20, a color camera 30 and a black-and-white camera 40.
  • Control methods include:
  • the time-of-flight module 20 and the black-and-white camera 40 are respectively disposed on two sides of the color camera 30.
  • control method further includes:
  • control method further includes:
  • the electronic device 100 further includes a flash 50
  • the control method further includes:
  • the embodiment of the present application provides a control method of the electronic device 100.
  • the electronic device 100 includes a time-of-flight module 20, a color camera 30 (ie, an RGB camera), and a black and white camera 40 (ie, a Mono camera).
  • Control methods include:
  • an embodiment of the present application provides an electronic device 100.
  • the electronic device 100 includes a time-of-flight module 20, a color camera 30, a black-and-white camera 40, and a processor 10.
  • the control method of the electronic device 100 according to the embodiment of the present application may be implemented by the electronic device 100 according to the embodiment of the present application.
  • the time of flight module 20 may be used to execute the method in 01
  • the color camera 30 may be used to execute the method in 02
  • the processor 10 may be used to execute the methods in 03 and 05
  • the black and white camera 40 may be used to execute the method in 04. That is to say, the time-of-flight module 20 can be used to acquire a depth image of a subject.
  • the color camera 30 can be used to collect a color image of a subject.
  • the black-and-white camera 40 can be used to capture a black-and-white image of a subject.
  • the processor 10 may be configured to acquire the current brightness of the ambient light in real time, and construct a three-dimensional image of the subject according to the depth image, the color image, and the black and white image when the current brightness is less than the first threshold.
  • An electronic device usually collects a two-dimensional image of a subject through a single camera, and the photographing effect is not good, which affects the user experience.
  • the electronic device 100 and the control method for the electronic device 100 construct a three-dimensional image of the subject according to the depth image, the color image, and the black and white image. Conducive to improving user experience.
  • the electronic device 100 may be a mobile phone, a tablet computer, a smart watch, a smart bracelet, a smart wearable device, and the like.
  • the embodiment of the present application is described by taking the electronic device 100 as a mobile phone as an example. Not limited to mobile phones.
  • the electronic device 100 may include a case 101 and a bracket 102.
  • the time-of-flight module 20, the color camera 30 and the black and white camera 40 are all disposed on the bracket 102.
  • the time-of-flight module 20, the color camera 30, the black-and-white camera 40, and the bracket 102 are all housed in the casing 101 and can extend from the casing 101.
  • the bracket 102 The time-of-flight module 20, the color camera 30, and the black-and-white camera 40 are driven to move outside the casing 101 to extend the casing 101, so as to acquire a depth image, a color image, or a black-and-white image of the subject.
  • the time-of-flight module 20, the color camera 30, and the black-and-white camera may be all front cameras or all rear cameras.
  • the subject can be a person, object, or other subject that the user wishes to photograph.
  • the casing 101 may be provided with a light through hole (not shown).
  • the time-of-flight module 20, the color camera 30, and the black and white camera 40 are immovably disposed in the casing 101 and correspond to the light through hole.
  • the display screen 103 of the electronic device 100 disposed on the casing 101 may be provided with a light through hole (not shown), and the time-of-flight module 20, the color camera 30, and the black and white camera 40 are disposed on the display screen 103.
  • the electronic device 100 may further include an ambient light sensor (not shown).
  • the ambient light sensor is configured to detect the current brightness of the ambient light and send the current brightness to the processor 10.
  • the processor 10 may acquire the current brightness of the ambient light before the time-of-flight module 20 acquires a depth image of the object and the color camera 30 acquires a color image of the object. If the current brightness is less than the first threshold, the time-of-flight module 20 captures a depth image of the subject, at the same time, the color camera 30 captures a color image of the subject, and the black-and-white camera 40 captures a black-and-white image of the subject.
  • the time-of-flight module 20 the color camera 30, and the black-and-white camera 40 acquire images at the same time, compared with the time-of-flight module 20, the color camera 30, and the black-and-white camera 40 acquiring images sequentially, it can effectively save the construction of a three-dimensional image of the subject Time to improve user experience.
  • the processor 10 may also acquire the current brightness of the ambient light after the depth-of-flight module 20 captures the depth image of the subject and the color camera 30 captures the color image of the subject, and when it is confirmed that the current brightness is less than the first threshold, The black-and-white camera 40 is turned on to collect black-and-white images of the subject, thereby avoiding unnecessary turning on of the black-and-white camera 40 and increasing power consumption of the electronic device 100.
  • the time-of-flight module 20 captures a depth image of the subject (ie, 01)
  • the color camera 30 captures a color image of the subject (ie, 02)
  • the processor 10 obtains the current brightness of the ambient light (ie, 03) )
  • the order in which the black-and-white camera 40 collects the black-and-white image of the subject (ie, 04) can be arbitrary, as long as it can satisfy the three-dimensional image of the subject that can be constructed based on the depth image, color image, and black-and-white image.
  • the process of the processor 10 constructing a three-dimensional image of the subject according to the depth image, the color image, and the black and white image may be: obtaining depth information of each pixel point in the depth image, and acquiring each pixel point in the color image. Obtaining color information of each pixel, obtaining brightness information of each pixel in a black and white image; and generating a three-dimensional image according to depth information, color information, and brightness information corresponding to each pixel.
  • the time-of-flight module 20, the color camera 30, and the black-and-white camera 40 respectively acquire images. Since the brightness, detail, and noise of the images collected by the black-and-white camera 40 are lower than that of the color camera in a low-light (ie, low-light) environment. 30 is good, which can improve the shooting quality of dark light / night scene images. Therefore, when generating 3D images, color is provided by the color camera 30, brightness and detail are provided by the black and white camera 40, and depth is provided by the time-of-flight module 20 to form a single In low-light environments, colors, brightness, details, and noise are all ideal 3D images, enabling better 3D effects in low-light environments and Augmented Reality (AR) applications.
  • AR Augmented Reality
  • control method further includes:
  • the processor 10 may be used to execute the method in 06. That is, the processor 10 may be configured to construct a fused image of the subject according to the color image and the black-and-white image when the current brightness is less than the first threshold.
  • the process of the processor 10 constructing the fused image of the subject according to the color image and the black and white image may be: obtaining color information of each pixel point in the color image, and obtaining brightness information of each pixel point in the black and white image; and The color information and brightness information corresponding to each pixel point generate a fused image.
  • the color camera 30 and the black-and-white camera 40 respectively acquire images. Since the brightness, details, and noise of the images collected by the black-and-white camera 40 are better than those of the color camera 30 in a low-light (ie, low-light) environment, the darkness can be improved. Light / night image shooting quality, therefore, when generating a fused image, color is provided by the color camera 30, and brightness and detail are provided by the black and white camera 40, so that a low-light environment is combined with color, brightness, detail and noise. Ideal two-dimensional image, to achieve better photo effects in low-light environments.
  • the processor 10 may determine to construct a three-dimensional image or a fused image of the subject according to a user input (for example, an image type selected by the user).
  • control method further includes:
  • the processor 10 may be used to execute the method in 07. That is, the processor 10 may be configured to construct a three-dimensional image of the subject according to the depth image and the color image when the current brightness is greater than or equal to the first threshold.
  • the processor 10 may directly construct a three-dimensional image of the subject based on the depth image and the color image. At this time, the black and white camera 40 does not need to work. It is beneficial to save power consumption of the electronic device 100.
  • the first threshold value is L1 and the current brightness is 10. If l0 ⁇ L1, the electronic device 100 collects a black and white image of the subject through the black and white camera 40, and then constructs a three-dimensional image of the subject based on the depth image, color image, and black and white image, or constructs the subject based on the color image and black and white image. Fusion image of objects. If l0> L1, the electronic device 100 does not need to collect a black and white image of the subject through the black and white camera 40, and directly constructs a three-dimensional image of the subject based on the depth image and the color image, so as to save power consumption of the electronic device 100.
  • the electronic device 100 can determine whether to capture a black and white image of the subject through the black and white camera 40 in real time according to the current brightness. Assume that the first threshold value is L1. At the first moment, the current brightness of the ambient light obtained by the processor 10 is l1, where l1 ⁇ L0. At this time, the electronic device 100 collects a black and white image of the subject through the black and white camera 40. At two moments, the current brightness of the ambient light obtained by the processor 10 is l2, where l2 ⁇ L0. At this time, the electronic device 100 does not need to collect a black and white image of the subject through the black and white camera 40. Finally, the processor 10 constructs a three-dimensional image of the subject from the depth image and the color image.
  • the electronic device 100 further includes a flash 50
  • the control method further includes:
  • the flash 50 may be used to perform the method in 08. That is to say, the flash 50 can be used to be turned on when the current brightness is greater than or equal to the first threshold and less than the second threshold.
  • the current brightness is less than the second threshold value, which indicates that the current brightness of the ambient light may be low to a certain extent.
  • the electronic device 100 does not collect a black and white image of the subject through the black and white camera 40, but turns on the flash 50 to compensate. Light, thereby ensuring the quality of the color images collected by the color camera 30 and saving power consumption required to turn on the black and white camera 40.
  • the current brightness is less than the first threshold, it indicates that the current brightness of the ambient light is extremely low. If the flash 50 is still used to supplement the light, the quality of the color image collected by the color camera 30 may still not be guaranteed.
  • the electronic device 100 Instead of turning on the flash 50 for supplementary light, a black and white image of the subject is collected through the black and white camera 40, thereby saving power consumption required to turn on the flash 50 and ensuring imaging quality in a low-light environment.
  • the control method of the electronic device 100 according to the embodiment of the present application can achieve a balance between the power consumption of the electronic device 100 and the captured image quality.
  • the time-of-flight module 20 and the black-and-white camera 40 are respectively disposed on two sides of the color camera 30.
  • the processor 10 constructs a three-dimensional image of the object based on the depth image and the color image, the time between the time of flight module 20 and the color camera 30 The parallax is small, which is conducive to constructing a better three-dimensional image.
  • the processor 10 constructs a fused image of the subject based on the color image and the black-and-white image, the parallax between the color camera 30 and the black-and-white camera 40 is small, which is advantageous for constructing a fused image with a better effect.
  • the centers of the time-of-flight module 20, the color camera 30, and the black-and-white camera 40 may be located on a straight line in order.
  • the bracket 102 can be reduced along the top of the electronic device 100 (i.e., one of the electronic device 100 near the bracket 102). Side) to the bottom (that is, the side of the electronic device 100 away from the stand 102); on the other hand, the stand 102 drives the time-of-flight module 20, the color camera 30, and the black and white camera 40 toward the outside of the casing 101 It can synchronously protrude from the casing 101 to structurally ensure that the time-of-flight module 20, the color camera 30 and the black and white camera 40 can work synchronously, saving shooting time.
  • the time of flight module 20 may have the following structure.
  • the time-of-flight module 20 includes a first substrate assembly 21, a spacer 22, a light transmitter 23 and a light receiver 24.
  • the first substrate assembly 21 includes a first substrate 211 and a flexible circuit board 212 connected to each other.
  • the spacer 22 is disposed on the first substrate 211.
  • the light transmitter 23 is configured to emit an optical signal outward.
  • the light transmitter 23 is disposed on the cushion block 22.
  • the flexible circuit board 212 is bent and one end of the flexible circuit board 212 is connected to the first substrate 211 and the other end is connected to the light emitter 23.
  • the light receiver 24 is disposed on the first substrate 211.
  • the light receiver 24 is configured to receive the light signal emitted by the reflected light transmitter 23.
  • the light receiver 24 includes a casing 241 and an optical element disposed on the casing 241. 242.
  • the housing 241 is connected with the cushion block 22 as a whole.
  • the pad 22 can raise the height of the light emitter 23, thereby increasing the height of the light emitting surface of the light emitter 23, and the light emitter 23
  • the emitted light signal is not easily blocked by the light receiver 24, so that the light signal can be completely irradiated on the measured object.
  • the first substrate assembly 21 includes a first substrate 211 and a flexible circuit board 212.
  • the first substrate 211 may be a printed circuit board or a flexible circuit board.
  • the control circuit of the time of flight module 20 may be laid on the first substrate 211.
  • One end of the flexible circuit board 212 can be connected to the first substrate 211, and the flexible circuit board 212 can be bent at a certain angle, so that the relative positions of the devices connected at both ends of the flexible circuit board 212 can be selected.
  • the pad 22 is disposed on the first substrate 211.
  • the pad 22 is in contact with the first substrate 211 and is carried on the first substrate 211.
  • the pad 22 may be combined with the first substrate 211 by means of adhesion or the like.
  • the material of the spacer 22 may be metal, plastic, or the like.
  • a surface where the pad 22 is combined with the first substrate 211 may be a flat surface, and a surface opposite to the combined surface of the pad 22 may also be a flat surface, so that when the light emitter 23 is disposed on the pad 22 Has better stability.
  • the light transmitter 23 is configured to emit an optical signal outwards.
  • the light signal may be infrared light, and the light signal may be a lattice spot emitted to the object to be measured.
  • the light signal is emitted from the light transmitter 23 at a certain divergence angle. .
  • the light transmitter 23 is disposed on the spacer 22. In the embodiment of the present application, the light transmitter 23 is disposed on the side of the spacer 22 opposite to the first substrate 211, or in other words, the spacer 22 connects the first substrate 211.
  • the light emitter 23 is spaced apart from the light emitter 23 so that a height difference is formed between the light emitter 23 and the first substrate 211.
  • the light transmitter 23 is also connected to the flexible circuit board 212.
  • the flexible circuit board 212 is bent, one end of the flexible circuit board 212 is connected to the first substrate 211, and the other end is connected to the light transmitter 23, so that the control signal of the light transmitter 23 is removed
  • the first substrate 211 is transmitted to the light transmitter 23, or a feedback signal of the light transmitter 23 (for example, time information, frequency information of the light signal emitted by the light transmitter 23, temperature information of the light transmitter 23, etc.) is transmitted to the first Substrate 211.
  • the light receiver 24 is configured to receive an optical signal emitted by the reflected light transmitter 23.
  • the light receiver 24 is disposed on the first substrate 211, and the contact surface between the light receiver 24 and the first substrate 211 is substantially flush with the contact surface between the pad 22 and the first substrate 211 (that is, the installation starting point of the two is On the same plane).
  • the light receiver 24 includes a housing 241 and an optical element 242.
  • the casing 241 is disposed on the first substrate 211, and the optical element 242 is disposed on the casing 241.
  • the casing 241 may be a lens holder and a lens barrel of the light receiver 24, and the optical element 242 may be a lens disposed in the casing 241. And other components.
  • the light receiver 24 may further include a photosensitive chip (not shown).
  • the optical signal reflected by the measured object is irradiated into the photosensitive chip through the optical element 242, and the photosensitive chip responds to the optical signal.
  • the time-of-flight module 20 calculates the time difference between the light signal emitted by the light transmitter 23 and the light sensor receiving the light signal reflected by the measured object, and further obtains the depth information of the measured object, which can be used for distance measurement, For generating depth images or for 3D modeling.
  • the housing 241 and the cushion block 22 are integrally connected. Specifically, the housing 241 and the spacer 22 may be integrally formed.
  • the materials of the housing 241 and the spacer 22 are the same and are integrally formed by injection molding, cutting or the like; or the materials of the housing 241 and the spacer 22 are different, both Integrated molding by two-color injection molding.
  • the housing 241 and the spacer 22 may also be separately formed, and the two form a matching structure.
  • the housing 241 and the spacer 22 may be connected into one body, and then jointly disposed on the first substrate 211. It is also possible to firstly arrange one of the housing 241 and the pad 22 on the first substrate 211, and then arrange the other on the first substrate 211 and connect them as a whole.
  • the pad 22 can raise the height of the light emitter 23, thereby increasing the height of the light emitting surface of the light emitter 23, and the light emitter 23
  • the emitted light signal is not easily blocked by the light receiver 24, so that the light signal can be completely irradiated on the measured object.
  • the exit surface of the light transmitter 23 may be flush with the entrance surface of the light receiver 24, or the exit surface of the light transmitter 23 may be slightly lower than the entrance surface of the light receiver 24, or it may be the exit surface of the light transmitter 23 Slightly higher than the incident surface of the light receiver 24.
  • the first substrate assembly 21 further includes a reinforcing plate 213.
  • the reinforcing plate 213 is coupled to a side of the first substrate 211 opposite to the pad 22.
  • the reinforcing plate 213 may cover one side of the first substrate 211, and the reinforcing plate 213 may be used to increase the strength of the first substrate 211 and prevent deformation of the first substrate 211.
  • the reinforcing plate 213 may be made of a conductive material, such as a metal or an alloy.
  • the reinforcing plate 213 may be electrically connected to the casing 10 to make the reinforcing plate 213. Grounding and effectively reducing the interference of static electricity from external components on the time of flight module 20.
  • the cushion block 22 includes a protruding portion 225 protruding from the side edge 2111 of the first substrate 211, and the flexible circuit board 212 is bent around the protruding portion 225. Specifically, a part of the cushion block 22 is directly carried on the first substrate 211, and another part is not in direct contact with the first substrate 211, and protrudes from the side edge 2111 of the first substrate 211 to form a protruding portion 225.
  • the flexible circuit board 212 may be connected to the side edge 2111, and the flexible circuit board 212 is bent around the protrusion 225, or the flexible circuit board 212 is bent so that the protrusion 225 is located in a space surrounded by the flexible circuit board 212. Inside, when the flexible circuit board 212 is subjected to an external force, the flexible circuit board 212 will not collapse inward and cause excessive bending, which will cause damage to the flexible circuit board 212.
  • the outer surface 2251 of the protruding portion 225 is a smooth curved surface (for example, the outer surface of a cylinder, etc.), that is, the outer surface 2251 of the protruding portion 225 does not form a curvature. Hence, even if the flexible circuit board 212 is bent over the outer side 2251 of the protruding portion 225, the degree of bending of the flexible circuit board 212 will not be too large, which further ensures the integrity of the flexible circuit board 212.
  • the time-of-flight module 20 further includes a connector 26 connected to the first substrate 211.
  • the connector 26 is used to connect the first substrate assembly 21 and an external device.
  • the connector 26 and the flexible circuit board 212 are respectively connected to opposite ends of the first substrate 211.
  • the connector 26 may be a connection base or a connector.
  • the connector 26 may be connected to the main board of the mobile terminal 100 so that the time-of-flight module 20 is electrically connected to the main board.
  • the connector 26 and the flexible circuit board 212 are respectively connected to opposite ends of the first substrate 211.
  • the connectors 26 and the flexible circuit board 212 may be respectively connected to the left and right ends of the first substrate 211, or respectively connected to the front and rear ends of the first substrate 211.
  • the light transmitter 23 and the light receiver 24 are arranged along a straight line L, and the connector 26 and the flexible circuit board 212 are located on opposite sides of the straight line L, respectively. It can be understood that, since the light transmitter 23 and the light receiver 24 are arranged in an array, the size of the time-of-flight module 20 may be larger in the direction of the straight line L.
  • the connector 26 and the flexible circuit board 212 are respectively disposed on opposite sides of the straight line L, which will not increase the size of the time-of-flight module 20 in the direction of the straight line L, thereby facilitating the installation of the time-of-flight module 20 on the mobile terminal 100.
  • the chassis 10 On the chassis 10.
  • a receiving cavity 223 is defined on a side where the cushion block 22 is combined with the first substrate 211.
  • the time-of-flight module 20 further includes an electronic component 25 disposed on the first substrate 211, and the electronic component 25 is contained in the receiving cavity 223.
  • the electronic component 25 may be an element such as a capacitor, an inductor, a transistor, a resistor, etc.
  • the electronic component 25 may be electrically connected to a control line laid on the first substrate 211 and used to drive or control the operation of the light transmitter 23 or the light receiver 24.
  • the electronic component 25 is contained in the containing cavity 223, and the space in the cushion block 22 is used reasonably.
  • the number of the receiving cavities 223 may be one or more, and the plurality of receiving cavities 223 may be spaced apart from each other.
  • the positions of the receiving cavity 223 and the electronic component 25 may be aligned and the pad 22 may be disposed at On the first substrate 211.
  • the cushion block 22 is provided with an avoiding through hole 224 communicating with at least one receiving cavity 223, and at least one electronic component 25 extends into the avoiding through hole 224.
  • the height of the electronic component 25 is required to be not higher than the height of the containing cavity 223.
  • an avoiding through hole 224 corresponding to the receiving cavity 223 may be provided, and the electronic component 25 may partially extend into the avoiding through hole 224, so as not to increase the height of the spacer 22
  • the electronic component 25 is arranged.
  • the light emitter 23 includes a second substrate assembly 231, a light source assembly 232, and a housing 233.
  • the second substrate assembly 231 is disposed on the pad 22, and the second substrate assembly 231 is connected to the flexible circuit board 212.
  • the light source assembly 232 is disposed on the second substrate assembly 231, and the light source assembly 232 is configured to emit a light signal.
  • the casing 233 is disposed on the second substrate assembly 231.
  • the casing 233 is formed with a receiving space 2331.
  • the receiving space 2331 can be used for receiving the light source module 232.
  • the flexible circuit board 212 may be detachably connected to the second substrate assembly 231.
  • the light source assembly 232 is electrically connected to the second substrate assembly 231.
  • the casing 233 may be bowl-shaped as a whole, and the opening of the casing 233 is disposed on the second substrate assembly 231 downwardly, so as to receive the light source assembly 232 in the accommodation space 2331.
  • a light outlet 2332 corresponding to the light source component 232 is provided on the housing 233.
  • the optical signal emitted from the light source component 232 passes through the light outlet 2332 and is emitted.
  • the light signal can pass directly through the light outlet 2332. It can also pass through the optical outlet 2332 after changing the optical path through other optical devices.
  • the second substrate assembly 231 includes a second substrate 2311 and a reinforcing member 2312.
  • the second substrate 2311 is connected to the flexible circuit board 212.
  • the light source assembly 232 and the reinforcing member 2312 are disposed on opposite sides of the second substrate 2311.
  • a specific type of the second substrate 2311 may be a printed circuit board or a flexible circuit board, and a control circuit may be laid on the second substrate 2311.
  • the reinforcing member 2312 may be fixedly connected to the second substrate 2311 by means of gluing, riveting, or the like.
  • the reinforcing member 2312 may increase the overall strength of the second substrate assembly 231.
  • the reinforcing member 2312 can directly contact the spacer 22, the second substrate 2311 is not exposed to the outside, and does not need to be in direct contact with the spacer 22, and the second substrate 2311 is not easily affected. Contamination by dust, etc.
  • the reinforcing member 2312 and the cushion block 22 are formed separately.
  • the spacer 22 may be first mounted on the first substrate 211.
  • the two ends of the flexible circuit board 212 are respectively connected to the first substrate 211 and the second substrate 2311, and the flexible circuit board 212 may Do not bend first (state shown in Figure 11).
  • the flexible circuit board 212 is then bent, so that the reinforcing member 2312 is disposed on the cushion block 22.
  • the reinforcing member 2312 and the spacer 22 may be integrally formed, for example, integrally formed by a process such as injection molding.
  • the spacer 22 and the light emitter 23 may be installed together.
  • the first substrate 211 On the first substrate 211.
  • a first positioning member 2313 is formed on the reinforcing member 2312.
  • the cushion block 22 includes a body 221 and a second positioning member 222.
  • the second positioning member 222 is formed on the body 221.
  • the first positioning member 2313 cooperates with the second positioning member 222.
  • the relative movement between the second substrate assembly 231 and the cushion block 22 can be effectively restricted.
  • the specific types of the first positioning member 2313 and the second positioning member 222 can be selected according to needs.
  • the first positioning member 2313 is a positioning hole formed in the reinforcing member 2312
  • the second positioning member 222 is a positioning column. Protrude into the positioning hole so that the first positioning member 2313 and the second positioning member 222 cooperate with each other; or the first positioning member 2313 is a positioning column formed on the reinforcing member 2312, and the second positioning member 222 is a positioning hole and the positioning column Project into the positioning hole so that the first positioning member 2313 and the second positioning member 222 cooperate with each other; or the number of the first positioning member 2313 and the second positioning member 222 are multiple, and part of the first positioning member 2313 is a positioning hole, Part of the second positioning member 222 is a positioning column, part of the first positioning member 2313 is a positioning column, and part of the second positioning member 222 is a positioning hole.
  • the positioning column projects into the positioning hole so that the first positioning member 2313 and the second positioning member 222 work cooperatively.
  • the structure of the light source component 232 will be described as an example below:
  • the light source assembly 232 includes a light source 60, a lens barrel 70, a diffuser 80 and a protective cover 90.
  • the light source 60 is connected to the second substrate assembly 231.
  • the lens barrel 70 includes a first surface 71 and a second surface 72 opposite to each other.
  • the lens barrel 11 defines a receiving cavity 75 penetrating the first surface 71 and the second surface 72.
  • the first surface 71 is recessed toward the second surface 72 to form a mounting groove 76 communicating with the receiving cavity 75.
  • the diffuser 80 is installed in the mounting groove 76.
  • the protective cover 90 is mounted on the side where the first surface 71 of the lens barrel 70 is located, and the diffuser 80 is sandwiched between the protective cover 90 and the bottom surface 77 of the mounting groove 76.
  • the protective cover 90 can be mounted on the lens barrel 70 by means of screw connection, engagement, and fastener connection.
  • the protective cover 90 when the protective cover 90 includes a top wall 91 and a protective side wall 92, the protective cover 90 (protective side wall 92) is provided with internal threads and the lens barrel 70 is provided with external threads.
  • the protective cover The internal thread of 90 is screwed with the external thread of the lens barrel 70 to mount the protective cover 90 on the lens barrel 70; or, referring to FIG. 13, when the protective cover 90 includes a top wall 91, the protective cover 90 (top wall 91) A locking hole 95 is opened, and a hook 73 is provided at an end of the lens barrel 70.
  • the hook 73 is inserted into the locking hole 95 so that the protective cover 90 is mounted on the lens barrel 70. 14; when the protective cover 90 includes a top wall 91 and a protective side wall 92, the protective cover 90 (protective side wall 92) is provided with a locking hole 95, and a hook 73 is provided on the lens barrel 70.
  • the hook 73 is inserted into the card hole 95 so that the protective cover 90 is mounted on the lens barrel 70; or, referring to FIG.
  • the protective cover 90 when the protective cover 90 includes the top wall 91, The end of the lens barrel 70 is provided with a first positioning hole 74, the protective cover 90 (top wall 91) is provided with a second positioning hole 93 corresponding to the first positioning hole 74, and the fastener 94 passes through the second positioning hole 93 And locked A first positioning hole 74 to the protective cover 90 is mounted on the lens barrel 70.
  • the protective cover 90 When the protective cover 90 is mounted on the lens barrel 70, the protective cover 90 is in contact with the diffuser 80 and the diffuser 80 is in contact with the bottom surface 77, so that the diffuser 80 is sandwiched between the protective cover 90 and the bottom surface 77.
  • the light source assembly 232 is provided with a mounting groove 76 on the lens barrel 70 and the diffuser 80 is installed in the mounting groove 76, and is mounted on the lens barrel 70 through a protective cover 90 to clamp the diffuser 80 between the protective cover 90 and the installation. Between the bottom surfaces 77 of the grooves 76, the diffuser 80 is actually fixed to the lens barrel 70. And avoid using glue to fix the diffuser 80 on the lens barrel 70, so as to prevent the glue from diffusing and solidifying on the surface of the diffuser 80 and affecting the microstructure of the diffuser 80 after the glue is volatilized to a gaseous state, and the connection and diffusion can be avoided. When the glue of the device 80 and the lens barrel 70 decreases due to aging, the diffuser 80 falls off from the lens barrel 70.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, the features defined as “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present application, the meaning of "plurality” is at least two, for example, two, three, unless specifically defined otherwise.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Toys (AREA)
PCT/CN2019/090017 2018-08-22 2019-06-04 电子装置和电子装置的控制方法 WO2020038054A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP19851528.0A EP3833019B1 (de) 2018-08-22 2019-06-04 Elektronische vorrichtung und steuerungsverfahren dafür
US17/175,681 US11516455B2 (en) 2018-08-22 2021-02-14 Electronic device and method for controlling the same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810963394.9A CN108989783A (zh) 2018-08-22 2018-08-22 电子装置和电子装置的控制方法
CN201810963394.9 2018-08-22

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/175,681 Continuation US11516455B2 (en) 2018-08-22 2021-02-14 Electronic device and method for controlling the same

Publications (1)

Publication Number Publication Date
WO2020038054A1 true WO2020038054A1 (zh) 2020-02-27

Family

ID=64547500

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/090017 WO2020038054A1 (zh) 2018-08-22 2019-06-04 电子装置和电子装置的控制方法

Country Status (4)

Country Link
US (1) US11516455B2 (de)
EP (1) EP3833019B1 (de)
CN (2) CN111698494B (de)
WO (1) WO2020038054A1 (de)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112188059A (zh) * 2020-09-30 2021-01-05 深圳市商汤科技有限公司 可穿戴设备、智能引导方法及装置、引导系统

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109005348A (zh) * 2018-08-22 2018-12-14 Oppo广东移动通信有限公司 电子装置和电子装置的控制方法
CN111698494B (zh) * 2018-08-22 2022-10-28 Oppo广东移动通信有限公司 电子装置
JP7122463B2 (ja) * 2018-09-04 2022-08-19 ▲寧▼波舜宇光▲電▼信息有限公司 Tof撮像モジュールと電子機器および組立方法
CN109737868A (zh) * 2018-12-21 2019-05-10 华为技术有限公司 飞行时间模组及电子设备
CN112492139B (zh) * 2018-12-24 2021-10-15 华为技术有限公司 摄像组件及电子设备
CN110266939B (zh) * 2019-05-27 2022-04-22 联想(上海)信息技术有限公司 显示方法及电子设备、存储介质
EP3846440B1 (de) 2019-12-30 2021-12-15 Axis AB Verfahren, einheit und system zur lichtarmen bildgebung
JP7043707B2 (ja) * 2020-06-30 2022-03-30 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド シーン認識装置、撮像装置、シーン認識方法、及びプログラム
CN115184956B (zh) * 2022-09-09 2023-01-13 荣耀终端有限公司 Tof传感器系统和电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100207938A1 (en) * 2009-02-18 2010-08-19 International Press Of Boston, Inc. Simultaneous three-dimensional geometry and color texture acquisition using single color camera
CN106454077A (zh) * 2016-09-26 2017-02-22 宇龙计算机通信科技(深圳)有限公司 拍摄方法、拍摄装置和终端
CN107133914A (zh) * 2016-02-26 2017-09-05 英飞凌科技股份有限公司 用于生成三维彩色图像的装置和用于生成三维彩色图像的方法
CN107580209A (zh) * 2017-10-24 2018-01-12 维沃移动通信有限公司 一种移动终端的拍照成像方法及装置
CN108989783A (zh) * 2018-08-22 2018-12-11 Oppo广东移动通信有限公司 电子装置和电子装置的控制方法

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9424650B2 (en) * 2013-06-12 2016-08-23 Disney Enterprises, Inc. Sensor fusion for depth estimation
CN104661008B (zh) * 2013-11-18 2017-10-31 深圳中兴力维技术有限公司 低照度条件下彩色图像质量提升的处理方法和装置
CN104918034A (zh) * 2015-05-29 2015-09-16 深圳奥比中光科技有限公司 一种3d图像捕获装置、捕获方法及3d图像系统
CN206698308U (zh) * 2016-11-08 2017-12-01 聚晶半导体股份有限公司 摄像模块和摄像装置
CN106791734A (zh) 2016-12-27 2017-05-31 珠海市魅族科技有限公司 用于图像采集的装置、电子装置和图像采集的方法
CN106772431B (zh) 2017-01-23 2019-09-20 杭州蓝芯科技有限公司 一种结合tof技术和双目视觉的深度信息获取装置及其方法
CN107147891B (zh) * 2017-05-17 2019-03-01 浙江大学 光轴可调节式三目深度获取摄像机
CN107179592A (zh) * 2017-06-30 2017-09-19 广东欧珀移动通信有限公司 镜头模组、相机模组及电子装置
CN107528946B (zh) * 2017-09-26 2020-07-17 Oppo广东移动通信有限公司 摄像头模组及移动终端
CN107819992B (zh) * 2017-11-28 2020-10-02 信利光电股份有限公司 一种三摄像头模组及电子设备
CN107995434A (zh) * 2017-11-30 2018-05-04 广东欧珀移动通信有限公司 图像获取方法、电子装置和计算机可读存储介质
CN107846542A (zh) * 2017-12-08 2018-03-27 杜鑫 一种3d摄像头及3d图像的生成方法
CN108093242A (zh) * 2017-12-29 2018-05-29 盎锐(上海)信息科技有限公司 摄影方法及摄影装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100207938A1 (en) * 2009-02-18 2010-08-19 International Press Of Boston, Inc. Simultaneous three-dimensional geometry and color texture acquisition using single color camera
CN107133914A (zh) * 2016-02-26 2017-09-05 英飞凌科技股份有限公司 用于生成三维彩色图像的装置和用于生成三维彩色图像的方法
CN106454077A (zh) * 2016-09-26 2017-02-22 宇龙计算机通信科技(深圳)有限公司 拍摄方法、拍摄装置和终端
CN107580209A (zh) * 2017-10-24 2018-01-12 维沃移动通信有限公司 一种移动终端的拍照成像方法及装置
CN108989783A (zh) * 2018-08-22 2018-12-11 Oppo广东移动通信有限公司 电子装置和电子装置的控制方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3833019A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112188059A (zh) * 2020-09-30 2021-01-05 深圳市商汤科技有限公司 可穿戴设备、智能引导方法及装置、引导系统
CN112188059B (zh) * 2020-09-30 2022-07-15 深圳市商汤科技有限公司 可穿戴设备、智能引导方法及装置、引导系统

Also Published As

Publication number Publication date
EP3833019A1 (de) 2021-06-09
CN111698494A (zh) 2020-09-22
US11516455B2 (en) 2022-11-29
CN111698494B (zh) 2022-10-28
EP3833019A4 (de) 2021-09-08
CN108989783A (zh) 2018-12-11
EP3833019B1 (de) 2023-07-26
US20210176449A1 (en) 2021-06-10

Similar Documents

Publication Publication Date Title
WO2020038054A1 (zh) 电子装置和电子装置的控制方法
WO2020038068A1 (zh) 成像装置及电子设备
WO2020038063A1 (zh) 电子装置和电子装置的控制方法
CN109256047B (zh) 一种显示面板、显示装置及其驱动方法
WO2020125388A1 (zh) 飞行时间模组及电子设备
EP3349064B1 (de) Auf ein endgerät angewandtes kameramodul und endgerät damit
WO2020134879A1 (zh) 摄像组件及电子设备
CN109428996B (zh) 包括加固构件的相机模块及包括该相机模块的电子装置
US9986137B2 (en) Image pickup apparatus
EP3562139B1 (de) Elektronische vorrichtung und kameraanordnung dafür
TWM523106U (zh) 光學裝置
US20140353501A1 (en) Night vision attachment for smart camera
WO2020052288A1 (zh) 深度采集模组及移动终端
KR20190006689A (ko) 광학 기기
CN107566555B (zh) 投影手机
CN107241472B (zh) 摄像头模组及电子设备
WO2021027580A1 (zh) 终端
CN210694195U (zh) 一种集成3d成像装置及电子设备
US20230179693A1 (en) Full-screen display device
US20190253590A1 (en) Camera Module
CN101459771A (zh) 数字拍摄设备
CN213693886U (zh) 一种摄像头模组及设备
WO2018028585A1 (zh) 具有不同大小光圈的多摄像头模组及其应用
TWM462378U (zh) 可攜式電子裝置
CN217216688U (zh) 摄像模组及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19851528

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019851528

Country of ref document: EP

Effective date: 20210302