WO2020238482A1 - 电子装置的控制方法及电子装置 - Google Patents

电子装置的控制方法及电子装置 Download PDF

Info

Publication number
WO2020238482A1
WO2020238482A1 PCT/CN2020/085819 CN2020085819W WO2020238482A1 WO 2020238482 A1 WO2020238482 A1 WO 2020238482A1 CN 2020085819 W CN2020085819 W CN 2020085819W WO 2020238482 A1 WO2020238482 A1 WO 2020238482A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
laser
laser projector
display screen
electronic device
Prior art date
Application number
PCT/CN2020/085819
Other languages
English (en)
French (fr)
Inventor
张学勇
吕向楠
赵斌
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Priority to KR1020217039467A priority Critical patent/KR20220004171A/ko
Priority to EP20813039.3A priority patent/EP3968615B1/en
Priority to JP2021571488A priority patent/JP2022535521A/ja
Publication of WO2020238482A1 publication Critical patent/WO2020238482A1/zh
Priority to US17/537,393 priority patent/US11947045B2/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/484Transmitters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0272Details of the structure or mounting of specific components for a projector or beamer module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/21Combinations with auxiliary equipment, e.g. with clocks or memoranda pads
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/03Combinations of cameras with lighting apparatus; Flash units
    • G03B15/05Combinations of cameras with electronic flash apparatus; Electronic flash units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2201/00Electronic components, circuits, software, systems or apparatus used in telephone systems
    • H04M2201/34Microprocessors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/20Details of telephonic subscriber devices including a rotatable camera

Definitions

  • This application relates to the technical field of consumer electronic products, and in particular to a control method of an electronic device and an electronic device.
  • the laser projector is generally set on the front shell of the mobile phone, and the laser projector is only used in the front use state where the shooting distance is relatively short.
  • the laser projector is only used for the front Set the depth image acquisition in the use state, resulting in fewer scenes where mobile phone users can use the laser projector.
  • the embodiments of the present application provide a method for controlling an electronic device and an electronic device.
  • the electronic device includes a casing, a display screen, and a rotating component, the display screen is arranged on one side of the casing, and the rotating component includes a base and is arranged on the base
  • the laser projector, the substrate is rotatably mounted on the casing so that the laser projector can selectively face the side where the display screen is located or the side opposite to the display screen
  • the control method includes: determining the orientation of the laser projector; when the laser projector faces the side of the display screen, the laser projector projects laser light in a first mode; and, in the When the laser projector faces the side opposite to the display screen, the laser projector projects laser light in a second mode, and the energy of the laser light emitted in the second mode is greater than that of the laser light emitted in the first mode. The energy of the laser.
  • the electronic device of the present application includes a casing, a display screen, a rotating component, and a processor, the display screen is arranged on one side of the casing, and the rotating component includes a base and a laser projector arranged on the base,
  • the base body is rotatably installed on the casing so that the laser projector can selectively face the side where the display screen is located or the side opposite to the display screen;
  • the laser projector is used to project the laser in the first mode when the laser projector is facing the side of the display screen, and used to project the laser in the direction of the laser projector and
  • the side opposite to the display screen projects laser light in a second mode
  • the energy of the laser light emitted in the second mode is greater than the energy of the laser light emitted in the first mode.
  • FIG. 1 to 3 are schematic structural diagrams of electronic devices according to some embodiments of the present application.
  • FIG. 4 is a schematic flowchart of a control method of an electronic device according to some embodiments of the present application.
  • FIG. 5 is a schematic diagram of a three-dimensional structure of an electronic device according to some embodiments of the present application.
  • FIG. 6 is a schematic diagram of the structure of the laser light source of the laser projector of the electronic device according to some embodiments of the present application.
  • FIG. 7 is a schematic flowchart of a control method of an electronic device according to some embodiments of the present application.
  • FIG 8 and 9 are schematic diagrams of the three-dimensional structure of an electronic device according to another embodiment of the present application.
  • FIG. 10 is a schematic diagram of a system architecture of an electronic device according to some embodiments of the present application.
  • FIG. 11 is a schematic flowchart of a control method of an electronic device according to some embodiments of the present application.
  • FIG. 12 is a schematic diagram of the principle of a control method of an electronic device according to some embodiments of the present application.
  • 13 to 16 are schematic flowcharts of control methods of electronic devices in some embodiments of the present application.
  • FIG. 17 is a schematic diagram of a control method of an electronic device in some embodiments of the present application.
  • FIG. 18 is a schematic flowchart of a control method of an electronic device according to some embodiments of the present application.
  • the control method of the electronic device 100 of the present application is applicable to the electronic device 100, where the electronic device 100 includes a casing 11, a display screen 12, and a rotating assembly 13, and the display screen 12 is disposed on the casing 11
  • the rotating assembly 13 includes a base 131 and a laser projector 14 arranged on the base 131.
  • the base 131 is rotatably installed on the casing 11 so that the laser projector 14 can selectively face the side where the display screen 12 is located.
  • the control method includes:
  • the laser projector 14 projects laser light in the first mode
  • the laser projector 14 projects laser light in the second mode, and the energy of the laser light emitted in the second mode is greater than that of the first mode. The energy of the laser.
  • the electronic device 100 further includes a Hall sensor assembly 16.
  • the Hall sensor assembly 16 includes a first sensor 161 and a second sensor 162.
  • the first sensor 161 is disposed on the base 131.
  • the second sensor 162 is arranged on the casing 11 and corresponds to the first sensor 161; determining the orientation (041) of the laser projector 14 includes:
  • the orientation of the laser projector 14 is determined by the Hall sensor assembly 16.
  • determining the orientation of the laser projector 14 by the Hall sensor assembly 16 includes:
  • the electronic device 100 further includes a state selection button 17 to determine the orientation (041) of the laser projector 14, including:
  • the orientation of the laser projector 14 is determined by the state selection button 17.
  • the power of the laser projector 14 in the first mode is less than the power of the laser in the second mode; and/or the laser projector 14 includes multiple points The light sources 141 and multiple point light sources 141 are independently controlled.
  • the number of point light sources 141 turned on in the first mode of the laser projector 14 is less than the number of point light sources 141 turned on in the second mode.
  • the electronic device 100 further includes an image collector 15.
  • the image collector 15 is arranged on the substrate 131 and located on the side where the laser projector 14 is located.
  • the laser projector 14 projects laser light
  • the first operating frequency projects laser light onto the scene.
  • Control methods also include:
  • the image collector 15 acquires the collected images at the second operating frequency, and the second operating frequency is greater than the first operating frequency;
  • 0115 distinguish the first image collected when the laser projector 14 is not projecting laser light and the second image collected when the laser projector 14 is projecting laser light in the collected images;
  • 0116 Calculate the depth image based on the first image, the second image, and the reference image.
  • the electronic device 100 of the embodiment of the present application includes a casing 11, a display 12, a rotating component 13 and a processor 20.
  • the display screen 12 is arranged on one side of the cabinet 11.
  • the rotating assembly 13 includes a base 131 and a laser projector 14 arranged on the base 131.
  • the base 131 is rotatably installed on the casing 11 so that the laser projector 14 can selectively face the side where the display screen 12 is located or the side opposite to the display screen 12.
  • the processor 20 is used to determine the orientation of the laser projector 14.
  • the laser projector 14 is used to project laser light in the first mode when the laser projector 14 is facing the side of the display screen 12, and used to project laser light in the second mode when the laser projector 14 is facing the side opposite to the display screen 12 Project laser.
  • the energy of the laser light emitted in the second mode is greater than the energy of the laser light emitted in the first mode.
  • the electronic device 100 further includes a Hall sensor assembly 16.
  • the Hall sensor assembly 16 includes a first sensor 161 and a second sensor 162.
  • the first sensor 161 is disposed on the base 131.
  • the second sensor 162 is disposed on the casing 11 and corresponds to the first sensor 161; the processor 20 is also used to determine the orientation of the laser projector 14 through the Hall sensor assembly 16.
  • the processor 20 is further used for:
  • the electronic device 100 further includes a state selection button 17, and the processor 20 is configured to determine the orientation of the laser projector 14 through the state selection button 17.
  • the power of the laser projector 14 in the first mode is less than the power of the laser in the second mode; and/or the laser projector 14 includes multiple points
  • the light source 141 is independently controlled by a plurality of point light sources 141; the number of point light sources 141 turned on in the first mode of the laser projector 14 is less than the number of point light sources 141 turned on in the second mode.
  • the electronic device 100 further includes an image collector 15.
  • the image collector 15 is arranged on the base 131 and located on the side where the laser projector 14 is located.
  • the laser projector 14 projects laser light
  • the laser projector 14 is used to project laser light onto the scene at a first operating frequency
  • the image collector 15 is used to capture images at a second operating frequency, and the second operating frequency is greater than the first operating frequency
  • the processor 20 is used to distinguish between captured images A first image collected when the laser projector 14 is not projecting laser light and a second image collected when the laser projector 14 is projecting laser light are displayed; and the depth image is calculated based on the first image, the second image, and the reference image.
  • the casing 11 includes an end surface 113 and a rear surface 112 opposite to the display screen 12.
  • the rear surface 112 is provided with a receiving groove 114 penetrating the end surface 113, and the rotating assembly 13 can Rotatingly installed in the receiving groove 114; when the laser projector 14 faces the side of the display screen 12, the base 131 protrudes from the end surface 113; when the laser projector 14 faces away from the display 12, one side of the base 131 It is flush with the end face 113.
  • the casing 11 includes a front surface 111, a rear surface 112, and an end surface 113.
  • the front surface 111 and the rear surface 112 are located on opposite sides of the casing 11, and the end surface 113 is connected The front surface 111 and the rear surface 112.
  • the display screen 12 is arranged on the front surface 111.
  • a gap 121 is formed at one end of the display screen 12 close to the end surface 113.
  • the casing 11 is provided with a container penetrating through the front surface 111, the rear surface 112 and the end surface 113.
  • the accommodating groove 115 is in communication with the notch 121, and the rotating assembly 13 is rotatably installed in the accommodating groove 115.
  • the casing 11 includes a front surface 111, a rear surface 112, and an end surface 113.
  • the display screen 12 is installed on the front surface 111 of the casing 11, and the display screen 12 can cover the front surface. 85% or more of the area of surface 111
  • the laser projector 14 includes a laser light source 140
  • the laser light source 140 includes multiple point light sources 141; multiple point light sources 141 form multiple light-emitting arrays 142; multiple light-emitting arrays 142 is arranged in a ring.
  • the light-emitting array 142 is turned on in a manner that the light-emitting array 142 farther from the center of the laser light source 140 is turned on first.
  • the laser projector 14 includes a laser light source 140 and a first driver 147, and the first driver 147 is used to drive the laser light source 140 to project laser light to the object to be measured.
  • the casing 11 includes a front surface 111, a rear surface 112, and an end surface 113.
  • the display screen 12 is disposed on the front surface 111 of the casing 11, and an end of the display screen 12 close to the end surface 113 A gap 121 is formed.
  • the electronic device 100 further includes a floodlight 50 disposed on the base 131, and the floodlight 50 and the laser projector 14 are located on the same surface of the base 131.
  • control methods include:
  • the laser projector 14 projects laser light in the first mode
  • the laser projector 14 projects laser light in the second mode, and the energy of the laser light emitted in the second mode is greater than that of the first mode. The energy of the laser.
  • the electronic device 100 of the present application can be used to implement the aforementioned control method.
  • the electronic device 100 further includes a processor 20, step 041 can be implemented by the processor 20, and step 042 and step 043 can both be implemented by the laser projector 14.
  • the processor 20 can be used to determine the orientation of the laser projector 14.
  • the laser projector 14 can be used to project the laser in the first mode when the laser projector 14 faces the side where the display screen 12 is located, and to project the laser When the projector 14 faces the side opposite to the display screen 12, the laser is projected in the second mode.
  • the electronic device 100 may be a mobile phone, a tablet computer, a notebook computer, a smart wearable device (smart watch, smart bracelet, smart helmet, smart glasses, etc.), virtual reality device, etc.
  • the electronic device 100 is a mobile phone as an example, but the form of the electronic device 100 is not limited to a mobile phone.
  • the casing 11 includes a front surface 111, a rear surface 112, and an end surface 113.
  • the front surface 111 and the rear surface 112 are located on opposite sides of the casing 11, and the end surface 113 connects the front surface 111 and the rear surface 113.
  • the rear surface 112 is provided with a receiving groove 114 penetrating the end surface 113.
  • the display screen 12 is installed on the front surface 111 of the casing 11.
  • the display screen 12 can cover 85% or more of the area of the front surface 111, such as 85%, 86%, 87%, 88%, 89%, 90%, 91%, 92%, 93%, 95% or even 100%.
  • the display screen 12 can be used to display images, and the images can be information such as text, images, videos, and icons.
  • the rotating assembly 13 is rotatably installed in the receiving groove 114.
  • the rotating assembly 13 includes a base 131 and a laser projector 14.
  • the two rotating shafts 132 (as shown in FIG. 2) of the base 131 are respectively mounted on two opposite side walls of the receiving groove 114, passing through the axis of the rotating shaft 132 and
  • the front surface 111, the rear surface 112, and the end surface 113 are all parallel.
  • the laser projector 14 is installed on the base 131 and exposed from a surface of the base 131.
  • the laser projector 14 can be driven to rotate so that the laser projector 14 can selectively face the side of the display screen 12 or the side opposite to the display screen 12, when the laser projector 14 faces the display screen
  • the base 131 protrudes from the end surface 113; when the laser projector 14 faces the side opposite to the display screen 12, one side of the base 131 is flush with the end surface 113, at this time, the base 14 is not convex From the end face 113.
  • the electronic device 100 further includes an image collector 15 which is arranged on the base 131 and located on the side where the laser projector 14 is located, that is, the laser projector 14 and the image collector 15 are exposed from the same surface of the base 131.
  • the laser projector 14 is used in conjunction with the image collector 15 to obtain the depth information of the object to be measured, and is used for three-dimensional modeling, three-dimensional image generation, distance measurement, and the like.
  • the laser projector 14 and the image collector 15 can be installed on the bracket, the bracket, the laser projector 14 and the image collector 15 can be installed together in the base 131; or the base 131 is a bracket, the laser projector 14 and the image
  • the collectors 15 are all installed in the base 131.
  • the display screen 12 faces the user.
  • the laser projector 14 faces the side where the display screen 12 is located (as shown in FIG. 3)
  • the laser projector 14 is in the front use state
  • the display screen 12 and the laser projector 14 are both facing the user of the electronic device 100.
  • the laser projector 14 can be used as a front laser projector
  • the user can watch the content displayed on the display 12, and can use the laser projector 14 to project laser light toward the user, and the user can use the laser projector 14 (And image collector 15) for face recognition, iris recognition, etc.
  • the laser projector 14 is away from the display screen 12, the laser projector 14 is in a rear-mounted use state (as shown in FIG.
  • the display screen 12 faces the user of the electronic device 100, and the laser projector 14 faces away from the user of the electronic device 100 At this time, the laser projector 14 can be used as a rear laser projector.
  • the user can watch the content displayed on the display screen 12 and can use the laser projector 14 to project laser light toward the side far from the user.
  • the user can The laser projector 14 (and the image collector 15) is used to obtain a depth image of the object to be measured on the side of the electronic device 100 opposite to the user.
  • the laser projector 14 projecting laser mode includes a first mode and a second mode, wherein the first mode corresponds to the laser projector 14 in the front use state, and the second mode corresponds to the laser projector 14 in the rear.
  • the energy of the laser light projected in the second mode is greater than the energy of the laser light projected in the first mode.
  • the power of the laser projector 14 in the first mode may be less than the power of the laser in the second mode, so that the energy of the laser light projected in the second mode is greater than the energy of the laser light projected in the first mode.
  • the maximum distance (projection distance) that the laser beam projected by the laser projector 14 can reach in the second mode is greater than the maximum distance (projection distance) that the laser beam projected in the first mode can reach.
  • the rear distance range that the laser projector 14 and the image collector 15 can detect in the rear use state is greater than the front distance range that the laser projector 14 and the image collector 15 can detect in the front use state, for example :
  • the laser projector 14 with the image collector 15 can detect the front distance range within 25cm, and the laser projector 14 with the image collector 15 can detect the rear distance range greater than 25cm (the accuracy of the distance range within 25cm) Very poor);
  • the front distance range and the rear distance range slightly overlap.
  • the front distance range that the laser projector 14 can detect with the image collector 15 is within 25cm, and the laser projector 14 cooperates with the image collector 15 can detect the rear distance range is greater than 20cm.
  • the rotating component 13 is rotatably mounted on the casing 11 so that the laser projector 14 can selectively face the side where the display screen 12 is located or face the display screen 12 12, the maximum distance that the laser projector 14 can project when facing the side opposite to the display screen 12 is greater than the maximum distance that can be projected when facing the side where the display screen 12 is located, so that the laser projector 14 can be used as a front laser projector and a rear laser projector at the same time, increasing the scene where users can use the electronic device 100; at the same time, there is no need to set two laser projectors 14 on the electronic device 100 to serve as the front laser respectively The use of the projector and the rear laser projector saves the cost of the electronic device 100.
  • the laser projector 14 includes a laser light source 140.
  • the laser light source 140 includes a plurality of point light sources 141.
  • the multiple point light sources 141 can be independently controlled. Specifically, each point light source 141 is Can be turned on and off independently. The number of point light sources 141 turned on by the laser projector 14 in the first mode is less than the number of point light sources 141 turned on in the second mode. At this time, each point light source 141 can project the same laser power, so that the second The energy of the laser light projected in the mode is greater than the energy of the laser light projected in the first mode, and the maximum distance that the laser projected by the laser projector 14 in the second mode can reach is greater than the maximum distance that the laser projected in the first mode can reach.
  • multiple point light sources 141 form multiple light-emitting arrays 142, and multiple light-emitting arrays 142 are independently controlled. Specifically, the multiple point light sources 141 in each light emitting array 142 can be turned on and off at the same time. At this time, the power of the multiple point light sources 141 in each light emitting array 142 may be the same. In other embodiments, the multiple point light sources 141 in each light-emitting array 142 can also be independently controlled.
  • the multiple light emitting arrays 142 are arranged in a ring shape.
  • the laser light emitted by the point light sources 141 in the circularly arranged light emitting array 142 can cover a wider field of view, so that more depth information of the object to be measured can be obtained.
  • the ring shape may be a square ring shape or a circular ring shape.
  • the light-emitting array 142 is turned on in a manner that the light-emitting array 142 farther from the center of the laser light source 140 is turned on first.
  • the total number of light-emitting arrays 142 is 6, and the 6 light-emitting arrays 142 include 5 ring-shaped sub-arrays 143 and 1 square sub-array 144.
  • 5 ring-shaped sub-arrays The arrays 143 are arranged in sequence, and the numbers of the five annular sub-arrays 143 arranged in sequence are A, B, C, D, and E.
  • the laser projector 14 turns on the point light sources 141 in the ring sub-array 143 numbered A and B when facing the side where the display screen 12 is located; the laser projector 14 is facing the opposite side of the display screen 12.
  • the point light sources 141 in the ring sub-arrays 143 numbered A, B, and C can be turned on, or the point light sources 141 in the ring sub-arrays 143 numbered A, B, C, and D can be turned on, or they can be turned on.
  • the point light sources 141 in the ring sub-arrays 143 of B, C, D, and E, or the point light sources 141 in the ring sub-arrays 143 numbered A, B, C, D, and E and the point light sources in the square sub-array 144 are turned on 141.
  • the number of point light sources 141 that are turned on in the first mode of the laser projector 14 is less than the number of point light sources 141 that are turned on in the second mode, so that the laser projector 14 is facing a position where the display screen 12 is located.
  • the energy of the laser light projected at the side is smaller than the energy of the laser light projected at the side opposite to the display screen 12.
  • the diffraction ability of the diffractive optical element (not shown) of the laser projector 14 is limited, that is, part of the laser light emitted by the laser light source 140 is not diffracted but is directly emitted, and the directly emitted laser light does not pass through the diffractive optical element. Diffraction attenuation. The energy of the laser directly emitted is relatively large, which is very likely to cause harm to the user's eyes. Therefore, the laser projector 14 projects laser light in the first mode when facing the side where the display screen 12 is located, that is, the laser projector 14 first turns on the annular sub-array 143 away from the center of the laser light source 140 when the projection distance is small.
  • the laser projector 14 can avoid that the laser light projected by the laser light source 140 is directly projected to the user's eyes without the diffraction attenuation effect of the diffractive optical element, thereby improving the safety of the laser projector 14; the laser projector 14 is facing the opposite of the display screen 12.
  • the laser is projected in the second mode, that is, the laser projector 14 simultaneously turns on the ring sub-array 143 away from the center of the laser light source 140 and the ring sub-array 143 near the center of the laser light source 140 when the projection distance is large. This increases the maximum distance that the laser beam projected by the laser projector 14 can reach.
  • the electronic device 100 further includes a Hall sensor assembly 16.
  • the Hall sensor assembly 16 includes a first sensor 161 and a second sensor 162.
  • the first sensor 161 is disposed on the base 131.
  • the second sensor 162 is arranged on the casing 11 and corresponds to the first sensor 161; the determination of the orientation of the laser projector 14 can be achieved by the Hall sensor assembly 16.
  • the control method includes:
  • the laser projector 14 projects laser light in the second mode, and the energy of the laser light emitted in the second mode is greater than the energy of the laser light emitted in the first mode.
  • step 0711, step 0712, and step 0713 can be sub-steps of step 041 described above
  • step 072 is basically the same as step 042 described above
  • step 073 is the same as the previous step.
  • the step 043 is basically the same.
  • the processor 20 is electrically connected to the Hall sensor assembly 16.
  • the processor 20 can also be used to determine the orientation of the laser projector 14 through the Hall sensor assembly 16, and the processor 20 can also be used to implement steps 0711, 0712, and Step 0713.
  • the processor 20 can also be used to obtain the Hall value of the Hall sensor assembly 16, determine that the laser projector 14 faces the side of the display screen 12 when the Hall value is less than the first preset value, and when the Hall value is smaller than the first preset value. When the value is greater than the second preset value, it is determined that the laser projector 14 faces the side opposite to the display screen 12.
  • the first sensor 161 may be a magnet 161 and the second sensor 162 may be a Hall sensor 162.
  • the Hall sensor 162 may be a Gauss meter or a digital Hall sensor, and the Hall value is the Gauss value.
  • the S pole of the magnet 161 is located at the end of the magnet 161 close to the Hall sensor 162, and the N pole of the magnet 161 is located at the end of the magnet 161 away from the Hall sensor 162.
  • the S pole of the magnet 16 is closer to the Hall sensor 162, the magnetic field of the Hall sensor 162 is stronger, and the Hall value collected by the Hall sensor 162 is larger and positive; when the N pole of the magnet 16 is away from The closer the Hall sensor 162 is, the smaller the Hall value collected by the Hall sensor 162 is and is a negative value.
  • the Hall value collected by the Hall sensor 162 is less than the first preset value, for example, when the Hall value of the Hall sensor 162 obtained by the processor 20 is -90, which is less than the first preset value -85, then It is determined that the laser projector 14 faces the side of the display screen 12; when the Hall value collected by the Hall sensor 162 is greater than the second preset value, for example, when the Hall value of the Hall sensor 162 obtained by the processor 20 is 40. When it is greater than the second preset value 35, it is determined that the laser projector 14 faces the side opposite to the display screen 12.
  • the size of the first preset value and the second preset value are related to the characteristics of the magnet 161, the distance between the magnet 161 and the Hall sensor 162 and other factors; the characteristics of the magnet 161 include the material, shape and size of the magnet 161; The smaller the distance between the magnet 161 and the Hall sensor 162, the larger the Hall value collected by the Hall sensor 162.
  • the electronic device 100 and the control method of the embodiment of the present application determine the orientation of the laser projector 14 through the Hall sensor assembly 16, so that the laser projector 14 can be used to project in the corresponding mode without the user manually selecting the orientation of the laser projector 14
  • the laser enhances the experience of using the electronic device 100.
  • the electronic device 100 further includes a state selection button 17 electrically connected to the processor 20.
  • the determination of the orientation of the laser projector 14 can be achieved by the state selection button 17, specifically,
  • the processor 20 can also be used to determine the orientation of the laser projector 14 through the state selection button 17.
  • the state selection button 17 may be a physical button and includes a first state button 171 and a second state button 172. When the processor 20 detects that the first state button 171 is triggered, the processor 20 determines that the laser projector 14 faces the side of the display screen 12; when the processor 20 detects that the second state button 172 is triggered, the processor 20 Make sure that the laser projector 14 faces the side opposite to the display screen 12.
  • the state selection button 17 may also be a virtual button, and the state selection button 17 may be displayed on the display screen 12.
  • the state selection button 17 may be an orientation switching button displayed on the display screen 12.
  • the electronic device 100 and the control method of the embodiment of the present application determine the orientation of the laser projector 14 through the state selection button 17, so that the user can accurately select the desired orientation according to needs.
  • the electronic device 100 further includes a visible light camera 40.
  • the visible light camera 40 includes a main camera 41 and a sub camera 42. Both the main camera 41 and the sub camera 42 are mounted on the base 131, and the main camera 41 and the sub camera 42 are located on the side where the laser projector 14 is located.
  • the main camera 41 can be a wide-angle camera.
  • the wide-angle camera has low sensitivity to motion. The relatively low shutter speed can ensure the sharpness of image shooting.
  • the wide-angle camera has a large field of view, which can cover a wide range of scenes, and can emphasize the foreground and highlight Comparison of near and far.
  • the secondary camera 42 may be a telephoto camera, and the telephoto lens of the telephoto camera can identify further objects.
  • the main camera 41 is a color camera
  • the secondary camera 42 is a black and white camera.
  • the laser projector 14, the sub camera 42, the main camera 41, and the image collector 15 are sequentially arranged at intervals and located on the same straight line.
  • the display screen 12 is disposed on the front surface 111, an end of the display screen 12 close to the end surface 113 is formed with a gap 121, and the casing 11 is provided with a penetrating through the front surface 111 and a rear surface 111.
  • the accommodating groove 115 of the surface 112 and the end surface 113, the accommodating groove 115 corresponds to and communicates with the notch 121, and the rotating assembly 13 is rotatably installed in the accommodating groove 115.
  • the size of the base 131 is slightly smaller than the size of the containing groove 115 so that the base 131 can rotate in the containing groove 115.
  • the two opposite surfaces (the front and rear surfaces) of the base 131 are flush with the rear surface 112 and the outermost light-emitting surface of the display screen 12 respectively;
  • the two opposite surfaces (the front and rear surfaces) of the base 131 are flush with the rear surface 112 and the outermost light-emitting surface of the display screen 12 respectively.
  • the laser projector 14 includes a laser light source 140 and a first driver 147, and the first driver 147 can be used to drive the laser light source 140 to project laser light to the object to be measured.
  • Both the laser projector 14 and the image collector 15 are connected to the processor 20.
  • the processor 20 may provide an enable signal for the laser projector 14, specifically, the processor 20 may provide an enable signal for the first driver 147.
  • the image collector 15 is connected to the processor 20 through an I2C bus.
  • the laser projector 14 can emit laser light, such as an infrared laser, which is reflected after reaching an object in the scene. The reflected laser light can be received by the image collector 15, and the processor 20 can be based on the laser light and image emitted by the laser projector 14.
  • the laser light received by the collector 15 calculates the depth information of the object.
  • the laser projector 14 and the image collector 15 can obtain the depth information through the Time of Flight (TOF) ranging method.
  • TOF Time of Flight
  • the laser projector 14 and the image collector 15 can use the structure The principle of optical ranging to obtain depth information.
  • the description of this application takes the laser projector 14 and the image collector 15 to obtain depth information through the structured light ranging principle as an example.
  • the laser projector 14 is an infrared laser transmitter
  • the image collector 15 is an infrared camera.
  • the image collector 15 can control the projection timing of the laser projector 14 through a strobe signal (strobe signal), where the strobe signal is based on the image collector 15
  • strobe signal can be regarded as an electrical signal with alternating high and low levels, and the laser projector 14 projects laser light according to the laser projection timing indicated by the strobe.
  • the processor 20 may send an image acquisition instruction through the I2C bus to activate the laser projector 14 and the image collector 15 and make the laser projector 14 and the image collector 15 work. After the image collector 15 receives the image acquisition instruction, it controls the switching device 30 through the strobe signal.
  • the switching device 30 sends the first pulse signal (pwn1) to the first driver 147, and the first driver 147
  • the first pulse signal drives the laser light source 140 to project laser light into the scene.
  • the switching device 30 stops sending the first pulse signal to the first driver 147, and the laser light source 140 does not project laser light; or, it can be When the strobe signal is low, the switching device 30 sends a first pulse signal to the first driver 147.
  • the first driver 147 drives the laser light source 140 to project laser light into the scene according to the first pulse signal.
  • the switch stops sending the first pulse signal to the first driver 147, and the laser light source 140 does not project laser light.
  • the strobe signal may not be used when the image collector 15 and the laser projector 14 cooperate.
  • the processor 20 sends an image acquisition command to the image collector 15 and simultaneously sends a laser projection command to the first driver 147.
  • the image collector 15 After the image collector 15 receives the image acquisition instruction, it starts to acquire the acquired image.
  • the first driver 147 receives the laser projection instruction, it drives the laser light source 140 to project laser light.
  • the laser projector 14 projects the laser, the laser forms a laser pattern with spots and projects it on the object to be measured in the scene.
  • the image collector 15 collects the laser pattern reflected by the object to be measured to obtain a speckle image, and sends the speckle image to the processor 20 through the Mobile Industry Processor Interface (MIPI).
  • MIPI Mobile Industry Processor Interface
  • the processor 20 may calculate the depth image based on the speckle image and the reference image pre-stored in the processor 20
  • the visible light camera 40 is also connected to the processor 20 through the I2C bus, that is, both the main camera 41 and the secondary camera 42 can be connected to the processor 20 through the I2C bus.
  • the visible light camera 40 can be used to collect visible light images; that is to say, the main camera 41 and the secondary camera 42 can be used to collect visible light images respectively, or the main camera 41 and the secondary camera 42 cooperate and are used to collect visible light images; in other words, Either or both of the main camera 41 and the sub camera 42 may be used to collect visible light images.
  • the visible light camera 40 (the main camera 41 and/or the secondary camera 42) sends a frame of visible light image to the processor 20, the processor 20 receives a data stream.
  • the visible light camera 40 can be used alone, that is, when the user only wants to obtain visible light images, the processor 20 sends an image acquisition instruction to the visible light camera 40 (either or both of the main camera 41 and the sub camera 42) through the I2C bus to enable The visible light camera 40 makes it work.
  • the visible light camera 40 collects the visible light image of the scene after receiving the image acquisition instruction, and sends the visible light image to the processor 20 through the mobile industry processor interface.
  • the visible light camera 40 (any one of the main camera 41 and the secondary camera 42, or the main camera 41 and the secondary camera 42) can also be used in conjunction with the laser projector 14 and the image collector 15, that is, the user wants to use the visible light image with
  • the depth image acquires a three-dimensional image
  • the image collector 15 and the visible light camera 40 realize hardware synchronization through a sync signal.
  • the processor 20 sends an image acquisition instruction to the image acquisition device 15 through the I2C bus.
  • the image collector 15 After the image collector 15 receives the image acquisition instruction, it can control the switching device 30 through the strobe signal to send the first pulse signal (pwn1) to the first driver 147, so that the first driver 147 drives the laser light source 140 to emit laser light according to the first pulse signal. ; At the same time, the image collector 15 and the visible light camera 40 are synchronized by a sync signal, which controls the visible light camera 40 to collect visible light images.
  • the electronic device 100 may further include a floodlight 50 arranged on the base 131, and the floodlight 50 and the laser projector 14 are located on the same surface of the base 131.
  • the floodlight 50 can emit uniform surface light into the scene.
  • the floodlight 50 includes a floodlight source 51 and a second driver 52.
  • the second driver 52 can be used to drive the floodlight source 51 to emit uniform surface light.
  • the light emitted by the floodlight 50 may be infrared light or other invisible light, such as ultraviolet light. This application takes the infrared light emitted by the floodlight 50 as an example for description, but the form of the light emitted by the floodlight 50 is not limited to this.
  • the floodlight 50 is connected to the processor 20, and the processor 20 can provide an enable signal for driving the floodlight 50. Specifically, the processor 20 can provide an enable signal for the second driver 52.
  • the floodlight 50 can work with the image collector 15 to collect infrared images.
  • the image collector 15 can use a strobe signal (strobe signal, the strobe signal and the image collector 15 control the strobe signal of the laser projector 14 into two Independent strobe signal) controls the emission timing of the infrared light emitted by the floodlight 50.
  • the strobe signal is generated according to the timing of the image acquisition by the image collector 15 and the strobe signal can be regarded as an electrical signal with alternating high and low levels.
  • the processor 20 may send an image acquisition instruction to the image acquisition device 15 through the I2C bus. After the image acquisition device 15 receives the image acquisition instruction, it controls the switching device 30 through the strobe signal. If the strobe signal is high, the switching device 30 sends a second pulse signal (pwn2) to the second driver 52. The second driver 52 controls the flood light source 51 to emit infrared light according to the second pulse signal.
  • pwn2 a second pulse signal
  • the switching device 30 stops sending the second pulse Signal to the second driver 52, the flood light source 51 does not emit infrared light; or, when the strobe signal is low, the switching device 30 sends the second pulse signal to the second driver 52, and the second driver 52
  • the pulse signal controls the flood light source 51 to emit infrared light.
  • the switching device 30 stops sending the second pulse signal to the second driver 52, and the flood light source 51 does not emit infrared light.
  • the image collector 15 When the floodlight 50 emits infrared light, the image collector 15 receives the infrared light reflected by objects in the scene to form an infrared image, and sends the infrared image to the processor 20 through the mobile industrial processor interface. Each time the image collector 15 sends a frame of infrared image to the processor 20, the processor 20 receives a data stream. This infrared image is usually used for iris recognition, face recognition, etc.
  • the electronic device 100 further includes an image collector 15.
  • the image collector 15 and the laser projector 14 are installed on the same surface of the substrate 131 together, and the laser projector 14 When projecting laser light (when the laser projector 14 projects laser light in the first mode, or when the laser projector 14 projects laser light in the second mode), the laser projector 14 projects laser light to the scene at the first operating frequency.
  • the control method further includes:
  • the image collector 15 acquires the collected images at the second operating frequency, and the second operating frequency is greater than the first operating frequency;
  • 0115 In the collected images, distinguish the first image collected when the laser projector 14 is not projecting laser light and the second image collected when the laser projector 14 is projecting laser light;
  • 0116 Calculate the depth image based on the first image, the second image, and the reference image.
  • control methods include:
  • the laser projector 14 projects laser light in the second mode, and the energy of the laser light emitted in the second mode is greater than the energy of the laser light emitted in the first mode;
  • the image collector 15 acquires the collected images at the second operating frequency, and the second operating frequency is greater than the first operating frequency
  • 0115 In the collected images, distinguish the first image collected when the laser projector 14 is not projecting laser light and the second image collected when the laser projector 14 is projecting laser light;
  • 0116 Calculate the depth image based on the first image, the second image, and the reference image.
  • step 0111 is basically the same as step 041 described above
  • step 0112 is basically the same as step 042 described above
  • step 0113 is basically the same as step 043 described above.
  • the image collector 15 can be used to implement step 0114
  • the processor 20 can also be used to implement step 0115 and step 0116.
  • the image collector 15 is used to obtain the captured image at the second operating frequency
  • the processor 20 is also used to distinguish the first image captured when the laser projector 14 is not projecting laser light from the first image captured by the laser projector 14 14
  • the second image collected when the laser is projected, and the depth image is calculated based on the first image, the second image, and the reference image.
  • the processor 20 simultaneously sends an image acquisition instruction for acquiring a depth image to the image acquisition device 15 and the first driver 147 via the I2C bus.
  • the first driver 147 drives the laser light source 140 to emit infrared laser light to the scene at the first operating frequency; after receiving the image acquisition instruction, the image acquisition device 15 acquires at the second operating frequency and is reflected back by objects in the scene Infrared laser to obtain the captured image. For example, as shown in FIG.
  • the solid line represents the timing of laser emission from the laser projector 14
  • the dashed line represents the timing of the image acquisition by the image collector 15 and the number of frames of the acquired image
  • the dash-dotted line represents the data obtained from the first image and the second image.
  • the number of frames of the third image, from top to bottom in FIG. 12, is a solid line, a dashed line, and a dot-dash line in order, wherein the second operating frequency is twice the first operating frequency.
  • the image collector 15 first receives infrared light in the environment (hereinafter referred to as ambient infrared light) when the laser projector 14 is not projecting laser light to obtain the Nth frame of the collected image (this time is the first An image, which can also be called a background image), and sends the Nth frame of the captured image to the processor 20 through the mobile industrial processor interface; then, the image collector 15 can receive the ambient infrared light and the ambient infrared light when the laser projector 14 projects laser light.
  • ambient infrared light in the environment
  • the image collector 15 can receive the ambient infrared light and the ambient infrared light when the laser projector 14 projects laser light.
  • the infrared laser emitted by the laser projector 14 is used to acquire the N+1th frame of acquisition image (this time is the second image, which can also be called the interference speckle image), and the N+1th frame is acquired through the mobile industry processor interface
  • the image is sent to the processor 20; then, the image collector 15 receives ambient infrared light when the laser projector 14 is not projecting laser light to obtain the N+2 frame of the captured image (the first image at this time), and processes it through the mobile industry
  • the device interface sends the N+2th frame captured image to the processor 20, and so on, the image collector 15 alternately obtains the first image and the second image.
  • the processor 20 sends a collection instruction for acquiring a depth image to the image collector 15 via the I2C bus.
  • the image collector 15 After the image collector 15 receives the image acquisition instruction, it sends a first pulse signal to the first driver 147 through the strobe signal control switch, and the first driver 147 drives the laser light source 140 to project laser light at the first operating frequency according to the first pulse signal.
  • the laser projector 14 projects laser light at the first working frequency, and the image collector 15 collects the infrared laser light reflected back by the objects in the scene at the second working frequency to obtain collected images. As shown in FIG.
  • the solid line represents the timing of the laser projector 14 emitting laser
  • the dashed line represents the timing of the image acquisition by the image collector 15 and the number of frames of the acquired image
  • the dashed line represents the time sequence obtained from the first image and the second image.
  • the number of frames of the third image, from top to bottom in FIG. 12, is a solid line, a dashed line, and a dot-dash line in order, wherein the second operating frequency is twice the first operating frequency.
  • the image collector 15 first receives ambient infrared light when the laser projector 14 is not projecting laser light to obtain the Nth frame of the captured image (this time is the first image, which can also be called the background image ), and send the Nth frame captured image to the processor 20 through the mobile industry processor interface; then, the image collector 15 can receive the ambient infrared light and the infrared laser emitted by the laser projector 14 when the laser projector 14 projects the laser In order to obtain the N+1th frame of the collected image (the second image at this time, which can also be called the interference speckle image), and send the N+1th frame of the collected image to the processor 20 through the mobile industry processor interface; then, The image collector 15 then receives ambient infrared light when the laser projector 14 is not projecting laser light to obtain the N+2 frame of the captured image (the first image at this time), and captures the N+2 frame through the mobile industrial processor interface The image is sent to the processor 20, and so on, the image collector 15
  • the image collector 15 may simultaneously execute the acquisition of the acquired image while sending the acquired image to the processor 20.
  • the image collector 15 may also obtain the second image first, then obtain the first image, and alternately perform the acquisition of the collected images according to this sequence.
  • the above-mentioned multiple relationship between the second operating frequency and the first operating frequency is only an example. In other embodiments, the multiple relationship between the second operating frequency and the first operating frequency may also be three times or four times. , Five times, six times and so on.
  • the processor 20 After the processor 20 receives a frame of the captured image, it distinguishes the received captured image and determines whether the captured image is the first image or the second image. After the processor 20 receives at least one frame of the first image and at least one frame of the second image, it can calculate the depth image according to the first image, the second image, and the reference image. Specifically, since the first image is collected when the laser projector 14 is not projecting laser light, the light that forms the first image only includes ambient infrared light, and the second image is collected when the laser projector 14 projects laser light, forming the first image. The light of the two images includes both the ambient infrared light and the infrared laser emitted by the laser projector 14. Therefore, the processor 20 can remove the part of the collected image formed by the ambient infrared light in the second image according to the first image, thereby obtaining only Collected image formed by infrared laser (that is, speckle image formed by infrared laser).
  • the ambient light includes infrared light with the same wavelength as the infrared laser emitted by the laser projector 14 (for example, including ambient infrared light at 940 nm), and when the image collector 15 acquires the captured image, this part of the infrared light will also be captured by the image collector. 15 received.
  • the proportion of ambient infrared light in the light received by the image collector 15 will increase, resulting in inconspicuous laser speckles in the collected image, thereby affecting the calculation of the depth image.
  • the control method of the present application controls the laser projector 14 and the image collector 15 to work at different operating frequencies.
  • the image collector 15 can collect the first image formed by only ambient infrared light and at the same time by the ambient infrared light and the laser projector 14
  • the second image formed by the emitted infrared laser and based on the first image, the part of the image formed by the ambient infrared light in the second image is removed, so that the laser speckles can be distinguished, and only the laser projector 14 can be used.
  • the acquired image formed by the infrared laser is used to calculate the depth image, and the laser speckle matching is not affected, which can avoid partial or complete loss of depth information, thereby improving the accuracy of the depth image.
  • control method includes:
  • the laser projector 14 projects laser light in the second mode, and the energy of the laser light emitted in the second mode is greater than the energy of the laser light emitted in the first mode;
  • 0134 When the laser projector 14 projects laser light onto the scene at the first operating frequency, the image collector 15 acquires the collected images at the second operating frequency, and the second operating frequency is greater than the first operating frequency;
  • 0136 Calculate the depth image based on the first image, the second image, and the reference image.
  • control method includes:
  • the laser projector 14 projects laser light in the second mode, and the energy of the laser light emitted in the second mode is greater than the energy of the laser light emitted in the first mode;
  • 014511 Determine the working state of the laser projector 14 at the collection time according to the collection time of each frame of collected images
  • 0146 Calculate the depth image based on the first image, the second image, and the reference image.
  • step 0131 is basically the same as step 041 described above
  • step 0132 is basically the same as step 042 described above
  • step 0133 is basically the same as step 043 described above
  • step 0134 is basically the same as step 0114 described above.
  • Step 01351 and step 01352 can be sub-steps of step 0115.
  • Step 0136 is basically the same as step 0116 described above.
  • Step 0141 is basically the same as step 041 described above.
  • step 0142 is basically the same as step 042 described above
  • step 0143 is basically the same as step 043 described above
  • step 0144 is basically the same as step 0114 described above
  • step 014511 and step 014512 can be sub-steps of step 01351, step 01452 It is basically the same as step 01352 described above
  • step 0146 is basically the same as step 0116 described above.
  • Step 01351, step 01352, step 014511, step 014512, and step 01452 can all be implemented by the processor 20.
  • the processor 20 can also be used to add an image type to each frame of the captured image, and to distinguish the first image from the second image according to the image type.
  • the processor 20 is used to add an image type to each frame of the captured image, specifically to determine the working status of the laser projector 14 at the time of collection according to the collection time of each frame of the collected image, and to collect each frame according to the working status Add image type to image.
  • the processor 20 adds an image type (stream_type) to the captured image every time it receives a frame of the captured image from the image collector 15, so that the first image and the second image can be distinguished according to the image type in subsequent processing.
  • the processor 20 will monitor the working status of the laser projector 14 in real time via the I2C bus. Each time the processor 20 receives a frame of collected images from the image collector 15, it will first acquire the collection time of the collected image, and then determine according to the collection time of the collected image that the working state of the laser projector 14 is projection The laser is still not projected, and the image type is added to the captured image based on the judgment result.
  • the collection time of the collected image may be the start time, the end time, any time between the start time and the end time when the image collector 15 acquires each frame of the collected image, and so on. In this way, it is possible to realize the correspondence between each frame of the collected image and the working state (projected laser or not projected laser) of the laser projector 14 during the acquisition of the frame of the collected image, and accurately distinguish the type of the collected image.
  • the structure of the image type stream_type is shown in Table 1:
  • stream When stream is 0 in Table 1, it means that the data stream at this time is an image formed by infrared light and/or infrared laser.
  • light When light is 00, it means that the data stream at this time is acquired without any equipment projecting infrared light and/or infrared laser (only ambient infrared light), then the processor 20 can add an image type of 000 to the collected image , To identify this captured image as the first image.
  • light is 01 it means that the data stream at this time is obtained when the laser projector 14 projects infrared laser light (both ambient infrared light and infrared laser light).
  • the processor 20 may add an image type of 001 to the captured image to identify this captured image as the second image.
  • the processor 20 can then distinguish the image types of the collected images according to stream_type.
  • control method includes:
  • the laser projector 14 projects laser light in the second mode, and the energy of the laser light emitted in the second mode is greater than the energy of the laser light emitted in the first mode;
  • 0154 When the laser projector 14 projects laser light onto the scene at the first operating frequency, the image collector 15 acquires the collected images at the second operating frequency, and the second operating frequency is greater than the first operating frequency;
  • 0155 In the collected images, distinguish the first image collected when the laser projector 14 is not projecting laser light and the second image collected when the laser projector 14 is projecting laser light;
  • 01561 Calculate a third image based on the first image and the second image, and the difference between the acquisition time of the first image and the acquisition time of the second image is less than a predetermined difference;
  • 01562 Calculate the depth image based on the third image and the reference image.
  • step 0151 is basically the same as step 041 described above
  • step 0152 is basically the same as step 042 described above
  • step 0153 is basically the same as step 043 described above
  • step 0154 is basically the same as step 0114 described above
  • step 0155 is basically the same as step 0115 described above
  • step 01561 and step 01562 may be sub-steps of step 0116.
  • Both step 01561 and step 01562 can be implemented by the processor 20.
  • the processor 20 may also be used to calculate a third image based on the first image and the second image, and calculate a depth image based on the third image and the reference image, where the acquisition time of the first image is the same as that of the second image. The difference in acquisition time is less than the predetermined difference.
  • the processor 20 may first distinguish the first image from the second image, and then select any frame of the second image and the first image of the specific frame corresponding to the arbitrary frame of the second image according to the acquisition time , Wherein the difference between the acquisition time of the first image of the specific frame and the acquisition time of the second image of the arbitrary frame is less than a predetermined difference. Subsequently, the processor 20 calculates a third image according to the first image of the specific frame and the second image of the arbitrary frame. The third image is the collected image formed by only the infrared laser emitted by the laser projector 14, or This is called the actual speckle image.
  • the processor 20 may calculate a depth image based on the third image and the reference image, where the number of frames of the second image, the number of frames of the third image, and the number of frames of the depth image are all equal. It can be understood that since the difference between the acquisition time of the first image and the acquisition time of the second image is small, the intensity of the ambient infrared light in the first image is closer to the intensity of the ambient infrared light in the second image. The accuracy of the third image calculated from the image and the second image is higher, which helps to further reduce the influence of ambient infrared light on the acquisition of the depth image.
  • the processor 20 may also add an image type to the third image and the depth image, so as to distinguish each data stream obtained after processing the acquired image. As shown in table 2:
  • the stream in Table 2 When the stream in Table 2 is 0, it means that the data stream at this time is an image formed by infrared light and/or infrared laser; when the stream is 1, it means that the data stream at this time is a depth image.
  • light When light is 11, it means background subtraction processing, which means removing the part formed by ambient infrared light in the captured image. Then the processor 20 can add the image type of 011 to the data stream after the background subtraction processing to identify this data
  • the stream is the third image.
  • Light is XX
  • X indicates that the value is not limited, and the processor 20 may add an image type of 1XX to the data stream obtained after depth calculation to identify this data stream as a depth image.
  • the acquisition time of the first image may be before the acquisition time of the second image or after the acquisition time of the second image. No restrictions.
  • the first image and the second image may be images of adjacent frames or non-adjacent frames.
  • the first image and the second image whose difference is less than the predetermined difference are images of adjacent frames; between the second operating frequency and the first operating frequency The multiple of is greater than twice.
  • the first image and the second image whose difference is less than the predetermined difference can be images of adjacent frames or non-adjacent frames Image (at this time, there is still a frame of first image between the first image and the second image).
  • the number of frames of the first image participating in the depth image calculation may also be multiple frames.
  • the processor 20 may first perform fusion processing on the two frames of the first image, for example, add the pixel values of the corresponding pixels of the two frames of the first image and then take the average value to obtain the fusion processed first image, and then use the fusion processing The subsequent first image and the adjacent frame of the second image calculate the third image.
  • the processor 20 may calculate multiple frames of third images, such as the (N+1)-Nth frame of the third image and the (N+3)-(N+2)th frame in FIG. 12
  • the third image, the (N+5)-(N+4)th frame of the third image, etc., and multiple frames of depth images are calculated corresponding to multiple frames of the third image.
  • the processor 20 may also calculate only one frame of the third image, and calculate a frame of depth image corresponding to one frame of the third image. The number of frames of the third image can be determined according to the security level of the application scenario.
  • the number of frames of the third image should be more. In this case, multiple frames of depth images are required to match the user's depth template
  • the payment action is executed only after all successes to improve the security of payment; while the security level of the application scenario is low, for example, for the application scenario of portrait beautification based on depth information, the number of frames of the third image can be less, for example, One frame, at this time, using one frame of depth image is sufficient for portrait beautification. In this way, the calculation amount and power consumption of the processor 20 can be reduced, and the image processing speed can be increased.
  • control method further includes:
  • 0167 Collect visible light images at the third operating frequency, the third operating frequency is greater than or less than the second operating frequency;
  • 0169 Determine the frame-synchronized visible light image and the second image according to the acquisition time of the visible light image, the acquisition time of the acquired image, and the image type of the acquired image.
  • control methods include:
  • the laser projector 14 projects laser light in the second mode, and the energy of the laser light emitted in the second mode is greater than the energy of the laser light emitted in the first mode;
  • 0165 In the collected images, distinguish the first image collected when the laser projector 14 is not projecting laser light and the second image collected when the laser projector 14 is projecting laser light;
  • 0166 Calculate the depth image according to the first image, the second image and the reference image
  • 0167 Collect visible light images at the third operating frequency, the third operating frequency is greater than or less than the second operating frequency;
  • 0169 Determine the frame-synchronized visible light image and the second image according to the acquisition time of the visible light image, the acquisition time of the acquired image, and the image type of the acquired image.
  • step 0161 is basically the same as step 041 described above
  • step 0162 is basically the same as step 042 described above
  • step 0163 is basically the same as step 043 described above
  • step 0164 is basically the same as step 0114 described above
  • step 0165 is basically the same as step 0115 described above
  • step 0166 is basically the same as step 0116 described above.
  • Step 0167 can be implemented by the visible light camera 40 (any one of the main camera 41 and the sub camera 42, or the main camera 41 and the sub camera 42 together).
  • Step 0168 and step 0169 may be implemented by the processor 20.
  • the visible light camera 40 can be used to collect visible light images at a third operating frequency, which is greater than or less than the second operating frequency.
  • the processor 20 can be used to add a collection time for each frame of visible light image and each frame of collected image, and determine the frame-synchronized visible light image and the second image according to the collection time of the visible light image, the collection time of the collected image, and the image type of the collected image. .
  • the image collector 15 In some application scenarios, for example, in the application scenario of three-dimensional modeling of objects in the scene, it is necessary to use the image collector 15 to obtain the depth information of the objects in the scene, and use the visible light camera 40 to obtain the color information of the objects in the scene. Three-dimensional modeling. At this time, the processor 20 needs to turn on the image collector 15 to obtain the depth image and simultaneously turn on the visible light camera 40 to obtain the visible light image.
  • the processor 20 can send an image acquisition instruction to the image collector 15 through the I2C bus, After the image collector 15 receives the image acquisition instruction, the image collector 15 and the visible light camera 40 are synchronized by a sync signal.
  • the sync signal controls the visible light camera 40 to start collecting visible light images to realize the hardware of the image collector 15 and the visible light camera 40 Synchronize. At this time, the number of frames of the collected image is consistent with the number of frames of the visible light image, and each frame of the collected image corresponds to each frame of the visible light image one to one.
  • the processor 20 needs to synchronize the image collector 15 and the visible light camera 40 through software synchronization. Specifically, the processor 20 sends an image acquisition instruction to the image acquisition device 15 through an I2C bus connected to the image acquisition device 15, and at the same time sends an image acquisition instruction to the visible light camera 40 through an I2C bus connected to the visible light camera 40.
  • the processor 20 Whenever the processor 20 receives a frame of a captured image, it adds an image type to each frame of the captured image, and also adds a capture time to each frame of the captured image. Moreover, each time the processor 20 receives a frame of visible light image, it adds a collection time to each frame of visible light image. Among them, the collection time of the captured image can be the start time, end time, any time between the start time and the end time of each frame of the captured image collected by the image collector 15; the collection time of the visible light image can be visible light The camera 40 collects the start time, end time, any time between the start time and the end time of each frame of visible light image, and so on.
  • the processor 20 may first base on the acquisition time of the visible light image, the acquisition time of the acquired image, and the acquisition The type of image first determines the frame-synchronized visible light image and the second image.
  • frame synchronization means that the determined difference between the acquisition time of the second image and the acquisition time of the visible light image is less than the preset time difference.
  • the acquisition time of the visible light image can be located before or before the acquisition time of the second image. After the acquisition time of the second image.
  • the processor 20 selects the first image based on the determined second image to further calculate the depth image based on the second image, the first image, and the reference image. Finally, the processor 20 performs subsequent processing based on the depth image and the determined visible light image.
  • the processor 20 may also add acquisition time to each frame of depth image, and then determine the frame-synchronized visible light image and the depth image according to the acquisition time of the visible light image and the acquisition time of the depth image, and finally the frame synchronization The visible light image and depth image are processed later.
  • the acquisition time of each frame of depth image is the acquisition time of the second image corresponding to the frame of depth image.
  • the captured image also includes an infrared image
  • the infrared image is an image obtained by the image collector 15 collecting infrared light emitted by the floodlight 50.
  • the processor 20 adds an image type to each frame of the captured image, it also adds an image type to the infrared image.
  • the image types of infrared images are shown in Table 3:
  • the stream in Table 3 When the stream in Table 3 is 0, it means that the data stream at this time is an image formed by infrared light and/or infrared laser. When light is 10, it means that the data stream at this time is obtained when the floodlight 50 projects infrared light and the laser projector 14 does not project laser light. Then, when the processor 20 adds the image type of 010 to the captured image, it identifies that this frame of captured image is an infrared image.
  • the image collector 15 needs to be used in conjunction with the floodlight 50 and the laser projector 14.
  • the image collector 15 can obtain the first image, the second image, and the infrared image in time sharing.
  • the solid line represents the timing of the laser projector 14 emitting laser
  • the double-dot chain line represents the timing of infrared light emitted by the floodlight 50
  • the dashed line represents the timing of the image acquisition by the image collector 15 and the number of frames of the image.
  • the dash-dotted line represents the frame number of the third image obtained from the first image and the second image.
  • the processor 20 can monitor the working status of the floodlight 50 in real time via the I2C bus. Each time the processor 20 receives a frame of collected images from the image collector 15, it will first acquire the collection time of the collected image, and then determine according to the collection time of the collected image that the working state of the floodlight 50 is emission during the collection time of the collected image The infrared light is still not emitting infrared light, and the image type is added to the captured image based on the judgment result.
  • the processor 20 may subsequently determine the infrared image and the second image whose acquisition time difference is less than the set difference based on the acquisition time of the infrared image and the acquisition time of the second image. Further, the processor 20 may determine the infrared image and the second image. Depth image, and use the infrared image and the depth image for identity verification.
  • control method further includes:
  • 0182 Determine whether the brightness is greater than the brightness threshold and the type is an outdoor scene
  • control methods include:
  • 0182 Determine whether the brightness is greater than the brightness threshold and the type is an outdoor scene
  • the laser projector 14 projects laser light in the second mode, and the energy of the laser light emitted in the second mode is greater than the energy of the laser light emitted in the first mode;
  • 0186 When the laser projector 14 projects laser light onto the scene at the first operating frequency, the image collector 15 acquires the collected images at the second operating frequency, and the second operating frequency is greater than the first operating frequency;
  • 0187 In the collected images, distinguish the first image collected when the laser projector 14 is not projecting laser light and the second image collected when the laser projector 14 is projecting laser light;
  • 0188 Calculate the depth image based on the first image, second image, and reference image.
  • step 0183 is basically the same as step 041 described above
  • step 0184 is basically the same as step 042 described above
  • step 0185 is basically the same as step 043 described above
  • step 0186 is basically the same as step 0114 described above
  • step 0187 is basically the same as step 0115 described above
  • step 0188 is basically the same as step 0116 described above.
  • step 0181 and step 0182 can be implemented by the processor 20.
  • the processor 20 can be used to obtain the brightness and type of the scene, and determine whether the brightness is greater than the brightness threshold and the type is an outdoor scene.
  • the laser projector 14 can be used to project laser light to the scene at the first operating frequency when the brightness is greater than the brightness threshold and the type is an outdoor scene.
  • the brightness of the scene can be obtained by analyzing the collected image obtained by the image collector 15 or the visible light image obtained by the visible light camera 40 (any one of the main camera 41 and the secondary camera 42, or the main camera 41 and the secondary camera 42 together); Alternatively, the brightness of the scene can also be directly detected by the light sensor, and the processor 20 reads the detected signal from the light sensor to obtain the brightness of the scene.
  • the type of the scene can be obtained by analyzing the collected image obtained by the image collector 15 or the visible light image obtained by the visible light camera 40, for example, analyzing the object in the collected image or the visible light image obtained by the visible light camera 40 to determine whether the type of the scene is an outdoor scene or an indoor Scene; the type of scene can also be determined directly according to the geographic location.
  • the processor 20 can obtain the positioning result of the global satellite positioning system for the scene, and then further judge the type of scene according to the positioning result, for example, the positioning result is a certain office Building, it means that the scene is an indoor scene; the positioning scene is a certain park, it means that the scene is an outdoor scene; when the positioning scene is a certain street, it means that the scene is an outdoor scene, and so on.
  • the positioning result is a certain office Building, it means that the scene is an indoor scene; the positioning scene is a certain park, it means that the scene is an outdoor scene; when the positioning scene is a certain street, it means that the scene is an outdoor scene, and so on.
  • the ambient infrared light in the collected image will take up a larger proportion, which will have a greater impact on spot recognition. At this time, the interference of ambient infrared light needs to be removed. However, when the brightness of the scene is low, the proportion of ambient infrared light in the captured image will be less, and the impact on spot recognition will be small and can be ignored.
  • the image collector 15 and the laser projector 14 can be the same The working frequency works, and the processor 20 directly calculates the depth image according to the acquired image (ie, the second image) obtained by the image collector 15 and the reference image.
  • the processor 20 directly calculates the depth image based on the acquired image (ie, the second image) acquired by the image collector 15 and the reference image. In this way, the operating frequency of the image collector 15 can be reduced, and the power consumption of the image collector 15 can be reduced.
  • control method may also determine whether to perform step 0183 based only on the brightness of the scene.
  • the processor 20 only obtains the brightness of the scene, determines whether the brightness of the scene is greater than the brightness threshold, and the laser projector 14 projects laser light onto the scene at the first operating frequency when the brightness is greater than the brightness threshold.
  • the processor 20 may also add status information (status) to each data stream.
  • status information (status)
  • the processor 20 includes a first storage area, a second storage area, and a logical subtraction circuit, and the logical subtraction circuit is connected to both the first storage area and the second storage area.
  • the first storage area is used to store the first image
  • the second storage area is used to store the second image
  • the logical subtraction circuit is used to process the first image and the second image to obtain the third image.
  • the logical subtraction circuit reads the first image from the first storage area, reads the second image from the second storage area, and performs subtraction on the first image and the second image after acquiring the first image and the second image
  • the third image is obtained by processing.
  • the logic subtraction circuit is also connected to the depth calculation module in the processor 20 (for example, it may be an integrated circuit ASIC dedicated to calculating depth, etc.).
  • the logic subtraction circuit sends the third image to the depth calculation module, and the depth calculation module is The third image and the reference image calculate the depth image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optics & Photonics (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Projection Apparatus (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Structure And Mechanism Of Cameras (AREA)
  • Automatic Focus Adjustment (AREA)
  • Studio Devices (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

一种电子装置(100)及电子装置(100)的控制方法。控制方法包括:(041)确定激光投射器(14)的朝向;(042)在激光投射器(14)朝向显示屏(12)所在的一侧时,激光投射器(14)以第一模式投射激光;及,(043)在激光投射器(14)朝向与显示屏(12)相背的一侧时,激光投射器(14)以第二模式投射激光,第二模式发射的激光的能量大于第一模式发射的激光的能量。

Description

电子装置的控制方法及电子装置
优先权信息
本申请请求2019年5月31日向中国国家知识产权局提交的、专利申请号为201910472716.4的专利申请的优先权和权益,并且通过参照将其全文并入此处。
技术领域
本申请涉及消费性电子产品技术领域,特别涉及一种电子装置的控制方法及电子装置。
背景技术
现有的具有激光投射器的手机中,激光投射器一般设置在手机的前壳上,并且激光投射器只用于拍摄距离较近的前置使用状态中,例如,激光投射器只用于前置深度图像获取的使用状态中,从而导致手机的使用者能够使用激光投射器的场景较少。
发明内容
本申请实施方式提供了一种电子装置的控制方法及电子装置。
本申请的电子装置的控制方法,所述电子装置包括机壳、显示屏及转动组件,所述显示屏设置在所述机壳的一侧,所述转动组件包括基体及设置在所述基体上的激光投射器,所述基体能够转动的安装在所述机壳上以使所述激光投射器能够选择性地朝向所述显示屏所在的一侧或朝向与所述显示屏相背的一侧;所述控制方法包括:确定所述激光投射器的朝向;在所述激光投射器朝向所述显示屏所在的一侧时,所述激光投射器以第一模式投射激光;及,在所述激光投射器朝向与所述显示屏相背的一侧时,所述激光投射器以第二模式投射激光,所述第二模式发射的所述激光的能量大于所述第一模式发射的所述激光的能量。
本申请的电子装置包括机壳、显示屏、转动组件及处理器,所述显示屏设置在所述机壳的一侧,所述转动组件包括基体及设置在所述基体上的激光投射器,所述基体能够转动的安装在所述机壳上以使所述激光投射器能够选择性地朝向所述显示屏所在的一侧或朝向与所述显示屏相背的一侧;所述处理器用于确定所述激光投射器的朝向;所述激光投射器用于在所述激光投射器朝向所述显示屏所在的一侧时以第一模式投射激光、以及用于在所述激光投射器朝向与所述显示屏相背的一侧时以第二模式投射激光,所述第二模式发射的所述激光的能量大于所述第一模式发射的所述激光的能量。
本申请实施方式的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本申请的实践了解到。
附图说明
本申请的上述和/或附加的方面和优点可以从结合下面附图对实施方式的描述中将变得明显和容易理解,其中:
图1至图3是本申请某些实施方式的电子装置的结构示意图。
图4是本申请某些实施方式的电子装置的控制方法的流程示意图。
图5是本申请某些实施方式的电子装置的立体结构示意图。
图6是本申请某些实施方式的电子装置的激光投射器的激光光源的结构示意图。
图7是本申请某些实施方式的电子装置的控制方法的流程示意图。
图8和图9是本申请另一实施方式的电子装置的立体结构示意图。
图10是本申请某些实施方式的电子装置的系统架构示意图。
图11是本申请某些实施方式的电子装置的控制方法的流程示意图。
图12是本申请某些实施方式的电子装置的控制方法的原理示意图。
图13至图16是本申请某些实施方式的电子装置的控制方法的流程示意图。
图17是本申请某些实施方式的电子装置的控制方法的原理示意图。
图18是本申请某些实施方式的电子装置的控制方法的流程示意图。
具体实施方式
下面详细描述本申请的实施方式,所述实施方式的示例在附图中示出,其中,相同或类似的标号自始至终表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施方式是示例性的,仅用于解释本申请的实施方式,而不能理解为对本申请的实施方式的限制。
请参阅图1至图4,本申请的电子装置100的控制方法适用于电子装置100,其中,电子装置100包括机壳11、显示屏12及转动组件13,显示屏12设置在机壳11的一侧,转动组件13包括基体131及设置在基体131上的激光投射器14,基体131能够转动的安装在机壳11上以使激光投射器14能够选择性地朝向显示屏12所在的一侧或朝向与显示屏12相背的一侧;请参阅图4,控制方法包括:
041,确定激光投射器14的朝向;
042,在激光投射器14朝向显示屏12所在的一侧时(如图3所示),激光投射器14以第一模式投射激光;及
043,在激光投射器14朝向与显示屏12相背的一侧时(如图2所示),激光投射器14以第二模式投射激光,第二模式发射的激光的能量大于第一模式发射的激光的能量。
请参阅图2,在某些实施方式中,电子装置100还包括霍尔传感器组件16,霍尔传感器组件16包括第一传感器161和第二传感器162,第一传感器161设置在基体131上,第二传感器162设置在机壳11上并与第一传感器161对应;确定激光投射器14的朝向(041),包括:
通过霍尔传感器组件16确定激光投射器14的朝向。
请参阅图2及图7,在某些实施方式中,通过霍尔传感器组件16确定激光投射器14的朝向,包括:
0711,获取霍尔传感器组件16的霍尔值;
0712,在霍尔值小于第一预设值时,确定激光投射器14朝向显示屏12所在的一侧;及
0713,在霍尔值大于第二预设值时,确定激光投射器14朝向与显示屏12相背的一侧。
请参阅图5,在某些实施方式中,电子装置100还包括状态选择按键17,确定激光投射器14的朝向(041),包括:
通过状态选择按键17确定激光投射器14的朝向。
请参阅图1至图3,在某些实施方式中,激光投射器14在第一模式下投射激光的功率小于在第二模式下投射激光的功率;和/或激光投射器14包括多个点光源141,多个点光源141独立控制,激光投射器14在第一模式下开启的点光源141的数量少于在第二模式下开启的点光源141的数量。
在某些实施方式中,电子装置100还包括图像采集器15,图像采集器15设置在基体131上并位于激光投射器14所在的一面,激光投射器14在投射激光时,激光投射器14以第一工作频率向场景投射激光。控制方法还包括:
0114:图像采集器15以第二工作频率获取采集图像,第二工作频率大于第一工作频率;
0115:在采集图像中区分出在激光投射器14未投射激光时采集的第一图像及在激光投射器14投射激光时采集的第二图像;和
0116:根据第一图像、第二图像及参考图像计算深度图像。
请参阅图1至图4,本申请实施方式的电子装置100包括机壳11、显示屏12、转动组件13及处理器20。显示屏12设置在机壳11的一侧。转动组件13包括基体131及设置在基体131上的激光投射器14。基体131能够转动的安装在机壳11上以使激光投射器14能够选择性地朝向显示屏12所在的一侧或朝向与显示屏12相背的一侧。处理器20用于确定激光投射器14的朝向。激光投射器14用于在激光投射器14朝向显示屏12所在的一侧时以第一模式投射激光、以及用于在激光投射器14朝向与显示屏12相背的一侧时以第二模式投射激光。第二模式发射的激光的能量大于第一模式发射的激光的能量。
请参阅图2,在某些实施方式中,电子装置100还包括霍尔传感器组件16,霍尔传感器组件16包括第一传感器161和第二传感器162,第一传感器161设置在基体131上,第二传感器162设置在机壳11上并与第一传感器161对应;处理器20还用于通过霍尔传感器组件16确定激光投射器14的朝向。
请参阅图2及图7,在某些实施方式中,处理器20还用于:
0711,获取霍尔传感器组件16的霍尔值;
0712,在霍尔值小于第一预设值时,确定激光投射器14朝向显示屏12所在的一侧;及
0713,在霍尔值大于第二预设值时,确定激光投射器14朝向与显示屏12相背的一侧。
请参阅图5,在某些实施方式中,电子装置100还包括状态选择按键17,处理器20用于通过状态选择按键17确定激光投射器14的朝向。
请参阅图1至图3,在某些实施方式中,激光投射器14在第一模式下投射激光的功率小于在第二模式下投射激光的功率;和/或激光投射器14包括多个点光源141,多个点光源141独立控制;激光投射器14在第一模式下开启的点光源141的数量少于在第二模式下开启的点光源141的数量。
请参阅图3,在某些实施方式中,电子装置100还包括图像采集器15,图像采集器15设置在基体131上并位于激光投射器14所在的一面,激光投射器14在投射激光时,激光投射器14用于以第一工作频率向场景投射激光;图像采集器15用于以第二工作频率采集图像,第二工作频率大于第一工作频率;处理器20用于在采集图像中区分出在激光投射器14未投射激光时采集的第一图像及在激光投射器14投射激光时采集的第二图像;和根据第一图像、第二图像及参考图像计算深度图像。
请参阅图1至图3,在某些实施方式中,机壳11包括端面113及与显示屏12相背的后表面112,后表面112开设有贯穿端面113的收容槽114,转动组件13能够转动地安装在收容槽114内;在激光投射器14朝向显示屏12的一侧时,基体131凸出于端面113;在激光投射器14背离显示屏12的一侧时,基体131的一侧面与端面113齐平。
请参阅图1至图3,在某些实施方式中,机壳11包括前表面111、后表面112及端面113,前表面111和后表面112位于机壳11的相背两侧,端面113连接前表面111和后表面112,显示屏12设置在前表面111上,显示屏12的靠近端面113的一端形成有缺口121,机壳11开设有贯穿前表面111、后表面112及端面113的容置槽115,容置槽115与缺口121连通,转动组件13能够转动地安装在容置槽115内。
请参阅图1至图3,在某些实施方式中,机壳11包括前表面111、后表面112及端面113,显示屏12安装在机壳11的前表面111上,显示屏12可以覆盖前表面111的面积的85%及以上
请参阅图6和图10,在某些实施方式中,激光投射器14包括激光光源140,激光光源140包括多个点光源141;多个点光源141形成多个发光阵列142;多个发光阵列142呈环形排布。
请参阅图6和图10,在某些实施方式中,发光阵列142的开启方式为:距离激光光源140的中心越远的发光阵列142越先开启。
请参阅图6和图10,在某些实施方式中,激光投射器14包括激光光源140和第一驱动器147,第一驱动器147用于驱动激光光源140向待测物投射激光。
请参阅图8,在某些实施方式中,机壳11包括前表面111、后表面112及端面113,显示屏12设置在机壳11的前表面111上,显示屏12的靠近端面113的一端形成有缺口121。
请参阅图5和图8,在某些实施方式中,,电子装置100还包括设置在基体131上的泛光灯50,泛光灯50和激光投射器14位于基体131的同一面上。
请参阅图1至图3,本申请的电子装置100的控制方法,其中,电子装置100包括机壳11、显示屏12及转动组件13,显示屏12设置在机壳11的一侧,转动组件13包括基体131及设置在基体131上的激光投射器14,基体131能够转动的安装在机壳11上以使激光投射器14能够选择性地朝向显示屏12所在的一侧或朝向与显示屏12相背的一侧;请参阅图4,控制方法包括:
041,确定激光投射器14的朝向;
042,在激光投射器14朝向显示屏12所在的一侧时(如图3所示),激光投射器14以第一模式投射激光;及
043,在激光投射器14朝向与显示屏12相背的一侧时(如图2所示),激光投射器14以第二模式投射激光,第二模式发射的激光的能量大于第一模式发射的激光的能量。
本申请的电子装置100能够用于实现上述控制方法,具体地,电子装置100还包括处理器20,步骤041可以由处理器20实现,步骤042和步骤043均可以由激光投射器14实现。也即是说,处理器20可用于确定激光投射器14的朝向,激光投射器14可用于在激光投射器14朝向显示屏12所在的一侧时以第一模式投射激光、以及用于在激光投射器14朝向与显示屏12相背的一侧时以第二模式投射激光。
其中,电子装置100可以是手机、平板电脑、笔记本电脑、智能穿戴设备(智能手表、智能手环、智能头盔、智能眼镜等)、虚拟现实设备等。本申请以电子装置100是手机为例进行说明,但电子装置100的形式并不限于手机。
请参阅图1及图5,机壳11包括前表面111、后表面112和端面113,前表面111和后表面112位于机壳11的相背两侧,端面113连接前表面111和后表面113。本申请实施方式中,后表面112开设有贯穿端面113的收容槽114。
显示屏12安装在机壳11的前表面111上,显示屏12可以覆盖前表面111的面积的85%及以上,例如达到85%、86%、87%、88%、89%、90%、91%、92%、93%、95%甚至是100%。显示屏12可以用于显示影像,影像可以是文字、图像、视频、图标等信息。
转动组件13能够转动地安装在收容槽114内。具体地,转动组件13包括基体131和激光投射器14,基体131的两个转轴132(如图2所示)分别安装在收容槽114的相对的两个侧壁上,经过转轴132的轴线与前表面111、后表面112和端面113均平行。激光投射器14安装在基体131上并从基体131的一表面露出。当基体131转动时能够带动激光投射器14转动以使激光投射器14能够选择性地朝向显示屏12所在的一侧或朝向与显示屏12相背的一侧,在激光投射器14朝向显示屏12所在的一侧时,基体131凸出于端面113;在激光投射器14朝向与显示屏12相背的一侧时,基体131的一侧面与端面113齐平,此时,基体14不凸出于端面113。
电子装置100还包括图像采集器15,图像采集器15设置在基体131上并位于激光投射器14所在的一面,也就是说,激光投射器14和图像采集器15从基体131的同一表面露出。激光投射器14配合图像采集器15使用以用于获取待测物的深度信息,以用于三维建模、生成三维图像、测距等。激光投射器14和图像采集器15可以安装在支架上后,再将支架、激光投射器14和图像采集器15一同安装在基体131内;或者,基体131为一个支架,激光投射器14和图像采集器15均安装在基体131内。
一般地,电子装置100的使用者使用电子装置100时,显示屏12朝向使用者。在激光投射器14朝向显示屏12所在的一侧时(如图3所示),激光投射器14处于前置使用状态,显示屏12和激光投射器14均朝向电子装置100的使用者,此时,激光投射器14可作为前置激光投射器使用,使用者能够观看到显示屏12显示的内容、并且能够使用激光投射器14朝使用者一侧投射激光,使用者能够使用激光投射器14(和图像采集器15)进行人脸识别、虹膜识别等。在激光投射器14背离显示屏12时,激光投射器14处于后置使用状态(如图2所示),显示屏12朝向电子装置100的使用者,激光投射器14背离电子装置100的使用者,此时,激光投射器14可作为后置激光投射器使用,使用者能够观看显示屏12显示的内容、并且能够使用激光投射器14朝远离使用者的一侧投射激光,例如,使用者能够使用激光投射器14(和图像采集器15)获取电子装置100的与使用者相背一侧的待测物的深度图像。
本申请实施方式中,激光投射器14投射激光的模式包括第一模式和第二模式,其中,第一模式对应激光投射器14处于前置使用状态,第二模式对应激光投射器14处于后置使用状态,第二模式投射的激光的能量大于第一模式投射的激光的能量。具体地,激光投射器14在第一模式下投射激光的功率可以小于在第二模式下投射激光的功率,以使第二模式投射的激光的能量大于第一模式投射的激光的能量,此时,激光投射器14在第二模式下投射的激光能够到达的最大距离(投射距离)大于在第一模式下投射的激光能够到达的最大距离(投射距离)。同时,后置使用状态下激光投射器14配合图像采集器15能够检测到的后置距离范围要大于前置使用状态下激光投射器14配合图像采集器15能够检测到的前置距离范围,例如:激光投射器14配合图像采集器15能够检测到的前置距离范围为25cm以内,而激光投射器14配合图像采集器15能够检测到的后置距离范围为大于25cm(25cm内的距离范围精度很差);或者,前置距离范围与后置距离范围有稍许交集,例如激光投射器14配合图像采集器15能够检测到的前置距离范围为25cm以内,而激光投射器14配合图像采集器15能够检测到的后置距离范围为大于20cm。
本申请的电子装置100及电子装置100的控制方法中,转动组件13能够转动地安装在机壳11上以使激光投射器14能够选择性地朝向显示屏12所在的一侧或朝向与显示屏12相背的一侧,激光投射器14在朝向与显示屏12相背的一侧时能够投射的最大距离大于在朝向显示屏12所在的一侧时能够投射的最大距离,从而使激光投射器14能够同时作为前置激光投射器和后置激光投射器使用,增加了使用者能够使用电子装置100的场景;同时,电子装置100上不需要设置两个激光投射器14以分别作为前置激光投射器和后置激光投射器使用,节省了电子装置100的成本。
请参阅图6,在某些实施方式中,激光投射器14包括激光光源140,激光光源140包括多个点光源141,多个点光源141能够被独立控制,具体地,每个点光源141均能够独立地开启和关闭。激光投射器14在第一模式下开启的点光源141的数量少于在第二模式下开启的点光源141的数量,此时,每个点光源141投射激光的功率可以相同,从而使得第二模式投射的激光的能量大于第一模式投射的激光的能量,激光投射器14在第二模式下投射的激光能够到达的最大距离也就大于在第一模式下投射的激光能够到达的最大距离。
请参阅图6,在某些实施方式中,多个点光源141形成多个发光阵列142,多个发光阵列142独立控制。具体地,每个发光阵列142内的多个点光源141能够同时开启和关闭,此时,每个发光阵列142内的多个点光源141的功率可以相同。在其他实施方式中,每个发光阵列142内的多个点光源141也可以独立控制。
本实施方式中,多个发光阵列142呈环形排布。环形排布的发光阵列142内的点光源141发出的激光可以覆盖更广的视场,如此,可以获得更多待测物的深度信息。其中,环形可以为方环形或圆环 形。
请参阅图6,在某些实施方式中,随着投射距离的增大,发光阵列142的开启方式为:距离激光光源140的中心越远的发光阵列142越先开启。例如,请结合图6,发光阵列142的总数为6个,6个发光阵列142包括5个环形子阵列143和1个方形子阵列144,沿靠近激光光源140的中心的方向,5个环形子阵列143依次排布,依次排布的5个环形子阵列143的编号为A、B、C、D、E。本实施方式中,激光投射器14在朝向显示屏12所在的一侧时开启编号为A和B的环形子阵列143内的点光源141;激光投射器14在朝向与显示屏12相背的一侧时可以开启编号为A、B和C的环形子阵列143内的点光源141、或者开启编号为A、B、C和D的环形子阵列143内的点光源141、或者开启编号为A、B、C、D和E的环形子阵列143内的点光源141、或者开启编号为A、B、C、D和E的环形子阵列143内的点光源141以及方形子阵列144内的点光源141。
本实施方式中,激光投射器14在第一模式下开启的点光源141的数量少于在第二模式下开启的点光源141的数量,从而使激光投射器14在朝向显示屏12所在的一侧时投射的激光的能量小于在朝向与显示屏12相背的一侧时投射的激光的能量。
可以理解,激光投射器14的衍射光学元件(图未示)的衍射能力是有限的,即激光光源140发射的部分激光不会被衍射而是直接出射,直接出射的激光不经过衍射光学元件的衍射衰减作用。而直接出射的激光的能量较大,极有可能对使用者的眼睛产生危害。因此,激光投射器14在朝向显示屏12所在的一侧时以第一模式投射激光,也就是说,激光投射器14在投射距离较小时先开启远离激光光源140的中心的环形子阵列143,可以避免激光光源140投射的激光不经衍射光学元件的衍射衰减作用而直接投射到使用者的眼睛,从而提高了激光投射器14的安全性;激光投射器14在朝向与显示屏12相背的一侧时以第二模式投射激光,也就是说,激光投射器14在投射距离较大时同时开启远离激光光源140的中心的环形子阵列143和靠近激光光源140的中心的环形子阵列143,从而增大了激光投射器14投射的激光能够到达的最大距离。
请参阅图2及图7,在某些实施方式中,电子装置100还包括霍尔传感器组件16,霍尔传感器组件16包括第一传感器161和第二传感器162,第一传感器161设置在基体131上,第二传感器162设置在机壳11上并与第一传感器161对应;所述确定激光投射器14的朝向可以通过霍尔传感器组件16实现,具体地,控制方法包括:
0711,获取霍尔传感器组件16的霍尔值;
0712,在霍尔值小于第一预设值时,确定激光投射器14朝向显示屏12所在的一侧;
0713,在霍尔值大于第二预设值时,确定激光投射器14朝向与显示屏12相背的一侧;
072,在激光投射器14朝向显示屏12所在的一侧时,激光投射器14以第一模式投射激光;及
073,在激光投射器14朝向与显示屏12相背的一侧时,激光投射器14以第二模式投射激光,第二模式发射的激光的能量大于第一模式发射的激光的能量。
上述的控制方法也可以由电子装置100实现,其中,步骤0711、步骤0712和步骤0713可以为前文所述的步骤041的子步骤,步骤072与前文所述的步骤042基本相同,步骤073与前文所述的步骤043基本相同。具体地,处理器20电性连接霍尔传感器组件16,处理器20还可以用于通过霍尔传感器组件16确定激光投射器14的朝向,处理器20还可以用于实现步骤0711、步骤0712和步骤0713。也就是说,处理器20还可以用于获取霍尔传感器组件16的霍尔值、在霍尔值小于第一预设值时确定激光投射器14朝向显示屏12所在的一侧、及在霍尔值大于第二预设值时确定激光投射器14朝向与显示屏12相背的一侧。
本实施方式中,第一传感器161可以为磁铁161,第二传感器162可以为霍尔传感器162。其中,霍尔传感器162可以为高斯计或数字霍尔传感器,霍尔值即为高斯值。当激光投射器14朝向与显示屏12相背的一侧时(即转动组件13处于初始状态时),磁铁161的N极位于磁铁161的靠近霍尔传感器162的一端,磁铁161的S极位于磁铁161的远离霍尔传感器162的一端。当激光投射器14朝向显示屏12所在的一侧时,磁铁161的S极位于磁铁161的靠近霍尔传感器162的一端,磁铁161的N极位于磁铁161的远离霍尔传感器162的一端。当磁铁16的S极离霍尔传感器162越近时,霍尔传感器162所处的磁场越强,霍尔传感器162采集到的霍尔值越大并为正值;当磁铁16的N极离霍尔传感器162越近,霍尔传感器162采集到的霍尔值越小并为负值。
当霍尔传感器162采集到的霍尔值小于第一预设值,例如,当处理器20获取到的霍尔传感器162的霍尔值为-90,小于第一预设值-85时,则确定激光投射器14朝向显示屏12所在的一侧;当霍尔传感器162采集到的霍尔值大于第二预设值,例如,当处理器20获取到的霍尔传感器162的霍尔值为40,大于第二预设值35时,则确定激光投射器14朝向与显示屏12相背的一侧。当然,第一预设值和第二预设值的大小与磁铁161的特性、磁铁161与霍尔传感器162之间的距离等因素有关;磁铁161的特 性包括磁铁161的材料、形状及大小;磁铁161与霍尔传感器162之间的距离越小,霍尔传感器162采集到的霍尔值越大。
本申请实施方式的电子装置100及控制方法通过霍尔传感器组件16确定激光投射器14的朝向,从而不需要使用者手动选择激光投射器14的朝向就能够使用激光投射器14以对应的模式投射激光,提升了电子装置100的使用体验。
请参阅图5,在某些实施方式中,电子装置100还包括与处理器20电性连接的状态选择按键17,所述确定激光投射器14的朝向可以通过状态选择按键17实现,具体地,处理器20还可以用于通过状态选择按键17确定激光投射器14的朝向。状态选择按键17可以为实体按键并包括第一状态按键171和第二状态按键172。当处理器20检测到第一状态按键171被触发时,处理器20确定激光投射器14朝向显示屏12所在的一侧;当处理器20检测到第二状态按键172被触发时,处理器20确定激光投射器14朝向与显示屏12相背的一侧。在其他实施方式中,状态选择按键17还可以为虚拟按键,状态选择按键17可以被显示屏12显示,例如,状态选择按键17可以为显示屏12显示的朝向切换按键。
本申请实施方式的电子装置100及控制方法通过状态选择按键17确定激光投射器14的朝向,从而使用者能够根据需要准确地选择所需要的朝向。
请参阅图2,在某些实施方式中,电子装置100还包括可见光摄像头40,可见光摄像头40包括主摄像头41和副摄像头42,主摄像头41和副摄像头42均安装在基体131上,并且主摄像头41和副摄像头42位于激光投射器14所在的一面。主摄像头41可以为广角摄像头,广角摄像头对运动敏感度低,相对较低的快门速度都能保证图像拍摄的清晰度,且广角摄像头视角范围大,可以涵盖大范围景物,并且能强调前景和突出远近对比。副摄像头42可以为长焦摄像头,长焦摄像头的长焦镜头,可以识别更远的物体。或者主摄像头41为彩色摄像头,副摄像头42为黑白摄像头。此处的摄像头可以为多个,两个、三个、四个、或者更多个。本实施方式中,激光投射器14、副摄像头42、主摄像头41和图像采集器15依次间隔设置并位于同一直线上。
请参阅图8及图9,在某些实施方式中,显示屏12设置在前表面111上,显示屏12的靠近端面113的一端形成有缺口121,机壳11开设有贯穿前表面111、后表面112及端面113的容置槽115,容置槽115与缺口121对应并连通,转动组件13能够转动地安装在容置槽115内。
本实施方式中,基体131的尺寸略小于容置槽115的尺寸,以使基体131能够在容置槽115内转动。当激光投射器14朝向与显示屏12相背的一侧时,基体131的相背两个表面(前后两个表面)分别与后表面112及显示屏12的最外侧的出光面齐平;当激光投射器14朝向显示屏12所在的一侧时,基体131的相背两个表面(前后两个表面)分别与后表面112及显示屏12的最外侧的出光面齐平。
请参阅图10,在某些实施方式中,激光投射器14包括激光光源140和第一驱动器147,第一驱动器147可用于驱动激光光源140向待测物投射激光。激光投射器14和图像采集器15均与处理器20连接。处理器20可以为激光投射器14提供使能信号,具体地,处理器20可以为第一驱动器147提供使能信号。图像采集器15通过I2C总线与处理器20连接。激光投射器14可以向外发射激光,例如红外激光,激光到达场景中的物体上后被反射,被反射的激光可由图像采集器15接收,处理器20可以依据激光投射器14发射的激光及图像采集器15接收的激光计算物体的深度信息。在一个例子中,激光投射器14和图像采集器15可通过飞行时间(Time of flight,TOF)测距法获取深度信息,在另一个例子中,激光投射器14和图像采集器15可通过结构光测距原理获取深度信息。本申请说明书以激光投射器14和图像采集器15通过结构光测距原理获取深度信息为例进行说明,此时,激光投射器14为红外激光发射器,图像采集器15为红外摄像头。
图像采集器15与激光投射器14配合使用时,在一个例子中,图像采集器15可以通过选通信号(strobe信号)控制激光投射器14的投射时序,其中,strobe信号是根据图像采集器15获取采集图像的时序来生成的,strobe信号可视为高低电平交替的电信号,激光投射器14根据strobe指示的激光投射时序来投射激光。具体地,处理器20可以通过I2C总线发送图像采集指令以启用激光投射器14和图像采集器15,并使激光投射器14和图像采集器15工作。图像采集器15接收到图像采集指令后,通过strobe信号控制开关器件30,若strobe信号为高电平,则开关器件30向第一驱动器147发送第一脉冲信号(pwn1),第一驱动器147根据第一脉冲信号驱动激光光源140向场景中投射激光,若strobe信号为低电平,则开关器件30停止发送第一脉冲信号至第一驱动器147,激光光源140不投射激光;或者,也可以是在strobe信号为低电平时,开关器件30向第一驱动器147发送第一脉冲信号,第一驱动器147根据第一脉冲信号驱动激光光源140向场景中投射激光,在strobe信号为高电平时,开关器件30停止发送第一脉冲信号至第一驱动器147,激光光源140不投射激光。在另一个例子中,图像采集器15与激光投射器14配合时可以无需用到strobe信号,此时,处理器20发送图像采集指令至图像采集器15并同时发送激光投射指令至第一驱动器147,图像采集器15接收到图像采集指令后开始获取 采集图像,第一驱动器147接收到激光投射指令时驱动激光光源140投射激光。激光投射器14投射激光时,激光形成带有斑点的激光图案投射在场景中的待测物上。图像采集器15采集被待测物反射的激光图案得到散斑图像,并通过移动产业处理器接口(Mobile Industry Processor Interface,MIPI)将散斑图像发送给处理器20。图像采集器15每发送一帧散斑图像给处理器20,处理器20就接收到一个数据流。处理器20可以根据散斑图像和预存在处理器20中的参考图像做深度图像的计算。
在某些实施方式中,可见光摄像头40也通过I2C总线与处理器20连接,也就是说,主摄像头41和副摄像头42均可以通过I2C总线与处理器20连接。可见光摄像头40可用于采集可见光图像;也就是说,主摄像头41和副摄像头42均可以分别用于采集可见光图像,或者主摄像头41和副摄像头42相配合并一起用于采集可见光图像;换句话说,主摄像头41和副摄像头42中的任意一个或两个可以用于采集可见光图像。可见光摄像头40(主摄像头41和/或副摄像头42)每发送一帧可见光图像给处理器20,处理器20就接收到一个数据流。可见光摄像头40可以单独使用,即使用者仅仅想要获取可见光图像时,处理器20通过I2C总线向可见光摄像头40(主摄像头41和副摄像头42中的任意一个或两个)发送图像采集指令以启用可见光摄像头40使其工作。可见光摄像头40接收到图像采集指令后采集场景的可见光图像,并通过移动产业处理器接口向处理器20发送可见光图像。可见光摄像头40(主摄像头41和副摄像头42中的任意一个,或者主摄像头41和副摄像头42一起)也可以与激光投射器14以及图像采集器15配合使用,即使用者想要通过可见光图像与深度图像获取三维图像时,若图像采集器15与可见光摄像头40工作频率相同,则图像采集器15与可见光摄像头40通过sync信号实现硬件同步。具体地,处理器20通过I2C总线向图像采集器15发送图像采集指令。图像采集器15接收到图像采集指令后,可以通过strobe信号控制开关器件30向第一驱动器147发送第一脉冲信号(pwn1),以使第一驱动器147根据第一脉冲信号驱动激光光源140发射激光;同时,图像采集器15与可见光摄像头40之间通过sync信号同步,该sync信号控制可见光摄像头40采集可见光图像。
请参阅图2及图10,电子装置100还可包括设置在基体131上的泛光灯50,泛光灯50和激光投射器14位于基体131的同一面上。泛光灯50可以向场景中发射均匀的面光,泛光灯50包括泛光光源51及第二驱动器52,第二驱动器52可用于驱动泛光光源51发射均匀的面光。泛光灯50发出的光可以是红外光或其他不可见光,例如紫外光等。本申请以泛光灯50发射红外光为例进行说明,但泛光灯50发射的光的形式并不限于此。泛光灯50与处理器20连接,处理器20可以为驱动泛光灯50提供使能信号,具体地,处理器20可以为第二驱动器52提供使能信号。泛光灯50可以与图像采集器15配合工作以采集红外图像。图像采集器15与泛光灯50配合使用时,在一个例子中,图像采集器15可以通过选通信号(strobe信号,该strobe信号与图像采集器15控制激光投射器14的strobe信号为两个独立的strobe信号)控制泛光灯50发射红外光的发射时序,strobe信号是根据图像采集器15获取采集图像的时序来生成的,strobe信号可视为高低电平交替的电信号,泛光灯50根据strobe信号指示的红外光发射时序来发射红外光。具体地,处理器20可以通过I2C总线向图像采集器15发送图像采集指令,图像采集器15接收到图像采集指令后,通过strobe信号控制开关器件30,若strobe信号为高电平,则开关器件30向第二驱动器52发送第二脉冲信号(pwn2),第二驱动器52根据第二脉冲信号控制泛光光源51发射红外光,若strobe信号为低电平,则开关器件30停止发送第二脉冲信号至第二驱动器52,泛光光源51不发射红外光;或者,也可以是在strobe信号为低电平时,开关器件30向第二驱动器52发送第二脉冲信号,第二驱动器52根据第二脉冲信号控制泛光光源51发射红外光,在strobe信号为高电平时,开关器件30停止发送第二脉冲信号至第二驱动器52,泛光光源51不发射红外光。泛光灯50发射红外光时,图像采集器15接收被场景中的物体反射的红外光以形成红外图像,并通过移动产业处理器接口将红外图像发送给处理器20。图像采集器15每发送一帧红外图像给处理器20,处理器20就接收到一个数据流。该红外图像通常用于虹膜识别、人脸识别等。
请参阅图1、图2及图11,在某些实施方式中,电子装置100还包括图像采集器15,图像采集器15和激光投射器14一同安装在基体131的同一面,在激光投射器14投射激光时(激光投射器14以第一模式投射激光时,或者激光投射器14以第二模式投射激光时),激光投射器14以第一工作频率向场景投射激光,控制方法还包括:
0114,图像采集器15以第二工作频率获取采集图像,第二工作频率大于第一工作频率;
0115,在采集图像中区分出在激光投射器14未投射激光时采集的第一图像及在激光投射器14投射激光时采集的第二图像;和
0116,根据第一图像、第二图像及参考图像计算深度图像。
也就是说,控制方法包括:
0111,确定激光投射器14的朝向;
0112,在激光投射器14朝向显示屏12所在的一侧时,激光投射器14以第一模式投射激光;
0113,在激光投射器14朝向与显示屏12相背的一侧时,激光投射器14以第二模式投射激光,第二模式发射的激光的能量大于第一模式发射的激光的能量;
0114,在激光投射器14以第一工作频率向场景投射激光时,图像采集器15以第二工作频率获取采集图像,第二工作频率大于第一工作频率;
0115,在采集图像中区分出在激光投射器14未投射激光时采集的第一图像及在激光投射器14投射激光时采集的第二图像;和
0116,根据第一图像、第二图像及参考图像计算深度图像。
上述的控制方法也可以由电子装置100实现,其中,步骤0111与前文所述的步骤041基本相同,步骤0112与前文所述的步骤042基本相同,步骤0113与前文所述的步骤043基本相同。图像采集器15可用于实现步骤0114,处理器20还可以用于实现步骤0115和步骤0116。也就是说,图像采集器15用于以第二工作频率获取采集图像,处理器20还用于在采集图像中区分出在激光投射器14未投射激光时采集的第一图像及在激光投射器14投射激光时采集的第二图像、及根据第一图像、第二图像及参考图像计算深度图像。
具体地,图像采集器15与激光投射器14工作频率不同(即第二工作频率大于第一工作频率)时,若需要获取深度图像,比如在解锁、支付、解密、三维建模等使用场景下。在一个例子中,处理器20通过I2C总线向图像采集器15和第一驱动器147同时发送获取深度图像的图像采集指令。第一驱动器147接收到图像采集指令后,驱动激光光源140以第一工作频率向场景发射红外激光;图像采集器15接收到图像采集指令后,以第二工作频率采集被场景中的物体反射回的红外激光以获取采集图像。例如图12所示,实线表示激光投射器14发射激光的时序,虚线表示图像采集器15获取采集图像的时序及采集图像的帧数,点划线表示根据第一图像和第二图像得到的第三图像的帧数,图12中由上至下,依次为实线、虚线及点划线,其中,第二工作频率为第一工作频率的两倍。请参阅图12中实线与虚线部分,图像采集器15在激光投射器14未投射激光时先接收环境中的红外光(下称环境红外光)以获取第N帧采集图像(此时为第一图像,也可称作背景图像),并通过移动产业处理器接口将第N帧采集图像发送给处理器20;随后,图像采集器15可以在激光投射器14投射激光时接收环境红外光以及由激光投射器14发射的红外激光以获取第N+1帧采集图像(此时为第二图像,也可称作干扰散斑图像),并通过移动产业处理器接口将第N+1帧采集图像发送给处理器20;随后,图像采集器15再在激光投射器14未投射激光时接收环境红外光以获取第N+2帧采集图像(此时为第一图像),并通过移动产业处理器接口将第N+2帧采集图像发送给处理器20,依此类推,图像采集器15交替地获取第一图像和第二图像。
在另一个例子中,处理器20通过I2C总线向图像采集器15发送获取深度图像的采集指令。图像采集器15接收到图像采集指令后,通过strobe信号控制开关器向第一驱动器147发送第一脉冲信号,第一驱动器147根据第一脉冲信号驱动激光光源140以第一工作频率投射激光(即激光投射器14以第一工作频率投射激光),同时图像采集器15以第二工作频率采集被场景中的物体反射回的红外激光以获取采集图像。如图12所示,实线表示激光投射器14发射激光的时序,虚线表示图像采集器15获取采集图像的时序及采集图像的帧数,点划线表示根据第一图像和第二图像得到的第三图像的帧数,图12中由上至下,依次为实线、虚线及点划线,其中,第二工作频率为第一工作频率的两倍。请参阅图12中实线与虚线部分,图像采集器15在激光投射器14未投射激光时先接收环境红外光以获取第N帧采集图像(此时为第一图像,也可称作背景图像),并通过移动产业处理器接口将第N帧采集图像发送给处理器20;随后,图像采集器15可以在激光投射器14投射激光时接收环境红外光以及由激光投射器14发射的红外激光以获取第N+1帧采集图像(此时为第二图像,也可称作干扰散斑图像),并通过移动产业处理器接口将第N+1帧采集图像发送给处理器20;随后,图像采集器15再在激光投射器14未投射激光时接收环境红外光以获取第N+2帧采集图像(此时为第一图像),并通过移动产业处理器接口将第N+2帧采集图像发送给处理器20,依此类推,图像采集器15交替地获取第一图像和第二图像。
需要说明的是,图像采集器15可以在发送采集图像给处理器20的过程中同时执行采集图像的获取。并且,图像采集器15也可以先获取第二图像,再获取第一图像,并根据这个顺序交替执行采集图像的获取。另外,上述的第二工作频率与第一工作频率之间的倍数关系仅为示例,在其他实施例中,第二工作频率与第一工作频率之间的倍数关系还可以是三倍、四倍、五倍、六倍等等。
处理器20每接收到一帧采集图像后,都会对接收到的采集图像进行区分,判断采集图像是第一图像还是第二图像。处理器20接收到至少一帧第一图像和至少一帧第二图像后,即可根据第一图像、第二图像以及参考图像计算深度图像。具体地,由于第一图像是在激光投射器14未投射激光时采集的,形成第一图像的光线仅包括环境红外光,而第二图像是在激光投射器14投射激光时采集的,形成第二 图像的光线同时包括环境红外光和激光投射器14发射的红外激光,因此,处理器20可以根据第一图像来去除第二图像中的由环境红外光形成的采集图像的部分,从而得到仅由红外激光形成的采集图像(即由红外激光形成的散斑图像)。
可以理解,环境光包括与激光投射器14发射的红外激光波长相同的红外光(例如,包含940nm的环境红外光),图像采集器15获取采集图像时,这部分红外光也会被图像采集器15接收。在场景的亮度较高时,图像采集器15接收的光线中环境红外光的占比会增大,导致采集图像中的激光散斑点不明显,从而影响深度图像的计算。
本申请的控制方法控制激光投射器14与图像采集器15以不同的工作频率工作,图像采集器15可以采集到仅由环境红外光形成的第一图像以及同时由环境红外光和激光投射器14发射的红外激光形成的第二图像,并基于第一图像去除掉第二图像中由环境红外光形成的图像部分,由此能够区分出激光散斑点,并能采用仅由激光投射器14发射的红外激光形成的采集图像来计算深度图像,激光散斑匹配不受影响,可以避免深度信息出现部分或全部缺失,从而提升深度图像的精确度。
请参阅图1、图2和图13,在某些实施方式中,控制方法包括:
0131,确定激光投射器14的朝向;
0132,在激光投射器14朝向显示屏12所在的一侧时,激光投射器14以第一模式投射激光;
0133,在激光投射器14朝向与显示屏12相背的一侧时,激光投射器14以第二模式投射激光,第二模式发射的激光的能量大于第一模式发射的激光的能量;
0134,在激光投射器14以第一工作频率向场景投射激光时,图像采集器15以第二工作频率获取采集图像,第二工作频率大于第一工作频率;
01351,为每一帧采集图像添加图像类型;
01352,根据图像类型区分第一图像与第二图像;和
0136,根据第一图像、第二图像及参考图像计算深度图像。
请参阅图1、图2和图14在某些实施方式中,控制方法包括:
0141,确定激光投射器14的朝向;
0142,在激光投射器14朝向显示屏12所在的一侧时,激光投射器14以第一模式投射激光;
0143,在激光投射器14朝向与显示屏12相背的一侧时,激光投射器14以第二模式投射激光,第二模式发射的激光的能量大于第一模式发射的激光的能量;
0144,在激光投射器14以第一工作频率向场景投射激光时,图像采集器15以第二工作频率获取采集图像,第二工作频率大于第一工作频率;
014511,根据每一帧采集图像的采集时间确定在采集时间下激光投射器14的工作状态;
014512,根据工作状态为每一帧采集图像添加图像类型;
01452,根据图像类型区分第一图像与第二图像;和
0146,根据第一图像、第二图像及参考图像计算深度图像。
上述的控制方法也可以由电子装置100实现,其中,步骤0131与前文所述的步骤041基本相同,步骤0132与前文所述的步骤042基本相同,步骤0133与前文所述的步骤043基本相同,步骤0134与前文所述的步骤0114基本相同,步骤01351和步骤01352可以为步骤0115的子步骤,步骤0136与前文所述的步骤0116基本相同;步骤0141与前文所述的步骤041基本相同,步骤0142与前文所述的步骤042基本相同,步骤0143与前文所述的步骤043基本相同,步骤0144与前文所述的步骤0114基本相同,步骤014511和步骤014512可以为步骤01351的子步骤,步骤01452与前文所述的步骤01352基本相同,步骤0146与前文所述的步骤0116基本相同。步骤01351、步骤01352、步骤014511、步骤014512及步骤01452均可以由处理器20实现。也即是说,处理器20还可用于为每一帧采集图像添加图像类型、以及根据图像类型区分第一图像与第二图像。处理器20用于为每一帧采集图像添加图像类型时,具体用于根据每一帧采集图像的采集时间确定在采集时间下激光投射器14的工作状态、以及根据工作状态为每一帧采集图像添加图像类型。
具体地,处理器20每从图像采集器15接收到一帧采集图像,都会为采集图像添加图像类型(stream_type),以便于后续处理中可以根据图像类型区分出第一图像和第二图像。具体地,在图像采集器15获取采集图像的期间,处理器20会通过I2C总线实时监测激光投射器14的工作状态。处理器20每从图像采集器15接收到一帧采集图像,会先获取采集图像的采集时间,再根据采集图像的采集时间来判断在采集图像的采集时间下激光投射器14的工作状态是投射激光还是未投射激光,并基于判断结果为采集图像添加图像类型。其中,采集图像的采集时间可以是图像采集器15获取每一帧采集图像的开始时间、结束时间、介于开始时间至结束时间之间的任意一个时间等等。如此,可以实现每一帧采集图像与激光投射器14在该帧采集图像获取期间的工作状态(投射激光或未投射激光)的对应, 准确区分出采集图像的类型。在一个例子中,图像类型stream_type的结构如表1所示:
表1
Figure PCTCN2020085819-appb-000001
表1中stream为0时,表示此时的数据流为由红外光和/或红外激光形成的图像。light为00时,表示此时的数据流是在没有任何设备投射红外光和/或红外激光(仅有环境红外光)的情形下获取的,那么处理器20可以对采集图像添加000的图像类型,以标识这一采集图像为第一图像。light为01时,表示此时的数据流是在激光投射器14投射红外激光(既有环境红外光,又有红外激光)的情形下获取的。处理器20可以对采集图像添加001的图像类型,以标识这一采集图像为第二图像。处理器20后续即可根据stream_type来区分采集图像的图像类型。
请参阅图1、图2及图15,在某些实施方式中,控制方法包括:
0151,确定激光投射器14的朝向;
0152,在激光投射器14朝向显示屏12所在的一侧时,激光投射器14以第一模式投射激光;
0153,在激光投射器14朝向与显示屏12相背的一侧时,激光投射器14以第二模式投射激光,第二模式发射的激光的能量大于第一模式发射的激光的能量;
0154,在激光投射器14以第一工作频率向场景投射激光时,图像采集器15以第二工作频率获取采集图像,第二工作频率大于第一工作频率;
0155,在采集图像中区分出在激光投射器14未投射激光时采集的第一图像及在激光投射器14投射激光时采集的第二图像;
01561,根据第一图像和第二图像计算第三图像,第一图像的采集时间与第二图像的采集时间的差值小于预定差值;和
01562,根据第三图像和参考图像计算深度图像。
上述的控制方法也可以由电子装置100实现,其中,步骤0151与前文所述的步骤041基本相同,步骤0152与前文所述的步骤042基本相同,步骤0153与前文所述的步骤043基本相同,步骤0154与前文所述的步骤0114基本相同,步骤0155与前文所述的步骤0115基本相同,步骤01561和步骤01562可以为步骤0116的子步骤。步骤01561及步骤01562均可以由处理器20实现。也即是说,处理器20还可以用于根据第一图像和第二图像计算第三图像、以及根据第三图像和参考图像计算深度图像,其中,第一图像的采集时间与第二图像的采集时间的差值小于预定差值。
在计算深度图像的过程中,处理器20可以先区分出第一图像与第二图像,再根据采集时间选出任意帧第二图像和与该任意帧第二图像对应的特定帧的第一图像,其中该特定帧的第一图像的采集时间与该任意帧的第二图像的采集时间的差值小于预定差值。随后,处理器20再根据该特定帧的第一图像和该任意帧的第二图像来计算第三图像,第三图像即为仅由激光投射器14发射的红外激光形成的采集图像,也可以称作实际散斑图像。具体地,第一图像中的多个像素点与第二图像中的多个像素点是一一对应的,假设第一图像为P1,第二图像为P2,第三图像为P3,处理器20可以将第二图像中的像素点P2 i,j的像素值减去第一图像中的像素点P1 i,j的像素值以得到第三图像中像素点P3 i,j的像素值,即P3 i,j=P2 i,j-P1 i,j,i∈N+,j∈N+。随后,处理器20可以根据第三图像与参考图像计算出深度图像,其中,第二图像的帧数、第三图像的帧数及深度图像的帧数均相等。可以理解,由于第一图像的采集时间和第二图像的采集时间的差值较小,那么第一图像中环境红外光的强度与第二图像中环境红外光的强度更为接近,基于第一图像和第二图像计算出的第三图像的精度更高,有利于进一步减小环境红外光对深度图像获取的影响。
在某些实施方式中,处理器20也可以为第三图像和深度图像添加图像类型,以便于对处理采集图像后得到的各个数据流进行区分。如表2所示:
表2
Figure PCTCN2020085819-appb-000002
表2中的stream为0时,表示此时的数据流为由红外光和/或红外激光形成的图像,stream为1时,表示此时的数据流为深度图像。light为11时,表示减背景处理,减背景处理即去除采集图像中由环境红外光形成的部分,那么处理器20可以对减背景处理后的数据流添加011的图像类型,以标识这一数 据流为第三图像。Light为XX时,X表示不限定取值,处理器20可对进行深度计算后得到的数据流添加1XX的图像类型,以标识这一数据流为深度图像。
在某些实施方式中,参与深度图像计算的第一图像和第二图像中,第一图像的采集时间可以位于第二图像的采集时间之前,也可以位于第二图像的采集时间之后,在此不作限制。
在某些实施方式中,第一图像的采集时间与第二图像的采集时间的差值小于预定差值时,第一图像和第二图像可以是相邻帧的图像,也可以是非相邻帧的图像。例如,在第二工作频率是第一工作频率的两倍时,差值小于预定差值的第一图像和第二图像是相邻帧的图像;在第二工作频率与第一工作频率之间的倍数大于两倍,例如第二工作频率是第一工作频率的三倍时,差值小于预定差值的第一图像和第二图像可以是相邻帧的图像,也可以是非相邻帧的图像(此时第一图像与第二图像之间还间隔有一帧第一图像)。
在某些实施方式中,参与深度图像计算的第一图像的帧数还可以为多帧。比如,在第二工作频率是第一工作频率的三倍时,可以选取两帧相邻的第一图像以及与这两帧第一图像相邻的一帧第二图像来计算第三图像。此时,处理器20可以先对两帧第一图像做融合处理,例如,将两帧第一图像对应像素点的像素值相加再取均值得到融合处理后的第一图像,再利用融合处理后的第一图像和该相邻的一帧第二图像计算第三图像。
在某些实施方式中,处理器20可以计算出多帧第三图像,如图12中的第(N+1)-N帧第三图像、第(N+3)-(N+2)帧第三图像、第(N+5)-(N+4)帧第三图像等等,并对应多帧第三图像计算出多帧深度图像。当然,在其他实施方式中,处理器20也可以仅计算出一帧第三图像,并对应一帧第三图像计算出一帧深度图像。第三图像的帧数可以根据应用场景的安全级别来确定。具体地,当应用场景的安全级别较高时,例如对于支付等安全级别较高的应用场景,第三图像的帧数应该较多,此时需要多帧深度图像与使用者的深度模板的匹配均成功才执行支付动作,以提升支付的安全性;而对于应用场景的安全级别较低,例如对于基于深度信息进行人像美颜的应用场景,第三图像的帧数可以较少,例如,为一帧,此时利用一帧深度图像即足够进行人像美颜,如此,可以减少处理器20的计算量及功耗,并可以提升图像处理的速度。
请参阅图1、图2及图16,在某些实施方式中,控制方法还包括:
0167,以第三工作频率采集可见光图像,第三工作频率大于或小于第二工作频率;
0168,为每一帧可见光图像和每一帧采集图像添加采集时间;和
0169,根据可见光图像的采集时间、采集图像的采集时间及采集图像的图像类型确定帧同步的可见光图像和第二图像。
也就是说,控制方法包括:
0161,确定激光投射器14的朝向;
0162,在激光投射器14朝向显示屏12所在的一侧时,激光投射器14以第一模式投射激光;
0163,在激光投射器14朝向与显示屏12相背的一侧时,激光投射器14以第二模式投射激光,第二模式发射的激光的能量大于第一模式发射的激光的能量;
0164,在激光投射器14以第一工作频率向场景投射激光时,图像采集器15以第二工作频率获取采集图像,第二工作频率大于第一工作频率;
0165,在采集图像中区分出在激光投射器14未投射激光时采集的第一图像及在激光投射器14投射激光时采集的第二图像;
0166,根据第一图像、第二图像及参考图像计算深度图像;
0167,以第三工作频率采集可见光图像,第三工作频率大于或小于第二工作频率;
0168,为每一帧可见光图像和每一帧采集图像添加采集时间;和
0169,根据可见光图像的采集时间、采集图像的采集时间及采集图像的图像类型确定帧同步的可见光图像和第二图像。
上述的控制方法也可以由电子装置100实现,其中,步骤0161与前文所述的步骤041基本相同,步骤0162与前文所述的步骤042基本相同,步骤0163与前文所述的步骤043基本相同,步骤0164与前文所述的步骤0114基本相同,步骤0165与前文所述的步骤0115基本相同,步骤0166与前文所述的步骤0116基本相同。步骤0167可以由可见光摄像头40(主摄像头41和副摄像头42中的任意一个,或者主摄像头41和副摄像头42一起)实现。步骤0168和步骤0169可以由处理器20实现。也即是说,可见光摄像头40可用于以第三工作频率采集可见光图像,第三工作频率大于或小于第二工作频率。处理器20可用于为每一帧可见光图像和每一帧采集图像添加采集时间、以及根据可见光图像的采集时间、采集图像的采集时间及采集图像的图像类型确定帧同步的可见光图像和第二图像。
在一些应用场景,例如,对场景中的物体进行三维建模的应用场景下,需要借助图像采集器15获 取场景中物体的深度信息,并且借助可见光摄像头40获取场景中物体的色彩信息,才能实现三维建模。此时,处理器20需要开启图像采集器15获取深度图像并同时开启可见光摄像头40获取可见光图像。
若图像采集器15与可见光摄像头40具有相同的工作频率,即图像采集器15与可见光摄像头40均以第二工作频率工作,那么处理器20可以通过I2C总线发送图像采集指令至图像采集器15,图像采集器15接收到图像采集指令后,图像采集器15与可见光摄像头40之间通过sync信号同步,该sync信号控制可见光摄像头40开启采集可见光图像,以实现图像采集器15与可见光摄像头40的硬件同步。此时,采集图像的帧数与可见光图像的帧数一致,每一帧采集图像与每一帧可见光图像一一对应。
但在图像采集器15与可见光摄像头40的工作频率不同,即图像采集器15以第二工作频率工作,可见光摄像头40以不等于第二工作频率的第三工作频率工作时,图像采集器15与可见光摄像头40无法实现硬件同步。此时,处理器20需要通过软件同步的方式来实现图像采集器15与可见光摄像头40的同步。具体地,处理器20通过与图像采集器15连接的I2C总线发送图像采集指令至图像采集器15,同时通过与可见光摄像头40连接的I2C总线发送图像采集指令至可见光摄像头40。处理器20每接收到一帧采集图像时,会为每一帧采集图像添加图像类型,还会为每一帧采集图像添加采集时间。并且,处理器20每接收到一帧可见光图像时,会为每一帧可见光图像添加采集时间。其中,采集图像的采集时间可以是图像采集器15采集每一帧采集图像的开始时间、结束时间、介于开始时间至结束时间之间的任意一个时间等等;可见光图像的采集时间可以是可见光摄像头40采集每一帧可见光图像的开始时间、结束时间、介于开始时间至结束时间之间的任意一个时间等等。那么,在后续基于深度图像和可见光图像做进一步处理(如三维建模、借助深度信息做人像美颜等处理)时,处理器20可以先根据可见光图像的采集时间、采集图像的采集时间及采集图像的类型先确定帧同步的可见光图像和第二图像。其中,帧同步指的是确定出的第二图像的采集时间与可见光图像的采集时间的差值小于预设的时间差值,可见光图像的采集时间可以位于第二图像的采集时间之前也可以位于第二图像的采集时间之后。随后,处理器20再根据确定的第二图像选出第一图像以进一步根据第二图像、第一图像及参考图像计算深度图像。最后,处理器20基于深度图像和确定出的可见光图像进行后续处理。
在某些实施方式中,处理器20也可以为每一帧深度图像添加采集时间,再根据可见光图像的采集时间和深度图像的采集时间确定帧同步的可见光图像以及深度图像,最后对帧同步的可见光图像以及深度图像做后续处理。其中,每一帧深度图像的采集时间为与该帧深度图像对应的第二图像的采集时间。
请参阅图17,在某些实施方式中,采集图像还包括红外图像,红外图像为图像采集器15采集泛光灯50发射的红外光所得到的图像。处理器20为每一帧采集图像添加图像类型时,还会为红外图像添加图像类型。在一个例子中,红外图像的图像类型如表3所示:
表3
Figure PCTCN2020085819-appb-000003
表3中的stream为0时,表示此时的数据流为由红外光和/或红外激光形成的图像。light为10时,表示此时的数据流是在泛光灯50投射红外光且激光投射器14未投射激光的情形下获取的。那么,处理器20为采集图像添加010的图像类型时,即标识这一帧采集图像为红外图像。
在某些应用场景中,比如同时基于深度图像与深度模板的匹配以及红外图像与红外模板的匹配实现身份验证的应用场景,图像采集器15需要与泛光灯50及激光投射器14配合使用,图像采集器15可以分时获取第一图像、第二图像及红外图像。如图17所示,实线表示激光投射器14发射激光的时序,双点划线表示泛光灯50发射红外光的时序,虚线表示图像采集器15获取采集图像的时序及采集图像的帧数,点划线表示根据第一图像和第二图像得到的第三图像的帧数,图17中由上至下,依次为实线、双点划线、虚线及点划线,其中,第二工作频率为第一工作频率的三倍,第二工作频率为第四工作频率的三倍。处理器20可以通过I2C总线实时监测泛光灯50的工作状态。处理器20每从图像采集器15接收到一帧采集图像,会先获取采集图像的采集时间,再根据采集图像的采集时间来判断在采集图像的采集时间下泛光灯50的工作状态是发射红外光还是未发射红外光,并基于判断结果为采集图像添加图像类型。处理器20后续可以基于红外图像的采集时间和第二图像的采集时间确定出采集时间的差值小于设定差值的红外图像和第二图像,进一步地,处理器20可以确定出红外图像和深度图像,并利用该红外图像和该深度图像进行身份验证。
请参阅图1、图2及图18,在某些实施方式中,控制方法还包括:
0181,获取场景的亮度及类型;
0182,判断亮度是否大于亮度阈值且类型为户外场景;
若是,则进入确定激光投射器14的朝向(步骤0183)。
也就是说,控制方法包括:
0181,获取场景的亮度及类型;
0182,判断亮度是否大于亮度阈值且类型为户外场景;
0183,若是,确定激光投射器14的朝向;
0184,在激光投射器14朝向显示屏12所在的一侧时,激光投射器14以第一模式投射激光;
0185,在激光投射器14朝向与显示屏12相背的一侧时,激光投射器14以第二模式投射激光,第二模式发射的激光的能量大于第一模式发射的激光的能量;
0186,在激光投射器14以第一工作频率向场景投射激光时,图像采集器15以第二工作频率获取采集图像,第二工作频率大于第一工作频率;
0187,在采集图像中区分出在激光投射器14未投射激光时采集的第一图像及在激光投射器14投射激光时采集的第二图像;和
0188,根据第一图像、第二图像及参考图像计算深度图像。
上述的控制方法也可以由电子装置100实现,其中,步骤0183与前文所述的步骤041基本相同,步骤0184与前文所述的步骤042基本相同,步骤0185与前文所述的步骤043基本相同,步骤0186与前文所述的步骤0114基本相同,步骤0187与前文所述的步骤0115基本相同,步骤0188与前文所述的步骤0116基本相同。步骤0181和步骤0182均可以由处理器20实现。也即是说,处理器20可用于获取场景的亮度及类型、以及判断亮度是否大于亮度阈值且类型为户外场景。激光投射器14可用于在亮度大于亮度阈值且类型为户外场景时以第一工作频率向场景投射激光。
具体地,场景的亮度可以通过分析图像采集器15获取的采集图像或可见光摄像头40(主摄像头41和副摄像头42中的任意一个,或者主摄像头41和副摄像头42一起)获取的可见光图像得到;或者,场景的亮度也可以由光线感应器来直接检测,处理器20从光线感应器读取检测得到的信号以获取场景的亮度。场景的类型可以通过分析图像采集器15获取的采集图像或可见光摄像头40获取的可见光图像得到,例如分析采集图像或可见光摄像头40获取的可见光图像中的物体来判断场景的类型为户外场景还是户内场景;场景的类型也可以直接根据地理位置来确定,具体地,处理器20可以获取全球卫星定位系统对场景的定位结果,再根据定位结果进一步判断场景的类型,例如,定位结果为某某办公楼,则说明场景为户内场景;定位场景为某某公园,则说明场景为户外场景;定位场景为某某街道,则说明场景为户外场景等等。
可以理解,在场景的亮度较高(例如亮度大于亮度阈值)时,采集图像中环境红外光的占比会较多,对斑点的识别会较大影响,此时需要去除环境红外光的干扰。但是在场景的亮度较低时,采集图像中环境红外光的占比会较少,对斑点的识别产生的影响较小,可以忽略不计,此时图像采集器15和激光投射器14可以采用相同工作频率工作,处理器20直接根据图像采集器15获取的采集图像(即第二图像)与参考图像计算深度图像。另外,场景的亮度较高时可能是户内的灯光光线较强引起的,由于灯光光线不包括红外光,不会对斑点的识别产生较大影响,此时图像采集器15和激光投射器14采用相同工作频率工作,处理器20直接根据图像采集器15获取的采集图像(即第二图像)与参考图像计算深度图像。如此,可以减小图像采集器15的工作频率,减少图像采集器15的功耗。
当然,在某些实施方式中,控制方法也可以仅仅基于场景的亮度来判断是否执行步骤0183。具体地,处理器20仅仅获取场景的亮度,判断场景的亮度是否大于亮度阈值,激光投射器14在亮度大于亮度阈值时以第一工作频率向场景投射激光。
在某些实施方式中,处理器20还可以为每一个数据流添加状态信息(status)。在一个例子中,如表4所示:
表4
Figure PCTCN2020085819-appb-000004
状态信息status为0时,表示该数据流未经过减背景处理,状态信息status为1时,表示该数据流经过减背景处理。表4中的0000即表示第一图像;0010即表示第二图像;0100即表示泛光灯50开启 时图像采集器15获取的红外图像;0111即表示第三图像;1XX1即表示经过减背景处理的深度图像;1XX0即表示未经过减背景处理的深度图像。如此,为每个数据流添加状态信息以便于处理器20分辨各个数据流是否经过减背景处理。
在某些实施方式中,处理器20包括第一存储区、第二存储区以及逻辑减电路,逻辑减电路与第一存储区及第二存储区均连接。其中,第一存储区用于存储第一图像,第二存储区用于存储第二图像,逻辑减电路用于处理第一图像和第二图像得到第三图像。具体地,逻辑减电路从第一存储区读取第一图像,从第二存储区读取第二图像,在获取到第一图像和第二图像后,对第一图像和第二图像执行减法处理得到第三图像。逻辑减电路还与处理器20中的深度计算模块(例如,可以是专门用于计算深度的集成电路ASIC等)连接,逻辑减电路将第三图像发送到深度计算模块中,由深度计算模块根据第三图像和参考图像计算深度图像。
在本说明书的描述中,参考术语“一个实施方式”、“一些实施方式”、“示意性实施方式”、“一个例子”“示例”、“具体示例”或“一些示例”等的描述意指结合所述实施方式或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施方式或示例中。在本说明书中,对上述术语的示意性表述不一定指的是相同的实施方式或示例。而且,描述的具体特征、结构、材料或者特点可以在任何的一个或多个实施方式或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本申请的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本申请的实施例所属技术领域的技术人员所理解。
尽管上面已经示出和描述了本申请的实施方式,可以理解的是,上述实施方式是示例性的,不能理解为对本申请的限制,本领域的普通技术人员在本申请的范围内可以对上述实施方式进行变化、修改、替换和变型。

Claims (20)

  1. 一种电子装置的控制方法,其特征在于,所述电子装置包括机壳、显示屏及转动组件,所述显示屏设置在所述机壳的一侧,所述转动组件包括基体及设置在所述基体上的激光投射器,所述基体能够转动的安装在所述机壳上以使所述激光投射器能够选择性地朝向所述显示屏所在的一侧或朝向与所述显示屏相背的一侧;所述控制方法包括:
    确定所述激光投射器的朝向;
    在所述激光投射器朝向所述显示屏所在的一侧时,所述激光投射器以第一模式投射激光;及
    在所述激光投射器朝向与所述显示屏相背的一侧时,所述激光投射器以第二模式投射激光,所述第二模式发射的所述激光的能量大于所述第一模式发射的所述激光的能量。
  2. 根据权利要求1所述的控制方法,其特征在于,所述电子装置还包括霍尔传感器组件,所述霍尔传感器组件包括第一传感器和第二传感器,所述第一传感器设置在所述基体上,所述第二传感器设置在所述机壳上并与所述第一传感器对应;所述确定所述激光投射器的朝向,包括:
    通过所述霍尔传感器组件确定所述激光投射器的朝向。
  3. 根据权利要求2所述的控制方法,其特征在于,所述通过所述霍尔传感器组件确定所述激光投射器的朝向,包括:
    获取所述霍尔传感器组件的霍尔值;
    在所述霍尔值小于第一预设值时,确定所述激光投射器朝向所述显示屏所在的一侧;及
    在所述霍尔值大于第二预设值时,确定所述激光投射器朝向与所述显示屏相背的一侧。
  4. 根据权利要求1所述的控制方法,其特征在于,所述电子装置还包括状态选择按键,所述确定所述激光投射器的朝向,包括:
    通过所述状态选择按键确定所述激光投射器的朝向。
  5. 根据权利要求1至4任意一项所述的控制方法,其特征在于,所述激光投射器在所述第一模式下投射所述激光的功率小于在所述第二模式下投射所述激光的功率;和/或
    所述激光投射器包括多个点光源,多个所述点光源独立控制;所述激光投射器在所述第一模式下开启的所述点光源的数量少于在所述第二模式下开启的所述点光源的数量。
  6. 根据权利要求1至3任意一项所述的控制方法,其特征在于,所述电子装置还包括图像采集器,所述图像采集器设置在所述基体上并位于所述激光投射器所在的一面;
    所述激光投射器在投射激光时,所述激光投射器以第一工作频率向场景投射所述激光;
    所述控制方法还包括:
    所述图像采集器以第二工作频率获取采集图像,所述第二工作频率大于所述第一工作频率;
    在所述采集图像中区分出在所述激光投射器未投射激光时采集的第一图像及在所述激光投射器投射激光时采集的第二图像;和
    根据所述第一图像、所述第二图像及参考图像计算深度图像。
  7. 一种电子装置,其特征在于,所述电子装置包括机壳、显示屏、转动组件及处理器,所述显示屏设置在所述机壳的一侧,所述转动组件包括基体及设置在所述基体上的激光投射器,所述基体能够转动的安装在所述机壳上以使所述激光投射器能够选择性地朝向所述显示屏所在的一侧或朝向与所述显示屏相背的一侧;所述处理器用于确定所述激光投射器的朝向;所述激光投射器用于在所述激光投射器朝向所述显示屏所在的一侧时以第一模式投射激光、以及用于在所述激光投射器朝向与所述显示屏相背的一侧时以第二模式投射激光,所述第二模式发射的所述激光的能量大于所述第一模式发射的所述激光的能量。
  8. 根据权利要求7所述的电子装置,其特征在于,所述电子装置还包括霍尔传感器组件,所述霍尔传感器组件包括第一传感器和第二传感器,所述第一传感器设置在所述基体上,所述第二传感器设置在所述机壳上并与所述第一传感器对应;所述处理器还用于通过所述霍尔传感器组件确定所述激光投射器的朝向。
  9. 根据权利要求8所述的电子装置,其特征在于,所述处理器还用于:
    获取所述霍尔传感器组件的霍尔值;
    在所述霍尔值小于第一预设值时,确定所述激光投射器朝向所述显示屏所在的一侧;及
    在所述霍尔值大于第二预设值时,确定所述激光投射器朝向与所述显示屏相背的一侧。
  10. 根据权利要求7所述的电子装置,其特征在于,所述电子装置还包括状态选择按键,所述处理器用于通过所述状态选择按键确定所述激光投射器的朝向。
  11. 根据权利要求7至10任意一项所述的电子装置,其特征在于,所述激光投射器在所述第一模式下投射所述激光的功率小于在所述第二模式下投射所述激光的功率;和/或
    所述激光投射器包括多个点光源,多个所述点光源独立控制;所述激光投射器在所述第一模式下开启的所述点光源的数量少于在所述第二模式下开启的所述点光源的数量。
  12. 根据权利要求7至10任意一项所述的电子装置,其特征在于,所述电子装置还包括图像采集器,所述图像采集器设置在所述基体上并位于所述激光投射器所在的一面;所述激光投射器在投射激光时,所述激光投射器用于以第一工作频率向场景投射所述激光;所述图像采集器用于以第二工作频率采集图像,所述第二工作频率大于所述第一工作频率;所述处理器用于:
    在所述采集图像中区分出在所述激光投射器未投射激光时采集的第一图像及在所述激光投射器投射激光时采集的第二图像;和
    根据所述第一图像、所述第二图像及参考图像计算深度图像。
  13. 根据权利要求7所述的电子装置,其特征在于,所述机壳包括端面及与所述显示屏相背的后表面,所述后表面开设有贯穿所述端面的收容槽,所述转动组件能够转动地安装在所述收容槽内;在所述激光投射器朝向所述显示屏的一侧时,所述基体凸出于所述端面;在所述激光投射器背离所述显示屏的一侧时,所述基体的一侧面与所述端面齐平。
  14. 根据权利要求7所述的电子装置,其特征在于,所述机壳包括前表面、后表面及端面,所述前表面和所述后表面位于所述机壳的相背两侧,所述端面连接所述前表面和所述后表面,所述显示屏设置在所述前表面上,所述显示屏的靠近所述端面的一端形成有缺口,所述机壳开设有贯穿所述前表面、所述后表面及所述端面的容置槽,所述容置槽与所述缺口连通,所述转动组件能够转动地安装在所述容置槽内。
  15. 根据权利要求7至10任意一项所述的电子装置,其特征在于,所述机壳包括前表面、后表面及端面,所述显示屏安装在所述机壳的所述前表面上,所述显示屏覆盖所述前表面的面积的85%及以上。
  16. 根据权利要求7至10任意一项所述的电子装置,其特征在于,所述激光投射器包括激光光源,所述激光光源包括多个点光源;多个所述点光源形成多个发光阵列;多个所述发光阵列呈环形排布。
  17. 根据权利要求16所述的电子装置,其特征在于,所述发光阵列的开启方式为:距离所述激光光源的中心越远的所述发光阵列越先开启。
  18. 根据权利要求7至10任意一项所述的电子装置,其特征在于,所述激光投射器包括激光光源和第一驱动器,所述第一驱动器用于驱动所述激光光源向待测物投射激光。
  19. 根据权利要求7至10任意一项所述的电子装置,其特征在于,所述机壳包括前表面、后表面及端面,所述显示屏设置在所述机壳的所述前表面上,所述显示屏的靠近所述端面的一端形成有缺口。
  20. 根据权利要求7至10任意一项所述的电子装置,其特征在于,所述电子装置还包括设置在所述基体上的泛光灯,所述泛光灯和所述激光投射器位于所述基体的同一面上。
PCT/CN2020/085819 2019-05-31 2020-04-21 电子装置的控制方法及电子装置 WO2020238482A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
KR1020217039467A KR20220004171A (ko) 2019-05-31 2020-04-21 전자 장치의 제어 방법 및 전자 장치
EP20813039.3A EP3968615B1 (en) 2019-05-31 2020-04-21 Control method for electronic apparatus, and electronic apparatus
JP2021571488A JP2022535521A (ja) 2019-05-31 2020-04-21 電子機器の制御方法及び電子機器
US17/537,393 US11947045B2 (en) 2019-05-31 2021-11-29 Controlling method for electronic device, and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910472716.4 2019-05-31
CN201910472716.4A CN112019660B (zh) 2019-05-31 2019-05-31 电子装置的控制方法及电子装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/537,393 Continuation US11947045B2 (en) 2019-05-31 2021-11-29 Controlling method for electronic device, and electronic device

Publications (1)

Publication Number Publication Date
WO2020238482A1 true WO2020238482A1 (zh) 2020-12-03

Family

ID=73506259

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/085819 WO2020238482A1 (zh) 2019-05-31 2020-04-21 电子装置的控制方法及电子装置

Country Status (6)

Country Link
US (1) US11947045B2 (zh)
EP (1) EP3968615B1 (zh)
JP (1) JP2022535521A (zh)
KR (1) KR20220004171A (zh)
CN (1) CN112019660B (zh)
WO (1) WO2020238482A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110213413B (zh) * 2019-05-31 2021-05-14 Oppo广东移动通信有限公司 电子装置的控制方法及电子装置
CN112019674B (zh) * 2019-05-31 2021-10-15 Oppo广东移动通信有限公司 电子装置的控制方法及电子装置

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160165023A1 (en) * 2014-12-09 2016-06-09 Sukjun Song Electronic device with rotating input unit and method for operating the same
CN105763803A (zh) * 2016-02-29 2016-07-13 广东欧珀移动通信有限公司 控制方法、控制装置及电子装置
CN106686321A (zh) * 2017-02-27 2017-05-17 维沃移动通信有限公司 一种摄像补光装置、方法及移动终端
CN207283612U (zh) * 2017-09-22 2018-04-27 深圳鼎智通讯股份有限公司 具有翻转摄像头的手机
CN109451106A (zh) * 2018-11-16 2019-03-08 Oppo广东移动通信有限公司 电子装置
CN109495617A (zh) * 2018-12-03 2019-03-19 武汉华星光电半导体显示技术有限公司 一种移动终端
CN110213413A (zh) * 2019-05-31 2019-09-06 Oppo广东移动通信有限公司 电子装置的控制方法及电子装置
CN209676289U (zh) * 2019-06-04 2019-11-22 Oppo广东移动通信有限公司 电子设备

Family Cites Families (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006245799A (ja) * 2005-03-01 2006-09-14 Nec Saitama Ltd 電子機器、該電子機器における警報出力制御方法及び警報出力制御プログラム
JP2008537661A (ja) * 2005-03-30 2008-09-18 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ 回転カメラユニットを備える携帯型電子装置
JP5317206B2 (ja) * 2006-09-21 2013-10-16 トムソン ライセンシング 3次元モデル取得のための方法及びシステム
JP5047058B2 (ja) * 2008-05-26 2012-10-10 シャープ株式会社 携帯端末装置
CN101650520A (zh) * 2008-08-15 2010-02-17 索尼爱立信移动通讯有限公司 移动电话的可视激光触摸板和方法
US20100091178A1 (en) * 2008-10-14 2010-04-15 Marko Eromaki Imaging device housing
EP2178276B1 (en) * 2008-10-20 2014-07-30 LG Electronics Inc. Adaptation of recorded or shown image according to the orientation of a mobile terminal
US8922625B2 (en) * 2009-11-19 2014-12-30 Lg Electronics Inc. Mobile terminal and controlling method thereof
GB0921461D0 (en) * 2009-12-08 2010-01-20 Qinetiq Ltd Range based sensing
US8284549B2 (en) * 2010-05-24 2012-10-09 Sunrex Technology Corp. Inverted U frame with pivotal micro-projector and camera for releasably securing to laptop or keyboard
US20110291935A1 (en) * 2010-05-26 2011-12-01 Sunrex Technology Corp. Wireless keyboard with pivotal camera and micro-projector
US9134799B2 (en) * 2010-07-16 2015-09-15 Qualcomm Incorporated Interacting with a projected user interface using orientation sensors
KR102097452B1 (ko) * 2013-03-28 2020-04-07 삼성전자주식회사 프로젝터를 포함하는 전자 장치 및 그 제어 방법
CN204031223U (zh) 2014-04-21 2014-12-17 张晗月 一种活动听筒闪光灯摄像头全屏智能手机
US20150347833A1 (en) * 2014-06-03 2015-12-03 Mark Ries Robinson Noncontact Biometrics with Small Footprint
KR20160029390A (ko) * 2014-09-05 2016-03-15 삼성전자주식회사 휴대용 단말기 및 그 제어방법
DE112015006245B4 (de) * 2015-03-30 2019-05-23 Fujifilm Corporation Abstandsbild-Erfassungsvorrichtung und Abstandsbild-Erfassungsverfahren
KR101927323B1 (ko) * 2015-04-03 2018-12-10 엘지전자 주식회사 이동단말기 및 그 제어방법
KR20170032761A (ko) * 2015-09-15 2017-03-23 엘지전자 주식회사 이동 단말기
US20190260863A1 (en) * 2016-08-30 2019-08-22 Xleap, Inc. Information processing terminal
CN107948363A (zh) 2016-11-16 2018-04-20 广东欧珀移动通信有限公司 一种移动终端的摄像头安装结构及移动终端
CN107743156B (zh) * 2017-09-29 2021-05-21 努比亚技术有限公司 移动终端、移动终端的控制方法及计算机可读存储介质
CN107896274B (zh) * 2017-10-27 2020-08-21 Oppo广东移动通信有限公司 红外发射器控制方法、终端及计算机可读存储介质
CN107862853B (zh) * 2017-10-27 2020-12-22 Oppo广东移动通信有限公司 红外发射器控制方法、终端及计算机可读存储介质
CN107968910B (zh) * 2017-12-26 2020-03-06 Oppo广东移动通信有限公司 电子装置
CN108055370A (zh) * 2017-12-26 2018-05-18 广东欧珀移动通信有限公司 电子装置
US10104210B1 (en) * 2018-01-31 2018-10-16 Yonatan Zike Zenebe Projector housing for iPhone
CN108594451B (zh) * 2018-03-12 2020-01-24 Oppo广东移动通信有限公司 控制方法、控制装置、深度相机和电子装置
CN108683795A (zh) * 2018-03-30 2018-10-19 中兴通讯股份有限公司 移动终端及其控制方法和计算机可读存储介质
WO2019201010A1 (zh) * 2018-04-16 2019-10-24 Oppo广东移动通信有限公司 激光投射器、相机模组和电子装置
CN109066288A (zh) * 2018-05-30 2018-12-21 Oppo广东移动通信有限公司 激光投射器的控制系统、终端和激光投射器的控制方法
CN208522805U (zh) * 2018-07-02 2019-02-19 Oppo广东移动通信有限公司 折叠式移动终端
US11258890B2 (en) * 2018-07-30 2022-02-22 IKIN, Inc. Portable terminal accessory device for holographic projection and user interface
CN109192153A (zh) * 2018-08-31 2019-01-11 维沃移动通信有限公司 一种终端及终端控制方法
CN109167855A (zh) * 2018-09-18 2019-01-08 杭州禾声科技有限公司 一种具有柔性折叠屏的智能移动装置
CN109391713A (zh) * 2018-12-20 2019-02-26 惠州Tcl移动通信有限公司 一种移动终端

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160165023A1 (en) * 2014-12-09 2016-06-09 Sukjun Song Electronic device with rotating input unit and method for operating the same
CN105763803A (zh) * 2016-02-29 2016-07-13 广东欧珀移动通信有限公司 控制方法、控制装置及电子装置
CN106686321A (zh) * 2017-02-27 2017-05-17 维沃移动通信有限公司 一种摄像补光装置、方法及移动终端
CN207283612U (zh) * 2017-09-22 2018-04-27 深圳鼎智通讯股份有限公司 具有翻转摄像头的手机
CN109451106A (zh) * 2018-11-16 2019-03-08 Oppo广东移动通信有限公司 电子装置
CN109495617A (zh) * 2018-12-03 2019-03-19 武汉华星光电半导体显示技术有限公司 一种移动终端
CN110213413A (zh) * 2019-05-31 2019-09-06 Oppo广东移动通信有限公司 电子装置的控制方法及电子装置
CN209676289U (zh) * 2019-06-04 2019-11-22 Oppo广东移动通信有限公司 电子设备

Also Published As

Publication number Publication date
CN112019660B (zh) 2021-07-30
US20220082661A1 (en) 2022-03-17
EP3968615A1 (en) 2022-03-16
US11947045B2 (en) 2024-04-02
EP3968615A4 (en) 2022-06-29
KR20220004171A (ko) 2022-01-11
JP2022535521A (ja) 2022-08-09
CN112019660A (zh) 2020-12-01
EP3968615B1 (en) 2023-10-18

Similar Documents

Publication Publication Date Title
WO2020238506A1 (zh) 电子装置的控制方法及电子装置
CN112204961B (zh) 从动态视觉传感器立体对和脉冲散斑图案投射器进行半密集深度估计
JP2021520154A (ja) 画像処理方法、コンピュータ可読記憶媒体、および電子機器
WO2020238491A1 (zh) 电子装置的控制方法及电子装置
US9160931B2 (en) Modifying captured image based on user viewpoint
JP2015526927A (ja) カメラ・パラメータのコンテキスト駆動型調整
KR20220123268A (ko) 파노라마 3차원 이미지를 캡처 및 생성하는 시스템 및 방법
US11947045B2 (en) Controlling method for electronic device, and electronic device
CN110062145B (zh) 深度相机、电子设备及图像获取方法
WO2020248896A1 (zh) 调节方法、终端及计算机可读存储介质
US10447998B2 (en) Power efficient long range depth sensing
US20220067951A1 (en) Method for Acquiring Image, Electronic Device and Readable Storage Medium
JP2016123074A (ja) 投影装置
WO2024096853A1 (en) Image capture system including a lidar device and cameras having a rolling shutter sensor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20813039

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021571488

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20217039467

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020813039

Country of ref document: EP

Effective date: 20211209