WO2019205887A1 - 控制拍摄的方法、装置、电子设备及计算机可读存储介质 - Google Patents

控制拍摄的方法、装置、电子设备及计算机可读存储介质 Download PDF

Info

Publication number
WO2019205887A1
WO2019205887A1 PCT/CN2019/080427 CN2019080427W WO2019205887A1 WO 2019205887 A1 WO2019205887 A1 WO 2019205887A1 CN 2019080427 W CN2019080427 W CN 2019080427W WO 2019205887 A1 WO2019205887 A1 WO 2019205887A1
Authority
WO
WIPO (PCT)
Prior art keywords
processing unit
camera
image
exposure time
synchronization signal
Prior art date
Application number
PCT/CN2019/080427
Other languages
English (en)
French (fr)
Inventor
谭国辉
周海涛
谭筱
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201810401344.1A external-priority patent/CN108650472B/zh
Priority claimed from CN201810404282.XA external-priority patent/CN108419017B/zh
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Priority to EP19791777.6A priority Critical patent/EP3627827B1/en
Publication of WO2019205887A1 publication Critical patent/WO2019205887A1/zh
Priority to US16/678,701 priority patent/US11095802B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/20Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising
    • H04N5/06Generation of synchronising signals
    • H04N5/067Arrangements or circuits at the transmitter end
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2224Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
    • H04N5/2226Determination of depth image, e.g. for foreground/background separation

Definitions

  • the present application relates to the field of computer technology, and in particular, to a method, an apparatus, an electronic device, and a computer readable storage medium for controlling shooting.
  • Embodiments of the present application provide a method, apparatus, electronic device, and computer readable storage medium for controlling shooting.
  • a method for controlling shooting includes: when a second processing unit receives a data acquisition request, controlling a second camera to acquire a second image according to the data acquisition request, and transmitting an image acquisition instruction to the first processing unit, the image
  • the acquiring instruction is used to instruct the first processing unit to control the first camera to acquire the first image
  • the second processing unit receives the synchronization signal sent by the second camera, acquire the first exposure of the first camera Time and a second exposure time of the second camera, the synchronization signal is a signal sent at a time when the second camera captures the second image of each frame at the time of starting the exposure; and is calculated according to the first exposure time and the second exposure time a delay duration; when the duration of the synchronization signal received by the second processing unit reaches the delay duration, the synchronization signal is forwarded to the first camera, and the synchronization signal is used to indicate the first
  • the camera starts to expose and acquires a first image; the first image is processed by the first processing unit, and the processed first image is processed Transmitted to the second
  • the device for controlling shooting includes a request receiving module, a signal receiving module, a calculating module, a signal forwarding module and a processing module; and the request receiving module is configured to control according to the data obtaining request when the second processing unit receives the data obtaining request
  • the second camera collects the second image, and sends an image acquisition instruction to the first processing unit, where the image acquisition instruction is used to instruct the first processing unit to control the first camera to acquire the first image
  • the signal receiving module is configured to When receiving the synchronization signal sent by the second camera, the second processing unit acquires a first exposure time of the first camera and a second exposure time of the second camera, and the synchronization signal is collected by the second camera.
  • a signal transmitted at a time when the second image is started a calculation module configured to calculate a delay time length according to the first exposure time and a second exposure time; and a signal forwarding module configured to receive the second processing unit
  • the duration of the synchronization signal reaches the delay duration, the synchronization signal is forwarded to the first camera,
  • the synchronization signal is used to instruct the first camera to start exposure and acquire a first image;
  • the processing module is configured to process the first image by the first processing unit, and send the processed first image to the The second processing unit is described.
  • An electronic device includes a first processing unit, a second processing unit, and a camera module, wherein the first processing unit is respectively connected to the second processing unit and the camera module;
  • the camera module includes a first camera and a second camera, the first processing unit is connected to the first camera through a control line, the second processing unit is connected to the second camera through a control line, and the first processing unit is connected to the second processing unit
  • the second processing unit is further connected to the first camera and the second camera respectively through a signal line;
  • the second processing unit is configured to control the second camera according to the data acquisition request when receiving the data acquisition request Acquiring a second image, and sending an image acquisition instruction to the first processing unit;
  • the first processing unit is configured to control the first camera to acquire the first image according to the image acquisition instruction; and the second camera is configured to collect each frame a second image is sent to the second processing unit at the time of starting the exposure;
  • the second processing unit is further configured to use the second Receiving, by the unit, the synchronization signal sent by the second camera
  • a method for controlling shooting includes: when a first processing unit receives an image capturing instruction sent by a second processing unit, controlling, according to the image capturing instruction, a first camera to acquire a first image, where the image capturing instruction is
  • the data acquisition request is used to instruct the second processing unit to control the second camera to acquire the second image
  • the first processing unit receives the synchronization sent by the second camera And a signal, acquiring a first exposure time of the first camera and a second exposure time of the second camera, wherein the synchronization signal is a signal sent by the second camera when the second image is captured in each frame at the time of starting the exposure; Calculating a delay duration according to the first exposure time and the second exposure time; when the duration of the synchronization signal received by the first processing unit reaches the delay duration, forwarding the synchronization to the first camera a signal, the synchronization signal is used to instruct the first camera to start exposure and acquire a first image; Processing said first image and the first image is transmitted to the second
  • a device for controlling shooting comprising an image acquisition module, a signal receiving module, a calculation module, a signal forwarding module and a processing module;
  • the image acquisition module is configured to: when the first processing unit receives the image acquisition instruction sent by the second processing unit, according to The image acquisition instruction controls the first camera to acquire the first image, and the image acquisition instruction is sent when the second processing unit receives the data acquisition request, where the data acquisition request is used to instruct the second processing unit to control The second camera captures the second image;
  • the signal receiving module is configured to acquire the first exposure time of the first camera and the second exposure time of the second camera when the first processing unit receives the synchronization signal sent by the second camera
  • the synchronization signal is a signal that is sent when the second camera captures the second image of each frame at the time of starting the exposure;
  • the calculation module is configured to calculate the delay duration according to the first exposure time and the second exposure time;
  • the module is configured to when the duration of the synchronization signal received by the first processing unit reaches the delay And transmitting,
  • An electronic device includes a first processing unit, a second processing unit, and a camera module, wherein the first processing unit is respectively connected to the second processing unit and the camera module;
  • the camera module includes a first camera and a second camera, the first processing unit is connected to the first camera through a control line, the second processing unit is connected to the second camera through a control line, and the first processing unit is connected to the second processing unit
  • the first processing unit is further connected to the first camera and the second camera respectively through a signal line;
  • the second processing unit is configured to control the first according to the data acquisition request when receiving a data acquisition request
  • the second camera acquires a second image, and sends an image acquisition instruction to the first processing unit;
  • the first processing unit is configured to: when receiving the image acquisition instruction sent by the second processing unit, according to the image collection instruction Controlling the first camera to acquire the first image;
  • the second camera is configured to collect the second image in each frame at the time of starting the exposure to the first processing list Sending a synchronization signal;
  • a computer readable storage medium having stored thereon a computer program that, when executed by a processor, implements the method as described above.
  • 1 is an application scenario diagram of a method for controlling shooting in an embodiment
  • FIG. 2 is an application scenario diagram of a method for controlling shooting in another embodiment
  • FIG. 3 is a block diagram of an electronic device in an embodiment
  • FIG. 4 is a schematic flow chart of a method for controlling shooting in an embodiment
  • FIG. 5 is a schematic flow chart of a second processing unit transmitting an image acquisition instruction to a first processing unit in an embodiment
  • FIG. 6 is a schematic flowchart of a first processing unit transmitting a processed first image to a second processing unit in an embodiment
  • Figure 7 is a block diagram of an apparatus for controlling shooting in an embodiment
  • Figure 8 is a block diagram of a processing module in one embodiment.
  • FIG. 9 is an application scenario diagram of a method for controlling shooting in an embodiment
  • FIG. 10 is a flow chart showing a method of controlling shooting in an embodiment
  • FIG. 11 is a schematic flow chart of processing a first image in an embodiment
  • FIG. 12 is a schematic flow chart of obtaining a reference speckle image according to a temperature of a laser in one embodiment
  • FIG. 13 is a schematic flowchart of selecting a data transmission channel according to a security level of an application in an embodiment
  • Figure 14 is a block diagram of an apparatus for controlling shooting in an embodiment
  • Figure 15 is a block diagram of a processing module in one embodiment.
  • first may be referred to as a second client
  • second client may be referred to as a first client, without departing from the scope of the present application.
  • Both the first client and the second client are clients, but they are not the same client.
  • FIG. 1 is an application scenario diagram of a method of controlling shooting in an embodiment.
  • the application scenario may include a first camera 110 , a second camera 120 , a first processing unit 130 , and a second processing unit 140 .
  • the first camera 110 may be a laser camera
  • the second camera 120 may be an RGB (Red/Green/Blue, red/green/blue color mode) camera.
  • the first processing unit 130 may be an MCU (Microcontroller Unit) module or the like
  • the second processing unit 140 may be a CPU (Central Processing Unit) module or the like.
  • the first processing unit 130 is connected to the first camera 110 through a control line
  • the second processing unit 140 is connected to the second camera 120 through a control line.
  • the first processing unit 130 is coupled to the second processing unit 140.
  • the second processing unit 130 is also connected to the first camera 110 and the second camera 120 through signal lines.
  • the second camera 120 may be controlled to acquire the second image through the control line according to the data acquisition request, and send an image acquisition instruction to the first processing unit 130.
  • the first processing unit 130 receives the image acquisition instruction sent by the second processing unit 140
  • the first camera 110 may be controlled to acquire the first image through the control line according to the image acquisition instruction.
  • the synchronization signal can be transmitted to the second processing unit 140 through the signal line at the time of starting the exposure.
  • the second processing unit 140 receives the synchronization signal sent by the second camera 120, the first exposure time of the first camera 110 and the second exposure time of the second camera 120 may be acquired, and according to the first exposure time and the second exposure. Time calculation delay time.
  • the synchronization signal can be forwarded to the first camera 110 through the signal line.
  • the first camera 110 After receiving the synchronization signal, the first camera 110 can start exposure and acquire the first image, and the acquired first image can be transmitted to the first processing unit 130.
  • the first processing unit 130 may process the first image and send the processed first image to the second processing unit 140.
  • the electronic device 200 can include a camera module 210 , a second processing unit 220 , and a first processing unit 230 .
  • the second processing unit 220 described above may be a CPU module.
  • the first processing unit 230 may be an MCU module or the like.
  • the first processing unit 230 is connected between the second processing unit 220 and the camera module 210.
  • the first processing unit 230 can control the laser camera 212, the floodlight 214 and the laser light 218 in the camera module 210.
  • the second processing unit 220 can control the RGB camera 216 in the camera module 210.
  • the camera module 210 includes a laser camera 212, a floodlight 214, an RGB camera 216, and a laser light 218.
  • the laser camera 212 described above may be an infrared camera for acquiring an infrared image.
  • the floodlight 214 is a surface light source capable of emitting infrared light;
  • the laser light 218 is a point light source capable of emitting laser light and is a patterned point light source.
  • the laser camera 212 can acquire an infrared image according to the reflected light.
  • the laser camera 212 can acquire a speckle image based on the reflected light.
  • the speckle image is an image in which the patterned laser light emitted by the laser lamp 218 is reflected and the pattern is deformed.
  • the second processing unit 220 can connect the RGB camera 216 and the laser camera 212 through signal lines, respectively.
  • a synchronization signal can be sent to the second processing unit 220.
  • the second processing unit 220 can acquire the exposure duration of the laser camera 212 and the exposure duration of the RGB camera 216, and calculate the delay according to the exposure duration of the laser camera 212 and the exposure duration of the RGB camera 216. duration.
  • the synchronization signal can be forwarded to the laser camera 212 through the signal line.
  • the laser camera 212 receives the synchronization signal, and can start exposure based on the synchronization signal and acquire an infrared image or a speckle image or the like.
  • the second processing unit 220 may include a CPU core running in a TEE (Trusted Execution Environment) environment and a CPU core running in a REE (Rich Execution Environment) environment.
  • TEE Trusted Execution Environment
  • REE Raich Execution Environment
  • the TEE environment and the REE environment are all operating modes of ARM modules (Advanced RISC Machines, advanced reduced instruction set processors).
  • the security level of the TEE environment is high, and only one CPU core in the second processing unit 220 can run in the TEE environment at the same time.
  • the higher level of security behavior in the electronic device 200 needs to be performed in the CPU core in the TEE environment, and the lower security level operation behavior can be performed in the CPU core in the REE environment.
  • the first processing unit 230 includes a PWM (Pulse Width Modulation) module 232, a SPI/I2C (Serial Peripheral Interface/Inter-Integrated Circuit) interface 234, and a RAM. (Random Access Memory) module 236 and depth engine 238.
  • the PWM module 232 can transmit a pulse to the camera module, and control the floodlight 214 or the laser light 218 to be turned on, so that the laser camera 212 can acquire an infrared image or a speckle image.
  • the SPI/I2C interface 234 is configured to receive an image acquisition instruction sent by the second processing unit 220.
  • the depth engine 238 described above can process the speckle image to obtain a depth disparity map.
  • the image can be sent to the first processing unit 230 by the CPU core running in the TEE environment. Acquisition instructions.
  • the first processing unit 230 can transmit the pulse wave through the PWM module 232 to control the floodlight 214 in the camera module 210 to open and collect the infrared image through the laser camera 212, and control the laser light 218 in the camera module 210.
  • the speckle image is turned on and acquired by the laser camera 212.
  • the camera module 210 can transmit the collected infrared image and the speckle image to the first processing unit 230.
  • the first processing unit 230 may process the received infrared image to obtain an infrared parallax map; and process the received speckle image to obtain a speckle disparity map or a depth disparity map.
  • the processing by the first processing unit 230 on the infrared image and the speckle image refers to correcting the infrared image or the speckle image, and removing the influence of the internal and external parameters of the camera module 210 on the image.
  • the first processing unit 230 can be set to different modes, and images output by different modes are different.
  • the first processing unit 230 When the first processing unit 230 is set to the speckle pattern, the first processing unit 230 processes the speckle image to obtain a speckle disparity map, and the target speckle image can be obtained according to the speckle disparity map; when the first processing unit 230 sets In the depth map mode, the first processing unit 230 processes the speckle image to obtain a depth disparity map, and the depth disparity map can obtain a depth image, and the depth image refers to an image with depth information.
  • the first processing unit 230 may send the infrared parallax map and the speckle disparity map to the second processing unit 220, and the first processing unit 230 may also send the infrared disparity map and the depth disparity map to the second processing unit 220.
  • the second processing unit 220 may acquire a target infrared image according to the infrared disparity map described above, and acquire a depth image according to the depth disparity map described above. Further, the second processing unit 220 may perform face recognition, face matching, living body detection, and acquiring depth information of the detected face according to the target infrared image and the depth image.
  • the communication between the first processing unit 230 and the second processing unit 220 is through a fixed security interface to ensure the security of the transmitted data.
  • the data sent by the second processing unit 220 to the first processing unit 230 is passed through the SECURE SPI/I2C 240, and the data sent by the first processing unit 230 to the second processing unit 220 is passed through the SECURE MIPI (Mobile Industry Processor). Interface, mobile industry processor interface) 250.
  • SECURE SPI/I2C 240 SECURE SPI/I2C 240
  • MIPI Mobile Industry Processor
  • Interface mobile industry processor interface
  • the first processing unit 230 may also acquire the target infrared image according to the infrared disparity map, calculate the acquired depth image according to the depth disparity map, and send the target infrared image and the depth image to the second processing unit 220.
  • the electronic device 300 includes a processor 310, a memory 320, a display screen 330, and an input device 340 that are coupled by a system bus 350.
  • the memory 320 may include a non-volatile storage medium 322 and an internal memory 324.
  • the non-volatile storage medium 322 of the electronic device 300 stores an operating system 3222 and a computer program 3224.
  • the computer program 3224 is executed by the processor 310 to implement a method for controlling shooting provided in the embodiments of the present application.
  • the processor 310 is configured to provide computing and control capabilities to support operation of the entire electronic device 300.
  • the internal memory 324 in the electronic device 300 provides an environment for the operation of the computer program 3224 in the non-volatile storage medium 322.
  • the display screen 330 of the electronic device 300 may be a liquid crystal display or an electronic ink display or the like.
  • the input device 340 may be a touch layer covered on the display screen 330, or may be a button, trackball or touch provided on the outer casing of the electronic device 300.
  • the board can also be an external keyboard, trackpad or mouse.
  • the electronic device 300 can be a cell phone, a tablet or a personal digital assistant or a wearable device or the like. A person skilled in the art can understand that the structure shown in FIG.
  • 3 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the electronic device to which the solution of the present application is applied.
  • the specific electronic device may be It includes more or fewer components than those shown in the figures, or some components are combined, or have different component arrangements.
  • a method of controlling shooting including the following steps:
  • Step 1410 When the second processing unit receives the data acquisition request, the second camera collects the second image according to the data acquisition request, and sends an image collection instruction to the first processing unit, where the image collection instruction is used to instruct the first processing unit to control The first camera acquires the first image.
  • the first camera can be controlled to be turned on, and the first image is collected, wherein the face data can include, but is not limited to, face unlocking, face payment, and the like, and the face needs to be performed. Verified data, and face depth information.
  • the first camera can be a laser camera, and the laser camera can collect invisible images of different wavelengths.
  • the first image may include, but is not limited to, an infrared image, a speckle image, etc., and the speckle image refers to an infrared image with a speckle image.
  • a data acquisition request may be sent to the second processing unit.
  • the second processing unit may send an image acquisition instruction to the first processing unit, where the first processing unit may be an MCU module, and the second processing unit may be a CPU module.
  • the second processing unit may first detect whether the visible light image acquisition instruction is included in the data acquisition request, and if the visible light image acquisition instruction is included, the application may simultaneously acquire the visible light including the human face while acquiring the facial data. image. If the data acquisition request includes a visible light image acquisition instruction, the second processing unit may control the second camera to acquire the second image according to the visible light image acquisition instruction, wherein the second camera may be an RGB camera, and the second image may be a human face. RGB image.
  • the first camera may be controlled to acquire the first image according to the image capturing instruction, wherein the first image may include an infrared image, a speckle image, and the like.
  • the first processing unit can control the floodlight in the camera module to be turned on and collect the infrared image through the laser camera, and can open the laser such as the laser light in the camera module and collect the speckle image through the laser camera.
  • the floodlight can be a point source that uniformly illuminates in all directions.
  • the light emitted by the floodlight can be infrared light, and the laser camera can collect the face to obtain an infrared image.
  • the laser emitted by the laser can be diffracted by a lens and a diffusing optical element (DOE) to produce a pattern with speckle particles, which is projected onto the target object by a pattern with speckle particles, and the distance between the points of the target object and the electronic device is different.
  • DOE diffusing optical element
  • the offset of the speckle pattern is generated, and the laser camera acquires the target object to obtain a speckle image.
  • Step 1420 when the second processing unit receives the synchronization signal sent by the second camera, acquiring the first exposure time of the first camera and the second exposure time of the second camera, and the synchronization signal is collecting the second image of each frame for the second camera. The signal that is sent at the moment the exposure is started.
  • the first processing unit may connect the first camera through the control line, and control the first camera to acquire the first image through the control line.
  • the second processing unit may connect the second camera through the control line, and control the second camera to acquire the second image through the control line.
  • the first processing unit can be coupled to the second processing unit.
  • the second processing unit may also be respectively connected to the first camera and the second camera through signal lines, wherein the signal lines may be synchronous signal lines.
  • the second camera may send a synchronization signal to the second processing unit connected to the signal line at the time of starting the exposure, and the synchronization signal may be a start of frame (SOF) of the frame, which may be used for each The frame image begins to be exposed.
  • SOF start of frame
  • the second processing unit receives the synchronization signal sent by the second camera, the first exposure time of the first camera and the second exposure time of the second camera may be acquired, and the exposure duration may refer to the photosensitive duration, and the longer the exposure duration, the further The more light you can have.
  • the first exposure time of the first camera is different from the second exposure time of the second camera, and the first exposure time of the first camera may be smaller than the second exposure time of the second camera, but is not limited thereto, and may be There is also a case where the first exposure time of the first camera is larger than the second exposure time of the second camera, and the like.
  • Step 1430 calculating a delay duration according to the first exposure time and the second exposure time.
  • the second processing unit may calculate a delay duration according to the first exposure time of the first camera and the second exposure time of the second camera, where the delay duration refers to lengthening the length of time for the first camera to start exposure, and delaying the first The moment when the camera starts to be exposed, thereby ensuring that the first camera is synchronized with the second camera.
  • the electronic device may preset a time when the first camera and the second camera are synchronized during the exposure process, wherein the moment of synchronization during the exposure may refer to the time when the first camera has been exposed for the first exposure.
  • the ratio of time is the same as the ratio of the time that the second camera has been exposed to the second exposure time.
  • the first camera and the second camera may be set to end the exposure at the same time, either at the time of half of the exposure, or at the same time when the exposure reaches 3/4.
  • the second processing unit may calculate the delay duration according to the first exposure time, the second exposure time, and the set timing of synchronization during the exposure.
  • Step 1440 When the duration of the synchronization signal received by the second processing unit reaches a delay duration, the synchronization signal is forwarded to the first camera, and the synchronization signal is used to instruct the first camera to start exposure and acquire the first image.
  • the synchronization signal may be forwarded to the first camera when the duration of receiving the synchronization signal reaches the delay duration.
  • the exposure is started, thereby ensuring that the timings at which the first camera and the second camera are synchronized during the exposure process are consistent.
  • the electronic device can pre-equalize the device at half the time of exposure, and the second processing unit calculates the delay duration, and when the duration of receiving the synchronization signal reaches the delay duration, the synchronization signal is forwarded to the first camera.
  • the second camera is also exposed to half, and the two are consistent.
  • Step 1450 The first image is processed by the first processing unit, and the processed first image is sent to the second processing unit.
  • the first camera may transmit the acquired first image to the first processing unit, and the first processing unit may process the first image.
  • the first processing unit can be set to different modes, and different modes can collect different first images, perform different processing on the first image, and the like.
  • the first processing unit can control to turn on the floodlight, and collect the infrared image through the first camera, and the infrared image can be processed to obtain an infrared parallax map.
  • the first processing unit is in the speckle pattern mode
  • the first processing unit may control to turn on the laser light, and collect the speckle image through the first camera, and the speckle image may be processed to obtain a speckle disparity map.
  • the first processing unit is in the depth map mode, the first processing unit may process the speckle image to obtain a depth disparity map.
  • the first processing unit may perform a correction process on the first image, and the correction process refers to correcting the image content offset caused by the internal and external parameters of the first image and the second camera, for example, due to the laser.
  • a disparity map of the first image is obtained.
  • the infrared image is corrected to obtain an infrared disparity map
  • the speckle image is corrected to obtain a speckle disparity map or a depth disparity map.
  • Performing correction processing on the first image can prevent a situation in which an image finally presented on the screen of the electronic device is ghosted.
  • the first processing unit processes the first image, and the processed first image may be sent to the second processing unit.
  • the second processing unit may obtain a target image according to the processed first image, such as a target infrared image, a target speckle image, a target depth map, and the like.
  • the second processing unit can process the target image according to the requirements of the application.
  • the second processing unit may perform face detection according to the target image or the like, wherein the face detection may include face recognition, face matching, and living body detection.
  • Face recognition refers to whether there is a face in the target image.
  • Face matching refers to matching the face in the target image with the pre-existing face.
  • the detection of the living body refers to detecting whether the face in the target image has biological activity or the like. If the application needs to obtain the depth information of the face, the generated target depth map can be uploaded to the application, and the application can perform the beauty processing, the three-dimensional modeling, and the like according to the received target depth map.
  • the delay time is calculated according to the exposure duration of the two cameras, and when the duration of the synchronization signal received by the second processing unit reaches the delay time Forwarding the synchronization signal to the first camera, dynamically adjusting the time point of forwarding the synchronization signal according to the exposure duration of the first camera and the second camera, and dynamically adjusting the timing of synchronizing the first camera and the second camera by the second processing unit, and the synchronization effect is good
  • the exposure time difference between the two cameras is large, the image content collected by the two cameras can still be ensured.
  • step 1430 calculates a delay duration according to the first exposure time and the second exposure time, including: calculating an exposure time difference between the first exposure time and the second exposure time, and dividing the exposure time difference by 2 to obtain a delay. duration.
  • the electronic device can set the first camera and the second camera to coincide at half of the exposure time, and when the first camera is exposed to half, the second camera is also exposed to half.
  • the exposure time difference between the first exposure time and the second exposure time can be calculated, and the exposure time difference is divided by 2 to obtain the delay time length.
  • the delay time T 3
  • the exposure time difference between the first exposure time and the second exposure time is first calculated to be 17 ms, and the exposure time difference is obtained. Dividing by 2 results in a delay of 13.5 ms.
  • the exposure time difference may be compared with the time threshold to determine whether the exposure time difference is greater than a time threshold, and if the time threshold is greater than the time threshold, The exposure time difference is divided by 2, and the delay time is obtained.
  • the duration of the synchronization signal received by the second processing unit reaches the delay time, the synchronization signal is forwarded to the first camera.
  • the second processing unit may directly forward the synchronization signal to the first camera without extending the time at which the first camera starts to be exposed.
  • the time threshold can be set according to actual requirements, for example, 1 ms, 2 ms, etc., to ensure that the captured image content of the first camera and the second camera are within tolerance tolerance, and the calculation pressure of the second processing unit is reduced.
  • the second processing unit may also calculate the first intermediate exposure time and the second exposure time of the first exposure time, respectively.
  • the second intermediate exposure time wherein the intermediate exposure time refers to the time of exposure to half.
  • the second processing unit may determine a difference between the first intermediate exposure timing and the second intermediate exposure timing, and use the difference as the delay duration.
  • the delay time T 3
  • the first intermediate exposure time of the first exposure time may be calculated to be 1.5 ms
  • the second intermediate time of the second exposure time may be When the exposure time is 15 ms, the difference between the first intermediate exposure time and the second intermediate exposure time can be calculated as 13.5 ms, and the difference of 13.5 ms can be used as the delay time. It can be understood that other algorithms can also be used to ensure synchronization between the first camera and the second camera, and are not limited to the above several ways.
  • the time point of forwarding the synchronization signal can be dynamically adjusted according to the exposure duration of the first camera and the second camera, so that the timing of synchronizing the first camera and the second camera can be dynamically adjusted to ensure the first camera and the second camera. It is consistent at half the time of exposure, and the synchronization effect is good.
  • the step of transmitting the processed first image to the second processing unit includes the following steps:
  • Step 1502 Send an image acquisition instruction to the first processing unit by using a kernel running in the first operation mode in the second processing unit, where the first operation mode is a trusted operating environment.
  • the second processing unit in the electronic device may include two operating modes, where the first operating mode may be a TEE, the TEE is a trusted operating environment, and the security level is high; and the second operating mode may be REE, REE For a natural operating environment, REE has a lower security level.
  • the image acquisition instruction may be sent to the first processing unit through the first operation mode.
  • the single core can be directly switched from the second operating mode to the first operating mode; when the second processing unit is multi-core, one core can be switched from the second operating mode to In the first mode of operation, the other cores are still running in the second mode of operation and the image acquisition instructions are sent to the first processing unit by the kernel running in the first mode of operation.
  • Step 1504 The first processing unit sends the processed first image to a kernel running in the first operating mode in the second processing unit.
  • the processed first image may be sent to the kernel running in the first operating mode, so that the first processing unit is always running in a trusted operating environment.
  • the second processing unit may obtain the target image according to the processed first image in the kernel running in the first operation mode, and process the target image according to the requirements of the application. For example, the second processing unit may perform face detection on the target image in a kernel running in the first mode of operation.
  • the serial manner can be collected to perform face recognition on a target image by pair, Face matching and live detection.
  • the second processing unit may perform face recognition on the target image first, and when the face is recognized, match the face included in the target image with the pre-stored face to determine whether it is the same face. If the same face is used to perform a living body detection on the face according to the target image, the face to be captured is prevented from being a two-dimensional plane face or the like. When the face is not recognized, face matching and living body detection may not be performed, and the processing pressure of the second processing unit may be alleviated.
  • the image processing instruction is sent to the first processing unit by the core with high security of the second processing unit, which ensures that the first processing unit is in a high security environment and improves data security.
  • the method for controlling shooting includes the following steps:
  • Step 1602 Acquire an application type of an application that sends a data acquisition request.
  • Step 1604 determining the security level of the application according to the application type.
  • the second processing unit may acquire an application type of the application and acquire a security level corresponding to the application type.
  • Application types may include, but are not limited to, an unlock application, a payment application, a camera application, a beauty application, and the like.
  • the security level of different application types may be different. For example, the security level corresponding to the payment application and the unlock application may be high, and the security level corresponding to the camera application and the beauty application may be low, but is not limited thereto.
  • Step 1606 selecting a data transmission channel corresponding to the security level.
  • the second processing unit may select a data transmission channel corresponding to the security level of the application, and the data transmission channel may include but is not limited to a secure channel and an unsecure channel, wherein the secure channel may correspond to an application with a higher security level, and the non-secure channel Can correspond to applications with lower security levels.
  • the payment application can correspond to a secure channel
  • the beauty application can correspond to a normal channel.
  • the transmitted data can be encrypted to avoid data leakage or theft.
  • Step 1608 When the data transmission channel is a secure channel, the first processing unit sends the processed first image to the kernel running in the first operating mode in the second processing unit.
  • the first processing unit may send the processed first image to the kernel running in the first operating mode in the second processing unit.
  • the second processing unit may obtain the target image according to the processed first image in the kernel running in the first operation mode, including the target infrared image, the target speckle image or the target depth image, and the like.
  • the second processing unit may perform face detection on the target image in the kernel running in the first operation mode, and may perform face recognition, face matching, and living body detection, etc., by a serial manner.
  • the second processing unit can transmit the data required by the application to the application through a secure channel according to the requirements of the application.
  • the second processing unit may transmit the result of the face detection to the application through a secure channel; if the application needs to acquire the depth information of the face, the second processing unit may target the target The depth map is transmitted to the application via a secure channel.
  • Step 1610 When the data transmission channel is a non-secure channel, the first processing unit sends the processed first image to the camera driver in the second operation mode, and the second operation mode is a natural operation environment.
  • the first processing unit may send the processed first image to the camera driver, and the camera driver may run on the kernel in the second operation mode in the second processing mode.
  • the second processing unit may perform face detection on the target image by using a camera driver, wherein the target image may be obtained according to the processed first image.
  • the second processing unit can perform face detection on the target image in parallel in the REE environment, and can perform face recognition, face matching, and living body detection on the target image in a plurality of kernels in the second operation mode, respectively, and can improve data.
  • the camera driver can transfer the data required by the application to the application according to the needs of the application.
  • the second processing unit may acquire the security level of the application that sent the data acquisition request and determine the image accuracy corresponding to the security level. The higher the image accuracy, the clearer the corresponding image and the more information it contains.
  • the second processing unit may send image data corresponding to the image precision to the application.
  • the application with high security level may correspond to the target depth image with high image precision, and the security level.
  • a low application can correspond to a target depth image with low image accuracy.
  • the second processing unit can adjust the image precision of the image data by adjusting the image resolution. The higher the resolution, the higher the image precision, the lower the resolution, and the lower the image precision.
  • the image precision can also be controlled in other ways, and is not limited to the above several ways.
  • the corresponding data channel is selected according to the security level of the application to transmit data, and the security of the data transmission can be improved in the secure channel, and the data processing efficiency can be improved in the non-secure channel.
  • a method of controlling shooting comprising the steps of:
  • Step (1) when the second processing unit receives the data acquisition request, controlling the second camera to acquire the second image according to the data acquisition request, and sending an image acquisition instruction to the first processing unit, where the image collection instruction is used to indicate the first processing
  • the unit controls the first camera to acquire the first image.
  • the first processing unit is connected to the first camera through a control line
  • the second processing unit is connected to the second camera through a control line
  • the first processing unit is connected to the second processing unit
  • the second processing unit is further connected by the signal line Connected to the first camera and the second camera.
  • Step (2) when the second processing unit receives the synchronization signal sent by the second camera, acquiring the first exposure time of the first camera and the second exposure time of the second camera, and the synchronization signal is the second camera collecting each frame.
  • step (3) the delay time is calculated according to the first exposure time and the second exposure time.
  • the step (3) comprises: calculating an exposure time difference between the first exposure time and the second exposure time, and dividing the exposure time difference by 2 to obtain a delay time length.
  • the step (3) includes: respectively calculating a first intermediate exposure time of the first exposure time and a second intermediate exposure time of the second exposure time; determining the first intermediate exposure time and the second intermediate exposure time The difference and the difference is used as the delay time.
  • Step (4) when the duration of the synchronization signal received by the second processing unit reaches the delay duration, the synchronization signal is forwarded to the first camera, and the synchronization signal is used to instruct the first camera to start exposure and acquire the first image.
  • step (5) the first image is processed by the first processing unit, and the processed first image is sent to the second processing unit.
  • the step (1) includes: sending an image acquisition instruction to the first processing unit by using a kernel running in the first operation mode in the second processing unit, where the first operation mode is a trusted operation environment; And comprising: the first processing unit transmitting the processed first image to the kernel running in the first operating mode in the second processing unit.
  • the step (5) includes: acquiring an application type of the application that sends the data acquisition request; determining a security level of the application according to the application type; selecting a data transmission channel corresponding to the security level; and when the data transmission channel is The first processing unit sends the processed first image to the kernel running in the first operating mode in the second processing unit; when the data transmission channel is a non-secure channel, the first processing unit processes the processed An image is sent to the camera driver in the second mode of operation, and the second mode of operation is a natural operating environment.
  • the method for controlling shooting includes: acquiring a security level of an application that sends a data acquisition request; determining image accuracy corresponding to the security level; and transmitting image data corresponding to the image accuracy to the application.
  • the delay time is calculated according to the exposure duration of the two cameras, and when the duration of the synchronization signal received by the second processing unit reaches the delay time Forwarding the synchronization signal to the first camera, dynamically adjusting the time point of forwarding the synchronization signal according to the exposure duration of the first camera and the second camera, and dynamically adjusting the timing of synchronizing the first camera and the second camera by the second processing unit, and the synchronization effect is good
  • the exposure time difference between the two cameras is large, the image content collected by the two cameras can still be ensured.
  • steps in the various flow diagrams described above are sequentially displayed as indicated by the arrows, these steps are not necessarily performed in the order indicated by the arrows. Except as explicitly stated herein, the execution of these steps is not strictly limited, and the steps may be performed in other orders. Moreover, at least some of the steps in the various flow diagrams above may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be executed at different times, these sub-steps or stages The order of execution is not necessarily performed sequentially, but may be performed alternately or alternately with at least a portion of other steps or sub-steps or stages of other steps.
  • an electronic device including a first processing unit, a second processing unit, and a camera module, the first processing unit being coupled to the second processing unit and the camera module, respectively.
  • the camera module includes a first camera and a second camera.
  • the first processing unit is connected to the first camera through a control line
  • the second processing unit is connected to the second camera through a control line.
  • the first processing unit is connected to the second processing unit, and the second processing unit is further connected to the first camera and the second camera through signal lines.
  • the second processing unit is configured to, when receiving the data acquisition request, control the second camera to acquire the second image according to the data acquisition request, and send the image collection instruction to the first processing unit.
  • the first processing unit is configured to control the first camera to acquire the first image according to the image capturing instruction.
  • the second camera is configured to send a synchronization signal to the second processing unit at the time of starting the exposure when acquiring the second image of each frame.
  • the second processing unit is further configured to: when the second processing unit receives the synchronization signal sent by the second camera, acquire the first exposure time of the first camera and the second exposure time of the second camera, and according to the first exposure time and The second exposure time calculates the delay time.
  • the second processing unit is further configured to: when the duration of the synchronization signal received by the second processing unit reaches a delay duration, forward the synchronization signal to the first camera.
  • the first camera is configured to start exposure according to the synchronization signal and acquire the first image.
  • the first processing unit is further configured to process the first image and send the processed first image to the second processing unit.
  • the delay time is calculated according to the exposure duration of the two cameras, and when the duration of the synchronization signal received by the second processing unit reaches the delay time Forwarding the synchronization signal to the first camera, dynamically adjusting the time point of forwarding the synchronization signal according to the exposure duration of the first camera and the second camera, and dynamically adjusting the timing of synchronizing the first camera and the second camera by the second processing unit, and the synchronization effect is good
  • the exposure time difference between the two cameras is large, the image content collected by the two cameras can still be ensured.
  • the second processing unit is further configured to calculate an exposure time difference between the first exposure time and the second exposure time, and divide the exposure time difference by 2 to obtain a delay time length.
  • the second processing unit is further configured to separately calculate a first intermediate exposure time of the first exposure time and a second intermediate exposure time of the second exposure time, and determine the first intermediate exposure time and the second intermediate exposure time. The difference and the difference is used as the delay time.
  • the time point of forwarding the synchronization signal can be dynamically adjusted according to the exposure duration of the first camera and the second camera, so that the timing of synchronizing the first camera and the second camera can be dynamically adjusted to ensure the first camera and the second camera. It is consistent at half the time of exposure, and the synchronization effect is good.
  • the second processing unit is further configured to send an image acquisition instruction to the first processing unit by using a kernel running in the first operating mode in the second processing unit, where the first operating mode is a trusted operating environment.
  • the first processing unit is further configured to send the processed first image to the kernel running in the first operating mode in the second processing unit.
  • the image processing instruction is sent to the first processing unit by the core with high security of the second processing unit, which ensures that the first processing unit is in a high security environment and improves data security.
  • the second processing unit is further configured to acquire an application type of the application that sends the data acquisition request, determine a security level of the application according to the application type, and select a data transmission channel corresponding to the security level.
  • the first processing unit is further configured to: when the data transmission channel is a secure channel, send the processed first image to the kernel running in the first operating mode in the second processing unit.
  • the first processing unit is further configured to: when the data transmission channel is a non-secure channel, send the processed first image to the camera driver in the second operation mode, and the second operation mode is a natural operation environment.
  • the second processing unit is further configured to acquire a security level of the application that sends the data acquisition request, determine image accuracy corresponding to the security level, and send image data corresponding to the image accuracy to the application.
  • the corresponding data channel is selected according to the security level of the application to transmit data, and the security of the data transmission can be improved in the secure channel, and the data processing efficiency can be improved in the non-secure channel.
  • an apparatus 700 for controlling shooting including a request receiving module 710, a signal receiving module 720, a computing module 730, a signal forwarding module 740, and a processing module 750.
  • the request receiving module 710 is configured to, when the second processing unit receives the data acquisition request, control the second camera to acquire the second image according to the data acquisition request, and send an image collection instruction to the first processing unit, where the image collection instruction is used to indicate the first
  • the processing unit controls the first camera to acquire the first image.
  • the first processing unit is connected to the first camera through a control line
  • the second processing unit is connected to the second camera through a control line
  • the first processing unit is connected to the second processing unit
  • the second processing unit is further connected by the signal line Connected to the first camera and the second camera.
  • the signal receiving module 720 is configured to: when the second processing unit receives the synchronization signal sent by the second camera, acquire the first exposure time of the first camera and the second exposure time of the second camera, and the synchronization signal is collected by the second camera. The signal transmitted at the time the exposure is started when the second image is framed.
  • the calculation module 730 is configured to calculate the delay duration according to the first exposure time and the second exposure time.
  • the signal forwarding module 740 is configured to forward the synchronization signal to the first camera when the duration of the synchronization signal received by the second processing unit reaches a delay duration, the synchronization signal is used to instruct the first camera to start exposure and acquire the first image.
  • the processing module 750 is configured to process the first image by using the first processing unit, and send the processed first image to the second processing unit.
  • the delay time is calculated according to the exposure duration of the two cameras, and when the duration of the synchronization signal received by the second processing unit reaches the delay time Forwarding the synchronization signal to the first camera, dynamically adjusting the time point of forwarding the synchronization signal according to the exposure duration of the first camera and the second camera, and dynamically adjusting the timing of synchronizing the first camera and the second camera by the second processing unit, and the synchronization effect is good
  • the exposure time difference between the two cameras is large, the image content collected by the two cameras can still be ensured.
  • the calculation module 730 is further configured to calculate an exposure time difference between the first exposure time and the second exposure time, and divide the exposure time difference by 2 to obtain a delay time length.
  • the calculating module 730 is further configured to separately calculate a first intermediate exposure time of the first exposure time and a second intermediate exposure time of the second exposure time, and determine the first intermediate exposure time and the second intermediate exposure time. The difference and the difference is used as the delay time.
  • the time point of forwarding the synchronization signal can be dynamically adjusted according to the exposure duration of the first camera and the second camera, so that the timing of synchronizing the first camera and the second camera can be dynamically adjusted to ensure the first camera and the second camera. It is consistent at half the time of exposure, and the synchronization effect is good.
  • the request receiving module 710 is further configured to send an image acquisition instruction to the first processing unit by using a kernel running in the first operating mode in the second processing unit, where the first operating mode is a trusted operating environment.
  • the processing module 750 is further configured to send, by using the first processing unit, the processed first image to the kernel running in the first operating mode in the second processing unit.
  • the image processing instruction is sent to the first processing unit by the core with high security of the second processing unit, which ensures that the first processing unit is in a high security environment and improves data security.
  • the processing module 750 includes a type obtaining unit 752, a level determining unit 754, a selecting unit 756, and a sending unit 758.
  • the type obtaining unit 752 is configured to acquire an application type of an application that transmits a data acquisition request.
  • the level determining unit 754 is configured to determine a security level of the application according to an application type.
  • the selecting unit 756 is configured to select a data transmission channel corresponding to the security level.
  • the sending unit 758 sends the processed first image to the kernel running in the first operating mode in the second processing unit by the first processing unit when the data transmission channel is a secure channel.
  • the sending unit 758 is further configured to: when the data transmission channel is a non-secure channel, send the processed first image to the camera driving in the second operating mode by using the first processing unit, where the second operating mode is a natural operating environment.
  • the level determining unit 754 is further configured to acquire a security level of an application that sends a data acquisition request, and determine an image accuracy corresponding to the security level.
  • the sending unit 758 is further configured to send image data corresponding to the image accuracy to the application.
  • the corresponding data channel is selected according to the security level of the application to transmit data, and the security of the data transmission can be improved in the secure channel, and the data processing efficiency can be improved in the non-secure channel.
  • FIG. 9 is an application scenario diagram of a method of controlling shooting in an embodiment.
  • the application scenario may include a first camera 110 , a second camera 120 , a first processing unit 130 , and a second processing unit 140 .
  • the first camera 110 may be a laser camera
  • the second camera 120 may be an RGB (Red/Green/Blue, red/green/blue color mode) camera.
  • the first processing unit 130 may be an MCU (Microcontroller Unit) module or the like
  • the second processing unit 140 may be a CPU (Central Processing Unit) module or the like.
  • the first processing unit 130 is connected to the first camera 110 through a control line
  • the second processing unit 140 is connected to the second camera 120 through a control line.
  • the first processing unit 130 is coupled to the second processing unit 140.
  • the first processing unit 130 is also connected to the first camera 110 and the second camera 120 through signal lines.
  • the second camera 120 may be controlled to acquire the second image through the control line according to the data acquisition request, and send an image acquisition instruction to the first processing unit 130.
  • the first processing unit 130 receives the image acquisition instruction sent by the second processing unit 140
  • the first camera 110 may be controlled to acquire the first image through the control line according to the image acquisition instruction.
  • the synchronization signal may be transmitted to the first processing unit 130 through the signal line at the time of starting the exposure.
  • the first processing unit 130 receives the synchronization signal sent by the second camera 120, the first exposure time of the first camera 110 and the second exposure time of the second camera 120 may be acquired, and according to the first exposure time and the second exposure. Time calculation delay time.
  • the synchronization signal can be forwarded to the first camera 110 through the signal line.
  • the first camera 110 After receiving the synchronization signal, the first camera 110 can start exposure and acquire the first image, and the acquired first image can be transmitted to the first processing unit 130.
  • the first processing unit 130 may process the first image and send the processed first image to the second processing unit 140.
  • the electronic device 200 can include a camera module 210 , a second processing unit 220 , and a first processing unit 230 .
  • the second processing unit 220 described above may be a CPU module.
  • the first processing unit 230 may be an MCU module or the like.
  • the first processing unit 230 is connected between the second processing unit 220 and the camera module 210.
  • the first processing unit 230 can control the laser camera 212, the floodlight 214, and the laser light 218 in the camera module 210.
  • the second processing unit 220 can control the RGB camera 216 in the camera module 210.
  • the camera module 210 includes a laser camera 212, a floodlight 214, an RGB camera 216, and a laser light 218.
  • the laser camera 212 described above may be an infrared camera for acquiring an infrared image.
  • the floodlight 214 is a surface light source capable of emitting infrared light;
  • the laser light 218 is a point light source capable of emitting laser light and is a patterned point light source.
  • the laser camera 212 can acquire an infrared image according to the reflected light.
  • the laser camera 212 can acquire a speckle image based on the reflected light.
  • the speckle image is an image in which the patterned laser light emitted by the laser lamp 218 is reflected and the pattern is deformed.
  • the first processing unit 230 can connect the RGB camera 216 and the laser camera 212 through signal lines, respectively.
  • a synchronization signal can be sent to the first processing unit 230.
  • the first processing unit 230 can acquire the exposure duration of the laser camera 212 and the exposure duration of the RGB camera 216, and calculate the delay according to the exposure duration of the laser camera 212 and the exposure duration of the RGB camera 216. duration.
  • the synchronization signal can be forwarded to the laser camera 212 through the signal line.
  • the laser camera 212 receives the synchronization signal, and can start exposure based on the synchronization signal and acquire an infrared image or a speckle image or the like.
  • the second processing unit 220 may include a CPU core running in a TEE (Trusted Execution Environment) environment and a CPU core running in a REE (Rich Execution Environment) environment.
  • TEE Trusted Execution Environment
  • REE Raich Execution Environment
  • the TEE environment and the REE environment are the operating modes of the ARM module (Advanced RISC Machines, advanced reduced instruction set processor).
  • the security level of the TEE environment is high, and only one CPU core in the second processing unit 220 can run in the TEE environment at the same time.
  • the higher level of security behavior in the electronic device 200 needs to be performed in the CPU core in the TEE environment, and the lower security level operation behavior can be performed in the CPU core in the REE environment.
  • the first processing unit 230 includes a PWM (Pulse Width Modulation) module 232, a SPI/I2C (Serial Peripheral Interface/Inter-Integrated Circuit) interface 234, and a RAM. (Random Access Memory) module 236 and depth engine 238.
  • the PWM module 232 can transmit a pulse to the camera module, and control the floodlight 214 or the laser light 218 to be turned on, so that the laser camera 212 can acquire an infrared image or a speckle image.
  • the SPI/I2C interface 234 is configured to receive an image acquisition instruction sent by the second processing unit 220.
  • the depth engine 238 described above can process the speckle image to obtain a depth disparity map.
  • the image can be sent to the first processing unit 230 by the CPU core running in the TEE environment. Acquisition instructions.
  • the first processing unit 230 can transmit the pulse wave through the PWM module 232 to control the floodlight 214 in the camera module 210 to open and collect the infrared image through the laser camera 212, and control the laser light 218 in the camera module 210.
  • the speckle image is turned on and acquired by the laser camera 212.
  • the camera module 210 can transmit the collected infrared image and the speckle image to the first processing unit 230.
  • the first processing unit 230 may process the received infrared image to obtain an infrared parallax map; and process the received speckle image to obtain a speckle disparity map or a depth disparity map.
  • the processing by the first processing unit 230 on the infrared image and the speckle image refers to correcting the infrared image or the speckle image, and removing the influence of the internal and external parameters of the camera module 210 on the image.
  • the first processing unit 230 can be set to different modes, and images output by different modes are different.
  • the first processing unit 230 When the first processing unit 230 is set to the speckle pattern, the first processing unit 230 processes the speckle image to obtain a speckle disparity map, and the target speckle image can be obtained according to the speckle disparity map; when the first processing unit 230 sets In the depth map mode, the first processing unit 230 processes the speckle image to obtain a depth disparity map, and the depth disparity map can obtain a depth image, and the depth image refers to an image with depth information.
  • the first processing unit 230 may send the infrared parallax map and the speckle disparity map to the second processing unit 220, and the first processing unit 230 may also send the infrared disparity map and the depth disparity map to the second processing unit 220.
  • the second processing unit 220 may acquire a target infrared image according to the infrared disparity map described above, and acquire a depth image according to the depth disparity map described above. Further, the second processing unit 220 may perform face recognition, face matching, living body detection, and acquiring depth information of the detected face according to the target infrared image and the depth image.
  • the communication between the first processing unit 230 and the second processing unit 220 is through a fixed security interface to ensure the security of the transmitted data.
  • the data sent by the second processing unit 220 to the first processing unit 230 is passed through the SECURE SPI/I2C 240, and the data sent by the first processing unit 230 to the second processing unit 220 is passed through the SECURE MIPI (Mobile Industry Processor). Interface, mobile industry processor interface) 250.
  • SECURE SPI/I2C 240 SECURE SPI/I2C 240
  • MIPI Mobile Industry Processor
  • Interface mobile industry processor interface
  • the first processing unit 230 may also acquire the target infrared image according to the infrared disparity map, calculate the acquired depth image according to the depth disparity map, and send the target infrared image and the depth image to the second processing unit 220.
  • the electronic device 300 includes a processor 310, a memory 320, a display screen 330, and an input device 340 that are coupled by a system bus 350.
  • the memory 320 may include a non-volatile storage medium 322 and an internal memory 324.
  • the non-volatile storage medium 322 of the electronic device 300 stores an operating system 3222 and a computer program 3224.
  • the computer program 3224 is executed by the processor 310 to implement a method for controlling shooting provided in the embodiments of the present application.
  • the processor 310 is configured to provide computing and control capabilities to support operation of the entire electronic device 300.
  • the internal memory 324 in the electronic device 300 provides an environment for the operation of the computer program 3224 in the non-volatile storage medium 322.
  • the display screen 330 of the electronic device 300 may be a liquid crystal display or an electronic ink display or the like.
  • the input device 340 may be a touch layer covered on the display screen 330, or may be a button, trackball or touch provided on the outer casing of the electronic device 300.
  • the board can also be an external keyboard, trackpad or mouse.
  • the electronic device 300 can be a cell phone, a tablet or a personal digital assistant or a wearable device or the like. A person skilled in the art can understand that the structure shown in FIG.
  • 3 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the electronic device to which the solution of the present application is applied.
  • the specific electronic device may be It includes more or fewer components than those shown in the figures, or some components are combined, or have different component arrangements.
  • a method of controlling shooting including the following steps:
  • Step 2410 When the first processing unit receives the image acquisition instruction sent by the second processing unit, the first camera collects the first image according to the image acquisition instruction, and the image collection instruction is sent when the second processing unit receives the data acquisition request.
  • the data acquisition request is used to instruct the second processing unit to control the second camera to acquire the second image.
  • the first camera can be controlled to be turned on, and the first image is collected, wherein the face data can include, but is not limited to, face unlocking, face payment, and the like, and the face needs to be performed. Verified data, and face depth information.
  • the first camera can be a laser camera, and the laser camera can collect invisible images of different wavelengths.
  • the first image may include, but is not limited to, an infrared image, a speckle image, etc., and the speckle image refers to an infrared image with a speckle image.
  • a data acquisition request may be sent to the second processing unit.
  • the second processing unit may send an image acquisition instruction to the first processing unit, where the first processing unit may be an MCU module, and the second processing unit may be a CPU module.
  • the second processing unit may first detect whether the visible light image acquisition instruction is included in the data acquisition request, and if the visible light image acquisition instruction is included, the application may simultaneously acquire the visible light including the human face while acquiring the facial data. image. If the data acquisition request includes a visible light image acquisition instruction, the second processing unit may control the second camera to acquire the second image according to the visible light image acquisition instruction, wherein the second camera may be an RGB camera, and the second image may be a human face. RGB image.
  • the first camera may be controlled to acquire the first image according to the image capturing instruction, wherein the first image may include an infrared image, a speckle image, and the like.
  • the first processing unit can control the floodlight in the camera module to be turned on and collect the infrared image through the laser camera, and can open the laser such as the laser light in the camera module and collect the speckle image through the laser camera.
  • the floodlight can be a point source that uniformly illuminates in all directions.
  • the light emitted by the floodlight can be infrared light, and the laser camera can collect the face to obtain an infrared image.
  • the laser emitted by the laser can be diffracted by a lens and a diffusing optical element (DOE) to produce a pattern with speckle particles, which is projected onto the target object by a pattern with speckle particles, and the distance between the points of the target object and the electronic device is different.
  • DOE diffusing optical element
  • the offset of the speckle pattern is generated, and the laser camera acquires the target object to obtain a speckle image.
  • Step 2420 when the first processing unit receives the synchronization signal sent by the second camera, acquiring the first exposure time of the first camera and the second exposure time of the second camera, and the synchronization signal is collecting the second image of each frame for the second camera. The signal that is sent at the moment the exposure is started.
  • the first processing unit may connect the first camera through the control line, and control the first camera to acquire the first image through the control line.
  • the second processing unit may connect the second camera through the control line, and control the second camera to acquire the second image through the control line.
  • the first processing unit can be coupled to the second processing unit.
  • the first processing unit may also be connected to the first camera and the second camera respectively through signal lines, wherein the signal lines may be synchronous signal lines.
  • the second camera may send a synchronization signal to the first processing unit connected to the signal line at the time of starting the exposure, and the synchronization signal may be a start of frame (SOF) of the frame, which may be used for each The frame image begins to be exposed.
  • SOF start of frame
  • the first processing unit receives the synchronization signal sent by the second camera, the first exposure time of the first camera and the second exposure time of the second camera may be acquired, and the exposure duration may refer to the sensation duration, and the longer the exposure duration, the advance The more light you can have.
  • the first exposure time of the first camera is different from the second exposure time of the second camera, and the first exposure time of the first camera may be smaller than the second exposure time of the second camera, but is not limited thereto, and may be There is also a case where the first exposure time of the first camera is larger than the second exposure time of the second camera, and the like.
  • Step 2430 calculating a delay duration according to the first exposure time and the second exposure time.
  • the first processing unit may calculate a delay duration according to the first exposure time of the first camera and the second exposure time of the second camera, where the delay duration refers to lengthening the length of time for the first camera to start exposure, and delaying the first The moment when the camera starts to be exposed, thereby ensuring that the first camera is synchronized with the second camera.
  • the electronic device may preset a time when the first camera and the second camera are synchronized during the exposure process, wherein the moment of synchronization during the exposure may refer to the time when the first camera has been exposed for the first exposure.
  • the ratio of time is the same as the ratio of the time that the second camera has been exposed to the second exposure time.
  • the first camera and the second camera may be set to end the exposure at the same time, either at the time of half of the exposure, or at the same time when the exposure reaches 3/4.
  • the first processing unit may calculate the delay duration according to the first exposure time, the second exposure time, and the set timing of synchronization during the exposure.
  • Step 2440 When the duration of the synchronization signal received by the first processing unit reaches a delay duration, the synchronization signal is forwarded to the first camera, and the synchronization signal is used to instruct the first camera to start exposure and acquire the first image.
  • the synchronization signal may be forwarded to the first camera when the duration of receiving the synchronization signal reaches the delay duration.
  • the exposure is started, thereby ensuring that the timings at which the first camera and the second camera are synchronized during the exposure process are consistent.
  • the electronic device can pre-equalize the device at half the time of exposure, and the first processing unit calculates the delay duration, and when the duration of receiving the synchronization signal reaches the delay duration, the synchronization signal is forwarded to the first camera.
  • the second camera is also exposed to half, and the two are consistent.
  • Step 2450 The first image is processed by the first processing unit, and the processed first image is sent to the second processing unit.
  • the first camera may transmit the acquired first image to the first processing unit, and the first processing unit may process the first image.
  • the first processing unit can be set to different modes, and different modes can collect different first images, perform different processing on the first image, and the like.
  • the first processing unit can control to turn on the floodlight, and collect the infrared image through the first camera, and the infrared image can be processed to obtain an infrared parallax map.
  • the first processing unit is in the speckle pattern mode
  • the first processing unit may control to turn on the laser light, and collect the speckle image through the first camera, and the speckle image may be processed to obtain a speckle disparity map.
  • the first processing unit is in the depth map mode, the first processing unit may process the speckle image to obtain a depth disparity map.
  • the first processing unit may perform a correction process on the first image, and the correction process refers to correcting the image content offset caused by the internal and external parameters of the first image and the second camera, for example, due to the laser.
  • a disparity map of the first image can be obtained.
  • the infrared image is corrected to obtain an infrared disparity map
  • the speckle image is corrected to obtain a speckle disparity map or a depth disparity map.
  • Performing correction processing on the first image can prevent a situation in which an image finally presented on the screen of the electronic device is ghosted.
  • the first processing unit processes the first image, and the processed first image may be sent to the second processing unit.
  • the second processing unit may obtain a target image according to the processed first image, such as a target infrared image, a target speckle image, a target depth map, and the like.
  • the second processing unit can process the target image according to the requirements of the application.
  • the second processing unit may perform face detection according to the target image or the like, wherein the face detection may include face recognition, face matching, and living body detection.
  • Face recognition refers to whether there is a face in the target image.
  • Face matching refers to matching the face in the target image with the pre-existing face.
  • the detection of the living body refers to detecting whether the face in the target image has biological activity or the like. If the application needs to obtain the depth information of the face, the generated target depth map can be uploaded to the application, and the application can perform the beauty processing, the three-dimensional modeling, and the like according to the received target depth map.
  • the delay time is calculated according to the exposure duration of the two cameras, and when the duration of the synchronization signal received by the first processing unit reaches the delay time , the synchronization signal is forwarded to the first camera, and the time point of the forward synchronization signal is dynamically adjusted according to the exposure duration of the first camera and the second camera, so that the timing of synchronizing the first camera and the second camera can be dynamically adjusted, and the synchronization effect is good, in two
  • the exposure time between the cameras is large, the image content collected by the two cameras can still be consistent.
  • step 2430 calculates a delay duration according to the first exposure time and the second exposure time, including: calculating an exposure time difference between the first exposure time and the second exposure time, and dividing the exposure time difference by 2 to obtain a delay. duration.
  • the electronic device can set the first camera and the second camera to coincide at half of the exposure time, and when the first camera is exposed to half, the second camera is also exposed to half.
  • the first processing unit receives the synchronization signal sent by the second camera
  • the exposure time difference between the first exposure time and the second exposure time can be calculated, and the exposure time difference is divided by 2 to obtain the delay time length.
  • the delay time T 3
  • the exposure time difference between the first exposure time and the second exposure time is first calculated to be 17 ms, and the exposure time difference is obtained. Dividing by 2 results in a delay of 13.5 ms.
  • the exposure time difference may be compared with the time threshold to determine whether the exposure time difference is greater than a time threshold, and if the time threshold is greater than the time threshold, The exposure time difference is divided by 2, and the delay time is obtained.
  • the duration of the synchronization signal received by the first processing unit reaches the delay time, the synchronization signal is forwarded to the first camera.
  • the first processing unit may directly forward the synchronization signal to the first camera without extending the time at which the first camera starts to be exposed.
  • the time threshold can be set according to actual requirements, for example, 1 ms, 2 ms, etc., to ensure that the captured image content of the first camera and the second camera are within tolerance tolerance, and the calculation pressure of the first processing unit is reduced.
  • the first processing unit may also calculate the first intermediate exposure time and the second exposure time of the first exposure time, respectively.
  • the second intermediate exposure time wherein the intermediate exposure time refers to the time of exposure to half.
  • the first processing unit may determine a difference between the first intermediate exposure time and the second intermediate exposure time, and use the difference as the delay duration.
  • the delay time T 3
  • the first intermediate exposure time of the first exposure time may be calculated to be 1.5 ms
  • the second intermediate time of the second exposure time may be When the exposure time is 15 ms, the difference between the first intermediate exposure time and the second intermediate exposure time can be calculated as 13.5 ms, and the difference of 13.5 ms can be used as the delay time. It can be understood that other algorithms can also be used to ensure synchronization between the first camera and the second camera, and are not limited to the above several ways.
  • the time point of forwarding the synchronization signal can be dynamically adjusted according to the exposure duration of the first camera and the second camera, so that the timing of synchronizing the first camera and the second camera can be dynamically adjusted to ensure the first camera and the second camera. It is consistent at half the time of exposure, and the synchronization effect is good.
  • step 2450 processes the first image by the first processing unit, and sends the processed first image to the second processing unit, including the following steps:
  • Step 2502 Acquire a stored reference speckle image, the reference speckle image with reference depth information.
  • the line perpendicular to the plane of the imaging plane and passing through the center of the mirror is the Z axis. If the coordinates of the object in the camera coordinate system are (X, Y, Z), then the Z value is the object imaged in the camera. Plane depth information. If the application needs to obtain the depth information of the face, it needs to collect the depth map containing the face depth information.
  • the first processing unit can control to turn on the laser light and collect the speckle image through the first camera.
  • a reference speckle map may be pre-stored in the first processing unit, and the reference speckle map may be provided with reference depth information, and the depth information of each pixel included in the speckle image may be acquired according to the collected speckle image and the reference speckle image. .
  • Step 2504 matching the reference speckle image with the speckle image to obtain a matching result.
  • the first processing unit may sequentially select a pixel block of a preset size centered on each pixel point included in the collected speckle image, for example, 31 pixel (pixel) * 31 pixel size, and search and select the pixel block on the reference speckle image. Matching blocks.
  • the first processing unit may find two points in the speckle image and the reference speckle image respectively on the same laser light path from the block selected by the selected pixel image and the reference speckle image, the same The speckle information of two points on the laser light path is consistent, and two points on the same laser light path can be identified as corresponding pixel points.
  • the reference speckle image the depth information of the points on each laser light path is known.
  • the first processing unit may calculate an offset between the target speckle image and the reference speckle image on two pixels of the same laser light path, and calculate the included in the acquired speckle pattern according to the offset Depth information for each pixel.
  • the first processing unit performs the calculation of the offset of the acquired speckle image and the reference speckle pattern, and calculates the depth information of each pixel point included in the speckle image according to the offset, and the calculation formula thereof is Can be as shown in equation (1):
  • Z D represents the depth information of the pixel point, that is, the depth value of the pixel point
  • L is the distance between the laser camera and the laser
  • f is the focal length of the lens in the laser camera
  • Z 0 is the reference plane when the reference speckle image is collected.
  • P is the offset between the acquired speckle image and the corresponding pixel point in the reference speckle image. P can be obtained by multiplying the pixel amount of the target speckle pattern and the pixel point offset in the reference speckle pattern by the actual distance of one pixel point.
  • P is a negative value when the distance between the target object and the laser camera is less than the distance between the reference plane and the first collector. , P is a positive value.
  • Step 2506 Generate a depth disparity map according to the reference depth information and the matching result, and send the depth disparity map to the second processing unit, and process the depth disparity map by the second processing unit to obtain a depth map.
  • the first processing unit obtains the depth information of each pixel point included in the collected speckle image, and can perform correction processing on the collected speckle image, and correct the collected speckle image due to internal and external parameters of the first camera and the second camera.
  • the image content is offset.
  • the first processing unit may generate a depth disparity map according to the corrected speckle image and the depth value of each pixel in the speckle image, and send the depth disparity map to the second processing unit.
  • the second processing unit may obtain a depth map according to the depth disparity map, and the depth map may include depth information of each pixel point.
  • the second processing unit may upload the depth map to the application, and the application may perform beauty, three-dimensional modeling, and the like according to the depth information of the face in the depth map.
  • the second processing unit can also perform living body detection according to the depth information of the face in the depth map, and can prevent the collected face from being a two-dimensional plane face or the like.
  • the depth information of the acquired image can be accurately obtained by the first processing unit, the data processing efficiency is high, and the accuracy of the image processing is improved.
  • step 2502 before the stored reference speckle image is acquired in step 2502, the following steps are further included:
  • Step 2602 collecting the temperature of the laser every acquisition time period, and acquiring a reference speckle image corresponding to the temperature.
  • the electronic device may be provided with a temperature sensor beside the laser, wherein the laser refers to a laser lamp or the like, and the temperature of the laser is collected by a temperature sensor.
  • the second processing unit may acquire the temperature of the laser collected by the temperature sensor every acquisition time period, wherein the collection time period may be set according to actual needs, for example, 3 seconds, 4 seconds, etc., but is not limited thereto. Since the temperature of the laser changes, the camera module may be deformed, affecting the internal and external parameters of the first camera and the second camera. Different effects on the camera at different temperatures, so different reference speckle images can be matched at different temperatures
  • the second processing unit may acquire a reference speckle image corresponding to the temperature, and process the speckle image acquired at the temperature according to the reference speckle image corresponding to the temperature to obtain a depth map.
  • the second processing unit may preset a plurality of different temperature intervals, such as 0° C. (photometry) to 30° C., 30° C. to 60° C., 60° C. to 90° C., etc., but is not limited thereto.
  • the temperature interval can correspond to different reference speckle images. After the second processing unit collects the temperature, the temperature interval in which the temperature is located may be determined, and a reference speckle image corresponding to the temperature interval is acquired.
  • Step 2604 When the reference speckle image acquired this time is inconsistent with the reference speckle image stored in the first processing unit, the reference speckle pattern acquired this time is written into the first processing unit.
  • the second processing unit After the second processing unit acquires the reference speckle image corresponding to the collected temperature, it may be determined whether the reference speckle image acquired this time is consistent with the reference speckle image stored in the first processing unit, and the reference speckle image may be carried
  • the image identifier may be composed of one or more of a number, a word line, a character, and the like.
  • the second processing unit may read the image identification of the stored reference speckle image from the first processing unit and compare the image identification of the currently acquired reference speckle image with the image identification read from the first processing unit. If the two image identifiers are inconsistent, it may be indicated that the reference speckle image acquired this time is inconsistent with the reference speckle image stored in the first processing unit, and the second processing unit may write the reference speckle image acquired this time into the first A processing unit.
  • the first processing unit may store the newly written reference speckle image and delete the previously stored reference speckle image.
  • the reference speckle image corresponding to the temperature is obtained according to the temperature of the laser, and the influence of the temperature on the final output depth map is reduced, so that the obtained depth information is more accurate.
  • the above method for controlling shooting includes the following steps:
  • Step 2702 When the second processing unit receives the data acquisition request of the application, the security level of the application is obtained.
  • the second processing unit in the electronic device may include two operating modes, where the first operating mode may be a TEE, the TEE is a trusted operating environment, and the security level is high; and the second operating mode may be REE, REE For a natural operating environment, REE has a lower security level.
  • the image acquisition instruction may be sent to the first processing unit through the first operation mode.
  • the single core can be directly switched from the second operating mode to the first operating mode; when the second processing unit is multi-core, one core can be switched from the second operating mode to the first In the run mode, the other cores are still running in the second run mode, and the image acquisition instructions are sent to the first processing unit by the kernel running in the first run mode.
  • the processed first image may be sent to the kernel running in the first operating mode, so that the first processing unit is always running in a trusted operating environment. Improve security.
  • the second processing unit may acquire an application type of the application and acquire a security level corresponding to the application type.
  • Application types may include, but are not limited to, an unlock application, a payment application, a camera application, a beauty application, and the like.
  • the security level of different application types may be different. For example, the security level corresponding to the payment application and the unlock application may be high, and the security level corresponding to the camera application and the beauty application may be low, but is not limited thereto.
  • Step 2704 determining a data transmission channel corresponding to the security level.
  • the second processing unit may determine a data transmission channel corresponding to the security level of the application, where the data transmission channel may include but is not limited to a secure channel and a common channel, wherein the security channel may correspond to an application with a higher security level, and the common channel may correspond to A less secure application.
  • the payment application can correspond to a secure channel
  • the beauty application can correspond to a normal channel.
  • the transmitted data can be encrypted to avoid data leakage or theft.
  • Step 2706 Send the depth map to the application through the corresponding data transmission channel.
  • the second processing unit may send the depth map to the application through a data transmission channel corresponding to the security level of the application, and send the depth map to the application with higher security level through the secure channel, and the depth map may be encrypted and passed through the common channel. Sending a depth map to a less secure application speeds up data transfer.
  • other data may be sent to the application through a data transmission channel corresponding to the security level of the application, for example, a verification result of the face verification, etc., but is not limited thereto.
  • the second processing unit may send a depth map with an accuracy corresponding to the security level to the application according to the security level of the application.
  • Applications with a high level of security can correspond to depth maps with high precision, and applications with low security levels can correspond to depth maps with low precision.
  • the second processing unit can adjust the image precision of the image data by adjusting the image resolution. The higher the resolution, the higher the image precision, the lower the resolution, and the lower the image precision. It is also possible to control the number of dots diffracted by the laser lamp, the higher the image accuracy, the more diffraction points, the lower the image accuracy, and the fewer the diffraction points. It can be understood that the image precision can also be controlled in other ways, and is not limited to the above several ways.
  • the accuracy of the depth map can be adjusted according to the security level of the application to improve the security of the data.
  • the corresponding data channel is selected according to the security level of the application to transmit data, thereby improving the security of data transmission.
  • a method of controlling shooting comprising the steps of:
  • Step (1) when the first processing unit receives the image acquisition instruction sent by the second processing unit, the first camera is controlled to acquire the first image according to the image acquisition instruction, and the image collection instruction is when the second processing unit receives the data acquisition request.
  • the sending, data obtaining request is used to instruct the second processing unit to control the second camera to acquire the second image.
  • the first processing unit is connected to the first camera through a control line
  • the second processing unit is connected to the second camera through a control line
  • the first processing unit is connected to the second processing unit
  • the first processing unit is further separated by a signal line. Connected to the first camera and the second camera.
  • Step (2) when the first processing unit receives the synchronization signal sent by the second camera, acquiring the first exposure time of the first camera and the second exposure time of the second camera, and the synchronization signal is the second camera collecting each frame.
  • step (3) the delay time is calculated according to the first exposure time and the second exposure time.
  • the step (3) comprises: calculating an exposure time difference between the first exposure time and the second exposure time, and dividing the exposure time difference by 2 to obtain a delay time length.
  • the step (3) includes: respectively calculating a first intermediate exposure time of the first exposure time and a second intermediate exposure time of the second exposure time; determining the first intermediate exposure time and the second intermediate exposure time The difference and the difference is used as the delay time.
  • Step (4) when the duration of the synchronization signal received by the first processing unit reaches the delay duration, the synchronization signal is forwarded to the first camera, and the synchronization signal is used to instruct the first camera to start exposure and acquire the first image.
  • step (5) the first image is processed by the first processing unit, and the processed first image is sent to the second processing unit.
  • the step (5) includes: acquiring a stored reference speckle image, the reference speckle image with reference depth information; matching the reference speckle image with the speckle image to obtain a matching result; according to the reference depth The information and the matching result generate a depth disparity map, and send the depth disparity map to the second processing unit, and process the depth disparity map by the second processing unit to obtain a depth map.
  • the method before the step of acquiring the stored reference speckle image, the method further includes: collecting the temperature of the laser every acquisition time period, and acquiring a reference speckle image corresponding to the temperature; when the reference speckle image acquired this time When it is inconsistent with the reference speckle image stored in the first processing unit, the reference speckle image acquired this time is written into the first processing unit.
  • the method for controlling shooting includes: when the second processing unit receives the data acquisition request of the application, acquires a security level of the application; determines a data transmission channel corresponding to the security level; passes the depth map The corresponding data transmission channel is sent to the application.
  • the delay time is calculated according to the exposure duration of the two cameras, and when the duration of the synchronization signal received by the first processing unit reaches the delay time , the synchronization signal is forwarded to the first camera, and the time point of the forward synchronization signal is dynamically adjusted according to the exposure duration of the first camera and the second camera, so that the timing of synchronizing the first camera and the second camera can be dynamically adjusted, and the synchronization effect is good, in two
  • the exposure time between the cameras is large, the image content collected by the two cameras can still be consistent.
  • steps in the various flow diagrams described above are sequentially displayed as indicated by the arrows, these steps are not necessarily performed in the order indicated by the arrows. Except as explicitly stated herein, the execution of these steps is not strictly limited, and the steps may be performed in other orders. Moreover, at least some of the steps in the various flow diagrams above may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be executed at different times, these sub-steps or stages The order of execution is not necessarily performed sequentially, but may be performed alternately or alternately with at least a portion of other steps or sub-steps or stages of other steps.
  • an electronic device including a first processing unit, a second processing unit, and a camera module, and the first processing unit is connectable to the second processing unit and the camera module, respectively.
  • the camera module may include a first camera and a second camera, the first processing unit may be connected to the first camera through a control line, and the second processing unit may be connected to the second camera through a control line.
  • the first processing unit is connected to the second processing unit, and the first processing unit is further connected to the first camera and the second camera through signal lines.
  • the second processing unit is configured to, when receiving the data acquisition request, control the second camera to acquire the second image according to the data acquisition request, and send the image collection instruction to the first processing unit.
  • the first processing unit is configured to, when receiving the image acquisition instruction sent by the second processing unit, control the first camera to acquire the first image according to the image acquisition instruction.
  • the second camera is configured to transmit a synchronization signal to the first processing unit at the time of starting the exposure when acquiring the second image of each frame.
  • the first processing unit is further configured to acquire a first exposure time of the first camera and a second exposure time of the second camera when the first processing unit receives the synchronization signal sent by the second camera.
  • the first processing unit is further configured to calculate the delay duration according to the first exposure time and the second exposure time.
  • the first processing unit is further configured to: when the duration of the synchronization signal received by the first processing unit reaches a delay duration, forward the synchronization signal to the first camera.
  • the first camera is configured to start exposure based on the synchronization signal and acquire the first image.
  • the first processing unit is further configured to process the first image by using the first processing unit, and send the processed first image to the second processing unit.
  • the delay time is calculated according to the exposure duration of the two cameras, and when the duration of the synchronization signal received by the first processing unit reaches the delay time , the synchronization signal is forwarded to the first camera, and the time point of the forward synchronization signal is dynamically adjusted according to the exposure duration of the first camera and the second camera, so that the timing of synchronizing the first camera and the second camera can be dynamically adjusted, and the synchronization effect is good, in two
  • the exposure time between the cameras is large, the image content collected by the two cameras can still be consistent.
  • the first processing unit is further configured to calculate an exposure time difference between the first exposure time and the second exposure time, and divide the exposure time difference by 2 to obtain a delay time length.
  • the first processing unit is further configured to separately calculate a first intermediate exposure time of the first exposure time and a second intermediate exposure time of the second exposure time, and determine the first intermediate exposure time and the second intermediate exposure time. The difference and the difference is used as the delay time.
  • the time point of forwarding the synchronization signal can be dynamically adjusted according to the exposure duration of the first camera and the second camera, so that the timing of synchronizing the first camera and the second camera can be dynamically adjusted to ensure the first camera and the second camera. It is consistent at half the time of exposure, and the synchronization effect is good.
  • the first processing unit is further configured to acquire the stored reference speckle image, and match the reference speckle image with the speckle image to obtain a matching result, and the reference speckle image has a reference depth. information.
  • the first processing unit is further configured to generate a depth disparity map according to the reference depth information and the matching result, and send the depth disparity map to the second processing unit.
  • the second processing unit is further configured to process the depth disparity map to obtain a depth map.
  • the depth information of the acquired image can be accurately obtained by the first processing unit, the data processing efficiency is high, and the accuracy of the image processing is improved.
  • the second processing unit is further configured to collect the temperature of the laser every acquisition time period and acquire a reference speckle image corresponding to the temperature.
  • the second processing unit is further configured to: when the reference speckle image acquired this time is inconsistent with the reference speckle image stored in the first processing unit, write the reference speckle image acquired this time to the first processing unit.
  • the reference speckle image corresponding to the temperature is obtained according to the temperature of the laser, and the influence of the temperature on the final output depth map is reduced, so that the obtained depth information is more accurate.
  • the second processing unit is further configured to acquire a security level of the application when receiving a data acquisition request of the application.
  • the second processing unit is further configured to determine a data transmission channel corresponding to the security level, and send the depth map to the application through the corresponding data transmission channel.
  • the corresponding data channel is selected according to the security level of the application to transmit data, thereby improving the security of data transmission.
  • an apparatus 800 for controlling shooting including an image acquisition module 810, a signal receiving module 820, a calculation module 830, a signal forwarding module 840, and a processing module 850.
  • the image acquisition module 810 is configured to: when the first processing unit receives the image acquisition instruction sent by the second processing unit, control the first camera to acquire the first image according to the image acquisition instruction, and the image collection instruction receives the data acquisition request by the second processing unit.
  • the data acquisition request is used to instruct the second processing unit to control the second camera to acquire the second image.
  • the signal receiving module 820 is configured to: when the first processing unit receives the synchronization signal sent by the second camera, acquire the first exposure time of the first camera and the second exposure time of the second camera, and the synchronization signal is collected by the second camera for each frame. The signal transmitted at the time of starting the exposure at the second image.
  • the calculation module 830 is configured to calculate the delay duration according to the first exposure time and the second exposure time.
  • the signal forwarding module 840 is configured to forward the synchronization signal to the first camera when the duration of the synchronization signal received by the first processing unit reaches a delay duration, and the synchronization signal is used to instruct the first camera to start exposure and acquire the first image.
  • the processing module 850 is configured to process the first image by using the first processing unit, and send the processed first image to the second processing unit.
  • the delay time is calculated according to the exposure duration of the two cameras, and when the duration of the synchronization signal received by the first processing unit reaches the delay time , the synchronization signal is forwarded to the first camera, and the time point of the forward synchronization signal is dynamically adjusted according to the exposure duration of the first camera and the second camera, so that the timing of synchronizing the first camera and the second camera can be dynamically adjusted, and the synchronization effect is good, in two
  • the exposure time between the cameras is large, the image content collected by the two cameras can still be consistent.
  • the calculation module 830 is further configured to calculate an exposure time difference between the first exposure time and the second exposure time, and divide the exposure time difference by 2 to obtain a delay time length.
  • the calculating module 830 respectively calculates a first intermediate exposure time of the first exposure time and a second intermediate exposure time of the second exposure time, and determines a difference between the first intermediate exposure time and the second intermediate exposure time, The difference is used as the delay time.
  • the time point of forwarding the synchronization signal can be dynamically adjusted according to the exposure duration of the first camera and the second camera, so that the timing of synchronizing the first camera and the second camera can be dynamically adjusted to ensure the first camera and the second camera. It is consistent at half the time of exposure, and the synchronization effect is good.
  • the processing module 850 includes an image acquisition unit 852, a matching unit 854, and a generation unit 856.
  • the image acquisition unit 852 is configured to acquire a stored reference speckle image with reference depth information.
  • the matching unit 854 is configured to match the reference speckle image with the speckle image to obtain a matching result.
  • the generating unit 856 is configured to generate a depth disparity map according to the reference depth information and the matching result, and send the depth disparity map to the second processing unit, and process the depth disparity map by the second processing unit to obtain a depth map.
  • the depth information of the acquired image can be accurately obtained by the first processing unit, the data processing efficiency is high, and the accuracy of the image processing is improved.
  • the processing module 850 includes a temperature acquisition unit and a writing unit in addition to the image acquisition unit 852, the matching unit 854, and the generation unit 856.
  • the temperature collecting unit is configured to collect the temperature of the laser every acquisition time period, and acquire a reference speckle image corresponding to the temperature.
  • the writing unit is configured to write the reference speckle image acquired this time to the first processing unit when the reference speckle image acquired this time is inconsistent with the reference speckle image stored in the first processing unit.
  • the reference speckle image corresponding to the temperature is obtained according to the temperature of the laser, and the influence of the temperature on the final output depth map is reduced, so that the obtained depth information is more accurate.
  • the apparatus 800 for controlling shooting includes an image acquisition module 810, a signal receiving module 820, a calculation module 830, a signal forwarding module 840, and a processing module 850, and includes a level acquisition module, a channel determination module, and a transmission module.
  • the level obtaining module is configured to acquire a security level of the application when the second processing unit receives the data acquisition request of the application.
  • the channel determination module is configured to determine a data transmission channel corresponding to the security level.
  • the sending module is configured to send the depth map to the application through the corresponding data transmission channel.
  • the corresponding data channel is selected according to the security level of the application to transmit data, thereby improving the security of data transmission.
  • a computer readable storage medium having stored thereon a computer program that, when executed by a processor, implements the method of controlling shooting as described above.
  • a computer program product comprising a computer program, when executed on a computer device, causes the computer device to perform the method of controlling shooting as described above when executed.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or the like.
  • Non-volatile memory can include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
  • Volatile memory can include random access memory (RAM), which acts as an external cache.
  • RAM is available in a variety of forms, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), dual data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronization.
  • SRAM static RAM
  • DRAM dynamic RAM
  • SDRAM synchronous DRAM
  • DDR SDRAM dual data rate SDRAM
  • ESDRAM enhanced SDRAM
  • synchronization Link (Synchlink) DRAM (SLDRAM), Memory Bus (Rambus) Direct RAM (RDRAM), Direct Memory Bus Dynamic RAM (DRDRAM), and Memory Bus Dynamic RAM (RDRAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Studio Devices (AREA)

Abstract

一种控制拍摄的方法、装置(700)、电子设备(200)及计算机可读存储介质。上述方法,包括:(1410)当第二处理单元(140)接收到数据获取请求时,控制第二摄像头(120)采集第二图像,并向第一处理单元(130)发送图像采集指令;(1420)当第二处理单元(140)接收到第二摄像头(120)发送的同步信号时,获取第一摄像头(110)的第一曝光时间和第二摄像头(120)的第二曝光时间;(1430)根据第一曝光时间和第二曝光时间计算延时时长;(1440)当第二处理单元(140)接收到同步信号的时长达到延时时长时,向第一摄像头(110)转发同步信号;(1450)通过第一处理单元(130)对第一图像进行处理,并将处理后的第一图像发送给第二处理单元(140)。

Description

控制拍摄的方法、装置、电子设备及计算机可读存储介质
优先权信息
本申请请求2018年4月28日向中国国家知识产权局提交的、专利申请号为201810401344.1和201810404282.X的专利申请的优先权和权益,并且通过参照将其全文并入此处。
技术领域
本申请涉及计算机技术领域,特别是涉及一种控制拍摄的方法、装置、电子设备及计算机可读存储介质。
背景技术
随着智能终端上影像技术的快速发展,越来越多的智能终端上安装有两个及两个以上的摄像头,采用多个摄像头配合,从而采集到视觉效果更好的图像。
发明内容
本申请实施例提供一种控制拍摄的方法、装置、电子设备及计算机可读存储介质。
一种控制拍摄的方法,包括:当第二处理单元接收到数据获取请求时,根据所述数据获取请求控制第二摄像头采集第二图像,并向第一处理单元发送图像采集指令,所述图像采集指令用于指示所述第一处理单元控制第一摄像头采集第一图像;当所述第二处理单元接收到所述第二摄像头发送的同步信号时,获取所述第一摄像头的第一曝光时间和第二摄像头的第二曝光时间,所述同步信号为所述第二摄像头采集每帧第二图像时在开始曝光的时刻发送的信号;根据所述第一曝光时间和第二曝光时间计算延时时长;当所述第二处理单元接收到所述同步信号的时长达到所述延时时长时,向所述第一摄像头转发所述同步信号,所述同步信号用于指示所述第一摄像头开始曝光并采集第一图像;通过所述第一处理单元对所述第一图像进行处理,并将处理后的第一图像发送给所述第二处理单元。
一种控制拍摄的装置,包括请求接收模块、信号接收模块、计算模块、信号转发模块和处理模块;请求接收模块用于当第二处理单元接收到数据获取请求时,根据所述数据获取请求控制第二摄像头采集第二图像,并向第一处理单元发送图像采集指令,所述图像采集指令用于指示所述第一处理单元控制第一摄像头采集第一图像;信号接收模块用于当所述第二处理单元接收到所述第二摄像头发送的同步信号时,获取所述第一摄像头的第一曝光时间和第二摄像头的第二曝光时间,所述同步信号为所述第二摄像头采集每帧第二图像时在开始曝光的时刻发送的信号;计算模块用于根据所述第一曝光时间和第二曝光时间计算延时时长;信号转发模块用于当所述第二处理单元接收到所述同步信号的时长达到所述延时时长时,向所述第一摄像头转发所述同步信号,所述同步信号用于指示所述第一摄像头开始曝光并采集第一图像;处理模块用于通过所述第一处理单元对所述第一图像进行处理,并将处理后的第一图像发送给所述第二处理单元。
一种电子设备,包括第一处理单元、第二处理单元和摄像头模组,所述第一处理单元分别与所述第二处理单元和摄像头模组相连;所述摄像头模组包括第一摄像头和第二摄像头,所述第一处理单元通过控制线连接所述第一摄像头,所述第二处理单元通过控制线连接所述第二摄像头,所述第一处理单元与所述第二处理单元连接,所述第二处理单元还通过信号线分别与所述第一摄像头及第二摄像头连接;所述第二处理单元用于当接收到数据获取请求时,根据所述数据获取请求控制第二摄像头采集第二图像,并向第一处理单元发送图像采集指令;所述第一处理单元用于根据所述图像采集指令控制第一摄像头采集第一图像;所述第二摄像头用于采集每帧第二图像时在开始曝光的时刻向所述第二处理单元发送同步信号;所述第二处理单元还用于当所述第二处理单元接收到所述第二摄像头发送的同步信号时,获取所述第一摄像头的第一曝光时间和第二摄像头的第二曝光时间,并根据所述第一曝光时间和第二曝光时间计算延时时长;所述第二处理单元还用于当所述第二处理单元接收到所述同步信号的时长达到所述延时时长时,向所述第一摄像头转发所述同步信号;所述第一摄像头用于根据所述同步信号开始曝光并采集第一图像;所述第一处理单元还用于对所述第一图像进行处理,并将处理后的第一图像发送给所述第二处理单元。
一种控制拍摄的方法,包括:当第一处理单元接收到第二处理单元发送的图像采集指令时,根据所述图像采集指令控制第一摄像头采集第一图像,所述图像采集指令为所述第二处理单元接收到数据获取请求时发送,所述数据获取请求用于指示所述第二处理单元控制第二摄像头采集第二图像;当第一处理单元接收到所述第二摄像头发送的同步信号时,获取所述第一摄像头的第一曝光时间和第二摄像头的第二曝光时间,所述同步信号为所述第二摄像头采集每帧第二图像时在开始曝光的时刻发送的信号;根据所述第一曝光时间和第二曝光时间计算延时时长;当所述第一处理单元接收到所述同步信号的时长达到所述延时时长时,向所述第一摄像头转发所述同步信号,所述同步信号用于指示所述第一摄像头开始曝光并采集第一图像;通过所述第一处理单元对所述第一图像进行处理,并将处理后的第一图像发送给所述第二处理单元。
一种控制拍摄的装置,包括图像采集模块、信号接收模块、计算模块、信号转发模块和处理模块;图像采集模块用于当第一处理单元接收到第二处理单元发送的图像采集指令时,根据所述图像采集指令控制第一摄像头采集第一图像,所述图像采集指令为所述第二处理单元接收到数据获取请求时发送,所述数据获取请求用于指示所述第二处理单元控制第二摄像头采集第二图像;信号接收模块用于当第一处理单元接收到所述第二摄像头发送的同步信号时,获取所述第一摄像头的第一曝光时间和第二摄像头的第二曝光时间,所述同步信号为所述第二摄像头采集每帧第二图像时在开始曝光的时刻发送的信号;计算模块用于根据所述第一曝光时间和第二曝光时间计算延时时长;信号转发模块用于当所述第一处理单元接收到所述同步信号的时长达到所述延时时长时,向所述第一摄像头转发所述同步信号,所述同步信号用于指示所述第一摄像头开始曝光并采集第一图像;处理模块用于通过所述第一处理单元对所述第一图像进行处理,并将处理后的第一图像发送给所述第二处理单元。
一种电子设备,包括第一处理单元、第二处理单元和摄像头模组,所述第一处理单元分别与所述第二处理单元和摄像头模组相连;所述摄像头模组包括第一摄像头和第二摄像头,所述第一处理单元通过控制线连接所述第一摄像头,所述第二处理单元通过控制线连接所述第二摄像头,所述第一处理单元与所述第二处理单元连接,所述第一处理单元还通过信号线分别与所述第一摄像头及第二摄像头连接;所述第二处理单元用于当接收到数据获取请求时,根据所述数据获取请求控制所述第二摄像头采集第二图像,并向所述第一处理单元发送图像采集指令;所述第一处理单元用于当接收到所述第二处理单元发送的图像采集指令时,根据所述图像采集指令控制第一摄像头采集第一图像;所述第二摄像头用于采集每帧第二图像时在开始曝光的时刻向第一处理单元发送同步信号;所述第一处理单元还用于当第一处理单元接收到所述第二摄像头发送的同步信号时,获取所述第一摄像头的第一曝光时间和第二摄像头的第二曝光时间;所述第一处理单元还用于根据所述第一曝光时间和第二曝光时间计算延时时长;所述第一处理单元还用于当所述第一处理单元接收到所述同步信号的时长达到所述延时时长时,向所述第一摄像头转发所述同步信号;所述第一摄像头用于根据同步信号开始曝光并采集第一图像;所述第一处理单元还用于通过所述第一处理单元对所述第一图像进行处理,并将处理后的第一图像发送给所述第二处理单元。
一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现如上所述的方法。
附图说明
图1为一个实施例中控制拍摄的方法的应用场景图;
图2为另一个实施例中控制拍摄的方法的应用场景图;
图3为一个实施例中电子设备的框图;
图4为一个实施例中控制拍摄的方法的流程示意图;
图5为一个实施例中第二处理单元向第一处理单元发送图像采集指令的流程示意图;
图6为一个实施例中第一处理单元向第二处理单元发送处理后的第一图像的流程示意图;
图7为一个实施例中控制拍摄的装置的框图;
图8为一个实施例中处理模块的框图。
图9为一个实施例中控制拍摄的方法的应用场景图;
图10为一个实施例中控制拍摄的方法的流程示意图;
图11为一个实施例中对第一图像进行处理的流程示意图;
图12为一个实施例中根据激光器的温度获取参考散斑图像的流程示意图;
图13为一个实施例中根据应用程序的安全级别选择数据传输通道的流程示意图;
图14为一个实施例中控制拍摄的装置的框图;
图15为一个实施例中处理模块的框图。
具体实施方式
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅用以解释本申请,并不用于限定本申请。
可以理解,本申请所使用的术语“第一”、“第二”等可在本文中用于描述各种元件,但这些元件不受这些术语限制。这些术语仅用于将第一个元件与另一个元件区分。举例来说,在不脱离本申请的范围的情况下,可以将第一客户端称为第二客户端,且类似地,可将第二客户端称为第一客户端。第一客户端和第二客户端两者都是客户端,但其不是同一客户端。
第一实施方式:
图1为一个实施例中控制拍摄的方法的应用场景图。如图1所示,该应用场景可包括第一摄像头110、第二摄像头120、第一处理单元130及第二处理单元140。第一摄像头110可以为激光摄像头,第二摄像头120可以为RGB(Red/Green/Blue,红/绿/蓝色彩模式)摄像头。第一处理单元130可为MCU(Microcontroller Unit,微控制单元)模块等,第二处理单元140可为CPU(Central Processing Unit,中央处理器)模块等。第一处理单元130通过控制线连接第一摄像头110,第二处理单元140通过控制线连接第二摄像头120。第一处理单元130与第二处理单元连接140。第二处理单元130还通过信号线分别与第一摄像头110及第二摄像头120连接。
当第二处理单元140接收到数据获取请求时,可根据数据获取请求通过控制线控制第二摄像头120采集第二图像,并向第一处理单元130发送图像采集指令。当第一处理单元130接收到第二处理单元140发送的图像采集指令时,可根据图像采集指令通过控制线控制第一摄像头110采集第一图像。第二摄像头120采集每帧第二图像时,可在开始曝光的时刻通过信号线向第二处理单元140发送同步信号。当第二处理单元140接收到第二摄像头120发送的同步信号时,可获取第一摄像头110的第一曝光时间和第二摄像头120的第二曝光时间,并根据第一曝光时间和第二曝光时间计算延时时长。当第二处理单元140接收到同步信号的时长达到延时时长时,可通过信号线向第一摄像头110转发同步信号。第一摄像头110接收到同步信号后,即可开始曝光并采集第一图像,可将采集的第一图像传输给第一处理单元130。第一处理单元130可对第一图像进行处理,并将处理后的第一图像发送给第二处理单元140。
图2为另一个实施例中控制拍摄的方法的应用场景图。如图2所示,电子设备200可包括摄像头模组210、第二处理单元220,第一处理单元230。上述第二处理单元220可为CPU模块。上述第一处理单元230可为MCU模块等。其中,第一处理单元230连接在第二处理单元220和摄像头模组210之间,上述第一处理单元230可控制摄像头模组210中激光摄像头212、泛光灯214和镭射灯218。上述第二处理单元220可控制摄像头模组210中RGB摄像头216。
摄像头模组210中包括激光摄像头212、泛光灯214、RGB摄像头216和镭射灯218。上述激光摄像头212可为红外摄像头,用于获取红外图像。上述泛光灯214为可发射红外光的面光源;上述镭射灯218为可发射激光的点光源且为带有图案的点光源。其中,当泛光灯214为面光源时,激光摄像头212可根据反射回的光线获取红外图像。当镭射灯218为点光源时,激光摄像头212可根据反射回的光线获取散斑图像。上述散斑图像是镭射灯218发射的带有图案的激光被反射后图案发生形变的图像。
第二处理单元220可通过信号线分别连接RGB摄像头216和激光摄像头212。当RGB摄像头216采集每帧图像时,可向第二处理单元220发送同步信号。第二处理单元220接收到RGB摄像头216发送的同步信号后,可获取激光摄像头212的曝光时长和RGB摄像头216的曝光时长,并根据激光摄像头212的曝光时长和RGB摄像头216的曝光时长计算延时时长。当第二处理单元220接收到同步信号的时长达到该延时时长时,可通过信号线向激光摄像头212转发同步信号。激光摄像头212接收到同步信号,可根据同步信号开始进行曝光并采集红外图像或散斑图像等。
第二处理单元220可包括在TEE(Trusted execution environment,可信运行环境)环境下运行的CPU内核和在REE(Rich Execution Environment,自然运行环境)环境下运行的CPU内核。其中,TEE环 境和REE环境均为ARM模块(Advanced RISC Machines,高级精简指令集处理器)的运行模式。其中,TEE环境的安全级别较高,第二处理单元220中有且仅有一个CPU内核可同时运行在TEE环境下。通常情况下,电子设备200中安全级别较高的操作行为需要在TEE环境下的CPU内核中执行,安全级别较低的操作行为可在REE环境下的CPU内核中执行。
第一处理单元230包括PWM(Pulse Width Modulation,脉冲宽度调制)模块232、SPI/I2C(Serial Peripheral Interface/Inter-Integrated Circuit,串行外设接口/双向二线制同步串行接口)接口234、RAM(Random Access Memory,随机存取存储器)模块236和深度引擎238。上述PWM模块232可向摄像头模组发射脉冲,控制泛光灯214或镭射灯218开启,使得激光摄像头212可采集到红外图像或散斑图像。上述SPI/I2C接口234用于接收第二处理单元220发送的图像采集指令。上述深度引擎238可对散斑图像进行处理得到深度视差图。
当第二处理单元220接收到应用程序的数据获取请求时,例如,当应用程序需要进行人脸解锁、人脸支付时,可通过运行在TEE环境下的CPU内核向第一处理单元230发送图像采集指令。当第一处理单元230接收到图像采集指令后,可通过PWM模块232发射脉冲波控制摄像头模组210中泛光灯214开启并通过激光摄像头212采集红外图像、控制摄像头模组210中镭射灯218开启并通过激光摄像头212采集散斑图像。摄像头模组210可将采集到的红外图像和散斑图像发送给第一处理单元230。第一处理单元230可对接收到的红外图像进行处理得到红外视差图;对接收到的散斑图像进行处理得到散斑视差图或深度视差图。其中,第一处理单元230对上述红外图像和散斑图像进行处理是指对红外图像或散斑图像进行校正,去除摄像头模组210中内外参数对图像的影响。其中,第一处理单元230可设置成不同的模式,不同模式输出的图像不同。当第一处理单元230设置为散斑图模式时,第一处理单元230对散斑图像处理得到散斑视差图,根据上述散斑视差图可得到目标散斑图像;当第一处理单元230设置为深度图模式时,第一处理单元230对散斑图像处理得到深度视差图,根据上述深度视差图可得到深度图像,上述深度图像是指带有深度信息的图像。第一处理单元230可将上述红外视差图和散斑视差图发送给第二处理单元220,第一处理单元230也可将上述红外视差图和深度视差图发送给第二处理单元220。第二处理单元220可根据上述红外视差图获取目标红外图像、根据上述深度视差图获取深度图像。进一步的,第二处理单元220可根据目标红外图像、深度图像来进行人脸识别、人脸匹配、活体检测以及获取检测到的人脸的深度信息。
第一处理单元230与第二处理单元220之间通信是通过固定的安全接口,用以确保传输数据的安全性。如图1所示,第二处理单元220发送给第一处理单元230的数据是通过SECURE SPI/I2C 240,第一处理单元230发送给第二处理单元220的数据是通过SECURE MIPI(Mobile Industry Processor Interface,移动产业处理器接口)250。
在一个实施例中,第一处理单元230也可根据上述红外视差图获取目标红外图像、上述深度视差图计算获取深度图像,再将上述目标红外图像、深度图像发送给第二处理单元220。
图3为一个实施例中电子设备的框图。如图3所示,该电子设备300包括通过系统总线350连接的处理器310、存储器320、显示屏330和输入装置340。其中,存储器320可包括非易失性存储介质322及内存储器324。电子设备300的非易失性存储介质322存储有操作系统3222及计算机程序3224,该计算机程序3224被处理器310执行时以实现本申请实施例中提供的一种控制拍摄的方法。该处理器310用于提供计算和控制能力,支撑整个电子设备300的运行。电子设备300中的内存储器324为非易失性存储介质322中的计算机程序3224的运行提供环境。电子设备300的显示屏330可以是液晶显示屏或者电子墨水显示屏等,输入装置340可以是显示屏330上覆盖的触摸层,也可以是电子设备300外壳上设置的按键、轨迹球或触控板,也可以是外接的键盘、触控板或鼠标等。该电子设备300可以是手机、平板电脑或者个人数字助理或穿戴式设备等。本领域技术人员可以理解,图3中示出的结构,仅仅是与本申请方案相关的部分结构的框图,并不构成对本申请方案所应用于其上的电子设备的限定,具体的电子设备可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
如图4所示,在一个实施例中,提供一种控制拍摄的方法,包括以下步骤:
步骤1410,当第二处理单元接收到数据获取请求时,根据数据获取请求控制第二摄像头采集第二图像,并向第一处理单元发送图像采集指令,图像采集指令用于指示第一处理单元控制第一摄像头采集第一图像。
当电子设备中应用程序需要获取人脸数据时,可控制第一摄像头开启,并采集第一图像,其中,人脸数据可包括但不限于人脸解锁、人脸支付等场景下需要进行人脸验证的数据,和人脸深度信息等。第一摄像头可以为激光摄像头,激光摄像头可以采集到不同波长的不可见光图像。第一图像可包括但不限于红外图像、散斑图像等,散斑图像指的是带有散斑图像的红外图像。
当应用程序需要获取人脸数据时,可向第二处理单元发送数据获取请求。第二处理单元接收数据获取请求后,可向第一处理单元发送图像采集指令,其中,第一处理单元可以是MCU模块,第二处理单元可以是CPU模块。可选地,第二处理单元可先检测数据获取请求中是否包含可见光图像获取指令,若包含可见光图像获取指令,则可说明应用程序在获取人脸数据的同时,需要同时获取包含人脸的可见光图像。若数据获取请求中包含可见光图像获取指令,第二处理单元可根据可见光图像获取指令控制第二摄像头采集第二图像,其中,第二摄像头可以是RGB摄像头,第二图像则可以是包含人脸的RGB图像。
当第一处理单元接收图像采集指令后,可根据图像采集指令控制第一摄像头采集第一图像,其中,第一图像可包括红外图像、散斑图像等。第一处理单元可控制开启摄像头模组中的泛光灯并通过激光摄像头采集红外图像,可开启摄像头模组中的镭射灯等激光器并通过激光摄像头采集散斑图像等。泛光灯可为一种向四面八方均匀照射的点光源,泛光灯发射的光线可为红外光,激光摄像头可采集人脸得到红外图像。激光器发出的激光可由透镜和DOE(diffractive optical elements,光学衍射元件)进行衍射产生带散斑颗粒的图案,通过带散斑颗粒的图案投射到目标物体,受目标物体各点与电子设备的距离不同产生散斑图案的偏移,激光摄像头对目标物体进行采集得到散斑图像。
步骤1420,当第二处理单元接收到第二摄像头发送的同步信号时,获取第一摄像头的第一曝光时间和第二摄像头的第二曝光时间,同步信号为第二摄像头采集每帧第二图像时在开始曝光的时刻发送的信号。
第一处理单元可通过控制线连接第一摄像头,通过控制线控制第一摄像头采集第一图像。第二处理单元可通过控制线连接第二摄像头,通过控制线控制第二摄像头采集第二图像。第一处理单元可与第二处理单元连接。第二处理单元还可通过信号线分别与第一摄像头及第二摄像头连接,其中,信号线可以是同步信号线。
第二摄像头在采集每帧图像时,可在开始曝光的时刻向连接了信号线的第二处理单元发送同步信号,该同步信号可以是帧的起始标志SOF(Start of Frame),可用于每帧图像开始曝光。当第二处理单元接收到第二摄像头发送的同步信号,可获取第一摄像头的第一曝光时间和第二摄像头的第二曝光时间,曝光时长可指的是感光时长,曝光时长越长,进的光可越多。通常来说,第一摄像头的第一曝光时间与第二摄像头的第二曝光时间差异较大,第一摄像头的第一曝光时间可小于第二摄像头的第二曝光时间,但不限于此,可能也存在第一摄像头的第一曝光时间大于第二摄像头的第二曝光时间的情况等。
步骤1430,根据第一曝光时间和第二曝光时间计算延时时长。
第二处理单元可根据第一摄像头的第一曝光时间与第二摄像头的第二曝光时间计算延时时长,该延时时长指的是延长第一摄像头开始曝光的时间长度,通过延后第一摄像头开始曝光的时刻,从而可保证第一摄像头与第二摄像头同步。
在一个实施例中,电子设备可预先设置第一摄像头和第二摄像头在曝光过程中同步的时刻,其中,在曝光过程中同步的时刻可指的是第一摄像头已曝光的时长占第一曝光时间的比例与第二摄像头已曝光的时长占第二曝光时间的比例相同。比如,可设置第一摄像头和第二摄像头同时结束曝光,或是在曝光一半的时刻一致,或是在曝光到达3/4时的时刻一致等。第二处理单元可根据第一曝光时间、第二曝光时间及设置的在曝光过程中同步的时刻计算延时时长。
步骤1440,当第二处理单元接收到同步信号的时长达到延时时长时,向第一摄像头转发同步信号,同步信号用于指示第一摄像头开始曝光并采集第一图像。
第二处理单元计算得到延时时长后,可在接收到同步信号的时长达到延时时长时,向第一摄像头转发同步信号。第一摄像头接收同步信号后,开始进行曝光,从而可保证第一摄像头和第二摄像头在曝光过程中同步的时刻保持一致。例如,电子设备可预先设备在曝光一半的时刻一致,则第二处理单元计算得到延时时长,并在接收到同步信号的时长达到延时时长时,向第一摄像头转发同步信号,可在第一摄像头曝光到一半的时候,第二摄像头也曝光到一半,二者保持一致。
步骤1450,通过第一处理单元对第一图像进行处理,并将处理后的第一图像发送给第二处理单元。
第一摄像头可将采集的第一图像发送给第一处理单元,第一处理单元可对第一图像进行处理。第一处理单元可设置成不同的模式,不同模式可采集不同的第一图像,并对第一图像进行不同的处理等。当第一处理单元为红外模式时,第一处理单元可控制开启泛光灯,并通过第一摄像头采集红外图像,可对红外图像进行处理得到红外视差图。当第一处理单元为散斑图模式时,第一处理单元可控制开启镭射灯,并通过第一摄像头采集散斑图像,可对散斑图像进行处理得到散斑视差图。当第一处理单元为深度图模式时,第一处理单元可对散斑图像进行处理得到深度视差图。
在一个实施例中,第一处理单元可对第一图像进行校正处理,进行校正处理是指校正第一图像由于第一摄像头及第二摄像头的内外参数等造成的图像内容偏移,例如由于激光摄像头偏转角度、激光摄像头和RGB摄像头之间的摆放位置等引起的图像内容偏移等。对第一图像进行校正处理后,可得到第一图像的视差图,例如,对红外图像进行校正处理得到红外视差图,对散斑图像进行校正可得到散斑视差图或深度视差图等。对第一图像进行校正处理,可以防止最终在电子设备的屏幕上呈现的图像出现重影的情况。
第一处理单元对第一图像进行处理,可将处理后的第一图像发送给第二处理单元。第二处理单元可根据处理后的第一图像得到目标图像,比如目标红外图像、目标散斑图像及目标深度图等。第二处理单元可根据应用程序的需求对目标图像进行处理。
例如,应用程序需要进行人脸验证时,则第二处理单元则可根据目标图像等进行人脸检测,其中,人脸检测可包括人脸识别、人脸匹配和活体检测。人脸识别是指识别目标图像中是否存在人脸,人脸匹配是指将目标图像中人脸与预存的人脸进行匹配,活体检测是指检测目标图像中人脸是否具有生物活性等。若应用程序需要获取人脸的深度信息,则可将生成的目标深度图上传至应用程序,应用程序可根据接收到的目标深度图进行美颜处理、三维建模等。
在本实施例中,当第二处理单元接收到第二摄像头发送的同步信号时,根据两个摄像头的曝光时长计算延时时长,当第二处理单元接收到同步信号的时长达到延时时长时,向第一摄像头转发同步信号,根据第一摄像头和第二摄像头的曝光时长动态调整转发同步信号的时间点,通过第二处理单元动态调整第一摄像头和第二摄像头同步的时机,同步效果好,在两个摄像头之间的曝光时长差别较大时,依然可以保证两个摄像头采集的图像内容一致。
在一个实施例中,步骤1430根据第一曝光时间和第二曝光时间计算延时时长,包括:计算第一曝光时间及第二曝光时间的曝光时差,并将曝光时差除以2,得到延时时长。
电子设备可设置第一摄像头和第二摄像头在曝光一半的时刻一致,当第一摄像头曝光到一半的时候,第二摄像头也曝光到一半。当第二处理单元接收到第二摄像头发送的同步信号后,可计算第一曝光时间及第二曝光时间的曝光时差,并将曝光时差除以2,得到延时时长。延时时长T 3=|T 1-T 2|/2,其中,T 1表示第一曝光时间,T 2表示第二曝光时间。例如,第一摄像头的第一曝光时间为3ms(毫秒),第二摄像头的第二曝光时间为30ms,则可先计算第一曝光时间和第二曝光时间的曝光时差为17ms,并将曝光时差除以2,得到延时时长为13.5ms。
可选地,第二处理单元计算第一曝光时间及第二曝光时间的曝光时差后,可先将曝光时差与时间阈值进行比较,判断曝光时差是否大于时间阈值,若大于时间阈值,则可将曝光时差除以2,得到延时时长,并在第二处理单元接收到同步信号的时长达到延时时长时,再向第一摄像头转发同步信号。若曝光时差小于或等于时间阈值,则第二处理单元可直接向第一摄像头转发同步信号,不延长第一摄像头开始曝光的时刻。时间阈值可根据实际需求进行设定,例如1ms、2ms等,保证第一摄像头和第二摄像头的采集图像内容在可容忍的区别误差之内,减轻第二处理单元的计算压力。
在一个实施例中,为了保证当第一摄像头曝光到一半的时候,第二摄像头也曝光到一半,第二处理单元还可分别计算第一曝光时间的第一中间曝光时刻和第二曝光时间的第二中间曝光时刻,其中,中间曝光时刻指的是曝光到一半的时刻。第二处理单元可确定第一中间曝光时刻和第二中间曝光时刻的差值,并将该差值作为延时时长。延时时长T 3=|T 1/2-T 2/2|,其中,T 1表示第一曝光时间,T 2表示第二曝光时间。例如,第一摄像头的第一曝光时间为3ms,第二摄像头的第二曝光时间为30ms,则可先计算第一曝光时间的第一中间曝光时刻为1.5ms,第二曝光时间的第二中间曝光时刻为15ms,则可计算第 一中间曝光时刻和第二中间曝光时刻的差值为13.5ms,可将该差值13.5ms作为延时时长。可以理解地,也可采用其他算法保证第一摄像头和第二摄像头之间的同步,并不仅限于上述几种方式。
在本实施例中,可根据第一摄像头和第二摄像头的曝光时长动态调整转发同步信号的时间点,从而可动态调整第一摄像头和第二摄像头同步的时机,保证第一摄像头和第二摄像头在曝光一半的时刻保持一致,同步效果好。
如图5所示,在一个实施例中,步骤将处理后的第一图像发送给所述第二处理单元,包括以下步骤:
步骤1502,通过第二处理单元中运行在第一运行模式的内核向第一处理单元发送图像采集指令,第一运行模式为可信运行环境。
在一个实施例中,电子设备中第二处理单元可包括两种运行模式,其中,第一运行模式可以为TEE,TEE为可信运行环境,安全级别高;第二运行模式可以为REE,REE为自然运行环境,REE的安全级较低。当第二处理单元接收到应用程序发送的数据获取请求后,可通过第一运行模式向第一处理单元发送图像采集指令。当第二处理单元为单核的CPU时,可直接将上述单核由第二运行模式切换到第一运行模式;当第二处理单元为多核时,可将一个内核由第二运行模式切换到第一运行模式,其他内核仍运行在第二运行模式中,并通过运行在第一运行模式下的内核向第一处理单元发送图像采集指令。
步骤1504,第一处理单元将处理后的第一图像发送给第二处理单元中运行在第一运行模式的内核。
第一处理单元对采集的第一图像进行处理后,可将处理后的第一图像发送给该运行在第一运行模式下的内核,可保证第一处理单元一直在可信运行环境下运行,提高安全性。第二处理单元可在该运行在第一运行模式下的内核中,根据处理后的第一图像得到目标图像,并根据应用程序的需求对目标图像进行处理。比如,第二处理单元可在运行在第一运行模式下的内核中对目标图像进行人脸检测。
在一个实施例中,由于运行在第一运行模式的内核是唯一的,第二处理单元在TEE环境下对目标图像进行人脸检测,可采集串行的方式逐一对目标图像进行人脸识别、人脸匹配和活体检测等。第二处理单元可先对目标图像进行人脸识别,当识别到人脸时,再将目标图像中包含的人脸与预先存储的人脸进行匹配,判断是否为同一人脸。若为同一人脸再根据目标图像对人脸进行活体检测,防止采集的人脸是二维的平面人脸等。当没有识别到人脸时,可不进行人脸匹配和活体检测,可减轻第二处理单元的处理压力。
在本实施例中,通过第二处理单元安全性高的内核向第一处理单元发送图像采集指令,可保证第一处理单元处于安全性高的环境中,提高数据的安全。
如图6所示,在一个实施例中,上述控制拍摄的方法,还包括以下步骤:
步骤1602,获取发送数据获取请求的应用程序的应用类型。
步骤1604,根据应用类型确定应用程序的安全级别。
当电子设备的应用程序向第二处理单元发送数据获取请求时,第二处理单元可获取应用程序的应用类型,并获取与应用类型对应的安全级别。应用类型可包括但不限于解锁应用、支付应用、相机应用、美颜应用等。不同应用类型的安全级别可不同,例如,支付应用和解锁应用对应的安全级别可为高,相机应用、美颜应用对应的安全级别可为低等,但不限于此。
步骤1606,选取与安全级别对应的数据传输通道。
第二处理单元可选取与应用程序的安全级别对应的数据传输通道,数据传输通道可包括但不限于安全通道和非安全通道,其中,安全通道可对应安全级别较高的应用程序,非安全通道可对应安全级别较低的应用程序。例如,支付应用可对应安全通道,美颜应用可对应普通通道。在安全通道中,可对传输的数据进行加密,避免数据泄露或被窃取。
步骤1608,当数据传输通道为安全通道时,第一处理单元将处理后的第一图像发送给第二处理单元中运行在第一运行模式的内核。
当数据传输通道为安全通道时,第一处理单元可将处理后的第一图像发送给第二处理单元中运行在第一运行模式的内核。第二处理单元可在运行在第一运行模式的内核中根据处理后的第一图像得到目标图像,包括目标红外图像、目标散斑图像或目标深度图像等。第二处理单元可在该运行在第一运行模式的内核中对目标图像进行人脸检测,可采集串行的方式逐一对目标图像进行人脸识别、人脸匹配和活体检测等。第二处理单元可根据应用程序的需求将应用程序所需的数据通过安全通道传输给应用程序。例如,若应用程序需要进行人脸检测,则第二处理单元可将人脸检测的结果通过安全通道传输给应用程序; 若应用程序需要获取人脸的深度信息,则第二处理单元可将目标深度图通过安全通道传输给应用程序。
步骤1610,当数据传输通道为非安全通道时,第一处理单元将处理后的第一图像发送给处于第二运行模式的摄像头驱动,第二运行模式为自然运行环境。
当数据传输通道为非安全通道时,第一处理单元可将处理后的第一图像发送给摄像头驱动,摄像头驱动可运行在第二处理单元中处于第二运行模式的内核上。第二处理单元可通过摄像头驱动对目标图像进行人脸检测,其中,该目标图像可根据处理后的第一图像得到。第二处理单元可在REE环境中并行对目标图像进行人脸检测,可分别在多个处于第二运行模式的内核中对目标图像进行人脸识别、人脸匹配及活体检测等,可以提高数据处理的效率。摄像头驱动可根据应用程序的需求将应用程序所需的数据传输给应用程序。
在一个实施例中,第二处理单元可获取发送数据获取请求的应用程序的安全级别,并确定与安全级别对应的图像精度。图像精度越高,对应的图像可越清晰,包含的信息越多。第二处理单元可向应用程序发送与图像精度对应的图像数据,例如,第二处理单元向应用程序发送目标深度图像时,安全级别高的应用程序可对应图像精度高的目标深度图像,安全级别低的应用程序可对应图像精度低的目标深度图像。可选地,第二处理单元可通过调整图像分辨率调整图像数据的图像精度,分辨率越高,图像精度越高,分辨率越低,图像精度越低。也可通过控制镭射灯衍射的点的个数,图像精度越高,衍射的点可越多,图像精度越低,衍射的点可越少。可以理解地,也可采用其他方式控制图像精度,并不限于上述几种方式。根据应用程序的安全级别调整图像精度,可以提高图像数据的安全性。
在本实施例中,根据应用程序的安全级别选取对应的数据通道来传输数据,在安全通道中可提高数据传输的安全性,在非安全通道中可提高数据处理效率。
在一个实施例中,提供一种控制拍摄的方法,包括以下步骤:
步骤(1),当第二处理单元接收到数据获取请求时,根据数据获取请求控制第二摄像头采集第二图像,并向第一处理单元发送图像采集指令,图像采集指令用于指示第一处理单元控制第一摄像头采集第一图像。
在一个实施例中,第一处理单元通过控制线连接第一摄像头,第二处理单元通过控制线连接第二摄像头,第一处理单元与第二处理单元连接,第二处理单元还通过信号线分别与第一摄像头及第二摄像头连接。
步骤(2),当第二处理单元接收到第二摄像头发送的同步信号时,获取第一摄像头的第一曝光时间和第二摄像头的第二曝光时间,同步信号为第二摄像头采集每帧第二图像时在开始曝光的时刻发送的信号。
步骤(3),根据第一曝光时间和第二曝光时间计算延时时长。
在一个实施例中,步骤(3),包括:计算第一曝光时间及第二曝光时间的曝光时差,并将曝光时差除以2,得到延时时长。
在一个实施例中,步骤(3),包括:分别计算第一曝光时间的第一中间曝光时刻和第二曝光时间的第二中间曝光时刻;确定第一中间曝光时刻和第二中间曝光时刻的差值,并将差值作为延时时长。
步骤(4),当第二处理单元接收到同步信号的时长达到延时时长时,向第一摄像头转发同步信号,同步信号用于指示第一摄像头开始曝光并采集第一图像。
步骤(5),通过第一处理单元对第一图像进行处理,并将处理后的第一图像发送给第二处理单元。
在一个实施例中,步骤(1),包括:通过第二处理单元中运行在第一运行模式的内核向第一处理单元发送图像采集指令,第一运行模式为可信运行环境;步骤(5),包括:第一处理单元将处理后的第一图像发送给第二处理单元中运行在第一运行模式的内核。
在一个实施例中,步骤(5),包括:获取发送数据获取请求的应用程序的应用类型;根据应用类型确定应用程序的安全级别;选取与安全级别对应的数据传输通道;当数据传输通道为安全通道时,第一处理单元将处理后的第一图像发送给第二处理单元中运行在第一运行模式的内核;当数据传输通道为非安全通道时,第一处理单元将处理后的第一图像发送给处于第二运行模式的摄像头驱动,第二运行模式为自然运行环境。
在一个实施例中,上述控制拍摄的方法,还包括:获取发送数据获取请求的应用程序的安全级别;确定与安全级别对应的图像精度;向应用程序发送与图像精度对应的图像数据。
在本实施例中,当第二处理单元接收到第二摄像头发送的同步信号时,根据两个摄像头的曝光时长计算延时时长,当第二处理单元接收到同步信号的时长达到延时时长时,向第一摄像头转发同步信号,根据第一摄像头和第二摄像头的曝光时长动态调整转发同步信号的时间点,通过第二处理单元动态调整第一摄像头和第二摄像头同步的时机,同步效果好,在两个摄像头之间的曝光时长差别较大时,依然可以保证两个摄像头采集的图像内容一致。
应该理解的是,虽然上述各个流程示意图中的各个步骤按照箭头的指示依次显示,但是这些步骤并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,上述各个流程示意图中的至少一部分步骤可以包括多个子步骤或者多个阶段,这些子步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些子步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它步骤或者其它步骤的子步骤或者阶段的至少一部分轮流或者交替地执行。
在一个实施例中,提供一种电子设备,包括第一处理单元、第二处理单元和摄像头模组,第一处理单元分别与第二处理单元和摄像头模组相连。摄像头模组包括第一摄像头和第二摄像头,第一处理单元通过控制线连接第一摄像头,第二处理单元通过控制线连接第二摄像头。第一处理单元与第二处理单元连接,第二处理单元还通过信号线分别与第一摄像头及第二摄像头连接。
第二处理单元,用于当接收到数据获取请求时,根据数据获取请求控制第二摄像头采集第二图像,并向第一处理单元发送图像采集指令。
第一处理单元,用于根据图像采集指令控制第一摄像头采集第一图像。
第二摄像头,用于采集每帧第二图像时在开始曝光的时刻向第二处理单元发送同步信号。
第二处理单元,还用于当第二处理单元接收到第二摄像头发送的同步信号时,获取第一摄像头的第一曝光时间和第二摄像头的第二曝光时间,并根据第一曝光时间和第二曝光时间计算延时时长。
第二处理单元,还用于当第二处理单元接收到同步信号的时长达到延时时长时,向第一摄像头转发同步信号。
第一摄像头,用于根据同步信号开始曝光并采集第一图像。
第一处理单元,还用于对第一图像进行处理,并将处理后的第一图像发送给第二处理单元。
在本实施例中,当第二处理单元接收到第二摄像头发送的同步信号时,根据两个摄像头的曝光时长计算延时时长,当第二处理单元接收到同步信号的时长达到延时时长时,向第一摄像头转发同步信号,根据第一摄像头和第二摄像头的曝光时长动态调整转发同步信号的时间点,通过第二处理单元动态调整第一摄像头和第二摄像头同步的时机,同步效果好,在两个摄像头之间的曝光时长差别较大时,依然可以保证两个摄像头采集的图像内容一致。
在一个实施例中,第二处理单元,还用于计算第一曝光时间及第二曝光时间的曝光时差,并将曝光时差除以2,得到延时时长。
在一个实施例中,第二处理单元,还用于分别计算第一曝光时间的第一中间曝光时刻和第二曝光时间的第二中间曝光时刻,确定第一中间曝光时刻和第二中间曝光时刻的差值,并将差值作为延时时长。
在本实施例中,可根据第一摄像头和第二摄像头的曝光时长动态调整转发同步信号的时间点,从而可动态调整第一摄像头和第二摄像头同步的时机,保证第一摄像头和第二摄像头在曝光一半的时刻保持一致,同步效果好。
在一个实施例中,第二处理单元,还用于通过第二处理单元中运行在第一运行模式的内核向第一处理单元发送图像采集指令,第一运行模式为可信运行环境。
第一处理单元,还用于将处理后的第一图像发送给第二处理单元中运行在第一运行模式的内核。
在本实施例中,通过第二处理单元安全性高的内核向第一处理单元发送图像采集指令,可保证第一处理单元处于安全性高的环境中,提高数据的安全。
在一个实施例中,第二处理单元,还用于获取发送数据获取请求的应用程序的应用类型,并根据应用类型确定应用程序的安全级别,再选取与安全级别对应的数据传输通道。
第一处理单元,还用于当数据传输通道为安全通道时,将处理后的第一图像发送给第二处理单元中运行在第一运行模式的内核。
第一处理单元,还用于当数据传输通道为非安全通道时,将处理后的第一图像发送给处于第二运行 模式的摄像头驱动,第二运行模式为自然运行环境。
在一个实施例中,第二处理单元,还用于获取发送数据获取请求的应用程序的安全级别,确定与安全级别对应的图像精度,并向应用程序发送与图像精度对应的图像数据。
在本实施例中,根据应用程序的安全级别选取对应的数据通道来传输数据,在安全通道中可提高数据传输的安全性,在非安全通道中可提高数据处理效率。
如图7所示,在一个实施例中,提供一种控制拍摄的装置700,包括请求接收模块710、信号接收模块720、计算模块730、信号转发模块740及处理模块750。
请求接收模块710用于当第二处理单元接收到数据获取请求时,根据数据获取请求控制第二摄像头采集第二图像,并向第一处理单元发送图像采集指令,图像采集指令用于指示第一处理单元控制第一摄像头采集第一图像。
在一个实施例中,第一处理单元通过控制线连接第一摄像头,第二处理单元通过控制线连接第二摄像头,第一处理单元与第二处理单元连接,第二处理单元还通过信号线分别与第一摄像头及第二摄像头连接。
信号接收模块720,用于当第二处理单元接收到第二摄像头发送的同步信号时,获取第一摄像头的第一曝光时间和第二摄像头的第二曝光时间,同步信号为第二摄像头采集每帧第二图像时在开始曝光的时刻发送的信号。计算模块730用于根据第一曝光时间和第二曝光时间计算延时时长。信号转发模块740用于当第二处理单元接收到同步信号的时长达到延时时长时,向第一摄像头转发所述同步信号,同步信号用于指示第一摄像头开始曝光并采集第一图像。处理模块750用于通过第一处理单元对第一图像进行处理,并将处理后的第一图像发送给第二处理单元。
在本实施例中,当第二处理单元接收到第二摄像头发送的同步信号时,根据两个摄像头的曝光时长计算延时时长,当第二处理单元接收到同步信号的时长达到延时时长时,向第一摄像头转发同步信号,根据第一摄像头和第二摄像头的曝光时长动态调整转发同步信号的时间点,通过第二处理单元动态调整第一摄像头和第二摄像头同步的时机,同步效果好,在两个摄像头之间的曝光时长差别较大时,依然可以保证两个摄像头采集的图像内容一致。
在一个实施例中,计算模块730,还用于计算第一曝光时间及第二曝光时间的曝光时差,并将曝光时差除以2,得到延时时长。
在一个实施例中,计算模块730,还用于分别计算第一曝光时间的第一中间曝光时刻和第二曝光时间的第二中间曝光时刻,确定第一中间曝光时刻和第二中间曝光时刻的差值,并将差值作为延时时长。
在本实施例中,可根据第一摄像头和第二摄像头的曝光时长动态调整转发同步信号的时间点,从而可动态调整第一摄像头和第二摄像头同步的时机,保证第一摄像头和第二摄像头在曝光一半的时刻保持一致,同步效果好。
在一个实施例中,请求接收模块710,还用于通过第二处理单元中运行在第一运行模式的内核向第一处理单元发送图像采集指令,第一运行模式为可信运行环境。
处理模块750,还用于通过第一处理单元将处理后的第一图像发送给第二处理单元中运行在第一运行模式的内核。
在本实施例中,通过第二处理单元安全性高的内核向第一处理单元发送图像采集指令,可保证第一处理单元处于安全性高的环境中,提高数据的安全。
如图8所示,在一个实施例中,处理模块750,包括类型获取单元752、级别确定单元754、选取单元756及发送单元758。
类型获取单元752用于获取发送数据获取请求的应用程序的应用类型。级别确定单元754用于根据应用类型确定所述应用程序的安全级别。选取单元756用于选取与安全级别对应的数据传输通道。发送单元758当数据传输通道为安全通道时,通过第一处理单元将处理后的第一图像发送给第二处理单元中运行在第一运行模式的内核。发送单元758还用于当数据传输通道为非安全通道时,通过第一处理单元将处理后的第一图像发送给处于第二运行模式的摄像头驱动,第二运行模式为自然运行环境。
在一个实施例中,级别确定单元754,还用于获取发送数据获取请求的应用程序的安全级别,并确定与安全级别对应的图像精度。
发送单元758,还用于向应用程序发送与图像精度对应的图像数据。
在本实施例中,根据应用程序的安全级别选取对应的数据通道来传输数据,在安全通道中可提高数据传输的安全性,在非安全通道中可提高数据处理效率。
第二实施方式:
图9为一个实施例中控制拍摄的方法的应用场景图。如图9所示,该应用场景可包括第一摄像头110、第二摄像头120、第一处理单元130及第二处理单元140。第一摄像头110可以为激光摄像头,第二摄像头120可以为RGB(Red/Green/Blue,红/绿/蓝色彩模式)摄像头。第一处理单元130可为MCU(Microcontroller Unit,微控制单元)模块等,第二处理单元140可为CPU(Central Processing Unit,中央处理器)模块等。第一处理单元130通过控制线连接第一摄像头110,第二处理单元140通过控制线连接第二摄像头120。第一处理单元130与第二处理单元连接140。第一处理单元130还通过信号线分别与第一摄像头110及第二摄像头120连接。
当第二处理单元140接收到数据获取请求时,可根据数据获取请求通过控制线控制第二摄像头120采集第二图像,并向第一处理单元130发送图像采集指令。当第一处理单元130接收到第二处理单元140发送的图像采集指令时,可根据图像采集指令通过控制线控制第一摄像头110采集第一图像。第二摄像头120采集每帧第二图像时,可在开始曝光的时刻通过信号线向第一处理单元130发送同步信号。当第一处理单元130接收到第二摄像头120发送的同步信号时,可获取第一摄像头110的第一曝光时间和第二摄像头120的第二曝光时间,并根据第一曝光时间和第二曝光时间计算延时时长。当第一处理单元130接收到同步信号的时长达到延时时长时,可通过信号线向第一摄像头110转发同步信号。第一摄像头110接收到同步信号后,即可开始曝光并采集第一图像,可将采集的第一图像传输给第一处理单元130。第一处理单元130可对第一图像进行处理,并将处理后的第一图像发送给第二处理单元140。
图2为另一个实施例中控制拍摄的方法的应用场景图。如图2所示,电子设备200可包括摄像头模组210、第二处理单元220,第一处理单元230。上述第二处理单元220可为CPU模块。上述第一处理单元230可为MCU模块等。其中,第一处理单元230连接在第二处理单元220和摄像头模组210之间,上述第一处理单元230可控制摄像头模组210中激光摄像头212、泛光灯214和镭射灯218,上述第二处理单元220可控制摄像头模组210中RGB摄像头216。
摄像头模组210中包括激光摄像头212、泛光灯214、RGB摄像头216和镭射灯218。上述激光摄像头212可为红外摄像头,用于获取红外图像。上述泛光灯214为可发射红外光的面光源;上述镭射灯218为可发射激光的点光源且为带有图案的点光源。其中,当泛光灯214为面光源时,激光摄像头212可根据反射回的光线获取红外图像。当镭射灯218为点光源时,激光摄像头212可根据反射回的光线获取散斑图像。上述散斑图像是镭射灯218发射的带有图案的激光被反射后图案发生形变的图像。
第一处理单元230可通过信号线分别连接RGB摄像头216和激光摄像头212。当RGB摄像头216采集每帧图像时,可向第一处理单元230发送同步信号。第一处理单元230接收到RGB摄像头216发送的同步信号后,可获取激光摄像头212的曝光时长和RGB摄像头216的曝光时长,并根据激光摄像头212的曝光时长和RGB摄像头216的曝光时长计算延时时长。当第一处理单元230接收到同步信号的时长达到该延时时长时,可通过信号线向激光摄像头212转发同步信号。激光摄像头212接收到同步信号,可根据同步信号开始进行曝光并采集红外图像或散斑图像等。
第二处理单元220可包括在TEE(Trusted execution environment,可信运行环境)环境下运行的CPU内核和在REE(Rich Execution Environment,自然运行环境)环境下运行的CPU内核。其中,TEE环境和REE环境均为ARM模块(Advanced RISC Machines,高级精简指令集处理器)的运行模式。其中,TEE环境的安全级别较高,第二处理单元220中有且仅有一个CPU内核可同时运行在TEE环境下。通常情况下,电子设备200中安全级别较高的操作行为需要在TEE环境下的CPU内核中执行,安全级别较低的操作行为可在REE环境下的CPU内核中执行。
第一处理单元230包括PWM(Pulse Width Modulation,脉冲宽度调制)模块232、SPI/I2C(Serial Peripheral Interface/Inter-Integrated Circuit,串行外设接口/双向二线制同步串行接口)接口234、RAM(Random Access Memory,随机存取存储器)模块236和深度引擎238。上述PWM模块232可向摄像头模组发射脉冲,控制泛光灯214或镭射灯218开启,使得激光摄像头212可采集到红外图像或散斑图像。上述SPI/I2C接口234用于接收第二处理单元220发送的图像采集指令。上述深度引擎238可对散斑图像进行处理得到深度视差图。
当第二处理单元220接收到应用程序的数据获取请求时,例如,当应用程序需要进行人脸解锁、人脸支付时,可通过运行在TEE环境下的CPU内核向第一处理单元230发送图像采集指令。当第一处理单元230接收到图像采集指令后,可通过PWM模块232发射脉冲波控制摄像头模组210中泛光灯214开启并通过激光摄像头212采集红外图像、控制摄像头模组210中镭射灯218开启并通过激光摄像头212采集散斑图像。摄像头模组210可将采集到的红外图像和散斑图像发送给第一处理单元230。第一处理单元230可对接收到的红外图像进行处理得到红外视差图;对接收到的散斑图像进行处理得到散斑视差图或深度视差图。其中,第一处理单元230对上述红外图像和散斑图像进行处理是指对红外图像或散斑图像进行校正,去除摄像头模组210中内外参数对图像的影响。其中,第一处理单元230可设置成不同的模式,不同模式输出的图像不同。当第一处理单元230设置为散斑图模式时,第一处理单元230对散斑图像处理得到散斑视差图,根据上述散斑视差图可得到目标散斑图像;当第一处理单元230设置为深度图模式时,第一处理单元230对散斑图像处理得到深度视差图,根据上述深度视差图可得到深度图像,上述深度图像是指带有深度信息的图像。第一处理单元230可将上述红外视差图和散斑视差图发送给第二处理单元220,第一处理单元230也可将上述红外视差图和深度视差图发送给第二处理单元220。第二处理单元220可根据上述红外视差图获取目标红外图像、根据上述深度视差图获取深度图像。进一步的,第二处理单元220可根据目标红外图像、深度图像来进行人脸识别、人脸匹配、活体检测以及获取检测到的人脸的深度信息。
第一处理单元230与第二处理单元220之间通信是通过固定的安全接口,用以确保传输数据的安全性。如图9所示,第二处理单元220发送给第一处理单元230的数据是通过SECURE SPI/I2C 240,第一处理单元230发送给第二处理单元220的数据是通过SECURE MIPI(Mobile Industry Processor Interface,移动产业处理器接口)250。
在一个实施例中,第一处理单元230也可根据上述红外视差图获取目标红外图像、上述深度视差图计算获取深度图像,再将上述目标红外图像、深度图像发送给第二处理单元220。
图3为一个实施例中电子设备的框图。如图3所示,该电子设备300包括通过系统总线350连接的处理器310、存储器320、显示屏330和输入装置340。其中,存储器320可包括非易失性存储介质322及内存储器324。电子设备300的非易失性存储介质322存储有操作系统3222及计算机程序3224,该计算机程序3224被处理器310执行时以实现本申请实施例中提供的一种控制拍摄的方法。该处理器310用于提供计算和控制能力,支撑整个电子设备300的运行。电子设备300中的内存储器324为非易失性存储介质322中的计算机程序3224的运行提供环境。电子设备300的显示屏330可以是液晶显示屏或者电子墨水显示屏等,输入装置340可以是显示屏330上覆盖的触摸层,也可以是电子设备300外壳上设置的按键、轨迹球或触控板,也可以是外接的键盘、触控板或鼠标等。该电子设备300可以是手机、平板电脑或者个人数字助理或穿戴式设备等。本领域技术人员可以理解,图3中示出的结构,仅仅是与本申请方案相关的部分结构的框图,并不构成对本申请方案所应用于其上的电子设备的限定,具体的电子设备可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
如图10所示,在一个实施例中,提供一种控制拍摄的方法,包括以下步骤:
步骤2410,当第一处理单元接收到第二处理单元发送的图像采集指令时,根据图像采集指令控制第一摄像头采集第一图像,图像采集指令为第二处理单元接收到数据获取请求时发送,数据获取请求用于指示第二处理单元控制第二摄像头采集第二图像。
当电子设备中应用程序需要获取人脸数据时,可控制第一摄像头开启,并采集第一图像,其中,人脸数据可包括但不限于人脸解锁、人脸支付等场景下需要进行人脸验证的数据,和人脸深度信息等。第一摄像头可以为激光摄像头,激光摄像头可以采集到不同波长的不可见光图像。第一图像可包括但不限于红外图像、散斑图像等,散斑图像指的是带有散斑图像的红外图像。
当应用程序需要获取人脸数据时,可向第二处理单元发送数据获取请求。第二处理单元接收数据获取请求后,可向第一处理单元发送图像采集指令,其中,第一处理单元可以是MCU模块,第二处理单元可以是CPU模块。可选地,第二处理单元可先检测数据获取请求中是否包含可见光图像获取指令,若包含可见光图像获取指令,则可说明应用程序在获取人脸数据的同时,需要同时获取包含人脸的可见光图像。若数据获取请求中包含可见光图像获取指令,第二处理单元可根据可见光图像获取指令控制第二摄像头采集第二图像,其中,第二摄像头可以是RGB摄像头,第二图像则可以是包含人脸的RGB 图像。
当第一处理单元接收图像采集指令后,可根据图像采集指令控制第一摄像头采集第一图像,其中,第一图像可包括红外图像、散斑图像等。第一处理单元可控制开启摄像头模组中的泛光灯并通过激光摄像头采集红外图像,可开启摄像头模组中的镭射灯等激光器并通过激光摄像头采集散斑图像等。泛光灯可为一种向四面八方均匀照射的点光源,泛光灯发射的光线可为红外光,激光摄像头可采集人脸得到红外图像。激光器发出的激光可由透镜和DOE(diffractive optical elements,光学衍射元件)进行衍射产生带散斑颗粒的图案,通过带散斑颗粒的图案投射到目标物体,受目标物体各点与电子设备的距离不同产生散斑图案的偏移,激光摄像头对目标物体进行采集得到散斑图像。
步骤2420,当第一处理单元接收到第二摄像头发送的同步信号时,获取第一摄像头的第一曝光时间和第二摄像头的第二曝光时间,同步信号为第二摄像头采集每帧第二图像时在开始曝光的时刻发送的信号。
第一处理单元可通过控制线连接第一摄像头,通过控制线控制第一摄像头采集第一图像。第二处理单元可通过控制线连接第二摄像头,通过控制线控制第二摄像头采集第二图像。第一处理单元可与第二处理单元连接。第一处理单元还可通过信号线分别与第一摄像头及第二摄像头连接,其中,信号线可以是同步信号线。
第二摄像头在采集每帧图像时,可在开始曝光的时刻向连接了信号线的第一处理单元发送同步信号,该同步信号可以是帧的起始标志SOF(Start of Frame),可用于每帧图像开始曝光。当第一处理单元接收到第二摄像头发送的同步信号,可获取第一摄像头的第一曝光时间和第二摄像头的第二曝光时间,曝光时长可指的是感光时长,曝光时长越长,进的光可越多。通常来说,第一摄像头的第一曝光时间与第二摄像头的第二曝光时间差异较大,第一摄像头的第一曝光时间可小于第二摄像头的第二曝光时间,但不限于此,可能也存在第一摄像头的第一曝光时间大于第二摄像头的第二曝光时间的情况等。
步骤2430,根据第一曝光时间和第二曝光时间计算延时时长。
第一处理单元可根据第一摄像头的第一曝光时间与第二摄像头的第二曝光时间计算延时时长,该延时时长指的是延长第一摄像头开始曝光的时间长度,通过延后第一摄像头开始曝光的时刻,从而可保证第一摄像头与第二摄像头同步。
在一个实施例中,电子设备可预先设置第一摄像头和第二摄像头在曝光过程中同步的时刻,其中,在曝光过程中同步的时刻可指的是第一摄像头已曝光的时长占第一曝光时间的比例与第二摄像头已曝光的时长占第二曝光时间的比例相同。比如,可设置第一摄像头和第二摄像头同时结束曝光,或是在曝光一半的时刻一致,或是在曝光到达3/4时的时刻一致等。第一处理单元可根据第一曝光时间、第二曝光时间及设置的在曝光过程中同步的时刻计算延时时长。
步骤2440,当第一处理单元接收到同步信号的时长达到延时时长时,向第一摄像头转发同步信号,同步信号用于指示第一摄像头开始曝光并采集第一图像。
第一处理单元计算得到延时时长后,可在接收到同步信号的时长达到延时时长时,向第一摄像头转发同步信号。第一摄像头接收同步信号后,开始进行曝光,从而可保证第一摄像头和第二摄像头在曝光过程中同步的时刻保持一致。例如,电子设备可预先设备在曝光一半的时刻一致,则第一处理单元计算得到延时时长,并在接收到同步信号的时长达到延时时长时,向第一摄像头转发同步信号,可在第一摄像头曝光到一半的时候,第二摄像头也曝光到一半,二者保持一致。
步骤2450,通过第一处理单元对第一图像进行处理,并将处理后的第一图像发送给第二处理单元。
第一摄像头可将采集的第一图像发送给第一处理单元,第一处理单元可对第一图像进行处理。第一处理单元可设置成不同的模式,不同模式可采集不同的第一图像,并对第一图像进行不同的处理等。当第一处理单元为红外模式时,第一处理单元可控制开启泛光灯,并通过第一摄像头采集红外图像,可对红外图像进行处理得到红外视差图。当第一处理单元为散斑图模式时,第一处理单元可控制开启镭射灯,并通过第一摄像头采集散斑图像,可对散斑图像进行处理得到散斑视差图。当第一处理单元为深度图模式时,第一处理单元可对散斑图像进行处理得到深度视差图。
在一个实施例中,第一处理单元可对第一图像进行校正处理,进行校正处理是指校正第一图像由于第一摄像头及第二摄像头的内外参数等造成的图像内容偏移,例如由于激光摄像头偏转角度、激光摄像头和RGB摄像头之间的摆放位置等引起的图像内容偏移等。对第一图像进行校正处理后,可得到第一 图像的视差图,例如,对红外图像进行校正处理得到红外视差图,对散斑图像进行校正可得到散斑视差图或深度视差图等。对第一图像进行校正处理,可以防止最终在电子设备的屏幕上呈现的图像出现重影的情况。
第一处理单元对第一图像进行处理,可将处理后的第一图像发送给第二处理单元。第二处理单元可根据处理后的第一图像得到目标图像,比如目标红外图像、目标散斑图像及目标深度图等。第二处理单元可根据应用程序的需求对目标图像进行处理。
例如,应用程序需要进行人脸验证时,则第二处理单元则可根据目标图像等进行人脸检测,其中,人脸检测可包括人脸识别、人脸匹配和活体检测。人脸识别是指识别目标图像中是否存在人脸,人脸匹配是指将目标图像中人脸与预存的人脸进行匹配,活体检测是指检测目标图像中人脸是否具有生物活性等。若应用程序需要获取人脸的深度信息,则可将生成的目标深度图上传至应用程序,应用程序可根据接收到的目标深度图进行美颜处理、三维建模等。
在本实施例中,当第一处理单元接收到第二摄像头发送的同步信号时,根据两个摄像头的曝光时长计算延时时长,当第一处理单元接收到同步信号的时长达到延时时长时,向第一摄像头转发同步信号,根据第一摄像头和第二摄像头的曝光时长动态调整转发同步信号的时间点,从而可动态调整第一摄像头和第二摄像头同步的时机,同步效果好,在两个摄像头之间的曝光时长差别较大时,依然可以保证两个摄像头采集的图像内容一致。
在一个实施例中,步骤2430根据第一曝光时间和第二曝光时间计算延时时长,包括:计算第一曝光时间及第二曝光时间的曝光时差,并将曝光时差除以2,得到延时时长。
电子设备可设置第一摄像头和第二摄像头在曝光一半的时刻一致,当第一摄像头曝光到一半的时候,第二摄像头也曝光到一半。当第一处理单元接收到第二摄像头发送的同步信号后,可计算第一曝光时间及第二曝光时间的曝光时差,并将曝光时差除以2,得到延时时长。延时时长T 3=|T 1-T 2|/2,其中,T 1表示第一曝光时间,T 2表示第二曝光时间。例如,第一摄像头的第一曝光时间为3ms(毫秒),第二摄像头的第二曝光时间为30ms,则可先计算第一曝光时间和第二曝光时间的曝光时差为17ms,并将曝光时差除以2,得到延时时长为13.5ms。
可选地,第一处理单元计算第一曝光时间及第二曝光时间的曝光时差后,可先将曝光时差与时间阈值进行比较,判断曝光时差是否大于时间阈值,若大于时间阈值,则可将曝光时差除以2,得到延时时长,并在第一处理单元接收到同步信号的时长达到延时时长时,再向第一摄像头转发同步信号。若曝光时差小于或等于时间阈值,则第一处理单元可直接向第一摄像头转发同步信号,不延长第一摄像头开始曝光的时刻。时间阈值可根据实际需求进行设定,例如1ms、2ms等,保证第一摄像头和第二摄像头的采集图像内容在可容忍的区别误差之内,减轻第一处理单元的计算压力。
在一个实施例中,为了保证当第一摄像头曝光到一半的时候,第二摄像头也曝光到一半,第一处理单元还可分别计算第一曝光时间的第一中间曝光时刻和第二曝光时间的第二中间曝光时刻,其中,中间曝光时刻指的是曝光到一半的时刻。第一处理单元可确定第一中间曝光时刻和第二中间曝光时刻的差值,并将该差值作为延时时长。延时时长T 3=|T 1/2-T 2/2|,其中,T 1表示第一曝光时间,T 2表示第二曝光时间。例如,第一摄像头的第一曝光时间为3ms,第二摄像头的第二曝光时间为30ms,则可先计算第一曝光时间的第一中间曝光时刻为1.5ms,第二曝光时间的第二中间曝光时刻为15ms,则可计算第一中间曝光时刻和第二中间曝光时刻的差值为13.5ms,可将该差值13.5ms作为延时时长。可以理解地,也可采用其他算法保证第一摄像头和第二摄像头之间的同步,并不仅限于上述几种方式。
在本实施例中,可根据第一摄像头和第二摄像头的曝光时长动态调整转发同步信号的时间点,从而可动态调整第一摄像头和第二摄像头同步的时机,保证第一摄像头和第二摄像头在曝光一半的时刻保持一致,同步效果好。
如图11所示,在一个实施例中,步骤2450通过第一处理单元对第一图像进行处理,并将处理后的第一图像发送给第二处理单元,包括以下步骤:
步骤2502,获取存储的参考散斑图像,参考散斑图像带有参考深度信息。
在摄像机坐标系中,以垂直成像平面并穿过镜面中心的直线为Z轴,若物体在摄像机坐标系的坐标为(X,Y,Z),那么其中的Z值即为物体在该摄像机成像平面的深度信息。若应用程序需要获取人脸的 深度信息,则需要采集包含人脸深度信息的深度图。第一处理单元可控制开启镭射灯,并通过第一摄像头采集散斑图像。第一处理单元中可预先存储有参考散斑图,参考散斑图可带有参考深度信息,可根据采集的散斑图像及参考散斑图像获取散斑图像中包含的各个像素点的深度信息。
步骤2504,将参考散斑图像与散斑图像进行匹配,得到匹配结果。
第一处理单元可依次以采集的散斑图像中包含的各个像素点为中心,选择一个预设大小像素块,例如31pixel(像素)*31pixel大小,在参考散斑图像上搜索与选择的像素块相匹配的块。第一处理单元可从采集的散斑图像中选择的像素块和参考散斑图像相匹配的块中,找到散斑图像及参考散斑图像中分别在同一条激光光路上的两个点,同一激光光路上的两个点的散斑信息一致,在同一条激光光路上的两个点可认定为对应的像素点。参考散斑图像中,每一条激光光路上的点的深度信息都是已知的。第一处理单元可计算目标散斑图像与参考散斑图像在同一条激光光路上的两个对应的像素点之间的偏移量,并根据偏移量计算得到采集的散斑图中包含的各个像素点的深度信息。
在一个实施例中,第一处理单元将采集的散斑图像与参考散斑图进行偏移量的计算,根据偏移量计算得到散斑图像中包含的各个像素点的深度信息,其计算公式可如式(1)所示:
Figure PCTCN2019080427-appb-000001
其中,Z D表示像素点的深度信息,也即像素点的深度值;L为激光摄像头与激光器之间的距离;f为激光摄像头中透镜的焦距,Z 0为参考散斑图像采集时参考平面距离电子设备的激光摄像头的深度值,P为采集的散斑图像与参考散斑图像中对应像素点之间的偏移量。P可由目标散斑图与参考散斑图中像素点偏移的像素量乘以一个像素点的实际距离得到。当目标物体与激光摄像头之间的距离大于参考平面与激光摄像头之间的距离时,P为负值,当目标物体与激光摄像头之间的距离小于参考平面与第一采集器之间的距离时,P为正值。
步骤2506,根据参考深度信息和匹配结果生成深度视差图,并将深度视差图发送给第二处理单元,通过第二处理单元对深度视差图进行处理得到深度图。
第一处理单元得到采集的散斑图像中包含的各个像素点的深度信息,可对采集的散斑图像进行校正处理,校正采集的散斑图像由于第一摄像头及第二摄像头的内外参数等造成的图像内容偏移。第一处理单元可根据校正后的散斑图像,以及散斑图像中各个像素点的深度值,生成深度视差图,并将深度视差图发送给第二处理单元。第二处理单元可根据深度视差图得到深度图,深度图中可包含各个像素点的深度信息。第二处理单元可将深度图上传至应用程序,应用程序可根据深度图中人脸的深度信息进行美颜、三维建模等。第二处理单元也可根据深度图中人脸的深度信息进行活体检测,可防止采集的人脸是二维的平面人脸等。
在本实施例中,通过第一处理单元可精准得到采集的图像的深度信息,数据处理效率高,且提高了图像处理的精准性。
如图12所示,在一个实施例中,在步骤2502获取存储的参考散斑图像之前,还包括以下步骤:
步骤2602,每隔采集时间段采集激光器的温度,并获取与温度对应的参考散斑图像。
电子设备可在激光器旁设置有温度传感器,其中,激光器指的是镭射灯等,并通过温度传感器采集激光器的温度。第二处理单元可每隔采集时间段获取温度传感器采集的激光器的温度,其中,采集时间段可根据实际需求进行设定,例如3秒、4秒等,但不限于此。由于当激光器的温度发生变化时,可能会对摄像头模组造成形变,影响第一摄像头和第二摄像头的内外参数。不同温度下对摄像头的影响不同,因此,在不同的温度下,可对应不同的参考散斑图像
第二处理单元可获取与温度对应的参考散斑图像,并根据与温度对应的参考散斑图像对在该温度下采集的散斑图像进行处理,得到深度图。可选地,第二处理单元可预先设定多个不同的温度区间,比如0℃(摄式度)~30℃,30℃~60℃,60℃~90℃等,但不限于此,不同温度区间可对应不同的参考散斑图像。第二处理单元采集温度后,可确定该温度所处的温度区间,并获取与该温度区间对应的参考散斑图像。
步骤2604,当本次获取的参考散斑图像与第一处理单元中存储的参考散斑图像不一致时,将本次获取的参考散斑图写入第一处理单元。
第二处理单元获取与采集的温度对应的参考散斑图像后,可判断本次获取的参考散斑图像与第一处理单元中存储的参考散斑图像是否一致,参考散斑图像中可携带有图像标识,图像标识可以由数字、字线及字符等中的一种或多种组成。第二处理单元可从第一处理单元中读取存储的参考散斑图像的图像标识,并将本次获取的参考散斑图像的图像标识与从第一处理单元读取的图像标识进行比较。若两个图像标识不一致,则可说明本次获取的参考散斑图像与第一处理单元中存储的参考散斑图像不一致,则第二处理单元可将本次获取的参考散斑图像写入第一处理单元。第一处理单元可存储新写入的参考散斑图像,并删除之前存储的参考散斑图像。
在本实施例中,可根据激光器的温度获取与温度对应的参考散斑图像,减少温度对最后输出的深度图的影响,使得到的深度信息更为精准。
如图13所示,在一个实施例中,上述控制拍摄的方法,还包括以下步骤:
步骤2702,当第二处理单元接收到应用程序的数据获取请求,获取应用程序的安全级别。
在一个实施例中,电子设备中第二处理单元可包括两种运行模式,其中,第一运行模式可以为TEE,TEE为可信运行环境,安全级别高;第二运行模式可以为REE,REE为自然运行环境,REE的安全级较低。当第二处理单元接收到应用程序发送的数据获取请求后,可通过第一运行模式向第一处理单元发送图像采集指令。当第二处理单元为单核时,可直接将上述单核由第二运行模式切换到第一运行模式;当第二处理单元为多核时,可将一个内核由第二运行模式切换到第一运行模式,其他内核仍运行在第二运行模式中,并通过运行在第一运行模式下的内核向第一处理单元发送图像采集指令。第一处理单元对采集的第一图像进行处理后,可将处理后的第一图像发送给该运行在第一运行模式下的内核,可保证第一处理单元一直在可信运行环境下运行,提高安全性。
当电子设备的应用程序向第二处理单元发送数据获取请求时,第二处理单元可获取应用程序的应用类型,并获取与应用类型对应的安全级别。应用类型可包括但不限于解锁应用、支付应用、相机应用、美颜应用等。不同应用类型的安全级别可不同,例如,支付应用和解锁应用对应的安全级别可为高,相机应用、美颜应用对应的安全级别可为低等,但不限于此。
步骤2704,确定与安全级别对应的数据传输通道。
第二处理单元可确定与应用程序的安全级别对应的数据传输通道,数据传输通道可包括但不限于安全通道和普通通道,其中,安全通道可对应安全级别较高的应用程序,普通通道可对应安全级别较低的应用程序。例如,支付应用可对应安全通道,美颜应用可对应普通通道。在安全通道中,可对传输的数据进行加密,避免数据泄露或被窃取。
步骤2706,将深度图通过对应的数据传输通道发送给应用程序。
第二处理单元可将深度图通过与应用程序的安全级别对应的数据传输通道发送给应用程序,通过安全通道向安全级别较高的应用程序发送深度图,可对深度图进行加密,通过普通道向安全级别较低的应用程序发送深度图,可加快数据传输速度。可选地,除了向应用程序发送深度图外,还可通过与应用程序的安全级别对应的数据传输通道向应用程序发送其他数据,例如,进行人脸验证的验证结果等,但不限于此。
在一个实施例中,第二处理单元可根据应用程序的安全级别,向应用程序发送与安全级别对应精度的深度图,精度越高,对应的深度图像越清晰,包含的深度信息越多。安全级别高的应用程序可对应精度高的深度图,安全级别低的应用程序可对应精度低的深度图。可选地,第二处理单元可通过调整图像分辨率调整图像数据的图像精度,分辨率越高,图像精度越高,分辨率越低,图像精度越低。也可通过控制镭射灯衍射的点的个数,图像精度越高,衍射的点可越多,图像精度越低,衍射的点可越少。可以理解地,也可采用其他方式控制图像精度,并不限于上述几种方式。根据应用程序的安全级别调整深度图的精度,可以提高数据的安全性。
在本实施例中,根据应用程序的安全级别选取对应的数据通道来传输数据,提高数据传输的安全性。
在一个实施例中,提供一种控制拍摄的方法,包括以下步骤:
步骤(1),当第一处理单元接收到第二处理单元发送的图像采集指令时,根据图像采集指令控制第一摄像头采集第一图像,图像采集指令为第二处理单元接收到数据获取请求时发送,数据获取请求用于指示第二处理单元控制第二摄像头采集第二图像。
在一个实施例中,第一处理单元通过控制线连接第一摄像头,第二处理单元通过控制线连接第二摄 像头,第一处理单元与第二处理单元连接,第一处理单元还通过信号线分别与第一摄像头及第二摄像头连接。
步骤(2),当第一处理单元接收到第二摄像头发送的同步信号时,获取第一摄像头的第一曝光时间和第二摄像头的第二曝光时间,同步信号为第二摄像头采集每帧第二图像时在开始曝光的时刻发送的信号。
步骤(3),根据第一曝光时间和第二曝光时间计算延时时长。
在一个实施例中,步骤(3),包括:计算第一曝光时间及第二曝光时间的曝光时差,并将曝光时差除以2,得到延时时长。
在一个实施例中,步骤(3),包括:分别计算第一曝光时间的第一中间曝光时刻和第二曝光时间的第二中间曝光时刻;确定第一中间曝光时刻和第二中间曝光时刻的差值,并将差值作为延时时长。
步骤(4),当第一处理单元接收到同步信号的时长达到延时时长时,向第一摄像头转发同步信号,同步信号用于指示第一摄像头开始曝光并采集第一图像。
步骤(5),通过第一处理单元对第一图像进行处理,并将处理后的第一图像发送给第二处理单元。
在一个实施例中,步骤(5),包括:获取存储的参考散斑图像,参考散斑图像带有参考深度信息;将参考散斑图像与散斑图像进行匹配,得到匹配结果;根据参考深度信息和匹配结果生成深度视差图,并将深度视差图发送给第二处理单元,通过第二处理单元对深度视差图进行处理得到深度图。
在一个实施例中,在步骤获取存储的参考散斑图像之前,还包括:每隔采集时间段采集激光器的温度,并获取与温度对应的参考散斑图像;当本次获取的参考散斑图像与第一处理单元中存储的参考散斑图像不一致时,将本次获取的参考散斑图像写入第一处理单元。
在一个实施例中,上述控制拍摄的方法,还包括:当第二处理单元接收到应用程序的数据获取请求,获取应用程序的安全级别;确定与安全级别对应的数据传输通道;将深度图通过对应的数据传输通道发送给应用程序。
在本实施例中,当第一处理单元接收到第二摄像头发送的同步信号时,根据两个摄像头的曝光时长计算延时时长,当第一处理单元接收到同步信号的时长达到延时时长时,向第一摄像头转发同步信号,根据第一摄像头和第二摄像头的曝光时长动态调整转发同步信号的时间点,从而可动态调整第一摄像头和第二摄像头同步的时机,同步效果好,在两个摄像头之间的曝光时长差别较大时,依然可以保证两个摄像头采集的图像内容一致。
应该理解的是,虽然上述各个流程示意图中的各个步骤按照箭头的指示依次显示,但是这些步骤并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,上述各个流程示意图中的至少一部分步骤可以包括多个子步骤或者多个阶段,这些子步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些子步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它步骤或者其它步骤的子步骤或者阶段的至少一部分轮流或者交替地执行。
在一个实施例中,提供一种电子设备,包括第一处理单元、第二处理单元和摄像头模组,第一处理单元可分别与第二处理单元和摄像头模组相连。摄像头模组可包括第一摄像头和第二摄像头,第一处理单元可通过控制线连接第一摄像头,第二处理单元可通过控制线连接第二摄像头。第一处理单元与第二处理单元连接,第一处理单元还通过信号线分别与第一摄像头及第二摄像头连接。
第二处理单元用于当接收到数据获取请求时,根据数据获取请求控制第二摄像头采集第二图像,并向第一处理单元发送图像采集指令。第一处理单元用于当接收到第二处理单元发送的图像采集指令时,根据图像采集指令控制第一摄像头采集第一图像。第二摄像头用于采集每帧第二图像时在开始曝光的时刻向第一处理单元发送同步信号。第一处理单元还用于当第一处理单元接收到第二摄像头发送的同步信号时,获取第一摄像头的第一曝光时间和第二摄像头的第二曝光时间。第一处理单元还用于根据第一曝光时间和第二曝光时间计算延时时长。第一处理单元还用于当第一处理单元接收到同步信号的时长达到延时时长时,向第一摄像头转发所述同步信号。第一摄像头用于根据同步信号开始曝光并采集第一图像。第一处理单元还用于通过第一处理单元对所述第一图像进行处理,并将处理后的第一图像发送给第二处理单元。
在本实施例中,当第一处理单元接收到第二摄像头发送的同步信号时,根据两个摄像头的曝光时长 计算延时时长,当第一处理单元接收到同步信号的时长达到延时时长时,向第一摄像头转发同步信号,根据第一摄像头和第二摄像头的曝光时长动态调整转发同步信号的时间点,从而可动态调整第一摄像头和第二摄像头同步的时机,同步效果好,在两个摄像头之间的曝光时长差别较大时,依然可以保证两个摄像头采集的图像内容一致。
在一个实施例中,第一处理单元,还用于计算第一曝光时间及第二曝光时间的曝光时差,并将曝光时差除以2,得到延时时长。
在一个实施例中,第一处理单元,还用于分别计算第一曝光时间的第一中间曝光时刻和第二曝光时间的第二中间曝光时刻,确定第一中间曝光时刻和第二中间曝光时刻的差值,并将差值作为延时时长。
在本实施例中,可根据第一摄像头和第二摄像头的曝光时长动态调整转发同步信号的时间点,从而可动态调整第一摄像头和第二摄像头同步的时机,保证第一摄像头和第二摄像头在曝光一半的时刻保持一致,同步效果好。
在一个实施例中,第一处理单元,还用于获取存储的参考散斑图像,并将将参考散斑图像与所述散斑图像进行匹配,得到匹配结果,参考散斑图像带有参考深度信息。
第一处理单元还用于根据参考深度信息和匹配结果生成深度视差图,并将深度视差图发送给第二处理单元。第二处理单元还用于对深度视差图进行处理得到深度图。
在本实施例中,通过第一处理单元可精准得到采集的图像的深度信息,数据处理效率高,且提高了图像处理的精准性。
在一个实施例中,第二处理单元,还用于每隔采集时间段采集激光器的温度,并获取与温度对应的参考散斑图像。
第二处理单元,还用于当本次获取的参考散斑图像与第一处理单元中存储的参考散斑图像不一致时,将本次获取的参考散斑图像写入第一处理单元。
在本实施例中,可根据激光器的温度获取与温度对应的参考散斑图像,减少温度对最后输出的深度图的影响,使得到的深度信息更为精准。
在一个实施例中,第二处理单元,还用于当接收到应用程序的数据获取请求,获取应用程序的安全级别。
第二处理单元,还用于确定与安全级别对应的数据传输通道,并将深度图通过对应的数据传输通道发送给应用程序。
在本实施例中,根据应用程序的安全级别选取对应的数据通道来传输数据,提高数据传输的安全性。
如图14所示,在一个实施例中,提供一种控制拍摄的装置800,包括图像采集模块810、信号接收模块820、计算模块830、信号转发模块840、处理模块850。图像采集模块810用于当第一处理单元接收到第二处理单元发送的图像采集指令时,根据图像采集指令控制第一摄像头采集第一图像,图像采集指令为第二处理单元接收到数据获取请求时发送,数据获取请求用于指示第二处理单元控制第二摄像头采集第二图像。信号接收模块820用于当第一处理单元接收到第二摄像头发送的同步信号时,获取第一摄像头的第一曝光时间和第二摄像头的第二曝光时间,同步信号为第二摄像头采集每帧第二图像时在开始曝光的时刻发送的信号。计算模块830用于根据第一曝光时间和第二曝光时间计算延时时长。信号转发模块840用于当第一处理单元接收到同步信号的时长达到延时时长时,向第一摄像头转发所述同步信号,同步信号用于指示第一摄像头开始曝光并采集第一图像。处理模块850用于通过第一处理单元对第一图像进行处理,并将处理后的第一图像发送给第二处理单元。
在本实施例中,当第一处理单元接收到第二摄像头发送的同步信号时,根据两个摄像头的曝光时长计算延时时长,当第一处理单元接收到同步信号的时长达到延时时长时,向第一摄像头转发同步信号,根据第一摄像头和第二摄像头的曝光时长动态调整转发同步信号的时间点,从而可动态调整第一摄像头和第二摄像头同步的时机,同步效果好,在两个摄像头之间的曝光时长差别较大时,依然可以保证两个摄像头采集的图像内容一致。
在一个实施例中,计算模块830,还用于计算第一曝光时间及第二曝光时间的曝光时差,并将曝光时差除以2,得到延时时长。
在一个实施例中,计算模块830,分别计算第一曝光时间的第一中间曝光时刻和第二曝光时间的第二中间曝光时刻,确定第一中间曝光时刻和第二中间曝光时刻的差值,并将差值作为延时时长。
在本实施例中,可根据第一摄像头和第二摄像头的曝光时长动态调整转发同步信号的时间点,从而可动态调整第一摄像头和第二摄像头同步的时机,保证第一摄像头和第二摄像头在曝光一半的时刻保持一致,同步效果好。
如图15所示,在一个实施例中,处理模块850,包括图像获取单元852、匹配单元854及生成单元856。图像获取单元852用于获取存储的参考散斑图像,参考散斑图像带有参考深度信息。匹配单元854用于将参考散斑图像与散斑图像进行匹配,得到匹配结果。生成单元856用于根据参考深度信息和匹配结果生成深度视差图,并将深度视差图发送给第二处理单元,通过第二处理单元对深度视差图进行处理得到深度图。
在本实施例中,通过第一处理单元可精准得到采集的图像的深度信息,数据处理效率高,且提高了图像处理的精准性。
在一个实施例中,上述处理模块850,除了包括图像获取单元852、匹配单元854及生成单元856,还包括温度采集单元及写入单元。
温度采集单元用于每隔采集时间段采集激光器的温度,并获取与温度对应的参考散斑图像。写入单元用于当本次获取的参考散斑图像与第一处理单元中存储的参考散斑图像不一致时,将本次获取的参考散斑图像写入第一处理单元。
在本实施例中,可根据激光器的温度获取与温度对应的参考散斑图像,减少温度对最后输出的深度图的影响,使得到的深度信息更为精准。
在一个实施例中,上述控制拍摄的装置800,除了包括图像采集模块810、信号接收模块820、计算模块830、信号转发模块840及处理模块850,还包括级别获取模块、通道确定模块及发送模块。级别获取模块用于当第二处理单元接收到应用程序的数据获取请求,获取应用程序的安全级别。通道确定模块用于确定与安全级别对应的数据传输通道。发送模块用于将深度图通过对应的数据传输通道发送给应用程序。
在本实施例中,根据应用程序的安全级别选取对应的数据通道来传输数据,提高数据传输的安全性。
在一个实施例中,提供一种计算机可读存储介质,其上存储有计算机程序,该计算机程序被处理器执行时实现上述的控制拍摄的方法。
在一个实施例中,提供一种包含计算机程序的计算机程序产品,当其在计算机设备上运行时,使得计算机设备执行时实现上述的控制拍摄的方法。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一非易失性计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)等。
如此处所使用的对存储器、存储、数据库或其它介质的任何引用可包括非易失性和/或易失性存储器。合适的非易失性存储器可包括只读存储器(ROM)、可编程ROM(PROM)、电可编程ROM(EPROM)、电可擦除可编程ROM(EEPROM)或闪存。易失性存储器可包括随机存取存储器(RAM),它用作外部高速缓冲存储器。作为说明而非局限,RAM以多种形式可得,诸如静态RAM(SRAM)、动态RAM(DRAM)、同步DRAM(SDRAM)、双数据率SDRAM(DDR SDRAM)、增强型SDRAM(ESDRAM)、同步链路(Synchlink)DRAM(SLDRAM)、存储器总线(Rambus)直接RAM(RDRAM)、直接存储器总线动态RAM(DRDRAM)、以及存储器总线动态RAM(RDRAM)。
以上所述实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本说明书记载的范围。
以上所述实施例仅表达了本申请的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对申请专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本申请构思的前提下,还可以做出若干变形和改进,这些都属于本申请的保护范围。因此,本申请专利的保护范围应以所附权利要求为准。

Claims (29)

  1. 一种控制拍摄的方法,其特征在于,包括:
    当第二处理单元接收到数据获取请求时,根据所述数据获取请求控制第二摄像头采集第二图像,并向第一处理单元发送图像采集指令,所述图像采集指令用于指示所述第一处理单元控制第一摄像头采集第一图像;
    当所述第二处理单元接收到所述第二摄像头发送的同步信号时,获取所述第一摄像头的第一曝光时间和第二摄像头的第二曝光时间,所述同步信号为所述第二摄像头采集每帧第二图像时在开始曝光的时刻发送的信号;
    根据所述第一曝光时间和第二曝光时间计算延时时长;
    当所述第二处理单元接收到所述同步信号的时长达到所述延时时长时,向所述第一摄像头转发所述同步信号,所述同步信号用于指示所述第一摄像头开始曝光并采集第一图像;
    通过所述第一处理单元对所述第一图像进行处理,并将处理后的第一图像发送给所述第二处理单元。
  2. 根据权利要求1所述的方法,其特征在于,所述第一处理单元通过控制线连接所述第一摄像头,所述第二处理单元通过控制线连接所述第二摄像头,所述第一处理单元与所述第二处理单元连接,所述第二处理单元还通过信号线分别与所述第一摄像头及第二摄像头连接。
  3. 根据权利要求1所述的方法,其特征在于,所述根据所述第一曝光时间和第二曝光时间计算延时时长,包括:
    计算所述第一曝光时间及第二曝光时间的曝光时差,并将所述曝光时差除以2,得到延时时长。
  4. 根据权利要求1所述的方法,其特征在于,所述根据所述第一曝光时间和第二曝光时间计算延时时长,包括:
    分别计算所述第一曝光时间的第一中间曝光时刻和第二曝光时间的第二中间曝光时刻;
    确定所述第一中间曝光时刻和第二中间曝光时刻的差值,并将所述差值作为延时时长。
  5. 根据权利要求1至4任一所述的方法,其特征在于,所述向第一处理单元发送图像采集指令,包括:
    通过所述第二处理单元中运行在第一运行模式的内核向所述第一处理单元发送图像采集指令,所述第一运行模式为可信运行环境;
    所述将处理后的第一图像发送给所述第二处理单元,包括:
    所述第一处理单元将所述处理后的第一图像发送给所述第二处理单元中运行在第一运行模式的内核。
  6. 根据权利要求5所述的方法,其特征在于,所述将处理后的第一图像发送给所述第二处理单元,包括:
    获取发送所述数据获取请求的应用程序的应用类型;
    根据所述应用类型确定所述应用程序的安全级别;
    选取与所述安全级别对应的数据传输通道;
    当所述数据传输通道为安全通道时,所述第一处理单元将所述处理后的第一图像发送给所述第二处理单元中运行在第一运行模式的内核;
    当所述数据传输通道为非安全通道时,所述第一处理单元将所述处理后的第一图像发送给处于第二运行模式的摄像头驱动,所述第二运行模式为自然运行环境。
  7. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    获取发送所述数据获取请求的应用程序的安全级别;
    确定与所述安全级别对应的图像精度;
    向所述应用程序发送与所述图像精度对应的图像数据。
  8. 一种控制拍摄的装置,其特征在于,包括:
    请求接收模块,用于当第二处理单元接收到数据获取请求时,根据所述数据获取请求控制第二摄像头采集第二图像,并向第一处理单元发送图像采集指令,所述图像采集指令用于指示所述第一处理单元控制第一摄像头采集第一图像;
    信号接收模块,用于当所述第二处理单元接收到所述第二摄像头发送的同步信号时,获取所述第一摄像头的第一曝光时间和第二摄像头的第二曝光时间,所述同步信号为所述第二摄像头采集每帧第二图像时在开始曝光的时刻发送的信号;
    计算模块,用于根据所述第一曝光时间和第二曝光时间计算延时时长;
    信号转发模块,用于当所述第二处理单元接收到所述同步信号的时长达到所述延时时长时,向所述第一摄像头转发所述同步信号,所述同步信号用于指示所述第一摄像头开始曝光并采集第一图像;
    处理模块,用于通过所述第一处理单元对所述第一图像进行处理,并将处理后的第一图像发送给所述第二处理单元。
  9. 一种电子设备,其特征在于,包括第一处理单元、第二处理单元和摄像头模组,所述第一处理单元分别与所述第二处理单元和摄像头模组相连;所述摄像头模组包括第一摄像头和第二摄像头,所述第一处理单元通过控制线连接所述第一摄像头,所述第二处理单元通过控制线连接所述第二摄像头,所述第一处理单元与所述第二处理单元连接,所述第二处理单元还通过信号线分别与所述第一摄像头及第二摄像头连接;
    所述第二处理单元,用于当接收到数据获取请求时,根据所述数据获取请求控制第二摄像头采集第二图像,并向第一处理单元发送图像采集指令;
    所述第一处理单元,用于根据所述图像采集指令控制第一摄像头采集第一图像;
    所述第二摄像头,用于采集每帧第二图像时在开始曝光的时刻向所述第二处理单元发送同步信号;
    所述第二处理单元,还用于当所述第二处理单元接收到所述第二摄像头发送的同步信号时,获取所述第一摄像头的第一曝光时间和第二摄像头的第二曝光时间,并根据所述第一曝光时间和第二曝光时间计算延时时长;
    所述第二处理单元,还用于当所述第二处理单元接收到所述同步信号的时长达到所述延时时长时,向所述第一摄像头转发所述同步信号;
    所述第一摄像头,用于根据所述同步信号开始曝光并采集第一图像;
    所述第一处理单元,还用于对所述第一图像进行处理,并将处理后的第一图像发送给所述第二处理单元。
  10. 根据权利要求9所述的电子设备,其特征在于,所述第二处理单元,还用于计算所述第一曝光时间及第二曝光时间的曝光时差,并将所述曝光时差除以2,得到延时时长。
  11. 根据权利要求9所述的电子设备,其特征在于,所述第二处理单元,还用于分别计算所述第一曝光时间的第一中间曝光时刻和第二曝光时间的第二中间曝光时刻,确定所述第一中间曝光时刻和第二中间曝光时刻的差值,并将所述差值作为延时时长。
  12. 根据权利要求9所述的电子设备,其特征在于,所述第二处理单元,还用于通过所述第二处理单元中运行在第一运行模式的内核向所述第一处理单元发送图像采集指令,所述第一运行模式为可信运行环境;
    所述第一处理单元,还用于将所述处理后的第一图像发送给所述第二处理单元中运行在第一运行模式的内核。
  13. 根据权利要求12所述的电子设备,其特征在于,所述第二处理单元,还用于获取发送所述数据获取请求的应用程序的应用类型,并根据所述应用类型确定所述应用程序的安全级别,再选取与所述安全级别对应的数据传输通道;
    所述第一处理单元,还用于当所述数据传输通道为安全通道时,将所述处理后的第一图像发送给所述第二处理单元中运行在第一运行模式的内核;
    所述第一处理单元,还用于当所述数据传输通道为非安全通道时,将所述处理后的第一图像发送给处于第二运行模式的摄像头驱动,所述第二运行模式为自然运行环境。
  14. 根据权利要求9所述的电子设备,其特征在于,所述第二处理单元,还用于获取发送所述数据获取请求的应用程序的安全级别,确定与所述安全级别对应的图像精度,并向所述应用程序发送与所述图像精度对应的图像数据。
  15. 一种控制拍摄的方法,其特征在于,包括:
    当第一处理单元接收到第二处理单元发送的图像采集指令时,根据所述图像采集指令控制第一摄像 头采集第一图像,所述图像采集指令为所述第二处理单元接收到数据获取请求时发送,所述数据获取请求用于指示所述第二处理单元控制第二摄像头采集第二图像;
    当第一处理单元接收到所述第二摄像头发送的同步信号时,获取所述第一摄像头的第一曝光时间和第二摄像头的第二曝光时间,所述同步信号为所述第二摄像头采集每帧第二图像时在开始曝光的时刻发送的信号;
    根据所述第一曝光时间和第二曝光时间计算延时时长;
    当所述第一处理单元接收到所述同步信号的时长达到所述延时时长时,向所述第一摄像头转发所述同步信号,所述同步信号用于指示所述第一摄像头开始曝光并采集第一图像;
    通过所述第一处理单元对所述第一图像进行处理,并将处理后的第一图像发送给所述第二处理单元。
  16. 根据权利要求15所述的方法,其特征在于,所述第一处理单元通过控制线连接所述第一摄像头,所述第二处理单元通过控制线连接所述第二摄像头,所述第一处理单元与所述第二处理单元连接,所述第一处理单元还通过信号线分别与所述第一摄像头及第二摄像头连接。
  17. 根据权利要求15所述的方法,其特征在于,所述根据所述第一曝光时间和第二曝光时间计算延时时长,包括:
    计算所述第一曝光时间及第二曝光时间的曝光时差,并将所述曝光时差除以2,得到延时时长。
  18. 根据权利要求15所述的方法,其特征在于,所述根据所述第一曝光时间和第二曝光时间计算延时时长,包括:
    分别计算所述第一曝光时间的第一中间曝光时刻和第二曝光时间的第二中间曝光时刻;
    确定所述第一中间曝光时刻和第二中间曝光时刻的差值,并将所述差值作为延时时长。
  19. 根据权利要求15所述的方法,其特征在于,所述第一图像包括散斑图像;
    所述通过所述第一处理单元对所述第一图像进行处理,并将处理后的第一图像发送给所述第二处理单元,包括:
    获取存储的参考散斑图像,所述参考散斑图像带有参考深度信息;
    将所述参考散斑图像与所述散斑图像进行匹配,得到匹配结果;
    根据所述参考深度信息和匹配结果生成深度视差图,并将所述深度视差图发送给所述第二处理单元,通过所述第二处理单元对所述深度视差图进行处理得到深度图。
  20. 根据权利要求19所述的方法,其特征在于,在所述获取参考散斑图像之前,所述方法还包括:
    每隔采集时间段采集激光器的温度,并获取与所述温度对应的参考散斑图像;
    当本次获取的参考散斑图像与第一处理单元中存储的参考散斑图像不一致时,将所述本次获取的参考散斑图像写入所述第一处理单元。
  21. 根据权利要求19所述的方法,其特征在于,所述方法还包括:
    当第二处理单元接收到应用程序的数据获取请求,获取所述应用程序的安全级别;
    确定与所述安全级别对应的数据传输通道;
    将所述深度图通过所述对应的数据传输通道发送给所述应用程序。
  22. 一种控制拍摄的装置,其特征在于,包括:
    图像采集模块,用于当第一处理单元接收到第二处理单元发送的图像采集指令时,根据所述图像采集指令控制第一摄像头采集第一图像,所述图像采集指令为所述第二处理单元接收到数据获取请求时发送,所述数据获取请求用于指示所述第二处理单元控制第二摄像头采集第二图像;
    信号接收模块,用于当第一处理单元接收到所述第二摄像头发送的同步信号时,获取所述第一摄像头的第一曝光时间和第二摄像头的第二曝光时间,所述同步信号为所述第二摄像头采集每帧第二图像时在开始曝光的时刻发送的信号;
    计算模块,用于根据所述第一曝光时间和第二曝光时间计算延时时长;
    信号转发模块,用于当所述第一处理单元接收到所述同步信号的时长达到所述延时时长时,向所述第一摄像头转发所述同步信号,所述同步信号用于指示所述第一摄像头开始曝光并采集第一图像;
    处理模块,用于通过所述第一处理单元对所述第一图像进行处理,并将处理后的第一图像发送给所述第二处理单元。
  23. 一种电子设备,其特征在于,包括第一处理单元、第二处理单元和摄像头模组,所述第一处理 单元分别与所述第二处理单元和摄像头模组相连;所述摄像头模组包括第一摄像头和第二摄像头,所述第一处理单元通过控制线连接所述第一摄像头,所述第二处理单元通过控制线连接所述第二摄像头,所述第一处理单元与所述第二处理单元连接,所述第一处理单元还通过信号线分别与所述第一摄像头及第二摄像头连接;
    所述第二处理单元,用于当接收到数据获取请求时,根据所述数据获取请求控制所述第二摄像头采集第二图像,并向所述第一处理单元发送图像采集指令;
    所述第一处理单元,用于当接收到所述第二处理单元发送的图像采集指令时,根据所述图像采集指令控制第一摄像头采集第一图像;
    所述第二摄像头,用于采集每帧第二图像时在开始曝光的时刻向第一处理单元发送同步信号;
    所述第一处理单元,还用于当第一处理单元接收到所述第二摄像头发送的同步信号时,获取所述第一摄像头的第一曝光时间和第二摄像头的第二曝光时间;
    所述第一处理单元,还用于根据所述第一曝光时间和第二曝光时间计算延时时长;
    所述第一处理单元,还用于当所述第一处理单元接收到所述同步信号的时长达到所述延时时长时,向所述第一摄像头转发所述同步信号;
    所述第一摄像头,用于根据同步信号开始曝光并采集第一图像;
    所述第一处理单元,还用于通过所述第一处理单元对所述第一图像进行处理,并将处理后的第一图像发送给所述第二处理单元。
  24. 根据权利要求23所述的电子设备,其特征在于,所述第一处理单元,还用于计算所述第一曝光时间及第二曝光时间的曝光时差,并将所述曝光时差除以2,得到延时时长。
  25. 根据权利要求23所述的电子设备,其特征在于,所述第一处理单元,还用于分别计算所述第一曝光时间的第一中间曝光时刻和第二曝光时间的第二中间曝光时刻,确定所述第一中间曝光时刻和第二中间曝光时刻的差值,并将所述差值作为延时时长。
  26. 根据权利要求23所述的电子设备,其特征在于,所述第一处理单元,还用于获取存储的参考散斑图像,并将将所述参考散斑图像与所述散斑图像进行匹配,得到匹配结果,所述参考散斑图像带有参考深度信息;
    所述第一处理单元,还用于根据所述参考深度信息和匹配结果生成深度视差图,并将所述深度视差图发送给所述第二处理单元;
    所述第二处理单元,还用于对所述深度视差图进行处理得到深度图。
  27. 根据权利要求26所述的电子设备,其特征在于,所述第二处理单元,还用于每隔采集时间段采集激光器的温度,并获取与所述温度对应的参考散斑图像;
    所述第二处理单元,还用于当本次获取的参考散斑图像与第一处理单元中存储的参考散斑图像不一致时,将所述本次获取的参考散斑图像写入所述第一处理单元。
  28. 根据权利要求26所述的电子设备,其特征在于,所述第二处理单元,还用于当接收到应用程序的数据获取请求,获取所述应用程序的安全级别;
    所述第二处理单元,还用于确定与所述安全级别对应的数据传输通道,并将所述深度图通过所述对应的数据传输通道发送给所述应用程序。
  29. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1至7、及15至21任一所述的方法。
PCT/CN2019/080427 2018-04-28 2019-03-29 控制拍摄的方法、装置、电子设备及计算机可读存储介质 WO2019205887A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP19791777.6A EP3627827B1 (en) 2018-04-28 2019-03-29 Method for controlling photographing, electronic device, and computer readable storage medium
US16/678,701 US11095802B2 (en) 2018-04-28 2019-11-08 Method for processing a captured image and electronic device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201810401344.1A CN108650472B (zh) 2018-04-28 2018-04-28 控制拍摄的方法、装置、电子设备及计算机可读存储介质
CN201810404282.X 2018-04-28
CN201810401344.1 2018-04-28
CN201810404282.XA CN108419017B (zh) 2018-04-28 2018-04-28 控制拍摄的方法、装置、电子设备及计算机可读存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/678,701 Continuation US11095802B2 (en) 2018-04-28 2019-11-08 Method for processing a captured image and electronic device

Publications (1)

Publication Number Publication Date
WO2019205887A1 true WO2019205887A1 (zh) 2019-10-31

Family

ID=68294899

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/080427 WO2019205887A1 (zh) 2018-04-28 2019-03-29 控制拍摄的方法、装置、电子设备及计算机可读存储介质

Country Status (3)

Country Link
US (1) US11095802B2 (zh)
EP (1) EP3627827B1 (zh)
WO (1) WO2019205887A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111314625A (zh) * 2020-02-19 2020-06-19 北京仁光科技有限公司 多信号源安全传输方法
CN113936050A (zh) * 2021-10-21 2022-01-14 北京的卢深视科技有限公司 散斑图像生成方法、电子设备及存储介质

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3923612A1 (en) * 2020-06-09 2021-12-15 Deutsche Telekom AG Method and communication system for ensuring secure communication in a zero touch connectivity-environment
EP4075790B1 (en) * 2021-04-13 2023-04-05 Axis AB Exposure time control in a video camera
CN113325838B (zh) * 2021-04-23 2022-08-12 武汉光庭信息技术股份有限公司 一种基于相机曝光特性的多传感器时间同步方法及装置
US11659132B2 (en) * 2021-09-22 2023-05-23 Sony Group Corporation Electronic device and method for synchronization of videos
CN114354036B (zh) * 2021-12-29 2022-10-11 上海交通大学 运动模型表面压力与三维形貌同步测量方法及测量装置

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000341719A (ja) * 1999-05-25 2000-12-08 Mitsubishi Electric Corp ステレオカメラ
CN1710935A (zh) * 2004-06-17 2005-12-21 株式会社日立制作所 摄像装置
CN102810139A (zh) * 2012-06-29 2012-12-05 宇龙计算机通信科技(深圳)有限公司 数据安全操作方法及通信终端
CN103338334A (zh) * 2013-07-17 2013-10-02 中测新图(北京)遥感技术有限责任公司 一种多相机数字航摄仪同步曝光控制系统及方法
CN103416071A (zh) * 2011-03-08 2013-11-27 瑞萨电子株式会社 摄像装置
US20180061056A1 (en) * 2016-08-30 2018-03-01 Microsoft Technology Licensing, Llc Temperature Compensation for Structured Light Depth Imaging System
CN108419017A (zh) * 2018-04-28 2018-08-17 Oppo广东移动通信有限公司 控制拍摄的方法、装置、电子设备及计算机可读存储介质
CN108650472A (zh) * 2018-04-28 2018-10-12 Oppo广东移动通信有限公司 控制拍摄的方法、装置、电子设备及计算机可读存储介质

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7046292B2 (en) 2002-01-16 2006-05-16 Hewlett-Packard Development Company, L.P. System for near-simultaneous capture of multiple camera images
JP5025526B2 (ja) 2008-02-29 2012-09-12 ブラザー工業株式会社 画像形成装置
CN101431603B (zh) 2008-12-17 2011-04-20 广东威创视讯科技股份有限公司 多摄像头同步拍摄的方法及其检测装置
EP2449762A4 (en) 2009-06-30 2015-07-15 Nokia Corp ADVANCED TIMER FUNCTIONALITY FOR CAMERA SYSTEMS
US8320621B2 (en) 2009-12-21 2012-11-27 Microsoft Corporation Depth projector system with integrated VCSEL array
CN201608788U (zh) 2010-01-14 2010-10-13 宝山钢铁股份有限公司 相机帧率同步可调的工业图像采集装置
US8711238B2 (en) * 2011-02-01 2014-04-29 Aptina Imaging Corporation Systems and methods for synchronizing and controlling multiple image sensors
JP6047858B2 (ja) * 2013-06-07 2016-12-21 カシオ計算機株式会社 撮影制御装置及び撮影制御方法
US9626803B2 (en) 2014-12-12 2017-04-18 Qualcomm Incorporated Method and apparatus for image processing in augmented reality systems
CN104486557B (zh) 2014-12-29 2018-04-06 上海集成电路研发中心有限公司 在图像采集过程中插入短帧的系统及方法
US9549100B2 (en) * 2015-04-23 2017-01-17 Microsoft Technology Licensing, Llc Low-latency timing control
US10846696B2 (en) * 2015-08-24 2020-11-24 Samsung Electronics Co., Ltd. Apparatus and method for trusted execution environment based secure payment transactions
US20170061210A1 (en) * 2015-08-26 2017-03-02 Intel Corporation Infrared lamp control for use with iris recognition authentication
CN106548077B (zh) * 2016-10-19 2019-03-15 沈阳微可信科技有限公司 通信系统和电子设备
CN107424187B (zh) * 2017-04-17 2023-10-24 奥比中光科技集团股份有限公司 深度计算处理器、数据处理方法以及3d图像设备
US20180309919A1 (en) * 2017-04-19 2018-10-25 Qualcomm Incorporated Methods and apparatus for controlling exposure and synchronization of image sensors
CN107040726B (zh) 2017-04-19 2020-04-07 宇龙计算机通信科技(深圳)有限公司 双摄像头同步曝光方法及系统
CN107395998A (zh) 2017-08-24 2017-11-24 维沃移动通信有限公司 一种图像拍摄方法及移动终端
US10506217B2 (en) * 2017-10-09 2019-12-10 Facebook Technologies, Llc Head-mounted display tracking system
CN107730561B (zh) * 2017-10-17 2021-07-06 奥比中光科技集团股份有限公司 深度相机温度误差校正方法及系统
CN107870080B (zh) 2017-10-23 2019-10-15 北京理工大学 用于图像融合系统的延时测量设备及方法
CN107948463B (zh) 2017-11-30 2020-06-23 北京图森智途科技有限公司 一种相机同步方法、装置及系统

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000341719A (ja) * 1999-05-25 2000-12-08 Mitsubishi Electric Corp ステレオカメラ
CN1710935A (zh) * 2004-06-17 2005-12-21 株式会社日立制作所 摄像装置
CN103416071A (zh) * 2011-03-08 2013-11-27 瑞萨电子株式会社 摄像装置
CN102810139A (zh) * 2012-06-29 2012-12-05 宇龙计算机通信科技(深圳)有限公司 数据安全操作方法及通信终端
CN103338334A (zh) * 2013-07-17 2013-10-02 中测新图(北京)遥感技术有限责任公司 一种多相机数字航摄仪同步曝光控制系统及方法
US20180061056A1 (en) * 2016-08-30 2018-03-01 Microsoft Technology Licensing, Llc Temperature Compensation for Structured Light Depth Imaging System
CN108419017A (zh) * 2018-04-28 2018-08-17 Oppo广东移动通信有限公司 控制拍摄的方法、装置、电子设备及计算机可读存储介质
CN108650472A (zh) * 2018-04-28 2018-10-12 Oppo广东移动通信有限公司 控制拍摄的方法、装置、电子设备及计算机可读存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3627827A4 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111314625A (zh) * 2020-02-19 2020-06-19 北京仁光科技有限公司 多信号源安全传输方法
CN113936050A (zh) * 2021-10-21 2022-01-14 北京的卢深视科技有限公司 散斑图像生成方法、电子设备及存储介质
CN113936050B (zh) * 2021-10-21 2022-08-12 合肥的卢深视科技有限公司 散斑图像生成方法、电子设备及存储介质

Also Published As

Publication number Publication date
US11095802B2 (en) 2021-08-17
EP3627827A1 (en) 2020-03-25
EP3627827A4 (en) 2021-03-24
EP3627827B1 (en) 2024-05-01
US20200077003A1 (en) 2020-03-05

Similar Documents

Publication Publication Date Title
WO2019205887A1 (zh) 控制拍摄的方法、装置、电子设备及计算机可读存储介质
CN110248111B (zh) 控制拍摄的方法、装置、电子设备及计算机可读存储介质
WO2019205742A1 (zh) 图像处理方法、装置、计算机可读存储介质和电子设备
WO2019205890A1 (zh) 图像处理方法、装置、计算机可读存储介质和电子设备
CN108805024B (zh) 图像处理方法、装置、计算机可读存储介质和电子设备
CN108804895B (zh) 图像处理方法、装置、计算机可读存储介质和电子设备
WO2019196683A1 (zh) 图像处理方法、装置、计算机可读存储介质和电子设备
CN110971836B (zh) 控制拍摄的方法、装置、电子设备及计算机可读存储介质
WO2019206129A1 (zh) 数据处理方法、装置、电子设备及计算机可读存储介质
TWI709110B (zh) 攝像頭校準方法和裝置、電子設備
US11275927B2 (en) Method and device for processing image, computer readable storage medium and electronic device
CN110191266B (zh) 数据处理方法、装置、电子设备及计算机可读存储介质
CN109712192A (zh) 摄像模组标定方法、装置、电子设备及计算机可读存储介质
CN108833887B (zh) 数据处理方法、装置、电子设备及计算机可读存储介质
CN109559353A (zh) 摄像模组标定方法、装置、电子设备及计算机可读存储介质
TWI708192B (zh) 影像處理方法、電子設備、電腦可讀儲存媒體
WO2020015403A1 (zh) 图像处理方法、装置、计算机可读存储介质和电子设备
JP4871315B2 (ja) 複眼撮影装置およびその制御方法並びにプログラム
CN108810516B (zh) 数据处理方法、装置、电子设备及计算机可读存储介质
WO2019205889A1 (zh) 图像处理方法、装置、计算机可读存储介质和电子设备
JP5137219B2 (ja) 複眼撮影装置およびその制御方法並びにプログラム
WO2019071403A1 (zh) 一种图像处理方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19791777

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019791777

Country of ref document: EP

Effective date: 20191220

NENP Non-entry into the national phase

Ref country code: DE