CN110248111B - Method and device for controlling shooting, electronic equipment and computer-readable storage medium - Google Patents

Method and device for controlling shooting, electronic equipment and computer-readable storage medium Download PDF

Info

Publication number
CN110248111B
CN110248111B CN201910585686.8A CN201910585686A CN110248111B CN 110248111 B CN110248111 B CN 110248111B CN 201910585686 A CN201910585686 A CN 201910585686A CN 110248111 B CN110248111 B CN 110248111B
Authority
CN
China
Prior art keywords
processing unit
camera
image
exposure time
exposure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910585686.8A
Other languages
Chinese (zh)
Other versions
CN110248111A (en
Inventor
谭国辉
周海涛
谭筱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910585686.8A priority Critical patent/CN110248111B/en
Publication of CN110248111A publication Critical patent/CN110248111A/en
Application granted granted Critical
Publication of CN110248111B publication Critical patent/CN110248111B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the application relates to a method and a device for controlling shooting, electronic equipment and a computer readable storage medium. The method comprises the following steps: when the first processing unit receives an image acquisition instruction sent by the second processing unit, the first camera is controlled to acquire a first image according to the image acquisition instruction; when the first processing unit receives a synchronous signal sent by the second camera, acquiring a first exposure time length of the first camera and a second exposure time length of the second camera; calculating a delay time length according to the first exposure time length and the second exposure time length; when the time length of the synchronization signal received by the first processing unit reaches the delay time length, the synchronization signal is forwarded to the first camera, and the synchronization signal is used for indicating the first camera to start exposure and collecting a first image; the first image is processed by a first processing unit. The method, the device, the electronic equipment and the computer readable storage medium for controlling shooting have good synchronization effect, and can ensure that the image contents acquired by the two cameras are consistent.

Description

Method and device for controlling shooting, electronic equipment and computer-readable storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for controlling shooting, an electronic device, and a computer-readable storage medium.
Background
Along with the rapid development of the image technology on the intelligent terminal, more and more intelligent terminals are provided with two or more cameras, and the cooperation of the multiple cameras is adopted, so that images with better visual effect are acquired. In order to ensure that the pictures finally acquired by the two cameras are consistent, the two cameras need to be controlled to be synchronized. In the conventional method, two cameras are usually connected through a hardware signal line and synchronized by a signal for starting exposure every frame. When the difference of the exposure time between the two cameras is large, the synchronism is poor, and the difference of the content of the images collected between the two cameras is large.
Disclosure of Invention
The embodiment of the application provides a method and a device for controlling shooting, an electronic device and a computer-readable storage medium, which have good synchronization effect and can ensure that the image contents acquired by two cameras are consistent.
A method of controlling photographing, comprising:
when a first processing unit receives an image acquisition instruction sent by a second processing unit, controlling a first camera to acquire a first image according to the image acquisition instruction, wherein the image acquisition instruction is sent when the second processing unit receives a data acquisition request, and the data acquisition request is used for indicating the second processing unit to control a second camera to acquire a second image;
when a first processing unit receives a synchronous signal sent by a second camera, acquiring a first exposure time length of the first camera and a second exposure time length of the second camera, wherein the synchronous signal is a signal sent at the moment of starting exposure when the second camera collects each frame of second image;
calculating a delay time length according to the first exposure time length and the second exposure time length;
when the time length of the first processing unit receiving the synchronous signal reaches the time delay length, the synchronous signal is forwarded to the first camera, and the synchronous signal is used for indicating the first camera to start exposure and acquiring a first image;
and processing the first image through the first processing unit, and sending the processed first image to the second processing unit.
An apparatus for controlling photographing, comprising:
the image acquisition module is used for controlling the first camera to acquire a first image according to an image acquisition instruction sent by the second processing unit when the first processing unit receives the image acquisition instruction, wherein the image acquisition instruction is sent when the second processing unit receives a data acquisition request, and the data acquisition request is used for indicating the second processing unit to control the second camera to acquire a second image;
the signal receiving module is used for acquiring a first exposure time length of the first camera and a second exposure time length of the second camera when the first processing unit receives a synchronous signal sent by the second camera, wherein the synchronous signal is a signal sent at the moment of starting exposure when the second camera collects each frame of second image;
the calculating module is used for calculating the time delay duration according to the first exposure duration and the second exposure duration;
the signal forwarding module is configured to forward the synchronization signal to the first camera when the time length of the synchronization signal received by the first processing unit reaches the time delay length, where the synchronization signal is used to instruct the first camera to start exposure and acquire a first image;
and the processing module is used for processing the first image through the first processing unit and sending the processed first image to the second processing unit.
An electronic device comprises a first processing unit, a second processing unit and a camera module, wherein the first processing unit is respectively connected with the second processing unit and the camera module; the camera module comprises a first camera and a second camera, the first processing unit is connected with the first camera through a control line, the second processing unit is connected with the second camera through a control line, the first processing unit is connected with the second processing unit, and the first processing unit is also connected with the first camera and the second camera through signal lines respectively;
the second processing unit is used for controlling the second camera to acquire a second image according to the data acquisition request and sending an image acquisition instruction to the first processing unit when the data acquisition request is received;
the first processing unit is used for controlling the first camera to acquire a first image according to the image acquisition instruction when receiving the image acquisition instruction sent by the second processing unit;
the second camera is used for sending a synchronizing signal to the first processing unit at the moment of starting exposure when each frame of second image is collected;
the first processing unit is further configured to obtain a first exposure duration of the first camera and a second exposure duration of the second camera when the first processing unit receives the synchronization signal sent by the second camera;
the first processing unit is further configured to calculate a delay time length according to the first exposure time length and the second exposure time length;
the first processing unit is further configured to forward the synchronization signal to the first camera when the time length of the synchronization signal received by the first processing unit reaches the delay time length;
the first camera is used for starting exposure according to the synchronous signal and acquiring a first image;
the first processing unit is further configured to process the first image through the first processing unit, and send the processed first image to the second processing unit.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method as set forth above.
According to the method, the device, the electronic equipment and the computer-readable storage medium for controlling shooting, when the first processing unit receives the synchronous signal sent by the second camera, the time delay duration is calculated according to the exposure duration of the two cameras, when the duration of the synchronous signal received by the first processing unit reaches the time delay duration, the synchronous signal is forwarded to the first camera, and the time point for forwarding the synchronous signal is dynamically adjusted according to the exposure duration of the first camera and the second camera, so that the synchronous time of the first camera and the second camera can be dynamically adjusted, the synchronization effect is good, and when the difference of the exposure duration between the two cameras is large, the image contents collected by the two cameras can still be ensured to be consistent.
Drawings
Fig. 1 is an application scenario diagram of a method of controlling photographing in one embodiment;
fig. 2 is an application scenario diagram of a method of controlling photographing in another embodiment;
FIG. 3 is a block diagram of an electronic device in one embodiment;
FIG. 4 is a flow diagram illustrating a method of controlling photography in one embodiment;
FIG. 5 is a flow diagram illustrating processing of a first image in one embodiment;
FIG. 6 is a schematic flow chart illustrating the process of acquiring a reference speckle image according to the temperature of the laser in one embodiment;
FIG. 7 is a flow diagram that illustrates the selection of a data transmission channel based on a security level of an application, according to one embodiment;
FIG. 8 is a block diagram of an apparatus for controlling photographing in one embodiment;
FIG. 9 is a block diagram of a processing module in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first client may be referred to as a second client, and similarly, a second client may be referred to as a first client, without departing from the scope of the present application. Both the first client and the second client are clients, but they are not the same client.
Fig. 1 is an application scenario diagram of a method for controlling shooting in one embodiment. As shown in fig. 1, the application scenario may include a first camera 110, a second camera 120, a first processing unit 130, and a second processing unit 140. The first camera 110 may be a laser camera and the second camera 120 may be an RGB (Red/Green/Blue, Red/Green/Blue color mode) camera. The first Processing Unit 130 may be an MCU (micro controller Unit) module, etc., and the second Processing Unit 140 may be a CPU (Central Processing Unit) module, etc. The first processing unit 130 is connected to the first camera 110 through a control line, and the second processing unit 140 is connected to the second camera 120 through a control line. The first processing unit 130 is connected 140 to the second processing unit. The first processing unit 130 is also connected to the first camera 110 and the second camera 120 through signal lines, respectively.
When the second processing unit 140 receives the data acquisition request, the second camera 120 may be controlled to acquire a second image through the control line according to the data acquisition request, and send an image acquisition instruction to the first processing unit 130. When the first processing unit 130 receives the image capturing instruction sent by the second processing unit 140, the first camera can be controlled by the control line to capture the first image according to the image capturing instruction. When the second camera 120 captures the second image of each frame, a synchronization signal may be sent to the first processing unit 130 through a signal line at the time of starting exposure. When the first processing unit 130 receives the synchronization signal sent by the second camera 120, the first exposure duration of the first camera 110 and the second exposure duration of the second camera 120 may be obtained, and the delay duration may be calculated according to the first exposure duration and the second exposure duration. When the time length for which the first processing unit 130 receives the synchronization signal reaches the delay time length, the synchronization signal may be forwarded to the first camera 110 through the signal line. After receiving the synchronization signal, the first camera 110 may start to expose and collect a first image, and may transmit the collected first image to the first processing unit 130. The first processing unit 130 may process the first image and transmit the processed first image to the second processing unit 140.
Fig. 2 is an application scenario diagram of a method for controlling shooting in another embodiment. As shown in fig. 2, the electronic device 200 may include a camera module 210, a second processing unit 220, and a first processing unit 230. The second processing unit 220 may be a CPU module. The first processing unit 230 may be an MCU module. The first processing unit 230 is connected between the second processing unit 220 and the camera module 210, the first processing unit 230 can control the laser camera 212, the floodlight 214 and the laser light 218 in the camera module 210, and the second processing unit 220 can control the RGB camera 216 in the camera module 210.
The camera module 210 includes a laser camera 212, a floodlight 214, an RGB camera 216, and a laser light 218. The laser camera 212 may be an infrared camera for acquiring infrared images. The floodlight 214 is a surface light source capable of emitting infrared light; the laser lamp 218 is a point light source capable of emitting laser light and is a point light source with a pattern. When the floodlight 214 emits a surface light source, the laser camera 212 can obtain an infrared image according to the reflected light. When the laser lamp 218 emits a point light source, the laser camera 212 may obtain a speckle image according to the reflected light. The speckle image is an image of the pattern deformation after the point light source with the pattern emitted by the laser lamp 218 is reflected.
The first processing unit 230 may be connected to the RGB camera 216 and the laser camera 212 through signal lines, respectively. As each frame of image is captured by RGB camera 216, a synchronization signal may be sent to first processing unit 230. After receiving the synchronization signal sent by the RGB camera 216, the first processing unit 230 may obtain the exposure duration of the laser camera 212 and the exposure duration of the RGB camera 216, and calculate the delay duration according to the exposure duration of the laser camera 212 and the exposure duration of the RGB camera 216. When the time period in which the first processing unit 230 receives the synchronization signal reaches the delay time period, the synchronization signal may be forwarded to the laser camera 212 through the signal line. The laser camera 212 receives the synchronization signal, and can start exposure and collect infrared images or speckle images according to the synchronization signal.
The second processing unit 220 may include a CPU core operating in a TEE (Trusted Execution Environment) Environment and a CPU core operating in a REE (natural Execution Environment) Environment. The TEE environment and the REE environment are both running modes of an ARM module (Advanced RISC Machines, Advanced reduced instruction set processor). The security level of the TEE environment is higher, and only one CPU core in the second processing unit 220 can operate in the TEE environment at the same time. Generally, the operation behavior with higher security level in the electronic device 200 needs to be executed in the CPU core in the TEE environment, and the operation behavior with lower security level can be executed in the CPU core in the REE environment.
The first processing unit 230 includes a PWM (Pulse Width Modulation) module 232, an SPI/I2C (Serial Peripheral Interface/Inter-Integrated Circuit) Interface 234, a RAM (Random Access Memory) module 236, and a depth engine 238. The PWM module 232 may transmit pulses to the camera module to control the floodlight 214 or the laser 218 to be turned on, so that the laser camera 212 may collect infrared images or speckle images. The SPI/I2C interface 234 is used for receiving the image capturing instruction sent by the second processing unit 220. The depth engine 238 may process the speckle images to obtain a depth disparity map.
When the second processing unit 220 receives a data acquisition request of an application program, for example, when the application program needs to perform face unlocking and face payment, an image acquisition instruction may be sent to the first processing unit 230 through the CPU core operating in the TEE environment. After the first processing unit 230 receives the image acquisition instruction, the PWM module 232 emits a pulse wave to control the floodlight 214 in the camera module 210 to be turned on and acquire an infrared image through the laser camera 212, and control the laser light 218 in the camera module 210 to be turned on and acquire a speckle image through the laser camera 212. The camera module 210 may send the collected infrared image and speckle image to the first processing unit 230. The first processing unit 230 may process the received infrared image to obtain an infrared disparity map; and processing the received speckle images to obtain a speckle parallax image or a depth parallax image. The processing of the infrared image and the speckle image by the first processing unit 230 refers to correcting the infrared image or the speckle image and removing the influence of internal and external parameters in the camera module 210 on the image. The first processing unit 230 can be set to different modes, and the images output by the different modes are different. When the first processing unit 230 is set to the speckle pattern mode, the first processing unit 230 processes the speckle image to obtain a speckle disparity map, and a target speckle image can be obtained according to the speckle disparity map; when the first processing unit 230 is set to the depth map mode, the first processing unit 230 processes the speckle images to obtain a depth disparity map, and obtains a depth image according to the depth disparity map, where the depth image is an image with depth information. The first processing unit 230 may send the infrared disparity map and the speckle disparity map to the second processing unit 220, and the first processing unit 230 may also send the infrared disparity map and the depth disparity map to the second processing unit 220. The second processing unit 220 may obtain an infrared image of the target according to the infrared disparity map and obtain a depth image according to the depth disparity map. Further, the second processing unit 220 may perform face recognition, face matching, living body detection, and depth information acquisition on the detected face according to the target infrared image and the depth image.
The communication between the first processing unit 230 and the second processing unit 220 is through a fixed security interface to ensure the security of the transmitted data. As shown in fig. 1, the data sent by the second processing unit 220 to the first processing unit 230 is through SECURE SPI/I2C 240, and the data sent by the first processing unit 230 to the second processing unit 220 is through SECURE industrial Processor Interface (secmipi) 250.
In an embodiment, the first processing unit 230 may also obtain a target infrared image according to the infrared disparity map, calculate and obtain a depth image according to the depth disparity map, and then send the target infrared image and the depth image to the second processing unit 220.
FIG. 3 is a block diagram of an electronic device in one embodiment. As shown in fig. 3, the electronic device includes a processor, a memory, a display screen, and an input device connected through a system bus. The memory may include, among other things, a non-volatile storage medium and a processor. A non-volatile storage medium of an electronic device stores an operating system and a computer program that when executed by a processor implements a method of controlling photographing provided in an embodiment of the present application. The processor is used for providing calculation and control capability and supporting the operation of the whole electronic equipment. The internal memory in the electronic device provides an environment for the execution of the computer program in the nonvolatile storage medium. The display screen of the electronic device may be a liquid crystal display screen or an electronic ink display screen, and the input device may be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on a housing of the electronic device, or an external keyboard, a touch pad or a mouse. The electronic device may be a mobile phone, a tablet computer, or a personal digital assistant or a wearable device, etc. Those skilled in the art will appreciate that the architecture shown in fig. 3 is a block diagram of only a portion of the architecture associated with the subject application, and does not constitute a limitation on the electronic devices to which the subject application may be applied, and that a particular electronic device may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
As shown in fig. 4, in one embodiment, there is provided a method of controlling photographing, including the steps of:
step 410, when the first processing unit receives an image acquisition instruction sent by the second processing unit, the first camera is controlled to acquire the first image according to the image acquisition instruction, the image acquisition instruction is sent when the second processing unit receives a data acquisition request, and the data acquisition request is used for instructing the second processing unit to control the second camera to acquire the second image.
When an application program in the electronic device needs to acquire face data, the first camera can be controlled to be started, and a first image is acquired, wherein the face data can include but is not limited to data needing face verification under scenes such as face unlocking and face payment, face depth information and the like. The first camera can be a laser camera, and the laser camera can collect invisible light images with different wavelengths. The first image may include, but is not limited to, an infrared image, a speckle image, etc., the speckle image referring to the infrared image with the speckle image.
When the application program needs to acquire the face data, a data acquisition request can be sent to the second processing unit. After receiving the data acquisition request, the second processing unit may send an image acquisition instruction to the first processing unit, where the first processing unit may be an MCU module, and the second processing unit may be a CPU module. Optionally, the second processing unit may first detect whether the data acquisition request includes a visible light image acquisition instruction, and if the data acquisition request includes the visible light image acquisition instruction, it may be stated that the application program needs to acquire a visible light image including a human face while acquiring the human face data. If the data acquisition request includes a visible light image acquisition instruction, the second processing unit may control the second camera to acquire a second image according to the visible light image acquisition instruction, where the second camera may be an RGB camera, and the second image may be an RGB image including a human face.
After the first processing unit receives the image acquisition instruction, the first camera can be controlled to acquire a first image according to the image acquisition instruction, wherein the first image can comprise an infrared image, a speckle image and the like. The floodlight in the camera module can be opened in the steerable of first processing unit and infrared image is gathered through the laser camera, can open lasers such as laser lamps in the camera module and gather speckle image etc. through the laser camera. The floodlight can be a pointolite of evenly shining in all directions, and the light that the floodlight sent can be the infrared light, and the laser camera can gather people's face and obtain infrared image. Laser that the laser instrument sent can be carried out the diffraction by lens and DOE (dispersive optical elements) and produce the pattern of taking the speckle granule, projects the target object through the pattern of taking the speckle granule, receives the different skew that produces the speckle pattern of distance of target object each point and electronic equipment, and laser camera gathers the target object and obtains the speckle image.
Step 420, when the first processing unit receives a synchronization signal sent by the second camera, acquiring a first exposure duration of the first camera and a second exposure duration of the second camera, where the synchronization signal is a signal sent at the time of starting exposure when the second camera collects each frame of the second image.
The first processing unit can be connected with the first camera through a control line, and the first camera is controlled to acquire a first image through the control line. The second processing unit can be connected with the second camera through a control line, and the second camera is controlled to collect a second image through the control line. The first processing unit may be connected with the second processing unit. The first processing unit can also be respectively connected with the first camera and the second camera through signal lines, wherein the signal lines can be synchronous signal lines.
When each frame of image is collected by the second camera, a synchronization signal can be sent to the first processing unit connected with the signal line at the moment of starting exposure, and the synchronization signal can be a start mark sof (start of frame) of a frame and can be used for starting exposure of each frame of image. When the first processing unit receives a synchronization signal sent by the second camera, a first exposure time length of the first camera and a second exposure time length of the second camera can be obtained, the exposure time length can refer to a photosensitive time length, and the longer the exposure time length is, the more light can enter the first camera. Generally, the difference between the first exposure duration of the first camera and the second exposure duration of the second camera is large, and the first exposure duration of the first camera may be smaller than the second exposure duration of the second camera, but is not limited thereto, and there may also be a case where the first exposure duration of the first camera is larger than the second exposure duration of the second camera, and the like.
Step 430, calculating a delay time according to the first exposure time and the second exposure time.
The first processing unit can calculate the time delay duration according to the first exposure duration of the first camera and the second exposure duration of the second camera, wherein the time delay duration refers to the time length for prolonging the exposure start of the first camera, and the synchronization of the first camera and the second camera can be ensured by delaying the exposure start time of the first camera.
In one embodiment, the electronic device may preset the time when the first camera and the second camera are synchronized in the exposure process, wherein the synchronized time in the exposure process may mean that the ratio of the time length that the first camera has been exposed to the first exposure time length is the same as the ratio of the time length that the second camera has been exposed to the second exposure time length. For example, the first camera and the second camera may be set to end exposure simultaneously, or may coincide at half the exposure time, or may coincide at 3/4 or the like. The first processing unit can calculate the delay time length according to the first exposure time length, the second exposure time length and the set synchronous time in the exposure process.
Step 440, when the time length of the synchronization signal received by the first processing unit reaches the delay time length, forwarding the synchronization signal to the first camera, where the synchronization signal is used to instruct the first camera to start exposure and acquire the first image.
After the first processing unit calculates the delay time, the first processing unit can forward the synchronization signal to the first camera when the time for receiving the synchronization signal reaches the delay time. After the first camera receives the synchronous signal, exposure is started, so that the synchronous moments of the first camera and the second camera in the exposure process can be kept consistent. For example, the electronic device may calculate the delay time length in advance when the half-exposure time of the device is consistent, and forward the synchronization signal to the first camera when the time length of receiving the synchronization signal reaches the delay time length, and may enable the second camera to be exposed to the half-exposure time when the first camera is exposed to the half-exposure time, and the two times are consistent.
Step 450, the first image is processed by the first processing unit, and the processed first image is sent to the second processing unit.
The first camera can send the first image of gathering to first processing unit, and first processing unit can handle first image. The first processing unit can be set to different modes, and different modes can acquire different first images and perform different processing on the first images, and the like. When first processing unit is infrared mode, floodlight is steerable to be opened to first processing unit to gather infrared image through first camera, can handle infrared image and obtain infrared parallax map. When the first processing unit is in a speckle pattern mode, the first processing unit can control the laser lamp to be started, the first camera collects speckle images, and the speckle images can be processed to obtain a speckle parallax image. When the first processing unit is in a depth map mode, the first processing unit can process the speckle images to obtain a depth parallax map.
In one embodiment, the first processing unit may perform a correction process on the first image, where the correction process is performed to correct an image content offset of the first image due to internal and external parameters of the first camera and the second camera, for example, an image content offset due to a laser camera deflection angle, a placement position between the laser camera and the RGB camera, and the like. After the first image is corrected, a disparity map of the first image can be obtained, for example, an infrared disparity map can be obtained by correcting an infrared image, and a speckle disparity map or a depth disparity map can be obtained by correcting a speckle image. The correction processing is performed on the first image, so that the situation that the image finally presented on the screen of the electronic equipment is ghosted can be prevented.
The first processing unit processes the first image, and can send the processed first image to the second processing unit. The second processing unit can obtain a target image, such as a target infrared image, a target speckle image, a target depth map and the like, according to the processed first image. The second processing unit can process the target image according to the requirement of the application program.
For example, when the application program needs to perform face verification, the second processing unit may perform face detection according to the target image, and the like, where the face detection may include face recognition, face matching, and living body detection. The human face recognition means recognizing whether a human face exists in a target image, the human face matching means matching the human face in the target image with a pre-stored human face, and the living body detection means detecting whether the human face in the target image has biological activity or not. If the application program needs to acquire the depth information of the face, the generated target depth map can be uploaded to the application program, and the application program can perform beautifying processing, three-dimensional modeling and the like according to the received target depth map.
In this embodiment, when the first processing unit receives the synchronization signal sent by the second camera, the delay time length is calculated according to the exposure time lengths of the two cameras, when the time length of the synchronization signal received by the first processing unit reaches the delay time length, the synchronization signal is forwarded to the first camera, and the time point for forwarding the synchronization signal is dynamically adjusted according to the exposure time lengths of the first camera and the second camera, so that the synchronization time of the first camera and the second camera can be dynamically adjusted, the synchronization effect is good, and when the difference between the exposure time lengths of the two cameras is large, the image contents collected by the two cameras can still be guaranteed to be consistent.
In one embodiment, the step 430 of calculating the delay time length according to the first exposure time length and the second exposure time length includes: and calculating the exposure time difference of the first exposure time and the second exposure time, and dividing the exposure time difference by 2 to obtain the delay time.
The electronic equipment can set the first camera and the second camera to be consistent at half-exposure time, and when the first camera is exposed to half, the second camera is also exposed to half. When the first processing unit receives the synchronization signal sent by the second camera, the exposure time difference of the first exposure time and the second exposure time can be calculated, and the exposure time difference is divided by 2 to obtain the delay time. Time delay duration T3=T1-T2/2, wherein, T1Denotes a first exposure time period, T2Indicating the second exposure time period. For example, if the first exposure time of the first camera is 3ms (millisecond), and the second exposure time of the second camera is 30ms, the exposure time difference between the first exposure time and the second exposure time may be calculated to be 17ms, and the exposure time difference may be divided by 2, so as to obtain the delay time of 13.5 ms.
Optionally, after calculating the exposure time difference between the first exposure time and the second exposure time, the first processing unit may compare the exposure time difference with a time threshold, determine whether the exposure time difference is greater than the time threshold, if so, divide the exposure time difference by 2 to obtain a delay time, and forward the synchronization signal to the first camera when the time length of the synchronization signal received by the first processing unit reaches the delay time. If the exposure time difference is less than or equal to the time threshold, the first processing unit can directly forward the synchronization signal to the first camera without prolonging the time when the first camera starts to expose. The time threshold can be set according to actual requirements, for example, 1ms, 2ms and the like, so as to ensure that the acquired image contents of the first camera and the second camera are within a tolerable difference error, and reduce the calculation pressure of the first processing unit.
In one embodiment, to ensure that when the first camera is half exposedThe first processing unit may further calculate a first intermediate exposure time of the first exposure time duration and a second intermediate exposure time of the second exposure time duration, respectively, where the intermediate exposure time refers to a time when the second camera is half exposed. The first processing unit may determine a difference between the first intermediate exposure timing and the second intermediate exposure timing, and regard the difference as the delay time period. Time delay duration T3=T1/2-T2/2, wherein, T1Denotes a first exposure time period, T2Indicating the second exposure time period. For example, if the first exposure time of the first camera is 3ms and the second exposure time of the second camera is 30ms, the first intermediate exposure time of the first exposure time may be calculated to be 1.5ms, and the second intermediate exposure time of the second exposure time may be calculated to be 15ms, the difference between the first intermediate exposure time and the second intermediate exposure time may be calculated to be 13.5ms, and the difference 13.5ms may be used as the delay time. It is to be understood that other algorithms may be used to ensure synchronization between the first camera and the second camera, and are not limited to the above-mentioned methods.
In this embodiment, the time point for forwarding the synchronization signal can be dynamically adjusted according to the exposure duration of the first camera and the second camera, so that the timing for synchronizing the first camera and the second camera can be dynamically adjusted, the first camera and the second camera are guaranteed to be consistent at half of the exposure time, and the synchronization effect is good.
As shown in fig. 5, in one embodiment, the step 450 of processing the first image by the first processing unit and sending the processed first image to the second processing unit includes the following steps:
step 502, a stored reference speckle image is obtained, the reference speckle image having reference depth information.
In the camera coordinate system, a straight line which is vertical to the imaging plane and passes through the center of the mirror surface is taken as a Z axis, and if the coordinates of the object in the camera coordinate system are (X, Y, Z), the Z value is the depth information of the object in the imaging plane of the camera. If the application program needs to acquire the depth information of the face, a depth map containing the face depth information needs to be acquired. The first processing unit can control the laser lamp to be started and collects speckle images through the first camera. The first processing unit can be pre-stored with a reference speckle pattern, the reference speckle pattern can be provided with reference depth information, and the depth information of each pixel point contained in the speckle image can be acquired according to the collected speckle image and the reference speckle image.
And step 504, matching the reference speckle image with the speckle image to obtain a matching result.
The first processing unit may sequentially select a pixel block of a predetermined size, for example, 31 pixels by 31 pixels, centering on each pixel point included in the collected speckle image, and search for a block matching the selected pixel block on the reference speckle image. The first processing unit can find two points on the same laser light path in the speckle image and the reference speckle image respectively from the pixel block selected from the acquired speckle images and the block matched with the reference speckle image, wherein the speckle information of the two points on the same laser light path is consistent, and the two points on the same laser light path can be identified as corresponding pixel points. The depth information of the points on each laser path in the reference speckle image is known. The first processing unit can calculate the offset between two corresponding pixel points of the target speckle image and the reference speckle image on the same laser light path, and calculate the depth information of each pixel point contained in the acquired speckle pattern according to the offset.
In one embodiment, the first processing unit calculates an offset between the collected speckle image and the reference speckle pattern, and calculates depth information of each pixel point included in the speckle image according to the offset, where a calculation formula may be as shown in formula (1):
Figure BDA0002114495380000101
wherein Z isDRepresenting the depth information of the pixel points, namely the depth values of the pixel points; l is the distance between the laser camera and the laser; f is the focal length of the lens in the laser camera, Z0Reference plane distance electronics for reference speckle image acquisitionAnd P is the offset between corresponding pixel points in the collected speckle image and the reference speckle image. P can be obtained by multiplying the amount of pixels of the offset of the pixels in the target speckle pattern and the reference speckle pattern by the actual distance of one pixel. When the distance between the target object and the laser camera is larger than the distance between the reference plane and the laser camera, P is a negative value, and when the distance between the target object and the laser camera is smaller than the distance between the reference plane and the first collector, P is a positive value.
And step 506, generating a depth disparity map according to the reference depth information and the matching result, sending the depth disparity map to a second processing unit, and processing the depth disparity map through the second processing unit to obtain the depth map.
The first processing unit obtains the depth information of each pixel point contained in the acquired speckle image, can correct the acquired speckle image, and corrects the acquired speckle image to have image content offset caused by internal and external parameters of the first camera and the second camera. The first processing unit can generate a depth parallax map according to the corrected speckle images and the depth values of all pixel points in the speckle images, and sends the depth parallax map to the second processing unit. The second processing unit may obtain a depth map according to the depth disparity map, and the depth map may include depth information of each pixel point. The second processing unit can upload the depth map to an application program, and the application program can perform beautifying, three-dimensional modeling and the like according to the depth information of the face in the depth map. The second processing unit can also carry out living body detection according to the depth information of the face in the depth map, and can prevent the collected face from being a two-dimensional plane face and the like.
In this embodiment, the depth information of the acquired image can be accurately obtained through the first processing unit, the data processing efficiency is high, and the accuracy of image processing is improved.
As shown in fig. 6, in one embodiment, before acquiring the stored reference speckle image in step 502, the method further includes the following steps:
step 602, collecting the temperature of the laser every other collection time period, and acquiring a reference speckle image corresponding to the temperature.
The electronic equipment can be provided with a temperature sensor beside the laser, wherein the laser refers to a laser lamp and the like, and the temperature of the laser is collected through the temperature sensor. The second processing unit may acquire the temperature of the laser acquired by the temperature sensor every acquisition time period, where the acquisition time period may be set according to actual requirements, such as 3 seconds, 4 seconds, and the like, but is not limited thereto. Because when the temperature of laser instrument changes, can cause deformation to the camera module, influence the inside and outside parameter of first camera and second camera. The influence on the camera is different at different temperatures, so that the camera can correspond to different reference speckle images at different temperatures
The second processing unit can acquire a reference speckle image corresponding to the temperature, and process the speckle image acquired at the temperature according to the reference speckle image corresponding to the temperature to obtain a depth map. Optionally, the second processing unit may preset a plurality of different temperature intervals, such as 0 ℃ (camera shooting degree) -30 ℃, 30 ℃ -60 ℃, 60 ℃ -90 ℃ and the like, but is not limited thereto, and the different temperature intervals may correspond to different reference speckle images. After the second processing unit collects the temperature, the temperature interval where the temperature is located can be determined, and the reference speckle image corresponding to the temperature interval is obtained.
And step 604, writing the reference speckle pattern acquired this time into the first processing unit when the reference speckle pattern acquired this time is not consistent with the reference speckle pattern stored in the first processing unit.
After the second processing unit acquires the reference speckle image corresponding to the acquired temperature, whether the reference speckle image acquired this time is consistent with the reference speckle image stored in the first processing unit can be judged, an image identifier can be carried in the reference speckle image, and the image identifier can be composed of one or more of numbers, word lines, characters and the like. The second processing unit can read the stored image identifier of the reference speckle image from the first processing unit and compare the image identifier of the reference speckle image acquired this time with the image identifier read from the first processing unit. If the two image identifications are not consistent, it can be shown that the reference speckle image obtained this time is not consistent with the reference speckle image stored in the first processing unit, and the second processing unit can write the reference speckle image obtained this time into the first processing unit. The first processing unit may store the newly written reference speckle image and delete the previously stored reference speckle image.
In this embodiment, the reference speckle image corresponding to the temperature can be acquired according to the temperature of the laser, and the influence of the temperature on the finally output depth map is reduced, so that the obtained depth information is more accurate.
As shown in fig. 7, in an embodiment, the method for controlling shooting further includes the following steps:
step 702, when the second processing unit receives a data obtaining request of the application program, obtaining a security level of the application program.
In one embodiment, the second processing unit in the electronic device may include two operation modes, where the first operation mode may be a TEE, the TEE is a trusted operation environment, and the security level is high; the second operation mode may be REE, which is a natural operation environment, and the security level of the REE is low. After receiving a data acquisition request sent by an application program, the second processing unit can send an image acquisition instruction to the first processing unit through the first operation mode. When the second processing unit is a single core, the single core can be directly switched from the second operation mode to the first operation mode; when the second processing unit has multiple cores, one core can be switched from the second operation mode to the first operation mode, other cores still operate in the second operation mode, and an image acquisition instruction is sent to the first processing unit through the core operating in the first operation mode. After the first processing unit processes the acquired first image, the processed first image can be sent to the kernel running in the first running mode, so that the first processing unit can be guaranteed to run in a trusted running environment all the time, and the safety is improved.
When the application program of the electronic device sends a data acquisition request to the second processing unit, the second processing unit may acquire an application type of the application program and acquire a security level corresponding to the application type. The application types may include, but are not limited to, unlock applications, payment applications, camera applications, beauty applications, and the like. The security level of different application types may be different, for example, but not limited to, the security level corresponding to the payment application and the unlock application may be high, the security level corresponding to the camera application, the beauty application may be low, and the like.
Step 704, determining a data transmission channel corresponding to the security level.
The second processing unit may determine a data transmission channel corresponding to the security level of the application program, and the data transmission channel may include, but is not limited to, a secure channel and a normal channel, wherein the secure channel may correspond to the application program with a higher security level, and the normal channel may correspond to the application program with a lower security level. For example, the payment application may correspond to a secure channel and the beauty application may correspond to a normal channel. In the secure channel, the transmitted data can be encrypted, so that the data is prevented from being leaked or stolen.
And step 706, sending the depth map to the application program through the corresponding data transmission channel.
The second processing unit can send the depth map to the application program through the data transmission channel corresponding to the security level of the application program, send the depth map to the application program with higher security level through the security channel, encrypt the depth map, and send the depth map to the application program with lower security level through the common channel, so that the data transmission speed can be increased. Optionally, in addition to sending the depth map to the application program, other data, such as, but not limited to, a verification result of performing face verification, etc., may also be sent to the application program through a data transmission channel corresponding to the security level of the application program.
In one embodiment, the second processing unit may send the depth map with the accuracy corresponding to the security level to the application program according to the security level of the application program, and the higher the accuracy is, the clearer the corresponding depth image is, and the more depth information is contained. An application with a high security level may correspond to a depth map with a high accuracy, and an application with a low security level may correspond to a depth map with a low accuracy. Alternatively, the second processing unit may adjust the image accuracy of the image data by adjusting the image resolution, the higher the image accuracy, and the lower the resolution, the lower the image accuracy. The number of the diffracted points of the laser lamp can be controlled, so that the higher the image precision is, the more the diffracted points can be, and the lower the image precision is, the fewer the diffracted points can be. It will be appreciated that other ways of controlling the image accuracy may be used, and is not limited to the above-mentioned ways. The accuracy of the depth map is adjusted according to the security level of the application program, and the security of data can be improved.
In this embodiment, the corresponding data channel is selected according to the security level of the application program to transmit data, so as to improve the security of data transmission.
In one embodiment, there is provided a method of controlling photographing, including the steps of:
step (1), when the first processing unit receives an image acquisition instruction sent by the second processing unit, the first camera is controlled to acquire a first image according to the image acquisition instruction, the image acquisition instruction is sent when the second processing unit receives a data acquisition request, and the data acquisition request is used for instructing the second processing unit to control the second camera to acquire a second image.
In one embodiment, the first processing unit is connected with the first camera through a control line, the second processing unit is connected with the second camera through a control line, the first processing unit is connected with the second processing unit, and the first processing unit is further connected with the first camera and the second camera through signal lines respectively.
And (2) when the first processing unit receives a synchronous signal sent by the second camera, acquiring a first exposure time length of the first camera and a second exposure time length of the second camera, wherein the synchronous signal is a signal sent at the moment of starting exposure when the second camera collects each frame of second image.
And (3) calculating the delay time length according to the first exposure time length and the second exposure time length.
In one embodiment, step (3) comprises: and calculating the exposure time difference of the first exposure time and the second exposure time, and dividing the exposure time difference by 2 to obtain the delay time.
In one embodiment, step (3) comprises: respectively calculating a first middle exposure time of the first exposure duration and a second middle exposure time of the second exposure duration; and determining the difference value between the first intermediate exposure time and the second intermediate exposure time, and taking the difference value as the delay time length.
And (4) when the time length of the first processing unit receiving the synchronous signal reaches the delay time length, forwarding the synchronous signal to the first camera, wherein the synchronous signal is used for indicating the first camera to start exposure and collecting a first image.
And (5) processing the first image through the first processing unit, and sending the processed first image to the second processing unit.
In one embodiment, step (5) comprises: acquiring a stored reference speckle image, wherein the reference speckle image is provided with reference depth information; matching the reference speckle image with the speckle image to obtain a matching result; and generating a depth parallax map according to the reference depth information and the matching result, sending the depth parallax map to a second processing unit, and processing the depth parallax map through the second processing unit to obtain the depth map.
In one embodiment, before the step of acquiring the stored reference speckle image, the method further includes: collecting the temperature of the laser at intervals of a collecting time period, and acquiring a reference speckle image corresponding to the temperature; and when the reference speckle image acquired this time is not consistent with the reference speckle image stored in the first processing unit, writing the reference speckle image acquired this time into the first processing unit.
In an embodiment, the method for controlling shooting further includes: when the second processing unit receives a data acquisition request of the application program, acquiring the security level of the application program; determining a data transmission channel corresponding to the security level; and sending the depth map to an application program through a corresponding data transmission channel.
In this embodiment, when the first processing unit receives the synchronization signal sent by the second camera, the delay time length is calculated according to the exposure time lengths of the two cameras, when the time length of the synchronization signal received by the first processing unit reaches the delay time length, the synchronization signal is forwarded to the first camera, and the time point for forwarding the synchronization signal is dynamically adjusted according to the exposure time lengths of the first camera and the second camera, so that the synchronization time of the first camera and the second camera can be dynamically adjusted, the synchronization effect is good, and when the difference between the exposure time lengths of the two cameras is large, the image contents collected by the two cameras can still be guaranteed to be consistent.
It should be understood that, although the steps in the respective flow charts described above are shown in sequence as indicated by the arrows, the steps are not necessarily performed in sequence as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in the various flow diagrams described above may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least a portion of the sub-steps or stages of other steps.
In one embodiment, an electronic device is provided, which includes a first processing unit, a second processing unit, and a camera module, wherein the first processing unit is respectively connected with the second processing unit and the camera module. The camera module can include first camera and second camera, and first camera is connected to first processing unit accessible control line, and second camera is connected to second processing unit accessible control line. The first processing unit is connected with the second processing unit, and the first processing unit is further connected with the first camera and the second camera through signal lines respectively.
And the second processing unit is used for controlling the second camera to acquire a second image according to the data acquisition request and sending an image acquisition instruction to the first processing unit when the data acquisition request is received.
And the first processing unit is used for controlling the first camera to acquire the first image according to the image acquisition instruction when receiving the image acquisition instruction sent by the second processing unit.
And the second camera is used for sending a synchronization signal to the first processing unit at the moment of starting exposure when acquiring each frame of second image.
The first processing unit is further configured to obtain a first exposure duration of the first camera and a second exposure duration of the second camera when the first processing unit receives the synchronization signal sent by the second camera.
The first processing unit is further configured to calculate a delay time length according to the first exposure time length and the second exposure time length.
And the first processing unit is also used for forwarding the synchronous signal to the first camera when the time length of the synchronous signal received by the first processing unit reaches the delay time length.
And the first camera is used for starting exposure and acquiring a first image according to the synchronous signal.
The first processing unit is further configured to process the first image through the first processing unit, and send the processed first image to the second processing unit.
In this embodiment, when the first processing unit receives the synchronization signal sent by the second camera, the delay time length is calculated according to the exposure time lengths of the two cameras, when the time length of the synchronization signal received by the first processing unit reaches the delay time length, the synchronization signal is forwarded to the first camera, and the time point for forwarding the synchronization signal is dynamically adjusted according to the exposure time lengths of the first camera and the second camera, so that the synchronization time of the first camera and the second camera can be dynamically adjusted, the synchronization effect is good, and when the difference between the exposure time lengths of the two cameras is large, the image contents collected by the two cameras can still be guaranteed to be consistent.
In an embodiment, the first processing unit is further configured to calculate an exposure time difference between the first exposure time and the second exposure time, and divide the exposure time difference by 2 to obtain the delay time.
In an embodiment, the first processing unit is further configured to calculate a first intermediate exposure time of the first exposure time duration and a second intermediate exposure time of the second exposure time duration, respectively, determine a difference between the first intermediate exposure time and the second intermediate exposure time, and use the difference as the delay time duration.
In this embodiment, the time point for forwarding the synchronization signal can be dynamically adjusted according to the exposure duration of the first camera and the second camera, so that the timing for synchronizing the first camera and the second camera can be dynamically adjusted, the first camera and the second camera are guaranteed to be consistent at half of the exposure time, and the synchronization effect is good.
In one embodiment, the first processing unit is further configured to acquire a stored reference speckle image, and match the reference speckle image with the speckle image to obtain a matching result, where the reference speckle image carries the reference depth information.
And the first processing unit is also used for generating a depth parallax map according to the reference depth information and the matching result and sending the depth parallax map to the second processing unit.
And the second processing unit is also used for processing the depth parallax map to obtain a depth map.
In this embodiment, the depth information of the acquired image can be accurately obtained through the first processing unit, the data processing efficiency is high, and the accuracy of image processing is improved.
In one embodiment, the second processing unit is further configured to acquire the temperature of the laser at every acquisition time period, and acquire a reference speckle image corresponding to the temperature.
And the second processing unit is also used for writing the reference speckle image acquired this time into the first processing unit when the reference speckle image acquired this time is not consistent with the reference speckle image stored in the first processing unit.
In this embodiment, the reference speckle image corresponding to the temperature can be acquired according to the temperature of the laser, and the influence of the temperature on the finally output depth map is reduced, so that the obtained depth information is more accurate.
In one embodiment, the second processing unit is further configured to, when receiving a data acquisition request of the application program, acquire the security level of the application program.
And the second processing unit is also used for determining a data transmission channel corresponding to the security level and sending the depth map to the application program through the corresponding data transmission channel.
In this embodiment, the corresponding data channel is selected according to the security level of the application program to transmit data, so as to improve the security of data transmission.
As shown in fig. 8, in one embodiment, an apparatus 800 for controlling photographing is provided, which includes an image capturing module 810, a signal receiving module 820, a calculating module 830, a signal forwarding module 840, and a processing module 850.
The image acquisition module 810 is configured to control the first camera to acquire the first image according to an image acquisition instruction when the first processing unit receives the image acquisition instruction sent by the second processing unit, where the image acquisition instruction is sent when the second processing unit receives a data acquisition request, and the data acquisition request is used to instruct the second processing unit to control the second camera to acquire the second image.
The signal receiving module 820 is configured to, when the first processing unit receives a synchronization signal sent by the second camera, obtain a first exposure duration of the first camera and a second exposure duration of the second camera, where the synchronization signal is a signal sent at a time when the second camera starts to expose when acquiring each frame of the second image.
The calculating module 830 is configured to calculate a delay time according to the first exposure time and the second exposure time.
The signal forwarding module 840 is configured to forward the synchronization signal to the first camera when the time length of the synchronization signal received by the first processing unit reaches the delay time length, where the synchronization signal is used to instruct the first camera to start exposure and acquire a first image.
The processing module 850 is configured to process the first image through the first processing unit, and send the processed first image to the second processing unit.
In this embodiment, when the first processing unit receives the synchronization signal sent by the second camera, the delay time length is calculated according to the exposure time lengths of the two cameras, when the time length of the synchronization signal received by the first processing unit reaches the delay time length, the synchronization signal is forwarded to the first camera, and the time point for forwarding the synchronization signal is dynamically adjusted according to the exposure time lengths of the first camera and the second camera, so that the synchronization time of the first camera and the second camera can be dynamically adjusted, the synchronization effect is good, and when the difference between the exposure time lengths of the two cameras is large, the image contents collected by the two cameras can still be guaranteed to be consistent.
In an embodiment, the calculating module 830 is further configured to calculate the exposure time difference between the first exposure time and the second exposure time, and divide the exposure time difference by 2 to obtain the delay time.
In one embodiment, the calculating module 830 calculates a first intermediate exposure time of the first exposure duration and a second intermediate exposure time of the second exposure duration, respectively, determines a difference between the first intermediate exposure time and the second intermediate exposure time, and uses the difference as the delay duration.
In this embodiment, the time point for forwarding the synchronization signal can be dynamically adjusted according to the exposure duration of the first camera and the second camera, so that the timing for synchronizing the first camera and the second camera can be dynamically adjusted, the first camera and the second camera are guaranteed to be consistent at half of the exposure time, and the synchronization effect is good.
As shown in fig. 9, in one embodiment, the processing module 850 includes an image obtaining unit 852, a matching unit 854, and a generating unit 856.
An image obtaining unit 852 is configured to obtain the stored reference speckle images with reference depth information.
And a matching unit 854 for matching the reference speckle image with the speckle image to obtain a matching result.
The generating unit 856 is configured to generate a depth disparity map according to the reference depth information and the matching result, send the depth disparity map to the second processing unit, and process the depth disparity map by the second processing unit to obtain a depth map.
In this embodiment, the depth information of the acquired image can be accurately obtained through the first processing unit, the data processing efficiency is high, and the accuracy of image processing is improved.
In one embodiment, the processing module 850 includes a temperature acquisition unit and a writing unit in addition to the image acquisition unit 852, the matching unit 854 and the generation unit 856.
And the temperature acquisition unit is used for acquiring the temperature of the laser at intervals of acquisition time periods and acquiring a reference speckle image corresponding to the temperature.
And the writing unit is used for writing the reference speckle image acquired this time into the first processing unit when the reference speckle image acquired this time is not consistent with the reference speckle image stored in the first processing unit.
In this embodiment, the reference speckle image corresponding to the temperature can be acquired according to the temperature of the laser, and the influence of the temperature on the finally output depth map is reduced, so that the obtained depth information is more accurate.
In an embodiment, the apparatus 800 for controlling photographing includes a level obtaining module, a channel determining module, and a transmitting module, in addition to the image capturing module 810, the signal receiving module 820, the calculating module 830, the signal forwarding module 840, and the processing module 850.
And the level acquisition module is used for acquiring the security level of the application program when the second processing unit receives a data acquisition request of the application program.
And the channel determining module is used for determining a data transmission channel corresponding to the security level.
And the sending module is used for sending the depth map to the application program through the corresponding data transmission channel.
In this embodiment, the corresponding data channel is selected according to the security level of the application program to transmit data, so as to improve the security of data transmission.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which, when being executed by a processor, implements the above-described method of controlling photographing.
In one embodiment, a computer program product is provided that comprises a computer program, which when run on a computer device causes the computer device to carry out the above-described method of controlling shooting.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a non-volatile computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the program is executed. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), or the like.
Any reference to memory, storage, database, or other medium as used herein may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (15)

1. A method of controlling shooting, comprising:
when a first processing unit receives an image acquisition instruction sent by a second processing unit, controlling a first camera to acquire a first image according to the image acquisition instruction, wherein the image acquisition instruction is sent when the second processing unit receives a data acquisition request, and the data acquisition request is used for indicating the second processing unit to control a second camera to acquire a second image;
when a first processing unit receives a synchronous signal sent by a second camera, acquiring a first exposure time length of the first camera and a second exposure time length of the second camera, wherein the synchronous signal is a signal sent at the moment of starting exposure when the second camera collects each frame of second image;
calculating a delay time length according to the first exposure time length and the second exposure time length;
when the time length of the first processing unit receiving the synchronous signal reaches the time delay length, the synchronous signal is forwarded to the first camera, and the synchronous signal is used for indicating the first camera to start exposure and acquiring a first image;
processing the first image through the first processing unit, and sending the processed first image to the second processing unit;
the processing the first image by the first processing unit and sending the processed first image to the second processing unit includes:
acquiring a mode of the first processing unit;
and processing the first image according to the mode to obtain a disparity map corresponding to the mode, sending the disparity map to the second processing unit, and processing the disparity map through the second processing unit to obtain a target image.
2. The method according to claim 1, wherein the first processing unit is connected to the first camera via a control line, the second processing unit is connected to the second camera via a control line, the first processing unit is connected to the second processing unit, and the first processing unit is further connected to the first camera and the second camera via signal lines, respectively.
3. The method of claim 1, wherein calculating a delay time based on the first exposure time and the second exposure time comprises:
and calculating the exposure time difference of the first exposure time and the second exposure time, and dividing the exposure time difference by 2 to obtain the delay time.
4. The method of claim 1, wherein calculating a delay time based on the first exposure time and the second exposure time comprises:
respectively calculating a first middle exposure time of the first exposure duration and a second middle exposure time of the second exposure duration;
and determining the difference value between the first intermediate exposure time and the second intermediate exposure time, and taking the difference value as the delay time length.
5. The method of claim 1, wherein the first image comprises a speckle image;
the processing the first image according to the mode to obtain a disparity map corresponding to the mode, sending the disparity map to the second processing unit, and processing the disparity map through the second processing unit to obtain a target image includes:
when the mode is a depth map mode, acquiring a stored reference speckle image, wherein the reference speckle image is provided with reference depth information;
matching the reference speckle image with the speckle image to obtain a matching result;
and generating a depth parallax map according to the reference depth information and the matching result, sending the depth parallax map to the second processing unit, and processing the depth parallax map through the second processing unit to obtain the depth map.
6. The method of claim 5, wherein prior to said acquiring a reference speckle image, the method further comprises:
acquiring the temperature of a laser at intervals of an acquisition time period, and acquiring a reference speckle image corresponding to the temperature;
and when the reference speckle image acquired this time is not consistent with the reference speckle image stored in the first processing unit, writing the reference speckle image acquired this time into the first processing unit.
7. The method of claim 5, further comprising:
when a second processing unit receives a data acquisition request of an application program, acquiring the security level of the application program;
determining a data transmission channel corresponding to the security level;
and sending the depth map to the application program through the corresponding data transmission channel.
8. An apparatus for controlling photographing, comprising:
the image acquisition module is used for controlling the first camera to acquire a first image according to an image acquisition instruction sent by the second processing unit when the first processing unit receives the image acquisition instruction, wherein the image acquisition instruction is sent when the second processing unit receives a data acquisition request, and the data acquisition request is used for indicating the second processing unit to control the second camera to acquire a second image;
the signal receiving module is used for acquiring a first exposure time length of the first camera and a second exposure time length of the second camera when the first processing unit receives a synchronous signal sent by the second camera, wherein the synchronous signal is a signal sent at the moment of starting exposure when the second camera collects each frame of second image;
the calculating module is used for calculating the time delay duration according to the first exposure duration and the second exposure duration;
the signal forwarding module is configured to forward the synchronization signal to the first camera when the time length of the synchronization signal received by the first processing unit reaches the time delay length, where the synchronization signal is used to instruct the first camera to start exposure and acquire a first image;
the processing module is used for processing the first image through the first processing unit and sending the processed first image to the second processing unit;
the processing module is specifically configured to: acquiring a mode of the first processing unit;
and processing the first image according to the mode to obtain a disparity map corresponding to the mode, sending the disparity map to the second processing unit, and processing the disparity map through the second processing unit to obtain a target image.
9. An electronic device is characterized by comprising a first processing unit, a second processing unit and a camera module, wherein the first processing unit is respectively connected with the second processing unit and the camera module; the camera module comprises a first camera and a second camera, the first processing unit is connected with the first camera through a control line, the second processing unit is connected with the second camera through a control line, the first processing unit is connected with the second processing unit, and the first processing unit is also connected with the first camera and the second camera through signal lines respectively;
the second processing unit is used for controlling the second camera to acquire a second image according to the data acquisition request and sending an image acquisition instruction to the first processing unit when the data acquisition request is received;
the first processing unit is used for controlling the first camera to acquire a first image according to the image acquisition instruction when receiving the image acquisition instruction sent by the second processing unit;
the second camera is used for sending a synchronizing signal to the first processing unit at the moment of starting exposure when each frame of second image is collected;
the first processing unit is further configured to obtain a first exposure duration of the first camera and a second exposure duration of the second camera when the first processing unit receives the synchronization signal sent by the second camera;
the first processing unit is further configured to calculate a delay time length according to the first exposure time length and the second exposure time length;
the first processing unit is further configured to forward the synchronization signal to the first camera when the time length of the synchronization signal received by the first processing unit reaches the delay time length;
the first camera is used for starting exposure according to the synchronous signal and acquiring a first image;
the first processing unit is further configured to process the first image through the first processing unit, and send the processed first image to the second processing unit;
the first processing unit specifically: acquiring a mode of the first processing unit;
processing the first image according to the mode to obtain a disparity map corresponding to the mode, and sending the disparity map to the second processing unit;
the second processing unit is further configured to: and processing the disparity map to obtain a target image.
10. The electronic device of claim 9, wherein the first processing unit is further configured to calculate an exposure time difference between the first exposure time and the second exposure time, and divide the exposure time difference by 2 to obtain a delay time.
11. The electronic device according to claim 9, wherein the first processing unit is further configured to calculate a first intermediate exposure time of the first exposure time duration and a second intermediate exposure time of the second exposure time duration, respectively, determine a difference between the first intermediate exposure time and the second intermediate exposure time, and use the difference as the delay time duration.
12. The electronic device of claim 9, wherein the first image comprises a speckle image, and the first processing unit is further configured to, when the pattern is a depth map pattern, obtain a stored reference speckle image, and match the reference speckle image with the speckle image to obtain a matching result, where the reference speckle image carries reference depth information;
the first processing unit is further configured to generate a depth disparity map according to the reference depth information and the matching result, and send the depth disparity map to the second processing unit;
the second processing unit is further configured to process the depth disparity map to obtain a depth map.
13. The electronic device of claim 12, wherein the second processing unit is further configured to acquire a temperature of the laser at every acquisition time period and acquire a reference speckle image corresponding to the temperature;
and the second processing unit is also used for writing the reference speckle image acquired this time into the first processing unit when the reference speckle image acquired this time is not consistent with the reference speckle image stored in the first processing unit.
14. The electronic device according to claim 12, wherein the second processing unit is further configured to, when receiving a data acquisition request of an application program, acquire a security level of the application program;
and the second processing unit is further configured to determine a data transmission channel corresponding to the security level, and send the depth map to the application program through the corresponding data transmission channel.
15. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
CN201910585686.8A 2018-04-28 2018-04-28 Method and device for controlling shooting, electronic equipment and computer-readable storage medium Active CN110248111B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910585686.8A CN110248111B (en) 2018-04-28 2018-04-28 Method and device for controlling shooting, electronic equipment and computer-readable storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910585686.8A CN110248111B (en) 2018-04-28 2018-04-28 Method and device for controlling shooting, electronic equipment and computer-readable storage medium
CN201810404282.XA CN108419017B (en) 2018-04-28 2018-04-28 Control method, apparatus, electronic equipment and the computer readable storage medium of shooting

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201810404282.XA Division CN108419017B (en) 2018-04-28 2018-04-28 Control method, apparatus, electronic equipment and the computer readable storage medium of shooting

Publications (2)

Publication Number Publication Date
CN110248111A CN110248111A (en) 2019-09-17
CN110248111B true CN110248111B (en) 2020-08-28

Family

ID=63137390

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201810404282.XA Active CN108419017B (en) 2018-04-28 2018-04-28 Control method, apparatus, electronic equipment and the computer readable storage medium of shooting
CN201910585686.8A Active CN110248111B (en) 2018-04-28 2018-04-28 Method and device for controlling shooting, electronic equipment and computer-readable storage medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201810404282.XA Active CN108419017B (en) 2018-04-28 2018-04-28 Control method, apparatus, electronic equipment and the computer readable storage medium of shooting

Country Status (1)

Country Link
CN (2) CN108419017B (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019205887A1 (en) 2018-04-28 2019-10-31 Oppo广东移动通信有限公司 Method and apparatus for controlling photographing, electronic device, and computer readable storage medium
CN111107248B (en) * 2018-10-25 2022-02-08 北京图森智途科技有限公司 Multi-channel video acquisition synchronization system and method and acquisition controller
CN109635539B (en) * 2018-10-30 2022-10-14 荣耀终端有限公司 Face recognition method and electronic equipment
WO2020181494A1 (en) * 2019-03-12 2020-09-17 深圳市大疆创新科技有限公司 Parameter synchronization method, image capture apparatus, and movable platform
CN110460824B (en) * 2019-07-03 2022-10-11 青岛小鸟看看科技有限公司 Frame synchronization method of image data and camera
CN113132551B (en) * 2019-12-30 2023-08-08 浙江舜宇智能光学技术有限公司 Synchronous control method and synchronous control device for multi-camera system and electronic equipment
CN111230290B (en) * 2020-01-17 2021-07-30 北京工业大学 System and method for synchronizing ultrafast laser and ICCD camera by photoelectric signal
CN111292488A (en) * 2020-02-13 2020-06-16 展讯通信(上海)有限公司 Image data processing method, device and storage medium
CN114071120A (en) * 2020-08-03 2022-02-18 炬才微电子(深圳)有限公司 Camera testing system, method, storage medium and electronic equipment
CN118102109A (en) 2020-08-07 2024-05-28 北京图森未来科技有限公司 Control method, device and equipment of image acquisition equipment and storage medium
CN114371722B (en) * 2021-12-03 2024-05-10 深圳供电局有限公司 Data acquisition method, device, unmanned aerial vehicle and storage medium
CN114863510B (en) * 2022-03-25 2023-08-01 荣耀终端有限公司 Face recognition method and device
CN115604402A (en) * 2022-09-22 2023-01-13 恒玄科技(上海)股份有限公司(Cn) Wireless intelligent wearable device and image acquisition method thereof

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1710935A (en) * 2004-06-17 2005-12-21 株式会社日立制作所 Imaging apparatus
CN103338334A (en) * 2013-07-17 2013-10-02 中测新图(北京)遥感技术有限责任公司 System and method for controlling multi-cameral digital aerial photographic camera synchronous exposure
CN105657243A (en) * 2015-11-08 2016-06-08 乐视移动智能信息技术(北京)有限公司 Anti-jitter delay photographing method and device
WO2016168781A1 (en) * 2015-04-17 2016-10-20 The Lightco Inc. Methods and apparatus for syncronizing readout of multiple image sensors
CN106973231A (en) * 2017-04-19 2017-07-21 宇龙计算机通信科技(深圳)有限公司 Picture synthetic method and system
CN107231533A (en) * 2017-06-12 2017-10-03 深圳市瑞立视多媒体科技有限公司 A kind of synchronous exposure method, device and terminal device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6033505B2 (en) * 2014-11-21 2016-11-30 オリンパス株式会社 Imaging system
US9626803B2 (en) * 2014-12-12 2017-04-18 Qualcomm Incorporated Method and apparatus for image processing in augmented reality systems
CN107040726B (en) * 2017-04-19 2020-04-07 宇龙计算机通信科技(深圳)有限公司 Double-camera synchronous exposure method and system
CN107395998A (en) * 2017-08-24 2017-11-24 维沃移动通信有限公司 A kind of image capturing method and mobile terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1710935A (en) * 2004-06-17 2005-12-21 株式会社日立制作所 Imaging apparatus
CN103338334A (en) * 2013-07-17 2013-10-02 中测新图(北京)遥感技术有限责任公司 System and method for controlling multi-cameral digital aerial photographic camera synchronous exposure
WO2016168781A1 (en) * 2015-04-17 2016-10-20 The Lightco Inc. Methods and apparatus for syncronizing readout of multiple image sensors
CN105657243A (en) * 2015-11-08 2016-06-08 乐视移动智能信息技术(北京)有限公司 Anti-jitter delay photographing method and device
CN106973231A (en) * 2017-04-19 2017-07-21 宇龙计算机通信科技(深圳)有限公司 Picture synthetic method and system
CN107231533A (en) * 2017-06-12 2017-10-03 深圳市瑞立视多媒体科技有限公司 A kind of synchronous exposure method, device and terminal device

Also Published As

Publication number Publication date
CN110248111A (en) 2019-09-17
CN108419017A (en) 2018-08-17
CN108419017B (en) 2019-07-30

Similar Documents

Publication Publication Date Title
CN110248111B (en) Method and device for controlling shooting, electronic equipment and computer-readable storage medium
CN108764052B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
EP3627827B1 (en) Method for controlling photographing, electronic device, and computer readable storage medium
AU2019326597B2 (en) Image processing method, computer-readable storage medium, and electronic apparatus
CN108805024B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108549867B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108650472B (en) Method and device for controlling shooting, electronic equipment and computer-readable storage medium
CN108804895B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
WO2019205890A1 (en) Image processing method, apparatus, computer-readable storage medium, and electronic device
CN110191266B (en) Data processing method and device, electronic equipment and computer readable storage medium
WO2019196683A1 (en) Method and device for image processing, computer-readable storage medium, and electronic device
EP3614659A1 (en) Image processing method, electronic apparatus, and computer-readable storage medium
CN108833887B (en) Data processing method and device, electronic equipment and computer readable storage medium
US11275927B2 (en) Method and device for processing image, computer readable storage medium and electronic device
CN109712192A (en) Camera module scaling method, device, electronic equipment and computer readable storage medium
CN108924426B (en) Image processing method and device, electronic equipment and computer readable storage medium
EP3672223B1 (en) Data processing method, electronic device, and computer-readable storage medium
CN111523499B (en) Image processing method, apparatus, electronic device, and computer-readable storage medium
US11218650B2 (en) Image processing method, electronic device, and computer-readable storage medium
CN109120846B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN108810516B (en) Data processing method and device, electronic equipment and computer readable storage medium
WO2019205889A1 (en) Image processing method, apparatus, computer-readable storage medium, and electronic device
US20190058869A1 (en) Stereoscopic image-capturing apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant