CN110971836B - Method and device for controlling shooting, electronic equipment and computer-readable storage medium - Google Patents

Method and device for controlling shooting, electronic equipment and computer-readable storage medium Download PDF

Info

Publication number
CN110971836B
CN110971836B CN202010004323.3A CN202010004323A CN110971836B CN 110971836 B CN110971836 B CN 110971836B CN 202010004323 A CN202010004323 A CN 202010004323A CN 110971836 B CN110971836 B CN 110971836B
Authority
CN
China
Prior art keywords
processing unit
camera
image
exposure time
exposure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010004323.3A
Other languages
Chinese (zh)
Other versions
CN110971836A (en
Inventor
谭国辉
周海涛
谭筱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010004323.3A priority Critical patent/CN110971836B/en
Publication of CN110971836A publication Critical patent/CN110971836A/en
Application granted granted Critical
Publication of CN110971836B publication Critical patent/CN110971836B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising
    • H04N5/06Generation of synchronising signals
    • H04N5/067Arrangements or circuits at the transmitter end
    • H04N5/073Arrangements or circuits at the transmitter end for mutually locking plural sources of synchronising signals, e.g. studios or relay stations
    • H04N5/0733Arrangements or circuits at the transmitter end for mutually locking plural sources of synchronising signals, e.g. studios or relay stations for distributing synchronisation pulses to different TV cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the application relates to a method and a device for controlling shooting, electronic equipment and a computer readable storage medium. The method comprises the following steps: when the second processing unit receives the data acquisition request, the second camera is controlled to acquire a second image, and an image acquisition instruction is sent to the first processing unit, wherein the image acquisition instruction is used for instructing the first processing unit to control the first camera to acquire a first image; when the second processing unit receives a synchronous signal sent by the second camera, acquiring first exposure time of the first camera and second exposure time of the second camera; calculating the time delay duration according to the first exposure time and the second exposure time; and when the time length of the synchronization signal received by the second processing unit reaches the delay time length, the synchronization signal is forwarded to the first camera. The method, the device, the electronic equipment and the computer readable storage medium for controlling shooting have good synchronization effect, and can ensure that the image contents acquired by the two cameras are consistent.

Description

Method and device for controlling shooting, electronic equipment and computer-readable storage medium
The application is a divisional application of a patent application with application number 201810401344.1, which is filed on 2018, 4, 28 and has an invention name of 'method, device, electronic equipment and computer-readable storage medium for controlling shooting'.
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for controlling shooting, an electronic device, and a computer-readable storage medium.
Background
Along with the rapid development of the image technology on the intelligent terminal, more and more intelligent terminals are provided with two or more cameras, and the cooperation of the multiple cameras is adopted, so that images with better visual effect are acquired. In order to ensure that the pictures finally acquired by the two cameras are consistent, the two cameras need to be controlled to be synchronized. In the conventional method, two cameras are usually connected through a hardware signal line and synchronized by a signal for starting exposure every frame. When the difference of the exposure time between the two cameras is large, the synchronism is poor, and the difference of the content of the images collected between the two cameras is large.
Disclosure of Invention
The embodiment of the application provides a method and a device for controlling shooting, an electronic device and a computer-readable storage medium, which have good synchronization effect and can ensure that the image contents acquired by two cameras are consistent.
A method of controlling photographing, comprising:
when a second processing unit receives a data acquisition request, controlling a second camera to acquire a second image according to the data acquisition request, and sending an image acquisition instruction to a first processing unit, wherein the image acquisition instruction is used for instructing the first processing unit to control the first camera to acquire a first image;
when the second processing unit receives a synchronization signal sent by the second camera, acquiring a first exposure time of the first camera and a second exposure time of the second camera, wherein the synchronization signal is a signal sent at the moment of starting exposure when the second camera collects each frame of second image;
calculating the time delay duration according to the first exposure time and the second exposure time;
when the time length of the synchronization signal received by the second processing unit reaches the time delay time length, the synchronization signal is forwarded to the first camera, and the synchronization signal is used for indicating the first camera to start exposure and acquiring a first image;
and processing the first image through the first processing unit, and sending the processed first image to the second processing unit.
An apparatus for controlling photographing, comprising:
the request receiving module is used for controlling the second camera to collect a second image according to the data acquisition request and sending an image collection instruction to the first processing unit when the second processing unit receives the data acquisition request, wherein the image collection instruction is used for instructing the first processing unit to control the first camera to collect a first image;
the signal receiving module is used for acquiring first exposure time of the first camera and second exposure time of the second camera when the second processing unit receives a synchronization signal sent by the second camera, wherein the synchronization signal is a signal sent at the moment of starting exposure when the second camera collects each frame of second image;
the calculating module is used for calculating the time delay duration according to the first exposure time and the second exposure time;
the signal forwarding module is configured to forward the synchronization signal to the first camera when the time length of the synchronization signal received by the second processing unit reaches the time delay length, where the synchronization signal is used to instruct the first camera to start exposure and acquire a first image;
and the processing module is used for processing the first image through the first processing unit and sending the processed first image to the second processing unit.
An electronic device comprises a first processing unit, a second processing unit and a camera module, wherein the first processing unit is respectively connected with the second processing unit and the camera module; the camera module comprises a first camera and a second camera, the first processing unit is connected with the first camera through a control line, the second processing unit is connected with the second camera through a control line, the first processing unit is connected with the second processing unit, and the second processing unit is also connected with the first camera and the second camera through signal lines respectively;
the second processing unit is used for controlling the second camera to acquire a second image according to the data acquisition request and sending an image acquisition instruction to the first processing unit when the data acquisition request is received;
the first processing unit is used for controlling the first camera to acquire a first image according to the image acquisition instruction;
the second camera is used for sending a synchronization signal to the second processing unit at the moment of starting exposure when each frame of second image is collected;
the second processing unit is further configured to obtain a first exposure time of the first camera and a second exposure time of the second camera when the second processing unit receives the synchronization signal sent by the second camera, and calculate a delay duration according to the first exposure time and the second exposure time;
the second processing unit is further configured to forward the synchronization signal to the first camera when the time length of the synchronization signal received by the second processing unit reaches the delay time length;
the first camera is used for starting exposure according to the synchronous signal and acquiring a first image;
the first processing unit is further configured to process the first image and send the processed first image to the second processing unit.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method as set forth above.
According to the method, the device, the electronic equipment and the computer readable storage medium for controlling shooting, when the second processing unit receives the synchronous signal sent by the second camera, the time delay duration is calculated according to the exposure duration of the two cameras, when the duration of the synchronous signal received by the second processing unit reaches the time delay duration, the synchronous signal is forwarded to the first camera, the time point for forwarding the synchronous signal is dynamically adjusted according to the exposure duration of the first camera and the exposure duration of the second camera, the synchronous time of the first camera and the second camera is dynamically adjusted through the second processing unit, the synchronization effect is good, and when the difference of the exposure duration between the two cameras is large, the image contents collected by the two cameras can still be guaranteed to be consistent.
Drawings
Fig. 1 is an application scenario diagram of a method of controlling photographing in one embodiment;
fig. 2 is an application scenario diagram of a method of controlling photographing in another embodiment;
FIG. 3 is a block diagram of an electronic device in one embodiment;
FIG. 4 is a flow diagram illustrating a method of controlling photography in one embodiment;
FIG. 5 is a schematic diagram illustrating a flow of a second processing unit sending an image capture instruction to a first processing unit in one embodiment;
FIG. 6 is a schematic diagram illustrating a process of a first processing unit sending a processed first image to a second processing unit in one embodiment;
FIG. 7 is a block diagram of an apparatus for controlling photographing in one embodiment;
FIG. 8 is a block diagram of a processing module in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first client may be referred to as a second client, and similarly, a second client may be referred to as a first client, without departing from the scope of the present application. Both the first client and the second client are clients, but they are not the same client.
Fig. 1 is an application scenario diagram of a method for controlling shooting in one embodiment. As shown in fig. 1, the application scenario may include a first camera 110, a second camera 120, a first processing unit 130, and a second processing unit 140. The first camera 110 may be a laser camera and the second camera 120 may be an RGB (Red/Green/Blue, Red/Green/Blue color mode) camera. The first Processing Unit 130 may be an MCU (micro controller Unit) module, etc., and the second Processing Unit 140 may be a CPU (Central Processing Unit) module, etc. The first processing unit 130 is connected to the first camera 110 through a control line, and the second processing unit 140 is connected to the second camera 120 through a control line. The first processing unit 130 is connected 140 to the second processing unit. The second processing unit 130 is also connected to the first camera 110 and the second camera 120 through signal lines, respectively.
When the second processing unit 140 receives the data acquisition request, the second camera 120 may be controlled to acquire a second image through the control line according to the data acquisition request, and send an image acquisition instruction to the first processing unit 130. When the first processing unit 130 receives the image capturing instruction sent by the second processing unit 140, the first camera can be controlled by the control line to capture the first image according to the image capturing instruction. When the second camera 120 captures the second image of each frame, a synchronization signal may be sent to the second processing unit 140 through a signal line at the time of starting exposure. When the second processing unit 140 receives the synchronization signal sent by the second camera 120, the first exposure duration of the first camera 110 and the second exposure duration of the second camera 120 may be obtained, and the delay duration may be calculated according to the first exposure duration and the second exposure duration. When the time length for receiving the synchronization signal by the second processing unit 140 reaches the delay time length, the synchronization signal may be forwarded to the first camera 110 through the signal line. After receiving the synchronization signal, the first camera 110 may start to expose and collect a first image, and may transmit the collected first image to the first processing unit 130. The first processing unit 130 may process the first image and transmit the processed first image to the second processing unit 140.
Fig. 2 is an application scenario diagram of a method for controlling shooting in another embodiment. As shown in fig. 2, the electronic device 200 may include a camera module 210, a second processing unit 220, and a first processing unit 230. The second processing unit 220 may be a CPU module. The first processing unit 230 may be an MCU module. The first processing unit 230 is connected between the second processing unit 220 and the camera module 210, and the first processing unit 230 can control the laser camera 212, the floodlight 214 and the laser light 218 in the camera module 210. The second processing unit 220 can control the RGB camera 216 in the camera module 210.
The camera module 210 includes a laser camera 212, a floodlight 214, an RGB camera 216, and a laser light 218. The laser camera 212 may be an infrared camera for acquiring infrared images. The floodlight 214 is a surface light source capable of emitting infrared light; the laser lamp 218 is a point light source capable of emitting laser light and is a point light source with a pattern. When the floodlight 214 emits a surface light source, the laser camera 212 can obtain an infrared image according to the reflected light. When the laser lamp 218 emits a point light source, the laser camera 212 may obtain a speckle image according to the reflected light. The speckle image is an image of the pattern deformation after the point light source with the pattern emitted by the laser lamp 218 is reflected.
The second processing unit 220 may be connected to the RGB camera 216 and the laser camera 212 through signal lines, respectively. When each frame of image is captured by the RGB camera 216, a synchronization signal may be sent to the second processing unit 220. After receiving the synchronization signal sent by the RGB camera 216, the second processing unit 220 may obtain the exposure duration of the laser camera 212 and the exposure duration of the RGB camera 216, and calculate the delay duration according to the exposure duration of the laser camera 212 and the exposure duration of the RGB camera 216. When the time period in which the second processing unit 220 receives the synchronization signal reaches the delay time period, the synchronization signal may be forwarded to the laser camera 212 through the signal line. The laser camera 212 receives the synchronization signal, and can start exposure and collect infrared images or speckle images according to the synchronization signal.
The second processing unit 220 may include a CPU core operating in a TEE (Trusted Execution Environment) Environment and a CPU core operating in a REE (natural Execution Environment) Environment. The TEE environment and the REE environment are both running modes of an ARM module (Advanced RISC Machines, Advanced reduced instruction set processor). The security level of the TEE environment is higher, and only one CPU core in the second processing unit 220 can operate in the TEE environment at the same time. Generally, the operation behavior with higher security level in the electronic device 200 needs to be executed in the CPU core in the TEE environment, and the operation behavior with lower security level can be executed in the CPU core in the REE environment.
The first processing unit 230 includes a PWM (Pulse Width Modulation) module 232, an SPI/I2C (Serial Peripheral Interface/Inter-Integrated Circuit) Interface 234, a RAM (Random Access Memory) module 236, and a depth engine 238. The PWM module 232 may transmit pulses to the camera module to control the floodlight 214 or the laser 218 to be turned on, so that the laser camera 212 may collect infrared images or speckle images. The SPI/I2C interface 234 is used for receiving the image capturing instruction sent by the second processing unit 220. The depth engine 238 may process the speckle images to obtain a depth disparity map.
When the second processing unit 220 receives a data acquisition request of an application program, for example, when the application program needs to perform face unlocking and face payment, an image acquisition instruction may be sent to the first processing unit 230 through the CPU core operating in the TEE environment. After the first processing unit 230 receives the image acquisition instruction, the PWM module 232 emits a pulse wave to control the floodlight 214 in the camera module 210 to be turned on and acquire an infrared image through the laser camera 212, and control the laser light 218 in the camera module 210 to be turned on and acquire a speckle image through the laser camera 212. The camera module 210 may send the collected infrared image and speckle image to the first processing unit 230. The first processing unit 230 may process the received infrared image to obtain an infrared disparity map; and processing the received speckle images to obtain a speckle parallax image or a depth parallax image. The processing of the infrared image and the speckle image by the first processing unit 230 refers to correcting the infrared image or the speckle image and removing the influence of internal and external parameters in the camera module 210 on the image. The first processing unit 230 can be set to different modes, and the images output by the different modes are different. When the first processing unit 230 is set to the speckle pattern mode, the first processing unit 230 processes the speckle image to obtain a speckle disparity map, and a target speckle image can be obtained according to the speckle disparity map; when the first processing unit 230 is set to the depth map mode, the first processing unit 230 processes the speckle images to obtain a depth disparity map, and obtains a depth image according to the depth disparity map, where the depth image is an image with depth information. The first processing unit 230 may send the infrared disparity map and the speckle disparity map to the second processing unit 220, and the first processing unit 230 may also send the infrared disparity map and the depth disparity map to the second processing unit 220. The second processing unit 220 may obtain an infrared image of the target according to the infrared disparity map and obtain a depth image according to the depth disparity map. Further, the second processing unit 220 may perform face recognition, face matching, living body detection, and depth information acquisition on the detected face according to the target infrared image and the depth image.
The communication between the first processing unit 230 and the second processing unit 220 is through a fixed security interface to ensure the security of the transmitted data. As shown in fig. 1, the data sent by the second processing unit 220 to the first processing unit 230 is through a SECURE SPI/I2C 240, and the data sent by the first processing unit 230 to the second processing unit 220 is through a SECURE MIPI (Mobile Industry Processor Interface) 250.
In an embodiment, the first processing unit 230 may also obtain a target infrared image according to the infrared disparity map, calculate and obtain a depth image according to the depth disparity map, and then send the target infrared image and the depth image to the second processing unit 220.
FIG. 3 is a block diagram of an electronic device in one embodiment. As shown in fig. 3, the electronic device includes a processor, a memory, a display screen, and an input device connected through a system bus. The memory may include, among other things, a non-volatile storage medium and a processor. A non-volatile storage medium of an electronic device stores an operating system and a computer program that when executed by a processor implements a method of controlling photographing provided in an embodiment of the present application. The processor is used for providing calculation and control capability and supporting the operation of the whole electronic equipment. The internal memory in the electronic device provides an environment for the execution of the computer program in the nonvolatile storage medium. The display screen of the electronic device may be a liquid crystal display screen or an electronic ink display screen, and the input device may be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on a housing of the electronic device, or an external keyboard, a touch pad or a mouse. The electronic device may be a mobile phone, a tablet computer, or a personal digital assistant or a wearable device, etc. Those skilled in the art will appreciate that the architecture shown in fig. 3 is a block diagram of only a portion of the architecture associated with the subject application, and does not constitute a limitation on the electronic devices to which the subject application may be applied, and that a particular electronic device may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
As shown in fig. 4, in one embodiment, there is provided a method of controlling photographing, including the steps of:
step 410, when the second processing unit receives the data acquisition request, controlling the second camera to acquire the second image according to the data acquisition request, and sending an image acquisition instruction to the first processing unit, where the image acquisition instruction is used to instruct the first processing unit to control the first camera to acquire the first image.
When an application program in the electronic device needs to acquire face data, the first camera can be controlled to be started, and a first image is acquired, wherein the face data can include but is not limited to data needing face verification under scenes such as face unlocking and face payment, face depth information and the like. The first camera can be a laser camera, and the laser camera can collect invisible light images with different wavelengths. The first image may include, but is not limited to, an infrared image, a speckle image, etc., the speckle image referring to the infrared image with the speckle image.
When the application program needs to acquire the face data, a data acquisition request can be sent to the second processing unit. After receiving the data acquisition request, the second processing unit may send an image acquisition instruction to the first processing unit, where the first processing unit may be an MCU module, and the second processing unit may be a CPU module. Optionally, the second processing unit may first detect whether the data acquisition request includes a visible light image acquisition instruction, and if the data acquisition request includes the visible light image acquisition instruction, it may be stated that the application program needs to acquire a visible light image including a human face while acquiring the human face data. If the data acquisition request includes a visible light image acquisition instruction, the second processing unit may control the second camera to acquire a second image according to the visible light image acquisition instruction, where the second camera may be an RGB camera, and the second image may be an RGB image including a human face.
After the first processing unit receives the image acquisition instruction, the first camera can be controlled to acquire a first image according to the image acquisition instruction, wherein the first image can comprise an infrared image, a speckle image and the like. The floodlight in the camera module can be opened in the steerable of first processing unit and infrared image is gathered through the laser camera, can open lasers such as laser lamps in the camera module and gather speckle image etc. through the laser camera. The floodlight can be a pointolite of evenly shining in all directions, and the light that the floodlight sent can be the infrared light, and the laser camera can gather people's face and obtain infrared image. Laser that the laser instrument sent can be carried out the diffraction by lens and DOE (dispersive optical elements) and produce the pattern of taking the speckle granule, projects the target object through the pattern of taking the speckle granule, receives the different skew that produces the speckle pattern of distance of target object each point and electronic equipment, and laser camera gathers the target object and obtains the speckle image.
Step 420, when the second processing unit receives a synchronization signal sent by the second camera, acquiring a first exposure time of the first camera and a second exposure time of the second camera, where the synchronization signal is a signal sent at a time when the second camera starts to expose when acquiring each frame of the second image.
The first processing unit can be connected with the first camera through a control line, and the first camera is controlled to acquire a first image through the control line. The second processing unit can be connected with the second camera through a control line, and the second camera is controlled to collect a second image through the control line. The first processing unit may be connected with the second processing unit. The second processing unit can also be respectively connected with the first camera and the second camera through signal lines, wherein the signal lines can be synchronous signal lines.
When the second camera collects each frame of image, it may send a synchronization signal to the second processing unit connected to the signal line at the time of starting exposure, where the synchronization signal may be a start mark sof (start of frame) of a frame, and may be used for starting exposure of each frame of image. When the second processing unit receives the synchronization signal sent by the second camera, the first exposure time of the first camera and the second exposure time of the second camera can be obtained, the exposure time can refer to the photosensitive time, and the longer the exposure time is, the more light can enter the second camera. Generally, the difference between the first exposure duration of the first camera and the second exposure duration of the second camera is large, and the first exposure duration of the first camera may be smaller than the second exposure duration of the second camera, but is not limited thereto, and there may also be a case where the first exposure duration of the first camera is larger than the second exposure duration of the second camera, and the like.
And 430, calculating the delay time length according to the first exposure time and the second exposure time.
The second processing unit can calculate the time delay duration according to the first exposure duration of the first camera and the second exposure duration of the second camera, wherein the time delay duration refers to the time length for prolonging the exposure start of the first camera, and the synchronization of the first camera and the second camera can be ensured by delaying the exposure start time of the first camera.
In one embodiment, the electronic device may preset the time when the first camera and the second camera are synchronized in the exposure process, wherein the synchronized time in the exposure process may mean that the ratio of the time length that the first camera has been exposed to the first exposure time length is the same as the ratio of the time length that the second camera has been exposed to the second exposure time length. For example, the first camera and the second camera may be set to end exposure simultaneously, or may coincide at half the exposure time, or may coincide at 3/4 or the like. The second processing unit can calculate the delay time length according to the first exposure time length, the second exposure time length and the set synchronous time in the exposure process.
Step 440, when the time length of the synchronization signal received by the second processing unit reaches the delay time length, forwarding the synchronization signal to the first camera, where the synchronization signal is used to instruct the first camera to start exposure and acquire the first image.
After the second processing unit calculates the delay time, the second processing unit can forward the synchronization signal to the first camera when the time for receiving the synchronization signal reaches the delay time. After the first camera receives the synchronous signal, exposure is started, so that the synchronous moments of the first camera and the second camera in the exposure process can be kept consistent. For example, the electronic device may calculate the delay time length in advance when the half-exposure time of the device is consistent, and forward the synchronization signal to the first camera when the time length of receiving the synchronization signal reaches the delay time length, and when the first camera is half-exposed, the second camera is also half-exposed, and the two are consistent.
Step 450, the first image is processed by the first processing unit, and the processed first image is sent to the second processing unit.
The first camera can send the first image of gathering to first processing unit, and first processing unit can handle first image. The first processing unit can be set to different modes, and different modes can acquire different first images and perform different processing on the first images, and the like. When first processing unit is infrared mode, floodlight is steerable to be opened to first processing unit to gather infrared image through first camera, can handle infrared image and obtain infrared parallax map. When the first processing unit is in a speckle pattern mode, the first processing unit can control the laser lamp to be started, the first camera collects speckle images, and the speckle images can be processed to obtain a speckle parallax image. When the first processing unit is in a depth map mode, the first processing unit can process the speckle images to obtain a depth parallax map.
In one embodiment, the first processing unit may perform a correction process on the first image, where the correction process is performed to correct an image content offset of the first image due to internal and external parameters of the first camera and the second camera, for example, an image content offset due to a laser camera deflection angle, a placement position between the laser camera and the RGB camera, and the like. After the first image is corrected, a disparity map of the first image can be obtained, for example, an infrared disparity map can be obtained by correcting an infrared image, and a speckle disparity map or a depth disparity map can be obtained by correcting a speckle image. The correction processing is performed on the first image, so that the situation that the image finally presented on the screen of the electronic equipment is ghosted can be prevented.
The first processing unit processes the first image, and can send the processed first image to the second processing unit. The second processing unit can obtain a target image, such as a target infrared image, a target speckle image, a target depth map and the like, according to the processed first image. The second processing unit can process the target image according to the requirement of the application program.
For example, when the application program needs to perform face verification, the second processing unit may perform face detection according to the target image, and the like, where the face detection may include face recognition, face matching, and living body detection. The human face recognition means recognizing whether a human face exists in a target image, the human face matching means matching the human face in the target image with a pre-stored human face, and the living body detection means detecting whether the human face in the target image has biological activity or not. If the application program needs to acquire the depth information of the face, the generated target depth map can be uploaded to the application program, and the application program can perform beautifying processing, three-dimensional modeling and the like according to the received target depth map.
In this embodiment, when the second processing unit receives the synchronization signal sent by the second camera, the delay time length is calculated according to the exposure time lengths of the two cameras, when the time length of the synchronization signal received by the second processing unit reaches the delay time length, the synchronization signal is forwarded to the first camera, the time point for forwarding the synchronization signal is dynamically adjusted according to the exposure time lengths of the first camera and the second camera, the synchronization time of the first camera and the second camera is dynamically adjusted through the second processing unit, the synchronization effect is good, and when the difference between the exposure time lengths of the two cameras is large, the content of the images collected by the two cameras can still be guaranteed to be consistent.
In one embodiment, the step 430 of calculating the delay time length according to the first exposure time and the second exposure time includes: and calculating the exposure time difference of the first exposure time and the second exposure time, and dividing the exposure time difference by 2 to obtain the delay time.
The electronic equipment can set the first camera and the second camera to be consistent at half-exposure time, and when the first camera is exposed to half, the second camera is also exposed to half. When the second processing unit receives the synchronization signal sent by the second camera, the exposure time difference of the first exposure time and the second exposure time can be calculated, and the exposure time difference is divided by 2 to obtain the delay time. Time delay duration T3=|T1-T2I/2, wherein T1Denotes a first exposure time period, T2Indicating the second exposure time period. For example, if the first exposure time of the first camera is 3ms (millisecond), and the second exposure time of the second camera is 30ms, the exposure time difference between the first exposure time and the second exposure time may be calculated to be 17ms, and the exposure time difference may be divided by 2, so as to obtain the delay time of 13.5 ms.
Optionally, after calculating the exposure time difference between the first exposure time and the second exposure time, the second processing unit may compare the exposure time difference with a time threshold, determine whether the exposure time difference is greater than the time threshold, if so, divide the exposure time difference by 2 to obtain a delay time, and forward the synchronization signal to the first camera when the time length of the synchronization signal received by the second processing unit reaches the delay time. If the exposure time difference is less than or equal to the time threshold, the second processing unit can directly forward the synchronization signal to the first camera without prolonging the time when the first camera starts to expose. The time threshold can be set according to actual requirements, for example, 1ms, 2ms and the like, so that the acquired image contents of the first camera and the second camera are ensured to be within a tolerable difference error, and the calculation pressure of the second processing unit is reduced.
In one embodiment, to ensure that when the first camera is half exposed, the second camera is also half exposed, the second cameraThe processing unit may further calculate a first intermediate exposure time of the first exposure time duration and a second intermediate exposure time of the second exposure time duration, respectively, wherein the intermediate exposure time refers to a time of exposure to half. The second processing unit may determine a difference between the first intermediate exposure timing and the second intermediate exposure timing, and regard the difference as the delay time period. Time delay duration T3=|T1/2-T2/2. wherein, T1Denotes a first exposure time period, T2Indicating the second exposure time period. For example, if the first exposure time of the first camera is 3ms and the second exposure time of the second camera is 30ms, the first intermediate exposure time of the first exposure time may be calculated to be 1.5ms, and the second intermediate exposure time of the second exposure time may be calculated to be 15ms, the difference between the first intermediate exposure time and the second intermediate exposure time may be calculated to be 13.5ms, and the difference 13.5ms may be used as the delay time. It is to be understood that other algorithms may be used to ensure synchronization between the first camera and the second camera, and are not limited to the above-mentioned methods.
In this embodiment, the time point for forwarding the synchronization signal can be dynamically adjusted according to the exposure duration of the first camera and the second camera, so that the timing for synchronizing the first camera and the second camera can be dynamically adjusted, the first camera and the second camera are guaranteed to be consistent at half of the exposure time, and the synchronization effect is good.
As shown in fig. 5, in one embodiment, the step of sending the processed first image to the second processing unit comprises the steps of:
step 502, sending an image acquisition instruction to a first processing unit through a kernel of a second processing unit operating in a first operation mode, where the first operation mode is a trusted operation environment.
In one embodiment, the second processing unit in the electronic device may include two operation modes, where the first operation mode may be a TEE, the TEE is a trusted operation environment, and the security level is high; the second operation mode may be REE, which is a natural operation environment, and the security level of the REE is low. After receiving a data acquisition request sent by an application program, the second processing unit can send an image acquisition instruction to the first processing unit through the first operation mode. When the second processing unit is a CPU with a single core, the single core can be directly switched from the second operation mode to the first operation mode; when the second processing unit has multiple cores, one core can be switched from the second operation mode to the first operation mode, other cores still operate in the second operation mode, and an image acquisition instruction is sent to the first processing unit through the core operating in the first operation mode.
In step 504, the first processing unit sends the processed first image to a kernel of the second processing unit operating in the first operating mode.
After the first processing unit processes the acquired first image, the processed first image can be sent to the kernel running in the first running mode, so that the first processing unit can be guaranteed to run in a trusted running environment all the time, and the safety is improved. The second processing unit may obtain, in the kernel operating in the first operation mode, a target image according to the processed first image, and process the target image according to a requirement of the application program. For example, the second processing unit may perform face detection on the target image in the kernel operating in the first operating mode.
In one embodiment, since the kernel operating in the first operating mode is unique, the second processing unit performs face detection on the target image in the TEE environment, and can perform face recognition, face matching, live body detection and the like on the target image one by one in a serial manner. The second processing unit may perform face recognition on the target image, and when a face is recognized, match the face included in the target image with a face stored in advance, and determine whether the faces are the same face. If the face is the same face, then the living body detection is carried out on the face according to the target image, and the collected face is prevented from being a two-dimensional plane face and the like. When the face is not recognized, face matching and live body detection may not be performed, and the processing pressure of the second processing unit may be reduced.
In this embodiment, the kernel with high security of the second processing unit sends the image acquisition instruction to the first processing unit, so that the first processing unit can be ensured to be in an environment with high security, and the security of data is improved.
As shown in fig. 6, in an embodiment, the method for controlling shooting further includes the following steps:
step 602, obtaining an application type of an application program sending a data obtaining request.
Step 604, determining the security level of the application program according to the application type.
When the application program of the electronic device sends a data acquisition request to the second processing unit, the second processing unit may acquire an application type of the application program and acquire a security level corresponding to the application type. The application types may include, but are not limited to, unlock applications, payment applications, camera applications, beauty applications, and the like. The security level of different application types may be different, for example, but not limited to, the security level corresponding to the payment application and the unlock application may be high, the security level corresponding to the camera application, the beauty application may be low, and the like.
Step 606, selecting a data transmission channel corresponding to the security level.
The second processing unit may select a data transmission channel corresponding to the security level of the application program, where the data transmission channel may include, but is not limited to, a secure channel and a non-secure channel, where the secure channel may correspond to the application program with a higher security level, and the non-secure channel may correspond to the application program with a lower security level. For example, the payment application may correspond to a secure channel and the beauty application may correspond to a normal channel. In the secure channel, the transmitted data can be encrypted, so that the data is prevented from being leaked or stolen.
In step 608, when the data transmission channel is a secure channel, the first processing unit sends the processed first image to a kernel of the second processing unit operating in the first operation mode.
When the data transmission channel is a secure channel, the first processing unit may send the processed first image to a kernel of the second processing unit operating in the first operation mode. The second processing unit can obtain a target image including a target infrared image, a target speckle image or a target depth image according to the processed first image in the kernel operating in the first operation mode. The second processing unit can perform face detection on the target image in the kernel operating in the first operation mode, and can perform face recognition, face matching, living body detection and the like on the target image one by one in a serial acquisition mode. The second processing unit can transmit the data required by the application program to the application program through the secure channel according to the requirement of the application program. For example, if the application program needs to perform face detection, the second processing unit may transmit a result of the face detection to the application program through the secure channel; if the application program needs to acquire the depth information of the face, the second processing unit can transmit the target depth map to the application program through the secure channel.
And step 610, when the data transmission channel is a non-secure channel, the first processing unit sends the processed first image to a camera driver in a second operation mode, wherein the second operation mode is a natural operation environment.
When the data transmission channel is a non-secure channel, the first processing unit can send the processed first image to the camera driver, and the camera driver can run on the kernel of the second processing unit in the second running mode. The second processing unit can carry out face detection on a target image through the driving of the camera, wherein the target image can be obtained according to the processed first image. The second processing unit can perform face detection on the target image in parallel in the REE environment, and can perform face recognition, face matching, living body detection and the like on the target image in a plurality of kernels in the second running mode respectively, so that the data processing efficiency can be improved. The camera driver can transmit data required by the application program to the application program according to the requirement of the application program.
In one embodiment, the second processing unit may obtain a security level of an application program that sends the data obtaining request, and determine an image accuracy corresponding to the security level. The higher the image accuracy, the sharper the corresponding image can be, the more information it contains. The second processing unit may send image data corresponding to the image accuracy to the application program, for example, when the second processing unit sends the target depth image to the application program, the application program with high security level may correspond to the target depth image with high image accuracy, and the application program with low security level may correspond to the target depth image with low image accuracy. Alternatively, the second processing unit may adjust the image accuracy of the image data by adjusting the image resolution, the higher the image accuracy, and the lower the resolution, the lower the image accuracy. The number of the diffracted points of the laser lamp can be controlled, so that the higher the image precision is, the more the diffracted points can be, and the lower the image precision is, the fewer the diffracted points can be. It will be appreciated that other ways of controlling the image accuracy may be used, and is not limited to the above-mentioned ways. The image precision is adjusted according to the security level of the application program, and the security of the image data can be improved.
In this embodiment, the corresponding data channel is selected according to the security level of the application program to transmit data, so that the security of data transmission can be improved in the secure channel, and the data processing efficiency can be improved in the non-secure channel.
In one embodiment, there is provided a method of controlling photographing, including the steps of:
and (1) when the second processing unit receives the data acquisition request, controlling the second camera to acquire a second image according to the data acquisition request, and sending an image acquisition instruction to the first processing unit, wherein the image acquisition instruction is used for instructing the first processing unit to control the first camera to acquire a first image.
In one embodiment, the first processing unit is connected with the first camera through a control line, the second processing unit is connected with the second camera through a control line, the first processing unit is connected with the second processing unit, and the second processing unit is further connected with the first camera and the second camera through signal lines respectively.
And (2) when the second processing unit receives a synchronous signal sent by the second camera, acquiring the first exposure time of the first camera and the second exposure time of the second camera, wherein the synchronous signal is a signal sent at the moment of starting exposure when the second camera collects each frame of second image.
And (3) calculating the time delay duration according to the first exposure time and the second exposure time.
In one embodiment, step (3) comprises: and calculating the exposure time difference of the first exposure time and the second exposure time, and dividing the exposure time difference by 2 to obtain the delay time.
In one embodiment, step (3) comprises: respectively calculating a first middle exposure time of the first exposure duration and a second middle exposure time of the second exposure duration; and determining the difference value between the first intermediate exposure time and the second intermediate exposure time, and taking the difference value as the delay time length.
And (4) when the time length of the synchronization signal received by the second processing unit reaches the delay time length, forwarding the synchronization signal to the first camera, wherein the synchronization signal is used for indicating the first camera to start exposure and acquiring a first image.
And (5) processing the first image through the first processing unit, and sending the processed first image to the second processing unit.
In one embodiment, step (1) comprises: sending an image acquisition instruction to a first processing unit through a kernel in a second processing unit, wherein the kernel operates in a first operation mode, and the first operation mode is a trusted operation environment; step (5), comprising: and the first processing unit sends the processed first image to a kernel which runs in the first running mode in the second processing unit.
In one embodiment, step (5) comprises: acquiring an application type of an application program sending a data acquisition request; determining the security level of the application program according to the application type; selecting a data transmission channel corresponding to the security level; when the data transmission channel is a safe channel, the first processing unit sends the processed first image to a kernel which runs in a first running mode in the second processing unit; when the data transmission channel is a non-safety channel, the first processing unit sends the processed first image to a camera drive in a second operation mode, and the second operation mode is a natural operation environment.
In an embodiment, the method for controlling shooting further includes: acquiring the security level of an application program sending a data acquisition request; determining an image precision corresponding to the security level; image data corresponding to the image accuracy is sent to the application.
In this embodiment, when the second processing unit receives the synchronization signal sent by the second camera, the delay time length is calculated according to the exposure time lengths of the two cameras, when the time length of the synchronization signal received by the second processing unit reaches the delay time length, the synchronization signal is forwarded to the first camera, the time point for forwarding the synchronization signal is dynamically adjusted according to the exposure time lengths of the first camera and the second camera, the synchronization time of the first camera and the second camera is dynamically adjusted through the second processing unit, the synchronization effect is good, and when the difference between the exposure time lengths of the two cameras is large, the content of the images collected by the two cameras can still be guaranteed to be consistent.
It should be understood that, although the steps in the respective flow charts described above are shown in sequence as indicated by the arrows, the steps are not necessarily performed in sequence as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in the various flow diagrams described above may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least a portion of the sub-steps or stages of other steps.
In one embodiment, an electronic device is provided and includes a first processing unit, a second processing unit, and a camera module, wherein the first processing unit is connected to the second processing unit and the camera module, respectively. The camera module comprises a first camera and a second camera, the first processing unit is connected with the first camera through a control line, and the second processing unit is connected with the second camera through a control line. The first processing unit is connected with the second processing unit, and the second processing unit is further connected with the first camera and the second camera through signal lines respectively.
And the second processing unit is used for controlling the second camera to acquire a second image according to the data acquisition request and sending an image acquisition instruction to the first processing unit when the data acquisition request is received.
And the first processing unit is used for controlling the first camera to acquire the first image according to the image acquisition instruction.
And the second camera is used for sending a synchronous signal to the second processing unit at the moment of starting exposure when acquiring the second image of each frame.
And the second processing unit is further used for acquiring the first exposure time of the first camera and the second exposure time of the second camera when the second processing unit receives the synchronous signal sent by the second camera, and calculating the delay time length according to the first exposure time and the second exposure time.
And the second processing unit is also used for forwarding the synchronous signal to the first camera when the time length of the synchronous signal received by the second processing unit reaches the delay time length.
And the first camera is used for starting exposure and acquiring a first image according to the synchronous signal.
The first processing unit is further used for processing the first image and sending the processed first image to the second processing unit.
In this embodiment, when the second processing unit receives the synchronization signal sent by the second camera, the delay time length is calculated according to the exposure time lengths of the two cameras, when the time length of the synchronization signal received by the second processing unit reaches the delay time length, the synchronization signal is forwarded to the first camera, the time point for forwarding the synchronization signal is dynamically adjusted according to the exposure time lengths of the first camera and the second camera, the synchronization time of the first camera and the second camera is dynamically adjusted through the second processing unit, the synchronization effect is good, and when the difference between the exposure time lengths of the two cameras is large, the content of the images collected by the two cameras can still be guaranteed to be consistent.
In an embodiment, the second processing unit is further configured to calculate an exposure time difference between the first exposure time and the second exposure time, and divide the exposure time difference by 2 to obtain the delay time.
In an embodiment, the second processing unit is further configured to calculate a first intermediate exposure time of the first exposure time duration and a second intermediate exposure time of the second exposure time duration, respectively, determine a difference between the first intermediate exposure time and the second intermediate exposure time, and use the difference as the delay time duration.
In this embodiment, the time point for forwarding the synchronization signal can be dynamically adjusted according to the exposure duration of the first camera and the second camera, so that the timing for synchronizing the first camera and the second camera can be dynamically adjusted, the first camera and the second camera are guaranteed to be consistent at half of the exposure time, and the synchronization effect is good.
In an embodiment, the second processing unit is further configured to send the image capturing instruction to the first processing unit through a kernel of the second processing unit operating in a first operation mode, where the first operation mode is a trusted operation environment.
And the first processing unit is also used for sending the processed first image to a kernel which runs in the first running mode in the second processing unit.
In this embodiment, the kernel with high security of the second processing unit sends the image acquisition instruction to the first processing unit, so that the first processing unit can be ensured to be in an environment with high security, and the security of data is improved.
In an embodiment, the second processing unit is further configured to obtain an application type of an application program that sends the data obtaining request, determine a security level of the application program according to the application type, and select a data transmission channel corresponding to the security level.
And the first processing unit is also used for sending the processed first image to a kernel which runs in the first running mode in the second processing unit when the data transmission channel is a safe channel.
And the first processing unit is also used for sending the processed first image to a camera drive in a second operation mode when the data transmission channel is a non-secure channel, wherein the second operation mode is a natural operation environment.
In one embodiment, the second processing unit is further configured to obtain a security level of an application program that sends the data obtaining request, determine an image accuracy corresponding to the security level, and send image data corresponding to the image accuracy to the application program.
In this embodiment, the corresponding data channel is selected according to the security level of the application program to transmit data, so that the security of data transmission can be improved in the secure channel, and the data processing efficiency can be improved in the non-secure channel.
As shown in fig. 7, in one embodiment, an apparatus 700 for controlling photographing is provided, which includes a request receiving module 710, a signal receiving module 720, a calculating module 730, a signal forwarding module 740, and a processing module 750.
The request receiving module 710 is configured to, when the second processing unit receives the data obtaining request, control the second camera to collect the second image according to the data obtaining request, and send an image collecting instruction to the first processing unit, where the image collecting instruction is used to instruct the first processing unit to control the first camera to collect the first image.
In one embodiment, the first processing unit is connected with the first camera through a control line, the second processing unit is connected with the second camera through a control line, the first processing unit is connected with the second processing unit, and the second processing unit is further connected with the first camera and the second camera through signal lines respectively.
The signal receiving module 720 is configured to, when the second processing unit receives a synchronization signal sent by the second camera, obtain a first exposure time of the first camera and a second exposure time of the second camera, where the synchronization signal is a signal sent at a time when the second camera starts to expose when acquiring each frame of the second image.
The calculating module 730 is configured to calculate a delay duration according to the first exposure time and the second exposure time.
The signal forwarding module 740 is configured to forward the synchronization signal to the first camera when the time length of the synchronization signal received by the second processing unit reaches the delay time, where the synchronization signal is used to instruct the first camera to start exposure and acquire the first image.
The processing module 750 is configured to process the first image through the first processing unit, and send the processed first image to the second processing unit.
In this embodiment, when the second processing unit receives the synchronization signal sent by the second camera, the delay time length is calculated according to the exposure time lengths of the two cameras, when the time length of the synchronization signal received by the second processing unit reaches the delay time length, the synchronization signal is forwarded to the first camera, the time point for forwarding the synchronization signal is dynamically adjusted according to the exposure time lengths of the first camera and the second camera, the synchronization time of the first camera and the second camera is dynamically adjusted through the second processing unit, the synchronization effect is good, and when the difference between the exposure time lengths of the two cameras is large, the content of the images collected by the two cameras can still be guaranteed to be consistent.
In an embodiment, the calculating module 730 is further configured to calculate the exposure time difference between the first exposure time and the second exposure time, and divide the exposure time difference by 2 to obtain the delay time.
In an embodiment, the calculating module 730 is further configured to calculate a first intermediate exposure time of the first exposure duration and a second intermediate exposure time of the second exposure duration, respectively, determine a difference between the first intermediate exposure time and the second intermediate exposure time, and use the difference as the delay duration.
In this embodiment, the time point for forwarding the synchronization signal can be dynamically adjusted according to the exposure duration of the first camera and the second camera, so that the timing for synchronizing the first camera and the second camera can be dynamically adjusted, the first camera and the second camera are guaranteed to be consistent at half of the exposure time, and the synchronization effect is good.
In an embodiment, the request receiving module 710 is further configured to send the image capturing instruction to the first processing unit through a kernel of the second processing unit operating in a first operation mode, where the first operation mode is a trusted operation environment.
The processing module 750 is further configured to send, by the first processing unit, the processed first image to a kernel of the second processing unit, which runs in the first running mode.
In this embodiment, the kernel with high security of the second processing unit sends the image acquisition instruction to the first processing unit, so that the first processing unit can be ensured to be in an environment with high security, and the security of data is improved.
In one embodiment, the processing module 750 includes a type obtaining unit 752, a level determining unit 754, a selecting unit 756, and a sending unit 758.
A type obtaining unit 752, configured to obtain an application type of the application program that sends the data obtaining request.
A level determining unit 754, configured to determine a security level of the application according to the application type.
A selecting unit 756 for selecting a data transmission channel corresponding to the security level.
The sending unit 758, when the data transmission channel is a secure channel, sends the processed first image to the kernel operating in the first operation mode in the second processing unit through the first processing unit.
The sending unit 758 is further configured to send the processed first image to a camera driver in a second operation mode through the first processing unit when the data transmission channel is the non-secure channel, where the second operation mode is a natural operation environment.
In one embodiment, the level determining unit 754 is further configured to obtain a security level of an application program that sends the data obtaining request, and determine an image precision corresponding to the security level.
The sending unit 758 is further configured to send image data corresponding to the image precision to the application program.
In this embodiment, the corresponding data channel is selected according to the security level of the application program to transmit data, so that the security of data transmission can be improved in the secure channel, and the data processing efficiency can be improved in the non-secure channel.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which, when being executed by a processor, implements the above-described method of controlling photographing.
In one embodiment, a computer program product is provided that comprises a computer program, which when run on a computer device causes the computer device to carry out the above-described method of controlling shooting.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a non-volatile computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the program is executed. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), or the like.
Any reference to memory, storage, database, or other medium as used herein may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (15)

1. A method of controlling shooting, comprising:
when a second processing unit receives a data acquisition request, controlling a second camera to acquire a second image according to the data acquisition request, and sending an image acquisition instruction to a first processing unit, wherein the image acquisition instruction is used for instructing the first processing unit to control the first camera to acquire a first image;
when the second processing unit receives a synchronization signal sent by the second camera, acquiring a first exposure time of the first camera and a second exposure time of the second camera, wherein the synchronization signal is a signal sent at the moment of starting exposure when the second camera collects each frame of second image;
calculating a delay time according to the first exposure time and the second exposure time, wherein the delay time is the time length of the moment when the first camera starts to be exposed after delay, and the first exposure time of the first camera is less than the second exposure time of the second camera;
when the time length of the synchronization signal received by the second processing unit reaches the time delay time length, the synchronization signal is forwarded to the first camera, and the synchronization signal is used for indicating the first camera to start exposure and acquiring a first image;
and processing the first image through the first processing unit, and sending the processed first image to the second processing unit.
2. The method according to claim 1, wherein the first processing unit is connected to the first camera via a control line, the second processing unit is connected to the second camera via a control line, the first processing unit is connected to the second processing unit, and the second processing unit is further connected to the first camera and the second camera via signal lines, respectively.
3. The method of claim 1, wherein calculating a delay time based on the first exposure time and the second exposure time comprises:
and calculating the exposure time difference of the first exposure time and the second exposure time, and dividing the exposure time difference by 2 to obtain the delay time.
4. The method of claim 1, wherein calculating a delay time based on the first exposure time and the second exposure time comprises:
respectively calculating a first middle exposure time of the first exposure duration and a second middle exposure time of the second exposure duration;
and determining the difference value between the first intermediate exposure time and the second intermediate exposure time, and taking the difference value as the delay time length.
5. The method according to any one of claims 1 to 4, wherein the sending of the image acquisition instruction to the first processing unit comprises:
sending an image acquisition instruction to the first processing unit through a kernel in the second processing unit, wherein the kernel runs in a first running mode, and the first running mode is a trusted running environment;
the sending the processed first image to the second processing unit includes:
and the first processing unit sends the processed first image to a kernel which runs in a first running mode in the second processing unit.
6. The method of claim 5, wherein sending the processed first image to the second processing unit comprises:
acquiring the application type of an application program which sends the data acquisition request;
determining the security level of the application program according to the application type;
selecting a data transmission channel corresponding to the security level;
when the data transmission channel is a secure channel, the first processing unit sends the processed first image to a kernel of the second processing unit, which runs in a first running mode;
and when the data transmission channel is a non-secure channel, the first processing unit sends the processed first image to a camera in a second operation mode for driving, wherein the second operation mode is a natural operation environment.
7. The method of claim 1, further comprising:
acquiring the security level of an application program sending the data acquisition request;
determining an image precision corresponding to the security level;
and sending image data corresponding to the image precision to the application program.
8. An apparatus for controlling photographing, comprising:
the request receiving module is used for controlling the second camera to collect a second image according to the data acquisition request and sending an image collection instruction to the first processing unit when the second processing unit receives the data acquisition request, wherein the image collection instruction is used for instructing the first processing unit to control the first camera to collect a first image;
the signal receiving module is used for acquiring a first exposure time length of the first camera and a second exposure time length of the second camera when the second processing unit receives a synchronous signal sent by the second camera, wherein the synchronous signal is a signal sent at the moment of starting exposure when the second camera collects each frame of second image;
the calculation module is used for calculating a delay time length according to the first exposure time length and the second exposure time length, wherein the delay time length is the time length of the moment when the first camera starts to be exposed after delay, and the first exposure time length of the first camera is smaller than the second exposure time length of the second camera;
the signal forwarding module is configured to forward the synchronization signal to the first camera when the time length of the synchronization signal received by the second processing unit reaches the time delay length, where the synchronization signal is used to instruct the first camera to start exposure and acquire a first image;
and the processing module is used for processing the first image through the first processing unit and sending the processed first image to the second processing unit.
9. An electronic device is characterized by comprising a first processing unit, a second processing unit and a camera module, wherein the first processing unit is respectively connected with the second processing unit and the camera module; the camera module comprises a first camera and a second camera, the first processing unit is connected with the first camera through a control line, the second processing unit is connected with the second camera through a control line, the first processing unit is connected with the second processing unit, and the second processing unit is also connected with the first camera and the second camera through signal lines respectively;
the second processing unit is used for controlling the second camera to acquire a second image according to the data acquisition request and sending an image acquisition instruction to the first processing unit when the data acquisition request is received;
the first processing unit is used for controlling the first camera to acquire a first image according to the image acquisition instruction;
the second camera is used for sending a synchronization signal to the second processing unit at the moment of starting exposure when each frame of second image is collected;
the second processing unit is further configured to, when the second processing unit receives a synchronization signal sent by the second camera, obtain a first exposure duration of the first camera and a second exposure duration of the second camera, and calculate a delay duration according to the first exposure duration and the second exposure duration, where the delay duration is a time length of a time when the first camera starts to be exposed after being delayed, and the first exposure duration of the first camera is smaller than the second exposure duration of the second camera;
the second processing unit is further configured to forward the synchronization signal to the first camera when the time length of the synchronization signal received by the second processing unit reaches the delay time length;
the first camera is used for starting exposure according to the synchronous signal and acquiring a first image;
the first processing unit is further configured to process the first image and send the processed first image to the second processing unit.
10. The electronic device of claim 9, wherein the second processing unit is further configured to calculate an exposure time difference between the first exposure time and the second exposure time, and divide the exposure time difference by 2 to obtain a delay time.
11. The electronic device according to claim 9, wherein the second processing unit is further configured to calculate a first intermediate exposure time of the first exposure time duration and a second intermediate exposure time of the second exposure time duration, respectively, determine a difference between the first intermediate exposure time and the second intermediate exposure time, and use the difference as the delay time duration.
12. The electronic device according to claim 9, wherein the second processing unit is further configured to send an image capture instruction to the first processing unit through a kernel of the second processing unit operating in a first operating mode, where the first operating mode is a trusted operating environment;
the first processing unit is further configured to send the processed first image to a kernel of the second processing unit, where the kernel operates in a first operation mode.
13. The electronic device according to claim 12, wherein the second processing unit is further configured to obtain an application type of an application program that sends the data obtaining request, determine a security level of the application program according to the application type, and select a data transmission channel corresponding to the security level;
the first processing unit is further configured to send the processed first image to a kernel operating in a first operating mode in the second processing unit when the data transmission channel is a secure channel;
the first processing unit is further configured to send the processed first image to a camera driver in a second operation mode when the data transmission channel is a non-secure channel, where the second operation mode is a natural operation environment.
14. The electronic device according to claim 9, wherein the second processing unit is further configured to acquire a security level of an application program that sends the data acquisition request, determine an image accuracy corresponding to the security level, and send image data corresponding to the image accuracy to the application program.
15. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
CN202010004323.3A 2018-04-28 2018-04-28 Method and device for controlling shooting, electronic equipment and computer-readable storage medium Active CN110971836B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010004323.3A CN110971836B (en) 2018-04-28 2018-04-28 Method and device for controlling shooting, electronic equipment and computer-readable storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010004323.3A CN110971836B (en) 2018-04-28 2018-04-28 Method and device for controlling shooting, electronic equipment and computer-readable storage medium
CN201810401344.1A CN108650472B (en) 2018-04-28 2018-04-28 Method and device for controlling shooting, electronic equipment and computer-readable storage medium

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201810401344.1A Division CN108650472B (en) 2018-04-28 2018-04-28 Method and device for controlling shooting, electronic equipment and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN110971836A CN110971836A (en) 2020-04-07
CN110971836B true CN110971836B (en) 2021-07-09

Family

ID=63748675

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202010004323.3A Active CN110971836B (en) 2018-04-28 2018-04-28 Method and device for controlling shooting, electronic equipment and computer-readable storage medium
CN201810401344.1A Active CN108650472B (en) 2018-04-28 2018-04-28 Method and device for controlling shooting, electronic equipment and computer-readable storage medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201810401344.1A Active CN108650472B (en) 2018-04-28 2018-04-28 Method and device for controlling shooting, electronic equipment and computer-readable storage medium

Country Status (1)

Country Link
CN (2) CN110971836B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3627827B1 (en) 2018-04-28 2024-05-01 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method for controlling photographing, electronic device, and computer readable storage medium
CN111107248B (en) * 2018-10-25 2022-02-08 北京图森智途科技有限公司 Multi-channel video acquisition synchronization system and method and acquisition controller
CN110312056B (en) * 2019-06-10 2021-09-14 青岛小鸟看看科技有限公司 Synchronous exposure method and image acquisition equipment
CN114143527B (en) * 2021-11-09 2023-05-26 长沙眸瑞网络科技有限公司 Sectional shooting instruction control method, device, system, electronic device and storage medium
CN114265471A (en) * 2021-11-12 2022-04-01 北京罗克维尔斯科技有限公司 Time synchronization method, device, electronic equipment, vehicle and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101753816A (en) * 2008-12-12 2010-06-23 三洋电机株式会社 Image sensing apparatus and image sensing method
CN107231533A (en) * 2017-06-12 2017-10-03 深圳市瑞立视多媒体科技有限公司 A kind of synchronous exposure method, device and terminal device
CN107948463A (en) * 2017-11-30 2018-04-20 北京图森未来科技有限公司 A kind of camera synchronous method, apparatus and system
CN107948515A (en) * 2017-11-30 2018-04-20 北京图森未来科技有限公司 A kind of camera synchronous method and device, binocular camera

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7046292B2 (en) * 2002-01-16 2006-05-16 Hewlett-Packard Development Company, L.P. System for near-simultaneous capture of multiple camera images
JP5025526B2 (en) * 2008-02-29 2012-09-12 ブラザー工業株式会社 Image forming apparatus
EP2449762A4 (en) * 2009-06-30 2015-07-15 Nokia Corp Enhanced timer functionality for camera systems
CN201608788U (en) * 2010-01-14 2010-10-13 宝山钢铁股份有限公司 Industrial image collecting device with adjustable camera frame rate
JP5517668B2 (en) * 2010-02-19 2014-06-11 キヤノン株式会社 COMMUNICATION DEVICE, IMAGING DEVICE, ITS CONTROL METHOD, PROGRAM, AND STORAGE MEDIUM
CN103416071B (en) * 2011-03-08 2015-11-25 瑞萨电子株式会社 Camera head
CN102810139B (en) * 2012-06-29 2016-04-06 宇龙计算机通信科技(深圳)有限公司 Secure data operation method and communication terminal
CN103338334A (en) * 2013-07-17 2013-10-02 中测新图(北京)遥感技术有限责任公司 System and method for controlling multi-cameral digital aerial photographic camera synchronous exposure
CN107395998A (en) * 2017-08-24 2017-11-24 维沃移动通信有限公司 A kind of image capturing method and mobile terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101753816A (en) * 2008-12-12 2010-06-23 三洋电机株式会社 Image sensing apparatus and image sensing method
CN107231533A (en) * 2017-06-12 2017-10-03 深圳市瑞立视多媒体科技有限公司 A kind of synchronous exposure method, device and terminal device
CN107948463A (en) * 2017-11-30 2018-04-20 北京图森未来科技有限公司 A kind of camera synchronous method, apparatus and system
CN107948515A (en) * 2017-11-30 2018-04-20 北京图森未来科技有限公司 A kind of camera synchronous method and device, binocular camera

Also Published As

Publication number Publication date
CN108650472B (en) 2020-02-04
CN110971836A (en) 2020-04-07
CN108650472A (en) 2018-10-12

Similar Documents

Publication Publication Date Title
CN110248111B (en) Method and device for controlling shooting, electronic equipment and computer-readable storage medium
CN110971836B (en) Method and device for controlling shooting, electronic equipment and computer-readable storage medium
CN110324521B (en) Method and device for controlling camera, electronic equipment and storage medium
CN108764052B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108804895B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108805024B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN110191266B (en) Data processing method and device, electronic equipment and computer readable storage medium
EP3627827B1 (en) Method for controlling photographing, electronic device, and computer readable storage medium
EP3614659B1 (en) Image processing method, electronic apparatus, and computer-readable storage medium
CN111126146A (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
WO2019196683A1 (en) Method and device for image processing, computer-readable storage medium, and electronic device
CN108573170B (en) Information processing method and device, electronic equipment and computer readable storage medium
CN111523499B (en) Image processing method, apparatus, electronic device, and computer-readable storage medium
EP3624006A1 (en) Image processing method, apparatus, computer-readable storage medium, and electronic device
CN108833887B (en) Data processing method and device, electronic equipment and computer readable storage medium
CN108711054B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
EP3672223B1 (en) Data processing method, electronic device, and computer-readable storage medium
CN108830141A (en) Image processing method, device, computer readable storage medium and electronic equipment
US11218650B2 (en) Image processing method, electronic device, and computer-readable storage medium
EP3621294B1 (en) Method and device for image capture, computer readable storage medium and electronic device
CN109120846B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN108810516B (en) Data processing method and device, electronic equipment and computer readable storage medium
EP3644261B1 (en) Image processing method, apparatus, computer-readable storage medium, and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant