CN108833885B - Image processing method, image processing device, computer-readable storage medium and electronic equipment - Google Patents

Image processing method, image processing device, computer-readable storage medium and electronic equipment Download PDF

Info

Publication number
CN108833885B
CN108833885B CN201810786074.0A CN201810786074A CN108833885B CN 108833885 B CN108833885 B CN 108833885B CN 201810786074 A CN201810786074 A CN 201810786074A CN 108833885 B CN108833885 B CN 108833885B
Authority
CN
China
Prior art keywords
image
type identifier
camera
processing unit
light emitter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810786074.0A
Other languages
Chinese (zh)
Other versions
CN108833885A (en
Inventor
欧锦荣
周海涛
郭子青
谭筱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810786074.0A priority Critical patent/CN108833885B/en
Publication of CN108833885A publication Critical patent/CN108833885A/en
Priority to PCT/CN2019/082560 priority patent/WO2020015403A1/en
Priority to EP19806080.8A priority patent/EP3621294B1/en
Priority to US16/671,840 priority patent/US20200068127A1/en
Application granted granted Critical
Publication of CN108833885B publication Critical patent/CN108833885B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application relates to an image processing method, an image processing device, a computer readable storage medium and an electronic device. The method comprises the following steps: an application program sends an image acquisition instruction to a camera driver, wherein the image acquisition instruction comprises a type identifier which is used for representing the type of a target image required to be acquired by the application program; the camera drive controls the camera to be opened according to the image acquisition instruction; the camera driver generates a control instruction according to the type identifier and sends the control instruction to a processing unit; the processing unit opens the corresponding light emitter according to the type identifier contained in the control instruction; and acquiring a target image formed when the light emitter irradiates an object through the camera. The image processing method, the image processing device, the computer readable storage medium and the electronic equipment can acquire images of different types and meet the personalized requirements of users.

Description

Image processing method, image processing device, computer-readable storage medium and electronic equipment
Technical Field
The present application relates to the field of computer technologies, and in particular, to an image processing method and apparatus, a computer-readable storage medium, and an electronic device.
Background
The user can shoot photos, videos and the like through the camera installed on the electronic equipment, and can also carry out authentication operations such as payment and unlocking through images collected by the camera. The electronic equipment can install the camera of different grade type, and the position of camera installation also can be different, then controls different cameras and gathers the image. For example, when payment is carried out, images can be collected through a front camera; when the photo is taken, the image can be collected through the rear camera.
Disclosure of Invention
The embodiment of the application provides an image processing method and device, a computer readable storage medium and electronic equipment, which can acquire images of different types and meet personalized requirements.
A method of image processing, the method comprising:
an application program sends an image acquisition instruction to a camera driver, wherein the image acquisition instruction comprises a type identifier which is used for representing the type of a target image required to be acquired by the application program;
the camera drive controls the camera to be opened according to the image acquisition instruction;
the camera driver generates a control instruction according to the type identifier and sends the control instruction to a processing unit;
the processing unit opens the corresponding light emitter according to the type identifier contained in the control instruction;
and acquiring a target image formed when the light emitter irradiates an object through the camera.
An image processing apparatus, the apparatus comprising:
the system comprises an instruction initiating module, an image acquiring module and a processing module, wherein the instruction initiating module is used for sending an image acquiring instruction to a camera driver by an application program, the image acquiring instruction comprises a type identifier, and the type identifier is used for representing the type of a target image required to be acquired by the application program;
the camera control module is used for controlling the camera to be opened by the camera drive according to the image acquisition instruction;
the command sending module is used for generating a control command by the camera drive according to the type identifier and sending the control command to the processing unit;
the light emitter control module is used for the processing unit to open the corresponding light emitter according to the type identifier contained in the control instruction;
and the image acquisition module is used for acquiring a target image formed when the light emitter irradiates an object through the camera.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
an application program sends an image acquisition instruction to a camera driver, wherein the image acquisition instruction comprises a type identifier which is used for representing the type of a target image required to be acquired by the application program;
the camera drive controls the camera to be opened according to the image acquisition instruction;
the camera driver generates a control instruction according to the type identifier and sends the control instruction to a processing unit;
the processing unit opens the corresponding light emitter according to the type identifier contained in the control instruction;
and acquiring a target image formed when the light emitter irradiates an object through the camera.
An electronic device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the steps of:
an application program sends an image acquisition instruction to a camera driver, wherein the image acquisition instruction comprises a type identifier which is used for representing the type of a target image required to be acquired by the application program;
the camera drive controls the camera to be opened according to the image acquisition instruction;
the camera driver generates a control instruction according to the type identifier and sends the control instruction to a processing unit;
the processing unit opens the corresponding light emitter according to the type identifier contained in the control instruction;
and acquiring a target image formed when the light emitter irradiates an object through the camera.
According to the image processing method, the image processing device, the computer readable storage medium and the electronic equipment, after the application program initiates the image acquisition instruction, the image acquisition instruction is sent to the camera driver, and the camera is controlled to be opened through the camera driver. Then the camera drives and sends a control instruction to the processing unit, the processing unit controls the light emitter to be opened according to the control instruction, and an image formed when the light emitter irradiates an object is collected through the opened camera. The image acquisition instruction initiated by the application program can drive the camera to open through the camera, and then open different light emitters through the processing unit, so that different types of target images are acquired, and the individual requirements of users are met.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a diagram of an exemplary embodiment of an application of an image processing method;
FIG. 2 is a flow diagram of a method of image processing in one embodiment;
FIG. 3 is a flow chart of an image processing method in another embodiment;
FIG. 4 is a schematic diagram of an embodiment of a control light emitter;
FIG. 5 is a schematic diagram of an embodiment of an electronic device for obtaining an image of a target;
FIG. 6 is a flowchart of an image processing method in yet another embodiment;
FIG. 7 is a flowchart of an image processing method in yet another embodiment;
FIG. 8 is a schematic diagram of computing depth information in one embodiment;
FIG. 9 is a diagram of hardware components for implementing an image processing method in one embodiment;
fig. 10 is a schematic structural diagram of an image processing apparatus according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first client may be referred to as a second client, and similarly, a second client may be referred to as a first client, without departing from the scope of the present application. Both the first client and the second client are clients, but they are not the same client.
Fig. 1 is a diagram illustrating an application scenario of an image processing method according to an embodiment. As shown in fig. 1, the application scenario includes an electronic device 104, and a camera and a light emitter may be installed in the electronic device 104, and several application programs may also be installed. When the application program initiates an image acquisition instruction, the application program sends the image acquisition instruction to the camera driver, and the camera driver can control the camera to be opened according to the image acquisition instruction. The camera driver generates a control instruction according to the type identifier contained in the image acquisition instruction, and sends the control instruction to the processing unit. The processing unit opens the corresponding light emitter according to the type identifier contained in the control instruction, and then the camera collects a target image formed when the light emitter irradiates the object. The electronic device 104 may be a smart phone, a tablet computer, a personal digital assistant, a wearable device, or the like.
FIG. 2 is a flow diagram of a method of image processing in one embodiment. As shown in fig. 2, the image processing method includes steps 202 to 210. Wherein:
step 202, the application program sends an image acquisition instruction to the camera driver, wherein the image acquisition instruction comprises a type identifier, and the type identifier is used for indicating the type of a target image required to be acquired by the application program.
The electronic equipment can be used for safely storing a plurality of application programs, the application programs refer to software written for a certain application purpose in the electronic equipment, and the application programs can be used for realizing the required service for users. For example, the user may play games through a game-like application, may pay for transactions through a payment-like application, may play music through a music-like application, and so on.
When the application program needs to collect the image, an image collecting instruction is initiated, and the electronic equipment can obtain the image according to the image collecting instruction. The image capturing instruction refers to an instruction for triggering an image capturing operation. For example, when a user wants to take a picture, the user can click a picture taking button, and when the electronic device recognizes that the picture taking button is pressed, an image acquisition instruction is generated, so that the camera is called to acquire an image. When the user needs to carry out payment verification through the face, the user can click the payment button, the face is aligned to the camera to be shot, and the electronic equipment can carry out payment verification after acquiring the face.
Specifically, the image acquisition instruction initiated by the application program may include an initiation time, a type identifier, an application identifier, and the like, where the initiation time is a time when the application program initiates the image acquisition instruction, the type identifier indicates a type of a target image that needs to be acquired by the application program, and the application identifier is used to identify the application program that initiates the image acquisition instruction. When the electronic equipment detects that the application program initiates an image acquisition instruction, the image acquisition instruction can be sent to the camera drive, and the camera drive is used for controlling the on-off of the camera.
And step 204, controlling the camera to be opened by the camera drive according to the image acquisition instruction.
It should be noted that the camera may be classified into a laser camera, a visible light camera, and the like according to the difference of the acquired images, the laser camera may acquire an image formed by irradiating laser light onto an object, and the visible light image may acquire an image formed by irradiating visible light onto the object. The electronic equipment can be provided with a plurality of cameras, and the installation position is not limited.
For example, one camera may be installed on a front panel of the electronic device, two cameras may be installed on a back panel of the electronic device, and the cameras may be installed in an embedded manner inside the electronic device and then opened by rotating or sliding. Specifically, a front camera and a rear camera can be mounted on the electronic device, the front camera and the rear camera can acquire images from different viewing angles, the front camera can acquire images from a front viewing angle of the electronic device, and the rear camera can acquire images from a back viewing angle of the electronic device.
After the image acquisition instruction is sent to the camera drive, the camera drive can be opened according to image acquisition instruction control camera. Specifically, after receiving an image acquisition instruction, the camera driver inputs a control signal to the camera and controls the camera to be opened through the input control signal. For example, a pulse wave signal is input to the camera, and the camera is controlled to be turned on through the input pulse wave signal.
And step 206, the camera driver generates a control instruction according to the type identifier and sends the control instruction to the processing unit.
And step 208, the processing unit turns on the corresponding light emitter according to the type identifier contained in the control instruction.
In the embodiment provided by the application, when the camera is turned on to acquire an image, the electronic device simultaneously turns on the light emitter. The light emitter can emit light, and when the light generated by the light emitter irradiates on the object, the image generated by irradiating the light on the object is collected through the camera. For example, the camera may be a laser camera, the light emitter may be a laser emitter, the laser emitter may generate laser, and an infrared image formed by irradiating the laser onto an object may be collected by the laser camera.
In particular, the type of light emitted will vary for different types of light emitters. For example, the light emitters may include different types of flash lamps, which may generate visible light, flood lamps, which may generate laser light, and laser lamps, which may generate laser speckle. The laser speckle is formed by diffracting laser light through a diffraction element.
When the application program generates an image acquisition instruction, the type identifier is written into the image acquisition instruction, and the type identifier is used for indicating the type of the acquired image. When the type of the collected image is different, the light emitters which are turned on are also different. Specifically, the processing unit is connected to the light emitter, the camera driver generates a control instruction according to the type identifier, and the processing unit opens the light emitter corresponding to the type identifier according to the control instruction after receiving the control instruction. For example, when a visible light image is acquired, the flash lamp is controlled to be turned on; and when the infrared image is collected, the floodlight is controlled to be turned on.
And step 210, acquiring a target image formed when the light emitter irradiates the object through the camera.
When the light emitter is turned on, light rays are generated, and when the light rays irradiate on an object, a target image formed when the light rays irradiate on the object can be collected through the camera. After the target image is acquired, the target image can be processed, and the processing mode is not limited. For example, the acquired target image may be sent to the application program, or may be processed by the processing unit of the electronic device, and the processing result is returned to the application program.
In one embodiment, in order to ensure the accuracy of the acquired target image, the acquisition time when the target image is acquired may be obtained and compared with the initiation time when the image acquisition instruction is initiated. If the time interval from the acquisition time to the initiation time is greater than the interval threshold, it is considered that a delay is generated in the process of acquiring the target image, that is, the acquisition of the target image is inaccurate, and the acquired target image can be directly discarded. If the time interval from the acquisition time to the initiation time is less than or equal to the interval threshold, the acquired target image is considered to be accurate, and the acquired target image can be processed.
The image processing method provided by the above embodiment may send the image acquisition instruction to the camera driver after the application program initiates the image acquisition instruction, and the camera driver controls the camera to open. Then the camera drives and sends a control instruction to the processing unit, the processing unit controls the light emitter to be opened according to the control instruction, and an image formed when the light emitter irradiates an object is collected through the opened camera. The image acquisition instruction initiated by the application program can drive the camera to open through the camera, and then open different light emitters through the processing unit, so that different types of target images are acquired, and the individual requirements of users are met.
Fig. 3 is a flowchart of an image processing method in another embodiment. As shown in fig. 3, the image processing method includes steps 302 to 320. Wherein:
step 302, the application program sends an image acquisition instruction to the camera driver, where the image acquisition instruction includes a type identifier and a mode identifier, and the type identifier is used to indicate the type of the target image that the application program needs to acquire.
It is understood that the image capture instruction initiated by the application program may be initiated automatically by the electronic device when the condition is detected to be satisfied, or may be initiated manually by the user. For example, the electronic device automatically initiates an image capture instruction upon detecting a hand-up action by the user. Or the user can trigger and generate an image acquisition instruction through the click operation of the key. When the image acquisition instruction is generated, a type identifier and a mode identifier can be written in the image acquisition instruction, and the mode identifier can represent the security level of the image acquisition instruction.
Specifically, the mode identifier may include a safe mode and an unsafe mode, where the image acquired in the safe mode has a higher security requirement on the operating environment, and the image acquired in the unsafe mode has a lower security requirement on the operating environment. After the application program sends the image acquisition instruction to the camera driver, the camera driver can switch different data channels according to the mode identification. The safety of different data channels is different, so that the camera is controlled to collect images through different data channels.
And step 304, controlling the camera to be opened by the camera drive according to the image acquisition instruction.
And step 306, if the mode identifier is a non-safety mode, the camera driver generates a control instruction according to the type identifier.
In embodiments provided herein, the operating environment of the electronic device may include a secure operating environment and a non-secure operating environment. For example, the non-secure runtime Environment may be an REE (Rich Execution Environment), and the secure runtime Environment may be a TEE (Trusted Execution Environment), and the security of runtime in the REE is lower than that of runtime in the TEE. The camera drive is in an unsafe operation environment, the processing unit is connected to the light emitter, and the switch of the light emitter is controlled through the processing unit.
The camera drive can send control command to the processing unit, and the processing unit controls the light emitter to be started through the control command. When the mode identification in the image acquisition command is a non-safety mode, the camera drive can directly send the control command to the processing unit; when the mode identifier in the image acquisition instruction is a safe mode, in order to prevent other malicious programs from operating the optical transmitter, the camera driver may send the control instruction to a trusted application under the TEE, and send the control instruction to the processing unit through the trusted application.
Wherein the control instructions may control the processing unit to switch on the light emitter. Specifically, the type identifier may represent a type of an image to be acquired, a control instruction is generated according to the type identifier, a corresponding light emitter may be turned on according to the type identifier carried in the control instruction, and then a target image corresponding to the type identifier is acquired.
And 308, sending the control command to the processing unit through the serial peripheral interface.
Specifically, after the camera driver generates the control command, the control command may be sent to the processing unit through a Serial Peripheral Interface (SPI). The processing unit may receive a control instruction through a Serial Peripheral Interface and a secure Serial Peripheral Interface (Secuer SPI), and when it is detected that a mode identifier in the image acquisition instruction is a secure mode, the electronic device may switch the Interface of the processing unit to the secure SPI, and receive the control instruction sent by the trusted application through the secure SPI. When the mode identification in the image acquisition command is detected to be a non-safe mode, the interface of the processing unit is switched to the serial peripheral interface, and the control command sent by the camera drive is received through the serial peripheral interface.
In step 310, the processing unit selects a corresponding controller according to the type identifier included in the control command, and inputs PWM to the controller.
In one embodiment, the processing unit is connected to a controller, which is connected to the light emitters. When the processing unit receives the control instruction, the corresponding controller can be selected according to the type identifier in the control instruction, then a Pulse Width Modulation (PWM) is input to the controller, and the light emitter is controlled to be turned on through the input PWM.
In step 312, the light emitter to which the controller is connected is turned on by Pulse Width Modulation (PWM).
In the embodiment of the application, the light emitter comprises the first light emitter and/or the second light emitter, and when different types of target images are acquired, the turned-on light emitters are different. Specifically, if the type identifier included in the control instruction is the first type identifier, the processing unit controls the first light emitter to be turned on; and if the type identifier contained in the control instruction is the second type identifier or the third type identifier, the processing unit controls the second light emitter to be turned on.
The camera can be a laser camera, the first light emitter can be a floodlight, and the second light emitter can be a laser lamp. Specifically, if the type identifier included in the control instruction is the first type identifier, the processing unit controls the floodlight to be turned on; and if the type identifier contained in the control instruction is the second type identifier or the third type identifier, the processing unit controls the laser lamp to be turned on.
For example, the first type identifier may be an infrared image identifier, the floodlight may generate laser light, and an infrared image generated when the floodlight irradiates may be collected by the laser camera. The second type identification can be speckle image identification, and laser lamp can generate laser speckle, can gather the speckle image that laser speckle generated when shining through the laser camera. The third type mark can be a depth image mark, a laser camera can be used for collecting a speckle image generated during laser speckle irradiation, and a depth image can be obtained through calculation according to the speckle image.
In one embodiment, if the type identifier included in the control instruction is a first type identifier, the processing unit inputs a first Pulse Width Modulation (PWM) to the first controller, and turns on a first light emitter connected to the first controller through the first PWM; if the type identifier contained in the control command is the second type identifier or the third type identifier, the processing unit inputs a second Pulse Width Modulation (PWM) to the second controller, and the second light emitter connected with the second controller is turned on through the second PWM.
Fig. 4 is a schematic structural diagram of a control light emitter in one embodiment. As shown in fig. 4, the first light emitter may be a floodlight and the second light emitter may be a laser. The processing unit may input two-way pulse width modulation, PWM1 and PWM 2. When the processing unit outputs PWM1, the first controller can be controlled by PWM1, and the floodlight is controlled to be turned on by the first controller. When the processing unit outputs PWM2, the second controller can be controlled by PWM2, and the laser lamp is controlled by the second controller to turn on.
And step 314, acquiring an original image formed when the light emitter irradiates the object through the camera, and sending the original image to the processing unit.
In one embodiment, the camera cannot directly acquire the target image, and the target image can be obtained after the acquired image is subjected to certain processing. Specifically, an original image is collected through the camera, and then the original image is sent to the processing unit, and the processing unit generates a target image according to the original image. For example, when two cameras are used to capture images, in order to ensure consistency of the images, the images captured by the two cameras need to be aligned, so that it is ensured that the two captured images are from the same shooting scene. Then the images captured by the camera are aligned and can be processed normally.
In step 316, a target image is acquired from the original image in the processing unit.
After the camera collects the original image, the original image is returned to the processing unit, and the processing unit acquires the target image according to the original image. For example, an infrared image captured by a camera is subjected to alignment correction, and the corrected infrared image is taken as a target image. Or calculating a depth image according to the speckle image collected by the camera, and taking the depth image as a target image.
FIG. 5 is a schematic diagram of an electronic device for obtaining an image of a target according to an embodiment. As shown in fig. 5, the operating environment of the electronic device includes REE and TEE. After an application program under the REE initiates an image acquisition instruction, the image acquisition instruction is sent to the camera driver. The camera is driven to control the camera to be opened, and a control instruction is generated according to the type identification in the image acquisition instruction. When the camera drives and judges that the mode identification in the image acquisition instruction is a non-safety mode, the control instruction is directly transmitted to the processing unit, the light emitter is controlled to be turned on through the processing unit, and the target image is acquired through the camera after the light emitter is turned on. When the camera driver judges that the mode identification in the image acquisition instruction is the safe mode, the camera driver transmits the control instruction to the trusted application in the TEE and transmits the control instruction to the processing unit through the trusted application. And the light emitter is controlled to be turned on through the processing unit, and a target image is acquired through the camera after the light emitter is turned on.
And step 318, when the target image is judged to be the preset image type, sending the target image to a camera driver.
After the target image is acquired, the processing unit sends the target image to the camera drive, and then processes the target image in the camera drive. In the embodiments provided in the present application, the Processing Unit may include a first Processing Unit and a second Processing Unit, the first Processing Unit may be an MCU (micro controller Unit), and the second Processing Unit may be a CPU (Central Processing Unit). The first processing unit is connected with the light emitter, and the switch of the light emitter is controlled through the first processing unit.
Specifically, the image processing method may include: the application program sends an image acquisition instruction to the camera driver, wherein the image acquisition instruction comprises a type identifier; the camera drive controls the camera to be opened according to the image acquisition instruction; the camera driver generates a control instruction according to the type identifier and sends the control instruction to the first processing unit; the first processing unit opens the corresponding light emitter according to the type identifier contained in the control command; and acquiring an original image formed when the light emitter irradiates the object through the camera.
After the camera collects the original image, the original image is sent to the first processing unit, and the first processing unit obtains the target image according to the original image. And when the mode identifier in the image acquisition instruction is a non-safety mode, the first processing unit returns the target image to the camera drive, and the camera drive processes the target image. When the mode identification in the image acquisition instruction is the safe mode, the first processing unit returns the target image to the second processing unit, and the second processing unit processes the target image.
And step 320, processing the target image through the camera drive, and sending a processing result to the application program.
It will be appreciated that the images that need to be captured by the application may be of one or more types, such as infrared and depth images may be captured simultaneously, or only visible light images may be captured. After the target image is collected, the camera drive can process the target image. For example, face recognition processing may be performed according to the infrared image and the depth image, face detection and face authentication may be performed according to the infrared image, and living body detection may be performed on the detected face according to the depth image to see whether the detected face is a living body face.
In one embodiment, when the mode is identified as the non-secure mode, the first processing unit determines whether the type of the target image is a preset image type before sending the target image to the camera driver. And when the type of the target image is a preset image type, sending the target image to a camera driver, processing the target image through the camera driver, and sending a processing result to the application program. For example, when the target image is a speckle image, the first processing unit does not send the target image to the camera driver; when the target image is an infrared image, a depth image or a visible light image, the first processing unit sends the target image to the camera for driving.
The second processing unit may be specifically a CPU core under the TEE, and the input of the first processing unit in the secure mode is controlled by a trusted application and is a processing unit independent of the CPU, so that both the first processing unit and the second processing unit can be regarded as being in a secure operating environment. When the mode is identified as the secure mode, the first processing unit sends the target image to the second processing unit since the second processing unit is also in a secure operating environment. After the second processing unit processes the target image, the processing result may be transmitted to the application program.
The second processing unit can also send the target image to the application program, before sending the target image to the application program, the second processing unit can judge the type of the target image, and when the type of the target image is a preset image type, the second processing unit can send the target image to the application program.
Further, the second processing unit may compress the target image before sending the target image, and then send the compressed target image to the application program. The method specifically comprises the following steps: the second processing unit acquires the application level of an application program and acquires a corresponding compression level according to the application level of the application program; performing compression processing corresponding to the compression level on the target image; and sending the compressed target image to an application program. For example, the applications may be classified into a system security class application, a system non-security class application, a third-party security class application, and a third-party non-security class application, and the corresponding compression level may be from high to low.
Specifically, as shown in fig. 6, the step of acquiring the original image includes:
step 602, if the type identifier is the first type identifier, controlling the camera to collect an infrared image formed when the first light emitter irradiates the object, and sending the infrared image to the processing unit.
And if the type identifier is the first type identifier, indicating that the image needing to be acquired by the application program is an infrared image. The processing unit can control the first light emitter to be turned on, and then an infrared image formed when the first light emitter irradiates the object is collected through the camera. Specifically, the first light emitter may be a floodlight, and when laser light generated by the floodlight is irradiated on the object, an infrared image is formed.
And step 604, if the type identifier is the second type identifier or the third type identifier, controlling the camera to acquire a speckle image formed when the second light emitter irradiates the object, and sending the speckle image to the processing unit.
If the type identifier is a second type identifier, the image needing to be acquired by the application program is a speckle image; and if the type identifier is a third type identifier, the image needing to be acquired by the application program is a depth image. Since the depth image is also calculated from the collected speckle image, when the type identifier is the second type identifier or the third type identifier, the processing unit controls the second light emitter to be turned on, and then collects the speckle image formed when the object is irradiated by the second light emitter through the camera. The first light emitter can be a laser lamp, and when laser speckles generated by the laser lamp are irradiated on an object, a speckle image is formed.
In one embodiment, the step of acquiring the target image specifically includes:
step 702, if the type identifier is the first type identifier, correcting the infrared image in the processing unit, and taking the corrected infrared image as the target image.
A plurality of cameras can be installed on the electronic equipment, and the placement positions of the cameras are different, so that images collected by the different cameras form a certain parallax. After the laser camera collects the infrared image, the infrared image is corrected to eliminate the influence of parallax, so that the images collected by different cameras correspond to the same view field range.
Specifically, after the laser camera collects the infrared image, the infrared image is sent to the processing unit. The processing unit can correct the collected infrared image according to a correction algorithm, and then the corrected infrared image is used as a target image.
And step 704, if the type identifier is the second type identifier, correcting the speckle image in the processing unit, and taking the corrected speckle image as a target image.
Electronic equipment can open radium lamp and laser camera, and the laser speckle that radium lamp formed can shine on the object, then shines the speckle image that forms on the object through the laser speckle that laser camera gathered. Specifically, when laser light irradiates on an optically rough surface with average fluctuation larger than the wavelength order, wavelets scattered by surface elements distributed on the surface are mutually superposed to enable a reflected light field to have random spatial light intensity distribution, and a granular structure is presented, namely laser speckle. The formed laser speckle comprises a plurality of laser speckles, so that a speckle pattern acquired by a laser camera also comprises a plurality of speckle points, for example, 30000 speckle points can be included in a speckle image.
The laser speckles formed are highly random, and therefore, the laser speckles generated by the laser emitted by different laser emitters are different. When the resulting laser speckle is projected onto objects of different depths and shapes, the resulting speckle images are not identical. The laser speckles formed by different laser emitters are unique, and therefore the speckle images obtained are also unique.
If the speckle image is collected by the application program, the speckle image is sent to the processing unit after the speckle image is collected by the laser camera, and the processing unit corrects the speckle image and takes the corrected speckle image as a target image.
Step 706, if the type identifier is the third type identifier, acquiring the reference image stored in the processing unit, calculating to obtain a depth image according to the speckle image and the reference image, correcting the depth image, and taking the corrected depth image as a target image.
If the depth image is acquired by the application program, the processing unit calculates to obtain the depth image according to the speckle image after receiving the speckle image acquired by the laser camera. The depth image is also subjected to parallax correction, and then the corrected depth image is taken as a target image. Specifically, a reference image is stored in the processing unit, and the reference image is an image formed when the laser lamp irradiates a reference plane when the camera is calibrated. The reference image is provided with reference depth information, and a depth image can be obtained by calculation according to the reference image and the speckle image.
The step of calculating the depth image specifically includes: acquiring a reference image stored in a processing unit; comparing the reference image with the speckle image to obtain offset information, wherein the offset information is used for representing the horizontal offset of the speckle point in the speckle image relative to the corresponding scattered spot in the reference image; and calculating to obtain a depth image according to the offset information and the reference depth information.
FIG. 8 is a schematic diagram of computing depth information in one embodiment. As shown in fig. 8, the laser lamp 802 may generate laser speckles, which are reflected by an object and then the formed image is acquired by the laser camera 804. In the calibration process of the camera, laser speckles emitted by the laser lamp 802 are reflected by the reference plane 808, reflected light is collected by the laser camera 804, and a reference image is obtained by imaging through the imaging plane 810. The reference depth L from the reference plane 808 to the laser lamp 802 is known. In the process of actually calculating the depth information, laser speckles emitted by the laser lamp 802 are reflected by the object 806, reflected light is collected by the laser camera 804, and an actual speckle image is obtained by imaging through the imaging plane 810. The calculation formula for obtaining the actual depth information is as follows:
Figure BDA0001733768190000171
wherein L is a distance between the laser lamp 802 and the reference plane 808, f is a focal length of a lens in the laser camera 804, CD is a distance between the laser lamp 802 and the laser camera 804, and AB is an offset distance between an image of the object 806 and an image of the reference plane 808. AB may be the product of the pixel offset n and the actual distance p of the pixel. When the distance Dis between the object 804 and the laser lamp 802 is greater than the distance L between the reference plane 806 and the laser lamp 802, AB is a negative value; AB is positive when the distance Dis between the object 804 and the laser lamp 802 is less than the distance L between the reference plane 806 and the laser lamp 802.
The image processing method provided by the above embodiment may send the image acquisition instruction to the camera driver after the application program initiates the image acquisition instruction, and the camera driver controls the camera to open. Then the camera drives and sends a control instruction to the processing unit, the processing unit controls the light emitter to be opened according to the control instruction, and an image formed when the light emitter irradiates an object is collected through the opened camera. The image acquisition instruction initiated by the application program can drive the camera to open through the camera, and then open different light emitters through the processing unit, so that different types of target images are acquired, and the individual requirements of users are met.
It should be understood that, although the steps in the flowcharts of fig. 2, 3, 6, and 7 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2, 3, 6, and 7 may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least some of the sub-steps or stages of other steps.
Fig. 9 is a hardware configuration diagram for implementing an image processing method in one embodiment. As shown in fig. 9, the electronic device may include a camera module 910, a Central Processing Unit (CPU)920 and a Micro Control Unit (MCU)930, wherein the camera module 910 includes a laser camera 912, a floodlight 914, an RGB (Red/Green/Blue, Red/Green/Blue color mode) camera 916 and a laser light 918. The mcu 930 includes a PWM module 932, an SPI/I2C (Serial Peripheral Interface/Inter-Integrated Circuit) module 934, a RAM (Random Access Memory) module 936, and a Depth Engine module 938. The first processing unit is the micro control unit 930, and the second processing unit may be the CPU core under TEE. It is understood that the central processor 920 may be in a multi-core mode of operation, and the CPU core in the central processor 920 may operate under a TEE or a REE. Both the TEE and the REE are running modes of an ARM module (Advanced RISC Machines). Generally, the operation behavior with higher security in the electronic device needs to be executed under the TEE, and other operation behaviors can be executed under the REE.
Fig. 10 is a schematic structural diagram of an image processing apparatus according to an embodiment. As shown in fig. 10, the image processing apparatus 1000 includes an instruction initiating module 1002, a camera control module 1004, an instruction sending module 1006, a light emitter control module 1008, and an image acquiring module 1010. Wherein:
the instruction initiating module 1002 is configured to send an image acquisition instruction to a camera driver by an application program, where the image acquisition instruction includes a type identifier, and the type identifier is used to indicate a type of a target image that needs to be acquired by the application program.
And the camera control module 1004 is used for controlling the camera to be opened by the camera drive according to the image acquisition instruction.
And the instruction sending module 1006 is configured to generate a control instruction by the camera driver according to the type identifier, and send the control instruction to the processing unit.
And the light emitter control module 1008 is used for the processing unit to turn on the corresponding light emitter according to the type identifier contained in the control instruction.
And the image acquisition module 1010 is used for acquiring a target image formed when the light emitter irradiates an object through the camera.
The image processing apparatus provided in the above embodiment may send the image acquisition instruction to the camera driver after the application program initiates the image acquisition instruction, and control the camera to open through the camera driver. Then the camera drives and sends a control instruction to the processing unit, the processing unit controls the light emitter to be opened according to the control instruction, and an image formed when the light emitter irradiates an object is collected through the opened camera. The image acquisition instruction initiated by the application program can drive the camera to open through the camera, and then open different light emitters through the processing unit, so that different types of target images are acquired, and the individual requirements of users are met.
In one embodiment, the image acquisition instruction includes a mode identifier, and the instruction sending module 1006 is further configured to, if the mode identifier is a non-secure mode, generate a control instruction according to the type identifier by the camera driver; and issuing the control instruction to the processing unit through the serial peripheral interface.
In one embodiment, the optical transmitter comprises a first optical transmitter and/or a second optical transmitter; the light emitter control module 1008 is further configured to, if the type identifier included in the control instruction is a first type identifier, control the first light emitter to turn on by the processing unit; and if the type identifier contained in the control instruction is a second type identifier or a third type identifier, the processing unit controls the second light emitter to be turned on.
In one embodiment, the light emitter control module 1008 is further configured to select a corresponding controller by the processing unit according to a type identifier included in the control instruction, and input Pulse Width Modulation (PWM) to the controller; and turning on the light emitter connected with the controller through the Pulse Width Modulation (PWM).
In one embodiment, the image capturing module 1010 is further configured to capture an original image formed when the light emitter irradiates an object through the camera, and send the original image to the processing unit; and acquiring a target image according to the original image in the processing unit.
In one embodiment, the image capturing module 1010 is further configured to control the camera to capture an infrared image formed when the first light emitter irradiates an object and send the infrared image to the processing unit if the type identifier is a first type identifier; and if the type identifier is a second type identifier or a third type identifier, controlling the camera to acquire a speckle image formed when the second light emitter irradiates the object, and sending the speckle image to the processing unit.
In an embodiment, the image acquisition module 1010 is further configured to correct the infrared image in the processing unit if the type identifier is the first type identifier, and use the corrected infrared image as a target image; if the type identifier is a second type identifier, correcting the speckle image in the processing unit, and taking the corrected speckle image as a target image; and if the type identifier is a third type identifier, acquiring a reference image stored in the processing unit, calculating to obtain a depth image according to the speckle image and the reference image, correcting the depth image, and taking the corrected depth image as a target image.
In an embodiment, the image processing apparatus 1000 may further include an image processing module, where the image processing module is configured to send the target image to the camera driver when determining that the target image is of a preset image type; and processing the target image through the camera drive, and sending a processing result to the application program.
The division of the modules in the image processing apparatus is only for illustration, and in other embodiments, the image processing apparatus may be divided into different modules as needed to complete all or part of the functions of the image processing apparatus.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the image processing methods provided by the above-described embodiments.
A computer program product comprising instructions which, when run on a computer, cause the computer to perform an image processing method.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An image processing method, characterized in that the method comprises:
an application program sends an image acquisition instruction to a camera driver, wherein the image acquisition instruction comprises a type identifier and a mode identifier, the type identifier is used for representing the type of a target image required to be acquired by the application program, the type identifier comprises a first type identifier, a second type identifier or a third type identifier, the first type identifier is an infrared image identifier, the second type identifier is a speckle image identifier, and the third type identifier is a depth image identifier; the mode identification is used for representing the safety level of the image acquisition instruction, and comprises a safety mode and a non-safety mode;
the camera drive controls the camera to be opened according to the image acquisition instruction;
the camera driver generates a control instruction according to the type identifier and sends the control instruction to a processing unit;
the processing unit turns on a corresponding light emitter according to a type identifier contained in the control instruction, wherein the light emitter comprises a first light emitter and/or a second light emitter, the first light emitter is a floodlight, and the second light emitter is a laser light;
acquiring a target image formed when the light emitter irradiates an object through the camera;
the camera driver generates a control instruction according to the type identifier and issues the control instruction to a processing unit, and the method comprises the following steps:
if the mode identifier is a non-safe mode, the camera driver generates a control instruction according to the type identifier;
issuing the control instruction to the processing unit through a serial peripheral interface;
if the mode identification is a safe mode, the camera driver sends the control instruction to a Trusted application in a TEE (Trusted execution environment), and the control instruction is sent to the processing unit through a safe serial peripheral interface by the Trusted application.
2. The method of claim 1,
the processing unit turns on the corresponding light emitter according to the type identifier contained in the control instruction, and the method comprises the following steps:
if the type identifier contained in the control instruction is a first type identifier, the processing unit controls the first light emitter to be turned on;
and if the type identifier contained in the control instruction is a second type identifier or a third type identifier, the processing unit controls the second light emitter to be turned on.
3. The method of claim 1, wherein the processing unit turns on the corresponding light emitter according to a type identifier included in the control instruction, comprising:
the processing unit selects a corresponding controller according to the type identifier contained in the control instruction and inputs Pulse Width Modulation (PWM) to the controller;
and turning on the light emitter connected with the controller through the Pulse Width Modulation (PWM).
4. The method according to any one of claims 1 to 3, wherein the acquiring, by the camera, a target image formed when the light emitter irradiates the object comprises:
acquiring an original image formed when the light emitter irradiates an object through the camera, and sending the original image to the processing unit;
and acquiring a target image according to the original image in the processing unit.
5. The method of claim 4, wherein the capturing, by the camera, a raw image formed when the light emitter illuminates an object, the raw image being sent to the processing unit, comprises:
if the type identifier is a first type identifier, controlling the camera to acquire an infrared image formed when the first light emitter irradiates an object, and sending the infrared image to the processing unit;
and if the type identifier is a second type identifier or a third type identifier, controlling the camera to acquire a speckle image formed when the second light emitter irradiates the object, and sending the speckle image to the processing unit.
6. The method of claim 5, wherein said obtaining in said processing unit a target image from said raw image comprises:
if the type identifier is a first type identifier, correcting the infrared image in the processing unit, and taking the corrected infrared image as a target image;
if the type identifier is a second type identifier, correcting the speckle image in the processing unit, and taking the corrected speckle image as a target image;
and if the type identifier is a third type identifier, acquiring a reference image stored in the processing unit, calculating to obtain a depth image according to the speckle image and the reference image, correcting the depth image, and taking the corrected depth image as a target image.
7. The method of claim 4, wherein after obtaining the target image from the raw image in the processing unit, further comprising:
when the target image is judged to be of a preset image type, sending the target image to the camera for driving;
and processing the target image through the camera drive, and sending a processing result to the application program.
8. An image processing apparatus, characterized in that the apparatus comprises:
the device comprises an instruction initiating module, a camera driving module and a processing module, wherein the instruction initiating module is used for sending an image acquisition instruction to a camera driver by an application program, the image acquisition instruction comprises a type identifier and a mode identifier, the type identifier is used for representing the type of a target image required to be acquired by the application program, the type identifier comprises a first type identifier, a second type identifier or a third type identifier, the first type identifier is an infrared image identifier, the second type identifier is a speckle image identifier, and the third type identifier is a depth image identifier; the mode identification is used for representing the safety level of the image acquisition instruction, and comprises a safety mode and a non-safety mode;
the camera control module is used for controlling the camera to be opened by the camera drive according to the image acquisition instruction;
the command sending module is used for generating a control command by the camera drive according to the type identifier and sending the control command to the processing unit;
the light emitter control module is used for turning on a corresponding light emitter by the processing unit according to the type identifier contained in the control instruction, wherein the light emitter comprises a first light emitter and/or a second light emitter, the first light emitter is a floodlight, and the second light emitter is a radium-shine lamp;
the image acquisition module is used for acquiring a target image formed when the light emitter irradiates an object through the camera;
the command sending module is further configured to, if the mode identifier is a non-secure mode, generate a control command by the camera driver according to the type identifier, and send the control command to the processing unit through a serial peripheral interface; if the mode identification is a safe mode, the camera driver sends the control instruction to a trusted application in a TEE (trusted execution environment), and sends the control instruction to the processing unit through a safe serial peripheral interface by the trusted application.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
10. An electronic device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the method of any of claims 1-7.
CN201810786074.0A 2018-07-16 2018-07-17 Image processing method, image processing device, computer-readable storage medium and electronic equipment Active CN108833885B (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201810786074.0A CN108833885B (en) 2018-07-17 2018-07-17 Image processing method, image processing device, computer-readable storage medium and electronic equipment
PCT/CN2019/082560 WO2020015403A1 (en) 2018-07-16 2019-04-12 Method and device for image processing, computer readable storage medium and electronic device
EP19806080.8A EP3621294B1 (en) 2018-07-16 2019-04-12 Method and device for image capture, computer readable storage medium and electronic device
US16/671,840 US20200068127A1 (en) 2018-07-16 2019-11-01 Method for Processing Image and Related Electronic Device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810786074.0A CN108833885B (en) 2018-07-17 2018-07-17 Image processing method, image processing device, computer-readable storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN108833885A CN108833885A (en) 2018-11-16
CN108833885B true CN108833885B (en) 2020-05-26

Family

ID=64141057

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810786074.0A Active CN108833885B (en) 2018-07-16 2018-07-17 Image processing method, image processing device, computer-readable storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN108833885B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020015403A1 (en) * 2018-07-16 2020-01-23 Oppo广东移动通信有限公司 Method and device for image processing, computer readable storage medium and electronic device
CN111447561B (en) * 2020-03-16 2023-04-18 阿波罗智联(北京)科技有限公司 Image processing system for vehicle

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102970548B (en) * 2012-11-27 2015-01-21 西安交通大学 Image depth sensing device
CN106152937B (en) * 2015-03-31 2019-10-25 深圳超多维科技有限公司 Space positioning apparatus, system and method
CN104918035A (en) * 2015-05-29 2015-09-16 深圳奥比中光科技有限公司 Method and system for obtaining three-dimensional image of target
CN105120135B (en) * 2015-08-25 2019-05-24 努比亚技术有限公司 A kind of binocular camera
CN205336464U (en) * 2015-12-08 2016-06-22 上海图漾信息科技有限公司 Range data detecting system
CN106210568A (en) * 2016-07-15 2016-12-07 深圳奥比中光科技有限公司 Image processing method and device
CN106226977A (en) * 2016-08-24 2016-12-14 深圳奥比中光科技有限公司 Laser projection module, image capturing system and control method thereof and device
CN106331453A (en) * 2016-08-24 2017-01-11 深圳奥比中光科技有限公司 Multi-image acquisition system and image acquisition method
CN106973251B (en) * 2017-03-23 2020-11-03 移康智能科技(上海)股份有限公司 Image data transmission method and device
CN107426560A (en) * 2017-06-22 2017-12-01 维沃移动通信有限公司 A kind of filming apparatus, image processing method and mobile terminal

Also Published As

Publication number Publication date
CN108833885A (en) 2018-11-16

Similar Documents

Publication Publication Date Title
CN108764052B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108769509B (en) Control method, apparatus, electronic equipment and the storage medium of camera
CN108804895B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108805024B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108711054B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108573170B (en) Information processing method and device, electronic equipment and computer readable storage medium
US11256903B2 (en) Image processing method, image processing device, computer readable storage medium and electronic device
CN111126146A (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
TWI696391B (en) Projector, detection method and detection device thereof, image capturing device, electronic device, and computer readable storage medium
CN108921903B (en) Camera calibration method, device, computer readable storage medium and electronic equipment
US11275927B2 (en) Method and device for processing image, computer readable storage medium and electronic device
WO2020015403A1 (en) Method and device for image processing, computer readable storage medium and electronic device
US11019325B2 (en) Image processing method, computer device and readable storage medium
CN110971836B (en) Method and device for controlling shooting, electronic equipment and computer-readable storage medium
CN108833885B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108924421B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN109683698B (en) Payment verification method and device, electronic equipment and computer-readable storage medium
CN108830141A (en) Image processing method, device, computer readable storage medium and electronic equipment
CN108760245A (en) Optical element detection method and device, electronic equipment, readable storage medium storing program for executing
CN108881712B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
US11308636B2 (en) Method, apparatus, and computer-readable storage medium for obtaining a target image
KR20120131819A (en) Body Detection System

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant