CN108881712B - Image processing method, image processing device, computer-readable storage medium and electronic equipment - Google Patents

Image processing method, image processing device, computer-readable storage medium and electronic equipment Download PDF

Info

Publication number
CN108881712B
CN108881712B CN201810404508.6A CN201810404508A CN108881712B CN 108881712 B CN108881712 B CN 108881712B CN 201810404508 A CN201810404508 A CN 201810404508A CN 108881712 B CN108881712 B CN 108881712B
Authority
CN
China
Prior art keywords
image
target
speckle
acquiring
application program
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810404508.6A
Other languages
Chinese (zh)
Other versions
CN108881712A (en
Inventor
郭子青
周海涛
谭国辉
惠方方
谭筱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810404508.6A priority Critical patent/CN108881712B/en
Publication of CN108881712A publication Critical patent/CN108881712A/en
Priority to EP19792627.2A priority patent/EP3644261B1/en
Priority to PCT/CN2019/080559 priority patent/WO2019205889A1/en
Priority to US16/740,925 priority patent/US11308636B2/en
Application granted granted Critical
Publication of CN108881712B publication Critical patent/CN108881712B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00095Systems or arrangements for the transmission of the picture signal

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

The application relates to an image processing method, an image processing device, a computer readable storage medium and an electronic device. The method comprises the following steps: if an image acquisition instruction is detected, acquiring a precision level corresponding to a target application program initiating the image acquisition instruction; adjusting the number of scattered spots contained in the collected speckle images according to the precision level, and acquiring a target image according to the adjusted speckle images; the speckle image is an image formed by irradiating laser speckles collected by a laser camera on an object; and sending the target image to the target application program. The image processing method, the image processing device, the computer readable storage medium and the electronic equipment can improve the safety of image processing.

Description

Image processing method, image processing device, computer-readable storage medium and electronic equipment
Technical Field
The present application relates to the field of computer technologies, and in particular, to an image processing method and apparatus, a computer-readable storage medium, and an electronic device.
Background
In the internet era, data needs to be transmitted to realize communication and sharing of information, so that the data is transmitted very frequently and importantly. However, data is usually transmitted through a common data transmission channel, so that the data is extremely easy to intercept during transmission. Once the data is revealed, a great safety hazard exists.
Disclosure of Invention
The embodiment of the application provides an image processing method and device, a computer readable storage medium and an electronic device, which can improve the safety of image processing.
An image processing method comprising:
if an image acquisition instruction is detected, acquiring a precision level corresponding to a target application program initiating the image acquisition instruction;
adjusting the number of scattered spots contained in the collected speckle images according to the precision level, and acquiring a target image according to the adjusted speckle images; the speckle image is an image formed by irradiating laser speckles collected by a laser camera on an object;
and sending the target image to the target application program.
An image processing apparatus comprising:
the precision acquisition module is used for acquiring a precision level corresponding to a target application program which initiates an image acquisition instruction if the image acquisition instruction is detected;
the image adjusting module is used for adjusting the number of scattered spots contained in the collected speckle images according to the precision level and acquiring target images according to the adjusted speckle images; the speckle image is an image formed by irradiating laser speckles collected by a laser camera on an object;
and the image sending module is used for sending the target image to the target application program.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
if an image acquisition instruction is detected, acquiring a precision level corresponding to a target application program initiating the image acquisition instruction;
adjusting the number of scattered spots contained in the collected speckle images according to the precision level, and acquiring a target image according to the adjusted speckle images; the speckle image is an image formed by irradiating laser speckles collected by a laser camera on an object;
and sending the target image to the target application program.
An electronic device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the steps of:
if an image acquisition instruction is detected, acquiring a precision level corresponding to a target application program initiating the image acquisition instruction;
adjusting the number of scattered spots contained in the collected speckle images according to the precision level, and acquiring a target image according to the adjusted speckle images; the speckle image is an image formed by irradiating laser speckles collected by a laser camera on an object;
and sending the target image to the target application program.
According to the image processing method, the image processing device, the computer readable storage medium and the electronic equipment, when the image acquisition instruction is detected, the precision level corresponding to the target application program initiating the image acquisition instruction can be obtained. And then adjusting the number of scattered spots contained in the acquired speckle image according to the precision level, and acquiring a target image according to the adjusted speckle image. And finally, sending the target image to a target application program. Therefore, the target images with different accuracies can be sent to different target application programs, the target application programs with low application levels are applied, the accuracy of the obtained target images is low, and the safety of the sent target images can be ensured.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a diagram of an exemplary embodiment of an application of an image processing method;
FIG. 2 is a flow diagram of a method of image processing in one embodiment;
FIG. 3 is a flow chart of an image processing method in another embodiment;
FIG. 4 is a schematic diagram of computing depth information in one embodiment;
FIG. 5 is a flowchart of an image processing method in yet another embodiment;
FIG. 6 is a flowchart of an image processing method in yet another embodiment;
FIG. 7 is a diagram of hardware components for implementing an image processing method in one embodiment;
FIG. 8 is a diagram showing a hardware configuration for implementing an image processing method in another embodiment;
FIG. 9 is a diagram illustrating a software architecture for implementing an image processing method according to an embodiment;
fig. 10 is a schematic structural diagram of an image processing apparatus according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first client may be referred to as a second client, and similarly, a second client may be referred to as a first client, without departing from the scope of the present application. Both the first client and the second client are clients, but they are not the same client.
Fig. 1 is a diagram illustrating an application scenario of an image processing method according to an embodiment. As shown in fig. 1, the application scenario includes an electronic device 104, and a camera module and a plurality of application programs may be installed in the electronic device 104. The application program may initiate an image capture instruction to obtain an image, and when the electronic device 104 detects the image capture instruction, may obtain an application level corresponding to a target application program that initiated the image capture instruction, and obtain a corresponding precision level according to the application level. The number of scattered spots included in the collected speckle image 102 is adjusted according to the accuracy level, and a target image is obtained according to the adjusted speckle image, wherein the speckle image 102 is an image formed by irradiating an object with laser speckles collected by a laser camera. And finally, sending the target image to a target application program. The electronic device 104 may be a smart phone, a tablet computer, a personal digital assistant, a wearable device, or the like.
FIG. 2 is a flow diagram of a method of image processing in one embodiment. As shown in fig. 2, the image processing method includes steps 202 to 206. Wherein:
step 202, if the image acquisition instruction is detected, acquiring the precision level corresponding to the target application program initiating the image acquisition instruction.
The electronic equipment can be provided with a camera, and images are obtained through the installed camera. The camera can be divided into types such as a laser camera and a visible light camera according to the difference of the obtained images, the laser camera can obtain the image formed by irradiating the laser to the object, and the visible light image can obtain the image formed by irradiating the visible light to the object. The electronic equipment can be provided with a plurality of cameras, and the installation position is not limited. For example, one camera may be installed on a front panel of the electronic device, two cameras may be installed on a back panel of the electronic device, and the cameras may be installed in an embedded manner inside the electronic device and then opened by rotating or sliding. Specifically, a front camera and a rear camera can be mounted on the electronic device, the front camera and the rear camera can acquire images from different viewing angles, the front camera can acquire images from a front viewing angle of the electronic device, and the rear camera can acquire images from a back viewing angle of the electronic device.
The electronic equipment can be used for safely storing a plurality of application programs, the application programs refer to software written for a certain application purpose in the electronic equipment, and the electronic equipment can realize the required service for users through the application programs. For example, the user may play games through a game-like application, may pay for transactions through a payment-like application, may play music through a music-like application, and so on. When the application program needs to collect images, an image collection instruction can be initiated, and the electronic equipment can call the camera module according to the image collection instruction to collect the images. The image capturing instruction refers to an instruction for triggering an image capturing operation. For example, when the user wants to shoot, the user can click the shooting button, and when the electronic device recognizes that the shooting button is pressed, the electronic device generates an image acquisition instruction, so that the camera module is called to acquire an image. When the user needs to carry out payment verification through the face, the user can click the payment button, the face is aligned to the camera to be shot, and the electronic equipment can carry out payment verification after acquiring the face.
When an application program initiates an image acquisition instruction, an application identifier for uniquely identifying one application program can be written in the image acquisition instruction. When the electronic equipment detects the image acquisition instruction, the corresponding application level can be searched according to the application identifier. The application program can also directly write the application level into the image acquisition instruction, and the electronic equipment can directly acquire the application level in the image acquisition instruction when detecting the image acquisition instruction. The application level refers to the importance of the application program, for example, the application program can be divided into a system security application, a system non-security application, a third-party security application and a third-party non-security application, and the corresponding application level is gradually decreased. The corresponding relation between the application level and the precision level can be stored in the electronic equipment in advance, and the corresponding precision level can be obtained according to the application level.
204, adjusting the number of scattered spots contained in the collected speckle images according to the precision level, and acquiring a target image according to the adjusted speckle images; the speckle image is an image formed by irradiating laser speckles collected by a laser camera onto an object.
When an image acquisition instruction is detected, the laser lamp and the laser camera can be opened, laser speckles formed by the laser lamp can be irradiated on an object, and then speckle images formed by the laser speckles acquired by the laser camera are irradiated on the object. Specifically, when laser light irradiates on an optically rough surface with average fluctuation larger than the wavelength order, wavelets scattered by surface elements distributed on the surface are mutually superposed to enable a reflected light field to have random spatial light intensity distribution, and a granular structure is presented, namely laser speckle. The formed laser speckle comprises a plurality of laser speckles, so that a speckle pattern acquired by a laser camera also comprises a plurality of speckle points, for example, 30000 speckle points can be included in a speckle image. The laser speckles formed are highly random, and therefore, the laser speckles generated by the laser emitted by different laser emitters are different. When the resulting laser speckle is projected onto objects of different depths and shapes, the resulting speckle images are not identical. The laser speckles formed by different laser emitters are unique, and therefore the speckle images obtained are also unique.
Specifically, a CPU (Central Processing Unit) of the electronic device may receive an instruction from an upper application. When the CPU receives the image acquisition instruction, the camera module can be controlled to work, and the infrared image and the speckle image are acquired through the camera module. The camera module can include, but is not limited to, a laser camera, a laser lamp and a floodlight. The CPU controls the laser lamp and the floodlight to work in a time-sharing mode, and controls the laser camera to collect speckle images when the laser lamp is started; when the floodlight is turned on, the laser camera is controlled to collect infrared images.
The electronic equipment can calculate the depth information from the object to the camera according to the collected speckle images, and the more speckle points are contained in the speckle images, the more accurate the calculated depth information is. The laser lamp can emit a plurality of laser speckle points, and when the laser speckle points irradiate objects with different distances, the positions of the spots displayed on the image are different. The electronic device may pre-capture a standard reference image, which is the image formed by the laser speckle impinging on the plane. The speckle points in the reference image are generally uniformly distributed, and then the correspondence between each speckle point in the reference image and the reference depth is established. When speckle images need to be collected, the laser spot lamp is controlled to emit laser speckles, and the laser speckles irradiate an object and are collected by the laser camera to obtain the speckle images. Then comparing each speckle point in the speckle image with the speckle point in the reference image, acquiring the position offset of the speckle point in the speckle image relative to the corresponding scattered spot in the reference image, and acquiring the actual depth information corresponding to the speckle point by the position offset of the scattered spot and the reference depth.
The infrared image collected by the camera corresponds to the speckle image, and the speckle image can be used for calculating depth information corresponding to each pixel point in the infrared image. Therefore, the human face can be detected and identified through the infrared image, and the depth information corresponding to the human face can be calculated according to the speckle image. Specifically, in the process of calculating the depth information according to the speckle images, a relative depth is first calculated according to the position offset of the speckle images relative to the scattered spots of the reference image, and the relative depth can represent the depth information of the actual shot object to the reference plane. And then calculating the actual depth information of the object according to the acquired relative depth and the reference depth. The depth image is used for representing depth information corresponding to the infrared image, and can be relative depth from a represented object to a reference plane or absolute depth from the object to a camera.
The number of the speckle points contained in the speckle image is adjusted according to the precision level, and the speckle image can be adjusted in a software mode or a hardware mode. When the software mode is adjusted, the speckle points in the acquired speckle pattern can be directly detected, and part of the speckle points are merged or eliminated, so that the number of the speckle points contained in the adjusted speckle pattern is reduced. When the hardware mode is adjusted, the number of laser scattered spots generated by the laser lamp in a diffraction mode can be adjusted. For example, the scattered spots included in the speckle image are adjusted from 30000 to 20000, so that the accuracy of the depth image obtained by corresponding calculation is correspondingly reduced. After the electronic equipment adjusts the number of the speckle points contained in the speckle image according to the precision level, the target image can be obtained according to the adjusted speckle image. The target image may be an adjusted speckle image or a depth image calculated from the adjusted speckle image.
Step 206, the target image is sent to the target application program.
After the target image is acquired, the corresponding target application program can be searched according to the application identifier, and the target image is sent to the target application program. The target application program can perform application operations such as payment, unlocking, beauty, AR (Augmented Reality technology) and the like according to the target image.
The image processing method provided in the foregoing embodiment may obtain, when the image acquisition instruction is detected, an application level corresponding to a target application program that initiates the image acquisition instruction, and obtain a corresponding precision level according to the application level. And then adjusting the number of scattered spots contained in the acquired speckle image according to the precision level, and acquiring a target image according to the adjusted speckle image. And finally, sending the target image to a target application program. Therefore, target images with different accuracies can be sent to different target application programs, the target application programs with low application levels can be applied, and the accuracy of the obtained target images is lower, so that the safety of the sent target images can be ensured.
Fig. 3 is a flowchart of an image processing method in another embodiment. As shown in fig. 3, the image processing method includes steps 302 to 310. Wherein:
step 302, if an image acquisition instruction is detected, acquiring an application level corresponding to a target application program initiating the image acquisition instruction, and acquiring a security level of an application operation corresponding to the image acquisition instruction.
The application operation refers to an operation that the application program needs to complete, and after the user opens the application program, different application operations can be completed through the application program. For example, the application operation may be a payment operation, a photographing operation, an unlocking operation, a game operation, and the like. The security level of an application operation refers to the level of security requirements of the application operation. For example, the security requirement of the payment operation on data processing is relatively high, so the security level corresponding to the payment operation is relatively high; the security requirement of the shooting operation on data processing is low, and the security level corresponding to the shooting operation is low.
Specifically, when the application program initiates an image acquisition instruction, an operation identifier may be written in the image acquisition instruction, where the operation identifier is used to uniquely identify the application operation. For example, the operation identifier corresponding to the payment operation is "pay", and the operation identifier corresponding to the shooting operation is "photo". The electronic device can obtain the corresponding application level according to the application identifier contained in the image acquisition instruction, and obtain the corresponding security level according to the operation identifier.
And step 304, acquiring the precision level according to the application level and the security level.
In one embodiment, the accuracy level is obtained according to the application level of the target application program and the security level of the application operation, and the higher the application level of the target application program is, the higher the security level of the application operation is, the higher the corresponding accuracy level is. Specifically, the application program may be divided into a plurality of application levels, and each application level corresponds to one first precision weight. The application operation may also be divided into a plurality of security levels, each security level corresponding to a second precision weight. After the application level and the safety level of the target application degree are obtained, a first precision weight corresponding to the application level and a second precision weight corresponding to the safety level can be obtained, the precision weights are calculated according to the first precision weight and the second precision weight, and then the precision level where the precision weights fall is determined.
For example, the application level of the application program may be divided into five levels, and the first precision weights corresponding to the application levels from low to high are 0.2, 0.4, 0.6, 0.8, and 1, respectively. The application operation is divided into four safety levels, and the second precision weights corresponding to the safety levels from low to high are respectively 0.25, 0.5, 0.75 and 1. The precision level can be divided into 3 levels, and the precision weights corresponding to the precision levels from low to high are respectively 0-0.4, 0.4-0.8 and 0.8-1. The calculation formula for calculating the precision weight (D) according to the first precision weight (a) and the second precision weight (b) is as follows: d ═ 0.5 a +0.5 b. Assuming that the application level of the target application program is level 3 and the security level of the corresponding application operation is level 2, the obtained first precision weight is 0.6 and the second precision weight is 0.5. The calculated precision weight is 0.5 × 0.6+0.5 × 0.5 — 0.55, and the corresponding precision level is 2.
And step 306, adjusting the number of scattered spots contained in the collected speckle image according to the precision level, and acquiring a target image according to the adjusted speckle image.
The electronic equipment can include first processing unit, camera module and second processing unit, and first processing unit moves under first secure environment, and the second processing unit moves under the second secure environment, and first processing unit links to each other with camera module and second processing unit respectively, and the camera module links to each other with the second processing unit. The image acquisition instruction initiated by the application program can be sent to the first processing unit, and after the first processing unit detects the image acquisition instruction, the camera module can be controlled to acquire the speckle image according to the image acquisition instruction, and the acquired speckle image is sent to the second processing unit. The second processing unit calculates the parallax image according to the speckle image and sends the speckle image and the parallax image to the first processing unit. The first processing unit corrects the speckle images according to the parallax images, acquires target images according to the corrected speckle images and sends the target images to a target application program.
Specifically, the camera module can include, but is not limited to, a laser camera, a laser lamp and a floodlight. When the first processing unit receives the image acquisition instruction, the first processing unit controls the laser lamp and the floodlight to work in a time-sharing mode, and when the laser lamp is started, the speckle image is acquired through the laser camera; when the floodlight is turned on, the infrared image is collected through the laser camera. The parallax image is used for representing errors generated in an image acquisition process and can comprise an infrared parallax image and a speckle parallax image, the infrared parallax image is used for correcting the infrared image, and the speckle parallax image is used for correcting the speckle image. For example, a laser camera deflects, and the acquired speckle image needs to correct an error caused by the deflection to obtain a standard speckle image.
It can be understood that the infrared image and the speckle image collected by the camera are corresponding, and the depth information corresponding to the infrared image can be calculated according to the speckle image. If the camera collects the infrared image and the speckle image in a time-sharing manner, the time interval for collecting the infrared image and the speckle image must be ensured to be very short, and the consistency of the infrared image and the speckle image can be ensured. That is, the time interval between the first time of acquiring the infrared image and the second time of acquiring the speckle image is smaller than the first threshold. The first threshold is generally a relatively small value, and can be adjusted according to the change rule of the object to be photographed. The faster the change of the object to be photographed, the smaller the correspondingly acquired first threshold value. The first threshold value may be set to a large value on the assumption that the subject is stationary for a long period of time. Specifically, the change speed of the object to be photographed is acquired, and the corresponding first threshold is acquired according to the change speed.
Specifically, electronic equipment can set up floodlight controller and laser lamp controller respectively, and first processing unit connects floodlight controller and laser lamp controller respectively through two way PWM, and when first processing unit need control floodlight and open or the laser lamp was opened, accessible PWM sends pulse wave control floodlight to floodlight controller and opens or sends pulse wave control laser lamp to laser lamp controller and opens, sends pulse wave to two controllers respectively through PWM and controls the time interval between gathering infrared image and the speckle image. The time interval between the collected infrared image and the speckle image is lower than the first threshold value, the consistency of the collected infrared image and the collected speckle image can be ensured, the large error between the infrared image and the speckle image is avoided, and the accuracy of image processing is improved.
For example, when the mobile phone needs to be authenticated and unlocked through a human face, the user can click an unlocking key to initiate an unlocking instruction, and the front-facing camera is aligned with the face to shoot. The mobile phone sends the unlocking instruction to the first processing unit, and the first processing unit controls the camera to work. The method comprises the steps of firstly collecting infrared images through a first camera module, controlling a second camera module to collect speckle images after 1 millisecond time interval, and carrying out authentication and unlocking through the collected infrared images and the speckle images.
In one embodiment, speckle images of different accuracies are sent for different target applications. The method for adjusting the number of speckle points contained in the speckle image specifically comprises the following steps: adjusting the number of scattered spots contained in laser speckles generated by diffraction of a laser lamp according to the precision level, and collecting a speckle image formed by irradiating the laser speckles on an object through a laser camera; or acquiring a speckle image formed by irradiating preset laser speckles on an object through a laser camera, and adjusting the number of scattered spots contained in the speckle image according to the precision level. Different Diffractive Optical Elements (DOEs) can be preset in the laser lamp, wherein the number of scattered spots formed by diffraction of different DOEs is different. And switching different DOEs according to the precision level to perform diffraction to generate speckle images, and obtaining depth images with different precisions according to the obtained speckle images. When the precision level is higher, the laser lamp can control the DOE with more scattered spots to emit laser speckles, so that speckle images with more scattered spots are obtained; when the precision level is lower, the laser lamp can control the DOE with less scattered spots to emit laser speckles, so that a speckle image with less scattered spots is obtained.
Specifically, if the number of the speckle points is adjusted by the laser lamp, the first processing unit acquires the precision level according to the received image acquisition instruction, and switches different DOEs according to the precision level, so that the number of scattered spots included in laser speckles generated by diffraction of the laser lamp is adjusted, and speckle images formed by irradiating the laser speckles on an object are acquired by the laser camera. If the number of the speckle points is adjusted in a software mode, the first processing unit corrects the speckle image according to the parallax image to obtain a corrected speckle image, and then adjusts the number of the speckle points contained in the corrected speckle image according to the precision level.
The method for acquiring the target image specifically comprises the following steps: calculating to obtain a depth image according to the adjusted speckle image, and taking the depth image as a target image; or, the adjusted speckle image is taken as the target image. Specifically, the electronic device calibrates the laser speckle in advance to obtain a reference image, and stores the reference image in the electronic device. Generally, a reference image is formed by irradiating laser speckle onto a reference plane, and the reference image is also an image with a plurality of reference scattered spots, each having corresponding reference depth information. When the depth information of the shot object needs to be acquired, the actually acquired speckle image can be compared with the reference image, and the actual depth information is calculated according to the offset of the scattered spots in the actually acquired speckle image. The step of calculating the depth image specifically includes: acquiring a reference image; comparing the reference image with the speckle image to obtain offset information, wherein the offset information is used for representing the horizontal offset of the speckle point in the speckle image relative to the corresponding scattered spot in the reference image; and calculating to obtain a depth image according to the offset information and the reference depth information.
FIG. 4 is a schematic diagram of computing depth information in one embodiment. As shown in fig. 4, the laser light 402 can generate laser speckles, which are reflected off of an object and then captured by the laser camera 404 to form an image. In the calibration process of the camera, laser speckles emitted by the laser lamp 402 are reflected by the reference plane 408, reflected light is collected by the laser camera 404, and a reference image is obtained by imaging through the imaging plane 410. The reference depth from reference plane 408 to laser lamp 402 is L, which is known. In the process of actually calculating the depth information, laser speckles emitted by the laser lamp 402 are reflected by the object 406, reflected light is collected by the laser camera 404, and an actual speckle image is obtained by imaging through the imaging plane 410. The calculation formula for obtaining the actual depth information is as follows:
Figure BDA0001646529820000131
where L is the distance between the laser beam 402 and the reference plane 408, f is the focal length of the lens in the laser camera 404, CD is the distance between the laser beam 402 and the laser camera 404, and AB is the offset distance between the image of the object 406 and the image of the reference plane 408. AB may be the product of the pixel offset n and the actual distance p of the pixel. When the distance Dis between the object 404 and the laser lamp 402 is greater than the distance L between the reference plane 406 and the laser lamp 402, AB is a negative value; AB is positive when the distance Dis between the object 404 and the laser lamp 402 is less than the distance L between the reference plane 406 and the laser lamp 402.
And 308, acquiring a reference image pre-stored in the electronic equipment, and encrypting the target image according to the reference image.
The target image may be encrypted before being sent to the target application. The reference image is a speckle image acquired by the electronic device when the camera module is calibrated, and the reference image acquired by different electronic devices is different due to the high uniqueness of the reference image. The reference image itself can be used as an encryption key for encrypting data. The electronic device can store the reference image in a secure environment, which can prevent data leakage. Specifically, the acquired reference image is formed by a two-dimensional pixel matrix, and each pixel point has a corresponding pixel value. The encryption process may be performed on the target image based on all or a part of the pixel points of the reference image. For example, the reference image may be directly superimposed with the target image to obtain an encrypted image. Or performing product operation on the pixel matrix corresponding to the target image and the pixel matrix corresponding to the reference image to obtain the encrypted image. The pixel value corresponding to one or more pixel points in the reference image may also be used as an encryption key to encrypt the target image, and the specific encryption algorithm is not limited in this embodiment.
And step 310, sending the encrypted target image to the target application program.
The reference image is generated when the electronic device is calibrated, the electronic device can store the reference image in a secure environment in advance, read the reference image in the secure environment when the target image needs to be encrypted, and encrypt the target image according to the reference image. Meanwhile, the same reference image is stored in the server corresponding to the target application program, and after the electronic equipment sends the encrypted target image to the server corresponding to the target application program, the server of the target application program acquires the reference image and decrypts the encrypted target image according to the acquired reference image.
It is understood that the server of the target application may store a plurality of reference images acquired by different electronic devices, and the reference image corresponding to each electronic device is different. Therefore, the server may define a reference image identifier for each reference image, store the device identifier of the electronic device, and then establish a corresponding relationship between the reference image identifier and the device identifier. When the server receives the target image, the received target image can carry the device identifier of the electronic device at the same time. The server can search the corresponding reference image identifier according to the equipment identifier, find the corresponding reference image according to the reference image identifier, and then decrypt the target image according to the found reference image.
In other embodiments provided in the present application, the method for performing encryption processing according to a reference image may specifically include: acquiring a pixel matrix corresponding to a reference image, and acquiring an encryption key according to the pixel matrix; and carrying out encryption processing on the target image according to the encryption key.
Specifically, the reference image is composed of a two-dimensional pixel matrix, and since the acquired reference image is unique, the pixel matrix corresponding to the reference image is also unique. The pixel matrix itself can be used as an encryption key to encrypt the target image, or the pixel matrix can be converted to obtain the encryption key, and then the encryption key obtained by conversion is used to encrypt the target image. For example, the pixel matrix is a two-dimensional matrix formed by a plurality of pixel values, and the position of each pixel value in the pixel matrix can be represented by a two-dimensional coordinate, so that the corresponding pixel value can be obtained by one or more position coordinates, and the obtained one or more pixel values are combined into an encryption key. After the encryption key is obtained, the target image may be encrypted according to the encryption key, and specifically, the encryption algorithm is not limited in this embodiment. For example, the encryption key may be directly superimposed or multiplied with the target image, or the encryption key may be inserted as a value into the target image to obtain the final encrypted target image.
The electronic device may also employ different encryption algorithms for different applications. Specifically, the electronic device may pre-establish a correspondence between an application identifier of the application program and the encryption algorithm, and the image acquisition instruction may include a target application identifier of the target application program. After receiving the image acquisition instruction, the target application identifier contained in the image acquisition instruction can be acquired, the corresponding encryption algorithm is acquired according to the target application identifier, and the target image is encrypted according to the acquired encryption algorithm.
In one embodiment, when the target image is encrypted, encryption processing of different degrees can be performed according to the time length of the image acquisition instruction. Specifically, the method comprises the following steps:
step 502, a timestamp included in the image capturing instruction is obtained, where the timestamp is used to indicate a time when the image capturing instruction is initiated.
When the target application program initiates an image acquisition instruction, a time stamp is written in the image acquisition instruction. The time stamp can represent the moment of initiating the image acquisition instruction, and the time length of initiating the image acquisition instruction can be judged according to the time stamp. For example, when the target application initiates an image capture instruction, the target application may initiate the image capture instruction at a time "11: 23: 01/12/02/2015", and then write the time of capture as a timestamp into the image capture instruction.
And step 504, acquiring an encryption grade according to the interval duration from the timestamp to the current time, and performing encryption processing corresponding to the encryption grade on the target image according to the reference image.
When the first processing unit of the electronic device sends the target image, the timestamp included in the image capture instruction may be acquired, and the current time may be acquired. And acquiring the encryption grade according to the interval duration from the timestamp to the current moment, and performing encryption processing corresponding to the encryption grade on the target image according to the reference image. The longer the interval from the timestamp to the current time, the more insecure the timestamp is, and the higher the encryption level of the corresponding encryption processing is. Specifically, when the interval duration exceeds the duration threshold, the current response is considered to be overtime, the target image is directly discarded, and the target image is not sent to the target application program any more. And when the interval duration is less than the duration threshold, sending the target image to the target application program. Step 504 may specifically include: and if the interval duration from the timestamp to the current moment is less than the duration threshold, acquiring the encryption grade according to the interval duration, and performing encryption processing corresponding to the encryption grade on the target image according to the reference image.
In one embodiment, the step of sending the target image by the image processing method may further include:
step 602, acquiring the running state of a target application program;
and step 604, if the target application program is operated in the foreground, sending the target image to the target application program.
When the electronic equipment runs the application program, the running state of the application program can be divided into foreground running and background running. Foreground running applications can interact with users, and background running applications generally cannot interact with users. Before the target image is sent, the running state of the target application program can be judged, and if the target application program runs in a background, the background application program is considered to illegally call the camera to acquire the image. And if the target application program is in foreground operation, sending the target image to the target application program.
The image processing method provided in the foregoing embodiment may obtain, when the image acquisition instruction is detected, an application level and a security level of an application operation corresponding to a target application program that initiates the image acquisition instruction, and obtain a corresponding precision level according to the security level of the application level. And then adjusting the number of scattered spots contained in the acquired speckle image according to the precision level, and acquiring a target image according to the adjusted speckle image. And finally, encrypting the target image according to the reference image, and sending the encrypted target image to a target application program. Therefore, target images with different accuracies can be sent to different target application programs, the target application programs with low application levels can be applied, and the accuracy of the obtained target images is lower, so that the safety of the sent target images can be ensured. Meanwhile, when the target image is sent, the target image is encrypted and then transmitted, and the safety of image processing is further improved.
It should be understood that, although the steps in the flowcharts of fig. 2, 3, 5, and 6 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2, 3, 5, and 6 may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least some of the sub-steps or stages of other steps.
Fig. 7 is a hardware configuration diagram for implementing an image processing method in one embodiment. As shown in fig. 7, the electronic device may include a camera module 710, a Central Processing Unit (CPU)720 and a first processing unit 730, wherein the camera module 710 includes a laser camera 712, a floodlight 714, an RGB (Red/Green/Blue, Red/Green/Blue color mode) camera 716 and a laser 718. The first processing unit 730 includes a PWM (Pulse Width Modulation) module 732, an SPI/I2C (Serial Peripheral Interface/Inter-Integrated Circuit) module 734, a RAM (Random Access Memory) module 736, and a Depth Engine module 738. The second processing unit 722 may be a CPU core in a TEE (Trusted execution environment), and the first processing unit 730 may be an MCU (micro control unit) processor. It is understood that the central processing unit 720 may be in a multi-core operation mode, and the CPU core in the central processing unit 720 may operate in a TEE or REE (Rich Execution Environment). Both the TEE and the REE are running modes of an ARM module (Advanced RISC Machines). Generally, the operation behavior with higher security in the electronic device needs to be executed under the TEE, and other operation behaviors can be executed under the REE. In the embodiment of the present application, when the central processing unit 720 receives an image acquisition instruction with a higher security requirement initiated by a target application, for example, when the target application needs to unlock face information and the target application needs to pay face information, the CPU core running under the TEE, i.e., the second processing unit 722, sends the image acquisition instruction to the SPI/I2C module 734 in the MCU730 through the SECURE SPI/I2C, the first processing unit 730 transmits a pulse wave through the PWM module 732 to control the opening of the floodlight 714 in the camera module 710 to acquire an infrared image, and controls the opening of the laser light 718 in the camera module 710 to acquire a speckle pattern. The camera module 710 may transmit the collected speckle pattern to the DepthEngine module 738 in the first processing unit 730, and the Depth Engine module 738 may calculate an infrared parallax image according to the infrared image, calculate a speckle parallax image according to the speckle image, and transmit the infrared image, the infrared parallax image, the speckle image, and the speckle parallax image to the second processing unit 722. The second processing unit 722 corrects the infrared image according to the infrared parallax image to obtain a corrected infrared image, and corrects the speckle image according to the speckle parallax image to obtain a corrected speckle image. Then, the second processing unit 722 may calculate a depth image according to the corrected speckle image, perform face recognition according to the corrected infrared image, and detect whether a face exists in the corrected infrared image and whether the detected face matches a stored face; and if the human face passes the identification, performing living body detection according to the corrected infrared image and the depth image, and detecting whether the human face is a living body human face. In one embodiment, after acquiring the corrected infrared image and the depth image, the living body detection and then the face recognition may be performed, or the face recognition and the living body detection may be performed simultaneously. After the face recognition passes and the detected face is a living face, the second processing unit 722 may send one or more of the above-described corrected infrared image, corrected speckle image, depth image, and face recognition result to the target application program.
Fig. 8 is a hardware configuration diagram for implementing an image processing method in another embodiment. As shown in fig. 8, the hardware structure includes a first processing unit 80, a camera module 82, and a second processing unit 84. The camera module 82 comprises a laser camera 820, a floodlight 822, an RGB camera 824 and a laser light 826. The central processing unit may include a CPU core under TEE and a CPU core under REE, the first processing unit 80 is a DSP processing module developed in the central processing unit, the second processing unit 84 is the CPU core under TEE, and the second processing unit 84 and the first processing unit 80 may be connected through a secure buffer (secure buffer), so that security in the image transmission process may be ensured. In general, when a central processing unit processes an operation behavior with higher security, the central processing unit needs to switch a processor core to be executed under a TEE, and an operation behavior with lower security can be executed under a REE. In the embodiment of the application, the image acquisition instruction sent by the upper application can be received by the second processing unit 84, and when the application operation corresponding to the image acquisition instruction received by the second processing unit 84 is safe, the floodlight 822 in the camera module 82 can be controlled to be turned on by the PWM module to acquire an infrared image, and then the laser lamp 826 in the camera module 82 is controlled to be turned on to acquire a speckle image. The camera module 82 can transmit the collected infrared image and the speckle image to the first processing unit 80, the first processing unit 80 can calculate to obtain a depth image according to the speckle image, then calculate to obtain a depth parallax image according to the depth image, and calculate to obtain the infrared parallax image according to the infrared image. The infrared parallax image and the depth parallax image are then sent to the second processing unit 84. The second processing unit 84 may perform correction according to the infrared parallax image to obtain a corrected infrared image, and perform correction according to the depth parallax image to obtain a corrected depth image. The second processing unit 84 performs face authentication according to the infrared image, and detects whether a face exists in the corrected infrared image and whether the detected face matches a stored face; and if the human face passes the authentication, performing living body detection according to the corrected infrared image and the corrected depth image, and judging whether the human face is a living body human face. After the second processing unit 84 performs the face authentication and the living body detection, the processing result is sent to the target application program, and the target application program performs application operations such as unlocking and payment according to the detection result.
FIG. 9 is a diagram illustrating a software architecture for implementing an image processing method according to an embodiment. As shown in fig. 9, the software architecture includes an application layer 910, an operating system 920, and a secure runtime environment 930. The modules in the secure operating environment 930 include a first processing unit 931, a camera module 932, a second processing unit 933, an encryption module 934, and the like; the operating system 930 comprises a security management module 921, a face management module 922, a camera driver 923 and a camera frame 924; the application layer 910 contains an application program 911. The application 911 may initiate an image capturing instruction and send the image capturing instruction to the first processing unit 931 for processing. For example, when operations such as payment, unlocking, beautifying, Augmented Reality (AR) and the like are performed by acquiring a human face, the application program may initiate an image acquisition instruction for acquiring a human face image. It is to be understood that the image instruction initiated by the application 911 may be first sent to the second processing unit 933, and then sent by the second processing unit 933 to the first processing unit 931.
After the first processing unit 931 receives the image capturing instruction, if it is determined that the application operation corresponding to the image capturing instruction is a security operation (e.g., payment or unlocking operation), the camera module 932 is controlled to capture an infrared image and a speckle image according to the image capturing instruction, and the infrared image and the speckle image captured by the camera module 932 are transmitted to the first processing unit 931. The first processing unit 931 calculates a depth image including depth information according to the speckle image, calculates a depth parallax image according to the depth image, and calculates an infrared parallax image according to the infrared image. The depth parallax image and the infrared parallax image are then transmitted to the second processing unit 933 through a secure transmission channel. The second processing unit 933 corrects the infrared parallax image to obtain a corrected infrared image, and corrects the corrected infrared image according to the depth parallax image to obtain a corrected depth image. Then, carrying out face authentication according to the corrected infrared image, and detecting whether a face exists in the corrected infrared image and whether the detected face is matched with the stored face; and if the human face passes the authentication, performing living body detection according to the corrected infrared image and the corrected depth image, and judging whether the human face is a living body human face. The face recognition result obtained by the second processing unit 933 can be sent to the encryption module 934, and after being encrypted by the encryption module 934, the encrypted face recognition result is sent to the security management module 921. Generally, different application programs 911 all have corresponding security management modules 921, and the security management modules 921 perform decryption processing on the encrypted face recognition results, and send the face recognition results obtained after the decryption processing to corresponding face management modules 922. The face management module 922 sends the face recognition result to the upper application 911, and the application 911 performs corresponding operations according to the face recognition result.
If the application operation corresponding to the image capturing instruction received by the first processing unit 931 is a non-secure operation (e.g., a beauty operation or an AR operation), the first processing unit 931 may control the camera module 932 to capture a speckle image, calculate a depth image according to the speckle image, and then obtain a depth parallax image according to the depth image. The first processing unit 931 sends the depth parallax image to the camera driver 923 through the insecure transmission channel, and the camera driver 923 corrects the depth parallax image to obtain a corrected depth image, and then sends the corrected depth image to the camera frame 924, and then the camera frame 924 sends the corrected depth image to the face management module 922 or the application 911.
Fig. 10 is a schematic structural diagram of an image processing apparatus according to an embodiment. As shown in fig. 10, the image processing apparatus 1000 includes a precision acquisition module 1002, an image adjustment module 1004, and an image transmission module 1006. Wherein:
the precision obtaining module 1002 is configured to, if an image acquisition instruction is detected, obtain a precision level corresponding to a target application program that initiates the image acquisition instruction.
The image adjusting module 1004 is used for adjusting the number of scattered spots contained in the collected speckle image according to the precision level and acquiring a target image according to the adjusted speckle image; the speckle image is an image formed by irradiating laser speckles collected by a laser camera onto an object.
An image sending module 1006, configured to send the target image to the target application.
The image processing apparatus provided in the foregoing embodiment may obtain, when the image capture instruction is detected, the precision level corresponding to the target application program that initiates the image capture instruction. And then adjusting the number of scattered spots contained in the acquired speckle image according to the precision level, and acquiring a target image according to the adjusted speckle image. And finally, sending the target image to a target application program. Therefore, target images with different accuracies can be sent to different target application programs, the target application programs with low application levels can be applied, and the accuracy of the obtained target images is lower, so that the safety of the sent target images can be ensured.
In one embodiment, the precision obtaining module 1002 is further configured to obtain an application level corresponding to a target application program that initiates the image acquisition instruction, and obtain a security level of an application operation corresponding to the image acquisition instruction; and acquiring the precision level according to the application level and the safety level.
In one embodiment, the image adjusting module 1004 is further configured to adjust the number of scattered spots included in laser speckles generated by diffraction of the laser lamp according to the precision level, and collect a speckle image formed by the laser speckles irradiating on an object through a laser camera; or acquiring a speckle image formed by irradiating preset laser speckles on an object through a laser camera, and adjusting the number of scattered spots contained in the speckle image according to the precision level.
In one embodiment, the image adjusting module 1004 is further configured to calculate a depth image according to the adjusted speckle image, and use the depth image as a target image; or, the adjusted speckle image is taken as the target image.
In one embodiment, the image adjustment module 1004 is further configured to obtain an operating status of the target application; and if the target application program is operated in the foreground, sending the target image to the target application program.
In an embodiment, the image adjusting module 1004 is further configured to obtain a reference image pre-stored in the electronic device, and encrypt the target image according to the reference image, where the reference image is an image obtained by calibration and containing reference scattered spots; and sending the encrypted target image to the target application program.
In one embodiment, the image adjusting module 1004 is further configured to obtain a timestamp included in the image capturing instruction, where the timestamp is used to indicate a time when the image capturing instruction is initiated; and acquiring an encryption grade according to the interval duration from the timestamp to the current moment, and performing encryption processing corresponding to the encryption grade on the target image according to the reference image.
The division of the modules in the image processing apparatus is only for illustration, and in other embodiments, the image processing apparatus may be divided into different modules as needed to complete all or part of the functions of the image processing apparatus.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the image processing methods provided by the above-described embodiments.
A computer program product comprising instructions which, when run on a computer, cause the computer to perform an image processing method.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (8)

1. An image processing method, comprising:
if an image acquisition instruction is detected, acquiring a precision level corresponding to a target application program initiating the image acquisition instruction;
adjusting the number of scattered spots contained in the collected speckle images according to the precision level, and acquiring a target image according to the adjusted speckle images; the speckle image is an image formed by irradiating laser speckles collected by a laser camera on an object;
sending the target image to the target application program;
the sending the target image to the target application includes:
acquiring a reference image prestored in electronic equipment, and encrypting the target image according to the reference image, wherein the reference image is an image which is obtained by calibration and contains reference scattered spots;
sending the encrypted target image to the target application program;
the encrypting the target image according to the reference image comprises:
acquiring a time stamp contained in an image acquisition instruction, wherein the time stamp is used for representing the moment of initiating the image acquisition instruction;
and acquiring an encryption grade according to the interval duration from the timestamp to the current moment, and performing encryption processing corresponding to the encryption grade on the target image according to the reference image.
2. The method of claim 1, wherein obtaining the level of precision corresponding to the target application that initiated the image capture instruction comprises:
acquiring an application level corresponding to a target application program which initiates the image acquisition instruction, and acquiring a safety level of application operation corresponding to the image acquisition instruction;
and acquiring the precision level according to the application level and the safety level.
3. The method of claim 1, wherein said adjusting the number of scattered spots included in the acquired speckle image according to the level of accuracy comprises:
adjusting the number of scattered spots contained in laser speckles generated by the laser lamp through diffraction according to the precision grade, and collecting a speckle image formed by the laser speckles irradiating an object through a laser camera; or the like, or, alternatively,
and acquiring a speckle image formed by irradiating preset laser speckles on an object through a laser camera, and adjusting the number of scattered spots contained in the speckle image according to the precision level.
4. The method of claim 1, wherein acquiring the target image from the adjusted speckle image comprises:
calculating to obtain a depth image according to the adjusted speckle image, and taking the depth image as a target image; or the like, or, alternatively,
and taking the adjusted speckle image as a target image.
5. The method of any of claims 1 to 4, wherein sending the target image to the target application comprises:
acquiring the running state of the target application program;
and when the target application program runs in the foreground, sending the target image to the target application program.
6. An image processing apparatus characterized by comprising:
the precision acquisition module is used for acquiring a precision level corresponding to a target application program which initiates an image acquisition instruction if the image acquisition instruction is detected;
the image adjusting module is used for adjusting the number of scattered spots contained in the collected speckle images according to the precision level and acquiring target images according to the adjusted speckle images; the speckle image is an image formed by irradiating laser speckles collected by a laser camera on an object;
the image sending module is used for sending the target image to the target application program;
the image sending module is further configured to obtain a reference image pre-stored in the electronic device, and encrypt the target image according to the reference image, where the reference image is an image obtained by calibration and containing reference scattered spots; sending the encrypted target image to the target application program;
the image sending module is further configured to obtain a timestamp included in an image acquisition instruction, where the timestamp is used to indicate a time when the image acquisition instruction is initiated; and acquiring an encryption grade according to the interval duration from the timestamp to the current moment, and performing encryption processing corresponding to the encryption grade on the target image according to the reference image.
7. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1 to 5.
8. An electronic device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the method of any of claims 1-5.
CN201810404508.6A 2018-04-28 2018-04-28 Image processing method, image processing device, computer-readable storage medium and electronic equipment Active CN108881712B (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201810404508.6A CN108881712B (en) 2018-04-28 2018-04-28 Image processing method, image processing device, computer-readable storage medium and electronic equipment
EP19792627.2A EP3644261B1 (en) 2018-04-28 2019-03-29 Image processing method, apparatus, computer-readable storage medium, and electronic device
PCT/CN2019/080559 WO2019205889A1 (en) 2018-04-28 2019-03-29 Image processing method, apparatus, computer-readable storage medium, and electronic device
US16/740,925 US11308636B2 (en) 2018-04-28 2020-01-13 Method, apparatus, and computer-readable storage medium for obtaining a target image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810404508.6A CN108881712B (en) 2018-04-28 2018-04-28 Image processing method, image processing device, computer-readable storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN108881712A CN108881712A (en) 2018-11-23
CN108881712B true CN108881712B (en) 2020-02-14

Family

ID=64326877

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810404508.6A Active CN108881712B (en) 2018-04-28 2018-04-28 Image processing method, image processing device, computer-readable storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN108881712B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019205889A1 (en) * 2018-04-28 2019-10-31 Oppo广东移动通信有限公司 Image processing method, apparatus, computer-readable storage medium, and electronic device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014143689A1 (en) * 2013-03-15 2014-09-18 Google Inc. Overlaying two-dimensional map elements over terrain geometry
CN107832677A (en) * 2017-10-19 2018-03-23 深圳奥比中光科技有限公司 Face identification method and system based on In vivo detection
CN108549867A (en) * 2018-04-12 2018-09-18 Oppo广东移动通信有限公司 Image processing method, device, computer readable storage medium and electronic equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9047688B2 (en) * 2011-10-21 2015-06-02 Here Global B.V. Depth cursor and depth measurement in images
CN105787329B (en) * 2016-03-30 2019-06-25 上海越满网络科技有限公司 A kind of encryption method and system based on comprehensive identification
CN106209823B (en) * 2016-07-08 2019-04-23 西安电子科技大学 A kind of lightweight file remote encryption method under mobile cloud computing environment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014143689A1 (en) * 2013-03-15 2014-09-18 Google Inc. Overlaying two-dimensional map elements over terrain geometry
CN107832677A (en) * 2017-10-19 2018-03-23 深圳奥比中光科技有限公司 Face identification method and system based on In vivo detection
CN108549867A (en) * 2018-04-12 2018-09-18 Oppo广东移动通信有限公司 Image processing method, device, computer readable storage medium and electronic equipment

Also Published As

Publication number Publication date
CN108881712A (en) 2018-11-23

Similar Documents

Publication Publication Date Title
CN108764052B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108549867B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108805024B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108804895B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108711054B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108668078B (en) Image processing method, device, computer readable storage medium and electronic equipment
US11275927B2 (en) Method and device for processing image, computer readable storage medium and electronic device
US11256903B2 (en) Image processing method, image processing device, computer readable storage medium and electronic device
US11146735B2 (en) Image processing methods and apparatuses, computer readable storage media, and electronic devices
CN108921903B (en) Camera calibration method, device, computer readable storage medium and electronic equipment
CN108830141A (en) Image processing method, device, computer readable storage medium and electronic equipment
WO2019196684A1 (en) Data transmission method and apparatus, computer readable storage medium, electronic device, and mobile terminal
CN108573170A (en) Information processing method and device, electronic equipment, computer readable storage medium
CN108712400B (en) Data transmission method and device, computer readable storage medium and electronic equipment
EP3621294B1 (en) Method and device for image capture, computer readable storage medium and electronic device
CN108924421B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108881712B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108833885B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
US11308636B2 (en) Method, apparatus, and computer-readable storage medium for obtaining a target image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant