CN107784232B - Picture processing method and mobile terminal - Google Patents

Picture processing method and mobile terminal Download PDF

Info

Publication number
CN107784232B
CN107784232B CN201710970603.8A CN201710970603A CN107784232B CN 107784232 B CN107784232 B CN 107784232B CN 201710970603 A CN201710970603 A CN 201710970603A CN 107784232 B CN107784232 B CN 107784232B
Authority
CN
China
Prior art keywords
target area
picture
fingerprint information
user
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710970603.8A
Other languages
Chinese (zh)
Other versions
CN107784232A (en
Inventor
彭作
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201710970603.8A priority Critical patent/CN107784232B/en
Publication of CN107784232A publication Critical patent/CN107784232A/en
Application granted granted Critical
Publication of CN107784232B publication Critical patent/CN107784232B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • Bioethics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a picture processing method and a mobile terminal, and relates to the technical field of mobile communication, wherein the method comprises the following steps: detecting touch operation of a user on a picture; if a first touch operation matched with a preset encryption operation is detected, acquiring a first operation characteristic of the first touch operation; determining a target area to be processed in the picture based on the first operation characteristic; extracting first fingerprint information in the first operating characteristic; and carrying out encryption processing on the target area based on the first fingerprint information. The invention solves the problems that the picture subjected to mosaic processing in the prior art cannot check the original picture information and the processed picture and the original picture need to be stored simultaneously.

Description

Picture processing method and mobile terminal
Technical Field
The embodiment of the invention relates to the technical field of mobile communication, in particular to a picture processing method and a mobile terminal.
Background
With the rapid development of mobile communication technology, mobile terminal devices such as smart phones have been widely used in photographing and picture processing. When a photo on a mobile phone of a user is shared with other people or other third-party social applications, and some information on the photo, such as a background or a person, is not wanted to be seen by other people, a general operation is to process the photo at a later stage. The general picture processing method is as follows: by editing the original picture, certain areas on the picture are painted, namely, mosaic is added. After the mosaic processing operation is completed, one processed picture needs to be saved in addition to the original picture. The defects of the case are as follows: when the user wants to view the original picture information, the user can only view the most original picture. If the original picture is lost, the picture loses the corresponding part of information. And after the picture is processed, the processed picture and the original picture are simultaneously stored, so that the consumption of the memory of the mobile terminal is increased.
Disclosure of Invention
The invention provides a picture processing method and a mobile terminal, and aims to solve the problems that in the prior art, a picture subjected to mosaic processing cannot check original picture information and the processed picture and an original picture need to be stored simultaneously.
In order to solve the technical problem, the invention is realized as follows: a method of picture processing, the method comprising:
detecting touch operation of a user on a picture;
if a first touch operation matched with a preset encryption operation is detected, acquiring a first operation characteristic of the first touch operation;
determining a target area to be processed in the picture based on the first operation characteristic;
extracting first fingerprint information in the first operating characteristic;
and carrying out encryption processing on the target area based on the first fingerprint information.
In a first aspect, an embodiment of the present invention further provides a mobile terminal, including:
the detection module is used for detecting the touch operation of a user on the picture;
the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring a first operation characteristic of a first touch operation if the first touch operation matched with a preset encryption operation is detected;
the determining module is used for determining a target area to be processed in the picture based on the first operation characteristic;
the extraction module is used for extracting first fingerprint information in the first operation characteristic;
and the encryption module is used for encrypting the target area based on the first fingerprint information.
In a second aspect, an embodiment of the present invention further provides a mobile terminal, including: the image processing method comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the processor realizes the steps of the image processing method when executing the computer program.
In a third aspect, an embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when being executed by a processor, the computer program implements the steps in the above-mentioned picture processing method.
In the embodiment of the invention, the first touch operation is received, the first fingerprint information of the first preset operation is extracted, the target area which needs to be encrypted by the user is determined according to the first touch operation, the encryption of the target area in the picture can be realized according to the fingerprint information, the operation is convenient and fast, and only the processed picture needs to be stored, so that the storage space of the mobile terminal is saved, and the picture can be encrypted and the original picture can be restored.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
Fig. 1 is a flowchart illustrating a picture processing method according to an embodiment of the present invention;
FIG. 2 is a second flowchart of a picture processing method according to an embodiment of the present invention;
fig. 3 is a flowchart illustrating a first exemplary picture processing method according to an embodiment of the present invention;
fig. 4 is a flowchart illustrating a second exemplary picture processing method according to an embodiment of the present invention;
FIG. 5 shows one of the scenarios for a second example of embodiment of the present invention;
FIG. 6 is a second exemplary scenario diagram of the embodiment of the present invention;
FIG. 7 illustrates one of the block diagrams of a mobile terminal provided by the embodiments of the present invention;
fig. 8 is a second block diagram of a mobile terminal according to an embodiment of the present invention;
fig. 9 is a third block diagram of a mobile terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
In various embodiments of the present invention, it should be understood that the sequence numbers of the following processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Referring to fig. 1, an embodiment of the present invention provides an image processing method, including:
step 101, detecting a touch operation of a user on a picture.
Wherein, the touch operation is a manual touch operation.
Step 102, if a first touch operation matched with a preset encryption operation is detected, a first operation characteristic of the first touch operation is obtained.
The mobile terminal obtains a first operation characteristic of a first touch operation when receiving the first touch operation triggered by a user on a picture.
Step 103, determining a target area to be processed in the picture based on the first operation feature.
The first operation characteristic may include fingerprint information, sliding track information, and the like; and determining a target area to be processed according to the first operation characteristic. The target area to be processed, i.e., the operation area of the first touch operation, is an area including the target object to be processed.
Alternatively, by using an image edge detection technology, a dividing line between different characteristic regions in the target region is extracted by using the difference between the object and the background in the aspects of gray scale, color, texture characteristics, or the like of the image, so as to identify a subject, i.e., a target object, in the region selected by the user on the image.
And 104, extracting first fingerprint information in the first operation characteristic.
First fingerprint information of the first touch operation is extracted, namely the fingerprint information of the finger executing the first touch operation.
And 105, encrypting the target area based on the first fingerprint information.
Optionally, after the target area is identified, prompting a user whether to encrypt the target area, if so, encrypting the target area, hiding the target area, and filling the hidden part; different types of areas can be hidden in different ways, for example, scene information is hidden, and another scene on the picture can be used for replacement: for example, a flower can be hidden by using another grass; when the person is hidden, the person can be replaced by another person head portrait, such as a star head portrait, an animal head portrait and a cartoon head portrait, or the blank area can be directly hidden and refilled with surrounding picture information.
After the target area is hidden, the corresponding relation between the target area and the first fingerprint information is recorded, so that when the user browses the picture again, the target object is restored and displayed, and the restored picture is obtained, therefore, only the processed picture is saved, and the memory consumption of the mobile terminal is reduced.
In the embodiment of the invention, the first touch operation is received, the first fingerprint information of the first preset operation is extracted, the target area which needs to be encrypted by the user is determined according to the first touch operation, the encryption of the target area in the picture can be realized according to the fingerprint information, the operation is convenient and fast, and only the processed picture needs to be stored, so that the storage space of the mobile terminal is saved, and the picture can be encrypted and the original picture can be restored. The invention solves the problems that the picture subjected to mosaic processing in the prior art cannot check the original picture information and the processed picture and the original picture need to be stored simultaneously.
Referring to fig. 2, an embodiment of the present invention provides an image processing method, including:
step 201, detecting a touch operation of a user on a picture;
wherein, the touch operation is a manual touch operation. The step of detecting the touch operation of the user on the picture comprises at least one of the following steps:
detecting sliding operation of a user on a picture in a closed sliding track mode;
detecting a smearing operation of a user on a picture;
a pressing operation of a user on a picture is detected.
The first touch operation is an operation for triggering selection of a target area to be processed, the first touch operation may be a sliding operation in which a sliding track is in a closed form, and the first touch operation may slide along any direction, the track may be in a closed form, and the target area is determined according to a range surrounded by the sliding track.
When the painting operation is performed on a picture, the target area is determined according to the range of the painting operation, and the painting operation slides repeatedly.
The method can also be used for pressing operation, and when the pressing pressure exceeds a set threshold value, an area within a preset range of the contact point of the pressing operation is set as a target area.
Step 202, if a first touch operation matched with a preset encryption operation is detected, a first operation characteristic of the first touch operation is obtained.
Wherein, the preset encryption operation is the pressing operation, the sliding operation in the form of closed sliding track or the smearing operation, etc.; when the above operation is detected, a first operation characteristic is acquired.
Step 203, determining a target area to be processed in the picture based on the first operation feature.
The first operation characteristic may include fingerprint information, sliding track information, and the like; and determining a target area to be processed according to the first operation characteristic.
Preferably, step 203 comprises: extracting an operation area in the first operation characteristic;
and determining the operation area as a target area, or performing edge detection on the operation area, and determining an image area surrounded by the detected outline as the target area.
The first operation feature comprises an operation area of the first touch operation, the target area to be processed is the operation area or an area comprising a target object to be processed, a dividing line between different feature areas in the target area can be extracted by using the difference of an object and a background in the aspects of gray scale, color or texture features of a picture through a picture edge detection technology, an image area surrounded by the detected outline is determined as the target area, and therefore a main body, namely the target object, in the area selected by a user on the picture is identified.
Step 204, extracting first fingerprint information in the first operation characteristic;
wherein, the first fingerprint information of the first touch operation is extracted, namely the fingerprint information of the finger executing the first touch operation
Step 205, determining a target encryption mode corresponding to the first fingerprint information according to a first corresponding relationship between preset fingerprint information and encryption modes.
The first corresponding relation comprises the corresponding relation between fingerprint information and an encryption mode, the encryption mode comprises the step of shielding a target area through a shielding object or conducting mosaic processing on the target area, and after the first fingerprint information is determined, the target encryption mode can be determined.
And step 206, encrypting the target area according to the target encryption mode.
Optionally, after the target area is identified, the user may be prompted whether to encrypt the target area, if the user feedback is yes, the target area is encrypted,
wherein step 206 comprises: hiding the target area according to the target encryption mode;
and recording a second corresponding relation between the target area and the first fingerprint information.
In this step, the target area is hidden according to the target encryption mode, and the corresponding relation between the target area and the first fingerprint information is recorded, so that subsequent image decryption is facilitated.
Preferably, the hiding the target area according to the target encryption manner includes:
covering an image area where the target area is located by using a preset object, wherein the preset object is a preset image or a non-target area extracted from the image;
or mosaic processing is carried out on the target area.
In the first mode, a preset object is adopted to cover an image area where the target area is located, and the preset object can be a preset image or a non-target area extracted from the image; hiding the target area and filling the hidden part; the preset object is a preset image, such as a star head portrait, an animal head portrait and a cartoon head portrait, replacing a target area; non-target areas, such as scene information, extracted from the picture, may be hidden and replaced with additional scenes on the picture: for example, a flower can be hidden by using another grass; when a person is hidden, the avatar of another person may be used.
In the second mode, the target area is subjected to mosaic processing, which means that details of color levels in a specific area of an image are degraded and color blocks are disturbed, so that the target area can be effectively shielded and cannot be identified.
Preferably, the step of mosaic processing the target area includes:
acquiring a preset mosaic type corresponding to the first fingerprint information;
and performing mosaic processing on the target area according to the mosaic type.
The user can correspond the fingerprint information of the user to the mosaic pattern form through the editing interface. One fingerprint may correspond to one mosaic type, or several fingerprint combinations may correspond to a certain mosaic pattern form. Therefore, when the first fingerprint information is acquired, the pre-stored mosaic type of the fingerprint information is determined, and the target object is subjected to mosaic processing according to the form.
Optionally, the user may use more than two types of mosaic information to perform coding on the area in a combined manner, or may perform a painting operation on one mosaic through other fingers to add mosaic patterns, thereby realizing superposition of multiple mosaics, and increasing security of encryption to prevent decryption of fingerprint information of one finger after the fingerprint information is misrecognized.
Preferably, after the step of encrypting the target area, the method includes:
if a second touch operation matched with the preset decryption operation is detected;
extracting second fingerprint information of the second touch operation;
and if the second fingerprint information is matched with the first fingerprint information, decrypting the target area.
In this step, the second touch operation is an operation for restoring and displaying the encrypted target area, and corresponds to the first touch operation. When a second touch operation of the user on the picture is received, fingerprint information of the second touch operation is extracted. If the recorded first fingerprint information contains first fingerprint information matched with the second fingerprint information, searching a target area corresponding to the first fingerprint information according to the second corresponding relation, namely the target area corresponding to the second fingerprint information, and decrypting the target area.
Specifically, the step of performing decryption processing on the target area includes:
determining a target area corresponding to the matched first fingerprint information according to the second corresponding relation;
and restoring and displaying the target area.
After the first fingerprint information corresponding to the second fingerprint information is determined, the target area corresponding to the first fingerprint information is searched according to the second corresponding relation, namely the target area corresponding to the second fingerprint information is found, and the target area is restored and displayed.
Therefore, after the second fingerprint information is identified, the target object can be restored and displayed, and the original image before the image processing is obtained, so that only the processed image needs to be stored, and two images do not need to be stored at the same time; and the target object can be not displayed for other users, the privacy of the picture is improved, and the target object is restored and displayed only when the user needs to check the target object.
Preferably, the step of displaying the target area by restoring includes:
eliminating the display of a preset object covering the target area;
or demosaicing the target area.
The restoring and displaying of the target area comprises the step of eliminating the display of the preset object covered on the target area for the picture with the preset object covering, or the step of demosaicing the picture subjected to mosaic processing.
As a first example, referring to fig. 3, the method for processing a picture of an occlusion target object by mosaic processing shown in fig. 3 mainly includes the following processes:
step 301, acquiring a mosaic type preset by a user and corresponding to the fingerprint.
The user can correspond the fingerprint information of the user to the mosaic pattern form in advance through the editing interface. One fingerprint can correspond to one mosaic type, or a plurality of fingerprint combinations can correspond to a certain mosaic pattern form; after setting, the system stores relevant configuration, and when the mobile terminal performs picture processing, the mosaic type preset by the user and corresponding to the fingerprint is obtained.
Step 302, detecting a first smearing operation of a user on the picture, and determining a target area of the first smearing operation.
I.e. the user applies a finger over the picture to paint a certain area. For example, a user selects a picture to be edited, a position on the picture to be subjected to mosaic adding is smeared by a finger, mosaic adding to an image area can be realized by adding mosaic to the image area clockwise, and mosaic information is smeared and removed in a counterclockwise direction during decoding;
step 303, extracting first fingerprint information of the smearing operation, and determining a mosaic type corresponding to the fingerprint information.
The system firstly identifies fingerprint information of a user, finds out a stored mosaic pattern form corresponding to the fingerprint information at the background, adds a mosaic with a corresponding form to a user smearing area on a picture, the user can use a mode of combining more than two mosaic information to beat codes to the area, and can also smear and beat a plurality of mosaic patterns on one mosaic, the superposition effect of various mosaics can be like a mixture of several colors, for example, a layer of blue mosaic is added on a pure red mosaic, the mosaic pattern is green, when the mosaic is removed, the user scrapes one layer with different fingers, then scrapes one layer, and finally displays original image information;
and 304, setting a mosaic pattern corresponding to the mosaic type for the target area, and recording the corresponding relation between the first fingerprint information and the target area.
Setting a mosaic pattern corresponding to the mosaic type for the target area, confirming and storing the picture after the user finishes processing the picture, and storing the mosaic information and the original pattern information of the picture and the fingerprint information of the user by the system;
and 305, detecting a second smearing operation of the user finger on the target area, and extracting second fingerprint information of the second smearing operation.
When the user wants to remove the mosaic pattern, fingers smear the mosaic area, and second fingerprint information of the second smearing operation is extracted. The mosaic can be gradually removed in the process of smearing the fingers, and the image is displayed again, so that the operation effect similar to scratch is achieved.
And step 306, if the second fingerprint information is matched with the first fingerprint information, decrypting the picture.
The system identifies the fingerprint information of the user, and if the second fingerprint information is matched with the first fingerprint information, the mosaic on the pattern is gradually removed along with the smearing of the finger of the user, and the original image information is displayed.
As a second example, referring to fig. 4, the picture processing flow shown in fig. 4 mainly includes the following steps,
step 401, detecting a touch operation of a user in an area where an object D on a picture is located, and extracting first fingerprint information of the touch operation.
The touch operation may be sliding of a closed track by a user, smearing of a fingerprint, or pressing of a certain area on an image, and extracting of first fingerprint information.
Referring to fig. 5, a user slides a finger on a picture to select a part to be hidden, for example, the user wants to hide a main body at C on the picture, circles out an area at C by sliding the finger or directly paints and presses the area at C, and identifies an object D in the area selected by the user on the picture by an edge detection technology.
Step 402, prompting the user whether to hide the selected object D.
The system marks the extracted object D and prompts a user whether to hide the block area;
step 403, if the user feedback is yes, hiding the object D, filling the area where the object D is located, and recording the corresponding relationship between the first fingerprint information and the object D.
The effect diagram after concealment is seen in fig. 6.
Step 404, if a second touch operation matched with a preset decryption operation is detected, extracting second fingerprint information of the second touch operation.
And the user slides, presses or paints the fingerprint again on the area of the hidden image, and extracts the second fingerprint information at the moment.
When a user wants to view the image of the hidden area, the user can smear or press the area of the hidden image information through finger fingerprints, and the hidden image is gradually displayed in the smearing process of the user;
and step 405, judging whether the second fingerprint information is matched with the first fingerprint information.
And identifying whether the second fingerprint information is correct.
And step 406, if the second fingerprint information is matched with the first fingerprint information, decrypting the target area.
And if the matching is carried out, displaying the object D.
In the embodiment of the invention, the first touch operation is received, the first fingerprint information of the first preset operation is extracted, the target area which needs to be encrypted by the user is determined according to the first touch operation, the encryption of the target area in the picture can be realized according to the fingerprint information, the operation is convenient and fast, and only the processed picture needs to be stored, so that the storage space of the mobile terminal is saved, and the picture can be encrypted and the original picture can be restored; and the picture hiding mode can be mosaic processing or blocking object processing, so that the use experience of a user is improved.
Referring to fig. 7, an embodiment of the present invention further provides a mobile terminal 700, including:
the detecting module 701 is configured to detect a touch operation of a user on a picture.
Wherein, the touch operation is a manual touch operation.
An obtaining module 702 is configured to, if a first touch operation matching a preset encryption operation is detected, obtain a first operation characteristic of the first touch operation.
The mobile terminal obtains a first operation characteristic of a first touch operation when receiving the first touch operation triggered by a user on a picture.
A determining module 703, configured to determine, based on the first operation feature, a target area to be processed in the picture.
The first operation characteristic may include fingerprint information, sliding track information, and the like; and determining a target area to be processed according to the first operation characteristic. The target area to be processed, i.e., the operation area of the first touch operation, is an area including the target object to be processed.
Alternatively, by using an image edge detection technology, a dividing line between different characteristic regions in the target region is extracted by using the difference between the object and the background in the aspects of gray scale, color, texture characteristics, or the like of the image, so as to identify a subject, i.e., a target object, in the region selected by the user on the image.
An extracting module 704, configured to extract first fingerprint information in the first operation feature.
First fingerprint information of the first touch operation is extracted, namely the fingerprint information of the finger executing the first touch operation.
The encryption module 705 is configured to perform encryption processing on the target area based on the first fingerprint information.
Optionally, after the target area is identified, prompting a user whether to encrypt the target area, if so, encrypting the target area, hiding the target area, and filling the hidden part; different types of areas can be hidden in different ways, for example, scene information is hidden, and another scene on the picture can be used for replacement: for example, a flower can be hidden by using another grass; when the person is hidden, the person can be replaced by another person head portrait, such as a star head portrait, an animal head portrait and a cartoon head portrait, or the blank area can be directly hidden and refilled with surrounding picture information.
After the target area is hidden, the corresponding relation between the target area and the first fingerprint information is recorded, so that when the user browses the picture again, the target object is restored and displayed.
Optionally, the detecting module 701 is configured to perform at least one of the following:
detecting sliding operation of a user on a picture in a closed sliding track mode;
detecting a smearing operation of a user on a picture;
a pressing operation of a user on a picture is detected.
Optionally, referring to fig. 8, the determining module 703 includes:
a determining submodule 7031 for extracting an operation region in the first operation feature;
and determining the operation area as a target area, or performing edge detection on the operation area, and determining an image area surrounded by the detected outline as the target area.
Optionally, referring to fig. 8, the encryption module 705 includes:
the encryption submodule 7051 is configured to determine, according to a first correspondence between preset fingerprint information and an encryption manner, a target encryption manner corresponding to the first fingerprint information;
and encrypting the target area according to the target encryption mode.
Optionally, the encryption sub-module 7051 is configured to:
hiding the target area according to the target encryption mode;
and recording a second corresponding relation between the target area and the first fingerprint information.
Optionally, referring to fig. 8, the encryption submodule 7051 includes:
a first encrypting unit 70511, configured to cover an image area where the target area is located with a preset object, where the preset object is a preset image or a non-target area extracted from the image;
or, the second encrypting unit 70512 is configured to perform mosaic processing on the target area.
Optionally, the second encrypting unit 70512 is configured to:
acquiring a preset mosaic type corresponding to the first fingerprint information;
and performing mosaic processing on the target area according to the mosaic type.
Optionally, referring to fig. 8, the mobile terminal 700 further includes:
the decryption module 706 is configured to, if a second touch operation matched with a preset decryption operation is detected;
extracting second fingerprint information of the second touch operation;
and if the second fingerprint information is matched with the first fingerprint information, decrypting the target area.
Optionally, the decryption module 706 is configured to:
determining a target area corresponding to the matched first fingerprint information according to the second corresponding relation;
and restoring and displaying the target area.
Optionally, referring to fig. 8, the decryption module 706 includes:
a restoring submodule 7061 configured to eliminate display of a preset object covering the target area;
or demosaicing the target area.
In the embodiment of the invention, the first touch operation is received, the first fingerprint information of the first preset operation is extracted, the target area which needs to be encrypted by the user is determined according to the first touch operation, the encryption of the target area in the picture can be realized according to the fingerprint information, the operation is convenient and fast, and only the processed picture needs to be stored, so that the storage space of the mobile terminal is saved, and the picture can be encrypted and the original picture can be restored; the picture hiding mode can be mosaic processing or blocking object processing, and the use experience of a user is improved.
Figure 9 is a schematic diagram of a hardware configuration of a mobile terminal implementing various embodiments of the present invention,
the mobile terminal 900 includes, but is not limited to: a radio frequency unit 901, a network module 902, an audio output unit 903, an input unit 904, a sensor 905, a display unit 906, a user input unit 907, an interface unit 908, a memory 909, a processor 910, and a power supply 911. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 9 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The radio frequency unit 901 is configured to detect a touch operation of a user on a picture;
a processor 910, configured to, if a first touch operation matching a preset encryption operation is detected, obtain a first operation characteristic of the first touch operation;
determining a target area to be processed in the picture based on the first operation characteristic;
extracting first fingerprint information in the first operating characteristic;
and carrying out encryption processing on the target area based on the first fingerprint information.
In the embodiment of the invention, the first touch operation is received, the first fingerprint information of the first preset operation is extracted, the target area which needs to be encrypted by the user is determined according to the first touch operation, the encryption of the target area in the picture can be realized according to the fingerprint information, the operation is convenient and fast, and only the processed picture needs to be stored, so that the storage space of the mobile terminal is saved, and the picture can be encrypted and the original picture can be restored.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 901 may be used for receiving and sending signals during a message transmission and reception process or a call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 910; in addition, the uplink data is transmitted to the base station. Generally, the radio frequency unit 901 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 901 can also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access via the network module 902, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 903 may convert audio data received by the radio frequency unit 901 or the network module 902 or stored in the memory 909 into an audio signal and output as sound. Also, the audio output unit 903 may also provide audio output related to a specific function performed by the mobile terminal 900 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 903 includes a speaker, a buzzer, a receiver, and the like.
The input unit 904 is used to receive audio or video signals. The input Unit 904 may include a Graphics Processing Unit (GPU) 9041 and a microphone 9042, and the Graphics processor 9041 processes image data of a still picture or video obtained by an image capturing device (such as a camera) in a video capture mode or an image capture mode. The processed image frames may be displayed on the display unit 906. The image frames processed by the graphic processor 9041 may be stored in the memory 909 (or other storage medium) or transmitted via the radio frequency unit 901 or the network module 902. The microphone 9042 can receive sounds and can process such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 901 in case of the phone call mode.
The mobile terminal 900 also includes at least one sensor 905, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 9061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 9061 and/or backlight when the mobile terminal 900 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 905 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which are not described in detail herein.
The display unit 906 is used to display information input by the user or information provided to the user. The Display unit 906 may include a Display panel 9061, and the Display panel 9061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 907 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 907 includes a touch panel 9071 and other input devices 9072. The touch panel 9071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 9071 (e.g., operations by a user on or near the touch panel 9071 using a finger, a stylus, or any other suitable object or accessory). The touch panel 9071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 910, receives a command from the processor 910, and executes the command. In addition, the touch panel 9071 may be implemented by using various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 907 may include other input devices 9072 in addition to the touch panel 9071. Specifically, the other input devices 9072 may include, but are not limited to, a physical keyboard, function keys (such as a volume control key, a switch key, and the like), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 9071 may be overlaid on the display panel 9061, and when the touch panel 9071 detects a touch operation on or near the touch panel 9071, the touch panel is transmitted to the processor 910 to determine the type of the touch event, and then the processor 910 provides a corresponding visual output on the display panel 9061 according to the type of the touch event. Although in fig. 9, the touch panel 9071 and the display panel 9061 are two independent components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 9071 and the display panel 9061 may be integrated to implement the input and output functions of the mobile terminal, which is not limited herein.
The interface unit 908 is an interface through which an external device is connected to the mobile terminal 900. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 908 may be used to receive input from external devices (e.g., data information, power, etc.) and transmit the received input to one or more elements within the mobile terminal 900 or may be used to transmit data between the mobile terminal 900 and external devices.
The memory 909 may be used to store software programs as well as various data. The memory 909 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 909 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 910 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by running or executing software programs and/or modules stored in the memory 909 and calling data stored in the memory 909, thereby performing overall monitoring of the mobile terminal. Processor 910 may include one or more processing units; preferably, the processor 910 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It is to be appreciated that the modem processor described above may not be integrated into processor 910.
The mobile terminal 900 may also include a power supply 911 (e.g., a battery) for powering the various components, and preferably, the power supply 911 is logically connected to the processor 910 through a power management system that provides power management functions to manage charging, discharging, and power consumption.
In addition, the mobile terminal 900 includes some functional modules that are not shown, and thus will not be described in detail herein.
Preferably, an embodiment of the present invention further provides a mobile terminal, which includes a processor 910, a memory 909, and a computer program that is stored in the memory 909 and can be run on the processor 910, and when the computer program is executed by the processor 910, the processes of the above-mentioned embodiment of the image processing method are implemented, and the same technical effect can be achieved, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the above-mentioned image processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (14)

1. An image processing method, comprising:
detecting touch operation of a user on a picture;
if a first touch operation matched with a preset encryption operation is detected, acquiring a first operation characteristic of the first touch operation;
determining a target area to be processed in the picture based on the first operation characteristic;
extracting first fingerprint information in the first operating characteristic;
encrypting the target area based on the first fingerprint information;
the step of encrypting the target area based on the first fingerprint information includes:
determining a target encryption mode corresponding to first fingerprint information according to a first corresponding relation between preset fingerprint information and an encryption mode;
encrypting the target area according to the target encryption mode;
the step of encrypting the target area according to the target encryption mode comprises the following steps:
hiding the target area according to the target encryption mode;
recording a second corresponding relation between the target area and the first fingerprint information;
the step of hiding the target area according to the target encryption mode comprises the following steps:
performing mosaic processing on the target area;
the step of mosaic processing the target area includes:
acquiring a preset mosaic type corresponding to the first fingerprint information;
according to the mosaic type, carrying out mosaic processing on the target area; wherein, one fingerprint corresponds to one mosaic type, or the fingerprint combination corresponds to one mosaic pattern form; the mosaic type corresponds to the mosaic pattern form;
after the step of encrypting the target area, the method includes:
if a second touch operation matched with the preset decryption operation is detected;
extracting second fingerprint information of the second touch operation;
if the second fingerprint information is matched with the first fingerprint information, decrypting the target area;
the first touch operation is a first smearing operation of a user on the picture, and the second touch operation is a second smearing operation of smearing the target area by the fingers of the user; wherein the first smearing operation is a clockwise operation, and the second smearing operation is a counterclockwise operation.
2. The method according to claim 1, wherein the step of detecting a touch operation of a user on the picture comprises at least one of:
detecting sliding operation of a user on a picture in a closed sliding track mode;
detecting a smearing operation of a user on a picture;
a pressing operation of a user on a picture is detected.
3. The method according to claim 1, wherein the step of determining the target region to be processed in the picture based on the first operation feature comprises:
extracting an operation area in the first operation characteristic;
and determining the operation area as a target area, or performing edge detection on the operation area, and determining an image area surrounded by the detected outline as the target area.
4. The method according to claim 1, wherein the step of hiding the target area according to the target encryption manner comprises:
and covering an image area where the target area is located by adopting a preset object, wherein the preset object is a preset image or a non-target area extracted from the image.
5. The method according to claim 1, wherein the step of decrypting the target area comprises:
determining a target area corresponding to the matched first fingerprint information according to the second corresponding relation;
and restoring and displaying the target area.
6. The method of claim 5, wherein the step of displaying the target region includes:
eliminating the display of a preset object covering the target area;
or demosaicing the target area.
7. A mobile terminal, comprising:
the detection module is used for detecting the touch operation of a user on the picture;
the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring a first operation characteristic of a first touch operation if the first touch operation matched with a preset encryption operation is detected;
the determining module is used for determining a target area to be processed in the picture based on the first operation characteristic;
the extraction module is used for extracting first fingerprint information in the first operation characteristic;
the encryption module is used for encrypting the target area based on the first fingerprint information;
the decryption module is used for detecting a second touch operation matched with the preset decryption operation; extracting second fingerprint information of the second touch operation; if the second fingerprint information is matched with the first fingerprint information, decrypting the target area;
the encryption module includes:
the encryption submodule is used for determining a target encryption mode corresponding to the first fingerprint information according to a first corresponding relation between preset fingerprint information and an encryption mode;
encrypting the target area according to the target encryption mode;
the encryption submodule is used for:
hiding the target area according to the target encryption mode;
recording a second corresponding relation between the target area and the first fingerprint information;
the encryption submodule includes:
a second encryption unit, configured to perform mosaic processing on the target area;
the second encryption unit is configured to:
acquiring a preset mosaic type corresponding to the first fingerprint information;
according to the mosaic type, carrying out mosaic processing on the target area; wherein, one fingerprint corresponds to one mosaic type, or the fingerprint combination corresponds to one mosaic pattern form; the mosaic type corresponds to the mosaic pattern form;
the first touch operation is a first smearing operation of a user on the picture, and the second touch operation is a second smearing operation of smearing the target area by the fingers of the user; wherein the first smearing operation is a clockwise operation, and the second smearing operation is a counterclockwise operation.
8. The mobile terminal of claim 7, wherein the detection module is configured to perform at least one of:
detecting sliding operation of a user on a picture in a closed sliding track mode;
detecting a smearing operation of a user on a picture;
a pressing operation of a user on a picture is detected.
9. The mobile terminal of claim 7, wherein the determining module comprises:
a determining submodule for extracting an operation region in the first operation feature;
and determining the operation area as a target area, or performing edge detection on the operation area, and determining an image area surrounded by the detected outline as the target area.
10. The mobile terminal of claim 7, wherein the encryption sub-module comprises:
and the first encryption unit is used for covering the image area where the target area is located by adopting a preset object, wherein the preset object is a preset image or a non-target area extracted from the image.
11. The mobile terminal of claim 7, wherein the decryption module is configured to:
determining a target area corresponding to the matched first fingerprint information according to the second corresponding relation;
and restoring and displaying the target area.
12. The mobile terminal of claim 11, wherein the decryption module comprises:
the restoring submodule is used for eliminating the display of a preset object covered in the target area;
or demosaicing the target area.
13. A mobile terminal, comprising: memory, processor and computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, carries out the steps in the picture processing method according to any one of claims 1 to 6.
14. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps in the picture processing method according to any one of claims 1 to 6.
CN201710970603.8A 2017-10-18 2017-10-18 Picture processing method and mobile terminal Active CN107784232B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710970603.8A CN107784232B (en) 2017-10-18 2017-10-18 Picture processing method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710970603.8A CN107784232B (en) 2017-10-18 2017-10-18 Picture processing method and mobile terminal

Publications (2)

Publication Number Publication Date
CN107784232A CN107784232A (en) 2018-03-09
CN107784232B true CN107784232B (en) 2020-05-05

Family

ID=61434557

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710970603.8A Active CN107784232B (en) 2017-10-18 2017-10-18 Picture processing method and mobile terminal

Country Status (1)

Country Link
CN (1) CN107784232B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108958598A (en) * 2018-05-31 2018-12-07 维沃移动通信有限公司 A kind of method and terminal of editing picture
CN109145552B (en) * 2018-07-09 2021-01-08 维沃移动通信有限公司 Information encryption method and terminal equipment
CN109271764B (en) * 2018-08-30 2023-10-17 北京珠穆朗玛移动通信有限公司 Private data protection method, mobile terminal and storage medium
CN109815727A (en) * 2018-12-18 2019-05-28 维沃移动通信有限公司 A kind of method for secret protection and terminal device
CN109918882B (en) * 2019-02-01 2023-11-21 维沃移动通信有限公司 Image encryption method and mobile terminal
CN110502936A (en) * 2019-07-22 2019-11-26 维沃移动通信有限公司 The display methods and terminal device of privacy information
CN114679518A (en) * 2022-03-31 2022-06-28 维沃移动通信有限公司 Image display method and device and electronic equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104866773A (en) * 2015-05-11 2015-08-26 酷派软件技术(深圳)有限公司 Fingerprint search method and apparatus, and terminal
CN106027794A (en) * 2016-06-29 2016-10-12 维沃移动通信有限公司 Encryption method of photo and mobile terminal

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090278692A1 (en) * 2008-05-11 2009-11-12 Alzaabi Saif Abdullah RFID Encrypted Paper Book
CN107133993A (en) * 2017-04-19 2017-09-05 珠海市魅族科技有限公司 A kind of image processing method and device
CN107102803B (en) * 2017-04-27 2020-06-26 北京天耀宏图科技有限公司 Picture display method and device and computer readable storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104866773A (en) * 2015-05-11 2015-08-26 酷派软件技术(深圳)有限公司 Fingerprint search method and apparatus, and terminal
CN106027794A (en) * 2016-06-29 2016-10-12 维沃移动通信有限公司 Encryption method of photo and mobile terminal

Also Published As

Publication number Publication date
CN107784232A (en) 2018-03-09

Similar Documents

Publication Publication Date Title
CN107784232B (en) Picture processing method and mobile terminal
CN107145795B (en) Screenshot method and device and computer equipment
CN110446097B (en) Screen recording method and mobile terminal
CN108495029B (en) Photographing method and mobile terminal
CN110706179B (en) Image processing method and electronic equipment
CN107817939B (en) Image processing method and mobile terminal
CN107977652B (en) Method for extracting screen display content and mobile terminal
CN108492246B (en) Image processing method and device and mobile terminal
CN108712603B (en) Image processing method and mobile terminal
CN109241775B (en) Privacy protection method and terminal
CN108459788B (en) Picture display method and terminal
CN110188524B (en) Information encryption method, information decryption method and terminal
CN110149628B (en) Information processing method and terminal equipment
CN109886000B (en) Image encryption method and mobile terminal
CN109544172B (en) Display method and terminal equipment
CN110719527A (en) Video processing method, electronic equipment and mobile terminal
CN108664818B (en) Unlocking control method and device
CN110990849A (en) Encryption and decryption method for private data and terminal
CN111031178A (en) Video stream clipping method and electronic equipment
CN111125800B (en) Icon display method and electronic equipment
CN109639981B (en) Image shooting method and mobile terminal
CN111079118A (en) Icon display control method, electronic device and medium
CN108109188B (en) Image processing method and mobile terminal
CN107798662B (en) Image processing method and mobile terminal
CN107817963B (en) Image display method, mobile terminal and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant