CN108063884B - Image processing method and mobile terminal - Google Patents

Image processing method and mobile terminal Download PDF

Info

Publication number
CN108063884B
CN108063884B CN201711131135.1A CN201711131135A CN108063884B CN 108063884 B CN108063884 B CN 108063884B CN 201711131135 A CN201711131135 A CN 201711131135A CN 108063884 B CN108063884 B CN 108063884B
Authority
CN
China
Prior art keywords
image
mobile terminal
processed
optimizable
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711131135.1A
Other languages
Chinese (zh)
Other versions
CN108063884A (en
Inventor
赵彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201711131135.1A priority Critical patent/CN108063884B/en
Publication of CN108063884A publication Critical patent/CN108063884A/en
Application granted granted Critical
Publication of CN108063884B publication Critical patent/CN108063884B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Telephone Function (AREA)

Abstract

The embodiment of the invention provides an image processing method and a mobile terminal, wherein the method applied to the first mobile terminal comprises the following steps: establishing connection with a second mobile terminal, wherein the second mobile terminal is at least one mobile terminal with the distance from the first mobile terminal within a preset range; sending the image to be processed to the second mobile terminal; receiving at least one comparison image sent by the second mobile terminal; identifying an optimizable region in the image to be processed with reference to the comparison image; and processing the images according to the comparison images to obtain target images in each optimized area in the images to be processed. According to the image processing scheme provided by the embodiment of the invention, users do not need to manually process the optimizable areas in the images to be processed one by one, so that the manpower resource can be saved, the processing efficiency is improved, and the shooting experience of the users is improved.

Description

Image processing method and mobile terminal
Technical Field
The present invention relates to the field of mobile terminal technologies, and in particular, to an image processing method and a mobile terminal.
Background
With the continuous increase of camera pixels in mobile terminals, the mobile terminals have basically replaced traditional cameras to capture images, and users have become an indispensable part of daily life using the mobile terminals to capture images.
The user can shoot images at any time and any place through the mobile terminal. When the hot tourist attractions are photographed, due to the fact that the situation that people enter in a disorderly mode easily occurs when the users with large amount of tourists in the attractions shoot images, the images of the people enter in the disorderly mode can occur in the shot images, and the area occupied by the images of the people entering in the disorderly mode is an optimized area. At present, the optimized areas can be manually processed one by one through a later-stage graph repairing function, but the method is complex to operate and time-consuming and labor-consuming.
Disclosure of Invention
The embodiment of the invention provides an image shooting method and a mobile terminal, and aims to solve the problems that in the prior art, time and labor are consumed due to the fact that the optimizable areas in an image are manually processed one by one.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an image processing method is provided, which is applied to a first mobile terminal, and includes: establishing connection with a second mobile terminal, wherein the second mobile terminal is at least one mobile terminal with the distance from the first mobile terminal within a preset range; sending the image to be processed to the second mobile terminal; receiving at least one comparison image sent by the second mobile terminal, wherein the at least one comparison image is an image matched with the background of the image to be processed in the second mobile terminal; identifying an optimizable region in the image to be processed with reference to the comparison image; and processing the images according to the comparison images to obtain target images in each optimized area in the images to be processed.
In a second aspect, a mobile terminal is provided that includes: the pairing module is used for establishing connection with a second mobile terminal, wherein the second mobile terminal is at least one mobile terminal, and the distance between the second mobile terminal and the first mobile terminal is within a preset range; the sending module is used for sending the image to be processed to the second mobile terminal; the receiving module is used for receiving at least one comparison image sent by the second mobile terminal, wherein the at least one comparison image is an image matched with the background of the image to be processed in the second mobile terminal; the identification module is used for identifying an optimizable area in the image to be processed by referring to the comparison image; and the processing module is used for carrying out image processing on each optimizable area in the image to be processed according to the comparison image to obtain a target image.
In a third aspect, a mobile terminal is provided, including: a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the image processing method.
In a fourth aspect, a computer-readable storage medium is provided, having stored thereon a computer program which, when executed by a processor, implements the steps of the image processing method.
Compared with the prior art, the invention has the following advantages:
in the embodiment of the invention, the connection with the second mobile terminal is established, the comparison image matched with the background of the image to be processed is obtained from the second mobile terminal, the comparison image is referred to identify the optimizable area in the image to be processed, such as the area with the object to be removed, and the optimizable area in the image to be processed is processed according to the comparison image, so that the high-quality target image expected by a user can be obtained. Because the user does not need to manually process the optimizable regions in the images to be processed one by one, the human resources can be saved, the processing efficiency is improved, and the shooting experience of the user is improved.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
Various advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 is a flowchart illustrating steps of an image processing method according to a first embodiment of the present invention;
FIG. 2 is a flowchart illustrating steps of an image processing method according to a second embodiment of the present invention;
fig. 3 is a block diagram of a mobile terminal according to a third embodiment of the present invention;
fig. 4 is a schematic diagram of a hardware structure of a mobile terminal according to a fourth embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example one
Referring to fig. 1, a flowchart illustrating steps of an image processing method according to a first embodiment of the present invention is shown.
The image processing method of the embodiment of the invention comprises the following steps:
step 101: and establishing connection with the second mobile terminal.
In the embodiment of the present invention, the station is explained on the first mobile terminal side. The second mobile terminal is at least one mobile terminal, and the distance between the second mobile terminal and the first mobile terminal is within a preset range.
The first mobile terminal establishes connection with the second mobile terminal, and aims to establish a data transmission channel between the mobile terminals for wireless short-distance communication. The first mobile terminal sends the image to be processed to the second mobile terminal through a data transmission channel established with the second mobile terminal, and the second mobile terminal sends the comparison image to the mobile terminal through the data transmission channel.
Step 102: and sending the image to be processed to a second mobile terminal.
And after receiving the image to be processed, each second mobile terminal performs background matching on the image to be processed and the image managed by the second mobile terminal to obtain a comparison image matched with the background of the image to be processed.
Specifically, at the time of matching, two images whose backgrounds indicate the same place may be determined as matching images; two images with background matching degrees larger than a preset value can also be determined as matched images.
Step 103: and receiving the comparison image sent by the second mobile terminal.
The number of the comparison images sent by the second mobile terminal may be one or more.
Step 104: and identifying an optimizable region in the image to be processed by referring to the comparison image.
And comparing the image with the image to be processed to obtain the image with the same or similar shooting background. If the same shooting background does not enter the object randomly, the image background of the shot image is the same. Therefore, the images of the random access object in the images to be processed can be determined by comparing the comparison images with the images to be processed, and the circumscribed rectangular region of each image is an optimized region.
For the optimizable area, the image of the random-in object in the optimizable area can be deleted in a mode of image texture restoration in subsequent processing.
Step 105: and processing the images according to the comparison images to each optimized area in the image to be processed to obtain a target image.
When the image processing is performed on the optimizable area, a user can select the optimizable area to be processed from the images to be processed, and the mobile terminal selects a corresponding comparison image for processing according to each selected optimizable area. One or more comparison images can be selected from the comparison images by the user, and the mobile terminal selects corresponding optimizable regions from the images to be processed for the selected comparison images to perform image processing.
One way to obtain the target image by performing image processing on each optimizable region in the image to be processed, preferably according to the comparison image, is:
after the optimizable area in the image to be processed is identified by referring to the comparison image, displaying the thumbnail of each comparison image; receiving a selection instruction of a user for each thumbnail; and according to the comparison image corresponding to the selected thumbnail, performing image processing on each optimizable area in the image to be processed to obtain a target image.
According to the optimal image processing method, the user can select and compare the images according to the requirements to perform image processing, the personalized requirements of the user can be met, and the use experience of the user is improved.
According to the image shooting method provided by the embodiment of the invention, the connection with the second mobile terminal is established, the comparison image matched with the background of the image to be processed is obtained from the second mobile terminal, the comparison image is referred to identify the optimizable area in the image to be processed, such as the area with the object to be removed, and the optimizable area in the image to be processed is processed according to the comparison image, so that the high-quality target image expected by a user can be obtained. Because the user does not need to manually process the optimizable regions in the images to be processed one by one, the human resources can be saved, the processing efficiency is improved, and the shooting experience of the user is improved.
Example two
Referring to fig. 2, a flowchart illustrating steps of an image processing method according to a second embodiment of the present invention is shown.
The image processing method of the embodiment of the invention specifically comprises the following steps:
step 201: and establishing connection with the second mobile terminal.
The image processing method according to the embodiment of the present invention is described with reference to the first mobile terminal. The first mobile terminal establishes connection with the second mobile terminal, and aims to establish a data transmission channel between the mobile terminals for wireless short-distance communication.
The second mobile terminal is at least one mobile terminal, and the distance between the second mobile terminal and the first mobile terminal is within a preset range.
One way to preferably establish a connection between a first mobile terminal and a second mobile terminal is as follows:
firstly, a first mobile terminal sends an image matching request broadcast signal through a wireless short-distance communication module;
the wireless short-distance communication module can be a WiFi module, an infrared module or a ZigBee module, and the ZigBee technology is a two-way wireless communication technology with short distance, low complexity, low power consumption, low speed and low cost.
And secondly, receiving a response signal returned by the second mobile terminal according to the image matching request broadcast signal, and establishing connection with the second mobile terminal returning the response signal.
If the devices in the coverage area of the broadcast signal support the same function, the request broadcast signal can be received, and a response signal, namely a response signal, can be replied to the first mobile terminal. If the response signal returned by the second mobile terminal is not received within the first preset time, the first prompt message is output to prompt that no connectable second mobile terminal exists around the user, and image processing cannot be performed.
Step 202: and sending the image to be processed to a second mobile terminal.
The comparison image may be determined by the second mobile terminal by: firstly, the second mobile terminal identifies the image background of the image to be processed; and traversing the stored local images, and determining the image with the image background matching degree higher than a preset value in the local images as a comparison image.
The preset value may be set by a person skilled in the art according to actual requirements, and is not particularly limited in the embodiment of the present invention. For example: the preset value may be 80%, 90%, or 95%, etc.
Specifically, after receiving the image to be processed, the second mobile terminal automatically performs intelligent background model matching in the background, extracts background scenery elements from the image to be processed by using an image classification recognition algorithm such as a word bag model, and traverses and matches whether an image with a similar background exists in the device, finds out an image with similarity higher than a preset value, and synchronizes the image with similarity higher than the preset value to the first mobile terminal.
Step 203: and receiving the comparison image sent by the second mobile terminal.
The second mobile terminal may be one or more. After receiving the image to be processed, the second mobile terminal may be capable of matching a local image with a matching degree higher than a preset value with the image to be processed, or may not be capable of matching a local image with a matching degree higher than a preset value with the image to be processed. Thus, a portion of the second mobile terminal may not transmit a comparison image to the mobile terminal, a portion of the second mobile terminal may transmit a comparison image to the mobile terminal, and a portion of the second mobile terminal may transmit two or more comparison images to the first mobile terminal.
It should be noted that, if the comparison image sent by the second mobile terminal is not received within the second preset time, the second prompt information is output to prompt the user that no comparison image is used as the background optimization resource, and the subsequent image processing operation cannot be executed.
Step 204: and identifying each object to be eliminated in the image to be processed, and respectively determining the circumscribed rectangular area of each object to be eliminated.
Each object to be cleared corresponds to an image in the image to be processed, and the circumscribed rectangular area of the image is the circumscribed rectangular area of the object to be cleared. And a circumscribed rectangular area is set for the object to be cleared, so that subsequent calculation is facilitated and the processing efficiency is improved. The image to be processed comprises one or more circumscribed rectangular areas of the object to be cleaned. After determining the circumscribed rectangular area of each object to be cleared, it is necessary to determine whether each circumscribed rectangular area is an optimizable area according to the comparison image. In step 205 to step 207, a specific process for determining whether one circumscribed rectangular region is an optimizable region is performed, and in a specific implementation process, the process is repeatedly performed to determine whether each circumscribed rectangular region is an optimizable region.
Step 205: and determining the position of the circumscribed rectangular area in the image to be processed aiming at each circumscribed rectangular area.
In the image to be processed, the position of each circumscribed rectangular area is different, and the position of each circumscribed rectangle can be represented by the coordinates of a first pixel, the length and the width of each circumscribed rectangular area and the coordinates of a last pixel of each circumscribed rectangular area; the coordinates of the pixel points at the center position of the circumscribed rectangular region, and the length and width of the rectangular region can also be used for representation.
Step 206: and traversing the comparison image, and judging whether a first comparison image which is the background at the position exists.
For example: and when determining whether the circumscribed rectangular area is an optimizable area, traversing the upper left corner of each comparison image to judge whether the circumscribed rectangular area is only a background, and if so, indicating that no other object exists at the upper left corner of the comparison image. If the left upper corner of one or more comparison images is only the background, it can be said that the circumscribed rectangular area at the left upper corner in the image to be processed is the image of the object to be removed. If the first comparison image with the background at the upper left corner does not exist, the image of the object at the upper left corner of the image is not the object to be cleared, but the image is the inherent object.
Step 207: if the image exists, determining the circumscribed rectangular area as an optimizable area of the image to be processed; and establishing a corresponding relation between the optimized area and the first comparison image.
For example: the total five comparison images are A, B, C, D and E, respectively, wherein the top left corner of the comparison image A, B is the background, and the top left corners of the other three images have images of other objects. Then image A, B is determined to be a first comparison image. And the circumscribed rectangular area of the object to be cleared at the upper left corner of the image to be processed is an optimized area. The correspondence of the optimizable region to image A, B is established.
In particular, an identification of the optimizable region may be added to the set of objects and an identification of the first comparison image may be added to the set of images.
Step 208: and marking each optimizable area in the image to be processed.
The steps 205 to 207 are repeatedly executed to determine each optimizable region in the image to be processed, and the optimizable regions are marked in the image, so that a user can visually determine which regions in the image to be processed can be optimized. The user can select one or more optimized areas according to requirements to carry out image texture repairing processing.
It should be noted that, if there is no optimizable area in the image to be processed, a third prompt message may be output to prompt the user that there is no optimizable area in the image to be processed.
Step 209: receiving a selection instruction of a user on the optimizable area, and determining the selected optimizable area as a target optimization area.
The user can select one or more optimizable regions as target optimization regions for processing according to requirements. The processing of the target optimization area may be image texture restoration processing.
Step 210: determining a first comparison image corresponding to each target optimization area; and performing image texture restoration on the target optimization area according to the background image area corresponding to the target optimization area in the first comparison image to obtain a target image.
In step 207, the correspondence between the optimizable region and the first comparison image is established, and the first comparison image corresponding to each target optimization region can be determined through the object relationship. After the image texture of the target optimization area is repaired, the image of the object to be removed in the target optimization area can be deleted in the image to be processed.
When image texture restoration is carried out, the first mobile terminal obtains restoration textures from an area corresponding to an optimizable area in a first comparison image by using a texture-based image restoration algorithm such as a Criminisi algorithm, restoration and synthesis of the image textures are completed in an image to be processed, a user can preview an image texture restoration effect in real time, and can select and restore an original effect according to requirements so as to flexibly process the image.
By repeating the step 210, the texture restoration processing can be performed on each target optimization area in the image to be processed, so as to obtain an ideal image.
The steps 208 to 210 are described as an example for the user to autonomously select the target optimization area in the image to be processed. In a specific implementation process, the user may also select the target comparison image autonomously, and the first mobile terminal determines a target optimization area in the image to be processed according to the target comparison image to perform image optimization, where the specific implementation manner may be as follows:
firstly, displaying a thumbnail of each first comparison image;
specifically, the thumbnail of the first comparison image may be presented in a floating sliding list or the like on the preview operation interface.
Secondly, receiving a selection instruction of a user for each thumbnail, and determining a first comparison image corresponding to the selected thumbnail as a target comparison image;
the user may select one or more thumbnails, each thumbnail corresponding to a first comparison image.
Thirdly, determining an optimizable area corresponding to the target comparison image aiming at each target comparison image; and performing image texture restoration on the optimizable area in the image to be processed according to the background image area corresponding to the optimizable area in the target comparison image to obtain a target image.
The user may further specifically select a target comparison image for each optimizable region, and the first mobile terminal performs optimization processing on the corresponding optimizable region in the to-be-processed image according to the target comparison image corresponding to each optimizable region, where the specific implementation may be as follows:
firstly, displaying a thumbnail of a corresponding first comparison image at each optimizable area according to the corresponding relation between the optimizable area and the first comparison image;
one of the optimizable regions corresponds to one or more of the first comparison images, so that one or more thumbnails are displayed in each optimizable region. The user can select a first comparison image corresponding to one thumbnail as a target comparison image for optimizing the optimizable region.
Secondly, determining the selection operation of the user on the thumbnail displayed in the optimizable area aiming at each optimizable area, and determining a first comparison image corresponding to the selected thumbnail as a target comparison image corresponding to the optimizable area;
and thirdly, comparing the images according to the targets corresponding to the optimized areas respectively, and performing image texture restoration on the optimized areas to obtain target images.
In the optimal mode, the user can select the target comparison image corresponding to each optimized area according to the requirement of the user, so that the image optimization is more detailed, the personalized requirement of the user is met, and the use experience of the user can be improved.
The image processing method provided by the embodiment of the invention has the beneficial effects of the image processing method shown in the first embodiment, and can also be used for selecting the target optimization area from the optimization areas contained in the image to be processed by the user according to the requirement and selectively deleting the object to be eliminated, so that the personalized requirement of the user can be met, and the use experience of the user is improved.
EXAMPLE III
Referring to fig. 3, a block diagram of a mobile terminal according to a third embodiment of the present invention is shown.
The mobile terminal of the embodiment of the invention comprises: a pairing module 301, configured to establish a connection with a second mobile terminal, where the second mobile terminal is at least one mobile terminal whose distance from the first mobile terminal is within a preset range; a sending module 302, configured to send an image to be processed to the second mobile terminal; a receiving module 303, configured to receive at least one comparison image sent by the second mobile terminal, where the at least one comparison image is an image of the second mobile terminal that matches a background of the image to be processed; an identification module 304, configured to identify an optimizable region in the image to be processed with reference to the comparison image; a processing module 305, configured to perform image processing on each of the optimizable regions in the image to be processed according to the comparison image to obtain a target image.
Preferably, the pairing module 301 comprises: a signal transmission sub-module 3011 for transmitting an image matching request broadcast signal through the wireless short-range communication module; and the signal receiving submodule 3012 is configured to receive a response signal returned by the second mobile terminal according to the image matching request broadcast signal, and establish a connection with the second mobile terminal returning the response signal.
Preferably, the identification module 304 comprises: a rectangular area determining submodule 3041, configured to identify each object to be removed in the image to be processed, and determine an external rectangular area of each object to be removed; a position determining submodule 3042, configured to determine, for each circumscribed rectangular region, a position of the circumscribed rectangular region in the image to be processed; a traversal submodule 3043, configured to traverse the comparison image, and determine whether a first comparison image with the position as a background exists; if the image to be processed exists in the external rectangular area, determining the external rectangular area as an optimizable area of the image to be processed; and establishing a corresponding relation between the optimizable area and the first comparison image.
Preferably, the processing module 305 comprises: a marking sub-module 3051, configured to mark each of the optimizable regions in the image to be processed; the first instruction receiving submodule 3052 is configured to receive a selection instruction of a user for an optimizable region, and determine the selected optimizable region as a target optimization region; the first repairing sub-module 3053 is configured to, for each target optimization region, determine a first comparison image corresponding to the target optimization region; and performing image texture restoration on the target optimization area according to a background image area corresponding to the target optimization area in the first comparison image to obtain a target image.
Preferably, the processing module 305 further comprises: a first display sub-module 3054, configured to display, at each optimizable region, a thumbnail of the corresponding first comparative image according to a correspondence between the optimizable region and the first comparative image; the second instruction receiving sub-module 3055 is configured to, for each optimizable area, determine a selection operation of a user on a thumbnail displayed in the optimizable area, and determine a first comparison image corresponding to the selected thumbnail as a target comparison image corresponding to the optimizable area; the second repairing sub-module 3056 is configured to compare the images according to the targets corresponding to the respective optimizable regions, and perform image texture repairing on the respective optimizable regions to obtain target images.
Preferably, the processing module 305 further comprises: a second display sub-module 3057, configured to display a thumbnail of each of the first comparison images; the third instruction receiving sub-module 3058 is configured to receive a selection instruction of a user for each thumbnail, and determine a first comparison image corresponding to the selected thumbnail as a target comparison image; a third repairing sub-module 3059, configured to determine, for each target comparison image, an optimizable region corresponding to the target comparison image; and according to a background image area corresponding to the optimizable area in the target comparison image, performing image texture restoration on the optimizable area in the image to be processed to obtain a target image.
Preferably, the processing module 305 further comprises: a thumbnail display sub-module 30510, configured to display thumbnails of the comparison images; the receiving sub-module 30511 is configured to receive a selection instruction of each thumbnail from a user; the processing sub-module 30512 is configured to, according to the comparison image corresponding to the selected thumbnail, perform image processing on each of the optimizable regions in the image to be processed to obtain a target image.
The mobile terminal provided in the embodiment of the present invention can implement each process implemented by the mobile terminal in the method embodiments of fig. 1 to fig. 2, and is not described herein again to avoid repetition.
According to the mobile terminal provided by the embodiment of the invention, the comparison image matched with the background of the image to be processed is acquired from the second mobile terminal through establishing connection with the second mobile terminal, the comparison image is referred to identify the optimizable area in the image to be processed, such as the external rectangular area of the object to be removed, and the image texture restoration is carried out on the optimizable area in the image to be processed according to the comparison image, so that the target image expected by a user can be obtained.
Example four
Figure 4 is a schematic diagram of a hardware configuration of a mobile terminal implementing various embodiments of the present invention,
the mobile terminal 400 includes, but is not limited to: radio frequency unit 401, network module 402, audio output unit 403, input unit 404, sensor 405, display unit 406, user input unit 407, interface unit 408, memory 409, processor 410, and power supply 411. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 4 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
A processor 410, configured to establish a connection with a second mobile terminal, where the second mobile terminal is at least one mobile terminal whose distance from the first mobile terminal is within a preset range; sending the image to be processed to the second mobile terminal; receiving at least one comparison image sent by the second mobile terminal, wherein the at least one comparison image is an image matched with the background of the image to be processed in the second mobile terminal; identifying an optimizable region in the image to be processed with reference to the comparison image; and processing the images according to the comparison images to obtain target images in each optimized area in the images to be processed.
The mobile terminal provided by the embodiment of the invention acquires the comparison image matched with the background of the image to be processed from the second mobile terminal by establishing connection with the second mobile terminal, identifies the optimizable area in the image to be processed, such as the area with the object to be removed, by referring to the comparison image, and processes the optimizable area in the image to be processed according to the comparison image, so that the high-quality target image expected by a user can be obtained. Because the user does not need to manually process the optimizable regions in the images to be processed one by one, the human resources can be saved, the processing efficiency is improved, and the shooting experience of the user is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 401 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 410; in addition, the uplink data is transmitted to the base station. Typically, radio unit 401 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio unit 401 can also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access through the network module 402, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 403 may convert audio data received by the radio frequency unit 401 or the network module 402 or stored in the memory 409 into an audio signal and output as sound. Also, the audio output unit 403 may also provide audio output related to a specific function performed by the mobile terminal 400 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 403 includes a speaker, a buzzer, a receiver, and the like.
The input unit 404 is used to receive audio or video signals. The input Unit 404 may include a Graphics Processing Unit (GPU) 4041 and a microphone 4042, and the Graphics processor 4041 processes image data of a still picture or video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 406. The image frames processed by the graphic processor 4041 may be stored in the memory 409 (or other storage medium) or transmitted via the radio frequency unit 401 or the network module 402. The microphone 4042 may receive sound, and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 401 in case of the phone call mode.
The mobile terminal 400 also includes at least one sensor 405, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 4061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 4061 and/or the backlight when the mobile terminal 400 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 405 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which will not be described in detail herein.
The display unit 406 is used to display information input by the user or information provided to the user. The Display unit 406 may include a Display panel 4061, and the Display panel 4061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 407 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 407 includes a touch panel 4071 and other input devices 4072. Touch panel 4071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 4071 using a finger, a stylus, or any suitable object or attachment). The touch panel 4071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 410, receives a command from the processor 410, and executes the command. In addition, the touch panel 4071 can be implemented by using various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 4071, the user input unit 407 may include other input devices 4072. Specifically, the other input devices 4072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 4071 can be overlaid on the display panel 4061, and when the touch panel 4071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 410 to determine the type of the touch event, and then the processor 410 provides a corresponding visual output on the display panel 4061 according to the type of the touch event. Although in fig. 4, the touch panel 4071 and the display panel 4061 are two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 4071 and the display panel 4061 may be integrated to implement the input and output functions of the mobile terminal, which is not limited herein.
The interface unit 408 is an interface through which an external device is connected to the mobile terminal 400. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 408 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 400 or may be used to transmit data between the mobile terminal 400 and external devices.
The memory 409 may be used to store software programs as well as various data. The memory 409 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 409 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 410 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 409 and calling data stored in the memory 409, thereby integrally monitoring the mobile terminal. Processor 410 may include one or more processing units; preferably, the processor 410 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 410.
The mobile terminal 400 may further include a power supply 411 (e.g., a battery) for supplying power to various components, and preferably, the power supply 411 may be logically connected to the processor 410 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the mobile terminal 400 includes some functional modules that are not shown, and thus, are not described in detail herein.
Preferably, an embodiment of the present invention further provides a mobile terminal, which includes a processor 410, a memory 409, and a computer program stored in the memory 409 and capable of being executed on the processor 410, where the computer program, when executed by the processor 410, implements each process of the above-mentioned image processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (12)

1. An image processing method applied to a first mobile terminal is characterized by comprising the following steps:
establishing connection with a second mobile terminal, wherein the second mobile terminal is at least one mobile terminal with the distance from the first mobile terminal within a preset range; sending the image to be processed to the second mobile terminal;
receiving at least one comparison image sent by the second mobile terminal, wherein the at least one comparison image is an image matched with the background of the image to be processed in the second mobile terminal;
identifying an optimizable region in the image to be processed with reference to the comparison image;
according to the comparison image, carrying out image processing on each optimizable area in the image to be processed to obtain a target image;
the step of identifying an optimizable region in the image to be processed with reference to the comparison image comprises:
identifying each object to be eliminated in the image to be processed, and respectively determining a circumscribed rectangular area of each object to be eliminated;
determining the position of the circumscribed rectangular region in the image to be processed aiming at each circumscribed rectangular region;
traversing the comparison image, and judging whether a first comparison image with the position as a background exists or not;
if the image to be processed exists in the external rectangular area, determining the external rectangular area as an optimizable area of the image to be processed; establishing a corresponding relation between the optimizable area and the first comparison image;
if not, determining that the circumscribed rectangular area is not the optimizable area of the image to be processed;
the establishing of the correspondence between the optimizable region and the first comparison image includes:
adding the identification of the optimizable region to an object set, and adding the identification of the first comparison image to an image set;
if no region which can be optimized exists in the image to be processed, prompting a user that no region in the image to be processed can be optimized through third prompt information;
and the target image is obtained by selecting a corresponding optimizable area from the image to be processed according to the selected comparison image and performing image processing.
2. The method of claim 1, wherein the step of establishing a connection with the second mobile terminal comprises:
transmitting an image matching request broadcast signal through a wireless short-distance communication module;
and receiving a response signal returned by the second mobile terminal according to the image matching request broadcast signal, and establishing connection with the second mobile terminal returning the response signal.
3. The method according to claim 1, wherein the step of performing image processing on each of the optimizable regions in the image to be processed according to the comparison image to obtain a target image comprises:
displaying a thumbnail of each comparison image;
receiving a selection instruction of a user for each thumbnail;
and processing the images of the optimized areas in the image to be processed according to the comparison image corresponding to the selected thumbnail to obtain a target image.
4. The method according to claim 1, wherein the step of performing image processing on each of the optimizable regions in the image to be processed according to the comparison image to obtain a target image comprises:
displaying a thumbnail of the corresponding first comparison image at each optimizable area according to the correspondence of the optimizable area and the first comparison image;
for each optimizable area, determining a selection operation of a user on a thumbnail displayed at the optimizable area, and determining a first comparison image corresponding to the selected thumbnail as a target comparison image corresponding to the optimizable area;
and respectively comparing the images according to the targets corresponding to the optimized areas, and performing image texture restoration on the optimized areas to obtain target images.
5. The method according to claim 1, wherein the step of performing image processing on each of the optimizable regions in the image to be processed according to the comparison image to obtain a target image comprises:
displaying a thumbnail of each of the first comparison images;
receiving a selection instruction of a user for each thumbnail, and determining a first comparison image corresponding to the selected thumbnail as a target comparison image;
aiming at each target comparison image, determining an optimizable area corresponding to the target comparison image; and according to a background image area corresponding to the optimizable area in the target comparison image, performing image texture restoration on the optimizable area in the image to be processed to obtain a target image.
6. A mobile terminal, characterized in that the mobile terminal comprises:
the system comprises a pairing module and a matching module, wherein the pairing module is used for establishing connection with a second mobile terminal, and the second mobile terminal is at least one mobile terminal, the distance between the second mobile terminal and a first mobile terminal is within a preset range;
the sending module is used for sending the image to be processed to the second mobile terminal;
the receiving module is used for receiving at least one comparison image sent by the second mobile terminal, wherein the at least one comparison image is an image matched with the background of the image to be processed in the second mobile terminal;
the identification module is used for identifying an optimizable area in the image to be processed by referring to the comparison image;
the processing module is used for carrying out image processing on each optimizable area in the image to be processed according to the comparison image to obtain a target image;
the identification module comprises:
the rectangular area determining submodule is used for identifying each object to be eliminated in the image to be processed and respectively determining the circumscribed rectangular area of each object to be eliminated;
the position determining submodule is used for determining the position of the circumscribed rectangular area in the image to be processed aiming at each circumscribed rectangular area;
the traversing submodule is used for traversing the comparison image and judging whether a first comparison image with the position as a background exists or not; if the image to be processed exists in the external rectangular area, determining the external rectangular area as an optimizable area of the image to be processed; establishing a corresponding relation between the optimizable area and the first comparison image; if not, determining that the circumscribed rectangular area is not the optimizable area of the image to be processed;
the traversal submodule is specifically configured to add the identifier of the optimizable region to an object set, and add the identifier of the first comparison image to an image set;
if no region in the image to be processed can be optimized, prompting a user that no region in the image to be processed can be optimized through third prompt information;
and the target image is obtained by selecting a corresponding optimizable area from the image to be processed according to the selected comparison image and performing image processing.
7. The mobile terminal of claim 6, wherein the pairing module comprises:
a signal transmitting sub-module for transmitting an image matching request broadcast signal through the wireless short-distance communication module;
and the signal receiving submodule is used for receiving a response signal returned by the second mobile terminal according to the image matching request broadcast signal and establishing connection with the second mobile terminal returning the response signal.
8. The mobile terminal of claim 6, wherein the processing module comprises:
the thumbnail display submodule is used for displaying the thumbnail of each comparison image;
the receiving submodule is used for receiving a selection instruction of a user for each thumbnail;
and the processing submodule is used for carrying out image processing on each optimizable area in the image to be processed according to the comparison image corresponding to the selected thumbnail to obtain a target image.
9. The mobile terminal of claim 6, wherein the processing module further comprises:
the first display sub-module is used for displaying a thumbnail of the corresponding first comparison image at each optimized area according to the corresponding relation between the optimized areas and the first comparison images;
the second instruction receiving submodule is used for receiving the selection operation of a user on the thumbnail displayed in the optimized area aiming at each optimized area, and determining the first comparison image corresponding to the selected thumbnail as the target comparison image corresponding to the optimized area;
and the second repairing submodule is used for respectively comparing the images according to the targets corresponding to the optimized areas and performing image texture repairing on the optimized areas to obtain target images.
10. The mobile terminal of claim 6, wherein the processing module comprises:
the second display sub-module is used for displaying thumbnails of the first comparison images;
the third instruction receiving submodule is used for receiving a selection instruction of a user for each thumbnail and determining a first comparison image corresponding to the selected thumbnail as a target comparison image;
the third restoration submodule is used for determining an optimizable area corresponding to each target comparison image; and according to a background image area corresponding to the optimizable area in the target comparison image, performing image texture restoration on the optimizable area in the image to be processed to obtain a target image.
11. A mobile terminal, characterized in that it comprises a processor, a memory and a computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, implements the steps of the image processing method according to any one of claims 1 to 5.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the image processing method according to any one of claims 1 to 5.
CN201711131135.1A 2017-11-15 2017-11-15 Image processing method and mobile terminal Active CN108063884B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711131135.1A CN108063884B (en) 2017-11-15 2017-11-15 Image processing method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711131135.1A CN108063884B (en) 2017-11-15 2017-11-15 Image processing method and mobile terminal

Publications (2)

Publication Number Publication Date
CN108063884A CN108063884A (en) 2018-05-22
CN108063884B true CN108063884B (en) 2021-02-26

Family

ID=62134877

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711131135.1A Active CN108063884B (en) 2017-11-15 2017-11-15 Image processing method and mobile terminal

Country Status (1)

Country Link
CN (1) CN108063884B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110855897B (en) * 2019-12-20 2021-10-15 维沃移动通信有限公司 Image shooting method and device, electronic equipment and storage medium
CN111401463B (en) * 2020-03-25 2024-04-30 维沃移动通信有限公司 Method for outputting detection result, electronic equipment and medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101266685A (en) * 2007-03-14 2008-09-17 中国科学院自动化研究所 A method for removing unrelated images based on multiple photos
JP2015162050A (en) * 2014-02-27 2015-09-07 セイコーエプソン株式会社 Image processor, image processing method, and image processing program
CN106204435A (en) * 2016-06-27 2016-12-07 北京小米移动软件有限公司 Image processing method and device
CN106454085B (en) * 2016-09-30 2019-09-27 维沃移动通信有限公司 A kind of image processing method and mobile terminal
KR101723551B1 (en) * 2016-11-21 2017-04-05 주식회사 성진하이텍 System for simulating car accident of pedestrian

Also Published As

Publication number Publication date
CN108063884A (en) 2018-05-22

Similar Documents

Publication Publication Date Title
CN108495029B (en) Photographing method and mobile terminal
CN108038825B (en) Image processing method and mobile terminal
CN107977652B (en) Method for extracting screen display content and mobile terminal
CN109409244B (en) Output method of object placement scheme and mobile terminal
CN109660723B (en) Panoramic shooting method and device
CN108174109B (en) Photographing method and mobile terminal
CN109684277B (en) Image display method and terminal
CN108460817B (en) Jigsaw puzzle method and mobile terminal
CN107749046B (en) Image processing method and mobile terminal
CN109495616B (en) Photographing method and terminal equipment
CN109727212B (en) Image processing method and mobile terminal
CN107153500B (en) Method and equipment for realizing image display
CN108174110B (en) Photographing method and flexible screen terminal
CN111401463B (en) Method for outputting detection result, electronic equipment and medium
CN110519503A (en) A kind of acquisition methods and mobile terminal of scan image
CN109639981B (en) Image shooting method and mobile terminal
CN109491964B (en) File sharing method and terminal
CN108156386B (en) Panoramic photographing method and mobile terminal
CN108063884B (en) Image processing method and mobile terminal
CN110717964A (en) Scene modeling method, terminal and readable storage medium
CN108243489B (en) Photographing control method and mobile terminal
CN111432154B (en) Video playing method, video processing method and electronic equipment
CN109739406B (en) File sending method and terminal
CN108495276B (en) Sharing method and device of digital business card
CN108304744B (en) Scanning frame position determining method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant