CN108132749B - Image editing method and mobile terminal - Google Patents

Image editing method and mobile terminal Download PDF

Info

Publication number
CN108132749B
CN108132749B CN201711393075.0A CN201711393075A CN108132749B CN 108132749 B CN108132749 B CN 108132749B CN 201711393075 A CN201711393075 A CN 201711393075A CN 108132749 B CN108132749 B CN 108132749B
Authority
CN
China
Prior art keywords
area
editing
image
target editing
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711393075.0A
Other languages
Chinese (zh)
Other versions
CN108132749A (en
Inventor
黄康康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201711393075.0A priority Critical patent/CN108132749B/en
Publication of CN108132749A publication Critical patent/CN108132749A/en
Application granted granted Critical
Publication of CN108132749B publication Critical patent/CN108132749B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • G06F3/04842Selection of a displayed object
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders, dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the screen or tablet into independently controllable areas, e.g. virtual keyboards, menus

Abstract

The invention discloses an image editing method and a mobile terminal, wherein the method comprises the following steps: acquiring a pressing operation of a target editing control in an image editing interface; if the duration time of the pressing operation is longer than the first preset time, setting the target editing control to be in a slidable state; acquiring sliding operation on a target editing control; if the area of the target editing control after the movement is matched with the target editing area of the image, executing preset function operation corresponding to the target editing area; the editing interface comprises at least one editing area, each editing area corresponds to one function operation, the target editing area is one of the at least one editing area, a user can slide the target editing control at will, the area where the target editing control moves is matched with each editing area, the function operation corresponding to each editing area is executed, the target editing control does not need to be clicked continuously to switch the image editing mode, the operation is simple and convenient, the user can operate with one hand, and the user experience is improved.

Description

Image editing method and mobile terminal
Technical Field
The invention relates to the technical field of picture processing, in particular to an image editing method and a mobile terminal.
Background
With the continuous progress of image processing technology, the image editing function on the mobile terminal is more and more abundant, so that the operation of editing the image on the mobile terminal by a user is more and more complicated.
Currently, the method for a user to edit an image in a mobile terminal generally includes: opening the photo album, selecting a picture in the photo album, clicking an 'edit' control, and entering an image editing mode, such as an editing mode of one-key beautification, filter, rotation, frame and the like. And the user continuously clicks the 'editing' control to switch the image editing mode, and the style is changed in a sliding or clicking mode in the corresponding image editing mode.
The existing image editing mode is complex to operate, and users can realize the editing requirements of the users only by continuously clicking an 'editing' control or sliding a screen of a mobile terminal in a corresponding image editing mode.
Disclosure of Invention
The embodiment of the invention provides an image editing method and a mobile terminal, and aims to solve the problem that an image editing mode in the prior art is relatively complex to operate.
In order to solve the technical problem, the invention is realized as follows: a method of editing an image, the method comprising:
acquiring a pressing operation of a target editing control in an image editing interface;
if the pressing operation duration is longer than first preset time, setting the target editing control to be in a slidable state;
acquiring sliding operation on the target editing control;
if the area of the target editing control after the movement is matched with the target editing area of the image, executing preset function operation corresponding to the target editing area;
the editing interface comprises at least one editing area, each editing area corresponds to a function operation, and the target editing area is one of the at least one editing area.
In a second aspect, an embodiment of the present invention further provides a mobile terminal, including:
the first acquisition module is used for acquiring the pressing operation of a target editing control in an image editing interface;
the first setting module is used for setting the target editing control into a slidable state if the pressing operation duration is longer than first preset time;
the second acquisition module is used for acquiring the sliding operation of the target editing control;
the first execution module is used for executing preset function operation corresponding to the target editing area if the area of the target editing control after the movement is matched with the target editing area of the image;
the editing interface comprises at least one editing area, each editing area corresponds to a function operation, and the target editing area is one of the at least one editing area.
In a third aspect, an embodiment of the present invention further provides a mobile terminal, including: a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the image editing method described above.
In a fourth aspect, the embodiment of the present invention further provides a readable storage medium, where a computer program is stored on the readable storage medium, and when the computer program is executed by a processor, the computer program implements the steps of the image editing method described above.
In the embodiment of the invention, the target editing control is set to be in a slidable state, if the moved area of the target editing control is matched with the target editing area of one of the at least one editing area of the image, the preset function operation corresponding to the target editing area is executed, so that a user can slide the target editing control at will, the moved area of the target editing control is matched with each editing area, the function operation corresponding to each editing area is executed, the target editing control does not need to be clicked continuously to switch the image editing mode, the operation is simple and convenient, the user can perform complex image editing operation through simple one-hand operation, and the user experience is improved.
Drawings
FIG. 1 is a flowchart of an image editing method according to an embodiment of the present invention;
fig. 2 is a flowchart of an image editing method in a first and a second practical application scenarios according to an embodiment of the present invention;
fig. 3 is one of schematic diagrams of an image editing method provided in an embodiment of the present invention in a first practical application scenario;
fig. 4 is a second schematic diagram of the image editing method in a first practical application scenario according to the embodiment of the present invention;
fig. 5 is a third schematic diagram of an image editing method in a first practical application scenario according to an embodiment of the present invention;
fig. 6 is one of schematic diagrams of an image editing method provided in an embodiment of the present invention in a second practical application scenario;
fig. 7 is a second schematic diagram of an image editing method in a second practical application scenario according to an embodiment of the present invention;
fig. 8 is a third schematic diagram of an image editing method in a second practical application scenario according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 10 is a second schematic structural diagram of a mobile terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention. The technical solutions provided by the embodiments of the present invention are described in detail below with reference to the accompanying drawings.
In order to solve the problem that the image editing mode in the prior art is relatively complicated to operate, the present invention provides an image editing method, and an execution subject of the method may be, but is not limited to, a mobile terminal (e.g., a mobile phone, a tablet computer, etc.) or a device capable of being configured to execute the method provided by the embodiment of the present invention.
For convenience of description, the following description will be made on embodiments of the method, taking as an example that the execution subject of the method is a mobile terminal capable of executing the method. It is understood that the mobile terminal is used as the main body of the method and is only an exemplary illustration, and should not be construed as a limitation of the method.
Fig. 1 is a flowchart of an image editing method according to an embodiment of the present invention, where the method of fig. 1 may be performed by a mobile terminal having a pressure touch screen. As shown in fig. 1, the method may include:
step 101, obtaining a pressing operation of a target editing control in an image editing interface.
The target editing control is displayed on a pressure touch screen of the mobile terminal. The target editing control is displayed on the pressure touch screen in a fixed button mode, namely the target editing control is in a fixed state.
In this step, the pressing operation of the target editing control in the image editing interface is acquired, specifically, the pressing operation input on the target editing control is acquired through the pressure touch screen. Specifically, when the pressing operation is detected by the pressure touch screen, the pressure value of the pressing operation can be acquired.
And 102, if the duration time of the pressing operation is longer than a first preset time, setting the target editing control to be in a slidable state.
The first predetermined time is set according to actual requirements, and in practical implementation, the first predetermined time may be 2s, 5s or 10 s.
Specifically, when it is detected that the duration of time that the target editing control is continuously pressed is longer than the first preset time, the target editing control is changed from a fixed state to a slidable state. At this point, the target edit control can be dragged to any specified location.
And 103, acquiring sliding operation on the target editing control.
In this step, the sliding operation of the target editing control is obtained, and specifically, the sliding operation may be obtained whether the sliding operation is an operation of sliding the target editing control from the initial position of the target editing control on the pressure touch screen to the target editing area on the pressure touch screen. And if so, determining whether the area of the moved target editing control is matched with the target editing area of the image. If yes, go to step 104.
The sliding operation is an operation of sliding the target editing control to the target editing area on the pressure touch screen at the initial position of the pressure touch screen, and specifically may include: the sliding operation is the sliding operation from the initial position of the target editing control to the target editing area; or, the sliding operation is an operation of sliding from the initial position of the target editing space to a preset placing position and then sliding from the preset placing position to the target editing area.
And 104, if the area of the moved target editing control is matched with the target editing area of the image, executing preset function operation corresponding to the target editing area.
The editing interface comprises at least one editing area, each editing area corresponds to a function operation, and the target editing area is one of the at least one editing area.
The area of the target editing control after the movement is matched with the target editing area of the image, and the specific implementation may be that, if it is detected that the overlapping area of the target editing control after the movement and the target editing area of the image exceeds a threshold value, the area of the target editing control after the movement is matched with the target editing area of the image.
The functional operations may include: one-key beautifying function, image rotation function, filter function, frame selection function and the like. Of course, other work operations for editing the image, such as a cropping function operation, may also be included. The embodiments of the present invention are not listed.
When the method is specifically implemented, for example, the function operation includes a one-key beautifying function operation, which specifically includes: and the target editing area corresponds to one-key beautifying function operation, and if the area of the moved target editing control is matched with the target editing area of the image, the one-key beautifying function operation corresponding to the target editing area is executed.
Similarly, for example, the function operation includes an image rotation function operation, specifically: and the target editing area corresponds to the image rotation function operation, and if the area of the target editing control after the movement is matched with the target editing area of the image, the image rotation function operation corresponding to the target editing area is executed.
In the embodiment of the invention, the target editing control is set to be in a slidable state, if the moved area of the target editing control is matched with the target editing area of one of the at least one editing area of the image, the preset function operation corresponding to the target editing area is executed, so that a user can slide the target editing control at will, the moved area of the target editing control is matched with each editing area, the function operation corresponding to each editing area is executed, the target editing control does not need to be clicked continuously to switch the image editing mode, the operation is simple and convenient, the user can perform complex image editing operation through simple one-hand operation, and the user experience is improved.
Optionally, as an embodiment, the step 104 may be specifically implemented as:
and if the overlapping area of the moved target editing control and the target editing area of the image exceeds a threshold value, executing preset function operation corresponding to the target editing area.
The threshold may be selected according to actual requirements, and the embodiment of the present invention is not particularly limited.
Of course, other implementations of step 104 may exist, and embodiments of the present invention are not limited in particular.
Optionally, as an embodiment, the step 103 may be specifically implemented as:
and acquiring whether the sliding operation of the target editing control is an operation of sliding the target editing control to the target editing area on the pressure touch screen from the initial position of the pressure touch screen.
The specific implementation manner may include: firstly, acquiring whether the sliding operation of the target editing control is the sliding operation from the initial position of the target editing control to a target editing area;
or, secondly, acquiring whether the sliding operation of the target editing control is an operation of sliding from the initial position of the target editing space to a preset placing position and then sliding from the preset placing position to the target editing area.
And if so, determining whether the area of the moved target editing control is matched with the target editing area of the image. If yes, go to step 104.
In the embodiment of the invention, compared with the first implementation mode, the second implementation mode can place the target editing control in the placement area, and the user can slide the target editing control in the placement area to any editing area, so that the user does not need to drag the target editing control from the initial position to each editing area repeatedly, and the operation is more convenient.
Optionally, as an embodiment, the editing interface may further include a placement area, where the placement area is a storage area of the target editing control.
After step 102 is executed, the method may further include:
and when the target editing control is detected to be dragged to the drop area, setting the editing area to be in a display state displayed with preset transparency, wherein the initial display state of the editing area is a hidden display state.
The preset transparency can be set according to actual requirements. In particular implementations, the predetermined transparency can be a transparency of 50%.
According to the embodiment of the invention, after the target editing control is dragged to the placement area, the editing area is set to be in the display state displayed with the preset transparency, so that a user can know each editing area for the user to select, and convenience is provided for the user to edit.
Optionally, as an embodiment, after the area after the movement of the target editing control matches the target editing area of the image, the method further includes:
and setting the editing area to be in a display state displayed with preset transparency, wherein the initial display state of the editing area is a hidden display state.
The preset transparency can be set according to actual requirements. In particular implementations, the predetermined transparency can be a transparency of 50%.
According to the embodiment of the invention, after the area of the target editing control after movement is matched with the target editing area of the image, the editing area is displayed with the preset transparency, so that a user can know each editing area for the user to select, and convenience is provided for the editing operation of the user.
Optionally, as an embodiment, after the executing 104, the method may further include:
and setting the editing areas except the target editing area to be in a hidden display state.
According to the embodiment of the invention, after the corresponding function operation is executed on the target editing area, the editing area except the target editing area is set to be in the hidden display state, so that the situation that the image is shielded by the editing area and the image watching of a user is influenced can be prevented. Meanwhile, the user can be reminded of the functional operation being executed by displaying the target editing area and hiding other editing areas.
Optionally, as another embodiment, after the executing 104, the method may further include:
detecting shaking state information of the mobile terminal; the shaking state information may include a shaking frequency, a shaking direction, a shaking time, and the like.
Determining a function operation instruction corresponding to the shaking state information according to a preset corresponding relation between the shaking state information and the function operation instruction;
and executing a function processing operation corresponding to the function operation instruction on the image according to the function operation instruction.
In specific implementation, taking the function operation as an example of the rotation operation, the operation instruction of rotating 10 degrees counterclockwise is corresponding to one time of counterclockwise shaking, and the operation instruction of rotating 10 degrees clockwise is corresponding to one time of clockwise shaking. When the mobile terminal is detected to shake once anticlockwise, an operation instruction rotating 10 degrees anticlockwise is determined, and processing operation rotating 10 degrees anticlockwise is executed on the image according to the operation instruction. Similarly, when the mobile terminal is detected to shake clockwise once, an operation instruction of rotating 10 degrees clockwise is determined, and according to the operation instruction, processing operation of rotating 10 degrees clockwise is executed on the image.
According to the embodiment of the invention, after the corresponding function operation is executed on the target editing area, the corresponding function processing operation is executed on the image by detecting the shaking state information of the mobile terminal according to the preset corresponding relation between the shaking state information and the function operation instruction, so that the user can execute the function processing operation corresponding to the function operation on the image by shaking the mobile terminal, the screen of the mobile terminal does not need to be slid under the corresponding image editing mode to realize the editing requirement, the clicking and sliding operation of the thumb of the user is reduced, the operation is simple, the use by the user is facilitated, and the user experience is improved.
Optionally, as another embodiment, the method further includes:
determining whether to perform a corresponding functional processing operation on the image within a second predetermined time;
the second predetermined time may be set according to actual requirements, and in practical implementation, the second predetermined time may be 0.5s, 1s, and 3 s. Preferably, the second predetermined time is 1 s.
If so, displaying the target editing area according to a preset transparency from the moment of executing the corresponding function processing operation on the image;
that is, if it is detected that the corresponding function processing operation is performed on the image within the second predetermined time, the target editing region is displayed at a predetermined transparency from the time when the corresponding function processing operation is performed on the image.
According to the method and the device, the user can clearly view the target editing area in the process of processing the image so as to clearly show the ongoing operation on the image.
And if not, displaying the target editing area in a display mode of descending transparency according to the time sequence.
In the embodiment of the application, the display transparency of the target editing area on the screen of the mobile terminal is gradually reduced along with the time until the display transparency disappears.
Optionally, as another embodiment, the method further includes: the image is saved.
Specifically, the specific implementation of saving the image may be: displaying a prompt dialog box for saving the image after executing the corresponding function processing operation on the image; and acquiring touch information of the determined control in the prompt dialog box, and storing the processed image according to the touch information.
Alternatively, the specific implementation of saving the image may be: acquiring first touch information of a user touching a screen of the mobile terminal, and displaying a prompt dialog box for storing an image according to the first touch information; and acquiring second touch information of the determined control in the prompt dialog box, and storing the processed image according to the second touch information.
Of course, the specific implementation of saving the image may also adopt other manners, and the embodiment of the present invention is not particularly limited.
The method of the embodiments of the present invention will be further described with reference to specific embodiments.
FIG. 2 is a flowchart illustrating an image editing method according to an embodiment of the present invention in a first practical application scenario;
specifically, as shown in fig. 2:
at 210, an image to be edited is selected and an image editing interface is entered.
The image editing interface can comprise a target editing control, at least one editing area and other editing controls, wherein each editing area corresponds to a function operation, and the target editing area is one of the at least one editing area.
For example, as shown in fig. 3 to 8, 11 represents a sharing edit control, 12 represents a target edit control, 13 represents a deleting edit control, and 14 represents a more edit control.
In this step, the user unlocks the mobile terminal, enters the album, selects an image to be edited, clicks the image to be edited, and enters an image editing interface, that is, the image to be edited is in a preview mode (as shown in fig. 3).
Before step 210 is performed, an editing area in the image editing interface is set to a hidden display state.
At 220, a target edit control in the image editing interface is launched.
The method specifically comprises the following steps: and detecting the pressing operation of a target editing control in the image editing interface. And if the pressing operation duration is longer than the first preset time, setting the target editing control to be in a slidable state.
At 230, the corresponding functional operation is performed on the target editing region.
The specific implementation is as follows: dragging the target editing control from the initial position of the target editing control to a target editing area (such as a frame 1 in fig. 4), and if the area of the target editing control after the movement is matched with the target editing area of the image, executing corresponding functional operation on the target editing area.
The area of the target editing control after the movement is matched with the target editing area of the image, and the specific implementation may be that, if the overlapping area of the target editing control after the movement and the target editing area of the image exceeds a threshold value, the area of the target editing control after the movement is matched with the target editing area of the image.
And after the moved area of the target editing control is matched with the target editing area of the image, displaying the editing area with preset transparency. After the corresponding functional operation is performed on the target editing area, the editing areas except the target editing area are set to be in a hidden display state (as shown in fig. 4).
At 240, according to the shaking state information of the mobile terminal, corresponding function processing operation is performed on the image.
The method comprises the steps of detecting shaking state information of the mobile terminal; the shaking state information may include a shaking frequency, a shaking direction, a shaking time, and the like. Determining a function operation instruction corresponding to the shaking state information according to a preset corresponding relation between the shaking state information and the function operation instruction; and executing corresponding function processing operation on the image according to the function operation instruction.
In specific implementation, taking the function operation as the frame operation as an example, the frame operation command is changed by shaking counterclockwise once. As shown in fig. 5, when the mobile terminal is detected to shake counterclockwise once, a frame replacement operation instruction is determined, and according to the operation instruction, a frame replacement processing operation is performed on the image.
At 250, the image is saved.
When the method is implemented specifically, prompt information of whether the image is stored or not is displayed; and acquiring touch operation information input by a user according to the prompt information. If the touch operation information is cancel operation information, execute step 251 to not save operation; if the touch operation information is saving operation information, step 252 is executed to save operation.
The flow of the image editing method provided by the embodiment of the invention in a second practical application scene is the same as that in fig. 2;
specifically, as shown in fig. 2:
at 210, an image to be edited is selected and an image editing interface is entered.
As described in the above embodiments, the embodiments of the present invention are not described in detail.
At 220, a target edit control in the image editing interface is launched.
As described in the above embodiments, the embodiments of the present invention are not described in detail.
At 230, the corresponding functional operation is performed on the target editing region.
The specific implementation is as follows: dragging the target editing control to slide from the initial position of the target editing space to a preset placing position (as shown in fig. 6), then sliding from the preset placing position to the target editing area along an arrow (as shown in fig. 7), and if the area of the target editing control after the movement is matched with the target editing area of the image, executing corresponding function operation on the target editing area.
The area of the target editing control after the movement is matched with the target editing area of the image, and the specific implementation may be that, if the overlapping area of the target editing control after the movement and the target editing area of the image exceeds a threshold value, the area of the target editing control after the movement is matched with the target editing area of the image.
After the target editing control is dragged to the preset drop position, the editing area is displayed with the preset transparency (as shown in fig. 7). After the corresponding functional operation is performed on the target editing area, the editing areas other than the target editing area are set to a hidden display state (as shown in fig. 8).
At 240, according to the shaking state information of the mobile terminal, corresponding function processing operation is performed on the image.
The method comprises the steps of detecting shaking state information of the mobile terminal; the shaking state information may include a shaking frequency, a shaking direction, a shaking time, and the like. Determining a function operation instruction corresponding to the shaking state information according to a preset corresponding relation between the shaking state information and the function operation instruction; and executing corresponding function processing operation on the image according to the function operation instruction.
In specific implementation, taking the function operation as an example of the rotation operation, the operation instruction of rotating 10 degrees counterclockwise is corresponding to one counterclockwise shaking. When the mobile terminal is detected to shake once anticlockwise, an operation instruction rotating 10 degrees anticlockwise is determined, and processing operation rotating 10 degrees anticlockwise is executed on the image according to the operation instruction.
At 250, the image is saved.
As described in the above embodiments, the embodiments of the present invention are not described in detail.
The image editing method according to the embodiment of the present invention is described in detail above with reference to fig. 1 to 8, and the mobile terminal according to the embodiment of the present invention is described in detail below with reference to fig. 9.
Fig. 9 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention, and as shown in fig. 9, the mobile terminal may include, based on an image editing method according to an embodiment of the present invention:
a first obtaining module 901, configured to obtain a pressing operation on a target editing control in an editing interface of an image;
a first setting module 902, configured to set the target editing control to be in a slidable state if the pressing operation duration is greater than a first predetermined time;
a second obtaining module 903, configured to obtain a sliding operation on the target editing control;
a first executing module 904, configured to execute a preset function operation corresponding to the target editing region if the region where the target editing control is moved matches the target editing region of the image;
the editing interface comprises at least one editing area, each editing area corresponds to a function operation, and the target editing area is one of the at least one editing area.
In one embodiment, the first execution module 904 comprises:
and the execution unit is used for executing preset function operation corresponding to the target editing area if the overlapping area of the moved target editing control and the target editing area of the image exceeds a threshold value.
In an embodiment, the editing interface further includes a placement area, and the placement area is a storage area of the target editing control.
The mobile terminal further includes:
a second setting module 905, configured to set the editing region to a display state displayed with a predetermined transparency after it is detected that the target editing control is dragged to the drop region, where an initial display state of the editing region is a hidden display state.
In one embodiment, the mobile terminal further comprises:
a third setting module 906, configured to set the editing region to a display state displayed with a predetermined transparency, where an initial display state of the editing region is a hidden display state.
In one embodiment, the mobile terminal further comprises:
a fourth setting module 907 configured to set the editing regions other than the target editing region to a hidden display state.
In one embodiment, the mobile terminal further comprises:
a third obtaining module 908, configured to detect shaking state information of the mobile terminal;
a first determining module 909, configured to determine a function operation instruction corresponding to shaking state information according to a preset correspondence between the shaking state information and the function operation instruction;
a second executing module 910, configured to execute, according to the function operation instruction, a function processing operation corresponding to the function operation instruction on the image.
In one embodiment, further comprising:
a second determining module 911, configured to determine whether to perform a corresponding function processing operation on the image within a second predetermined time;
a first display module 912, configured to display the target editing region according to a predetermined transparency from a time when the corresponding function processing operation is performed on the image if the target editing region is displayed;
and the second display module 913 is configured to display the target editing area in a display manner with decreasing transparency according to the time sequence if the target editing area is not displayed.
In the embodiment of the invention, the target editing control is set to be in a slidable state, if the moved area of the target editing control is matched with the target editing area of one of the at least one editing area of the image, the preset function operation corresponding to the target editing area is executed, so that a user can slide the target editing control at will, the moved area of the target editing control is matched with each editing area, the function operation corresponding to each editing area is executed, the target editing control does not need to be clicked continuously to switch the image editing mode, the operation is simple and convenient, the user can perform complex image editing operation through simple one-hand operation, and the user experience is improved.
Figure 10 is a schematic diagram of the hardware architecture of a mobile terminal implementing various embodiments of the present invention,
the mobile terminal 1000 includes, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, a processor 1010, and a power supply 1011. Those skilled in the art will appreciate that the mobile terminal architecture illustrated in fig. 10 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
Wherein the content of the first and second substances,
the processor 1010 is used for acquiring the pressing operation of a target editing control in the image editing interface;
if the pressing operation duration is longer than first preset time, setting the target editing control to be in a slidable state;
acquiring sliding operation on the target editing control;
if the area of the target editing control after the movement is matched with the target editing area of the image, executing preset function operation corresponding to the target editing area;
the editing interface comprises at least one editing area, each editing area corresponds to a function operation, and the target editing area is one of the at least one editing area.
In the embodiment of the invention, the target editing control is set to be in a slidable state, if the moved area of the target editing control is matched with the target editing area of one of the at least one editing area of the image, the preset function operation corresponding to the target editing area is executed, so that a user can slide the target editing control at will, the moved area of the target editing control is matched with each editing area, the function operation corresponding to each editing area is executed, the target editing control does not need to be clicked continuously to switch the image editing mode, the operation is simple and convenient, the user can perform complex image editing operation through simple one-hand operation, and the user experience is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 1001 may be used for receiving and sending signals during a message transmission or a call, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 1010; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 1001 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 1001 may also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access through the network module 1002, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 1003 may convert audio data received by the radio frequency unit 1001 or the network module 1002 or stored in the memory 1009 into an audio signal and output as sound. Also, the audio output unit 1003 may also provide audio output related to a specific function performed by the mobile terminal 1000 (e.g., a call signal reception sound, a notification event reception sound, etc.). The audio output unit 1003 includes a speaker, a buzzer, a receiver, and the like.
The input unit 1004 is used to receive an audio or video signal. The input Unit 1004 may include a Graphics Processing Unit (GPU) 10041 and a microphone 10042, the Graphics processor 10041 Processing image data of still pictures or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 1006. The image frames processed by the graphic processor 10041 may be stored in the memory 1009 (or other storage medium) or transmitted via the radio frequency unit 1001 or the network module 1002. The microphone 10042 can receive sound and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 1001 in case of a phone call mode.
The mobile terminal 1000 can also include at least one sensor 1005, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 10061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 10061 and/or the backlight when the mobile terminal 1000 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 1005 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which will not be described in detail herein.
The display unit 1006 is used to display information input by the user or information provided to the user. The Display unit 1006 may include a Display panel 10061, and the Display panel 10061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 1007 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 1007 includes a touch panel 10071 and other input devices 10072. The touch panel 10071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 10071 (e.g., operations by a user on or near the touch panel 10071 using a finger, a stylus, or any other suitable object or attachment). The touch panel 10071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 1070, receives a command from the processor 1010, and executes the command. In addition, the touch panel 10071 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 10071, the user input unit 1007 can include other input devices 10072. Specifically, the other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 10071 can be overlaid on the display panel 10061, and when the touch panel 10071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 1010 to determine the type of the touch event, and then the processor 1010 provides a corresponding visual output on the display panel 10061 according to the type of the touch event. Although in fig. 10, the touch panel 10071 and the display panel 10061 are two independent components for implementing the input and output functions of the mobile terminal, in some embodiments, the touch panel 10071 and the display panel 10061 may be integrated to implement the input and output functions of the mobile terminal, which is not limited herein.
The interface unit 1008 is an interface through which an external device is connected to the mobile terminal 1000. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 1008 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 1000 or may be used to transmit data between the mobile terminal 1000 and external devices.
The memory 1009 may be used to store software programs as well as various data. The memory 1009 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, and the like), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1009 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 1010 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 1009 and calling data stored in the memory 1009, thereby integrally monitoring the mobile terminal. Processor 1010 may include one or more processing units; preferably, the processor 1010 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1010.
The mobile terminal 1000 may also include a power supply 1011 (e.g., a battery) for powering the various components, and the power supply 1011 may be logically coupled to the processor 1010 via a power management system that may be configured to manage charging, discharging, and power consumption.
In addition, the mobile terminal 1000 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides a mobile terminal, including a processor 1010, a memory 1009, and a computer program stored in the memory 1009 and capable of running on the processor 1010, where the computer program is executed by the processor 1010 to implement each process of the above-mentioned embodiment of the image editing method, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the above-mentioned embodiment of the image editing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (13)

1. An image editing method, comprising:
acquiring a pressing operation of a target editing control in an image editing interface;
if the pressing operation duration is longer than first preset time, setting the target editing control to be in a slidable state;
acquiring sliding operation on the target editing control;
if the area of the target editing control after the movement is matched with the target editing area of the image, executing preset function operation corresponding to the target editing area on the image;
the editing interface comprises at least one editing area, each editing area corresponds to a function operation, and the target editing area is one of the at least one editing area;
acquiring shaking state information of the mobile terminal;
determining a function operation instruction corresponding to the shaking state information according to a preset corresponding relation between the shaking state information and the function operation instruction;
and executing a function processing operation corresponding to the function operation instruction on the image according to the function operation instruction.
2. The method of claim 1, wherein if the moved region of the target editing control matches the target editing region of the image, performing a preset functional operation corresponding to the target editing region on the image, includes:
and if the overlapping area of the moved target editing control and the target editing area of the image exceeds a threshold value, executing preset function operation corresponding to the target editing area.
3. The method of claim 1, further comprising a placement area in the editing interface, wherein the placement area is a storage area of the target editing control;
after the target editing control is set to the slidable state, the method further comprises the following steps:
and when the target editing control is detected to be dragged to the drop area, setting the editing area to be in a display state displayed with preset transparency, wherein the initial display state of the editing area is a hidden display state.
4. The method of claim 1, after the region of the image after the target editing control moves matches the target editing region of the image, further comprising:
and setting the editing area to be in a display state displayed with preset transparency, wherein the initial display state of the editing area is a hidden display state.
5. The method of claim 1, after performing a preset functional operation corresponding to the target editing region on the image, further comprising:
and setting the editing areas except the target editing area to be in a hidden display state.
6. The method of claim 1, further comprising:
determining whether to perform a corresponding functional processing operation on the image within a second predetermined time;
if yes, displaying the target editing area according to a preset transparency from the moment of executing the function operation corresponding to the target editing area on the image;
and if not, displaying the target editing area in a display mode of descending transparency according to the time sequence.
7. A mobile terminal, comprising:
the first acquisition module is used for acquiring the pressing operation of a target editing control in an image editing interface;
the first setting module is used for setting the target editing control into a slidable state if the pressing operation duration is longer than first preset time;
the second acquisition module is used for acquiring the sliding operation of the target editing control;
the first execution module is used for executing preset function operation corresponding to the target editing area on the image if the area of the target editing control after the movement is matched with the target editing area of the image;
the editing interface comprises at least one editing area, each editing area corresponds to a function operation, and the target editing area is one of the at least one editing area;
the third acquisition module is used for acquiring the shaking state information of the mobile terminal;
the first determining module is used for determining a function operation instruction corresponding to the shaking state information according to the preset corresponding relation between the shaking state information and the function operation instruction;
and the second execution module is used for executing the function processing operation corresponding to the function operation instruction on the image according to the function operation instruction.
8. The mobile terminal of claim 7, wherein the first execution module comprises:
and the execution unit is used for executing preset function operation corresponding to the target editing area on the image if the overlapping area of the moved target editing control and the target editing area of the image exceeds a threshold value.
9. The mobile terminal of claim 7, wherein the editing interface further comprises a placement area, and the placement area is a storage area of the target editing control;
the mobile terminal further includes:
and the second setting module is used for setting the editing area to be in a display state displayed in a preset transparency after the target editing control is detected to be dragged to the placement area, wherein the initial display state of the editing area is a hidden display state.
10. The mobile terminal of claim 7, wherein the mobile terminal further comprises:
and the third setting module is used for setting the editing area to be in a display state displayed with preset transparency, wherein the initial display state of the editing area is a hidden display state.
11. The mobile terminal of claim 7, wherein the mobile terminal further comprises:
and the fourth setting module is used for setting the editing area except the target editing area to be in a hidden display state.
12. The mobile terminal of claim 7, further comprising:
the second determining module is used for determining whether to execute corresponding function processing operation on the image within second preset time;
the first display module is used for displaying the target editing area according to a preset transparency from the moment of executing the corresponding function processing operation on the image if the target editing area is displayed;
and the second display module is used for displaying the target editing area in a display mode of descending transparency according to the time sequence if the target editing area is not displayed in the display mode of descending transparency.
13. A mobile terminal, comprising: memory, processor and computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, carries out the steps of the image editing method according to any one of claims 1 to 6.
CN201711393075.0A 2017-12-21 2017-12-21 Image editing method and mobile terminal Active CN108132749B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711393075.0A CN108132749B (en) 2017-12-21 2017-12-21 Image editing method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711393075.0A CN108132749B (en) 2017-12-21 2017-12-21 Image editing method and mobile terminal

Publications (2)

Publication Number Publication Date
CN108132749A CN108132749A (en) 2018-06-08
CN108132749B true CN108132749B (en) 2020-02-11

Family

ID=62391044

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711393075.0A Active CN108132749B (en) 2017-12-21 2017-12-21 Image editing method and mobile terminal

Country Status (1)

Country Link
CN (1) CN108132749B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109034150B (en) * 2018-06-15 2021-09-21 北京小米移动软件有限公司 Image processing method and device
CN111352557B (en) * 2020-02-24 2021-09-14 北京字节跳动网络技术有限公司 Image processing method, assembly, electronic equipment and storage medium
CN111488104B (en) * 2020-04-16 2021-10-12 维沃移动通信有限公司 Font editing method and electronic equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3279556B2 (en) * 1990-07-06 2002-04-30 株式会社日立製作所 Data editing method
US20160110091A1 (en) * 2014-10-16 2016-04-21 Sony Corporation Fast and natural one-touch deletion in image editing on mobile devices
CN106354374A (en) * 2016-09-30 2017-01-25 维沃移动通信有限公司 Icon moving method and mobile terminal

Also Published As

Publication number Publication date
CN108132749A (en) 2018-06-08

Similar Documents

Publication Publication Date Title
CN108762954B (en) Object sharing method and mobile terminal
CN109343755B (en) File processing method and terminal equipment
CN108174103B (en) Shooting prompting method and mobile terminal
CN108737904B (en) Video data processing method and mobile terminal
CN108471498B (en) Shooting preview method and terminal
CN108132749B (en) Image editing method and mobile terminal
CN109683802B (en) Icon moving method and terminal
CN108900695B (en) Display processing method, terminal equipment and computer readable storage medium
CN109078319B (en) Game interface display method and terminal
CN109710130B (en) Display method and terminal
CN107908382B (en) Split screen display method and mobile terminal
CN109862172B (en) Screen parameter adjusting method and terminal
CN109508136B (en) Display method of application program and mobile terminal
CN109032445B (en) Screen display control method and terminal equipment
CN108228902B (en) File display method and mobile terminal
CN107728923B (en) Operation processing method and mobile terminal
CN108196753B (en) Interface switching method and mobile terminal
CN107885423B (en) Picture processing method and mobile terminal
CN107992342B (en) Application configuration changing method and mobile terminal
CN110855921B (en) Video recording control method and electronic equipment
CN110213437B (en) Editing method and mobile terminal
CN109407948B (en) Interface display method and mobile terminal
CN110795189A (en) Application starting method and electronic equipment
CN108021315B (en) Control method and mobile terminal
CN110647506B (en) Picture deleting method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant