CN108519846B - Image editing processing method and terminal - Google Patents

Image editing processing method and terminal Download PDF

Info

Publication number
CN108519846B
CN108519846B CN201810264652.4A CN201810264652A CN108519846B CN 108519846 B CN108519846 B CN 108519846B CN 201810264652 A CN201810264652 A CN 201810264652A CN 108519846 B CN108519846 B CN 108519846B
Authority
CN
China
Prior art keywords
image editing
input
terminal
target control
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810264652.4A
Other languages
Chinese (zh)
Other versions
CN108519846A (en
Inventor
肇宇飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810264652.4A priority Critical patent/CN108519846B/en
Publication of CN108519846A publication Critical patent/CN108519846A/en
Application granted granted Critical
Publication of CN108519846B publication Critical patent/CN108519846B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides an image editing processing method and a terminal. The method is applied to the terminal and comprises the following steps: receiving a first input of a target control displayed on a current interface by a user; in response to the first input, executing a first processing operation on the executed N-step image editing operation according to the operation parameters of the first input; the first processing operation comprises a withdrawal operation or a recovery operation, and N is a positive integer. Compared with the prior art, the image editing method and the image editing device can effectively improve the image editing speed and the image editing efficiency is high.

Description

Image editing processing method and terminal
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to an image editing processing method and a terminal.
Background
With the development and spread of terminals such as mobile phones, many users are accustomed to sharing various images in a circle of friends or a group. At present, many terminals are provided with image editing functions (such as a whitening function), before images are shared, users can edit images to be shared by using the image editing functions of the terminals, so that the edited images reach a satisfactory state, and then the users can share the edited images to a friend circle or a group.
It should be noted that the image editing function of the existing terminal is often relatively powerful and complex, and a user usually needs to perform multiple steps of image editing operations during the image editing process. If the effect of the executed image editing operation is not satisfactory, the user can only cancel or restore the image step by manually clicking the cancel key or the restore key, which seriously affects the image editing speed and has very low image editing efficiency.
Disclosure of Invention
The embodiment of the invention provides an image editing processing method and a terminal, and aims to solve the problem of low image editing efficiency in the prior art.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an image editing processing method, which is applied to a terminal, and the method includes:
receiving a first input of a target control displayed on a current interface by a user;
responding to the first input, and executing a first processing operation on the executed N-step image editing operation according to the operation parameters of the first input;
the first processing operation comprises a withdrawal operation or a recovery operation, and N is a positive integer.
In a second aspect, an embodiment of the present invention provides a terminal, where the terminal includes:
the first receiving module is used for receiving first input of a target control displayed on the current interface by a user;
the first execution module is used for responding to the first input and executing a first processing operation on the executed N-step image editing operation according to the operation parameters of the first input;
the first processing operation comprises a withdrawal operation or a recovery operation, and N is a positive integer.
In a third aspect, an embodiment of the present invention provides a terminal, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the image editing processing method described above.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the image editing processing method described above.
In the embodiment of the invention, in the image editing process, if the user is not satisfied with the effect of the executed image editing operation, the user can execute the first input with the corresponding operation parameter on the target control displayed on the current interface of the terminal aiming at the image editing operation. In this way, in the case of receiving the first input of the user, the terminal may perform an undo operation or a redo operation on the performed N-step image editing operation according to the operation parameters of the first input in response to the first input of the user. It can be seen that, in the embodiment of the present invention, the user does not need to manually click the cancel key or the resume key to cancel or resume the image editing operation step by step, and the terminal can automatically cancel or resume the image editing operation only by executing the first input with the corresponding operation parameter to the target control. Therefore, compared with the prior art, the image editing method and the image editing device can effectively improve the image editing speed and the image editing efficiency.
Drawings
Fig. 1 is a flowchart of an image editing processing method provided by an embodiment of the present invention;
FIG. 2 is a schematic interface diagram of an image editing processing method according to an embodiment of the present invention;
FIG. 3 is a second schematic interface diagram of the image editing method according to the embodiment of the present invention;
fig. 4 is a third schematic interface diagram of an image editing processing method according to an embodiment of the present invention;
FIG. 5 is a fourth schematic interface diagram of an image editing processing method according to an embodiment of the present invention;
FIG. 6 is a fifth schematic interface diagram of an image editing processing method according to an embodiment of the present invention;
FIG. 7 is a sixth schematic interface diagram of an image editing processing method according to an embodiment of the present invention;
FIG. 8 is a seventh schematic interface diagram of an image editing method according to an embodiment of the present invention;
fig. 9 is an eighth schematic interface diagram of an image editing processing method according to an embodiment of the present invention;
FIG. 10 is a ninth schematic interface diagram illustrating an image editing method according to an embodiment of the present invention;
fig. 11 is a schematic structural diagram of a terminal according to an embodiment of the present invention;
fig. 12 is a schematic diagram of a hardware structure of another terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
First, an image editing processing method according to an embodiment of the present invention will be described.
It should be noted that the image editing processing method provided by the embodiment of the present invention is applied to a terminal. Specifically, the terminal may be any device having a communication function, such as: computers (Computer), Mobile phones, tablet computers (tablet personal Computer), Laptop computers (Laptop Computer), Personal Digital Assistants (PDA), Mobile Internet Devices (MID), Wearable devices (Wearable Device), and the like.
Referring to fig. 1, a flowchart of an image editing processing method according to an embodiment of the present invention is shown. As shown in fig. 1, the method is applied to a terminal, and the method includes the following steps:
step 101, receiving a first input of a target control displayed on a current interface by a user.
In the embodiment of the invention, the terminal display screen can be a special-shaped screen. Specifically, the terminal display screen may be the display screen shown in fig. 2 to 10 and provided with a bang area, and the bang area is a non-display area with a concave screen top end and used for arranging a camera and other devices. Of course, it is also possible that the terminal display screen is not a shaped screen.
It should be noted that, whether the terminal display screen is a special-shaped screen or not, there are various types of the first input. The first input may include a touch input, and specifically, the first input may include at least one of: the method comprises the following steps of pulling operation of pulling a target control, pressing operation of pressing the target control and dragging operation of dragging the target control.
In step 101, the current interface may be an application interface of an image processing application, and the target control may be a line type control displayed on the application interface, which may be but is not limited to be named Touch-line, Touch line or Touch line, and may be but is not limited to be interpreted as a Touch line, a control line, an operation line or an operation control. Specifically, the target control may be a straight solid line, a straight dotted line, a curved straight line, or a curved dotted line, and the width of the target control may range from 1 cm to 5 cm, and the target control may also be in a strip shape. Of course, the target control may also take other shapes (e.g., rectangular, circular, etc.) and width ranges, which are not limited in this respect by the embodiments of the present invention.
And 102, responding to the first input, and executing a first processing operation on the executed N-step image editing operation according to the operation parameters of the first input.
The first processing operation comprises a withdrawal operation or a recovery operation, and N is a positive integer.
Specifically, the operation parameters of the first input include an operation direction, an operation distance, an operation pressure, an operation position, and the like, and the executed image editing operations include a skin beautifying operation, a whitening operation, a mosaic operation, a rotation operation, and the like, which are not listed here.
It should be noted that, in step 102, according to the first input operation parameter, the terminal may specifically execute the first processing operation on the N-step image editing operation whose execution time is closest to the current time (i.e., the N-last-step image editing operation) among the executed image editing operations.
In the embodiment of the invention, in the image editing process, if the user is not satisfied with the effect of the executed image editing operation, the user can execute the first input with the corresponding operation parameter on the target control displayed on the current interface of the terminal aiming at the image editing operation. In this way, in the case of receiving the first input of the user, the terminal may perform an undo operation or a redo operation on the performed N-step image editing operation according to the operation parameters of the first input in response to the first input of the user. It can be seen that, in the embodiment of the present invention, the user does not need to manually click the cancel key or the resume key to cancel or resume the image editing operation step by step, and the terminal can automatically cancel or resume the image editing operation only by executing the first input with the corresponding operation parameter to the target control. Therefore, compared with the prior art, the image editing method and the image editing device can effectively improve the image editing speed and the image editing efficiency.
Optionally, before receiving a first input of a target control displayed by the current interface by the user, the method further includes:
a second input by the user is received.
The second input may include a voice input, a touch input on the current interface, and the like. Specifically, the touch input at the current interface may include at least one of: click operation, press operation, and slide operation.
In response to the second input, a target control is displayed.
It should be noted that the target control displayed by the terminal may be a preset control, for example, the touch line 20 shown in fig. 2, where the preset control is a straight dotted line. It should be noted that the target control displayed by the terminal may also be a control generated according to the second input, which is also possible.
In the embodiment of the invention, the target control can not be displayed under the condition that the executed image editing operation does not need to be cancelled or recovered, so that the display of the content on the screen is prevented from being influenced. In the case that the executed image editing operation needs to be cancelled or recovered, the user can cause the target control to be displayed by executing the second input, so that the cancellation or recovery of the executed image editing operation is realized by executing the first input, and the image editing efficiency is improved.
Optionally, according to the first input operation parameter, a first processing operation is performed on the performed N-step image editing operation, including:
and determining a first processing operation type according to the first input first type operation parameters, wherein the first processing operation type comprises a withdrawal type or a recovery type.
Wherein the first type of operating parameter may include at least one of: operating direction, operating pressure, operating position.
Specifically, the terminal may store in advance a correspondence relationship between the operation direction and the processing operation type. In this way, according to the corresponding relationship, the terminal may determine the processing operation type corresponding to the first input operation direction, and use the processing operation type as the first processing operation type. Of course, the terminal may also store the correspondence between the operation pressure range and the process operation type in advance. In this way, according to the correspondence, the terminal can determine the processing operation type corresponding to the operation pressure range to which the operation pressure of the first input belongs, and use the processing operation type as the first processing operation type.
Therefore, the terminal can conveniently determine the first processing operation type according to the first type of operation parameters.
And determining the step number N according to the first input second-class operation parameters.
Wherein the second type of operating parameter may include at least one of: operating distance, operating pressure, operating position.
Specifically, the terminal may store in advance a correspondence relationship between the operation distance range and the number of steps. Thus, according to the corresponding relationship, the terminal can determine the step number corresponding to the operation distance range to which the first input operation distance belongs, and use the step number as the step number N. Of course, the terminal may also store the correspondence between the operation area and the number of steps in advance. Thus, according to the corresponding relationship, the terminal can determine the step number corresponding to the operation area where the first input operation position is located, and take the step number as the step number N.
Therefore, the terminal can conveniently determine the step number N according to the second type of operation parameters.
And executing a first processing operation corresponding to the first processing operation type on the executed N-step image editing operation.
Wherein, the first processing operation type corresponding to the first processing operation means: under the condition that the first processing operation type is a revocation type, the first processing operation is a revocation operation; in a case where the first processing operation type is a recovery type, the first processing operation is a recovery operation.
It can be seen that, in the embodiment of the present invention, based on the first-type processing parameter and the second-type processing parameter, the terminal can very conveniently determine the first processing operation type and the step number N, so as to perform the corresponding first processing operation on the corresponding image editing operation according to the determined first processing operation type and step number N.
Optionally, the first input comprises a pull operation of pulling the target control;
in response to a first input, performing a first processing operation on the performed N-step image editing operation according to the operation parameters of the first input, including:
and responding to the pulling operation, and updating the display form of the target control according to the operation direction and the operation distance of the pulling operation in the pulling process.
In the pulling process, according to the operation direction and the operation distance of the pulling operation, the terminal can update display parameters such as the length and the radian of the target control so as to update the display form of the target control.
Acquiring relative position change information of the target control and a target boundary line or a target subarea; wherein the relative position of the target control and the demarcation or sub-region includes an intersection or a departure.
The relative position change information of the target control and the target boundary line or the target sub-area refers to: and changing the relative position of the target control and the target boundary line or the target sub-area from intersection to departure information or from departure to intersection information.
In the embodiment of the present invention, as shown in fig. 3 to fig. 10, the terminal display may arrange a plurality of dividing lines (which may be dashed lines or straight lines) along a predetermined direction, and the target dividing line may be any dividing line of the plurality of dividing lines.
The preset direction may be a direction from the top of the screen to the bottom of the screen, and each of the boundaries may correspond to an image editing operation that has been performed. Specifically, the boundary closest to the top of the screen (i.e., the boundary 30) may correspond to the last-but-one image editing operation (it is assumed to be a whitening operation); the boundary second near the top of the screen (i.e., boundary 40) may correspond to the second-to-last image editing operation (assuming it is a mosaic operation); the boundary third near the top of the screen (i.e., the boundary 50) may correspond to the image editing operation of the last third step (assuming it is a rotation operation).
It should be noted that the boundaries may divide the terminal display screen into a plurality of sub-areas, and the target sub-area may be any sub-area of the plurality of sub-areas. The sub-region 60 between the boundary 30 and the boundary 40 may be considered to correspond to a whitening operation (i.e., an image editing operation corresponding to the boundary 30); the sub-area 70 between the dividing line 40 and the dividing line 50 may be considered to correspond to a mosaic operation (i.e., an image editing operation corresponding to the dividing line 40); the sub-region 80 below the borderline 50 may be considered to correspond to a rotation operation (i.e., an image editing operation corresponding to the borderline 50).
And executing a first processing operation on the image editing operation of the N steps corresponding to the relative position change information.
Wherein, the N-step image editing operation corresponding to the relative position change information is: for each image editing operation in the N-step image editing operations, the relative position of the corresponding boundary line or sub-region and the target control is changed from intersection to separation, or from separation to intersection.
For ease of understanding, the following describes specific examples of the implementation of the embodiments of the present invention with reference to fig. 2 to 10.
It should be noted that, in each of the following examples, the terminal stores in advance a corresponding relationship between an operation direction and a processing operation type; in the correspondence, the lower left direction corresponds to the resume type, the lower right direction corresponds to the undo type, the upper right direction corresponds to the resume type, and the upper left direction corresponds to the undo type.
In a first example, in the image editing process, assuming that a user performs 3 image editing operations on an image to be shared in total, the operation times are respectively as follows in the order from early to late: rotating operation, mosaic operation and whitening operation.
Assuming that the user is not satisfied with the effect of the three image editing operations of the rotation operation, the mosaic operation, and the whitening operation, the user may first perform a second input, and at this time, the terminal may display the touch line 20 shown in fig. 2 in response to the second input. In addition, the terminal may also display the borderline 30, the borderline 40, and the borderline 50, where the borderline 30, the borderline 40, and the borderline 50 are currently separated from the touch-sensitive line 20, and where the sub-area 60, the sub-area 70, and the sub-area 80 are currently separated from the touch-sensitive line 20.
Next, the user can pull the touch line 20 in fig. 2 to the lower right direction, at this time, the terminal takes the undo operation as the first processing operation, and during the pulling process, the display form of the touch line 20 is updated, and the touch line 20 is updated from a flat state to a curved state.
After the touch line 20 is pulled a distance in the downward-right direction, the touch line 20 intersects the boundary line 30, and remains separated from the boundary line 40 and the boundary line 50. It can be seen that the relative positions of the touch line 20 and the boundary line 30 are changed from being separated to being intersected, and therefore, the terminal can perform the undo operation on the whitening operation corresponding to the boundary line 30.
After the touch line 20 is pulled a further distance in the downward-right direction, the touch line 20 intersects the dividing line 40 in addition to the dividing line 30, and remains separated from the dividing line 50. It can be seen that the relative positions of the touch lines 20 and the dividing line 40 are changed from being separated to being intersected, and therefore, the terminal can perform a withdrawal operation on the mosaic operation corresponding to the dividing line 40.
After the touch line 20 is continuously pulled to the lower right direction for a certain distance, assuming that the touch line 20 is in the display mode shown in fig. 3, the touch line 20 simultaneously intersects with the boundary line 30, the boundary line 40, and the boundary line 50, that is, the relative positions of the touch line 20 and the boundary line 50 are changed from the separation to the intersection, so that the terminal can perform the undo operation on the rotation operation corresponding to the boundary line 50.
It can be seen that, in the first example, in the process that the user pulls the contact wire 20 in the right-down direction, the terminal may perform the undo operation on the already performed image editing operations in order, and the longer the pulling distance is, the more the total number of steps of the undone image editing operation is (i.e., the larger N is).
In the second example, on the basis of the first example, assuming that the touch lines 20 have the display form shown in fig. 4 after the rotation operation, the mosaic operation, and the whitening operation are all cancelled, and the user needs to recover from all the three image editing operations, the user may pull the touch lines 20 in the upper right direction. At this time, the terminal uses the restoring operation as the first processing operation, and the display form of the touch line 20 is updated during the pulling process, so that the touch line 20 gradually restores from the curved shape to the straight shape.
After the touch line 20 is pulled to the upper right direction for a certain distance, the touch line 20 is updated to the display form shown in fig. 5, and the terminal can perform the recovery operation on the rotation operation corresponding to the dividing line 50. In the process of continuing the pulling in the upper right direction, as the touch line 20 changes from the intersection to the separation from the boundary line 40, the terminal performs the restoration operation on the mosaic operation corresponding to the boundary line 40. Thereafter, as the touch line 20 is shifted from the intersection to the separation from the boundary line 30, the terminal performs a restoration operation on the whitening operation corresponding to the boundary line 30.
It can be seen that, in the second example, in the process in which the user pulls the contact wire 20 in the right-upper direction, the terminal may sequentially perform the restoring operation for the image editing operations that have been performed, and the longer the pulling distance, the greater the total number of steps of the image editing operations that are restored.
In the third example, assuming that the user performs three image editing operations of the whitening operation, the mosaic operation, and the rotation operation, and the three image editing operations are all canceled, if the user has a need to restore the three image editing operations, the user may perform the second input, and at this time, the terminal may display the touch line 20 shown in fig. 2, similar to the first example.
Next, the user can pull the touch line 20 shown in fig. 2 in the left-down direction, at this time, the terminal uses the restore operation as the first processing operation, and during the pulling process, the display form of the touch line 20 is updated, and the touch line 20 is updated from a flat state to a curved state.
After the touch line 20 is pulled a distance in the left-down direction, the touch line 20 and the boundary line 30 change from being separated to intersecting, and therefore, the terminal can perform a recovery operation on the whitening operation corresponding to the boundary line 30. After the touch line 20 is pulled further to the left and down direction for a certain distance, the touch line 20 and the dividing line 40 change from being separated to intersecting, so that the terminal can perform a recovery operation on the mosaic operation corresponding to the dividing line 40. When the touch line 20 is further pulled to the left and down direction for a distance, the touch line 20 is updated to the display form shown in fig. 6 or fig. 7, and it can be seen that the touch line 2 and the boundary line 50 are also changed from the separated state to the intersected state, so that the terminal can perform the recovery operation on the rotation operation corresponding to the boundary line 50.
It can be seen that, in the third example, in the process where the user pulls the contact wire 20 in the left-down direction, the terminal may sequentially perform the restoring operation for the image editing operations that have been performed, and the longer the pulling distance, the more the total number of steps of the image editing operations that are restored.
In a fourth example, on the basis of the third example, assuming that the touch line 20 has the display form shown in fig. 7 after the rotation operation, the mosaic operation, and the whitening operation are all restored, and the user needs to cancel all the three image editing operations, the user may pull the touch line 20 in the upper left direction. At this time, the terminal uses the cancel operation as the first processing operation, and the display form of the touch line 20 is updated during the pulling process, so that the touch line 20 gradually returns from the curved shape to the straight shape.
After the touch line 20 is pulled to the left and up direction for a certain distance, the touch line 20 is updated to the display form shown in fig. 8, and the terminal can perform the undo operation on the rotation operation corresponding to the dividing line 50. In the process of continuing in the upward left direction, as the touch line 20 changes from intersecting to separating from the boundary line 40, the terminal performs a cancel operation on the mosaic operation corresponding to the boundary line 40. Thereafter, as the touch line 20 is shifted from the intersection to the separation from the boundary line 30, the terminal performs a cancellation operation on the whitening operation corresponding to the boundary line 30.
It can be seen that, in the fourth example, in the process where the user pulls the contact wire 20 in the left and upper direction, the terminal may perform the undo operation on the already performed image editing operations in order, and the longer the pulling distance is, the more the total number of steps of the undone image editing operation is.
It can be seen that, according to the distance and direction in which the user pulls the touch control line 20, the embodiment of the present invention can conveniently implement the cancellation or recovery of the multi-step image editing operation based on one-step pulling operation.
Optionally, after receiving a first input of a target control displayed by the current interface by a user, the method further includes:
displaying operation information of each step of image editing operation in the N steps of image editing operation;
the operation information includes operation step number and/or operation name.
Specifically, the operation step number of each step of the image editing operation may be a reverse order of the image editing operation among the image editing operations that have been performed, so that the operation step number of the whitening operation may be "1", the operation step number of the mosaic operation may be "2", and the operation step number of the rotation operation may be "3".
In the embodiment of the present invention, the boundary lines in fig. 3 to 10 may be displayed, for example, when the user starts to pull the touch control wire 20. In addition, the terminal can also display operation information of corresponding image editing operation on each boundary line.
Specifically, as shown in fig. 9 and 10, the terminal may display operation information of the whitening operation, for example, the operation step number "1" on the boundary 30; the terminal may display operation information of the mosaic operation, such as the operation step number "2", on the boundary 40; the terminal may display operation information of the rotation operation, such as the operation step number "3", on the boundary line 50. Alternatively, the operation steps may be displayed in a display frame 90.
Therefore, the user can pull the target control to the corresponding position according to the actual requirement so as to change the relative position of the boundary or the sub-region corresponding to the image editing operation to be cancelled or recovered and the target control, thereby conveniently realizing the cancellation or the recovery of the image editing operation.
Optionally, after the operation information of each of the N image editing operations is displayed, the method further includes:
a third input of the user on operation information of a target image editing operation among the N-step image editing operations is received.
Wherein the third input may include a pressing operation, a dragging operation, and the like.
In response to a third input, a second processing operation is performed on the target image editing operation.
Wherein, in the case that the first processing operation is a cancel operation, the second processing operation is a recovery operation; in the case where the first processing operation is a recovery operation, the second processing operation is a undo operation.
In the embodiment of the present invention, after the terminal performs the undo operation or the undo operation of the N-step image editing operations in response to the pulling operation of the user, if the user wishes to individually undo or undo one of the N-step image editing operations, the user may manually perform the third input of the image editing operation.
Wherein, the third input may be a drag operation. Specifically, the third input may be a dragging operation of dragging the operation information of the image editing operation to the inner side of the arc line of the touch line 20, or a dragging operation of dragging the operation information of the image editing operation to the outer side of the arc line of the touch line 20, or a dragging operation along any direction and with an operation pressure within a set pressure range, or a dragging operation along any direction and with an operation duration within a set duration range, which may be determined according to practical situations, and is not listed herein.
In the embodiment of the present invention, after the user performs the pulling operation in the right-down direction, and pulls the touch line 20 to the position shown in fig. 9, so that the terminal performs the cancel operation of the whitening operation, the mosaic operation, and the rotation operation and releases the finger performing the pulling operation, the operation of the user is not finished. If the user needs to keep the cancellation of the whitening operation and the rotation operation and separately recover the mosaic operation, the user may manually drag the operation information of the mosaic operation to the inner side of the arc of the touch line 20, for example, drag the operation step number of "2" along the direction indicated by the arrow 100, to represent that the cancellation of the mosaic operation is cancelled, so that the terminal performs the recovery operation on the mosaic operation.
Similarly, after the user performs a pull operation in a left-down direction, pulls the touch line 20 to the position shown in fig. 10, so that the terminal performs a recovery operation for the whitening operation, the mosaic operation, and the rotation operation and releases the finger performing the pull operation, the user's operation is not finished. If the user needs to keep the recovery of the whitening operation and the rotation operation and cancel the mosaic operation separately, the user may manually drag the operation information of the mosaic operation to the inner side of the arc of the touch line 20, for example, drag the operation step number "2" in the direction shown by the arrow 110 to represent that the recovery of the mosaic operation is cancelled, so that the terminal may perform the cancellation operation on the mosaic operation, or may perform a double click or long press on the operation step number 2 "to represent that the recovery of the mosaic operation is cancelled, or other touch operations, which will not be described in detail herein.
Note that, in fig. 9 and 10, after the user performs the drag operation for the operation step number "2", the user may click the touch line 20, and at this time, the terminal may eliminate the display of the touch line 20 and the respective dividing lines, or eliminate the display of the respective dividing lines and update the touch line 20 to the display form shown in fig. 2, so as to end the image editing processing flow of this time.
It can be seen that, in the embodiment of the present invention, after the executed N steps of image editing operations are cancelled or restored, the terminal may perform step-by-step restoration or cancellation on the N steps of image editing operations based on the third input of the user, so that the user can further edit the image conveniently, and the image editing efficiency is further improved.
In summary, compared with the prior art, the image editing method and the image editing device can effectively improve the image editing speed and the image editing efficiency.
The following describes a terminal provided in an embodiment of the present invention.
Referring to fig. 11, a schematic structural diagram of a terminal 1100 according to an embodiment of the present invention is shown. As shown in fig. 11, the terminal 1100 includes:
a first receiving module 1101, configured to receive a first input of a target control displayed by a current interface by a user;
a first executing module 1102, configured to, in response to a first input, execute a first processing operation on the executed N-step image editing operation according to an operation parameter of the first input;
the first processing operation comprises a withdrawal operation or a recovery operation, and N is a positive integer.
Optionally, the terminal 1100 further includes:
the second receiving module is used for receiving a second input of the user before receiving the first input of the user on the target control displayed on the current interface;
and the first display module is used for responding to the second input and displaying the target control.
Optionally, the first execution module includes:
the first determining submodule is used for determining a first processing operation type according to a first input first-class operation parameter, and the first processing operation type comprises a withdrawal type or a recovery type;
the second determining submodule is used for determining the step number N according to the first input second-class operation parameters;
and the first execution sub-module is used for executing a first processing operation corresponding to the first processing operation type on the executed N-step image editing operation.
Optionally, the first type of operating parameter comprises at least one of: operating direction, operating pressure, operating position.
Optionally, the second type of operating parameter comprises at least one of: operating distance, operating pressure, operating position.
Optionally, the first input comprises a pull operation of pulling the target control;
a first execution module comprising:
the updating submodule is used for responding to the pulling operation and updating the display form of the target control according to the operation direction and the operation distance of the pulling operation in the pulling process;
the acquisition sub-module is used for acquiring relative position change information of the target control and the target boundary line or the target sub-region;
the second execution sub-module is used for executing the first processing operation on the image editing operation of the N steps corresponding to the relative position change information;
wherein the relative position of the target control and the demarcation or sub-region includes an intersection or a departure.
Optionally, the terminal 1100 further includes:
the second display module is used for displaying operation information of each step of image editing operation in the N steps of image editing operation after receiving first input of a user to a target control displayed on the current interface;
the operation information includes operation step number and/or operation name.
Optionally, the terminal 1100 further includes:
a third receiving module, configured to receive a third input of the user on the operation information of the target image editing operation in the N-step image editing operations after displaying the operation information of each of the N-step image editing operations;
a second execution module for executing a second processing operation on the target image editing operation in response to a third input;
wherein, in the case that the first processing operation is a cancel operation, the second processing operation is a recovery operation; in the case where the first processing operation is a recovery operation, the second processing operation is a undo operation.
Optionally, the target control is a line type control.
It should be noted that, the terminal 1100 provided in the embodiment of the present invention can implement each process implemented by the terminal in the foregoing method embodiment, and details are not described here to avoid repetition. In the embodiment of the present invention, the user does not need to manually click the cancel key or the resume key to cancel or resume the image editing operation step by step, and the terminal 1100 can automatically cancel or resume the image editing operation only by executing the first input with the corresponding operation parameter to the target control. Therefore, compared with the prior art, the image editing method and the image editing device can effectively improve the image editing speed and the image editing efficiency.
Referring to fig. 12, a schematic diagram of a hardware structure of a terminal 1200 according to an embodiment of the present invention is shown. As shown in fig. 12, terminal 1200 includes, but is not limited to: radio frequency unit 1201, network module 1202, audio output unit 1203, input unit 1204, sensor 1205, display unit 1206, user input unit 1207, interface unit 1208, memory 1209, processor 1210, and power source 1211. Those skilled in the art will appreciate that the terminal structure illustrated in fig. 12 is not intended to be limiting of terminal 1200 and that terminal 1200 may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal 1200 includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
Wherein, the processor 1210 is configured to: receiving a first input of a target control displayed on a current interface by a user; responding to the first input, and executing a first processing operation on the executed N-step image editing operation according to the operation parameters of the first input; the first processing operation comprises a withdrawal operation or a recovery operation, and N is a positive integer.
In the embodiment of the present invention, the user does not need to manually click the cancel key or the resume key to cancel or resume the image editing operation step by step, and the terminal 1200 can automatically cancel or resume the image editing operation only by executing the first input with the corresponding operation parameter to the target control. Therefore, compared with the prior art, the image editing method and the image editing device can effectively improve the image editing speed and the image editing efficiency.
Optionally, the processor 1210 is further configured to: receiving a second input of the user before receiving a first input of the target control displayed by the current interface; in response to the second input, a target control is displayed.
Optionally, the processor 1210 is further configured to: determining a first processing operation type according to a first input first type of operation parameter, wherein the first processing operation type comprises a withdrawal type or a recovery type; determining the step number N according to the first input second-class operation parameters; and executing a first processing operation corresponding to the first processing operation type on the executed N-step image editing operation.
Optionally, the first type of operating parameter comprises at least one of: operating direction, operating pressure, operating position.
Optionally, the second type of operating parameter comprises at least one of: operating distance, operating pressure, operating position.
Optionally, the first input comprises a pull operation of pulling the target control; in response to a first input, performing a first processing operation on the performed N-step image editing operation according to the operation parameters of the first input, including: responding to the pulling operation, and updating the display form of the target control according to the operation direction and the operation distance of the pulling operation in the pulling process; acquiring relative position change information of the target control and a target boundary line or a target subarea; executing a first processing operation on the image editing operation of the N steps corresponding to the relative position change information; wherein the relative position of the target control and the demarcation or sub-region includes an intersection or a departure.
Optionally, the processor 1210 is further configured to: after receiving first input of a user on a target control displayed on a current interface, displaying operation information of each image editing operation in the N image editing operations; the operation information includes operation step number and/or operation name.
Optionally, the processor 1210 is further configured to: receiving a third input of the user on operation information of a target image editing operation in the N-step image editing operations after displaying operation information of each of the N-step image editing operations; performing a second processing operation on the target image editing operation in response to a third input; wherein, in the case that the first processing operation is a cancel operation, the second processing operation is a recovery operation; in the case where the first processing operation is a recovery operation, the second processing operation is a undo operation.
Optionally, the target control is a line type control.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 1201 may be used for receiving and sending signals during information transmission and reception or during a call, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 1210; in addition, the uplink data is transmitted to the base station. Typically, the radio frequency unit 1201 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 1201 can also communicate with a network and other devices through a wireless communication system.
The terminal 1200 provides wireless broadband internet access to the user, such as assisting the user in e-mailing, browsing web pages, and accessing streaming media, through the network module 1202.
The audio output unit 1203 may convert audio data received by the radio frequency unit 1201 or the network module 1202 or stored in the memory 1209 into an audio signal and output as sound. Also, the audio output unit 1203 may also provide audio output related to a specific function performed by the terminal 1200 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 1203 includes a speaker, a buzzer, a receiver, and the like.
The input unit 1204 is used to receive audio or video signals. The input Unit 1204 may include a Graphics Processing Unit (GPU) 12041 and a microphone 12042, and the Graphics processor 12041 processes image data of a still picture or video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 1206. The image frames processed by the graphics processor 12041 may be stored in the memory 1209 (or other storage medium) or transmitted via the radio frequency unit 1201 or the network module 1202. The microphone 12042 can receive sound, and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 1201 in case of the phone call mode.
The terminal 1200 also includes at least one sensor 1205, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 12061 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 12061 and/or backlight when the terminal 1200 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal 1200 posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 1205 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., and will not be described further herein.
The display unit 1206 is used to display information input by the user or information provided to the user. The Display unit 1206 may include a Display panel 12061, and the Display panel 12061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 1207 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal 1200. Specifically, the user input unit 1207 includes a touch panel 12071 and other input devices 12072. The touch panel 12071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 12071 (e.g., operations by a user on or near the touch panel 12071 using a finger, a stylus, or any suitable object or attachment). The touch panel 12071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1210, receives a command from the processor 1210, and executes the command. In addition, the touch panel 12071 may be implemented by using various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 1207 may include other input devices 12072 in addition to the touch panel 12071. In particular, the other input devices 12072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 12071 can be overlaid on the display panel 12061, and when the touch panel 12071 receives a touch operation thereon or nearby, the touch operation is transmitted to the processor 1210 to determine the type of the touch event, and then the processor 1210 provides a corresponding visual output on the display panel 12061 according to the type of the touch event. Although the touch panel 12071 and the display panel 12061 are shown as two separate components in fig. 12 to implement the input and output functions of the terminal 1200, in some embodiments, the touch panel 12071 and the display panel 12061 may be integrated to implement the input and output functions of the terminal 1200, which is not limited herein.
An interface unit 1208 is an interface for connecting an external device to the terminal 1200. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 1208 may be used to receive input from an external device (e.g., data information, power, etc.) and transmit the received input to one or more elements within the terminal 1200 or may be used to transmit data between the terminal 1200 and the external device.
The memory 1209 may be used to store software programs as well as various data. The memory 1209 may mainly include a storage program area and a storage data area, where the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1209 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 1210 is a control center of the terminal 1200, connects various parts of the entire terminal 1200 using various interfaces and lines, and performs various functions of the terminal 1200 and processes data by running or executing software programs and/or modules stored in the memory 1209 and calling data stored in the memory 1209, thereby monitoring the terminal 1200 as a whole. Processor 1210 may include one or more processing units; preferably, the processor 1210 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It is to be appreciated that the modem processor described above may not be integrated into processor 1210.
The terminal 1200 may also include a power source 1211 (e.g., a battery) for powering the various components, and preferably, the power source 1211 is logically connected to the processor 1210 via a power management system such that the functions of managing charging, discharging, and power consumption are performed via the power management system.
In addition, the terminal 1200 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides a terminal, including a processor 1210, a memory 1209, and a computer program stored in the memory 1209 and capable of running on the processor 1210, where the computer program, when executed by the processor 1210, implements each process of the above-mentioned image editing processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the above-mentioned embodiment of the image editing processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (16)

1. An image editing processing method applied to a terminal is characterized by comprising the following steps:
receiving a first input of a target control displayed on a current interface by a user;
responding to the first input, and executing a first processing operation on the executed N-step image editing operation according to the operation parameters of the first input;
the first processing operation comprises a withdrawal operation or a recovery operation, and N is a positive integer;
the first input comprises a pull operation of pulling the target control;
the executing, in response to the first input, a first processing operation on the executed N-step image editing operation according to the operation parameters of the first input, includes:
responding to the pulling operation, and updating the display form of the target control according to the operation direction and the operation distance of the pulling operation in the pulling process;
acquiring relative position change information of the target control and a target boundary line or a target sub-area;
executing a first processing operation on the image editing operation of the N steps corresponding to the relative position change information;
wherein the relative position of the target control and the demarcation or sub-region comprises intersection or separation;
the N-step image editing operation corresponding to the relative position change information comprises the following steps: for each image editing operation in the N-step image editing operations, the relative position of the boundary line or the sub-region corresponding to each image editing operation and the target control is changed from intersection to separation, or from separation to intersection.
2. The method of claim 1, wherein prior to receiving the first user input to the target control of the current interface display, the method further comprises:
receiving a second input of the user;
in response to the second input, displaying a target control.
3. The method according to claim 1, wherein the performing a first processing operation on the performed N-step image editing operation according to the first input operation parameters comprises:
determining a first processing operation type according to the first input first type of operation parameters, wherein the first processing operation type comprises a withdrawal type or a recovery type;
determining the step number N according to the first input second-class operation parameters;
and executing a first processing operation corresponding to the first processing operation type on the executed N-step image editing operation.
4. The method of claim 3, wherein the first class of operating parameters comprises at least one of: operating direction, operating pressure, operating position.
5. The method of claim 3, wherein the second class of operating parameters comprises at least one of: operating distance, operating pressure, operating position.
6. The method of claim 1, wherein after receiving a first user input to a target control displayed by a current interface, the method further comprises:
displaying operation information of each step of image editing operation in the N steps of image editing operation;
wherein the operation information comprises operation step number and/or operation name.
7. The method according to claim 6, wherein after the displaying operation information of each of the N-step image editing operations, the method further comprises:
receiving a third input of the user on operation information of a target image editing operation in the N-step image editing operations;
in response to the third input, performing a second processing operation on the target image editing operation;
wherein, in a case where the first processing operation is a cancel operation, the second processing operation is a recovery operation; in a case where the first processing operation is a recovery operation, the second processing operation is a undo operation.
8. The method of claim 1, wherein the target control is a line type control.
9. A terminal, characterized in that the terminal comprises:
the first receiving module is used for receiving first input of a target control displayed on the current interface by a user;
the first execution module is used for responding to the first input and executing a first processing operation on the executed N-step image editing operation according to the operation parameters of the first input;
the first processing operation comprises a withdrawal operation or a recovery operation, and N is a positive integer;
the first input comprises a pull operation of pulling the target control;
the first execution module includes:
the updating submodule is used for responding to the pulling operation and updating the display form of the target control according to the operation direction and the operation distance of the pulling operation in the pulling process;
the obtaining sub-module is used for obtaining the relative position change information of the target control and the target boundary line or the target sub-region;
the second execution sub-module is used for executing first processing operation on the image editing operation of the N steps corresponding to the relative position change information;
wherein the relative position of the target control and the demarcation or sub-region comprises intersection or separation;
the N-step image editing operation corresponding to the relative position change information comprises the following steps: for each image editing operation in the N-step image editing operations, the relative position of the boundary line or the sub-region corresponding to each image editing operation and the target control is changed from intersection to separation, or from separation to intersection.
10. The terminal of claim 9, wherein the terminal further comprises:
the second receiving module is used for receiving a second input of the user before receiving the first input of the user on the target control displayed on the current interface;
a first display module to display a target control in response to the second input.
11. The terminal of claim 9, wherein the first execution module comprises:
the first determining submodule is used for determining a first processing operation type according to the first input first-class operation parameters, wherein the first processing operation type comprises a withdrawal type or a recovery type;
the second determining submodule is used for determining the step number N according to the first input second-class operation parameters;
and the first execution sub-module is used for executing a first processing operation corresponding to the first processing operation type on the executed N-step image editing operation.
12. The terminal of claim 11, wherein the first type of operating parameter comprises at least one of: operating direction, operating pressure, operating position.
13. The terminal of claim 11, wherein the second type of operating parameter comprises at least one of: operating distance, operating pressure, operating position.
14. The terminal of claim 9, wherein the terminal further comprises:
the second display module is used for displaying operation information of each image editing operation in the N image editing operations after receiving first input of a user to a target control displayed on a current interface;
wherein the operation information comprises operation step number and/or operation name.
15. The terminal of claim 14, wherein the terminal further comprises:
a third receiving module, configured to receive a third input of the user on operation information of a target image editing operation in the N-step image editing operations after displaying operation information of each image editing operation in the N-step image editing operations;
a second execution module for executing a second processing operation on the target image editing operation in response to the third input;
wherein, in a case where the first processing operation is a cancel operation, the second processing operation is a recovery operation; in a case where the first processing operation is a recovery operation, the second processing operation is a undo operation.
16. The terminal of claim 9, wherein the target control is a line type control.
CN201810264652.4A 2018-03-28 2018-03-28 Image editing processing method and terminal Active CN108519846B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810264652.4A CN108519846B (en) 2018-03-28 2018-03-28 Image editing processing method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810264652.4A CN108519846B (en) 2018-03-28 2018-03-28 Image editing processing method and terminal

Publications (2)

Publication Number Publication Date
CN108519846A CN108519846A (en) 2018-09-11
CN108519846B true CN108519846B (en) 2020-05-19

Family

ID=63430653

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810264652.4A Active CN108519846B (en) 2018-03-28 2018-03-28 Image editing processing method and terminal

Country Status (1)

Country Link
CN (1) CN108519846B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111553854A (en) * 2020-04-21 2020-08-18 维沃移动通信有限公司 Image processing method and electronic equipment
CN111913631B (en) * 2020-06-30 2022-03-25 维沃移动通信有限公司 Content editing method and device and electronic equipment
CN111857518A (en) * 2020-07-30 2020-10-30 北京默契破冰科技有限公司 Method and device for canceling image editing operation, electronic equipment and medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2128823A1 (en) * 2008-05-26 2009-12-02 Lg Electronics Inc. Mobile terminal using proximity sensor and method of controlling the mobile terminal
CN103703437A (en) * 2012-03-06 2014-04-02 苹果公司 Application for viewing images
CN103795851A (en) * 2012-10-31 2014-05-14 Lg电子株式会社 Mobile terminal and method for controlling the same
CN106055534A (en) * 2016-05-27 2016-10-26 珠海市魅族科技有限公司 Operation canceling method and operation canceling device
CN107577401A (en) * 2017-08-31 2018-01-12 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN107817939A (en) * 2017-10-27 2018-03-20 维沃移动通信有限公司 A kind of image processing method and mobile terminal

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9001055B2 (en) * 2010-11-26 2015-04-07 Htc Corporation Portable device and method for operating portable device
CN104090704B (en) * 2014-07-28 2019-10-29 联想(北京)有限公司 Information processing method and electronic equipment
CN105045509B (en) * 2015-08-03 2019-01-15 努比亚技术有限公司 A kind of device and method of editing picture
CN107480255A (en) * 2017-08-14 2017-12-15 五莲金汇股权投资基金合伙企业(有限合伙) The method that commodity are carried out with the customization of 3D Photographing On-lines and displaying based on internet
CN108334371B (en) * 2017-09-07 2021-03-09 北京小米移动软件有限公司 Method and device for editing object
CN107623763B (en) * 2017-10-19 2019-10-25 Oppo广东移动通信有限公司 The method and apparatus for editing image

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2128823A1 (en) * 2008-05-26 2009-12-02 Lg Electronics Inc. Mobile terminal using proximity sensor and method of controlling the mobile terminal
CN103703437A (en) * 2012-03-06 2014-04-02 苹果公司 Application for viewing images
CN103795851A (en) * 2012-10-31 2014-05-14 Lg电子株式会社 Mobile terminal and method for controlling the same
CN106055534A (en) * 2016-05-27 2016-10-26 珠海市魅族科技有限公司 Operation canceling method and operation canceling device
CN107577401A (en) * 2017-08-31 2018-01-12 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN107817939A (en) * 2017-10-27 2018-03-20 维沃移动通信有限公司 A kind of image processing method and mobile terminal

Also Published As

Publication number Publication date
CN108519846A (en) 2018-09-11

Similar Documents

Publication Publication Date Title
CN108536365B (en) Image sharing method and terminal
CN108469898B (en) Image processing method and flexible screen terminal
CN108495029B (en) Photographing method and mobile terminal
CN107977132B (en) Information display method and mobile terminal
CN108491129B (en) Application program management method and terminal
CN108491149B (en) Split screen display method and terminal
CN110196667B (en) Notification message processing method and terminal
CN111159983B (en) Editing method and electronic equipment
CN110007835B (en) Object management method and mobile terminal
CN107943390B (en) Character copying method and mobile terminal
CN109739407B (en) Information processing method and terminal equipment
CN109213407B (en) Screenshot method and terminal equipment
CN108898555B (en) Image processing method and terminal equipment
CN108415643B (en) Icon display method and terminal
CN107728923B (en) Operation processing method and mobile terminal
CN108228902B (en) File display method and mobile terminal
CN111142769A (en) Split screen display method and electronic equipment
CN109739423B (en) Alarm clock setting method and flexible terminal
CN110795189A (en) Application starting method and electronic equipment
CN108519846B (en) Image editing processing method and terminal
CN108600544B (en) Single-hand control method and terminal
CN108804628B (en) Picture display method and terminal
CN108170329B (en) Display control method and terminal equipment
CN110928619B (en) Wallpaper setting method and device, electronic equipment and medium
CN110536005B (en) Object display adjustment method and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant