CN108845753B - Picture processing method and terminal - Google Patents

Picture processing method and terminal Download PDF

Info

Publication number
CN108845753B
CN108845753B CN201810690070.2A CN201810690070A CN108845753B CN 108845753 B CN108845753 B CN 108845753B CN 201810690070 A CN201810690070 A CN 201810690070A CN 108845753 B CN108845753 B CN 108845753B
Authority
CN
China
Prior art keywords
display
target
picture
terminal
parameter value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810690070.2A
Other languages
Chinese (zh)
Other versions
CN108845753A (en
Inventor
张繁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810690070.2A priority Critical patent/CN108845753B/en
Publication of CN108845753A publication Critical patent/CN108845753A/en
Application granted granted Critical
Publication of CN108845753B publication Critical patent/CN108845753B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a picture processing method and a terminal, wherein the method comprises the following steps: under the condition that a picture to be edited is displayed in a display interface of the terminal, an operation frame comprising an element to be added is displayed on the picture to be edited in a suspending way; receiving a target touch operation input by a user and associated with a target object, wherein the target object is determined by the positions of at least two touch points of the target touch operation on a display interface of a terminal, and the target object comprises at least one of a picture to be edited and an operation frame; changing the display state of the target object into a corresponding target display state according to the target touch operation; and under the condition that the target object is in a target display state, synthesizing the element to be added to the picture to be edited. According to the picture processing method provided by the invention, in the process that the terminal adds the elements in the picture to be edited, the display state of the picture to be edited and/or the operation frame can be changed according to the operation of the user, so that the display effect of the synthesized picture is improved.

Description

Picture processing method and terminal
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a picture processing method and a terminal.
Background
In order to meet the higher and higher requirements of people, the functions of the terminal are more and more, and particularly, pictures such as photos, dynamic pictures and video pictures are edited. Because the functions of adding elements such as characters, watermarks, stickers and the like in the pictures, such as adding the shooting time, the shooting location, the shooting name and the like to the shot pictures in a watermark mode, and the like, the interestingness of picture editing can be improved, and the picture editing device is more and more popular with users.
However, at present, elements such as characters, watermarks, stickers and the like added to a picture to be edited are generally preset in a terminal, and when a user needs to add a certain element, the terminal synthesizes the element to the picture to be edited according to the preset display parameter, so that the display states of the original picture and the element are single, and the display effect of the synthesized picture is reduced.
Therefore, in the process of adding elements in the picture to be edited by the terminal, the display state is single due to the preset display parameters, and the display effect of the synthesized picture is poor.
Disclosure of Invention
The embodiment of the invention provides a picture processing method and a terminal, and aims to solve the problem that the display effect of a synthesized picture is poor due to single display state caused by preset display parameters in the process of adding elements in a picture to be edited by the conventional terminal.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an image processing method applied to a terminal, including:
under the condition that a picture to be edited is displayed in a display interface of the terminal, an operation frame comprising an element to be added is displayed on the picture to be edited in a suspending way;
receiving a target touch operation input by a user and associated with a target object, wherein the target object is determined by the positions of at least two touch points of the target touch operation on a display interface of the terminal, and the target object comprises at least one of the picture to be edited and the operation frame;
changing the display state of the target object into a corresponding target display state according to the target touch operation;
and under the condition that the target object is in the target display state, synthesizing the element to be added to the picture to be edited.
In a second aspect, an embodiment of the present invention further provides a terminal, including:
the terminal comprises a first display module, a second display module and a display module, wherein the first display module is used for displaying an operation frame including an element to be added on a picture to be edited in a suspension manner under the condition that the picture to be edited is displayed in a display interface of the terminal;
the terminal comprises a receiving module, a display module and a display module, wherein the receiving module is used for receiving a target touch operation input by a user and associated with a target object, the target object is determined by the positions of at least two touch points of the target touch operation on a display interface of the terminal, and the target object comprises at least one of the picture to be edited and the operation frame;
the state changing module is used for changing the display state of the target object into a corresponding target display state according to the target touch operation;
and the second display module is used for synthesizing the element to be added to the picture to be edited under the condition that the target object is in the target display state.
In a third aspect, an embodiment of the present invention further provides a terminal, including a processor, a memory, and a computer program stored in the memory and capable of running on the processor, where the computer program, when executed by the processor, implements the steps of the image processing method.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the above-mentioned picture processing method are implemented.
In the picture processing method of the embodiment of the invention, in the process that the terminal adds the elements in the picture to be edited, the picture to be edited and/or the operation frame can be determined as the target object of the display state to be adjusted through the positions of at least two touch points of the touch operation of the user, and the display state of the target object is changed into the display state corresponding to the touch operation, so that the adjustment of the display parameters of the added elements in the picture and/or the operation frame to be edited can be realized, and the display effect of the synthesized picture is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a schematic flowchart of a picture processing method according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a display interface of a terminal according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a display interface of another terminal according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a display interface of another terminal according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a display interface of another terminal according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a display interface of another terminal according to an embodiment of the present invention;
FIG. 7 is a diagram illustrating a display interface of another terminal according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a terminal according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a state change module in a terminal according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of another terminal provided in an embodiment of the present invention;
fig. 11 is a schematic structural diagram of a state change module in another terminal according to an embodiment of the present invention;
fig. 12 is a schematic structural diagram of another terminal according to an embodiment of the present invention;
fig. 13 is a schematic diagram of a hardware structure of a terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a schematic flowchart of a picture processing method according to an embodiment of the present invention, which is applied to a terminal, and as shown in fig. 1, the method includes the following steps:
step 101, under the condition that a picture to be edited is displayed in a display interface of a terminal, an operation frame comprising an element to be added is displayed on the picture to be edited in a suspension mode.
In the embodiment of the present invention, the operation frame displayed in a floating manner on the picture to be edited may be displayed in a floating manner on the picture to be edited in a floating window manner when the picture to be edited is displayed by the terminal; or, in a case where the terminal displays the picture to be edited, receiving a preset operation input by the user, and displaying the operation frame in a floating window manner on the currently displayed picture, that is, the picture to be edited, in response to the preset operation.
Wherein, the preset operation may include, but is not limited to, at least one of the following:
sliding operation of a preset track input on a picture to be edited;
pressing operation that the pressing force input on the picture to be edited exceeds the preset force;
pressing operation with the pressing time length input on the picture to be edited exceeding the preset time length;
clicking operation with the click times exceeding the preset times and input on the picture to be edited;
presetting voice input of voice;
a clear gesture operation of a preset gesture, and so on.
Here, the picture to be edited may be a picture acquired by the terminal in real time, including a picture shot by the camera in real time or a frame of picture in a video, a screen real-time screenshot, a picture shared by other terminals in real time, and the like; the picture may be a picture stored in the terminal in advance, and is not limited herein.
In addition, the picture to be edited is displayed in the display interface of the terminal, and may be displayed in a full screen mode, or may be displayed in an area with any screen proportion, for example, the picture to be edited is displayed in a quarter area of the upper left corner of the screen, and the like, which is not limited herein.
It should be noted that the size, shape, position, and the like of the operation frame are fixed or variable, such as: the operation frame is displayed in the central area of the picture to be edited by a rectangular frame with a fixed size; or, the operation frame is adjusted according to the size and the display position of the picture to be edited, and the like.
Of course, the element to be added is displayed in the operation box, and may be a text input by the user in the operation box, or a picture imported into the operation box, or the like. Moreover, the terminal can adjust the elements to be added in the operation frame according to the requirements of the user, such as: the user can delete the characters displayed in the operation box, input new characters or pictures in the operation box, and the like.
And 102, receiving a target touch operation which is input by a user and is associated with a target object.
In this embodiment of the present invention, after the operation frame is displayed on the picture to be edited in step 101, at this time, the current interface of the terminal simultaneously displays the picture to be edited and the operation frame including the element to be added, the terminal may receive a target touch operation input by the user and associated with a target object, and the target object includes at least one of the picture to be edited and the operation frame.
The target touch operation is an operation input by a user on at least two touch points in a display interface of the terminal, and may include, but is not limited to, a sliding operation, a pressing operation, a clicking operation, and the like.
It should be noted that the target touch operation may be the same operation performed by the user on at least two touch points sequentially or simultaneously, for example: the target touch operation may include:
performing sliding operation with the two touch points as starting points, for example, sliding with two fingers of a user respectively with the two touch points as starting points, or sliding with one finger of the user sequentially with the two touch points as starting points respectively;
pressing on the two touch points, for example, two fingers of a user respectively press the two touch points, or one finger of the user sequentially presses on the two touch points within a preset time length;
at least one click operation is performed on the two touch points, for example, at least one click is performed on the two touch points by two fingers of a user respectively, or at least one click is performed on the two touch points by one finger of the user successively within a preset time length; and so on.
Of course, the target touch operation may also be different operations performed by the user on at least two touch points sequentially or simultaneously, for example: the target touch control operation is a sliding operation of a user on one touch control point sequentially or simultaneously and a pressing operation on the other touch control point; or, a sliding operation on one touch point and at least one clicking operation on another touch point; or a pressing operation on one touch point, and at least one clicking operation on another touch point, and so on.
In addition, the target object is determined by the positions of at least two touch points on the display interface of the terminal through the target touch operation, that is, the terminal may determine an object where the positions of the at least two touch points of the target touch operation are located as the target object, and the target object includes at least one of a picture to be edited and an operation frame.
For example: when a user needs to add a date "2018/05/25" to a picture a to be edited, the terminal may display an operation frame B on the picture a to be edited, where the date to be added is located in the operation frame B, and if the terminal receives a pressing operation input by the user on two touch points (i.e., a touch point 1 and a touch point 2) as shown in fig. 2, that is, the touch point 2 is located in the operation frame B, and the touch point 1 is located in an image area of the picture a to be edited, other than the operation frame B, the terminal determines the picture a to be edited and the operation frame B as the target object; if the terminal receives the pressing operation input by the user on the two touch points shown in fig. 3, that is, the touch point 1 and the touch point 2 are both located in the operation frame B, the terminal determines the operation frame B as the target object; if the terminal receives the pressing operation input by the user on the two touch points as shown in fig. 4, that is, the touch point 1 and the touch point 2 are both located in the image area of the picture a to be edited except the operation frame B, the terminal determines the picture a to be edited as the target object.
It should be noted that each of the at least two touch points may be a touch point at any position in a display interface of the terminal, and is not limited herein.
And 103, changing the display state of the target object into a corresponding target display state according to the target touch operation.
In this embodiment of the present invention, after the target touch operation is received in step 102, the terminal may change the display state of the target object to the target display state corresponding to the target touch operation in response to the target touch operation.
The changing of the display state of the target object to the target display state may be adjusting the display state of the target object according to a preset rule when the terminal receives the target touch operation. For example: the following rules are preset in the terminal: when the terminal receives sliding operation of a user on two touch points in a display interface at the same time, rotating the target object determined by the positions of the two touch points in the sliding operation counterclockwise by an angle alpha; if the terminal receives the sliding operation simultaneously input by the user on the touch point 1 and the touch point 2 shown in fig. 2, if the two fingers of the user slide with the two touch points as starting points, the terminal rotates the image a to be edited counterclockwise by the angle α, and rotates the operation frame B counterclockwise by the angle α.
The display state of the target object may be changed to a corresponding target display state by the terminal adjusting at least one of a rotation angle, a flip angle, a zoom factor, a font parameter, and the like of the target object, or by the terminal changing the target object to a mirror image thereof, and the like, which is not limited herein.
It should be noted that, when the target object includes the operation box, the terminal changes the display state of the operation box to the target display state, and at the same time, changes the display state of the element to be added in the operation box to the target display state.
In addition, when the target object includes a picture to be edited and an operation frame, the changing the display state of the target object to the target display state corresponding to the target touch operation may be changing the display states of the picture to be edited and the operation frame according to the same preset rule; or, the display states of the picture to be edited and the operation frame may be changed according to different preset rules.
For example: the preset rules in the terminal are as follows: when the terminal receives a sliding operation with a touch point as a starting point, rotating an object at the position of the touch point counterclockwise by an angle alpha; when the terminal receives a pressing operation on a touch point, zooming the object at the position of the touch point by n times; if the target touch operation includes a sliding operation performed on the touch point 1 and the touch point 2 shown in fig. 2 at the same time, the terminal rotates the image a to be edited counterclockwise by an angle α, and rotates the operation frame B counterclockwise by an angle α; if the target touch operation includes a sliding operation starting from the touch point 1 and a pressing operation on the touch point 2, the terminal rotates the image a to be edited counterclockwise by the angle α, and scales the operation frame B by n times, so that the element to be added, i.e., "2018/05/25", is scaled by n times.
Of course, the above-mentioned changing the display state of the target object to the target display state may also be that the terminal adjusts the display state of the target object according to the display parameter adjusted by the target touch operation and the parameter value corresponding to the operation parameter value of the target touch operation.
Optionally, step 103 includes:
determining a first display parameter adjusted by target touch operation;
determining a first display parameter value corresponding to an operation parameter value of a target touch operation;
and adjusting the parameter value of the first display parameter of the target object according to the first display parameter value.
In this embodiment, the terminal may determine the adjusted first display parameter and the corresponding first display parameter value according to the target touch operation, and change the first display parameter of the target object associated with the target touch operation into the first display parameter value, thereby changing the display state of the target object into the target display state, and may adjust the parameter values of different display parameters of the target object, so that the terminal may adjust the display state of the target object more flexibly.
The determining of the first display parameter adjusted by the target touch operation may be that the terminal determines, according to a preset correspondence between the operation attribute and the display parameter, the display parameter corresponding to the operation attribute of the target touch operation as the first display parameter. The operation attribute may include at least one of an operation type, an operation duration, an operation position, an operation direction, a trajectory, and the like, which is not limited herein.
In addition, the determining of the first display parameter value corresponding to the operation parameter value of the target touch operation may be that the terminal determines the parameter value corresponding to the operation parameter value of the target touch operation as the first display parameter value according to a preset correspondence relationship between the operation parameter value and the display parameter value, including a mapping relationship, a linear relationship, and other preset relationships.
It should be noted that, the adjusting of the parameter value of the first display parameter of the target object according to the first display parameter value may be that the terminal adjusts the parameter value of the first display parameter of the target object to the first display parameter value; or when the first display parameter value is a variable quantity, adjusting the variable quantity of the parameter value of the first display parameter value target object.
Of course, when the target object includes the picture to be edited and the operation box, the first display parameter may include only one display parameter, and the first display parameter value may include only one parameter value, that is, when the user inputs the same operation on at least two touch points, the terminal performs the same adjustment on the parameter value of the same display parameter of the picture to be edited and the operation box.
For example: the terminal is preset with the sliding operation corresponding to the turning angle and the length L of the sliding track1Angle of rotation beta1Correspondingly, as shown in fig. 2, if the target touch operation includes a sliding operation with the touch point 1 and the touch point 2 shown in fig. 2 as starting points, that is, two fingers of the user slide with the touch point 1 and the touch point 2 as starting points at the same time, and the sliding directions are the same, and the sliding track lengths are all L1, the terminal adjusts the angle values of the flip angles of the picture to be edited and the operation frame to β simultaneously1
Of course, when the target object includes the picture to be edited and the operation box, the first display parameter may only include a plurality of display parameters, and the first display parameter value may only include a plurality of parameter values, for example, when the user inputs different operations (including at least one of different operation types, different operation durations, different operation tracks, different operation directions, and the like) in the picture to be edited and the operation box, the parameter values of the display parameters of the picture to be edited and the operation box are adjusted differently.
For example: the terminal is preset with the corresponding relation between the following operation types and the display parameters: the sliding operation corresponds to the turning angle, and the pressing operation corresponds to the rotating angle; and presetting the corresponding relation between the following operation parameter values and display parameter values: length L of sliding track2Value of change beta from the turning angle2Corresponding to the pressing force F and the rotation angle change value gamma2Corresponding; if the target touch operation comprises a sliding operation with the touch point 1 as the starting point as shown in fig. 2 and a pressing operation on the touch point 2 as shown in fig. 2, the sliding track length of the sliding operation is L, and the pressing force degree of the pressing operation is F, the terminal changes the angle value of the rotation angle of the picture to be edited by β2And changing the angle value of the turning angle of the operation frame by gamma2
Further optionally, before the step 102, the method further includes:
displaying at least one sub-control on a display interface, wherein each sub-control in the at least one sub-control is associated with a display parameter, and the display parameters associated with different sub-controls are different;
the step of determining the first display parameter adjusted by the target touch operation includes:
and under the condition that the touch point of the target touch operation is located on the target sub-control, determining the display parameter associated with the target sub-control as a first display parameter, wherein the target sub-control is any one of at least one sub-control.
The terminal displays at least one sub-control on the display interface, different sub-controls are associated with different display parameters, when a target touch operation input by a user is received, the terminal can determine a first display parameter according to the sub-control where a touch point of the target touch operation is located, so that the terminal can prompt the user to adjust the display parameter according to the user's needs, select the input operation on the associated sub-control, and improve the information display capability of the terminal; meanwhile, the terminal can quickly determine the first display parameter according to the target touch operation.
It should be noted that the target sub-control may be located on the operation box, or may also be located in an image area of the picture to be edited, which is other than the operation box, and is not limited herein.
For example: as shown in fig. 5, the terminal displays a picture a to be edited, and an operation frame B including a sub-control 1, a sub-control 2, a sub-control 3, and a sub-control 4 is displayed on the picture a to be edited, where the sub-control 1 is associated with a rotation angle, the sub-control 2 is associated with a flipping angle, the sub-control 3 is associated with a zoom multiple, and the sub-control 4 is associated with a transparency of an element to be added (i.e., "2018/05/25"). If the terminal receives a target touch operation, and the target touch operation includes that the finger 1 of the user slides from the touch point located in the sub-control 3 as a starting point, the finger 2 slides from the touch points located at other positions in the operation frame B, and the finger 2 slides towards the direction close to the finger 1 or away from the finger 1, the terminal determines that the first display parameter adjusted by the target touch operation is a zoom multiple, and adjusts the zoom multiple of the picture a to be edited and the operation frame B according to the distance between the touch points of the finger 1 and the finger 2.
It should be noted that, the displaying of the at least one sub-control on the display interface may be performed while the operation frame including the element to be added is displayed in a floating manner on the picture to be edited, or may be performed after the operation frame including the element to be added is displayed in a floating manner on the picture to be edited and before the target touch operation associated with the target object is received from the user, which is not limited herein.
The terminal can determine a first display parameter value according to an operation parameter value of the target touch operation, and optionally, the operation parameter value includes a sliding track length when the target touch operation is a sliding operation; or
Under the condition that the target touch operation is a pressing operation, the operation parameter value comprises a pressing intensity and/or a pressing duration; or
In the case that the target touch operation is a click operation, the operation parameter value includes a number of clicks.
Here, the terminal may determine the first display parameter value according to the operation parameter value of different operations, so that the terminal may receive various operations of the user to adjust the display state of the target object, and diversity of receiving operations of the terminal is improved.
Optionally, the first display parameter includes at least one of a flip angle, a rotation angle, a zoom factor, and a font parameter of the text (e.g., at least one of a size, a color, a transparency, and a font).
Here, the terminal may implement one or more display parameter adjustments on the target object to change the display state of the target object, so that the manner in which the terminal adjusts the display state of the picture to be edited and/or the operation frame is various.
Here, the process of adjusting, by the terminal, the parameter value of the first display parameter of the target object according to the first display parameter value to change the display state of the target object is exemplified as follows:
in an example one, when the finger 1 of the user is located on the sub-control 1 shown in fig. 5, the finger 2 is located on the picture a to be edited, and the finger 2 slides toward the finger 1, the terminal determines that the display parameter adjusted by the sliding operation input by the user is a rotation angle, and determines, according to a correspondence between the length of the sliding track and the rotation angle value, a first display parameter value corresponding to the sliding operation, and adjusts the rotation angles of the picture a to be edited and the operation frame B according to the determined first display parameter value, if the sliding track of the sliding operation is 1 cm, the first display parameter value is determined to be 20 ° counterclockwise rotation, then the terminal rotates the picture a to be edited and the operation frame B counterclockwise at the same time by 20 °, as shown in fig. 6.
The terminal rotates the picture A to be edited and the operation frame B, wherein the picture A to be edited and the operation frame B can simultaneously use the central point of the picture A to be edited or the central point of the operation frame B as a rotation central point; alternatively, the rotation may be performed at respective center points, which is not limited herein.
In example two, when the finger 1 of the user is located on the sub-control 3 shown in fig. 5, the finger 2 is located in the operation frame B, and the finger 2 slides toward the finger 1, the terminal determines that the display parameter adjusted by the sliding operation input by the user is a zoom multiple, and calculates a first display parameter value corresponding to the sliding track length of the sliding operation according to a linear calculation formula of the sliding track length and the zoom multiple value, and adjusts the zoom multiple value of the operation frame according to the first display parameter value, such as: when the length of the sliding track is 1 cm, the first display parameter value is calculated to be 2 times of magnification, and then the terminal magnifies the operation frame B by 2 times for display, as shown in fig. 7.
Of course, the terminal may also preset a corresponding relationship between the operation attribute such as the operation type of the target touch operation and the display parameter, so as to determine the adjusted first display parameter according to the operation attribute of the target touch operation, and determine the first display parameter value according to the operation parameter value of the target touch operation, thereby implementing adjustment of the parameter value of the first display parameter of the target object.
Optionally, the target touch operation preset is associated with a second display parameter;
the step 103 includes:
determining a second display parameter value corresponding to the operation parameter value of the target touch operation;
adjusting a value of a second display parameter of the target object to the second display parameter value.
Here, the terminal can adjust the parameter value of the second display parameter of the target object according to the preset association relationship between the target touch operation and the second display parameter, the method is simple, and the operation is convenient and fast and saves time.
For example: the method comprises the steps that an incidence relation between a turning angle and sliding operation input by two fingers of a user simultaneously touching a display interface is preset in a terminal, when target touch operation received by the terminal is that the two fingers of the user respectively touch an operation frame B and an image area except the operation frame B in a picture A to be edited and slide operation is carried out, and the terminalDetermining the corresponding second display parameter value as an angle value beta according to the sliding track length of the sliding operation of the two fingers3Then the terminal adjusts the angle values of the turning angles of the picture A to be edited and the operation frame B to beta simultaneously3(ii) a Or, if the target touch operation is a sliding operation in which the two fingers of the user touch the operation frame B or the picture a to be edited and slide, and the second display parameter value is determined to be the angle β4If so, the terminal adjusts the angle value of the turning angle of the operation frame B or the picture A to be edited to beta4
In addition, optionally, before the step 103, the method further includes:
and displaying a display parameter value corresponding to the operation parameter value of the target touch operation in the display interface.
In this embodiment, in the process of inputting the target touch operation by the user, the terminal may display the parameter value corresponding to the operation parameter value of the target touch operation in real time, so that the user may adjust the input operation in time according to the displayed display parameter value, thereby adjusting the target object to a more appropriate display state.
For example: as shown in fig. 7, in the process of receiving the sliding operation input by the user, the terminal may display the parameter value corresponding to the current sliding track length of the sliding operation in real time in the display interface thereof, that is, when the sliding track length is 1 cm, "zoom in 2" is displayed in the display interface of the terminal.
And 104, synthesizing the element to be added to the picture to be edited under the condition that the target object is in the target display state.
In the embodiment of the present invention, after the display state of the target object is changed in step 103, that is, under the condition that the target object is in the target display state corresponding to the target touch operation, the terminal may synthesize the element to be added to the picture to be edited, so as to form a synthesized picture including the element to be added and the picture to be edited.
The above-mentioned synthesizing the element to be added to the picture to be edited may be the content of the element to be added covering the corresponding position on the picture to be edited; or, placing the element to be added in the stratum of the content at the corresponding position of the picture to be edited; or, add the element to be added to the corresponding position of the picture to be edited, and display the element to be added with a preset transparency, and the like, which is not limited herein.
It should be noted that, the above synthesizing the element to be added to the picture to be edited may also be that when the terminal receives a synthesis operation input by the user, the synthesizing of the element to be added to the picture to be edited is executed; or, when the terminal detects that a preset time length after the display state of the target object is changed arrives, automatically synthesizing the element to be added to the picture to be edited, which is not limited herein.
In this embodiment of the present invention, the terminal may be any terminal capable of executing the image processing method, for example: a Mobile phone, a Tablet Personal Computer (Tablet Personal Computer), a Laptop Computer (Laptop Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or the like.
In the picture processing method of the embodiment of the invention, in the process that the terminal adds the elements in the picture to be edited, the picture to be edited and/or the operation frame can be determined as the target object of the display state to be adjusted through the positions of at least two touch points of the touch operation of the user, and the display state of the target object is changed into the display state corresponding to the touch operation, so that the adjustment of the display parameters of the added elements in the picture and/or the operation frame to be edited can be realized, and the display effect of the synthesized picture is improved.
Referring to fig. 8, fig. 8 is a schematic structural diagram of a terminal according to an embodiment of the present invention, and as shown in fig. 8, a terminal 800 includes:
the first display module 801 is configured to, in a case that a picture to be edited is displayed in a display interface of the terminal, display an operation frame including an element to be added in a suspended manner on the picture to be edited;
a receiving module 802, configured to receive a target touch operation input by a user and associated with a target object, where the target object is determined by positions of at least two touch points of the target touch operation on a display interface of a terminal, and the target object includes at least one of a picture to be edited and an operation frame;
a state changing module 803, configured to change the display state of the target object to a corresponding target display state according to the target touch operation;
and the second display module 804 is configured to synthesize the element to be added to the picture to be edited when the target object is in a target display state.
Optionally, as shown in fig. 9, the state changing module 803 includes:
a display parameter determining unit 8031, configured to determine a first display parameter adjusted by the target touch operation;
a first parameter value determining unit 8032, configured to determine a first display parameter value corresponding to an operation parameter value of the target touch operation;
the first adjusting unit 8033 is configured to adjust a parameter value of a first display parameter of the target object according to the first display parameter value.
Optionally, as shown in fig. 10, the terminal 800 further includes:
a third display module 805, configured to display at least one sub-control on a display interface, where each sub-control in the at least one sub-control is associated with a display parameter, and the display parameters associated with different sub-controls are different;
the display parameter determining unit 8031 is specifically configured to:
and under the condition that the touch point of the target touch operation is located on the target sub-control, determining the display parameter associated with the target sub-control as a first display parameter, wherein the target sub-control is any one of at least one sub-control.
Optionally, in a case that the target touch operation is a sliding operation, the operation parameter value includes a sliding track length; or
Under the condition that the target touch operation is a pressing operation, the operation parameter value comprises a pressing intensity and/or a pressing duration; or
In the case that the target touch operation is a click operation, the operation parameter value includes a number of clicks.
Optionally, the first display parameter includes at least one of a turning angle, a rotation angle, a zoom multiple, and a font parameter of the text.
Optionally, the target touch operation preset is associated with a second display parameter;
as shown in fig. 11, the state change module 803 includes:
a second parameter value determining unit 8034, configured to determine a second display parameter value corresponding to an operation parameter value of the target touch operation;
a second adjusting unit 8035, configured to adjust a value of a second display parameter of the target object to the second display parameter value.
Optionally, as shown in fig. 12, the terminal 800 further includes:
a fourth display module 806, configured to display, in the display interface, a display parameter value corresponding to the operation parameter value of the target touch operation.
The terminal 800 can implement each process implemented by the terminal in the above method embodiments, and is not described here again to avoid repetition.
In the process of adding elements to the picture to be edited, the terminal 800 of the embodiment of the present invention may determine the picture to be edited and/or the operation frame as the target object with the display state to be adjusted through the positions of at least two touch points of the touch operation of the user, and change the display state of the target object into the display state corresponding to the touch operation, thereby implementing adjustment of the display parameters of the added elements in the picture to be edited and/or the operation frame, and improving the display effect of the synthesized picture.
Fig. 13 is a schematic diagram of a hardware structure of a terminal for implementing various embodiments of the present invention, where the terminal 1300 includes, but is not limited to: a radio frequency unit 1301, a network module 1302, an audio output unit 1303, an input unit 1304, a sensor 1305, a display unit 1306, a user input unit 1307, an interface unit 1308, a memory 1309, a processor 1310, a power supply 1311, and the like. Those skilled in the art will appreciate that the terminal configuration shown in fig. 13 is not intended to be limiting, and that the terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
Wherein, the processor 1310 is configured to:
under the condition that a picture to be edited is displayed in a display interface of the terminal, an operation frame comprising an element to be added is displayed on the picture to be edited in a suspending way;
receiving a target touch operation input by a user and associated with a target object, wherein the target object is determined by the positions of at least two touch points of the target touch operation on a display interface of a terminal, and the target object comprises at least one of a picture to be edited and an operation frame;
changing the display state of the target object into a corresponding target display state according to the target touch operation;
and under the condition that the target object is in a target display state, synthesizing the element to be added to the picture to be edited.
Optionally, the processor 1310 is further configured to:
determining a first display parameter adjusted by target touch operation;
determining a first display parameter value corresponding to an operation parameter value of a target touch operation;
and adjusting the parameter value of the first display parameter of the target object according to the first display parameter value.
Optionally, the processor 1310 is further configured to:
displaying at least one sub-control on a display interface, wherein each sub-control in the at least one sub-control is associated with a display parameter, and the display parameters associated with different sub-controls are different;
and under the condition that the touch point of the target touch operation is located on the target sub-control, determining the display parameter associated with the target sub-control as a first display parameter, wherein the target sub-control is any one of at least one sub-control.
Optionally, in a case that the target touch operation is a sliding operation, the operation parameter value includes a sliding track length; or
Under the condition that the target touch operation is a pressing operation, the operation parameter value comprises a pressing intensity and/or a pressing duration; or
In the case that the target touch operation is a click operation, the operation parameter value includes a number of clicks.
Optionally, the first display parameter includes at least one of a turning angle, a rotation angle, a zoom multiple, and a font parameter of the text.
Optionally, the target touch operation is associated with a second display parameter;
the processor 1310 is further configured to:
determining a second display parameter value corresponding to the operation parameter value of the target touch operation;
adjusting a value of a second display parameter of the target object to the second display parameter value.
The processor 1310 is further configured to:
and displaying a display parameter value corresponding to the operation parameter value of the target touch operation in the display interface.
In the process of adding elements to the picture to be edited, the terminal 1300 in the embodiment of the present invention may determine the picture to be edited and/or the operation frame as the target object with the display state to be adjusted through the positions of at least two touch points of the touch operation of the user, and change the display state of the target object into the display state corresponding to the touch operation, thereby implementing adjustment of the display parameters of the added elements in the picture to be edited and/or the operation frame, and improving the display effect of the synthesized picture.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 1301 may be configured to receive and transmit signals during a message transmission or call process, and specifically, receive downlink data from a base station and then process the received downlink data to the processor 1310; in addition, the uplink data is transmitted to the base station. In general, radio unit 1301 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 1301 can also communicate with a network and other devices through a wireless communication system.
The terminal provides wireless broadband internet access to the user through the network module 1302, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 1303 can convert audio data received by the radio frequency unit 1301 or the network module 1302 or stored in the memory 1309 into an audio signal and output as sound. Also, the audio output unit 1303 may also provide audio output related to a specific function performed by the terminal 1300 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 1303 includes a speaker, a buzzer, a receiver, and the like.
The input unit 1304 is used to receive audio or video signals. The input Unit 1304 may include a Graphics Processing Unit (GPU) 13041 and a microphone 13042, and the Graphics processor 13041 processes picture data of still pictures or video obtained by a picture capturing apparatus (such as a camera) in a video capture mode or a picture capture mode. The processed picture frame may be displayed on the display unit 1306. The picture frame processed by the graphic processor 13041 may be stored in the memory 1309 (or other storage medium) or transmitted via the radio frequency unit 1301 or the network module 1302. The microphone 13042 can receive sounds and can process such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 1301 in case of a phone call mode.
Terminal 1300 can also include at least one sensor 1305, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 13061 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 13061 and/or backlight when the terminal 1300 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 1305 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which will not be described in detail herein.
The display unit 1306 is used to display information input by a user or information provided to the user. The Display unit 1306 may include a Display panel 13061, and the Display panel 13061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 1307 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal. Specifically, the user input unit 1307 includes a touch panel 13071 and other input devices 13072. Touch panel 13071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on touch panel 13071 or near touch panel 13071 using a finger, stylus, or any other suitable object or attachment). The touch panel 13071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1310, and receives and executes commands sent from the processor 1310. In addition, the touch panel 13071 can be implemented by various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 1307 may include other input devices 13072 in addition to the touch panel 13071. In particular, the other input devices 13072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 13071 can be overlaid on the display panel 13061, and when the touch panel 13071 detects a touch operation on or near the touch panel, the touch operation can be transmitted to the processor 1310 to determine the type of the touch event, and then the processor 1310 can provide a corresponding visual output on the display panel 13061 according to the type of the touch event. Although the touch panel 13071 and the display panel 13061 are shown in fig. 13 as two separate components to implement the input and output functions of the terminal, in some embodiments, the touch panel 13071 may be integrated with the display panel 13061 to implement the input and output functions of the terminal, which is not limited herein.
An interface unit 1308 is an interface for connecting an external device to the terminal 1300. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. Interface unit 1308 can be used to receive input from an external device (e.g., data information, power, etc.) and transmit the received input to one or more elements within terminal 1300 or can be used to transmit data between terminal 1300 and an external device.
The memory 1309 may be used to store software programs as well as various data. The memory 1309 may mainly include a program storage area and a data storage area, where the program storage area may store an operating system, an application program (such as a sound playing function, a picture playing function, etc.) required by at least one function, and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1309 can include high-speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 1310 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 1309 and calling data stored in the memory 1309, thereby monitoring the terminal as a whole. Processor 1310 may include one or more processing units; preferably, the processor 1310 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1310.
The terminal 1300 may also include a power supply 1311 (e.g., a battery) for powering the various components, and preferably, the power supply 1311 may be logically coupled to the processor 1310 via a power management system that provides functionality for managing charging, discharging, and power consumption via the power management system.
In addition, terminal 1300 includes some functional modules that are not shown, and are not described herein again.
Preferably, an embodiment of the present invention further provides a terminal, including a processor 1310, a memory 1309, and a computer program stored in the memory 1309 and capable of running on the processor 1310, where the computer program, when executed by the processor 1310, implements each process of the above-mentioned image processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the above-mentioned image processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (11)

1. A picture processing method is applied to a terminal and is characterized by comprising the following steps:
under the condition that a picture to be edited is displayed in a display interface of the terminal, an operation frame comprising an element to be added is displayed on the picture to be edited in a suspending way;
receiving a target touch operation input by a user and associated with a target object, wherein the target object is determined by the positions of at least two touch points of the target touch operation on a display interface of the terminal, and the target object comprises at least one of the picture to be edited and the operation frame;
changing the display state of the target object into a corresponding target display state according to the target touch operation;
under the condition that the target object is in the target display state, synthesizing the element to be added to the picture to be edited;
wherein changing the display state of the target object to a corresponding target display state comprises: when the target object comprises an operation frame, changing the display state of the operation frame into the target display state, and simultaneously changing the display state of the element to be added in the operation frame into the target display state;
wherein the step of changing the display state of the target object to a corresponding target display state according to the target touch operation includes: determining a first display parameter adjusted by the target touch operation; determining a first display parameter value corresponding to an operation parameter value of the target touch operation; adjusting the parameter value of the first display parameter of the target object according to the first display parameter value;
and under the condition that the target object comprises a picture to be edited and an operation frame and the first display parameter comprises one display parameter, carrying out the same adjustment on the picture to be edited and the operation frame according to the first display parameter.
2. The method of claim 1, wherein the step of receiving the user input of the target touch operation associated with the target object is preceded by the step of:
displaying at least one sub-control on the display interface, wherein each sub-control in the at least one sub-control is associated with a display parameter, and the associated display parameters of different sub-controls are different;
the step of determining the first display parameter adjusted by the target touch operation includes:
and under the condition that the touch point of the target touch operation is located on a target sub-control, determining the display parameter associated with the target sub-control as the first display parameter, wherein the target sub-control is any one of the at least one sub-control.
3. The method according to claim 1, wherein in a case where the target touch operation is a sliding operation, the operation parameter value includes a sliding track length; or
Under the condition that the target touch operation is a pressing operation, the operation parameter value comprises a pressing intensity and/or a pressing duration; or
And under the condition that the target touch operation is click operation, the operation parameter value comprises click times.
4. The method of claim 1, wherein the target touch operation preset is associated with a second display parameter;
the step of changing the display state of the target object to a corresponding target display state according to the target touch operation includes:
determining a second display parameter value corresponding to the operation parameter value of the target touch operation;
adjusting a value of a second display parameter of the target object to the second display parameter value.
5. The method according to claim 1, wherein the step of changing the display state of the target object to the corresponding target display state according to the target touch operation is preceded by:
and displaying a display parameter value corresponding to the operation parameter value of the target touch operation in the display interface.
6. A terminal, comprising:
the terminal comprises a first display module, a second display module and a display module, wherein the first display module is used for displaying an operation frame including an element to be added on a picture to be edited in a suspension manner under the condition that the picture to be edited is displayed in a display interface of the terminal;
the terminal comprises a receiving module, a display module and a display module, wherein the receiving module is used for receiving a target touch operation input by a user and associated with a target object, the target object is determined by the positions of at least two touch points of the target touch operation on a display interface of the terminal, and the target object comprises at least one of the picture to be edited and the operation frame;
the state changing module is used for changing the display state of the target object into a corresponding target display state according to the target touch operation;
the second display module is used for synthesizing the element to be added to the picture to be edited under the condition that the target object is in the target display state;
the state changing module is further configured to change the display state of the operation box to the target display state when the target object includes the operation box, and change the display state of the element to be added in the operation box to the target display state at the same time;
wherein the state change module comprises: the display parameter determining unit is used for determining a first display parameter adjusted by the target touch operation; a first parameter value determining unit, configured to determine a first display parameter value corresponding to an operation parameter value of the target touch operation; the first adjusting unit is used for adjusting the parameter value of the first display parameter of the target object according to the first display parameter value;
and under the condition that the target object comprises a picture to be edited and an operation frame and the first display parameter comprises one display parameter, carrying out the same adjustment on the picture to be edited and the operation frame according to the first display parameter.
7. The terminal of claim 6, further comprising:
the third display module is used for displaying at least one sub-control on the display interface, wherein each sub-control in the at least one sub-control is associated with a display parameter, and the associated display parameters of different sub-controls are different;
the display parameter determination unit is specifically configured to:
and under the condition that the touch point of the target touch operation is located on a target sub-control, determining the display parameter associated with the target sub-control as the first display parameter, wherein the target sub-control is any one of the at least one sub-control.
8. The terminal according to claim 6, wherein in a case where the target touch operation is a sliding operation, the operation parameter value includes a sliding track length; or
Under the condition that the target touch operation is a pressing operation, the operation parameter value comprises a pressing intensity and/or a pressing duration; or
And under the condition that the target touch operation is click operation, the operation parameter value comprises click times.
9. The terminal of claim 6, wherein the target touch operation preset is associated with a second display parameter;
the state change module includes:
a second parameter value determining unit, configured to determine a second display parameter value corresponding to an operation parameter value of the target touch operation;
a second adjusting unit for adjusting a value of a second display parameter of the target object to the second display parameter value.
10. The terminal of claim 6, further comprising:
and the fourth display module is used for displaying the display parameter value corresponding to the operation parameter value of the target touch operation in the display interface.
11. An electronic device comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the picture processing method according to any one of claims 1 to 5.
CN201810690070.2A 2018-06-28 2018-06-28 Picture processing method and terminal Active CN108845753B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810690070.2A CN108845753B (en) 2018-06-28 2018-06-28 Picture processing method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810690070.2A CN108845753B (en) 2018-06-28 2018-06-28 Picture processing method and terminal

Publications (2)

Publication Number Publication Date
CN108845753A CN108845753A (en) 2018-11-20
CN108845753B true CN108845753B (en) 2021-04-27

Family

ID=64200770

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810690070.2A Active CN108845753B (en) 2018-06-28 2018-06-28 Picture processing method and terminal

Country Status (1)

Country Link
CN (1) CN108845753B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112084750B (en) * 2019-06-14 2023-05-23 腾讯数码(天津)有限公司 Label paper processing method and device, electronic equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102354275A (en) * 2011-09-29 2012-02-15 深圳市万兴软件有限公司 Text input box and data processing method thereof
CN105700789A (en) * 2016-01-11 2016-06-22 广东欧珀移动通信有限公司 Image sending method and terminal device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102030754B1 (en) * 2012-03-08 2019-10-10 삼성전자주식회사 Image edting apparatus and method for selecting region of interest
CN106775293A (en) * 2016-11-22 2017-05-31 维沃移动通信有限公司 The operating method and mobile terminal of a kind of picture

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102354275A (en) * 2011-09-29 2012-02-15 深圳市万兴软件有限公司 Text input box and data processing method thereof
CN105700789A (en) * 2016-01-11 2016-06-22 广东欧珀移动通信有限公司 Image sending method and terminal device

Also Published As

Publication number Publication date
CN108845753A (en) 2018-11-20

Similar Documents

Publication Publication Date Title
JP7359920B2 (en) Image processing method and flexible screen terminal
CN107995429B (en) Shooting method and mobile terminal
CN107943390B (en) Character copying method and mobile terminal
CN109240577B (en) Screen capturing method and terminal
CN110174993B (en) Display control method, terminal equipment and computer readable storage medium
CN109213416B (en) Display information processing method and mobile terminal
CN111562896B (en) Screen projection method and electronic equipment
CN109710349B (en) Screen capturing method and mobile terminal
CN109739407B (en) Information processing method and terminal equipment
WO2019184947A1 (en) Image viewing method and mobile terminal
CN110209313B (en) Icon moving method and terminal equipment
CN108900695B (en) Display processing method, terminal equipment and computer readable storage medium
CN109189303B (en) Text editing method and mobile terminal
CN111026316A (en) Image display method and electronic equipment
CN111147919A (en) Play adjustment method, electronic equipment and computer readable storage medium
CN107734172B (en) Information display method and mobile terminal
CN108924035B (en) File sharing method and terminal
CN108804628B (en) Picture display method and terminal
CN108509141B (en) Control generation method and mobile terminal
WO2021017730A1 (en) Screenshot method and terminal device
CN111352892B (en) Operation processing method and electronic equipment
CN110795021A (en) Information display method and device and electronic equipment
CN111031253A (en) Shooting method and electronic equipment
CN110536005B (en) Object display adjustment method and terminal
CN111522613A (en) Screen capturing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant