CN117893639A - Picture processing method and related device - Google Patents

Picture processing method and related device Download PDF

Info

Publication number
CN117893639A
CN117893639A CN202211261751.XA CN202211261751A CN117893639A CN 117893639 A CN117893639 A CN 117893639A CN 202211261751 A CN202211261751 A CN 202211261751A CN 117893639 A CN117893639 A CN 117893639A
Authority
CN
China
Prior art keywords
picture
user
terminal equipment
filter
toolbar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211261751.XA
Other languages
Chinese (zh)
Inventor
张洁
韩笑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211261751.XA priority Critical patent/CN117893639A/en
Publication of CN117893639A publication Critical patent/CN117893639A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a picture processing method and a related device. The method comprises the following steps: the terminal equipment displays a first picture; the terminal equipment adds a first type element for the first picture; the terminal equipment adds a second class element for the first picture; the terminal equipment receives the cutting operation; responding to the cutting operation, displaying a first picture, displaying a second type element, displaying a cutting area corresponding to the cutting operation, and not displaying the first type element by the terminal equipment; when the cutting operation is finished, the terminal equipment obtains a second picture, wherein the second picture comprises: the first type element, the content of the first picture covered by the target clipping region and the content of the second type element covered by the target clipping region; the target clipping region is a clipping region corresponding to the clipping operation ending time. Therefore, the flexibility of picture processing can be improved, the picture processing result is more in line with the expectations of users, and the use experience of the users is improved.

Description

Picture processing method and related device
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to a method and an apparatus for processing a picture.
Background
With the development of terminal technology, the supportable image processing functions of the terminal equipment are more and more abundant, and a user can process a target picture through an editing function in a terminal equipment gallery application.
However, when the gallery application of the terminal device is used for processing the picture, the flexibility of the processing is low, the convenience is poor, and the processing result of the picture may not meet the user expectation, so that the use experience of the user is affected.
Disclosure of Invention
The embodiment of the application provides a picture processing method and a related device, when a user performs cutting processing on a picture, a part of elements on the picture are not displayed temporarily and cannot be cut, and after cutting, the part of elements can be displayed on the cut picture according to the size, the position and the like of the cut picture; and displaying the other part of the elements on the picture, wherein a user can intuitively see the part of the elements in the cutting process and can cut the part of the elements, so that the flexibility of picture processing is improved.
In a first aspect, an embodiment of the present application provides a method for processing a picture, where the method includes:
the terminal equipment displays a first picture; the terminal equipment adds a first type element for the first picture; the terminal equipment adds a second class element for the first picture; the terminal equipment receives cutting operation, and the cutting operation continuously acts on an interface of the terminal equipment; responding to the cutting operation, displaying a first picture, displaying a second type element, displaying a cutting area corresponding to the cutting operation, and not displaying the first type element by the terminal equipment; when the cutting operation is finished, the terminal equipment obtains a second picture, wherein the second picture comprises: the first type element, the content of the first picture covered by the target clipping region and the content of the second type element covered by the target clipping region; the target clipping region is a clipping region corresponding to the clipping operation ending time. Therefore, the user can preview the cutting effect in real time, the flexibility of picture processing can be improved, the picture processing result is more in line with the expectation of the user, and the use experience of the user is improved.
In one possible implementation, the first type of element includes one or more of the following: text, watermarks or stickers; the second class of elements includes one or more of the following: graffiti or mosaic. Therefore, the content contained in the first type of elements and the content contained in the second type of elements can be adjusted according to the actual demands of the users, and in the cutting process, some elements are displayed and other elements are not displayed, so that the flexibility of picture processing is improved, and the actual demands of the users are met.
In one possible implementation, the first type of element includes one or more of the following: graffiti, watermark or decal; the second class of elements includes one or more of the following: text or mosaic. Therefore, the content contained in the first type of elements and the content contained in the second type of elements can be adjusted according to the actual demands of the users, and in the cutting process, some elements are displayed and other elements are not displayed, so that the flexibility of picture processing is improved, and the actual demands of the users are met.
In a possible implementation, the difference between the position of the first type element in the second picture and the position of the first type element in the first picture is smaller than a preset value. Therefore, the coordination degree of the cut picture and the added element can be improved, and the use experience of a user is improved.
In a possible implementation, before the terminal device receives the clipping operation, the method further includes: the terminal equipment receives a first operation; responding to a first operation, and adding a photo frame for the periphery of the first picture by the terminal equipment; when the cutting operation continuously acts on the interface of the terminal equipment, the terminal equipment does not display the photo frame; when the cropping operation is finished, the periphery of the second picture comprises a photo frame. Therefore, when a user cuts the picture, the photo frame is not affected by cutting, the periphery of the cut picture can be redisplayed after the cutting is completed, the picture processing expectation of the user is met, and the use experience of the user is improved.
In a possible implementation, after the terminal device displays the first picture, the method further includes: the terminal equipment receives a second operation aiming at the first picture in the gallery application; in response to the second operation, the terminal device displays a toolbar, wherein the toolbar includes a primary toolbar and does not include a secondary toolbar. Therefore, the user can select the function to be used in the toolbar, the secondary toolbar does not need to be unfolded, the user can conveniently find each function, the interface jumping times are reduced, and the operation of the user is more convenient.
In one possible implementation, the primary toolbar includes one or more of the following functions: clipping, filtering, adjusting, beautifying, graffiti, characters, mosaics, watermarks, photo frames, blurring, and retaining colors or stickers. Therefore, the picture processing functions are all placed in one toolbar, a user can select the functions to be used in the first-level toolbar, the user can conveniently find each function, the interface jumping times are reduced, and the user can operate more conveniently.
In one possible implementation, the first level toolbar has a portion of functionality displayed therein, the method further comprising: when a sliding operation is received in the primary toolbar, a function of hiding a display in the primary toolbar in the sliding direction is displayed based on the sliding direction of the sliding operation. Therefore, the picture processing functions are all placed in one toolbar, a user can select the functions to be used through sliding operation, the secondary toolbar does not need to be unfolded, the user can conveniently find out the functions, the interface jump times are reduced, and the user can operate more conveniently.
In a possible implementation, after the terminal device obtains the second picture, the method further includes: the terminal equipment receives a third operation; and responding to the third operation, and storing the second picture by the terminal equipment. Therefore, the processed pictures can be saved by one key, so that the operation of a user is simpler and quicker.
In a possible implementation, before the terminal device adds the second class element to the first picture, the method further includes: the terminal equipment receives a fourth operation; in response to the fourth operation, the terminal device displays a filter toolbar including one or more filter options; when a trigger for a first filter option in the multiple filter options is received, the terminal equipment adds a first filter corresponding to the first filter option for the first picture, and displays the first filter option as a selected state; the terminal device adds a second type element for the first picture, and the terminal device further comprises: the terminal equipment receives a fifth operation; responding to a fifth operation, and displaying a filter toolbar by the terminal equipment, wherein the first filter option is in a selected state in the filter toolbar; when a trigger for a second filter option in the multiple filter options is received, the terminal equipment switches the first filter to a second filter corresponding to the second filter option, the second filter option is in a selected state in a filter toolbar, and the first filter option is in an unselected state. Therefore, the user can conveniently process the pictures, the operation of the user is more convenient, and the use experience of the user is improved.
In a possible implementation, before the terminal device adds the second class element to the first picture, the method further includes: the terminal equipment receives a sixth operation; responding to a sixth operation, and adding a first text for the first picture by the terminal equipment; the terminal device adds a second type element for the first picture, and the terminal device further comprises: the terminal equipment receives a seventh operation; responding to the seventh operation, and displaying the first text as an editable state; the terminal equipment receives an eighth operation; and responding to the eighth operation, and modifying the first text into the second text by the terminal equipment. Therefore, the user can conveniently process the pictures, and the user experience is improved.
In a possible implementation, before the terminal device displays the first picture, the method further includes: the terminal equipment displays an interface of a gallery, wherein the interface of the gallery comprises one or more pictures; the terminal equipment receives a ninth operation; in response to the ninth operation, the terminal device displays the first picture. Therefore, the user can process the picture through the gallery application in the terminal equipment without importing the picture into third-party picture processing software for processing, so that the picture processing is more convenient and quicker.
In a second aspect, an embodiment of the present application provides a picture processing apparatus. The image processing device may be a terminal device, or may be a chip or a chip system in the terminal device. The picture processing apparatus may include a processing unit and a display unit. The processing unit is configured to implement the first aspect or any method related to processing in any possible implementation manner of the first aspect. When the picture processing apparatus is a terminal device, the processing unit may be a processor. The display unit is used for supporting the picture processing device to display image information and the like. The picture processing device may further comprise a storage unit, which may be a memory. The storage unit is configured to store instructions, and the processing unit executes the instructions stored in the storage unit, so that the terminal device implements a method described in the first aspect or any one of possible implementation manners of the first aspect. When the picture processing means is a chip or a system of chips within the terminal device, the processing unit may be a processor. The processing unit executes instructions stored by the storage unit to cause the terminal device to implement a method as described in the first aspect or any one of the possible implementations of the first aspect. The memory unit may be a memory unit (e.g., a register, a cache, etc.) in the chip, or a memory unit (e.g., a read-only memory, a random access memory, etc.) located outside the chip in the terminal device.
Exemplary, a display unit for displaying the first picture; the processing unit is used for adding a first type element to the first picture; the processing unit is also used for adding a second class element to the first picture; the processing unit is also used for receiving cutting operation, and the cutting operation continuously acts on the interface of the terminal equipment; the display unit is used for displaying the first picture, displaying the second type element, displaying a clipping region corresponding to the clipping operation and not displaying the first type element; the processing unit is further configured to obtain a second picture when the cropping operation is finished, where the second picture includes: the first type element, the content of the first picture covered by the target clipping region and the content of the second type element covered by the target clipping region; the target clipping region is a clipping region corresponding to the clipping operation ending time.
In one possible implementation, the first type of element includes one or more of the following: text, watermarks or stickers; the second class of elements includes one or more of the following: graffiti or mosaic.
In one possible implementation, the first type of element includes one or more of the following: graffiti, watermark or decal; the second class of elements includes one or more of the following: text or mosaic.
In a possible implementation, the difference between the position of the first type element in the second picture and the position of the first type element in the first picture is smaller than a preset value.
In a possible implementation, the processing unit is further configured to receive a first operation; the processing unit is also used for adding a photo frame to the periphery of the first picture in response to the first operation; when the cutting operation continuously acts on the interface of the terminal equipment, the terminal equipment does not display the photo frame; when the cropping operation is finished, the periphery of the second picture comprises a photo frame.
In a possible implementation, the processing unit is further configured to receive, in the gallery application, a second operation for the first picture; and the display unit is also used for displaying a toolbar in response to the second operation, wherein the toolbar comprises a primary toolbar and does not comprise a secondary toolbar.
In one possible implementation, the primary toolbar includes one or more of the following functions: clipping, filtering, adjusting, beautifying, graffiti, characters, mosaics, watermarks, photo frames, blurring, and retaining colors or stickers.
In a possible implementation, a part of the functions are displayed in the primary toolbar, and the display unit is further configured to display the functions of the primary toolbar that are hidden in the sliding direction based on the sliding direction of the sliding operation when the sliding operation is received in the primary toolbar.
In a possible implementation, the processing unit is further configured to receive a third operation; and responding to the third operation, and the processing unit is also used for saving the second picture.
In a possible implementation, the processing unit is further configured to receive a fourth operation; in response to the fourth operation, the display unit is further configured to display a filter toolbar, where the filter toolbar includes one or more filter options; the processing unit is further used for adding a first filter corresponding to the first filter option to the first picture when receiving the trigger of the first filter option in the plurality of filter options, and the display unit is further used for displaying the first filter option to be in a selected state when receiving the trigger of the first filter option in the plurality of filter options; the processing unit is also used for receiving a fifth operation; the display unit is further used for displaying a filter toolbar in response to the fifth operation, wherein the first filter option is in a selected state; and the processing unit is also used for switching the first filter to a second filter corresponding to the second filter option when receiving the trigger of the second filter option in the plurality of filter options, wherein the second filter option is in a selected state in the filter toolbar, and the first filter option is in an unselected state.
In a possible implementation, the processing unit is further configured to receive a sixth operation; the processing unit is further used for adding first characters to the first picture in response to a sixth operation; the processing unit is further used for adding a second type element to the first picture, and receiving a seventh operation; the display unit is also used for displaying the first text in an editable state in response to a seventh operation; the processing unit is also used for receiving an eighth operation; in response to the eighth operation, the processing unit is further configured to modify the first text to a second text.
In a possible implementation, the display unit is further configured to display an interface of the gallery, where the interface of the gallery includes one or more pictures; the processing unit is further used for receiving a ninth operation; and a display unit for displaying the first picture in response to the ninth operation.
In a third aspect, embodiments of the present application provide an electronic device comprising a processor and a memory, the memory being for storing code instructions, the processor being for executing the code instructions to perform the method described in the first aspect or any one of the possible implementations of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having a computer program stored thereon. The computer program, when executed by a processor, implements a method as in the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a computer program which, when run, causes a computer to perform the method as in the first aspect.
In a sixth aspect, embodiments of the present application provide a chip comprising a processor for invoking a computer program in a memory to perform a method as described in the first aspect.
It should be understood that, the second aspect to the sixth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the beneficial effects obtained by each aspect and the corresponding possible embodiments are similar, and are not repeated.
Drawings
FIG. 1 is a schematic diagram of picture processing in one possible implementation;
FIG. 2 is a schematic illustration of the addition of elements in one possible implementation;
FIG. 3 is a schematic diagram of picture cropping in one possible implementation;
FIG. 4 is a schematic illustration of filter addition in one possible implementation;
FIG. 5 is a schematic diagram of an editing flow in one possible implementation;
fig. 6 is a schematic structural diagram of a terminal device provided in an embodiment of the present application;
fig. 7 is a schematic software architecture diagram of a terminal device according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a logic layer according to an embodiment of the present disclosure;
Fig. 9 is a schematic diagram of a physical layer according to an embodiment of the present application;
FIG. 10 is a toolbar diagram provided by an embodiment of the present application;
FIG. 11 is a diagram of a filter editing interface provided in an embodiment of the present application;
FIG. 12 is a text editing interface diagram provided in an embodiment of the present application;
fig. 13 is a picture cropping interface diagram provided in an embodiment of the present application;
fig. 14 is a schematic structural diagram of a picture processing device according to an embodiment of the present application;
fig. 15 is a schematic hardware structure of a control device according to an embodiment of the present application;
fig. 16 is a schematic structural diagram of a chip according to an embodiment of the present application.
Detailed Description
In order to facilitate the clear description of the technical solutions of the embodiments of the present application, the following simply describes some terms and techniques related to the embodiments of the present application:
1. terminology
For purposes of clarity in describing the embodiments of the present application, in the embodiments of the present application, words such as "exemplary" or "such as" are used to indicate by way of example, illustration, or description. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
The term "at … …" in the embodiment of the present application may be instantaneous when a certain situation occurs, or may be a period of time after a certain situation occurs, which is not particularly limited in the embodiment of the present application. In addition, the display interface provided in the embodiments of the present application is merely an example, and the display interface may further include more or less content.
2. Terminal equipment
The terminal device in the embodiment of the present application may also be any form of electronic device, for example, the electronic device may include a handheld device with an image processing function, an in-vehicle device, and the like. For example, some electronic devices are: a mobile phone, tablet, palm, notebook, mobile internet device (mobile internet device, MID), wearable device, virtual Reality (VR) device, augmented reality (augmented reality, AR) device, wireless terminal in industrial control (industrial control), wireless terminal in unmanned (self driving), wireless terminal in teleoperation (remote medical surgery), wireless terminal in smart grid (smart grid), wireless terminal in transportation security (transportation safety), wireless terminal in smart city (smart city), wireless terminal in smart home (smart home), cellular phone, cordless phone, session initiation protocol (session initiation protocol, SIP) phone, wireless local loop (wireless local loop, WLL) station, personal digital assistant (personal digital assistant, PDA), handheld device with wireless communication function, public computing device or other processing device connected to wireless modem, vehicle-mounted device, wearable device, terminal device in 5G network or evolving land mobile terminal (public land mobile network), and the like, without limiting the examples of this.
By way of example, and not limitation, in embodiments of the present application, the electronic device may also be a wearable device. The wearable device can also be called as a wearable intelligent device, and is a generic name for intelligently designing daily wear by applying wearable technology and developing wearable devices, such as glasses, gloves, watches, clothes, shoes and the like. The wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also can realize a powerful function through software support, data interaction and cloud interaction. The generalized wearable intelligent device includes full functionality, large size, and may not rely on the smart phone to implement complete or partial functionality, such as: smart watches or smart glasses, etc., and focus on only certain types of application functions, and need to be used in combination with other devices, such as smart phones, for example, various smart bracelets, smart jewelry, etc. for physical sign monitoring.
In addition, in the embodiment of the application, the electronic device may also be a terminal device in an internet of things (internet of things, ioT) system, and the IoT is an important component of future information technology development, and the main technical characteristic of the IoT is that the article is connected with a network through a communication technology, so that man-machine interconnection and an intelligent network for internet of things are realized.
The electronic device in the embodiment of the application may also be referred to as: a terminal device, a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), an access terminal, a subscriber unit, a subscriber station, a mobile station, a remote terminal, a mobile device, a user terminal, a wireless communication device, a user agent, a user equipment, or the like.
In an embodiment of the present application, the electronic device or each network device includes a hardware layer, an operating system layer running above the hardware layer, and an application layer running above the operating system layer. The hardware layer includes hardware such as a central processing unit (central processing unit, CPU), a memory management unit (memory management unit, MMU), and a memory (also referred to as a main memory). The operating system may be any one or more computer operating systems that implement business processes through processes (processes), such as a Linux operating system, a Unix operating system, an Android operating system, an iOS operating system, or a windows operating system. The application layer comprises applications such as a browser, an address book, word processing software, instant messaging software and the like.
It can be understood that in the embodiments corresponding to fig. 1 to 5 and fig. 10 to 13, the terminal device is taken as a mobile phone, and the user processes the pictures through the gallery application in the mobile phone, and this example does not limit the embodiments of the present application.
By way of example, fig. 1 is a schematic diagram of picture processing in one possible implementation.
The interface shown in fig. 1 a may include one or more pictures in a gallery Application (APP) of the mobile phone, and after the mobile phone receives an operation of selecting a picture in the gallery APP by a user, for example, the picture 101, the display interface of the mobile phone is switched from a in fig. 1 to b in fig. 1.
As shown in the interface b in fig. 1, the interface displayed by the mobile phone includes a picture 101, a sharing icon, a collection icon, an editing icon 102, a deletion icon and more icons, and when the mobile phone receives an operation of selecting a certain icon by a user, the mobile phone can perform a corresponding operation on the picture 101. When the mobile phone receives the operation of selecting the editing icon 102 by the user, the display interface of the mobile phone is switched from b in fig. 1 to c in fig. 1.
As shown in the interface c of fig. 1, the interface displayed by the mobile phone includes a picture 101 and a first level toolbar (toolbar) 104. The primary toolbar 104 may include, among other things, a crop icon, a filter icon, an adjust icon, and a more icon 103. When the mobile phone receives an operation of selecting a certain icon by a user, the mobile phone can perform corresponding processing on the picture 101. When the mobile phone receives the operation of selecting more icons 103 by the user, the display interface of the mobile phone is switched from c in fig. 1 to d in fig. 1.
As shown in the interface of fig. 1 d, the interface displayed by the mobile phone includes a picture 101, a primary toolbar 104, and a secondary toolbar 105. As shown in fig. 1 d, the secondary toolbar 105 may include a beauty icon, a text icon 106, a graffiti icon, a mosaic icon, a watermark icon, a photo frame icon. In possible implementations, the secondary toolbar 105 may also include a virtual icon, a reserved color icon, a decal icon, and the like.
It will be appreciated that in situations where the terminal device screen size is limited, the content of the secondary toolbar 105 may not be fully displayed and when the terminal device receives a user sliding the secondary toolbar 105, the secondary toolbar 105 may display hidden content. For example, in a mobile phone interface, the content displayed by the secondary toolbar 105 includes a beauty icon, a text icon, a graffiti icon, a mosaic icon, a watermark icon, and a photo frame icon, the hidden content includes a blurring icon, a reserved color icon, and a sticker icon, and the mobile phone can display the hidden content in response to the operation of sliding the secondary toolbar by the user.
When the mobile phone receives the operation of selecting the text icon 106 by the user, the display interface of the mobile phone is switched from d in fig. 1 to e in fig. 1.
As shown in the interface e in fig. 1, the interface displayed by the mobile phone includes a picture 101 and a character adjusting floating layer 108. The user can input characters, letters, numbers, symbols and the like through the keyboard in the character adjustment float layer 108, and the mobile phone can add content input by the user on the picture in response to the input operation of the user. As shown in e of fig. 1, the mobile phone adds text 107 on the picture 101 in response to the user's operation of inputting the "text" two words through the keyboard in the text adjustment float layer 108.
In a possible implementation, the user may also select a bubble style through a bubble option in the text adjustment float layer 108, add bubbles to the text, and may also select a style of the text through a style option in the text adjustment float layer 108, and modify the font, color, size, etc. of the text.
When the mobile phone receives the operation of clicking the confirm button 109 by the user, the mobile phone completes the operation of adding the text 107 on the picture, and the display interface of the mobile phone is switched from e in fig. 1 to f in fig. 1.
As shown in the interface f of fig. 1, the interface displayed by the mobile phone includes a picture 101, text 107, a primary toolbar 104, and a secondary toolbar 105.
After adding text to the picture, the mobile phone may also support continuing to add elements in the picture, and fig. 2 is an exemplary schematic diagram of element addition in one possible implementation.
As shown in fig. 2 a, the interface displayed by the handset includes a picture 101, text 107, a primary toolbar 104, and a secondary toolbar 105. When the handset receives a user selection of the graffiti function 201 in the secondary toolbar 105, the interface displayed by the handset switches from a in fig. 2 to b in fig. 2.
As shown in fig. 2 b, the interface displayed by the cell phone may include a picture 101, a graffiti function field 206, and a graffiti style field 207. When the mobile phone receives the operation of drawing the graffiti on the picture 101, the graffiti 210 drawn by the user can be displayed on the picture 101.
In a possible implementation, the user may select a pen shape for the graffiti via a pen shape function in graffiti function field 206; the user may select the color of the graffiti via the color function in the graffiti function field 206; the user can adjust the thickness of the graffiti line through the thickness function in the graffiti function field 206; the user may erase the graffiti through an eraser function in graffiti function field 206.
In a possible implementation, the user may select a style of graffiti through graffiti style field 207. For example, the user may select the graffiti style as a curve, a straight line with an arrow, a rectangle, or a circle. When the handset receives the user selection of the confirmation option 208, the handset may retain the added graffiti, and the interface displayed by the handset switches from b in fig. 2 to c in fig. 2.
As shown in fig. 2 c, the picture 101 displayed by the mobile phone includes text 107 and graffiti 210. When the mobile phone receives an operation of selecting the mosaic function 202 by the user and adding the mosaic to the picture 101, the mosaic added by the user can be displayed on the picture 101. The user added mosaic may be as shown by mosaic 220 of fig. 2 d.
As shown in the interface of fig. 2 d, the picture 101 displayed by the mobile phone includes text 107, graffiti 210, and mosaic 220. When the mobile phone receives the operation that the user selects the watermark function 203 and adds the watermark on the picture 101, the watermark added by the user can be displayed on the picture 101. The user-added watermark may be as shown by watermark 230 of e in fig. 2.
As shown in the interface of fig. 2 e, the picture 101 displayed by the mobile phone includes text 107, graffiti 210, mosaic 220 and watermark 230. And responding to the operation of sliding the secondary toolbar by the user, and displaying the hidden blurring function, the color reserving function and the paper pasting function of the secondary toolbar by the mobile phone interface. When the mobile phone receives the operation that the user selects the sticker function 204 and adds a sticker on the picture 101, the sticker added by the user may be displayed on the picture 101. The user-added decal may be as shown in decal 240 of fig. 2 f.
As shown in the interface f of fig. 2, the picture 101 displayed by the mobile phone includes text 107, graffiti 210, mosaic 220, watermark 230 and sticker 240. When the mobile phone receives the operation of selecting the photo frame function 205 by the user and adding the photo frame on the picture 101, the photo frame added by the user can be displayed on the picture 101. The user added photo frame may be as shown in fig. 2 g, photo frame 250.
As shown in the interface g of fig. 2, the picture 101 displayed by the mobile phone includes text 107, graffiti 210, mosaic 220, watermark 230, sticker 240 and photo frame 250.
After adding elements such as graffiti, mosaic, watermark, sticker, photo frame and the like to the picture, the mobile phone can also support clipping the picture, and fig. 3 is an exemplary schematic diagram of clipping the picture in one possible implementation.
As shown in the interface of fig. 3 a, the picture 101 displayed by the mobile phone includes text 107, graffiti 210, mosaic 220, watermark 230, sticker 240 and photo frame 250. When the mobile phone receives the operation of selecting the clipping function 301 by the user, the display interface of the mobile phone is switched from a in fig. 3 to b in fig. 3.
As shown in fig. 3 b, the interface displayed by the mobile phone may include a picture 101, a cropping secondary toolbar 302, and a cropping adjustment toolbar 303, and text 107, graffiti 210, mosaic 220, watermark 230, sticker 240, and photo frame 250 added on the picture are not displayed temporarily. The user can crop the picture by an operation of the crop frame 304, for example, an operation of enlarging, reducing, or the like, and crop continuously acts on the picture 101.
When the mobile phone detects that the user lifts his hand off the screen of the mobile phone, the display interface of the mobile phone is switched from b in fig. 3 to c in fig. 3. When the handset receives the operation of the user selecting the confirmation option 208, the display interface of the handset is switched from c in fig. 3 to d in fig. 3.
As shown in fig. 3 d, the cropped picture 306 includes the graffiti portion, the mosaic portion, and the decal portion, and the text, watermark, and frame are all cropped. After the mobile phone receives the operation of clicking the save button 305 by the user, the mobile phone can save the cropped picture 306.
That is, in the clipping process, elements added to the picture, such as text, graffiti, mosaic, watermark, sticker, photo frame, etc., are not displayed temporarily, clipping may act on the picture and the elements, and when the mobile phone receives the operation of the user to clip the picture, the user cannot see the elements added to the picture, so that the clipped picture may not meet the expectations of the user.
It will be appreciated that in the interface of fig. 3 d, the user may also trigger the dismissal button 307, and in response to the dismissal operation by the user, the interface displayed by the handset switches from fig. 3 d to fig. 3 e. As shown in fig. 3 e, the picture displayed by the mobile phone is rolled back to the picture 101 before cutting, and the picture 101 includes the text 107, the graffiti 210, the mosaic 220, the watermark 230, the sticker 240 and the photo frame 250. Further, the mobile phone may also support adding a filter to the picture, and fig. 4 is an exemplary picture processing schematic in one possible implementation.
As shown in fig. 4 a, when the mobile phone receives the operation of selecting the filter function 401 in the first-level toolbar 104 by the user, the display interface of the mobile phone is switched from a in fig. 4 to b in fig. 4.
As shown in fig. 4 b, the interface displayed by the mobile phone includes a picture 101 and a filter function field 402. When the mobile phone receives the operation of selecting the filter 3 in the filter function field 402 by the user, the display interface of the mobile phone is switched from b in fig. 4 to c in fig. 4.
As shown in fig. 4 c, the filter 3 in the filter function field is in a selected state, the picture 101 displayed on the mobile phone interface is a picture after the filter function, and the filter function acts on the picture 101 and all elements added on the picture 101. The elements added in the picture 101 include characters, watermarks, stickers, graffiti, mosaics and photo frames.
When the mobile phone receives an operation of selecting the text 107 by the user, the mobile phone does not respond to the operation, and the user cannot modify the text which has been added.
When the mobile phone receives the operation of selecting the adjustment function 403 from the user, the display interface of the mobile phone is switched from c in fig. 4 to d in fig. 4.
As shown in d of fig. 4, the interface displayed by the mobile phone includes a picture 101 and an adjustment function bar 404. When the mobile phone receives the operation of selecting the brightness function in the adjusting function field 404, the brightness of the picture is adjusted by the brightness parameter adjusting field 405 in the adjusting function field 404, wherein the brightness adjustment can be applied to the picture 101 and all the elements added by the picture 101.
As shown in fig. 4 e, the picture 101 displayed by the mobile phone is a picture with adjusted brightness, and each element included in the picture is an element with adjusted brightness. When the mobile phone receives the operation of selecting the filter function 401 by the user, the display interface of the mobile phone is switched from e in fig. 4 to f in fig. 4.
As shown in f in fig. 4, after the user adds a filter to the picture, other operations other than the filter are performed on the picture, when the mobile phone receives the operation of selecting the filter function 401 again by the user, the original image in the filter function field is in the selected state, and the gallery application cannot locate that the filter added by the user is the filter 3, that is, the gallery application cannot memorize the filter added by the user to the picture.
That is, in the picture editing process of fig. 1 to 4, there is a certain pain point, and fig. 5 shows the editing flow diagrams of fig. 1 to 4 in time sequence, and illustrates the pain point in the editing process.
And the mobile phone responds to the operation of selecting the picture and clicking the editing function by the user to enter the picture editing.
Pain point one: in a scene of matching the picture with the text added by the user in response to the text adding operation of the user, the user expects the text function as follows: the characters are rich in style and can be added quickly. However, in some implementations, the pain points for adding text are fewer in text styles supported by gallery applications; the text function has deep level, is inconvenient to find and operate, and has low operation efficiency.
Pain point two: responding to the operation of adding the graffiti, the mosaic, the watermark and the paper by the user, and matching the images with the graffiti, the mosaic, the watermark and the paper added by the user, wherein the expectation of the user on the elements is as follows: the style of each element is rich and can be added quickly. However, in some implementations, the pain points to which the above elements are added support fewer styles of elements for gallery applications; the function level corresponding to each element is deep, so that the searching operation is inconvenient, and the operation efficiency is low.
Pain point three: in a scene of adding a photo frame to the periphery of a picture, the mobile phone responds to the operation of adding the photo frame by a user, and the expectation of the user on the photo frame is as follows: the photo frame has rich patterns, can quickly preview the effect of the photo frame and can quickly add the photo frame. In some implementations, however, the pain points for adding the photo frame are: the photo frame has few types and is unsightly; the proportion of pictures after adding the photo frame may change.
Pain points four: in a scene of adjusting the length, the width and the like of the picture, the mobile phone responds to the operation of cutting the picture by a user, the user expects the cutting function to be the operation of cutting the picture into the target size and cutting the picture rapidly. However, in some implementations, when the pain point of the picture cropping is to adjust the size of the picture, the cropping effect cannot be previewed in real time, and the content beyond the expectations of the user may be cropped.
Pain point five: the mobile phone responds to the operation of adding a filter to a user and adjusting the picture through an adjusting function, and the method comprises the following steps of: the filters are rich, the effect of previewing the picture after adding the filters can be implemented, and the added filters and the completed picture adjustment can be modified. However, in some implementations, the pain points of the filter function are that the filters are more and are not classified, so that the user cannot find the specified effect conveniently; the granularity of the adjusting fence is rough, and the adjusting effect is not fine enough; adding a filter to the picture or adjusting the picture to change parameters such as color, brightness, saturation and the like of elements such as a photo frame, characters, a sticker, a graffiti, a mosaic, a watermark and the like; the filter that has been added or the adjustment that has been completed cannot be modified.
It will be appreciated that when the handset receives an operation from the user to modify the mosaic, blurring or beautifying added to the picture, the user expects to repeatedly modify and edit the above elements, but the pain point in some implementations is that the above elements cannot be modified any more.
Pain spot six: when the mobile phone receives the operation of adjusting the text by the user, the text can be repeatedly modified and edited by the user in anticipation, but the pain point in some implementations is that the style or the content of the text can not be modified any more.
It will be appreciated that when the handset receives a user's action to alter a graffiti, watermark, sticker or photo frame, the user expects to repeatedly modify and edit the element, but the pain point in some implementations is that the element cannot be altered.
Pain point seven: when the mobile phone receives the operation of clicking the storage by the user, the processed picture can be exported. The user expects to quickly export the edited picture, however, in some implementations, the pain point is the second-level toolbar or the third-level toolbar interface with the editing function, after the picture is edited, the user needs to click the "confirm" button first, return to the first-level toolbar, click the "save" button of the first-level toolbar interface, and then save and export the processed picture.
In view of this, the embodiment of the present application provides a picture processing method:
aiming at the pain point I, the pain point II and the pain point seven in the pain points, the embodiment of the application can provide a first-level toolbar containing all picture processing functions, a user can select the functions needed to be used in the first-level toolbar, the second-level toolbar does not need to be unfolded, and the pictures after one-key storage processing can be further realized. Therefore, the user can conveniently find each function of picture processing, the frequency of page skip is reduced, and the operation of the user is more convenient.
Aiming at the pain point IV in the pain points, when a user performs cutting processing on the picture, a part of elements on the picture are not displayed temporarily and cannot be cut, and the part of elements after cutting can be displayed on the cut picture according to the size, the position and the like of the cut picture; and displaying another part of the elements on the picture, wherein a user can intuitively see the part of the elements in the clipping process, and clipping the part of the elements. Like this, can promote flexibility ratio, the convenience of picture processing, promote the harmony of picture, promote user's use experience.
Aiming at the pain points five and six in the pain points, the image processing method provided by the embodiment of the application can realize the function of memorizing the filter added by the user, can change the filter, adjustment, mosaic, blurring or beauty added to the image, can also change the added characters, stickers, graffiti, watermarks, photo frames and the like, and can not act on the added elements such as photo frames, characters, stickers, graffiti, mosaics, watermarks and the like after the filter is added to the image or the image is adjusted. Therefore, the user can conveniently process the picture, the operation of the user is more convenient, the picture processing result is more in line with the expectation of the user, and the use experience of the user is improved.
In order to better understand the embodiments of the present application, the structure of the terminal device of the embodiments of the present application is described below.
Fig. 6 is a schematic structural diagram of a terminal device 100 according to an embodiment of the present application.
The terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the terminal device 100. In other embodiments of the present application, terminal device 100 may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The charge management module 140 is configured to receive a charge input from a charger.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110.
The wireless communication function of the terminal device 100 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The terminal device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the terminal device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The terminal device 100 may implement a photographing function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to realize expansion of the memory capability of the terminal device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer-executable program code that includes instructions. The internal memory 121 may include a storage program area and a storage data area.
The terminal device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The software system of the terminal device 100 may employ a layered architecture, an event driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. In the embodiment of the invention, taking an Android system with a layered architecture as an example, a software structure of the terminal device 100 is illustrated.
Fig. 7 is a schematic software architecture diagram of a terminal device according to an embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 7, the application package may include applications such as camera, gallery, calendar, phone call, WLAN, bluetooth, music, video, short message, etc. The image processing method provided by the embodiment of the application can be implemented through a gallery.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 7, the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the terminal device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
Taking the picture processing method of the embodiment of the application as an example of a gallery application, the gallery application can be divided into a logic layer as shown in fig. 8 and a physical layer as shown in fig. 9 according to different processing functions.
Fig. 8 is a schematic diagram of a logic layer according to an embodiment of the present application.
As shown in fig. 8, the logic layer may include, from top to bottom, a photo frame, a color class function, an overlay class material, an effect class material, and a content layer.
The photo frame is independently displayed at the periphery of the picture, so that the photo frame can be free from the influence of other materials or functions. For example, the mobile phone sequentially receives the operation of adding a photo frame to the picture by the user and the operation of adding a filter to the picture by the user, and the color of the photo frame is not influenced by the filter and is not changed.
The color class function can include reserving color, and the function object of the color class function is all color content in the picture.
The overlay material may include text, watermarks, stickers, and/or graffiti, the overlay material acting on a portion of the picture.
The effect class material may include mosaics, blurring, filters, beauty, and/or conditioning. The object of the adjusting function is a picture or a video itself, and for example, a user can adjust brightness, saturation, and the like of the picture itself through the adjusting function. In addition to the conditioning function, the upper layer material in other effect-like materials may have an effect on the lower layer material. For example, the mobile phone receives an operation that a user adds a mosaic to the picture added with the filter, and the mosaic acts on the picture added with the filter.
The content layer may include a picture, canvas. Wherein, the picture refers to a picture itself used for picture processing.
In the logic layer shown in fig. 8, the material or function of the upper layer may affect the material or function of the lower layer, and the material or function of the lower layer may not affect the material or function of the upper layer, so that a consistent picture processing result may be obtained regardless of the change of the operation sequence of the user. Taking a filter and adjustment as an example, the mobile phone sequentially receives: the user adds the operation of the filter to the picture and the user uses the adjusting function to adjust the picture, and the adjusting function aims at the picture layer in the logic picture layer instead of the picture added with the filter, so that the picture processing result is consistent with the effect of adding the filter after the user uses the adjusting function to adjust the picture.
As shown in fig. 8, the clipping function is placed on the lower layer of text, watermark and sticker, and the clipping function is placed on the upper layer of graffiti, for the actual needs of the user. When the mobile phone receives the cutting operation of the user to cut the picture, the cutting can be applied to a graffiti layer, an effect material and a content layer at the lower layer, and the cutting cannot be applied to characters, watermarks, a sticker layer, color functions and a photo frame at the upper layer.
That is, on the one hand, considering that the graffiti added to the picture by the user and the mosaic may be strongly related to the content of the picture, for example, in the case that the user adds the graffiti to the picture and circles a certain word in the picture, the user does not want to change the relative position of the graffiti and the picture in the clipping process, or in the case that the user adds the mosaic to the picture and covers the face in the picture, the user does not want to change the relative position of the mosaic and the picture in the clipping process, so the graffiti and the mosaic are placed on the lower layer of the clipping, so that the graffiti and the mosaic can be always displayed on the picture in the clipping process and the relative position of the mosaic and the picture remains unchanged, and the user can intuitively see the graffiti and the mosaic, and the mosaic is clipped according to the actual requirement of the user. On the other hand, considering that the user does not want to partially or completely cut the text, the watermark and/or the sticker added in the picture in the cutting process, the text, the watermark and the sticker are placed on the upper layer of the cutting process, so that the text, the watermark and the sticker can not be displayed temporarily in the cutting process, the text, the watermark and the sticker can be redisplayed on the cut picture after the cutting process is completed, and the cutting function is not applied to the text, the watermark and the sticker.
It will be appreciated that the content included in each layer may be adapted to actual requirements. For example, in the case that a user adds a text to a picture and marks the names of various objects in the picture with the text, the relative positions of the text and the picture remain unchanged in the cutting process, and the text can be moved to the lower layer of cutting to meet the user's requirements. The embodiment of the present application is not particularly limited thereto.
Fig. 9 is a schematic diagram of a physical layer according to an embodiment of the present application. The physical layer is a real layer sequence corresponding to each picture processing function in the picture processing process.
Corresponding to the picture editing process of fig. 1-4, the order of the physical layers may include, from top to bottom: the picture frame physical layer, the reserved color physical layer, the superimposed material physical layer, the mosaic physical layer, the virtual physical layer, the filter physical layer, the beauty physical layer, the adjustment physical layer, the picture physical layer and the canvas physical layer.
Wherein, the superimposed material can include words, graffiti, watermark and/or sticker, etc. Corresponding to the picture editing process of fig. 1-4, the physical layers of the superimposed material-like material may include, from top to bottom, a decal layer, a watermark layer, a text layer, and a layer of graffiti 123.
The order of the physical layers may be relatively fixed. One or more real layers may be added to each physical layer.
Taking a mosaic physical layer as an example, when a first mosaic is added to a picture in response to a user operation, a first real mosaic layer is added to the mosaic physical layer, then, when characters are added to the picture in response to the user operation, a real character layer can be added to the character physical layer, after the characters are added, the user returns to a mosaic function, and further, when a second mosaic is added to the picture in response to the user operation, a second real mosaic layer is added to the mosaic physical layer, that is, the first mosaic and the second mosaic are positioned on the mosaic physical layer.
Similarly, for characters, graffiti, watermarks and stickers added to the terminal device at different times, in the physical layers of the overlapped elements, a new layer can be sequentially created for each element according to the sequence of adding the elements by the user. For example, if a first graffiti is added to a picture in response to a user operation, a first real graffiti layer can be added to the physical layer of the superimposed material, then if a text is added to the picture in response to a user operation, a real text layer can be added to the physical layer of the superimposed material, after the text is added, the user returns to the graffiti function, and if a second graffiti is added to the picture in response to a user operation, a second real graffiti layer can be added to the physical layer of the superimposed material. That is, the first graffiti layer, the text layer, and the second graffiti layer may be in the superimposed material-like physical layer in a bottom-to-top order.
It will be appreciated that for the first graffiti, a real graffiti layer may be created for the first graffiti no matter how much graffiti content is included in the first graffiti, e.g., the content of the first graffiti is "123", where while "1", "2", "3" corresponds to three different strokes, in the graffiti physical layer, "123" corresponds to a real layer instead of three. The second graffiti is also similarly processed and will not be described in detail.
Based on the logic layer and the physical layer, the embodiment of the application can distinguish and process the elements in each logic layer or the physical layer, so that some elements can be displayed without displaying when cutting, and the elements which are not displayed after cutting can be redisplayed on the cut picture, thereby solving the problem corresponding to pain point IV; when the effect type material is added to the picture, elements such as a photo frame, characters, a sticker, a graffiti, a mosaic, a watermark and the like can not be acted, and the added effect type material can be modified, so that the problem corresponding to the pain point five is solved; the method can also edit and modify the added elements such as photo frames, characters, stickers, graffiti, watermarks and the like on the pictures, thereby solving the problem of six corresponding pain points. The specific solution will be described in detail in the following embodiments, and will not be described again.
The interfaces that may be involved in the image processing method according to the embodiments of the present application will be described in reference to the sequence of using the functions in the gallery application of fig. 1-4, and the sequence of pain points.
Illustratively, as shown in fig. 10, the embodiment of the present application provides a toolbar schematic for the problem that the function level in the pain spot one and the pain spot two is too deep, and the problem that the pain spot seven cannot be saved by one key.
As shown in fig. 10 a, the toolbar 1001 may include all functions of picture processing, and in a case where the terminal device has a limited screen size and only a part of the content of the toolbar 1001 can be displayed, when the terminal device receives an operation of sliding the toolbar 1001 by the user, the toolbar may display other hidden functions.
As shown in fig. 10 b, the picture processing functions included in the toolbar 1001 may include: cutting, filtering, adjusting, beautifying, graffiti, characters, mosaics, watermarks, photo frames, blurring, retaining colors and stickers. The order of the image processing functions is not fixed, so that the order of the image processing functions can be arranged from high to low according to the frequency of use of the user for convenience of the user, and the embodiment of the application is not particularly limited.
It is to be appreciated that more or fewer functions may be included in the toolbar 1001, which embodiments of the present application are not specifically limited.
It will be appreciated that toolbar 1001 may include, in contrast to primary toolbar 104 and secondary toolbar 105 shown in fig. 1 a: all functions contained in the primary toolbar 104 and all functions contained in the secondary toolbar 105. When the mobile phone receives a function contained in the original secondary toolbar in the toolbar 1001 selected by the user, the interface displayed by the mobile phone can not jump, and after the mobile phone processes the picture according to the function selected by the user, the mobile phone can respond to the operation of selecting a save button by the user to save the processed picture, so that one-key saving is realized.
Compared with the pain point I, the pain point II and the pain point seven, the picture processing functions are all arranged in one toolbar, and a user can select the functions required to be used through sliding operation without clicking more buttons and expanding the secondary toolbar. Therefore, the user can conveniently find each function, the page skip times are reduced, the processed pictures can be saved by one key, and the operation of the user is more convenient.
For example, aiming at the problem that the filter in the pain spot five is inconvenient for the user to find the specified effect, fig. 11 is a view of a filter editing interface provided in an embodiment of the present application.
As shown in the interface a of fig. 11, the interface displayed by the mobile phone includes a picture 101 and a toolbar 1001. When the mobile phone receives the operation of selecting the filter function 401 in the toolbar 1001 by the user, the display interface of the mobile phone is switched from a in fig. 11 to b in fig. 11.
As shown in the interface b in fig. 11, the interface displayed by the mobile phone includes a title bar 601, a picture display area 602, and a filter function area 1101, and a user can preview the output picture after the picture processing through the picture display area 602 in real time. The filter function area 1101 includes a content area 603 and a parameter adjustment field 604, where the parameter adjustment field 604 is finer and finer than the filter parameter adjustment field in fig. 4, so that smooth adjustment of parameters can be achieved. When the mobile phone receives the operation of selecting the option of the filter 3 in the filter function field, the display interface of the mobile phone is switched from b in fig. 11 to c in fig. 11.
As shown in the interface of fig. 11 c, the option of the filter 3 is in a selected state, and after the filter is adjusted, parameters such as color, brightness, etc. of the picture 101 may be changed, and parameters such as color, brightness, etc. of each element added on the picture 101 are not changed. According to the logical layer shown in fig. 8, in the logical layer, the filter is at the upper layer of the picture and the lower layer of the text, watermark, sticker, graffiti and mosaic, so the filter is only adjusted for the picture and the text, watermark, sticker, graffiti and mosaic added on the picture are not adjusted.
When the mobile phone receives the operation of selecting the adjustment function in the toolbar 1001 by the user, the display interface of the mobile phone is switched from c in fig. 11 to d in fig. 11.
As shown in the interface d in fig. 11, the interface displayed by the mobile phone includes a picture and an adjustment function field 1102. When the mobile phone receives the operation of selecting the brightness function in the brightness adjustment function field 1102 and adjusting the brightness of the picture through the brightness adjustment field 1103, the display interface of the mobile phone is switched from d in fig. 11 to e in fig. 11. The brightness adjustment field 1103 is finer and finer than the brightness parameter adjustment field 405 in fig. 4, and smooth adjustment of parameters can be achieved.
As shown in the interface of fig. 11 e, the interface displayed by the mobile phone includes a picture 101 after brightness adjustment and a toolbar 1001. The brightness adjustment is applied only to the picture 101 itself, and is not applied to the elements added to the picture 101. According to the logical layer shown in fig. 8, in the logical layer, the adjustment is on the upper layer of the picture, the lower layer of the text, the watermark, the sticker, the graffiti, the mosaic, so the adjustment is only applied to the picture, and is not applied to the text, the watermark, the sticker, the graffiti, the mosaic added on the picture.
When the mobile phone receives the operation of selecting the filter function 401 by the user, the display interface of the mobile phone is switched from e in fig. 11 to f in fig. 11.
As shown in the interface f of fig. 11, the interface displayed by the mobile phone includes a filter function region 1101, and the option of the filter 3 in the filter function region 1101 is in a selected state.
That is, after the user adds a filter to the picture, other operations other than the filter are performed on the picture, and when the mobile phone receives an operation of returning the user to the filter function again, the gallery application may memorize the filter added to the picture by the user, for example, the filter 3, and may display the filter in a selected state in the filter function field.
In a possible implementation, the mobile phone may adjust parameters of the filter 3 in response to a user editing the filter, or switch the filter 3 to another filter, such as the filter 4. Thus, after the user has added the filter and performed other operations, when the user selects the filter function again, the filter can be modified and edited again, that is, continuous editing of the filter can be realized.
Similarly, in the embodiment of the present application, the gallery application may continuously edit other effect materials shown in fig. 8, such as mosaics, blurring, beautifying, adjusting, and the like. And will not be described in detail herein.
Compared with the fifth pain point, in the embodiment of the application, after the user adds the filter and performs other operations, when the user selects the filter function again, the gallery application can memorize the filter added before, the user can position the filter selected before without performing operations such as comparing the filter and searching the filter, and the modification and editing of the filter can be realized. Therefore, the user can conveniently process the pictures, and the user experience is improved.
For example, for the problem that the style or content of the text cannot be modified any more in the pain spot six, fig. 12 is a text editing interface diagram provided in the embodiment of the present application.
As shown in the interface of fig. 12 a, the interface displayed by the mobile phone includes a picture 101, a toolbar 1001, and a filter function bar 402. When the mobile phone receives the operation of selecting the text 107 by the user, the display interface of the mobile phone is switched from a in fig. 12 to b in fig. 12. The operation of selecting the text 107 by the user may be a single click text 107, a double click text 107, a long press text 107, etc., which is not limited in the embodiment of the present application.
As shown in fig. 12 b, the interface displayed by the mobile phone includes a picture 101, a text selection box 1201, and a text toolbar 1202. When the mobile phone receives the operation of selecting the edit function in the text toolbar 1202 by the user, the display interface of the mobile phone is switched from b in fig. 12 to c in fig. 12.
As shown in fig. 12 c, the interface displayed by the handset includes a picture 101, a text adjustment float layer 108, a confirm button 109, text function options 1204, and a title bar 1205. In a possible implementation, the user inputs "reading" in the text adjustment floating layer 108, and may change "text" of the text 107 to "reading", and in response to the user selecting the confirm button 109, the display interface of the mobile phone is switched from c in fig. 12 to d in fig. 12, and the picture of the modified text is shown as d in fig. 12.
That is, after the user adds the text to the picture, other operations except the text are performed on the picture, and when the mobile phone receives the operation of returning the text function again from the user, the gallery application can modify and edit the added text again, that is, can realize continuous editing of the text.
Similarly, in the embodiment of the present application, the gallery application may continuously edit other overlapped materials and photo frames shown in fig. 8, such as a sticker, a watermark, a graffiti, and the like. And will not be described in detail herein.
Therefore, compared with the pain spot six, the added characters, the stickers, the watermarks, the graffiti and the photo frame can be modified and edited again in the embodiment of the application, so that a user can conveniently process the pictures, and the user experience is improved.
For example, in order to solve the problem that when a picture is cut in the pain spot four, the content that is unexpected by the user may be cut, fig. 13 is a schematic view of cutting provided in the embodiment of the present application.
As shown in the interface of fig. 13 a, the picture 101 displayed by the mobile phone includes text 107, graffiti 210, mosaic 220, watermark 230, sticker 240, and photo frame 250. When the mobile phone receives the operation of selecting the clipping function 301 by the user, the display interface of the mobile phone is switched from a in fig. 13 to b in fig. 13.
As shown in fig. 13 b, the interface displayed by the mobile phone may include a picture, a cropping secondary toolbar 302, a cropping adjustment toolbar 303, and a cropping frame 304, where the graffiti 210 and the mosaic 220 on the picture are displayed, and the text 107, the watermark 230, the sticker 240, and the photo frame 250 on the picture are not displayed temporarily. The user can crop the picture by an operation of the crop box 304, such as an operation of enlarging, reducing, or the like. When the user completes cutting and the mobile phone detects that the user lifts his hand off the mobile phone screen, the display interface of the mobile phone is switched from b in fig. 13 to c in fig. 13.
According to the logical layer shown in fig. 8, in the logical layer, the clipping function is at the upper layer of the graffiti, the mosaic, the text, the watermark, and the paper-sticking lower layer, so the clipping function can clip the graffiti and the mosaic only, and cannot clip the text, the watermark, and the paper-sticking added on the picture.
As shown in the interface c in fig. 13, the interface displayed by the mobile phone may include a picture 1301 and a save button 305, where the picture 1301 is a picture cut out of the picture 101. The picture 1301 includes text 107, a portion of the content of the graffiti 210, a mosaic 220, a watermark 230, a decal 240, and a photo frame 250.
That is, after the cutting operation is completed, contents that are not displayed temporarily, such as text, watermark, sticker, photo frame, etc., are redisplayed on the picture.
It can be understood that the text, the watermark and the sticker can be automatically adapted to the picture frame of the picture after cutting, and the difference between the position of the text, the watermark or the sticker on the picture after cutting and the position of the text, the watermark or the sticker on the picture before cutting is smaller than a preset value. Taking the watermark as an example, in the original picture, the watermark is positioned at 90% of the original picture in the transverse direction and 95% of the original picture in the vertical direction, and in the cut picture, the watermark is still positioned at 90% of the original picture in the transverse direction and 95% of the original picture in the vertical direction.
It can be understood that the photo frame can be automatically adjusted according to the size of the picture after being cut. For example, the length and width of the original picture are 30mm x 40mm, the length and width of the cut picture are 25mm x 30mm, and the size of the photo frame can be automatically adjusted according to the size of the cut picture, so that the picture can be full of the whole photo frame.
When the mobile phone receives an operation of selecting the save button 305 by the user, the mobile phone can save the picture 1301.
In another possible implementation, when the mobile phone receives the operation of selecting the clipping function 301 by the user, the display interface of the mobile phone may be switched from a in fig. 13 to d in fig. 13.
As shown in fig. 13 d, the interface displayed by the mobile phone may include a picture, a cropping secondary toolbar 302, a cropping adjustment toolbar 303, and a cropping frame 304, wherein the text 107 and the mosaic 220 on the picture are displayed, and the graffiti 210, the watermark 230, the sticker 240, and the photo frame 250 on the picture are not displayed temporarily.
That is, corresponding to the logic layer in fig. 8, when the clipping function is at the upper layer of the text, the mosaic, the lower layer of the graffiti, the watermark, the sticker, the text, the mosaic can be displayed and clipped during clipping; graffiti, watermarks, decals are temporarily not displayed and cannot be cut. The user can crop the picture by an operation of the crop box 304, such as an operation of enlarging, reducing, or the like. When the user completes cutting and the mobile phone detects that the user lifts his hand off the screen, the display interface of the mobile phone is switched from d in fig. 13 to c in fig. 13.
Therefore, compared with the fourth pain point, in the cutting process, the user can intuitively see elements such as mosaics and graffiti added on the picture, the elements which are unexpected by the user can be effectively prevented from being cut, the elements such as characters, watermarks, stickers and photo frames are not displayed temporarily in the cutting process, and the elements can be automatically adapted to the cut picture after the cutting is completed, and the positions are adjusted. Therefore, the convenience and the flexibility of the picture processing can be improved, the user can preview the cutting effect in real time, the picture processing result is more in line with the expectations of the user, and the use experience of the user is improved.
Based on the descriptions of the foregoing embodiments, for better understanding of the embodiments of the present application, a specific explanation of a picture processing method is provided below.
The specific steps of the picture processing method provided by the embodiment of the application may include:
s1401, displaying an interface of a gallery by the terminal equipment, and receiving a ninth operation by the terminal equipment.
In this embodiment, the gallery interface includes one or more pictures. Wherein the gallery interface may correspond to the interface shown in fig. 1 a.
In the embodiment of the present application, the ninth operation may be used to select the first picture in the gallery interface. The ninth operation may correspond to the operation of selecting the picture 101 in the gallery interface by the user received by the terminal device in a of fig. 1.
Therefore, the user can process the picture through the gallery application in the terminal equipment without importing the picture into third-party picture processing software for processing, so that the picture processing is more convenient and quicker.
S1402, the terminal device displays the first picture.
Wherein the first picture may correspond to picture 101 in fig. 1.
In a possible implementation, the terminal device displays the first picture in response to the ninth operation. For example, the terminal device receives an operation of selecting the picture 101 in the gallery interface by the user, and in response to the operation, the terminal device displays the picture 101.
S1403, the terminal equipment receives a second operation aiming at the first picture in the gallery application; in response to the second operation, the terminal device displays a toolbar.
In this embodiment of the present application, the second operation may be used to invoke a picture editing function of the gallery application, and the second operation may correspond to the operation shown in b in fig. 1: the user received by the handset selects the edit icon 102.
In response to the second operation, the interface of the terminal device may display a toolbar, wherein the toolbar includes a primary toolbar and does not include a secondary toolbar. The toolbar may correspond to toolbar 1001 shown in fig. 10 a.
Therefore, the user can select the function to be used in the toolbar, the secondary toolbar does not need to be unfolded, the user can conveniently find each function, the interface jumping times are reduced, and the operation of the user is more convenient.
Optionally, the primary toolbar includes one or more of the following functions: clipping, filtering, adjusting, beautifying, graffiti, characters, mosaics, watermarks, photo frames, blurring, and retaining colors or stickers.
Wherein the primary toolbar may correspond to toolbar 1001 shown in fig. 10 b.
It may be appreciated that the primary toolbar may further include a picture processing function such as matting and background, which is not specifically limited in the embodiment of the present application.
Therefore, the picture processing functions are all placed in one toolbar, a user can select the functions to be used in the first-level toolbar, the user can conveniently find each function, the interface jumping times are reduced, and the user can operate more conveniently.
Optionally, a portion of the functionality is displayed in the primary toolbar. When a sliding operation is received in the primary toolbar, a function of hiding a display in the primary toolbar in the sliding direction is displayed based on the sliding direction of the sliding operation.
In the case of a limited interface size of the terminal device, the primary toolbar may display a portion of the functionality. The user can expand the hidden functions by sliding the primary toolbar. For example, the functions currently displayed by the primary toolbar include: cutting, filtering, adjusting, beautifying and graffiti, when the terminal equipment receives the operation of sliding the first-level toolbar leftwards by a user, the terminal equipment can display mosaic, characters and other hidden functions.
Therefore, the picture processing functions are all placed in one toolbar, a user can select the functions to be used through sliding operation, the secondary toolbar does not need to be unfolded, the user can conveniently find out the functions, the interface jump times are reduced, and the user can operate more conveniently.
S1404, the terminal device adds a first type element for the first picture.
The first type of element may be understood as an element located at an upper layer of the clipping layer in the logic layer shown in fig. 8, and the clipping operation does not act on the first type of element.
S1405, the terminal device adds a second type element for the first picture.
The second element may be understood as an element of the lower layer in fig. 8, and the clipping operation acts on the second element.
S1406, the terminal equipment receives a cutting operation, and the cutting operation continuously acts on the interface of the terminal equipment.
In this embodiment of the present application, the cropping operation may be used to implement cropping of a portion of the content in the picture, and the cropping operation may correspond to the cropping operation of the picture 101 shown in fig. 13. The user can adjust the clipping region by a continuous operation on the clipping frame, for example, a continuous touch of the clipping frame displayed on the mobile phone screen.
S1407, in response to the cropping operation, the terminal device displays the first picture, displays the second type element, displays the cropping area corresponding to the cropping operation, and does not display the first type element.
The cropping area corresponding to the cropping operation may be as shown in b in fig. 13 or d in fig. 13, and in fig. 13, the cropping area is an area of the picture selected by the cropping frame 304.
Optionally, the first type of element includes one or more of: text, watermarks or stickers; the second class of elements includes one or more of the following: graffiti or mosaic.
Corresponding to b in fig. 13, during cropping, the first type of element is not displayed, i.e. text, watermark and/or decal is not displayed, and the second type of element is displayed, i.e. graffiti and/or mosaic is displayed.
Optionally, the first type of element includes one or more of: graffiti, watermark or decal; the second class of elements includes one or more of the following: text or mosaic.
Corresponding to d in fig. 13, during cropping, the first type of element is not displayed, i.e. graffiti, watermark and/or decal is not displayed, and the second type of element is displayed, i.e. text and/or mosaic is displayed.
Therefore, the content contained in the first type of elements and the content contained in the second type of elements can be adjusted according to the actual demands of the users, and in the cutting process, some elements are displayed and other elements are not displayed, so that the flexibility of picture processing is improved, and the actual demands of the users are met.
S1408, when the clipping operation is finished, the terminal device obtains a second picture, where the second picture includes: the first type element, the content of the first picture covered by the target clipping region and the content of the second type element covered by the target clipping region; the target clipping region is a clipping region corresponding to the clipping operation ending time.
In this embodiment, the second picture is a cropped picture, and the second picture may be understood as the picture 1301 in fig. 13. After the cutting operation is finished, the whole content of the first type element can be redisplayed because the cutting operation does not act on the first type element; the clipping operation acts on the second class element, and the second class element displays the content covered by the target clipping region, where the content of the second class element covered by the target clipping region may be part of the content of the second class element or may be all the content of the second class element, and the coverage of the target clipping region may not include the content of the second class element.
And after the terminal equipment detects that the user stops cutting operation, finishing cutting the picture by the terminal equipment. Wherein, the operation of stopping cutting by the user may include: the operation of stopping the touch screen by the user lifting the finger, the operation of stopping the operation of controlling the input device such as the mouse by the user lifting the finger, and the like are not particularly limited.
In the cutting process, the first type elements are not displayed temporarily, the second type elements are displayed, a user can intuitively see the second type elements added on the picture, the second type elements which are unexpected by the user are effectively prevented from being cut, and after the cutting is completed, the first type elements are not affected by cutting and can be displayed again. Therefore, the user can preview the cutting effect in real time, the flexibility of picture processing can be improved, the picture processing result is more in line with the expectation of the user, and the use experience of the user is improved.
Optionally, the difference between the position of the first type element in the second picture and the position of the first type element in the first picture is smaller than a preset value.
After the clipping operation is finished, the first type element is redisplayed, and the position of the first type element in the second picture can be set according to the position of the first type element in the first picture. For example, the preset value is that the difference between the positions is less than 5%, and the positions of the first type elements in the first picture are: the positions of the first type elements in the second picture can be 30% of the original picture in the horizontal direction and 40% in the vertical direction: the second picture is 33% horizontal and 42% vertical. The difference between the position of the first type element before cutting and the position of the picture after cutting is smaller than a preset value.
Therefore, the coordination degree of the cut picture and the added element can be improved, and the use experience of a user is improved.
S1409, the terminal equipment receives a third operation; and responding to the third operation, and storing the second picture by the terminal equipment.
In this embodiment of the present application, a third operation may be used to save the processed picture, and the third operation may correspond to the operation of selecting the save button 305 by the user received by the mobile phone in c of fig. 13.
And when the terminal equipment processes the picture to obtain a second picture and receives a third operation, the terminal equipment can store the second picture.
Therefore, the processed pictures can be saved by one key, so that the operation of a user is simpler and quicker.
Optionally, before the terminal device receives the clipping operation, the method further includes: the terminal equipment receives a first operation; responding to a first operation, and adding a photo frame for the periphery of the first picture by the terminal equipment; when the cutting operation continuously acts on the interface of the terminal equipment, the terminal equipment does not display the photo frame; when the cropping operation is finished, the periphery of the second picture comprises a photo frame.
In the embodiment of the present application, the first operation may be used to add a photo frame to the periphery of the picture. The first operation may correspond to that shown as f in fig. 2: the handset receives the operation of the user selecting the photo frame function 205 and adding a photo frame to the picture 101.
According to the logical layer shown in fig. 8, the picture frame is located at the upper layer of the cropping, and therefore the cropping operation does not act on the picture frame, the picture frame is not displayed temporarily when the terminal device views the first picture in response to the operation of the user, and the periphery of the second picture includes the picture frame when the terminal device completes the cropping operation in response to the operation of the user. The size of the photo frame can be automatically adjusted according to the size of the second picture.
Therefore, when a user cuts the picture, the photo frame is not affected by cutting, the periphery of the cut picture can be redisplayed after the cutting is completed, the picture processing expectation of the user is met, and the use experience of the user is improved.
Optionally, before adding the second type element to the first picture, the terminal device further includes: the terminal equipment receives a fourth operation; in response to the fourth operation, the terminal device displays a filter toolbar including one or more filter options; when a trigger for a first filter option in the multiple filter options is received, the terminal equipment adds a first filter corresponding to the first filter option for the first picture, and displays the first filter option as a selected state; the terminal device adds a second type element for the first picture, and the terminal device further comprises: the terminal equipment receives a fifth operation; responding to a fifth operation, and displaying a filter toolbar by the terminal equipment, wherein the first filter option is in a selected state in the filter toolbar; when a trigger for a second filter option in the multiple filter options is received, the terminal equipment switches the first filter to a second filter corresponding to the second filter option, the second filter option is in a selected state in a filter toolbar, and the first filter option is in an unselected state.
In this embodiment, the fourth operation may be used to tune the filter function, and the fourth operation may correspond to the operation of the filter function 401 in the user selection toolbar 1001 shown in fig. 11 a. A fifth operation may be used to tune the filter function again, and the fifth operation may correspond to the mobile phone shown in fig. 11 e receiving an operation of the user selecting the filter function 401. Between the fourth operation and the fifth operation, the terminal device may receive an operation of the user on the picture other than the filter processing.
The first filter option may correspond to the filter 3 in fig. 11, and the first filter option is in a selected state when the terminal device receives an operation of selecting the first filter option by the user. The selected state may be highlighting the first filter option, lightening or deepening the background color of the first filter option, or the like, which is not specifically limited in the embodiment of the present application.
Optionally, after the terminal device adds the first filter corresponding to the first filter option for the first picture, the terminal device receives operations of the user except for filter processing, the terminal device processes the picture in response to the operations of the user, after the processing is completed, the terminal device receives operations of the user for adjusting the filter function, and the terminal device displays a filter toolbar, wherein the first filter option in the filter toolbar is in a selected state. And the terminal equipment responds to the operation of switching the filters by the user, switches the filter added by the first picture into a second filter, switches the second filter option into a selected state, and switches the first filter option into an unselected state.
For example, after the terminal device adds the filter 3 corresponding to the filter 3 option for the first picture, the terminal device receives the operation of adding text for the user, and responds to the operation of the user to add text for the picture. After the text is added, the terminal equipment receives the operation of the user for adjusting the filter function, the terminal equipment displays a filter toolbar, and the filter 3 options in the filter toolbar are in a selected state. The terminal device responds to the operation of switching the filters by a user, switches the filter 3 added by the first picture into a filter 4, switches the option of the filter 4 into a selected state, and switches the option of the filter 3 into an unselected state.
That is, the gallery application of the terminal device may memorize the filter selected by the user, modify the filter added to the picture, and the user may locate the filter selected before without performing operations such as comparing the filter and searching the filter. Therefore, the user can conveniently process the pictures, the operation of the user is more convenient, and the use experience of the user is improved.
Optionally, before adding the second type element to the first picture, the terminal device further includes: the terminal equipment receives a sixth operation; responding to a sixth operation, and adding a first text for the first picture by the terminal equipment; the terminal device adds a second type element for the first picture, and the terminal device further comprises: the terminal equipment receives a seventh operation; responding to the seventh operation, and displaying the first text as an editable state; the terminal equipment receives an eighth operation; and responding to the eighth operation, and modifying the first text into the second text by the terminal equipment.
In this embodiment of the present application, the sixth operation may be used to add text, where the sixth operation may correspond to the operation of adding text 107 by the user received by the terminal device shown in fig. 1 e. A seventh operation may be used to select the text added on the picture, and the seventh operation may correspond to the operation that the handset shown in fig. 12 a receives the user selection text 107. And between the sixth operation and the seventh operation, the terminal device may receive an operation, except word processing, performed on the picture by the user. An eighth operation may be used to edit the text, and the eighth operation may correspond to the operation in fig. 12 c in which the terminal device receives the user modified text 107.
The first text may correspond to text 107. When the terminal device receives the operation of selecting the first text by the user, the first text is switched to an editable state, and the user can modify the content, style and the like of the first text.
Optionally, after the terminal device adds the first text to the first picture, the terminal device receives operations of the user except text processing, the terminal device processes the picture in response to the operations of the user, and after the processing is completed, the terminal device receives operations of the user selecting the first text, and the first text is switched to an editable state. And the terminal equipment responds to the operation of modifying the first text by the user, and modifies the first text into the second text.
For example, after the terminal device adds the text 107 with the content of "text" to the first picture, the terminal device receives the operation of cutting the picture by the user, and after the cutting is completed, the terminal device receives the operation of selecting the text 107 by the user, and the text 107 is switched to the editable state. The terminal device modifies the content of the text 107 to "read" in response to a user modifying the text.
That is, after the user has added the text and performed other operations, when the user selects the text added on the picture again, the user may switch to the editing interface corresponding to the text to modify and edit the text again. Therefore, the user can conveniently process the pictures, and the user experience is improved.
It is to be understood that, in the above steps S1401 to S1409, S1401, S1403, S1409 are optional steps.
The method provided by the embodiment of the present application is described above with reference to fig. 10 to 13, and the device for performing the method provided by the embodiment of the present application is described below. As shown in fig. 14, fig. 14 is a schematic structural diagram of a picture processing device provided in an embodiment of the present application, where the picture processing device may be a terminal device in the embodiment of the present application, or may be a chip or a chip system in the terminal device.
As shown in fig. 14, a picture processing apparatus 1400, which includes a processing unit 1410, may be used in a circuit, a hardware component, or a chip. Wherein the processing unit 1410 is for supporting steps performed by the picture processing apparatus, for example, the processing unit is for processing steps S1401 to S1409.
In a possible implementation manner, the image processing apparatus may further include: and a storage unit 1430. The storage unit 1430 may include one or more memories, which may be one or more devices, circuits, or devices for storing programs or data.
The memory unit 1430 may exist separately and be coupled to the processing unit 1410 through a communication bus. The memory unit 1430 may also be integrated with the processing unit 1410.
Taking a chip or a chip system in which the image processing apparatus may be a terminal device in the embodiment of the present application as an example, the storage unit 1430 may store computer-executable instructions of a method of the terminal device, so that the processing unit 1410 performs the method of the terminal device in the embodiment described above. The storage unit 1430 may be a register, a cache, or a random access memory (random access memory, RAM), etc., and the storage unit 1430 may be integrated with the processing unit 1410. The storage unit 1430 may be a read-only memory (ROM) or other type of static storage device that may store static information and instructions, and the storage unit 1430 may be independent of the processing unit 1410.
In a possible implementation manner, the image processing apparatus may further include: a communication unit 1420. Wherein the communication unit 1420 is configured to support interaction of the picture processing apparatus with other devices. For example, when the picture processing apparatus is a terminal device, the communication unit 1420 may be a communication interface or an interface circuit. When the picture processing apparatus is a chip or a chip system within a terminal device, the communication unit 1420 may be a communication interface. For example, the communication interface may be an input/output interface, pins or circuitry, etc.
The apparatus of this embodiment may be correspondingly configured to perform the steps performed in the foregoing method embodiments, and the implementation principle and technical effects are similar, which are not described herein again.
Fig. 15 is a schematic hardware structure of an electronic device according to an embodiment of the present application, as shown in fig. 15, where the electronic device includes a processor 1501, a communication line 1504 and at least one communication interface (an exemplary communication interface 1503 is illustrated in fig. 15).
The processor 1501 may be a general purpose central processing unit (central processing unit, CPU), microprocessor, application Specific Integrated Circuit (ASIC), or one or more integrated circuits for controlling the execution of the programs of the present application.
Communication line 1504 may include circuitry for communicating information between the components described above.
The communication interface 1503 uses any transceiver-like device for communicating with other devices or communication networks, such as ethernet, wireless local area network (wireless local area networks, WLAN), etc.
Possibly, the electronic device may also comprise a memory 1502.
The memory 1502 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a random access memory (random access memory, RAM) or other type of dynamic storage device that can store information and instructions, or an electrically erasable programmable read-only memory (electrically erasable programmable read-only memory, EEPROM), a compact disc read-only memory (compact disc read-only memory) or other optical disk storage, optical disk storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be separate and coupled to the processor via communication line 1504. The memory may also be integrated with the processor.
The memory 1502 is used for storing computer-executable instructions for executing the embodiments of the present application, and the processor 1501 controls the execution. The processor 1501 is configured to execute computer-executable instructions stored in the memory 1502 to implement the methods provided in the embodiments of the present application.
Possibly, the computer-executed instructions in the embodiments of the present application may also be referred to as application program code, which is not specifically limited in the embodiments of the present application.
In a particular implementation, the processor 1501 may include one or more CPUs, such as CPU0 and CPU1 of FIG. 15, as an embodiment.
In a particular implementation, as one embodiment, an electronic device may include multiple processors, such as processor 1501 and processor 1505 in FIG. 15. Each of these processors may be a single-core (single-CPU) processor or may be a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
Fig. 16 is a schematic structural diagram of a chip according to an embodiment of the present application. Chip 1600 includes one or more (including two) processors 1620 and a communication interface 1630.
In some implementations, the memory 1640 stores the following elements: executable modules or data structures, or a subset thereof, or an extended set thereof.
In the present embodiment, memory 1640 may include read only memory and random access memory and provide instructions and data to processor 1620. A portion of the memory 1640 may also include non-volatile random access memory (non-volatile random access memory, NVRAM).
In the illustrated embodiment, memory 1640, communication interface 1630, and processor 1620 are coupled together by bus system 1610. The bus system 1610 may include a power bus, a control bus, a status signal bus, and the like in addition to a data bus. For ease of description, the various buses are labeled as bus system 1610 in FIG. 16.
The methods described in the embodiments of the present application may be applied to the processor 1620 or implemented by the processor 1620. Processor 1620 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuitry in hardware in processor 1620 or by instructions in software. The processor 1620 may be a general purpose processor (e.g., a microprocessor or a conventional processor), a digital signal processor (digital signal processing, DSP), an application specific integrated circuit (application specific integrated circuit, ASIC), an off-the-shelf programmable gate array (field-programmable gate array, FPGA) or other programmable logic device, discrete gates, transistor logic, or discrete hardware components, and the processor 1620 may implement or perform the methods, steps, and logic blocks disclosed in the embodiments herein.
The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in hardware, in a decoded processor, or in a combination of hardware and software modules in a decoded processor. The software modules may be located in a state-of-the-art storage medium such as random access memory, read-only memory, programmable read-only memory, or charged erasable programmable memory (electrically erasable programmable read only memory, EEPROM). The storage medium is located in the memory 1640, and the processor 1620 reads information in the memory 1640 and performs the steps of the method described above in combination with its hardware.
In the above embodiments, the instructions stored by the memory for execution by the processor may be implemented in the form of a computer program product. The computer program product may be written in the memory in advance, or may be downloaded in the form of software and installed in the memory.
The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL), or wireless (e.g., infrared, wireless, microwave, etc.), or semiconductor medium (e.g., solid state disk, SSD)) or the like.
Embodiments of the present application also provide a computer-readable storage medium. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. Computer readable media can include computer storage media and communication media and can include any medium that can transfer a computer program from one place to another. The storage media may be any target media that is accessible by a computer.
As one possible design, the computer-readable medium may include compact disk read-only memory (CD-ROM), RAM, ROM, EEPROM, or other optical disk memory; the computer readable medium may include disk storage or other disk storage devices. Moreover, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, digital versatile disc (digital versatile disc, DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
Combinations of the above should also be included within the scope of computer-readable media. The foregoing is merely illustrative embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily think about variations or substitutions within the technical scope of the present invention, and the invention should be covered. Therefore, the protection scope of the invention is subject to the protection scope of the claims.

Claims (15)

1. A picture processing method, comprising:
the terminal equipment displays a first picture;
the terminal equipment adds a first type element for the first picture;
the terminal equipment adds a second type element for the first picture;
the terminal equipment receives a cutting operation, and the cutting operation continuously acts on an interface of the terminal equipment;
responding to the clipping operation, displaying the first picture, displaying the second type element, displaying a clipping region corresponding to the clipping operation and not displaying the first type element by the terminal equipment;
when the cutting operation is finished, the terminal equipment obtains a second picture, wherein the second picture comprises: the first type element, the content of the first picture covered by the target cropping zone, and the content of the second type element covered by the target cropping zone; the target clipping region is a clipping region corresponding to the clipping operation ending time.
2. The method of claim 1, wherein the first type of element comprises one or more of: text, watermarks or stickers;
the second class of elements includes one or more of the following: graffiti or mosaic.
3. The method of claim 1, wherein the first type of element comprises one or more of: graffiti, watermark or decal;
the second class of elements includes one or more of the following: text or mosaic.
4. A method according to claim 2 or 3, wherein the difference between the position of the first type element in the second picture and the position of the first type element in the first picture is less than a preset value.
5. The method according to any one of claims 1-4, wherein before the terminal device receives the clipping operation, further comprising:
the terminal equipment receives a first operation;
responding to the first operation, and adding a photo frame for the periphery of the first picture by the terminal equipment;
when the cutting operation continuously acts on the interface of the terminal equipment, the terminal equipment does not display the photo frame;
when the cropping operation is finished, the periphery of the second picture comprises the photo frame.
6. The method according to any one of claims 1-5, wherein after the terminal device displays the first picture, further comprising:
the terminal equipment receives a second operation aiming at the first picture in a gallery application;
in response to the second operation, the terminal device displays a toolbar, wherein the toolbar includes a primary toolbar and does not include a secondary toolbar.
7. The method of claim 6, wherein the primary toolbar includes one or more of the following functions: clipping, filtering, adjusting, beautifying, graffiti, characters, mosaics, watermarks, photo frames, blurring, and retaining colors or stickers.
8. The method of claim 7, wherein a portion of functionality is displayed in the primary toolbar, the method further comprising:
when a sliding operation is received in the primary toolbar, a function of hiding a display in the primary toolbar in a sliding direction of the sliding operation is displayed based on the sliding direction.
9. The method according to any one of claims 1-8, wherein after the terminal device obtains the second picture, further comprising:
The terminal equipment receives a third operation;
and responding to the third operation, and storing the second picture by the terminal equipment.
10. The method according to any of claims 1-9, wherein before adding the second class element to the first picture, the terminal device further comprises:
the terminal equipment receives a fourth operation;
in response to the fourth operation, the terminal device displays a filter toolbar, wherein the filter toolbar comprises one or more filter options;
when a trigger for a first filter option in the plurality of filter options is received, the terminal device adds a first filter corresponding to the first filter option for the first picture, and displays the first filter option as a selected state;
the terminal device adds the second class element for the first picture, and further includes:
the terminal equipment receives a fifth operation;
responding to the fifth operation, the terminal equipment displays a filter toolbar, wherein the first filter option is in a selected state;
when a trigger for a second filter option in the plurality of filter options is received, the terminal equipment switches the first filter to a second filter corresponding to the second filter option, wherein in the filter toolbar, the second filter option is in a selected state, and the first filter option is in an unselected state.
11. The method according to any of claims 1-10, wherein before adding the second class element to the first picture, the terminal device further comprises:
the terminal equipment receives a sixth operation;
responding to the sixth operation, and adding a first text for the first picture by the terminal equipment;
the terminal device adds the second class element for the first picture, and further includes:
the terminal equipment receives a seventh operation;
responding to the seventh operation, and displaying the first text in an editable state;
the terminal equipment receives an eighth operation;
and responding to the eighth operation, and modifying the first text into a second text by the terminal equipment.
12. The method according to any of claims 1-11, wherein before the terminal device displays the first picture, further comprising:
the terminal equipment displays an interface of a gallery, wherein the interface of the gallery comprises one or more pictures;
the terminal equipment receives a ninth operation;
in response to a ninth operation, the terminal device displays the first picture.
13. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the computer program is caused by the processor to perform the method of any one of claims 1-12.
14. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program which, when executed by a processor, implements the method according to any of claims 1-12.
15. A computer program product comprising a computer program that, when executed, causes an electronic device to perform the graffiti processing method of any one of claims 1-12.
CN202211261751.XA 2022-10-14 2022-10-14 Picture processing method and related device Pending CN117893639A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211261751.XA CN117893639A (en) 2022-10-14 2022-10-14 Picture processing method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211261751.XA CN117893639A (en) 2022-10-14 2022-10-14 Picture processing method and related device

Publications (1)

Publication Number Publication Date
CN117893639A true CN117893639A (en) 2024-04-16

Family

ID=90638188

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211261751.XA Pending CN117893639A (en) 2022-10-14 2022-10-14 Picture processing method and related device

Country Status (1)

Country Link
CN (1) CN117893639A (en)

Similar Documents

Publication Publication Date Title
CN111966251B (en) Card display method, electronic device and computer readable storage medium
CN113766064B (en) Schedule processing method and electronic equipment
CN111263002B (en) Display method and electronic equipment
CN115328358B (en) Display method and related device
WO2023083184A1 (en) Desktop management method, graphical user interface, and electronic device
WO2023005751A1 (en) Rendering method and electronic device
US12001777B2 (en) Font switching method and electronic device
CN117893639A (en) Picture processing method and related device
CN117891537A (en) Doodle processing method and related device
CN115017522A (en) Permission recommendation method and electronic equipment
CN116974446B (en) Animation effect display method and device
CN114866641B (en) Icon processing method, terminal equipment and storage medium
CN116743908B (en) Wallpaper display method and related device
CN116700554B (en) Information display method, electronic device and readable storage medium
CN114625303B (en) Window display method, terminal device and computer readable storage medium
CN116088715B (en) Message reminding method and electronic equipment
WO2024066990A1 (en) Method for displaying screen wallpaper, and electronic device
WO2023040613A1 (en) Human-machine interaction method, computer-readable medium, and electronic device
CN116737037A (en) Stack management method in interface display and related device
CN118244950A (en) Widget display method and electronic equipment
CN117991937A (en) Multi-window management method, graphical interface and related device
CN117631920A (en) Data selection method and related device
CN118093067A (en) Method for displaying card, electronic device and readable storage medium
CN116088745A (en) Application opening method and related device
CN117171188A (en) Search method, search device, electronic device and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination