CN112947923A - Object editing method and device and electronic equipment - Google Patents

Object editing method and device and electronic equipment Download PDF

Info

Publication number
CN112947923A
CN112947923A CN202110213643.4A CN202110213643A CN112947923A CN 112947923 A CN112947923 A CN 112947923A CN 202110213643 A CN202110213643 A CN 202110213643A CN 112947923 A CN112947923 A CN 112947923A
Authority
CN
China
Prior art keywords
target
instruction
editing
identifier
basic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110213643.4A
Other languages
Chinese (zh)
Inventor
刘鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202110213643.4A priority Critical patent/CN112947923A/en
Publication of CN112947923A publication Critical patent/CN112947923A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an object editing method and device and electronic equipment, belongs to the technical field of communication, and can solve the problems of complexity, time consumption and low efficiency in the editing process of an object to be edited. The method comprises the following steps: receiving a first input of a user to a first object displayed in a first interface, wherein the first interface is an interface in a first application; in response to a first input, displaying a first object in an editable state on a first interface, and displaying a target control, the target control comprising at least one editing instruction; receiving a second input of the user, wherein the second input is used for determining a target editing instruction; responding to the second input, calling a second application in the background to execute target editing operation on the first object to obtain a second object, and updating the first object into a second object; the second application is an application to which the target editing instruction belongs, and the target editing operation is an editing operation corresponding to the target editing instruction.

Description

Object editing method and device and electronic equipment
Technical Field
The application belongs to the technical field of communication, and particularly relates to an object editing method and device and electronic equipment.
Background
With the rapid development of communication technology, electronic devices are more and more widely applied, for example, in the process that a user views objects such as pictures, documents, audios and videos through the electronic devices, the objects are often required to be edited and processed, so as to achieve the effect that the user is satisfied.
At present, when a user needs to edit an image 1 in (an interface of) an interface 1 of an application program (hereinafter referred to as an application, non-editing application), the image 1 needs to be saved to an electronic device, an editing application (such as an album application) with an image editing function is opened, the image 1 is found in the editing application, and finally the image 1 is edited in the editing application, so that an image meeting the user requirements is obtained.
Therefore, the editing processing process is complicated, time-consuming and low in efficiency.
Disclosure of Invention
The embodiment of the application aims to provide an object editing method, an object editing device and electronic equipment, and the problems that the editing process of an edited object is complex, time-consuming and low in efficiency can be solved.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides an object editing method, where the method includes: receiving a first input of a user to a first object displayed in a first interface, wherein the first interface is an interface in a first application; in response to a first input, displaying a first object in an editable state on a first interface, and displaying a target control, the target control comprising at least one editing instruction; receiving a second input of the user, wherein the second input is used for determining a target editing instruction; responding to the second input, calling a second application in the background to execute target editing operation on the first object to obtain a second object, and updating the first object into a second object; the second application is an application to which the target editing instruction belongs, and the target editing operation is an editing operation corresponding to the target editing instruction.
In a second aspect, an embodiment of the present application provides an object editing apparatus, including: the device comprises a receiving module, a display module, an execution module and an updating module; the receiving module is used for receiving first input of a user on a first object displayed in a first interface, and the first interface is an interface in a first application; the display module is used for responding to the first input received by the receiving module, displaying the first object in an editable state on the first interface, and displaying a target control, wherein the target control comprises at least one editing instruction; the receiving module is used for receiving a second input of the user, and the second input is used for determining a target editing instruction; the execution module is used for responding to the second input received by the receiving module and calling a second application to execute target editing operation on the first object in the background to obtain a second object; the updating module is used for updating the first object into a second object; the second application is an application to which the target editing instruction belongs, and the target editing operation is an editing operation corresponding to the target editing instruction.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, the first input of the user to the first object displayed in the first interface (the interface in the first application) can be received; in response to the first input, displaying the first object in an editable state on the first interface, and displaying a target control (including at least one editing instruction); receiving a second input of the user, wherein the second input is used for determining a target editing instruction; and responding to the second input, executing target editing operation (editing operation corresponding to the target editing instruction) on the first object by calling a second application (namely the application to which the target editing instruction belongs) in the background to obtain a second object, and updating the first object into the second object. According to the scheme, the first object in an editable state and the target control (the target control is associated with the editing instruction of the second application) can be displayed on the current interface of the first application, the electronic equipment is triggered to determine the target editing instruction through the second input of the target control, the target editing operation corresponding to the target editing instruction can be carried out on the first object through calling the second application in the background, the second object is obtained, and the first object is updated to the second object. Therefore, operations such as saving the first object in the first application, opening the second application, opening the first object in the second application and the like are not needed, so that the operation steps can be simplified, the operation time can be saved, and the operation efficiency can be improved.
Drawings
Fig. 1 is a flowchart of an object editing method provided in an embodiment of the present application;
FIG. 2 is one of schematic interface diagrams of an object editing method provided in an embodiment of the present application;
fig. 3 is a second schematic interface diagram of an object editing method according to an embodiment of the present application;
fig. 4 is a third interface schematic diagram of an object editing method provided in the embodiment of the present application;
FIG. 5 is a fourth schematic interface diagram of an object editing method provided in the embodiments of the present application;
FIG. 6 is a fifth schematic interface diagram of an object editing method according to an embodiment of the present application;
FIG. 7 is a sixth schematic interface diagram of an object editing method according to an embodiment of the present application;
FIG. 8 is a mapping logic diagram of an object editing method according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an object editing apparatus according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 11 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
In the embodiments of the present application, words such as "exemplary" or "for example" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the description of the embodiments of the present application, unless otherwise specified, "a plurality" means two or more, for example, a plurality of processing units means two or more processing units; plural elements means two or more elements, and the like.
It should be noted that, the marks in the embodiments of the present application are used to indicate words, symbols, images, and the like of information, and a control or other container may be used as a carrier for displaying information, including but not limited to a word mark, a symbol mark, and an image mark.
The following describes in detail an object editing method, an object editing apparatus, and an electronic device provided in the embodiments of the present application through specific embodiments and application scenarios thereof with reference to the accompanying drawings.
The object editing method provided by the embodiment of the application can be applied to a scene of editing objects such as images, documents, audios and videos in a non-editing application, and by means of the scheme provided by the embodiment of the application, a first object in an editable state and a target control (the target control is associated with an editing instruction of a second application) can be displayed on a current interface of the first application, the electronic device is triggered to determine the target editing instruction through second input of the target control, the second application is called to perform target editing operation corresponding to the target editing instruction on the first object in a background mode, a second object is obtained, and the first object is updated to the second object. Therefore, operations such as saving the first object in the first application, opening the second application, opening the first object in the second application and the like are not needed, so that the operation steps can be simplified, the operation time can be saved, and the operation efficiency can be improved.
Referring to fig. 1, an object editing method provided in an embodiment of the present application is provided, and an example of the object editing method provided in the embodiment of the present application is described below with an execution subject as an electronic device. The method may include steps 201 through 204 described below.
Step 201, the electronic device receives a first input of a user to a first object displayed in a first interface.
The first interface is an interface in the first application.
It is to be understood that, in the embodiment of the present application, the first object may be an object such as an image, a document, a video, an audio, or the like, or may be another object, which may be determined specifically according to an actual situation, and the embodiment of the present application is not limited.
It can be understood that, in the embodiment of the present application, the first application may be an instant social application, a browser application, or the like, or may be another application, which may be determined specifically according to an actual situation, and the embodiment of the present application is not limited.
It can be understood that, in this embodiment of the application, the first interface may be a main interface in the first application or a function interface in the first application, which may be determined specifically according to an actual situation, and this embodiment of the application is not limited.
Optionally, the first input may be a click input of the user on the first object, may also be a slide input of the user on the first object, and may also be other feasibility inputs, which may be specifically set according to an actual situation, and the embodiment of the present application is not limited.
Illustratively, the click input may be any number of click inputs, such as a single click input, a double click input, and the like, or may be a short-press input, a long-press input, a double-finger click input, and the like; the slide input may be a slide input in any direction, such as an up slide input, a down slide input, a left slide input, a right slide input, or a two-finger slide input.
Step 202, the electronic device responds to the first input, displays the first object in an editable state on the first interface, and displays the target control.
Wherein the target control comprises at least one editing instruction.
It is understood that the first object in the editable state, that is, the first object can be subjected to the editing process at this time.
Optionally, the first object in the editable state may be displayed in a floating manner on the first interface; the editing interface 1 may also be displayed in a floating manner on the first interface, and the first object in an editable state is displayed in the editing interface 1; the first object in the editable state may also be displayed in other forms, which may be determined according to actual use requirements, and the embodiment of the present application is not limited.
Alternatively, the target control may be a hover form control; when the first object in the editable state is displayed in the editing interface 1, the target control can also be embedded in the editing interface 1; the target control may also be displayed in other forms, which may be determined specifically according to actual use requirements, and the embodiment of the present application is not limited.
Optionally, any control related to the target control is not displayed on the first interface before the first input, and the target control is displayed on the first interface after the first input is received.
Optionally, before the first input, a first control is displayed on the first interface, the first control and the target control have an association relationship, and after the first input is received, the electronic device updates the first control to the target control. For example, a navigation hover ball control (first control) of the system is displayed on the first interface prior to the first input, and after receiving the first input, the electronic device updates the navigation hover ball control to a target control (e.g., a hover toolbar control, i.e., a navigation hover control auto-expand editing toolbar).
Illustratively, as shown in fig. 2 (a), the first interface is a contact 1 interface, the navigation hover ball control (i.e., the first control, indicated by the mark "1") of the display system is hovered on the contact 1 interface, the first object is image 1, and the user clicks on the image 1 on the contact 1 interface, then as shown in fig. 2 (b), the image 1 in an editable state is displayed on the contact 1 interface, the navigation hover ball control automatically expands an editing tool bar (i.e., the target control, indicated by the mark "2"), and a plurality of editing tool identifications are displayed in the editing tool bar.
Optionally, the target control comprises at least one of: n first marks, M second marks.
It can be understood that, in this embodiment of the application, the target control may only include N first identifiers, may also include M second identifiers, and may also include N first identifiers and M second identifiers, which may be determined specifically according to actual use requirements, and this embodiment of the application is not limited. In addition, in the embodiment of the present application, it is not limited whether the target control further includes other identifiers when the target control includes the identifier.
It can be understood that each first identifier is used to indicate at least one editing instruction, and each second identifier is used to indicate at least one editing instruction, which may be determined according to actual usage requirements, and the embodiment of the present application is not limited.
Optionally, for N first identifiers, each first identifier is a base tool identifier or a combination tool identifier, each combination tool identifier is a combination identifier of at least two base tool identifiers, each base tool identifier includes at least one instruction tool identifier, each instruction tool identifier in one base tool identifier is used to indicate a different base editing instruction, and N is a positive integer.
It can be understood that, in this application embodiment, each of the N first identifiers may be a basic tool identifier, may also be a combination tool identifier, may also be a part of a basic tool identifier, and may also be a part of a combination tool identifier, which may specifically be determined according to actual use requirements, and this application embodiment is not limited.
It is understood that, in the case that one first identifier is one base tool identifier, the one first identifier is used to indicate at least one base editing instruction; in the case that one first identifier is one combination tool identifier, the one first identifier is used for indicating at least one combination editing instruction, and each combination editing instruction is a combination instruction of at least two different basic editing instructions.
Optionally, in a case that the target control includes the N first identifiers, each instruction tool identifier (which may be referred to as a service tool identifier at this time) in the one base tool identifier is used to indicate: basic editing instructions from different services and similar functions.
It should be noted that in the embodiment of the present application, functions are similar, that is, the degree of function similarity is greater than a certain threshold, and it can be understood that the functions are the same, that is, the functions belong to the same type. In this embodiment of the application, the different services may be from different cloud services, or from different application programs, or from different other services, which is not limited in this embodiment of the application.
It is understood that the basic editing instructions indicated by the same basic tool identifier are basic editing instructions from different services but with the same function type. The basic editing instructions indicated by different basic tool identifiers are basic editing instructions of different function types (possibly from different services or from the same service). Namely, in the embodiment of the present application, the basic editing instructions are classified according to the function types.
Optionally, in a case that the target control includes the N first identifiers, each instruction tool identifier in the one base tool identifier is used to indicate: basic editing instructions from the same service and with different functions.
It is understood that the same basic tool id indicates a basic editing instruction that is from the same service but has a different function type. The basic editing instructions indicated by the different basic tool identifiers are basic editing instructions from different services (which may have the same function type or different function types). Namely, in the embodiment of the present application, classification is performed according to the service provider of the basic editing instruction.
Optionally, in a case that the target control includes the N first identifiers, each instruction tool identifier in the one base tool identifier is used to indicate: basic editing instructions from different services and with different functions.
Illustratively, as shown in fig. 3 (a), the reference "3" indicates a basic tool identifier 1, and the basic tool identifier 1 includes a plurality of instruction tool identifiers, and the basic editing instruction indicated by each instruction tool identifier is from a different service but has similar functions. For example, the base editing instruction indicated by each instruction tool identifier is a matting editing instruction from a different service.
Illustratively, as shown in fig. 3 (b), the reference "4" indicates a combination tool identifier 1, the combination tool identifier 1 includes a plurality of basic tool identifiers (e.g., a plurality of basic tool identifiers included in the combination tool identifier 1 are illustrated in a horizontal direction), each basic tool identifier includes at least one instruction tool identifier (e.g., a plurality of instruction tool identifiers included in one basic tool identifier are illustrated in a horizontal direction), and the basic editing instructions indicated by each instruction tool identifier are from different services but have similar functions. For example, the base editing instruction indicated by each instruction tool identifier is a matting editing instruction from a different service.
In the embodiment of the application, the classification of the basic editing instructions can make the target control more orderly, and not stack all the editing instructions together, so that the user can conveniently search and use the editing instructions.
Optionally, for N first identifiers, each first identifier is an instruction tool identifier, and each instruction tool identifier is used to indicate a basic editing instruction (which may be from different services, may also be from the same service, may have similar functions, and may also have different functions).
It will be appreciated that each first identifier is indicative of a base editing instruction, and that the base editing instruction is not classified at this time.
In this embodiment of the present application, the N first identifiers may also be displayed in other forms, and this embodiment of the present application is not limited.
Optionally, for M second identifiers, each second identifier is a basic template identifier or a combined template identifier, each basic template identifier includes at least one basic instruction template identifier, and each basic instruction template identifier in one basic template identifier is used to indicate a different basic editing instruction; each combined instruction template mark comprises at least one combined instruction template mark, each combined instruction template mark is a combined mark of at least one basic instruction template mark, each basic instruction template mark in one combined instruction template mark is used for indicating different basic editing instructions, and M is a positive integer.
It can be understood that, in this embodiment of the application, each of the M second identifiers may be a basic template identifier, may also be a combined template identifier, may also be a part of a basic template identifier, and a part of a combined template identifier, and may specifically be determined according to actual use requirements, which is not limited in this embodiment of the application.
It is to be understood that, in the case that one second identifier is one basic template identifier, the one second identifier is used to indicate at least one basic editing instruction; and in the case that one second identifier is a combined template identifier, the one second identifier is used for indicating at least one combined editing instruction, and each combined editing instruction is a combined instruction of at least two different basic editing instructions.
Optionally, in a case that the target control includes the M second identifiers, each basic instruction template identifier in the one basic template identifier is used to indicate: basic editing instructions from different services and similar functions.
It is understood that the basic editing instructions indicated by the same basic template identifier are basic editing instructions from different services but with the same function type. The basic editing instructions indicated by different basic template identifications are basic editing instructions of different function types (possibly from different services or from the same service). Namely, in the embodiment of the present application, the basic editing instructions are classified according to the function types.
Optionally, in a case that the target control includes the M second identifiers, each basic instruction template identifier in the one basic template identifier is used to indicate: basic editing instructions from the same service and with different functions.
It is understood that the basic editing instructions indicated by the same basic template identifier are basic editing instructions from the same service but with different function types. The basic editing instructions indicated by different basic template identifications are basic editing instructions from different services (which may have the same function type or different function types). Namely, in the embodiment of the present application, classification is performed according to the service provider of the basic editing instruction.
Optionally, in a case that the target control includes the M second identifiers, each basic instruction template identifier in the one basic template identifier is used to indicate: basic editing instructions from different services and with different functions.
Optionally, in a case that the M second identifiers include a second identifier of the combined template identifier, each combined instruction template identifier in one combined template identifier is: and S combined identifications of basic instruction template identifications with different functions.
It can be understood that the number of the basic instruction template identifications included in each combined instruction template identification in the same combined template identification is the same, and the number of the basic instruction template identifications included in each combined instruction template identification in different combined template identifications may be the same or different.
It is understood that any two combined instruction template identifications in the same combined instruction template identification include a plurality of basic instruction template identifications with the same category, but at least one basic instruction template identification in the plurality of basic instruction template identifications comes from different services.
Illustratively, the combined template identifier (second identifier) is a + b, and the a + b includes three combined instruction template identifiers, which are: a1+ b2, a3+ b2 and a4+ b5 (letters represent "function type", and numbers represent "service provider"), each combined instruction template identifier includes 2 basic instruction template identifiers, each combined instruction template identifier includes the same function type of the basic instruction template identifier (i.e., both include function type a and function type b), at least one basic instruction template identifier in any two combined instruction template identifiers is from different services, such as a1+ b2 and a3+ b2, a1 and a3 are from different services, a3+ b2 and a4+ b5, a3 and a4 are from different services, and b2 and b5 are from different services.
Illustratively, as shown in fig. 4 (a), the target control includes N first identifiers, and the user may trigger the electronic device to display M second identifiers by a user's sliding input to the left of the target control (as shown in fig. 4 (b)). As shown in fig. 5, the user may also trigger the electronic device to expand one second identifier by inputting the one second identifier, and display the instruction template identifier (the basic instruction template identifier or the combined instruction template identifier) included in the one second identifier. For example, the user clicks on template a (a basic template identifier), and expands to display instruction a1 (a basic instruction template identifier) and instruction a2 (a basic instruction template identifier); the user clicks on the template c + d (a combined template identifier), and expands to display the commands c1+ d1 (a combined command template identifier), c1+ d2 (a combined command template identifier), and c3+ d3 (a combined command template identifier).
Optionally, for M second identifiers, each second identifier is an instruction template identifier, and each instruction template identifier is used to indicate an editing instruction (which may be from different services, may also be from the same service, may have similar functions, and may also have different functions).
It will be appreciated that each second identifier is indicative of a base editing instruction, and that the base editing instruction is not classified at this time.
In this embodiment of the present application, the M second identifiers may also be displayed in other forms, and this embodiment of the present application is not limited.
Step 203, the electronic device receives a second input of the user.
Wherein the second input is used to determine a target editing instruction.
Optionally, the second input may also be used for other functions, and the embodiment of the present application is not limited.
Optionally, the second input may be a click input of the user on the target control, a slide input of the user on the target control, or other feasibility inputs, which may be determined according to actual use requirements, and the embodiment of the present application is not limited.
For example, reference may be made to the description of the click input and the slide input in the description of the first input in step 201, and details thereof are not repeated herein.
Optionally, the second input may also include other inputs, for example, an input of the first object by the user, which may be determined according to an actual use condition, and the embodiment of the present application is not limited.
And step 204, the electronic equipment responds to the second input, obtains a second object by calling a second application in the background to execute target editing operation on the first object, and updates the first object into the second object.
The second application is an application to which the target editing instruction belongs, and the target editing operation is an editing operation corresponding to the target editing instruction.
It is to be understood that, in the embodiment of the present application, the second application is different from the first application.
Optionally, in this embodiment of the application, the target editing instruction may be an editing instruction embedded in the second application, or may also be an editing instruction obtained by the second application from the cloud server, which may specifically be determined according to an actual use requirement, and this embodiment of the application is not limited.
It can be understood that in the embodiment of the present application, each identifier in the target control has an association relationship, a mapping relationship, or a hyperlink relationship with an editing instruction (editing tool) in another application, and in the embodiment of the present application, it is described that each identifier in the target control has a mapping relationship with an editing instruction in another application as an example. When a user selects one identifier in the target control, the identifier is mapped to an editing instruction (target editing instruction) in a corresponding application (second application) based on the mapping relation, and then the electronic equipment calls the second application to execute target editing operation on the first object in the background to obtain a second object.
Optionally, in a case that the target control includes the N first identifiers, the second input includes a user input to a first target identifier of the N first identifiers; in the case that the first target identifier is a basic tool identifier, the target editing instruction is: the basic editing instruction corresponding to the second input is selected from at least one basic editing instruction indicated by the first target identifier; in the case that the first target identifier is a combination tool identifier, the target editing instruction is: and the combined editing instruction corresponding to the second input is selected from the at least one combined editing instruction indicated by the first target identifier.
Illustratively, as shown in (a) in fig. 3, the user clicks the basic tool identifier a (first target identifier) indicated by the mark "3", the 3 instruction tool identifiers (the basic editing instruction a1, the basic editing instruction a2, and the basic editing instruction A3 are shown in the expanded display, and the user clicks the rightmost instruction tool identifier, so that the target editing instruction is the basic editing instruction A3.
Example 1, as shown in (B) of fig. 3, the user clicks the combination tool id B + C + a (first object id) indicated by the flag "4", expands and displays 3 basic tool ids as shown in the figure (the basic tool id B, the basic tool id C, and the basic tool id a are sequentially displayed from left to right), the user clicks the basic tool id a, expands and displays 3 instruction tool ids as shown in the figure (the instruction tool id a1 (indicating the basic editing instruction a1), the instruction tool id a2 (the basic editing instruction a2), and the instruction tool id A3 (the basic editing instruction A3)) in sequence from top to bottom, the user clicks the instruction tool id on the lowermost side, the basic tool id a corresponds to the basic editing instruction A3, the basic editing instruction id B corresponds to the basic editing instruction B1, the basic editing instruction C corresponds to the basic editing instruction C3, the target editing instruction is a combined instruction of the base editing instruction B1, the base editing instruction C3, and the base editing instruction A3.
Optionally, in a case that the first target identifier is a basic tool identifier, the second input is used for updating the target instruction tool identifier from the instruction tool identifiers included in the first target identifier; in the case where the first target identifier is a cluster tool identifier, the second input includes at least one of: the system is used for adjusting the arrangement sequence of the basic tool identifications included by the first target identification; the target basic tool identification is used for updating the target instruction tool identification from the instruction tool identification included in the target basic tool identification; the target basic tool identifier is a basic tool identifier included in the first target identifier, and the target instruction tool identifier is an instruction tool identifier used for determining the target editing instruction.
Optionally, in a case that the second input is used for adjusting the arrangement order of the basic tool identifiers included in the first target identifier, the second input is further used for expanding and displaying at least two basic tool identifiers included in the first target identifier; in the case that the second input is used for updating the target instruction tool identifier from the instruction tool identifiers included in the target base tool identifier, the second input is further used for expanding and displaying at least two base tool identifiers included in the first target identifier and at least one instruction tool identifier included in the target base tool identifier.
It can be understood that, in the embodiment of the present application, the arrangement order of the basic tool identifiers represents the call order of the corresponding basic editing instructions.
Example 2, in combination with example 1, the user may adjust the first target identifier to include a basic tool identifier B, a basic tool identifier C, and a basic tool identifier a by inputting the basic tool identifier B, the basic tool identifier C, and the basic tool identifier a. The user may update the original corresponding instruction tool id A3 of the basic tool id a to the instruction tool id a1 by inputting.
Optionally, in a case that the target control includes the M second identifiers, the second input includes a user input to a second target identifier of the M second identifiers; in the case that the second target identifier is a basic template identifier, the target editing instruction is: the basic editing instruction corresponding to the second input is selected from at least one basic editing instruction indicated by the second target identifier; in the case that the second target identifier is a combined template identifier, the target editing instruction is: and the combined editing instruction corresponding to the second input is selected from the at least one combined editing instruction indicated by the second target identification.
For example, as shown in fig. 5, if the second target identifier is the template a, the user clicks the command a1 (i.e. the basic command template identifier a1), and the target editing command is the basic editing command indicated by the command a 1. If the second target identifier is the template c + d, the user clicks the command c1+ d2 (i.e., the combined command template identifier c1+ d2), and the target editing command is the combined editing command indicated by the command c1+ d 2.
Optionally, for the first target identification or the second target identification, the second input is further used to determine at least one region, each region being: in the first object, the action region corresponding to the target basic editing instruction is a basic editing instruction included in the target basic editing instruction.
It can be understood that, in this embodiment of the application, it may be assumed that action areas corresponding to all basic editing instructions included in the target editing instruction are all areas of the first object, an action area may also be set by inputting at least one basic editing instruction in all basic editing instructions included in the target editing instruction, the same action area may be set for at least two basic editing instructions in all basic editing instructions included in the target editing instruction, a different action area may also be set for each basic editing instruction in all basic editing instructions included in the target editing instruction, other situations may also be included, and it may specifically be determined according to actual use requirements, which is not limited in this embodiment of the application.
Optionally, the at least one region includes a first region and a second region, the first region and the second region have an overlapping region, the first region is an action region corresponding to the first editing instruction, the second region is an action region corresponding to the second editing instruction, and the first editing instruction and the second editing instruction are two editing instructions in the target editing instruction; the electronic device in step 204 may specifically perform the target editing operation on the first object by invoking the second application in the background through the following process.
The electronic equipment calls a third application to execute the editing operation corresponding to the first editing instruction on the first non-overlapping area in the background, calls a fourth application to execute the editing operation corresponding to the second editing instruction on the second non-overlapping area in the background, and sequentially executes the following steps on the overlapping area: the editing operation corresponding to the first editing instruction is executed by calling a third application in the background, and the editing operation corresponding to the second editing instruction is executed by calling a fourth application in the background; the first non-overlapping area is an area except the overlapping area in the first area, and the second non-overlapping area is an area except the overlapping area in the second area; the third application is: in the second application, the application to which the first editing instruction belongs; the fourth application is as follows: and in the second application, the application to which the second editing instruction belongs.
Example 3, in combination with example 1, as shown in fig. 6, the action region corresponding to the basic editing instruction B1 is a region B, the action region corresponding to the basic editing instruction C3 (the first editing instruction) is a region C (the first region), and the action region corresponding to the basic editing instruction A3 (the second editing instruction) is a region a (the second region), where there is an overlapping region C between the region a and the region CoverlappingIn the region C except the overlap region CoverlappingThe other region is a first non-overlapping region except for overlapping region C in region aoverlappingThe outer region is a second non-overlapping region. The specifically step of the electronic device in step 204 executing the target editing operation on the first object by invoking the second application in the background is: the electronic equipment executes the editing operation corresponding to the basic editing instruction C3 on the first non-overlapping area by calling the third application (the application to which the basic editing instruction C3 belongs) in the background, and executes the editing operation corresponding to the basic editing instruction A3 on the second non-overlapping area by calling the fourth application (the application to which the basic editing instruction A3 belongs) in the background, so that the editing operation corresponding to the basic editing instruction A3 is executed on the overlapping areaCoverlappingThe following steps are performed in sequence: the editing operation corresponding to the basic editing instruction C3 is performed by calling the third application in the background, the editing operation corresponding to the basic editing instruction A3 is performed by calling the fourth application in the background, and the editing operation corresponding to the basic editing instruction B1 is performed on the area B by the fifth application (the application to which the basic editing instruction B1 belongs).
Optionally, in a case that the second input includes an input of a second target identifier in the M second identifiers by a user, each basic instruction template identifier is further used to indicate a template action region corresponding to a corresponding basic editing instruction, where each region is: and determining the action region of the template corresponding to the target basic editing instruction.
It is to be understood that one basic editing instruction template identifier indicates one basic editing instruction and the template action region corresponding to the one basic editing instruction, and each region in the at least one region is determined based on the template action region corresponding to the corresponding basic editing instruction in the target editing instruction.
Optionally, the target area (the action area of the target basic editing instruction in the first object) is automatically matched by the electronic device based on the template action area corresponding to the target basic editing instruction.
Illustratively, the first object is an image 1, the template action area corresponding to the target basic editing instruction is an upper left area of the image, and the target area is the upper left area of the image 1; and if the template action area corresponding to the target basic editing instruction is the face area of the image, the target area is the face area of the image 1.
Optionally, the target area is jointly matched with the template action area 1 corresponding to the target basic editing instruction and the area designated by the user by the electronic device.
Illustratively, the first object is an image 1, the template action region corresponding to the target basic editing instruction is an upper left region 1 of the image, and if the user selects an upper left region 2 of the image 1, the target region is an intersection of the upper left region 1 and the upper left region 2 of the image 1; the template action area corresponding to the target basic editing instruction is the face area of the image, and if the user selects the area 3 of the image 1, the target area is the face area in the area 3 of the image 1.
It is understood that the instructions are loaded from a preset instruction template, which includes a combination of various operation commands that the user can click. The instructions in the instruction template comprise specific instruction parameters and instruction action areas. When the instruction template (the target basic editing instruction is determined by the second target identifier) is applied to a new editing object (the first object), the action area of the instruction is changed (the action area of the template corresponding to the target basic editing instruction is different from the actual action area (the target area) of the target editing instruction), the action area needs to be changed, and the action effect of the instruction is ensured to be consistent. The action region is simply divided into a regular polygonal region (such as a rectangle) and an irregular region (such as a human face outline), the region is labeled by applying an edge detection and image recognition algorithm to the irregular region, the region can be a concrete abstract concept (such as a human face and the like), and the shape description based on the outline or the region is also provided.
When the command action area is a regular polygon area, it can be assumed that the command acts on a fixed-scale area of the graphic. The new edited object is directly scaled based on the original edited object. When the command application area is an irregular polygonal area, it can be assumed that the command is applied to a specific object in the graphic. And detecting the shape description of the new editing object, judging whether the abstract concept of the new editing object is consistent or not or judging whether the similarity score of the shape description reaches a certain threshold value, and considering that the instruction acts in the region of the new editing object (for example, determining a region matched with the template acting region in the first object as a target region).
In the embodiment of the application, the first input of the user to the first object displayed in the first interface (the interface in the first application) can be received; in response to the first input, displaying the first object in an editable state on the first interface, and displaying a target control (including at least one editing instruction); receiving a second input of the user, wherein the second input is used for determining a target editing instruction; and responding to the second input, executing target editing operation (editing operation corresponding to the target editing instruction) on the first object by calling a second application (namely the application to which the target editing instruction belongs) in the background to obtain a second object, and updating the first object into the second object. According to the scheme, the first object in an editable state and the target control (the target control is associated with the editing instruction of the second application) can be displayed on the current interface of the first application, the electronic equipment is triggered to determine the target editing instruction through the second input of the target control, the target editing operation corresponding to the target editing instruction can be carried out on the first object through calling the second application in the background, the second object is obtained, and the first object is updated to the second object. Therefore, operations such as saving the first object in the first application, opening the second application, opening the first object in the second application and the like are not needed, so that the operation steps can be simplified, the operation time can be saved, and the operation efficiency can be improved.
Optionally, in this embodiment of the present application, after the step 203, the object editing method provided in this embodiment of the present application may further include the following step 205.
Step 205, the electronic device saves the target editing instruction for the first object before updating the first object to the second object.
It can be understood that, in this embodiment of the application, after receiving the second input, the electronic device stores the target editing instruction, and the target editing instruction may be stored in an instruction temporary storage area of the electronic device, until the electronic device finishes editing the first object based on the target editing instruction, and the electronic device may delete the target editing instruction. In this way, if the editing is interrupted before the first object is edited based on the target editing instruction, the editing of the first object can be continued next time according to the saved target editing instruction. Also, multiple objects can be edited simultaneously.
The details can be understood from the following description: the electronic equipment can record user editing instructions in the process of editing objects, each editing instruction of a user can be recorded by the electronic equipment according to a time sequence (the user editing instruction can be stored as a temporary storage instruction), and the user editing instruction is stored in the instruction temporary storage module in real time, so that the original editing site can be recovered when the user interrupts current editing or does not finish editing. And each editing object is temporarily stored with instructions, so that a plurality of objects can be edited simultaneously. And saving the editing instruction which is not edited by the user, and continuously editing the editing object when the editing instruction is opened again.
Optionally, within a preset time period after the second input is received (before the first object is updated to the second object), if a third input is received, the target editing instruction is saved. Wherein the third input is for controlling the first object to change from an editable state to a non-editable state.
It can be understood that the preset time can be determined according to actual use requirements, and the embodiment of the application is not limited.
Optionally, the third input may be a click input of the user, or a slide input of the user, and the third input may also be other feasibility inputs, which may be determined according to actual use requirements, and the embodiment of the present application is not limited.
For example, reference may be made to the description of the click input and the slide input in the description of the first input in step 201, and details thereof are not repeated herein.
It is to be understood that the third input is for controlling the electronic device to stop editing the first object. For example, the third input may be an input that the user quits editing the first object, an input that the user switches the editing object, an input that the user clicks the first interface, or the like.
In the embodiment of the application, within a preset time period after the electronic device receives the second input, if a third input is received, the target editing instruction is stored, and then the first object can be edited according to the target editing instruction continuously when the first object is in an editable state next time.
Optionally, in this embodiment of the application, after step 203, the object editing method provided in this embodiment of the application may further include step 206 described below.
Step 206, after updating the first object to the second object, the electronic device generates a target instruction identifier based on the target editing instruction, and updates the target instruction identifier to the target control.
Wherein, the target instruction identification is a target tool identification or a target template identification.
It should be noted that, in this embodiment of the application, in the case that the target editing instruction is determined according to the input of the first target identifier by the user, the electronic device may store the target editing instruction as a target tool instruction (corresponding to the target tool identifier) or a target template instruction (corresponding to the target template identifier); in the case that the target editing instruction is determined according to the input of the second target identifier by the user, the electronic device may save the target editing instruction as a target tool instruction or a target template instruction.
In the embodiment of the application, the target tool instruction can be stored in the tool instruction library, and the execution is called from the tool instruction library when the target tool instruction is called; the target template instruction can be stored in a template instruction library, and the execution is called from the template instruction library when the target template instruction is called.
Optionally, the target instruction identifier is added to the target control when the instruction identifier similar to the editing instruction indicated by the target instruction identifier does not exist in the target control, and the first instruction identifier in the target control is updated to the target instruction identifier when the first instruction identifier similar to the editing instruction indicated by the target instruction identifier exists in the target control.
Optionally, in this embodiment of the present application, the type of the similar instruction identifier (or the first instruction identifier) may not be limited. It will be appreciated that whether the target instruction identification is a target tool identification or a target template identification, the similar instruction identification (or first instruction identification) may be a tool identification or a template identification.
Optionally, in the case that the target instruction identifier is a target tool identifier, the similar instruction identifier (or the first instruction identifier) is a tool identifier; in the case where the target instruction identification is a target template identification, the similar instruction identification (or first instruction identification) is a template identification.
Illustratively, the target instruction identifier is a target template identifier, the target instruction identifier is added to the target control when the template identifier similar to the editing instruction indicated by the target instruction identifier does not exist in the target control, and the first template identifier in the target control is updated to the target instruction identifier when the first template identifier similar to the editing instruction indicated by the target instruction identifier exists in the target control.
In the embodiment of the application, after the first object is operated by the target editing instruction, the target editing instruction is saved as the template, and the template can be directly called when similar editing needs to be performed next time.
It should be noted that, in this embodiment of the application, when the second input is input to the second target identifier, that is, the target editing instruction is an existing template instruction, the user may adjust the template instruction according to a requirement, may set a modification parameter for each basic editing instruction in the template instruction separately, and may also drag the basic editing instruction in the template instruction to change a call sequence of the basic editing instruction.
Optionally, in this embodiment of the application, after the step 201 and before the step 202, the object editing method provided in this embodiment of the application may further include the following steps 207 to 208.
Step 207, the electronic device determines the target type to which the first object belongs.
And step 208, the electronic device determines the target control corresponding to the target type.
It is understood that the type of the first object is judged according to the content information of the first object, i.e., the target type is determined. Generally, the types include pictures, documents, audio, video and the like, each type has a corresponding target control, and editing instructions in each target control can be classified according to similar functions.
Optionally, in this embodiment of the present application, after the step 201, the object editing method provided in this embodiment of the present application may further include the following steps 209 to 211.
And step 209, the electronic equipment displays at least one object in a state to be edited on the first interface.
Step 210, the electronic device receives a fourth input from the user.
Optionally, the fourth input may be a click input of the user on the first object, or a slide input of the user on the first object, and the fourth input may also be another feasibility input, which may be determined according to actual use requirements, and the embodiment of the present application is not limited.
For example, reference may be made to the description of the click input and the slide input in the description of the first input in step 201, and details thereof are not repeated herein.
Step 211, the electronic device responds to the fourth input, and switches the first object from the editable state to the to-be-edited state, and switches the third object from the to-be-edited state to the editable state.
And the third object is an object corresponding to the fourth input in the at least one object in the state to be edited.
Optionally, at least one object in the to-be-edited state and the first object in the editable state are displayed in an overlapping manner, or the at least one object in the to-be-edited state and the first object in the editable state form a slidable horizontal list, and the first object currently in the editable state is in the middle of the horizontal list, as shown in fig. 7.
Illustratively, as shown in fig. 2, the user clicks a first object in the first interface, opens an editing interface in the first interface, and enters an editing process. The editing interface comprises a first object in an editable state and also comprises a plurality of uncompleted editing objects (objects in a to-be-edited state), the objects are displayed in a stacked mode, different editing objects can be distinguished by frame colors (or distinguished by other modes, and the embodiment of the application is not limited), and a user can slide up and down after pressing the first object for a long time to switch the states of the editing objects.
Illustratively, the contents of the above steps 201 to 211 can be further understood through the following descriptions of the steps 101 to 104.
Step 101, mapping the hover tool bar (target control) with an editing application tool bar (an application to which the editing instruction belongs, for example, a second application).
A mapping logic diagram of the hover tool bar and the editing application tool bar is shown in fig. 8 (in which a base editing instruction is denoted by a mark "5" and a combined editing instruction is denoted by a mark "6"), and generally, each type of editing object corresponds to a plurality of editing applications and editing services. When the mapping tool bar is called in the background, a plurality of editing tools of editing application and editing service can be mapped for each tool, and each tool can be independently switched into different mapping tools. The mapping of these tools can be divided into a base mapping tool (base editing instructions) and a combination mapping tool (combination editing instructions).
The effect of this time is that the user clicks the instruction and the editing object through the background calling service, and then the user-selected application and service in each tool are called directionally, at this time, each tool can come from different services, and after the processing is completed, the instruction result is immediately returned to the editing interface visible to the user.
1. Basic mapping tool: and classifying and integrating the homogenized services A1, A2 and A3. For example, for a picture editing object, the a tool may be a command to crop, beautify, cut, etc., where the subscript indicates the provider of the service, binding a number of services An with a. And the user carries out tool editing after pressing the tool A for a long time, the tool expands a specific homogenization service list, the mapping from the tool A to An can be switched by clicking, and tool mapping is added and deleted after pressing the expansion tool bar for a long time.
2. Combining mapping tools: for a complex combination operation T with specific requirements of a user, a call chain A2- > C1 is generated for services in a basic operation, and the whole call chain is mapped into a combination tool T. The background calling module calls the combined instructions in sequence, and the input of each instruction is the output result based on the previous instruction. The user edits after pressing the T tool, the tool expands specific calling services, drags and switches calling sequences, and clicks specific basic operation switching homogenization services A1, A2 and A3. The combined mapping toolbar is shown in fig. 3 (b), in which a call instruction chain composed of basic operations is horizontally expanded, and a plurality of services of the basic instruction a are vertically expanded.
The effect of this time is that the user clicks the instruction and the editing object through the background calling service, and then the user-selected application and service in each tool are called directionally, at this time, each tool can come from different services, and after the processing is completed, the instruction result is immediately returned to the editing interface visible to the user.
And 102, unfolding the suspension ball to edit the tool bar, and starting the suspension ball editing function.
When a user enters an editing interface to edit, the suspension ball expands an editing tool bar corresponding to the current editing object, and the user clicks to edit. The options corresponding to each tool are listed below the interface. The editing step is as follows.
a. And determining an editing tool (target editing instruction), and clicking the hovel tool bar by the user to select the editing tool (the first target identifier or the second target identifier).
b. Determining action areas, and determining the action area corresponding to each basic editing instruction (the action area can be selected by a user through input, or the action area can be automatically matched by the electronic equipment). Only one action area is selected for the basic editing instruction; for the combined editing instruction, a plurality of action areas are sequentially selected according to a specific calling instruction, the plurality of action areas can be overlapped, and one action area can correspond to at least one basic editing instruction.
c. Editing a plurality of action regions (for example, the first region and the second region in the above-described embodiment) in which overlapping regions exist, and performing general clipping on an action region (non-overlapping action region) in which no overlapping region exists; for the clipping of a plurality of action areas with overlapping areas, the areas are taken as a plurality of polygons, the intersection of the polygons is calculated, the overlapping block and the non-overlapping block of each action area are calculated according to the intersection, and the image is segmented again. For non-overlapping blocks of non-overlapping regions and overlapping regions, only a single region-action instruction is applied; overlap block C for overlap regionoverlappingThen, using consecutive instructions, add the instruction C1 of the current region after the instruction A2 of the previous region overlapped, the background server receives the call instruction chain A2->C1 and an overlap area point set.
d. And comparing the effects, wherein the return result of the action area of each basic editing instruction is stacked on the original image in the form of a layer, and a user can select to save the edited object according to the editing effect, and also can select to withdraw the previous instruction editing modification editing instruction for re-editing (for example, withdraw the editing of the target editing instruction on the first object, reselect the editing instruction for re-editing) so as to obtain the satisfactory editing effect of the user. In particular, for the modification of the region a in the overlapped part, the combined mapping tool T is called at the same time, and a2- > C1 is executed in sequence to ensure the consistency of the action region of the subsequent instruction.
e. The image layers of a plurality of areas are spliced and combined on the original image, and due to different processing effects of different tools, the splicing seams need to be subjected to image fusion processing, so that smooth transition of spliced images is guaranteed, for example, the images are ground in a weighted average and poisson fusion mode, and the effect of meeting the needs of users is achieved. Other fusion processing techniques, combinations of multiple fusion processing techniques, and the like are also possible. The relevant fusion processing means and technology can be described with reference to the relevant technology, and the embodiment of the present application is not limited.
And 103, calling a target editing instruction by a background.
And clicking the floating ball toolbar by the user for editing, correspondingly calling a target editing instruction in the background, specifically, transmitting the corresponding editing instruction and an object region acted by the instruction to corresponding background application and service processing, and returning a processing result after the processing is finished.
And step 104, clicking by the user to share and quit.
The user clicks to share, and the edited object is shared to interfaces of other applications; or click to retrieve the hover ball and return to the first interface.
It should be noted that, in the object editing method provided in the embodiment of the present application, the execution main body may be an object editing apparatus, or a functional module and/or a functional entity in the object editing apparatus, which is used for executing the object editing method. In the embodiment of the present application, an object editing apparatus is taken as an example to execute an object editing method, and an apparatus of the object editing method provided in the embodiment of the present application is described.
Fig. 9 shows a schematic diagram of a possible structure of an object editing apparatus according to an embodiment of the present application. As shown in fig. 9, the object editing apparatus 300 may include: a receiving module 301, a display module 302, an execution module 303 and an update module 304; the receiving module 301 is configured to receive a first input of a user to a first object displayed in a first interface, where the first interface is an interface in a first application; the display module 302 is configured to display, in response to the first input received by the receiving module 301, a first object in an editable state on a first interface and display a target control, where the target control includes at least one editing instruction; the receiving module 301 is configured to receive a second input of the user, where the second input is used to determine a target editing instruction; the executing module 303, configured to, in response to the second input received by the receiving module 301, obtain a second object by invoking a second application in the background to perform a target editing operation on the first object; the updating module 304 is used for updating the first object into a second object; the second application is an application to which the target editing instruction belongs, and the target editing operation is an editing operation corresponding to the target editing instruction.
Optionally, the target control comprises at least one of: n first marks and M second marks; each first identifier is a basic tool identifier or a combined tool identifier, each combined tool identifier is a combined identifier of at least two basic tool identifiers, each basic tool identifier comprises at least one instruction tool identifier, each instruction tool identifier in one basic tool identifier is used for indicating different basic editing instructions, and N is a positive integer; each second identifier is a basic template identifier or a combined template identifier, each basic template identifier comprises at least one basic instruction template identifier, and each basic instruction template identifier in one basic template identifier is used for indicating different basic editing instructions; each combined instruction template mark comprises at least one combined instruction template mark, each combined instruction template mark is a combined mark of at least one basic instruction template mark, each basic instruction template mark in one combined instruction template mark is used for indicating different basic editing instructions, and M is a positive integer.
Optionally, in a case that the target control includes the N first identifiers, each instruction tool identifier in the one base tool identifier is used to indicate: basic editing instructions from different services and with similar functions, or basic editing instructions from the same service and with different functions; in a case that the target control includes the M second identifiers, each of the one base instruction template identifier is used to indicate: basic editing instructions from different services and with similar functions, or basic editing instructions from the same service and with different functions; each combined instruction template identifier in one combined template identifier is: and S combined identifications of basic instruction template identifications with different functions.
Optionally, in a case that the target control includes the N first identifiers, the second input includes a user input to a first target identifier of the N first identifiers; in the case that the first target identifier is a basic tool identifier, the target editing instruction is: the basic editing instruction corresponding to the second input is selected from at least one basic editing instruction indicated by the first target identifier; in the case that the first target identifier is a combination tool identifier, the target editing instruction is: and the combined editing instruction corresponding to the second input is selected from the at least one combined editing instruction indicated by the first target identifier.
Optionally, in a case that the first target identifier is a basic tool identifier, the second input is used for updating the target instruction tool identifier from the instruction tool identifiers included in the first target identifier; in the case where the first target identifier is a cluster tool identifier, the second input includes at least one of: the system is used for adjusting the arrangement sequence of the basic tool identifications included by the first target identification; the target basic tool identification is used for updating the target instruction tool identification from the instruction tool identification included in the target basic tool identification; the target basic tool identifier is a basic tool identifier included in the first target identifier, and the target instruction tool identifier is an instruction tool identifier used for determining the target editing instruction.
Optionally, in a case that the target control includes the M second identifiers, the second input includes a user input to a second target identifier of the M second identifiers; in the case that the second target identifier is a basic template identifier, the target editing instruction is: the basic editing instruction corresponding to the second input is selected from at least one basic editing instruction indicated by the second target identifier; in the case that the second target identifier is a combined template identifier, the target editing instruction is: and the combined editing instruction corresponding to the second input is selected from the at least one combined editing instruction indicated by the second target identification.
Optionally, the second input is further for determining at least one region, each region being: in the first object, the action region corresponding to the target basic editing instruction is a basic editing instruction included in the target basic editing instruction.
Optionally, the at least one region includes a first region and a second region, the first region and the second region have an overlapping region, the first region is an action region corresponding to the first editing instruction, the second region is an action region corresponding to the second editing instruction, and the first editing instruction and the second editing instruction are two editing instructions in the target editing instruction; the executing module 303 is specifically configured to invoke the third application to execute the editing operation corresponding to the first editing instruction on the first non-overlapping area in the background, invoke the fourth application to execute the editing operation corresponding to the second editing instruction on the second non-overlapping area in the background, and sequentially execute the following steps on the overlapping area: the editing operation corresponding to the first editing instruction is executed by calling a third application in the background, and the editing operation corresponding to the second editing instruction is executed by calling a fourth application in the background; the first non-overlapping area is an area except the overlapping area in the first area, and the second non-overlapping area is an area except the overlapping area in the second area; the third application is: in the second application, the application to which the first editing instruction belongs; the fourth application is as follows: and in the second application, the application to which the second editing instruction belongs.
Optionally, in a case that the second input includes an input of a second target identifier in the M second identifiers by a user, each basic instruction template identifier is further used to indicate a template action region corresponding to a corresponding basic editing instruction, where each region is: and determining the action region of the template corresponding to the target basic editing instruction.
Optionally, the object editing apparatus further includes at least one of: the storage module generates a module; the saving module is used for saving the target editing instruction aiming at the first object after receiving the second input of the user and before updating the first object into the second object; the generating module is used for generating a target instruction identifier based on the target editing instruction after the first object is updated to the second object, and the updating module is also used for updating the target instruction identifier to the target control, wherein the target instruction identifier is a target tool identifier or a target template identifier.
Optionally, the object editing apparatus further includes: a determination module; the determining module is used for determining a target type to which the first object belongs before the target control is displayed; and determining the target control corresponding to the target type.
Optionally, the object editing apparatus further includes: a switching module; the display module is further used for displaying at least one object in a state to be edited on the first interface after receiving first input of a user on the first object displayed in the first interface; the receiving module is further used for receiving a fourth input of the user; the switching module is further configured to, in response to the fourth input received by the receiving module, switch the first object from the editable state to the to-be-edited state, and switch the third object from the to-be-edited state to the editable state, where the third object is an object corresponding to the fourth input in the at least one object in the to-be-edited state.
The embodiment of the application provides an object editing apparatus, which can receive a first input of a first object displayed in a first interface (interface in a first application) from a user; in response to the first input, displaying the first object in an editable state on the first interface, and displaying a target control (including at least one editing instruction); receiving a second input of the user, wherein the second input is used for determining a target editing instruction; and responding to the second input, executing target editing operation (editing operation corresponding to the target editing instruction) on the first object by calling a second application (namely the application to which the target editing instruction belongs) in the background to obtain a second object, and updating the first object into the second object. According to the scheme, the first object in an editable state and the target control (the target control is associated with the editing instruction of the second application) can be displayed on the current interface of the first application, the electronic equipment is triggered to determine the target editing instruction through the second input of the target control, the target editing operation corresponding to the target editing instruction can be carried out on the first object through calling the second application in the background, the second object is obtained, and the first object is updated to the second object. Therefore, operations such as saving the first object in the first application, opening the second application, opening the first object in the second application and the like are not needed, so that the operation steps can be simplified, the operation time can be saved, and the operation efficiency can be improved.
The object editing apparatus in the embodiment of the present application may be an apparatus, and may also be an electronic device or a component, an integrated circuit, or a chip in the electronic device. The electronic device may be a mobile electronic device or a non-mobile electronic device. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The object editing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android operating system (Android), an iOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The object editing apparatus provided in the embodiment of the present application can implement each process implemented by the method embodiments of fig. 1 to 8, and can achieve the same technical effect, and is not described here again to avoid repetition.
Optionally, as shown in fig. 10, an electronic device 400 is further provided in this embodiment of the present application, and includes a processor 401, a memory 402, and a program or an instruction stored in the memory 402 and executable on the processor 401, where the program or the instruction is executed by the processor 401 to implement each process of the above-mentioned embodiment of the object editing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 11 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application. The electronic device 500 includes, but is not limited to: a radio frequency unit 501, a network module 502, an audio output unit 503, an input unit 504, a sensor 505, a display unit 506, a user input unit 507, an interface unit 508, a memory 509, a processor 510, and the like.
Those skilled in the art will appreciate that the electronic device 500 may further include a power supply (e.g., a battery) for supplying power to various components, and the power supply may be logically connected to the processor 510 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 11 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
The user input unit 507 is configured to receive a first input of a user to a first object displayed in a first interface, where the first interface is an interface in a first application; a display unit 506, configured to display, in response to a first input, a first object in an editable state on a first interface, and display a target control, where the target control includes at least one editing instruction; the user input unit 507 is further configured to receive a second input of the user, where the second input is used to determine a target editing instruction; a processor 510, configured to, in response to a second input, obtain a second object by invoking a second application in the background to perform a target editing operation on the first object, and update the first object to the second object; the second application is an application to which the target editing instruction belongs, and the target editing operation is an editing operation corresponding to the target editing instruction.
Optionally, the target control comprises at least one of: n first marks and M second marks; each first identifier is a basic tool identifier or a combined tool identifier, each combined tool identifier is a combined identifier of at least two basic tool identifiers, each basic tool identifier comprises at least one instruction tool identifier, each instruction tool identifier in one basic tool identifier is used for indicating different basic editing instructions, and N is a positive integer; each second identifier is a basic template identifier or a combined template identifier, each basic template identifier comprises at least one basic instruction template identifier, and each basic instruction template identifier in one basic template identifier is used for indicating different basic editing instructions; each combined instruction template mark comprises at least one combined instruction template mark, each combined instruction template mark is a combined mark of at least one basic instruction template mark, each basic instruction template mark in one combined instruction template mark is used for indicating different basic editing instructions, and M is a positive integer.
Optionally, in a case that the target control includes the N first identifiers, each instruction tool identifier in the one base tool identifier is used to indicate: basic editing instructions from different services and with similar functions, or basic editing instructions from the same service and with different functions; in a case that the target control includes the M second identifiers, each of the one base instruction template identifier is used to indicate: basic editing instructions from different services and with similar functions, or basic editing instructions from the same service and with different functions; each combined instruction template identifier in one combined template identifier is: and S combined identifications of basic instruction template identifications with different functions.
Optionally, in a case that the target control includes the N first identifiers, the second input includes a user input to a first target identifier of the N first identifiers; in the case that the first target identifier is a basic tool identifier, the target editing instruction is: the basic editing instruction corresponding to the second input is selected from at least one basic editing instruction indicated by the first target identifier; in the case that the first target identifier is a combination tool identifier, the target editing instruction is: and the combined editing instruction corresponding to the second input is selected from the at least one combined editing instruction indicated by the first target identifier.
Optionally, in a case that the first target identifier is a basic tool identifier, the second input is used for updating the target instruction tool identifier from the instruction tool identifiers included in the first target identifier; in the case where the first target identifier is a cluster tool identifier, the second input includes at least one of: the system is used for adjusting the arrangement sequence of the basic tool identifications included by the first target identification; the target basic tool identification is used for updating the target instruction tool identification from the instruction tool identification included in the target basic tool identification; the target basic tool identifier is a basic tool identifier included in the first target identifier, and the target instruction tool identifier is an instruction tool identifier used for determining the target editing instruction.
Optionally, in a case that the target control includes the M second identifiers, the second input includes a user input to a second target identifier of the M second identifiers; in the case that the second target identifier is a basic template identifier, the target editing instruction is: the basic editing instruction corresponding to the second input is selected from at least one basic editing instruction indicated by the second target identifier; in the case that the second target identifier is a combined template identifier, the target editing instruction is: and the combined editing instruction corresponding to the second input is selected from the at least one combined editing instruction indicated by the second target identification.
Optionally, the second input is further for determining at least one region, each region being: in the first object, the action region corresponding to the target basic editing instruction is a basic editing instruction included in the target basic editing instruction.
Optionally, the at least one region includes a first region and a second region, the first region and the second region have an overlapping region, the first region is an action region corresponding to the first editing instruction, the second region is an action region corresponding to the second editing instruction, and the first editing instruction and the second editing instruction are two editing instructions in the target editing instruction; the processor 510 is specifically configured to invoke a third application in the background to execute an editing operation corresponding to a first editing instruction on a first non-overlapping area, and invoke a fourth application in the background to execute an editing operation corresponding to a second editing instruction on a second non-overlapping area, and sequentially execute the following steps on the overlapping area: the editing operation corresponding to the first editing instruction is executed by calling a third application in the background, and the editing operation corresponding to the second editing instruction is executed by calling a fourth application in the background; the first non-overlapping area is an area except the overlapping area in the first area, and the second non-overlapping area is an area except the overlapping area in the second area; the third application is: in the second application, the application to which the first editing instruction belongs; the fourth application is as follows: and in the second application, the application to which the second editing instruction belongs.
Optionally, in a case that the second input includes an input of a second target identifier in the M second identifiers by a user, each basic instruction template identifier is further used to indicate a template action region corresponding to a corresponding basic editing instruction, where each region is: and determining the action region of the template corresponding to the target basic editing instruction.
Optionally, the processor 510 is further configured to perform at least one of: after receiving the second input of the user, before updating the first object into the second object, saving the target editing instruction for the first object; after the first object is updated to the second object, a target instruction identification is generated based on the target editing instruction, and the target instruction identification is updated to the target control, wherein the target instruction identification is a target tool identification or a target template identification.
Optionally, the processor 510 is further configured to determine, before the displaying the target control, a target type to which the first object belongs; and determining the target control corresponding to the target type.
Optionally, the display unit 506 is further configured to display at least one object in a state to be edited on the first interface after the first input of the user on the first object displayed in the first interface is received; a user input unit 507, further configured to receive a fourth input by the user; the processor 510 is further configured to, in response to a fourth input, switch the first object from the editable state to the to-be-edited state, and switch a third object from the to-be-edited state to the editable state, where the third object is an object corresponding to the fourth input, in the at least one object in the to-be-edited state.
The electronic device provided by the embodiment of the application can receive a first input of a first object displayed in a first interface (interface in a first application) from a user; in response to the first input, displaying the first object in an editable state on the first interface, and displaying a target control (including at least one editing instruction); receiving a second input of the user, wherein the second input is used for determining a target editing instruction; and responding to the second input, executing target editing operation (editing operation corresponding to the target editing instruction) on the first object by calling a second application (namely the application to which the target editing instruction belongs) in the background to obtain a second object, and updating the first object into the second object. According to the scheme, the first object in an editable state and the target control (the target control is associated with the editing instruction of the second application) can be displayed on the current interface of the first application, the electronic equipment is triggered to determine the target editing instruction through the second input of the target control, the target editing operation corresponding to the target editing instruction can be carried out on the first object through calling the second application in the background, the second object is obtained, and the first object is updated to the second object. Therefore, operations such as saving the first object in the first application, opening the second application, opening the first object in the second application and the like are not needed, so that the operation steps can be simplified, the operation time can be saved, and the operation efficiency can be improved.
It should be understood that, in the embodiment of the present application, the radio frequency unit 501 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 510; in addition, the uplink data is transmitted to the base station. In addition, the radio frequency unit 501 can also communicate with a network and other devices through a wireless communication system. The electronic device provides wireless broadband internet access to the user via the network module 502, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media. The audio output unit 503 may convert audio data received by the radio frequency unit 501 or the network module 502 or stored in the memory 509 into an audio signal and output as sound. Also, the audio output unit 503 may also provide audio output related to a specific function performed by the electronic apparatus 500 (e.g., a call signal reception sound, a message reception sound, etc.). The input Unit 504 may include a Graphics Processing Unit (GPU) 5041 and a microphone 5042, and the Graphics processor 5041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 506 may include a display panel 5061, and the display panel 5061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 507 includes a touch panel 5071 and other input devices 5072. A touch panel 5071, also referred to as a touch screen. The touch panel 5071 may include two parts of a touch detection device and a touch controller. Other input devices 5072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in further detail herein. The memory 509 may be used to store software programs as well as various data including, but not limited to, application programs and operating systems. Processor 510 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 510.
The beneficial effects of the various implementation manners in this embodiment may specifically refer to the beneficial effects of the corresponding implementation manners in the above method embodiments, and are not described herein again to avoid repetition.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above-mentioned embodiment of the object editing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and the like.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the above object editing method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. An object editing method, characterized in that the method comprises:
receiving a first input of a user to a first object displayed in a first interface, wherein the first interface is an interface in a first application;
in response to the first input, displaying a first object in an editable state on the first interface, and displaying a target control, the target control comprising at least one editing instruction;
receiving a second input of the user, wherein the second input is used for determining a target editing instruction;
responding to the second input, calling a second application in the background to execute target editing operation on the first object to obtain a second object, and updating the first object into the second object;
the second application is an application to which the target editing instruction belongs, and the target editing operation is an editing operation corresponding to the target editing instruction.
2. The method of claim 1, wherein the target control comprises at least one of:
n first marks and M second marks;
each first identifier is a basic tool identifier or a combined tool identifier, each combined tool identifier is a combined identifier of at least two basic tool identifiers, each basic tool identifier comprises at least one instruction tool identifier, each instruction tool identifier in one basic tool identifier is used for indicating different basic editing instructions, and N is a positive integer;
each second identifier is a basic template identifier or a combined template identifier, each basic template identifier comprises at least one basic instruction template identifier, and each basic instruction template identifier in one basic template identifier is used for indicating different basic editing instructions; each combined instruction template mark comprises at least one combined instruction template mark, each combined instruction template mark is a combined mark of at least one basic instruction template mark, each basic instruction template mark in one combined instruction template mark is used for indicating different basic editing instructions, and M is a positive integer.
3. The method of claim 2, wherein in the case that the target control includes the N first identifiers, each instruction tool identifier in the one base tool identifier is used to indicate: basic editing instructions from different services and with similar functions, or basic editing instructions from the same service and with different functions;
in a case that the target control includes the M second identifiers, each of the one base instruction template identifiers is used to indicate: basic editing instructions from different services and with similar functions, or basic editing instructions from the same service and with different functions; each combined instruction template identifier in one combined template identifier is: and S combined identifications of basic instruction template identifications with different functions.
4. The method of claim 2, wherein, in the case that the target control includes the N first identifiers, the second input includes user input of a first target identifier of the N first identifiers;
in the case that the first target identifier is a basic tool identifier, the target editing instruction is: a basic editing instruction corresponding to the second input is selected from at least one basic editing instruction indicated by the first target identifier; the second input is used for updating the target instruction tool identification from the instruction tool identification included in the first target identification;
in the case that the first target identifier is a combination tool identifier, the target editing instruction is: in at least one combined editing instruction indicated by the first target identifier, a combined editing instruction corresponding to the second input; the second input includes at least one of:
the first target identifier is used for adjusting the arrangement sequence of the basic tool identifiers included by the first target identifier; the target basic tool identification is used for updating the target instruction tool identification from the instruction tool identification included in the target basic tool identification;
the target basic tool identifier is a basic tool identifier included in the first target identifier, and the target instruction tool identifier is an instruction tool identifier used for determining the target editing instruction.
5. The method of claim 2, wherein, in the case that the target control includes the M second identifiers, the second input includes user input of a second target identifier of the M second identifiers;
in the case that the second target identifier is a basic template identifier, the target editing instruction is: a basic editing instruction corresponding to the second input is selected from at least one basic editing instruction indicated by the second target identifier;
under the condition that the second target identifier is a combined template identifier, the target editing instruction is as follows: and in at least one combined editing instruction indicated by the second target identifier, a combined editing instruction corresponding to the second input.
6. The method of claim 4 or 5, wherein the second input is further used to determine at least one region, each region being: in the first object, the action region corresponding to a target basic editing instruction is a basic editing instruction included in the target basic editing instruction.
7. The method according to claim 6, wherein the at least one region comprises a first region and a second region, the first region and the second region have an overlapping region, the first region is an action region corresponding to a first editing instruction, the second region is an action region corresponding to a second editing instruction, and the first editing instruction and the second editing instruction are two editing instructions of the target editing instructions;
the performing a target editing operation on the first object by invoking a second application in the background comprises:
the editing operation corresponding to the first editing instruction is executed on a first non-overlapping area by calling a third application in the background, the editing operation corresponding to the second editing instruction is executed on a second non-overlapping area by calling a fourth application in the background, and the following steps are sequentially executed on the overlapping area: the third application is called in the background to execute the editing operation corresponding to the first editing instruction, and the fourth application is called in the background to execute the editing operation corresponding to the second editing instruction;
wherein the first non-overlapping region is a region of the first region other than the overlapping region, and the second non-overlapping region is a region of the second region other than the overlapping region; the third application is: in the second application, the application to which the first editing instruction belongs; the fourth application is: and in the second application, the application to which the second editing instruction belongs.
8. The method of any one of claims 1 to 5, wherein after receiving the second input from the user, the method further comprises at least one of:
saving the target editing instructions for the first object before updating the first object to the second object;
after the first object is updated to the second object, a target instruction identifier is generated based on the target editing instruction, and the target instruction identifier is updated to the target control, wherein the target instruction identifier is a target tool identifier or a target template identifier.
9. An object editing apparatus, characterized in that the apparatus comprises: the device comprises a receiving module, a display module, an execution module and an updating module;
the receiving module is used for receiving a first input of a user to a first object displayed in a first interface, and the first interface is an interface in a first application;
the display module is used for responding to the first input received by the receiving module, displaying a first object in an editable state on the first interface, and displaying a target control, wherein the target control comprises at least one editing instruction;
the receiving module is used for receiving a second input of the user, wherein the second input is used for determining a target editing instruction;
the execution module is used for responding to the second input received by the receiving module, and calling a second application to execute target editing operation on the first object in a background to obtain a second object;
the updating module is used for updating the first object into the second object;
the second application is an application to which the target editing instruction belongs, and the target editing operation is an editing operation corresponding to the target editing instruction.
10. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the object editing method as claimed in any one of claims 1 to 8.
CN202110213643.4A 2021-02-25 2021-02-25 Object editing method and device and electronic equipment Pending CN112947923A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110213643.4A CN112947923A (en) 2021-02-25 2021-02-25 Object editing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110213643.4A CN112947923A (en) 2021-02-25 2021-02-25 Object editing method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN112947923A true CN112947923A (en) 2021-06-11

Family

ID=76246277

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110213643.4A Pending CN112947923A (en) 2021-02-25 2021-02-25 Object editing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN112947923A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113470701A (en) * 2021-06-30 2021-10-01 深圳市斯博科技有限公司 Audio and video editing method and device, computer equipment and storage medium
CN114510166A (en) * 2022-04-01 2022-05-17 深圳传音控股股份有限公司 Operation method, intelligent terminal and storage medium
CN114610429A (en) * 2022-03-14 2022-06-10 北京达佳互联信息技术有限公司 Multimedia interface display method and device, electronic equipment and storage medium
CN115484396A (en) * 2021-06-16 2022-12-16 荣耀终端有限公司 Video processing method and electronic equipment
CN116775197A (en) * 2023-07-06 2023-09-19 北京五木恒润科技有限公司 Algorithm editing method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130235067A1 (en) * 2012-03-06 2013-09-12 Apple Inc. Color adjustors for color segments
CN103870277A (en) * 2014-03-18 2014-06-18 广州市纬志电子科技有限公司 Interface editing software
CN111126301A (en) * 2019-12-26 2020-05-08 腾讯科技(深圳)有限公司 Image processing method and device, computer equipment and storage medium
CN111352557A (en) * 2020-02-24 2020-06-30 北京字节跳动网络技术有限公司 Image processing method, assembly, electronic equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130235067A1 (en) * 2012-03-06 2013-09-12 Apple Inc. Color adjustors for color segments
CN103870277A (en) * 2014-03-18 2014-06-18 广州市纬志电子科技有限公司 Interface editing software
CN111126301A (en) * 2019-12-26 2020-05-08 腾讯科技(深圳)有限公司 Image processing method and device, computer equipment and storage medium
CN111352557A (en) * 2020-02-24 2020-06-30 北京字节跳动网络技术有限公司 Image processing method, assembly, electronic equipment and storage medium

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115484396A (en) * 2021-06-16 2022-12-16 荣耀终端有限公司 Video processing method and electronic equipment
CN115484396B (en) * 2021-06-16 2023-12-22 荣耀终端有限公司 Video processing method and electronic equipment
CN113470701A (en) * 2021-06-30 2021-10-01 深圳市斯博科技有限公司 Audio and video editing method and device, computer equipment and storage medium
CN113470701B (en) * 2021-06-30 2022-07-01 深圳万兴软件有限公司 Audio and video editing method and device, computer equipment and storage medium
CN114610429A (en) * 2022-03-14 2022-06-10 北京达佳互联信息技术有限公司 Multimedia interface display method and device, electronic equipment and storage medium
CN114510166A (en) * 2022-04-01 2022-05-17 深圳传音控股股份有限公司 Operation method, intelligent terminal and storage medium
CN116775197A (en) * 2023-07-06 2023-09-19 北京五木恒润科技有限公司 Algorithm editing method and device

Similar Documents

Publication Publication Date Title
CN112947923A (en) Object editing method and device and electronic equipment
CN107426403B (en) Mobile terminal
US11604580B2 (en) Configuration of application execution spaces and sub-spaces for sharing data on a mobile touch screen device
CN104246678A (en) Device, method, and graphical user interface for sharing a content object in a document
CN104750450A (en) File sharing method in IM (Instant Messaging) and terminal
WO2006044834A2 (en) Desktop alert management
WO2023030306A1 (en) Method and apparatus for video editing, and electronic device
CN115357158A (en) Message processing method and device, electronic equipment and storage medium
WO2022222672A1 (en) Multi-content parallel display method and apparatus, terminal, medium, and program product
CN112148167A (en) Control setting method and device and electronic equipment
CN114518822A (en) Application icon management method and device and electronic equipment
CN113268182B (en) Application icon management method and electronic device
CN114415886A (en) Application icon management method and electronic equipment
JP2023529219A (en) Picture processing method, apparatus and electronic equipment
CN113485853A (en) Information interaction method and device and electronic equipment
WO2023179539A1 (en) Video editing method and apparatus, and electronic device
CN115617225A (en) Application interface display method and device, electronic equipment and storage medium
CN115576463A (en) Background application management method and device, electronic equipment and medium
CN113325986B (en) Program control method, program control device, electronic device and readable storage medium
CN115167721A (en) Display method and device of functional interface
CN114911389A (en) Screen display method and device, electronic equipment and readable storage medium
CN114679546A (en) Display method and device, electronic equipment and readable storage medium
CN114721565A (en) Application program starting method and device, electronic equipment and storage medium
CN114397989A (en) Parameter value setting method and device, electronic equipment and storage medium
CN114564271A (en) Chat window information input method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination