CN112395838A - Object synchronous editing method, device, equipment and readable storage medium - Google Patents

Object synchronous editing method, device, equipment and readable storage medium Download PDF

Info

Publication number
CN112395838A
CN112395838A CN201910750040.0A CN201910750040A CN112395838A CN 112395838 A CN112395838 A CN 112395838A CN 201910750040 A CN201910750040 A CN 201910750040A CN 112395838 A CN112395838 A CN 112395838A
Authority
CN
China
Prior art keywords
editing
target
target object
synchronous
size
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910750040.0A
Other languages
Chinese (zh)
Other versions
CN112395838B (en
Inventor
林晓晴
许铭洁
曾奋飞
赖志强
周梓煜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201910750040.0A priority Critical patent/CN112395838B/en
Publication of CN112395838A publication Critical patent/CN112395838A/en
Application granted granted Critical
Publication of CN112395838B publication Critical patent/CN112395838B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a method, a device and equipment for synchronously editing objects and a readable storage medium. By the method, when a corresponding synchronous editing request of a user for a certain target object is received, corresponding synchronous editing information can be acquired, the association identifier and the element editing parameter of a target object element to be edited in the target object are determined, the associated object element associated with the target object is determined according to the association identifier, the target object element and the associated object element are respectively edited based on the element editing parameter, and the user can synchronously edit the associated object with the target object only by editing the target object.

Description

Object synchronous editing method, device, equipment and readable storage medium
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to an object editing method, apparatus, device, and readable storage mechanism.
Background
Often for an event or project, there is a need to promote promotions. The designer can design the corresponding planar design works aiming at the activities or projects to realize the propaganda and promotion of the activities or the projects. The planar design works can be real planar designs (such as billboards, commodity packages, covers and the like) or virtual planar designs (such as page advertisements and the like).
However, one event or project is generally promoted and publicized through a plurality of publicity entrances or exposure places (e.g., physical advertisement places at different places, advertisement places on different application pages, etc.), different publicity entrances or exposure places are different in size and display form, and it is often necessary to adapt, modify and rearrange a planar design work for each publicity entrance or exposure place, such a work does not require design originality, but consumes a large amount of communication and time cost of designers, and particularly, for a project or a project with a large publicity and popularization demand, high time cost and labor cost are brought. Once the demand for promotion changes during the course of the activity or project, the corresponding modification cost is also very high.
Disclosure of Invention
An object of the present invention is to provide a new technical solution for synchronizing edited objects.
According to a first aspect of the present invention, there is provided an object synchronous editing method, including:
responding to an object synchronous editing request implemented on a target object, and acquiring corresponding synchronous editing information; the synchronous editing information at least comprises an associated identifier and an element editing parameter of a target object element to be edited in the target object;
determining an associated object element having the same association identifier as the target object element in an associated object associated with the target object;
and respectively editing the target object element and the associated object element according to the element editing parameters, so as to realize the synchronous editing of the target object and the associated object.
According to a second aspect of the present invention, there is provided a display method for object synchronous editing, including:
displaying, by a user device, a canvas;
displaying a target object and the associated object associated with the target object through the canvas, so that a user can implement object synchronous editing on the target object and the associated object through the object synchronous editing method according to any item in the first aspect of the invention;
and displaying the target object and the associated object after the object synchronous editing is implemented through the canvas.
According to a third aspect of the present invention, there is provided a video synchronous editing method, including:
responding to a synchronous editing request implemented on a target video frame, and acquiring corresponding synchronous editing information; the synchronous editing information at least comprises an associated identifier and an element editing parameter of a target video frame element to be edited in the target video frame;
determining, in an associated video frame associated with the target video frame, an associated video frame element having the same association identifier as the target video frame element;
and respectively editing the target video frame element and the associated video frame element according to the element editing parameters, so as to realize synchronous editing of the target video frame and the associated video frame.
According to a fourth aspect of the present invention, there is provided an object synchronization editing apparatus comprising:
an information acquisition unit, configured to respond to an object synchronous editing request applied to a target object, and acquire corresponding synchronous editing information; the synchronous editing information at least comprises an associated identifier and an element editing parameter of a target object element to be edited in the target object;
an association determining unit, configured to determine, in an association object associated with the target object, an association object element having the same association identifier as the target object element;
and the synchronous editing unit is used for respectively editing the target object element and the associated object element according to the element editing parameters to realize synchronous editing of the target object and the associated object.
According to a fifth aspect of the present invention, there is provided an object synchronization editing apparatus comprising:
a memory for storing executable instructions;
and the processor is used for operating the object synchronous editing equipment according to the control of the executable instruction and executing any one object synchronous editing method in the first aspect of the invention.
According to a sixth aspect of the present invention, there is provided a readable storage medium, in which a computer program is stored, wherein the computer program is readable and executable by a computer, and when the computer program is read and executed by the computer, the computer program is configured to execute any one of the object synchronous editing methods according to the first aspect of the present invention.
According to one embodiment of the present disclosure, by setting an object element to have a unique association identifier, associating an object element in a plurality of objects associated with each other based on the association identifier, when a corresponding synchronous editing request of a user for a certain target object is received, acquiring corresponding synchronous editing information, determining an association identifier and an element editing parameter of the target object element to be edited in the target object, determining an associated object element in the associated object associated with the target object according to the association identifier, and editing the target object element and the associated object element respectively based on the element editing parameter, a user can edit the associated object with the target object synchronously only by editing the target object, thereby completing synchronous editing of the plurality of objects without operating the plurality of objects respectively, greatly saving manpower and time required for editing the object, the time cost and the labor cost of synchronously editing the objects are reduced, and the design efficiency of a user is improved. The method is particularly suitable for scenes requiring frequent synchronous editing of a plurality of objects.
Other features of the present invention and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a block diagram showing an example of a hardware configuration of an electronic apparatus 1000 that can be used to implement an embodiment of the present invention.
Fig. 2 shows a flowchart of an object synchronous editing method according to an embodiment of the present invention.
Fig. 3 is a flowchart showing an example of an object synchronization editing method according to an embodiment of the present invention.
Fig. 4 is a schematic diagram showing a source object of an example of the object synchronous editing method of the embodiment of the present invention.
Fig. 5 is a schematic diagram illustrating batch generation of new objects in an example of the object synchronous editing method according to the embodiment of the present invention.
Fig. 6 is a further schematic diagram illustrating batch generation of new objects in the example of the object synchronous editing method according to the embodiment of the present invention.
Fig. 7 is a diagram illustrating an example of a synchronous editing object in the object synchronous editing method according to the embodiment of the present invention.
Fig. 8 shows a block diagram of the object synchronization editing apparatus 3000 of the embodiment of the present invention.
Fig. 9 shows a block diagram of an object synchronization editing apparatus 4000 of an embodiment of the present invention.
Detailed Description
Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
< hardware configuration >
Fig. 1 is a block diagram showing a hardware configuration of an electronic apparatus 1000 that can implement an embodiment of the present invention.
The electronic device 1000 may be a laptop, desktop, cell phone, tablet, etc. The electronic device 1000 may also be a cloud server, a blade server, a server cluster, and the like.
For example, as shown in fig. 1, the electronic device 1000 may include a processor 1100, a memory 1200, an interface device 1300, a communication device 1400, a display device 1500, an input device 1600, a speaker 1700, a microphone 1800, and so forth. The processor 1100 may be a central processing unit CPU, a microprocessor MCU, or the like. The memory 1200 includes, for example, a ROM (read only memory), a RAM (random access memory), a nonvolatile memory such as a hard disk, and the like. The interface device 1300 includes, for example, a USB interface, a headphone interface, and the like. The communication device 1400 is capable of wired or wireless communication, for example, and may specifically include Wifi communication, bluetooth communication, 2G/3G/4G/5G communication, and the like. The display device 1500 is, for example, a liquid crystal display panel, a touch panel, or the like. The input device 1600 may include, for example, a touch screen, a keyboard, a somatosensory input, and the like. A user can input/output voice information through the speaker 1700 and the microphone 1800.
The electronic device shown in fig. 1 is merely illustrative and is in no way meant to limit the invention, its application, or uses. In an embodiment of the present invention, the memory 1200 of the electronic device 1000 is configured to store instructions for controlling the processor 1100 to operate so as to execute any one of the object synchronous editing methods provided by the embodiment of the present invention. It will be appreciated by those skilled in the art that although a plurality of means are shown for the electronic device 1000 in fig. 1, the present invention may relate to only some of the means therein, e.g. the electronic device 1000 relates to only the processor 1100 and the storage means 1200. The skilled person can design the instructions according to the disclosed solution. How the instructions control the operation of the processor is well known in the art and will not be described in detail herein.
< example >
In the present embodiment, an object synchronization editing method is provided. The object can be any target object which can be edited, for example, the object can be a planar design, the planar design can be a real planar design (such as a billboard, a commodity package, a cover, a poster and the like) or a virtual planar design (such as a page advertisement, an application interface and the like). Alternatively, the object may be any digital image that can be edited, for example, each frame of a video image in a video file. The objects may also be materials for design use, promotional content designed for a marketing campaign, dynamic entries for presenting content, and the like.
The object synchronous editing method, as shown in fig. 2, includes: steps S2100-S2300.
In step S2100, in response to the object synchronous editing request applied to the target object, corresponding synchronous editing information is acquired.
In this embodiment, the target object may be an object selected by the user for editing or an object edited by default by the system. The object synchronous editing request may be generated with an operation performed by a user, for example, an object editing interface may be provided to display the target object, a human-computer interaction operation (such as operations of changing a color, a font size, or zooming, rotating, adding content, etc.) performed by the user on the target object and embodying an editing requirement is received, and in response to the human-computer interaction operation, the object synchronous editing request may be correspondingly generated; alternatively, the object synchronous editing request may be generated according to the configuration of the user, for example, a configuration interface may be provided for the user to input editing information meeting the editing requirement of the user with respect to the target object (for example, the target object is used as a template), and generate a corresponding object synchronous editing request according to the editing information.
In one example, the method provided in this embodiment further includes:
and receiving the object synchronous editing operation implemented by the user and generating an object synchronous editing request.
The object synchronous editing operation can be a man-machine interaction operation implemented by a user on an application interface for showing the target object. In this example, the object synchronization editing operation includes at least one of a zoom-in operation and a zoom-out operation. A user can intuitively and quickly trigger a synchronous editing request to a target object by implementing the synchronous editing operation of the object, so that the synchronous editing efficiency of the object is improved.
For example, a user may perform a whole zoom-in operation on a certain target object or perform a local zoom-in operation on a certain object element, and intuitively and quickly trigger a corresponding synchronous editing request. By analyzing the object synchronous editing request or extracting the information in the object synchronous editing request, the corresponding synchronous editing information can be obtained.
The synchronous editing information is information related to a demand of the user to perform object synchronous editing based on the target object. In this embodiment, the synchronous editing information at least includes an association identifier and an element editing parameter of a target object element to be edited in the target object. The association identifier of the target object element is an identifier for associating the target object element with an object element in another object. Each object element is provided with a unique association identification. The association identifier of the object element may be set according to the unique identifier of the object element or according to a preset rule.
By setting the association identifier for each object element, the method can combine with other steps in the embodiment to realize synchronous editing of a plurality of object elements based on the association identifier, realize automatic synchronous editing of a plurality of objects, avoid the need of investing time and manpower to edit a plurality of objects respectively to realize synchronous editing between the objects, and greatly save the time and the labor cost of synchronous editing of the objects.
The element editing parameter is a parameter related to editing of the target object element, and editing of the target object element may include modifying an element style (e.g., color, font, etc.) of the target object element, rotating or scaling the target object element, adding content (e.g., picture, font, multimedia resource, etc.) to the target object element, and so on, and accordingly, the specific content of the element editing parameter is determined according to the specific editing of the target object.
In one example, the element editing parameters of the target object element include at least one of a target coordinate relationship and a target size relationship of the edited target object element.
The target coordinate relationship is a relative coordinate offset between the target object element and the object reference point of the target object. The object reference point may be selected as a center point of the target object or a vertex of the target object, and the object reference point may also select multiple vertices in the target object for cross reference, or may pre-divide the target object into multiple fixed regions, where the object reference point is a center point in a certain region of the target object where the target object element is located, a vertex of one or more of the regions, and the like. On the premise that the object reference point is determined, based on the same coordinate system, the coordinates of the object reference point and the coordinates of the target object element can be obtained, and the relative coordinate offset between the object reference point and the target object element can be obtained to be used as a target coordinate relation.
The target size relationship is the relative size ratio between the target object element and the target object. The relative size ratio may be obtained according to the element size of the target object element and the object size of the target object, and may specifically be a ratio of an aspect ratio of the target object element to an aspect ratio of the target object, a ratio of a width of the target object element to a width of the target object, and a ratio of a height of the target object element to a height of the target object. It should be understood that references to "dimensions" in this embodiment each include a corresponding height and width, or other parameters describing the size of the shape.
By setting the element editing parameters of the target object element, at least one of the target coordinate relationship and the target size relationship of the edited target object element is included, the change of the edited target object element (no matter zooming, moving, rotating and the like) can be simply and accurately represented through the target coordinate relationship and the target size relationship, and the efficiency of implementing synchronous editing according to the element editing parameters is improved.
The element editing parameters may also include an object style (including color, font, etc.) of the edited target object, and changes in object content (including changes in pictures, texts, multimedia resources, etc.), to name but a few.
In step S2200, an associated object element having the same association identifier as the target object element is determined in the associated object associated with the target object.
In the present embodiment, the related object associated with the target object is an object that the user desires to edit in synchronization with the target object.
In one example, the method in this embodiment further includes:
and determining the associated object associated with the target object according to the acquired association relation.
And the association relation is used for describing a plurality of objects with association.
In this example, the association relationship is set by a user in a customized manner, for example, the user specifies which objects have an association through an external configuration operation, or the user may provide a plurality of objects having an association through uploading, checking, and the like, so that the user can set the association relationship in a customized manner. After the association relationship is set by the user, the association relationship can be saved in the local storage so as to be read and acquired when the association object associated with the target object is determined.
Alternatively, the association relationship may be saved in a default record of the device implementing the embodiment, for example, the target object and the association object are generated from the same source object, and after a plurality of objects are generated from the source object, the association relationship between the objects may be recorded by the default record of the device implementing the embodiment.
The associated object associated with the target object is determined through the acquired association relation, and the associated object associated with the target object can be determined without analyzing the target object and then matching and comparing with other objects, so that the processing efficiency is improved.
In another example, the method in this embodiment further includes:
acquiring associated data, wherein the associated data at least comprises the position relation between object content and object elements; and generating the associated object associated with the target object according to the associated data.
The association data is data related to an association between the target object and the associated object. The association data includes at least object content and object element positional relationships. The object content is the object content associated with the target object and the associated object, and may be text content, image content, or the like. The object element position relationship may include a relationship between a position of any one object element in the target object and a position of a corresponding object element in the associated object.
The associated data can provide a data setting interface through user equipment used by a user, so that the user can set the associated data according to the personalized requirements of the user.
The associated data can be set according to the personalized requirements of the user, and the associated object meeting the personalized requirements of the user can be correspondingly generated for the user to implement object synchronous editing.
In another example, the method in this embodiment further includes:
and acquiring the size of the user equipment, and generating an associated object which is adaptive to the size of the user equipment and is associated with the target object.
The user equipment is a device used by a user. The associated object may be presented or applied by the user device. In this example, the user device size may be obtained through a device interface supported by the user device. It should be understood that when there are a plurality of different types of user equipment, a plurality of different user equipment sizes may be obtained correspondingly.
According to the obtained user equipment size, the associated object which is associated with the target object and is adaptive to the user equipment size can be automatically generated, and the object application requirement of the user can be met more accurately.
In another example, the target object and the associated object are generated from the same source object. The method further comprises the following steps: steps S2010-S2020.
Step S2010, analyzing the acquired source object, extracting an element attribute of each source object element included in the source object, and setting a unique corresponding association identifier for each source object element.
In this example, the source object may be provided by a user through uploading and checking from a plurality of displayed objects, for example, the source object may be a flat design work uploaded by the user, or a flat design work designed by the user through design software and output directly through an interface provided by the design software.
Parsing the source object may determine that each source object element is included in the source object. A source object element is an object element in a source object. Parsing the source object may be performed by various means, for example, the source object may be read by design software such as Adobe CEP, and parsed.
In this example, the element attributes of the source object elements may be extracted upon determining that each source object element is included in the source object. The element attributes may include element dimensions, element styles, element coordinates, etc. of the object elements. For example, the source object is composed of a plurality of layers, each layer is an object element, and the element attributes of the object element include an original canvas size, a coordinate position of an element layer, a layer type of the element layer, a size of the element layer, and the like.
The association identifier of each source object element may be set correspondingly according to a preset rule when determining each source object element included in the source object, for example, the association identifier may be a unique number, an ID, or the like; for example, the source object includes a plurality of layers, each layer is an object element, and each layer included in the source object may be analyzed by an Adobe CEP technique to obtain a unique ID of each layer as the association identifier.
Step S2020, respectively for different preset object sizes, generating a corresponding new object according to the element attribute and the association identifier of each source object element, so as to obtain a target object and an associated object with different object sizes.
In this example, each new object includes an object element having the same association identifier and corresponding to each source object element.
The preset object size may be set by default or specified by a user, for example, multiple object sizes may be provided for the user to select multiple object sizes meeting requirements as the preset object size when acquiring the source object.
In this example, an information package with the element attribute and the association identifier of each source object element as a whole may be generated, a source template is generated according to the information package, and based on the source template, corresponding object size scaling, element size scaling, source object element displacement, and the like are performed for different preset object sizes, so as to generate a corresponding new object. When any one of the new objects is a target object, the other new objects are associated objects associated with the target object.
According to the embodiment, the new objects with different sizes can be automatically generated in batches according to the source object and aiming at a plurality of different preset object sizes to serve as the target object and the associated object for the user to implement synchronous editing, manual design and manufacturing by the user are not needed, the labor cost and the time cost are further saved, and the processing efficiency is improved.
In a more specific example, step S2020 may include: steps S2021-S2022.
Step S2021, for each preset object size, setting an element attribute of a new object element corresponding to the source object element according to the element attribute of each source object element, the object size of the source object, and the preset object size, and setting an association identifier of the new object element to be the same as the association identifier of the source object element, so as to obtain all the new object elements corresponding to the preset object size.
The element attributes of an object element may include the element size (element height, element width), the coordinate position of the element, and the like. Taking the element attribute as the coordinate position of the element as an example, assuming that the corresponding aspect ratio obtained according to the object size of the source object is a, the corresponding aspect ratio obtained according to the preset object size is B, the center point of the object is selected as the object reference point in both the source object and the new object to be generated, the coordinate offset between the source object element and the object reference point of the source object is X (X may be a linear offset distance, or may include a horizontal offset X1 and a vertical offset X2), and the coordinate offset between the corresponding new object element and the object reference point of the new object is X
Figure BDA0002166876840000111
According to the calculated Y, based on preset object reference point coordinates of the new object, element coordinates of the new object elements corresponding to the source object elements can be determined;
similarly, the dimensions (including width, height) and the like of the newborn object element can be obtained according to the above-described method.
The element attribute of the object element includes other elements such as element color, element font, element content, etc., and the element color, element font, element content of the new object element can be directly set to be the same as the source object element.
Step S2022, generating a new object according to the preset object size according to all new object elements corresponding to each preset object size.
By correspondingly setting the element attribute and the associated identifier of the new object element based on the element attribute and the associated identifier of the source object element according to each different preset object size, the object elements in different new objects can be associated based on the associated identifier, the association of the objects can be realized, meanwhile, the corresponding new objects can be automatically generated in batch based on the new object elements, and the processing efficiency is improved.
In the associated objects associated with the target object, the associated object elements with the same associated identification are determined through the associated identifications of the target object elements, and the method can be combined with other steps to realize synchronous editing of a plurality of object elements based on the associated identifications, realize automatic synchronous editing of the plurality of objects, avoid consuming a large amount of time and labor cost, respectively edit the plurality of objects to realize synchronous editing among the objects, and greatly save the time and labor cost for synchronous editing of the objects.
And step S2300, editing the target object element and the associated object element respectively according to the element editing parameters, and realizing synchronous editing of the target object and the associated object.
The related content of the element editing parameter has been described above, and is not described in detail here.
The element editing parameters correspond to the object synchronous editing request implemented in the target object, and the target object elements are edited according to the specific content of the element editing parameters, so that the target object can be edited based on the requirements of a user, and the editing can comprise zooming, moving, selecting and installing, content changing and the like of the target object elements.
And editing the associated object elements according to the element editing parameters, synchronously editing the associated object elements associated with the target object elements through the associated identifiers while editing the target object elements, and synchronously editing the associated objects associated with the target objects on the basis of user requirements.
The target object element is an object element in the target object and the associated object element is an object element in an associated object with which the target object exists. The target object element and the associated object element have the same association identifier, and are the corresponding associated object elements in the associated object. According to the element editing parameters, the target object element and the associated object element are respectively edited, the associated target object and the corresponding associated object element in the associated object are synchronously edited, namely the target object and the associated object are synchronously edited, the synchronous editing of a plurality of object elements based on the associated identification is realized, the automatic synchronous editing of a plurality of objects is realized, the synchronous editing among the objects by respectively editing a plurality of objects with a large amount of input time and manpower is avoided, and the time and the labor cost for synchronously editing the objects are greatly saved.
In one example, editing the associated object element according to the element editing parameters may include: steps S2310-S2320.
Step S2310, acquiring a corresponding associated editing parameter according to the element editing parameter.
The associated editing parameter is a related parameter for editing the associated object element, and is an element editing parameter object for editing the target object element.
For example, based on the above example, the element editing parameters of the target object element at least include one of the target coordinate relationship and the target size relationship of the edited target object element; correspondingly, the associated editing parameters at least comprise one of the associated coordinate relationship and the associated size relationship of the edited associated object elements.
The associated coordinate relationship is a relative coordinate offset between the associated object element and the object reference point of the associated object. The object reference point may be selected as a center point of the associated object or a vertex of the associated object, and the object reference point may also select multiple vertices in the associated object for cross reference, or may pre-divide the associated object into multiple fixed regions, where the object reference point is a center point in a certain region of the associated object where the associated object element is located, a vertex of one or more of the regions, and the like. On the premise that the object reference point is determined, based on the same coordinate system, the coordinate of the object reference point and the coordinate of the associated object element can be obtained, and the relative coordinate offset between the object reference point and the associated object element can be obtained to be used as an associated coordinate relation.
The association size relationship is the relative size ratio between the association object element and the association object. The relative size ratio may be obtained according to the element size of the associated object element and the object size of the associated object, and the relative size ratio may specifically be a ratio of an aspect ratio of the associated object element to an aspect ratio of the associated object, or may also be a ratio of a width of the associated object element to a width of the associated object, and a ratio of a height of the associated object element to a height of the associated object.
The associated editing parameters and the element editing parameters at least comprise one of associated coordinate relations and associated size relations of the edited associated object elements, and the change of the edited associated object elements (no matter zooming, moving, rotating and the like) can be simply and accurately represented through the associated coordinate relations and the associated size relations, so that the efficiency of implementing synchronous editing according to the associated editing parameters is improved.
The element editing parameters may further include a style (including color, font, etc.) of the edited target object element, a content change (including a change of a picture, a text, a multimedia resource, etc.), and correspondingly, the associated editing parameters also include a style, a content change, etc. of the edited associated object element.
In a more specific example, obtaining the corresponding associated editing parameter according to the element editing parameter includes: steps S2311-S2312.
Step S2311, acquiring a parameter relative relation according to the object size and the element editing parameters of the target object.
The parameter relative relationship is used for describing a relative relationship between an element editing parameter for editing the element of the target object and the object size of the target object.
For example, the corresponding aspect ratio of the object may be obtained according to the object size of the target object, and the parameter relative relationship may be a ratio between the element editing parameter and the aspect ratio of the object.
Step S2312, obtaining an associated editing parameter according to the object size of the associated object and the parameter relative relationship.
In this example, the relative relationship between the associated editing parameter and the object size of the associated object should be the same as the parameter relative relationship, so that it can be ensured that the associated object element is edited based on the associated editing parameter, and the editing of the target object element based on the element editing parameter can be truly synchronized.
For example, assume that the element editing parameter is the relative coordinate relationship X 'of the edited target object element, the aspect ratio of the target object is a', and the parameter relative relationship is the ratio between the relative coordinate relationship of the target object element and the aspect ratio of the target object
Figure BDA0002166876840000131
Obtaining the corresponding aspect ratio B' according to the object size of the associated object, and correspondingly, obtaining the associated coordinate relation of the associated object elements
Figure BDA0002166876840000132
Similarly, the association proportion relation of the associated objects and the like can be obtained according to the method.
According to the parameter relative relationship, the associated editing parameters for editing the associated object elements can be obtained quickly through conversion according to the element editing parameters for editing the target object elements, synchronous editing among the object elements associated through the associated identification is achieved, and processing efficiency is improved.
After acquiring the corresponding associated editing parameters, entering:
step S2320, editing the associated object elements according to the associated editing parameters.
And associating the editing parameters corresponding to the element editing parameters for editing the target object element.
In this example, the associated object element is edited according to the associated editing parameter, so that synchronous editing with the target object element and synchronous editing between the target object and the associated object can be realized.
< example >
The object synchronous editing method in the present embodiment will be further illustrated in conjunction with fig. 3 to 7. In this example, the object is a flat design product, such as a poster.
As shown in fig. 3, the object synchronization editing method includes: S201-S208.
S201, providing a source object uploading entrance and receiving a source object uploaded by a user.
In this example, the source object is a poster source file designed by the user, as shown in FIG. 4.
The source object uploading entrance can be butt jointed with a work transmission interface provided by design software used by a user, and after the user designs a source object through the design software, the source object is directly uploaded to the source object uploading entrance through the work transmission interface. Alternatively, the source object upload portal may be a provided interface upload portal through which the user submits the source object upload.
S202, analyzing the source object, extracting the element attribute of each source object element included in the source object, and setting the association identifier of each source object element.
In this example, each layer of the source object may be analyzed by design software such as Adobe CEP, so as to obtain an element attribute of each layer as a source object element, and set the unique ID of the layer as the association identifier of the corresponding source object element.
S203, generating a source template according to the element attribute and the associated identifier of each source object element.
And S204, providing a plurality of object sizes for the user to select the preset object size which meets the requirement and is used for generating the objects in batches.
S205, generating a plurality of new objects with different object sizes in batch based on the source template according to the plurality of different preset object sizes.
In this example, assuming that the user selects three different preset object sizes 600 × 240, 400 × 400, and 1200 × 180, the three new objects generated in the batch may be as shown in fig. 5: object 1 size 600 × 240, object 2 size 400 × 400, object 3 size 1200 × 180.
Or, in this example, the step 204-S205 need not be executed, but the user device size to which the associated object is to be applied by the user is directly obtained, and the corresponding preset object size is automatically adapted and set according to the user device size, so as to generate a plurality of new objects with different object sizes in batch based on the source template. For example, as shown in fig. 6, assuming that the user devices are a desktop computer, a mobile phone, and a tablet computer, respectively, the sizes of the user devices of the desktop computer, the mobile phone, and the tablet computer are obtained, and the preset object size for adapting to the size of the user device of the desktop computer is 600 × 240, the preset object size for adapting to the size of the user device of the mobile phone is 400 × 400, and the preset object size for adapting to the size of the user device of the tablet computer is 1200 × 180.
S206, responding to the synchronous editing operation that the user selects a certain new object as the target object, and acquiring the associated identification and the element editing parameters of the target object element to be edited.
For example, the user selects the object 1 shown in fig. 7 as the target object, and performs the modification operation on the target object element "i is the material headline" in the object 1, and the modified content is "the headline is now modified".
And S207, editing the target object element according to the element editing parameter, and finishing editing the target object.
For example, when the user performs an editing operation on the target object, an edited object 1 is as shown in fig. 7.
And S208, acquiring corresponding associated editing parameters according to the element editing parameters, editing the associated object elements with the same associated identification according to the associated editing parameters, and finishing synchronous editing of the associated object corresponding to the target object.
The manner of obtaining the corresponding associated editing parameter is as described above, and is not described herein again.
In this example, the objects 1, 2, and 3 are generated based on the same source template, that is, based on the same source object, and after the object 1 is selected as the target object, the objects 2 and 3 are associated objects associated with the target object, and correspondingly, the objects 2 and 3 are edited according to the associated editing parameters, and the result is shown in fig. 7, so that the user can edit the objects 2 and 3 and the object 1 synchronously without operating the objects 2 and 3, respectively.
The present example has been described above with reference to the drawings, in the present example, a user can obtain new objects with different object sizes and generated automatically in batches only by uploading a source object, the user selects any new object as a target object to implement a synchronous editing operation, and other new objects are automatically and synchronously edited, so that manpower and time required by the user to generate and edit objects are greatly saved, time cost and labor cost of batch generation and synchronous editing of objects are reduced, and design efficiency of the user is improved. The method is particularly suitable for scenes which need to be edited and generated synchronously and frequently for a plurality of objects.
In this embodiment, a method for displaying object synchronous editing is further provided, including:
displaying, by a user device, a canvas;
displaying a target object and an associated object associated with the target object through a canvas, so that a user can implement object synchronous editing on the target object and the associated object through any one of the object synchronous editing methods provided in the embodiment;
and displaying the target object and the associated object after the object synchronous editing is implemented through the canvas.
The user equipment can be any electronic equipment which has a display screen and can display a human-computer interaction interface, such as a mobile phone, a desktop computer, a tablet computer, a notebook computer and the like. The canvas is an interface for exposing the target object and associated objects. The canvas is displayed through the user equipment, the target object and the associated object are displayed through the canvas, the whole process of synchronously editing the target object and the associated object through the object synchronous editing method can be visually displayed for a user, the user can more visually and quickly master the whole process of editing the object, and the user experience is improved.
In this embodiment, a video synchronous editing method is further provided, including:
responding to a synchronous editing request implemented on a target video frame, and acquiring corresponding synchronous editing information; the synchronous editing information at least comprises an associated identifier and an element editing parameter of a target video frame element to be edited in the target video frame;
determining, in an associated video frame associated with the target video frame, an associated video frame element having the same association identifier as the target video frame element;
and respectively editing the target video frame element and the associated video frame element according to the element editing parameters, so as to realize synchronous editing of the target video frame and the associated video frame.
The target video frame is any one frame of video picture in the target video file, and the associated video frame is a corresponding video frame in the associated video file which is associated with the target video file. The video frame element is an element in a corresponding video frame, and may be a picture element such as a scene, a person, and the like in a certain video area or a video frame image. The target video frame element is a video frame element in the target video frame. The associated video frame element is a video frame element in an associated video frame.
The specific implementation of the video synchronous editing method may refer to any one of the object synchronous editing methods provided in this embodiment, and is implemented by using a target video frame as a target object, an associated video frame as an associated object, a target video frame element as a target object element, and an associated video frame element as an associated object element, which are not described herein again.
By the video synchronous editing method, the target video file and the associated video file associated with the target video file can be automatically and synchronously edited, so that the video editing efficiency is improved.
For example, a user records an original short video and publishes the original short video on a plurality of short video platforms such as tremble video and fast-handed videos, the specifications of videos supported and published by each short video platform are different, a plurality of associated short videos need to be generated according to the original short video and published on each corresponding short video platform, and when a certain picture in the original short video needs to be edited, the associated videos need to be manually edited one by one. By the video synchronous editing method in the embodiment, the original short video recorded by the user can be used as the target video, each short video published on each short video platform can be used as the associated video, and the user can synchronously edit the short video published on each short video platform only by synchronously editing the target video, so that the video editing efficiency of the user is greatly improved.
< object synchronization editing apparatus >
In this embodiment, there is further provided an object synchronization editing apparatus 3000, as shown in fig. 8, including: the information obtaining unit 3100, the association determining unit 3200, and the synchronization editing unit 3300 are configured to obtain the object synchronization method in this embodiment, and are not described herein again.
An information obtaining unit 3100, configured to obtain corresponding synchronous editing information in response to an object synchronous editing request applied to a target object; the synchronous editing information at least comprises the associated identification of the target object element to be edited in the target object and the element editing parameter.
The association determining unit 3200 is configured to determine, in an association object associated with the target object, an association object element having the same association identifier as the target object element.
And the synchronous editing unit 3300 is configured to edit the target object element and the associated object element respectively according to the element editing parameter, so as to implement synchronous editing on the target object and the associated object.
Optionally, the object synchronization editing apparatus 3000 is further configured to:
and determining the associated object associated with the target object according to the acquired association relation.
Optionally, the target object and the associated object are generated from the same source object;
the object synchronization editing apparatus 3000 is further configured to:
analyzing the acquired source object, extracting element attributes of each source object element included in the source object, and setting a unique corresponding association identifier for each source object element;
generating corresponding new objects according to the element attribute of each source object element and the association identifier aiming at different preset object sizes respectively so as to obtain the target objects and the association objects with different object sizes; each new object comprises the object elements which respectively correspond to each source object element and have the same associated identification.
Further optionally, the generating, for different preset object sizes, a corresponding new object according to the element attribute of each source object element and the association identifier includes:
for each preset object size, setting the element attribute of a new object element corresponding to the source object element according to the element attribute of each source object element, the object size of the source object and the preset object size, and setting the association identifier of the new object element to be the same as the association identifier of the source object element, so as to obtain all new object elements corresponding to the preset object size;
and generating the new object according with the preset object size according to all the new object elements corresponding to each preset object size.
Optionally, the synchronization editing unit 3300 is further configured to:
acquiring corresponding associated editing parameters according to the element editing parameters;
and editing the associated object elements according to the associated editing parameters.
Optionally, the obtaining, according to the element editing parameter, a corresponding associated editing parameter includes:
acquiring a parameter relative relation according to the object size of the target object and the element editing parameters;
and acquiring the associated editing parameters according to the object size of the associated object and the relative relation of the parameters.
Optionally, the element editing parameter at least includes one of a target coordinate relationship and a target size relationship of the edited target object element; the target coordinate relationship is a relative coordinate offset between the target object element and an object reference point of the target object; the target size relationship is a relative size ratio between the target object element and the target object;
the associated editing parameters at least comprise one of an associated coordinate relationship and an associated size relationship of the edited associated object elements; the associated coordinate relationship is a relative coordinate offset between the associated object element and an object reference point of the associated object; the association size relationship is a relative size ratio between the association object element and the association object.
It will be understood by those skilled in the art that the object synchronization editing apparatus 3000 can be implemented in various ways. For example, the object synchronization editing apparatus 3000 may be realized by an instruction configuration processor. For example, the object synchronization editing apparatus 3000 may be implemented by storing instructions in a ROM and reading the instructions from the ROM into a programmable device when the device is started. For example, the object synchronization editing apparatus 3000 may be solidified into a dedicated device (e.g., ASIC). The object synchronization editing apparatus 3000 may be divided into units independent of each other, or may be implemented by combining them together. The object synchronization editing apparatus 3000 may be implemented by one of the various implementations described above, or may be implemented by a combination of two or more of the various implementations described above.
In this embodiment, the object synchronization editing apparatus 3000 may have a plurality of implementation forms, for example, the object synchronization editing apparatus 3000 is a packaged Web application, and provides a corresponding website for a user to access and call; the object synchronous editing apparatus 3000 may also be packaged in the form of a software tool development kit (e.g. SDK) and provided to other software or application calls having object synchronous editing requirements; the object synchronization editing apparatus 3000 may also be a functional module provided in object editing software, or the like.
< object synchronous editing apparatus >
In this embodiment, there is further provided an object synchronization editing apparatus 4000, as shown in fig. 9, including:
a memory 4100 for storing executable instructions;
a processor 4200, configured to run the object synchronization editing apparatus according to the control of the executable instruction, and execute any one of the object synchronization editing methods according to this embodiment.
In this embodiment, the object synchronization editing apparatus 4000 may be an electronic apparatus such as a mobile phone, a handheld computer, a tablet computer, a notebook computer, and a desktop computer, for example, the object synchronization editing apparatus 4000 may be a computer installed with software for implementing the object synchronization editing method of this embodiment. Alternatively, the object synchronization editing apparatus 4000 may also be a server such as a blade server or a cloud server, for example, the object synchronization editing apparatus is a network server that implements the object synchronization editing method according to this embodiment. Alternatively, the object synchronization editing apparatus 4000 may be configured by a plurality of entity apparatuses, for example, a front-end apparatus for a user and a back-end apparatus for performing processing.
The object synchronization editing apparatus 4000 may further include other devices, for example, a display device, an input device, a communication device, or the like, such as the electronic apparatus 1000 shown in fig. 1.
< readable storage Medium >
In this embodiment, a readable storage medium is further provided, where a computer program that can be read and executed by a computer is stored, and the computer program is configured to, when being read and executed by the computer, execute the object synchronization editing method according to this embodiment.
The readable storage medium may be a tangible device that can hold and store instructions for use by an instruction execution device. The readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. A readable storage medium as used herein is not to be construed as a transitory signal per se, such as a radio wave or other freely propagating electromagnetic wave, an electromagnetic wave propagating through a waveguide or other transmission medium (e.g., a light pulse through a fiber optic cable), or an electrical signal transmitted through an electrical wire.
Having described the embodiments of the present invention with reference to the accompanying drawings, according to the present embodiments, there are provided an object synchronous editing method, apparatus, device and readable storage medium, in which an object element is set to have a unique association identifier, an object element in a plurality of objects associated with the object element is associated based on the association identifier, when a corresponding synchronous editing request of a user for a certain target object is received, corresponding synchronous editing information can be obtained, the association identifier and the element editing parameter of the target object element to be edited in the target object are determined, the associated object element in the associated object associated with the target object is determined according to the association identifier, the target object element and the associated object element are respectively edited based on the element editing parameter, so that the user can synchronously edit the associated object with the target object only by editing the target object, thereby realizing that there is no need to separately operate a plurality of objects, the synchronous editing of a plurality of objects can be completed, the manpower and time required by editing the objects are greatly saved, the time cost and the labor cost of synchronously editing the objects are reduced, and the design efficiency of a user is improved. The method is particularly suitable for scenes requiring frequent synchronous editing of a plurality of objects.
The present invention may be a system, method and/or computer program product. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied therewith for causing a processor to implement various aspects of the present invention.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present invention may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present invention are implemented by personalizing an electronic circuit, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), with state information of computer-readable program instructions, which can execute the computer-readable program instructions.
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, by software, and by a combination of software and hardware are equivalent.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the invention is defined by the appended claims.

Claims (13)

1. An object synchronous editing method is characterized by comprising the following steps:
responding to an object synchronous editing request implemented on a target object, and acquiring corresponding synchronous editing information; the synchronous editing information at least comprises an associated identifier and an element editing parameter of a target object element to be edited in the target object;
determining an associated object element having the same association identifier as the target object element in an associated object associated with the target object;
and respectively editing the target object element and the associated object element according to the element editing parameters, so as to realize the synchronous editing of the target object and the associated object.
2. The method of claim 1, wherein editing the associated object element according to element editing parameters comprises:
acquiring corresponding associated editing parameters according to the element editing parameters;
and editing the associated object elements according to the associated editing parameters.
3. The method according to claim 2, wherein the obtaining of the corresponding associated editing parameter according to the element editing parameter comprises:
acquiring a parameter relative relation according to the object size of the target object and the element editing parameters;
and acquiring the associated editing parameters according to the object size of the associated object and the relative relation of the parameters.
4. The method of claim 2,
the element editing parameters at least comprise one of target coordinate relation and target size relation of the edited target object element; the target coordinate relationship is a relative coordinate offset between the target object element and an object reference point of the target object; the target size relationship is a relative size ratio between the target object element and the target object;
the associated editing parameters at least comprise one of an associated coordinate relationship and an associated size relationship of the edited associated object elements; the associated coordinate relationship is a relative coordinate offset between the associated object element and an object reference point of the associated object; the association size relationship is a relative size ratio between the association object element and the association object.
5. The method of claim 1,
the target object and the associated object are generated from the same source object;
the method further comprises the following steps:
analyzing the acquired source object, extracting element attributes of each source object element included in the source object, and setting a unique corresponding association identifier for each source object element;
generating corresponding new objects according to the element attribute of each source object element and the association identifier aiming at different preset object sizes respectively so as to obtain the target objects and the association objects with different object sizes; each new object comprises the object elements which respectively correspond to each source object element and have the same associated identification.
6. The method according to claim 5, wherein the generating corresponding new objects according to the element attribute of each source object element and the association identifier for different preset object sizes respectively comprises:
for each preset object size, setting the element attribute of a new object element corresponding to the source object element according to the element attribute of each source object element, the object size of the source object and the preset object size, and setting the association identifier of the new object element to be the same as the association identifier of the source object element, so as to obtain all new object elements corresponding to the preset object size;
and generating the new object according with the preset object size according to all the new object elements corresponding to each preset object size.
7. The method of claim 1, further comprising:
determining the associated object associated with the target object according to the acquired association relation;
and/or the presence of a gas in the gas,
acquiring associated data, wherein the associated data at least comprises the position relation between object content and object elements;
and generating the associated object associated with the target object according to the associated data.
8. The method of claim 1,
acquiring the size of user equipment, and generating the associated object which is adaptive to the size of the user equipment and is associated with the target object;
and/or the presence of a gas in the gas,
receiving object synchronous editing operation implemented by a user and generating an object synchronous editing request; the object synchronous editing operation at least comprises one of a zooming-in operation and a zooming-out operation.
9. A display method for synchronously editing objects is characterized by comprising the following steps:
displaying, by a user device, a canvas;
displaying a target object and the associated object which is associated with the target object through the canvas, so that a user can implement object synchronous editing on the target object and the associated object through the object synchronous editing method according to any one of claims 1 to 8;
and displaying the target object and the associated object after the object synchronous editing is implemented through the canvas.
10. A method for video synchronous editing, comprising:
responding to a synchronous editing request implemented on a target video frame, and acquiring corresponding synchronous editing information; the synchronous editing information at least comprises an associated identifier and an element editing parameter of a target video frame element to be edited in the target video frame;
determining, in an associated video frame associated with the target video frame, an associated video frame element having the same association identifier as the target video frame element;
and respectively editing the target video frame element and the associated video frame element according to the element editing parameters, so as to realize synchronous editing of the target video frame and the associated video frame.
11. An object synchronization editing apparatus, comprising:
an information acquisition unit, configured to respond to an object synchronous editing request applied to a target object, and acquire corresponding synchronous editing information; the synchronous editing information at least comprises an associated identifier and an element editing parameter of a target object element to be edited in the target object;
an association determining unit, configured to determine, in an association object associated with the target object, an association object element having the same association identifier as the target object element;
and the synchronous editing unit is used for respectively editing the target object element and the associated object element according to the element editing parameters to realize synchronous editing of the target object and the associated object.
12. An object synchronization editing apparatus, comprising:
a memory for storing executable instructions;
a processor, configured to operate the object synchronous editing device according to the control of the executable instruction, and execute the object synchronous editing method according to any one of claims 1 to 8.
13. A readable storage medium comprising, in combination,
the readable storage medium stores a computer program readable and executable by a computer, and the computer program is used for executing the object synchronous editing method according to any one of claims 1 to 8 when being read and executed by the computer.
CN201910750040.0A 2019-08-14 2019-08-14 Method, device and equipment for synchronously editing object and readable storage medium Active CN112395838B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910750040.0A CN112395838B (en) 2019-08-14 2019-08-14 Method, device and equipment for synchronously editing object and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910750040.0A CN112395838B (en) 2019-08-14 2019-08-14 Method, device and equipment for synchronously editing object and readable storage medium

Publications (2)

Publication Number Publication Date
CN112395838A true CN112395838A (en) 2021-02-23
CN112395838B CN112395838B (en) 2023-12-05

Family

ID=74601434

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910750040.0A Active CN112395838B (en) 2019-08-14 2019-08-14 Method, device and equipment for synchronously editing object and readable storage medium

Country Status (1)

Country Link
CN (1) CN112395838B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113518187A (en) * 2021-07-13 2021-10-19 北京达佳互联信息技术有限公司 Video editing method and device
CN115619905A (en) * 2022-10-24 2023-01-17 北京力控元通科技有限公司 Primitive editing method and device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1567381A (en) * 2003-06-20 2005-01-19 北京北佳信息系统有限公司 Multimedia material synchronous editing device
CN105393246A (en) * 2013-06-28 2016-03-09 微软技术许可有限责任公司 Selecting and editing visual elements with attribute groups
US20180189255A1 (en) * 2016-12-30 2018-07-05 Dropbox, Inc. Image annotations in collaborative content items
CN109558448A (en) * 2018-10-10 2019-04-02 北京海数宝科技有限公司 Data processing method, device, computer equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1567381A (en) * 2003-06-20 2005-01-19 北京北佳信息系统有限公司 Multimedia material synchronous editing device
CN105393246A (en) * 2013-06-28 2016-03-09 微软技术许可有限责任公司 Selecting and editing visual elements with attribute groups
US20160189404A1 (en) * 2013-06-28 2016-06-30 Microsoft Corporation Selecting and Editing Visual Elements with Attribute Groups
US20180189255A1 (en) * 2016-12-30 2018-07-05 Dropbox, Inc. Image annotations in collaborative content items
CN109558448A (en) * 2018-10-10 2019-04-02 北京海数宝科技有限公司 Data processing method, device, computer equipment and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113518187A (en) * 2021-07-13 2021-10-19 北京达佳互联信息技术有限公司 Video editing method and device
CN113518187B (en) * 2021-07-13 2024-01-09 北京达佳互联信息技术有限公司 Video editing method and device
CN115619905A (en) * 2022-10-24 2023-01-17 北京力控元通科技有限公司 Primitive editing method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN112395838B (en) 2023-12-05

Similar Documents

Publication Publication Date Title
CN109344352B (en) Page loading method and device and electronic equipment
US20230419576A1 (en) Modifying a graphic design to match the style of an input design
KR101968977B1 (en) Cartoon providing system, cartoon providing device and cartoon providing method
CN110909275B (en) Page browsing method and device and electronic equipment
CN108351880A (en) Image processing method, device, electronic equipment and graphic user interface
US11995747B2 (en) Method for generating identification pattern and terminal device
KR20160106970A (en) Method and Apparatus for Generating Optimal Template of Digital Signage
TW201516968A (en) Browser-based image processing
CN112395838B (en) Method, device and equipment for synchronously editing object and readable storage medium
EP4273808A1 (en) Method and apparatus for publishing video, device, and medium
CN111583379A (en) Rendering method and device of virtual model, storage medium and electronic equipment
CN114528816B (en) Collaborative editing information display method and device, electronic equipment and readable medium
CN108447106B (en) Generation method and device of venue seat map
CN110909274B (en) Page browsing method and device and electronic equipment
CN113515922A (en) Document processing method, system, device and interaction device
CN108021317B (en) Method and device for screen editing
CN111506841A (en) Webpage display method, device and equipment and readable storage medium
KR20140127131A (en) Method for displaying image and an electronic device thereof
CN115756452A (en) Target page code generation method, device, storage medium and program product
US9542094B2 (en) Method and apparatus for providing layout based on handwriting input
CN113947450A (en) Rendering method and editing method and device of seating chart and terminal equipment
KR20200107779A (en) Method for providing a customized identification card and apparatus thereof
CN112035771A (en) Web-based camera data drawing method and device and electronic equipment
US10896483B2 (en) Dynamic content generation system
CN111625303B (en) Event processing method, device, equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant