CN114035877A - Interface editing method and device, storage medium and electronic equipment - Google Patents

Interface editing method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN114035877A
CN114035877A CN202111329034.1A CN202111329034A CN114035877A CN 114035877 A CN114035877 A CN 114035877A CN 202111329034 A CN202111329034 A CN 202111329034A CN 114035877 A CN114035877 A CN 114035877A
Authority
CN
China
Prior art keywords
user interface
terminal
operation instruction
editing
component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111329034.1A
Other languages
Chinese (zh)
Inventor
何维
康伟雄
王欢
牛袖风
庞晓智
胡震宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Nut Software Co.,Ltd.
Original Assignee
Shenzhen Huole Science and Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Huole Science and Technology Development Co Ltd filed Critical Shenzhen Huole Science and Technology Development Co Ltd
Priority to CN202111329034.1A priority Critical patent/CN114035877A/en
Publication of CN114035877A publication Critical patent/CN114035877A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/23Updating

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to an interface editing method, an interface editing device, a storage medium and an electronic device. The method comprises the following steps: receiving an interactive operation instruction which is input by a user and aims at a first user interface displayed by a first terminal, and editing the first user interface according to the interactive operation instruction; converting the interactive operation instruction into an editing operation instruction of a component or a container corresponding to the first user interface; and synchronizing the editing operation instruction to the second terminal so that the second terminal can synchronously edit the second user interface displayed on the second terminal according to the editing operation instruction. Therefore, cross-terminal synchronous editing of the user interface can be achieved, namely a plurality of terminals synchronously edit the same user interface, and therefore the editing efficiency and the user experience of the user interface are improved. In addition, the terminal for synchronously editing the same user interface can comprise a large screen, so that a user can directly watch the editing effect through the large screen, and further adjust the user interface according to the editing effect, and the editing efficiency and the user experience are further improved.

Description

Interface editing method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to an interface editing method and apparatus, a storage medium, and an electronic device.
Background
The User Interface (UI) synchronous editing scheme at the present stage is basically implemented by the same screen technology. That is, the screen of the main editing device (such as a mobile phone) or the picture in an application program is captured frame by frame at a high frequency, and the picture is synchronized to other terminals needing to be displayed in a form of continuous picture frames through communication modes such as Wi-Fi (Wireless Fidelity). Therefore, the method is essentially a screen projection technology, only realizes the screen display of the main editing equipment in the editing process by other terminals, and cannot realize the cross-terminal synchronous editing of the UI.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides an interface editing method, apparatus, storage medium, and electronic device.
In order to achieve the above object, in a first aspect, the present disclosure provides an interface editing method applied to a first terminal, including: receiving an interactive operation instruction which is input by a user and aims at a first user interface displayed by the first terminal, and editing the first user interface according to the interactive operation instruction; converting the interactive operation instruction into an editing operation instruction of a component or a container corresponding to the first user interface; and synchronizing the editing operation instruction to a second terminal so that the second terminal can synchronously edit a second user interface displayed on the second terminal according to the editing operation instruction.
Optionally, the synchronizing the editing operation instruction to the second terminal includes: directly sending the editing operation instruction to a second terminal; or sending the editing operation instruction to a server in communication connection with the first terminal, so that the server sends the editing operation instruction to a second terminal.
Optionally, the operation type of the editing operation instruction is any one of a property of a change component, a property of a change container, a newly added component, a newly added container, a deleted component, and a deleted container.
Optionally, the method further comprises: receiving an update instruction input by a user, wherein the update instruction is an update instruction for target content stored by a server, and the target content is one of a component, a container and a material; and sending the updating instruction to the server so as to update the target content by the server.
Optionally, the method further comprises: receiving a synchronization instruction which is sent by the server and contains the updated target content; if the first terminal currently displays the first user interface, updating the target content in the currently displayed first user interface into the updated target content; and if the first terminal does not display the first user interface currently, updating the target content in the first user interface to the updated target content in the background.
Optionally, the first user interface includes a display area and an editing area, and the display area of the first user interface is the same as the content displayed by the second user interface.
In a second aspect, the present disclosure provides an interface editing method applied to a second terminal, including: receiving an editing operation instruction, wherein the editing operation instruction is generated by the first terminal according to an interactive operation instruction which is input by a user and aims at the first user interface; analyzing the editing operation instruction to obtain an operation instruction of a component or a container corresponding to the second user interface; and editing the second user interface according to the operation instruction of the component or the container corresponding to the second user interface so as to realize the synchronous editing of the first user interface and the second user interface.
Optionally, when the operation instruction for the component or the container corresponding to the second user interface is a newly added component or a newly added container, the editing the second user interface according to the operation instruction for the component or the container corresponding to the second user interface includes: determining an identifier and a target adding position of a target element to be newly added according to the operating instruction of the component or the container corresponding to the second user interface, wherein the target element is the component or the container; acquiring a target element corresponding to the identifier from a server; adding the target element at the target addition location in the second user interface.
Optionally, before the step of adding the target element to the target addition position in the second user interface, the editing the second user interface according to the operation instruction on the component or the container corresponding to the second user interface further includes: acquiring screen parameters of the second terminal; adjusting the appearance of the target element according to the screen parameters; the adding the target element at the target add location in the second user interface includes: and adding the adjusted target element at the target adding position in the second user interface.
Optionally, when the operation instruction for the component or the container corresponding to the second user interface is a newly added component and the component is a special effect component, the editing of the second user interface according to the operation instruction for the component or the container corresponding to the second user interface further includes: acquiring a rendering engine of the second terminal; rendering, with the rendering engine, the target element added to the second user interface.
Optionally, the method further comprises: receiving a synchronization instruction which is sent by a server and contains the updated target content; if the second terminal currently displays the second user interface, updating the target content in the currently displayed second user interface into the updated target content; and if the second user interface is not displayed at present by the second terminal, updating the target content in the second user interface to the updated target content in the background.
In a third aspect, the present disclosure provides an interface editing method applied to a server, including: receiving an update instruction which is uploaded by a terminal and aims at target content stored by the server, wherein the target content is one of a component, a container and a material; updating the target content corresponding to the updating instruction; and synchronizing the updated target content to other terminals using the target content.
In a fourth aspect, the present disclosure provides an interface editing method applied to a server, including: receiving an editing operation instruction uploaded by a first terminal, wherein the editing operation instruction is generated by the first terminal according to an interactive operation instruction which is input by a user and aims at a first user interface; and sending the editing operation instruction to a second terminal so that the second terminal edits a component or a container corresponding to a second user interface according to the editing operation instruction, and the first user interface and the second user interface are synchronously edited.
Optionally, when the operation type of the editing operation instruction is a newly added component or a newly added container, before the step of sending the editing operation instruction to the second terminal, the method further includes: determining an identifier of a target element to be newly added according to the editing operation instruction, wherein the target element is a component or a container; acquiring a target element corresponding to the identifier; the sending the editing operation instruction to the second terminal includes: and sending the editing operation instruction and the target element to a second terminal.
Optionally, before the step of sending the editing operation instruction and the target element to the second terminal, the method further includes: acquiring screen parameters of the second terminal; adjusting the appearance of the target element according to the screen parameters; the sending the editing operation instruction and the target element to a second terminal includes: and sending the editing operation instruction and the adjusted target element to a second terminal.
Optionally, the server stores a container and a component management rule; the method further comprises the following steps: and synchronizing the management rule to the first terminal and the second terminal so that the first terminal and the second terminal display user interfaces according to the management rule.
In a fifth aspect, the present disclosure provides an interface editing apparatus applied to a first terminal, including: the first receiving module is used for receiving an interactive operation instruction which is input by a user and aims at a first user interface displayed by the first terminal, and editing the first user interface according to the interactive operation instruction; the conversion module is used for converting the interactive operation instruction received by the first receiving module into an editing operation instruction of a component or a container corresponding to the first user interface; and the first synchronization module is used for synchronizing the editing operation instruction obtained by the conversion module to a second terminal so that the second terminal can synchronously edit a second user interface displayed on the second terminal according to the editing operation instruction.
In a sixth aspect, the present disclosure provides an interface editing apparatus, applied to a second terminal, including: the second receiving module is used for receiving an editing operation instruction, wherein the editing operation instruction is generated by the first terminal according to an interactive operation instruction which is input by a user and aims at the first user interface; the analysis module is used for analyzing the editing operation instruction received by the second receiving module to obtain an operation instruction of a component or a container corresponding to a second user interface; and the editing module is used for editing the second user interface according to the operation instruction of the component or the container corresponding to the second user interface, which is obtained by the analyzing module, so as to realize the synchronous editing of the first user interface and the second user interface.
In a seventh aspect, the present disclosure provides an interface editing apparatus, applied to a server, including: a third receiving module, configured to receive an update instruction, where the update instruction is an update instruction that is uploaded by a terminal and is for target content stored by the server, where the target content is one of a component, a container, and a material; the first updating module is used for updating the target content corresponding to the updating instruction received by the third receiving module; and the second synchronization module is used for synchronizing the target content obtained after the first updating module updates to other terminals using the target content.
In an eighth aspect, the present disclosure provides an interface editing apparatus, applied to a server, including: the fourth receiving module is used for receiving an editing operation instruction uploaded by the first terminal, wherein the editing operation instruction is generated by the first terminal according to an interactive operation instruction which is input by a user and aims at the first user interface; the first sending module is configured to send the editing operation instruction received by the fourth receiving module to the second terminal, so that the second terminal edits a component or a container corresponding to the second user interface according to the editing operation instruction, and the first user interface and the second user interface are synchronously edited.
In a ninth aspect, the present disclosure provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method provided by the first aspect of the present disclosure, the steps of the method provided by the second aspect of the present disclosure, the steps of the method provided by the third aspect of the present disclosure, or the steps of the method provided by the fourth aspect of the present disclosure.
In a tenth aspect, the present disclosure provides an electronic device comprising: a memory having a computer program stored thereon; a processor for executing the computer program in the memory to implement the steps of the method provided by the first aspect of the present disclosure, the steps of the method provided by the second aspect of the present disclosure, the steps of the method provided by the third aspect of the present disclosure, or the steps of the method provided by the fourth aspect of the present disclosure.
The disclosure provides an interface editing method, an interface editing device, a storage medium and an electronic device. In the interface editing method, a first terminal receives an interactive operation instruction which is input by a user and aims at a first user interface displayed by the first terminal, and edits the first user interface according to the interactive operation instruction; converting the interactive operation instruction into an editing operation instruction of a component or a container corresponding to the first user interface; and synchronizing the editing operation instruction to the second terminal so that the second terminal can synchronously edit the second user interface displayed on the second terminal according to the editing operation instruction. Therefore, cross-terminal synchronous editing of the user interface can be achieved, namely a plurality of terminals can synchronously edit the same user interface, so that the editing efficiency of the user interface is improved, and the user experience is improved. In addition, the terminal for synchronously editing the same user interface can comprise a large screen, so that a user can directly view the editing effect through the large screen, and further can adjust the user interface according to the editing effect, thereby further improving the editing efficiency and the user experience of the user interface.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure without limiting the disclosure. In the drawings:
FIG. 1A is a schematic diagram illustrating an interface synchronization editing system, according to an example embodiment.
Fig. 1B is a schematic diagram illustrating an interface synchronization editing system according to another exemplary embodiment.
FIG. 2 is a flow diagram illustrating a method of interface editing in accordance with an exemplary embodiment.
FIG. 3 is a flow diagram illustrating a method of interface editing in accordance with an exemplary embodiment.
FIG. 4 is a flow diagram illustrating a method of interface editing in accordance with an exemplary embodiment.
FIG. 5 is a flowchart illustrating a method of interface editing according to an exemplary embodiment.
Fig. 6 is a block diagram illustrating an interface editing apparatus according to an example embodiment.
Fig. 7 is a block diagram illustrating an interface editing apparatus according to an example embodiment.
Fig. 8 is a block diagram illustrating an interface editing apparatus according to an example embodiment.
Fig. 9 is a block diagram illustrating an interface editing apparatus according to an example embodiment.
FIG. 10 is a block diagram illustrating an electronic device in accordance with an example embodiment.
FIG. 11 is a block diagram illustrating an electronic device in accordance with an example embodiment.
FIG. 12 is a block diagram illustrating an electronic device in accordance with an example embodiment.
Detailed Description
The following detailed description of specific embodiments of the present disclosure is provided in connection with the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating the present disclosure, are given by way of illustration and explanation only, not limitation.
The present disclosure provides an interface synchronous editing system, wherein the interface synchronous editing system may include a server and a plurality of terminals that synchronously edit the same user interface.
As shown in fig. 1A and 1B, the interface synchronous editing system includes a server 20, and a terminal 100, a terminal 110, … …, a terminal 120, and a terminal 130 for synchronously editing the same user interface.
The terminals may be local terminals or remote terminals, and may be, for example, smart phones, tablet computers, notebook computers, desktop computers, projection devices, and the like, which are not specifically limited in this disclosure, and the terminals are all in communication connection with the server, and the terminals may perform synchronous editing on the same user interface. The system installed in the terminal may be iOS, Android, Windows, or the like.
The user interface may include elements such as components and containers, wherein the components may also refer to multimedia data (e.g., pictures, videos, etc.), related data of internet services, and the like (for example, the weather component may acquire weather information (e.g., sunny days, snowstorms, etc.) online and convert the weather information into a visual UI expression (e.g., sunny days represented by sun graphic materials and snowstorms represented by snowflake graphic materials) through a preset instruction set). The container refers to a virtual frame for placing the bearing components, and may be a layer or other bearing frames with overall management rules for some components. A component is a granular part object with relatively independent functionality and UI expression that can be added modularly within a container. The user can add different components in the container and adjust the properties of the container and the components to realize the user-defined editing of the UI.
In addition, in order to reduce occupation and consumption of resources such as components, containers, multimedia data, and the like to the local storage of each terminal, they may be stored in the server 20 to implement resource sharing through the server 20. In this way, when the content such as the component, the container, the multimedia data and the like which needs to be added to the user interface of the current synchronous editing is required, each terminal can obtain the corresponding content through the server 20.
As shown in fig. 1A, each terminal is connected to the server 20, and at this time, each terminal implements user interface synchronous editing using the server 20 as a medium.
As shown in fig. 1B, each terminal is connected to the server 20, and the terminals may be connected to each other, in which case, the terminals may implement user interface synchronous editing through direct communication. Each terminal can be in wireless connection through Wi-Fi, Bluetooth, hot spots, a wide area network and the like, and can also be in wired connection through a router.
In order to implement the cross-terminal synchronous editing of the user interface, firstly, a container-component UI display frame needs to be designed, and each terminal is compatible with the container-component UI display frame, so that each terminal implements the synchronous editing of the user interface based on the container-component UI display frame. Specifically, the container-component UI presentation framework described above needs to include:
(i) the entire UI framework governs the rules of the container as a whole. For example, an upper limit of the number of containers, a rule of container placeable positions, a rule of container overlap mutual exclusion, a 3D spatial perspective rule, and the like.
(ii) Description and editing capabilities for the overall UI framework. For example, changing the relative position of containers in space, adding, deleting containers, etc.
(iii) Description and editing capabilities for individual containers. For example, container edit area definitions, component type definitions that may be added within a container, adding, deleting components within a container, changing the location of components within a container, and the like.
(iv) Description and editing capabilities for individual components. For example, zooming in, zooming out, rotating a single component, changing the appearance of a component according to different component types, changing the content of a component, and so forth.
(v) Aiming at different terminals and different software platforms, the self-adaption mechanism of the whole set of UI display framework. For example, on devices of different resolutions, different screen sizes, containers, layout of components, aspect ratio, self-adapting mechanism of color reduction.
(vi) And aiming at different terminals, different software platforms and different rendering engines, and customizing the customized representation and the customized function of certain components. For example, a video component may use different video plug-in processing technologies on different platforms such as iOS, Android, Windows, etc.; the special effect component may have different effect expressions on different engines such as Unity and OpenGL.
In addition, in order to implement cross-terminal synchronous editing of a user interface, a set of cross-platform and cross-terminal communication protocol needs to be constructed in advance, and devices (namely, the terminals) participating in the cross-terminal synchronous editing of the UI need to adapt to the protocol. Specifically, the communication protocol includes:
(1) account authentication and communication permission authorization (namely synchronous editing authority authentication) are carried out on equipment participating in cross-terminal synchronous editing, and equipment terminals needing synchronization in each step of change operation in the synchronous editing process are appointed;
(2) and defining the description mode of a certain attribute change of the container/component based on the editable attributes of each container and component defined in the container-component UI presentation framework.
For example, in the attribute of the container, an attribute field increase caused by adding a component can be defined; in the attribute of the component, description modes of attribute changes such as coordinate changes, color value field changes, address changes of reference contents, and the like can be defined.
(3) And (2) defining new events (namely adding) corresponding to different behavior events of different platforms and different terminal users or attribute change events in the (2) by combining different platforms and different terminal user interaction modes, namely, pre-establishing corresponding operation instruction sets aiming at the different platforms and the different terminals, wherein the operation instruction sets comprise editing operation instructions corresponding to different interaction operation instructions (wherein the editing operation instructions comprise the change of the attributes of the container/component or the new container/component).
For example, the dragging movement of the finger at the moving end + the double-finger zoom-in event corresponds to the change of the coordinates and aspect ratio of the component in (2) corresponding to the event of the mouse-down dragging and the selection of the single-point stretching at the PC (Personal Computer).
Fig. 2 is a flowchart illustrating an interface editing method according to an exemplary embodiment, in which the method is applied to a first terminal, for example, the method may be applied to any one of the terminal 100, the terminal 110, the terminal 120, and the terminal 130 shown in fig. 1A or 1B, and the first terminal may be a smart phone, a tablet computer, a notebook computer, a desktop computer, a projection device, and the like. As shown in fig. 2, the method includes the following S201 to S203.
In S201, an interactive operation instruction, which is input by a user and is specific to a first user interface displayed by a first terminal, is received, and the first user interface is edited according to the interactive operation instruction.
In the disclosure, a user may interact with the first terminal through touch, keyboard input, voice input, and the like.
After the first user interface is edited according to the interactive operation instruction, the first user interface can be re-rendered. Editing the first user interface includes: and editing the interface layout and the interface content of the first user interface.
In S202, the interactive operation instruction is converted into an editing operation instruction for the component or container corresponding to the first user interface.
In the present disclosure, an interactive operation instruction may be converted into an editing operation instruction for a component or a container corresponding to a first user interface according to a pre-established operation instruction set, where the operation instruction set includes editing operation instructions corresponding to different interactive operation instructions.
Illustratively, the first terminal is a smart phone, the interactive operation instruction input by the user and specific to the first user interface displayed by the first terminal is to drag and move the component 1, and the editing operation instruction corresponding to the interactive operation instruction is to change the position of the component 1 in the first user interface.
Further illustratively, the first terminal is a PC terminal, the interoperation instruction input by the user and directed to the first user interface displayed by the first terminal is to select a single point stretch on the component 2, and the editing operation instruction corresponding to the interoperation instruction is to change the size (i.e., width and height) of the component 2 in the first user interface.
In addition, the edit manipulation instruction may include a target manipulation object (component or container) and a target manipulation type. The target operation type of the editing operation instruction may be any one of changing a property (e.g., a position, a color, a size, a frame style, a light effect, referenced content) of the component, changing a property (e.g., a position, a size, etc.) of the container, adding the component, adding the container, deleting the component, and deleting the container. Also, the above-described component may be any one of an image component, a video component, an audio component, a text component, an online component, and a special effects component. The online component is a component for acquiring internet content through an online data service, and exemplarily, the online component is any one of a weather component and a time component. For example, the weather component acquires weather information provided by the internet through an online data service, and updates and displays the weather information in real time on a user interface.
In S203, the editing operation instruction is synchronized to the second terminal, so that the second terminal performs synchronous editing on the second user interface displayed on the second terminal according to the editing operation instruction.
In one embodiment, the second user interface is the same interface as the first user interface in that the second user interface is the same interface layout and interface content (e.g., components, containers) as the first user interface. After the first terminal performs an editing operation on the first user interface, the second terminal also performs the same editing operation on the second user interface, so as to ensure that the contents displayed by the second user interface and the first user interface are consistent, that is, synchronous editing is realized.
In another embodiment, the first user interface includes a display area and an editing area, and the display area of the first user interface is the same as the content displayed by the second user interface, that is, the layout and content (e.g., components, containers) of the second user interface are the same as the display area of the first user interface. The display area is an area formed by the containers and the components, and the editing area is an area for editing the display area, for example, a keyboard area. After the first terminal carries out an editing operation on the display area through the editing area, the second terminal also carries out the same editing operation on the second user interface so as to ensure that the contents displayed in the display area of the second user interface and the first user interface are kept consistent, namely synchronous editing is realized.
In addition, the number of the second terminals may be one or more, and is not particularly limited in the present disclosure.
In one embodiment, as shown in fig. 1A, there is no communication connection between the first terminal and the second terminal of the synchronous editing user interface, and at this time, the first terminal may send the editing operation instruction to a server in communication connection with the first terminal, so that the server sends the editing operation instruction to the second terminal of the synchronous editing user interface.
In another embodiment, as shown in fig. 1B, the first terminal is in direct communication connection with the second terminal of the synchronous editing user interface, and in this case, the first terminal may directly send the editing operation instruction to the second terminal of the synchronous editing user interface.
After receiving the editing operation instruction sent by the first terminal, the second terminal can synchronously edit the second user interface displayed by the second terminal according to the editing operation instruction, and re-render the synchronously edited second user interface.
In the technical scheme, the first terminal receives an interactive operation instruction which is input by a user and aims at a first user interface displayed by the first terminal, and edits the first user interface according to the interactive operation instruction; converting the interactive operation instruction into an editing operation instruction of a component or a container corresponding to the first user interface; and synchronizing the editing operation instruction to the second terminal so that the second terminal can synchronously edit the second user interface displayed on the second terminal according to the editing operation instruction. Therefore, cross-terminal synchronous editing of the user interface can be achieved, namely a plurality of terminals can synchronously edit the same user interface, so that the editing efficiency of the user interface is improved, and the user experience is improved. In addition, the terminal for synchronously editing the same user interface can comprise a large screen, so that a user can directly view the editing effect through the large screen, and further can adjust the user interface according to the editing effect, thereby further improving the editing efficiency and the user experience of the user interface.
In addition, the user can modify resources such as components, containers and materials in the server through the first terminal, and specifically, the method further includes the following steps 1) and 2):
1) and receiving an updating instruction input by a user.
2) And sending an updating instruction to the server so as to update the target content by the server.
In the present disclosure, the update instruction is an update instruction for target content stored by the server, wherein the target content may be any one of a component, a container, and a material (e.g., multimedia content). The update instruction may include an identification (e.g., ID) of the target content and an update operation, which may be to change attributes of the component (e.g., position, color, size, border style, light effect, referenced content), change attributes of the container (e.g., position, size, etc.), change material (e.g., change hue of the picture, change duration of the music, etc.). In this way, after receiving the update instruction sent by the first terminal, the server may update the target content corresponding to the identifier according to the update operation in the update instruction.
Further, the method includes the following steps [1] to [4 ]:
[1] and receiving a synchronization instruction which is sent by the server and contains the updated target content.
In the present disclosure, after updating a target content, a server synchronizes the updated target content to another terminal (e.g., a first terminal) that uses the target content. Specifically, an update instruction including the target content obtained after the update may be transmitted to the other terminal.
[2] And judging whether the first terminal displays the first user interface currently.
In this disclosure, when the first terminal receives the update instruction sent by the server, the currently displayed screen may be the first user interface, or may be another screen, and therefore, it is necessary to determine whether the first terminal currently displays the first user interface. If the first terminal displays the first user interface currently, executing the following step [3 ]; if the first terminal does not present the first user interface, the following step [4] is executed.
[3] And updating the target content in the currently displayed first user interface into the updated target content.
[4] And updating the target content in the first user interface into the updated target content in the background.
In the disclosure, the background is a data update background, and when the first terminal does not show the first user interface currently, the target content in the first user interface can be updated to the updated target content in the background. Therefore, when the first terminal displays the first user interface again, the displayed first user interface is the latest user interface, and real-time synchronization of the first user interface is guaranteed.
In the disclosure, in order to make the content referenced by the user interface consistent with the content in the server, after the server updates the target content, it may be determined whether the first user interface of the first terminal includes the target content; if yes, a target content synchronization instruction containing the updated target content is sent to the first terminal and a second terminal of the synchronous editing user interface, and the updated target content is synchronously updated to the first terminal and the second terminal. And after receiving the synchronization instruction, the first terminal and the second terminal synchronously update the target content in the corresponding user interface into the updated target content.
Fig. 3 is a flowchart illustrating an interface editing method according to an exemplary embodiment, in which the method is applied to a second terminal, for example, the method may be applied to any one of the terminals 100, 110, 120, and 130 shown in fig. 1A or 1B, and the second terminal may be a smart phone, a tablet computer, a notebook computer, a desktop computer, a projection device, and the like. As shown in fig. 3, the method includes the following S301 to S303.
In S301, an editing operation instruction is received, where the editing operation instruction is generated by the first terminal according to an interactive operation instruction for the first user interface input by the user.
In S302, the editing operation instruction is analyzed to obtain an operation instruction for a component or container corresponding to the second user interface.
In this disclosure, the editing operation instruction includes a target operation object (a component or a container to be edited) and a target operation type, and the target operation object and the target operation type of the editing operation can be obtained by analyzing the target operation object and the target operation type, that is, the operation instruction for the component or the container corresponding to the second user interface is obtained, where the component or the container corresponding to the second user interface is the target operation object of the editing operation, and the operation instruction includes the target operation type of the editing operation.
In S303, the second user interface is edited according to an operation instruction for the component or the container corresponding to the second user interface, so as to implement synchronous editing of the first user interface and the second user interface.
In this disclosure, the container or the component corresponding to the second user interface may be edited according to the target operation type according to the operation instruction for the component or the container corresponding to the second user interface.
In the technical scheme, the first terminal receives an interactive operation instruction which is input by a user and aims at a first user interface displayed by the first terminal, and edits the first user interface according to the interactive operation instruction; converting the interactive operation instruction into an editing operation instruction of a component or a container corresponding to the first user interface; and synchronizing the editing operation instruction to the second terminal so that the second terminal can synchronously edit the second user interface displayed on the second terminal according to the editing operation instruction. Therefore, cross-terminal synchronous editing of the user interface can be achieved, namely a plurality of terminals can synchronously edit the same user interface, so that the editing efficiency of the user interface is improved, and the user experience is improved. In addition, the terminal for synchronously editing the same user interface can comprise a large screen, so that a user can directly view the editing effect through the large screen, and further can adjust the user interface according to the editing effect, thereby further improving the editing efficiency and the user experience of the user interface.
Optionally, the operation type of the editing operation instruction is any one of a property of a change component, a property of a change container, a newly added component, a newly added container, a deleted component, and a deleted container.
Optionally, the component is any one of an image component, a video component, an audio component, a text component, an online component and a special effect component, wherein the online component is a component for acquiring internet content through an online data service.
In addition, when the operation instruction for the component or container corresponding to the second user interface is the newly added component or newly added container, the second user interface may be edited through the following steps 1] to 3] according to the operation instruction for the component or container corresponding to the second user interface:
and 1, determining the identifier and the target adding position of a target element to be newly added according to an operation instruction of a component or a container corresponding to the second user interface, wherein the target element is the component or the container.
In the present disclosure, the target adding position is a position in the second user interface corresponding to the adding position of the corresponding component or container in the first user interface.
And 2, acquiring the target element corresponding to the identification from the server.
In the disclosure, the second terminal may send an acquisition request including the identifier to the server; and after receiving the acquisition request, the server locally acquires the target element corresponding to the identifier and then feeds the target element back to the second terminal.
3] adding the target element at the target addition position in the second user interface.
Further, before the step 3, the step S303 may further include the following steps 4] and 5 ].
And 4, acquiring screen parameters of the second terminal.
In the present disclosure, the screen parameters may include a screen resolution and a screen size.
And 5, adjusting the appearance of the target element according to the screen parameters.
Specifically, the appearance of the target element, such as its layout, size (e.g., width, height), color (i.e., color reduction), etc., may be adjusted.
At this time, the above step 3] may add the adjusted target element at the target addition position in the second user interface. Thereby, the target element in the second user interface can be ensured to be self-adaptive to the screen of the second terminal.
When the operation instruction of the component or the container corresponding to the second user interface is the newly added component and the component is the special effect component, the method can further comprise the following steps 6] and 7 ].
And 6, acquiring the rendering engine of the second terminal.
7] rendering the target element added to the second user interface with the rendering engine.
In the present disclosure, the rendering engine of the second terminal may be, for example, Unity, OpenGL (Open Graphics Library), or the like. The special effect component may have different effect expressions on different engines such as Unity and OpenGL, so that the special effect component can have different customized expressions and customized functions on terminals of different rendering engines.
Optionally, the method further includes:
receiving a synchronization instruction which is sent by a server and contains the updated target content;
if the second terminal currently displays the second user interface, updating the target content in the currently displayed second user interface into the updated target content;
and if the second user interface is not displayed at present by the second terminal, updating the target content in the second user interface into the updated target content in the background.
With regard to the method applied to the embodiment of the second terminal side, the specific manner in which each step performs an operation has been described in detail in the embodiment of the interface editing method of the first terminal side, and will not be elaborated herein.
In addition, it should be noted that, the number of the first terminal and the second terminal of the synchronous editing user interface may be one or multiple, and the disclosure is not limited specifically. In addition, the authority of each terminal of the synchronous editing user interface can be different.
Fig. 4 is a flowchart illustrating an interface editing method according to an exemplary embodiment, in which the method is applied to a server, and the method may be applied to the server 20 illustrated in fig. 1A or 1B, for example. As shown in fig. 4, the method includes the following S401 to S403.
In S401, an update instruction is received, where the update instruction is an update instruction uploaded by a terminal and directed to target content stored by a server.
Wherein the target content is one of a component, a container, and a material.
In S402, the target content corresponding to the update instruction is updated.
In S403, the updated target content is synchronized with another terminal using the target content.
In the embodiment, the user can update the contents of components, containers, materials and the like in the server at any time through the terminal and synchronize the modified contents to each terminal, so that the contents quoted by the currently edited user interface can be ensured to be consistent with the server.
Fig. 5 is a flowchart illustrating an interface editing method according to an exemplary embodiment, in which the method is applied to a server, and for example, the method may be applied to the server 20 illustrated in fig. 1A or 1B. As shown in fig. 5, the method includes the following S501 and S502.
In S501, an editing operation instruction uploaded by the first terminal is received, where the editing operation instruction is generated by the first terminal according to an interactive operation instruction for the first user interface and input by a user.
In S502, the editing operation instruction is sent to the second terminal, so that the second terminal edits the component or the container corresponding to the second user interface according to the editing operation instruction, and the first user interface and the second user interface are edited synchronously.
In the technical scheme, the first terminal receives an interactive operation instruction which is input by a user and aims at a first user interface displayed by the first terminal, and edits the first user interface according to the interactive operation instruction; converting the interactive operation instruction into an editing operation instruction of a component or a container corresponding to the first user interface; and synchronizing the editing operation instruction to the second terminal through the server so that the second terminal can synchronously edit the second user interface displayed on the second terminal according to the editing operation instruction. Therefore, cross-terminal synchronous editing of the user interface can be achieved, namely a plurality of terminals can synchronously edit the same user interface, so that the editing efficiency of the user interface is improved, and the user experience is improved. In addition, the terminal for synchronously editing the same user interface can comprise a large screen, so that a user can directly view the editing effect through the large screen, and further can adjust the user interface according to the editing effect, thereby further improving the editing efficiency and the user experience of the user interface.
In addition, when the operation type of the editing operation instruction is a newly added component or a newly added container, before the step S502, the method further includes the following two steps:
determining an identifier of a target element to be newly added according to the editing operation instruction, wherein the target element is a component or a container;
and acquiring a target element corresponding to the identifier.
At this time, the above S502 may transmit the editing operation instruction and the target element to the second terminal. After receiving the editing operation instruction, the second terminal analyzes the editing operation instruction to obtain an operation instruction of a component or a container corresponding to the second user interface; then, determining a target adding position of a target element to be newly added according to an operation instruction of a component or a container corresponding to the second user interface; then, the target element sent by the server is added at the target adding position in the second user interface.
In addition, before the step of sending the editing operation instruction and the target element to the second terminal, the method may further include the following two steps:
acquiring screen parameters of a second terminal;
and adjusting the appearance of the target element according to the screen parameters.
At this time, the sending the editing operation instruction and the target element to the second terminal may include: and sending the editing operation instruction and the adjusted target element to the second terminal.
Then, after receiving the editing operation instruction, the second terminal analyzes the editing operation instruction to obtain an operation instruction of a component or a container corresponding to the second user interface; then, determining a target adding position of a target element to be newly added according to an operation instruction of a component or a container corresponding to the second user interface; and then adding the adjusted target element sent by the server at the target adding position in the second user interface. Therefore, the target elements displayed by the second user interface can be ensured to be self-adaptive to the screen of the second terminal.
In addition, the server stores a container and a component management rule, namely a container-component UI display frame; the above method may further comprise the steps of:
and synchronizing the management rule to the first terminal and the second terminal so that the first terminal and the second terminal display the user interface according to the management rule.
With regard to the method applied to the server-side embodiment, the specific manner in which each step performs operations has been described in detail in the embodiment of the interface editing method on the first terminal side, and will not be elaborated here.
Based on the same inventive concept, the disclosure also provides an interface editing device applied to the first terminal. As shown in fig. 6, the apparatus 600 includes: a first receiving module 601, configured to receive an interactive operation instruction, which is input by a user and is specific to a first user interface displayed by the first terminal, and edit the first user interface according to the interactive operation instruction; a conversion module 602, configured to convert the interactive operation instruction received by the first receiving module 601 into an editing operation instruction for a component or a container corresponding to the first user interface; a first synchronization module 603, configured to synchronize the editing operation instruction obtained by the conversion module 602 to a second terminal, so that the second terminal performs synchronous editing on a second user interface displayed on the second terminal according to the editing operation instruction.
In the technical scheme, the first terminal receives an interactive operation instruction which is input by a user and aims at a first user interface displayed by the first terminal, and edits the first user interface according to the interactive operation instruction; converting the interactive operation instruction into an editing operation instruction of a component or a container corresponding to the first user interface; and synchronizing the editing operation instruction to the second terminal so that the second terminal can synchronously edit the second user interface displayed on the second terminal according to the editing operation instruction. Therefore, cross-terminal synchronous editing of the user interface can be achieved, namely a plurality of terminals can synchronously edit the same user interface, so that the editing efficiency of the user interface is improved, and the user experience is improved. In addition, the terminal for synchronously editing the same user interface can comprise a large screen, so that a user can directly view the editing effect through the large screen, and further can adjust the user interface according to the editing effect, thereby further improving the editing efficiency and the user experience of the user interface.
Optionally, the first synchronization module 603 is configured to: directly sending the editing operation instruction to a second terminal; or sending the editing operation instruction to a server in communication connection with the first terminal, so that the server sends the editing operation instruction to a second terminal.
Optionally, the operation type of the editing operation instruction is any one of a property of a change component, a property of a change container, a newly added component, a newly added container, a deleted component, and a deleted container.
Optionally, the first receiving module 601 is further configured to receive an update instruction input by a user, where the update instruction is an update instruction for target content stored by a server, where the target content is one of a component, a container, and a material; the apparatus 600 further comprises: and the second sending module is used for sending the updating instruction to the server so as to update the target content by the server.
Optionally, the apparatus 600 further comprises: a fifth receiving module, configured to receive a synchronization instruction that is sent by the server and includes the updated target content; a second updating module, configured to update a target content in the currently displayed first user interface to the updated target content if the first terminal currently displays the first user interface; and the third updating module is used for updating the target content in the first user interface to the updated target content in the background if the first user interface is not displayed at present by the first terminal.
Optionally, the first user interface includes a display area and an editing area, and the display area of the first user interface is the same as the content displayed by the second user interface.
The disclosure also provides an interface editing device applied to the second terminal. As shown in fig. 7, the apparatus 700 includes: a second receiving module 701, configured to receive an editing operation instruction, where the editing operation instruction is generated by the first terminal according to an interactive operation instruction, which is input by a user and is for the first user interface; the analyzing module 702 is configured to analyze the editing operation instruction received by the second receiving module 701 to obtain an operation instruction for a component or a container corresponding to a second user interface; an editing module 703 is configured to edit the second user interface according to the operation instruction of the component or the container corresponding to the second user interface, which is obtained by the analyzing module 702, so as to implement synchronous editing of the first user interface and the second user interface.
In the technical scheme, the first terminal receives an interactive operation instruction which is input by a user and aims at a first user interface displayed by the first terminal, and edits the first user interface according to the interactive operation instruction; converting the interactive operation instruction into an editing operation instruction of a component or a container corresponding to the first user interface; and synchronizing the editing operation instruction to the second terminal so that the second terminal can synchronously edit the second user interface displayed on the second terminal according to the editing operation instruction. Therefore, cross-terminal synchronous editing of the user interface can be achieved, namely a plurality of terminals can synchronously edit the same user interface, so that the editing efficiency of the user interface is improved, and the user experience is improved. In addition, the terminal for synchronously editing the same user interface can comprise a large screen, so that a user can directly view the editing effect through the large screen, and further can adjust the user interface according to the editing effect, thereby further improving the editing efficiency and the user experience of the user interface.
Optionally, when the operation instruction for the component or the container corresponding to the second user interface is a newly added component or a newly added container, the editing module 703 includes: the determining submodule is used for determining an identifier and a target adding position of a target element to be newly added according to the operating instruction of the component or the container corresponding to the second user interface, wherein the target element is the component or the container; the first obtaining submodule is used for obtaining the target element corresponding to the identifier from the server; and the adding submodule is used for adding the target element at the target adding position in the second user interface.
Optionally, the editing module 703 further includes: the second obtaining submodule is used for obtaining the screen parameters of the second terminal before the adding submodule adds the target element at the target adding position in the second user interface; the adjusting submodule is used for adjusting the appearance of the target element according to the screen parameters; and the adding submodule is used for adding the adjusted target element at the target adding position in the second user interface.
Optionally, when the operation instruction for the component or the container corresponding to the second user interface is a newly added component and the component is a special effect component, the editing module 703 further includes: a third obtaining submodule, configured to obtain a rendering engine of the second terminal; a rendering submodule for rendering the target element added to the second user interface using the rendering engine.
Optionally, the apparatus 700 further comprises: a sixth receiving module, configured to receive a synchronization instruction that is sent by the server and includes the updated target content; a fourth updating module, configured to update a target content in the currently displayed second user interface to the updated target content if the second user interface is currently displayed by the second terminal; and the fifth updating module is used for updating the target content in the second user interface to the updated target content in the background if the second user interface is not currently displayed by the second terminal.
The disclosure also provides an interface editing device applied to the server. As shown in fig. 8, the apparatus 800 includes: a third receiving module 801, configured to receive an update instruction, where the update instruction is an update instruction that is uploaded by a terminal and is for target content stored by the server, where the target content is one of a component, a container, and a material; a first updating module 802, configured to update the target content corresponding to the update instruction received by the third receiving module 801; a second synchronization module 803, configured to synchronize the target content updated by the first update module 802 to other terminals using the target content.
In the embodiment, the user can update the contents of components, containers, materials and the like in the server at any time through the terminal and synchronize the modified contents to each terminal, so that the contents quoted by the currently edited user interface can be ensured to be consistent with the server.
The disclosure also provides an interface editing device applied to the server. As shown in fig. 9, the apparatus 900 includes: a fourth receiving module 901, configured to receive an editing operation instruction uploaded by a first terminal, where the editing operation instruction is generated by the first terminal according to an interactive operation instruction input by a user and specific to a first user interface; the first sending module 902 is configured to send the editing operation instruction received by the fourth receiving module 901 to the second terminal, so that the second terminal edits a component or a container corresponding to the second user interface according to the editing operation instruction, and implements synchronous editing of the first user interface and the second user interface.
In the technical scheme, the first terminal receives an interactive operation instruction which is input by a user and aims at a first user interface displayed by the first terminal, and edits the first user interface according to the interactive operation instruction; converting the interactive operation instruction into an editing operation instruction of a component or a container corresponding to the first user interface; and synchronizing the editing operation instruction to the second terminal through the server so that the second terminal can synchronously edit the second user interface displayed on the second terminal according to the editing operation instruction. Therefore, cross-terminal synchronous editing of the user interface can be achieved, namely a plurality of terminals can synchronously edit the same user interface, so that the editing efficiency of the user interface is improved, and the user experience is improved. In addition, the terminal for synchronously editing the same user interface can comprise a large screen, so that a user can directly view the editing effect through the large screen, and further can adjust the user interface according to the editing effect, thereby further improving the editing efficiency and the user experience of the user interface.
Optionally, when the operation type of the editing operation instruction is a newly added component or a newly added container, the apparatus 900 further includes: a determining module, configured to determine, according to the editing operation instruction, an identifier of a target element to be newly added before the first sending module 902 sends the editing operation instruction to the second terminal, where the target element is a component or a container; the first obtaining module is used for obtaining the target element corresponding to the identifier; the first sending module 902 is configured to send the editing operation instruction and the target element to a second terminal.
Optionally, the apparatus 900 further comprises: a second obtaining module, configured to obtain a screen parameter of a second terminal before the first sending module 902 sends the editing operation instruction and the target element to the second terminal; the adjusting module is used for adjusting the appearance of the target element according to the screen parameters; the first sending module 902 is configured to send the editing operation instruction and the adjusted target element to the second terminal.
Optionally, the server stores a container and a component management rule; the apparatus 900 further comprises: and the third synchronization module is used for synchronizing the management rule to the first terminal and the second terminal so that the first terminal and the second terminal display user interfaces according to the management rule.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
The present disclosure also provides a non-transitory computer-readable storage medium having stored thereon a computer program that, when executed by a processor, implements the steps of the above-described interface editing method on the first terminal side, the steps of the interface editing method on the second terminal side, or the steps of the interface editing method on the server side.
Fig. 10 is a block diagram illustrating an electronic device 1000 in accordance with an example embodiment. As shown in fig. 10, the electronic device 1000 may include: a processor 1001 and a memory 1002. The electronic device 1000 may also include one or more of a multimedia component 1003, an input/output (I/O) interface 1004, and a communications component 1005.
The processor 1001 is configured to control the overall operation of the electronic device 1000, so as to complete all or part of the steps in the above-mentioned interface editing method on the first terminal side. The memory 1002 is used to store various types of data to support operation of the electronic device 1000, such as instructions for any application or method operating on the electronic device 1000 and application-related data, such as contact data, messaging, pictures, audio, video, and so forth. The Memory 1002 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as a Static Random Access Memory (SRAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), an Erasable Programmable Read-Only Memory (EPROM), a Programmable Read-Only Memory (PROM), a Read-Only Memory (ROM), a magnetic Memory, a flash Memory, a magnetic disk, or an optical disk. The multimedia components 1003 may include screen and audio components. Wherein the screen may be, for example, a touch screen and the audio component is used for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signals may further be stored in memory 1002 or transmitted through communication component 1005. The audio assembly also includes at least one speaker for outputting audio signals. The I/O interface 1004 provides an interface between the processor 1001 and other interface modules, such as a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The communication component 1005 is used for wired or wireless communication between the electronic device 1000 and other devices. Wireless Communication, such as Wi-Fi, bluetooth, Near Field Communication (NFC), 2G, 3G, 4G, NB-IOT, eMTC, or other 5G, etc., or a combination of one or more of them, which is not limited herein. The corresponding communication component 1005 may thus include: Wi-Fi module, Bluetooth module, NFC module, etc.
In an exemplary embodiment, the electronic Device 1000 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors or other electronic components, for executing the above-mentioned interface editing method on the first terminal side.
In another exemplary embodiment, there is also provided a computer readable storage medium including program instructions, which when executed by a processor, implement the steps of the above-described interface editing method at the first terminal side. For example, the computer readable storage medium may be the memory 1002 including the program instructions, which are executable by the processor 1001 of the electronic device 1000 to perform the interface editing method on the first terminal side.
Fig. 11 is a block diagram illustrating an electronic device 1100 in accordance with an example embodiment. As shown in fig. 11, the electronic device 1100 may include: a processor 1101, a memory 1102. The electronic device 1100 may also include one or more of a multimedia component 1103, an input/output (I/O) interface 1104, and a communications component 1105.
The processor 1101 is configured to control the overall operation of the electronic device 1100, so as to complete all or part of the steps in the above-mentioned interface editing method on the second terminal side. The memory 1102 is used to store various types of data to support operation at the electronic device 1100, such as instructions for any application or method operating on the electronic device 1100, as well as application-related data, such as contact data, messaging, pictures, audio, video, and so forth. The Memory 1102 may be implemented by any type or combination of volatile and non-volatile Memory devices, such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk, or optical disk. The multimedia components 1103 may include screen and audio components. Wherein the screen may be, for example, a touch screen and the audio component is used for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signal may further be stored in the memory 1102 or transmitted through the communication component 1105. The audio assembly also includes at least one speaker for outputting audio signals. The I/O interface 1104 provides an interface between the processor 1101 and other interface modules, such as a keyboard, mouse, buttons, and the like. These buttons may be virtual buttons or physical buttons. The communication component 1105 provides for wired or wireless communication between the electronic device 1100 and other devices. Wireless Communication, such as Wi-Fi, bluetooth, Near Field Communication (NFC), 2G, 3G, 4G, NB-IOT, eMTC, or other 5G, etc., or a combination of one or more of them, which is not limited herein. The corresponding communication component 1105 can therefore include: Wi-Fi module, Bluetooth module, NFC module, etc.
In an exemplary embodiment, the electronic Device 1100 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic components for performing the above-mentioned interface editing method of the second terminal side.
In another exemplary embodiment, there is also provided a computer readable storage medium including program instructions, which when executed by a processor, implement the steps of the above-described interface editing method at the second terminal side. For example, the computer readable storage medium may be the memory 1102 including the program instructions, which are executable by the processor 1101 of the electronic device 1100 to perform the interface editing method on the second terminal side described above.
Fig. 12 is a block diagram illustrating an electronic device 1900 according to an example embodiment. For example, the electronic device 1900 may be provided as a server. Referring to fig. 12, an electronic device 1900 includes a processor 1922, which may be one or more in number, and a memory 1932 for storing computer programs executable by the processor 1922. The computer program stored in memory 1932 may include one or more modules that each correspond to a set of instructions. Further, the processor 1922 may be configured to execute the computer program to perform the above-described server-side interface editing method.
Additionally, electronic device 1900 may also include a power component 1926 and a communication component 1950, the power component 1926 may be configured to perform power management of the electronic device 1900, and the communication component 1950 may be configured to enable communication, e.g., wired or wireless communication, of the electronic device 1900. In addition, the electronic device 1900 may also include input/output (I/O) interfaces 1958. The electronic device 1900 may operate based on an operating system, such as Windows Server, stored in memory 1932TM,Mac OS XTM,UnixTM,LinuxTMAnd so on.
In another exemplary embodiment, there is also provided a computer readable storage medium including program instructions which, when executed by a processor, implement the steps of the server-side interface editing method described above. For example, the non-transitory computer readable storage medium may be the memory 1932 including program instructions executable by the processor 1922 of the electronic device 1900 to perform the server-side interface editing method described above.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the above-mentioned server-side interface editing method when executed by the programmable apparatus.
The preferred embodiments of the present disclosure are described in detail with reference to the accompanying drawings, however, the present disclosure is not limited to the specific details of the above embodiments, and various simple modifications may be made to the technical solution of the present disclosure within the technical idea of the present disclosure, and these simple modifications all belong to the protection scope of the present disclosure.
It should be noted that the various features described in the above embodiments may be combined in any suitable manner without departing from the scope of the invention. In order to avoid unnecessary repetition, various possible combinations will not be separately described in this disclosure.
In addition, any combination of various embodiments of the present disclosure may be made, and the same should be considered as the disclosure of the present disclosure, as long as it does not depart from the spirit of the present disclosure.

Claims (21)

1. An interface editing method is applied to a first terminal and is characterized by comprising the following steps:
receiving an interactive operation instruction which is input by a user and aims at a first user interface displayed by the first terminal, and editing the first user interface according to the interactive operation instruction;
converting the interactive operation instruction into an editing operation instruction of a component or a container corresponding to the first user interface;
and synchronizing the editing operation instruction to a second terminal so that the second terminal can synchronously edit a second user interface displayed on the second terminal according to the editing operation instruction.
2. The method according to claim 1, wherein the synchronizing the editing operation instruction to the second terminal comprises:
directly sending the editing operation instruction to a second terminal; or
And sending the editing operation instruction to a server in communication connection with the first terminal, so that the server sends the editing operation instruction to a second terminal.
3. The method according to claim 1, wherein the operation type of the edit operation instruction is any one of a change of a property of a component, a change of a property of a container, an addition of a component, an addition of a container, a deletion of a component, and a deletion of a container.
4. The method according to any one of claims 1-3, further comprising:
receiving an update instruction input by a user, wherein the update instruction is an update instruction for target content stored by a server, and the target content is one of a component, a container and a material;
and sending the updating instruction to the server so as to update the target content by the server.
5. The method according to any of claims 1-3, wherein the first user interface comprises a presentation area and an editing area, and wherein the presentation area of the first user interface is the same as what is presented by the second user interface.
6. An interface editing method is applied to a second terminal and is characterized by comprising the following steps:
receiving an editing operation instruction, wherein the editing operation instruction is generated by the first terminal according to an interactive operation instruction which is input by a user and aims at the first user interface;
analyzing the editing operation instruction to obtain an operation instruction of a component or a container corresponding to the second user interface;
and editing the second user interface according to the operation instruction of the component or the container corresponding to the second user interface so as to realize the synchronous editing of the first user interface and the second user interface.
7. The method according to claim 6, wherein when the operation instruction for the component or the container corresponding to the second user interface is a newly added component or a newly added container, the editing the second user interface according to the operation instruction for the component or the container corresponding to the second user interface includes:
determining an identifier and a target adding position of a target element to be newly added according to the operating instruction of the component or the container corresponding to the second user interface, wherein the target element is the component or the container;
acquiring a target element corresponding to the identifier from a server;
adding the target element at the target addition location in the second user interface.
8. The method according to claim 7, wherein before the step of adding the target element at the target adding position in the second user interface, the editing the second user interface according to the operation instruction on the component or container corresponding to the second user interface further comprises:
acquiring screen parameters of the second terminal;
adjusting the appearance of the target element according to the screen parameters;
the adding the target element at the target add location in the second user interface includes:
and adding the adjusted target element at the target adding position in the second user interface.
9. The method according to claim 7, wherein when the operation instruction for the component or the container corresponding to the second user interface is a newly added component and the component is a special effect component, the editing of the second user interface is performed according to the operation instruction for the component or the container corresponding to the second user interface, and further comprising:
acquiring a rendering engine of the second terminal;
rendering, with the rendering engine, the target element added to the second user interface.
10. The method according to any one of claims 6-9, further comprising:
receiving a synchronization instruction which is sent by a server and contains the updated target content;
if the second terminal currently displays the second user interface, updating the target content in the currently displayed second user interface into the updated target content;
and if the second user interface is not displayed at present by the second terminal, updating the target content in the second user interface to the updated target content in the background.
11. An interface editing method is applied to a server and is characterized by comprising the following steps:
receiving an update instruction which is uploaded by a terminal and aims at target content stored by the server, wherein the target content is one of a component, a container and a material;
updating the target content corresponding to the updating instruction;
and synchronizing the updated target content to other terminals using the target content.
12. An interface editing method is applied to a server and is characterized by comprising the following steps:
receiving an editing operation instruction uploaded by a first terminal, wherein the editing operation instruction is generated by the first terminal according to an interactive operation instruction which is input by a user and aims at a first user interface;
and sending the editing operation instruction to a second terminal so that the second terminal edits a component or a container corresponding to a second user interface according to the editing operation instruction, and the first user interface and the second user interface are synchronously edited.
13. The method according to claim 12, wherein when the operation type of the editing operation instruction is a newly added component or a newly added container, before the step of sending the editing operation instruction to the second terminal, the method further comprises:
determining an identifier of a target element to be newly added according to the editing operation instruction, wherein the target element is a component or a container;
acquiring a target element corresponding to the identifier;
the sending the editing operation instruction to the second terminal includes:
and sending the editing operation instruction and the target element to a second terminal.
14. The method according to claim 13, wherein before the step of transmitting the editing operation instruction and the target element to the second terminal, the method further comprises:
acquiring screen parameters of the second terminal;
adjusting the appearance of the target element according to the screen parameters;
the sending the editing operation instruction and the target element to a second terminal includes:
and sending the editing operation instruction and the adjusted target element to a second terminal.
15. The method according to any one of claims 12-14, wherein the server has stored therein a container, a component management rule;
the method further comprises the following steps:
and synchronizing the management rule to the first terminal and the second terminal so that the first terminal and the second terminal display user interfaces according to the management rule.
16. An interface editing apparatus applied to a first terminal, comprising:
the first receiving module is used for receiving an interactive operation instruction which is input by a user and aims at a first user interface displayed by the first terminal, and editing the first user interface according to the interactive operation instruction;
the conversion module is used for converting the interactive operation instruction received by the first receiving module into an editing operation instruction of a component or a container corresponding to the first user interface;
and the first synchronization module is used for synchronizing the editing operation instruction obtained by the conversion module to a second terminal so that the second terminal can synchronously edit a second user interface displayed on the second terminal according to the editing operation instruction.
17. An interface editing apparatus applied to a second terminal, comprising:
the second receiving module is used for receiving an editing operation instruction, wherein the editing operation instruction is generated by the first terminal according to an interactive operation instruction which is input by a user and aims at the first user interface;
the analysis module is used for analyzing the editing operation instruction received by the second receiving module to obtain an operation instruction of a component or a container corresponding to a second user interface;
and the editing module is used for editing the second user interface according to the operation instruction of the component or the container corresponding to the second user interface, which is obtained by the analyzing module, so as to realize the synchronous editing of the first user interface and the second user interface.
18. An interface editing apparatus applied to a server, comprising:
a third receiving module, configured to receive an update instruction, where the update instruction is an update instruction that is uploaded by a terminal and is for target content stored by the server, where the target content is one of a component, a container, and a material;
the first updating module is used for updating the target content corresponding to the updating instruction received by the third receiving module;
and the second synchronization module is used for synchronizing the target content obtained after the first updating module updates to other terminals using the target content.
19. An interface editing apparatus applied to a server, comprising:
the fourth receiving module is used for receiving an editing operation instruction uploaded by the first terminal, wherein the editing operation instruction is generated by the first terminal according to an interactive operation instruction which is input by a user and aims at the first user interface;
the first sending module is configured to send the editing operation instruction received by the fourth receiving module to the second terminal, so that the second terminal edits a component or a container corresponding to the second user interface according to the editing operation instruction, and the first user interface and the second user interface are synchronously edited.
20. A non-transitory computer readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 15.
21. An electronic device, comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to carry out the steps of the method of any one of claims 1 to 15.
CN202111329034.1A 2021-11-10 2021-11-10 Interface editing method and device, storage medium and electronic equipment Pending CN114035877A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111329034.1A CN114035877A (en) 2021-11-10 2021-11-10 Interface editing method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111329034.1A CN114035877A (en) 2021-11-10 2021-11-10 Interface editing method and device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN114035877A true CN114035877A (en) 2022-02-11

Family

ID=80143883

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111329034.1A Pending CN114035877A (en) 2021-11-10 2021-11-10 Interface editing method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN114035877A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115002358A (en) * 2022-03-22 2022-09-02 北京优酷科技有限公司 Control method and system in digital background shooting

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102081946A (en) * 2010-11-30 2011-06-01 上海交通大学 On-line collaborative nolinear editing system
CN112905174A (en) * 2021-01-27 2021-06-04 长沙市到家悠享网络科技有限公司 Information processing method, device, system and storage medium
CN113591439A (en) * 2020-04-30 2021-11-02 北京字节跳动网络技术有限公司 Information interaction method and device, electronic equipment and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102081946A (en) * 2010-11-30 2011-06-01 上海交通大学 On-line collaborative nolinear editing system
CN113591439A (en) * 2020-04-30 2021-11-02 北京字节跳动网络技术有限公司 Information interaction method and device, electronic equipment and storage medium
CN112905174A (en) * 2021-01-27 2021-06-04 长沙市到家悠享网络科技有限公司 Information processing method, device, system and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115002358A (en) * 2022-03-22 2022-09-02 北京优酷科技有限公司 Control method and system in digital background shooting
CN115002358B (en) * 2022-03-22 2023-10-10 神力视界(深圳)文化科技有限公司 Control method and system in digital background shooting

Similar Documents

Publication Publication Date Title
US11460996B2 (en) Modifying style layer properties of a digital map
JP6868705B2 (en) Message processing methods, storage media, and computer equipment
CN103077239B (en) Based on the iFrame embedded Web 3D system that cloud is played up
US10572101B2 (en) Cross-platform multi-modal virtual collaboration and holographic maps
CN103095828B (en) The Web3D synchronous conferencing system played up based on cloud and realize synchronous method
CN109344352B (en) Page loading method and device and electronic equipment
WO2016150386A1 (en) Interface processing method, apparatus, and system
US20150012831A1 (en) Systems and methods for sharing graphical user interfaces between multiple computers
CN104168417A (en) Picture processing method and device
CN106610826B (en) Method and device for manufacturing online scene application
CN104331243A (en) Mobile terminal and large screen display interaction control method based on thumbnail reconstruction
WO2020220773A1 (en) Method and apparatus for displaying picture preview information, electronic device and computer-readable storage medium
AU2020202901B2 (en) Enriching collaboration using visual comments in a shared review
CN106162353A (en) Interface processing method, Apparatus and system
CN112053370A (en) Augmented reality-based display method, device and storage medium
JP2024504053A (en) Two-dimensional code display method, apparatus, device and medium
JP2023522266A (en) Method, apparatus, device and medium for multimedia data delivery
JP2019537397A (en) Effect sharing method and system for video
CN110971974B (en) Configuration parameter creating method, device, terminal and storage medium
CN110865863B (en) Interface display method and device for fast application and storage medium
CN114035877A (en) Interface editing method and device, storage medium and electronic equipment
KR20150079387A (en) Illuminating a Virtual Environment With Camera Light Data
KR20160125322A (en) Apparatus and method for generating and managing an advertizing contents
CN110647273B (en) Method, device, equipment and medium for self-defined typesetting and synthesizing long chart in application
CN116610394A (en) Template and module-based data visualization page configuration method, system and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20221212

Address after: 518055 1202, Building 1, Chongwen Park, Nanshan Zhiyuan, No. 3370, Liuxian Avenue, Fuguang Community, Taoyuan Street, Nanshan District, Shenzhen, Guangdong

Applicant after: Shenzhen Nut Software Co.,Ltd.

Address before: No.01, 10 / F, unit 4, building B, Kexing Science Park, No.15 Keyuan Road, Central District, Nanshan District, Shenzhen, Guangdong 518000

Applicant before: SHENZHEN HOLATEK Co.,Ltd.

TA01 Transfer of patent application right