CN116506682A - Distributed editing method and device - Google Patents

Distributed editing method and device Download PDF

Info

Publication number
CN116506682A
CN116506682A CN202310467502.4A CN202310467502A CN116506682A CN 116506682 A CN116506682 A CN 116506682A CN 202310467502 A CN202310467502 A CN 202310467502A CN 116506682 A CN116506682 A CN 116506682A
Authority
CN
China
Prior art keywords
editing
area
slave device
target
initial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310467502.4A
Other languages
Chinese (zh)
Inventor
李斌
吕璐
谢俊
尹寿臣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics China R&D Center
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics China R&D Center
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics China R&D Center, Samsung Electronics Co Ltd filed Critical Samsung Electronics China R&D Center
Priority to CN202310467502.4A priority Critical patent/CN116506682A/en
Publication of CN116506682A publication Critical patent/CN116506682A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43076Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of the same content streams on multiple devices, e.g. when family members are watching the same movie on different devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4318Generation of visual interfaces for content selection or interaction; Content or additional data rendering by altering the content in the rendering process, e.g. blanking, blurring or masking an image region
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43637Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/458Scheduling content for creating a personalised stream, e.g. by combining a locally stored advertisement with an incoming stream; Updating operations, e.g. for OS modules ; time-related management operations
    • H04N21/4586Content update operation triggered locally, e.g. by comparing the version of software modules in a DVB carousel to the version stored locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application discloses a distributed editing method and device, and relates to the technical field of screen sharing. One embodiment of the method comprises the following steps: responding to the received distributed editing instruction, and opening the distributed editing authority of the initial slave device when the initial slave device exists; in response to determining that a target slave device activating the distributed editing authority exists in the initial slave device, synchronizing the current display content of the display control device to the target slave device; and updating the current display content of the display control device based on the editing result of the target slave device on the current display content. The implementation method effectively avoids the problem that the screen sharing cannot be edited, and realizes the fusion display of the distributed editing.

Description

Distributed editing method and device
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to the field of screen sharing technologies, and in particular, to a distributed editing method and apparatus.
Background
With the development of science and technology, mobile terminals are becoming more popular, and meanwhile, the functions are becoming more powerful, and a single communication tool is evolved into an integrated multifunctional intelligent portable device; the screen is thus shared as an integral part of the multi-screen interaction. And the screen sharing technology is not only used in living entertainment, but also widely applied in conferences. The most common scene is that a presenter shares the screen picture of the mobile terminal device to a large screen receiving end, such as a television, through screen sharing, so that the presenter can conveniently and clearly express own ideas.
However, it may not be possible for conference members who need to make comments to express their own views simply by language, and if conference members need to make their own views, it may be necessary to write on a board or share their own (note) devices, resulting in a conference that may require more time to connect, discuss, switch device sharing, and record the content of the conference.
Disclosure of Invention
The embodiment of the application provides a distributed editing method, a distributed editing device, distributed editing equipment and a storage medium.
According to a first aspect, an embodiment of the present application provides a distributed editing method, including: responding to the received distributed editing instruction, and opening the distributed editing authority of the initial slave device when the initial slave device exists; in response to determining that a target slave device activating the distributed editing authority exists in the initial slave device, synchronizing the current display content of the display control device to the target slave device; and updating the current display content of the display control device based on the editing result of the target slave device on the current display content.
According to a second aspect, an embodiment of the present application provides a distributed editing apparatus, including: the receiving module is configured to respond to the received distributed editing instruction, and the initial slave device exists, so that the distributed editing authority of the initial slave device is opened; a synchronization module configured to synchronize the current display content of the display control device to the target slave device in response to determining that the target slave device activating the distributed editing authority exists in the initial slave device; and the updating module is configured to update the current display content of the display control device based on the editing result of the target slave device on the current display content.
According to a third aspect, embodiments of the present application provide a distributed editing system, including: the display control device is used for displaying the current display content and the updated current display content of the display control device; the target slave device is used for receiving the current display content synchronized by the display control device, editing the current display content and sending the editing result to the control device; the control device is for performing the distributed editing method of any of the embodiments of the first aspect.
According to a fourth aspect, embodiments of the present application provide an electronic device comprising one or more processors; and a storage device having one or more programs stored thereon, which when executed by the one or more processors, cause the one or more processors to implement the distributed editing method as in any of the embodiments of the first aspect.
According to a fifth aspect, embodiments of the present application provide a computer readable medium having stored thereon a computer program which, when executed by a processor, implements a distributed editing method as in any of the embodiments of the first aspect.
In response to receiving a distributed editing instruction, and the initial slave equipment exists, opening the distributed editing authority of the initial slave equipment; in response to determining that a target slave device activating the distributed editing authority exists in the initial slave device, synchronizing the current display content of the display control device to the target slave device; based on the editing result of the target slave device on the current display content, the current display content of the display control device is updated, the problem that the screen sharing cannot be edited is effectively avoided, and the fusion display of the distributed editing is realized.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow chart of one embodiment of a distributed editing method according to the present application;
FIG. 3 is a schematic illustration of one application scenario of a distributed editing method according to the present application;
FIG. 4a is a flow chart of yet another embodiment of a distributed editing method according to the present application;
FIG. 4b is a schematic diagram of yet another embodiment of a distributed editing method according to the present application;
FIG. 4c is a schematic diagram of another embodiment of a distributed editing method according to the present application;
FIG. 4d is a schematic diagram of another embodiment of a distributed editing method according to the present application;
FIG. 4e is a schematic diagram of another embodiment of a distributed editing method according to the present application;
FIG. 5 is a schematic diagram of one embodiment of a distributed editing apparatus according to the present application;
FIG. 6 is a schematic diagram of one embodiment of a distributed editing system according to the present application;
FIG. 7 is a schematic diagram of a computer system suitable for use in implementing embodiments of the present application.
Detailed Description
Exemplary embodiments of the present application are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present application to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
It should be noted that, in the case of no conflict, the embodiments and features in the embodiments may be combined with each other. The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
FIG. 1 illustrates an exemplary system architecture 100 in which embodiments of the distributed editing methods of the present application may be applied.
As shown in fig. 1, a system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The user may interact with the server 105 via the network 104 using the terminal devices 101, 102, 103 to receive or send messages or the like. The terminal devices 101, 102, 103 may have client application software installed thereon, such as a play class application software, a communication class application software, or the like.
The terminal devices 101, 102, 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices including, but not limited to, smartphones, tablets, desktop computers, projectors, and the like. When the terminal devices 101, 102, 103 are software, they can be installed in the above-listed electronic devices. Which may be implemented as a plurality of software or software modules, or as a single software or software module. The present invention is not particularly limited herein.
The server 105 may be a server providing various services, for example, in response to receiving a distributed editing instruction, and there is an initial slave device, opening a distributed editing right of the initial slave device; in response to determining that a target slave device activating the distributed editing authority exists in the initial slave device, synchronizing the current display content of the display control device to the target slave device; and updating the current display content of the display control device based on the editing result of the target slave device on the current display content.
The server 105 may be hardware or software. When the server 105 is hardware, it may be implemented as a distributed server cluster formed by a plurality of servers, or as a single server. When the server is software, it may be implemented as a plurality of software or software modules (e.g., to provide distributed editing services), or as a single software or software module. The present invention is not particularly limited herein.
It should be noted that the distributed editing method provided by the embodiments of the present disclosure may be performed by the server 105, may be performed by the terminal devices 101, 102, 103, or may be performed by the server 105 and the terminal devices 101, 102, 103 in cooperation with each other. Accordingly, the respective parts (e.g., respective units, sub-units, modules, sub-modules) included in the distributed editing apparatus may be all provided in the server 105, may be all provided in the terminal devices 101, 102, 103, or may be provided in the server 105 and the terminal devices 101, 102, 103, respectively.
It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
FIG. 2 illustrates a flow 200 of an embodiment of a distributed editing method that may be applied to the present application. The distributed editing method comprises the following steps:
in step 201, in response to receiving the distributed editing instruction, and the initial slave device exists, the distributed editing authority of the initial slave device is opened.
In this embodiment, the executing body (for example, the terminal devices 101, 102, 103 or the server 105 in fig. 1) may monitor the distributed editing instruction sent by the master device or input by the user in real time or periodically, and in response to receiving the distributed editing instruction, the executing body may sense the distance information between the peripheral slave device and the display control device in a wired or wireless manner, and in response to the existence of the initial slave device, the executing body may open the distributed editing authority of the initial slave device.
The initial slave device is used for indicating at least one slave device of which the distance with the display control device meets preset conditions.
Here, the wireless means may include, but is not limited to, 3G/4G, wiFi, bluetooth, wiMAX, zigbee, UWB (ultra wireless), and other now known or later developed wireless means.
Specifically, in response to receiving the distributed editing instruction, the executing body may sense and range a plurality of slave devices around the display control device through, for example, bluetooth with low energy in a wireless protocol, thereby establishing a ranging session with the plurality of slave devices, and send a broadcast notification to at least one slave device that meets a preset condition (for example, is smaller than a threshold value of a set space), namely, an initial slave device, and establish a long connection communication session, so as to open the distributed editing authority of the initial slave device.
Here, the initial slave device may establish a long connection session with the display control device in response to receiving the connection request, and maintain the validity of the long connection session through the heartbeat keep-alive mode.
In some alternatives, the initial slave device is determined by: determining the position information of each slave device relative to the display control device based on an arrival time difference positioning method; based on the location information, an initial slave device is determined among the slave devices.
In this implementation manner, the executing body may determine, according to the arrival time difference positioning method, that is, the TDOA (Time Difference of Arrival, arrival time difference) positioning method, position information of each slave device around the display control device relative to the display control device, and determine, according to the position information, an initial slave device in each slave device.
Specifically, the executing body can calculate the position information of each slave device relative to the display control device through a TDOA positioning method according to a wireless protocol UWB, and determine at least one slave device, namely an initial slave device, of which the distance from the display control device meets the preset condition in each slave device according to the position information.
The implementation mode is to determine the position information of each slave device relative to the display control device by a positioning method based on the arrival time difference; based on the position information, the initial slave device is determined in each slave device, so that the flexibility and accuracy of determining the initial slave device are improved.
Step 202, in response to the existence of the target slave device activating the distributed editing authority in the initial slave device, synchronizing the current display content of the display control device to the target slave device.
In this embodiment, after determining the initial slave device, the executing body may monitor whether the initial slave device activates the distributed editing authority in real time or periodically, and in response to detecting that a target slave device with the distributed editing authority is present in the initial slave device, that is, a target slave device with editing requirements is present in the initial slave device, the executing body may synchronize the current display content, that is, the current display image, of the display control device to the target slave device, so that the user edits the synchronized display content on the target slave device.
The number of the target slave devices may be one or more, and this is not limited in this application.
Here, the current display content of the display control device may be content that the display control device itself displays as the master device, or may be content that the master device synchronizes through screen sharing, content projection, file sharing, or the like, which is not limited in this application.
The main device may be an intelligent mobile terminal in the prior art or in the future, for example, a smart phone, a smart tablet, a folding screen, etc.; the display control device may be an intelligent display terminal in the prior art or in the future, for example, an intelligent projector, an intelligent television, an intelligent electronic whiteboard, etc., which is not limited in this application.
In addition, if the current display content of the display control device is the content shared by the main device, the execution main body can inform the main device to pause the currently-ongoing content sharing mode when receiving the distributed editing instruction, and reserve the current content sharing progress so as to continue the current progress to share when resuming the content sharing.
In some alternatives, the method further comprises: and switching the master device into the slave device in response to receiving the distributed editing instruction, so that the master device starts distributed editing according to the requirement.
In this implementation manner, the current display content of the display control device is content shared by the master device, and the execution body, in response to receiving the distributed editing instruction, may notify the master device to suspend ongoing content sharing, and switch from the master device mode to the slave device mode, that is, from the master device to the slave device, so that the master device starts distributed editing as required.
According to the implementation mode, the master equipment is switched to the slave equipment in response to receiving the distributed editing instruction, so that the master equipment starts distributed editing according to the requirement, the master equipment is facilitated to perform distributed editing, and the flexibility and the richness of the distributed editing are improved.
And step 203, updating the current display content of the display control device based on the editing result of the target slave device on the current display content.
In this embodiment, the execution subject may acquire an editing result of the current display content by the target slave device, and update the current display content of the display control device according to the editing result of the current display content by the target slave device.
Here, the execution body may splice the editing result with the current display content to update the current display content of the display control device, or may first determine the display area, and then update a part of the display area according to the editing result, which is not limited in this application.
Specifically, for example, in a conference scene, the display control device is a large-screen terminal device such as a TV/projector, and the participants can view conference content displayed by the display control device through respective intelligent mobile terminals, namely target slave devices, and edit and comment in real time, and the display control device can synchronously display editing results fed back by the target slave devices.
For example, in a teaching scene, the display control device is a large-screen terminal device such as a TV/projector, and students can view the lesson content displayed by the display control device and answer in real time through respective intelligent mobile terminals, namely target slave devices, and the display control device can synchronously display editing results fed back by the target slave devices.
In some alternatives, the method further comprises: and in response to receiving the instruction of ending the distributed editing mode, notifying the main equipment to continue content sharing.
In this implementation manner, the current display content of the display control device is the content shared by the master device, the execution body may monitor the instruction of ending the distributed editing mode in real time or periodically, and in response to receiving the instruction of ending the distributed editing mode, notify the master device to continue content sharing.
After receiving the instruction for continuing content sharing, the main device can resume the previous content sharing progress and continue to share the related content to the display control device.
According to the implementation mode, the main equipment is informed to continue content sharing by responding to the instruction of ending the distributed editing mode, so that the main equipment is helped to continue content sharing in time after the distributed editing mode is ended.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the distributed editing method according to the present embodiment.
In the multi-person production mode application scenario of fig. 3, the execution subject 301 opens the distributed editing authority of the initial slave device in response to receiving the distributed editing instruction and the initial slave device is present, where the initial slave device is used to indicate at least one slave device whose distance from the display control device 302 meets a preset condition; in response to determining that there is a target slave device 303, 304, 305 in the initial slave device that activates the distributed editing authority, synchronizing the current display content of the display control device 302, such as the initial image, to the target slave device 303, 304, 305; based on the editing result of the current display content by the target slave devices 303, 304, 305, the current display content of the display control device 302 is updated to obtain a final image.
The distributed editing method provided by the embodiment of the disclosure opens the distributed editing authority of the initial slave device by responding to the received distributed editing instruction and the initial slave device exists; in response to determining that a target slave device activating the distributed editing authority exists in the initial slave device, synchronizing the current display content of the display control device to the target slave device; based on the editing result of the target slave device on the current display content, the current display content of the display control device is updated, the problem that the screen sharing cannot be edited is effectively avoided, and the fusion display of the distributed editing is realized.
With further reference to fig. 4a, a flow 400 of yet another embodiment of a distributed editing method is shown. In this embodiment, the flow 400 of the distributed editing method may include the following steps:
in step 401, in response to receiving the distributed editing instruction, and the initial slave device exists, the distributed editing authority of the initial slave device is opened.
In this embodiment, the implementation details and technical effects of step 401 may refer to the description of step 201, which is not described herein.
Step 402, in response to the existence of the target slave device activating the distributed editing authority in the initial slave device, synchronizing the current display content of the display control device to the target slave device.
In this embodiment, the implementation details and technical effects of step 402 may refer to the description of step 202, which is not repeated here.
Step 403, determining a target display area based on the editing mode.
In this embodiment, the editing result may include an editing mode, which may include a division editing mode that characterizes division editing of the sub-region of the display content and an extension editing mode that characterizes extension editing of the sub-region of the display content.
Here, the display content of the display control device may be divided into a plurality of sub-areas, the display content synchronized to the target slave device may be divided into a plurality of sub-areas, and the manner in which the target slave device and the display control device divide the sub-areas may be the same or different, which is not limited in this application.
Specifically, after the target slave device obtains the display content synchronized by the display control device, the display content can be divided into one or more sub-areas, and the sub-areas can be divided and edited, for example, the sub-areas are divided into a plurality of node areas, and each node area is edited, wherein each node area comprises coordinate information corresponding to the node area, and the area of the display content can be expanded and edited, for example, a new sub-area is added on the basis of the display content.
The sub-regions may be characterized by tiles, the node regions may be characterized by sub-tiles, and the tiles may include information such as spatial range frames, geometric errors, tile content, and the like.
If the editing mode is a split editing mode, the execution subject can directly determine the area of the current display content of the display control equipment, namely the initial area, as a target display area; if the editing mode is an extended editing mode, the execution subject can directly expand the range of the preset area, namely the initial area, of the current display content of the display control equipment, and determine the expanded area of the current display content as a target display area; the target display area may also be determined directly based on the initial area and the edit area.
The target display area is used for representing a minimum display area which can cover the initial area and the editing area. The side length scale of each side of the target display area is the same as the side length scale of each side of the initial area.
Here, the editing area may include editing contents.
In some alternatives, in response to determining that the edit mode includes only a split edit mode, the initial region is directly determined to be the target display region.
In this embodiment, if the editing mode includes only the division editing mode, that is, the target slave device performs division editing on only the display content, the execution subject may directly determine the initial area, that is, the area where the display control device currently displays the content, as the target display area.
The implementation enables determination of the target display area in the split editing mode by directly determining the initial area as the target display area in response to determining that the editing mode includes only the split editing mode.
In some alternatives, responsive to determining that the edit mode includes an extended edit mode, determining a second edit region based on the location information and the size of the edit region; and determining a target display area based on the second editing area and the initial area.
In this implementation manner, the editing mode may include an extended editing mode, the editing result may include an editing area, and the execution subject may determine a second editing area according to the position information and the size of the editing area, and further determine the target display area based on the area where the display control device currently displays the content, that is, the initial area and the second editing area.
Here, the category of the second editing area may include a plurality of types, for example, a type in which the editing area appears on the left side of the display content, a type in which the editing area appears on the right side and above the display content, and the like, according to the difference in the position information and the size of the editing area.
Specifically, as shown in fig. 4b, the target display area may include various kinds of the categories for the different second editing area, such as a target display area (comb1) determined for adding editing content on the right side of the display content, a target display area (comb2) determined for adding editing content above the display content, a target display area (comb3) determined for adding editing content on the right side and below the display content, a target display area (comb4) determined for adding editing content on the left side and above the display content, and the like.
Further, as shown in fig. 4C, the number of target slave devices is two, namely, the target slave device 1 and the target slave device 2, the current display content of the display control device is an image "a", the target slave device 1 and the target slave device 2 simultaneously edit the display content synchronized by the display control device, for example, the target slave device 1 increases an image "B" below the display content, the target slave device 2 increases an image "C" on the right side of the display content, the execution subject can determine a second editing area according to the positions and the sizes of the editing areas of the target slave device 1 and the target slave device 2, and determine a target display area (such as the above-mentioned Comb 3) according to the second editing area and the initial area.
The implementation determines a second edit region according to the position information and the size of the edit region by responding to the determination that the edit mode comprises an extended edit mode; and determining a target display area based on the second editing area and the initial area, so that the target display area in the extended editing mode is determined.
In some alternatives, determining the target display area based on the initial area, the second edit area, includes: and determining a target display area based on the subarea and the corresponding weight included in the initial area and the subarea and the corresponding weight included in the second editing area.
In this implementation manner, the execution body may determine the target display area according to the sub-area and the corresponding weight included in the initial area, and the sub-area and the corresponding weight included in the second editing area.
The weights of the sub-areas included in the initial area and the weights of the sub-areas included in the second editing area may be determined according to the size, the position, etc. of the sub-areas, which are not limited in this application.
Specifically, the minimum display region combo' t,x,y Can be determined by the following formula:
wherein, the Comb represents an initial area and a second editing area, tile t,i Characterizing subregions, W t,i The weights are characterized.
According to the implementation mode, the target display area is determined based on the subareas and the corresponding weights included in the initial area and the subareas and the corresponding weights included in the second editing area, so that the accuracy of the determined target display area is improved.
In step 404, in the target display area, an area to be updated is determined.
In this embodiment, the execution subject may first convert the editing region into a region under the coordinate system of the target display region, that is, the first editing region, and then determine the region to be updated in the target display region.
The area to be updated is an area which can cover the first editing area and occupies the least subarea in the target display area. The number of sub-areas corresponding to the area to be updated is an integer.
Specifically, as shown in fig. 4d, the coordinates P1 (x p1 ,y p1 )、P2(x p2 ,y p2 ) Determined by the following formula:
wherein R0 (x 0 ,y 0 ) For displaying the coordinates of the upper left corner or the coordinates of the center point of the target display area of the display control device, M (x 1 ,y 1 )、N(x 2 ,y 2 ) R is the coordinates of the upper left corner and the lower right corner of the first editing area respectively w For the width of the subarea, R h Is the sub-region height.
And step 405, updating the current display content of the display control device based on the first editing area and the area to be updated.
In this embodiment, the execution subject may directly replace the area to be updated with the first editing area, that is, splice the first editing area with the target display area to obtain the updated current display content of the display control device, or update the content of the area to be updated according to the content of the first editing area to obtain the updated current display content of the display control device.
In some optional manners, updating the current display content of the display control device based on the first editing area and the area to be updated includes: updating the area to be updated according to the first editing area to obtain an updated area; and carrying out smooth fusion processing based on the updated region and the target display region to obtain the current display content of the updated display control equipment.
In this implementation manner, after the execution main body updates the area to be updated according to the first editing area to obtain the updated area, if the updated area is directly spliced with the target display area, for the edge area, obvious splicing defects occur in the direct splicing due to pixel differences, so that smooth fusion processing based on convolution and distance weighting can be performed on the updated area and the target display area, and the current display content of the updated display control device is obtained.
In particular, as shown in FIG. 4e
First, element values of a transition region are calculated, and the calculation formula of the element values of the transition region is as follows:
C t =ω b ×C bu ×C u
wherein omega b 、ω u To update the weight, C b Cell value of initial area, C u For updating the cell values of the region, d is the length of the image point starting from the left edge of the transition region (the larger d, the closer the representation image point is to the updated region, the corresponding ω u The larger d is smaller, the closer the characterization image point is to the initial region, corresponding ω b The larger) W is the width of the transition region.
The frequency components of the image edge and noise interference are positioned at the higher part of the spatial frequency domain, so that the noise is eliminated by adopting a low-pass filtering method, the frequency domain filtering is realized by convolution of the spatial domain, and the effect of filtering the noise can be achieved by designing a unit impact response matrix of the spatial system.
And (3) for each element value of the transition region, adopting a distance weighting calculation based on a spatial domain low-pass filter to realize fusion processing operation, wherein the result is as follows:
the implementation mode is to update the area to be updated according to the first editing area to obtain an updated area; and based on the updated region and the target display region, performing smooth fusion processing to obtain the current display content of the updated display control equipment, thereby effectively improving the accuracy of the display content and the display effect.
In the above embodiment of the present application, compared to the embodiment corresponding to fig. 2, the flow 400 of the distributed editing method in this embodiment embodies determining the target display area based on the editing mode; and determining an area to be updated in the target display area, updating the area to be updated based on the first editing area to obtain the current display content of the updated display control equipment, and improving the accuracy of the display content while realizing the content fusion display between the master equipment and the slave equipment.
With further reference to fig. 5, as an implementation of the method shown in the foregoing figures, the present application provides an embodiment of a distributed editing apparatus, where an embodiment of the apparatus corresponds to the embodiment of the method shown in fig. 1, and the apparatus may be specifically applied to various electronic devices.
As shown in fig. 5, the distributed editing apparatus 500 of the present embodiment includes: a receiving module 501, a synchronizing module 502 and an updating module 503.
Wherein, the receiving module 501 may be configured to open the distributed editing rights of the initial slave device in response to receiving the distributed editing instruction and the initial slave device being present.
The synchronization module 502 may be configured to synchronize the current display content of the display control device to the target slave device in response to determining that the target slave device that activates the distributed editing rights exists in the initial slave device.
The updating module 503 may be configured to update the current display content of the display control device based on the editing result of the current display content by the target slave device.
In some alternatives of this embodiment, the determining module is further configured to determine the distributed editing area based on the distributed editing device and the pointing vector in response to determining that the distributed editing device is rotatable and that the distributed editing vector of the distributed editing device does not match the pointing vector.
In some alternatives of this embodiment, the update module further includes: a first determination unit configured to determine a target display area based on the editing mode; a second determining unit configured to determine, in the target display area, an area to be updated; and the sub-updating unit is configured to update the current display content of the display control device based on the first editing area and the area to be updated.
In some alternatives of this embodiment, the first determining unit is further configured to: in response to determining that the edit mode includes only the split edit mode, the initial region is directly determined as the target display region.
In some alternatives of this embodiment, the first determining unit is further configured to: determining a second editing region according to the position information and the size of the editing region in response to determining that the editing mode includes an extended editing mode; and determining a target display area based on the second editing area and the initial area.
In some optional manners of this embodiment, determining the target display area based on the second editing area, the initial area includes: and determining a target display area based on the subarea and the corresponding weight included in the initial area and the subarea and the corresponding weight included in the second editing area.
In some optional manners of this embodiment, the sub-updating unit is further configured to update the area to be updated according to the first editing area, to obtain an updated area; and carrying out smooth fusion processing based on the updated region and the target display region to obtain the current display content of the updated display control equipment.
In some alternatives of this embodiment, the initial slave device is determined by: determining the position information of each slave device relative to the display control device based on an arrival time difference positioning method; based on the location information, an initial slave device is determined among the slave devices.
In some alternatives of this embodiment, the apparatus further includes a switching module configured to switch the master device to the slave device in response to receiving the distributed editing instruction, such that the master device initiates distributed editing as desired.
In some optional manners of this embodiment, the apparatus further includes a sharing module configured to notify the master device to continue content sharing in response to receiving an instruction to end the distributed editing mode.
With further reference to FIG. 6, one embodiment of a distributed editing system is provided.
In this embodiment, the system comprises a control device 601, a display control device 602 and at least one target slave device 603.
Wherein a control device 601 is used to perform the distributed editing method as described in embodiment 2 above.
Here, the control device may be a terminal device, such as a display control device of a projector, a television, or the like, or may be a server, which is not limited in this application.
And the display control device 602 is used for displaying the current display content and the updated current display content.
The target slave devices 603 and 604 are used for receiving the current display content synchronized by the display control device, editing the current display content, and sending the editing result to the control device.
In some alternatives, the system further comprises: and the main device 605 is used for synchronizing the display content to the display control device for display.
According to embodiments of the present application, an electronic device and a readable storage medium are also provided.
As shown in fig. 7, a block diagram of an electronic device according to a distributed editing method according to an embodiment of the present application.
700 is a block diagram of an electronic device of a distributed editing method according to an embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the application described and/or claimed herein.
As shown in fig. 7, the electronic device includes: one or more processors 701, memory 702, and interfaces for connecting the various components, including high-speed interfaces and low-speed interfaces. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions executing within the electronic device, including instructions stored in or on memory to display graphical information of the GUI on an external input/output device, such as a display device coupled to the interface. In other embodiments, multiple processors and/or multiple buses may be used, if desired, along with multiple memories and multiple memories. Also, multiple electronic devices may be connected, each providing a portion of the necessary operations (e.g., as a server array, a set of blade servers, or a multiprocessor system). One processor 701 is illustrated in fig. 7.
Memory 702 is a non-transitory computer-readable storage medium provided herein. Wherein the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the distributed editing methods provided herein. The non-transitory computer readable storage medium of the present application stores computer instructions for causing a computer to perform the distributed editing method provided by the present application.
The memory 702 is used as a non-transitory computer readable storage medium for storing non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules (e.g., the receiving module 501, the synchronizing module 502, and the updating module 503 shown in fig. 5) corresponding to the distributed editing methods in the embodiments of the present application. The processor 701 executes various functional applications of the server and data processing by running non-transitory software programs, instructions, and modules stored in the memory 702, that is, implements the distributed editing method in the above-described method embodiments.
Memory 702 may include a storage program area that may store an operating system, at least one application program required for functionality, and a storage data area; the storage data area may store data created by use of the distributed edited electronic device, and the like. In addition, the memory 702 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid-state storage device. In some embodiments, memory 702 optionally includes memory remotely located relative to processor 701, which may be connected to the distributed editing electronic device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of the distributed editing method may further include: an input device 703 and an output device 704. The processor 701, the memory 702, the input device 703 and the output device 704 may be connected by a bus or otherwise, in fig. 7 by way of example.
The input device 703 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device for quality monitoring of the live video stream, such as a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointer stick, one or more mouse buttons, a track ball, a joystick, or the like. The output device 704 may include a display apparatus, auxiliary lighting devices (e.g., LEDs), and haptic feedback devices (e.g., vibration motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device may be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASIC (application specific integrated circuit), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
These computing programs (also referred to as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
According to the technical scheme of the embodiment of the application, the problem that the screen sharing cannot be edited is effectively avoided, and the fusion display of the distributed editing is realized.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present application may be performed in parallel, sequentially, or in a different order, provided that the desired results of the technical solutions disclosed in the present application can be achieved, and are not limited herein.
The above embodiments do not limit the scope of the application. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present application are intended to be included within the scope of the present application.

Claims (14)

1. A distributed editing method, the method comprising:
responding to receiving a distributed editing instruction, and opening the distributed editing authority of an initial slave device when the initial slave device exists, wherein the initial slave device is used for indicating at least one slave device with a distance from a display control device meeting a preset condition;
in response to determining that a target slave device activating distributed editing authorities exists in the initial slave device, synchronizing the current display content of a display control device to the target slave device;
And updating the current display content of the display control device based on the editing result of the target slave device on the current display content.
2. The method of claim 1, wherein the editing result comprises: and updating the current display content of the display control device based on the editing mode and the editing area of the target slave device and the editing result of the current display content, wherein the method comprises the following steps:
determining a target display area based on the editing mode, wherein the target display area is used for representing a minimum display area capable of covering an initial area and an editing area, the initial area is an area of the current display content of the display control equipment, and the side length proportion of each side of the target display area is the same as that of each side of the initial area;
determining an area to be updated in the target display area, wherein the area to be updated is an area which can cover a first editing area and occupies a least subarea in the target display area, and the first editing area is an area for converting the editing area into a coordinate system of the target display area;
and updating the current display content of the display control equipment based on the first editing area and the area to be updated.
3. The method of claim 2, wherein the determining a target display area based on the editing mode comprises:
in response to determining that the edit mode includes only the split edit mode, the initial region is determined to be the target display region.
4. The method of claim 2, wherein the determining a target display area based on the editing mode comprises:
determining a second editing region according to the position information and the size of the editing region in response to determining that the editing mode includes an extended editing mode;
and determining a target display area based on the second editing area and the initial area.
5. The method of claim 4, wherein the determining a target display area based on the second edit area, an initial area, comprises:
and determining a target display area based on the subarea and the corresponding weight included in the initial area and the subarea and the corresponding weight included in the second editing area.
6. The method of claim 2, wherein updating the current display content of the display control device based on the first editing area and the area to be updated comprises:
updating the area to be updated according to the first editing area to obtain an updated area;
And carrying out smooth fusion processing based on the updating area and the target display area to obtain the current display content of the updated display control equipment.
7. The method of claim 1, wherein the initial slave device is determined by:
determining the position information of each slave device relative to the display control device based on an arrival time difference positioning method;
and determining an initial slave device in the slave devices based on the position information.
8. The method of claim 1, wherein the current display content of the display control device is content shared by a master device, and the method further comprises:
and switching the master device into the slave device in response to receiving the distributed editing instruction, so that the master device starts distributed editing according to the requirement.
9. The method of claim 1, wherein the current display content of the display control device is content shared by a master device, and the method further comprises:
and in response to receiving the instruction of ending the distributed editing mode, notifying the main equipment to continue content sharing.
10. A distributed editing apparatus, the apparatus comprising:
the receiving module is configured to respond to receiving the distributed editing instruction, and an initial slave device exists, and opens the distributed editing authority of the initial slave device, wherein the initial slave device is used for indicating at least one slave device, the distance between the slave device and the display control device of which meets preset conditions;
A synchronizing module configured to synchronize the current display content of the display control device to a target slave device that activates distributed editing rights in response to determining that the target slave device exists in the initial slave device;
and the updating module is configured to update the current display content of the display control device based on the editing result of the target slave device on the current display content.
11. A distributed editing system, the system comprising: a control device, a display control device, and at least one target slave device,
-said control device for performing the method according to one of claims 1-9;
the display control device is used for displaying the current display content and displaying the updated current display content;
the target slave device is used for receiving the current display content synchronized by the display control device, editing the current display content and sending the editing result to the control device.
12. The system of claim 11, the system further comprising: the master device is provided with a control unit,
and the main equipment is used for synchronizing the display content to the display control equipment for display.
13. An electronic device, comprising:
at least one processor; and
A memory communicatively coupled to the at least one processor; wherein,,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-9.
14. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of claims 1-9.
CN202310467502.4A 2023-04-27 2023-04-27 Distributed editing method and device Pending CN116506682A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310467502.4A CN116506682A (en) 2023-04-27 2023-04-27 Distributed editing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310467502.4A CN116506682A (en) 2023-04-27 2023-04-27 Distributed editing method and device

Publications (1)

Publication Number Publication Date
CN116506682A true CN116506682A (en) 2023-07-28

Family

ID=87329808

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310467502.4A Pending CN116506682A (en) 2023-04-27 2023-04-27 Distributed editing method and device

Country Status (1)

Country Link
CN (1) CN116506682A (en)

Similar Documents

Publication Publication Date Title
WO2020248640A1 (en) Display device
WO2017173793A1 (en) Method and apparatus for screen projection of video
KR102463304B1 (en) Video processing method and device, electronic device, computer-readable storage medium and computer program
EP4130963A1 (en) Object dragging method and device
US20230007065A1 (en) Video sharing method, apparatus, device and medium
US20150067536A1 (en) Gesture-based Content Sharing Between Devices
CN109194972B (en) Live stream acquisition method and device, computer equipment and storage medium
US20230011395A1 (en) Video page display method and apparatus, electronic device and computer-readable medium
JP6615997B2 (en) Synchronization of server-side keyboard layout and client-side keyboard layout in virtual sessions
EP4085593A2 (en) Integration of internet of things devices
CN104915107A (en) Media shooting method and terminal
CN110070592B (en) Generation method and device of special effect package and hardware device
CN105516638A (en) Video call method, device and system
JP2015162869A (en) Transmission terminal, transmission system, transmission method, and program
CN112019896A (en) Screen projection method and device, electronic equipment and computer readable medium
WO2020248697A1 (en) Display device and video communication data processing method
CN110928509B (en) Display control method, display control device, storage medium, and communication terminal
CN106598213A (en) Virtual reality device, and control method and system thereof
JP2015529902A (en) Collaboration environment and views
US20150143261A1 (en) Information processing terminal, information processing method, and information processing system
CN113923498A (en) Processing method and device
JP6361728B2 (en) Transmission control system, transmission system, transmission control method, and recording medium
WO2024022179A1 (en) Media content display method and apparatus, electronic device and storage medium
WO2023246859A1 (en) Interaction method and apparatus, electronic device, and storage medium
WO2023116505A1 (en) Module processing method and apparatus for foreground and background separation system, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination