CN116701779A - Collaborative annotation data screening method and device, conference system, terminal and medium - Google Patents

Collaborative annotation data screening method and device, conference system, terminal and medium Download PDF

Info

Publication number
CN116701779A
CN116701779A CN202310612709.6A CN202310612709A CN116701779A CN 116701779 A CN116701779 A CN 116701779A CN 202310612709 A CN202310612709 A CN 202310612709A CN 116701779 A CN116701779 A CN 116701779A
Authority
CN
China
Prior art keywords
labeling
annotation
interface
collaborative
main body
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310612709.6A
Other languages
Chinese (zh)
Inventor
强树树
赵延鹏
宗靖国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixelhue Technology Ltd
Original Assignee
Pixelhue Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pixelhue Technology Ltd filed Critical Pixelhue Technology Ltd
Priority to CN202310612709.6A priority Critical patent/CN116701779A/en
Publication of CN116701779A publication Critical patent/CN116701779A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • G06F16/435Filtering based on additional data, e.g. user or group profiles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/45Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9536Search customisation based on social or collaborative filtering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/403Arrangements for multi-party communication, e.g. for conferences

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The application is applicable to the technical field of data processing, and provides a collaborative annotation data screening method, a collaborative annotation data screening device, a conference system, a terminal and a medium, wherein the data screening method comprises the following steps: the terminal equipment responds to the triggering operation of the user on the marking button in the main conference interface to generate a marking interface, then responds to the triggering operation of the user on the collaborative operation button in the marking interface to generate a collaborative marking interface, and sends collaborative instructions to other terminal equipment in the conference system, if the terminal equipment performs data marking on the corresponding collaborative marking interface, a marking sub-object and at least one corresponding marking main body can be obtained based on the data marking action of the terminal equipment and displayed in the collaborative marking interface; and finally, screening the annotation sub-objects corresponding to the target annotation main body through the target annotation main body selected by the user and displaying the annotation sub-objects in the collaborative annotation interface so as to effectively screen collaborative annotation data in the conference and improve the conference efficiency and collaborative capability.

Description

Collaborative annotation data screening method and device, conference system, terminal and medium
Technical Field
The application belongs to the technical field of data processing, and particularly relates to a collaborative annotation data screening method, a collaborative annotation data screening device, a conference system, a terminal and a medium.
Background
In the present digital age, paperless conferences are becoming increasingly popular. Paperless conferences not only can reduce environmental pollution, but also can improve conference efficiency and collaboration capability. However, in the case where multiple participants participate in a conference using multiple devices, it becomes difficult to perform collaborative annotation on the objects discussed in the conference and filtering of annotation data. Existing paperless conference systems typically employ digital labeling, i.e., converting conference content into digital form and labeling.
However, this way of digitally annotating also has some problems. For example, the objects (pictures or videos, etc.) discussed in the conference cannot be directly marked intuitively, and multiple participants may use different devices for marking, so that the format and structure of the data are inconsistent, and collaborative processing is difficult. In addition, a large amount of redundant and repeated data may be generated in the labeling process, which increases the difficulty and complexity of data screening. Therefore, a method capable of effectively collaborative labeling and effectively screening labeling data is needed to improve conference efficiency and collaborative capability.
Disclosure of Invention
The embodiment of the application provides a collaborative annotation data screening method, a collaborative annotation data screening device, a conference system, a terminal and a medium, which can effectively screen collaborative annotation data in a conference so as to improve conference efficiency and collaborative capability.
In a first aspect, an embodiment of the present application provides a collaborative annotation data screening method, which is applied to any terminal device in a conference system, where the conference system includes a plurality of terminal devices, each terminal device includes at least a conference main interface, the conference main interface is configured to provide a plurality of conference function buttons, and the plurality of conference function buttons includes at least an annotation button; the data screening method comprises the following steps:
responding to the triggering operation of a user on a labeling button in a main conference interface, generating a labeling interface, wherein the labeling interface is used for carrying out data labeling on any labeling object in terminal equipment, and at least comprises a cooperative operation button;
responding to the triggering operation of the user on the cooperative operation button, generating a cooperative annotation interface and sending a cooperative instruction to other terminal equipment in the conference system;
if the terminal equipment performs data annotation on the corresponding collaborative annotation interface, acquiring an annotation sub-object and at least one corresponding annotation main body in the conference system based on the data annotation action of the terminal equipment, and synchronously displaying the annotation sub-object and the corresponding at least one annotation main body in the collaborative annotation interface of each terminal equipment;
And screening the labeling sub-objects corresponding to the target labeling main body through the selected target labeling main body, and displaying the labeling sub-objects in the collaborative labeling interface.
Optionally, the annotation body comprises an annotation type, an annotation type attribute, and an annotation user.
Optionally, the terminal devices included in the conference system are connected to the same server, and a plurality of terminal devices connected to the same server have the same labeling object resources, and the storage paths of the labeling object resources in the terminal devices are the same or different; based on the data labeling action of the terminal equipment, obtaining a labeling sub-object and at least one corresponding labeling main body in the conference system, and synchronously displaying the labeling sub-object and the corresponding at least one labeling main body in a collaborative labeling interface of each terminal equipment, wherein the collaborative labeling interface comprises the following steps:
the method comprises the steps of forwarding through a server, obtaining a labeling instruction sent by terminal equipment for data labeling, wherein the labeling instruction comprises labeling actions for labeling sub-objects;
obtaining the labeling action for the labeling sub-object by analyzing the labeling instruction;
and marking the marking sub-object according to the marking action, determining the marking sub-object and at least one marking main body corresponding to the marking sub-object in the conference system, and synchronously displaying the marking sub-object and the marking main body in a collaborative marking interface of each terminal device.
Optionally, after obtaining the labeling sub-object and the corresponding at least one labeling main body in the conference system, the method further includes:
dividing the labeling sub-objects corresponding to the same labeling main body into a group to obtain a plurality of labeling data groups;
screening the labeling sub-objects corresponding to the target labeling main body through the selected target labeling main body, and displaying the labeling sub-objects in the collaborative labeling interface, wherein the method comprises the following steps:
and extracting a labeling data set corresponding to the target labeling main body through the selected target labeling main body, and displaying the target labeling main body and the corresponding labeling data set in the collaborative labeling interface.
Optionally, extracting the labeling data set corresponding to the target labeling body through the selected target labeling body, and displaying the target labeling body and the corresponding labeling data set in the collaborative labeling interface, including:
responding to the selection operation of a user on a target labeling main body, extracting a labeling data set corresponding to the target labeling main body, and displaying the labeling data set in a collaborative labeling interface of self terminal equipment; and/or
Responding to the selection operation of a user on the target labeling main body, generating a screening instruction, wherein the screening instruction comprises the target labeling main body;
and sending the screening instruction to other terminal equipment so that the terminal equipment receiving the screening instruction analyzes the screening instruction to obtain a target labeling main body, and extracting and displaying a labeling data set corresponding to the target labeling main body according to the target labeling main body.
Optionally, the data screening method further comprises:
determining a plurality of labeling subjects corresponding to all labeling sub-objects as a labeling subject list, and displaying the labeling subjects in each collaborative labeling interface;
and determining the target labeling subject in response to the selection operation of at least one labeling subject in the labeling subject list by the user.
Optionally, in response to a triggering operation of the user on the cooperative operation button, generating a cooperative annotation interface, and sending a cooperative instruction to other terminal devices in the conference system, including:
responding to the triggering operation of the user on the cooperative operation button, generating a cooperative annotation interface, and sending a cooperative request to other terminal equipment in the conference system;
acquiring a response terminal equipment list based on response information of other terminal equipment;
and determining target terminal equipment based on the response terminal equipment list, and sending a cooperative instruction to the target terminal equipment.
Optionally, in response to a triggering operation of the user on the cooperative operation button, generating a cooperative annotation interface, and sending a cooperative instruction to other terminal devices in the conference system, and further including:
and responding to the triggering operation of the user on the cooperative operation button, generating a cooperative annotation interface, and broadcasting a cooperative instruction to other terminal devices in the conference system.
Optionally, the collaboration instruction includes at least annotation object information and an operation action for the annotation object; sending a collaboration instruction to other terminal devices in the conference system, and further comprising:
and sending a cooperative instruction to other terminal equipment in the conference system, so that the terminal equipment receiving the cooperative instruction analyzes the cooperative instruction to obtain the information of the marked object and the operation action for the marked object, and executing the operation action for the marked object according to the information of the marked object.
In a second aspect, an embodiment of the present application provides a collaborative annotation data screening apparatus, which is applied to any terminal device in a conference system, where the conference system includes a plurality of terminal devices, each terminal device includes at least a conference main interface, where the conference main interface is configured to provide a plurality of conference function buttons, and the plurality of conference function buttons includes at least an annotation button; the data screening method comprises the following steps:
the first trigger response module is used for responding to the trigger operation of a user on a marking button in a conference main interface, generating a marking interface which is used for marking data of any marking object in the terminal equipment, and at least comprises a cooperative operation button;
The second trigger response module is used for responding to the trigger operation of the user on the cooperative operation button, generating a cooperative annotation interface and sending a cooperative instruction to other terminal devices in the conference system;
the annotation data acquisition module is used for acquiring an annotation sub-object and at least one corresponding annotation main body in the conference system based on the data annotation action of the terminal equipment if the terminal equipment performs data annotation on the corresponding collaborative annotation interface, and synchronously displaying the annotation sub-object and the corresponding at least one annotation main body in the collaborative annotation interface of each terminal equipment;
and the annotation data screening module is used for screening the annotation sub-objects corresponding to the target annotation main body through the selected target annotation main body and displaying the annotation sub-objects in the collaborative annotation interface.
In a third aspect, an embodiment of the present application provides a conference system, where the conference system includes a server and a plurality of terminal devices, where the plurality of terminal devices are connected to the server respectively;
the server is used for receiving the labeling sub-object and the corresponding at least one labeling main body of any terminal equipment and sending the labeling sub-object and the corresponding at least one labeling main body to other terminal equipment;
the terminal equipment is used for responding to the data labeling action of the user on the collaborative labeling interface, generating a labeling sub-object and at least one corresponding labeling main body, and sending the labeling sub-object and the at least one corresponding labeling main body to the server, so that the server sends the labeling sub-object and the at least one corresponding labeling main body to other terminal equipment in the conference system, and a plurality of terminal equipment screen the corresponding labeling sub-object through the labeling main body.
In a fourth aspect, an embodiment of the present application provides a terminal device, including: the system comprises a memory, a processor and a computer program stored in the memory and capable of running on the processor, wherein the processor realizes the collaborative annotation data screening method of the first aspect when executing the computer program.
In a fifth aspect, an embodiment of the present application provides a computer readable storage medium, where a computer program is stored, where the computer program, when executed by a processor, implements the collaborative annotation data screening method of the first aspect.
In a sixth aspect, an embodiment of the present application provides a computer program product, which when run on a terminal device, causes the terminal device to execute the collaborative annotation data screening method of the first aspect.
Compared with the prior art, the embodiment of the application has the beneficial effects that:
the method provided by the embodiment of the application can be applied to any terminal equipment in a conference system, wherein the conference system comprises a plurality of terminal equipment, each terminal equipment at least comprises a conference main interface, the conference main interface is used for providing a plurality of conference function buttons, the conference function buttons at least comprise marking buttons, the terminal equipment responds to the triggering operation of the user on the marking buttons in the conference main interface to generate a marking interface, and then responds to the triggering operation of the user on the cooperative operation buttons in the marking interface to generate a cooperative marking interface and send cooperative instructions to other terminal equipment in the conference system, if the terminal equipment performs data marking on the corresponding cooperative marking interface, a marking sub-object and at least one corresponding marking main body can be obtained based on the data marking action of the terminal equipment and displayed on the cooperative marking interface; and finally, screening the annotation sub-objects corresponding to the target annotation main body through the target annotation main body selected by the user and displaying the annotation sub-objects in the collaborative annotation interface so as to effectively screen collaborative annotation data in the conference and improve the conference efficiency and collaborative capability.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of a conference system according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of a collaborative annotation data screening method according to an embodiment of the present application;
fig. 3 is a schematic flow chart of an interface operation in a conference according to an embodiment of the present application;
FIG. 4 is a flowchart illustrating a step of screening annotation data according to an embodiment of the present application;
FIG. 5 is a flowchart illustrating a cooperative instruction sending step according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a collaborative annotation data screening apparatus according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in the present description and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
Furthermore, the terms "first," "second," "third," and the like in the description of the present specification and in the appended claims, are used for distinguishing between descriptions and not necessarily for indicating or implying a relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
In the present digital age, paperless conferences are becoming increasingly popular. Paperless conferences not only can reduce environmental pollution, but also can improve conference efficiency and collaboration capability. However, in the case where multiple participants participate in a conference using multiple devices, it becomes difficult to perform collaborative annotation on the objects discussed in the conference and filtering of annotation data. Existing paperless conference systems typically employ digital labeling, i.e., converting conference content into digital form and labeling.
However, this way of digitally annotating also has some problems. For example, the objects (pictures or videos, etc.) discussed in the conference cannot be directly marked intuitively, and multiple participants may use different devices for marking, so that the format and structure of the data are inconsistent, and collaborative processing is difficult. In addition, a large amount of redundant and repeated data may be generated in the labeling process, which increases the difficulty and complexity of data screening. Therefore, a method capable of effectively collaborative labeling and effectively screening labeling data is needed to improve conference efficiency and collaborative capability.
Based on the above problems, the application provides a collaborative annotation data screening method, which is applied to any terminal device in a conference system, the terminal device responds to the triggering operation of a user on a annotation button in a main conference interface to generate an annotation interface, and then responds to the triggering operation of the user on the collaborative operation button in the annotation interface to generate a collaborative annotation interface and send a collaborative instruction to other terminal devices in the conference system, if the terminal device performs data annotation on the corresponding collaborative annotation interface, the data annotation action of the terminal device can be based on the data annotation action of the terminal device to obtain an annotation sub-object and at least one corresponding annotation main body, the annotation sub-object and the corresponding annotation main body are displayed in the collaborative annotation interface, and finally the annotation sub-object corresponding to the target annotation main body is screened through the target annotation main body selected by the user and displayed in the collaborative annotation interface, so that effective screening of collaborative annotation data in the conference is realized, and conference efficiency and collaborative capability are improved.
It should be understood that, the sequence number of each step in the embodiment of the present application does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not limit the implementation process of the embodiment of the present application.
In order to illustrate the technical method of the present application, the following description is made by way of specific examples.
Referring to fig. 1, a schematic structural diagram of a conference system according to an embodiment of the present application is shown. As shown in fig. 1, the conference system includes a server and a plurality of terminal devices, where the plurality of terminal devices and the server may be connected through a network, that is, the plurality of terminal devices and the server are located in the same network, which may mean that the plurality of terminal devices and the server are located in the same ethernet or wireless network, or that the plurality of terminal devices and the server are located in the same network address space, for example, a local area network, and the communication between the devices is based on a network protocol, for example, IP (Internet Protocol), that is, identification and addressing are performed through an IP address, and data forwarding and communication are performed through network devices such as a router, a switch, and the like; etc., the application is not limited to a particular network form.
In the embodiment of the application, in order to realize collaborative annotation data screening among a plurality of terminal devices, when a user performs a data annotation action on a collaborative annotation interface of any terminal device, the terminal device can generate an annotation sub-object and at least one annotation main body in response to the data annotation action of the user on the collaborative annotation interface, the annotation sub-object and the corresponding at least one annotation main body are sent to a server, the server receives the annotation sub-object and the corresponding at least one annotation main body of any terminal, and the annotation sub-object and the corresponding at least one annotation main body are sent to other terminal devices, so that each terminal device screens the corresponding annotation sub-object through the annotation main body.
The method comprises the steps that a collaborative annotation interface is generated, and firstly, the annotation interface can be generated by responding to the triggering operation of a user on an annotation button in a main conference interface through terminal equipment; and secondly, responding to the triggering operation of the collaborative operation button in the labeling interface by the user through the terminal equipment, generating a labeling collaborative interface, and sharing the data labeling action executed by any terminal equipment in the interface to other terminal equipment through a server.
It should be understood that when all the terminal devices in the conference system are not labeled, the collaborative labeling interface is the same as the picture content displayed by the labeling interface.
The conference main interface is used for providing a plurality of conference function buttons.
As shown in fig. 1, the terminal device may enter the conference main interface through conference terminal software, perform subsequent collaborative labeling and data screening operations, and may also enter the conference main interface through a conference web address, and perform subsequent collaborative labeling and data screening operations.
Referring to fig. 2, a flow chart of a collaborative annotation data screening method provided by an embodiment of the present application is shown, where the data screening method may be applied to any terminal device in the conference system of fig. 1.
Specifically, the data screening method may include the steps of:
and step 201, responding to the triggering operation of a user on a labeling button in a main conference interface, and generating a labeling interface.
In the embodiment of the application, the terminal equipment can enter the conference through conference terminal software or conference webpage addresses, wherein the terminal equipment needs to install the conference terminal software or support the webpage to enter the conference.
The conference main interface may refer to a terminal display interface in the process of a conference, where a picture may be a picture displayed by sharing the conference, that is, all terminal devices may see the shared picture, and the conference main interface may further include a plurality of conference function buttons, that is, operation buttons related to various conferences, for example, an invite button, a conference person button, a conference recording button, a labeling button, etc., where if a labeling function needs to be executed in the conference, at least one of the plurality of conference function buttons needs to include a labeling button to trigger the terminal devices to enter the labeling interface.
In the embodiment of the application, when the user triggers the annotation button in the conference main interface, the terminal equipment generates an annotation interface to provide an interactive environment for the user to perform data annotation in the interface, wherein the data annotation can comprise marking, annotating, highlighting and the like on the annotation sub-object, and the annotation sub-object can be part of the annotation sub-object, such as a section of text in a document or a character or object in a pair of images. As shown in the flow chart of interface operation in the conference in fig. 3, when a user clicks a labeling button in a main interface of the conference, a labeling interface is generated, where the labeling interface is used for data labeling of any labeling object in a terminal device, and in order to enable multiple users to label conference contents at the same time, and to enable sharing of labeling results, so as to promote collaboration and information sharing among teams, the labeling interface should at least include a collaborative operation button.
It should be understood that the labeling interface may include several components and functions in addition to the co-operating buttons:
first, the annotation interface typically displays relevant content of the meeting, such as meeting agenda, discussion records, presentation, etc. So that the user can view the meeting content directly on the annotation interface for better understanding and annotation.
Second, to enable the user to annotate the meeting content, the annotation interface may provide tools such as text labels, scoring, annotating, highlighting, and the like. The user may use these tools to mark important information, key points, or add personal notes in the meeting content.
Third, the annotation interface may also provide different annotation type selections to meet different annotation needs of the user. For example, a user may select different annotation types that are marked as important, to-do, question, discussion, etc. in order to better organize and categorize the annotation content.
Fourth, the annotation interface typically also provides annotation management functionality so that a user can view, edit, delete, or export annotations that have already been made. In this way, the user can conveniently manage the annotations and further process or share them when needed.
It should be noted that the above examples are only exemplary and should not be construed as limiting the application. In the actual use process, besides the functions, the labeling interface can be further expanded to meet the specific requirements of users. The embodiment of the present application is not limited thereto.
Step 202, generating a collaborative annotation interface in response to the triggering operation of the user on the collaborative operation button, and sending a collaborative instruction to other terminal devices in the conference system.
In the embodiment of the application, when the user triggers the collaborative operation button on the annotation interface, the terminal equipment generates the collaborative annotation interface according to the preset interface template, and sends a collaborative instruction to other terminal equipment in the conference system to invite the other terminal equipment to perform collaborative operation while generating the collaborative annotation interface.
The other terminal devices in the conference system can be computers, smart phones, tablet computers and the like of other users participating in the conference, and by receiving the collaborative instruction, collaborative annotation interfaces which are the same as the terminal devices sending the collaborative instruction can be displayed on the devices of the other terminal devices, once the collaborative annotation interfaces are synchronously displayed on the terminal devices, the users can carry out annotation operation at the same time, and the annotation operation on any terminal device can be updated on the other devices in real time, so that all conference participants can see the latest annotation data.
In one possible implementation manner, the steps of generating the collaborative annotation interface and sending the collaborative instruction to other terminal devices in the conference system in response to the triggering operation of the collaborative operation button by the user may be implemented in the following steps 401 to 403.
And step 401, responding to the triggering operation of the user on the cooperative operation button, generating a cooperative annotation interface and sending a cooperative request to other terminal devices in the conference system.
In the embodiment of the application, after the user triggers the cooperative operation button, a cooperative annotation interface is generated, and when the cooperative annotation interface is generated, the terminal equipment sends a cooperative request to other terminal equipment in the conference system, the cooperative request is used for indicating the other terminal equipment to return response information, if the other terminal equipment receives the system request, the response information is returned to the terminal which sends the request, wherein the response information can comprise the identity identification of the terminal equipment, such as an IP address.
The collaborative annotation interface may refer to the collaborative annotation interface in fig. 3, where the collaborative annotation interface includes a plurality of annotation bodies, each annotation body may correspond to a plurality of annotation sub-objects, and the corresponding annotation sub-objects may be screened out by screening the annotation bodies.
Step 402, acquiring a response terminal device list based on response information of other terminal devices.
In the embodiment of the application, the terminal equipment sending the collaboration request can determine a response terminal equipment list based on the received response information, wherein the list records the identity of other terminal equipment which can participate in the collaborative labeling operation.
Step 403, determining a target terminal device based on the response terminal device list, and sending a collaboration instruction to the target terminal device.
In the embodiment of the application, the terminal equipment can select the target terminal equipment based on the response terminal equipment list, one or more selected target terminal equipment can be selected, and a cooperative instruction is sent to the selected target terminal equipment, so that the target terminal equipment can participate in the data annotation of the same annotation object.
In one possible implementation manner, responding to the triggering operation of the user on the cooperative operation button, generating a cooperative annotation interface, and sending a cooperative instruction to other terminal devices in the conference system, and may further include:
and responding to the triggering operation of the user on the cooperative operation button, generating a cooperative annotation interface, and broadcasting a cooperative instruction to other terminal devices in the conference system.
In the embodiment of the application, after the user triggers the collaborative operation button, the system generates a collaborative annotation interface. The collaborative annotation interface may refer to a collaborative annotation interface schematic diagram of fig. 3, where the collaborative annotation interface includes a plurality of annotation bodies, each annotation body may correspond to a plurality of annotation sub-objects, and the corresponding annotation sub-objects may be screened out by screening the annotation bodies.
The collaboration instruction can be broadcast to all other terminal devices in the conference system while the collaboration annotation interface is generated, and the broadcasting mode can be that connection is established through WebSocket, UDP or TCP, then the collaboration instruction is transmitted in real time through the connection, and the collaboration instruction can be transmitted through an HTTP transmission protocol.
In one possible implementation manner, the collaborative instruction at least comprises information of a labeling object and an operation action aiming at the labeling object; sending a collaboration instruction to other terminal devices in the conference system, and further comprising:
and sending a cooperative instruction to other terminal equipment in the conference system, so that the terminal equipment receiving the cooperative instruction analyzes the cooperative instruction to obtain the information of the marked object and the operation action for the marked object, and executing the operation action for the marked object according to the information of the marked object.
In the embodiment of the application, the terminal equipment sending the cooperative instruction can transmit Json character strings or binary data in two modes of websocket (can transmit character strings or binary data) or HTTP (character strings or binary data and files); i.e., the co-instruction may be represented in a Json string or binary data. The terminal device sending the coordination instruction sends the coordination instruction to other terminal devices in the conference system, and the coordination instruction can be firstly sent to a server and then forwarded to other response terminals by the server.
The collaborative instruction at least includes labeling object information (for example, the labeling object information is one of an object address or an object unique identifier, or an object file name) and an operation action on the labeling object (for example, page turning to an X page, how many times corresponding to a video stream, etc.), and the terminal device receiving the collaborative instruction can obtain the labeling object information and the operation action on the labeling object by analyzing the collaborative instruction, so that the self device can obtain the labeling object according to the labeling object information, and execute the operation action on the labeling object, for example, turn the file a to a 10 th page, etc.
It should be understood that the plurality of terminal devices have the same labeling object resource, but the storage paths of the labeling object resource in the corresponding terminal devices are the same or different, for example, the storage position of the file a in the terminal device 1 is the position a, the storage position of the file a in the terminal device 1 is the position B, and after receiving the collaboration instruction, the two terminal devices may take out the file a from the corresponding storage position according to the labeling object information (for example, the name identifier of the file a, etc.), so as to further execute the operation action for the file a.
Step 203, if the terminal device performs data annotation on the corresponding collaborative annotation interface, acquiring an annotation sub-object and at least one corresponding annotation main body in the conference system based on the data annotation action of the terminal device, and synchronously displaying the annotation sub-object and the corresponding at least one annotation main body in the collaborative annotation interface of each terminal device.
In the embodiment of the application, each terminal device responding to the cooperative instruction in the conference system can execute cooperative operation, namely cooperative annotation, and the annotation sub-objects of a plurality of terminal devices participating in the cooperative annotation and at least one corresponding annotation main body are obtained.
The annotation main body comprises an annotation type, an annotation type attribute and an annotation user. For example, the type of annotation may include a brush, a line, a graphic (triangle, circle, or rectangle), text, and the like; the attributes of the annotation type may include color, brush thickness, text size, etc., and the annotation user may include personal information (e.g., user ID, user nickname, etc.) for each user participating in the annotation, or may include personal information (e.g., user ID, user nickname, etc.) for all users participating in the conference.
In the embodiment of the application, if any terminal device performs data annotation on a corresponding collaborative annotation interface in a plurality of terminal devices participating in collaborative annotation, the terminal device performing data annotation transmits an annotation instruction containing self data annotation action to a server through a network, the server distributes the annotation instruction to other terminal devices participating in collaborative annotation, after receiving the annotation instruction, other terminal devices participating in collaborative annotation analyze the annotation instruction, so as to obtain the annotation action of an annotation sub-object in the annotation object, and then perform the annotation action of the annotation sub-object on the self interface, namely, draw the annotation sub-object on the self collaborative annotation interface, so as to obtain the annotation sub-object of the plurality of terminal devices participating in collaborative annotation in the system and at least one corresponding annotation main body.
Specifically, the step 203 may include:
The method comprises the steps of forwarding through a server, obtaining a labeling instruction sent by terminal equipment for data labeling, wherein the labeling instruction comprises labeling actions for labeling sub-objects;
obtaining the labeling action for the labeling sub-object by analyzing the labeling instruction;
and marking the marking sub-object according to the marking action, determining the marking sub-object and at least one marking main body corresponding to the marking sub-object in the conference system, and synchronously displaying the marking sub-object and the marking main body in a collaborative marking interface of each terminal device. For example, the terminal device performing data labeling performs circle labeling on the person a in the picture 1 by using a drawing pen, and then the server may forward a labeling instruction of the terminal device, where the labeling instruction includes a labeling action of the terminal device performing circle labeling on the person a in the picture 1 by using the drawing pen, after analyzing the labeling instruction by other terminal devices, the labeling action may be performed, and in a collaborative labeling interface of the terminal device, the labeling sub-object in the conference system is determined to be the person a in the picture 1, and the corresponding at least one labeling main body is the drawing pen and the circle.
It should be understood that the terminal devices included in the conference system are connected to the same server, and a plurality of terminal devices connected to the same server have the same labeling object resources, and the storage paths of the labeling object resources in the terminal devices are the same or different.
As a possible implementation manner, if there is a terminal device that performs data annotation on a corresponding collaborative annotation interface, based on a data annotation action of the terminal device, acquiring an annotation sub-object and a corresponding at least one annotation main body in the conference system, and synchronously displaying the annotation sub-object and the corresponding at least one annotation main body in the collaborative annotation interface of each terminal device, where the collaborative annotation interface further includes:
if the terminal equipment performs data annotation on the corresponding collaborative annotation interface, based on the data annotation action of the terminal equipment, determining an annotation sub-object and at least one corresponding annotation main body of the terminal equipment for annotation, then generating an annotation instruction, wherein the annotation instruction comprises the annotation sub-object and the at least one corresponding annotation main body, transmitting the annotation instruction comprising the annotation data and the at least one corresponding annotation main body to a server, forwarding the annotation instruction to other terminal equipment participating in collaborative annotation by the server, and drawing and annotating display by each terminal equipment receiving the instruction on the collaborative annotation interface of the terminal equipment based on the received annotation sub-object and the at least one annotation main body by the other terminal equipment so that all the terminal equipment participating in collaborative annotation can see the same annotation picture.
As a possible implementation manner, in order to facilitate the user to filter the labeling data, the data filtering method further includes:
determining a plurality of labeling subjects corresponding to all labeling sub-objects as a labeling subject list, and displaying the labeling subjects in each collaborative labeling interface;
and determining the target labeling subject in response to the selection operation of at least one labeling subject in the labeling subject list by the user.
In the embodiment of the application, after the labeling sub-objects and at least one corresponding labeling main body in the conference system are determined, all the labeling main bodies in the conference system can be obtained, a plurality of labeling main bodies corresponding to all the labeling sub-objects are determined to be a labeling main body list, or the labeling main bodies can be preset, all the preset labeling main bodies are determined to be a labeling main body list, and the labeling main bodies are displayed in each collaborative labeling interface. As shown in the flowchart of fig. 3, the list of the labeling subjects 1, 2, and 3 in the collaborative labeling interface is formed.
It should be understood that the list of the labeling subjects 1, 2, and 3 in fig. 3 is merely exemplary, and the user may extend the list according to the hierarchical division and the number of the labeling subjects, and the specific content of the labeling subjects included in the list is not limited in the embodiment of the present application.
In the embodiment of the application, when the user triggers at least one labeling body in the labeling body list, the terminal equipment can determine the target labeling body so as to screen data according to the target labeling body. Wherein the user may select one or more annotation bodies as target annotation bodies.
And 204, screening the annotation data corresponding to the target annotation body through the selected target annotation body, and displaying the annotation data in a collaborative annotation interface of the terminal equipment.
In the embodiment of the application, each terminal device participating in collaborative annotation stores an annotation sub-object and at least one corresponding annotation main body, wherein a mapping relationship exists between the annotation sub-object and the corresponding at least one annotation main body, so that the annotation sub-object corresponding to the target annotation main body, such as annotation characters, annotation areas and the like, can be extracted from a data set or a database stored in the terminal device by selecting the target annotation main body.
It should be appreciated that once the annotation sub-objects corresponding to the target annotation body are screened out, the data will be displayed at the collaborative annotation interface. For example, the target labeling subjects selected by the user a on the terminal device 1 are the labeling subjects 1 and 2, and then labeling data corresponding to the labeling subjects 1 and 2 are displayed in the collaborative labeling interface; if the target labeling subjects selected by the user B on the terminal device 2 are the labeling subjects 3 and the labeling subjects 2, the labeling data corresponding to the labeling subjects 3 and the labeling subjects 2 will be displayed on the collaborative labeling interface.
In the embodiment of the application, the collaborative annotation interface can be a graphical interface for a user to view and operate annotation data therein. The user can perform marking, annotating, modifying or deleting and other operations on the annotation data, and can communicate and cooperate with other personnel participating in collaborative annotation in real time.
As a possible implementation manner, the step of filtering the labeling data may be specifically implemented through the following steps 501 and 502.
In step 501, labeling sub-objects corresponding to the same labeling main body are divided into a group, so as to obtain a plurality of labeling data sets.
Step 502, extracting a labeling data set corresponding to the target labeling main body through the selected target labeling main body, and displaying the target labeling main body and the corresponding labeling data set in a collaborative labeling interface of the terminal equipment.
In the embodiment of the application, the labeling sub-objects corresponding to the same labeling main body are divided into a group and stored in the terminal equipment, so that a plurality of labeling data sets can be obtained. For example, the labeling sub-objects corresponding to user A are divided into a group, the labeling sub-objects corresponding to user B are divided into a group, the labeling sub-objects corresponding to pencil labeling are divided into a group, the labeling sub-objects corresponding to circle labeling are divided into a group, and so on.
When a user selects a target annotation main body on a terminal device used by the user, the annotation data set corresponding to the target annotation main body can be directly extracted from a plurality of stored data sets, for example, the selected target annotation main body is user A, the annotation data set corresponding to the user A is extracted, and after the annotation data set is extracted, personal information of the user A and the corresponding annotation data set are simultaneously displayed in a collaborative annotation interface, so that the user can visually check.
In one possible implementation manner, extracting, by the selected target labeling body, a labeling data set corresponding to the target labeling body, and displaying the target labeling body and the corresponding labeling data set in the collaborative labeling interface, including:
responding to the selection operation of a user on a target labeling main body, extracting a labeling data set corresponding to the target labeling main body, and displaying the labeling data set in a collaborative labeling interface of self terminal equipment; and/or
Responding to the selection operation of a user on the target labeling main body, generating a screening instruction, wherein the screening instruction comprises the target labeling main body;
and sending the screening instruction to other terminal equipment so that the terminal equipment receiving the screening instruction analyzes the screening instruction to obtain a target labeling main body, and extracting and displaying a labeling data set corresponding to the target labeling main body according to the target labeling main body.
In the embodiment of the application, the annotation data set corresponding to the target annotation main body is extracted, the annotation data set can be selected to be only displayed on the collaborative annotation interface of the terminal equipment, and the screening operation of other people can not be influenced.
When the common display method is adopted, the selected annotation data sets can be displayed in sequence according to the time when each terminal device sends the screening instruction.
In a possible implementation manner, a picture sharing manner may be further adopted to share pictures corresponding to the labeling data set displayed after the local end screening to other terminal devices, so that when a certain labeling object is individually explained in a conference, all users can see the pictures after the screening, and explanation is conveniently carried out on the labeling sub-objects.
In the embodiment of the application, the terminal equipment responds to the triggering operation of the user on the marking button in the main conference interface to generate the marking interface, and then responds to the triggering operation of the user on the collaborative operation button in the marking interface to generate the collaborative marking interface and send collaborative instructions to other terminal equipment in the conference system, if the terminal equipment performs data marking on the corresponding collaborative marking interface, the marking sub-object and the corresponding at least one marking main body can be acquired based on the data marking action of the terminal equipment and displayed on the collaborative marking interface; and finally, screening the annotation sub-objects corresponding to the target annotation main body through the target annotation main body selected by the user and displaying the annotation sub-objects in the collaborative annotation interface so as to effectively screen collaborative annotation data in the conference and improve the conference efficiency and collaborative capability.
Referring to fig. 6, a schematic structural diagram of a collaborative annotation data screening apparatus provided by an embodiment of the present application is shown, and for convenience of explanation, only a portion relevant to the embodiment of the present application is shown.
The collaborative annotation data screening apparatus 6 may specifically include the following modules:
the first trigger response module 601 is configured to generate a labeling interface in response to a triggering operation of a user on a labeling button in a conference main interface, where the labeling interface is used for performing data labeling on any labeling object in the terminal device, and the labeling interface at least includes a collaborative operation button;
The second trigger response module 602 is configured to generate a collaborative annotation interface in response to a trigger operation of the collaborative operation button by a user, and send a collaborative instruction to other terminal devices in the conference system;
the annotation data obtaining module 603 is configured to obtain, if there is a terminal device that performs data annotation on a corresponding collaborative annotation interface, an annotation sub-object and a corresponding at least one annotation main body in the conference system based on a data annotation action of the terminal device, and synchronously display the annotation sub-object and the corresponding at least one annotation main body in the collaborative annotation interface of each terminal device;
the annotation data filtering module 604 is configured to filter, through the selected target annotation body, the annotation sub-objects corresponding to the target annotation body, and display the annotation sub-objects in the collaborative annotation interface.
In the embodiment of the application, the annotation main body comprises an annotation type, an annotation type attribute and an annotation user.
In the embodiment of the application, the terminal equipment included in the conference system is connected with the same server, and a plurality of terminal equipment connected with the same server have the same labeling object resources, and the storage paths of the labeling object resources in the terminal equipment are the same or different; the labeling data obtaining module 603 may specifically include the following sub-modules:
The marking instruction acquisition sub-module is used for forwarding and acquiring a marking instruction sent by the terminal equipment for marking data through the server, wherein the marking instruction comprises marking actions for marking sub-objects;
the labeling instruction analysis sub-module is used for obtaining labeling actions for the labeling sub-objects by analyzing labeling action labeling instructions;
the labeling action execution sub-module is used for labeling the labeling sub-objects according to the labeling action, determining the labeling sub-objects and at least one corresponding labeling main body in the conference system, and synchronously displaying the labeling sub-objects and the corresponding at least one labeling main body in a collaborative labeling interface of each terminal device.
In the embodiment of the present application, the collaborative annotation data screening apparatus 6 may further include:
the data group dividing module is used for dividing the labeling sub-objects corresponding to the same labeling main body into a group to obtain a plurality of labeling data groups;
correspondingly, the labeling data filtering module 604 may specifically include the following sub-modules:
the data set extraction sub-module is used for extracting the labeling data set corresponding to the target labeling main body through the selected target labeling main body, and displaying the target labeling main body and the corresponding labeling data set in the collaborative labeling interface.
In the embodiment of the present application, the data set extraction submodule may specifically further include the following units:
The first display unit is used for responding to the selection operation of the user on the target annotation main body, extracting the annotation data set corresponding to the target annotation main body and displaying the annotation data set in the collaborative annotation interface of the terminal equipment. And/or
The second display unit is used for responding to the selection operation of the user on the target labeling main body and generating a screening instruction, wherein the screening instruction comprises the target labeling main body; and sending the screening instruction to other terminal equipment so that the terminal equipment receiving the screening instruction analyzes the screening instruction to obtain a target labeling main body, and extracting and displaying a labeling data set corresponding to the target labeling main body according to the target labeling main body.
In the embodiment of the present application, the collaborative annotation data screening apparatus 6 may further include:
the list determining module is used for determining a plurality of labeling subjects corresponding to all labeling sub-objects as a labeling subject list and displaying the labeling subjects in each collaborative labeling interface;
the main body selection module is used for responding to the selection operation of at least one labeling main body in the labeling main body list by a user and determining a target labeling main body.
In an embodiment of the present application, the second trigger-response module 602 may specifically include the following sub-modules:
The collaboration request sending sub-module is used for responding to the triggering operation of the user on the collaboration operation button, generating a collaboration annotation interface and sending collaboration requests to other terminal devices in the conference system;
the response list acquisition sub-module is used for acquiring a response terminal equipment list based on response information of other terminal equipment;
and the cooperative instruction sending sub-module is used for determining target terminal equipment based on the response terminal equipment list and sending a cooperative instruction to the target terminal equipment.
In the embodiment of the present application, the second trigger-response module 602 may specifically further include the following sub-modules:
and the collaborative instruction broadcasting sub-module is used for responding to the triggering operation of the user on the collaborative operation button, generating a collaborative annotation interface and broadcasting a collaborative instruction to other terminal equipment in the conference system.
In the embodiment of the application, the collaborative instruction at least comprises the information of the marked object and the operation action for the marked object; the second trigger-response module 602 may specifically further include the following sub-modules:
and the operation action transmitting sub-module is used for transmitting a cooperative instruction to other terminal equipment in the conference system, so that the terminal equipment receiving the cooperative instruction analyzes the cooperative instruction to obtain the information of the marked object and the operation action for the marked object, and executing the operation action for the marked object according to the information of the marked object.
The cooperatively labeled data screening device 6 provided in the embodiment of the present application may be applied in the foregoing method embodiment, and details refer to the description of the foregoing method embodiment, which is not repeated herein.
Fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 7, the terminal device 700 of this embodiment includes: at least one processor 710 (only one shown in fig. 7), a memory 720, and a computer program 721 stored in the memory 720 and executable on the at least one processor 710, the processor 710 implementing the steps in the co-noted data screening method embodiments described above when the computer program 721 is executed.
The terminal device 700 may be a computing device such as a desktop computer, a notebook computer, a palm computer, a cloud server, etc. The terminal device may include, but is not limited to, a processor 710, a memory 720. It will be appreciated by those skilled in the art that fig. 7 is merely an example of a terminal device 700 and is not limiting of the terminal device 700, and may include more or fewer components than shown, or may combine certain components, or different components, such as may also include input-output devices, network access devices, etc.
The processor 710 may be a central processing unit (Central Processing Unit, CPU), the processor 710 may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 720 may in some embodiments be an internal storage unit of the terminal device 700, such as a hard disk or a memory of the terminal device 700. The memory 720 may also be an external storage device of the terminal device 700 in other embodiments, for example, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the terminal device 700. Further, the memory 720 may also include both an internal storage unit and an external storage device of the terminal device 700. The memory 720 is used to store an operating system, application programs, boot Loader (Boot Loader), data, other programs, etc., such as program codes of the computer program. The memory 720 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium contains content that can be appropriately scaled according to the requirements of jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is subject to legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunication signals.
The present application may also be implemented by a computer program product for implementing all or part of the steps of the above embodiments of the method, when the computer program product is run on a terminal device, for enabling the terminal device to execute the steps of the above embodiments of the method.
The above embodiments are only for illustrating the technical solution of the present application, and are not limited thereto. Although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (13)

1. The collaborative annotation data screening method is characterized by being applied to any terminal equipment in a conference system, wherein the conference system comprises a plurality of terminal equipment, each terminal equipment at least comprises a conference main interface, the conference main interface is used for providing a plurality of conference function buttons, and the conference function buttons at least comprise annotation buttons; the data screening method comprises the following steps:
Responding to the triggering operation of a user on a labeling button in the conference main interface, generating a labeling interface, wherein the labeling interface is used for carrying out data labeling on any labeling object in terminal equipment, and at least comprises a cooperative operation button;
responding to the triggering operation of the user on the cooperative operation button, generating a cooperative annotation interface and sending a cooperative instruction to other terminal equipment in the conference system;
if the terminal equipment performs data annotation on the corresponding collaborative annotation interface, acquiring an annotation sub-object and at least one corresponding annotation main body in the conference system based on the data annotation action of the terminal equipment, and synchronously displaying the annotation sub-object and the corresponding at least one annotation main body in the collaborative annotation interface of each terminal equipment;
and screening the labeling sub-objects corresponding to the target labeling main body through the selected target labeling main body, and displaying the labeling sub-objects in the collaborative labeling interface.
2. The data filtering method according to claim 1, wherein the annotation body comprises an annotation type, an annotation type attribute, and an annotation user.
3. The data screening method according to claim 2, wherein the terminal devices included in the conference system are connected to the same server, the plurality of terminal devices connected to the same server have the same labeling object resources, and the storage paths of the labeling object resources in the terminal devices are the same or different; the data labeling action based on the terminal equipment obtains a labeling sub-object and at least one corresponding labeling main body in the conference system, and synchronously displays the labeling sub-object and the corresponding at least one labeling main body in the collaborative labeling interface of each terminal equipment, wherein the collaborative labeling interface comprises the following steps:
The method comprises the steps of forwarding and obtaining a labeling instruction sent by terminal equipment for data labeling through a server, wherein the labeling instruction comprises labeling actions for the labeling sub-objects;
obtaining the labeling action of the labeling sub-object by analyzing the labeling instruction;
and marking the marking sub-object according to the marking action, determining the marking sub-object and at least one marking main body corresponding to the marking sub-object in the conference system, and synchronously displaying the marking sub-object and the marking main body in the collaborative marking interface of each terminal device.
4. The data filtering method according to claim 2, wherein after the obtaining the labeling sub-object and the corresponding at least one labeling main body in the conference system, the method further comprises:
dividing the labeling sub-objects corresponding to the same labeling main body into a group to obtain a plurality of labeling data groups;
screening the labeling sub-objects corresponding to the target labeling main body through the selected target labeling main body, and displaying the labeling sub-objects in the collaborative labeling interface, wherein the method comprises the following steps:
extracting a labeling data set corresponding to the target labeling main body through the selected target labeling main body, and displaying the target labeling main body and the corresponding labeling data set in the collaborative labeling interface.
5. The data filtering method according to claim 4, wherein the extracting, by the selected target labeling body, the labeling data set corresponding to the target labeling body, and displaying the target labeling body and the corresponding labeling data set in the collaborative labeling interface includes:
responding to the selection operation of a user on a target labeling main body, extracting a labeling data set corresponding to the target labeling main body, and displaying the labeling data set in the collaborative labeling interface of the terminal equipment; and/or
Responding to the selection operation of a user on a target labeling main body, generating a screening instruction, wherein the screening instruction comprises the target labeling main body;
and sending the screening instruction to other terminal equipment so that the terminal equipment receiving the screening instruction analyzes the screening instruction to obtain the target labeling main body, and extracting and displaying a labeling data set corresponding to the target labeling main body according to the target labeling main body.
6. The data screening method of claim 1, wherein the data screening method further comprises:
determining a plurality of labeling subjects corresponding to all labeling sub-objects as a labeling subject list, and displaying the labeling subjects in each collaborative labeling interface;
And determining a target labeling subject in response to a user selection operation of at least one labeling subject in the labeling subject list.
7. The data screening method according to claim 1, wherein the generating a collaborative annotation interface in response to a triggering operation of the collaborative operation button by a user and sending a collaborative instruction to other terminal devices in the conference system includes:
responding to the triggering operation of the user on the cooperative operation button, generating a cooperative annotation interface and sending a cooperative request to other terminal equipment in the conference system;
acquiring a response terminal equipment list based on response information of the other terminal equipment;
and determining target terminal equipment based on the response terminal equipment list, and sending a cooperative instruction to the target terminal equipment.
8. The data screening method according to claim 1, wherein the generating a collaborative annotation interface in response to a triggering operation of the collaborative operation button by a user and sending a collaborative instruction to other terminal devices in the conference system further comprises:
and responding to the triggering operation of the user on the cooperative operation button, generating a cooperative annotation interface, and broadcasting the cooperative instruction to other terminal devices in the conference system.
9. The data filtering method according to claim 7 or 8, wherein the collaborative instruction includes at least annotation object information and an operation action for the annotation object; the sending of the collaboration instruction to other terminal devices in the conference system further includes:
and sending a cooperative instruction to other terminal equipment in the conference system, so that the terminal equipment receiving the cooperative instruction analyzes the cooperative instruction to obtain the labeling object information and the operation action of the labeling object, and executing the operation action of the labeling object according to the labeling object information.
10. The collaborative annotation data screening device is characterized by being applied to any terminal equipment in a conference system, wherein the conference system comprises a plurality of terminal equipment, each terminal equipment at least comprises a conference main interface, the conference main interface is used for providing a plurality of conference function buttons, and the conference function buttons at least comprise annotation buttons; the data screening method comprises the following steps:
the first trigger response module is used for responding to the trigger operation of a user on the marking button in the conference main interface and generating a marking interface, wherein the marking interface is used for marking data of any marking object in the terminal equipment, and at least comprises a cooperative operation button;
The second trigger response module is used for responding to the trigger operation of the user on the cooperative operation button, generating a cooperative annotation interface and sending a cooperative instruction to other terminal devices in the conference system;
the annotation data acquisition module is used for acquiring an annotation sub-object and at least one corresponding annotation main body in the conference system based on the data annotation action of the terminal equipment if the terminal equipment performs data annotation on the corresponding collaborative annotation interface, and synchronously displaying the annotation sub-object and the corresponding at least one annotation main body in the collaborative annotation interface of each terminal equipment;
and the annotation data screening module is used for screening the annotation sub-objects corresponding to the target annotation main body through the selected target annotation main body and displaying the annotation sub-objects in the collaborative annotation interface.
11. A conference system, characterized in that the conference system comprises a server and a plurality of terminal devices, wherein the terminal devices are respectively connected with the server;
the server is used for receiving the labeling sub-object and at least one corresponding labeling main body of any terminal equipment and sending the labeling sub-object and the corresponding at least one labeling main body to other terminal equipment;
the terminal equipment is used for responding to the data labeling action of the user on the collaborative labeling interface, generating a labeling sub-object and at least one corresponding labeling main body, and sending the labeling sub-object and the at least one corresponding labeling main body to the server, so that the server sends the labeling sub-object and the at least one corresponding labeling main body to other terminal equipment in the conference system, and a plurality of terminal equipment screen out the corresponding labeling sub-object through the labeling main body.
12. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 8 when executing the computer program.
13. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the method according to any one of claims 1 to 8.
CN202310612709.6A 2023-05-26 2023-05-26 Collaborative annotation data screening method and device, conference system, terminal and medium Pending CN116701779A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310612709.6A CN116701779A (en) 2023-05-26 2023-05-26 Collaborative annotation data screening method and device, conference system, terminal and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310612709.6A CN116701779A (en) 2023-05-26 2023-05-26 Collaborative annotation data screening method and device, conference system, terminal and medium

Publications (1)

Publication Number Publication Date
CN116701779A true CN116701779A (en) 2023-09-05

Family

ID=87830436

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310612709.6A Pending CN116701779A (en) 2023-05-26 2023-05-26 Collaborative annotation data screening method and device, conference system, terminal and medium

Country Status (1)

Country Link
CN (1) CN116701779A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117408655A (en) * 2023-12-13 2024-01-16 国网浙江省电力有限公司金华供电公司 Financial tax data management method and platform based on full-service view angle

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117408655A (en) * 2023-12-13 2024-01-16 国网浙江省电力有限公司金华供电公司 Financial tax data management method and platform based on full-service view angle
CN117408655B (en) * 2023-12-13 2024-03-05 国网浙江省电力有限公司金华供电公司 Financial tax data management method and platform based on full-service view angle

Similar Documents

Publication Publication Date Title
CN109104354B (en) Grouping and establishing method and equipment thereof
US10496354B2 (en) Terminal device, screen sharing method, and screen sharing system
CN112350923A (en) Session message display method and device, computer equipment and storage medium
CN107102786B (en) Information processing method and client
CN104166514A (en) Information processing apparatus, information processing system, and information display method
CN112231463A (en) Session display method and device, computer equipment and storage medium
CN112333082B (en) Message display method and device
CN114124861A (en) Message group sending method and device, computer equipment and storage medium
CN111767396A (en) Data processing method, device, equipment and computer readable storage medium
WO2023016536A1 (en) Interaction method, apparatus and device, and storage medium
CN116701779A (en) Collaborative annotation data screening method and device, conference system, terminal and medium
CN112947807A (en) Display method and device and electronic equipment
CN110647827A (en) Comment information processing method and device, electronic equipment and storage medium
CN113891105A (en) Picture display method and device, storage medium and electronic equipment
CN112230821A (en) Session display method and device, computer equipment and storage medium
CN110109594B (en) Drawing data sharing method and device, storage medium and equipment
CN109697129A (en) A kind of information sharing method, equipment and computer readable storage medium
CN112269504B (en) Information display method and device and electronic equipment
CN113037925A (en) Information processing method, information processing apparatus, electronic device, and readable storage medium
WO2023179549A1 (en) Document block sharing method, apparatus and system, and storage medium
CN111178846A (en) Workflow file generation method, device, equipment and storage medium
CN109120783A (en) Information acquisition method and device, mobile terminal and computer readable storage medium
CN114221923B (en) Message processing method and device and electronic equipment
CN110765296A (en) Image searching method, terminal device and storage medium
CN113852540B (en) Information transmission method, information transmission device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 01A, 12th Floor, No. 8 Caihefang Road, Haidian District, Beijing, 100000

Applicant after: PIXELHUE TECHNOLOGY Ltd.

Address before: 100000 612, floor 5, No. 15, West Fourth Ring North Road, Haidian District, Beijing

Applicant before: PIXELHUE TECHNOLOGY Ltd.

CB02 Change of applicant information