WO2021218680A1 - Procédé et appareil de traitement d'informations d'interaction, dispositif électronique et support de stockage - Google Patents

Procédé et appareil de traitement d'informations d'interaction, dispositif électronique et support de stockage Download PDF

Info

Publication number
WO2021218680A1
WO2021218680A1 PCT/CN2021/088044 CN2021088044W WO2021218680A1 WO 2021218680 A1 WO2021218680 A1 WO 2021218680A1 CN 2021088044 W CN2021088044 W CN 2021088044W WO 2021218680 A1 WO2021218680 A1 WO 2021218680A1
Authority
WO
WIPO (PCT)
Prior art keywords
interactive
target
interactive information
user
interaction
Prior art date
Application number
PCT/CN2021/088044
Other languages
English (en)
Chinese (zh)
Inventor
陈可蓉
韩晓
赵立
杨晶生
史寅
Original Assignee
北京字节跳动网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京字节跳动网络技术有限公司 filed Critical 北京字节跳动网络技术有限公司
Priority to EP21796877.5A priority Critical patent/EP4124025A4/fr
Priority to JP2022564172A priority patent/JP7462070B2/ja
Publication of WO2021218680A1 publication Critical patent/WO2021218680A1/fr
Priority to US17/883,877 priority patent/US20220391058A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/005Language recognition
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/28Constructional details of speech recognition systems
    • G10L15/30Distributed recognition, e.g. in client-server systems, for mobile phones or network applications
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L17/00Speaker identification or verification techniques
    • G10L17/06Decision making techniques; Pattern matching strategies
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L17/00Speaker identification or verification techniques
    • G10L17/22Interactive procedures; Man-machine interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/233Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/26Speech to text systems
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L17/00Speaker identification or verification techniques

Definitions

  • the embodiments of the present disclosure relate to the field of computer data processing technology, and in particular to an interactive information processing method, device, electronic equipment, and storage medium.
  • the server can obtain the voice information of some users and the text information published by all users, and play and display the voice information and text information after processing.
  • the embodiments of the present disclosure provide an interactive information processing method, device, electronic equipment, and storage medium, so as to realize effective processing and display of information in an interactive scene, so as to improve communication efficiency.
  • embodiments of the present disclosure provide an interactive information processing method, including:
  • interactive information records are generated based on the user's interactive behavior data
  • the target interactive information and other interactive information elements in the interactive information record are separately displayed in the target area of the real-time interactive interface; the target interactive information is one or more interactive information elements in the interactive information record .
  • embodiments of the present disclosure also provide an interactive information processing device, including:
  • the interactive information recording module is used to generate interactive information records based on the user's interactive behavior data during the user's interaction based on the real-time interactive interface;
  • the interactive information screening module is used to screen out the target interactive information that meets the target screening conditions from the interactive information record;
  • the interactive information distinguishing display module is used to distinguish the target interactive information from other interactive information elements in the interactive information record in the target area of the real-time interactive interface; the target interactive information is in the interactive information record One or more interactive information elements.
  • embodiments of the present disclosure also provide an electronic device, the electronic device including:
  • One or more processors are One or more processors;
  • Storage device for storing one or more programs
  • the one or more processors When the one or more programs are executed by the one or more processors, the one or more processors implement the interactive information processing method provided by any embodiment of the present disclosure.
  • embodiments of the present disclosure also provide a storage medium containing computer-executable instructions, which are used to perform interactive information processing as provided by any embodiment of the present disclosure when the computer-executable instructions are executed by a computer processor. method.
  • the technical solutions of the embodiments of the present disclosure can effectively filter the interactive information records in the user's real-time interaction process by setting the filter conditions, and display them separately, so that the interactive users can selectively obtain the interactive information, thereby improving the interaction efficiency .
  • FIG. 1 is a schematic flowchart of an interactive information processing method provided by Embodiment 1 of the present disclosure
  • Embodiment 2 is a schematic flowchart of an interactive information processing method provided by Embodiment 2 of the present disclosure
  • Embodiment 3 is a schematic flowchart of an interactive information processing method provided by Embodiment 3 of the present disclosure
  • Embodiment 4 is a schematic flowchart of an interactive information processing method provided by Embodiment 4 of the present disclosure
  • Embodiment 5 is a schematic structural diagram of an interactive information processing device provided by Embodiment 5 of the present disclosure.
  • FIG. 6 is a schematic structural diagram of an interactive information processing device provided by Embodiment 6 of the present disclosure.
  • FIG. 7 is a schematic structural diagram of an electronic device provided by Embodiment 7 of the present disclosure.
  • FIG. 1 is a schematic flow chart of an interactive information processing method provided by Embodiment 1 of the present disclosure.
  • the embodiment of the present disclosure is suitable for processing and displaying information interacted by users in a real-time interactive application scenario supported by the Internet.
  • the method can be implemented by an interactive information processing device, which can be implemented in the form of software and/or hardware, and optionally, implemented by an electronic device, which can be a mobile terminal, a PC (Personal Computer, personal computer) End or server end, etc.
  • Real-time interactive application scenarios can usually be implemented by the client and the server.
  • the method provided in this embodiment can be executed by the client, the server, or both.
  • the method includes:
  • an interactive information record is generated based on the user's interactive behavior data.
  • the real-time interactive interface is any interactive interface in the real-time interactive application scenario.
  • Real-time interactive application scenarios can be implemented through the Internet and computer technology, such as interactive applications implemented through native programs or web (web page) programs.
  • multiple users may be allowed to interact in various forms of interactive behaviors, such as inputting text, voice, video, or sharing content objects.
  • an interactive information record can be generated based on the user's interactive behavior data.
  • the interactive behavior data may include various data involved in the user's interactive behavior, such as the type of interactive behavior and the specific content involved in the interactive behavior.
  • the interactive information record can be processed and generated by each client itself, or it can be generated by the server based on the interactive behavior data of each user and pushed to the client to obtain it.
  • S120 Filter out the target interactive information that meets the target screening condition from the interactive information record.
  • the target screening conditions can be manually input by the user or generated automatically.
  • the target filter condition is any condition that can distinguish the interactive information records. For example, at least one of the type, content, time, and user of the interactive behavior can be used as the target filter condition.
  • the target filtering conditions of each client can be different.
  • Each client can filter interactive information records based on local target filter conditions.
  • the server performs different screening processing on the interactive information records, and then pushes the screened target interactive information to the corresponding client.
  • the target interaction information is one or more interactions in the interaction information record Information element.
  • the interactive information record can be displayed in the target area of the real-time interactive interface.
  • the target area may be, for example, an area around the main interaction area, and may be the top, bottom, or side edges.
  • the video interaction window is the main interaction area and occupies 2/3 of the area of the screen, then the area where the interactive information record is displayed may be a certain sub-area in the 1/3 area on the side.
  • the interactive information record can include a variety of interactive information elements.
  • the interactive information elements correspond to the user’s interactive behavior data.
  • the speaker’s name and text captions are all different interactive information elements;
  • the sharer, the type of shared content, and the specific content to be shared are all different interactive information elements.
  • the target interactive information When the target interactive information is obtained by screening, the target interactive information itself is also one or more interactive information elements in the interactive information record. When displayed, it can be displayed differently from other interactive information elements, thereby highlighting the filtered target interactive information and allowing users It may be more intuitive and convenient to find the target interactive information.
  • the differentiated display may be to display only the target interactive information without displaying other interactive information elements, or it may also be displayed in a display format such as colors, fonts, background patterns, and the like.
  • the technical solution of this embodiment can effectively filter the interactive information records in the user's real-time interaction process and display them separately by setting the filter conditions, so that the interactive user can selectively obtain the interactive information, thereby improving the interaction efficiency.
  • the technical solutions of the embodiments of the present disclosure are applicable to various real-time interactive application scenarios, especially multimedia real-time interactive scenarios.
  • the real-time interactive interface may be, for example, a video conference interactive interface, a live video interactive interface, or a group chat interactive interface.
  • the interactive behavior of these real-time interactive scenes can be in one or more forms, for example, including text, voice, video, whiteboard, and shared content.
  • the display speed is fast, the display stay time is short, and there are also problems that users are not familiar with the display mode, so it is preferable to convert the interactive behavior data into a static interactive information record.
  • Auxiliary display Static interactive information recording can record multimedia interactive data through text, pictures, etc.
  • the interactive behavior data includes user voice data, and the content of the interactive information record includes text data recognized by the user voice data; or, the interactive behavior data includes user operation behavior data, so The content of the interactive information record includes text data recognized by the user operation behavior data; or, the interactive behavior data includes user voice data and user operation behavior data, and the content of the interactive information record includes the user voice data recognition
  • the output text data and the text data recognized by the user operation behavior data is equivalent to the subtitle text of the user's voice.
  • typical user operation behavior data may include sharing behaviors and shared content.
  • Sharing behavior is a type of operation
  • the sharing behavior is a type of operation behavior that presents shared content to each user.
  • the shared content includes shared documents, shared screens, and/or web links. Both types of operation behaviors and shared content can be transformed into interactive information records.
  • user operation behaviors are not limited to shared content, and may include, for example, behaviors such as writing on a whiteboard.
  • FIG. 2 is a schematic flowchart of an interactive information processing method provided by Embodiment 2 of the present disclosure. This embodiment is based on the foregoing embodiment, and provides specific options for the method and content of determining target screening conditions. The method of this embodiment specifically includes:
  • an interactive information record is generated based on the user's interactive behavior data.
  • the filter control includes at least one of a filter list, a condition input box, and an option label.
  • the target screening conditions include at least one of the following conditions: content keywords, voice data speaking users, speaking user activity levels, operation behavior types, and operation behavior content objects.
  • the content keywords may be specific to the content of various interactive information elements, for example, the name of the speaker, the text of the content of the speaker's speech, or the text of the shared content.
  • the speaking user of the voice data is a filtering condition determined from the user dimension, and can only focus on the content of one or a few speakers, for example, only the moderator.
  • the activity level of speaking users is also a filtering condition determined from the user dimension.
  • the activity of speaking users generally refers to the activity of users in implementing interactive behaviors, which can be the frequency, quantity, or quality of one interactive behavior or several interactive behaviors.
  • the evaluation index of activity may be: the number of speeches per unit time, the number of shared content, the length of speech content, and the effectiveness of substantial meaning.
  • the activity level of each speaking user can be determined according to the quantitative index, so that the speaking users with a high activity level can be selectively paid attention to.
  • the operation behavior type is a filtering condition determined from the type dimension of the interaction behavior, so that the user can selectively obtain the information records of a certain type or several types of operation behavior, for example, only the interaction information of the shared content can be obtained.
  • the content object of the operation behavior is a filtering condition determined from the dimension of the interactive behavior object.
  • the content object of the shared behavior is a document that can be filtered out and displayed separately.
  • the target filter condition can be manually input by the user to determine, specifically, it can be obtained through a filter control or a search control set on the real-time interactive interface.
  • a filter control or a search control set on the real-time interactive interface Specifically, taking voice data speaking users as an example, all speaking users can be displayed in the form of a filtered list for users to click to filter.
  • search controls such as a search input bar, can be provided for users to enter content keywords to determine target filtering conditions.
  • the activity level of the speaking user as an example, it can be displayed in the form of multiple activity level option labels or condition input boxes for the user to choose.
  • the content of the filter control can be dynamically generated based on the real-time interaction process or the content recorded by the interaction information. For example, with the increase or decrease of speaking users, the filtering list of speaking users can be increased or decreased correspondingly; according to the increase of interactive behavior types, label options of the types of operation behaviors that can be filtered can be added.
  • S230 Filter out the target interaction information that meets the target screening condition from the interaction information record.
  • the target interaction information is one or more interactions in the interaction information record Information element.
  • the technical solution of this embodiment can provide a wealth of controls in a real-time interactive interface for users to conveniently input target screening conditions.
  • FIG. 3 is a schematic flowchart of an interactive information processing method provided by Embodiment 3 of the present disclosure. This embodiment is based on the foregoing embodiment and provides another specific implementation manner for determining target screening conditions. The method includes:
  • an interactive information record is generated based on the user's interactive behavior data.
  • the trigger condition it is optional to set the trigger condition to automatically determine the target screening condition, thereby further reducing the degree of manual participation of the user and providing the user with intelligent services.
  • the trigger condition can be set from multiple dimensions according to requirements, for example, it can include at least one of the following: a preset time point in the interaction process, the user's voice data is detected to include indicative voice data, and the voice data used by each speaking user
  • the language type is different from the language type of the current client.
  • the preset time point may be an absolute time point or a relative time point.
  • the absolute time point may refer to an accurate time, such as a few minutes, which triggers the determination of the target screening condition.
  • the relative time point may be determined according to the duration of the interval, for example, the determination of the target screening condition may be triggered once at the interval setting duration.
  • the user's voice data includes indicative voice data, which means that the determination of the target screening condition can be triggered by the user's voice instruction.
  • the indicative voice data can be a clear instruction to set a word, or it can be a way to determine the user's screening intention through intention recognition.
  • the explicit instruction for setting words can be an instruction for setting the sentence structure, for example, "I want to see the text of the XX user's speech”. Intention recognition can more intelligently determine the user's screening intent, for example, "I can't hear XX user", indicating that there is a screening intent to view the content of the XX user's speech, and then it can trigger the determination of target screening conditions.
  • the client may be used by one user, or may be used by multiple users at the same time, and the type of language used by the speaking user corresponding to the client may be one or more.
  • the language of the non-local speaker is different from the language type of the local speaker, it can be considered that there is a trigger condition, and the local speaker may need to filter the speech content of other different languages to focus on viewing .
  • Differential languages refer to languages that are different from the language type of the user speaking at the local end. For example, if the language type of the speaking user at the local end is Chinese, then any language different from Chinese can be called a different language.
  • S330 Filter out the target interaction information that meets the target screening condition from the interaction information record.
  • the target interaction information is one or more interactions in the interaction information record Information element.
  • determining the target screening condition may specifically include: in the real-time interaction process of the user, when the set trigger condition is detected based on the collected interactive behavior data, Determine the target filter criteria. That is, the interactive behavior data is used to determine whether there is a trigger condition, and the user's interactive behavior data usually carries the possibility or intention of the user's demand for which information needs to be paid attention to. Therefore, determining the trigger condition based on the interactive behavior data can more accurately determine the target screening condition for the user.
  • determining the target screening conditions may specifically include: determining the target screening conditions based on the collected interactive behavior data. That is, the interactive behavior data can not only determine whether there is a trigger condition to determine the target filter condition, but also can be used to determine the content of the target filter condition.
  • determining the trigger condition and/or determining the target screening condition based on the interactive behavior data may include multiple situations.
  • determining the target screening condition includes at least one of the following:
  • the user's current activity value can be determined based on the interactive behavior data, and then, when the user's current activity value reaches a preset standard, users whose current activity value is higher than the preset activity value can be used as the target screening condition. So as to achieve the effect of displaying only the interactive content of active users.
  • the setting standard is, for example, there are N to M users with a high level of activity, and both N and M are set positive integers.
  • the language type may cause users to have communication barriers. Therefore, the target screening conditions can be determined based on the language type, which can specifically include:
  • the target language types preset by each client are collected, and other language types that are different from the target language types are used as target screening conditions.
  • each client can identify and filter the languages of other users for the speaking users on the local end.
  • the method may further include: performing voiceprint recognition on the voice data of each user to determine the speaking user to which the voice data belongs.
  • the client has a corresponding client account or client ID to distinguish different clients.
  • voiceprint recognition can be further performed on the voice data of each user, and each person's voice has a unique voiceprint, and different users can be distinguished accordingly. Then it can be marked as client ID-user A, client ID-user B, so as to distinguish different speaking users under the same client.
  • the substantial meaning of interactive behavior data can be embodied in semantics or the meaning of behavior results.
  • Information input by voice or text can recognize its natural semantics.
  • Information without effective natural semantics can be regarded as invalid for the substantial meaning of the data.
  • information with valid semantics can be regarded as valid for the substantial meaning of the data. If the data is detected If the actual meaning is valid, it can trigger the determination of the target screening conditions, and the effective information can be used as the screening object, that is, the effective interactive information can be the screening reserved object.
  • the basis for judging validity can be set or determined according to specific interaction scenarios and actual needs during the interaction process.
  • the voice is blurred and cannot be effectively recognized, and this type of message can be determined to be invalid information, and accordingly, information other than this type of message is regarded as valid information.
  • the meaning of behavior results can be reflected in whether certain user actions can achieve actual behavior results. For example, if the user quickly withdraws after sharing the wrong content, this kind of interactive behavior does not constitute a valid behavior result, and the interactive information can be regarded as invalid. If the user performs a document sharing operation and provides a specific explanation or further elaboration on the shared content, this type of interactive behavior constitutes an effective behavior result, and this type of operation can be effective as interactive information. At this time, it can trigger the determination of target filtering conditions , That is, to retain interactive behavior data with substantial meaning.
  • the technical solution of the embodiment of the present disclosure can automatically determine the generation time and content of the target screening condition through the user's interactive behavior data, making the screening operation more intelligent, and can distinguish between users without interfering with user interaction as much as possible. Display the interactive information that is focused on.
  • FIG. 4 is a schematic flowchart of an interactive information processing method provided by Embodiment 4 of the present disclosure. This embodiment is based on the foregoing embodiment, and further provides a way of distinguishing and displaying target interaction information.
  • the method includes:
  • an interactive information record is generated based on the user's interactive behavior data.
  • S420 Filter out the target interaction information that meets the target screening condition from the interaction information record.
  • the target interaction information is one of the interaction information records Or multiple interactive information elements.
  • the display mode corresponds to the screening type in the target screening condition.
  • a display mode corresponding to the screening type of the target screening condition is optionally adopted for the target interaction information. Therefore, the display mode can be further used to highlight the screening type of the target interactive information, and enhance the intuitiveness of the information display. Examples of several display methods corresponding to different filters are as follows:
  • the target interactive information is displayed in a corresponding display manner from other interactive information elements in the interactive record.
  • the target area of the real-time interactive interface includes:
  • the target interactive information is displayed in the target area of the real-time interactive interface together with the content of the interactive information record in a display manner that is different from the content of other interactive information records.
  • the target interactive information that is filtered through the search method the user may pay attention to the information content itself, or may pay attention to the position and context of the interactive information in all the information.
  • the target interactive information can be displayed together with other information, but the target interactive information can be displayed in a special display manner.
  • the display mode may include at least one of the following:
  • the light-shielding hood is in a semi-transparent state
  • the target interactive information is displayed in a preset font.
  • the target interactive information is displayed in a corresponding display mode with other interactive information elements in the interactive record to be displayed in real time.
  • the target area of the interactive interface can specifically include:
  • the target interactive information and other interactive information recorded content are respectively displayed in different target areas of the real-time interactive interface.
  • the target interactive information and other interactive information elements are placed in different areas for display.
  • the overall interactive information recording display area can be divided into two or more sub-areas, one sub-areas can display the filtered shared content, and the other sub-areas can display the text of the voice data.
  • the display of the target interactive information is retained, while the display of other interactive information elements is shielded, so that the target interactive information is displayed differently from other interactive information elements.
  • FIG. 5 is a schematic structural diagram of an interactive information processing device provided by Embodiment 5 of the present disclosure.
  • the interactive information processing device provided in this embodiment can be implemented in hardware and/or software, and is used to screen and display interactive information in an application scenario where a user performs real-time interaction based on application software.
  • the interactive information processing device can be integrated in the client, or can also be integrated in the server to provide business services for each client.
  • the interactive information processing device includes: an interactive information recording module 510, an interactive information screening module 520, and an interactive information distinguishing display module 530.
  • the interactive information recording module 510 is used to generate interactive information records based on the user's interactive behavior data during the user's interaction based on the real-time interactive interface;
  • the interactive information screening module 520 is used to filter out the interactive information records The target interactive information that meets the target screening conditions;
  • the interactive information distinguishing display module 530 is used to distinguish the target interactive information from other interactive information elements in the interactive information record and display them in the target area of the real-time interactive interface;
  • the target interaction information is one or more interaction information elements in the interaction information record.
  • the technical solutions of the embodiments of the present disclosure can effectively filter the interactive information records in the user's real-time interaction process by setting the filter conditions, and display them separately, so that the interactive users can selectively obtain the interactive information, thereby improving the interaction efficiency .
  • the method is suitable for user real-time interactive application scenarios, particularly suitable for multimedia real-time interactive application scenarios where users interact in various multimedia forms, and the application scenarios are implemented by application software based on Internet technology.
  • the real-time interactive interface is any interactive interface in the multimedia real-time interactive application scenario, for example, it may be a video conference interactive interface, a live video interactive interface, or a group chat interactive interface.
  • the interactive behavior data may include user voice data and/or user operation behavior data; the content of the interactive information record includes text data recognized by the user voice data and/or the type and content of the user operation behavior.
  • the user operation behavior data may include sharing behavior and shared content.
  • the sharing behavior is an operation behavior type of presenting the shared content to each user, and the shared content includes a shared document, a shared screen, and/or a web page link.
  • the target screening condition is any dimensional condition for screening interactive information records.
  • the target screening condition may include at least one of the following conditions: content keywords, voice data speaking users, speaking user activity level, operation behavior type And operation behavior content object.
  • the device may further include:
  • the control acquisition module is configured to acquire the target filter condition input by the user through the filter control or search control of the real-time interactive interface; wherein, the filter control includes at least one of a filter list, a condition input box, and an option label.
  • the user can input the target filtering conditions independently, which directly reflects the user's filtering intention.
  • FIG. 6 is a schematic structural diagram of an interactive information processing device provided by Embodiment 6 of the present disclosure. On the basis of the foregoing embodiments, this embodiment further provides a specific implementation manner for determining the screening conditions.
  • the device includes: an interactive information recording module 610, an interactive information screening module 620, and an interactive information distinguishing display module 630, and further includes a condition triggering module 640 for screening the interactive information records that meet the target screening conditions Before the target interaction information, when the set trigger condition is detected, the target filter condition is determined.
  • This embodiment can automatically trigger the acquisition of the target filter condition based on the setting of the trigger condition, or it can further automatically determine the content of the target filter condition.
  • the trigger condition includes at least one of the following: a preset time point in the interaction process, detecting that the user's voice data includes indicative voice data, and a difference in the type of language used by the speaking user, and the like.
  • condition trigger module can be specifically used to determine the target screening condition when the set trigger condition is detected based on the collected interactive behavior data in the real-time interaction process of the user.
  • condition trigger module may specifically include at least one of the following functional units:
  • the activity determination unit is configured to determine the current activity value of each user based on the collected interactive behavior data, and determine the target screening condition based on the current activity value;
  • the language determination unit is used to determine the language type of each user based on the collected interactive behavior data, and determine the target screening condition based on the language type;
  • the semantic determination unit is used to determine the substantial meaning of the interactive behavior data based on the collected interactive behavior data, and determine the target screening condition based on the validity or invalidity of the substantial meaning of the interactive behavior data.
  • the activity determination unit may be specifically configured to use users whose current activity value is higher than the preset activity value as the target screening condition.
  • the language determination unit may be specifically configured to determine the current language type corresponding to each user based on the voice data in the interactive behavior data and use other language types different from the current language type as the target screening condition; or, collect The target language type preset by each client uses other language types that are different from the target language type as the target screening condition.
  • the device may also include a voiceprint recognition module for performing voiceprint recognition on the voice data of each user before determining the language type of each user based on the collected interactive behavior data to determine the voice The speaking user to which the data belongs.
  • a voiceprint recognition module for performing voiceprint recognition on the voice data of each user before determining the language type of each user based on the collected interactive behavior data to determine the voice The speaking user to which the data belongs.
  • the target interactive information when it is determined, it can be displayed in a variety of different ways.
  • the interactive information differentiated display module may be specifically used to: display the target interactive information in a corresponding display manner with other interactive information elements in the interactive record in the target area of the real-time interactive interface; wherein, The display mode corresponds to the screening type in the target screening condition.
  • the interactive information distinguishing display module can also be more specifically used to: display the target interactive information in a display mode that is different from the content of other interactive information records, and display the content of the interactive information record in the target area of the real-time interactive interface. .
  • the interactive information differentiated display module may be more specifically used to display the target interactive information and other interactive information recorded content in different target areas of the real-time interactive interface, respectively.
  • the display mode includes at least one of the following:
  • the light-shielding hood is in a semi-transparent state
  • the target interactive information is displayed in a preset font.
  • the interactive information processing device provided by the embodiment of the present disclosure can execute the interactive information processing method provided by any embodiment of the present disclosure, and has the corresponding functional modules and beneficial effects for the execution method.
  • FIG. 7 shows a schematic structural diagram of an electronic device (for example, the terminal device or the server in FIG. 7) 700 suitable for implementing the embodiments of the present disclosure.
  • the terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile phones, notebook computers, digital broadcast receivers, PDAs (Personal Digital Assistant, personal digital assistants), PAD (Portable Android Device, tablet computers), and PMP (Portable Media). Players, portable multimedia players), mobile terminals such as in-vehicle terminals (for example, in-vehicle navigation terminals), and fixed terminals such as digital TVs (television), desktop computers, and the like.
  • the electronic device shown in FIG. 7 is only an example, and should not bring any limitation to the function and scope of use of the embodiments of the present disclosure.
  • the electronic device 700 may include a processing device (such as a central processing unit, a graphics processor, etc.) 701, which may be loaded into a random access device according to a program stored in a read-only memory (ROM) 702 or from a storage device 706
  • the program in the memory (RAM) 703 executes various appropriate actions and processing.
  • various programs and data required for the operation of the electronic device 700 are also stored.
  • the processing device 701, the ROM 702, and the RAM 703 are connected to each other through a bus 704.
  • An input/output (I/O) interface 705 is also connected to the bus 704.
  • the following devices can be connected to the I/O interface 705: including input devices 706 such as touch screens, touch pads, keyboards, mice, cameras, microphones, accelerometers, gyroscopes, etc.; including, for example, liquid crystal displays (LCD), speakers, vibrations
  • input devices 706 such as touch screens, touch pads, keyboards, mice, cameras, microphones, accelerometers, gyroscopes, etc.
  • LCD liquid crystal displays
  • An output device 707 such as a device
  • a storage device 706 such as a magnetic tape, a hard disk, etc.
  • the communication device 709 may allow the electronic device 700 to perform wireless or wired communication with other devices to exchange data.
  • FIG. 7 shows an electronic device 700 having various devices, it should be understood that it is not required to implement or have all of the illustrated devices. It may be implemented alternatively or provided with more or fewer devices.
  • an embodiment of the present disclosure includes a computer program product, which includes a computer program carried on a non-transitory computer readable medium, and the computer program contains program code for executing the method shown in the flowchart.
  • the computer program may be downloaded and installed from the network through the communication device 709, or installed from the storage device 706, or installed from the ROM702.
  • the processing device 701 the above-mentioned functions defined in the method of the embodiment of the present disclosure are executed.
  • the eighth embodiment of the present disclosure provides a computer storage medium on which a computer program is stored, and when the program is executed by a processor, the interactive information processing method provided in the foregoing embodiment is implemented.
  • the above-mentioned computer-readable medium in the present disclosure may be a computer-readable signal medium or a computer-readable storage medium, or any combination of the two.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any combination of the above. More specific examples of computer-readable storage media may include, but are not limited to: electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (ROM), erasable Programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program, and the program may be used by or in combination with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a data signal propagated in a baseband or as a part of a carrier wave, and a computer-readable program code is carried therein. This propagated data signal can take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • the computer-readable signal medium may also be any computer-readable medium other than the computer-readable storage medium.
  • the computer-readable signal medium may send, propagate or transmit the program for use by or in combination with the instruction execution system, apparatus, or device .
  • the program code contained on the computer-readable medium can be transmitted by any suitable medium, including but not limited to: wire, optical cable, RF (Radio Frequency), etc., or any suitable combination of the above.
  • the client and server can communicate with any currently known or future developed network protocol such as HTTP (HyperText Transfer Protocol), and can communicate with digital data in any form or medium.
  • Communication e.g., communication network
  • Examples of communication networks include local area networks (“LAN”), wide area networks (“WAN”), the Internet (for example, the Internet), and end-to-end networks (for example, ad hoc end-to-end networks), as well as any currently known or future research and development network of.
  • the above-mentioned computer-readable medium may be included in the above-mentioned electronic device; or it may exist alone without being assembled into the electronic device.
  • the aforementioned computer-readable medium carries one or more programs, and when the aforementioned one or more programs are executed by the electronic device, the electronic device:
  • interactive information records are generated based on the user's interactive behavior data
  • the target interactive information and other interactive information elements in the interactive information record are separately displayed in the target area of the real-time interactive interface; the target interactive information is one or more interactive information elements in the interactive information record .
  • the computer program code used to perform the operations of the present disclosure can be written in one or more programming languages or a combination thereof.
  • the above-mentioned programming languages include but are not limited to object-oriented programming languages-such as Java, Smalltalk, C++, and Including conventional procedural programming languages-such as "C" language or similar programming languages.
  • the program code can be executed entirely on the user's computer, partly on the user's computer, executed as an independent software package, partly on the user's computer and partly executed on a remote computer, or entirely executed on the remote computer or server.
  • the remote computer can be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or it can be connected to an external computer (for example, using an Internet service provider to pass Internet connection).
  • LAN local area network
  • WAN wide area network
  • each block in the flowchart or block diagram can represent a module, program segment, or part of code, and the module, program segment, or part of code contains one or more for realizing the specified logic function.
  • Executable instructions can also occur in a different order from the order marked in the drawings. For example, two blocks shown one after another can actually be executed substantially in parallel, and they can sometimes be executed in the reverse order, depending on the functions involved.
  • each block in the block diagram and/or flowchart, and the combination of the blocks in the block diagram and/or flowchart can be implemented by a dedicated hardware-based system that performs the specified functions or operations Or it can be realized by a combination of dedicated hardware and computer instructions.
  • the units involved in the embodiments described in the present disclosure can be implemented in software or hardware. Wherein, the name of the unit/module does not constitute a limitation on the unit itself under certain circumstances.
  • the user determination module to be sent can also be described as a "user determination module”.
  • exemplary types of hardware logic components include: Field Programmable Gate Array (FPGA), Application Specific Integrated Circuit (ASIC), Application Specific Standard Product (ASSP), System on Chip (SOC), Complex Programmable Logical device (CPLD) and so on.
  • FPGA Field Programmable Gate Array
  • ASIC Application Specific Integrated Circuit
  • ASSP Application Specific Standard Product
  • SOC System on Chip
  • CPLD Complex Programmable Logical device
  • a machine-readable medium may be a tangible medium, which may contain or store a program for use by the instruction execution system, apparatus, or device or in combination with the instruction execution system, apparatus, or device.
  • the machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
  • the machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any suitable combination of the foregoing.
  • machine-readable storage media would include electrical connections based on one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or flash memory erasable programmable read-only memory
  • CD-ROM compact disk read only memory
  • magnetic storage device or any suitable combination of the foregoing.
  • Example 1 provides an interactive information processing method, which includes:
  • interactive information records are generated based on the user's interactive behavior data
  • the target interactive information and other interactive information elements in the interactive information record are separately displayed in the target area of the real-time interactive interface; the target interactive information is one or more interactive information elements in the interactive information record .
  • Example 2 provides an interactive information processing method, which further includes:
  • the user operation behavior data includes sharing behavior and shared content
  • the sharing behavior is an operation behavior type of presenting the shared content to each user
  • the shared content includes a shared document, a shared screen, and/or a web page link.
  • Example 3 provides an interactive information processing method, which further includes:
  • the interactive behavior data includes user voice data
  • the content of the interactive information record includes text data recognized by the user voice data
  • the interactive behavior data includes user operation behavior data, and the content of the interactive information record includes text data recognized by the user operation behavior data;
  • the interactive behavior data includes user voice data and user operation behavior data
  • the content of the interactive information record includes text data recognized by the user voice data and text data recognized by the user operation behavior data.
  • Example 4 provides an interactive information processing method, which further includes:
  • the target screening conditions include at least one of the following conditions: content keywords, voice data speaking users, speaking user activity levels, operation behavior types, and operation behavior content objects.
  • Example 5 provides an interactive information processing method, which further includes:
  • the target screening conditions are determined in the following manner:
  • the filter control includes at least one of a filter list, a condition input box, and an option label.
  • Example 6 provides an interactive information processing method, which further includes:
  • the method before screening the target interaction information that meets the target screening condition from the interaction information record, the method further includes:
  • the target filter condition is determined.
  • Example 7 provides an interactive information processing method, which further includes:
  • the trigger condition includes at least one of the following: at a preset time point in the interaction process, detecting that the user's voice data includes indicative voice data, and the type of language used by the speaking user is different.
  • Example 8 provides an interactive information processing method, which further includes:
  • determining the target screening condition includes:
  • the target screening condition is determined.
  • Example 9 provides an interactive information processing method, which further includes:
  • determining the target screening condition includes at least one of the following:
  • the substantial meaning of the interactive behavior data is determined, and based on the validity or invalidity of the substantial meaning of the interactive behavior data, the target screening conditions are determined.
  • Example 10 provides an interactive information processing method, which further includes:
  • the determining target screening conditions based on the current activity value includes:
  • the user whose current activity value is higher than the preset activity value is used as the target filter condition.
  • Example 11 provides an interactive information processing method, which further includes:
  • the determining the language type of each user based on the collected interactive behavior data, and determining the target screening condition based on the language type includes:
  • the target language types preset by each client are collected, and other language types that are different from the target language types are used as target screening conditions.
  • Example 12 provides an interactive information processing method, which further includes:
  • the method before determining the language type of each user based on the collected interactive behavior data, the method further includes:
  • Example 13 provides an interactive information processing method, which further includes:
  • the differently displaying the target interaction information and other interaction information elements in the interaction information record in the target area of the real-time interaction interface includes:
  • the target interactive information is displayed in the target area of the real-time interactive interface in a corresponding display manner from other interactive information elements in the interactive record; wherein, the display method is related to the filter type in the target filter condition Corresponding.
  • Example 14 provides an interactive information processing method, which further includes:
  • the filter type in the target filter condition includes the filter condition entered in the search control, and the target interactive information is displayed in a corresponding display manner with other interactive information elements in the interactive record.
  • the target area of the real-time interactive interface includes:
  • the target interactive information is displayed in the target area of the real-time interactive interface together with the content of the interactive information record in a display manner that is different from the content of other interactive information records.
  • Example 15 provides an interactive information processing method, which further includes:
  • the filter type in the target filter condition includes the filter condition entered in the filter control, and the target interaction information is displayed in a corresponding display manner with other interactive information elements in the interaction record.
  • the target area of the real-time interactive interface includes:
  • the target interactive information and other interactive information recorded content are respectively displayed in different target areas of the real-time interactive interface.
  • Example 16 provides an interactive information processing method, which further includes:
  • the display mode includes at least one of the following:
  • the light-shielding hood is in a semi-transparent state
  • the target interactive information is displayed in a preset font.
  • Example 17 provides an interactive information processing method, which further includes:
  • the real-time interactive interface is a video conference interactive interface, a live video interactive interface, or a group chat interactive interface.
  • Example 18 provides an interactive information processing device, which includes:
  • the interactive information recording module is used to generate interactive information records based on the user's interactive behavior data during the user's interaction based on the real-time interactive interface;
  • the interactive information screening module is used to screen out the target interactive information that meets the target screening conditions from the interactive information record;
  • the interactive information distinguishing display module is used to distinguish the target interactive information from other interactive information elements in the interactive information record in the target area of the real-time interactive interface; the target interactive information is in the interactive information record One or more interactive information elements.
  • a storage medium containing computer-executable instructions that, when executed by a computer processor, are used to perform interactions as provided in any embodiment of the present disclosure.
  • Information processing methods are provided.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Business, Economics & Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Game Theory and Decision Science (AREA)
  • Artificial Intelligence (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Debugging And Monitoring (AREA)

Abstract

Un procédé et un appareil de traitement d'informations d'interaction, un dispositif électronique et un support de stockage sont divulgués. Le procédé fait appel aux étapes suivantes : pendant le processus de réalisation d'une interaction par un utilisateur sur la base d'une interface d'interaction en temps réel, la génération d'un enregistrement d'informations d'interaction sur la base de données de comportement d'interaction de l'utilisateur ; le filtrage d'informations d'interaction cibles, qui satisfont une condition de filtrage cible, à partir de l'enregistrement d'informations d'interaction ; et l'affichage de manière distinctive, dans une zone cible de l'interface d'interaction en temps réel, des informations d'interaction cibles et d'autres éléments d'informations d'interaction dans l'enregistrement d'informations d'interaction. Selon la solution technique des modes de réalisation de la présente divulgation, au moyen du réglage d'une condition de filtrage, des enregistrements d'informations d'interaction peuvent être efficacement filtrés pendant le processus d'interaction en temps réel d'un utilisateur et peuvent être affichés de manière distinctive, de sorte qu'un utilisateur d'interaction puisse acquérir de manière sélective des informations d'interaction, ce qui permet d'améliorer l'efficacité d'interaction.
PCT/CN2021/088044 2020-04-30 2021-04-19 Procédé et appareil de traitement d'informations d'interaction, dispositif électronique et support de stockage WO2021218680A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP21796877.5A EP4124025A4 (fr) 2020-04-30 2021-04-19 Procédé et appareil de traitement d'informations d'interaction, dispositif électronique et support de stockage
JP2022564172A JP7462070B2 (ja) 2020-04-30 2021-04-19 インタラクション情報処理方法、装置、電子デバイス及び記憶媒体
US17/883,877 US20220391058A1 (en) 2020-04-30 2022-08-09 Interaction information processing method and apparatus, electronic device and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010366928.7 2020-04-30
CN202010366928.7A CN113014853B (zh) 2020-04-30 2020-04-30 互动信息处理方法、装置、电子设备及存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/883,877 Continuation US20220391058A1 (en) 2020-04-30 2022-08-09 Interaction information processing method and apparatus, electronic device and storage medium

Publications (1)

Publication Number Publication Date
WO2021218680A1 true WO2021218680A1 (fr) 2021-11-04

Family

ID=76383556

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/088044 WO2021218680A1 (fr) 2020-04-30 2021-04-19 Procédé et appareil de traitement d'informations d'interaction, dispositif électronique et support de stockage

Country Status (5)

Country Link
US (1) US20220391058A1 (fr)
EP (1) EP4124025A4 (fr)
JP (1) JP7462070B2 (fr)
CN (1) CN113014853B (fr)
WO (1) WO2021218680A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114979754A (zh) * 2022-04-11 2022-08-30 北京高途云集教育科技有限公司 一种信息显示方法、装置、设备以及存储介质
CN116149520B (zh) * 2023-04-23 2023-07-21 深圳市微克科技有限公司 一种智能手表交互界面智能处理方法、系统及介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102929917A (zh) * 2011-09-20 2013-02-13 微软公司 动态内容馈源过滤
CN104615593A (zh) * 2013-11-01 2015-05-13 北大方正集团有限公司 微博热点话题自动检测方法及装置
CN104679405A (zh) * 2015-02-06 2015-06-03 深圳市金立通信设备有限公司 一种终端
CN106375865A (zh) * 2016-09-20 2017-02-01 腾讯科技(深圳)有限公司 一种基于社交信息的弹幕交互方法、系统及终端
US20190182062A1 (en) * 2017-11-29 2019-06-13 Palantir Technologies Inc. Systems and methods for providing category-sensitive chat channels
CN110392312A (zh) * 2019-06-14 2019-10-29 北京字节跳动网络技术有限公司 群聊构建方法、系统、介质和电子设备

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000284876A (ja) 1999-03-31 2000-10-13 Sony Corp 表示制御装置および表示制御方法、並びに媒体
JP2009194857A (ja) 2008-02-18 2009-08-27 Sharp Corp 通信会議システム、通信装置、通信会議方法、コンピュータプログラム
JP5014449B2 (ja) * 2010-02-26 2012-08-29 シャープ株式会社 会議システム、情報処理装置、会議支援方法、情報処理方法、及びコンピュータプログラム
US20110246172A1 (en) 2010-03-30 2011-10-06 Polycom, Inc. Method and System for Adding Translation in a Videoconference
JP5751627B2 (ja) 2011-07-28 2015-07-22 国立研究開発法人産業技術総合研究所 音声データ書き起こし用webサイトシステム
US9007448B2 (en) * 2012-02-03 2015-04-14 Bank Of America Corporation Video-assisted customer experience
US9590929B2 (en) * 2013-04-11 2017-03-07 International Business Machines Corporation Directed message notification in chat sessions
JP6233798B2 (ja) * 2013-09-11 2017-11-22 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation データを変換する装置及び方法
US20160094866A1 (en) * 2014-09-29 2016-03-31 Amazon Technologies, Inc. User interaction analysis module
CN105632498A (zh) * 2014-10-31 2016-06-01 株式会社东芝 生成会议记录的方法、装置和系统
CN105488116A (zh) * 2015-11-20 2016-04-13 珠海多玩信息技术有限公司 一种基于在线直播的消息显示方法及客户端
CN107911646B (zh) * 2016-09-30 2020-09-18 阿里巴巴集团控股有限公司 一种会议分享、生成会议记录的方法及装置
US11412012B2 (en) * 2017-08-24 2022-08-09 Re Mago Holding Ltd Method, apparatus, and computer-readable medium for desktop sharing over a web socket connection in a networked collaboration workspace
JP6914154B2 (ja) 2017-09-15 2021-08-04 シャープ株式会社 表示制御装置、表示制御方法及びプログラム
CN107645686A (zh) * 2017-09-22 2018-01-30 广东欧珀移动通信有限公司 信息处理方法、装置、终端设备及存储介质
CN108881789B (zh) * 2017-10-10 2019-07-05 视联动力信息技术股份有限公司 一种基于视频会议的数据交互方法和装置
US20200167699A1 (en) * 2018-11-26 2020-05-28 Tickitin Experiences LLC Event management and coordination platform
US11269591B2 (en) * 2019-06-19 2022-03-08 International Business Machines Corporation Artificial intelligence based response to a user based on engagement level
CN110851745B (zh) * 2019-10-28 2023-11-03 腾讯科技(深圳)有限公司 信息处理方法、装置、存储介质及电子设备
US11146700B2 (en) * 2020-01-27 2021-10-12 Kyocera Document Solutions Inc. Image forming apparatus and communication system that utilize group chat function
US10819532B1 (en) * 2020-03-27 2020-10-27 Ringcentral, Inc. System and method for determining a source and topic of content for posting in a chat group

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102929917A (zh) * 2011-09-20 2013-02-13 微软公司 动态内容馈源过滤
CN104615593A (zh) * 2013-11-01 2015-05-13 北大方正集团有限公司 微博热点话题自动检测方法及装置
CN104679405A (zh) * 2015-02-06 2015-06-03 深圳市金立通信设备有限公司 一种终端
CN106375865A (zh) * 2016-09-20 2017-02-01 腾讯科技(深圳)有限公司 一种基于社交信息的弹幕交互方法、系统及终端
US20190182062A1 (en) * 2017-11-29 2019-06-13 Palantir Technologies Inc. Systems and methods for providing category-sensitive chat channels
CN110392312A (zh) * 2019-06-14 2019-10-29 北京字节跳动网络技术有限公司 群聊构建方法、系统、介质和电子设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4124025A4

Also Published As

Publication number Publication date
EP4124025A1 (fr) 2023-01-25
US20220391058A1 (en) 2022-12-08
JP2023523230A (ja) 2023-06-02
CN113014853B (zh) 2022-11-11
CN113014853A (zh) 2021-06-22
JP7462070B2 (ja) 2024-04-04
EP4124025A4 (fr) 2023-09-20

Similar Documents

Publication Publication Date Title
WO2022068533A1 (fr) Procédé et appareil de traitement interactif d'informations, dispositif et support
CN108847214B (zh) 语音处理方法、客户端、装置、终端、服务器和存储介质
WO2021218981A1 (fr) Procédé et appareil de génération d'un enregistrement d'interaction, dispositif et support
US20220391058A1 (en) Interaction information processing method and apparatus, electronic device and storage medium
CN113259740A (zh) 一种多媒体处理方法、装置、设备及介质
WO2021218556A1 (fr) Procédé et appareil d'affichage d'informations et dispositif électronique
US20240007718A1 (en) Multimedia browsing method and apparatus, device and mediuim
CN115079884B (zh) 会话消息的显示方法、装置、设备及存储介质
WO2019214132A1 (fr) Procédé, dispositif et équipement de traitement d'informations
CN112667118A (zh) 显示历史聊天消息的方法、设备以及计算机可读介质
CN112291614A (zh) 一种视频生成方法及装置
CN110379406B (zh) 语音评论转换方法、系统、介质和电子设备
CN112203151A (zh) 视频处理方法及装置
CN116048337A (zh) 一种页面展示方法、装置、设备和存储介质
WO2024165010A1 (fr) Procédé et appareil de génération d'informations, procédé et appareil d'affichage d'informations, dispositif et support d'enregistrement
WO2022042251A1 (fr) Procédé d'affichage d'entrée de fonction, dispositif électronique et support de stockage lisible par ordinateur
WO2024093443A1 (fr) Procédé et appareil d'affichage d'informations basés sur une interaction vocale, et dispositif électronique
CN113552984A (zh) 文本提取方法、装置、设备及介质
CN112000251A (zh) 用于播放视频的方法、装置、电子设备和计算机可读介质
CN110377842A (zh) 语音评论显示方法、系统、介质和电子设备
WO2021218631A1 (fr) Procédé et appareil de traitement d'informations d'interaction, dispositif et support
CN113194279A (zh) 网络会议的记录方法、计算机可读存储介质及电子设备
WO2023088044A1 (fr) Procédé et appareil de traitement de données, dispositif électronique et support de stockage
CN114816599A (zh) 图像显示方法、装置、设备及介质
CN114567700A (zh) 交互方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21796877

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022564172

Country of ref document: JP

Kind code of ref document: A

Ref document number: 2021796877

Country of ref document: EP

Effective date: 20221018

NENP Non-entry into the national phase

Ref country code: DE