CN112148182B - Interaction control method, terminal and storage medium - Google Patents

Interaction control method, terminal and storage medium Download PDF

Info

Publication number
CN112148182B
CN112148182B CN201910579764.3A CN201910579764A CN112148182B CN 112148182 B CN112148182 B CN 112148182B CN 201910579764 A CN201910579764 A CN 201910579764A CN 112148182 B CN112148182 B CN 112148182B
Authority
CN
China
Prior art keywords
interactive
terminal
identifier
interactive object
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910579764.3A
Other languages
Chinese (zh)
Other versions
CN112148182A (en
Inventor
李建平
郭建伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technical Service Co Ltd
Original Assignee
Huawei Technical Service Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technical Service Co Ltd filed Critical Huawei Technical Service Co Ltd
Priority to CN201910579764.3A priority Critical patent/CN112148182B/en
Publication of CN112148182A publication Critical patent/CN112148182A/en
Application granted granted Critical
Publication of CN112148182B publication Critical patent/CN112148182B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop

Abstract

The application discloses an interaction control method, a terminal and a storage medium, which are applied to a multi-party interaction system, wherein the multi-party interaction system can be a multi-party conference system, a live broadcast system, a virtual reality system and the like, and the method comprises the following steps: displaying an interactive interface, wherein the interactive interface comprises identifiers of a plurality of objects, the plurality of objects comprise a first object and an interactive object, the first object is an object using the terminal, and the interactive object is any one of other objects except the first object in the plurality of objects; detecting an operation event, wherein the operation event is to adjust the relative position relation between the identifier of the first object and the identifier of the interactive object on the interactive interface; adjusting a target output mode of output information of the interactive object according to the operation event; and outputting the output information of the interactive object according to the adjusted target output mode. Therefore, the purpose that the output information of each interactive object displayed by the terminal can be independently adjusted is achieved.

Description

Interaction control method, terminal and storage medium
Technical Field
The present application relates to the field of communications technologies, and in particular, to an interaction control method, a terminal, and a storage medium.
Background
With the development of communication technology, the application of multi-party interaction is more and more extensive, and the scene of multi-party interaction can be audio and video conference, live broadcast, online education or online games and the like.
In multi-party interaction, a user needs to interact with an interactive object through a terminal used by the user, and the interactive object can be other participants in an audio and video conference, and can also be a Non Player Character (NPC) in a network game, and the like. The user receives the data packet sent by the server through the used terminal, the terminal decodes the data packet to obtain the decoded data packet, the terminal outputs the decoded data packet to realize multi-party interaction, and the output mode can be audio information of an interactive object played by the terminal, icon information of the multi-party interactive object displayed by the terminal and the like.
The data packet received by the terminal generally includes output information of a plurality of interactive objects, and the server has uniformly encoded the output information of each interactive object in the process of generating the data packet, for example, audio information of the plurality of interactive objects in the data packet has the same volume, and for example, icon information of a plurality of interactive objects has the same brightness value, so that each output information included in the data packet is output in a uniform output mode, and the purpose that a user independently adjusts output information of different interactive objects cannot be met.
Disclosure of Invention
The embodiment of the invention provides an interaction control method, a terminal and a storage medium, which can independently adjust target output modes corresponding to different interaction objects.
A first aspect of an embodiment of the present invention provides an interaction control method, where the method is applied to a first terminal, and the method includes:
the first terminal displays an interactive interface, the interactive interface comprises identifications of a plurality of objects, the plurality of objects comprise a first object and an interactive object, the first object is an object using the first terminal, the interactive object is any one of other objects except the first object in the plurality of objects, and the identifications are one or more items of account names, head images or nicknames of the objects. The method comprises the steps that a first terminal detects an operation event, and the operation event is used for adjusting the relative position relation between an identifier of a first object and an identifier of an interactive object on an interactive interface; and the first terminal adjusts a target output mode of the output information of the interactive object according to the operation event. And the first terminal outputs the output information of the interactive object according to the adjusted target output mode.
By adopting the method, the identification of the interactive objects is displayed on the interactive interface of the first terminal, and under the condition that the first terminal needs to adjust the target output mode corresponding to the interactive objects, the operation event can be input aiming at the identification of the interactive objects so as to adjust the relative position relation of the identification of the first object and the identification of the interactive objects on the interactive interface, so as to adjust the target output mode of the output information of the interactive objects, so that the output information of the interactive objects is output in the target output mode on the first terminal, and thus the first terminal can independently adjust the output information of each interactive object displayed on the interactive interface.
Based on the first aspect of the embodiment of the present invention, in an optional implementation manner of the first aspect of the embodiment of the present invention, a relative position relationship between the identifier of the first object and the identifier of the interactive object on the interactive interface is a relative distance between the identifier of the first object and the identifier of the interactive object on the interactive interface.
By adopting the method of the invention, the first terminal adjusts the target output mode of the interactive object by adjusting the relative distance between the identifier of the first object and the identifier of the interactive object on the interactive interface, thereby effectively improving the efficiency of adjusting the target output mode of the interactive object.
Based on the first aspect of the embodiments of the present invention, in an optional implementation manner of the first aspect of the embodiments of the present invention, the adjusting, by the first terminal, the target output mode of the output information of the interactive object according to the operation event specifically includes adjusting, by the first terminal, the target output mode of the output information of the interactive object according to a change of the relative distance; optionally, a target output mode of the output information of the interactive object is in an inverse correlation relationship with the relative distance, that is, the larger the target distance is, the smaller the target output mode is, specifically, for example, if the output information is audio information, the larger the target distance is, the smaller decibel of the audio information is, if the output information is audio-video information, the larger the target distance is, the smaller decibel and resolution of the audio-video information is, if the output information is a brightness value of the identifier, the larger the target distance is, the smaller the brightness value of the identifier is, and if the output information is sensory information, the larger the target distance is, the weaker the sensory information is;
optionally, the target output mode of the output information of the interactive object is in positive correlation with the relative distance, that is, the larger the target distance is, the larger the target output mode is, specifically, for example, if the output information is audio information, the larger the target distance is, the larger decibel of the audio information is, if the output information is audio/video information, the larger the target distance is, the larger decibel and resolution of the audio/video information are, if the output information is a brightness value of an identifier, the larger the target distance is, the larger the brightness value of the identifier is, and if the output information is sensory information, the larger the target distance is, the stronger the sensory information is.
By adopting the method of the invention, the first terminal adjusts the target output mode of the interactive object according to the relative distance between the identifier of the first object and the identifier of the interactive object on the interactive interface, thereby realizing the visual adjustment of the target output mode corresponding to the interactive object, and the accurate adjustment of the target output mode can be realized by adjusting the relative distance of the target.
Based on the first aspect of the embodiments of the present invention, in an optional implementation manner of the first aspect of the embodiments of the present invention, the adjusting, by the first terminal, the target output mode of the output information of the interactive object according to the operation event specifically includes: the first terminal sorts the identifiers of the M interactive objects from small to large according to the relative distance between the identifier of each interactive object in the M interactive objects and the identifier of the first object; the first terminal determines that a target output mode corresponding to the identification of the interactive object sequenced after the first L bits in the M interactive objects is a mute mode, M and L are both positive integers greater than 1, and M is greater than L.
By adopting the method of the invention, the first terminal can quickly select the interactive objects without audio information output in the interactive interface, the identifiers of the M interactive objects are sorted by the first terminal according to the relative distance from small to large, and the target output mode corresponding to the identifier of the interactive object sorted after the front L position in the M interactive objects is a mute mode, namely the first terminal cannot output the audio information of the interactive object positioned after the front L position, thereby improving the efficiency of multi-party interaction.
Based on the first aspect of the embodiments of the present invention, in an optional implementation manner of the first aspect of the embodiments of the present invention, the output information of the interactive object is audio information, the target output mode is a decibel size of the audio information of the interactive object, and the adjusting, by the first terminal, the target output mode of the output information of the interactive object according to the operation event specifically includes: if the mark of at least one object is positioned between the mark of the first object and the mark of the interactive object, the first terminal determines that the target output mode is a mute mode; or, if the identifier of other objects does not exist between the first object and the interactive object, the first terminal determines that the target output mode is the sound output mode.
By adopting the method of the invention, the first terminal can quickly select the interactive object which needs to output the audio information and the interactive object which does not need to output the audio information in the interactive interface, thereby effectively improving the efficiency of multi-party interaction.
Based on the first aspect of the embodiments of the present invention, in an optional implementation manner of the first aspect of the embodiments of the present invention, the method is applied to a virtual reality system, where the first terminal is a virtual reality device, a relative position relationship between the identifier of the first object and the identifier of the interactive object on the interactive interface is a target included angle, the target included angle is an included angle between the identifier of the interactive object and an optical axis of the target view field angle, the target view field angle is a view field angle of the virtual reality device, the optical axis of the target view field angle is a symmetric axis of the target view field angle, and the identifier of the first object is a vertex of the target view field angle.
By adopting the method of the invention, the identification of the plurality of interactive objects is displayed on the interactive interface displayed by the virtual reality equipment, and the virtual reality equipment can adjust the corresponding target output mode according to the target included angle between the identification of the first object and the identification of the interactive object, so that the output information of the interactive object is output in the target output mode on the virtual reality equipment, thereby realizing the purpose of independently adjusting the target output mode of the output information of each interactive object on the virtual reality equipment in the virtual reality system.
Based on the first aspect of the embodiments of the present invention, in an optional implementation manner of the first aspect of the embodiments of the present invention, the adjusting, by the first terminal, the target output mode of the output information of the interactive object according to the operation event specifically includes adjusting, by the first terminal, the target output mode of the output information of the interactive object according to a change of the target included angle. Optionally, the target output mode and the target included angle are in a positive correlation relationship, if the target included angle is larger, the decibel of the audio information of the interactive object is larger, and if the target included angle is larger, the decibel of the audio information and the resolution of the video information of the audio and video information of the interactive object are larger, and if the target included angle is larger, the brightness value of the identifier of the interactive object is larger, and if the target included angle is larger, the sensory information of the interactive object is stronger. Optionally, the target output mode and the target included angle are in an inverse correlation relationship, if the target included angle is larger, the decibel of the audio information of the interactive object is smaller, if the target included angle is larger, the decibel of the audio information of the audio-video information of the interactive object and the resolution of the video information of the audio-video information of the interactive object are smaller, if the target included angle is larger, the brightness value of the identifier of the interactive object is smaller, and if the target included angle is larger, the sensory information of the interactive object is weaker.
In the virtual reality system, the identification of a plurality of interactive objects is displayed on an interactive interface displayed by the virtual reality equipment, the corresponding target output mode can be adjusted according to the size of a target included angle between the interactive objects and an optical axis of a target view field angle, the first object needs to wear VR equipment and move, the size of the target included angle can be changed, the purpose of adjusting the target output mode is achieved, the target output mode can be adjusted under the condition that manual operation of the first object is not needed, and the efficiency of adjusting the target output mode is improved.
Based on the first aspect of the embodiments of the present invention, in an optional implementation manner of the first aspect of the embodiments of the present invention, the outputting, by the first terminal, the output information of the interactive object according to the adjusted target output mode specifically includes: the first terminal sends the corresponding relation between the interactive object and the adjusted target output mode to the server; the first terminal receives the processed output information from the server, wherein the processed output information is the output information processed by the server according to the adjusted target output mode; the first terminal outputs the processed output information.
By adopting the method of the present aspect, the first terminal sends the corresponding relationship capable of adjusting the target output mode to the server, the server can adjust the output information of the interactive object according to the corresponding relationship to generate the processed output information, and after the server sends the processed output information to the first terminal, the first terminal directly outputs the processed output information without adjusting the processed output information, so that the first terminal can output the output information of the interactive object according to the target output mode. In this aspect, the server processes the output information of the interactive object to generate the processed output information, thereby reducing the load of the first terminal on processing the output information.
Based on the first aspect of the embodiments of the present invention, in an optional implementation manner of the first aspect of the embodiments of the present invention, the method further includes: the first terminal receives output information of the interactive object from the server; the step of outputting, by the first terminal, the output information of the interactive object according to the adjusted target output mode specifically includes: the first terminal processes the output information of the interactive object according to the adjusted target output mode to obtain the processed output information, wherein the output mode of the processed output information is the adjusted target output mode; the first terminal outputs the processed output information.
By adopting the method of the aspect, the first terminal can acquire the output information of the interactive object from the server, and the first terminal processes the output information according to the corresponding relation to generate the processed output information, so that the load of the server for processing the output information to generate the processed output information is reduced.
A second aspect of the embodiments of the present invention provides a terminal, where the terminal is a first terminal used by a first object, and the terminal includes:
the display unit is used for displaying an interactive interface, the interactive interface comprises a plurality of object identifiers, the plurality of objects comprise a first object and an interactive object, and the interactive object is any one of other objects except the first object in the plurality of objects;
the detection unit is used for detecting an operation event, wherein the operation event is the adjustment of the relative position relationship between the identifier of the first object and the identifier of the interactive object on the interactive interface;
the adjusting unit is used for adjusting a target output mode of the output information of the interactive object according to the operation event;
and the output unit is used for outputting the output information of the interactive object according to the adjusted target output mode.
For a description of the beneficial effects of the terminal for executing the interactive control method and executing the interactive control method, please refer to the description in the first aspect, which is not repeated herein.
Based on the second aspect of the embodiment of the present invention, in an optional implementation manner of the second aspect of the embodiment of the present invention, a relative position relationship between the identifier of the first object and the identifier of the interactive object on the interactive interface is a relative distance between the identifier of the first object and the identifier of the interactive object on the interactive interface.
Based on the second aspect of the embodiment of the present invention, in an optional implementation manner of the second aspect of the embodiment of the present invention, the adjusting unit is specifically configured to adjust a target output mode of the output information of the interaction object according to a change of the relative distance, where the target output mode and the relative distance have an inverse correlation or a positive correlation.
Based on the second aspect of the embodiment of the present invention, in an optional implementation manner of the second aspect of the embodiment of the present invention, the interactive interface includes M target objects, and the adjusting unit is further configured to:
sequencing the identifiers of the M interactive objects from small to large according to the relative distance between the identifier of each interactive object in the M interactive objects and the identifier of the first object;
and determining that a target output mode corresponding to the identification of the interactive object sequenced after the first L bits in the M interactive objects is a mute mode, wherein both M and L are positive integers greater than 1, and M is greater than L.
Based on the second aspect of the embodiment of the present invention, in an optional implementation manner of the second aspect of the embodiment of the present invention, the output information of the interactive object is audio information, the target output mode is a decibel size of the audio information of the interactive object, and the adjusting unit is further configured to:
if the mark of at least one object is positioned between the mark of the first object and the mark of the interactive object, determining that the target output mode is a mute mode;
or the like, or a combination thereof,
and if the identification of other objects does not exist between the first object and the interactive object, determining that the target output mode is the voice-out mode.
Based on the second aspect of the embodiment of the present invention, in an optional implementation manner of the second aspect of the embodiment of the present invention, a relative position relationship between the identifier of the first object and the identifier of the interactive object on the interactive interface is a target included angle, the target included angle is an included angle between the identifier of the interactive object and an optical axis of the target view angle, the target view angle is a view angle possessed by the virtual reality device, the optical axis of the target view angle is a symmetric axis of the target view angle, and the identifier of the first object is a vertex of the target view angle.
Based on the second aspect of the embodiment of the present invention, in an optional implementation manner of the second aspect of the embodiment of the present invention, the adjusting unit is specifically configured to adjust a target output mode of the output information of the interactive object according to a change of the target included angle, where the target output mode and the relative distance have an inverse correlation or a positive correlation.
Based on the second aspect of the embodiment of the present invention, in an optional implementation manner of the second aspect of the embodiment of the present invention, the output unit is specifically configured to:
sending the corresponding relation between the interactive object and the adjusted target output mode to a server;
receiving the processed output information from the server, wherein the processed output information is the output information obtained by processing the output information of the interactive object by the server according to the adjusted target output mode;
and outputting the processed output information.
Based on the second aspect of the embodiment of the present invention, in an optional implementation manner of the second aspect of the embodiment of the present invention, the terminal further includes: a receiving unit for receiving output information of the interactive object from the server; the output unit is specifically configured to:
processing the output information of the interactive object according to the adjusted target output mode to obtain processed output information, wherein the output mode of the processed output information is the adjusted target output mode;
and outputting the processed output information.
A third aspect of an embodiment of the present invention provides a terminal, where the terminal is a first terminal used by a first object, and the terminal includes a processor and a memory, where the memory stores a computer program; the processor is configured to execute the interactive control method according to the first aspect of the embodiment of the present invention by executing the computer program in the memory.
A fourth aspect of the embodiments of the present invention provides a computer-readable storage medium, where a computer program is stored, where the computer program is used to execute the interaction control method shown in the first aspect of the embodiments of the present invention.
A fifth aspect of the embodiments of the present invention provides a computer program product, which is configured to, when executed, perform the interaction control method shown in the first aspect of the embodiments of the present invention.
Drawings
Fig. 1 is a schematic structural diagram of an embodiment of a terminal provided in the present invention;
FIG. 2 is a flowchart illustrating steps of an interactive control method according to an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of an embodiment of a multi-party conferencing system provided in the present invention;
FIG. 4 is a schematic structural diagram of a server according to an embodiment of the present invention;
FIG. 5 is a schematic diagram illustrating an embodiment of an interactive interface displayed on the first terminal according to the present invention;
FIG. 6 is a flowchart illustrating steps of an interactive control method according to another embodiment of the present invention;
FIG. 7 is a schematic diagram illustrating an interactive interface displayed on the first terminal according to another embodiment of the present invention;
FIG. 8 is a schematic diagram illustrating an interactive interface displayed on the first terminal according to another embodiment of the present invention;
FIG. 9 is a schematic diagram illustrating an interactive interface displayed on the first terminal according to another embodiment of the present invention;
FIG. 10 is a flowchart illustrating steps of an interactive control method according to another embodiment of the present invention;
FIG. 11 is a flowchart illustrating steps of an interactive control method according to another embodiment of the present invention;
FIG. 12 is a schematic diagram illustrating an interactive interface displayed on the first terminal according to another embodiment of the present invention;
fig. 13 is a schematic structural diagram of a live broadcasting system according to an embodiment of the present invention;
FIG. 14 is a flowchart illustrating steps of an interactive control method according to another embodiment of the present invention;
FIG. 15 is a schematic diagram illustrating an interactive interface displayed on the first terminal according to another embodiment of the present invention;
FIG. 16 is a schematic diagram illustrating an interactive interface displayed on the first terminal according to another embodiment of the present invention;
fig. 17 is a schematic structural diagram of an embodiment of a virtual reality system provided in the present invention;
FIG. 18 is a flowchart illustrating steps of an interactive control method according to another embodiment of the present invention;
FIG. 19 is a schematic diagram illustrating an interactive interface displayed on the first terminal according to another embodiment of the present invention;
FIG. 20 is a flowchart illustrating steps of an interactive control method according to another embodiment of the present invention;
FIG. 21 is a schematic view of a scenario of an interaction control method according to an embodiment of the present invention;
FIG. 22 is a schematic diagram illustrating a scenario of an interaction control method according to another embodiment of the present invention;
FIG. 23 is a flowchart illustrating steps of an interactive control method according to another embodiment of the present invention;
FIG. 24 is a schematic diagram illustrating a scenario of an interaction control method according to another embodiment of the present invention;
fig. 25 is a schematic structural diagram of an embodiment of a first terminal provided in the present invention;
fig. 26 is a schematic structural diagram of another embodiment of the first terminal provided in the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without inventive step based on the embodiments of the present invention, are within the scope of protection of the present invention.
The term "and/or" appearing in the present application may be an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" in this application generally indicates that the former and latter related objects are in an "or" relationship.
The terms "first," "second," and the like in the description and in the claims of the present application and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be implemented in other sequences than those illustrated or described herein. Moreover, the terms "comprises," "comprising," and any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or modules is not necessarily limited to those steps or modules explicitly listed, but may include other steps or modules not expressly listed or inherent to such process, method, article, or apparatus.
The interaction control method provided by the application is applied to a communication system, the communication system comprises a plurality of nodes, one node in the plurality of nodes can interact with all other nodes, and the node can be a processing unit or a physical device, such as a terminal or a server. When the system types of the communication systems are different, the device types of the nodes may be different, for example, if the communication system is a multi-party conference system or a live broadcast system, the nodes may be smart phones, desktop computers, intelligent wearable devices, desktop computers or notebook computers, etc.; if the communication system is a virtual reality system, the device type of the node may be a virtual reality device. The types of devices between a plurality of nodes included in the same communication system may be the same or different. Alternatively, the plurality of nodes may also be a plurality of virtual machines or a plurality of containers, etc. running on the same or different physical devices.
Referring to fig. 1, in the embodiment shown in fig. 1, a node is taken as an example for exemplary description, and a specific structure of a terminal is optionally described below with reference to fig. 1: fig. 1 is a schematic structural diagram of an embodiment of a terminal provided by the present invention, where fig. 1 illustrates an example in which the terminal is a smart phone, and it should be clear that the present embodiment does not limit a specific hardware structure of the terminal.
The terminal 100 includes at least components of an input unit 105, a processor 103, an output unit 101, a communication unit 107, a memory 104, and the like. These components communicate over one or more buses. Those skilled in the art will appreciate that the structure of the terminal 100 shown in fig. 1 is not intended to limit the present invention, and may be a bus structure, a star structure, a structure including more or less components than those shown, a combination of components, or a different arrangement of components, and that the terminal 100 specifically includes:
the input unit 105 may be a touch panel, also referred to as a touch screen, on which operation events of a user may be collected, such as the user operating on the touch panel using a finger, a stylus, or any suitable object or accessory. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of an operation event input by a user, detects a signal brought by the operation event and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 103, and the processor 103 determines the specific position of the operation event input by the user on the touch panel according to the touch point coordinates. The touch panel can also receive and execute commands sent by the processor 103. In addition, the touch panel may be implemented in various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel, the input unit 105 may also include other input devices, which may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, a microphone, and the like.
The terminal 100 further includes a display unit 110, where the display unit 110 is configured to display information input by the user or information provided to the user and various menu interfaces of the terminal 100, and in this embodiment, the display unit 110 may be configured to display an interactive interface shown in this embodiment, where the interactive interface displays identifiers of all terminals located in the same conference group. Specifically, the display unit 110 may be a display panel, and optionally, the display panel may be configured in the form of a Liquid Crystal Display (LCD) or an Organic Light Emitting Diode (OLED).
The present embodiment is exemplified by covering the display panel with the touch panel to form a touch display screen, and after the touch display screen detects an operation event on or near the touch display screen, the operation event is transmitted to the processor 103 to determine the type of the operation event, and then the processor 103 provides a corresponding visual output on the touch display screen according to the type of the operation event. Optionally, the operation event may be a drag operation event, where, when the identifier is displayed on the touch display screen, the drag operation event executed by the user for the identifier includes a start position, a drag direction, and an end position of the drag, and the identifier, under the action of the drag operation event, causes the identifier to move in the touch display screen along the drag direction until the identifier moves to the end position, with the start position as a start point.
The memory 104 is configured to store a first computer program 1041, where the first computer program 1041 is configured to execute a function that needs to be executed by the terminal side in the interaction control method described in this application. The processor 103 is configured to execute the first computer program 1041 stored in the memory 104, thereby implementing logical functions when the terminal executes the interaction control method shown in this application.
The processor 103 may be a general-purpose Central Processing Unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more ics for controlling the execution of programs in accordance with the present invention.
In some other embodiments of the present invention, the input unit 105 may also be various sensing devices, such as hall devices, for detecting physical quantities of the terminal 100, such as force, moment, pressure, stress, position, displacement, speed, acceleration, angle, angular velocity, number of rotations, rotation speed, and time of change of operating state, and converting the physical quantities into electric quantities for detection and control. Other sensing devices may include gravity sensors, three-axis accelerometers, gyroscopes, electronic compasses, ambient light sensors, proximity sensors, temperature sensors, humidity sensors, pressure sensors, heart rate sensors, fingerprint identifiers, and the like.
A communication unit 107 for establishing a communication channel for the terminal 100 to connect to a remote server through the communication channel and downloading media data from the remote server. The communication unit 107 may include a wireless local area network (wlan) module, a bluetooth module, a baseband module, and other communication modules, and Radio Frequency (RF) circuits corresponding to the communication modules, and configured to perform wlan communication, bluetooth communication, infrared communication, and/or cellular communication system communication, such as wideband code division multiple access (W-CDMA) and/or High Speed Downlink Packet Access (HSDPA). The communication module is used to control communication of each component in the terminal 100 and may support direct memory access.
A power supply 109 for supplying power to the various components of the terminal 100 to maintain its operation. As a general understanding, the power source 109 may be a built-in battery, such as a common lithium ion battery, nickel metal hydride battery, etc., and also include an external power source that directly supplies power to the terminal 100.
Based on the hardware structure of the terminal shown in fig. 1, an embodiment of the interaction control method provided in the present application is exemplarily described below with reference to fig. 2:
step 201, the first terminal displays an interactive interface.
The interactive interface displayed by the first terminal comprises a plurality of object identifications, wherein the plurality of objects displayed by the interactive interface comprise a first object and an interactive object, and the interactive object is any one of other objects except the first object in the plurality of objects. The interactive interface shown in this embodiment is used to implement interaction between a first object and a plurality of interactive objects, where the first object is an object using a first terminal.
Optionally, the identifier of the first object in this embodiment may be displayed on the interactive interface in an explicit manner, or alternatively, the identifier of the first object may also be set on the interactive interface in an implicit manner, for example, the identifier of the first object may be preset at a specific position in the interactive interface, for example, the identifier of the first object is located at a central position and an edge position of the interactive interface, and the specific embodiment is not limited thereto.
In this embodiment, the first terminal is one node of a plurality of nodes that need to interact in a communication system, and when the interaction control method shown in the present application is applied to different communication systems, types of interaction objects that interact with the first object may be different, for example, if the communication system is a multi-party conference system, the interaction object is an object other than the first object in a conference group, that is, the interaction object may be a user in the conference group; for another example, if the communication system is a live broadcast system, the interactive object is an object other than the first object in the live broadcast channel, that is, the interactive object may be a user located in the live broadcast channel; for another example, if the communication system is a virtual reality system, the interactive object may be another user or an NPC that needs to interact with the first object, and the specific implementation is not limited in this embodiment.
Step 202, the first terminal detects an operation event.
In the case that a target output mode of output information of an interactive object needs to be adjusted, a first object may execute an operation event on a touch display screen of a first terminal, where the operation event is used to adjust a relative position relationship between an identifier of the first object and an identifier of the interactive object on an interactive interface, and this embodiment does not limit a specific operation type of the operation event, and this embodiment exemplarily describes the operation event as a drag operation event, and please refer to fig. 1 for specific description of the drag operation event, which is not specifically described in this embodiment.
The following illustrates how the operation event adjusts the relative position relationship between the identifier of the first object and the identifier of the interactive object on the interactive interface:
optionally, in the interactive interface displayed by the first terminal, the position of the identifier of each interactive object is adjustable, the position of the identifier of the first object in the interactive interface is fixed in advance, the position of the identifier of each interactive object in the interactive interface may move under the action of the operation event, and the adjustment of the relative position relationship between the identifier of the first object and the identifier of the interactive object on the interactive interface is realized by changing the position of the identifier of the interactive object in the interactive interface.
Optionally, in the interactive interface displayed by the first terminal, both the position of the identifier of the first object and the identifier of the interactive object may be adjustable, and then the position of the identifier of the first object and the position of the identifier of the interactive object in the interactive interface move under the action of the operation event, and thus, the purpose of adjusting the relative position relationship between the identifier of the first object and the identifier of the interactive object on the interactive interface may also be achieved by changing the positions of the identifier of the first object and the identifier of the interactive object in the interactive interface.
And 203, the first terminal adjusts a target output mode of the output information of the interactive object according to the operation event.
In this embodiment, the first terminal may create the corresponding relationship between the different relative position relationships and the different target output modes in advance, and it is seen that, if the first object needs to adjust the target output mode of the output information of the interactive object displayed on the interactive interface, the first terminal may execute an operation event for the interactive object, so as to adjust the relative position relationship between the identifier of the first object and the identifier of the interactive object on the interactive interface, and adjust the target output mode according to the adjusted target relative relationship.
And step 204, the first terminal outputs the output information of the interactive object according to the adjusted target output mode.
Optionally, after the first terminal adjusts the target output mode of the output information of the interactive object according to the target relative position, the first terminal may send the adjusted target output mode of the interactive object to the server, the server adjusts the output information of the interactive object according to the target output mode to generate adjusted output information, and sends the adjusted output information to the first terminal, and the first terminal may directly output the adjusted output information, and the output information may be output on the first terminal side according to the adjusted target output mode.
Optionally, after the first terminal adjusts the target output mode of the output information of the interactive object according to the target relative position, the first terminal may obtain the output information of the interactive object, adjust the output information of the interactive object according to the target output mode to obtain the adjusted output information, and output the output information of the interactive object according to the adjusted target output mode.
The target output mode of the output information is not limited in this embodiment, and may be, for example, a decibel size of the audio information, a decibel and a resolution size of the audio/video information, a brightness value of the identifier, or the like.
By adopting the method shown in this embodiment, the interactive interface of the first terminal displays the identifiers of the multiple interactive objects, and the first terminal adjusts the target output mode in which the output information of the interactive objects is output on the first terminal by adjusting the relative position relationship between the identifier of the first object and the identifier of the interactive object on the interactive interface, so that the output information of the interactive objects is output in the target output mode on the first terminal, thereby realizing the independent adjustment of the target output mode in which the output information of each interactive object interacting with the first object is output on the first terminal.
In order to better understand the interaction control method provided by the present application, the following description is made by way of example in conjunction with architectures of different communication systems to which the interaction control method provided by the present application is applied. Referring to fig. 3, a communication system to which the interaction control method provided by the present application is applied may be a multi-party conferencing system shown in fig. 3. Fig. 3 is a diagram illustrating a structure of a multi-party conference system provided by the present invention, and a specific structure of the multi-party conference system is described below;
through the multi-party conference system 300, interaction of output information of all participants in one conference group is achieved, specifically, as shown in fig. 3, one conference group 301 may include terminals owned by different participants, and the present embodiment does not limit the specific number of terminals included in one conference group 301. Within the same conference group 301, any terminal may transmit its output information to all other terminals within the conference group 301, so that geographically dispersed participants can communicate the output information through the multi-party conferencing system 300. In this embodiment, the type of the output information interacted with the same conference group 301 is not limited, for example, the output information interacted with the same conference group 301 may be audio information, audio and video information, or screen information displayed on a terminal screen, and specifically in this embodiment, the type is not limited, where please refer to fig. 1 for specific description of a hardware structure of the terminal, and details are not repeated.
The multi-party conferencing system 300 includes a server 303, the server 303 shown in this embodiment may be an independent computer server or a server cluster, and is not limited in this embodiment specifically, and this embodiment exemplifies the number of servers 303 included in the multi-party conferencing system 300 as one example, and it should be clear that the number of servers 303 and a specific device type are not limited in this embodiment.
The server 303 is connected to the terminal through a network 304, where the network 304 may be any network(s), such as the internet, a local area network, and an internet of things, and the present embodiment is not limited in this embodiment.
The terminal shown in this embodiment may be a personal computer 3011, a notebook computer 3012, a smart phone 3013, and a smart watch 3014 as shown in fig. 3, and the present embodiment does not limit the description of the specific device type of the terminal, for example, the device type of the terminal may also be any other computing device capable of implementing the interactive control method shown in this application, such as a palm computer, a personal digital assistant, a game console, a video player, a DVD recorder/player, and a television.
To better understand the hardware structure of the server 303 included in the multi-party conference system 300 shown in fig. 3, a specific structure of computer hardware of a server capable of implementing the interaction control method shown in this application is exemplarily described below with reference to fig. 4, where fig. 4 is a schematic structural diagram of an embodiment of the server provided in this invention.
The server 400 includes at least one processor 401, a communication bus 402, a memory 403, and at least one control interface 404.
For a detailed description of the processor 401, reference may be made to fig. 1, which is not described in detail. Communication bus 402 may include a path that transfers information between the above components. The control interface 404 is a control interface between the server 400 and the terminal, and in the case that the output information interacted between the plurality of terminals is audio information or audio/video information as shown in this embodiment, the control interface 404 may use a Session Initiation Protocol (SIP), and the output information interacted between the server and the terminal may use an SIP signaling, and it is to be clear that the protocol type adopted for information interaction between the server and the terminal is not limited in this embodiment.
The memory 403 is configured to store at least a second computer program 4031, and the second computer program 4031 is configured to execute a function required to be executed by the server side in the interaction control method described in this application. The processor 401 is configured to execute the second computer program 4031 stored in the memory 403, thereby implementing logical functions when the server executes the interactive control method shown in the present application.
In particular implementations, processor 401 may include one or more CPUs, such as CPU0 and CPU1 in fig. 4, as one embodiment.
In particular implementations, server 400 may include multiple processors, such as processor 401 and processor 408 in FIG. 4, for example, as an embodiment. Each of these processors may be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
Based on the structures of the multi-party conferencing system shown in fig. 3, the terminal shown in fig. 1, and the server shown in fig. 4, the following describes an exemplary specific implementation flow of an embodiment of the interaction control method provided by the present application:
fig. 5 may be a schematic diagram of the interactive interface displayed by the terminal shown in fig. 1 or fig. 3, where the interactive interface 500 is displayed on the first terminal 501, the interactive interface 500 may display the identifications of all objects in a conference group of the multi-party conference, and all objects include an identification 502 of a first object and identifications 503 of a plurality of interactive objects, where the first object is a user using the first terminal 501, the first object may be any one of all objects in the group, the interactive objects are other objects in the conference group except the first object, and any one of the plurality of interactive objects may output information and send the information to the first object. By adopting the interaction control method shown in this embodiment, the first terminal 501 can individually adjust the output information of any interaction object to meet different requirements of the first object on the output modes of the output information of different interaction objects, and the following description is provided with reference to fig. 6 for describing how the first terminal independently adjusts the output information of each interaction object:
step 601, the first terminal displays an interactive interface.
In this embodiment, the server may create a conference group, and all terminals located in the conference group may have an interactive interface displayed thereon. Optionally, if the first terminal shown in this embodiment is used as an initiator of a conference group, the first terminal may send a conference request to the server, where the conference request may include an identifier of the conference group sent by the first terminal, an identifier of the first terminal, or address information of the first terminal, and the server creates the conference group based on the conference request, and adds the first terminal that has sent the conference request to the conference group. Optionally, if the server has successfully created the conference group, any second terminal that has joined the conference group may send a conference joining request to the first terminal, the first terminal sends, to the server, a response message used for indicating agreement to join the conference group according to the conference joining request, and the server joins the first terminal to the created conference group based on the response message.
It should be clear that, in this embodiment, an example is described based on the multi-party conference system shown in fig. 3, a server is responsible for creating a conference group, a plurality of objects that need to participate in the conference group are added to the conference group through respective terminals used by the objects, in other examples, a plurality of objects participate in the conference group through respective nodes used by the objects, and each node may be a virtual machine running on the same computer device or different computer devices, and in this scenario, the computer device implements the functions of the server shown in this embodiment, which is not specifically described in detail.
The interactive interface displayed by the first terminal comprises the identifications of all second terminals in the conference group in which the first terminal is positioned, wherein the second terminals are terminals used by the interactive objects. The specific type of the identifier is not limited in this embodiment, for example, the identifier may be one or more of an account name, an avatar, or a nickname of the multi-party conferencing system that uses the interactive object of the second terminal, and is not limited in this embodiment. The embodiment takes the avatar identified as the interactive object using the second terminal as an example for illustration. Optionally, the interactive interface may further display an identifier corresponding to the first object, and a description of the identifier of the first object may also refer to the description of the identifier of the interactive object, which is not described in detail.
Step 602, the first terminal adjusts the relative distance between the identifier of the first object and the identifier of the interactive object on the interactive interface according to the operation event.
The operation event shown in this embodiment is used to adjust the relative distance between the identifier of the first object and the identifier of the interactive object in the interactive interface, so as to adjust the target output mode of each interactive object.
Optionally, the server may send the application update package to each terminal in the conference group in advance, where the application update package includes a target computer program, and after the processor of the terminal runs the target computer program, the position of the identifier of each interactive object in the interactive interface displayed by the first terminal may be freely adjusted. In one implementation, the first object may perform an operation event on the identification of the interactive object on the interactive interface, thereby adjusting a relative distance between the identification of the first object and the identification of the interactive object in the interactive interface. In another implementation manner, the first object may adjust a relative distance between the identifier of the first object and the identifier of the interactive object in the interactive interface by performing a mouse click event with respect to the identifier of the interactive object, which is not limited in this embodiment.
It can be seen that, in order to implement independent adjustment of output information of each interactive object in a conference group, a first object may determine, in an interactive interface, an identifier of an interactive object that needs to be subjected to output information adjustment, where the first object executes an operation event on the identifier of the interactive object on the interactive interface, so that a position of the identifier of the interactive object in the interactive interface moves under an action of the operation event, thereby implementing adjustment of a target output mode of the interactive object, where the target output mode refers to a mode of outputting information, and optionally, the target output mode may be a numerical parameter of the output information, and the output information may be audio information or audio-video information, or the like. Optionally, if the output information is audio information, the target output mode is a decibel value of the audio information of the interactive object output at the first terminal; optionally, if the output information is audio/video information, the target output mode is a resolution of the audio information of the interactive object output at the first terminal and a decibel of the audio information output at the first terminal; optionally, if the output information is a brightness value of the identifier of the interactive object, the target output mode is a brightness value displayed by the identifier of the interactive object on the interactive interface, optionally, the target output mode may also be a color value of the identifier of the interactive object, and the like, the specific information type of the output information of the interactive object is not limited in this embodiment, and the present embodiment exemplarily illustrates that the output information is audio information:
and 603, the first terminal adjusts a target output mode of the output information of the interactive object according to the change of the relative distance.
The following describes an exemplary process of how the first terminal adjusts the target output mode of the interactive object according to the change of the relative distance, with reference to the accompanying drawings:
optionally, the target coordinate system may be displayed on the interactive interface, and optionally, the target coordinate system may not be displayed on the interactive interface, which is not limited in this embodiment. The target coordinate system in the interactive interface is used to calculate the position of the identifier of each interactive object in the interactive interface, and the target coordinate system shown in this embodiment is a one-dimensional coordinate system or a two-dimensional coordinate system, which is exemplified by an example. In the following, with reference to fig. 7 as an example, an example is described by taking an example in which the first terminal displays a one-dimensional target coordinate system in the interactive interface 700;
firstly, a first terminal determines a first position in an interactive interface, wherein the first position is the position of an identifier of a first object in the interactive interface; taking the example shown in fig. 7 as an example, the origin of the target coordinate system is the position of the identifier of the first object in the interactive interface.
The identifier 701 of the first object displayed in the interactive interface 700 is an avatar, the coordinates of the position of the identifier 701 of the first object in the target coordinate system may refer to the coordinates of the target pixel in the interactive interface 700, and the target pixel is any one of all pixels included in the identifier 701 of the first object. Optionally, the target pixel point is a pixel point located at the center of all pixel points included in the identifier 701 of the first object.
Secondly, the first terminal determines the coordinates of the second position in the interactive interface.
There is an initial relative distance 702 between the coordinates of the first position of the marker 701 of the first object (the origin of the target coordinate system) and the coordinates of the position of the marker 703 of the interactive object, wherein the initial relative distance 702 is the absolute value of the difference between the coordinates of the position of the marker 703 of the interactive object and the coordinates of the first position in the target coordinate system. For a specific description of determining the coordinate of the position of the identifier 703 of the interactive object, please refer to a description of determining the coordinate of the first position of the identifier 701 of the first object, which is not described in detail herein;
the first object may execute an operation event on the identifier 703 of the interactive object in an interactive interface 700 displayed on a touch display screen of the first terminal, specifically, the operation event is a drag operation event that can select the identifier 703 of the interactive object and drive the identifier 703 of the interactive object to move in the interactive interface, and please refer to the embodiment shown in fig. 1 for details of the specific description of the touch display screen and the specific description of how the touch display screen determines the coordinates according to the drag operation event, which is not described in detail in this embodiment. Alternatively, taking the example shown in fig. 7, the first object may input the drag operation event through the identification 703 of the interactive object by the finger of the first object.
As shown in fig. 8 as an example, after the first object finishes dragging the identifier 703 of the interactive object through the drag operation event, the first terminal determines a coordinate of a second position, where the coordinate of the second position is a coordinate of a position in the interactive interface 700 after the identifier 703 of the interactive object passes through the drag operation event, and as shown in fig. 8, a coordinate of a position of the identifier 803 of the interactive object is closer to a coordinate origin of the target coordinate system than before the identifier passes through the drag operation event.
Again, the first terminal determines a target relative distance 802, where the target relative distance 802 is the distance between the coordinates of the first location and the coordinates of the second location in the target coordinate system, and continuing as shown in fig. 8, the target relative distance 802 is less than the initial relative distance 702.
Finally, the first terminal determines the target output mode of the interactive object 703 according to the target relative distance, and as shown in the above description, in this embodiment, taking the target output mode as an example of the decibel size of the audio information output by the target output mode, it is only necessary that the first terminal adjusts that the decibel size of the audio information output by the first terminal from the interactive object and the size of the target distance form an inverse correlation relationship, that is, the closer the identifier of the interactive object is to the identifier of the first object, the greater the decibel of the audio information output by the interactive object at the first terminal is, the farther the identifier of the interactive object is from the identifier of the first object, the smaller the decibel of the audio information output by the interactive object at the first terminal is. In the above example, the inverse relationship between the magnitude of the target output pattern and the magnitude of the target relative distance is taken as an example for illustration, and in other examples, the magnitude of the target output pattern and the magnitude of the target relative distance may also be in a positive correlation, a functional relationship, or the like, and the present embodiment is not limited in particular.
In a specific application, as shown in fig. 9, fig. 9 shows that a plurality of identifiers of interactive objects are displayed on an interactive interface 900 displayed by a first terminal, and a first object has completed a drag operation event on an identifier of each interactive object on the interactive interface, in a specific application process, taking fig. 9 as an example, the first object drags the identifier 901 of the interactive object to a position closest to a relative distance of the identifier 902 of the first object by performing the drag operation event on the identifier 901 of the interactive object, so that a decibel of audio information output by the interactive object corresponding to the identifier 901 is the largest among all interactive objects displayed by the interactive interface 900; the first object drags the identifier 903 of the interactive object to the position where the relative distance between the identifier 903 of the first object and the identifier 902 is shortest by executing a drag operation event on the identifier 903 of the interactive object, and the relative distance 904 between the identifier 901 and the identifier 902 is equal to the relative distance 905 between the identifier 903 and the identifier 902, so that the audio information output by the interactive object corresponding to the identifier 903 and the interactive object corresponding to the identifier 902 is output at the first terminal with equal volume.
Similarly, the first object drags the identifier 910 of the interactive object to the position farthest from the relative distance of the identifier 902 of the first object by inputting a drag operation event to the identifier 910 of the interactive object, as shown in fig. 9, the relative distance 906 between the identifier 910 and the identifier 902 is greater than the relative distance 904 between the identifier 901 and the identifier 902, so that the decibel of the audio information output by the interactive object corresponding to the identifier 910 is the smallest among all the interactive objects displayed by the interactive interface 900. The identifier 907 of the interactive object is located between the identifier 901 and the identifier 910 of the interactive object, the decibel output by the audio information of the interactive object corresponding to the identifier 907 at the first terminal is greater than the decibel output by the audio information of the interactive object corresponding to the identifier 910 at the first terminal, and the decibel output by the audio information of the interactive object corresponding to the identifier 907 at the first terminal is less than the decibel output by the audio information of the interactive object corresponding to the identifier 901 at the first terminal.
In this embodiment, the identifiers of the interactive objects located in the interactive interface may be freely moved under the action of the drag operation event, the specific positions of the identifiers of the interactive objects in the interactive interface are not limited, and the shapes formed by the identifiers of all the interactive objects in the interactive interface are not limited, for example, the identifiers of all the interactive objects in the interactive interface may be in any shape such as a linear structure, an arc structure, a circular structure, a square structure, and the like.
And step 604, the first terminal outputs the output information of the interactive object according to the adjusted target output mode.
The specific execution process of step 604 shown in this embodiment is shown in step 204 shown in fig. 2, and details thereof are not described in this embodiment.
By adopting the method shown in this embodiment, in a conference group of the multi-party conference system, the interactive interface of the first terminal displays the identifiers of the multiple interactive objects, and the first terminal adjusts the target output mode of the output information of the interactive objects by adjusting the relative position relationship between the identifier of the first object and the identifier of the interactive object on the interactive interface, so that the output information of the interactive objects is output in the target output mode on the first terminal, thereby realizing the independent adjustment of the target output mode of the output information of each interactive object interacting with the first terminal in the conference group.
In practical application, because of the need of a specific conference, the first object needs to create a mini-conference group in the conference group, and the mini-conference group includes part of interactive objects in the conference group, for example, if there are four interactive objects a, B, C, and D in the conference group, but because of the need of the conference, the first object only needs to acquire output information of the interactive objects a and B, and to implement creation of the mini-conference group, the first object needs to dissolve the currently established conference group, and recreate the mini-conference group for the first object and the interactive objects a and B, which is obvious that the process of creating the mini-conference group is cumbersome and reduces the efficiency of creating the mini-conference group, and the following description is given in combination with fig. 10 to exemplify the process of creating the mini-conference group by the first terminal shown in this embodiment in the conference group:
with the embodiment shown in fig. 10, the first terminal may directly create the mini-conference group in the conference group without dissembling the conference group, that is, at the first terminal side, the target output modes corresponding to all the interactive objects located in the mini-conference group are all the output modes, and the target output mode corresponding to the interactive object located outside the mini-conference group is the mute mode, that is, for the first object, if the interactive object is located in the conference group but not located in the mini-conference group, the output information of the interactive object is the mute mode on the first terminal.
The following description, with reference to fig. 10, illustrates an example of a process in which the first terminal does not need to exit from the conference group, but directly creates a small conference group in the conference group:
step 1001, the first terminal displays an interactive interface.
Step 1002, the first terminal adjusts the relative distance between the identifier of the first object and the identifier of the interactive object on the interactive interface according to the operation event.
For details of the specific execution process from step 1001 to step 1002 shown in this embodiment, please refer to step 601 to step 602 shown in fig. 6, which is not described in detail in this embodiment.
And step 1003, the first terminal determines the identifier of the first interactive object and the identifier of the second interactive object according to the relative distance.
In this embodiment, the interactive interface displayed by the first terminal includes a plurality of interactive objects, and the first terminal may determine, according to a relative distance between the identifier of each interactive object and the identifier of the first object, a first interactive object and a second interactive object in the plurality of interactive objects, where in this embodiment, taking the output information of each interactive object as the audio information as an example, a target output mode of the first interactive object is an outgoing mode, that is, the first terminal outputs the audio information of the first interactive object in the outgoing mode, and a description output mode of the second interactive object is a mute mode, that is, the first terminal outputs the audio information of the second interactive object in a mute description.
The following describes how the first terminal determines the first interactive object and the second interactive object in the interactive node:
the first object may execute a drag operation event on each identifier displayed on the interactive interface according to a requirement for creating the mini-conference group, and please refer to the above description for the description of the drag operation event, which is not described in detail in this embodiment.
The first terminal may receive a target number L input by the first object, where the target number L is used to indicate the number of first interactive objects joining the conference group, and in this example, there are M interactive objects in the conference group, as can be seen from the above description of creating the conference group in the conference group, M and L in this example are positive integers greater than 1, and M is greater than L, for example, if M is equal to 15 and L is equal to 5, it is indicated that an interactive interface of the first terminal displays 15 interactive objects, and the first object needs to create the conference group including 5 first interactive objects, so that audio information of 5 first interactive objects in the conference room is in an audio output mode at the first terminal, and audio information of 10 other second interactive objects is in a mute mode at the first terminal.
In a case that the first object has executed a drag operation event on each interactive object in the interactive interface and the first terminal receives the number L of objects, the first terminal detects a relative distance between an identifier of the first object and an identifier of the interactive object on the interactive interface, and detects a specific process of the relative distance of the objects, which is described in detail in the foregoing description and is not described in detail in this embodiment.
After the first terminal detects the target relative distances corresponding to all the interactive objects in the interactive interface respectively, the first terminal can sort all the target relative distances in a descending order to form a target list, the first terminal can determine the interactive objects sorted in the front L position in the target list as the first interactive objects added into the small conference group,
continuing to take the example shown in fig. 9 as an example, taking the target number L as 4 as an example, it is described that the first object includes a first object and 4 first interactive objects in the minor conference group created by the first object, the first object determines that, of all the interactive objects included in the interactive interface 900, the interactive objects are sorted in the order from small to large according to the target relative distance to be the identifier 901, the identifier 903, the identifier 907, the identifier 908, the identifier 910, and the identifier 909, and if the target number L is 4, the minor conference group created by the first terminal includes the identifier 902, the identifier 901, the identifier 903, the identifier 907, and the identifier 908, so that the audio information of the first interactive object corresponding to the identifier 901, the identifier 903, the identifier 907, and the identifier 908 is output in the audio output mode at the first terminal; and the audio information of the second interactive object corresponding to any identifier of the identifiers 910 and 909 is in a mute mode at the first terminal, so that the first object cannot acquire the audio information of the interactive object corresponding to any identifier of the identifiers 910 and 909 through the first terminal.
Step 1004, the first terminal determines a first target output mode and a second target output mode.
Under the condition that the first terminal determines the first interactive object and the second interactive object in the interactive interface, the first terminal can determine a first target output mode of the first interactive object, wherein the first target output mode is a sound output mode, the first terminal can also determine a second target output mode of the second interactive object, and the second target output mode is a mute mode.
Step 1005, the first terminal outputs the output information of the first interactive object according to the first target output mode.
As can be seen from the above description, if the first terminal includes the first interactive object in the mini-conference group created in the interactive interface, the first terminal outputs the audio information of the first interactive object in the sound output mode, so that the first object can acquire the audio information of the first interactive object.
Optionally, in the minor conference group, the first terminal may further adjust the target output mode of the first interactive object according to the relative distance between the identifier of the first object and the identifier of the first interactive object, and adjust a specific description of the target output mode of the first interactive object according to the relative distance, which is shown in fig. 6 and is not specifically described in this embodiment.
And step 1006, the first terminal outputs the output information of the second interactive object according to the second target output mode.
And if the conference group created by the first terminal in the interactive interface does not comprise the second interactive object, the first terminal outputs the audio information of the second interactive object in a mute mode, so that the first object cannot acquire the audio information of the second interactive object.
By adopting the method shown in this embodiment, the first object using the first terminal can directly create the mini-conference group in the conference group, and in the process of creating the mini-conference group, the target output mode of the first interactive object located in the mini-conference group can be the sound output mode without dissembling the conference group, and the target output mode of the second interactive object located outside the mini-conference group is the mute mode, so that the first terminal only outputs the audio information of the first interactive object located in the mini-conference group in the sound output mode, and the audio information of the second interactive object outside the mini-conference group is output in the mute mode. By adopting the method shown in the embodiment, the first object can directly create the mini-conference group in the conference group without quitting the conference group and re-creating the mini-conference group, so that the number of times of creating the conference group is reduced, and the conference interaction efficiency is improved.
In the embodiment shown in fig. 10, which illustrates how to create a group of meetings based on the relative distance between the identity of the first object and the identity of the interactive object in the interactive interface, another implementation of creating a group of meetings is illustrated below in conjunction with fig. 11:
step 1101, the first terminal displays an interactive interface.
Step 1102, the first terminal detects an operation event.
For details of the specific execution process of step 1101 to step 1102 shown in this embodiment, please refer to step 201 to step 202 shown in fig. 2, and the specific execution process is not described in detail in this embodiment.
Step 1103, the first terminal determines the identifier of the first interactive object and the identifier of the second interactive object according to the operation event.
In this embodiment, if a small conference group needs to be created in a conference group, a first object may move an identifier in an interactive interface in a manner of executing a drag operation event on the identifier of each interactive object, where the example does not limit a position where the identifier is moved, and the example is exemplarily illustrated by taking an example shown in fig. 12, and after the first object inputs the identifier in the interactive interface and finishes the drag operation event, the first terminal may detect a specific position of the identifier of each interactive object displayed in the interactive interface 1200;
specifically, the first terminal detects positions of identifiers of all interactive objects displayed on the interactive interface 1200 one by one, and the first terminal detects whether an identifier 1201 of an interactive object and an identifier 1202 of the first object include an identifier of at least one other interactive object, as shown in fig. 12 by way of example, it can be known that there is no identifier of another interactive object between the identifier 1201 of the interactive object and the identifier 1202 of the first object, and then the first terminal determines that the interactive object corresponding to the identifier 1201 is the first interactive object, a target output mode of the first interactive object is a sound output mode, and this embodiment does not limit a specific decibel size of audio information output by the first interactive object corresponding to the identifier 1201 at the first terminal;
similarly, for the setting that no other interactive object identifier exists between each identifier 1203, 1204, 1205 in the interactive interface 1200 and the identifier 1202 of the first object, the first terminal determines that the target output mode of the audio information of the first interactive object corresponding to each identifier 1203, 1204, 1205 in the first terminal is the sound output mode.
Continuing to refer to fig. 12, the first terminal detects whether the identifier 1206 of the interactive object and the identifier 1202 of the first object include at least one identifier of another interactive object, as shown in fig. 12 as an example, if there is a setting of the identifier of the interactive object between the identifier 1206 of the interactive object and the identifier 1202 of the first object, that is, the identifier 1201 is set between the identifier 1206 of the interactive object and the identifier 1202 of the first object, the first terminal determines that the target output mode of the audio information of the second interactive object corresponding to the identifier 1206 in the first terminal is a mute mode, and similarly, for a setting of the identifier of another interactive object between the identifier 1207 on the interactive interface 1200 and the identifier 1202 of the first object, the first terminal determines that the target output mode of the audio information of the second interactive object corresponding to the identifier 1207 in the first terminal is a mute mode.
Optionally, when the first object creates a mini-conference group through a drag operation event input to each identifier in the interaction interface, reference may be made to the description of the decibel size output by the first terminal from the audio information of the interaction object corresponding to each identifier in the mini-conference group, as shown in fig. 6, that is, the decibel size of the audio information of each first interaction object output by the first terminal is determined according to the size of the target relative distance, specifically, please refer to fig. 6 for details, which is not described in detail in this example.
In step 1104, the first terminal determines a first target output mode and a second target output mode.
And step 1105, the first terminal outputs the output information of the first interactive object according to the first target output mode.
And step 1106, the first terminal outputs the output information of the second interactive object according to the second target output mode.
For details of the specific execution process between step 1104 and step 1106 shown in this embodiment, please refer to step 1001 to step 1006 shown in fig. 10, and the detailed execution process is not repeated in this embodiment.
By adopting the method shown in this embodiment, the first object using the first terminal can directly create the mini-conference group in the conference group, and in the process of creating the mini-conference group, the target output mode of the first interactive object located in the mini-conference group can be the sound output mode without resolving the conference group, and the target output mode of the second interactive object located outside the mini-conference group is the mute mode, so that the first terminal only outputs the audio information of the first interactive object located in the mini-conference group in the sound output mode, and the audio information of the second interactive object outside the mini-conference group is output in the mute mode. By adopting the method shown in the embodiment, the first object can directly create the mini-conference group in the conference group without quitting the conference group and re-creating the mini-conference group, so that the number of times of creating the conference group is reduced, and the conference interaction efficiency is improved.
In the above description of the implementation process of the interaction control method in the scene of the multiparty conference system, the implementation process of the interaction control method in the live broadcast scene is described as follows:
with the development of multimedia technology, it has become a current trend that users can enjoy colorful multimedia services on various terminals, users can interact with other users in a live broadcast manner to output information, and the output information can be audio information, audio/video information, image-text information, and the like, so that live broadcast applications are increasing, for example, social chat room applications, game applications, online singing applications, and the like, and the following description is given for an exemplary structure of a live broadcast system in combination with fig. 13, as shown in fig. 13, the live broadcast system shown in this embodiment includes: a first terminal 1301, a live broadcast management server 1303 and a live broadcast server 1304.
The live broadcast system shown in this embodiment includes a plurality of first terminals 1301, and the present embodiment does not limit the specific number of the first terminals 1301, as long as the plurality of first terminals 1301 located in the live broadcast system are used to perform interaction of output information, and a description of a specific type of the output information is given, which is described in detail in the above, and is not specifically described in this embodiment; a live client is operated in the first terminal 1301, and for the description of the specific device type of the first terminal 1301 shown in this embodiment, please refer to the description above, which is not described in detail; the live broadcast client shown in this embodiment is an application for implementing live broadcast on the first terminal 1301, such as a social application, a game application, an online singing application, and the like, and is not limited in this embodiment specifically; the first terminal 1301 operating with a live broadcast client can acquire and push output information for live broadcast, and the live broadcast management server 1303 is used for scheduling the live broadcast server 1304 and creating a live broadcast channel;
specifically, the first terminal 1301 may initiate a push streaming request to the live broadcast management server 1303 through the internet 1305, and the live broadcast management server 1303 schedules a live broadcast server 1304, and creates a live broadcast channel on the live broadcast server 1304; the live broadcast management server 1303 records the correspondence between the created live broadcast channel identifier and the Internet (Internet) address (IP address) of the scheduled live broadcast server 1304, and the live broadcast management server 1303 responds to the address of the stream pushing url to the first terminal 1301. The push stream uniform resource locator address comprises a created live broadcast channel identifier and an IP address of a scheduled live broadcast server; after receiving the address of the stream pushing url, the first terminal 1301 pushes output information for live broadcasting to the scheduled live broadcasting server 1304, the scheduled live broadcasting server 1304 receives the output information and stores the output information in a storage location associated with a live broadcasting channel identifier, and then all terminals connected to the live broadcasting channel can share the output information, for example, if a chat room is created on the live broadcasting channel, all terminals located in the chat room can share the output information, and if a game is created on the live broadcasting channel, online commanding of a game team can be realized based on the live broadcasting system, which takes the creation of the live broadcasting channel as an example of the chat room:
based on the architecture of the live broadcast system shown in fig. 13, an implementation process of the interaction control method shown in this embodiment is exemplarily described below with reference to fig. 14:
and 1401, displaying an interactive interface by the first terminal.
An exemplary description of the interactive interface is described below with reference to fig. 15, where a chat room created on a live channel includes a first object a, an interactive object B, an interactive object C, an interactive object D, and an interactive object E, and for a specific description of the interactive object, please refer to the above description, which is not repeated; the first object A is an object which is added into a chat room by using a first terminal 1301, the first terminal 1301 displays an interactive interface 1501, and an identifier 1502 of the first object A, an identifier 1503 of an interactive object B, an identifier 1504 of an interactive object C, an identifier 1505 of an interactive object D and an identifier 1506 of an interactive object E are displayed on the interactive interface 1501; optionally, for specific description of the identifier shown in this embodiment, refer to the embodiment shown above. Optionally, in a scene of a live broadcast system, the identifier may also be a live broadcast window, and in a case that audio/video information needs to be shared among objects in the chat room, the live broadcast window is configured to output audio/video information corresponding to the identifier, for example, the audio/video information of the first object a is output in the identifier 1502, the audio/video information of the interactive object B is output in the identifier 1503, the audio/video information of the interactive object C is output in the identifier 1504, the audio/video information of the interactive object D is output in the identifier 1505, and the audio/video information of the interactive object E is output in the identifier 1505.
And 1402, the first terminal adjusts a target output mode according to the relative position of the interactive object identifier on the interactive interface.
Optionally, the relative position shown in this embodiment may refer to a relative distance between the identifier of the first object and the identifier of the interactive object on the interactive interface, and for a specific description, please refer to fig. 6 in detail, which is not limited in this embodiment.
Optionally, the relative position shown in this embodiment may also refer to a position of the identifier of the interactive object relative to the position in the interactive interface, and how the first terminal determines the target output mode according to the relative position of the identifier of the interactive object is described as follows:
firstly, a first terminal receives an operation event of a first object for identifying an interactive object;
the operation event shown in this embodiment is a zoom operation event, where the zoom operation event specifically includes a zoom-in operation event and a zoom-out operation event, the zoom-in operation event is used to enlarge an area occupied by the identifier of the interactive object in the interactive interface, and the zoom-out operation event is used to reduce the area occupied by the identifier of the interactive object in the interactive interface.
The method comprises the steps that when a first terminal detects that an identification of an interactive object receives a zooming operation event, the first terminal adjusts a target output mode according to the zooming operation event, for example, if the first terminal detects that the identification of the interactive object receives the zooming operation event, the size of the target output mode is increased, for example, when output information of the interactive object is audio and video information, the size of the target output mode is increased, namely, the volume of the audio and video information of the interactive object is increased and/or the resolution of an audio and video file is increased; for another example, if the first terminal detects that the scaling-down operation event is received by the identifier of the interactive object, the size of the target output mode is reduced, for example, in a case that the output information of the interactive object is audio/video information, reducing the size of the target output mode means reducing the volume of the audio/video information of the interactive object and/or reducing the resolution of the audio/video file.
For better understanding, the following description is made in conjunction with the illustration of fig. 16: fig. 16 includes a diagram a and a diagram B, where fig. a is a schematic diagram that the identifier 1503 of the interactive object has not received the zoom operation event, and fig. B is a schematic diagram that the identifier 1503 of the interactive object has received the zoom operation event, as shown in fig. 16 a, if the first object needs to increase the target output mode in which the audiovisual information of the interactive object B is output at the first terminal 1301, the first object may execute the zoom operation event on the identifier 1503 of the interactive object B, and after the identifier 1503 of the interactive object B passes through the zoom operation event, as shown in fig. 16B, the area of the identifier 1503 of the interactive object B is increased, that is, the area of the identifier 1503 of the interactive object B in fig. B is larger than the area of the identifier 1503 of the interactive object B in fig. a, and the first terminal may increase the size of the audiovisual information of the interactive object B.
In the above example, the area size of the identifier of the interactive object and the size of the target output mode are in a positive correlation as an example, in other examples, the area size of the identifier of the interactive object and the size of the target output mode may also be in an inverse correlation, or a functional relationship, etc., and the specific embodiment is not limited as long as the terminal can adjust the size of the target output mode according to the change of the area size of the identifier of the interactive object in the interactive interface.
And 1403, the first terminal outputs the output information of the interactive object according to the adjusted target output mode.
For details of the specific execution process of step 1403 shown in this embodiment, please refer to step 204 shown in fig. 2, and the specific execution process is not described in detail in this embodiment.
By adopting the method shown in this embodiment, in a chat room of a live broadcast system, a plurality of identifiers of interactive objects are displayed on an interactive interface of a first terminal used by a first object, and the first terminal adjusts a target output mode of the interactive objects by adjusting the relative positions of the identifiers of the interactive objects on the interactive interface, so that output information of the interactive objects is output in the target output mode on the first terminal, and thus, independent adjustment of the target output mode output by the first terminal of the output information of each interactive object interacting with the first object is realized.
In the above description of the implementation process of the interaction control method in the scene of the multiparty conference system and the scene of the live system, the implementation process of the interaction control method in the Virtual Reality (VR) scene is described as follows:
first, referring to fig. 17, an architecture of virtual reality will be described exemplarily, and fig. 17 is an exemplary diagram of an architecture of a VR system including a server 1702 and a display device 1703 coupled via a network 1701, and the display device 1703 is further coupled with a VR device 1704.
The present embodiment exemplifies an application of the VR system to a game scene, where the server 1702 may be, for example, an independent computer server or a server cluster, the server 1702 is used for storing game data, and the server 1702 is further used for responding to various requests of the VR device 1704 and the display device 1703. The network 1701 may be any network(s) such as the internet, a local area network, and the internet of things. The display device 1703 is, for example, a computer device with a separate display screen and certain processing capability, and may be: a personal computer, laptop computer, computer workstation, server, mainframe computer, palmtop computer, personal digital assistant, smart phone, smart appliance, game console, DVD recorder/player, television, home entertainment system, or any other suitable computing device. The display device 1703 may store VR player software, and when the VR player is started, the player requests and receives various game data from the server 1702, and renders and plays the game data downloaded to the display device side in the player.
In this example, the VR device 1704 is an external head-mounted display device, and may perform various interactions with the display device 1703 and the server 1702, and transmit some information related to the current situation of the user, such as the angle of the user's field of view, the change of the eye sight line, etc., to the display device 1703 and/or the server 1702 through signal transmission, and the display device 1703 may flexibly process the currently played video data according to the information.
In the above embodiment, the VR device 1704 is an external head-mounted display device, but those skilled in the art will appreciate that the VR device 1704 is not so limited, and the VR device 1704 may also be an integral head-mounted display device. The integral head display device itself has the function of a display screen, so that an external display device may not be required, for example, in this case, if the integral head display device is used, the display device 1703 may be omitted. At this time, the integrated head display device undertakes the work of acquiring video data from the server 1702 and playing, and at the same time, the integrated head display device detects some current viewing angles of the user and adjusts the playing work.
Based on the architecture of the VR system shown in fig. 17, the following describes an exemplary implementation process of the interaction control method with reference to fig. 18:
and 1801, displaying an interactive interface by the VR equipment.
When the interaction control method shown in this embodiment is applied to a game scene, an interaction interface displayed by the VR device may be a game interface, and the game interface may display any scene interface of a game, which is not specifically limited in this embodiment, specifically, the interaction interface shown in this embodiment displays identifiers of a plurality of interaction objects, and a specific type of the interaction object in this embodiment is not limited, for example, the interaction object may be another player participating in the game, and for example, the interaction object may be an NPC of the game.
Step 1802, the VR device adjusts the target output mode according to the relative position relationship between the identifier of the first object and the identifier of the interactive object on the interactive interface.
In a scene of the VR system, a relative position relationship between the identifier of the first object and the identifier of the interactive object on the interactive interface may be a target included angle, specifically, the VR device may determine a target output mode according to the target included angle, the target included angle is an included angle between optical axes of the interactive object and a target view field angle, the target view field angle is a view field angle of the VR device, the target view field angle is used to measure a scale of a maximum view field range that a user wearing the VR device can "see", and the identifier of the first object is a vertex of the target view field angle.
For better understanding, the present embodiment is illustrated with the interactive interface shown in fig. 19:
as shown in fig. 19, the interactive interface 1900 can be seen by the first object wearing the VR device through the VR device, specifically, in the interactive interface, a vertex 1901 of a target view field angle is a position of a lens of the VR device, that is, the target view field angle is an angle formed by two edges 1902 of a maximum range of the view field range with the vertex 1901 as a starting point, and the target view field angle determines the view field range of the VR device, and the larger the target view field angle is, the larger the view field of the first object wearing the VR device is, and the size of the target view field angle may be determined by the arrangement of the VR device itself, or may be arranged by the first object according to its own needs, and is not limited in this embodiment.
Continuing with fig. 19, the optical axis of the target view field angle is a symmetry axis 1903 of the target view field angle, and the specific process of the VR device adjusting the target output mode according to the target view field angle may be that the VR device determines that the size of the target output mode of the interactive object and the size of the target included angle are in an inverse correlation relationship.
In this embodiment, the target output target may be a volume of audio information of the interactive object, a resolution of video information, a brightness value of an identifier of the interactive object, and intensity of other sensory information such as heat, smell, pressure, and the like of the interactive object, which is not specifically limited in this embodiment;
continuing with the example shown in fig. 19, if the identifier 1904 of the interactive object is located on the optical axis of the target view field angle, the target included angle is 0 degree, and the target included angle 1906 between the identifier 1905 of the interactive object and the optical axis 1903 is 30 degrees, it is known that the target included angle of the identifier 1905 of the interactive object is greater than the target included angle of the identifier 1904 of the interactive object, and the target output mode of the identifier 1905 of the interactive object is smaller than the target output mode of the identifier 1904 of the interactive object, for example, the brightness value of the identifier 1904 of the interactive object in the interactive interface 1900 is greater than the brightness value of the identifier 1905 of the interactive object.
Optionally, if the identifier 1907 of the interactive object is located outside the target field angle, the VR device may determine to close the output information of the identifier 1907 of the interactive object, for example, if the output information is audio/video information, the VR device may not display the audio/video information of the interactive object in the interactive interface, and if the output information is a brightness value of the identifier, the VR device may not display the identifier of the interactive object in the interactive interface, which is not specifically described in this embodiment.
Optionally, in the above exemplary illustration taking the relative position relationship between the identifier of the first object and the identifier of the interactive object on the interactive interface as the target included angle, the relative position relationship shown in this embodiment may also be the relative distance between the gaze point of the first object viewing line wearing the VR device and the identifier of the interactive object;
the VR device shown in this embodiment has an eyeball tracking function, where the eyeball tracking function means that the VR device can capture eyeball rotation in the process of eyeball rotation of a first object to calculate the position of an eyeball fixation point, optionally, the VR device can shoot an eyeball infrared characteristic image of the first object through a near-infrared camera, then perform image processing, and then perform fixation point solving through a human eye mathematical model established in advance.
In this embodiment, the relative distance between the gaze point of the first object view and the identifier of the interactive object and the size of the target output mode of the interactive object may be in a positive correlation, that is, the closer the interactive object is to the gaze point of the first object view, the larger the target output mode of the interactive object is, and the farther the interactive object is from the gaze point of the first object view, the smaller the target output mode of the interactive object is. Optionally, the size of the relative distance between the gaze point of the first object view and the identifier of the interactive object and the size of the target output mode of the interactive object shown in this embodiment may also be in an inverse correlation relationship or a functional relationship, and the like, which is not limited in this embodiment.
And 1803, outputting output information of the interactive object by the VR equipment according to the adjusted target output mode.
The specific implementation process of step 1803 shown in this embodiment may be shown in step 203 shown in fig. 2, and the specific implementation process is not described in detail.
Optionally, in this embodiment, the target output mode is exemplarily illustrated by adjusting the size of an included angle between the identifier of each interactive object and the optical axis of the target view field angle, in other optional examples, if the identifier included in the interactive interface displayed by the VR device is another player wearing the VR device, each player participating in the VR system may further wear a sensor for collecting a physiological parameter of the player, and this embodiment does not limit the specific type of the sensor and the specific type of the physiological parameter, for example, the physiological parameter collected by the sensor may be skin temperature, heartbeat, blood pressure, sweat metabolic quantity, and the like, optionally, the size of the target output mode for outputting the output information of the interactive object and the size of the physiological parameter have a positive correlation, for example, if the heartbeat of the player is faster, the larger brightness value displayed in the interactive interface by the identifier corresponding to the player is, if the skin temperature of the player is higher, the decibel of the audio information of the player is higher, and the like, and in this embodiment is not limited specifically.
By adopting the method shown in this embodiment, in the VR system, the identifiers of the multiple interactive objects are displayed on the interactive interface displayed by the VR device, and the corresponding target output mode can be adjusted according to the size of the included angle between the interactive object and the optical axis of the target view field angle, so that the output information of the interactive objects is output in the target output mode on the VR device, and the purpose of independently adjusting the output mode in which the output information of each interactive object is output on the VR device in the VR system is achieved.
The following description will be made with reference to fig. 20 to illustrate a process of how the first terminal outputs the output information of each interactive object in the target output mode:
step 2001, the first terminal sends the correspondence to the server.
In this embodiment, the corresponding relationship includes a corresponding relationship between an identifier of the interactive object and the adjusted target output mode, and the first terminal obtains a specific description of the adjusted target output mode, which is shown in the foregoing embodiment and will not be described in detail. Optionally, if the method shown in this embodiment is applied to the multi-party conferencing system shown in fig. 3, the server shown in this embodiment may be the server 303 shown in fig. 3; if the method of this embodiment is applied to the live broadcast system shown in fig. 13, the server shown in this embodiment may be the live broadcast server 1304 shown in fig. 13; if the method of this embodiment is applied to the VR system shown in fig. 17, the server shown in this embodiment may be the server 1702 shown in fig. 17. The communication system applied in the present embodiment is exemplified by the multi-party conference system shown in fig. 3.
For a better understanding, the following is illustrated in connection with FIG. 21: in the example shown in fig. 21, the objects that have joined the conference group 2100 include a first object 2101, an interactive object 2102, an interactive object 2103, and an interactive object 2104, during the conference, the first object 2101 hears the output information (taking audio information as an example) of the interactive object 2102, the interactive object 2103, and the interactive object 2104 through a first terminal 2105 used by the first object 2101, the interactive object 2102 sends respective audio information to a server 2109 through a second terminal 2106 used by the interactive object 2103, a second terminal 2107 used by the interactive object 2103, and a second terminal 2108 used by the interactive object 2104 through the interactive object 2104, the server 2109 processes the audio information sent by the second terminal 2106, the second terminal 2107, and the second terminal 2108, respectively, and sends the processed audio information to the first terminal 2105, and the first terminal can output the processed audio information sent by the server 2109, where the audio information includes the interactive object 2102, the interactive object 2103, and the audio information of the interactive object 2104, so that other audio information in the conference group 2100 can be obtained. Specifically, taking the second terminal 2106 as an example, the second terminal 2106 may also obtain the audio information of the first terminal 2105, the second terminal 2107, and the second terminal 2108 through the server, and the specific obtaining process may refer to the specific description of the first terminal 2105, which is not repeated herein.
In this embodiment, the first object 2101 sends three correspondences to the server 2109 via the first terminal 2105 used by the first object, where each correspondence includes a correspondence between an Identity Identifier (ID) of an interactive object and an adjusted target output mode; as can be seen from the above description, the first object 2101 may adjust the output modes of all the interactive objects in the conference group 2100 by the first terminal 2105, after the adjustment is completed, the first terminal 2105 generates three corresponding relationships, where the three corresponding relationships respectively include the identification number of the interactive object 2102 and the first corresponding relationship of the first output mode, include the identification number of the interactive object 2103 and the second corresponding relationship of the second output mode, include the identification number of the interactive object 2104 and the third corresponding relationship of the third output mode, and the following description is given to the three corresponding relationships sent by the first terminal 2105 to the server 2109 by referring to table 1:
TABLE 1
Identity number of interactive object Output mode
ID of interactive object 2102 First output mode
ID of interactive object 2103 Second output mode
ID of interactive object 2104 Third output mode
For example, with reference to the above description, the first terminal 2105 gradually decreases the numerical parameter indicated by the first output mode, the numerical parameter indicated by the second output mode, and the numerical parameter indicated by the third output mode, which are sent by the first terminal 2105 to the server 2109 through the above indicated adjustment process of the target output mode, for example, the numerical parameter indicated by the first output mode is 50 db, the numerical parameter indicated by the second output mode is 40 db, and the numerical parameter indicated by the third output mode is 35 db, after the first terminal 2105 sends the above indicated three corresponding relationships to the server 2109, the server 2109 knows that the decibel of the audio information output corresponding to the identification number of the interactive object 2102 is 50 db, the decibel of the audio information output corresponding to the identification number of the interactive object 2103 is 40 db, and the decibel of the audio information output corresponding to the identification number of the interactive object 2104 is 35 db according to the indication of the corresponding relationships.
Step 2002, the server receives the correspondence.
Step 2003, the server receives audio information from the second terminal.
In this embodiment, in order to enable the first terminal 2105 to output the audio information of all the second terminals in the conference group 2100, each second terminal needs to send its audio information to the server, taking fig. 21 as an example, the second terminal 2106 needs to report the audio information of the interactive object 2102, the second terminal 2107 needs to report the audio information of the interactive object 2103, and the second terminal 2108 needs to report the audio information of the interactive object 2104 to the server 2109, and the audio information is processed by the server 2109.
Specifically, taking the second terminal 2106 as an example for illustration, the second terminal 2106 may collect, by using a microphone, an audio signal of a speech of the interactive object 2102 in the conference group 2100, and after the audio signal is collected by the second terminal 2106, encode the audio signal to generate encoded audio information, where the specific encoding manner is not limited in this embodiment, as long as the second terminal 2106 can convert an analog and continuous audio signal into digital audio information, for example, encoding manners such as internet low bit rate coding (iLBC), g.711, g.723, g.726, g.729, high pass code excited linear predictive coding (qcelp), enhanced variable rate coding (enhanced variable rate coding, evc), and the like may be used; in the embodiment, the first terminal 2105 may implement encoding of the audio signal through an independent encoder, or may implement encoding of the audio signal through a separate computer program, which is not limited in this embodiment.
Similarly, as shown in fig. 21 for example, the second terminal 2107 may send the encoded audio information and the second terminal 2108 may send the encoded audio information to the server 2109, so that the server 2109 processes the encoded audio information sent by the second terminal 2106, the encoded audio information sent by the second terminal 2107 and the encoded audio information sent by the second terminal 2108.
Step 2004, the server sends the first processed output information to the first terminal.
Specifically, after receiving the encoded audio information sent by all the second terminals, the server processes the encoded audio information according to the received correspondence to generate first processed output information, and sends the generated first processed output information to the first terminal.
The following describes how the server processes the encoded audio information sent by each second terminal according to the correspondence relationship:
optionally, the server shown in this embodiment may be configured with a processing module for each terminal in the conference group, where the processing module shown in this embodiment is configured to perform a decoding function, an adjusting function, a mixing function, and an encoding function, and the following description of a specific implementation process is provided; optionally, the processing module shown in this embodiment may be composed of physical hardware, that is, the processing module includes a decoder for performing a decoding function, a processor for performing an adjusting function, a mixer for performing a mixing function, and an encoder for performing an encoding function; optionally, the processing module shown in this embodiment may be an independent computer program configured separately for any terminal in the conference group, and when the independent computer program runs, the processing module shown in this embodiment may be configured by combining entity hardware and the computer program, for example, the processing module includes a computer program for performing an encoding function, an adjusting function, and a decoding function, and the processing module is a mixer of the entity hardware for performing a mixing function, and the specific form of the processing module is not limited in this embodiment, and in order to save cost and not limit the number of terminals participating in the conference group, the processing module is taken as an example of a computer program configured for a terminal, and this embodiment exemplarily illustrates that:
as shown in fig. 21, for example, the server may configure an independent first processing module 2110 for the first terminal 2105, an independent second processing module 2111 for the second terminal 2106, an independent third processing module 2121 for the second terminal 2107, and an independent fourth processing module 2113 for the second terminal 2108, and the following exemplarily describes specific functions of the first processing module 2110 corresponding to the first terminal 2105, and please refer to the description of the first processing module 2110 for specific descriptions of other processing modules, which is not described in detail;
first, if the first terminal 2105 receives audio information of all second terminals in the conference group, the first processing module 2110 needs to acquire encoded audio information sent by the second terminal 2106, encoded audio information sent by the second terminal 2107, and encoded audio information sent by the second terminal 2108, and the first processing module 2110 needs to decode each received encoded audio information, which is not limited in the specific manner of decoding in this embodiment, as long as the encoded audio information that is digitized through decoding by the first processing module 2110 is restored to an analog audio signal, and continuing to take the example shown in fig. 21, the first processing module 2110 may acquire a first audio signal corresponding to the second terminal 2106, a second audio signal corresponding to the second terminal 2107, and a third audio signal corresponding to the second terminal 2108 after performing a decoding function;
next, the first processing module 2110 processes the first audio signal based on the first corresponding relationship, specifically, the encoded audio information sent by each second terminal includes an ID of the second terminal sending the audio information, as can be seen from the above description, if each corresponding relationship sent by the first terminal 2105 received by the server also includes an ID of the second terminal, the first processing module 2110 determines, according to the ID of the second terminal, an output mode and an audio signal corresponding to each other, as shown in fig. 21, the first processing module 2110 determines, based on the ID of the second terminal 2106, the first audio signal and the first output mode corresponding to each other, the first processing module 2110 adjusts the first audio signal according to the first output mode to generate first adjusted information, where the first audio signal included in the first adjusted information is output in the first output mode, in the example, if a numerical parameter indicated by the first output mode is 50 db, the first processing module 2110 adjusts the first audio signal according to the first output mode, so that the second audio signal has the adjusted second audio signal after the first audio signal, and the second audio signal generated by the second processing module 2110, and the third audio signal generated by adjusting the second audio signal after the adjusted in the first output mode.
Thirdly, the first processing module 2110 performs mixing processing on the multiple paths of adjusted information to generate a path of target audio signal, specifically, because the output information shown in this embodiment is audio information, the mixing processing is sound mixing processing, and more specifically, the sound mixing processing means that the first processing module 2110 performs linear superposition on the multiple paths of adjusted information to generate a path of target audio signal, where the target audio signal includes first adjusted information, second adjusted information, and third adjusted information; more specifically, the first processing module 2110 performs mixing processing on the multiple channels of adjusted audio signals with the same timestamp to generate a target audio signal, so that multiple audio signals included in the target audio signal after mixing processing are collected by all second terminals in the conference group at the same time.
The first processing module 2110 encodes the target audio signal to generate output information after the first processing, and the description of the encoding of the audio signal is described in detail above and is not repeated.
Finally, the first processing module 2110 transmits the output information after the first processing to the first terminal 2105. The above description of the process of generating the output information after the first processing by the server shown in fig. 21 is an optional example and is not limited.
Another process of the server generating the first processed output information is exemplified below with reference to fig. 22: fig. 22 shows a conference group 2200 and detailed descriptions of each terminal in the conference group, please refer to fig. 21 for details, which are not repeated herein;
the following first explains differences between the server 2200 shown in fig. 22 and the server 2109 shown in fig. 21:
as can be seen from the above description, each terminal in the conference group shown in fig. 21 is configured with a processing module, and the decoding, adjusting, mixing and encoding of different processing modules included in the server 2109 are independent, while in the server shown in fig. 22, the implementation of the decoding function, the mixing function and the encoding function is unified, specifically, only one processing module is provided in the server 2200, and the processing module specifically includes a decoding module 2205 for implementing the decoding function, a mixing module 2206 for implementing the mixing function and an encoding module 2207 for implementing the encoding function, where the decoding module 2205, the mixing module 2206 and the encoding module 2207 shown in this embodiment may be physical hardware or may be implemented by independent computer programs, and this embodiment is exemplarily described by taking as an example that the decoding module 2205 is a decoder in physical hardware, the mixing module 2206 is a mixer in physical hardware, and the encoding module 2207 is an encoder in physical hardware; the server 2200 is configured with different adjusting modules for different terminals in the conference group 2200, for example, if a first adjusting module 2201 is configured for the first terminal 2205, a second adjusting module 2202 is configured for the second terminal 2206, a third adjusting module 2203 is configured for the second terminal 2207, and a fourth adjusting module 2204 is configured for the second terminal 2208, the first adjusting module 2201 is configured to adjust the audio signal output by the decoding module 2205, wherein each adjusting module may be a processor for implementing an adjusting function, or may be an independent computer program for implementing an adjusting function, and this embodiment takes each adjusting module as an independent computer program for example to exemplify the embodiment. In this example, different terminals in the conference group should have different adjustment modules for exemplary illustration, in other examples, only one adjustment module may be provided in the processing module, and the specific number of the adjustment modules is not limited in this embodiment.
In the following, a description is given to how the server 2200 generates the first processed output information that needs to be sent to the first terminal 2205, and for a description of a generation process of the processed information that needs to be sent to another second terminal, please refer to the process of generating the first processed output information that needs to be sent to the first terminal 2205 shown in this embodiment, which is not specifically described again;
first, to realize the sharing of audio information among the terminals in the conference group 2200, the decoding module 2205 decodes the encoded audio information sent by the first terminal 2205, the encoded audio information sent by the second terminal 2206, the encoded audio information sent by the second terminal 2207 and the encoded audio information sent by the second terminal 2208 respectively to obtain multiple paths of audio signals sent by different terminals, that is, a local audio signal corresponding to the ID of the first terminal 2205, a first audio signal corresponding to the ID of the second terminal 2206, a second audio signal corresponding to the ID of the second terminal 2207 and a third audio signal corresponding to the ID of the second terminal 2208;
secondly, after the first adjustment module 2201 of the server 2200 has received the corresponding relationship sent by the first terminal 2205, the first adjustment module 2201 may obtain the audio signal that needs to be adjusted from the decoding module 2205 based on the corresponding relationship, in this example, the first terminal 2205 continues to send the first corresponding relationship, the second corresponding relationship, and the third corresponding relationship to the first adjustment module 2201 for explanation, where specific descriptions of the first corresponding relationship, the second corresponding relationship, and the third corresponding relationship are described in detail in the foregoing, and are not described in detail;
specifically, the first adjusting module 2201 may obtain the first audio signal from the decoding module 2205 based on the ID of the second terminal 2206 included in the first corresponding relationship, and the first adjusting module 2201 is further configured to adjust the first audio signal according to the first output mode to generate the first adjusted information, for the specific description of generating the first adjusted information, see the specific description of generating the first adjusted information shown in fig. 22, which is not repeated here. In this embodiment, if the first adjustment module 2201 receives a first corresponding relationship, a second corresponding relationship and a third corresponding relationship, the first adjustment module 2201 needs to adjust the first audio signal, the second audio signal and the third audio signal respectively, and send the first adjusted information, the second adjusted information and the third adjusted information to the sound mixing module 2206;
next, please refer to an example corresponding to fig. 21 for a description of a specific process of the audio mixing module 2206 for performing audio mixing processing on all the audio signals sent by the first adjusting module 2201 to generate a target audio signal, which is not described in detail herein;
next, the mixing module 2206 sends the mixed target audio signal to the encoding module 2207, the encoding module 2207 is configured to encode the target audio signal to generate the first processed output information, and the encoding module 2207 can send the first processed output information to the first terminal 2205.
Step 2005, the first terminal receives the output information after the first processing.
In this embodiment, the output information after the first processing is audio information, and in order to realize the output of the output information after the first processing by the first terminal, the first terminal may decode the output information after the first processing, and specific descriptions of the decoding are described in detail in the foregoing description and are not repeated.
And step 2006, the first terminal outputs the output information after the first processing.
In this embodiment, after the first terminal decodes the first processed output information, the first processed output information may be output through the speaker of the first terminal, so that the first object receives the first processed output information through the first terminal, as shown above, on the server side, each audio signal included in the first processed output information may be adjusted in a corresponding target output mode, and when the first terminal directly outputs the first processed information, each audio signal may be output in a corresponding target output mode, for example, when the server side adjusts the output decibel of the audio signal of the interactive object 2102 to 50 decibels, the output decibel of the audio signal of the interactive object 2103 to 40 decibels, and the output decibel of the audio signal of the interactive object to 35 decibels, when the first terminal directly outputs the first processed output information through the speaker, the audio signal of the interactive object 2102 may be output in a 50 decibels, the audio signal of the interactive object 2103 to 40 decibels, and the audio signal of the interactive object to 35 decibels.
By adopting the method shown in this embodiment, the first terminal may send the corresponding relationship between the interactive object and the adjusted target output mode to the server, the server processes the output information of the interactive object based on the corresponding relationship to generate the first processed output information, and in the first processed output information, the output mode of the output information of each interactive object is the adjusted target output mode of the first object, and the first terminal may directly output the first processed output information after receiving the first processed output information, and in the process of outputting the first processed output information by the first terminal, the first processed output information is output in the target output mode, thereby achieving independent adjustment of the target output mode of the interactive object.
In the above fig. 20, the example of implementing the processing on the output information on the server side to generate the output information after the first processing is described, and the example of implementing the processing on the output information on the first terminal side to generate the output information after the second processing is described with reference to fig. 23:
step 2301, the first terminal obtains the corresponding relationship.
In a case that the terminal determines the target output mode, the terminal may create a corresponding relationship, where the corresponding relationship includes a corresponding relationship between the identifier of the interactive object and the target output mode, and a specific description of the corresponding relationship is shown in the embodiment corresponding to fig. 20, which is not specifically described in this embodiment.
For an exemplary description by taking the method shown in this embodiment applied to a multi-party conferencing system as an example, in order to implement that a first terminal used by a first object outputs audio information of all interactive objects in a conferencing group, decoding operation, adjusting operation, mixing operation, and the like need to be performed on the audio information of all interactive objects;
step 2302, the first terminal receives audio information of the interactive object from the server.
In this embodiment, as shown in fig. 24 as an example, when the first object a, the interactive object B, the interactive object C, the interactive object D, and the interactive object E are located in the same conference group, the interactive object B, the interactive object C, the interactive object D, and the interactive object E need to report audio information to the server through the second terminals used by the interactive object B, the interactive object C, the interactive object D, and the interactive object E.
Specifically, the second terminal of each interactive object first receives the audio signal of the interactive object, encodes the audio signal to generate audio information, and sends the audio information to the server, and please refer to the embodiment shown in fig. 20 for a detailed description of a specific process of how the second terminal encodes the audio signal to generate the audio information, which is not described in detail in this embodiment.
For better understanding, as shown in fig. 24, the server 2410 receives the audio information sent by the second terminal 2406 through the interactive object B, the audio information sent by the second terminal 2407 through the interactive object C, the audio information sent by the second terminal 2408 through the interactive object D, and the audio information sent by the second terminal 2409 through the interactive object E, in this embodiment, the server 2410 does not process each audio information, but only forwards the audio information, for example, if the server 2410 determines that the first object a, the interactive object B, the interactive object C, the interactive object D, and the interactive object E are included in the conference group, it may be determined that the audio information sent by each interactive object B, the interactive object C, the interactive object D, and the interactive object E respectively includes the ID of the interactive object a to be sent to the first terminal 2405 of the first object a, and as can be known by the above example, the audio information sent by each interactive object to the server includes the ID of the interactive object, and the server 2410 may send the audio information corresponding to the ID of the interactive object B, the interactive object C, the interactive object D, and the audio information corresponding to the terminal 2405 of the interactive object E.
Step 2303, the first terminal processes the audio information of the interactive object according to the corresponding relationship to obtain second processed output information.
In this embodiment, please refer to the specific process in which the first terminal used by the first object processes the audio information of the interactive object according to the corresponding relationship to generate the second processed output information, and refer to the process in which the server processes the audio of the interactive object according to the corresponding relationship to generate the first processed output information shown in fig. 20, which is not specifically described in this embodiment.
Step 2304, the first terminal outputs the second processed output information.
In this embodiment, the output mode of the second processed output information is the adjusted target output mode, and after the first terminal used by the first object acquires the second processed output information, the first terminal can directly output the second processed output information.
By adopting the method shown in this embodiment, the interactive interface of the first terminal used by the first object displays the identifiers of the multiple interactive objects, the first terminal corresponds to the target output mode through the identifiers of the interactive objects, and the first terminal processes the output information of the interactive objects according to the target output mode, so that the output information of the interactive objects is output on the first terminal in the target output mode.
Referring to fig. 25, a specific structure of a first terminal is exemplarily described from the perspective of a function module, where the first terminal 2500 shown in this embodiment may be used to execute the interaction control method shown in fig. 2, and specific descriptions of a specific execution process and beneficial effects are described in detail with reference to fig. 2, and are not specifically described in this embodiment;
the first terminal 2500 includes:
a display unit 2501, configured to display an interactive interface, where the interactive interface includes identifiers of a plurality of objects, the plurality of objects include a first object and an interactive object, and the interactive object is any one of other objects except the first object in the plurality of objects;
a detecting unit 2502, configured to detect an operation event, where the operation event is to adjust a relative position relationship between an identifier of the first object and an identifier of the interactive object on the interactive interface;
an adjusting unit 2503, configured to adjust a target output mode of output information of the interactive object according to the operation event;
an output unit 2504, configured to output information of the interactive object according to the adjusted target output mode.
Optionally, if the first terminal 2500 shown in this embodiment is configured to execute the interaction control method shown in fig. 6, the relative position relationship between the identifier of the first object and the identifier of the interaction object on the interaction interface is a relative distance between the identifier of the first object and the identifier of the interaction object on the interaction interface, and the adjusting unit 2503 is specifically configured to adjust a target output mode of the output information of the interaction object according to a change of the relative distance, where the target output mode is in an inverse-correlation or positive-correlation relationship with the relative distance. For a detailed description, please refer to the embodiment shown in fig. 6, which is not described in detail in this embodiment.
Optionally, if the first terminal 2500 shown in this embodiment is used to execute the interaction control method shown in fig. 10, and the interaction interface includes M target objects, the adjusting unit 2503 is further configured to: according to the relative distance between the identifier of each interactive object in the M interactive objects and the identifier of the first interactive object, the identifiers of the M interactive objects are sorted from small to large; and determining that a target output mode corresponding to the identification of the interactive object sequenced after the first L bits in the M interactive objects is a mute mode, wherein both M and L are positive integers greater than 1, and M is greater than L. For detailed description, please refer to the embodiment shown in fig. 10, which is not described in detail in this embodiment.
Optionally, if the first terminal 2500 shown in this embodiment is configured to execute the interaction control method shown in fig. 11, and the output information of the interaction object is audio information, and the target output mode is a decibel size of the audio information of the interaction object, the adjusting unit 2503 is further configured to determine that the target output mode is a silent mode if at least one identifier of an object is located between the identifier of the first object and the identifier of the interaction object; or, if the identifier of other objects does not exist between the first object and the interactive object, determining that the target output mode is the sound output mode. For a detailed description, please refer to the embodiment shown in fig. 11, which is not described in detail in this embodiment.
In the embodiment, the first terminal 2500 may also be configured to execute the interaction control method, the specific execution process, and the description of the beneficial effects shown in fig. 14, please refer to the embodiment shown in fig. 14 in detail, which is not described in detail in this embodiment.
Optionally, if the first terminal 2500 shown in this embodiment is configured to execute the interaction control method shown in fig. 18, a relative position relationship between the identifier of the first object and the identifier of the interactive object on the interaction interface is a target included angle, the target included angle is an included angle between the identifier of the interactive object and an optical axis of the target view angle, the target view angle is a view angle possessed by the virtual reality device, the optical axis of the target view angle is a symmetric axis of the target view angle, the identifier of the first object is a vertex of the target view angle, and the adjusting unit 2503 is specifically configured to adjust a target output mode of output information of the interactive object according to a change of the target included angle, where the target output mode is in an inverse correlation or a positive correlation with the relative distance. For detailed description, please refer to the embodiment shown in fig. 18, which is not described in detail in this embodiment.
Optionally, if the first terminal 2500 shown in this embodiment is used to execute the interaction control method shown in fig. 20, the output unit 2504 is specifically configured to: sending the corresponding relation between the interactive object and the adjusted target output mode to a server; receiving the processed output information from the server, wherein the processed output information is the output information obtained by processing the output information of the interactive object by the server according to the adjusted target output mode; and outputting the processed output information. For a detailed description, please refer to the embodiment shown in fig. 20, which is not described in detail in this embodiment.
Referring to fig. 26, a specific structure, a specific implementation process, and a description of beneficial effects of a first terminal for implementing the interaction control method shown in fig. 23 are exemplarily described below, and please refer to fig. 23 for details, which are not described in detail in this embodiment.
The first terminal 2600 includes:
a display unit 2601, configured to display an interactive interface, where the interactive interface includes identifiers of a plurality of objects, the plurality of objects includes a first object and an interactive object, and the interactive object is any one of other objects except the first object in the plurality of objects;
a detecting unit 2602, configured to detect an operation event, where the operation event is to adjust a relative position relationship between an identifier of the first object and an identifier of the interactive object on the interactive interface;
an adjusting unit 2603, configured to adjust a target output mode of the output information of the interactive object according to the operation event;
a receiving unit 2604, configured to receive output information of the interactive object from the server;
an output unit 2605, configured to output the output information of the interactive object according to the adjusted target output mode.
The output unit 2605 is specifically configured to: processing the output information of the interactive object according to the adjusted target output mode to obtain processed output information, wherein the output mode of the processed output information is the adjusted target output mode; and outputting the processed output information.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one type of logical functional division, and other divisions may be realized in practice, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention, which is substantially or partly contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk, and various media capable of storing program codes.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (19)

1. An interaction control method, applied to a first terminal, includes:
displaying an interactive interface, wherein the interactive interface comprises identifiers of a plurality of objects, the plurality of objects comprise a first object and an interactive object, the first object is an object using a first terminal, and the interactive object is any one of other objects except the first object in the plurality of objects;
detecting an operation event, wherein the operation event is to adjust the relative position relationship between the identifier of the first object and the identifier of the interactive object on the interactive interface;
adjusting a target output mode of the output information of the interactive object according to the operation event;
outputting the output information of the interactive object according to the adjusted target output mode;
if the first terminal is a virtual reality device, the relative position relation between the identification of the first object and the identification of the interactive object on the interactive interface is a target included angle, the target included angle is an included angle between the identification of the interactive object and an optical axis of a target view field angle, the target view field angle is a view field angle of the virtual reality device, the optical axis of the target view field angle is a symmetry axis of the target view field angle, and the identification of the first object is a vertex of the target view field angle.
2. The method of claim 1, wherein the adjusting the target output mode of the output information of the interactive object according to the operation event comprises:
and adjusting a target output mode of the output information of the interactive object according to the change of the target included angle, wherein the target output mode and the target included angle are in an anti-correlation or positive correlation relationship.
3. The method according to claim 1 or 2, wherein the outputting the output information of the interactive object according to the adjusted target output mode comprises:
sending the corresponding relation between the interactive object and the adjusted target output mode to a server;
receiving processed output information from the server, wherein the processed output information is output information obtained by processing the output information of the interactive object by the server according to the adjusted target output mode;
and outputting the processed output information.
4. The method according to claim 1 or 2, characterized in that the method further comprises:
receiving output information of the interactive object from a server;
the outputting the output information of the interactive object according to the adjusted target output mode comprises:
processing the output information of the interactive object according to the adjusted target output mode to obtain processed output information, wherein the output mode of the processed output information is the adjusted target output mode;
and outputting the processed output information.
5. The method of claim 1, wherein the target output mode of the output information is a strength of the sensory information.
6. An interaction control method, applied to a first terminal, includes:
displaying an interactive interface, wherein the interactive interface comprises identifiers of a plurality of objects, the plurality of objects comprise a first object and an interactive object, the first object is an object using a first terminal, and the interactive object is any one of other objects except the first object in the plurality of objects;
detecting an operation event, wherein the operation event is to adjust the relative position relationship between the identifier of the first object and the identifier of the interactive object on the interactive interface;
adjusting a target output mode of the output information of the interactive object according to the operation event;
outputting the output information of the interactive object according to the adjusted target output mode;
the output information of the interactive object is audio information, the target output mode is decibel size of the audio information of the interactive object, and the adjusting the target output mode of the output information of the interactive object according to the operation event comprises:
if the mark of at least one object is positioned between the mark of the first object and the mark of the interactive object, determining that the target output mode is a mute mode;
or the like, or, alternatively,
and if no identifier of other objects exists between the first object and the interactive object, determining that the target output mode is the sound output mode.
7. The method of claim 6, wherein the relative position relationship between the identifier of the first object and the identifier of the interactive object on the interactive interface is a relative distance between the identifier of the first object and the identifier of the interactive object on the interactive interface.
8. The method of claim 7, wherein the adjusting the target output mode of the output information of the interactive object according to the operation event comprises:
and adjusting a target output mode of the output information of the interactive object according to the change of the relative distance, wherein the target output mode is in an inversely or positively correlated relation with the relative distance.
9. The method according to claim 7 or 8, wherein the interactive interface includes M interactive objects, and the adjusting the target output mode of the output information of the interactive objects according to the operation event includes:
according to the relative distance between the identifier of each interactive object in the M interactive objects and the identifier of the first object, sequencing the identifiers of the M interactive objects from small to large;
determining that a target output mode corresponding to the identifier of the interactive object sequenced after the first L bits in the M interactive objects is a mute mode, wherein both M and L are positive integers greater than 1, and M is greater than L.
10. A terminal, wherein the terminal is a first terminal used by a first object, comprising:
the display unit is used for displaying an interactive interface, the interactive interface comprises a plurality of identifiers of objects, the objects comprise the first object and an interactive object, and the interactive object is any one of other objects except the first object in the objects;
the detection unit is used for detecting an operation event, wherein the operation event is the adjustment of the relative position relation between the identifier of the first object and the identifier of the interactive object on the interactive interface;
the adjusting unit is used for adjusting a target output mode of the output information of the interactive object according to the operation event;
the output unit is used for outputting the output information of the interactive object according to the adjusted target output mode;
if the terminal is a virtual reality device, the relative position relation between the identification of the first object and the identification of the interactive object on the interactive interface is a target included angle, the target included angle is an included angle between the identification of the interactive object and an optical axis of a target view field angle, the target view field angle is a view field angle of the virtual reality device, the optical axis of the target view field angle is a symmetry axis of the target view field angle, and the identification of the first object is a vertex of the target view field angle.
11. The terminal according to claim 10, wherein the adjusting unit is specifically configured to adjust a target output mode of the output information of the interactive object according to the change of the target included angle, and the target output mode and the target included angle have an inverse correlation or a positive correlation.
12. The terminal according to claim 10 or 11, wherein the output unit is specifically configured to: sending the corresponding relation between the interactive object and the adjusted target output mode to a server;
receiving processed output information from the server, wherein the processed output information is output information obtained by processing the output information of the interactive object by the server according to the adjusted target output mode;
and outputting the processed output information.
13. The terminal according to claim 10 or 11, characterized in that the terminal further comprises:
a receiving unit, configured to receive output information of the interactive object from a server;
the output unit is specifically configured to:
processing the output information of the interactive object according to the adjusted target output mode to obtain processed output information, wherein the output mode of the processed output information is the adjusted target output mode;
and outputting the processed output information.
14. A terminal, wherein the terminal is a first terminal used by a first object, comprising:
a display unit, configured to display an interactive interface, where the interactive interface includes identifiers of multiple objects, where the multiple objects include the first object and an interactive object, and the interactive object is any one of other objects except the first object in the multiple objects;
the detection unit is used for detecting an operation event, wherein the operation event is the adjustment of the relative position relation between the identifier of the first object and the identifier of the interactive object on the interactive interface;
the adjusting unit is used for adjusting a target output mode of the output information of the interactive object according to the operation event;
the output unit is used for outputting the output information of the interactive object according to the adjusted target output mode;
the output information of the interactive object is audio information, the target output mode is a decibel size of the audio information of the interactive object, and the adjusting unit is further configured to:
if the identifier of at least one object is positioned between the identifier of the first object and the identifier of the interactive object, determining that the target output mode is a mute mode;
or the like, or a combination thereof,
and if no identifier of other objects exists between the first object and the interactive object, determining that the target output mode is the sound output mode.
15. The terminal according to claim 14, wherein the relative position relationship between the identifier of the first object and the identifier of the interactive object on the interactive interface is a relative distance between the identifier of the first object and the identifier of the interactive object on the interactive interface.
16. The terminal according to claim 15, wherein the adjusting unit is specifically configured to adjust a target output mode of the output information of the interactive object according to the change of the relative distance, and the target output mode is in an inverse or positive correlation with the relative distance.
17. The terminal according to claim 15 or 16, wherein the interactive interface comprises M interactive objects, and the adjusting unit is further configured to:
sequencing the identifiers of the M interactive objects from small to large according to the relative distance between the identifier of each interactive object in the M interactive objects and the identifier of the first object;
and determining that a target output mode corresponding to the identifier of the interactive object sequenced after the first L bits in the M interactive objects is a mute mode, wherein M and L are positive integers greater than 1, and M is greater than L.
18. A terminal, characterized in that the terminal is a first terminal for use by a first object, the terminal comprising a processor and a memory, wherein,
the memory stores a computer program;
the processor executes the computer program to perform the method of any of the preceding claims 1 to 9.
19. A computer-readable storage medium, in which a computer program is stored, the computer program being adapted to perform the method of any of the preceding claims 1 to 9.
CN201910579764.3A 2019-06-28 2019-06-28 Interaction control method, terminal and storage medium Active CN112148182B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910579764.3A CN112148182B (en) 2019-06-28 2019-06-28 Interaction control method, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910579764.3A CN112148182B (en) 2019-06-28 2019-06-28 Interaction control method, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN112148182A CN112148182A (en) 2020-12-29
CN112148182B true CN112148182B (en) 2022-10-04

Family

ID=73891560

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910579764.3A Active CN112148182B (en) 2019-06-28 2019-06-28 Interaction control method, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN112148182B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20240053879A1 (en) * 2020-04-24 2024-02-15 Huawei Technologies Co., Ltd. Object Drag Method and Device
CN114816294A (en) 2020-09-02 2022-07-29 华为技术有限公司 Display method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101557444A (en) * 2009-05-08 2009-10-14 宇龙计算机通信科技(深圳)有限公司 Co-operation control method, system and mobile communication terminal
CN103390410A (en) * 2012-05-10 2013-11-13 宏碁股份有限公司 System and method for long-distance telephone conference
CN104025538A (en) * 2011-11-03 2014-09-03 Glowbl公司 A communications interface and a communications method, a corresponding computer program, and a corresponding registration medium
CN106791238A (en) * 2016-11-28 2017-05-31 努比亚技术有限公司 The call control method and device of MPTY conference system
CN109582273A (en) * 2018-11-26 2019-04-05 联想(北京)有限公司 Audio-frequency inputting method, electronic equipment and audio output device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10712927B2 (en) * 2015-06-12 2020-07-14 Avaya Inc. System and method for call management in single window communication endpoints

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101557444A (en) * 2009-05-08 2009-10-14 宇龙计算机通信科技(深圳)有限公司 Co-operation control method, system and mobile communication terminal
CN104025538A (en) * 2011-11-03 2014-09-03 Glowbl公司 A communications interface and a communications method, a corresponding computer program, and a corresponding registration medium
CN103390410A (en) * 2012-05-10 2013-11-13 宏碁股份有限公司 System and method for long-distance telephone conference
CN106791238A (en) * 2016-11-28 2017-05-31 努比亚技术有限公司 The call control method and device of MPTY conference system
CN109582273A (en) * 2018-11-26 2019-04-05 联想(北京)有限公司 Audio-frequency inputting method, electronic equipment and audio output device

Also Published As

Publication number Publication date
CN112148182A (en) 2020-12-29

Similar Documents

Publication Publication Date Title
US10363479B2 (en) Methods for controller pairing with assigned differentiating color/player
CN109874021B (en) Live broadcast interaction method, device and system
EP3180911B1 (en) Immersive video
US9253440B2 (en) Augmenting a video conference
US20170038829A1 (en) Social interaction for remote communication
US8463182B2 (en) Wireless device pairing and grouping methods
JP2024050721A (en) Information processing device, information processing method, and computer program
CN111527525A (en) Mixed reality service providing method and system
US20030037101A1 (en) Virtual reality systems and methods
US9167071B2 (en) Wireless device multimedia feed switching
EP2516027A1 (en) Wireless device pairing and grouping methods
CN111580661A (en) Interaction method and augmented reality device
Müller et al. PanoVC: Pervasive telepresence using mobile phones
CN112148182B (en) Interaction control method, terminal and storage medium
JP7228338B2 (en) System, method and program for distributing videos
WO2010120304A2 (en) Communicating visual representations in virtual collaboration systems
CN113244616B (en) Interaction method, device and equipment based on virtual scene and readable storage medium
Steptoe et al. Acting rehearsal in collaborative multimodal mixed reality environments
CN106686463A (en) Video role replacing method and apparatus
CN111405310B (en) Live broadcast interaction method and device, electronic equipment and storage medium
JP2023524930A (en) CONFERENCE PROCESSING METHOD AND SYSTEM USING AVATARS
US20230195403A1 (en) Information processing method and electronic device
Tamai et al. View control interface for 3d tele-immersive environments
CN109213307A (en) A kind of gesture identification method and device, system
Guo et al. Tangible video communication over the Internet

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant