CN112256181B - Interaction processing method and device, computer equipment and storage medium - Google Patents

Interaction processing method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN112256181B
CN112256181B CN202011157026.9A CN202011157026A CN112256181B CN 112256181 B CN112256181 B CN 112256181B CN 202011157026 A CN202011157026 A CN 202011157026A CN 112256181 B CN112256181 B CN 112256181B
Authority
CN
China
Prior art keywords
interaction
target
mode
interface
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011157026.9A
Other languages
Chinese (zh)
Other versions
CN112256181A (en
Inventor
汤晓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202011157026.9A priority Critical patent/CN112256181B/en
Publication of CN112256181A publication Critical patent/CN112256181A/en
Priority to PCT/CN2021/106497 priority patent/WO2022088765A1/en
Application granted granted Critical
Publication of CN112256181B publication Critical patent/CN112256181B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Abstract

The disclosure relates to an interaction processing method, an interaction processing device, computer equipment and a storage medium, and belongs to the technical field of computers. The method comprises the following steps: acquiring at least one attribute information of an account, wherein the attribute information is used for representing the attribute of a user to which the account belongs; determining a target interaction mode matched with the at least one attribute information from a plurality of interaction modes, wherein the interaction mode is used for indicating a response mode of an interaction operation corresponding to the interaction mode; and if the interactive operation is detected on the target interface, responding to the interactive operation based on a target interactive mode corresponding to the interactive operation. According to the method and the device, the interaction habits of different users are reflected through at least one attribute information of different accounts, so that the target interaction mode which accords with the interaction habits of the users is determined from the plurality of interaction modes, the interaction operation on the target interface is responded according to the target interaction mode, the interaction requirements of the users are met, and the usability of man-machine interaction is improved.

Description

Interaction processing method and device, computer equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an interaction processing method and apparatus, a computer device, and a storage medium.
Background
Currently, a terminal displays an application interface of an application program, and feeds back interactive operation on the application interface through interface change to realize interface interaction. The application program usually has a preset interaction mode, for example, a recommendation interface of the short video application program is used for playing a single short video, and the terminal responds to the operation of sliding upwards to play the next short video in the recommendation interface.
In the above scheme, one application program provides an interaction mode for all users of the application program, but the interaction habits of different users are different, the acceptance degrees of the interaction modes are also different, a single interaction mode cannot meet the interaction requirements of different users, and the usability of human-computer interaction is insufficient.
Disclosure of Invention
The embodiment of the disclosure provides an interaction processing method, an interaction processing device, computer equipment and a storage medium, so as to meet interaction requirements of users and improve usability of man-machine interaction. The technical scheme of the disclosure is as follows:
in one aspect, an interactive processing method is provided, where the method includes:
acquiring at least one attribute information of an account, wherein the attribute information is used for representing the attribute of a user to which the account belongs;
determining a target interaction mode matched with the at least one attribute information from a plurality of interaction modes, wherein the interaction mode is used for indicating a response mode of an interaction operation corresponding to the interaction mode;
and if the interactive operation is detected on the target interface, responding to the interactive operation based on a target interactive mode corresponding to the interactive operation.
In an optional implementation manner, the determining, from the multiple interaction manners, a target interaction manner matched with the at least one attribute information includes:
determining the matching degree of each interactive mode and the at least one attribute information;
and determining the interactive mode with the highest matching degree as the target interactive mode.
In another optional implementation manner, the determining a matching degree between each of the interaction manners and the at least one attribute information includes:
acquiring the weight of each attribute information in the at least one attribute information, wherein the weight is used for representing the importance degree of the attribute information for determining the target interaction mode;
determining an interaction mode corresponding to each attribute information;
and adding the weights of the attribute information corresponding to the same interactive mode to obtain the matching degree of the interactive mode and the at least one attribute information.
In another optional implementation manner, the at least one attribute information includes at least one of region information, age information, gender information, channel information, and device information;
the system comprises a target interface, a computer device and channel information, wherein the region information is used for representing a region where the computer device displaying the target interface is located, the channel information is used for representing a downloading channel of an application program to which the target interface belongs, and the device information is used for representing the category of an operating system of the computer device displaying the target interface.
In another optional implementation manner, before the obtaining of the at least one attribute information of the account, the interaction processing method further includes:
and responding to the account which is registered for the first time, and executing the step of acquiring at least one attribute information of the account.
In another optional implementation manner, before the obtaining of the at least one attribute information of the account, the interaction processing method further includes:
displaying a prompt interface, wherein the prompt interface comprises selection controls of the plurality of interaction modes;
responding to the selection operation of a selection control of any interactive mode, and determining the selected interactive mode as the target interactive mode;
and in response to not detecting the selection operation of the selection controls of the plurality of interactive modes, executing the step of acquiring at least one attribute information of the account.
In another optional implementation manner, the prompt interface further includes presentation animations of the multiple interaction manners, and the presentation animation of any interaction manner is used to show a process of responding to an interaction operation corresponding to the interaction manner based on the interaction manner.
In another optional implementation manner, after the target interaction manner matched with the at least one attribute information is determined from the plurality of interaction manners, the interaction processing method further includes:
storing the corresponding relation between the account and the target interaction mode;
responding to the re-login of the account, and determining the target interaction mode corresponding to the account based on the stored corresponding relation;
and if the interactive operation is detected on the target interface, responding to the interactive operation based on a target interactive mode corresponding to the interactive operation.
In another optional implementation manner, the target interface is a video playing interface, and the target interaction manner is used for instructing to switch videos in the video playing interface in response to a sliding operation on the video playing interface;
the interaction processing method further comprises the following steps:
replacing the target interaction mode with a first interaction mode in response to the occurrence frequency of the target playing event being greater than a frequency threshold;
the target playing event is used for indicating that the playing duration of a single video in the video playing interface is smaller than a duration threshold, and the first interaction mode is used for indicating that the video acted by the click operation is played in response to the click operation on the video in the video list interface.
In another optional implementation manner, the target interface is a video list interface, and the target interaction manner is used for instructing to respond to a click operation on a video in the video list interface and play a video acted by the click operation;
the interaction processing method further comprises the following steps:
replacing the target interaction mode with a second interaction mode in response to the fact that the proportion of the clicked video in the video list interface is larger than a proportion threshold value;
the specific gravity of the clicked videos in the video list interface is the ratio of the number of videos acted by the clicking operation in the video list interface to the total number of videos in the video list interface, and the second interaction mode is used for indicating that the videos in the video playing interface are switched in response to the sliding operation on the video playing interface.
In one aspect, an interaction processing apparatus is provided, the apparatus including:
the account management system comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is configured to acquire at least one attribute information of an account, and the attribute information is used for representing the attribute of a user to which the account belongs;
a determining unit configured to determine a target interaction mode matched with the at least one attribute information from a plurality of interaction modes, where the interaction mode is used to indicate a response mode of an interaction operation corresponding to the interaction mode;
and the response unit is configured to respond to the interactive operation based on a target interactive mode corresponding to the interactive operation if the interactive operation is detected on the target interface.
In an optional implementation manner, the determining unit includes:
a first determining subunit, configured to perform determining a matching degree of each of the interaction modes with the at least one attribute information;
and the second determining subunit is configured to determine the interaction mode with the highest matching degree as the target interaction mode.
In another optional implementation manner, the first determining subunit is configured to perform:
acquiring the weight of each attribute information in the at least one attribute information, wherein the weight is used for representing the importance degree of the attribute information for determining the target interaction mode;
determining an interaction mode corresponding to each attribute information;
and adding the weights of the attribute information corresponding to the same interactive mode to obtain the matching degree of the interactive mode and the at least one attribute information.
In another optional implementation manner, the at least one attribute information includes at least one of region information, age information, gender information, channel information, and device information;
the system comprises a target interface, a computer device and channel information, wherein the region information is used for representing a region where the computer device displaying the target interface is located, the channel information is used for representing a downloading channel of an application program to which the target interface belongs, and the device information is used for representing the category of an operating system of the computer device displaying the target interface.
In another optional implementation manner, the obtaining unit is configured to perform, in response to the account being a first registered account, obtaining at least one attribute information of the account.
In another optional implementation manner, the interaction processing apparatus further includes:
a display unit configured to execute displaying a prompt interface, the prompt interface including selection controls of the plurality of interaction modes;
the determining unit is further configured to execute a selection operation of a selection control responding to any interactive mode, and determine the selected interactive mode as the target interactive mode;
the obtaining unit is configured to execute, in response to not detecting the selection operation of the selection controls of the multiple interaction modes, obtaining at least one attribute information of the account.
In another optional implementation manner, the prompt interface further includes presentation animations of the multiple interaction manners, and the presentation animation of any interaction manner is used to show a process of responding to an interaction operation corresponding to the interaction manner based on the interaction manner.
In another optional implementation manner, the interaction processing apparatus further includes:
the storage unit is configured to store the corresponding relation between the account and the target interaction mode;
the determining unit is further configured to execute re-login in response to the account, and determine the target interaction mode corresponding to the account based on the stored corresponding relationship;
the response unit is further configured to execute, if the interactive operation is detected on the target interface, a response to the interactive operation based on a target interactive mode corresponding to the interactive operation.
In another optional implementation manner, the target interface is a video playing interface, and the target interaction manner is used to instruct to switch videos in the video playing interface in response to a sliding operation on the video playing interface;
the interaction processing apparatus further includes:
a first replacing unit configured to replace the target interactive mode with a first interactive mode in response to the occurrence frequency of the target playing event being greater than a frequency threshold;
the target playing event is used for indicating that the playing duration of a single video in the video playing interface is smaller than a duration threshold, and the first interaction mode is used for indicating that the video acted by the click operation is played in response to the click operation on the video in the video list interface.
In another optional implementation manner, the target interface is a video list interface, and the target interaction manner is used to instruct that a video acted by a click operation is played in response to the click operation on the video in the video list interface;
the interaction processing apparatus further includes:
the second replacement unit is configured to replace the target interaction mode with a second interaction mode in response to the fact that the proportion of the clicked video in the video list interface is larger than a proportion threshold value;
the specific gravity of the clicked videos in the video list interface is the ratio of the number of videos acted by the clicking operation in the video list interface to the total number of videos in the video list interface, and the second interaction mode is used for indicating that the videos in the video playing interface are switched in response to the sliding operation on the video playing interface.
In one aspect, a computer device is provided, the computer device comprising: one or more processors; a memory for storing instructions executable by the processor; wherein the processor is configured to execute the instructions to implement the interaction processing method according to any one of the above-mentioned optional implementation manners.
In one aspect, a storage medium is provided, and instructions in the storage medium, when executed by a processor of a computer device, enable the computer device to perform the interaction processing method according to any one of the above-mentioned alternative implementations.
In one aspect, a computer program product is provided, and when the instructions in the computer program product are executed by a processor of a computer device, the computer device is enabled to execute the interaction processing method according to any one of the above-mentioned optional implementation manners.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects:
the method comprises the steps of reflecting the interaction habits of different users through at least one attribute information of different accounts, determining a target interaction mode which accords with the interaction habits of the users from a plurality of provided interaction modes, responding to the interaction operation executed by the users on a target interface according to the target interaction mode, meeting the interaction requirements of the users, improving the usability of man-machine interaction, further improving the interaction experience of the users, reducing the user loss caused by the unaccustomed interaction mode, and improving the retention rate of the users.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure and are not to be construed as limiting the disclosure.
FIG. 1 is a schematic diagram of an implementation environment shown in accordance with an exemplary embodiment;
FIG. 2 is a flow diagram illustrating a method of interaction processing in accordance with an illustrative embodiment;
FIG. 3 is a flow diagram illustrating a method of interaction processing in accordance with an illustrative embodiment;
FIG. 4 is a flow diagram illustrating a method of interaction processing in accordance with an illustrative embodiment;
FIG. 5 is a block diagram illustrating an interaction processing device in accordance with an exemplary embodiment;
FIG. 6 is a block diagram illustrating a terminal in accordance with an exemplary embodiment;
FIG. 7 is a block diagram illustrating a server in accordance with an example embodiment.
Detailed Description
In order to make the technical solutions of the present disclosure better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the disclosure described herein are capable of operation in sequences other than those illustrated or otherwise described herein. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims. "plurality" in the embodiments of the present disclosure means two or more.
The user information referred to in the present disclosure is information authorized by the user or sufficiently authorized by each party.
FIG. 1 is a schematic diagram illustrating one implementation environment in accordance with an example embodiment. Referring to fig. 1, in an alternative implementation manner, the implementation environment includes a terminal 110, and the interaction processing method provided by the embodiment of the disclosure is executed by the terminal 110. Optionally, the terminal 110 is a smartphone, a tablet, a smart watch, a laptop or laptop computer, or the like. Optionally, various applications, such as a video viewing application, an audio playing application, and the like, are installed and run on the terminal 110. For example, the video viewing application includes a short video application, and the user can perform corresponding interactive operation on an application interface of the short video application to view a next short video, view a live video or view other short videos of a short video author, and the like.
The terminal 110 may refer to one of a plurality of terminals, and the embodiment is only illustrated by the terminal 110. Those skilled in the art will appreciate that the number of terminals described above may be greater or fewer. For example, the number of the terminals may be only a few, or the number of the terminals may be several tens or hundreds, or more, and the number of the terminals and the type of the device are not limited in the embodiments of the present disclosure.
In another alternative implementation, the implementation environment includes a terminal 110 and a server 120. The interaction processing method provided by the embodiment of the present disclosure is implemented by interaction between the terminal 110 and the server 120. Optionally, the server 120 is a background server for the application on the terminal, and provides a background service for the application. Alternatively, the server 120 and the terminal 110 are directly or indirectly connected through wired or wireless communication. Optionally, the server 120 is a server, a plurality of servers, a cloud server, a cloud computing platform, or a virtualization center. The number of the servers can be more or less, and the embodiment of the disclosure does not limit this. Of course, the server 120 may also include other functional servers in order to provide more comprehensive and diverse services.
The interactive processing method provided by the embodiment of the present disclosure is executed by a computer device, where the implementation environment is described by taking the example that the computer device includes a terminal, optionally, the computer device may also be an interactive robot, a self-service food ordering machine, or other interactive devices, and the embodiment of the present disclosure does not limit this.
FIG. 2 is a flow diagram illustrating an interaction processing method in accordance with an exemplary embodiment. Referring to fig. 2, the interactive processing method is applied to a terminal and includes the following steps.
In step S201, the terminal acquires at least one attribute information of the account, where the attribute information is used to indicate an attribute of a user to which the account belongs.
The account number is used to represent a user who uses the terminal. The user can log in the account of the user on the terminal, so that the terminal provides more complete service for the user. The related information of the user can be saved by taking the account as an identifier so as to be convenient for query and application. The terminal acquires attribute information used for representing the user attribute, and recommends an interaction mode which accords with the interaction requirement of the user for the user based on the analysis of the attribute information.
In step S202, the terminal determines a target interaction manner matched with at least one attribute information from a plurality of interaction manners, where the interaction manner is used to indicate a response manner of an interaction operation corresponding to the interaction manner.
The terminal is provided with a plurality of interactive modes. For example, for a short video application, one interactive mode is used to instruct to switch videos in a video playing interface in response to a sliding operation on the video playing interface; and another interactive mode is used for indicating that the video acted by the clicking operation is played in response to the clicking operation on the video in the video list interface. For another example, for a song playing application, an interactive mode is used to indicate that lyrics are displayed in response to a leftward swipe operation on a song playing interface; another interaction mode is used for indicating that the lyrics are displayed in response to clicking operation on the song playing interface.
And the terminal conjectures the interaction mode of the user tendency according to the attribute information of the user and automatically sets a target interaction mode which is in accordance with the interaction requirement of the user for the user.
In step S203, if the terminal detects an interactive operation on the target interface, the terminal responds to the interactive operation based on the target interactive mode corresponding to the interactive operation.
And the terminal takes the target interaction mode matched with at least one attribute information of the user as an interaction mode suitable for the user, and responds to the interaction operation executed by the user based on the target interaction mode. For example, for an interactive mode that responds to a click operation on a video in a video list interface and plays a video acted by the click operation, the target interface is the video list interface, the interactive operation is the click operation, and if the terminal detects the click operation on the video list interface, the video acted by the click operation is played based on the target interactive mode corresponding to the click operation.
According to the technical scheme, the interaction habits of different users are reflected through at least one attribute information of different accounts, then the target interaction mode which accords with the interaction habits of the users is determined from the multiple provided interaction modes, the interaction operation executed by the users on the target interface is responded according to the target interaction mode, the interaction requirements of the users are met, the usability of man-machine interaction is improved, the interaction experience of the users can be improved, the user loss caused by the fact that the interaction modes are not used is reduced, and the user retention rate is improved.
FIG. 3 is a flow diagram illustrating an interaction processing method in accordance with an exemplary embodiment. The process is described by taking the interaction between the terminal and the server as an example, and referring to fig. 3, the interaction processing method includes the following steps.
In step S301, the terminal obtains at least one attribute information of the account, where the attribute information is used to indicate an attribute of a user to which the account belongs.
Optionally, the at least one attribute information includes at least one of region information, age information, gender information, channel information, and device information. The region information is used to indicate a region where the terminal is located, and the region information is, for example, beijing, wuhan, shanghai, or the like. Optionally, the interaction mode in the embodiment of the present disclosure refers to an interaction mode of an operating system of a terminal; or, the interactive mode in the embodiment of the present disclosure refers to an interactive mode of an application program on a terminal. In the embodiments of the present disclosure, the interactive mode refers to an interactive mode of an application program on a terminal. The channel information is used to indicate a download channel of the application, for example, an official application download platform provided by a manufacturer of the terminal, an application download platform provided by a third party, or the like. The device information is used to indicate a type of an operating System of the terminal, and for example, the operating System of the terminal is Android (an open source mobile operating System), iOS (an iPhone Operation System), or the like.
Due to the fact that the interaction habits of the user can be reflected to a certain extent by the attribute information such as the region information, the age information, the gender information, the channel information and the equipment information, the interaction mode meeting the interaction requirements of the user can be determined based on the attribute information, a personalized interaction mode is provided for the user, the usability of man-machine interaction is improved, the interaction experience of the user is further improved, the user loss caused by the unaccustomed interaction mode is reduced, and the user retention rate is improved.
It should be noted that the attribute information according to the embodiments of the present disclosure is collected and subsequently processed by authorization of the user. The terminal has the authority to acquire and process the attribute information.
Optionally, when the user uses the application program for the first time, the terminal prompts the user to perfect attribute information such as age information, sex information, location information and the like. The user can fill attribute information such as age information, gender information, location information and the like in the attribute editing interface, and the terminal acquires and stores the input attribute information in the attribute editing interface. Further, the terminal can acquire the sex information and the age information based on the stored attribute information. For example, if the stored attribute information includes female gender information, the terminal acquires the female gender information from the stored attribute information. Optionally, the attribute information that the terminal prompts the user to refine includes birthday information and does not include age information. The terminal can acquire and store the birthday information input by the user, and if the stored attribute information includes birthday information of 1/2000 and the current time of 1/2020, the terminal determines that the age information is 20 based on the birthday information.
Optionally, the terminal includes a positioning component, and the terminal can locate the current geographic position of the terminal through the positioning component and store the current geographic position in the location information buffer. And the terminal runs the application program, and if the application program has the position acquisition permission, the terminal acquires the region information from the position information buffer area. In other embodiments, if the stored attribute information includes location information, the terminal may use the location information as the region information.
Optionally, the program file generated after the application program is installed includes channel information of the application program, and the terminal acquires the channel information of the application program from the program file of the application program. Optionally, the terminal locally stores the device information. And the terminal runs the application program, and if the application program has the acquisition permission of the device information, the terminal acquires the locally stored device information.
It should be noted that, optionally, when a user uses an application program for the first time, the terminal provides an interaction mode meeting the interaction requirement of the user for the user through the interaction processing method provided by the embodiment of the present disclosure. When a user opens an application program for the first time, the user usually needs to register an account number and input attribute information through an attribute editing interface, so that a background server of the application program provides relatively perfect service for the user based on the account number of the user. Correspondingly, the terminal responds to the account which is the account registered for the first time, and executes the step of acquiring at least one attribute information of the account.
According to the technical scheme, the personalized recommendation of the interaction style is performed for the new user based on the attribute information of the new user so as to meet the interaction requirement of the new user, so that the interaction mode conforms to the interaction habit of the new user, the usability of man-machine interaction is improved, the loss of the new user is reduced, and the retention rate of the user is improved.
Optionally, if the account logged in by the terminal is not the account registered for the first time, the terminal acquires a historical interaction mode corresponding to the account, responds to the detected interaction operation based on the historical interaction mode, and does not perform steps S301 to S305 any more.
In step S302, the terminal sends an acquisition request to the server, where the acquisition request is used to request to acquire a target interaction mode matched with at least one piece of attribute information.
Wherein the acquisition request carries at least one attribute information. The terminal sends an acquisition request to the server to acquire a target interaction mode meeting the interaction requirements of the user.
In step S303, the server receives the acquisition request, and determines a target interaction manner matching with the at least one attribute information from the plurality of interaction manners.
The interactive mode is used for indicating a response mode of interactive operation corresponding to the interactive mode. For example, one interactive mode is used to instruct to switch a video in a video playing interface in response to a sliding operation on the video playing interface, where the interactive operation corresponding to the interactive mode includes an upward sliding operation on the video playing interface and a downward sliding operation on the video playing interface, and the response mode indicated by the interactive mode includes: responding to the upward sliding operation on the video playing interface, and playing the next video; and responding to the downward sliding operation on the video playing interface, and playing the last video.
In a first optional implementation manner, the server determines a matching degree of each interaction manner and at least one attribute information; and determining the interactive mode with the highest matching degree as a target interactive mode.
According to the technical scheme, the interaction mode with the highest matching degree with the attribute information of the user is determined as the target interaction mode, the interaction mode which conforms to the interaction habit of the user is provided for the user, the interaction requirement of the user is met as far as possible, the usability of man-machine interaction is improved, the interaction experience of the user is further improved, the user loss caused by the unaccustomed interaction mode is reduced, and the user retention rate is improved.
The step of determining the matching degree of each interactive mode and at least one attribute information by the server comprises the following steps: the server acquires the weight of each attribute information in at least one attribute information, wherein the weight is used for representing the importance degree of the attribute information for determining the target interaction mode; the server determines an interaction mode corresponding to each attribute information; and the server adds the weights of the attribute information corresponding to the same interactive mode to obtain the matching degree of the interactive mode and at least one attribute information.
For example, the at least one attribute information includes region information, age information, gender information, channel information, and device information. Wherein the region information is a city A; the age information is 50; the sex information is male; the channel information is a channel a; the equipment information is Android. The weight of the region information is 0.2; the weight of the age information is 0.3; the weight of the gender information is 0.2; the weight of the channel information is 0.2; the weight of the device information is 0.1. The interactive mode corresponding to the city A is a second interactive mode; the interactive mode corresponding to the age information between 40 and 60 is a second interactive mode; the interaction mode corresponding to the sex information male is a first interaction mode; the interactive mode corresponding to the channel a is a second interactive mode; the interaction mode corresponding to the Android device information is a first interaction mode. Correspondingly, the matching degree of the first interaction mode is 0.3; the matching degree of the second interactive mode is 0.7, and the terminal determines the second interactive mode as a target interactive mode.
According to the technical scheme, the attribute information is given with certain weight, so that the weight of the attribute information reflecting the high accuracy of the interaction habit of the user is high, the weights of the attribute information corresponding to the same interaction mode are added, the matching degree of the interaction mode and at least one attribute information is obtained, the accuracy of the matching degree can be improved, the interaction mode determined based on the matching degree can meet the interaction requirement of the user, the usability of man-machine interaction is improved, the interaction experience of the user can be improved, the user loss caused by the unaccustomed interaction mode is reduced, and the user retention rate is improved.
It should be noted that, optionally, the server stores a corresponding relationship between the attribute information and the interaction manner, and the server can determine the interaction manner corresponding to each attribute information according to the stored corresponding relationship between the attribute information and the interaction manner. Before the server stores the corresponding relation between the attribute information and the interactive modes, the interactive mode corresponding to each attribute information is determined based on historical interactive data of a plurality of accounts. The server determines an interaction mode corresponding to each attribute information based on historical interaction data of a plurality of accounts, and the step comprises the following steps: the server determines the number of the accounts applying each interaction mode based on historical interaction data of a plurality of accounts with the same attribute information; and determining the interaction mode with the largest number of corresponding accounts as the interaction mode corresponding to the attribute information.
For example, the number of accounts of the city a is 1000 ten thousand for the region information, and the terminal provides a first interaction mode and a second interaction mode, where the number of accounts applying the first interaction mode is 200 ten thousand for 1000 ten thousand accounts and the number of accounts applying the second interaction mode is 800 ten thousand, and then the interaction mode corresponding to the city a of the region information is the second interaction mode.
It should be noted that the server may also determine the interaction mode corresponding to the attribute information in other ways based on historical interaction data of multiple accounts, which is not limited in the embodiment of the present disclosure.
In a second optional implementation manner, the server predicts a target interaction manner matched with the at least one piece of attribute information through an interaction manner prediction model. Correspondingly, the step of determining the target interaction mode matched with the at least one attribute information from the plurality of interaction modes by the server comprises the following steps: and the server inputs at least one attribute information into the interactive mode prediction model to obtain a target interactive mode matched with the at least one attribute information.
The interactive mode prediction model is obtained by training based on a machine learning method. And the server takes at least one attribute information of the account and the interactive mode applied by the account as training samples, and trains the interactive mode prediction model to enable the interactive mode prediction model to have the capability of predicting the interactive mode based on the attribute information.
It should be noted that, after the server determines a target interaction method matching at least one attribute information from the plurality of interaction methods, the server also stores a corresponding relationship between the account and the target interaction method, so that when the account is re-registered, the target interaction method can be determined directly based on the corresponding relationship. That is, the server re-logs in response to the account, and determines the target interaction mode corresponding to the account based on the stored corresponding relationship.
According to the technical scheme, after the target interaction mode matched with the at least one attribute information is determined, the account number and the target interaction mode to which the at least one attribute information belongs are also correspondingly stored, so that when the account number logs in again, the target interaction mode corresponding to the account number can be determined directly based on the stored corresponding relation, the determination efficiency of the target interaction mode is improved, interface interaction service can be provided for a user based on the target interaction mode quickly, the interaction experience of the user is further improved, and the retention rate of the user is improved. In addition, the computing resources consumed by re-determining the target interaction mode are reduced, and the resource utilization rate is improved.
In step S304, the server returns the target interaction mode to the terminal.
And the server returns the determined target interaction mode to the terminal so as to feed back the acquisition request of the terminal.
It should be noted that, in the embodiment of the present disclosure, through the above steps 302 to 304, a target interaction mode obtained by a terminal through interaction with a server is taken as an example for description. In some embodiments, the terminal may also be capable of locally performing the step of determining the target interaction manner matched with the at least one attribute information from the multiple interaction manners, and the process of determining the target interaction manner matched with the at least one attribute information by the terminal from the multiple interaction manners is the same as the process of determining the target interaction manner matched with the at least one attribute information by the server from the multiple interaction manners, which is not described herein again.
In step S305, the terminal receives the target interaction mode, and if the terminal detects an interaction operation on the target interface, the terminal responds to the interaction operation based on the target interaction mode corresponding to the interaction operation.
The target interface can support implementation of a target interaction mode. For example, for a short video application, the target interaction mode is used to instruct to switch videos in the video playing interface in response to a sliding operation on the video playing interface. The target interface is a video playing interface, and the video playing interface is used for playing a single video. If the terminal detects a sliding operation on the video playing interface, responding to the sliding operation based on a target interaction mode corresponding to the sliding operation, that is, if the terminal detects the sliding operation on the target interface, switching the video played on the video playing interface to other videos.
For another example, for a short video application, the target interaction mode is used to indicate that, in response to a click operation on a video in the video list interface, a video acted on by the click operation is played. And the target interface is a video list interface. Optionally, the video list interface is used for double-row display of video covers. And if the terminal detects the click operation on the cover of the video on the video list interface, the video acted by the click operation is played in a full screen mode.
According to the technical scheme, the interaction habits of different users are reflected through at least one attribute information of different accounts, then the target interaction mode which accords with the interaction habits of the users is determined from the multiple provided interaction modes, the interaction operation executed by the users on the target interface is responded according to the target interaction mode, the interaction requirements of the users are met, the usability of man-machine interaction is improved, the interaction experience of the users can be improved, the user loss caused by the fact that the interaction modes are not used is reduced, and the user retention rate is improved.
It should be noted that, optionally, in the process that the terminal provides the interactive service for the user in the target interactive mode, the target interactive mode can be adjusted based on the interactive data of the user, so as to provide the interactive mode suitable for the user.
In one example, the target interface is a video playing interface, and the target interaction mode is used for instructing to switch videos in the video playing interface in response to a sliding operation on the video playing interface. Correspondingly, the step of adjusting the target interaction mode by the terminal comprises the following steps: the terminal responds to the condition that the occurrence frequency of the target playing event is greater than a frequency threshold value, and replaces the target interaction mode with a first interaction mode; the target playing event is used for indicating that the playing duration of a single video in the video playing interface is smaller than a duration threshold, and the first interaction mode is used for indicating that the video acted by the click operation is played in response to the click operation on the video in the video list interface.
Wherein the duration threshold is set based on the total duration of a single video. For example, the total duration of a single video is 11 seconds, the duration threshold may be set to 2 seconds or 3 seconds, etc. The frequency threshold is a preset value between 0 and 1, for example, the frequency threshold is 0.8 or 0.9.
The technical scheme is that the terminal is converted into the first interactive mode to provide interactive service for the user under the condition that the user frequently switches videos through sliding operation. For example, the total time length of a single video is 11 seconds, the time length threshold is 2 seconds, the frequency threshold is 0.8, if the playing time length of 80 videos in 100 videos browsed by a user is less than 2 seconds, it indicates that the user is used to quickly switch videos in the video playing interface, and the interaction mode of quickly knowing video information by viewing the front cover of the videos in the video list interface is more suitable for the user.
In the technical scheme, if the terminal provides the interactive mode for switching the video in the video playing interface in response to the sliding operation on the video playing interface for the user and the user does not effectively browse the video, the video switching is carried out, the interactive mode is converted into the interactive mode responding to the click operation of the video in the video list interface and playing the video acted by the click operation, therefore, the user can quickly know the video information through the front cover of the video in the video list interface and then decide whether to click the front cover to watch the video, the video does not need to be switched and the general information of the video is not needed to be known through frequent sliding operation, the man-machine interaction efficiency is improved, therefore, the usability of man-machine interaction is further improved, the interaction experience of the user is further improved, the interaction service is provided for the user through an interaction mode suitable for the user, and the retention rate of the user is improved.
In another example, the target interface is a video list interface, and the target interaction mode is used for indicating that a video acted by a click operation is played in response to the click operation on the video in the video list interface; correspondingly, the step of adjusting the target interaction mode by the terminal comprises the following steps: the terminal responds to the fact that the proportion of the clicked video in the video list interface is larger than a proportion threshold value, and the target interaction mode is replaced by a second interaction mode; the specific gravity of the clicked videos in the video list interface is the ratio of the number of the videos acted by the clicking operation in the video list interface to the total number of the videos in the video list interface, and the second interaction mode is used for indicating that the videos in the video playing interface are switched in response to the sliding operation on the video playing interface. Wherein the specific gravity threshold is a preset value between 0 and 1, for example, the specific gravity threshold is 0.8 or 0.9, etc.
According to the technical scheme, the terminal is converted into the second interactive mode to provide interactive service for the user under the condition that the user clicks and watches more videos in the video list interface. For example, if the specific gravity threshold is 0.8, and the terminal recommends 100 videos for the user through the video list interface, and the user watches at least 80 videos in the 100 videos through a click operation, it indicates that the user is accustomed to watching most videos in the video list interface, and a mode of switching videos through a sliding operation is more suitable for the user.
According to the technical scheme, if a user clicks and watches most videos in the video list interface, the user is converted into a sliding operation responding to the video playing interface, the video interaction mode in the video playing interface is switched, the interactive service is provided for the user, the user can switch the videos to watch through simple sliding operation, the user does not need to click and open the videos, the videos are returned to the video list interface after being watched and then the next video is clicked to watch, the man-machine interaction efficiency can be improved, the usability of man-machine interaction is further improved, the interaction experience of the user is further improved, the interactive service is provided for the user through the interaction mode suitable for the user, and the user retention rate is improved.
Another point to be explained is that, in the above embodiment, the terminal automatically determines the target interaction mode meeting the interaction requirement of the user based on the attribute information of the user as an example. Optionally, before determining the target interaction manner through the steps S301 to S304, the terminal further supports the user to autonomously select one of the multiple interaction manners as the target interaction manner. If the user selects the interactive mode, the terminal responds to the interactive operation of the user based on the target interactive mode selected by the user through the process similar to the step S305. If the user does not select the interaction mode, the terminal determines the target interaction mode through the steps S301 to S304, that is, the terminal performs the step of acquiring at least one attribute information of the account in response to not detecting the selection operation of the selection controls of the plurality of interaction modes.
Referring to fig. 4, a process of the terminal supporting a user to autonomously select one of a plurality of interactive manners as a target interactive manner includes the following steps S401 to S402.
In step S401, the terminal displays a prompt interface, where the prompt interface includes a plurality of interactive mode selection controls.
Wherein each interactive mode corresponds to a selection control. The prompt interface comprises an interaction brief introduction of each interaction mode and a selection control of each interaction mode. For example, in response to a click operation on a video in the video list interface, the interaction profile of the interaction mode of playing the video acted by the click operation is "double cover show"; and responding to the sliding operation on the video playing interface, switching the interactive brief introduction of the video interactive mode in the video playing interface into 'single-column up-down sliding display', and displaying the interactive brief introduction of the interactive mode and the selection control of the interactive mode in parallel so as to facilitate the user to view and select. Optionally, the selection control is a control in the form of a button.
Optionally, the prompt interface further includes a plurality of presentation animations of the interaction modes, and the presentation animation of any interaction mode is used to show a process of responding to the interaction operation corresponding to the interaction mode based on the interaction mode. The terminal vividly and visually displays the interaction process of the interaction mode by displaying the demonstration animation in the prompt interface, so that the user can understand conveniently, the man-machine efficiency of the user in selecting the interaction mode can be improved, and the user experience is further improved.
In step S402, the terminal determines the selected interactive mode as the target interactive mode in response to the selection operation of the selection control of any interactive mode.
And the terminal determines the interaction mode selected by the user as a target interaction mode, and responds to the interaction operation detected on the target interface based on the target interaction mode.
According to the technical scheme, the interaction mode selected by the user independently is used as the target interaction mode, so that the interaction service is provided for the user based on the interaction mode selected by the user independently, the interaction mode can meet the interaction requirement of the user, the usability of man-machine interaction is improved, the interaction experience of the user is further improved, the user loss caused by the unaccustomed interaction mode is reduced, and the user retention rate is improved.
Optionally, after determining the target interaction mode, the terminal further sends a mode storage request to the server, where the mode storage request is used to request storage of a corresponding relationship between an account and the target interaction mode, and the mode storage request carries the account and the target interaction mode, where the account is an account logged in by the terminal. The server receives the mode storage request of the terminal, correspondingly stores the corresponding relation between the account and the target interaction mode, so that when the account logs in again, the target interaction mode corresponding to the account can be determined directly based on the stored corresponding relation, the determination efficiency of the target interaction mode is improved, interface interaction service can be provided for the user based on the target interaction mode quickly, the interaction experience of the user is further improved, and the retention rate of the user is improved. In addition, the computing resources consumed by re-determining the target interaction mode are reduced, and the resource utilization rate is improved.
All the above optional technical solutions may be combined arbitrarily to form optional embodiments of the present disclosure, and are not described in detail herein.
FIG. 5 is a block diagram illustrating an interaction processing device according to an example embodiment. Referring to fig. 5, the apparatus includes an acquisition unit 501, a determination unit 502, and a response unit 503.
An obtaining unit 501, configured to perform obtaining at least one attribute information of an account, where the attribute information is used to indicate an attribute of a user to which the account belongs;
a determining unit 502 configured to determine, from the plurality of interaction manners, a target interaction manner matched with the at least one attribute information, where the interaction manner is used to indicate a response manner of an interaction operation corresponding to the interaction manner;
the response unit 503 is configured to execute, if the interactive operation is detected on the target interface, a response to the interactive operation based on a target interactive mode corresponding to the interactive operation.
According to the technical scheme, the interaction habits of different users are reflected through at least one attribute information of different accounts, then the target interaction mode which accords with the interaction habits of the users is determined from the multiple provided interaction modes, the interaction operation executed by the users on the target interface is responded according to the target interaction mode, the interaction requirements of the users are met, the usability of man-machine interaction is improved, the interaction experience of the users can be improved, the user loss caused by the fact that the interaction modes are not used is reduced, and the user retention rate is improved.
In an alternative implementation, the determining unit 502 includes:
the first determining subunit is configured to determine the matching degree of each interactive mode and at least one attribute information;
and the second determining subunit is configured to determine the interaction mode with the highest matching degree as the target interaction mode.
In another optional implementation manner, the first determining subunit is configured to perform:
acquiring the weight of each attribute information in at least one attribute information, wherein the weight is used for representing the importance degree of the attribute information for determining a target interaction mode;
determining an interaction mode corresponding to each attribute information;
and adding the weights of the attribute information corresponding to the same interactive mode to obtain the matching degree of the interactive mode and at least one attribute information.
In another optional implementation manner, the at least one attribute information includes at least one of region information, age information, gender information, channel information, and device information;
the system comprises a target interface display unit, a channel information display unit and a device information display unit, wherein the channel information display unit is used for displaying the channel information of the application program of the computer equipment of the target interface, the channel information display unit is used for displaying the download channel of the application program of the target interface, and the device information display unit is used for displaying the type of the operating system of the computer equipment of the target interface.
In another optional implementation manner, the obtaining unit 501 is configured to perform, in response to the account being a first registered account, obtaining at least one attribute information of the account.
In another optional implementation manner, the interaction processing apparatus further includes:
the display unit is configured to execute display of a prompt interface, and the prompt interface comprises a plurality of interactive mode selection controls;
the determining unit 502 is further configured to execute a selection operation of a selection control in response to any one of the interaction modes, and determine the selected interaction mode as a target interaction mode;
the obtaining unit 501 is configured to execute, in response to not detecting the selection operation of the selection controls of the multiple interaction modes, obtaining at least one attribute information of the account.
In another optional implementation manner, the prompt interface further includes a plurality of interactive mode presentation animations, and the presentation animation of any interactive mode is used for showing a process of responding to the interactive operation corresponding to the interactive mode based on the interactive mode.
In another optional implementation manner, the interaction processing apparatus further includes:
the storage unit is configured to execute the corresponding relation between the storage account and the target interaction mode;
the determining unit 502 is further configured to perform, in response to the re-login of the account, determining a target interaction mode corresponding to the account based on the stored correspondence;
the response unit 503 is further configured to execute, if the interactive operation is detected on the target interface, a response to the interactive operation based on a target interactive mode corresponding to the interactive operation.
In another optional implementation manner, the target interface is a video playing interface, and the target interaction manner is used for instructing to switch videos in the video playing interface in response to a sliding operation on the video playing interface;
the interaction processing apparatus further includes:
a first replacing unit configured to replace the target interactive mode with a first interactive mode in response to the occurrence frequency of the target play event being greater than a frequency threshold;
the target playing event is used for indicating that the playing duration of a single video in the video playing interface is smaller than a duration threshold, and the first interaction mode is used for indicating that the video acted by the click operation is played in response to the click operation on the video in the video list interface.
In another optional implementation manner, the target interface is a video list interface, and the target interaction manner is used for indicating that a video acted by a click operation is played in response to the click operation on the video in the video list interface;
the interaction processing apparatus further includes:
the second replacement unit is configured to replace the target interaction mode with a second interaction mode in response to the fact that the proportion of the clicked video in the video list interface is larger than the proportion threshold value;
the specific gravity of the clicked videos in the video list interface is the ratio of the number of the videos acted by the clicking operation in the video list interface to the total number of the videos in the video list interface, and the second interaction mode is used for indicating that the videos in the video playing interface are switched in response to the sliding operation on the video playing interface.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
In the disclosed embodiments, the computer device may be configured as a terminal or a server. If the computer device is configured as a terminal, the terminal serves as an execution subject to implement the technical scheme provided by the embodiment of the disclosure. If the computer device is configured as a server, the server serves as an execution subject to implement the technical solution provided by the embodiments of the present disclosure. Or, the technical solution provided by the present disclosure is implemented through interaction between a terminal and a server, which is not limited in the embodiments of the present disclosure.
If the computer device is configured as a terminal, FIG. 6 is a block diagram illustrating a terminal according to an example embodiment. The terminal 600 may be a smart phone, a tablet computer, a smart watch, a laptop computer, or the like. The terminal 600 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, etc.
In general, the terminal 600 includes: a processor 601 and a memory 602.
The processor 601 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 601 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 601 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 601 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content that the display screen needs to display. In some embodiments, processor 601 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 602 may include one or more computer-readable storage media, which may be non-transitory. The memory 602 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 602 is used to store at least one instruction for execution by the processor 601 to implement the interaction processing method provided by the method embodiments of the present disclosure.
In some embodiments, the terminal 600 may further optionally include: a peripheral interface 603 and at least one peripheral. The processor 601, memory 602, and peripheral interface 603 may be connected by buses or signal lines. Various peripheral devices may be connected to the peripheral interface 603 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of a radio frequency circuit 604, a display 605, a camera assembly 606, an audio circuit 607, a positioning component 608, and a power supply 609.
The peripheral interface 603 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 601 and the memory 602. In some embodiments, the processor 601, memory 602, and peripheral interface 603 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 601, the memory 602, and the peripheral interface 603 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 604 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 604 communicates with a communication network and other communication devices via electromagnetic signals. The rf circuit 604 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 604 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 604 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 604 may further include NFC (Near Field Communication) related circuits, which are not limited by this disclosure.
The display 605 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 605 is a touch display screen, the display screen 605 also has the ability to capture touch signals on or over the surface of the display screen 605. The touch signal may be input to the processor 601 as a control signal for processing. At this point, the display 605 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 605 may be one, disposed on the front panel of the terminal 600; in other embodiments, the display 605 may be at least two, respectively disposed on different surfaces of the terminal 600 or in a folded design; in other embodiments, the display 605 may be a flexible display disposed on a curved surface or a folded surface of the terminal 600. Even more, the display 605 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 605 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera assembly 606 is used to capture images or video. Optionally, camera assembly 606 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 606 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 607 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 601 for processing or inputting the electric signals to the radio frequency circuit 604 to realize voice communication. The microphones may be provided in plural numbers, respectively, at different portions of the terminal 600 for the purpose of stereo sound collection or noise reduction. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 601 or the radio frequency circuit 604 into sound waves. The loudspeaker can be a traditional film loudspeaker and can also be a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 607 may also include a headphone jack.
The positioning component 608 is used for positioning the current geographic Location of the terminal 600 to implement navigation or LBS (Location Based Service). The Positioning component 608 can be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
Power supply 609 is used to provide power to the various components in terminal 600. The power supply 609 may be ac, dc, disposable or rechargeable. When the power supply 609 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 600 also includes one or more sensors 610. The one or more sensors 610 include, but are not limited to: acceleration sensor 611, gyro sensor 612, pressure sensor 613, fingerprint sensor 614, optical sensor 615, and proximity sensor 616.
The acceleration sensor 611 may detect the magnitude of acceleration in three coordinate axes of the coordinate system established with the terminal 600. For example, the acceleration sensor 611 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 601 may control the display screen 605 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 611. The acceleration sensor 611 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 612 may detect a body direction and a rotation angle of the terminal 600, and the gyro sensor 612 and the acceleration sensor 611 may cooperate to acquire a 3D motion of the user on the terminal 600. The processor 601 may implement the following functions according to the data collected by the gyro sensor 612: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 613 may be disposed on the side bezel of terminal 600 and/or underneath display screen 605. When the pressure sensor 613 is disposed on the side frame of the terminal 600, a holding signal of the user to the terminal 600 can be detected, and the processor 601 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 613. When the pressure sensor 613 is disposed at the lower layer of the display screen 605, the processor 601 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 605. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 614 is used for collecting a fingerprint of a user, and the processor 601 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 614, or the fingerprint sensor 614 identifies the identity of the user according to the collected fingerprint. Upon identifying that the user's identity is a trusted identity, the processor 601 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 614 may be disposed on the front, back, or side of the terminal 600. When a physical button or vendor Logo is provided on the terminal 600, the fingerprint sensor 614 may be integrated with the physical button or vendor Logo.
The optical sensor 615 is used to collect the ambient light intensity. In one embodiment, processor 601 may control the display brightness of display screen 605 based on the ambient light intensity collected by optical sensor 615. Specifically, when the ambient light intensity is high, the display brightness of the display screen 605 is increased; when the ambient light intensity is low, the display brightness of the display screen 605 is adjusted down. In another embodiment, the processor 601 may also dynamically adjust the shooting parameters of the camera assembly 606 according to the ambient light intensity collected by the optical sensor 615.
A proximity sensor 616, also known as a distance sensor, is typically disposed on the front panel of the terminal 600. The proximity sensor 616 is used to collect the distance between the user and the front surface of the terminal 600. In one embodiment, when the proximity sensor 616 detects that the distance between the user and the front surface of the terminal 600 is gradually decreased, the display 605 is controlled by the processor 601 to switch from the bright screen state to the dark screen state; when the proximity sensor 616 detects that the distance between the user and the front face of the terminal 600 is gradually increased, the processor 601 controls the display 605 to switch from the breath-screen state to the bright-screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 6 is not intended to be limiting of terminal 600 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
If the computer device is configured as a server, fig. 7 is a block diagram of a server according to an exemplary embodiment, where the server 700 may generate a relatively large difference due to different configurations or performances, and may include one or more processors (CPUs) 701 and one or more memories 702, where the memories 702 are used for storing executable instructions, and the processors 701 are configured to execute the executable instructions to implement the interaction Processing method provided by the above method embodiments. Of course, the server may also have components such as a wired or wireless network interface, a keyboard, and an input/output interface, so as to perform input/output, and the server may also include other components for implementing the functions of the device, which are not described herein again.
In an exemplary embodiment, there is also provided a storage medium comprising instructions, such as a memory 702 comprising instructions, executable by the processor 701 of the server 700 to perform the above-described interaction processing method. Alternatively, the storage medium may be a non-transitory computer readable storage medium, for example, the non-transitory computer readable storage medium may be a ROM (Read-Only Memory), a RAM (Random Access Memory), a CD-ROM (Compact Disc Read-Only Memory), a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program product is also provided, in which instructions, when executed by a processor of a computer device, enable the computer device to perform the interaction processing method in the above-described method embodiments.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (22)

1. An interaction processing method, characterized in that the method comprises:
acquiring at least one attribute information of an account, wherein the attribute information is used for representing the attribute of a user to which the account belongs;
determining a target interaction mode matched with the at least one attribute information from a plurality of interaction modes, wherein the interaction mode is used for indicating a response mode of an interaction operation corresponding to the interaction mode;
if the interactive operation is detected on the target interface, responding to the interactive operation based on a target interactive mode corresponding to the interactive operation;
wherein the target interface is an interface in a video application program, and the plurality of interaction modes include:
responding to the sliding operation on the target interface, and switching the mode of the video in the target interface;
responding to the click operation of the video in the target interface, and playing the mode of the video acted by the click operation;
or, the target interface is an interface in a song playing application program, and the plurality of interaction modes include:
responding to the operation of sliding leftwards on the target interface, and displaying the lyric mode;
and responding to the click operation on the target interface, and displaying the lyric mode.
2. The interaction processing method according to claim 1, wherein the determining a target interaction manner that matches the at least one attribute information from among the plurality of interaction manners includes:
determining the matching degree of each interactive mode and the at least one attribute information;
and determining the interactive mode with the highest matching degree as the target interactive mode.
3. The interaction processing method according to claim 2, wherein the determining a matching degree between each of the interaction means and the at least one attribute information includes:
acquiring the weight of each attribute information in the at least one attribute information, wherein the weight is used for representing the importance degree of the attribute information for determining the target interaction mode;
determining an interaction mode corresponding to each attribute information;
and adding the weights of the attribute information corresponding to the same interactive mode to obtain the matching degree of the interactive mode and the at least one attribute information.
4. The interactive processing method according to claim 1, wherein the at least one attribute information includes at least one of region information, age information, gender information, channel information, and device information;
the system comprises a target interface, a computer device and channel information, wherein the region information is used for representing a region where the computer device displaying the target interface is located, the channel information is used for representing a downloading channel of an application program to which the target interface belongs, and the device information is used for representing the category of an operating system of the computer device displaying the target interface.
5. The interaction processing method according to claim 1, wherein before the obtaining of the at least one attribute information of the account, the interaction processing method further comprises:
and responding to the account number which is registered for the first time, and executing the step of acquiring at least one attribute information of the account number.
6. The interaction processing method according to claim 1, wherein before the obtaining of the at least one attribute information of the account, the interaction processing method further comprises:
displaying a prompt interface, wherein the prompt interface comprises selection controls of the plurality of interaction modes;
responding to the selection operation of a selection control of any interactive mode, and determining the selected interactive mode as the target interactive mode;
and in response to not detecting the selection operation of the selection controls of the plurality of interactive modes, executing the step of acquiring at least one attribute information of the account.
7. The interaction processing method according to claim 6, wherein the prompt interface further includes presentation animations of the plurality of interaction modes, and the presentation animation of any interaction mode is used to show a process of responding to an interaction operation corresponding to the interaction mode based on the interaction mode.
8. The interaction processing method according to claim 1, wherein after the target interaction means matched with the at least one attribute information is determined from the plurality of interaction means, the interaction processing method further comprises:
storing the corresponding relation between the account and the target interaction mode;
responding to the re-login of the account, and determining the target interaction mode corresponding to the account based on the stored corresponding relation;
and if the interactive operation is detected on the target interface, responding to the interactive operation based on a target interactive mode corresponding to the interactive operation.
9. The interaction processing method according to claim 1, wherein the target interface is a video playing interface, and the target interaction manner is used to instruct to switch videos in the video playing interface in response to a sliding operation on the video playing interface;
the interaction processing method further comprises the following steps:
replacing the target interaction mode with a first interaction mode in response to the occurrence frequency of the target playing event being greater than a frequency threshold;
the target playing event is used for indicating that the playing duration of a single video in the video playing interface is smaller than a duration threshold, and the first interaction mode is used for indicating that the video acted by the click operation is played in response to the click operation on the video in the video list interface.
10. The interaction processing method according to claim 1, wherein the target interface is a video list interface, and the target interaction mode is used to instruct, in response to a click operation on a video in the video list interface, to play a video acted by the click operation;
the interaction processing method further comprises the following steps:
in response to the fact that the proportion of the clicked video in the video list interface is larger than a proportion threshold value, replacing the target interaction mode with a second interaction mode;
the specific gravity of the clicked videos in the video list interface is the ratio of the number of videos acted by the clicking operation in the video list interface to the total number of videos in the video list interface, and the second interaction mode is used for indicating that the videos in the video playing interface are switched in response to the sliding operation on the video playing interface.
11. An interaction processing apparatus, comprising:
the account management system comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is configured to acquire at least one attribute information of an account, and the attribute information is used for representing the attribute of a user to which the account belongs;
a determining unit configured to determine a target interaction mode matched with the at least one attribute information from a plurality of interaction modes, where the interaction mode is used to indicate a response mode of an interaction operation corresponding to the interaction mode;
the response unit is configured to execute a response to the interactive operation based on a target interactive mode corresponding to the interactive operation if the interactive operation is detected on a target interface;
wherein the target interface is an interface in a video application program, and the plurality of interaction modes include:
responding to the sliding operation on the target interface, and switching the mode of the video in the target interface;
responding to the click operation of the video in the target interface, and playing the mode of the video acted by the click operation;
or, the target interface is an interface in a song playing application program, and the plurality of interaction modes include:
responding to the operation of sliding leftwards on the target interface, and displaying the lyric mode;
and responding to the click operation on the target interface, and displaying the lyric mode.
12. The interaction processing apparatus according to claim 11, wherein the determining unit includes:
a first determining subunit, configured to perform determining a matching degree of each of the interaction modes with the at least one attribute information;
and the second determining subunit is configured to determine the interaction mode with the highest matching degree as the target interaction mode.
13. The interaction processing apparatus according to claim 12, wherein the first determining subunit is configured to perform:
acquiring the weight of each attribute information in the at least one attribute information, wherein the weight is used for representing the importance degree of the attribute information for determining the target interaction mode;
determining an interaction mode corresponding to each attribute information;
and adding the weights of the attribute information corresponding to the same interactive mode to obtain the matching degree of the interactive mode and the at least one attribute information.
14. The apparatus according to claim 11, wherein the at least one attribute information includes at least one of region information, age information, gender information, channel information, and equipment information;
the system comprises a target interface, a computer device and channel information, wherein the region information is used for representing a region where the computer device displaying the target interface is located, the channel information is used for representing a downloading channel of an application program to which the target interface belongs, and the device information is used for representing the category of an operating system of the computer device displaying the target interface.
15. The interaction processing apparatus according to claim 11, wherein the obtaining unit is configured to perform obtaining at least one attribute information of the account in response to the account being a first registered account.
16. The interaction processing apparatus according to claim 11, further comprising:
a display unit configured to execute displaying a prompt interface, the prompt interface including selection controls of the plurality of interaction modes;
the determining unit is further configured to execute a selection operation of a selection control responding to any interactive mode, and determine the selected interactive mode as the target interactive mode;
the obtaining unit is configured to execute, in response to not detecting the selection operation of the selection controls of the multiple interaction modes, obtaining at least one attribute information of the account.
17. The interaction processing apparatus of claim 16, wherein the prompt interface further includes a presentation animation of the plurality of interaction modes, and the presentation animation of any interaction mode is used to show a process of responding to the interaction operation corresponding to the interaction mode based on the interaction mode.
18. The interaction processing apparatus according to claim 11, further comprising:
the storage unit is configured to store the corresponding relation between the account and the target interaction mode;
the determining unit is further configured to execute re-login in response to the account, and determine the target interaction mode corresponding to the account based on the stored corresponding relationship;
the response unit is further configured to respond to the interactive operation based on a target interactive mode corresponding to the interactive operation if the interactive operation is detected on the target interface.
19. The interaction processing apparatus according to claim 11, wherein the target interface is a video playback interface, and the target interaction manner is used to instruct to switch videos in the video playback interface in response to a sliding operation on the video playback interface;
the interaction processing apparatus further includes:
a first replacing unit configured to replace the target interactive mode with a first interactive mode in response to the occurrence frequency of the target playing event being greater than a frequency threshold;
the target playing event is used for indicating that the playing duration of a single video in the video playing interface is smaller than a duration threshold, and the first interaction mode is used for indicating that the video acted by the click operation is played in response to the click operation on the video in the video list interface.
20. The interaction processing apparatus according to claim 11, wherein the target interface is a video list interface, and the target interaction manner is used to instruct, in response to a click operation on a video in the video list interface, to play a video acted on by the click operation;
the interaction processing apparatus further includes:
the second replacement unit is configured to replace the target interaction mode with a second interaction mode in response to the fact that the proportion of the clicked video in the video list interface is larger than a proportion threshold value;
the specific gravity of the clicked videos in the video list interface is the ratio of the number of videos acted by the clicking operation in the video list interface to the total number of videos in the video list interface, and the second interaction mode is used for indicating that the videos in the video playing interface are switched in response to the sliding operation on the video playing interface.
21. A computer device, characterized in that the computer device comprises:
one or more processors;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the interaction processing method of any of claims 1 to 10.
22. A storage medium, wherein instructions in the storage medium, when executed by a processor of a computer device, enable the computer device to perform the interaction processing method of any one of claims 1 to 10.
CN202011157026.9A 2020-10-26 2020-10-26 Interaction processing method and device, computer equipment and storage medium Active CN112256181B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011157026.9A CN112256181B (en) 2020-10-26 2020-10-26 Interaction processing method and device, computer equipment and storage medium
PCT/CN2021/106497 WO2022088765A1 (en) 2020-10-26 2021-07-15 Interaction processing method and computer device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011157026.9A CN112256181B (en) 2020-10-26 2020-10-26 Interaction processing method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112256181A CN112256181A (en) 2021-01-22
CN112256181B true CN112256181B (en) 2022-06-03

Family

ID=74261128

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011157026.9A Active CN112256181B (en) 2020-10-26 2020-10-26 Interaction processing method and device, computer equipment and storage medium

Country Status (2)

Country Link
CN (1) CN112256181B (en)
WO (1) WO2022088765A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112256181B (en) * 2020-10-26 2022-06-03 北京达佳互联信息技术有限公司 Interaction processing method and device, computer equipment and storage medium
CN114816181A (en) * 2022-03-08 2022-07-29 平安科技(深圳)有限公司 Human-computer interaction mode processing method and device based on machine learning and related equipment

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9298337B2 (en) * 2013-02-07 2016-03-29 Google Inc. Mechanism to reduce accidental clicks on online content
CN106651357B (en) * 2016-11-16 2021-06-22 网易乐得科技有限公司 Payment mode recommendation method and device
CN108319485A (en) * 2018-01-29 2018-07-24 出门问问信息科技有限公司 Information interacting method, device, equipment and storage medium
CN109151548A (en) * 2018-08-31 2019-01-04 北京优酷科技有限公司 Interface alternation method and device
CN110109596B (en) * 2019-05-08 2021-11-16 芋头科技(杭州)有限公司 Recommendation method and device of interaction mode, controller and medium
CN110730387B (en) * 2019-11-13 2022-12-06 腾讯科技(深圳)有限公司 Video playing control method and device, storage medium and electronic device
CN112256181B (en) * 2020-10-26 2022-06-03 北京达佳互联信息技术有限公司 Interaction processing method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
WO2022088765A1 (en) 2022-05-05
CN112256181A (en) 2021-01-22

Similar Documents

Publication Publication Date Title
CN112162671B (en) Live broadcast data processing method and device, electronic equipment and storage medium
CN110868626B (en) Method and device for preloading content data
CN107908929B (en) Method and device for playing audio data
CN111079012A (en) Live broadcast room recommendation method and device, storage medium and terminal
CN109327608B (en) Song sharing method, terminal, server and system
CN110248236B (en) Video playing method, device, terminal and storage medium
CN109144346B (en) Song sharing method and device and storage medium
CN110572716B (en) Multimedia data playing method, device and storage medium
CN110149557B (en) Video playing method, device, terminal and storage medium
CN111836069A (en) Virtual gift presenting method, device, terminal, server and storage medium
CN111339326A (en) Multimedia resource display method, multimedia resource providing method and multimedia resource providing device
CN109982129B (en) Short video playing control method and device and storage medium
CN112256181B (en) Interaction processing method and device, computer equipment and storage medium
CN110147503B (en) Information issuing method and device, computer equipment and storage medium
CN109089137B (en) Stuck detection method and device
CN112616082A (en) Video preview method, device, terminal and storage medium
CN112699268A (en) Method, device and storage medium for training scoring model
CN111796990A (en) Resource display method, device, terminal and storage medium
CN110134902B (en) Data information generating method, device and storage medium
CN112118482A (en) Audio file playing method and device, terminal and storage medium
CN112118353A (en) Information display method, device, terminal and computer readable storage medium
CN112069350A (en) Song recommendation method, device, equipment and computer storage medium
CN111522483B (en) Multimedia data sharing method and device, electronic equipment and storage medium
CN110996115B (en) Live video playing method, device, equipment, storage medium and program product
CN110267114B (en) Video file playing method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant