WO2022088765A1 - Procédé de traitement d'interaction et dispositif informatique - Google Patents

Procédé de traitement d'interaction et dispositif informatique Download PDF

Info

Publication number
WO2022088765A1
WO2022088765A1 PCT/CN2021/106497 CN2021106497W WO2022088765A1 WO 2022088765 A1 WO2022088765 A1 WO 2022088765A1 CN 2021106497 W CN2021106497 W CN 2021106497W WO 2022088765 A1 WO2022088765 A1 WO 2022088765A1
Authority
WO
WIPO (PCT)
Prior art keywords
interaction
target
interface
interaction mode
video
Prior art date
Application number
PCT/CN2021/106497
Other languages
English (en)
Chinese (zh)
Inventor
汤晓
Original Assignee
北京达佳互联信息技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京达佳互联信息技术有限公司 filed Critical 北京达佳互联信息技术有限公司
Publication of WO2022088765A1 publication Critical patent/WO2022088765A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Definitions

  • the present disclosure relates to the field of computer technologies, and in particular, to an interaction processing method and computer device.
  • the terminal displays the application interface of the application program, and feedbacks the interactive operation on the application interface through the interface change, so as to realize the interface interaction.
  • the application is usually preset with an interaction mode.
  • the recommendation interface of the short video application is used to play a single short video, and the terminal responds to the operation of sliding up and plays the next short video in the recommendation interface.
  • an application program provides an interaction mode for all users of the application program.
  • Embodiments of the present disclosure provide an interaction processing method and a computer device, so as to improve the usability of human-computer interaction.
  • the technical solutions of the present disclosure are as follows:
  • an interaction processing method comprising:
  • determining a target interaction mode from a plurality of interaction modes the target interaction mode being matched with the at least one attribute information and used to indicate a response mode of a target interaction operation, the target interaction operation corresponding to the target interaction mode; as well as
  • determining the target interaction mode from multiple interaction modes includes:
  • the interaction mode with the highest matching degree is determined as the target interaction mode.
  • the determining the degree of matching between each of the interaction modes and the at least one attribute information includes:
  • the weights of the attribute information corresponding to the same interaction mode are added to obtain the matching degree of the interaction mode and the at least one attribute information.
  • the acquiring at least one attribute information of the current login account includes:
  • the at least one attribute information from at least one of the geographical information, age information, gender information, channel information and device information,
  • the region information is used to indicate the region where the computer device displaying the target interface is located, the channel information is used to indicate the download channel of the application to which the target interface belongs, and the device information is used to indicate the display of the target interface.
  • the category of the computer device's operating system for the target interface is used to indicate the region where the computer device displaying the target interface is located.
  • the interaction processing method further includes:
  • the step of acquiring at least one attribute information of the current login account is performed in response to the current login account being the account for the first login.
  • the interaction processing method further includes:
  • the prompt interface including selection controls for the plurality of interaction modes
  • the step of acquiring at least one attribute information of the currently logged-in account is performed in response to not detecting a selection operation on the plurality of interactive mode selection controls.
  • the interaction processing method further includes:
  • the selected interaction mode is determined as the target interaction mode.
  • the displaying a prompt interface includes:
  • a demonstration animation of the multiple interaction modes is also displayed in the prompt interface, and each interaction mode in the multiple interaction modes is also displayed in the prompt interface.
  • the demonstration animation is used to show the process of responding to the interaction operation corresponding to the interaction mode based on the interaction mode.
  • the interaction processing method further includes:
  • the target interface is a video playback interface
  • the target interaction mode is used to instruct to switch the video in the video playback interface in response to a sliding operation on the video playback interface
  • the interaction processing method further includes:
  • the target interaction mode is replaced with the first interaction mode
  • the target playback event is used to indicate that the playback duration of a single video in the video playback interface is less than the duration threshold
  • the first interaction mode is used to indicate that in response to a click operation on the video in the video list interface, play all the videos. A video of what the click action will do.
  • the target interface is a video list interface
  • the target interaction mode is used to indicate that in response to a click operation on a video in the video list interface, the video acted by the click operation is played;
  • the interaction processing method further includes:
  • the target interaction mode is replaced by the second interaction mode
  • the proportion is the ratio of the first number to the second number
  • the first number is the number of videos acted on by the click operation in the video list interface
  • the second number is the videos in the video list interface
  • the second interaction mode is used to instruct to switch the video in the video playback interface in response to a sliding operation on the video playback interface.
  • an interaction processing apparatus comprising:
  • an acquisition unit configured to perform acquisition of at least one attribute information of the current login account, where the attribute information is used to represent the attribute of the user to which the current login account belongs;
  • the determining unit is configured to determine a target interaction mode from a plurality of interaction modes, the target interaction mode is matched with the at least one attribute information and is used to indicate the response mode of the target interaction operation, the target interaction operation is related to the target interaction mode. corresponding to the way in which the target interacts;
  • a response unit configured to perform, in response to detecting the target interaction operation on the target interface, responding to the target interaction operation based on the target interaction manner.
  • the determining unit includes:
  • a first determining subunit configured to perform determining the degree of matching of each of the interaction modes with the at least one attribute information
  • the second determining subunit is configured to perform determining the interaction mode with the highest matching degree as the target interaction mode.
  • the first determining subunit is configured to perform:
  • the weights of the attribute information corresponding to the same interaction mode are added to obtain the matching degree of the interaction mode and the at least one attribute information.
  • the acquiring unit is configured to acquire at least one of the region information, age information, gender information, channel information and device information of the currently logged in account, and convert the region information, age information, At least one of gender information, channel information and device information constitutes the at least one attribute information,
  • the region information is used to indicate the region where the computer device displaying the target interface is located, the channel information is used to indicate the download channel of the application to which the target interface belongs, and the device information is used to indicate the display of the target interface.
  • the category of the computer device's operating system for the target interface is used to indicate the region where the computer device displaying the target interface is located.
  • the obtaining unit is configured to obtain at least one attribute information of the current login account in response to the current login account being an account for the first login.
  • the interaction processing apparatus further includes:
  • a display unit configured to display a prompt interface, the prompt interface including the selection controls for the plurality of interaction modes
  • the obtaining unit is configured to obtain at least one attribute information of the currently logged-in account in response to not detecting a selection operation on the selection controls of the plurality of interaction modes.
  • the determining unit is further configured to perform a selection operation on an interaction mode selection control in the prompt interface, and determine the selected interaction mode as the target interaction mode.
  • the display unit is configured to perform, in addition to displaying the selection controls of the plurality of interaction modes in the prompt interface, a presentation of the plurality of interaction modes in the prompt interface Animation, the demonstration animation of each interaction mode in the multiple interaction modes is used to show the process of responding to the interaction operation corresponding to the interaction mode based on the interaction mode.
  • the interaction processing apparatus further includes:
  • a storage unit configured to execute and store the corresponding relationship between the current login account and the target interaction mode
  • the determining unit is further configured to perform, in response to a re-login operation based on the current login account, determining the target interaction mode based on the stored correspondence;
  • the response unit is further configured to perform, in response to detecting the target interaction operation on the target interface, responding to the target interaction operation based on the target interaction manner.
  • the target interface is a video playback interface
  • the target interaction mode is used to instruct to switch the video in the video playback interface in response to a sliding operation on the video playback interface
  • the interaction processing device further includes:
  • a first replacement unit configured to replace the target interaction mode with the first interaction mode when the occurrence frequency of the target playback event is greater than the frequency threshold
  • the target playback event is used to indicate that the playback duration of a single video in the video playback interface is less than the duration threshold
  • the first interaction mode is used to indicate that in response to a click operation on the video in the video list interface, play all the videos. A video of what the click action will do.
  • the target interface is a video list interface
  • the target interaction mode is used to indicate that in response to a click operation on a video in the video list interface, the video acted by the click operation is played;
  • the interaction processing device further includes:
  • the second replacement unit is configured to replace the target interaction mode with the second interaction mode when the proportion of the clicked video in the video list interface is greater than the proportion threshold
  • the proportion is the ratio of the first number to the second number
  • the first number is the number of videos acted on by the click operation in the video list interface
  • the second number is the videos in the video list interface
  • the second interaction mode is used to instruct to switch the video in the video playback interface in response to a sliding operation on the video playback interface.
  • a computer device comprising: one or more processors; memory for storing instructions executable by the processors; wherein the processors are configured To execute the instructions, to achieve the following steps:
  • determining a target interaction mode from a plurality of interaction modes the target interaction mode being matched with the at least one attribute information and used to indicate a response mode of a target interaction operation, the target interaction operation corresponding to the target interaction mode; as well as
  • the processor is configured to execute the instructions to implement the steps of:
  • the interaction mode with the highest matching degree is determined as the target interaction mode.
  • the processor is configured to execute the instructions to implement the steps of:
  • the weights of the attribute information corresponding to the same interaction mode are added to obtain the matching degree of the interaction mode and the at least one attribute information.
  • the processor is configured to execute the instructions to implement the steps of:
  • the at least one attribute information from at least one of the geographical information, age information, gender information, channel information and device information,
  • the region information is used to indicate the region where the computer device displaying the target interface is located, the channel information is used to indicate the download channel of the application to which the target interface belongs, and the device information is used to indicate the display of the target interface.
  • the category of the computer device's operating system for the target interface is used to indicate the region where the computer device displaying the target interface is located.
  • the processor is configured to execute the instructions to implement the steps of:
  • At least one attribute information of the current login account is acquired.
  • the processor is configured to execute the instructions to implement the steps of:
  • the prompt interface including selection controls for the plurality of interaction modes
  • the processor is configured to execute the instructions to implement the steps of:
  • the selected interaction mode is determined as the target interaction mode.
  • the processor is configured to execute the instructions to implement the steps of:
  • a demonstration animation of the multiple interaction modes is also displayed in the prompt interface, and each interaction mode in the multiple interaction modes is also displayed in the prompt interface.
  • the demonstration animation is used to show the process of responding to the interaction operation corresponding to the interaction mode based on the interaction mode.
  • the processor is configured to execute the instructions to implement the steps of:
  • the target interface is a video playback interface
  • the target interaction mode is used to instruct to switch the video in the video playback interface in response to a sliding operation on the video playback interface
  • the processor is configured to execute the instructions to implement the steps of:
  • the target interaction mode is replaced with the first interaction mode
  • the target playback event is used to indicate that the playback duration of a single video in the video playback interface is less than the duration threshold
  • the first interaction mode is used to indicate that in response to a click operation on the video in the video list interface, play all the videos. A video of what the click action will do.
  • the processor is configured to execute the instructions to implement the steps of:
  • the target interaction mode is replaced by the second interaction mode
  • the proportion is the ratio of the first number to the second number
  • the first number is the number of videos acted on by the click operation in the video list interface
  • the second number is the videos in the video list interface
  • the second interaction mode is used to instruct to switch the video in the video playback interface in response to a sliding operation on the video playback interface.
  • a storage medium when the instructions in the storage medium are executed by a processor of a computer device, the computer device is caused to perform the following steps:
  • determining a target interaction mode from a plurality of interaction modes the target interaction mode being matched with the at least one attribute information and used to indicate a response mode of a target interaction operation, the target interaction operation corresponding to the target interaction mode; as well as
  • a computer program product comprising a computer program/instruction, the computer program/instruction, when executed by a processor, implements the following steps:
  • determining a target interaction mode from a plurality of interaction modes the target interaction mode being matched with the at least one attribute information and used to indicate a response mode of a target interaction operation, the target interaction operation corresponding to the target interaction mode; as well as
  • At least one attribute information of different accounts is used to reflect the interaction habits of different users, and then a target interaction mode that conforms to the user's interaction habits is determined from the provided multiple interaction modes. Perform interactive operations in response to meet the user's interactive needs and improve the usability of human-computer interaction.
  • FIG. 1 is a schematic diagram of an implementation environment according to an exemplary embodiment
  • FIG. 2 is a flowchart of an interaction processing method according to an exemplary embodiment
  • FIG. 3 is a flowchart of an interaction processing method according to an exemplary embodiment
  • FIG. 4 is a flowchart of an interaction processing method according to an exemplary embodiment
  • FIG. 5 is a block diagram of an interaction processing apparatus according to an exemplary embodiment
  • FIG. 6 is a block diagram of a terminal according to an exemplary embodiment.
  • Fig. 7 is a block diagram of a server according to an exemplary embodiment.
  • the terms “at least one”, “plurality”, “each”, at least one includes one, two or more, multiple includes two or more, and each refers to the corresponding each of the multiple.
  • the user information involved in this disclosure is the information authorized by the user or fully authorized by all parties.
  • FIG. 1 is a schematic diagram illustrating an implementation environment according to an exemplary embodiment.
  • the implementation environment includes a terminal 110, and the interaction processing method provided by the embodiment of the present disclosure is executed by the terminal 110.
  • the terminal 110 is a smartphone, tablet, smart watch, laptop or laptop, or the like.
  • various types of applications such as video viewing applications, audio playing applications, etc., are installed and run on the terminal 110 .
  • video viewing applications include short video applications, and users can perform corresponding interactive operations on the application interface of the short video application to watch the next short video, watch live broadcasts, or view other short videos by the author of the short video.
  • the terminal 110 generally refers to one of multiple terminals, and the embodiments of the present disclosure only use the terminal 110 as an example for illustration. Those skilled in the art know that the number of the above-mentioned terminals can be more or less. For example, the above-mentioned terminals are only a few, or the above-mentioned terminals are dozens or hundreds, or more, and the embodiments of the present disclosure do not limit the number of terminals and device types.
  • the implementation environment includes terminal 110 and server 120 .
  • the interaction processing method provided by the embodiment of the present disclosure is implemented through interaction between the terminal 110 and the server 120 .
  • the server 120 is a background server of an application program on the terminal, and provides background services for the application program.
  • the server 120 and the terminal 110 are directly or indirectly connected through wired or wireless communication.
  • the server 120 is a server, multiple servers, a cloud server, a cloud computing platform, or a virtualization center. The number of the foregoing servers may be larger or smaller, which is not limited in this embodiment of the present disclosure.
  • the server 120 also includes other functional servers in order to provide more comprehensive and diversified services.
  • the interaction processing method provided by the embodiments of the present disclosure is executed by a computer device.
  • the above implementation environment is described by taking the computer device including a terminal as an example.
  • the computer device is an interactive robot, a self-service ordering machine, or other interactive devices. The embodiment does not limit this.
  • Fig. 2 is a flowchart of an interaction processing method according to an exemplary embodiment. Referring to FIG. 2 , the interaction processing method is applied to a terminal, and includes the following steps.
  • step S201 the terminal acquires at least one attribute information of the current login account, where the attribute information is used to indicate the attribute of the user to which the current login account belongs.
  • the current login account is used to represent the user using the terminal.
  • the user can log in the user's account on the terminal, so that the terminal can provide the user with a more complete service.
  • the relevant information of the user can be saved with the account as an identifier for easy query and application.
  • the terminal acquires attribute information used to represent user attributes, and based on the analysis of the attribute information, recommends an interaction mode for the user that meets the interaction requirements of the user.
  • step S202 the terminal determines a target interaction mode from a plurality of interaction modes, the target interaction mode matches at least one attribute information and is used to indicate a response mode of the target interaction operation, and the target interaction operation corresponds to the target interaction mode.
  • Terminals offer several ways of interacting. For example, for a short video application, the terminal provides two interaction modes. One interaction mode is used to indicate that the video in the video playback interface is switched in response to a sliding operation on the video playback interface; Click operation of a video in the video list interface, and play the video effected by the click operation.
  • the terminal provides two interactive modes, one interactive mode is used to indicate that the lyrics are displayed in response to the operation of sliding to the left on the song playing interface; the other interactive mode is used to indicate that the song playing interface is responsive to Click on the button to display the lyrics.
  • the terminal infers the user's preferred interaction mode, and automatically sets a target interaction mode for the user that meets the user's interaction requirements.
  • step S203 in response to detecting the target interaction operation on the target interface, the terminal responds to the target interaction operation based on the target interaction mode.
  • the target interface is an interface that supports the target interaction method.
  • the target interface in response to a click operation on a video in the video list interface, the interaction method of playing the video effected by the click operation, the target interface is the video list interface, and the target interaction operation is a click operation.
  • the target interaction operation in response to the terminal detecting a click operation on the video list interface, and based on the target interaction mode corresponding to the click operation, play the video effected by the click operation.
  • the technical solutions provided by the embodiments of the present disclosure reflect the interaction habits of different users through at least one attribute information of different accounts, and then determine a target interaction mode that conforms to the user's interaction habits from the provided multiple interaction modes.
  • the interactive mode responds to the interactive operations performed by the user on the target interface to meet the user's interactive needs and improve the usability of human-computer interaction.
  • Fig. 3 is a flowchart of an interaction processing method according to an exemplary embodiment. The flow is described by taking the interaction between the terminal and the server as an example. Referring to FIG. 3 , the interaction processing method includes the following steps.
  • step S301 the terminal acquires at least one attribute information of the current login account, and the attribute information is used to indicate the attribute of the user to which the current login account belongs.
  • the at least one attribute information includes at least one of location information, age information, gender information, channel information, and device information.
  • the region information is used to indicate the region where the terminal is located, for example, the region information is Beijing, Wuhan, Shanghai, and the like.
  • the interaction mode in the embodiment of the present disclosure refers to the interaction mode of the operating system of the terminal; or, the interaction mode in the embodiment of the present disclosure refers to the interaction mode of the application program on the terminal. In the embodiments of the present disclosure, the interaction mode refers to the interaction mode of the application on the terminal as an example for description.
  • the above channel information is used to indicate the download channel of the application, for example, the download channel is an official application download platform provided by the terminal manufacturer or an application download platform provided by a third party.
  • the above device information is used to indicate the type of the operating system of the terminal, for example, the operating system of the terminal is Android (Android, an open source mobile operating system) or iOS (iPhone Operation System, a mobile operating system).
  • attribute information such as geographical information, age information, gender information, channel information, and device information can reflect the user's interaction habits to a certain extent, based on the at least one attribute information above, it is possible to determine the interaction method that meets the user's interaction needs. , to provide the user with a personalized interaction method and improve the usability of human-computer interaction.
  • the attribute information involved in the embodiments of the present disclosure is collected and processed after being authorized by the user.
  • the terminal has the acquisition and processing authority of attribute information, and the acquisition and processing authority of attribute information is authorized by the user.
  • the terminal when the user uses the application for the first time, the terminal prompts the user to complete attribute information such as age information, gender information, and location information.
  • the user can fill in attribute information such as age information, gender information, and location information in the attribute editing interface, and the terminal acquires and saves the attribute information input in the attribute editing interface.
  • the terminal can acquire gender information and age information based on the stored attribute information. For example, in the case where the gender information included in the stored attribute information is female, the terminal acquires from the stored attribute information that the gender information is female.
  • the terminal prompts the user to complete the attribute information including birthday information but not age information.
  • the terminal can acquire and save the birthday information input by the user, and determine the age information based on the birthday information and the current time. For example, when the birthday information included in the stored attribute information is January 1, 2000 and the current time is January 1, 2020, the terminal determines that the age information is 20 based on the birthday information and the current time.
  • the terminal includes a positioning component, and the terminal can locate the region information of the terminal through the positioning component, and store it in the location information buffer.
  • the terminal runs an application program, and in the case that the application program has the location acquisition authority, the terminal acquires regional information from the location information buffer.
  • the terminal in the case that the stored attribute information includes regional information, the terminal directly acquires the regional information in the attribute information.
  • the program file generated after the application program is installed includes the channel information of the application program, and the terminal obtains the channel information of the application program from the program file of the application program.
  • the terminal locally stores device information. The terminal runs an application, and in the case that the application has the right to acquire device information, the terminal acquires the locally stored device information.
  • the terminal when the user uses the application program for the first time, the terminal provides the user with an interaction mode that meets the user's interaction requirements by using the interaction processing method provided by the embodiments of the present disclosure.
  • the terminal When a user opens an application for the first time, it is usually necessary to register an account and input attribute information through the attribute editing interface, so that the background server of the application can provide the user with a more complete service based on the user's account.
  • the terminal performs the step of acquiring at least one attribute information of the current login account in response to the current login account being the account for the first login.
  • the above technical solution based on the attribute information of the new user, makes personalized recommendation of the interaction mode for the new user, so as to meet the interaction requirements of the new user, make the interaction mode conform to the interaction habits of the new user, improve the usability of human-computer interaction, and reduce the number of new users.
  • User churn increases user retention rate.
  • the terminal in the case that the current login account is not the account for the first login, the terminal obtains the historical interaction mode corresponding to the current login account, and based on the historical interaction mode, responds to the detected interaction operation, and does not perform step S301. Go to step S305.
  • step S302 the terminal sends an acquisition request to the server, where the acquisition request is used to request acquisition of a target interaction mode matching at least one attribute information.
  • the get request carries at least one attribute information.
  • the terminal sends an acquisition request to the server to acquire the target interaction mode that meets the interaction requirements of the user.
  • step S303 the server receives the acquisition request, and determines a target interaction mode from a plurality of interaction modes, and the target interaction mode matches at least one attribute information.
  • the target interaction mode is used to indicate the response mode of the target interaction operation, and the target interaction operation corresponds to the interaction mode.
  • the target interaction mode is used to indicate that the video in the video playback interface is switched in response to a sliding operation on the video playback interface, and the target interaction operation includes an upward sliding operation on the video playback interface and a downward sliding operation on the video playback interface
  • the response mode indicated by the target interaction mode includes: playing the next video in response to the upward sliding operation on the video playing interface; playing the previous video in response to the downward sliding operation on the video playing interface.
  • the server determines the degree of matching between each interaction mode and at least one attribute information; and determines the interaction mode with the highest matching degree as the target interaction mode.
  • the user by determining the interaction mode with the highest degree of matching with the user's attribute information as the target interaction mode, the user is provided with an interaction mode that conforms to the user's interaction habits, so as to satisfy the user's interaction needs as much as possible and improve the usability of human-computer interaction.
  • the step of the server determining the degree of matching between each interaction mode and the at least one attribute information includes: the server determining the interaction mode corresponding to each attribute information; the server acquiring the weight of each attribute information in the at least one attribute information, and the weight is used to represent the attribute The importance of the information for determining the corresponding interaction mode; and the server adds the weights of the attribute information corresponding to the same interaction mode to obtain the matching degree between the interaction mode and at least one attribute information.
  • the at least one attribute information includes region information, age information, gender information, channel information and device information.
  • the geographic information is city A; the age information is 50; the gender information is male; the channel information is channel a; and the device information is Android.
  • the weight of geographical information is 0.2; the weight of age information is 0.3; the weight of gender information is 0.2; the weight of channel information is 0.2; the weight of equipment information is 0.1.
  • the interaction method corresponding to city A is the second interaction method; the interaction method corresponding to age information between 40 and 60 is the second interaction method; the interaction method corresponding to gender information male is the first interaction method; the interaction method corresponding to channel a is the second interaction mode; the interaction mode corresponding to the device information Android is the first interaction mode.
  • the matching degree of the first interaction mode is 0.3; the matching degree of the second interaction mode is 0.7, and the terminal determines the second interaction mode as the target interaction mode.
  • attribute information is given a certain weight, so that attribute information that reflects the user's interaction habits with higher accuracy has a higher weight, and then the weights of attribute information corresponding to the same interaction mode are added to obtain the interaction mode and at least the attribute information.
  • a matching degree of attribute information can improve the accuracy of the matching degree, and then the interaction mode determined based on the matching degree can meet the user's interaction requirements and improve the usability of human-computer interaction.
  • the server stores the corresponding relationship between the attribute information and the interaction mode, and the server can determine the interaction mode corresponding to each attribute information according to the stored corresponding relationship between the attribute information and the interaction mode.
  • the server Before storing the corresponding relationship between the attribute information and the interaction mode, the server also determines the interaction mode corresponding to each attribute information based on historical interaction data of multiple accounts.
  • the step that the server determines the interaction mode corresponding to each attribute information based on the historical interaction data of the multiple accounts includes: the server determines, based on the historical interaction data of the multiple accounts with any one of the same attribute information, the number of accounts to which each interaction mode is applied ; Determine the interaction mode with the largest number of corresponding accounts as the interaction mode corresponding to the attribute information.
  • the number of accounts whose geographic information is city A is 10 million
  • the terminal provides a first interaction method and a second interaction method.
  • the number of accounts using the first interaction method is 2 million
  • the second interaction method is applied If the number of accounts in the interaction mode is 8 million, the interaction mode corresponding to the geographic information city A is the second interaction mode.
  • the server can also determine the interaction mode corresponding to the attribute information in other ways based on the historical interaction data of multiple accounts, which is not limited in this embodiment of the present disclosure.
  • the server predicts the target interaction mode matching the at least one attribute information through the interaction mode prediction model.
  • the step of the server determining the target interaction mode from the plurality of interaction modes includes: the server inputs at least one attribute information into the interaction mode prediction model to obtain the target interaction mode.
  • the interactive mode prediction model is obtained by training based on the machine learning method.
  • the server uses at least one attribute information of the sample account and the interaction mode applied to the sample account as training samples, and trains the interaction mode prediction model, so that the interaction mode prediction model has the ability to predict the interaction mode based on the attribute information.
  • the server after the server determines the target interaction mode from multiple interaction modes, the server also stores the corresponding relationship between the current login account and the target interaction mode, so that when the current login account is re-logged in, it can be directly based on the corresponding relationship. relationship, determine the target interaction mode. That is, in response to the re-login operation based on the current login account, the server determines the target interaction mode corresponding to the current login account based on the stored correspondence.
  • the current login account and the target interaction mode are also stored correspondingly, so that when the account is re-logged in, the target interaction corresponding to the account can be determined directly based on the stored correspondence.
  • the method improves the determination efficiency of the target interaction mode, reduces the computing resources consumed by re-determining the target interaction mode, and improves the resource utilization rate.
  • step S304 the server returns the target interaction mode to the terminal.
  • the server returns the determined target interaction mode to the terminal.
  • the above steps 302 to 304 are used for description by taking an example of an interaction manner in which a terminal acquires a target through interaction with a server.
  • the terminal can also locally execute the step of determining the interaction mode with the target from multiple interaction modes, and the process of determining the target interaction mode by the terminal from the multiple interaction modes and the server determine the process from the multiple interaction modes The same is true for the interaction method of outgoing targets, which will not be repeated here.
  • step S305 the terminal receives the target interaction mode, and responds to the target interaction operation based on the target interaction mode in response to the terminal detecting the target interaction operation on the target interface.
  • the target interface can support the implementation of the target interaction mode.
  • the target interaction mode is used to instruct to switch the video in the video playback interface in response to a sliding operation on the video playback interface.
  • the target interface is a video playing interface, and the video playing interface is used to play a single video.
  • the sliding operation is responded to based on the target interaction mode corresponding to the sliding operation, that is, in response to the terminal detecting the sliding operation on the target interface, the video playback interface is The video played on the screen is switched to another video.
  • the target interaction mode is used to instruct, in response to a click operation on a video in the video list interface, to play the video effected by the click operation.
  • the target interface is a video list interface.
  • the video list interface is used to display the cover of the video in two columns.
  • the video affected by the click operation is played in full screen.
  • the technical solutions provided by the embodiments of the present disclosure reflect the interaction habits of different users through at least one attribute information of different accounts, and then determine a target interaction mode that conforms to the user's interaction habits from the provided multiple interaction modes.
  • the interactive mode responds to the interactive operations performed by the user on the target interface to meet the user's interactive needs and improve the usability of human-computer interaction.
  • the terminal in the process of providing the user with the interactive service in the target interaction mode, can also adjust the target interaction mode based on the user's interaction data, so as to provide the user with an interaction mode suitable for the user.
  • the target interface is a video playback interface
  • the target interaction mode is used to instruct to switch videos in the video playback interface in response to a sliding operation on the video playback interface.
  • the step of adjusting the target interaction mode by the terminal includes: when the occurrence frequency of the target playback event is greater than the frequency threshold, the terminal replaces the target interaction mode with the first interaction mode; wherein, the target playback event is used to indicate that in the video playback interface
  • the playback duration of a single video is less than the duration threshold
  • the first interaction mode is used to instruct, in response to a click operation on the video in the video list interface, to play the video acted by the click operation.
  • the duration threshold is set based on the total duration of a single video. For example, the total duration of a single video is 11 seconds, and the duration threshold can be set to 2 seconds or 3 seconds.
  • the frequency threshold is a preset value between 0 and 1, for example, the frequency threshold is 0.8 or 0.9.
  • the terminal switches to the first interactive mode to provide interactive services for the user.
  • the total duration of a single video is 11 seconds
  • the duration threshold is 2 seconds
  • the frequency threshold is 0.8. If the playback duration of 80 videos is less than 2 seconds among the 100 videos viewed by the user, it indicates that the user is used to fast Switching the video in the video playback interface, it is more suitable for the user to quickly understand the interaction mode of the video information by viewing the cover of the video in the video list interface.
  • the interaction mode is changed. Converted to the interactive mode of playing the video acted by the click operation in response to the click operation on the video in the video list interface, so that the user can quickly understand the video information through the cover of the video in the video list interface, and then decide whether to click the cover to watch it
  • it is not necessary to switch videos through frequent sliding operations and understand the general information of the video which improves the efficiency of human-computer interaction, thereby further improving the usability of human-computer interaction.
  • the target interface is a video list interface
  • the target interaction mode is used to indicate that in response to a click operation on a video in the video list interface, the video acted by the click operation is played; accordingly, the step of adjusting the target interaction mode by the terminal includes: : The terminal responds that the proportion of the clicked video in the video list interface is greater than the proportion threshold, and replaces the target interaction mode with the second interaction mode; wherein, the proportion is the ratio of the first quantity to the second quantity, and the first quantity is The number of videos acted on by the click operation in the video list interface, the second number is the total number of videos in the video list interface, and the second interaction mode is used to indicate that the video in the video play interface is switched in response to the sliding operation on the video play interface .
  • the specific gravity threshold is a preset value between 0 and 1, for example, the specific gravity threshold is 0.8 or 0.9.
  • the terminal switches to the second interaction mode to provide the user with interactive services. For example, if the proportion threshold is 0.8, if the terminal recommends 100 videos for the user through the video list interface, and the user has watched at least 80 videos out of the above 100 videos through the click operation, it indicates that the user is accustomed to watching videos Most of the videos in the list interface, then switching videos by swiping operation is more suitable for this user.
  • the terminal before determining the target interaction mode through the above steps S301 to S304, the terminal also supports the user to independently select one of the multiple interaction modes as the target interaction mode. In the case where the user selects the interaction mode, the terminal responds to the user's interaction operation based on the target interaction mode selected by the user through the same process as step S305.
  • the terminal determines the target interaction mode through the above steps S301 to S304, that is, the terminal executes the acquisition of the current login account in response to not detecting the selection operation of the selection controls for multiple interaction modes of at least one attribute information step.
  • the process in which the terminal supports the user to independently select one of the multiple interaction modes as the target interaction mode includes the following steps S401 to S402 .
  • step S401 the terminal displays a prompt interface, where the prompt interface includes a plurality of interactive mode selection controls.
  • each interaction mode corresponds to a selection control.
  • the prompt interface includes an interaction introduction for each interaction mode and selection controls for each interaction mode.
  • the interaction method is to play the video effected by the click operation in response to the click operation on the video in the video list interface, and the interaction introduction of this interaction method is "double-column cover display"; the interaction method is to respond to the slide on the video playback interface. Operation, switch the video in the video playback interface, the interaction introduction of this interaction mode is "single-column up and down display", and the interaction introduction of the interaction mode is displayed in parallel with the selection controls of the interaction mode, so that users can view and select.
  • the selection control is a control in the form of a button.
  • the prompt interface further includes demonstration animations of multiple interaction modes.
  • step S401 includes: in addition to displaying the selection controls of multiple interaction modes in the prompt interface, displaying multiple interaction modes in the prompt interface The demonstration animation of each interaction mode in the multiple interaction modes is used to show the process of responding to the interaction operation corresponding to the interaction mode based on the interaction mode.
  • the terminal displays the demonstration animation in the prompt interface to vividly and intuitively display the interaction process of the interaction mode, which is convenient for users to understand, and can improve the human-machine efficiency when the user selects the interaction mode, thereby further improving the user experience.
  • step S402 the terminal determines the selected interaction mode as the target interaction mode in response to the selection operation on the selection control of the interaction mode in the prompt interface.
  • the terminal determines the interaction mode selected by the user as the target interaction mode, and responds to the interaction operation detected on the target interface based on the target interaction mode.
  • the interaction mode independently selected by the user is used as the target interaction mode, so as to provide interactive services for the user based on the interaction mode independently selected by the user, so that the interaction mode can meet the interaction needs of the user, thereby improving the usability of human-computer interaction.
  • the terminal after determining the target interaction mode, the terminal also sends a mode storage request to the server, where the mode storage request is used to request to store the corresponding relationship between the account and the target interaction mode, and the mode storage request carries the current login account and the target interaction mode , where the current login account is the account logged in by the terminal.
  • the server receives the mode storage request of the terminal, and stores the corresponding relationship between the current login account and the target interaction mode, so that when the account is re-logged in, it can directly determine the target interaction mode corresponding to the account based on the stored corresponding relationship, improving the The efficiency of determining the target interaction mode is reduced, the computing resources consumed by redetermining the target interaction mode are reduced, and the resource utilization rate is improved.
  • FIG. 5 is a block diagram of an interaction processing apparatus 500 according to an exemplary embodiment.
  • the apparatus 500 includes an acquisition unit 501 , a determination unit 502 and a response unit 503 .
  • the obtaining unit 501 is configured to obtain at least one attribute information of the current login account, where the attribute information is used to indicate the attribute of the user to which the current login account belongs.
  • the determining unit 502 is configured to determine a target interaction mode from a plurality of interaction modes, the target interaction mode is matched with at least one attribute information and is used to indicate a response mode of the target interaction operation, and the target interaction mode is an interaction operation corresponding to the target interaction mode .
  • the response unit 503 is configured to, in response to detecting the target interaction operation on the target interface, respond to the target interaction operation based on the target interaction manner.
  • the technical solutions provided by the embodiments of the present disclosure reflect the interaction habits of different users through at least one attribute information of different accounts, and then determine a target interaction mode that conforms to the user's interaction habits from the provided multiple interaction modes.
  • the interactive mode responds to the user's interactive operation on the target interface to meet the user's interactive needs and improve the usability of human-computer interaction.
  • the determining unit 502 includes: a first determining subunit configured to determine the degree of matching between each interaction mode and at least one attribute information, and a second determining subunit configured to determine the matching degree of the highest matching degree The interaction mode is determined as the target interaction mode.
  • the first determination subunit is configured to: determine an interaction mode corresponding to each attribute information, and obtain a weight of each attribute information in at least one attribute information, where the weight is used to indicate that the attribute information is used to determine the corresponding interaction The importance of the mode is added, and the weights of the attribute information corresponding to the same interaction mode are added to obtain the matching degree of the interaction mode and at least one attribute information.
  • the obtaining unit 501 is configured to obtain at least one of the region information, age information, gender information, channel information and device information of the currently logged-in account; and at least one of the device information constitute at least one attribute information.
  • the region information is used to indicate the region where the computer device displaying the target interface is located, the channel information is used to indicate the download channel of the application program to which the target interface belongs, and the device information is used to indicate the type of the operating system of the computer device displaying the target interface.
  • the obtaining unit 501 is configured to obtain at least one attribute information of the current login account in response to the current login account being an account for the first login.
  • the interaction processing apparatus 500 further includes: a display unit configured to display a prompt interface, where the prompt interface includes a plurality of interactive mode selection controls.
  • the obtaining unit 501 is further configured to obtain at least one attribute information of the current account in response to not detecting a selection operation on the selection controls of the plurality of interaction modes.
  • the determining unit 502 is further configured to determine the selected interaction mode as the target interaction mode in response to a selection operation on the selection control of the interaction mode in the prompt interface.
  • the display unit is further configured to, in addition to displaying the selection controls of the plurality of interaction modes in the prompt interface, also display the demonstration animations of the plurality of interaction modes in the prompt interface, each interaction in the plurality of interaction modes.
  • the demonstration animation of the mode is used to show the process of responding to the interaction operation corresponding to the interaction mode based on the interaction mode.
  • the interaction processing apparatus 500 further includes: a storage unit configured to store the corresponding relationship between the current login account and the target interaction mode.
  • the determining unit 502 is further configured to, in response to the re-login operation based on the current login account, and based on the stored correspondence, determine the target interaction mode; and the response unit 503 is further configured to respond to detecting the target interaction operation on the target interface, Based on the target interaction mode, the target interaction operation is responded to.
  • the target interface is a video playing interface
  • the target interaction mode is used to instruct to switch the video in the video playing interface in response to a sliding operation on the video playing interface.
  • the interaction processing apparatus 500 further includes: a first replacement unit, configured to replace the target interaction mode with the first interaction mode when the occurrence frequency of the target playback event is greater than the frequency threshold; wherein, the target playback event is used to represent a video The playback duration of a single video in the playback interface is less than the duration threshold, and the first interaction mode is used to instruct to play the video acted by the click operation in response to the click operation on the video in the video list interface.
  • a first replacement unit configured to replace the target interaction mode with the first interaction mode when the occurrence frequency of the target playback event is greater than the frequency threshold
  • the target playback event is used to represent a video
  • the playback duration of a single video in the playback interface is less than the duration threshold
  • the first interaction mode is used to instruct to play the video acted by the click operation in response to the click operation on the video in the video list interface.
  • the target interface is a video list interface
  • the target interaction mode is used to instruct, in response to a click operation on a video in the video list interface, to play the video acted by the click operation.
  • the interaction processing apparatus 500 further includes: a second replacement unit, configured to replace the target interaction mode with the second interaction mode when the proportion of the clicked video in the video list interface is greater than the proportion threshold; wherein the proportion of is the ratio of the first number to the second number, the first number is the number of videos acted on by the click operation in the video list interface, the second number is the total number of videos in the video list interface, and the second interaction mode is used to indicate the response to The sliding operation on the video playback interface switches the video in the video playback interface.
  • a second replacement unit configured to replace the target interaction mode with the second interaction mode when the proportion of the clicked video in the video list interface is greater than the proportion threshold; wherein the proportion of is the ratio of the first number to the second number, the first number is the number of videos acted on by the click operation in the video list interface, the second number is the total number of videos in the video list interface, and the second interaction mode is used to indicate the response to The sliding operation on the video playback interface switches the video in the video playback interface.
  • the computer device may be configured as at least one of a terminal and a server.
  • the terminal is used as the main body to implement the technical solutions provided by the embodiments of the present disclosure.
  • the server is used as the main body to implement the technical solutions provided by the embodiments of the present disclosure.
  • the technical solution provided by the present disclosure is implemented through the interaction between the terminal and the server, which is not limited in this embodiment of the present disclosure.
  • the steps of acquiring attribute information, displaying the target interface, detecting the target interactive operation based on the target interface, and responding to the target interactive operation based on the target interaction method are performed by the terminal, and the target is determined based on the attribute information.
  • the steps of the interactive mode are performed by the server.
  • FIG. 6 is a block diagram of a terminal 600 according to an exemplary embodiment.
  • the terminal 600 may be a smart phone, a tablet computer, a smart watch, or a laptop computer.
  • the terminal 600 may also be called a user equipment, a portable terminal, a laptop terminal, a desktop terminal, and the like by other names.
  • the terminal 600 includes: a processor 601 and a memory 602 .
  • the processor 601 includes one or more processing cores, such as a 4-core processor, an 8-core processor, and the like.
  • the processor 601 is implemented by at least one hardware form among DSP (Digital Signal Processing, digital signal processing), FPGA (Field-Programmable Gate Array, field programmable gate array), and PLA (Programmable Logic Array, programmable logic array).
  • the processor 601 includes a main processor and a co-processor, and the main processor is a processor for processing data in a wake-up state, also referred to as a CPU (Central Processing Unit, central processing unit);
  • a coprocessor is a low-power processor for processing data in a standby state.
  • the processor 601 is integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is used for rendering and drawing the content that needs to be displayed on the display screen.
  • the processor 601 includes an AI (Artificial Intelligence, artificial intelligence) processor for processing computing operations related to machine learning.
  • AI Artificial Intelligence, artificial intelligence
  • Memory 602 includes one or more computer-readable storage media that are non-transitory. Memory 602 also includes high-speed random access memory, as well as non-volatile memory, such as one or more disk storage devices, flash storage devices. In some embodiments, a non-transitory computer-readable storage medium in the memory 602 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 601 to implement the interactive processing provided by the method embodiments of the present disclosure method.
  • the terminal 600 may further include: a peripheral device interface 603 and at least one peripheral device.
  • the processor 601, the memory 602 and the peripheral device interface 603 are connected through a bus or a signal line.
  • Each peripheral device is connected to the peripheral device interface 603 through a bus, a signal line or a circuit board.
  • the peripheral device includes at least one of a radio frequency circuit 604 , a display screen 605 , a camera assembly 606 , an audio circuit 607 , a positioning assembly 608 and a power supply 609 .
  • the peripheral device interface 603 can be used to connect at least one peripheral device related to I/O (Input/Output, input/output) to the processor 601 and the memory 602 .
  • processor 601, memory 602, and peripherals interface 603 are integrated on the same chip or circuit board; in some other embodiments, any one of processor 601, memory 602, and peripherals interface 603 or The two can be implemented on a separate chip or circuit board, which is not limited by the embodiments of the present disclosure.
  • the radio frequency circuit 604 is used for receiving and transmitting RF (Radio Frequency, radio frequency) signals, also called electromagnetic signals.
  • the radio frequency circuit 604 communicates with the communication network and other communication devices via electromagnetic signals.
  • the radio frequency circuit 604 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals.
  • the radio frequency circuit 604 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and the like.
  • the radio frequency circuit 604 communicates with other terminals through at least one wireless communication protocol.
  • the wireless communication protocol includes but is not limited to: World Wide Web, Metropolitan Area Network, Intranet, various generations of mobile communication networks (2G, 3G, 4G and 5G), wireless local area network and/or WiFi (Wireless Fidelity, Wireless Fidelity) network.
  • the radio frequency circuit 604 further includes a circuit related to NFC (Near Field Communication, short-range wireless communication), which is not limited in the present disclosure.
  • the display screen 605 is used for displaying UI (User Interface, user interface).
  • the UI includes graphics, text, icons, video, and any combination thereof.
  • display screen 605 is a touch display screen
  • display screen 605 also has the ability to acquire touch signals on or above the surface of display screen 605 .
  • the touch signal is input to the processor 601 as a control signal for processing.
  • the display screen 605 is also used to provide virtual buttons and/or virtual keyboards, also called soft buttons and/or soft keyboards.
  • the display screen 605 there is one display screen 605, which is arranged on the front panel of the terminal 600; in other embodiments, there are at least two display screens 605, which are respectively arranged on different surfaces of the terminal 600 or in a folded design; In some embodiments, the display screen 605 is a flexible display screen disposed on a curved or folded surface of the terminal 600 . Even, the display screen 605 is set as a non-rectangular irregular figure, that is, a special-shaped screen.
  • the display screen 605 is made of materials such as LCD (Liquid Crystal Display, liquid crystal display), OLED (Organic Light-Emitting Diode, organic light-emitting diode).
  • the camera assembly 606 is used to capture images or video.
  • camera assembly 606 includes a front-facing camera and a rear-facing camera.
  • the front camera is arranged on the front panel of the terminal, and the rear camera is arranged on the back of the terminal.
  • there are at least two rear cameras which are any one of a main camera, a depth-of-field camera, a wide-angle camera, and a telephoto camera, so as to realize the fusion of the main camera and the depth-of-field camera to realize the background blur function, the main camera It is integrated with the wide-angle camera to achieve panoramic shooting and VR (Virtual Reality, virtual reality) shooting functions or other integrated shooting functions.
  • camera assembly 606 includes a flash.
  • the flash is a single color temperature flash or a dual color temperature flash. Dual color temperature flash refers to the combination of warm light flash and cold light flash, which is used for light compensation under different color temperatures.
  • Audio circuit 607 includes a microphone and a speaker.
  • the microphone is used to collect the sound waves of the user and the environment, convert the sound waves into electrical signals and input them to the processor 601 for processing, or to the radio frequency circuit 604 to realize voice communication.
  • the microphones are array microphones or omnidirectional collection microphones.
  • the speaker is used to convert the electrical signal from the processor 601 or the radio frequency circuit 604 into sound waves.
  • the loudspeaker is a traditional thin-film loudspeaker or a piezoelectric ceramic loudspeaker.
  • the loudspeaker is a piezoelectric ceramic loudspeaker
  • the loudspeaker can not only convert electrical signals into sound waves audible to humans, but also convert electrical signals into sound waves inaudible to humans for purposes such as distance measurement.
  • the audio circuit 607 also includes a headphone jack.
  • the positioning component 608 is used to locate the regional information of the terminal 600 to implement navigation or LBS (Location Based Service).
  • the positioning component 608 is a positioning component based on the GPS (Global Positioning System, global positioning system) of the United States, the Beidou system of China or the Galileo system of Russia.
  • the power supply 609 is used to power various components in the terminal 600 .
  • the power source 609 is alternating current, direct current, a primary battery or a rechargeable battery.
  • the rechargeable battery is a wired rechargeable battery or a wireless rechargeable battery.
  • Wired rechargeable batteries are batteries that are charged through wired lines
  • wireless rechargeable batteries are batteries that are charged through wireless coils.
  • the rechargeable battery is also used to support fast charging technology.
  • terminal 600 also includes one or more sensors 610 .
  • the one or more sensors 610 include, but are not limited to, an acceleration sensor 611 , a gyro sensor 612 , a pressure sensor 613 , a fingerprint sensor 614 , an optical sensor 615 and a proximity sensor 616 .
  • the acceleration sensor 611 is used to detect the magnitude of acceleration on the three coordinate axes of the coordinate system established by the terminal 600 .
  • the acceleration sensor 611 is used to detect the components of the gravitational acceleration on the three coordinate axes.
  • the processor 601 controls the display screen 605 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 611 .
  • the acceleration sensor 611 is also used for game or user movement data collection.
  • the gyroscope sensor 612 detects the body direction and rotation angle of the terminal 600 , and the gyroscope sensor 612 cooperates with the acceleration sensor 611 to collect 3D actions of the user on the terminal 600 .
  • the processor 601 implements the following functions according to the data collected by the gyroscope sensor 612 : motion sensing (such as changing the UI according to the user's tilt operation), image stabilization during shooting, game control, and inertial navigation.
  • the pressure sensor 613 is disposed on the side frame of the terminal 600 and/or the lower layer of the display screen 605 .
  • the pressure sensor 613 detects the user's holding signal of the terminal 600 , and the processor 601 performs left and right hand recognition or shortcut operations according to the holding signal collected by the pressure sensor 613 .
  • the processor 601 controls the operability controls on the UI interface according to the user's pressure operation on the display screen 605.
  • the operability controls include at least one of button controls, scroll bar controls, icon controls, and menu controls.
  • the fingerprint sensor 614 is used to collect the user's fingerprint, and the processor 601 identifies the user's identity according to the fingerprint collected by the fingerprint sensor 614, or the fingerprint sensor 614 identifies the user's identity according to the collected fingerprint. In the case that the identity of the user is identified as a trusted identity, the processor 601 authorizes the user to perform relevant sensitive operations, including unlocking the screen, viewing encrypted information, downloading software, making payments, and changing settings.
  • the fingerprint sensor 614 is provided on the front, back or side of the terminal 600 . In the case where the terminal 600 is provided with a physical button or a manufacturer's logo, the fingerprint sensor 614 is integrated with the physical button or the manufacturer's logo.
  • Optical sensor 615 is used to collect ambient light intensity.
  • the processor 601 controls the display brightness of the display screen 605 according to the ambient light intensity collected by the optical sensor 615 . In some embodiments, when the ambient light intensity is high, the display brightness of the display screen 605 is increased; when the ambient light intensity is low, the display brightness of the display screen 605 is decreased. In another embodiment, the processor 601 dynamically adjusts the shooting parameters of the camera assembly 606 according to the ambient light intensity collected by the optical sensor 615 .
  • a proximity sensor 616 also called a distance sensor, is usually provided on the front panel of the terminal 600 .
  • the proximity sensor 616 is used to collect the distance between the user and the front of the terminal 600 .
  • the processor 601 controls the display screen 605 to switch from the bright screen state to the off screen state; 616
  • the processor 601 controls the display screen 605 to switch from the screen-off state to the screen-on state.
  • FIG. 6 does not constitute a limitation on the terminal 600, for example, it includes more or less components than the one shown, or combines some components, or adopts different component arrangements.
  • FIG. 7 is a block diagram of a server according to an exemplary embodiment.
  • the server 700 may vary greatly due to different configurations or performance, including one or more processing A processor (Central Processing Units, CPU) 701 and one or more memories 702, wherein the memory 702 is used to store executable instructions, and the processor 701 is configured to execute the above executable instructions, so as to implement the above-mentioned various method embodiments.
  • Interactive processing method the server also has components such as a wired or wireless network interface, a keyboard, and an input and output interface for input and output, and the server also includes other components for implementing device functions, which are not described here.
  • a storage medium including instructions such as a memory 702 including instructions, is also provided, and the above-mentioned instructions can be executed by the processor 701 of the server 700 to complete the above-mentioned interactive processing method.
  • the storage medium is a non-transitory computer-readable storage medium, for example, the non-transitory computer-readable storage medium is ROM (Read-Only Memory, read only memory), RAM (Random Access Memory, random access memory) ), CD-ROM (Compact Disc Read-Only Memory), magnetic tape, floppy disk and optical data storage devices, etc.
  • a computer program product is also provided, when the instructions in the computer program product are executed by the processor of the computer device, the computer device can execute the interactive processing method in each of the above method embodiments.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un procédé de traitement d'interaction et un dispositif informatique, se rapportant au domaine technique des technologies informatiques. Le procédé comprend les étapes consistant à : acquérir au moins une information de propriété d'un compte de connexion actuel, l'information de propriété représentant la propriété d'un utilisateur à qui appartient le compte de connexion actuel ; déterminer un mode d'interaction cible parmi de multiples modes d'interaction, le mode d'interaction cible correspondant à l'au moins une information de propriété et indiquant un mode de réponse d'une opération d'interaction cible, l'opération d'interaction cible correspondant au mode d'interaction cible ; et en réponse à la détection de l'opération d'interaction cible sur une interface cible, répondre à l'opération d'interaction cible sur la base du mode d'interaction cible.
PCT/CN2021/106497 2020-10-26 2021-07-15 Procédé de traitement d'interaction et dispositif informatique WO2022088765A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011157026.9 2020-10-26
CN202011157026.9A CN112256181B (zh) 2020-10-26 2020-10-26 交互处理方法、装置、计算机设备及存储介质

Publications (1)

Publication Number Publication Date
WO2022088765A1 true WO2022088765A1 (fr) 2022-05-05

Family

ID=74261128

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/106497 WO2022088765A1 (fr) 2020-10-26 2021-07-15 Procédé de traitement d'interaction et dispositif informatique

Country Status (2)

Country Link
CN (1) CN112256181B (fr)
WO (1) WO2022088765A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112256181B (zh) * 2020-10-26 2022-06-03 北京达佳互联信息技术有限公司 交互处理方法、装置、计算机设备及存储介质
CN114489441A (zh) * 2022-01-21 2022-05-13 珠海格力电器股份有限公司 食谱展示方法、装置、电子设备及存储介质
CN114816181A (zh) * 2022-03-08 2022-07-29 平安科技(深圳)有限公司 基于机器学习的人机交互方式处理方法、装置及相关设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140223361A1 (en) * 2013-02-07 2014-08-07 Google Inc. Mechanism to reduce accidental clicks on online content
CN108319485A (zh) * 2018-01-29 2018-07-24 出门问问信息科技有限公司 信息交互方法、装置、设备及存储介质
CN109151548A (zh) * 2018-08-31 2019-01-04 北京优酷科技有限公司 界面交互方法及装置
CN110109596A (zh) * 2019-05-08 2019-08-09 芋头科技(杭州)有限公司 交互方式的推荐方法、装置以及控制器和介质
CN112256181A (zh) * 2020-10-26 2021-01-22 北京达佳互联信息技术有限公司 交互处理方法、装置、计算机设备及存储介质

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106651357B (zh) * 2016-11-16 2021-06-22 网易乐得科技有限公司 一种支付方式推荐方法和设备
CN110730387B (zh) * 2019-11-13 2022-12-06 腾讯科技(深圳)有限公司 视频播放控制方法和装置、存储介质及电子装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140223361A1 (en) * 2013-02-07 2014-08-07 Google Inc. Mechanism to reduce accidental clicks on online content
CN108319485A (zh) * 2018-01-29 2018-07-24 出门问问信息科技有限公司 信息交互方法、装置、设备及存储介质
CN109151548A (zh) * 2018-08-31 2019-01-04 北京优酷科技有限公司 界面交互方法及装置
CN110109596A (zh) * 2019-05-08 2019-08-09 芋头科技(杭州)有限公司 交互方式的推荐方法、装置以及控制器和介质
CN112256181A (zh) * 2020-10-26 2021-01-22 北京达佳互联信息技术有限公司 交互处理方法、装置、计算机设备及存储介质

Also Published As

Publication number Publication date
CN112256181A (zh) 2021-01-22
CN112256181B (zh) 2022-06-03

Similar Documents

Publication Publication Date Title
CN112162671B (zh) 直播数据处理方法、装置、电子设备及存储介质
US20220159323A1 (en) Method for pre-loading content data, and electronic device
CN109874312B (zh) 播放音频数据的方法和装置
WO2022088765A1 (fr) Procédé de traitement d'interaction et dispositif informatique
CN109327608B (zh) 歌曲分享的方法、终端、服务器和系统
CN110149557B (zh) 视频播放方法、装置、终端和存储介质
CN110248236B (zh) 视频播放方法、装置、终端及存储介质
CN113411680B (zh) 多媒体资源播放方法、装置、终端及存储介质
WO2023050737A1 (fr) Procédé de présentation de ressources basé sur une salle de diffusion continue en direct, et terminal
CN111836069A (zh) 虚拟礼物赠送方法、装置、终端、服务器及存储介质
CN111368114B (zh) 信息展示方法、装置、设备及存储介质
CN111026992A (zh) 多媒体资源预览方法、装置、终端、服务器及存储介质
WO2020253129A1 (fr) Procédé, appareil et dispositif d'affichage de chansons et support de stockage
CN112616082A (zh) 视频预览方法、装置、终端及存储介质
CN111796990A (zh) 资源显示方法、装置、终端及存储介质
CN112100528A (zh) 对搜索结果评分模型进行训练的方法、装置、设备、介质
CN111563201A (zh) 内容推送方法、装置、服务器及存储介质
EP4125274A1 (fr) Procédé et appareil de lecture de vidéos
CN112860046A (zh) 选择运行模式的方法、装置、电子设备及介质
WO2022127200A1 (fr) Procédé et appareil d'affichage de contenu
CN111522483B (zh) 多媒体数据分享方法、装置、电子设备及存储介质
CN113064537B (zh) 媒体资源播放方法、装置、设备、介质及产品
CN114115660B (zh) 媒体资源处理方法、装置、终端及存储介质
CN111241334B (zh) 显示歌曲信息页面的方法、装置、系统、设备及存储介质
CN111526221B (zh) 域名质量确定方法、装置及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21884501

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 18.08.2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21884501

Country of ref document: EP

Kind code of ref document: A1