CN109800037B - Interface display method, interface data processing method, client and server - Google Patents
Interface display method, interface data processing method, client and server Download PDFInfo
- Publication number
- CN109800037B CN109800037B CN201711136046.6A CN201711136046A CN109800037B CN 109800037 B CN109800037 B CN 109800037B CN 201711136046 A CN201711136046 A CN 201711136046A CN 109800037 B CN109800037 B CN 109800037B
- Authority
- CN
- China
- Prior art keywords
- state
- client
- interface
- state information
- interface data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the specification discloses an interface display method, an interface data processing method, a client and a server, wherein the interface display method comprises the following steps: collecting state information related to the client; generating state representation parameters according to the state information; the state representation parameters are used for representing the emotion of the user of the client; formulating interface data according to the state representation parameters; wherein the interface data has display elements corresponding to the emotions; and displaying the interface according to the display element. The technical scheme provided by the specification can more intelligently display personalized content.
Description
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an interface display method, an interface data processing method, a client, and a server.
Background
Currently, users generally prefer to set personalized content in terminal devices. For example, the user can change the ring tone, the screen saver pattern, the incoming call display pattern, etc. of the mobile phone according to the needs of the user.
Currently, when a terminal device displays personalized content, on one hand, the corresponding content is passively replaced in response to an input instruction of a user. For example, a user may manually change the screen saver pattern of a cell phone. On the other hand, the terminal device may select one or more resources from the candidate resources to sequentially display according to a control instruction input by the user and a certain time interval, so that personalized content may be dynamically displayed. For example, a user may set a picture library in a mobile phone, and then set the mobile phone to randomly select a picture from the picture library as a screen saver pattern of the mobile phone every 1 minute, so that the mobile phone can automatically change the screen saver pattern.
Currently, the demand of users for intelligent devices is increasing, and therefore, a more convenient and intelligent method for displaying personalized content is urgently needed.
Disclosure of Invention
An object of the embodiments of the present specification is to provide an interface display method, an interface data processing method, a client, and a server, which can more intelligently display personalized content.
In order to achieve the above object, an embodiment of the present specification provides an interface display method, which is applied to a client, where the method includes: collecting state information related to the client; generating state representation parameters according to the state information; the state representation parameters are used for representing the emotion of the user of the client; formulating interface data according to the state characterization parameters; wherein the interface data has display elements corresponding to the emotions; and displaying the interface according to the display element.
Embodiments of the present specification further provide a client, where the client includes a memory and a processor, the memory stores a computer program, and when the computer program is executed by the processor, the method implements the following steps: collecting state information related to the client; generating state representation parameters according to the state information; the state representation parameters are used for representing the emotion of the user of the client; formulating interface data according to the state characterization parameters; wherein the interface data has a display element corresponding to the emotion; and displaying the interface according to the display element.
An embodiment of the present specification further provides an interface display method, which is applied to a client, and the method includes: collecting state information related to the client; generating state representation parameters according to the state information; the state representation parameters are used for representing the emotion of the user of the client; sending the state representation parameters to a server for the server to formulate interface data according to the state representation parameters; wherein the interface data has a display element corresponding to the emotion; receiving interface data fed back by the server; and displaying the interface according to the display element.
Embodiments of the present specification further provide a client, where the client includes a memory and a processor, the memory stores a computer program, and when the computer program is executed by the processor, the method implements the following steps: collecting state information related to the client; generating state representation parameters according to the state information; the state representation parameters are used for representing the emotion of the user of the client; sending the state representation parameters to a server for the server to formulate interface data according to the state representation parameters; wherein the interface data has a display element corresponding to the emotion; receiving interface data fed back by the server; and displaying the interface according to the display element.
An embodiment of the present specification further provides an interface display method, which is applied to a client, and the method includes: providing first state information related to the client to a server, wherein the first state information is used for generating a first state representation parameter according to the first state information by the server, and formulating first interface data according to the first state representation parameter; the first state representation parameter is used for representing the emotion of the user of the client; the first interface data has a display element corresponding to the emotion; receiving first interface data fed back by the server; and displaying the interface according to the display elements in the first interface data.
Embodiments of the present specification further provide a client, including a memory and a processor, where the memory stores a computer program, and when the computer program is executed by the processor, the method implements the following steps: providing first state information related to the client to a server, so that the server generates a first state representation parameter according to the first state information, and first interface data are formulated according to the first state representation parameter; the first state characterization parameter is used for representing the emotion of the user of the client; the first interface data has a display element corresponding to the emotion; receiving first interface data fed back by the server; and displaying the interface according to the display elements in the first interface data.
An embodiment of the present specification further provides an interface data processing method, including: receiving state information sent by a client; generating a state representation parameter according to the state information, wherein the state representation parameter is used for representing the emotion of the user of the client; formulating interface data according to the state characterization parameters; wherein the interface data has display elements corresponding to the emotions; and sending the interface data to the client.
Embodiments of the present specification further provide a server, including a memory and a processor, where the memory stores a computer program, and when the computer program is executed by the processor, the server implements the following steps: receiving state information sent by a client; generating a state representation parameter according to the state information, wherein the state representation parameter is used for representing the emotion of the user of the client; formulating interface data according to the state representation parameters; wherein the interface data has a display element corresponding to the emotion; and sending the interface data to the client.
As can be seen from the technical solutions provided in the foregoing embodiments of the present specification, when the interface is presented in the present specification, the state information of the client may be collected, and the state information may be used to characterize an environment where the user is currently located and/or characterize an activity that the user is currently engaged in. Then, based on the collected state information, a corresponding state characterizing parameter may be obtained, which may represent an emotion of a user using the client. Therefore, interface data which accord with the current emotion of the user can be worked out according to the state representation parameters, and the current interface can be displayed according to the worked out interface data. Therefore, the interface displayed in the description is obtained based on the current emotion analysis of the user, and the content in the displayed interface can accord with the current emotion of the user, so that different personalized contents can be displayed for different users, and the mode of displaying the personalized contents is more intelligent.
Drawings
In order to more clearly illustrate the embodiments of the present specification or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments described in the specification, and other drawings can be obtained by those skilled in the art without inventive labor.
FIG. 1 is a schematic diagram of a client according to an embodiment of the present disclosure;
FIG. 2 is a diagram illustrating the assignment of weight values in an embodiment of the present disclosure;
FIG. 3 is a first diagram of a client and a server in an embodiment of the present disclosure;
FIG. 4 is a second diagram of a client and a server in an embodiment of the present disclosure;
FIG. 5 is a flowchart of an interface display method according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram illustrating an internal structure of a client according to an embodiment of the present disclosure;
FIG. 7 is a flowchart of a method for processing interface data according to an embodiment of the present disclosure;
fig. 8 is a diagram illustrating correspondence between state characterizing parameters and specified features in an embodiment of the present disclosure.
Detailed Description
In order to make the technical solutions in the present specification better understood, the technical solutions in the embodiments of the present specification will be clearly and completely described below with reference to the drawings in the embodiments of the present specification, and it is obvious that the described embodiments are only a part of the embodiments of the present specification, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art without any inventive work based on the embodiments in the present specification shall fall within the protection scope of the present specification.
The specification provides an interface display method, which can be applied to a client. The client may be an electronic device having a display function and a data processing function. The client can be an electronic device such as a desktop computer, a tablet computer, a notebook computer, a smart phone, a digital assistant, an intelligent wearable device, a shopping guide terminal and an intelligent television.
Referring to fig. 1, in the present embodiment, the client may include an information acquisition system, an information processing system, an interface formulation system, and an interface display system. Wherein the information collection system can collect status information related to the client. Specifically, the information collection system may obtain the status information through an application program running in the client. The Application program may be, for example, a screen saver APP (Application program), a music playing APP, a video playing APP, a weather forecast APP, a navigation APP, and the like installed in a smartphone. In this embodiment, the status information may be at least one of audio information, video information, location information, weather information, and traffic information. The information acquisition system can call the access interface of each application program so as to acquire the current real-time data from the application program. For example, when a music playing program plays music, the information acquisition system may acquire information of the currently played music; for another example, the information collection system may obtain current weather information while the weather forecast program is running in the background.
In this embodiment, the information processing system may generate a state characterizing parameter from the state information. The state representation parameters can be used for representing the emotion of the user of the client. In particular, the status information may reflect at least one characteristic. The characteristics reflected by different state information may also be different. For example, when the status information is audio information, the features reflected by the status information may be features such as rhythm, melody, tone, and intensity. For another example, when the status information is weather information, the characteristic reflected by the status information may be a characteristic of temperature, humidity, weather condition, wind power, or the like. In the present embodiment, the big data of the state information may be analyzed in advance to quantify each feature, thereby obtaining a feature value corresponding to each feature. For example, the sound intensity may be divided into 10 levels according to the intensity of the sound intensity in the audio information, and each level may correspond to a numerical value. Therefore, according to the grade of the sound intensity in the actually collected audio information, the numerical value corresponding to the grade can be used as the characteristic value corresponding to the sound intensity characteristic. In this embodiment, the features reflected by the state information may have corresponding feature values. In an actual application scenario, feature values of the respective features may be included in the state information. For example, the audio information may include a tempo feature having a feature value of 3 and a sound intensity feature having a feature value of 10. Thus, from the state information, a feature value of a specified feature can be extracted. The specified characteristic may represent one of the characteristics reflected by the state information.
In this embodiment, the interface data presented in the interface may be represented by a plurality of dimensions. Specifically, the interface data may have multiple dimensions of showing time, color, showing rhythm, and the like. In this way, after the feature value of the specified feature is extracted from the state information, the influence of the feature value of the specified feature on the interface data can be further determined. For example, if the client currently plays music with a cheerful rhythm, the presentation time of the interface data presented in the interface of the client may be short, the color may be in a warm color system, and the presentation rhythm may be fast. For another example, if the navigation software in the client indicates that the current road condition is relatively congested, the projection time of the interface data displayed in the interface of the client may be longer, the color may adopt a cold color system, and the projection rhythm may also be slower. That is, the feature values of different features all affect the interface data displayed on the final interface. As described above, in the present embodiment, the feature value of the specified feature may be mapped to a specified dimension, so as to obtain a dimension value corresponding to the specified dimension. The specified dimension may be one of the dimensions included in the interface data described above. For example, when the specified feature is temperature and the feature value of the specified feature is 1 (indicating that the current feature is cold), the feature value of the specified feature can be mapped into the dimension of the color, so that the dimension value of the color is 2 (indicating a cold color system). In an actual application scenario, the mapping relationship between the feature value and the dimension value may be predetermined. In this way, after the feature value of the specified feature is extracted from the state information, the dimension value of a certain dimension corresponding to the feature value can be determined according to the mapping relation. It should be noted that the mapping relationship may include a mapping between one feature value and multiple dimension values. For example, the feature values of the temperature feature may correspond to the dimension values of three dimensions of the showing time, the color and the showing rhythm. Thus, one feature value may correspond to three dimensional values. In the present embodiment, the state characterizing parameter may be constituted by a plurality of dimension values obtained based on a feature value of the specified feature. Each numerical value in the state characterizing parameters may be a dimension value mapped by a feature value of the specified feature. For example, if the feature value of the sound intensity is 10, and the feature value is mapped into three dimensions of showing time, color and showing rhythm, three dimension values of 2, 20 and 10 can be obtained, so that the state characterizing parameter can be (2, 20, 10).
In the present embodiment, the unit used for the eigenvalue corresponding to different features is different, and therefore, there is a large difference in numerical value. For example, in the rhythm feature, a feature value of 10 indicates a possibly slower rhythm, and in the intensity feature, a feature value of 10 indicates a possibly higher intensity. In order to be able to represent different features with a uniform numerical range, in the present embodiment, the feature values may be converted into a specified metric domain, so as to obtain a degree value of the feature values in the metric domain. The specified metric field may be a value interval having a minimum value and a maximum value within the value interval. Therefore, the characteristic values in different value ranges can be mapped to the specified measurement domain, and the unity of numerical units is ensured. For example, the specified measurement domain is a dimensionless value interval, and the minimum value in the value interval is 0 and the maximum value in the value interval is 10. Thus, the characteristic value originally having a value range of 0 to 100 can be mapped to the value range. The eigenvalue 0 may correspond to the degree value 0, the eigenvalue 100 may correspond to the degree value 10, and the eigenvalue 50 may correspond to the degree value 5. In this way, by mapping the feature values of different features into a unified designated metric domain, the degree of the feature can be uniformly expressed by the magnitude of the degree value in the designated metric domain. In an actual application scenario, the feature values of the features may be normalized, so that the feature values are uniformly mapped to a value interval with a minimum value of 0 and a maximum value of 1.
Specifically, referring to fig. 8, in practical applications, state information collected by a large number of clients may be analyzed, so as to summarize a corresponding relationship between a specified feature and a state characterization parameter. In fig. 8, the abscissa may be the show time in the state characterizing parameter and the ordinate may be the show tempo in the state characterizing parameter. In the coordinate system, different emotional, weather, road condition, temperature and music characteristics can be included. For example, the coordinate system may include emotional characteristics such as anger, joy, startle, and the like, weather characteristics such as rain, thunder, and fog, road condition characteristics such as smooth traffic, congestion, and the like, temperature characteristics such as hot, warm, and the like, and music characteristics such as heavy metal, hip-hop, and ballad. In fig. 8, different specified features may each correspond to values in two dimensions, show time and show tempo. Therefore, after the specific feature in the state information is determined, according to the content shown in fig. 8, the state characterizing parameter corresponding to the specific feature can be determined. In addition, in practical applications, a given feature may correspond to a range of values, and is not limited to a particular value. For example, the characteristic happy emotion feature may correspond to a show time range of 0.3 to 0.4 and may correspond to a show tempo range of 0.5 to 0.6. Then, when the state characterizing parameters corresponding to the specified features are generated, the state characterizing parameters can be selected according to a certain rule in the numerical value range corresponding to the specified features. Specifically, the rule may be to randomly select a value within a range of values, or to select a center value within the range of values, where the center value may be an average of a start value and an end value of the range of values. It should be noted that the state characterizing parameters may also be more dimensional, and thus, for a given feature, may correspond to more values than show time and show tempo.
Referring to fig. 2, in the present embodiment, there may be more than one feature value of the feature extracted from the state information. For example, two feature values, a tempo feature value and a tone feature value, can be extracted from the audio information. The characteristic values all affect the same dimension of the interface data, but the influence degrees are different. Based on this, corresponding weight values can be assigned to different features for a specified dimension of the interface data. For example, for a given dimension of color, a weight value of 0.8 may be assigned to the tempo feature and a weight value of 0.2 may be assigned to the intensity feature. In this way, after each feature value is converted into a designated metric domain to obtain a corresponding degree value, the degree values may be weighted and summed according to different designated dimensions to obtain the dimension value. For example, for the color dimension, assuming that the degree value corresponding to the rhythm feature is 0.6, and the degree value corresponding to the intensity feature is 0.4, then after performing weighted summation, the dimension value corresponding to the color dimension is 0.6 + 0.8+0.4 + 0.2=0.56. Thus, when the state information reflects at least two characteristics, corresponding dimension values can be obtained by means of weighted summation according to all dimensions, and therefore the state characterization parameters can be finally formed.
In this embodiment, after obtaining the state characterizing parameters, the interface formulation system may formulate interface data according to the state characterizing parameters. Wherein the interface data has a display element corresponding to the emotion. Specifically, the display element corresponding to the emotion may be a display mode capable of reflecting the emotion. For example, assuming the emotion is happy, the display element may be the object that is active more frequently. For example, the display element may be a small monkey that beats up and down. For another example, assuming the emotion is sadness, the display element may be an asphalt rain. The display elements may be pre-stored in the client, and the display elements may have corresponding emotion tags, so that the corresponding display elements may be determined according to the emotion represented by the state representation parameters.
In this embodiment, the interface data may have a plurality of attributes. The attributes may refer to visualization elements that can be adjusted. For example, the attribute may be a rhythm of interface data playing, a color presented when the interface data is played, a duration of the interface data playing, and the like. In particular, the attribute may be provided with an attribute value, which may embody a state in which the attribute should be presented. Therefore, the display interface represented by the interface data can present a corresponding display effect according to the value of the attribute. For example, if the attribute value of the playing rhythm is 10, the display effect is that the playing rhythm of the animation in the display interface represented by the interface data is fast. For another example, if the attribute value of the color is 1, the display effect is that the color system of the color of the animation in the display interface represented by the interface data is relatively cold. In this embodiment, when interface data is formulated according to the state characterizing parameters, attribute values of attributes of the interface data may be specified according to the dimensional values of the state characterizing parameters. Specifically, there may be a mapping relationship between the dimension value and the attribute value. Thus, according to the determined dimension value, the attribute value of the attribute can be obtained. For example, the state characterizing parameter is (0.1, 0.6, 1), where the three dimensional values respectively correspond to three attributes of the playing time, the color, and the playing rhythm of the interface data, and then the three dimensional values can be respectively converted into the value ranges of the attribute values corresponding to the attributes. For example, 0.1 may be mapped to an attribute value of 3, indicating a play time of 3 seconds; 0.6 can be mapped to an attribute value of 20, which indicates that the played color is a color system of No. 20; 1 may be mapped to an attribute value of 3, indicating that the tempo of the playback is 3 x speed.
In this embodiment, the interface displayed by the interface data may include animation, and thus, the interface data may include animation data. The animation data may be used to present a dynamic interface. For example, the dynamic interface may be a small person running from a distance and kicking a ball with one foot. Thus, the animation playing can have duration, color and deformation. Wherein, the duration may be a time length of playing the animation data; the color may be used to represent the overall tone of the animation data, or the color of at least a partial region in the animation data; the morphing may be used to represent a rhythm of play of the animation data. In this embodiment, the dimension of the state characterizing parameter may correspond to the duration, color, and deformation of the animation data one to one. Thus, the dimensions of the state characterizing parameters may include: the time length, the color and the deformation can be converted into the attribute values, and the time length, the color and the deformation can be controlled through the attribute values.
In this embodiment, after obtaining the interface data, the client may display the interface data as a corresponding interface through the interface display system. In the interface, a display element for representing the emotion of the user can be included, and the display mode of the display element can be controlled by the attribute value. Therefore, according to the difference of the collected state information, the dynamic interfaces displayed on the interfaces of the client sides of the users can be different, and personalized interfaces can be intelligently displayed for the users.
Referring to fig. 3, an interface display method is also provided in the present specification, and the method can be applied to a system architecture of a client and a server. The server may be an electronic device with strong computing power. The server may be communicatively coupled to the client. The client side can only comprise an information acquisition system, an information processing system and an interface display system. Thus, through the information acquisition system, the state information related to the client can be acquired. Then, generating state representation parameters according to the state information through an information processing system; the state representation parameters are used for representing the emotion of the user of the client. After processing the obtained state characterizing parameters, the client may send the state characterizing parameters to the server. Thus, the server may include an interface formulation system, so that interface data may be formulated according to the received state characterizing parameters and fed back to the client. Therefore, the client can display the interface according to the display elements through the interface display system.
In addition, referring to fig. 4, in an embodiment, the client may further include only an information collecting system and an interface displaying system, and the information processing system and the interface formulating system may exist in the server. In this way, the client can acquire the first state information of the client through the information acquisition system and provide the first state information related to the client to the server. In this way, after the server receives the first state information, the server can generate a first state representation parameter according to the first state information through the information processing system, and can formulate first interface data according to the first state representation parameter through the interface formulation system. The first state characterization parameter can be used for representing the emotion of the user of the client; the first interface data has a display element corresponding to the emotion. After the server generates the first interface data, the first interface data can be fed back to the client. Therefore, the client can display an interface according to the first interface data through the interface display system.
In this embodiment, the information collecting system of the client may continuously collect the corresponding state information at regular time intervals. In this way, after the second state information is collected, the client may send the second state information to the server in the above manner, so that the server generates a second state characterizing parameter according to the second state information, and formulates second interface data according to the second state characterizing parameter. In an actual application scenario, the state information of the client may change at any time. For example, the client has previously played a more sad song, while the client currently plays a cheerful song. In this way, the second state information collected by the client is different from the first state information. This results in the second interface data being different from the first interface data. That is, as the client state information changes, the interface data presented in the interface of the client may also be different.
In an actual application scenario, an APP for displaying dynamic wallpaper can be installed in a mobile phone of a user, and the APP can acquire access rights of other multiple application programs under the condition of user authorization. For example, the APP may obtain access rights of navigation software in the mobile phone, so that real-time data may be obtained from the navigation software.
In the application scenario, when a user drives, navigation can be performed through the navigation software of the mobile phone. Assuming that the current road condition is relatively congested, the real-time data in the navigation software can be data representing the road condition congestion. The data may include a severity value of the congestion. At this time, the APP may obtain the state information including the congestion severity value of 10 from the navigation software. The APP may extract the congestion severity value and normalize the congestion severity value to obtain a normalized severity value of 0.8. At the moment, the APP is based on the mapping relation between the normalized congestion degree value and the time length, the color and the deformation in the interface animation. The obtained time length attribute value of the interface animation is 5, and 5 seconds are required for representing that the whole animation is played; the color attribute value is 20, which indicates that 20 color systems (partial red) are selected when the animation is played; the deformation attribute value is 0.5, which indicates that the rhythm of playing the animation is 0.5 times speed. The animation assumption may be that a bug crawls forward on the road, and then in the current situation, the speed of crawling the bug may be slower, the bug may appear red, and the frequency of head swing of the bug may also be slower.
In the application scenario, when the user drives to a smooth road section, the APP can acquire the data of the navigation software again, and therefore the state information of which the congestion severity degree value is 0 can be obtained. According to the state information, the finally obtained bug crawling animation can be as follows: the speed of the small insects crawling can be fast, the small insects can be green, and the head swing frequency of the small insects can also be fast.
In another application scenario, a user plays music in a player while listening to a mobile phone. At this time, the APP displaying the dynamic wallpaper may collect the currently played audio data. After acquiring the audio data, the client may send the audio data to the server. In the server, the audio data may be analyzed so that the tempo and the sound intensity of the audio data may be obtained. At this time, the server may determine feature values corresponding to the rhythm and the sound intensity respectively according to the acquired rhythm and sound intensity. Then, the feature values corresponding to the rhythm and the tone can be input into a Hevner emotion model, so that a state feature vector corresponding to the audio data is obtained. The state feature vector may characterize an emotion embodied by the audio data. Of course, in practical application, the Hevner emotion model can be replaced by other emotion models. For example, a Thayer emotion model or a Tellegen-Watson-Clark emotion model may be employed.
In the application scenario, after the state feature vector is obtained through the emotion model, the server may obtain attribute values of each attribute in the interface data based on the state feature vector. For example, the interface data shows that a child runs from a distance and kicks a ball. The interface data may include attributes such as the running speed of the child, the flight speed of the ball, the degree of curvature of the flight trajectory of the ball, and the color of the ball. Assuming that a happy song is played by the current player, the values assigned to the attribute values in the interface data may be 0.8,0.1 and 1, where the first two 0.8 respectively represent the running speed of a child and the flying speed of a ball being 80% of the maximum speed, 0.1 represents that the flying trajectory of the ball is a straight line, and 1 represents that the color of the ball is red. After setting the attribute values of the attributes, the server may send the attribute values to the mobile phone of the user. After the mobile phone of the user receives the attribute values, the APP displaying the dynamic wallpaper can process the interface data, and after the processing, the dynamic interface moving according to the attribute values can be displayed.
Referring to fig. 5, the present specification provides an interface display method, which can be applied to a client, and the method includes the following steps.
S11: collecting state information related to the client.
In this embodiment, the state information may be obtained by an application running in the client. The application program can be, for example, a screen saver APP, a music playing APP, a video playing APP, a weather forecast APP, a navigation APP, and the like installed in a smart phone. In this embodiment, the status information may be at least one of audio information, video information, location information, weather information, and traffic information. The access interface of each application may be invoked to obtain current real-time data from the application. For example, when the music playing program is playing music, the currently played music information may be acquired; for another example, current weather information may be obtained while a weather forecast program is running in the background.
S13: generating state representation parameters according to the state information; the state representation parameters are used for representing the emotion of the user of the client.
In this embodiment, the state representation parameter may be used to represent an emotion of a user of the client. In particular, the status information may reflect at least one characteristic. The characteristics reflected by different state information may also be different. For example, when the status information is audio information, the features reflected by the status information may be features such as rhythm, melody, tone, and intensity. For another example, when the status information is weather information, the characteristic reflected by the status information may be a characteristic of temperature, humidity, weather condition, wind power, or the like. In the present embodiment, the big data of the state information may be analyzed in advance to quantify each feature, thereby obtaining a feature value corresponding to each feature. In this embodiment, the features reflected by the state information may have corresponding feature values. In an actual application scenario, the feature value of each feature may be included in the state information. For example, the audio information may include a tempo feature having a feature value of 3 and a sound intensity feature having a feature value of 10. Thus, from the state information, a feature value of a specified feature can be extracted. The specified characteristic may represent one of the characteristics reflected by the state information.
In this embodiment, the interface data presented in the interface may be represented by a plurality of dimensions. Specifically, the interface data may have multiple dimensions of showing time, color, showing rhythm, and the like. In this way, after the feature value of the specified feature is extracted from the state information, the influence of the feature value of the specified feature on the interface data can be further determined. For example, if the client currently plays music with a cheerful rhythm, the presentation time of the interface data presented in the interface of the client may be short, the color may be in a warm color system, and the presentation rhythm may be fast. For another example, if the navigation software in the client indicates that the current road condition is relatively congested, the showing time of the interface data shown in the interface of the client may be longer, the color may adopt a cold color system, and the showing rhythm may also be slower. That is, the feature values of different features all have an effect on the interface data displayed on the final interface. As described above, in the present embodiment, the feature value of the specified feature may be mapped to a specified dimension to obtain a dimension value corresponding to the specified dimension. The specified dimension may be one of the dimensions included in the interface data described above. For example, when the specific feature is temperature and the feature value of the specific feature is 1 (indicating that the specific feature is currently relatively cold), the feature value of the specific feature may be mapped into the dimension of the color, so that the dimension value of the color is 2 (indicating a cold color system). In an actual application scenario, a mapping relationship between the feature value and the dimension value may be predetermined. Therefore, after the characteristic value of the specified characteristic is extracted from the state information, the dimension value of a certain dimension corresponding to the characteristic value can be determined according to the mapping relation. It should be noted that the mapping relationship may include a mapping between one feature value and a plurality of dimension values. For example, the feature values of the temperature feature may correspond to dimension values of three dimensions of the projection time, the color, and the projection rhythm, respectively. Thus, one feature value may correspond to three dimensional values. In the present embodiment, the state characterizing parameter may be constituted by a plurality of dimension values obtained based on a feature value of the specified feature. Each value in the state characterizing parameter may be a dimension value mapped by a feature value of the specified feature. For example, if a feature value of 10 is extracted from the audio information, and the feature value is mapped into three dimensions of presentation time, color, and presentation tempo, three dimension values of 2, 20, and 10 are obtained, and thus the state characterizing parameter may be (2, 20, 10).
In the present embodiment, the unit used for the eigenvalue corresponding to different characteristics is different, and therefore, there is a large difference in numerical value. For example, in the rhythm feature, a feature value of 10 may indicate a slower rhythm, and in the intensity feature, a feature value of 10 may indicate a higher intensity. In order to be able to represent different features with a uniform numerical range, in the present embodiment, the feature values may be converted into a specified metric domain, so as to obtain a degree value of the feature values in the metric domain. The specified metric field may be a value interval having a minimum value and a maximum value within the value interval. Therefore, the characteristic values in different value ranges can be mapped to the specified measurement domain, and the unity of the numerical units is ensured. For example, the specified measurement domain is a dimensionless value interval, and the minimum value in the value interval is 0 and the maximum value in the value interval is 10. Thus, the characteristic value originally having a value range of 0 to 100 can be mapped into the value range. The eigenvalue 0 may correspond to the degree value 0, the eigenvalue 100 may correspond to the degree value 10, and the eigenvalue 50 may correspond to the degree value 5. In this way, by mapping the feature values of different features into a unified designated metric domain, the degree of the feature can be uniformly expressed by the magnitude of the degree value in the designated metric domain. In an actual application scenario, the feature values of the features may be normalized, so that the feature values are uniformly mapped to a value interval with a minimum value of 0 and a maximum value of 1.
In the present embodiment, there may be more than one feature value of the feature extracted from the state information. For example, two feature values, a tempo feature value and a tone feature value, can be extracted from the audio information. The characteristic values all affect the same dimension of the interface data, but the influence degrees are different. Based on this, corresponding weight values can be assigned to different features for the specified dimension of the interface data. For example, for a given dimension of color, a weight value of 0.8 may be assigned to the tempo feature and a weight value of 0.2 may be assigned to the intensity feature. In this way, after each feature value is converted into a designated metric domain to obtain a corresponding degree value, the degree values may be weighted and summed according to different designated dimensions to obtain the dimension value. For example, for the color dimension, assuming that the degree value corresponding to the rhythm feature is 0.6, and the degree value corresponding to the intensity feature is 0.4, after performing weighted summation, the dimension value corresponding to the color dimension is 0.6 × 0.8+0.4 × 0.2=0.56. Thus, when the state information reflects at least two characteristics, corresponding dimension values can be obtained by means of weighted summation according to each dimension, and therefore the state characterization parameters can be finally formed.
S15: formulating interface data according to the state representation parameters; wherein the interface data has a display element corresponding to the emotion.
In this embodiment, after obtaining the state characterizing parameters, interface data may be formulated according to the state characterizing parameters. Wherein the interface data has a display element corresponding to the emotion. Specifically, the display element corresponding to the emotion may be a display mode capable of reflecting emotion. For example, assuming the emotion is happy, the display element may be an object with a faster frequency of activity. For example, the display element may be a small monkey that beats up and down. For another example, assuming the emotion is sadness, the display element may be an asphalt rain. The display elements may be pre-stored in the client, and the display elements may have corresponding emotion tags, so that the corresponding display elements may be determined according to the emotion represented by the state representation parameters.
In this embodiment, the interface data may have a plurality of attributes. The attribute may refer to a visualization element that can be adjusted. For example, the attribute may be a rhythm of interface data playing, a color presented when the interface data is played, a duration of the interface data playing, and the like. In particular, the attribute may be provided with an attribute value, which may embody the state in which the attribute should be presented. Therefore, the display interface represented by the interface data can present a corresponding display effect according to the value of the attribute. For example, if the attribute value of the playing rhythm is 10, the display effect is that the playing rhythm of the animation in the display interface represented by the interface data is fast. For another example, if the attribute value of the color is 1, the display effect is that the color system of the color of the animation in the display interface represented by the interface data is relatively cold. In this embodiment, when interface data is formulated according to the state characterizing parameters, attribute values of attributes of the interface data may be specified according to the dimension values of the state characterizing parameters. Specifically, there may be a mapping relationship between the dimension value and the attribute value. Thus, the attribute value of the attribute can be obtained according to the determined dimension value. For example, the state characterizing parameter is (0.1, 0.6, 1), where the three dimensional values respectively correspond to three attributes of the playing time, the color, and the playing rhythm of the interface data, and then the three dimensional values can be respectively converted into the value ranges of the attribute values corresponding to the attributes. For example, 0.1 may be mapped to an attribute value of 3, indicating a play time of 3 seconds; 0.6 can be mapped to 20 attribute values, which represents that the played color is a 20 # color system; 1 may be mapped to an attribute value of 3, indicating that the tempo of the playback is 3 x speed.
In this embodiment, the interface displayed by the interface data may include animation, and thus, the interface data may include animation data. The animation data can be used to present a dynamic interface. For example, the dynamic interface may be a small person running from a distance and kicking a ball with one foot. Thus, the animation playing can have duration, color and deformation. Wherein, the duration may be a time length of playing the animation data; the color may be used to represent the overall tone of the animation data, or the color of at least a partial region in the animation data; the morphing may be used to represent a rhythm of play of the animation data. In this embodiment, the dimension of the state characterizing parameter may correspond to the duration, color, and deformation of the animation data one to one. Thus, the dimensions of the state characterizing parameters may include: the time length, the color and the deformation can be converted into the attribute values, and the time length, the color and the deformation can be controlled through the attribute values.
S17: and displaying the interface according to the display element.
In this embodiment, after obtaining the interface data, the client may display the interface data as a corresponding interface. In the interface, a display element for representing the emotion of the user can be included, and the display mode of the display element can be controlled by the attribute value. Therefore, according to the difference of the collected state information, the dynamic interfaces displayed on the interfaces of the client sides of the users can be different, and personalized interfaces can be intelligently displayed for the users.
Referring to fig. 6, the present specification further provides a client, where the client includes a memory and a processor, and the memory stores a computer program, and the computer program, when executed by the processor, implements the following steps.
S11: collecting state information related to the client.
S13: generating state representation parameters according to the state information; the state representation parameters are used for representing the emotion of the user of the client.
S15: formulating interface data according to the state representation parameters; wherein the interface data has display elements corresponding to the emotions.
S17: and displaying the interface according to the display element.
In this embodiment, the memory may include a physical device for storing information, and typically, the information is digitized and then stored in a medium using an electrical, magnetic, or optical method. The memory according to this embodiment may further include: devices that store information using electrical energy, such as RAM, ROM, etc.; devices that store information using magnetic energy, such as hard disks, floppy disks, tapes, core memories, bubble memories, usb disks; devices for storing information optically, such as CDs or DVDs. Of course, there are other ways of memory, such as quantum memory, graphene memory, and so forth.
In this embodiment, the processor may be implemented in any suitable manner. For example, the processor may take the form of, for example, a microprocessor or processor and a computer-readable medium that stores computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller and embedded microcontroller, and so forth.
The specific functions implemented by the client, the processor and the memory thereof provided in the embodiments of the present specification may be explained in comparison with the foregoing embodiments in the present specification.
The present specification also provides an interface display method, which can be applied to a client, and the method includes the following steps.
S21: collecting state information related to the client.
S23: generating state representation parameters according to the state information; the state representation parameters are used for representing the emotion of the user of the client.
S25: sending the state representation parameters to a server for the server to formulate interface data according to the state representation parameters; wherein the interface data has a display element corresponding to the emotion.
S27: and receiving interface data fed back by the server.
S29: and displaying the interface according to the display element.
In this embodiment, the client may collect status information related to the client. State characterizing parameters may then be generated from the state information. After processing the obtained state characterizing parameters, the client may send the state characterizing parameters to the server. Therefore, interface data can be formulated in the server according to the received state characterization parameters, and the interface data is fed back to the client. In this way, the client can display the interface according to the display element.
In this embodiment, for more detailed explanation of each step, reference may be made to the description in the foregoing embodiment, and details are not repeated here.
The present specification also provides a client comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, performs the following steps.
S21: collecting state information related to the client.
S23: generating state representation parameters according to the state information; the state representation parameters are used for representing the emotion of the user of the client.
S25: sending the state representation parameters to a server for the server to formulate interface data according to the state representation parameters; wherein the interface data has display elements corresponding to the emotions.
S27: and receiving interface data fed back by the server.
S29: and displaying the interface according to the display element.
In this embodiment, the memory may include a physical device for storing information, and typically, the information is digitized and then stored in a medium using an electrical, magnetic, or optical method. The memory according to this embodiment may further include: devices that store information using electrical energy, such as RAM, ROM, etc.; devices that store information by magnetic energy, such as hard disks, floppy disks, tapes, core memories, bubble memories, usb disks; devices for storing information optically, such as CDs or DVDs. Of course, there are other ways of memory, such as quantum memory, graphene memory, and so forth.
In this embodiment, the processor may be implemented in any suitable manner. For example, the processor may take the form of, for example, a microprocessor or processor and a computer-readable medium that stores computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller and embedded microcontroller, and so forth.
The specific functions of the client, the processor and the memory of the client, which are implemented by the embodiments of the present specification, can be explained in contrast to the foregoing embodiments of the present specification.
The specification further provides an interface display method, the method is applied to a client side, and the method comprises the following steps.
S31: providing first state information related to the client to a server, so that the server generates a first state representation parameter according to the first state information, and first interface data are formulated according to the first state representation parameter; the first state characterization parameter is used for representing the emotion of the user of the client; the first interface data has a display element corresponding to the emotion.
S33: and receiving first interface data fed back by the server.
S35: and displaying the interface according to the display elements in the first interface data.
In this embodiment, the client may collect first state information of the client and provide the first state information related to the client to the server. In this way, after receiving the first state information, the server may generate a first state characterizing parameter according to the first state information, and may formulate first interface data according to the first state characterizing parameter. After the server generates the first interface data, the first interface data can be fed back to the client. Therefore, the client can display an interface according to the first interface data.
In this embodiment, the client may continuously collect the corresponding state information according to a fixed time interval. In this way, after the second state information is collected, the client may send the second state information to the server in the above manner, so that the server generates a second state characterizing parameter according to the second state information, and formulates second interface data according to the second state characterizing parameter. In an actual application scenario, the state information of the client may change at any time. For example, the client has previously played a more distressing song, while the client currently plays a cheerful song. In this way, the second state information collected by the client is different from the first state information. This results in the second interface data being different from the first interface data. That is, as the client state information changes, the interface data presented in the interface of the client may also be different.
There is also provided in this specification a client comprising a memory and a processor, the memory having stored therein a computer program which, when executed by the processor, carries out the following steps.
S31: providing first state information related to the client to a server, so that the server generates a first state representation parameter according to the first state information, and first interface data are formulated according to the first state representation parameter; the first state representation parameter is used for representing the emotion of the user of the client; the first interface data has a display element corresponding to the emotion.
S33: and receiving first interface data fed back by the server.
S35: and displaying the interface according to the display elements in the first interface data.
In this embodiment, the memory may include a physical device for storing information, and typically, the information is digitized and then stored in a medium using an electrical, magnetic, or optical method. The memory according to this embodiment may further include: devices that store information using electrical energy, such as RAM, ROM, etc.; devices that store information using magnetic energy, such as hard disks, floppy disks, tapes, core memories, bubble memories, usb disks; devices for storing information optically, such as CDs or DVDs. Of course, there are other ways of memory, such as quantum memory, graphene memory, and so forth.
In this embodiment, the processor may be implemented in any suitable manner. For example, the processor may take the form of, for example, a microprocessor or processor and a computer-readable medium that stores computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller and embedded microcontroller, and so forth.
The specific functions of the client, the processor and the memory of the client provided in the embodiments of the present specification may be explained in comparison with the foregoing embodiments of the present specification.
Referring to fig. 7, the present specification further provides an interface data processing method, including the following steps.
S41: and receiving the state information sent by the client.
S43: and generating a state representation parameter according to the state information, wherein the state representation parameter is used for representing the emotion of the user of the client.
S45: formulating interface data according to the state characterization parameters; wherein the interface data has display elements corresponding to the emotions.
S47: and sending the interface data to the client.
In this embodiment, the client may only collect current state information and may then send the state information to the server. Therefore, after the server receives the state information, the server can generate state representation parameters according to the state information, and can formulate interface data according to the state representation parameters. After the server generates the interface data, the interface data can be fed back to the client. Therefore, the client can display an interface according to the interface data.
In this embodiment, for more detailed explanation of each step, reference may be made to the description in the foregoing embodiment, and details are not repeated here.
The present specification also provides a server comprising a memory and a processor, the memory having stored therein a computer program which, when executed by the processor, carries out the following steps.
S41: and receiving the state information sent by the client.
S43: and generating a state representation parameter according to the state information, wherein the state representation parameter is used for representing the emotion of the user of the client.
S45: formulating interface data according to the state representation parameters; wherein the interface data has display elements corresponding to the emotions.
S47: and sending the interface data to the client.
In this embodiment, the memory may include a physical device for storing information, and typically, the information is digitized and then stored in a medium using an electrical, magnetic, or optical method. The memory according to this embodiment may further include: devices that store information using electrical energy, such as RAM, ROM, etc.; devices that store information using magnetic energy, such as hard disks, floppy disks, tapes, core memories, bubble memories, usb disks; devices for storing information optically, such as CDs or DVDs. Of course, there are other ways of memory, such as quantum memory, graphene memory, and so forth.
In this embodiment, the processor may be implemented in any suitable manner. For example, the processor may take the form of, for example, a microprocessor or processor and a computer-readable medium that stores computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller and embedded microcontroller, and so forth.
Embodiments of the present disclosure provide a server, a processor and a memory of which implement specific functions, which can be explained in comparison with the foregoing embodiments of the present disclosure.
As can be seen from the technical solutions provided by the above embodiments of the present specification, when the interface is presented, the present specification may collect state information of the client, where the state information may be used to characterize the environment where the user is currently located and/or to characterize the activity that the user is currently engaged in. Then, based on the collected state information, a corresponding state characterizing parameter may be obtained, and the state characterizing parameter may represent an emotion of a user using the client. Therefore, interface data which accord with the current emotion of the user can be worked out according to the state representation parameters, and the current interface can be displayed according to the worked out interface data. Therefore, the interface displayed in the description is obtained based on the current emotion analysis of the user, and the content in the displayed interface can accord with the current emotion of the user, so that different personalized contents can be displayed for different users, and the mode of displaying the personalized contents is more intelligent.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments can be referred to each other, and each embodiment focuses on the differences from the other embodiments.
The server in the embodiments of the present specification may be an electronic device having a certain arithmetic processing capability. Which may have network communication terminals, a processor, memory, etc. Of course, the server may also refer to software running in the electronic device. The server may be a distributed server, and may be a system having a plurality of processors, memories, network communication modules, and the like that cooperate with one another.
In the 90 s of the 20 th century, improvements in a technology could clearly distinguish between improvements in hardware (e.g., improvements in circuit structures such as diodes, transistors, switches, etc.) and improvements in software (improvements in process flow). However, as technology advances, many of today's process flow improvements have been seen as direct improvements in hardware circuit architecture. Designers almost always obtain a corresponding hardware circuit structure by programming an improved method flow into the hardware circuit. Thus, it cannot be said that an improvement in the process flow cannot be realized by hardware physical blocks. For example, a Programmable Logic Device (PLD), such as a Field Programmable Gate Array (FPGA), is an integrated circuit whose Logic functions are determined by programming the Device by a user. A digital system is "integrated" on a PLD by the designer's own programming without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Furthermore, nowadays, instead of manually manufacturing an Integrated Circuit chip, such Programming is often implemented by "logic compiler" software, which is similar to a software compiler used in program development and writing, but the original code before compiling is also written by a specific Programming Language, which is called Hardware Description Language (HDL), and HDL is not only one but many, such as ABEL (Advanced Boolean Expression Language), AHDL (alternate Hardware Description Language), traffic, CUPL (core universal Programming Language), HDCal, jhddl (Java Hardware Description Language), lava, lola, HDL, PALASM, rhyd (Hardware Description Language), and vhjh-Language (Hardware Description Language), which is currently used by Hardware compiler-Language-2. It will also be apparent to those skilled in the art that hardware circuitry that implements the logical method flows can be readily obtained by merely slightly programming the method flows into an integrated circuit using the hardware description languages described above.
Those skilled in the art will also appreciate that, in addition to implementing the controller as pure computer readable program code, the same functionality can be implemented by logically programming method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Such a controller may thus be considered a hardware component, and the means included therein for performing the various functions may also be considered as a structure within the hardware component. Or even means for performing the functions may be regarded as being both a software module for performing the method and a structure within a hardware component.
From the above description of the embodiments, it is clear to those skilled in the art that the present specification can be implemented by software plus a necessary general hardware platform. Based on such understanding, the technical solutions of the present specification may be embodied in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, or the like, and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute the method according to the embodiments or some parts of the embodiments of the present specification.
While the specification has been described with respect to the embodiments, those skilled in the art will appreciate that there are numerous variations and permutations of the specification that fall within the spirit and scope of the specification, and it is intended that the appended claims include such variations and modifications as fall within the spirit and scope of the specification.
Claims (22)
1. An interface display method is applied to a client, and comprises the following steps:
collecting state information related to the client;
generating state representation parameters according to the state information; the state representation parameters are used for representing the emotion of the user of the client;
formulating interface data according to the state characterization parameters; wherein the interface data has display elements corresponding to the emotions;
displaying the interface according to the display element;
wherein collecting state information related to the client comprises: acquiring real-time data from an application program running on the client;
the step of generating the state characterizing parameters comprises:
extracting a feature value of a specified feature from the state information;
mapping the characteristic value to an appointed dimension to obtain a dimension value; a plurality of said dimension values form said state characterising parameter.
2. The method of claim 1, wherein the status information comprises at least one of: the played audio information, the played video information, the position information, the weather information and the traffic information.
3. The method of claim 1, wherein the step of mapping the eigenvalues to the specified dimensions comprises:
converting the characteristic value into a specified measurement domain to obtain a degree value of the characteristic value in the measurement domain;
and respectively carrying out weighted summation on the degree values according to different specified dimensions to obtain the dimension values.
4. The method of claim 1, wherein interface data is provided, the interface data having a plurality of attributes; displaying a corresponding display effect on a display interface represented by the interface data according to the value of the attribute;
the step of preparing the bounding surface data comprises:
and specifying the attribute value of the attribute of the interface data according to the dimension value of the state characterization parameter.
5. The method of claim 1, wherein the interface data includes animation data for rendering a dynamic interface; the dimensions of the state characterizing parameters include: duration, color, and distortion; wherein, the duration is the time length of playing the animation data; the color is used for representing the overall tone of the animation data or the color of at least partial area in the animation data; the deformation is used for representing the playing rhythm of the animation data.
6. A client, comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, performs the steps of:
collecting state information related to the client;
generating state representation parameters according to the state information; the state representation parameters are used for representing the emotion of the user of the client;
formulating interface data according to the state characterization parameters; wherein the interface data has display elements corresponding to the emotions;
displaying the interface according to the display element;
wherein collecting state information related to the client comprises: acquiring real-time data from an application program running on the client;
generating state characterizing parameters according to the state information, including:
extracting a feature value of a specified feature from the state information;
mapping the characteristic value to a specified dimension to obtain a dimension value; a plurality of said dimension values form said state characterizing parameter.
7. An interface display method is applied to a client, and comprises the following steps:
collecting state information related to the client;
generating state representation parameters according to the state information; the state representation parameters are used for representing the emotion of the user of the client;
sending the state representation parameters to a server for the server to formulate interface data according to the state representation parameters; wherein the interface data has display elements corresponding to the emotions;
receiving interface data fed back by the server;
displaying the interface according to the display element;
wherein collecting state information related to the client comprises: acquiring real-time data from an application program running on the client;
generating state characterizing parameters according to the state information, including:
extracting a feature value of a specified feature from the state information;
mapping the characteristic value to an appointed dimension to obtain a dimension value; a plurality of said dimension values form said state characterizing parameter.
8. The method of claim 7, wherein the status information comprises at least one of: the played audio information, the played video information, the position information, the weather information and the traffic information.
9. The method of claim 7, wherein the step of mapping the eigenvalues to the specified dimensions comprises:
converting the characteristic value into a designated metric domain to obtain a degree value of the characteristic value in the metric domain;
and respectively carrying out weighted summation on the degree values according to different specified dimensions to obtain the dimension values.
10. The method of claim 7, wherein the interface data has a plurality of attributes; and displaying a corresponding display effect on a display interface represented by the interface data according to the value of the attribute.
11. The method of claim 7, wherein the interface data includes animation data for rendering a dynamic interface; the dimensions of the state characterizing parameters include: duration, color, and distortion; wherein, the duration is the time length of playing the animation data; the color is used for representing the overall tone of the animation data or the color of at least partial area in the animation data; the deformation is used for representing the playing rhythm of the animation data.
12. A client, comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, performs the steps of:
collecting state information related to the client;
generating state representation parameters according to the state information; the state representation parameters are used for representing the emotion of the user of the client;
sending the state representation parameters to a server for the server to formulate interface data according to the state representation parameters; wherein the interface data has a display element corresponding to the emotion;
receiving interface data fed back by the server;
displaying the interface according to the display element;
wherein collecting state information related to the client comprises: acquiring real-time data from an application program running on the client;
generating state characterizing parameters according to the state information, including:
extracting a feature value of a specified feature from the state information;
mapping the characteristic value to an appointed dimension to obtain a dimension value; a plurality of said dimension values form said state characterising parameter.
13. An interface display method is applied to a client, and comprises the following steps:
providing first state information related to the client to a server, so that the server generates a first state representation parameter according to the first state information, and first interface data are formulated according to the first state representation parameter; the first state representation parameter is used for representing the emotion of the user of the client; the first interface data has a display element corresponding to the emotion;
receiving first interface data fed back by the server;
displaying the interface according to display elements in the first interface data;
wherein the first state information is real-time data obtained from an application running on the client;
generating a first state characterizing parameter according to the first state information, including:
extracting a feature value of a specified feature from the first state information;
mapping the characteristic value to an appointed dimension to obtain a dimension value; a plurality of said dimension values form said first state characterizing parameter.
14. The method of claim 13, further comprising:
sending second state information related to the client to the server, so that the server generates second state representation parameters according to the second state information, and formulating second interface data according to the second state representation parameters; the second state information is different from the first state information, and the second interface data is different from the first interface data.
15. The method of claim 13, wherein the first status information comprises at least one of: the played audio information, the played video information, the position information, the weather information and the traffic information.
16. The method of claim 13, wherein the first interface data includes animation data for rendering a dynamic interface; the dimensions of the first state characterizing parameter include: duration, color, and distortion; wherein, the duration is the time length of playing the animation data; the color is used for representing the overall tone of the animation data or the color of at least partial area in the animation data; the deformation is used for representing the playing rhythm of the animation data.
17. A client, comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, performs the steps of:
providing first state information related to the client to a server, wherein the first state information is used for generating a first state representation parameter according to the first state information by the server, and formulating first interface data according to the first state representation parameter; the first state characterization parameter is used for representing the emotion of the user of the client; the first interface data has a display element corresponding to the emotion;
receiving first interface data fed back by the server;
displaying the interface according to display elements in the first interface data;
wherein the first state information is real-time data obtained from an application running on the client;
generating a first state characterizing parameter according to the first state information, including:
extracting a feature value of a specified feature from the first state information;
mapping the characteristic value to an appointed dimension to obtain a dimension value; a plurality of said dimension values form said first state characterizing parameter.
18. An interface data processing method, characterized in that the method comprises:
receiving state information sent by a client; wherein the state information is real-time data obtained from an application running on the client;
generating a state representation parameter according to the state information, wherein the state representation parameter is used for representing the emotion of the user of the client;
formulating interface data according to the state characterization parameters; wherein the interface data has display elements corresponding to the emotions;
sending the interface data to the client;
generating state characterizing parameters according to the state information, including:
extracting a feature value of a specified feature from the state information;
mapping the characteristic value to an appointed dimension to obtain a dimension value; a plurality of said dimension values form said state characterising parameter.
19. The method of claim 18, wherein the status information comprises at least one of: the played audio information, the played video information, the position information, the weather information and the traffic information.
20. The method of claim 18, wherein the step of mapping the eigenvalues to the specified dimensions comprises:
converting the characteristic value into a specified measurement domain to obtain a degree value of the characteristic value in the measurement domain;
and respectively carrying out weighted summation on the degree values according to different specified dimensions to obtain the dimension values.
21. The method of claim 18, wherein the interface data includes animation data for rendering a dynamic interface; the dimensions of the state characterizing parameters include: duration, color, and distortion; wherein the duration is the time length of playing the animation data; the color is used for representing the overall tone of the animation data or the color of at least partial area in the animation data; the deformation is used for representing the playing rhythm of the animation data.
22. A server, characterized in that the server comprises a memory and a processor, the memory having stored therein a computer program which, when executed by the processor, performs the steps of:
receiving state information sent by a client; wherein the state information is real-time data obtained from an application running on the client;
generating a state representation parameter according to the state information, wherein the state representation parameter is used for representing the emotion of the user of the client;
formulating interface data according to the state characterization parameters; wherein the interface data has a display element corresponding to the emotion;
sending the interface data to the client;
generating state characterizing parameters according to the state information, including:
extracting a feature value of a specified feature from the state information;
mapping the characteristic value to an appointed dimension to obtain a dimension value; a plurality of said dimension values form said state characterising parameter.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711136046.6A CN109800037B (en) | 2017-11-16 | 2017-11-16 | Interface display method, interface data processing method, client and server |
PCT/CN2018/114352 WO2019096047A1 (en) | 2017-11-16 | 2018-11-07 | Interface display method, interface data processing method, client and server |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711136046.6A CN109800037B (en) | 2017-11-16 | 2017-11-16 | Interface display method, interface data processing method, client and server |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109800037A CN109800037A (en) | 2019-05-24 |
CN109800037B true CN109800037B (en) | 2022-11-11 |
Family
ID=66538497
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711136046.6A Active CN109800037B (en) | 2017-11-16 | 2017-11-16 | Interface display method, interface data processing method, client and server |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN109800037B (en) |
WO (1) | WO2019096047A1 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012058877A (en) * | 2010-09-07 | 2012-03-22 | Clarion Co Ltd | Play list creation device |
CN104461256A (en) * | 2014-12-30 | 2015-03-25 | 广州视源电子科技股份有限公司 | interface element display method and system |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7874983B2 (en) * | 2003-01-27 | 2011-01-25 | Motorola Mobility, Inc. | Determination of emotional and physiological states of a recipient of a communication |
CN101989295A (en) * | 2009-08-06 | 2011-03-23 | 冯俊 | Resource management and information publish system based on main interface of operating system (OS) |
CN101901595B (en) * | 2010-05-05 | 2014-10-29 | 北京中星微电子有限公司 | Method and system for generating animation according to audio music |
CN102436845A (en) * | 2011-11-28 | 2012-05-02 | 康佳集团股份有限公司 | Music player and method for processing twinkling of light emitting diode (LED) lamp of music player along with music |
CN102801657A (en) * | 2012-09-03 | 2012-11-28 | 鲁赤兵 | Composite microblog system and method |
CN106020618B (en) * | 2013-11-27 | 2019-06-28 | 青岛海信电器股份有限公司 | The interface creating method and device of terminal |
CN105099861A (en) * | 2014-05-19 | 2015-11-25 | 阿里巴巴集团控股有限公司 | User emotion-based display control method and display control device |
CN104486331A (en) * | 2014-12-11 | 2015-04-01 | 上海元趣信息技术有限公司 | Multimedia file processing method, client terminals and interaction system |
CN106569763B (en) * | 2016-10-19 | 2020-03-20 | 华为机器有限公司 | Image display method and terminal |
CN107066442A (en) * | 2017-02-15 | 2017-08-18 | 阿里巴巴集团控股有限公司 | Detection method, device and the electronic equipment of mood value |
-
2017
- 2017-11-16 CN CN201711136046.6A patent/CN109800037B/en active Active
-
2018
- 2018-11-07 WO PCT/CN2018/114352 patent/WO2019096047A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012058877A (en) * | 2010-09-07 | 2012-03-22 | Clarion Co Ltd | Play list creation device |
CN104461256A (en) * | 2014-12-30 | 2015-03-25 | 广州视源电子科技股份有限公司 | interface element display method and system |
Also Published As
Publication number | Publication date |
---|---|
CN109800037A (en) | 2019-05-24 |
WO2019096047A1 (en) | 2019-05-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020019663A1 (en) | Face-based special effect generation method and apparatus, and electronic device | |
CN109034115B (en) | Video image recognizing method, device, terminal and storage medium | |
US11383162B2 (en) | Voice skill game editing method, apparatus, device and readable storage medium | |
US9501140B2 (en) | Method and apparatus for developing and playing natural user interface applications | |
US10203838B2 (en) | Avatar personalization in a virtual environment | |
JP6684883B2 (en) | Method and system for providing camera effects | |
CN113453040A (en) | Short video generation method and device, related equipment and medium | |
CN109064387A (en) | Image special effect generation method, device and electronic equipment | |
US20220174237A1 (en) | Video special effect generation method and terminal | |
CN114245099B (en) | Video generation method and device, electronic equipment and storage medium | |
US20230368461A1 (en) | Method and apparatus for processing action of virtual object, and storage medium | |
CN111643899A (en) | Virtual article display method and device, electronic equipment and storage medium | |
CN103116463A (en) | Interface control method of personal digital assistant applications and mobile terminal | |
US20190005156A1 (en) | Data flow visualization system | |
CN111464430A (en) | Dynamic expression display method, dynamic expression creation method and device | |
CN110286836B (en) | Device, method and graphical user interface for mobile application interface elements | |
CN111880877A (en) | Animation switching method, device, equipment and storage medium | |
CN112235635A (en) | Animation display method, animation display device, electronic equipment and storage medium | |
CN108197105B (en) | Natural language processing method, device, storage medium and electronic equipment | |
CN108052506B (en) | Natural language processing method, device, storage medium and electronic equipment | |
CN109800037B (en) | Interface display method, interface data processing method, client and server | |
CN115167966B (en) | Lyric-based information prompting method, device, equipment, medium and product | |
CN112348955B (en) | Object rendering method | |
CN113362802A (en) | Voice generation method and device and electronic equipment | |
CN113468932A (en) | Intelligent mirror and makeup teaching method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |