CN106155703B - Emotional state display method and device - Google Patents

Emotional state display method and device Download PDF

Info

Publication number
CN106155703B
CN106155703B CN201610630277.1A CN201610630277A CN106155703B CN 106155703 B CN106155703 B CN 106155703B CN 201610630277 A CN201610630277 A CN 201610630277A CN 106155703 B CN106155703 B CN 106155703B
Authority
CN
China
Prior art keywords
target
emotional state
information
state information
app
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610630277.1A
Other languages
Chinese (zh)
Other versions
CN106155703A (en
Inventor
祁连山
王柯
张亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201610630277.1A priority Critical patent/CN106155703B/en
Publication of CN106155703A publication Critical patent/CN106155703A/en
Application granted granted Critical
Publication of CN106155703B publication Critical patent/CN106155703B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/20Software design
    • G06F8/22Procedural
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides a display method and a display device of emotional states, wherein the method comprises the following steps: receiving target emotional state information of a user; and displaying the target emotional state information in a target application program (App). In the disclosure, the terminal can display the content in the related technology in the target App, and can also display the emotional state of the user, so that the intelligent degree of the terminal is improved, and the user experience is improved.

Description

Emotional state display method and device
Technical Field
The present disclosure relates to the field of communications, and in particular, to a method and an apparatus for displaying emotional states.
Background
Currently, social applications (apps) are increasing, such as WeChat, QQ, Michat, and so on. In the related art, a user can publish different contents through the social App, so that other users can interact with the user according to the published contents of the user.
However, social apps currently have limited content that can be displayed, such as geographical locations and the like. That is, the content that the App of the social class can currently display cannot really meet the needs of the user.
Disclosure of Invention
In view of the above, the present disclosure provides a method and an apparatus for displaying emotional states to solve the deficiencies in the related art.
According to a first aspect of embodiments of the present disclosure, there is provided a method for displaying an emotional state, the method including:
receiving target emotional state information of a user;
and displaying the target emotional state information in a target application program (App).
Optionally, the receiving target emotional state information of the user includes:
outputting an emotional state option according to the received first instruction;
receiving a target emotional state option selected among the emotional state options;
and taking the emotional state information indicated by the target emotional state option as the target emotional state information.
Optionally, the receiving target emotional state information of the user includes:
outputting an emotional state adjusting shaft according to the received second instruction;
when at least one first touch instruction for the emotional state regulating axis is received, acquiring first target position information corresponding to the last touch instruction in the at least one first touch instruction;
and determining the target emotional state information corresponding to the first target position information according to a first corresponding relation between preset position information and emotional state information.
Optionally, after the outputting the emotional state adjustment axis, the method further comprises:
when at least one second touch instruction for the emotional state regulating axis is received, second target position information corresponding to the last touch instruction in the at least one second touch instruction is obtained;
determining target mood information corresponding to the second target position information according to a second corresponding relation between preset position information and mood information;
and displaying the target mood degree information in the target App.
Optionally, the method further comprises:
receiving input target information, wherein the target information comprises at least one of video information, picture information and text information;
and triggering the step of receiving the target emotional state information of the user when the target information is confirmed to be input completely.
Optionally, the displaying the target emotional state information in the target application App includes:
and in the target App, displaying the text content indicated by the target emotional state information through text.
Optionally, the displaying the target emotional state information in the target application App includes:
determining a target preset expression corresponding to the target emotion state information according to a third corresponding relation between pre-stored emotion state information and preset expressions;
and in the target App, displaying the target emotional state information through the target preset expression.
Optionally, the displaying the target emotional state information in the target application App includes:
outputting a preset image in the target App, wherein the preset image is formed by a closed curve and can be internally filled;
determining a filling area in the preset image after receiving a filling instruction;
filling in the preset image according to the filling area;
and in the target App, displaying the target emotional state information through the filled preset image.
According to a second aspect of embodiments of the present disclosure, there is provided a display device of emotional states, the device comprising:
a first receiving module configured to receive target emotional state information of a user;
an emotional state display module configured to display the target emotional state information in a target application App.
Optionally, the first receiving module includes:
a first output sub-module configured to output an emotional state option according to the received first instruction;
a receiving sub-module configured to receive a selected target one of the emotional state options;
a first determination sub-module configured to take the emotional state information indicated by the target emotional state option as the target emotional state information.
Optionally, the first receiving module includes:
a second output submodule configured to output an emotional state adjustment axis according to the received second instruction;
the obtaining sub-module is configured to obtain first target position information corresponding to a last touch instruction in at least one first touch instruction when the at least one first touch instruction for the emotional state regulating axis is received;
a second determining sub-module configured to determine the target emotional state information corresponding to the first target location information according to a first corresponding relationship between preset location information and emotional state information.
Optionally, the apparatus further comprises:
the obtaining module is configured to obtain second target position information corresponding to a last touch instruction in at least one second touch instruction when the at least one second touch instruction for the emotional state regulating axis is received;
the determining module is configured to determine target mood information corresponding to the second target position information according to a second corresponding relation between preset position information and mood information;
a mood degree display module configured to display the target mood degree information in the target App.
Optionally, the apparatus further comprises:
a second receiving module configured to receive input target information, the target information including at least one of video information, picture information, and text information;
the triggering module is configured to trigger the first receiving module to receive the target emotional state information of the user when the target information is confirmed to be input completely.
Optionally, the emotional state display module includes:
a first display sub-module configured to display, in the target App, the text content indicated by the target emotional state information by text.
Optionally, the emotional state display module includes:
the expression determination submodule is configured to determine a target preset expression corresponding to the target emotion state information according to a third corresponding relation between the prestored emotion state information and the preset expression;
the second display sub-module is configured to display the target emotion state information through the target preset expression in the target App.
Optionally, the emotional state display module includes:
the image output sub-module is configured to output a preset image in the target App, wherein the preset image is an image which is formed by a closed curve and can be filled inside;
an area determination submodule configured to determine a filled area in the preset image upon receiving a filling instruction;
a filling sub-module configured to fill in the preset image according to the filling area;
a third display sub-module configured to display the target emotional state information through the filled preset image in the target App.
According to a third aspect of embodiments of the present disclosure, there is provided a display device of emotional states, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
receiving target emotional state information of a user;
and displaying the target emotional state information in a target application program (App).
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
in the embodiment of the disclosure, the terminal can automatically receive the target emotional state information of the user, and then the target emotional state information is displayed in the target App. Through the process, the terminal can display the content in the related technology in the target App and can also display the target emotional state information of the user, the intelligent degree of the terminal is improved, and the user experience is improved.
In the embodiment of the disclosure, the terminal may output the emotional state options, and further receive a target emotional state option selected by the user from the emotional state options, so that the emotional state information indicated by the target emotional state option is used as the target emotional state information. The user operation is simple and convenient, and the usability is high.
In the embodiment of the present disclosure, optionally, the terminal may output the emotional state adjustment axis while outputting the emotional state option, or output the emotional state adjustment axis separately. And adjusting the emotional state adjusting shaft by the user so as to determine target emotional state information. The user operation is simple and convenient, and the usability is high.
In the embodiment of the present disclosure, the mood of the user may be represented by the emotional state adjustment axis. When the target emotion state information of the user is displayed in the target App, the target mood degree information can be displayed, displayable contents are further increased, and the intelligent degree of the terminal is improved.
In the embodiment of the disclosure, the target emotional state information of the user can be triggered and received when the input of the target information is determined to be finished, wherein the target information includes at least one of video information, picture information and text information, which increases displayable content and improves user experience.
In the embodiment of the disclosure, when the target emotional state information is displayed in the target App, optionally, the target emotional state information can be displayed through characters, that is, the displayed character content is the character content indicated by the target emotional state information; or displaying through a preset expression, namely displaying a target preset expression corresponding to the target emotion state information; or displaying through the filling area of a preset image, namely displaying the target emotional state information through the filled preset image. Through the process, the target emotion state information of the user can be displayed in the target App, displayable content is added, and the user experience is improved while the intelligent degree of the terminal is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow chart of a method of displaying an emotional state shown in accordance with an exemplary embodiment of the present disclosure;
FIG. 2 is a flow diagram of another method of displaying an emotional state shown in accordance with an example embodiment;
3A-3B are display scenario diagrams of emotional states shown in the present disclosure according to an example embodiment;
FIG. 4 is a flow diagram of another method of displaying an emotional state shown in accordance with an example embodiment;
fig. 5A to 5D are display scene diagrams of emotional states shown in the present disclosure according to an example embodiment;
FIG. 6 is a flow diagram of another method of displaying an emotional state shown in accordance with an example embodiment;
FIG. 7 is a display scenario diagram illustrating another emotional state according to an example embodiment of the present disclosure;
FIG. 8 is a display scenario diagram illustrating another emotional state according to an example embodiment of the present disclosure;
FIG. 9 is a flow diagram of another method of displaying an emotional state shown in accordance with an example embodiment;
FIG. 10 is a display scenario diagram illustrating another emotional state according to an example embodiment of the present disclosure;
FIG. 11 is a flow diagram of another method of displaying an emotional state shown in accordance with an example embodiment;
FIG. 12 is a display scenario diagram illustrating another emotional state according to an example embodiment of the present disclosure;
FIG. 13 is a flow chart of another method of displaying an emotional state shown in accordance with an exemplary embodiment of the present disclosure;
FIG. 14 is a block diagram of a display device showing an emotional state according to an example embodiment of the present disclosure;
FIG. 15 is a block diagram of a display device showing another emotional state according to an example embodiment of the disclosure;
FIG. 16 is a block diagram of a display device showing another emotional state according to an example embodiment of the disclosure;
FIG. 17 is a block diagram of a display device showing another emotional state according to an example embodiment of the disclosure;
FIG. 18 is a block diagram of a display device showing another emotional state according to an example embodiment of the disclosure;
FIG. 19 is a block diagram of a display device showing another emotional state according to an example embodiment of the disclosure;
FIG. 20 is a block diagram of a display device showing another emotional state according to an example embodiment of the disclosure;
FIG. 21 is a block diagram of a display device showing another emotional state according to an example embodiment of the disclosure;
fig. 22 is a schematic structural diagram illustrating a display device for emotional states according to an exemplary embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as operated herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if," as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination," depending on the context.
The display method of emotional states provided by the embodiment of the disclosure can be used for terminals, such as smart phones, tablet computers, Personal Digital Assistants (PDAs), and the like. As shown in fig. 1, fig. 1 is a display method of an emotional state shown according to an exemplary embodiment, including the steps of:
in step 101, target emotional state information of a user is received.
Alternatively, the terminal may receive the target emotional state information in any one of the following manners.
First, targeted emotional state information is received via an emotional state option.
Accordingly, step 101 is shown in fig. 2, and fig. 2 is another emotional state display method shown on the basis of the foregoing embodiment shown in fig. 1, and includes the following steps:
in step 101-1, an emotional state option is output based on the received first instruction.
In this step, the terminal may output a virtual key, where the virtual key corresponds to outputting the emotional state option. If the user triggers the virtual key, the terminal confirms that the first instruction is received, and at this time, a plurality of preset emotional state options can be output, for example, as shown in fig. 3A, the emotional state options are output in a drop-down box mode.
Wherein the emotional state options include, but are not limited to, happy, cardiac, excited, angry, sad, boring, and the like options.
In step 101-2, a selected target one of the emotional state options is received.
In this step, after the target emotional state option selected by the user from the emotional state options is selected, the terminal may obtain the relevant information of the target emotional state option. When the user selects the target emotional state option, the target emotional state option can be selected through voice. For example, the user may turn on a sound collection device on the terminal, and if the voice input is happy, the target emotional state option selected by the terminal is happy.
Or the target mood option may be determined by the terminal through an analysis of the environment. For example, the terminal collects image information or video information of the surrounding environment through an image collecting device, and if the collected image or video comprises a plurality of users having meals, the target emotional state option can be determined to be happy.
Or the terminal can directly determine the designated target emotional state option from the output emotional state options. For example, the emotional state options include happiness, anger and anger, and if the user presets that the specified option of the current time period is anger, the terminal directly indicates that the target emotional state option is anger.
In step 101-3, the emotional state information indicated by the target emotional state option is taken as the target emotional state information.
In this step, the terminal directly uses the emotional state information indicated by the target emotional state option as the target emotional state information.
For example, if the user selects the target emotional state information from the emotional state options, as shown in fig. 3B, the emotional state information indicated by the target emotional state information is happy state information, and the target emotional state information is happy state information.
In a second mode, target emotional state information is received through an emotional state adjustment axis.
Accordingly, step 101 is shown in fig. 4, and fig. 4 is another emotional state display method shown on the basis of the foregoing embodiment shown in fig. 1, and includes the following steps:
in step 101-1', an emotional state adjustment axis is output according to the received second instruction.
In this step, the terminal may output another virtual key corresponding to the output of the emotional state adjustment axis. And if the user triggers the virtual key, the terminal confirms that the second instruction is received, and the terminal can generate the emotional state regulating shaft according to the related technology and then output the emotional state regulating shaft. As shown in fig. 5A, the corresponding emotional states may change gradually from angry to excited, in a left-to-right order.
The second instruction may be the same as the first instruction, that is, after the user triggers the same virtual key, the terminal confirms that the first instruction and the second instruction are received, and at this time, the emotional state option and the emotional state adjustment axis are synchronously output, as shown in fig. 5B.
The second instruction may also be different from the first instruction, that is, after the user triggers a virtual key, the terminal confirms that the first instruction is received and outputs the emotional state option. The user can trigger the virtual key again or trigger other virtual keys, the terminal confirms that the second instruction is received, and the emotional state adjusting shaft is output at the moment.
In step 101-2', when at least one first touch instruction for the emotional state adjustment axis is received, first target position information corresponding to a last touch instruction in the at least one first touch instruction is obtained.
In this step, the user may determine the emotional state of the user by dragging the emotional state adjustment axis, and the terminal may determine the first target position information corresponding to the last touch instruction when detecting that at least one first touch instruction exists on the emotional state adjustment axis.
In step 101-3', the target emotional state information corresponding to the first target location information is determined according to a first corresponding relationship between preset location information and emotional state information.
In this step, different positions on the emotional state adjustment axis correspond to different emotional states, that is, the first corresponding relationship between the position information and the emotional state information is preset, so that the terminal can determine the target emotional state information corresponding to the target position information.
For example, the emotional state adjustment axis after being dragged by the user is shown in fig. 5C, and the target emotional state information corresponding to the target position information is happy state information.
In another embodiment, the emotional state option may include a plurality of emotional options, such as a happy option, an excited option, a relaxed option, an angry option, a fear option, and the like, which are not limited by the embodiments of the disclosure. The user may indicate the emotional state of the user by selecting one or more of the emotional state options and displaying the proportion of each emotion through the emotional state adjustment axis, for example, as shown in fig. 5D.
In order to further increase the content that the terminal can display in the target App, as shown in fig. 6, fig. 6 is another display method of the emotional state shown on the basis of the foregoing embodiment shown in fig. 4, and after the emotional state adjustment axis is output, the method further includes the following steps:
in step 102', when at least one second touch instruction for the emotional state adjustment axis is received, second target position information corresponding to a last touch instruction in the at least one second touch instruction is obtained.
In this step, the terminal may determine the second target position information corresponding to the last touch instruction when detecting that at least one second touch instruction exists on the emotional state adjustment axis.
In step 103', target mood information corresponding to the second target location information is determined according to a second correspondence between preset location information and mood information.
In this step, different positions on the emotional state adjustment axis may correspond to different emotional states and different mood degrees, that is, the second correspondence between the position information and the mood degree information is preset, so that the terminal may determine the target emotional state information corresponding to the second target position information according to the related art.
The mood levels may include high, medium, low, etc., as shown, for example, in fig. 7.
In step 104', the target mood information is displayed in the target App.
In this step, the terminal may directly display the target mood level information through a text, for example, displaying that the current mood is good. In another embodiment, in the above embodiment, the degree of mood may be adjusted by the output of the mood state adjustment axis, and the mood state option is used to indicate which mood is. For example, the mood is high, the current mood is too difficult, and the user is described to be too difficult.
Or only the emotional state adjusting axis may be output, the emotional state adjusting axis may be only used to indicate the degree of mood, and the emotional category of the user may be automatically obtained according to the content issued by the user, for example, if the user issues text content with bad mood today, the emotional category may be difficult to obtain. Or the emotion type of the user is automatically selected according to whether the holiday is festival or not, and the like, for example, if the current date is national holiday, the emotion type can be acquired as happy.
In step 102, the target emotional state information is displayed in the target application App.
In the embodiment of the disclosure, the target App may be a social App, such as WeChat, QQ, Michat, and the like.
Optionally, the terminal may display the target emotional state information in the target App in any one of the following manners.
In the first mode, the target emotional state information is displayed through characters.
Accordingly, the terminal may directly display the text content indicated by the target emotional state information in the target App according to the related art, for example, as shown in fig. 8.
And in the second mode, the target emotion state information is displayed through a preset expression.
Accordingly, step 102 is shown in fig. 9, and fig. 9 is another display method of emotional states based on the embodiment shown in fig. 4, which includes the following steps:
in step 102-1, a target preset expression corresponding to the target emotional state information is determined according to a third corresponding relationship between the pre-stored emotional state information and the preset expression.
In the embodiment of the present disclosure, the terminal has prestored the third corresponding relationship, for example, as shown in table 1.
TABLE 1
Figure BDA0001068687000000121
Figure BDA0001068687000000131
The terminal may determine a target preset expression corresponding to the target emotional state information in table 1 according to the third correspondence.
In step 102-2, in the target App, the target emotional state information is displayed through the target preset expression.
In this step, the terminal may directly display the target preset expression in the target App according to a related technology.
For example, if the target emotional state information is very happy, the corresponding target preset expression is a laughter expression, and the terminal directly displays the laughter expression in the target App as shown in fig. 10.
And in a third mode, target emotional state information is displayed through the filling area of the preset image.
Accordingly, step 102 is shown in fig. 11, and fig. 11 is another display method of emotional states shown on the basis of the embodiment shown in fig. 4, which includes the following steps:
in step 102-1', in the target App, a preset image is output, where the preset image is an image formed by a closed curve and can be filled inside.
In this step, the terminal may output the preset image, and the preset image may be a closed circle, a triangle, a trapezoid, or the like. The preset image may represent a preset mood, such as happy.
In step 102-2', upon receiving a fill instruction, a fill area in the preset image is determined.
In this step, the user slides in the preset image, the terminal also receives at least one third touch instruction, and the at least one third touch instruction is the fill instruction. The terminal may determine the filling area according to the filling instruction. For example, the filling area is 15% of the image area of the preset image, which indicates that the user is not happy.
In step 102-3', the preset image is filled according to the filling area.
In this step, the terminal may fill the preset image according to the previously determined filling area, for example, as shown in fig. 12.
In step 102-4', in the target App, the target emotional state information is displayed through the filled preset image.
In this step, the terminal directly displays the filled preset image in the target App.
In the above embodiment, the terminal may automatically receive the target emotional state information of the user, and then display the target emotional state information in the target App. Through the process, the terminal can display the content in the related technology in the target App and can also display the target emotional state information, the intelligent degree of the terminal is improved, and the user experience is improved.
In another embodiment, as shown in fig. 13, fig. 13 is another display method of emotional states shown on the basis of the foregoing embodiments shown in fig. 1 or fig. 2, further comprising the following steps:
in step 100-1, input target information is received, the target information including at least one of video information, picture information, and text information.
In this step, the user may take a video or select a pre-stored video, take or select a picture, input a text, and the like, and the terminal receives the input information.
In step 100-2, when the target information input is confirmed to be completed, step 101 is triggered.
In this step, the terminal may trigger execution of step 101 when the terminal confirms that the target information is input completely, for example, when the video shooting or selection is completed, the picture shooting or selection is completed, or when a text input completion instruction is received.
Through the process, the target emotional state information can be triggered and received when the target information is confirmed to be output in the target App, so that the target emotional state information of the user can be displayed while the target information is displayed, and the user experience is improved.
Corresponding to the foregoing method embodiments, the present disclosure also provides embodiments of an apparatus.
As shown in fig. 14, fig. 14 is a block diagram of a display device of an emotional state shown in the present disclosure according to an exemplary embodiment, including:
a first receiving module 210 configured to receive target emotional state information of a user;
an emotional state display module 220 configured to display the target emotional state information in the target application App.
As shown in fig. 15, fig. 15 is a block diagram of a display device of another emotional state shown in the present disclosure according to an exemplary embodiment, which is based on the foregoing embodiment shown in fig. 14, where the first receiving module 210 includes:
a first output sub-module 211 configured to output an emotional state option according to the received first instruction;
a receiving sub-module 212 configured to receive a selected target one of the emotional state options;
a first determination submodule 213 configured to take the emotional state information indicated by the target emotional state option as the target emotional state information.
As shown in fig. 16, fig. 16 is a block diagram of a display device of another emotional state shown in the present disclosure according to an exemplary embodiment, which is based on the foregoing embodiment shown in fig. 14 or fig. 15, where the first receiving module 210 includes:
a second output submodule 214 configured to output an emotional state adjustment axis according to the received second instruction;
the obtaining submodule 215 is configured to, when at least one first touch instruction for the emotional state is received, obtain first target position information corresponding to a last touch instruction in the at least one first touch instruction;
a second determining sub-module 216 configured to determine the target emotional state information corresponding to the first target location information according to a first corresponding relationship between preset location information and emotional state information.
As shown in fig. 17, fig. 17 is a block diagram of another display device of emotional states according to an exemplary embodiment of the disclosure, which is based on the foregoing embodiment shown in fig. 15, and the device further includes:
the obtaining module 230 is configured to, when at least one second touch instruction for the emotional state adjustment axis is received, obtain second target position information corresponding to a last touch instruction in the at least one second touch instruction;
a determining module 240 configured to determine target mood information corresponding to the second target location information according to a second correspondence between preset location information and mood information;
a mood degree display module 250 configured to display the target mood degree information in the target App.
As shown in fig. 18, fig. 18 is a block diagram of another display device of emotional states according to an exemplary embodiment of the present disclosure, which is based on the foregoing embodiment shown in fig. 14 or fig. 15, and the device further includes:
a second receiving module 260 configured to receive input target information, the target information including at least one of video information, picture information, and text information;
a triggering module 270 configured to trigger the first receiving module 210 to receive the target emotional state information of the user when the target information input is confirmed to be completed.
As shown in fig. 19, fig. 19 is a block diagram of another emotional state display device shown in the present disclosure according to an exemplary embodiment, based on the foregoing embodiment shown in fig. 16, where the emotional state display module 220 includes:
a first display sub-module 221 configured to display, in the target App, text content indicated by the target emotional state information by text.
As shown in fig. 20, fig. 20 is a block diagram of another emotional state display device according to an exemplary embodiment of the present disclosure, where on the basis of the foregoing embodiment shown in fig. 16, the emotional state display module 220 includes:
an expression determination submodule 222 configured to determine a target preset expression corresponding to the target emotion state information according to a second correspondence between the pre-stored emotion state information and the preset expression;
a second display sub-module 223 configured to display the target emotional state information through the target preset expression in the target App.
As shown in fig. 21, fig. 21 is a block diagram of another emotional state display device shown in the present disclosure according to an exemplary embodiment, which is based on the foregoing embodiment shown in fig. 16, where the emotional state display module 220 includes:
the image output sub-module 224 is configured to output a preset image in the target App, wherein the preset image is an image which is formed by a closed curve and can be filled inside;
an area determination submodule 225 configured to determine a filled area in the preset image upon receiving a filling instruction;
a filling sub-module 226 configured to fill in the preset image according to the filling area;
a third display sub-module 227 configured to display the target emotional state information through the filled preset image in the target App.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the disclosed solution. One of ordinary skill in the art can understand and implement it without inventive effort.
Correspondingly, this disclosure still provides a display device of emotional state, includes:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
receiving target emotional state information of a user;
and displaying the target emotional state information in a target application program (App).
Fig. 22 is a schematic diagram illustrating a display of emotional states, according to an example embodiment. As shown in fig. 22, according to an example embodiment, an emotional state display apparatus 2200 is shown, and the apparatus 2200 may be a computer, a mobile phone, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, or the like.
Referring to fig. 22, the apparatus 2200 may include one or more of the following components: processing components 2201, memory 2202, power components 2203, multimedia components 2204, audio components 2205, input/output (I/O) interfaces 2206, sensor components 2207, and communication components 2208.
The processing component 2201 generally controls the overall operation of the apparatus 2200, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 2201 may include one or more processors 2209 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 2201 may include one or more modules that facilitate interaction between the processing component 2201 and other components. For example, the processing component 2201 can include a multimedia module to facilitate interaction between the multimedia component 2204 and the processing component 2201.
Memory 2202 is configured to store various types of data to support operations at device 2200. Examples of such data include instructions for any application or method operating on device 2200, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 2202 may be implemented by any type or combination of volatile or non-volatile storage devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 2203 provides power to the various components of the device 2200. The power components 2203 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device 2200.
The multimedia component 2204 comprises a screen providing an output interface between the device 2200 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 2204 comprises a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 2200 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 2205 is configured to output and/or input audio signals. For example, the audio component 2205 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 2200 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 2202 or transmitted via the communication component 2208. In some embodiments, the audio component 2205 further comprises a speaker for outputting audio signals.
The I/O interface 2206 provides an interface between the processing component 2201 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 2207 includes one or more sensors for providing status assessment of various aspects to the device 2200. For example, the sensor component 2207 can detect an open/closed state of the apparatus 2200, the relative positioning of components, such as a display and keypad of the apparatus 2200, the sensor component 2207 can also detect a change in position of the apparatus 2200 or a component of the apparatus 2200, the presence or absence of user contact with the apparatus 2200, orientation or acceleration/deceleration of the apparatus 2200, and a change in temperature of the apparatus 2200. The sensor assembly 2207 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 2207 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 2207 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 2208 is configured to facilitate wired or wireless communication between the apparatus 2200 and other devices. The apparatus 2200 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 2208 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communications component 2208 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 2200 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as the memory 2202 comprising instructions, executable by the processor 2209 of the apparatus 2200 to perform the method described above is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Wherein the instructions in the storage medium, when executed by the processor, enable apparatus 2200 to perform a method of displaying an emotional state, comprising:
receiving target emotional state information of a user;
and displaying the target emotional state information in a target application program (App).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
The above description is only exemplary of the present disclosure and should not be taken as limiting the disclosure, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.

Claims (9)

1. A method of displaying an emotional state, the method comprising:
receiving target emotional state information of a user;
displaying the target emotional state information in a target application App; the target App is a social class App; the receiving of the target emotional state information of the user comprises:
outputting an emotional state adjusting shaft according to the received second instruction; the second instruction is the same as or different from the first instruction for outputting the emotional state option;
when at least one first touch instruction for the emotional state regulating axis is received, acquiring first target position information corresponding to the last touch instruction in the at least one first touch instruction;
determining the target emotional state information corresponding to the first target position information according to a first corresponding relation between preset position information and emotional state information;
further comprising:
receiving input target information, wherein the target information comprises at least one of video information, picture information and text information;
triggering the target emotion state information of the receiving user when the target information is confirmed to be input completely;
after the outputting of the emotional state adjustment axis, the method further comprises:
when at least one second touch instruction for the emotional state regulating axis is received, second target position information corresponding to the last touch instruction in the at least one second touch instruction is obtained;
determining target mood information corresponding to the second target position information according to a second corresponding relation between preset position information and mood information;
displaying the target mood degree information in the target App;
the displaying the target emotional state information in a target application App includes:
outputting a preset image in the target App, wherein the preset image is formed by a closed curve and can be internally filled;
determining a filling area in the preset image after receiving a filling instruction;
filling in the preset image according to the filling area;
and in the target App, displaying the target emotional state information through the filled preset image.
2. The method of claim 1, wherein receiving the target emotional state information of the user comprises:
outputting an emotional state option according to the received first instruction;
receiving a target emotional state option selected among the emotional state options;
and taking the emotional state information indicated by the target emotional state option as the target emotional state information.
3. The method according to claim 1, wherein the displaying the target emotional state information in a target application (App) comprises:
and in the target App, displaying the text content indicated by the target emotional state information through text.
4. The method according to claim 1, wherein the displaying the target emotional state information in a target application (App) comprises:
determining a target preset expression corresponding to the target emotion state information according to a third corresponding relation between pre-stored emotion state information and preset expressions;
and in the target App, displaying the target emotional state information through the target preset expression.
5. An emotional state display device, the device comprising:
a first receiving module configured to receive target emotional state information of a user;
an emotional state display module configured to display the target emotional state information in a target application App; the target App is a social class App;
the first receiving module includes:
a second output submodule configured to output an emotional state adjustment axis according to the received second instruction; the second instruction is the same as or different from the first instruction for outputting the emotional state option;
the obtaining sub-module is configured to obtain first target position information corresponding to a last touch instruction in at least one first touch instruction when the at least one first touch instruction for the emotional state regulating axis is received;
a second determining sub-module configured to determine the target emotional state information corresponding to the first target location information according to a first corresponding relationship between preset location information and emotional state information;
the device further comprises:
a second receiving module configured to receive input target information, the target information including at least one of video information, picture information, and text information;
the triggering module is configured to trigger the first receiving module to receive the target emotional state information of the user when the target information is confirmed to be input completely;
the device further comprises:
the obtaining module is configured to obtain second target position information corresponding to a last touch instruction in at least one second touch instruction when the at least one second touch instruction for the emotional state regulating axis is received;
the determining module is configured to determine target mood information corresponding to the second target position information according to a second corresponding relation between preset position information and mood information;
a mood degree display module configured to display the target mood degree information in the target App;
the emotional state display module includes:
the image output sub-module is configured to output a preset image in the target App, wherein the preset image is an image which is formed by a closed curve and can be filled inside;
an area determination submodule configured to determine a filled area in the preset image upon receiving a filling instruction;
a filling sub-module configured to fill in the preset image according to the filling area;
a third display sub-module configured to display the target emotional state information through the filled preset image in the target App.
6. The apparatus of claim 5, wherein the first receiving module comprises:
a first output sub-module configured to output an emotional state option according to the received first instruction;
a receiving sub-module configured to receive a selected target one of the emotional state options;
a first determination sub-module configured to take the emotional state information indicated by the target emotional state option as the target emotional state information.
7. The apparatus of claim 5, wherein the emotional state display module comprises:
a first display sub-module configured to display, in the target App, textual content indicated by the target emotional state information by text.
8. The apparatus of claim 5, wherein the emotional state display module comprises:
the expression determination submodule is configured to determine a target preset expression corresponding to the target emotion state information according to a third corresponding relation between the prestored emotion state information and the preset expression;
the second display sub-module is configured to display the target emotion state information through the target preset expression in the target App.
9. A display device of emotional states, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
receiving target emotional state information of a user;
displaying the target emotional state information in a target application App; the target App is a social class App;
the receiving of the target emotional state information of the user comprises:
outputting an emotional state adjusting shaft according to the received second instruction; the second instruction is the same as or different from the first instruction for outputting the emotional state option;
when at least one first touch instruction for the emotional state regulating axis is received, acquiring first target position information corresponding to the last touch instruction in the at least one first touch instruction;
determining the target emotional state information corresponding to the first target position information according to a first corresponding relation between preset position information and emotional state information;
the processor is further configured to:
receiving input target information, wherein the target information comprises at least one of video information, picture information and text information;
triggering and executing target emotion state information of a receiving user when the target information is confirmed to be input completely;
after the outputting an emotional state adjustment axis, the processor is further configured to:
when at least one second touch instruction for the emotional state regulating axis is received, second target position information corresponding to the last touch instruction in the at least one second touch instruction is obtained;
determining target mood information corresponding to the second target position information according to a second corresponding relation between preset position information and mood information;
displaying the target mood degree information in the target App;
the displaying the target emotional state information in a target application App includes:
outputting a preset image in the target App, wherein the preset image is formed by a closed curve and can be internally filled;
determining a filling area in the preset image after receiving a filling instruction;
filling in the preset image according to the filling area;
and in the target App, displaying the target emotional state information through the filled preset image.
CN201610630277.1A 2016-08-03 2016-08-03 Emotional state display method and device Active CN106155703B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610630277.1A CN106155703B (en) 2016-08-03 2016-08-03 Emotional state display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610630277.1A CN106155703B (en) 2016-08-03 2016-08-03 Emotional state display method and device

Publications (2)

Publication Number Publication Date
CN106155703A CN106155703A (en) 2016-11-23
CN106155703B true CN106155703B (en) 2021-03-16

Family

ID=57329059

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610630277.1A Active CN106155703B (en) 2016-08-03 2016-08-03 Emotional state display method and device

Country Status (1)

Country Link
CN (1) CN106155703B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107480614A (en) * 2017-07-31 2017-12-15 广东欧珀移动通信有限公司 Motion management method, apparatus and terminal device
CN111970402A (en) * 2019-05-20 2020-11-20 北京字节跳动网络技术有限公司 Information processing method and device and terminal equipment
CN110855554B (en) * 2019-11-08 2021-07-13 腾讯科技(深圳)有限公司 Content aggregation method and device, computer equipment and storage medium
CN113079242A (en) * 2021-03-29 2021-07-06 深圳市艾酷通信软件有限公司 User state setting method and electronic equipment
CN113572893B (en) * 2021-07-13 2023-03-14 青岛海信移动通信技术股份有限公司 Terminal device, emotion feedback method and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101146056A (en) * 2007-09-24 2008-03-19 腾讯科技(深圳)有限公司 A display method and system for emotion icons
US20120136219A1 (en) * 2010-11-30 2012-05-31 International Business Machines Corporation Emotion script generating, experiencing, and emotion interaction
US20130154980A1 (en) * 2011-12-20 2013-06-20 Iconicast, LLC Method and system for emotion tracking, tagging, and rating and communication
CN104239515A (en) * 2014-09-16 2014-12-24 广东欧珀移动通信有限公司 Mood information implementation method and system
CN105138222A (en) * 2015-08-26 2015-12-09 美国掌赢信息科技有限公司 Method for selecting expression icon and electronic equipment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080214903A1 (en) * 2005-02-22 2008-09-04 Tuvi Orbach Methods and Systems for Physiological and Psycho-Physiological Monitoring and Uses Thereof
WO2011153318A2 (en) * 2010-06-02 2011-12-08 Q-Tec Systems Llc Method and apparatus for monitoring emotion in an interactive network
CN103826160A (en) * 2014-01-09 2014-05-28 广州三星通信技术研究有限公司 Method and device for obtaining video information, and method and device for playing video
CN104786911A (en) * 2015-04-15 2015-07-22 上海电机学院 Vehicle-mounted expression display system and method based on voice control

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101146056A (en) * 2007-09-24 2008-03-19 腾讯科技(深圳)有限公司 A display method and system for emotion icons
US20120136219A1 (en) * 2010-11-30 2012-05-31 International Business Machines Corporation Emotion script generating, experiencing, and emotion interaction
US20130154980A1 (en) * 2011-12-20 2013-06-20 Iconicast, LLC Method and system for emotion tracking, tagging, and rating and communication
CN104239515A (en) * 2014-09-16 2014-12-24 广东欧珀移动通信有限公司 Mood information implementation method and system
CN105138222A (en) * 2015-08-26 2015-12-09 美国掌赢信息科技有限公司 Method for selecting expression icon and electronic equipment

Also Published As

Publication number Publication date
CN106155703A (en) 2016-11-23

Similar Documents

Publication Publication Date Title
EP3454192B1 (en) Method and device for displaying page
US10152207B2 (en) Method and device for changing emoticons in a chat interface
US11086482B2 (en) Method and device for displaying history pages in application program and computer-readable medium
CN107908351B (en) Application interface display method and device and storage medium
US20180046336A1 (en) Instant Message Processing Method and Apparatus, and Storage Medium
EP3035738A1 (en) Method for connecting appliance to network and device for the same
CN106155703B (en) Emotional state display method and device
US20160352891A1 (en) Methods and devices for sending virtual information card
US20170123644A1 (en) Interface display method and device
EP3333690A2 (en) Object starting method and device
US9661132B2 (en) Method, apparatus, and storage medium for displaying a conversation interface
CN107992257B (en) Screen splitting method and device
US20190235745A1 (en) Method and device for displaying descriptive information
CN106354504B (en) Message display method and device
CN109324846B (en) Application display method and device and storage medium
US10042328B2 (en) Alarm setting method and apparatus, and storage medium
EP3015965A1 (en) Method and apparatus for prompting device connection
EP3147802A1 (en) Method and apparatus for processing information
US20220391446A1 (en) Method and device for data sharing
US20180365038A1 (en) Display method and device of application interface
CN104850643B (en) Picture comparison method and device
CN112051949A (en) Content sharing method and device and electronic equipment
CN106775210B (en) Wallpaper changing method and device
CN106919302B (en) Operation control method and device of mobile terminal
CN106447747B (en) Image processing method and device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant