CN113572893A - Terminal device, emotion feedback method and storage medium - Google Patents

Terminal device, emotion feedback method and storage medium Download PDF

Info

Publication number
CN113572893A
CN113572893A CN202110790161.5A CN202110790161A CN113572893A CN 113572893 A CN113572893 A CN 113572893A CN 202110790161 A CN202110790161 A CN 202110790161A CN 113572893 A CN113572893 A CN 113572893A
Authority
CN
China
Prior art keywords
emotion
information
target
user
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110790161.5A
Other languages
Chinese (zh)
Other versions
CN113572893B (en
Inventor
东芳
王倩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Mobile Communications Technology Co Ltd
Original Assignee
Hisense Mobile Communications Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Mobile Communications Technology Co Ltd filed Critical Hisense Mobile Communications Technology Co Ltd
Priority to CN202110790161.5A priority Critical patent/CN113572893B/en
Publication of CN113572893A publication Critical patent/CN113572893A/en
Application granted granted Critical
Publication of CN113572893B publication Critical patent/CN113572893B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72451User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to schedules, e.g. using calendar applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a terminal device, an emotion feedback method and a storage medium, and relates to the technical field of communication. The terminal equipment can receive first emotion label information of a user corresponding to a target date in a calendar application; determining target emotion information of the user on a target date according to the first emotion label information; the target emotion information is the emotion type and the emotion level corresponding to the current emotion label information; displaying the target emotion management information in the calendar application according to the target emotion information; the target emotion management information is feedback information corresponding to the target emotion information. The terminal equipment analyzes the received emotion label of the user, determines the emotion type and the emotion level, refines emotion data, and gives intelligent feedback conforming to the user, so that emotion feedback based on the calendar is realized, basic functions can be added to the calendar, and the calendar is diversified in functions.

Description

Terminal device, emotion feedback method and storage medium
Technical Field
The present application relates to the field of communications technologies, and in particular, to a terminal device, an emotion feedback method, and a storage medium.
Background
The calendar is relatively familiar application software, but the functions in the current calendar application are relatively single, and only basic functions such as travel, activities and the like can be displayed. The calendar has single function, is not easy to expand, and affects wider application of calendar application.
How to provide a method for adding basic functions to the calendar and enabling the functions of the calendar to be diversified has important practical significance.
Disclosure of Invention
The embodiment of the application provides a terminal device, an emotion feedback method and a storage medium, wherein emotion feedback is carried out based on a calendar, so that basic functions are added to the calendar, and the calendar is diversified in functions.
In a first aspect, an embodiment of the present application provides a terminal device, including:
a display unit for displaying an image and a user interface; the user interface comprises a calendar application interface;
a processor to: receiving first emotion label information of a user corresponding to a target date in a calendar application;
determining target emotion information of the user on the target date according to the first emotion label information; the target emotion information is an emotion type and an emotion level corresponding to the current emotion label information;
displaying target emotion management information in the calendar application according to the target emotion information; the target emotion management information is feedback information corresponding to the target emotion information.
The terminal device provided by the embodiment of the application can receive first emotion label information of a user corresponding to a target date in calendar application; determining target emotion information of the user on the target date according to the first emotion label information; the target emotion information is an emotion type and an emotion level corresponding to the current emotion label information; displaying target emotion management information in the calendar application according to the target emotion information; the target emotion management information is feedback information corresponding to the target emotion information. The terminal equipment analyzes the received emotion label of the user, determines the emotion type and the emotion level, refines emotion data, and gives intelligent feedback conforming to the user, so that emotion feedback based on the calendar is realized, basic functions can be added to the calendar, and the calendar is diversified in functions.
In some embodiments, the processor is specifically configured to:
displaying an emotion label option in response to the received user selection operation for the target date;
and taking the target emotion label option selected by the user in the emotion label options as the first emotion label information of the target date.
In some embodiments, the processor is specifically configured to:
and determining the target emotion information of the user on the target date corresponding to the first emotion label information according to a first corresponding relation between a preset emotion label and emotion information.
In some embodiments, the processor is specifically configured to:
determining target emotion management information corresponding to the target emotion information according to a second corresponding relation between preset emotion information and emotion management information;
displaying the target emotion management information in the calendar application.
In some embodiments, the processor is further configured to:
and in response to a user's viewing operation for the target date, displaying the target emotion information corresponding to the target date in the calendar application.
In some embodiments, the processor is further configured to:
generating an emotion fluctuation curve of a preset time interval according to a preset time frequency; the emotion fluctuation curve comprises the target emotion information corresponding to each date in the preset time interval;
according to the emotion fluctuation curve, determining the stage psychological characteristics of the user in a preset time interval; the stage psychological characteristics are types of which the occurrence frequency meets preset conditions in the emotion types of the characteristic emotion information; the characteristic emotion information is target emotion information of which the emotion level exceeds a preset emotion level threshold;
determining target emotion guidance information corresponding to the stage psychological characteristics according to a third corresponding relation between preset psychological characteristics and emotion guidance information; the target emotion guidance information is feedback information corresponding to the stage psychological characteristics;
displaying the target emotional guidance information in the calendar application.
In a second aspect, an embodiment of the present application provides an emotional feedback method, including:
receiving first emotion label information of a user corresponding to a target date in a calendar application;
determining target emotion information of the user on the target date according to the first emotion label information; the target emotion information is an emotion type and an emotion level corresponding to the current emotion label information;
displaying target emotion management information in the calendar application according to the target emotion information; the target emotion management information is feedback information corresponding to the target emotion information.
In some embodiments, the receiving the first emotion tag information of the user corresponding to the target date in the calendar application includes:
displaying an emotion label option in response to the received user selection operation for the target date;
and taking the target emotion label option selected by the user in the emotion label options as the first emotion label information of the target date.
In some embodiments, determining target emotion information for the user on the target date based on the first emotion tag information comprises:
and determining the target emotion information of the user on the target date corresponding to the first emotion label information according to a first corresponding relation between a preset emotion label and emotion information.
In some embodiments, said displaying target emotion management information in said calendar application based on said target emotion information comprises:
determining target emotion management information corresponding to the target emotion information according to a second corresponding relation between preset emotion information and emotion management information;
displaying the target emotion management information in the calendar application.
In certain embodiments, the method further comprises:
and in response to a user's viewing operation for the target date, displaying the target emotion information corresponding to the target date in the calendar application.
In certain embodiments, the method further comprises:
generating an emotion fluctuation curve of a preset time interval according to a preset time frequency; the emotion fluctuation curve comprises the target emotion information corresponding to each date in the preset time interval;
according to the emotion fluctuation curve, determining the stage psychological characteristics of the user in a preset time interval; the stage psychological characteristics are types of which the occurrence frequency meets preset conditions in the emotion types of the characteristic emotion information; the characteristic emotion information is target emotion information of which the emotion level exceeds a preset emotion level threshold;
determining target emotion guidance information corresponding to the stage psychological characteristics according to a third corresponding relation between preset psychological characteristics and emotion guidance information; the target emotion guidance information is feedback information corresponding to the stage psychological characteristics;
displaying the target emotional guidance information in the calendar application.
In a third aspect, the present application further provides a computer-readable storage medium, in which a computer program is stored, and when the computer program is executed by a processor, the method for emotional feedback according to the second aspect is implemented.
The technical effect brought by any one implementation manner of the second aspect to the third aspect may be referred to the technical effect brought by the implementation manner of the first aspect, and is not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic illustration of an emotion label option in an embodiment of the present application;
fig. 2 is a block diagram of a hardware configuration of a terminal device according to an embodiment of the present disclosure;
fig. 3 is a block diagram of a software structure of a terminal device according to an embodiment of the present disclosure;
fig. 4 is a schematic flowchart of an emotional feedback method according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an interface for displaying an emotional tag option provided by an embodiment of the application;
FIG. 6 is a schematic diagram of an interface for setting emotion label options according to an embodiment of the present application;
fig. 7 is a schematic diagram of an interface for displaying emotion management information according to an embodiment of the present application;
FIG. 8 is a schematic flow chart of another emotional feedback method provided in an embodiment of the present application;
FIG. 9 is a schematic diagram of a mood swing curve provided in an embodiment of the present application;
fig. 10 is a schematic diagram of an interface for displaying emotion management information according to an embodiment of the present application;
fig. 11 is a schematic flowchart of another emotional feedback method according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the accompanying drawings, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Some terms in the embodiments of the present application are explained below to facilitate understanding by those skilled in the art.
(1) The terminal equipment: the term "terminal device" in this embodiment of the present application refers to a device that can install various types of application programs, including an application carried by the terminal device and a third-party application, and can display an object provided in the installed application program, where the terminal device may be mobile or fixed. For example, a mobile phone, a tablet computer, various wearable devices, a vehicle-mounted device, a Personal Digital Assistant (PDA), a Point Of Sale (POS), or other terminal devices capable Of implementing the above functions.
(2) Target emotion management information: the target emotion management information refers to a dredging scheme that after a user selects a certain level of a certain emotion and stores the certain level in a calendar of the terminal device, the terminal device can push an instant response corresponding to the emotion type and the emotion level selected by the user through data analysis, and the dredging scheme comprises illustration, animation, music, jokes, psychological state analysis and the like. For example, in the embodiment of the application, when the user sets the "fear" identifier, a hugging poster or animation is popped up; when the "happy" logo is selected, the terminal pushes a piece of soothing music or gives a picture supporting encouragement. When the sadness mark is selected, the terminal pushes a piece of psychological analysis for accepting understanding, or suggestions and methods for emotion relief, and the like.
(3) Target emotion guidance information: the target emotion guidance information refers to guidance information for determining a staged psychological characteristic of the user, such as a certain level of certain emotion which occurs most frequently and whose emotion level exceeds a preset emotion level threshold, according to an emotion fluctuation curve of a preset time interval of the user, such as a weekly or monthly emotion fluctuation curve, and the terminal device gives the user future directionality, and includes: guidance advice, support encouragement, and treatment methods. For example, in the embodiment of the present application, if "sad" emotions at a level of "5" appear most frequently in this month, the terminal device may remind the user of the harm that "sad" brings to the body, instruct the user to deal with the relationship with "sad" emotions, how to deal with when getting into "sad" again, and the like, so that the user obtains guidance information of future directionality.
The calendar is relatively familiar application software, but the functions in the current calendar application are relatively single, and only basic functions such as travel, activities and the like can be displayed. The calendar has single function, is not easy to expand, and affects wider application of calendar application.
How to provide a method for adding basic functions to the calendar and enabling the functions of the calendar to be diversified has important practical significance.
Based on this, the embodiment of the application provides a terminal device, an emotion feedback method and a storage medium. The terminal equipment can receive first emotion label information of a user corresponding to a target date in a calendar application; determining target emotion information of the user on a target date according to the first emotion label information; the target emotion information is the emotion type and the emotion level corresponding to the current emotion label information; displaying the target emotion management information in the calendar application according to the target emotion information; the target emotion management information is feedback information corresponding to the target emotion information. The terminal equipment analyzes the received emotion label of the user, determines the emotion type and the emotion level, refines emotion data, and gives intelligent feedback conforming to the user, so that emotion feedback based on the calendar is realized, basic functions can be added to the calendar, and the calendar is diversified in functions.
In order to make the objects, technical solutions and advantages of the embodiments of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings, and it is to be understood that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Fig. 1 is a schematic diagram illustrating an emotion tag option provided by an embodiment of the present application, which is shown in fig. 1 and includes 35 tags, where the emotion type of each tag is one of the following: happiness, anger, sadness, happiness, fright, terrorism and thinking, and corresponds to one of the following 5 emotion levels: 1. 2, 3, 4 and 5. It is understood that the above emotional tag options include 35 kinds of tags, which are merely exemplary illustrations of the technical solutions of the present application and do not constitute specific limitations of the technical solutions of the present application.
The calendar of the terminal equipment of the application can provide a label representing emotion, and seven categories according to the emotion are as follows: happiness, anger, sadness, happiness, fright, terrorism and thinking, each emotion type respectively comprises the following labels of 5 emotion grades: 1. 2, 3, 4 and 5. Wherein a higher value of the mood rating indicates a stronger mood.
In fig. 1, the expression picture of each emotion type is a circle, and the schematic word of the emotion type and the schematic word of the emotion level are arranged in the middle of the circle. The following embodiments of the present application are all illustrated with the emotion tag options shown in fig. 1. For example, assuming that the emotion type of an emotion tag is "happiness", the expression picture thereof includes a circle and a "happiness" word in the circle, and when the emotion level is 5, the circle has a number "5", as shown in fig. 1; assuming that the emotion type of an emotion label is "anger", the corresponding emoticon includes a circle and a "anger" word in the circle, and when the emotion level is 3, the circle also has a number "3".
Fig. 2 shows a block diagram of a hardware configuration of a terminal device according to an embodiment of the present application. It should be understood that the terminal device 100 shown in fig. 2 is only an example, and the terminal device 100 may have more or less components than those shown in fig. 2, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
Fig. 2 is a block diagram schematically showing a hardware configuration of a terminal device according to an exemplary embodiment. As shown in fig. 2, the terminal device 100 includes: a Radio Frequency (RF) circuit 210, a memory 220, a display unit 230, a camera 240, a sensor 250, an audio circuit 260, a Wireless Fidelity (Wi-Fi) module 270, a processor 280, a bluetooth module 281, and a power supply 290.
The RF circuit 210 may be used for receiving and transmitting signals during information transmission and reception or during a call, and may receive downlink data of a base station and then send the downlink data to the processor 280 for processing; the uplink data may be transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
The memory 220 is used for storing data or program codes used when the terminal device 100 operates. The processor 280 performs various functions of the terminal device 100 and data processing by executing software programs or data stored in the memory 220. The memory 220 may include high speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. The memory 220 stores an operating system that enables the terminal device 100 to operate. The memory 220 may store an operating system and various application programs, and may also store codes for performing the methods according to the embodiments of the present application.
The display unit 230 may be used to receive input numeric or character information and generate signal input related to user settings and function control of the terminal device 100, and particularly, the display unit 230 may include a touch screen 231 disposed on the front surface of the terminal device 100 and may collect touch operations of a user thereon or nearby, such as clicking a button, dragging a scroll box, and the like.
The display unit 230 may also be used to display information input by the user or information provided to the user and a Graphical User Interface (GUI) of various menus of the terminal apparatus 100. For example, images and user interfaces are displayed in embodiments of the present application, where the user interfaces include a calendar application interface. Specifically, the display unit 230 may include a display screen 232 disposed on the front surface of the terminal device 100. The display screen 232 may be configured in the form of a liquid crystal display, a light emitting diode, or the like.
The touch screen 231 may be covered on the display screen 232, or the touch screen 231 and the display screen 232 may be integrated to implement the input and output functions of the terminal device 100, and after the integration, the touch screen may be referred to as a touch display screen for short. The display unit 230 in the present invention can display the application programs and the corresponding operation steps.
The camera 240 may be used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing elements convert the light signals into electrical signals which are then passed to a processor 280 for conversion into digital image signals.
The terminal device 100 may further comprise at least one sensor 250, such as an acceleration sensor 251, a distance sensor 252, a fingerprint sensor 253, a temperature sensor 254. The terminal device 100 may also be configured with other sensors such as a gyroscope, barometer, hygrometer, thermometer, infrared sensor, light sensor, motion sensor, and the like.
The audio circuitry 260, speaker 261, and microphone 262 may provide an audio interface between the user and the terminal device 100. The audio circuit 260 may transmit the electrical signal converted from the received audio data to the speaker 261, and convert the electrical signal into a sound signal by the speaker 261 and output the sound signal. The terminal device 100 may also be provided with a volume button for adjusting the volume of the sound signal. On the other hand, the microphone 262 converts the collected sound signals into electrical signals, which are received by the audio circuit 260 and converted into audio data, which are then output to the RF circuit 210 for transmission to, for example, another terminal, or output to the memory 220 for further processing.
Wi-Fi belongs to a short-distance wireless transmission technology, and the terminal device 100 can help a user to send and receive e-mails, browse webpages, access streaming media and the like through the Wi-Fi module 270, and provides wireless broadband internet access for the user.
The processor 280 is a control center of the terminal device 100, connects various parts of the entire terminal device using various interfaces and lines, and performs various functions of the terminal device 100 and processes data by running or executing software programs stored in the memory 220 and calling data stored in the memory 220. In one possible implementation, processor 280 may include one or more processing units; the processor 280 may also integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a baseband processor, which primarily handles wireless communications. It will be appreciated that the baseband processor described above may not be integrated into the processor 280. In the present application, the processor 280 may run an operating system, an application program, a user interface display, a touch response, and the emotion feedback method according to the embodiments of the present application. Further, the processor 280 is coupled with the display unit 230.
And the bluetooth module 281 is configured to perform information interaction with other bluetooth devices having a bluetooth module through a bluetooth protocol. For example, the terminal device 100 may establish a bluetooth connection with another terminal device 100 through the bluetooth module 281, thereby performing data interaction.
Terminal device 100 also includes a power supply 290 (e.g., a battery) for powering the various components. The power supply may be logically coupled to the processor 280 through a power management system to manage charging, discharging, and power consumption through the power management system. The terminal device 100 may further be configured with a power button for powering on and off the terminal, and locking the screen.
In one embodiment of the present application, the processor 280 is configured to:
receiving first emotion label information of a user corresponding to a target date in a calendar application;
determining target emotion information of the user on a target date according to the first emotion label information; the target emotion information is the emotion type and the emotion level corresponding to the current emotion label information;
displaying the target emotion management information in the calendar application according to the target emotion information; the target emotion management information is feedback information corresponding to the target emotion information.
In one possible implementation, the processor 280 is specifically configured to:
displaying an emotion label option in response to the received user selection operation for the target date;
and taking the target emotion label option selected by the user in the emotion label options as the first emotion label information of the target date.
In one possible implementation, the processor 280 is specifically configured to:
and determining target emotion information of the user on a target date corresponding to the first emotion label information according to a first corresponding relation between a preset emotion label and the emotion information.
In one possible implementation, the processor 280 is specifically configured to:
determining target emotion management information corresponding to the target emotion information according to a second corresponding relation between the preset emotion information and the emotion management information;
the target emotion management information is displayed in the calendar application.
In one possible implementation, the processor 280 is further configured to:
and in response to the user's viewing operation for the target date, displaying target emotion information corresponding to the target date in the calendar application.
In one possible implementation, the processor 280 is further configured to:
generating an emotion fluctuation curve of a preset time interval according to a preset time frequency; the emotion fluctuation curve comprises target emotion information corresponding to each date in a preset time interval;
determining the stage psychological characteristics of the user in a preset time interval according to the emotional fluctuation curve; the stage psychological characteristics are types of which the frequency of occurrence meets preset conditions in the emotion types of the characteristic emotion information; the characteristic emotion information is target emotion information of which the emotion level exceeds a preset emotion level threshold;
determining target emotion guidance information corresponding to the stage psychological characteristics according to a third corresponding relation between the preset psychological characteristics and the emotion guidance information; the target emotion guidance information is feedback information corresponding to the stage psychological characteristics;
the target emotion guidance information is displayed in the calendar application.
Fig. 3 is a block diagram of a software configuration of a terminal device according to an embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In one possible implementation, the Android system is divided into four layers, which are an application layer, an application framework layer, an Android Runtime (Android Runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 3, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. The video application program can be used for playing a common video and can also be used for playing a free viewpoint video.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 3, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video (e.g., free viewpoint video), images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The telephone manager is used for providing a communication function of the terminal equipment. Such as management of call status (including on, off, etc.).
The resource manager provides various resources, such as localized strings, icons, pictures, layout files, video files, etc., to the application.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, text information is prompted in the status bar, a prompt tone is given, the communication terminal vibrates, and an indicator light flashes.
The android runtime includes a core library and a virtual machine. The android runtime is responsible for scheduling and managing the android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, composition, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
Fig. 4 shows a flowchart of an emotion feedback method provided in an embodiment of the present application, and is applicable to a terminal device, where the terminal device is applicable to the hardware structure shown in fig. 2.
The network searching control method may be executed by the terminal device 100, please refer to fig. 4, and may include the following steps:
step S401, receiving first emotion label information of a user corresponding to a target date in a calendar application.
Specifically, the user sets first emotion tag information corresponding to a target date in the calendar application, and may select a target emotion tag option in the displayed emotion tags after clicking the target date in the calendar application of the terminal device; or selecting the physical characteristics after clicking the target date in the calendar application of the terminal equipment, and determining the corresponding target emotion label option by the calendar application according to the received physical characteristics input by the user.
In one possible implementation, receiving first emotion tag information of a user corresponding to a target date in a calendar application may be implemented by:
step A01, in response to receiving a user selection for a target date, displays an emotional tag option.
Illustratively, assuming that the target date is 2021, 5, month and 2, the terminal device displays an interface diagram as shown in fig. 5 in response to a received user selection operation for the target date, wherein an emotion tag option is displayed for the user to select any one of the tags shown in fig. 1.
Step a02, using the target emotion label option selected by the user in the emotion label options as the first emotion label information of the target date.
Illustratively, assume that the user selected the target emotion tag option among the emotion tag options of the interface shown in fig. 5 is the one shown in fig. 1 corresponding to "emotion type: a preference; emotion grade: 5 ", then the label shown in fig. 1 corresponding to" mood type: a preference; emotion grade: the tab of 5 "displays an interface for setting the emotion tab option as shown in fig. 6 as the first emotion tab information of 5/2/2021.
In one possible implementation manner, when the user selects the target emotion tag option in the emotion tag options, the terminal device may display prompt information for assisting the user in determining the emotion level. For example, physical features corresponding to various emotional levels are displayed, such as: rapid heartbeat, extreme collapse, confused thinking, etc.
In some embodiments, the user setting the first emotion tag information corresponding to the target date in the calendar application may be by selecting a physical characteristic after clicking on the target date in the calendar application of the terminal device, the calendar application determining the corresponding target emotion tag option based on the received physical characteristic input by the user. Illustratively, if the input physical characteristics are part or all of the following: extreme heartburn, extreme headache, extreme confusion, extreme acceleration of heartbeat, extreme collapse, the calendar application determines the corresponding target emotion label options according to the received physical characteristics of the user input, such as "type of emotion: anger; emotion grade: 5 "corresponding tag.
Step S402, determining target emotion information of the user on a target date according to the first emotion label information.
And the target emotion information is the emotion type and the emotion level corresponding to the current emotion label information.
Specifically, the mood types include, but are not limited to, some combination or all of the following: joy, anger, grief, happiness, fright, terrorism and thinking; mood ratings include, but are not limited to, the following: 1. 2, 3, 4 and 5.
In a possible implementation manner, the target emotion information of the user on the target date is determined according to the first emotion tag information, and the target emotion information of the user on the target date corresponding to the first emotion tag information may be determined according to a first corresponding relationship between a preset emotion tag and the emotion information.
For example, the first correspondence between the preset emotion labels and the emotion information may be the correspondence between 35 kinds of labels and (emotion type, emotion level) shown in the emotion label option of fig. 1. The terminal device can determine that the target emotion information of the user in 2021, 5, month and 2, corresponding to the first emotion tag information set through the interface shown in fig. 6, is "emotion type: a preference; emotion grade: 5".
Step S403, displaying the target emotion management information in the calendar application according to the target emotion information.
Wherein the target emotion management information is feedback information corresponding to the target emotion information.
The target emotion management information represents a short-term control mode which is required to be applied to the emotion corresponding to the target emotion information.
Specifically, in the embodiment of the present application, the target emotion management information may be feedback information corresponding to the target emotion information one to one; or may be an optional one of a plurality of preset feedback information corresponding to the target emotion information, which is not specifically limited in the present application.
In one possible implementation manner, the displaying of the target emotion management information in the calendar application according to the target emotion information can be implemented by the following steps:
and step B01, determining target emotion management information corresponding to the target emotion information according to a second corresponding relation between the preset emotion information and the emotion management information.
In some embodiments of the present application, the second correspondence is a correspondence between a type of emotion in the preset emotion information and the emotion management information; the second correspondence is a correspondence between (type of emotion, emotion level) in the preset emotion information and the emotion management information.
In specific implementation, the terminal device has prestored the second corresponding relationship, for example, as shown in table 1:
TABLE 1
Emotional information Emotion management information
The mood type is "le" Playing a preset soothing piece of music
The emotion type is 'terrorism' Popping up animation of hug
The emotion type is "sade" Push a segment of admission understanding psychology analysis
The emotion type is "xi" Pop-up a section of preset celebration animation
…… ……
Illustratively, assume that the target emotion information of the user at 5/2/2021 is "emotion type: a preference; emotion grade: and 5 ', the terminal device determines that the target emotion management information corresponding to the target emotion information is ' pop up a preset celebration animation ', according to the emotion type contained in the target emotion information being ' happy ', as shown in FIG. 7.
Step B02, displaying the target emotion management information in the calendar application.
Specifically, the target emotion management information is displayed in the calendar application in order to perform an operation corresponding to the target emotion management information. For example, when the emotion management information is "play a piece of preset soothing music", a play window corresponding to "play a piece of preset soothing music" is displayed in the calendar application; when the emotion management information is "push a piece of mental analysis accommodating understanding", a text presentation interface corresponding to "push a piece of mental analysis accommodating understanding" is displayed in the calendar application.
Illustratively, if the target emotion management information corresponding to the target emotion information is "pop-up of a preset piece of animation of celebration", the terminal device displays an animation corresponding to "pop-up of a preset piece of animation of celebration" in the calendar application.
According to the method, the received emotion labels of the users are analyzed, the emotion types and the emotion levels are determined, emotion data are refined, and the users are given consistent intelligent feedback, so that emotion feedback based on the calendar is achieved, basic functions can be added to the calendar, and the functions of the calendar are diversified.
In one possible implementation manner, the terminal device further displays target emotion information corresponding to the target date in the calendar application in response to a viewing operation of the user for the target date.
Specifically, the viewing operation includes, but is not limited to, one of the following: clicking, sliding and long-pressing for a preset time length.
Illustratively, if the user performs a viewing operation for 2021 year 5 month 02 day, target emotion information corresponding to 2021 year 5 month 02 day is displayed in the calendar application.
Displaying target emotion information corresponding to a target date in the calendar application, specifically displaying target emotion information (emotion type, emotion level) near the target date of the calendar application; or a label corresponding to the target emotion information may be displayed as shown in fig. 1.
In some embodiments of the present application, the terminal device may further display, in the calendar application, target emotion information corresponding to respective dates of the target month in response to a viewing operation of the user for the target month.
In one possible implementation manner, as shown in fig. 8, the terminal device further performs the following steps:
step S801, generating an emotion fluctuation curve in a preset time interval according to a preset time frequency.
The emotion fluctuation curve comprises target emotion information corresponding to each date in a preset time interval.
Specifically, the preset time frequency may be weekly or monthly. In the embodiment of the present application, when the preset time frequency is every week, the terminal device generates an emotion fluctuation curve of a preset time interval at a preset first time point, as shown in fig. 9, where the preset time interval is 1 week; and when the preset time frequency is monthly, the terminal equipment generates an emotion fluctuation curve of a preset time interval at a preset second time point, wherein the preset time interval is 1 month. Fig. 9 shows an emotion fluctuation curve containing target emotion information corresponding to each date within a preset time interval. The mood corresponding to each date in the preset time interval can be seen from the top row of emotion labels in fig. 9. As can be seen from fig. 9, the emotion labels from monday to sunday of the week are (xi, 5 level), (anger, 3 level), (sadi, 2 level), (le, 3 level), (startle, 2 level), (fear, 1 level), and (si, 4 level), in this order. It is understood that fig. 9 is only an example of an emotion fluctuation curve provided in the embodiments of the present application, and in other embodiments, the emotion types and emotion levels corresponding to the emotion labels on the respective dates may also be displayed on the emotion fluctuation curve at the same time.
For example, assuming that the preset time frequency is weekly, the terminal device may generate the emotional fluctuation curve of the current week at a preset first time point, for example, 18:00 a day of the week. It will be appreciated that in some embodiments, when the preset time frequency is weekly, the terminal device may also generate the emotional fluctuation curve of the last week at a preset first time point, for example, 10:00 weekly.
For another example, assuming that the preset time frequency is monthly, the terminal device may generate the emotional fluctuation curve of the previous month at a preset second time point, for example, 1 day 10:00 per month.
And S802, determining the stage psychological characteristics of the user in a preset time interval according to the emotional fluctuation curve.
The stage psychological characteristics are types of emotion types of the characteristic emotion information, wherein the occurrence frequency of the types of emotion types meets preset conditions; the characteristic emotion information is target emotion information of which the emotion level exceeds a preset emotion level threshold value.
Specifically, the occurrence frequency satisfies a preset condition, and may be the highest occurrence frequency, or the occurrence frequency exceeds a preset frequency threshold. The present application is not particularly limited in contrast.
Illustratively, assuming that "sad" emotions having a level of "5" appear most frequently for a certain 1 week or a certain 1 month according to the emotion fluctuation curve, it is determined that the user is "sad" in the stage of the preset time interval.
Step S803, determining target emotion guidance information corresponding to the stage psychological characteristics according to a third correspondence between the preset psychological characteristics and the emotion guidance information.
Wherein the target emotion guidance information is feedback information corresponding to the stage psychological characteristics.
In the embodiment of the present application, the third corresponding relationship may be a corresponding relationship between an emotion type in the preset psychological characteristic and the emotion guidance information, or may be a corresponding relationship between (an emotion type, an emotion level) in the preset psychological characteristic and the emotion guidance information.
In specific implementation, the terminal device has prestored the third corresponding relationship, for example, as shown in table 2:
TABLE 2
Figure BDA0003160785390000181
Figure BDA0003160785390000191
Illustratively, assuming that the "sad" emotion with the user 4 month level of "5" appears most frequently, the terminal device may remind the user of the following information at a predetermined time of 5 months and 1 day by displaying an interface as shown in fig. 10, so as to enable the user to obtain guidance information of future directionality: sadness easily causes dysfunction of the nervous system of a person, secretion of digestive juice is reduced, inappetence, anorexia, poor appetite, haggard appearance, shortness of breath, mental weariness, depression and discomfort occur, the sadness needs to be controlled intensively, a predetermined safe and comfortable place is considered, and when the person falls into sadness again, the person goes to the determined safe and comfortable place to release his own mood.
Step S804 displays the target emotion guidance information in the calendar application.
The target emotion management information represents a follow-up long-term management and control mode which is required to be applied to the emotion corresponding to the psychological characteristics of the stage.
Illustratively, the terminal device may display target emotion guidance information for stage psychological characteristics "sad" in the calendar application: sadness easily causes dysfunction of the nervous system of a person, secretion of digestive juice is reduced, inappetence, anorexia, poor appetite, haggard appearance, shortness of breath, lassitude, depression and discomfort occur, the sadness is required to be controlled intensively, a predetermined safe and comfortable place is considered, and the user releases his own emotion to the determined safe and comfortable place when the user is trapped in sadness again.
According to the method of the embodiment, the emotion fluctuation curve of the preset time interval is generated according to the preset time frequency; determining the stage psychological characteristics of the user in a preset time interval according to the emotional fluctuation curve; determining target emotion guidance information corresponding to the stage psychological characteristics according to a third corresponding relation between the preset psychological characteristics and the emotion guidance information; the target emotion guidance information is feedback information corresponding to the stage psychological characteristics; the target emotion guidance information is displayed in the calendar application. By the method, the user can regularly know the stage psychological characteristics of the preset time interval and obtain the target emotion guidance information corresponding to the stage psychological characteristics, so that emotion feedback based on the calendar is realized, basic functions can be added to the calendar, and the calendar has diversified functions.
Fig. 11 is a schematic flowchart of another emotion feedback method provided in an embodiment of the present application, and is applicable to a terminal device, where the terminal device is applicable to the hardware structure shown in fig. 2.
This emotional feedback method may be performed by the terminal device 100, see fig. 11, and may be implemented by:
step S1101, in response to the received user selection operation for the target date, displays an emotion tag option.
Step S1102 is to use the target emotion tag option selected by the user from the emotion tag options as the first emotion tag information of the target date.
Step S1103, determining target emotion information of the user on the target date corresponding to the first emotion tag information according to a first corresponding relationship between the preset emotion tag and the emotion information.
And step S1104, determining target emotion management information corresponding to the target emotion information according to a second correspondence between the preset emotion information and the emotion management information.
Step S1105 displays the target emotion management information in the calendar application.
Step S1106, generating an emotion fluctuation curve in a preset time interval according to the preset time frequency.
Step S1107, determine the stage psychological characteristics of the user in a preset time interval according to the mood swing curve.
Step S1108, determining target emotion guidance information corresponding to the stage psychological characteristics according to a third corresponding relationship between the preset psychological characteristics and the emotion guidance information.
Step S1109 displays the target emotion guidance information in the calendar application.
The processes of the emotion feedback method in steps S1101 to S1109 may be executed by referring to the specific processes in the foregoing embodiments, and are not described herein again.
According to the method, a feedback strategy based on emotion recognition in the calendar is realized, the emotion type and the emotion level are determined by analyzing the received emotion label of the user, emotion data are refined, and the user is given consistent intelligent feedback, so that emotion feedback based on the calendar is realized, basic functions can be added to the calendar, and the functions of the calendar are diversified.
The present invention also provides a computer program medium having stored thereon a computer program which, when executed by a processor, implements the steps of any of the emotional feedback methods provided in the above embodiments.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

1. A terminal device, comprising:
a display unit for displaying an image and a user interface; the user interface comprises a calendar application interface;
a processor to: receiving first emotion label information of a user corresponding to a target date in the calendar application;
determining target emotion information of the user on the target date according to the first emotion label information; the target emotion information is an emotion type and an emotion level corresponding to the current emotion label information;
displaying target emotion management information in the calendar application according to the target emotion information; the target emotion management information is feedback information corresponding to the target emotion information.
2. The terminal device of claim 1, wherein the processor is specifically configured to:
displaying an emotion label option in response to the received user selection operation for the target date;
and taking the target emotion label option selected by the user in the emotion label options as the first emotion label information of the target date.
3. The terminal device of claim 1, wherein the processor is specifically configured to:
and determining the target emotion information of the user on the target date corresponding to the first emotion label information according to a first corresponding relation between a preset emotion label and emotion information.
4. The terminal device of claim 1, wherein the processor is specifically configured to:
determining target emotion management information corresponding to the target emotion information according to a second corresponding relation between preset emotion information and emotion management information;
displaying the target emotion management information in the calendar application.
5. The terminal device of claim 1, wherein the processor is further configured to:
and in response to a user's viewing operation for the target date, displaying the target emotion information corresponding to the target date in the calendar application.
6. The terminal device of any of claims 1-5, wherein the processor is further configured to:
generating an emotion fluctuation curve of a preset time interval according to a preset time frequency; the emotion fluctuation curve comprises the target emotion information corresponding to each date in the preset time interval;
according to the emotion fluctuation curve, determining the stage psychological characteristics of the user in a preset time interval; the stage psychological characteristics are types of which the occurrence frequency meets preset conditions in the emotion types of the characteristic emotion information; the characteristic emotion information is target emotion information of which the emotion level exceeds a preset emotion level threshold;
determining target emotion guidance information corresponding to the stage psychological characteristics according to a third corresponding relation between preset psychological characteristics and emotion guidance information; the target emotion guidance information is feedback information corresponding to the stage psychological characteristics;
displaying the target emotional guidance information in the calendar application.
7. An emotional feedback method, comprising:
receiving first emotion label information of a user corresponding to a target date in a calendar application;
determining target emotion information of the user on the target date according to the first emotion label information; the target emotion information is an emotion type and an emotion level corresponding to the current emotion label information;
displaying target emotion management information in the calendar application according to the target emotion information; the target emotion management information is feedback information corresponding to the target emotion information.
8. The method of claim 7, wherein displaying targeted emotion management information in the calendar application based on the targeted emotion information comprises:
determining target emotion management information corresponding to the target emotion information according to a second corresponding relation between preset emotion information and emotion management information;
displaying the target emotion management information in the calendar application.
9. The method according to any one of claims 7-8, further comprising:
generating an emotion fluctuation curve of a preset time interval according to a preset time frequency; the emotion fluctuation curve comprises the target emotion information corresponding to each date in the preset time interval;
according to the emotion fluctuation curve, determining the stage psychological characteristics of the user in a preset time interval; the stage psychological characteristics are types of which the occurrence frequency meets preset conditions in the emotion types of the characteristic emotion information; the characteristic emotion information is target emotion information of which the emotion level exceeds a preset emotion level threshold;
determining target emotion guidance information corresponding to the stage psychological characteristics according to a third corresponding relation between preset psychological characteristics and emotion guidance information; the target emotion guidance information is feedback information corresponding to the stage psychological characteristics;
displaying the target emotional guidance information in the calendar application.
10. A computer-readable storage medium having a computer program stored therein, the computer program characterized by: the computer program, when executed by a processor, implements the method of any of claims 7-9.
CN202110790161.5A 2021-07-13 2021-07-13 Terminal device, emotion feedback method and storage medium Active CN113572893B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110790161.5A CN113572893B (en) 2021-07-13 2021-07-13 Terminal device, emotion feedback method and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110790161.5A CN113572893B (en) 2021-07-13 2021-07-13 Terminal device, emotion feedback method and storage medium

Publications (2)

Publication Number Publication Date
CN113572893A true CN113572893A (en) 2021-10-29
CN113572893B CN113572893B (en) 2023-03-14

Family

ID=78164613

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110790161.5A Active CN113572893B (en) 2021-07-13 2021-07-13 Terminal device, emotion feedback method and storage medium

Country Status (1)

Country Link
CN (1) CN113572893B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114969554A (en) * 2022-07-27 2022-08-30 杭州网易云音乐科技有限公司 User emotion adjusting method and device, electronic equipment and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130143185A1 (en) * 2011-12-02 2013-06-06 Eric Liu Determining user emotional state
CN105574478A (en) * 2015-05-28 2016-05-11 宇龙计算机通信科技(深圳)有限公司 Information processing method and apparatus
CN106155703A (en) * 2016-08-03 2016-11-23 北京小米移动软件有限公司 The display packing of emotional state and device
CN108804609A (en) * 2018-05-30 2018-11-13 平安科技(深圳)有限公司 Song recommendation method and device
CN109033163A (en) * 2018-06-19 2018-12-18 珠海格力电器股份有限公司 Method and device for adding diary in calendar
CN109460752A (en) * 2019-01-10 2019-03-12 广东乐心医疗电子股份有限公司 Emotion analysis method and device, electronic equipment and storage medium
CN109977101A (en) * 2016-05-24 2019-07-05 甘肃百合物联科技信息有限公司 A kind of method and system for reinforcing memory
CN110459296A (en) * 2019-06-26 2019-11-15 深圳市天彦通信股份有限公司 Information-pushing method and Related product
WO2020221103A1 (en) * 2019-04-30 2020-11-05 上海掌门科技有限公司 Method for displaying user emotion, and device
CN112464025A (en) * 2020-12-17 2021-03-09 当趣网络科技(杭州)有限公司 Video recommendation method and device, electronic equipment and medium
CN112860995A (en) * 2021-02-04 2021-05-28 北京百度网讯科技有限公司 Interaction method, device, client, server and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130143185A1 (en) * 2011-12-02 2013-06-06 Eric Liu Determining user emotional state
CN105574478A (en) * 2015-05-28 2016-05-11 宇龙计算机通信科技(深圳)有限公司 Information processing method and apparatus
CN109977101A (en) * 2016-05-24 2019-07-05 甘肃百合物联科技信息有限公司 A kind of method and system for reinforcing memory
CN106155703A (en) * 2016-08-03 2016-11-23 北京小米移动软件有限公司 The display packing of emotional state and device
CN108804609A (en) * 2018-05-30 2018-11-13 平安科技(深圳)有限公司 Song recommendation method and device
CN109033163A (en) * 2018-06-19 2018-12-18 珠海格力电器股份有限公司 Method and device for adding diary in calendar
CN109460752A (en) * 2019-01-10 2019-03-12 广东乐心医疗电子股份有限公司 Emotion analysis method and device, electronic equipment and storage medium
WO2020221103A1 (en) * 2019-04-30 2020-11-05 上海掌门科技有限公司 Method for displaying user emotion, and device
CN110459296A (en) * 2019-06-26 2019-11-15 深圳市天彦通信股份有限公司 Information-pushing method and Related product
CN112464025A (en) * 2020-12-17 2021-03-09 当趣网络科技(杭州)有限公司 Video recommendation method and device, electronic equipment and medium
CN112860995A (en) * 2021-02-04 2021-05-28 北京百度网讯科技有限公司 Interaction method, device, client, server and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
马晓悦;马昊;: "考虑标签情绪信息的图书资源个性化推荐方法研究" *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114969554A (en) * 2022-07-27 2022-08-30 杭州网易云音乐科技有限公司 User emotion adjusting method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN113572893B (en) 2023-03-14

Similar Documents

Publication Publication Date Title
KR102346043B1 (en) Digital assistant alarm system
CN113094135B (en) Page display control method, device, equipment and storage medium
CN111225108A (en) Communication terminal and card display method of negative screen interface
KR20150017015A (en) Method and device for sharing a image card
CN111367456A (en) Communication terminal and display method in multi-window mode
CN112835472B (en) Communication terminal and display method
US20140068519A1 (en) Phonebook provision method and apparatus
CN112153218B (en) Page display method and device, wearable device and storage medium
CN111857531A (en) Mobile terminal and file display method thereof
CN112612386A (en) Mobile terminal and display method of application card thereof
US20130219309A1 (en) Task performing method, system and computer-readable recording medium
CN113572893B (en) Terminal device, emotion feedback method and storage medium
CN113709026B (en) Method, device, storage medium and program product for processing instant communication message
CN112000408B (en) Mobile terminal and display method thereof
CN114035870A (en) Terminal device, application resource control method and storage medium
US11670340B2 (en) Prerecorded video experience container
CN114371895B (en) Terminal equipment, mail marking method and storage medium
CN111324255B (en) Application processing method based on double-screen terminal and communication terminal
CN114594894A (en) Interface element marking method, terminal device and storage medium
CN113760164A (en) Display device and response method of control operation thereof
CN113254132A (en) Application display method and related device
CN112363653A (en) Ink screen display method and terminal
CN114063459A (en) Terminal and intelligent household control method
CN114998067B (en) Study plan recommending method and electronic equipment
CN112000409B (en) Mobile terminal and display method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: 266071 Shandong city of Qingdao province Jiangxi City Road No. 11

Patentee after: Qingdao Hisense Mobile Communication Technology Co.,Ltd.

Address before: 266071 Shandong city of Qingdao province Jiangxi City Road No. 11

Patentee before: HISENSE MOBILE COMMUNICATIONS TECHNOLOGY Co.,Ltd.