WO2020192761A1 - Procédé permettant d'enregistrer une émotion d'utilisateur et appareil associé - Google Patents

Procédé permettant d'enregistrer une émotion d'utilisateur et appareil associé Download PDF

Info

Publication number
WO2020192761A1
WO2020192761A1 PCT/CN2020/081666 CN2020081666W WO2020192761A1 WO 2020192761 A1 WO2020192761 A1 WO 2020192761A1 CN 2020081666 W CN2020081666 W CN 2020081666W WO 2020192761 A1 WO2020192761 A1 WO 2020192761A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
user
emotional
image
user interface
Prior art date
Application number
PCT/CN2020/081666
Other languages
English (en)
Chinese (zh)
Inventor
相超
龚阿世
张宇涵
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2020192761A1 publication Critical patent/WO2020192761A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/51Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • G06F16/538Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/5866Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information

Definitions

  • This application relates to the technical field of emotion recognition and image processing, and particularly relates to methods and related devices for recording user emotions.
  • terminals such as mobile phones
  • the terminal When a user uses a terminal to take an image, the terminal only records the visual effect of the object imaging, and lacks attention to the emotion of the photographer.
  • This application provides a method and related devices for recording user emotions.
  • Electronic equipment can obtain the user's current emotional state, and store the image viewed by the user in association with the user's current emotional state.
  • the present application provides a method for recording user emotions.
  • the method includes: an electronic device displays a first user interface, and the first user interface displays a first image; the electronic device recognizes that the user is currently in the first user interface.
  • An emotional state in response to the detected first operation, the second image and the first emotional state are associated and stored, or the second image and the first emotional element are associated and stored; the first emotional element includes the first emotion
  • One or more emotional elements corresponding to the state, the first emotional element reflects the first emotional state; the second image is the same as the first image.
  • the first user interface may be a user interface provided by a camera application installed on an electronic device
  • the first image may be a preview image obtained by a camera of the electronic device
  • the second image may be a captured image.
  • the method of the first aspect can also be applied to the scene of viewing pictures.
  • the picture can come from the server or the local end of the electronic device.
  • the electronic device can recognize the emotional state of the user in the scene of shooting or viewing the picture, and store the captured/viewed image and the current emotional state of the recognized user according to the user’s needs, or according to the user It is necessary to store the captured/viewed image and the emotional element corresponding to the current emotional state of the user in association.
  • the present application provides a method for recording user emotions.
  • the method includes: an electronic device displays a first user interface, and the first user interface displays a first image; the electronic device recognizes that the user is currently in the first user interface.
  • An emotional state in response to the detected first operation, the second image and the first emotional state are associated and stored, or the second image and the first emotional element are associated and stored; the first emotional element includes the first emotion
  • One or more emotional elements corresponding to the state, the first emotional element reflects the first emotional state;
  • the first image is a preview image before shooting, and the second image is a captured image.
  • the first user interface may be a user interface provided by a camera application installed on an electronic device
  • the first image may be a preview image obtained by a camera of the electronic device
  • the second image may be a captured image. Since the user preview image and the captured image may not be performed at the same time, the preview image viewed by the user in the viewfinder and the final captured image may be the same or different.
  • the electronic device can recognize the user's emotional state in the shooting scene, and store the captured image and the recognized user's current emotional state according to the user's needs, or store the captured image and the user's current emotional state according to the user's needs.
  • the emotional element corresponding to the current emotional state.
  • the user's emotional state refers to the user's mood, emotion, or mental state, such as calm, angry, disgusted, sad, happy, romantic, happy, and fearful. It is not limited to the several emotional states mentioned above. In the specific implementation of this application, there may also be other emotional states, such as joy, anger, sadness, happiness, etc., which are not limited by this application.
  • one or more emotional elements corresponding to the first emotional state include: text information, pictures, music, image composition, image light and shadow effects, image saturation, image The tone or color of the image.
  • the first user interface includes a first interactive element
  • the first interactive element is used to monitor the operation of saving the second image
  • the first operation includes the electronic The operation detected by the device that acts on the first interactive element.
  • the first interactive element may be a shooting control in the user interface provided by the camera application.
  • the electronic device may store the first image and the first emotional state in a local terminal or a cloud server, or the electronic device may associate the first image with the first emotional state.
  • the image and the first emotional element are stored in a local terminal or a cloud server in association with each other.
  • the electronic device may store the second image in regions according to the recognized emotional state of the user. For example, when the recognized first emotional state is "happy”, the electronic device may store the second image and the emotional state "happy” in the first area; when the recognized first emotional state is "romance” , The electronic device can store the second image and the emotional state "happy” in the second area in association; when the recognized first emotional state is "happy”, the electronic device can "calm” the second image and the emotional state The association is stored in the third area.
  • the electronic device can also reproduce user emotions according to user needs. Specifically, after the electronic device stores the second image and the first emotional state in association, or after the electronic device stores the second image and the first emotional element in association, the method may further include: the electronic device displays the second user Interface, one or more third images are displayed in the second user interface, and the third image includes the second image; in response to the second operation of viewing the second image in the second user interface detected , The electronic device displays a third user interface, and the electronic device presents a second emotional element; the third user interface includes the second image; the second emotional element includes the first emotional state stored in association with the second image One or more corresponding emotional elements, or the second emotional element includes the first emotional element.
  • the presented second emotional element may be any one of the one or more emotional elements corresponding to the emotional state stored in association with the second image or the default one of the electronic device, or, The second emotional element may be any one of one or more emotional elements stored in association with the second image or a default one of the electronic device.
  • the electronic device can reproduce the user's emotions when needed, so that the user can resonate emotionally and improve the user experience.
  • the electronic device can display the second user interface in the following two situations, that is, before the electronic device displays the second user interface, the method further includes any one of the following two:
  • the electronic device may display a fourth user interface, in which one or more album entries are displayed, the one or more album entries include the first album entry, and the first album entry An album entry corresponds to the first emotional state; the album corresponding to the first album entry includes the image saved by the electronic device when the user is in the first emotional state; in response to the detected first album entry that acts on the first emotional state
  • the electronic device displays the second user interface.
  • the electronic device may display a fourth user interface in which one or more album entries are displayed, the one or more album entries include the first album entry, and the The first album entry corresponds to the first emotional state; the album corresponding to the first album entry includes the image saved by the electronic device when the user is in the first emotional state; the electronic device recognizes the second emotional state that the user is currently in , And display the fifth user interface; the one or more album entries displayed in the fifth user interface are those displayed on the fourth user interface by the electronic device according to a preset recommendation strategy and according to the second emotional state Or multiple album entries; one or more album entries displayed in the fifth user interface include the first album entry; in response to the detection of the first album entry acting on the fifth user interface In the third operation, the electronic device displays the second user interface.
  • the preset recommendation strategy may include: 1. When the current user's emotional state is "sorrow”, push to the user the image obtained when the user's emotional state is "happy” or “romantic". Such a push strategy can eliminate users' bad emotions. 2. When the current user's emotional state is "calm”, push an image preset by the user or the electronic device to the user, for example, push an image acquired when the user's emotional state is "happy” or "happy”.
  • the one or more album entries displayed in the fourth user interface include: one or more album entries respectively corresponding to different emotional states.
  • the electronic device may present the user with emotional elements corresponding to the user's emotional state before saving the second image, so that the user can save the second image Preview the emotional elements corresponding to the first emotional state.
  • the method may further include: the electronic device presents one or more emotions corresponding to the first emotional state element.
  • one or more emotional elements corresponding to the first emotional state presented by the electronic device are selected by the user, or selected by the electronic device by default, or randomly selected by the electronic device.
  • one or more emotional elements corresponding to the first emotional state are set by the electronic device by default, or set independently by the user.
  • the method before the electronic device recognizes that the user is currently in the first emotional state, the method further includes: the electronic device detects that the fourth aspect is used to activate the emotional recognition service. Operation, in response to the fourth operation, identify the current emotional state of the user; the emotional recognition service is used for the electronic device to collect the user's behavior through the configured hardware device or by starting the hardware device of other devices connected to the electronic device Data, according to the user's behavior data to identify the user's current emotional state.
  • the fourth operation includes an operation detected by the electronic device that acts on a second interaction element in the first user interface, and the second interaction element is used to monitor the operation of starting the emotion recognition service.
  • the present application provides an electronic device that includes one or more processors, a memory, and a display screen; the memory is coupled with the one or more processors, and the memory is used to store computer program codes,
  • the computer program code includes computer instructions, and the one or more processors invoke the computer instructions to cause the electronic device to execute:
  • a first user interface is displayed on the display screen, and the first image is displayed in the first user interface; it is recognized that the user is currently in the first emotional state; in response to the detected first operation, the second image and the The first emotional state, or the second image and the first emotional element are stored in association; the first emotional element includes one or more emotional elements corresponding to the first emotional state, and the first emotional element reflects the first emotional element. Emotional state; the second image is the same as the first image.
  • the electronic device of the third aspect can be applied to shooting scenes.
  • the first user interface may be a user interface provided by a camera application installed on an electronic device
  • the first image may be a preview image obtained by a camera of the electronic device
  • the second image may be a captured image.
  • the electronic device of the third aspect can also be applied to the scene of viewing pictures.
  • the picture can come from the server or the local end of the electronic device.
  • the electronic device of the third aspect can recognize the emotional state of the user in the scene of shooting or viewing the picture, and store the captured/viewed image and the current emotional state of the recognized user according to the user's needs, or store the associated storage according to the user's needs The images taken/viewed and the emotional elements corresponding to the current emotional state of the user.
  • the present application provides an electronic device that includes one or more processors, a memory, and a display screen; the memory is coupled with the one or more processors, and the memory is used to store computer program codes,
  • the computer program code includes computer instructions, and the one or more processors invoke the computer instructions to cause the electronic device to execute:
  • a first user interface is displayed on the display screen, and the first image is displayed in the first user interface; it is recognized that the user is currently in the first emotional state; in response to the detected first operation, the second image and the The first emotional state, or the second image and the first emotional element are stored in association; the first emotional element includes one or more emotional elements corresponding to the first emotional state, and the first emotional element reflects the first emotional element.
  • Emotional state the first image is a preview image before shooting, and the second image is a captured image.
  • the electronic device of the fourth aspect can be applied to shooting scenes.
  • the first user interface may be a user interface provided by a camera application installed on an electronic device
  • the first image may be a preview image obtained by a camera of the electronic device
  • the second image may be a captured image. Since the user preview image and the captured image may not be performed at the same time, the preview image viewed by the user in the viewfinder and the final captured image may be the same or different.
  • the electronic device of the fourth aspect can recognize the emotional state of the user in the shooting scene, and store the captured image and the recognized current emotional state of the user according to the user's needs, or store the captured image and the current emotional state of the user according to the user's needs.
  • the user's emotional state refers to the user's mood, mood, or mental state, such as calm, anger, disgust, sadness, joy, romance, happiness, and fear. It is not limited to the several emotional states mentioned above. In the specific implementation of this application, there may also be other emotional states, such as joy, anger, sadness, happiness, etc., which are not limited by this application.
  • one or more emotional elements corresponding to the first emotional state include: text information, pictures, music, image composition, image light and shadow effects, image saturation, The hue of the image or the color of the image.
  • the first user interface includes a first interactive element, and the first interactive element is used to monitor the operation of saving the second image; the first operation includes the The operation detected by the electronic device acting on the first interactive element.
  • the first interactive element may be a shooting control in the user interface provided by the camera application.
  • the electronic device may store the first image and the first emotional state in a local or cloud server, or the electronic device may associate the first image with the first emotional state.
  • the image and the first emotional element are stored in a local terminal or a cloud server in association with each other.
  • the electronic device may store the second image in regions according to the recognized emotional state of the user. For example, when the recognized first emotional state is "happy”, the electronic device may store the second image and the emotional state "happy” in the first area; when the recognized first emotional state is "romance” , The electronic device can store the second image and the emotional state "happy” in the second area in association; when the recognized first emotional state is "happy”, the electronic device can "calm” the second image and the emotional state The association is stored in the third area.
  • the electronic device can also reproduce user emotions according to user needs.
  • the one or more processors are also used to call the computer instructions to make the electronic device execute:
  • a second user interface is displayed on the display screen, and the second user interface displays one or A plurality of third images, the third image includes the second image;
  • a third user interface is displayed on the display screen, and a second emotional element is presented; the third user interface includes the first Two images; the second emotional element includes one or more emotional elements corresponding to the first emotional state stored in association with the second image, or the second emotional element includes the first emotional element.
  • the presented second emotional element may be any one of the one or more emotional elements corresponding to the emotional state stored in association with the second image or the default one of the electronic device, or, The second emotional element may be any one of one or more emotional elements stored in association with the second image or a default one of the electronic device.
  • the electronic device can reproduce the user's emotions when needed, so that the user can resonate emotionally and improve the user experience.
  • the electronic device can display the second user interface in the following two situations, that is, the one or more processors are also used to call the computer instructions to make the electronic device perform any of the following:
  • a fourth user interface is displayed on the display screen, and one or more album entries are displayed on the fourth user interface, and the one or more album entries include the first An album entry, the first album entry corresponds to the first emotional state; the album corresponding to the first album entry includes the image saved by the electronic device when the user is in the first emotional state; in response to the detected effect on the first emotional state;
  • the third operation of an album entry displays the second user interface on the display screen.
  • a fourth user interface is displayed on the display screen, and one or more album entries are displayed on the fourth user interface, and the one or more album entries include the first An album entry, the first album entry corresponds to the first emotional state; the album corresponding to the first album entry includes the image saved by the electronic device when the user is in the first emotional state; the second emotion that the user is currently in is recognized State, and display the fifth user interface on the display screen; the one or more album entries displayed in the fifth user interface are determined by the electronic device in accordance with a preset recommendation strategy and based on the second emotional state in the fourth The one or more album entries displayed in the user interface are determined; the one or more album entries displayed in the fifth user interface include the first album entry; in response to the detected effect on the fifth user interface In the third operation of the first album entry, the electronic device displays the second user interface.
  • the preset recommendation strategy may include: 1. When the current user's emotional state is "sorrow”, push to the user the image obtained when the user's emotional state is "happy” or “romantic". Such a push strategy can eliminate users' bad emotions. 2. When the current user's emotional state is "calm”, push an image preset by the user or the electronic device to the user, for example, push an image acquired when the user's emotional state is "happy” or "happy”.
  • the one or more album entries displayed in the fourth user interface include: one or more album entries respectively corresponding to different emotional states.
  • the electronic device may present the user with emotional elements corresponding to the user's emotional state before saving the second image, so that the user can save the second image Previously preview the emotional elements corresponding to the first emotional state.
  • the one or more processors are also used to call the computer instructions to make the electronic device execute:
  • the one or more emotional elements corresponding to the first emotional state presented by the electronic device are selected by the user, or selected by the electronic device by default, or randomly selected by the electronic device of.
  • the one or more processors are further used to invoke the computer instructions to cause the electronic device to execute:
  • a fourth operation for starting the emotional recognition service is detected, and in response to the fourth operation, recognizing the current emotional state of the user; the emotional recognition service is used for the electronic device
  • the user's behavior data is collected through the configured hardware device or by starting the hardware device of other equipment connected to the electronic device, and the current emotional state of the user is identified according to the user's behavior data.
  • the fourth operation includes an operation detected by the electronic device that acts on a second interaction element in the first user interface, and the second interaction element is used to monitor the operation of starting the emotion recognition service.
  • the present application provides a computer program product containing instructions.
  • the computer program product When the computer program product is run on an electronic device, the electronic device is caused to perform the operations described in the first aspect and any possible implementation of the first aspect. method.
  • the present application provides a computer program product containing instructions.
  • the computer program product When the computer program product is run on an electronic device, the electronic device is caused to perform the operations described in the second aspect and any possible implementation of the second aspect. method.
  • the present application provides a computer-readable storage medium, including instructions, when the instructions are executed on an electronic device, the electronic device is caused to execute as described in the first aspect and any possible implementation of the first aspect method.
  • the present application provides a computer-readable storage medium, including instructions, when the instructions are executed on an electronic device, the electronic device is caused to execute as described in the second aspect and any possible implementation of the second aspect method.
  • the electronic device can record or save the user's emotions when acquiring images, achieve the goal of paying attention to the user's emotions, and reproduce the user's emotions when needed, and improve the user's experience of using the electronic device.
  • FIG. 1A is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 1B is a software structure block diagram of an electronic device provided by an embodiment of the present application.
  • FIGS. 2A-2B are schematic diagrams of human-computer interaction with "emotion recognition” enabled according to an embodiment of this application;
  • 3A-3H are schematic diagrams of human-computer interaction for recording user emotions in a shooting scene provided by an embodiment of the application;
  • 4A-4H are schematic diagrams of human-computer interaction for recording user emotions in a picture viewing scene provided by an embodiment of the application;
  • 5A-5K and 6A-6B are schematic diagrams of human-computer interaction for reproducing user emotions provided by an embodiment of this application;
  • 7A-7D are schematic diagrams of human-computer interaction for the user to set emotional elements according to embodiments of the application.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Thus, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present application, unless otherwise specified, “plurality” means two or more.
  • the following embodiments of the application provide a method and related devices for recording user emotions, so that electronic equipment can record or save user emotions when acquiring images, and achieve the purpose of paying attention to user emotions and reproducing user emotions when needed. Improve the user experience of electronic equipment.
  • the electronic device under the condition that the "emotion recognition" of the electronic device is turned on, when the electronic device recognizes the scene in which the user views the image, the electronic device can obtain the user's current emotional state, and Store the image and the current emotional state of the user in association. If the user views the image again, the electronic device can present the image to the user while presenting the emotional elements corresponding to the emotional state stored in association with the image, that is, the electronic device can restore the user's emotions, resonate with the user, and give the user more Good experience.
  • emotional states and emotional elements refer to related descriptions in subsequent embodiments.
  • the image viewed by the user may include: an image obtained by the camera of an electronic device (for example, an image included in a viewing frame in a shooting interface provided by a camera application), and the electronic device obtains and stores it from a network or other device To the local image, the image stored on the cloud device accessed by the electronic device, etc.
  • the image can be a static picture, a dynamic picture, a video, etc.
  • emotional recognition may be a service or function provided by an electronic device, which may support the electronic device to obtain the user's emotional state.
  • the emotional state of the user refers to the mood, mood, or mental state of the user, such as calm, anger, disgust, sadness, joy, romance, happiness, and fear. It is not limited to the several emotional states mentioned above. In the specific implementation of this application, there may also be other emotional states, such as joy, anger, sadness, happiness, etc., which are not limited by this application.
  • "emotion recognition” can support the electronic device to analyze the user's behavior data through artificial intelligence (AI) technology, so as to recognize the user's emotional state.
  • enabling "emotion recognition” may include enabling the hardware device of the electronic device and/or enabling the hardware device of other devices connected to the electronic device to collect the user's behavior data, and identify the user's behavior based on the user's behavior data. Emotional state.
  • the user behavior data collected by these hardware devices may include: 1. Non-physiological data. For example, the face, facial expressions and actions collected by the camera 193, the voice collected by the microphone 170C, the ambient light collected by the ambient light sensor 180L, and the user's typing speed and grammar collected by the touch screen 194. 2. Physiological data.
  • the electronic device can analyze the user's emotional state through AI technology based on the collected user behavior data. For example, when the user’s mouth is upturned, the corners of the eyes are slightly tilted, the heart rate is stable, breathing is relieved, and the skin impedance is high, the user is in a happy mood; when the user has a panic expression, an accelerated heart rate, shortness of breath, and a low skin impedance In a state of panic and fear; when the user's pupils dilate and his hands clenched fists, the user is in a state of anger.
  • "emotion recognition” can support the electronic device to analyze the user's behavior data through AI technology, and match the emotional state that is closest to the current user's actual state among the possible emotional states of the user, that is, recognize To the user’s current emotional state.
  • the possible emotional state of the user refers to the emotional state that matches the personality of the user.
  • the possible emotional states of an optimistic and cheerful user may include pleasure, romance, and happiness
  • the possible emotional states of a cautious and melancholic user may include calm, sadness, etc.
  • the user's personality can be obtained by analyzing the user's past behavior data by the electronic device through AI technology.
  • the possible emotional states of users with different personalities can be pre-stored in an electronic device or a cloud server.
  • the AI technology mentioned in the embodiment of the present application may be a machine learning algorithm.
  • the machine learning algorithm can be a deep learning algorithm, which can include one or more of the following: convolutional neural network (convolutional neural network, CNN), nearest neighbor (k-NearestNeighbor, KNN) classification algorithm, recurrent neural network ( recurrent neural network, RNN) or statistical algorithms, etc.
  • "emotion recognition” can support the electronic device to receive user input or selection of emotional state indication information. Enabling “emotion recognition” may include that the electronic device receives the instruction information of the emotional state directly input or selected by the user, so as to recognize the current emotional state of the user according to the instruction information.
  • the user can input or select the indication information of the emotional state on the touch screen 194, or input voice through the microphone 170C to input or select the indication information of the emotional state, or input or select the indication information of the emotional state by shaking or gestures.
  • the indication information of the emotional state may be an icon, text, or voice.
  • an emotional element refers to an element that can reflect the emotional state of the user.
  • Emotional elements may include, but are not limited to: text information, pictures (such as emoticons), music, image composition, light and shadow effects, saturation, hue, and color.
  • the composition method of the image may include nine square grid composition method, guide line composition method, diagonal composition method, three-part composition method, blank composition method, foreground/background blur composition method, etc.
  • the light and shadow effects of the image can include strong light and shadow, grayed light and shadow, and so on.
  • the saturation may include high saturation, low saturation, and so on.
  • Hues may include cool colors, warm colors, neutral colors, and so on.
  • the color can include bright, dim, gray, normal, etc.
  • Different emotional elements can give people different feelings, that is, different emotional elements can correspond to different emotional states.
  • the composition of the image is a three-part composition (that is, the picture is divided by the golden section) and the image is colorful, it can enhance the visual pleasure and vividness, which can correspond to the emotional state of "happy".
  • the composition of an image is a blank composition, a gray light and a dark color, it can give people a sense of loneliness, which can correspond to the emotional state of "sorrow.” Understandably, when the user sees these emotional elements, he can intuitively feel the emotional state corresponding to the emotional element.
  • one emotional state may correspond to multiple emotional elements.
  • the determination method or the setting method of the emotional elements corresponding to different emotional states can refer to the related descriptions in the subsequent embodiments, which will not be repeated here.
  • the emotional elements corresponding to different emotional states may be stored locally on the electronic device, or may also be stored in a cloud server, which is not limited in this application.
  • the electronic devices can be mobile phones, tablets, personal digital assistants (personal digital assistants, PDAs), wearable devices, laptops and other portable electronic devices.
  • portable electronic devices include, but are not limited to, portable electronic devices equipped with iOS, android, microsoft or other operating systems.
  • the aforementioned portable electronic device may also be other portable electronic devices, such as a laptop computer with a touch-sensitive surface (such as a touch panel).
  • the electronic device may not be a portable electronic device, but a desktop computer with a touch-sensitive surface (such as a touch panel), or a smart TV.
  • FIG. 1A shows a schematic structural diagram of an exemplary electronic device 100 provided by an embodiment of the present application.
  • FIG. 1A shows a schematic diagram of the structure of an electronic device 100.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2.
  • Mobile communication module 150 wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM Subscriber identification module
  • the sensor module 180 may include pressure sensor 180A, gyroscope sensor 180B, air pressure sensor 180C, magnetic sensor 180D, acceleration sensor 180E, distance sensor 180F, proximity light sensor 180G, fingerprint sensor 180H, temperature sensor 180J, touch sensor 180K, ambient light Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU), etc.
  • AP application processor
  • GPU graphics processing unit
  • ISP image signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • the processor 110 may recognize the user's emotional state according to the acquired user behavior data.
  • the specific identification method can refer to the previous description.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 to store instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (PCM) interface, and a universal asynchronous transmitter/receiver (universal asynchronous) interface.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART mobile industry processor interface
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the I2C interface is a two-way synchronous serial bus, including a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may include multiple sets of I2C buses.
  • the processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc. through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement the touch function of the electronic device 100.
  • the I2S interface can be used for audio communication.
  • the processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to realize communication between the processor 110 and the audio module 170.
  • the audio module 170 may transmit audio signals to the wireless communication module 160 through an I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communication to sample, quantize and encode analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a two-way communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is generally used to connect the processor 110 and the wireless communication module 160.
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 may transmit audio signals to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with the display screen 194, the camera 193 and other peripheral devices.
  • the MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc.
  • the processor 110 and the camera 193 communicate through a CSI interface to implement the shooting function of the electronic device 100.
  • the processor 110 and the display screen 194 communicate through a DSI interface to realize the display function of the electronic device 100.
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and so on.
  • GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that complies with the USB standard specification, and specifically may be a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transfer data between the electronic device 100 and peripheral devices. It can also be used to connect headphones and play audio through the headphones. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present application is merely a schematic description, and does not constitute a structural limitation of the electronic device 100.
  • the electronic device 100 may also adopt different interface connection modes in the foregoing embodiments, or a combination of multiple interface connection modes.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive the charging input of the wired charger through the USB interface 130.
  • the charging management module 140 may receive wireless charging input through the wireless charging coil of the electronic device 100. While the charging management module 140 charges the battery 142, it can also supply power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110.
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 can provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the electronic device 100.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves by the antenna 1, and perform processing such as filtering, amplifying and transmitting the received electromagnetic waves to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic waves for radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 110 and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), and global navigation satellites.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, perform frequency modulation, amplify it, and convert it into electromagnetic wave radiation via the antenna 2.
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, and an application processor.
  • the GPU is a microprocessor for image processing, connected to the display 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs, which execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, etc.
  • the display screen 194 includes a display panel.
  • the display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active-matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the electronic device 100 may include one or N display screens 194, and N is a positive integer greater than one.
  • the electronic device 100 can implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
  • the ISP is used to process the data fed back from the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transfers the electrical signal to the ISP for processing and is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats.
  • the electronic device 100 may include 1 or N cameras 193, and N is a positive integer greater than 1.
  • the camera 193 can be used to collect the user's face, facial expressions, and actions for the electronic device to recognize the user's emotional state.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects the frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in a variety of encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • the NPU can realize applications such as intelligent cognition of the electronic device 100, such as image recognition, face recognition, voice recognition, text understanding, and so on.
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, at least one application program (such as a sound playback function, an image playback function, etc.) required by at least one function.
  • the data storage area can store data (such as audio data, phone book, etc.) created during the use of the electronic device 100.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), etc.
  • UFS universal flash storage
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by running instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the internal memory 121 may be used to store one or more emotional elements corresponding to different emotional states.
  • the internal memory 121 may also be used to store the image acquired by the electronic device 100 and the emotional state of the user when the image is acquired.
  • the internal memory 121 may also be used to associate and store the image obtained by the electronic device 100 and the emotional element corresponding to the user's emotional state when the image is obtained.
  • the electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
  • the speaker 170A also called a "speaker" is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the speaker 170A can be used to output audio-like emotional elements (such as background music).
  • the speaker 170A may also be used to input voice prompt information, for example, a voice used to prompt the user to turn on "emotion recognition".
  • the receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 answers a call or voice message, it can receive the voice by bringing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can approach the microphone 170C through the mouth to make a sound, and input the sound signal to the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which can implement noise reduction functions in addition to collecting sound signals. In some other embodiments, the electronic device 100 can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the earphone interface 170D is used to connect wired earphones.
  • the earphone interface 170D may be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association
  • the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A may be provided on the display screen 194.
  • the capacitive pressure sensor may include at least two parallel plates with conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations that act on the same touch location but have different touch operation strengths may correspond to different operation instructions. For example: when a touch operation whose intensity of the touch operation is less than the first pressure threshold is applied to the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the movement posture of the electronic device 100.
  • the angular velocity of the electronic device 100 around three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shake of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can use the magnetic sensor 180D to detect the opening and closing of the flip holster.
  • the electronic device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the electronic device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 100 emits infrared light to the outside through the light emitting diode.
  • the electronic device 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 can determine that there is no object near the electronic device 100.
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, and the pocket mode will automatically unlock and lock the screen.
  • the ambient light sensor 180L is used to sense the brightness of the ambient light.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived brightness of the ambient light.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in the pocket to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, access application locks, fingerprint photographs, fingerprint answering calls, etc.
  • the temperature sensor 180J is used to detect temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the electronic device 100 executes to reduce the performance of the processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 due to low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194, and the touch screen is composed of the touch sensor 180K and the display screen 194, which is also called a “touch screen”.
  • the touch sensor 180K is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 194.
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100, which is different from the position of the display screen 194.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can obtain the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the human pulse and receive the blood pressure pulse signal.
  • the bone conduction sensor 180M may also be provided in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can parse the voice signal based on the vibration signal of the vibrating bone block of the voice obtained by the bone conduction sensor 180M, and realize the voice function.
  • the application processor may analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M, and realize the heart rate detection function.
  • the button 190 includes a power button, a volume button, and so on.
  • the button 190 may be a mechanical button. It can also be a touch button.
  • the electronic device 100 may receive key input, and generate key signal input related to user settings and function control of the electronic device 100.
  • the motor 191 can generate vibration prompts.
  • the motor 191 can be used for incoming call vibration notification, and can also be used for touch vibration feedback.
  • touch operations applied to different applications can correspond to different vibration feedback effects.
  • Acting on touch operations in different areas of the display screen 194, the motor 191 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminding, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 may be an indicator light, which may be used to indicate the charging status, power change, or to indicate messages, missed calls, notifications, and so on.
  • the SIM card interface 195 is used to connect to the SIM card.
  • the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to achieve contact and separation with the electronic device 100.
  • the electronic device 100 may support 1 or N SIM card interfaces, and N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM cards, Micro SIM cards, SIM cards, etc.
  • the same SIM card interface 195 can insert multiple cards at the same time. The types of the multiple cards can be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 may also be compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 100 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiment of the present application takes a layered Android system as an example to illustrate the software structure of the electronic device 100.
  • FIG. 1B is a software structure block diagram of an electronic device 100 according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Communication between layers through software interface.
  • the Android system is divided into four layers, from top to bottom, the application layer, the application framework layer, the Android runtime and system library, and the kernel layer.
  • the application layer can include a series of application packages.
  • the application package may include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message, etc.
  • the application framework layer provides application programming interfaces (application programming interface, API) and programming frameworks for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer can include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and so on.
  • the window manager is used to manage window programs.
  • the window manager can obtain the size of the display, determine whether there is a status bar, lock the screen, take a screenshot, etc.
  • the content provider is used to store and retrieve data and make these data accessible to applications.
  • the data may include video, image, audio, phone calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls that display text and controls that display pictures.
  • the view system can be used to build applications.
  • the display interface can be composed of one or more views.
  • a display interface that includes a short message notification icon may include a view that displays text and a view that displays pictures.
  • the phone manager is used to provide the communication function of the electronic device 100. For example, the management of the call status (including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, etc.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and it can disappear automatically after a short stay without user interaction.
  • the notification manager is used to notify the download completion, message reminder, etc.
  • the notification manager can also be a notification that appears in the status bar at the top of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window. For example, text messages are prompted in the status bar, prompt sounds, electronic devices vibrate, and indicator lights flash.
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function functions that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in a virtual machine.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), three-dimensional graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides a combination of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support multiple audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to realize 3D graphics drawing, image rendering, synthesis, and layer processing.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display driver, camera driver, audio driver, and sensor driver.
  • the corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes touch operations into original input events (including touch coordinates, time stamps of touch operations, etc.).
  • the original input events are stored in the kernel layer.
  • the application framework layer obtains the original input event from the kernel layer, and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and the control corresponding to the click operation is the control of the camera application icon as an example, the camera application calls the interface of the application framework layer to start the camera application, and then starts the camera driver by calling the kernel layer.
  • the camera 193 captures still images or videos.
  • the following describes an exemplary graphical user interface on the electronic device 100 for displaying the application programs installed by the electronic device 100.
  • FIG. 2A exemplarily shows an exemplary user interface 21 on the electronic device 100 for displaying application programs installed by the electronic device 100.
  • the user interface 21 may include: a status bar 201, a calendar indicator 202, a weather indicator 203, a tray 204 with commonly used application icons, a navigation bar 205, and other application icons. among them:
  • the status bar 201 may include: one or more signal strength indicators 201A of mobile communication signals (also called cellular signals), the name of the operator (for example, "China Mobile") 201B, and wireless fidelity (Wi-Fi) )
  • One or more signal strength indicators 201C, battery status indicator 201D, and time indicator 201E of the signal are included in the signal.
  • the calendar indicator 202 can be used to indicate the current time, such as date, day of the week, hour and minute information, etc.
  • the weather indicator 203 can be used to indicate the type of weather, such as cloudy to clear, light rain, etc., and can also be used to indicate information such as temperature.
  • the tray 204 with icons of commonly used application programs can display: a phone icon 204A, a contact icon 204B, a short message icon 204C, and a camera icon 204D.
  • the navigation bar 205 may include system navigation keys such as a return key 205A, a home screen key 205B, and a multi-task key 205C.
  • system navigation keys such as a return key 205A, a home screen key 205B, and a multi-task key 205C.
  • the electronic device 100 may display the previous page of the current page.
  • the electronic device 100 may display the home interface.
  • the electronic device 100 may display the task recently opened by the user.
  • the naming of each navigation key can also be other, which is not limited in this application. It is not limited to virtual keys, and each navigation key in the navigation bar 205 can also be implemented as a physical key.
  • Other application icons can be for example: Wechat icon 206, QQ icon 207, Twitter icon 208, Facebook icon 209, mailbox icon 210, cloud sharing icon 211, memo The icon 212 of, the icon 213 of Alipay, the icon 214 of gallery, and the icon 215 of settings.
  • the user interface 21 may also include a page indicator 216.
  • Other application program icons may be distributed on multiple pages, and the page indicator 216 may be used to indicate which application program in which page the user is currently browsing. The user can slide the area of other application icons left and right to browse application icons in other pages.
  • the user interface 21 exemplarily shown in FIG. 2A may be a home screen.
  • the electronic device 100 may also include a physical home screen key.
  • the home screen key can be used to receive instructions from the user and return the currently displayed UI to the main interface, so that the user can view the home screen at any time.
  • the above instruction can be an operation instruction for the user to press the home screen key once, or an operation instruction for the user to press the home screen key twice in a short period of time, or the user long press the home screen key within a predetermined time Operation instructions.
  • the home screen key can also be integrated with a fingerprint recognizer, so that when the home screen key is pressed, fingerprints are collected and recognized.
  • FIG. 2A only exemplarily shows the user interface on the electronic device 100, and should not constitute a limitation to the embodiment of the present application.
  • 2A and 2B exemplarily show an operation of turning on "emotion recognition" on the electronic device 100.
  • the electronic device 100 may display a window 217 on the user interface 21.
  • the switch control 217A of "emotion recognition” can be displayed in the window 217, and the switch control of other functions (such as Wi-Fi, Bluetooth, flashlight, etc.) can also be displayed.
  • the electronic device 100 can turn on "emotion recognition".
  • the user can make a downward sliding gesture on the status bar 201 to open the window 217, and can click the switch control 217A of "emotion recognition” in the window 217 to conveniently open the "emotion recognition".
  • the expression form of the switch control 217A of "emotion recognition” can be text information or icons.
  • the user can also view images (such as the image in the viewfinder provided by the camera application, the image stored in the electronic device, or the cloud accessed by the electronic device 100).
  • the "emotion recognition” is turned on when the image on the server, etc.), which will be described in detail in the subsequent embodiments, and will not be repeated here.
  • the electronic device 100 may also automatically turn on "emotion recognition", for example, automatically turn on "emotion recognition” when recognizing a scene in which a user views an image.
  • the user can turn on the "emotion recognition” function from the setting options of the electronic device.
  • the electronic device may also display a prompt message that "emotion recognition” has been turned on in the status bar 201.
  • a prompt message that "emotion recognition” has been turned on in the status bar 201.
  • the icon of "emotion recognition” is displayed in the status bar 201 or the text "emotion recognition” is directly displayed.
  • the user can preview the image collected by the camera from the viewfinder when taking a photo, and can choose to capture the previewed image in the current viewfinder, that is, save the image.
  • the electronic device can recognize the emotional state of the user when viewing the image, and store the emotional state in association when storing the image. It is understandable that the image captured by the camera previewed by the user in the viewfinder may be the same or different from the image finally selected by the user.
  • the "photographing interface" provided by the UI embodiment exemplarily shown in FIGS. 3A to 3H is described below.
  • the "photographing interface” can be used to display the image collected by the camera and one or more related controls when the image is taken.
  • the image collected by the camera can be collected by the front camera, or can be collected by the rear camera.
  • Related controls when shooting images can be used to receive user operations (such as touch operations).
  • the electronic device can perform one or more of the following: change the scale of the displayed image, turn on the flash, turn on the corresponding shooting mode, switch Camera etc.
  • the user interface 31 exemplarily shown in FIGS. 3A-3C may be an implementation of the "photographing interface".
  • the user interface 31 may be provided by the "camera” application.
  • “Camera” is an application program installed on electronic devices such as smart phones, tablet computers, etc. for capturing images, and the name of the application program is not limited in the embodiment of the present application.
  • This application can support displaying various images collected by the camera 193. That is, the object displayed by the "camera” is the image collected by the camera 193.
  • the user interface 31 may be a user interface opened by the user clicking on the camera icon 204D in FIG. 2A, and is not limited to this.
  • the user may also open the user interface 31 for capturing images in other applications, such as the user in the "WeChat” application Tap the shooting control to open the user interface 31 for shooting images.
  • the user interface 31 may include: a captured image display control 301, a shooting control 302, a control 303 for switching cameras, a shooting mode menu 304, a viewfinder 305, and other controls such as A control 306A for adjusting the size of the image displayed in the viewfinder 305, a control 306B for displaying introduction information of the shooting mode, a control 306C for setting various parameters during shooting, and a control 306D for turning on/off the flash.
  • a control 306A for adjusting the size of the image displayed in the viewfinder 305
  • a control 306B for displaying introduction information of the shooting mode
  • a control 306C for setting various parameters during shooting
  • a control 306D for turning on/off the flash.
  • the captured image echo control 301 can be used to display the last image captured by the electronic device 100, and can also monitor user operations for opening an album.
  • the electronic device 100 can detect a user operation (for example, a click operation) acting on the control 301, and in response to the operation, the electronic device displays the most recently saved image.
  • the shooting control 302 can monitor user operations for shooting images.
  • the electronic device can detect a user operation (for example, a click operation) acting on the shooting control 302, and shoot in response to the user operation, save the captured image, and display the image in the control 301.
  • a user operation for example, a click operation
  • the shooting control 302 can shoot an image.
  • the control 303 for switching cameras can monitor user operations for switching cameras.
  • the electronic device 100 can detect a user operation (such as a click operation) acting on the control 303, and switch the camera in response to the user operation, for example, switch the rear camera to the front camera, or switch the front camera to the rear camera.
  • a user operation such as a click operation
  • the viewfinder frame 305 can be used to display the image acquired by the camera. In other words, the viewfinder frame 305 can be used to display a preview image.
  • the electronic device 100 can refresh the display content therein in real time.
  • the camera used to obtain the image may be a rear camera or a front camera.
  • the shooting mode menu 304 may display one or more shooting mode options.
  • the shooting mode options can be implemented as icons, text or other forms.
  • the shooting mode options in the shooting mode menu 304 may include: large aperture mode option 304A, night scene mode option 304B, portrait mode option 304C, photo mode option 304D, video mode option 304E, professional Mode option 304F etc.
  • the electronic device can detect a sliding operation (such as a left or right sliding operation) acting on the shooting mode menu 304, and switch the shooting mode options displayed in the shooting mode menu 304 in response to the operation, so that the user can browse more Multiple shooting mode options.
  • the electronic device 100 can also detect a user operation acting on the shooting mode option, and activate the corresponding shooting mode in response to the user operation.
  • the electronic device 100 can shoot images with different effects in different shooting modes. For example, in the large-aperture mode, the captured image can be processed to show a depth-of-field effect, and in the portrait mode, the portrait in the captured image can be beautified.
  • the switch control 307 of "emotion recognition” may be displayed in the user interface 31.
  • the switch control 307 may be an icon.
  • the user interface 31 may also display prompt information for prompting the user to turn on "emotion recognition".
  • the prompt information may be text information such as "click here to turn on emotion recognition", pictures, links or other forms. The embodiment does not impose any limitation on this.
  • the electronic device can detect an operation that acts on the control 307 (such as a user's light touch, heavy press, or long press on the control 307), and in response to the operation, the electronic device can turn on "emotion recognition ".
  • the display mode of the switch control 307 may be updated, and the updated control 307 may be as shown in FIG. 3B. That is to say, after the electronic device turns on "emotion recognition", the control 307 can be changed from the shadow state to the non-shadow state. It is not limited to changing the display mode of the control 307.
  • the electronic device may also prompt the user to have turned on "emotion recognition” in other ways. For example, the electronic device may also play voice or generate vibration to remind the user that the user has turned on "emotion recognition". No restrictions.
  • the electronic device may also turn on “emotion recognition” in other ways.
  • the user can also input a voice "enable emotion recognition”
  • the electronic device can collect the user's voice through the microphone 170C, and enable "emotion recognition” according to the voice.
  • the user can also enable "emotion recognition” in the window 217 shown in FIG. 2B, which is not limited in this application.
  • prompt information 308 may be displayed in the view frame 305.
  • the prompt information 308 may be used to prompt the user that the electronic device is recognizing the emotional state of the user.
  • the prompt information 308 may be text information "recognizing user emotion”.
  • the prompt information 308 may also be in other forms such as pictures, links, etc., which is not limited in this application.
  • the prompt information 308 in the user interface 31 may not be an interactive element displayed on the touch screen, but may be audio played through the speaker 170A.
  • the electronic device when the electronic device has not yet recognized the user's emotional state, the electronic device may also not present any other content in the view frame 305, that is, blank, which can indicate that the user's emotional state has not yet been recognized. status.
  • the electronic device can update the content displayed in the viewing frame 305, and the updated viewing frame 305 can display prompt information 309.
  • the preset time may be determined by the performance of the electronic device, or may be independently set by the user, which is not limited in the embodiment of the present application.
  • the prompt information 309 may be used to prompt the user of the emotional state of the user that the electronic device has recognized.
  • the prompt information 309 may be the text message "Identified that the user is in a happy state, click to view". It is not limited to text information.
  • the prompt information 309 may also be in other forms such as pictures and links, which are not limited in this application.
  • the prompt information 309 in the user interface 31 as shown in FIG. 3C may also be updated accordingly.
  • the electronic device may also use other forms of interactive elements to indicate that the electronic device has recognized the user's emotional state, for example, the electronic device can also play Voice to remind the user that the emotional state has been recognized.
  • the specific implementation manner for the electronic device to recognize the user's emotional state can refer to the relevant description above, which will not be repeated here.
  • the electronic device can directly display the emotional element corresponding to the user's emotional state in the viewfinder 305.
  • some or all of the prompt information 309 in the user interface 31 shown in FIG. 3C can be used to receive triggers that the electronic device presents corresponding to the user's emotional state
  • the manipulation of emotional elements For example, part of the text "click to view” in the prompt message 309 "recognizing that the user is in a happy state, click to view” can be used to receive an operation that triggers the electronic device to present an emotional element corresponding to the user's emotional state.
  • the operation can be light touch, heavy press or long press.
  • the emotional elements corresponding to the user's emotional state can be stored in the local end of the electronic device, or can be stored in the cloud server. That is, after the electronic device recognizes the emotional state of the user, it can obtain the emotional element corresponding to the emotional state from the local end or from the cloud server.
  • the user interface 31 exemplarily shown in FIGS. 3D-3H may present several implementation manners of emotional elements corresponding to the current emotional state of the user for the electronic device.
  • Fig. 3D shows a way for an electronic device to display an emotional element corresponding to the emotional state of the user.
  • an emotional element corresponding to the emotional state "happy” is displayed in the user interface 31, and the emotional element is an emoticon package 310A.
  • the emoticon package 310A may be any one of the one or more emotional elements corresponding to the emotional state "happy” or a default one of the electronic device.
  • FIG. 3E shows a way for an electronic device to display multiple emotional elements corresponding to the emotional state of the user.
  • multiple emotional elements corresponding to the emotional state "happy” are displayed in the user interface 31, including emoticons 310A, text bubbles 310B, text bubbles 310C, and text information 310D, and the image in the viewfinder 305 High saturation presented.
  • the multiple emotional elements may be part or all of the multiple emotional elements corresponding to the emotional state "happy".
  • the electronic device may also present other emotional elements.
  • the electronic device can also adjust the composition, light and shadow effects, saturation, hue, and color of the image in the viewfinder 305, and can also play background music through the speaker 170A to reflect the current emotional state of the user.
  • Figures 3F-3H show yet another way for an electronic device to present emotional elements corresponding to the user's emotional state.
  • a small window 311 is displayed in the user interface 31.
  • the small window 311 can be used for the user to select corresponding emotional elements for presentation.
  • the small window 311 may include the icon 311A of the emotional element, the text information "emotional element" 311B, multiple emotional element options such as “original image” 311C, high saturation 311D, text bubble "Happy life! 311E, emoticon package 311F,
  • the text message "Singing in the daytime must indulge in wine, youth is a good company to return home" 311G, background music “Song "Today is a good day”” 311H, etc.
  • the emotional elements corresponding to the multiple emotional element options may be part or all of the one or more emotional elements corresponding to the emotional state "happy". It is not limited to the emotional element options included in the small window 311 shown in FIG. 3F. In other embodiments, the small window 311 may also include other emotional element options, such as composition options, lighting effects, and saturation options. , Hue options and color options, etc.
  • Each emotional element option in the small window 311 can be used to monitor the operation of triggering the electronic device to present the emotional element corresponding to the option.
  • the electronic device may present the emotional element corresponding to the emotional element option.
  • the electronic device may not present any emotional elements; in response to a detected touch operation on option 311E, the electronic device may display text bubbles on the user interface 31 "Happy life!, as shown in FIG. 3G; in response to the detected touch operation on the option 311H, the electronic device activates the speaker 170A to play the song "Today is a good day”.
  • the display mode of the emotional element option in the small window 311 can be changed. For example, the border of the option 311E shown in FIG. 3G is thickened.
  • only one emotional element can be presented in the user interface 31 as shown in FIG. 3G.
  • the electronic device detects an operation acting on the next emotional element option in the small window 311
  • the electronic device stops presenting The previous emotional element and begin to present the next emotional element.
  • multiple emotional elements may be presented in the user interface 31 as shown in FIG. 3G.
  • the electronic device detects an operation (such as a touch operation) that acts on the emotional element option
  • the electronic device presents For the emotional element corresponding to the emotional element option
  • the electronic device detects an operation (such as a touch operation) acting on the emotional element option again the electronic device may stop presenting the emotional element corresponding to the emotional element option.
  • the small window 311 may also display a control 311I and/or a control 311J.
  • the control 311I or the control 311J can be used for the user to switch or update the emotional element options displayed in the small window 311, so that more emotional elements corresponding to the current user's emotional state can be viewed.
  • other interactive elements can also be used for the user to switch the emotional element options displayed in the small window 311.
  • the user can also make a left or right sliding gesture in the small window 311 to switch or update the emotional element options displayed in the small window 311.
  • the small window 311 may further include a control 311K and a control 311L.
  • the control 311L can be used to monitor the operation of triggering to stop displaying the small window 311.
  • the user can input a user operation (such as a touch operation on the text "OK") on the control 311L.
  • the electronic device stops displaying the small window 311 and continues to present the emotional element selected by the user (for example, the text bubble "Happy Life! shown in FIG. 3H is continuously displayed).
  • the control 311K can be used to monitor the operation of triggering to stop displaying the small window 311 and stop presenting any emotional elements.
  • the electronic device stops displaying the small window 311 and stops presenting any emotional elements.
  • the electronic device may associate and store the image obtained by the electronic device with the user's current emotional state.
  • the electronic device can store the image obtained by the camera and the current emotional state of the user locally, or store the image obtained by the camera and the current emotional state of the user in a cloud server, which is not limited in this application.
  • the electronic device can input a user operation (such as a touch operation) on the shooting control 302, and in response to the detected user operation, the electronic device can store the The portrait in the view frame 305 as shown in FIG. 3C and the recognized current emotional state of the user are "happy".
  • the electronic device can also store the acquired image and the recognized current emotional state of the user in response to other user operations.
  • the user can also input voice to "take a photo” and face
  • the camera blinks and other methods trigger the electronic device to store the acquired image and the recognized current emotional state of the user.
  • the electronic device can store images in JPEG, TIFF, RAW, PNG, GIF, AVI, 3GP, MOV or other image formats, and use text (such as the text "happy”) and indicator (such as the indicator " 1", indicating that the corresponding emotional state is "happy”) and other methods to store the user's current emotional state, which is not limited in this application.
  • the image obtained by the electronic device may be associated and stored with the emotional element corresponding to the recognized emotional state of the user.
  • the emotional element may be one or more.
  • the electronic device can store the image obtained by the camera and the emotional element corresponding to the user's emotional state to the local, or store the image obtained by the camera and the emotional element corresponding to the user's emotional state to the local Cloud server, this application does not restrict this.
  • the electronic device when the electronic device recognizes the user's emotional state and displays an emotional element emoticon 310A corresponding to the emotional state "happy" on the user interface 31, the user can input on the shooting control 302 A user operation (for example, a touch operation), in response to the detected user operation, the electronic device may associate and store the portrait and emotional element expression package 310A in the view frame 305 as shown in FIG. 3D.
  • a user operation for example, a touch operation
  • the electronic device when the electronic device recognizes the user's emotional state and displays an emotional element text bubble "Happy Life! 310B corresponding to the emotional state "happy" on the user interface 31, the user can Input a user operation (such as a touch operation) on the shooting control 302, and in response to the detected user operation, the electronic device can store the portrait and emotional element text bubble "Happy Life! in the view frame 305 as shown in FIG. 3H in association with each other. 310B.
  • a user operation such as a touch operation
  • the electronic device recognizes the user's emotional state, and displays multiple emotional elements corresponding to the emotional state "happy" on the user interface 31, such as emoticon 310A, text bubble 310B, text
  • the bubble 310C and the text message 310D also play the emotional element song "Today is a good day", and when the image in the viewfinder 305 is displayed with high saturation, the user can input user operations (such as touch operations) on the shooting control 302,
  • the electronic device may store the portrait and emotional element emoticon package 310A, text bubble 310B, text bubble 310C, text information 310D, and the song "Today is a good day" in the viewfinder frame 305 as shown in FIG. 3E. Day” and high saturation indicator information.
  • the electronic device may also store the acquired images and emotional elements in association with other user operations.
  • the user can also trigger the electronic device to associate and store the acquired images and emotional elements by inputting a voice "photograph", blinking at the camera, etc., which is not limited in the embodiment of the present application.
  • the electronic device After the electronic device recognizes the user's emotional state, it stores the image obtained by the electronic device and the emotional element corresponding to the recognized user's emotional state in association, or associates the image obtained by the electronic device with the recognized user's emotion.
  • the acquired image stored by the electronic device is a captured image. Since the user preview image and the captured image may not be performed at the same time, the preview image viewed by the user in the viewfinder and the final captured image (ie, the saved image) may be the same or different.
  • the above UI embodiments shown in FIGS. 3C-3H are described by taking the preview image viewed by the user in the viewfinder as the same as the final captured image, which is implemented in the UI shown in FIGS. 3C-3H.
  • the preview image and the final captured image viewed by the user in the viewfinder frame are both portraits in the viewfinder frame 305.
  • the preview image viewed by the user in the viewfinder and the final captured image may also be different.
  • the time between the user viewing the preview image and the final captured image in the viewfinder does not exceed the preset duration, which can be preset, for example, It is 1 minute, 2 minutes, etc.
  • the preset duration can ensure that the emotional state stored in association with the captured image is the emotional state when the user sees the captured image, or the preset duration can ensure that the emotional element stored in association with the captured image reflects the user viewing the captured image Emotional state at the time.
  • the electronic device if the electronic device acquires a video through a camera, and the user's emotional state changes during the process of acquiring the video by the electronic device, that is, when the electronic device recognizes multiple emotional states of the user in the process of acquiring the video, the electronic device The device can associate and store each segment of the video and the emotional state or emotional element corresponding to each segment.
  • the electronic device can Store the first 1-10 minutes of the video in association with the emotional state "romantic”, and store the last 11-20 minutes of the video in association with the emotional state "happy".
  • the electronic device may store the acquired images in different areas according to different emotional states.
  • the electronic device can store all the images acquired when the user’s emotional state is “happy” and the emotional state “happy” in the first area, and can store all the images acquired when the user's emotional state is "romantic”
  • the image of and the emotional state "romance” is associated and stored in the second area, and all the images obtained when the user's emotional state is “calm” and the emotional state “calm” can be associated and stored in the third area.
  • the electronic device can store all the images acquired when the user’s emotional state is "happy” and the emotional elements corresponding to the emotional state "happy” in the fourth area, and can store all the user's emotional states as " The image obtained during "romance” and the emotional element corresponding to the emotional state “romantic” are stored in the fifth area. All the images obtained when the user's emotional state is "calm” and the emotional state “calm” can be associated The emotional elements are stored in the sixth area.
  • the above-mentioned first area to sixth area may be located at the local end of the electronic device, or may be located at the cloud server.
  • the electronic device can recognize the user's current emotional state when capturing an image, and store the captured image and the recognized user's current emotional state in association with the user's needs, or , According to the user's needs, the captured images and the emotional elements corresponding to the user's current emotional state are associated and stored. In this way, the user's emotion can be recorded, and the purpose of paying attention to the user's emotion can be achieved, and the user's emotion can be easily reproduced when needed, and the user's experience of using the electronic device can be improved.
  • the “photographing interface” may be referred to as the first user interface
  • the preview image may be referred to as the first image
  • the captured image may be referred to as the second image.
  • the first image and the second image may be the same or different.
  • the “photographing interface” refer to the user interface 31 shown in FIGS. 3A to 3H.
  • the emotional state of the user recognized by the electronic device when the user views the preview image may be referred to as the first emotional state.
  • the electronic device associates and stores the second image and the emotional element corresponding to the recognized first emotional state
  • the emotional element corresponding to the recognized first emotional state may be referred to as the first emotional element.
  • the control included in the first user interface for monitoring and saving the second image may be referred to as the first interactive element, such as the shooting control 302 in FIGS. 3C- 3E and 3H;
  • the operation acting on the first interaction element is called a first operation, for example, a touch operation acting on the shooting control 302.
  • the electronic device when the user views a picture, the electronic device can recognize the user's current emotional state, and store the emotional state in association when storing the picture.
  • the "view picture interface” may be used to display a picture and one or more related controls for processing the picture.
  • the picture may be a picture stored on a cloud device accessed by the electronic device, a picture stored on the local end of the electronic device, and so on.
  • Exemplary embodiments of the pictures stored on the cloud device accessed by the electronic device may include the Moments pictures viewed by the user in the "WeChat” application, and the pictures posted by the friends viewed by the user in the "Facebook” application. Wait.
  • Exemplary embodiments of pictures stored on the local end of the electronic device may include pictures viewed by the user in the "album” application, etc.
  • Related controls for processing the picture can be used to monitor user operations (such as touch operations), and in response to the user operation, the electronic device can perform one or more of the following: share pictures, save pictures, bookmark pictures, and so on.
  • the user interface 41 exemplarily shown in FIGS. 4A-4D may be an implementation of the "view picture interface".
  • the user interface 41 may be provided by a “WeChat” application, a “Facebook” application, a “photo album” application or other applications.
  • the user interface 41 may be a user interface opened when the user clicks the WeChat icon 206, the Facebook icon 209, or the gallery icon 214 in FIG. 2A to view a certain picture.
  • the user interface 41 may include: a status bar 401 and a picture 402.
  • the picture 402 is a picture currently viewed by the user.
  • the picture may be a picture stored on a cloud device accessed by the electronic device, or a picture stored on the local end of the electronic device, which is not limited in this application.
  • the user interface 41 may further include a concealable navigation bar (not shown in the figure).
  • For the navigation bar refer to the navigation bar 205 in the user interface 21 shown in FIG. 2A, which is not repeated here.
  • the electronic device can detect an operation that acts on the picture 402 (such as a user's long-press, heavy-press, or double-click operations on the picture 402), and in response to the operation, the electronic device can display the image shown in FIG. 4B Show menu 403.
  • the menu 403 may include multiple controls for triggering the electronic device to process the picture, such as a sharing control 403A, a saving control 403B, a favorite control 403C, and an emotion recognition control 403D.
  • the controls in the menu 403 can be used to monitor user operations (such as touch operations), and in response to the detected user operations, the electronic device can perform corresponding processing on the picture, such as sharing, saving, collecting, or emotion recognition.
  • the menu 403 may also include more controls 403E. More controls 403E can be used to monitor user operations (such as touch operations). In response to the detected user operations, the electronic device can display more controls for triggering the electronic device to process the picture, such as delete controls, move controls, and so on.
  • the manifestation of the controls in the menu 403 may be icons and/or text information.
  • the emotion recognition control 403D in the menu 403 can be used to monitor and trigger the operation of enabling "emotion recognition".
  • the electronic device turns on "emotion recognition”.
  • the electronic device can also turn on “emotion recognition” in other ways.
  • the user can also enable “emotion recognition” in the window 217 shown in FIG. 2B, which is not limited in this application.
  • a prompt message 404 may be displayed on the picture 402.
  • the prompt information 404 may be used to prompt the user that the electronic device is recognizing the emotional state of the user.
  • the prompt information 404 reference may be made to the prompt information 308 shown in FIG. 3B, which will not be repeated here.
  • the electronic device when the electronic device has not yet recognized the user's emotional state, the electronic device may not present any other content on the picture 402, that is, blank, which can indicate that the user's emotional state has not yet been recognized.
  • the electronic device may update the content displayed on the picture 402, and the updated picture 402 may display prompt information 405.
  • the prompt information 405 may be used to prompt the user's emotional state that has been recognized by the electronic device.
  • the prompt information 405 reference may be made to the prompt information 309 shown in FIG. 3C, which is not repeated here.
  • the specific implementation manner for the electronic device to recognize the user's emotional state can refer to the relevant description above, which will not be repeated here.
  • the electronic device may directly display the emotional element corresponding to the user's emotional state on the picture 402.
  • part or all of the prompt information 405 in the user interface 41 shown in FIG. 4D can be used to receive triggers that the electronic device presents corresponding to the user's emotional state
  • the manipulation of emotional elements For example, part of the text "click to view” in the prompt message 405 "Identifies that the user is in a happy state, click to view” can be used to receive an operation that triggers the electronic device to present an emotional element corresponding to the user's emotional state.
  • the operation can be light touch, heavy press or long press.
  • the emotional elements corresponding to the user's emotional state can be stored in the local end of the electronic device, or can be stored in the cloud server. That is, after the electronic device recognizes the emotional state of the user, it can obtain the emotional element corresponding to the emotional state from the local end or from the cloud server.
  • the user interface 41 exemplarily shown in FIGS. 4E-4H can present several implementation ways of emotional elements corresponding to the user's emotional state for the electronic device.
  • FIG. 4E shows a way for an electronic device to display an emotional element corresponding to the user's emotional state.
  • an emotional element corresponding to the emotional state "happy” is displayed in the user interface 41, and the emotional element is an emoticon package 406A.
  • the emoticon package 406A may be any one of the one or more emotional elements corresponding to the emotional state "happy” or a default one of the electronic device.
  • FIG. 4F shows a way for an electronic device to display multiple emotional elements corresponding to the emotional state of the user.
  • the user interface 41 displays multiple emotional elements corresponding to the emotional state "happy", including emoticons 406A, text bubbles 406B, text bubbles 406C, and text information 406D, as well as the high saturation presented by the picture 402 .
  • the multiple emotional elements may be part or all of the multiple emotional elements corresponding to the emotional state "happy".
  • the electronic device may also present other emotional elements.
  • the electronic device can also adjust the composition, light and shadow effects, saturation, hue, and color of the image in the viewfinder 305, and can also play background music through the speaker 170A to reflect the current emotional state of the user.
  • the user interface 41 shown in FIG. 4E and FIG. 4F may further include a control 407.
  • the control 407 can be used to monitor the operation that triggers the electronic device to stop presenting any emotional element. In response to the user operation, the electronic device stops presenting any emotional elements.
  • Figures 4G-4H show yet another way for an electronic device to present emotional elements corresponding to the user's emotional state.
  • a small window 409 is displayed in the user interface 41.
  • the small window 409 may include the icon 409A of the emotional element, the text information "emotional element” 409B, multiple emotional element options such as “original image” 409C, high saturation 409D, text bubble "Happy life! 409E, emoticon package 409F,
  • the text message "Singing in the daytime must be indulging in wine, youth is a good company to return home” 409G, background music “Song "Today is a good day”” 409H, etc.
  • Each emotional element option in the small window 409 can be used to monitor the operation of triggering the electronic device to present the emotional element corresponding to the option.
  • the various emotional element options in the small window 409 can refer to the various emotional element options in the small window 311 in the user interface 31 shown in FIG. 3F, which will not be repeated here.
  • only one emotional element can be presented in the user interface 41 as shown in FIG. 4H.
  • the electronic device detects an operation acting on the next emotional element option in the small window 409, the electronic device stops presenting The previous emotional element and begin to present the next emotional element.
  • multiple emotional elements may be presented in the user interface 41 as shown in FIG. 4H.
  • the electronic device detects an operation (such as a touch operation) acting on the emotional element option, the electronic device presents For the emotional element corresponding to the emotional element option, when the electronic device detects an operation (such as a touch operation) acting on the emotional element option again, the electronic device may stop presenting the emotional element corresponding to the emotional element option.
  • the small window 409 may also display a control 409I and/or a control 409J.
  • the control 409I or the control 409J can be used for the user to switch or update the emotional element options displayed in the small window 409, so that more emotional elements corresponding to the current user's emotional state can be viewed.
  • other interactive elements may also be used for the user to switch the emotional element options displayed in the small window 409.
  • the user may also make a left or right sliding gesture in the small window 409 to switch or update the emotional element options displayed in the small window 409.
  • the small window 409 may also include a control 409K.
  • the control 409K can be used to monitor the operation of triggering to stop displaying the small window 409 and stop presenting any emotional elements.
  • the electronic device stops displaying the small window 409 and stops presenting any emotional elements.
  • the electronic device may store the picture viewed by the user in association with the user's current emotional state.
  • the electronic device may store the pictures viewed by the user and the current emotional state of the user locally, or store the pictures viewed by the user and the current emotional state of the user in a cloud server.
  • the user can input a user operation on the picture 402 (such as a long press, heavy press, or double-click on the picture 402), in response to the detected With this user operation, the electronic device can display a small window 403 as shown in FIG.
  • the electronic device can store the picture 402 as shown in FIG. 4D and the identified user's current The emotional state is "happy".
  • the electronic device can also store pictures viewed by the user and the recognized current emotional state of the user in response to other user operations.
  • the user can also input voice " Means such as “save” trigger the electronic device to store the picture viewed by the user and the current emotional state of the user identified.
  • Means such as “save” trigger the electronic device to store the picture viewed by the user and the current emotional state of the user identified.
  • the manner in which the electronic device stores the picture and the user's current emotional state can refer to the related description in the application scenario 1, which will not be repeated here.
  • the electronic device after the electronic device recognizes the user's emotional state, it can store the pictures viewed by the user and the emotional elements corresponding to the user's emotional state in association with each other.
  • the electronic device associates and stores the picture viewed by the user and the emotional element corresponding to the user's emotional state, there may be one or more emotional elements.
  • the electronic device may store the pictures viewed by the user and the emotional elements corresponding to the user's emotional state locally, and may also store the pictures viewed by the user and the emotional elements corresponding to the user's emotional state in association with each other. To the cloud server, this application does not restrict this.
  • the electronic device recognizes the user's emotional state and displays an emotional element emoticon package 406A corresponding to the emotional state "happy" on the user interface 41.
  • the user interface 41 shown in FIG. 4E may further include a control 408, which can be used to monitor and trigger the operation of the electronic device to associate and store the picture viewed by the user and the emotional element corresponding to the emotional state of the user.
  • the electronic device may store the picture 402 and the emotional element emoticon package 406A as shown in FIG. 4E in association.
  • the expression form of the control 408 may be text or icon, etc., which is not limited in this application.
  • the electronic device recognizes the user's emotional state and displays an emotional element text bubble "Happy Life! 406B corresponding to the emotional state "happy" on the user interface 41.
  • the small window 407 in the user interface 41 shown in FIG. 4H may also include a control 409L, which can be used to monitor the operation of triggering the electronic device to associate and store the picture viewed by the user and the emotional element corresponding to the emotional state of the user.
  • the electronic device may store the picture 402 and the emotional element emoticon package 406A as shown in FIG. 4H in association.
  • the expression form of the control 409L may be text or icon, etc., which is not limited in this application.
  • the electronic device recognizes the user's emotional state, and displays multiple emotional elements corresponding to the emotional state "happy" on the user interface 41, such as emoticons 406A, text bubbles 406B, and text
  • the bubble 406C and the text message 406D also play the emotional element song "Today is a Good Day", and the picture 402 is displayed with high saturation.
  • the user interface 41 shown in FIG. 4F may further include a control 408, which may be used to monitor and trigger the operation of the electronic device to associate and store the picture viewed by the user and the emotional element corresponding to the emotional state of the user.
  • the electronic device may associate and store the picture 402 shown in FIG. 4F with the emotional element emoticon package 406A, text bubble 406B, text bubble 406C, and text information 406D.
  • the expression form of the control 408 may be text or icon, etc., which is not limited in this application.
  • the electronic device may also store the acquired images and emotional elements in association with other user operations.
  • the user can also trigger the electronic device to associate and store the acquired image and emotional element by inputting a voice "save", etc., which is not limited in the embodiment of the present application.
  • the electronic device may store the acquired images in different areas according to different emotional states.
  • the electronic device can store all the images acquired when the user’s emotional state is “happy” and the emotional state “happy” in the first area, and can store all the images acquired when the user's emotional state is "romantic”
  • the image of and the emotional state "romance” is associated and stored in the second area, and all the images obtained when the user's emotional state is “calm” and the emotional state “calm” can be associated and stored in the third area.
  • the electronic device can store all the images acquired when the user’s emotional state is "happy” and the emotional elements corresponding to the emotional state "happy” in the fourth area, and can store all the user's emotional states as " The image obtained during "romance” and the emotional element corresponding to the emotional state “romantic” are stored in the fifth area. All the images obtained when the user's emotional state is "calm” and the emotional state “calm” can be associated The emotional elements are stored in the sixth area.
  • the above-mentioned first area to sixth area may be located at the local end of the electronic device, or may be located at the cloud server.
  • the page layout of the "view picture interface” may also present other forms, which is not limited in this embodiment.
  • the electronic device can recognize the user’s current emotional state when the user views a picture, and store the picture viewed by the user in association with the recognized current emotional state of the user according to user needs, Or, the pictures viewed by the user and the emotional elements corresponding to the current emotional state of the user are associated and stored according to the needs of the user.
  • the user's emotion can be recorded, and the purpose of paying attention to the user's emotion can be achieved, and the user's emotion can be easily reproduced when needed, and the user's experience of using the electronic device can be improved.
  • the electronic device can also recognize the user’s emotional state when the user views a video or other types of images, and can store the images viewed by the user and the recognition as needed.
  • the current emotional state of the user can be obtained, or the image viewed by the user can be associated and stored according to the user's needs and the emotional element corresponding to the recognized current emotional state of the user.
  • the "picture viewing interface” may be referred to as the first user interface
  • the viewed picture may be referred to as the first image
  • the saved picture may be referred to as the second image
  • the first image and the second image the same.
  • the first image and the second image may be pictures 402 in the user interface shown in FIGS. 4A-4H, for example.
  • the emotional state of the user recognized by the electronic device when the user views the picture may be referred to as the first emotional state.
  • the electronic device stores the second image and one or more emotional elements corresponding to the recognized first emotional state in association
  • the one or more emotional elements corresponding to the recognized first emotional state may be referred to as The first emotional element.
  • the control included in the first user interface for monitoring and saving pictures may be referred to as the first interactive element, for example, the control 408 in FIG. 4E-FIG. 4F and the control 409L in FIG. 4H ;
  • the operation acting on the first interactive element can be referred to as the first operation, for example, a touch operation acting on the control 408 or 409L.
  • the operation for turning on “emotion recognition” may be referred to as the fourth operation.
  • the electronic device can collect the user's behavior data through the configured hardware device or by starting the hardware device of other devices connected to the electronic device. Identify the current emotional state of the user. For the specific implementation of the electronic device recognizing the user's emotional state, please refer to the previous description.
  • the first user interface may include a second interaction element, and the second interaction element is used to monitor the operation of starting the emotion recognition service.
  • the fourth operation may be an operation (for example, a touch operation) acting on the second interaction element in the first user interface.
  • An exemplary implementation of the second interactive element may include the control 307 in FIG. 3A or the control 403D in FIG. 4B.
  • Application scenario 3 A scenario where the electronic device reproduces the user's emotional state
  • the electronic device After the electronic device stores the image in association with the current emotional state of the recognized user, or after the electronic device stores the image and the emotional element corresponding to the recognized user's current emotional state, the electronic device can present the image when the user views the image again Corresponding emotional elements.
  • the emotional element is an emotional element corresponding to an emotional state stored in association with the image, or the emotional element is an emotional element stored in association with the image.
  • the user After seeing these emotional elements, the user can experience the corresponding emotional state and generate emotional resonance, that is, reproduce the emotional state when the image is acquired.
  • the image can be a static picture, a dynamic picture, a video, and so on.
  • the electronic device associates and stores the image and the emotional state/emotional element
  • the corresponding emotional element can be presented according to the user's needs when the user views the image again.
  • 5A-5C exemplarily show related user interfaces for the user to select an image for viewing.
  • the user interface 51 exemplarily shown in FIG. 5A may be used to display multiple photo albums.
  • the user interface 51 may be provided by the "Gallery” application.
  • "Gallery” is an application program for displaying images installed on electronic devices such as smart phones, tablet computers, and the embodiment of the application does not limit the name of the application program.
  • the application can support displaying images stored on the local end of the electronic device or on a cloud server.
  • the user interface 51 may be a user interface opened by the user clicking on the gallery icon 214 in FIG. 2A.
  • the user interface 51 may include: a status bar 501, an application title bar, one or more album entries, a menu 504, and a navigation bar 505.
  • the status bar 501 can refer to the status bar 201 in the user interface 21 shown in FIG. 2A, which will not be repeated here.
  • the application title bar may include a current page indicator 502.
  • the current page indicator 502 may be used to indicate the current page, for example, the text information "album”. Not limited to text information, the current page indicator 502 may also be an icon.
  • Album entries can include: all photo album entries, all video album entries, camera album entries, screenshot catalog album entries, WeChat album entries, Weibo album entries, and one or more of the categories of the user’s emotional state when the image is acquired by the electronic device
  • the album entries are, for example, a happy album entry 503A, a romantic album entry 503B, an angry album entry 503C, and a sad album entry 503D.
  • the image under the happy album entry 503A is obtained when the user's emotional state is "happy”
  • the image under the romantic album entry 503A is obtained when the user's emotional state is “romantic”
  • the image under the angry album entry 503A The image is obtained when the user's emotional state is "angry”
  • the image under the sadness album entry 503A is obtained when the user's emotional state is "sorrow”.
  • Each album entry can be used to monitor an operation (such as a touch operation) that triggers the electronic device to display the image under the corresponding album. In response to this operation, the electronic device can open a user interface for displaying the image under the corresponding album.
  • the user can enter an upward or downward swipe gesture on the album entry to view more album entries.
  • Each album entry can be used to monitor user operations (such as touch operations) that trigger the display of images under the corresponding album.
  • the electronic device may also display more or fewer album entries in the user interface 51, which is not limited in the embodiment of the present application.
  • the name of each album entry in the user interface 51 may be set independently by the user, or may be set by default by the electronic device, which is not limited in the embodiment of the present application.
  • the menu 504 includes one or more related controls for processing the album, such as a control 504A, a control 504B, and a control 504C.
  • Control 504A can be used to monitor user operations that trigger the search for albums (such as touch operations)
  • control 504B can be used to monitor user operations that trigger new albums (such as touch operations)
  • control 504C can be used to monitor and trigger display more for processing albums User operations (such as touch operations).
  • the expression form of the related controls used to process the album may be an icon or text information.
  • the navigation bar 505 can refer to the navigation bar 205 in the user interface 21 shown in FIG. 2A, which will not be repeated here.
  • the user interface 52 exemplarily shown in FIG. 5B can be used to display the images in the happy album.
  • the user interface 52 may be a user interface opened by the user clicking on the happy album entry 503A in FIG. 5A.
  • the user interface 52 may include: a status bar 506, an application title bar, and an image area 510.
  • the status bar 506 can refer to the status bar 201 in the user interface 21 shown in FIG. 2A, which will not be repeated here.
  • the application title bar may include: a return key 509, current page indicators 507 and 508.
  • the return key 509 is an APP-level return key, which can be used to return to the upper level of the menu.
  • the upper level page of the user interface 52 is the user interface 51 as shown in FIG. 5A.
  • the current page indicators 507 and 508 can be used to indicate the current page. For example, the text information "album” and "happy” can be used to indicate that the current page is used to display images in the happy album. It is not limited to text information.
  • the current page indicators 507 and 508 can also be used. Is the icon.
  • One or more images may be displayed in the image area 510, and the images may include pictures or videos, such as pictures 510A, 510B, 510D-510H, 510J-510L and videos 510C, 510I.
  • the image displayed in the picture area 510 may be a thumbnail.
  • the original image corresponding to the thumbnail in the image area 510 may be stored on the electronic device, or may be stored on the cloud server.
  • prompt information may also be displayed on the image in the image area 510, and the prompt information may be used to remind the user that the image is associated with the user's emotional state or emotional elements.
  • the prompt information may be the icon 510M in FIG. 5B.
  • the image area 510 may further include a control 510N, and the control 510N may be used for the user to switch the images displayed in the image area 510, so that more images in the current album can be viewed.
  • the user can also make an upward or downward sliding gesture in the image area 510 to browse more images.
  • Each image in the image area 510 can be used to monitor an operation (such as a touch operation) that triggers the display of the corresponding image.
  • the image 510E may receive an input user operation (such as a touch operation), and in response to the detected user operation, the electronic device may display a user interface 53 as shown in FIG. 5C.
  • the user interface 53 may include: a status bar 511, an application title bar, an image area 514, and one or more controls for processing images.
  • the status bar 511 can refer to the status bar 201 in the user interface 21 shown in FIG. 2A, which will not be repeated here.
  • the application title bar may include: a return key 512 and an image information indicator 513.
  • the return key 512 can be used to return to the upper level of the menu.
  • the upper level page of the user interface 53 is the user interface 52 as shown in FIG. 5B.
  • the image information indicator 513 can be used to indicate the location and time when the electronic device acquires the image.
  • the image information indicator 513 can be the text information "Shenzhen-Longgang District, February 9th, 9:00".
  • the image area 514 can be used to display an image selected by the user.
  • the image may be an original image stored on the local end of the electronic device or on a cloud server.
  • the control for processing the image can be used to monitor an operation (such as a touch operation) that triggers the electronic device to perform corresponding processing on the image displayed in the image area 514.
  • the controls for processing images may include, for example, a control 515A, a control 515B, a control 515C, and a control 515D.
  • the control 515A can be used to monitor operations (such as touch operations) that trigger the electronic device to share the image displayed in the image area 514 to other user equipment (such as a smart phone) or a cloud server.
  • the electronic device can share the image to other user devices or cloud servers.
  • the electronic device may also share the image and the emotional state/emotional element stored in association with the image to other user devices or cloud servers.
  • the control 515B can be used to monitor an operation (such as a touch operation) that triggers the image displayed in the collection image area 514 of the electronic device.
  • the electronic device can store the image.
  • the electronic device may also collect the image and the emotional state/emotional element stored in association with the image.
  • the control 515C can be used to monitor an operation (such as a touch operation) that triggers the electronic device to delete the image displayed in the image area 514.
  • the electronic device may delete the image.
  • the electronic device in response to the detected user operation, may also delete the image and the emotional state/emotional element stored in association with the image.
  • the electronic device can perform the deletion operation on the local end or the cloud server.
  • the control 515D can be used to monitor operations that trigger the electronic device to present emotional elements (such as touch operations).
  • the electronic device may present a corresponding emotional element, which is an emotional element corresponding to the emotional state stored in association with the image, or, the emotional element is an emotion stored in association with the image element.
  • the electronic device may also present corresponding emotional elements when the user views the image through other interactive elements.
  • the electronic device may also present corresponding emotional elements when detecting an operation (such as a touch operation, a double-click operation, a long press operation, etc.) acting on the picture 510E as shown in FIG. 5B.
  • the electronic device When the electronic device stores the acquired image and the emotional state of the recognized user in association, the electronic device can learn the emotional state of the user stored in association with the image viewed by the current user, and find the emotional state on the local or cloud server One or more emotional elements corresponding to the state, and the one or more emotional elements are presented according to a certain presentation strategy when the user views the image.
  • the electronic device When the electronic device stores the acquired image and one or more emotional elements corresponding to the recognized emotional state of the user in association, the electronic device can present the one or more emotional elements according to a certain presentation strategy when the user views the image. Emotional elements.
  • the user interface 53 exemplarily shown in FIGS. 5D-5K may be several implementations for the electronic device to present the corresponding emotional element when the user views the image again after the image and the emotional state/emotional element are associated and stored.
  • the electronic device can present an emotional element when the user views the image.
  • the one emotional element is any one of the one or more emotional elements corresponding to the emotional state stored in association with the image or the default one of the electronic device, or the one emotional element is one or more emotions stored in association with the image Any one of the elements or the default one of the electronic device.
  • Figure 5D shows a way for an electronic device to display an emotional element.
  • an emotional element is displayed in the user interface 53, and the emotional element is an emoticon package 516A.
  • the emoticon package 516A is any one of the one or more emotional elements corresponding to the emotional state "happy" stored in association with the picture 514 or the default one of the electronic device, or the emoticon package 516A is one or more stored in association with the picture 514 Any one of the emotional elements or the default one of the electronic device.
  • the electronic device can present multiple emotional elements when the user views the image.
  • the multiple emotional elements are part or all of one or more emotional elements corresponding to the emotional state stored in association with the image, or the one emotional element is a part of one or more emotional elements stored in association with the image Or all.
  • Fig. 5E shows a way for an electronic device to display multiple emotional elements.
  • a plurality of emotional elements are displayed in the user interface 53, and the plurality of emotional elements include emoticons 516A, text bubbles 516B, text bubbles 516C, text information 516D, and high saturation of the image.
  • the emoticon package 516A, the text bubble 516B, the text bubble 516C, the text information 516D, and the high saturation of the image are part or all of one or more emotional elements corresponding to the emotional state "happy" stored in association with the picture 514, Alternatively, the emoticon package 516A, the text bubble 516B, the text bubble 516C, and the text information 516D are part or all of one or more emotional elements stored in association with the picture 514.
  • the electronic device may also present other emotional elements.
  • the electronic device can also adjust the composition, light and shadow effects, saturation, hue, and color of the image in the viewfinder 305, and can also play background music through the speaker 170A to reflect the current emotional state of the user.
  • the user interface 53 shown in FIG. 5D or FIG. 5E may further include a control 517.
  • the control 517 can be used to monitor operations (such as touch operations) that trigger the electronic device to stop presenting emotional elements.
  • the electronic device may stop presenting all emotional elements.
  • the electronic device in response to the detected operation, the electronic device may jump to the user interface 52 as shown in FIG. 5C. This method can facilitate the user to switch the image area 514 with emotional elements and without emotional elements.
  • control 517 it is not limited to the aforementioned control 517, and other interactive elements may also be used for the user to switch between the image area 514 with emotional elements and without emotional elements.
  • the user can also make gestures such as long press, double-click, or draw a circle on the image area 514 shown in FIG. 5D or FIG. 5E to switch the image area 514 with and without emotion elements.
  • the electronic device can present one or more emotional elements according to user needs.
  • the one or more emotional elements are one or more emotional elements corresponding to the emotional state stored in association with the image, or the one or more emotional elements are one or more emotional elements stored in association with the image.
  • Figures 5F-5H show yet another way for an electronic device to present corresponding emotional elements.
  • a small window 518 is displayed in the user interface 53.
  • the small window 518 can be used for the user to select corresponding emotional elements for presentation.
  • the small window 518 may include icons 518A of emotional elements, text information "emotional elements" 518B, multiple emotional element options such as “original image” 518C, high saturation 518D, text bubble "Happy life! 518E, emoticon package 518F,
  • the text message "Singing in the daytime must indulge in wine, youth is a good company to return home" 518G, background music “Song "Today is a good day”” 518H, etc.
  • the icon 518A, the text information "emotional element” 518B, and multiple emotional element options 518C-518H can refer to the icon 311A in the small window 311 of FIGS. 3F-3H, the text information "emotional element” 311B, and multiple emotional element options. 311C-311H, not repeat them here.
  • the small window 518 may also include a control 518I and/or a control 518J.
  • control 518I and the control 518J please refer to the control 311I and the control 311J in the small window 311 in FIG. 3F-3H, which will not be repeated here.
  • the small window 518 may also include a control 518K and a control 518L.
  • a control 518K and a control 518L please refer to the control 311K and the control 311L in the small window 311 in FIGS. 3F to 3H, which will not be repeated here.
  • the electronic device can simultaneously display images with and without emotional elements.
  • the emotional element may be one or more emotional elements corresponding to the emotional state stored in association with the image, or the emotional element is one or more emotional elements stored in association with the image.
  • the user interface 53 exemplarily shown in FIGS. 5I to 5K may be several ways for the electronic device to display images with and without emotion elements at the same time.
  • a picture 519 without emotional elements can be displayed floating above the picture 520 with emotional elements, that is, the picture 519 covers a part of the display area of the picture 520.
  • the picture 520 with emotional elements can be displayed floating on the picture 519 without emotional elements, that is, the picture 520 covers a part of the display area of the picture 519.
  • the picture 519 without emotional elements and the picture 520 with emotional elements may be displayed in different areas of the user interface 53 respectively.
  • the emotional element on the picture 520 with the emotional element in the user interface 53 shown in FIGS. 5I-5K may be one or more emotional elements corresponding to the emotional state stored in association with the picture in the image area 514, or , Is one or more emotional elements stored in association with the picture in the image area 514.
  • the picture 520 with an emotional element in the user interface 53 shown in FIGS. 5I to 5K may further include prompt information for prompting the user that the picture 520 presents an emotional element.
  • the prompt information can be represented as an icon, text information, or other forms.
  • the user interface 53 shown in FIGS. 5I-5K may also receive user operations (such as touch operations, double-click operations, etc.), and in response to the user operations, the electronic device may change the display positions of the picture 519 and the picture 520 . This user operation can act on the picture 519 or the picture 520.
  • user operations such as touch operations, double-click operations, etc.
  • FIGS. 5I to 5K can enable users to conveniently compare and view images with and without emotion elements.
  • the user interface exemplarily shown in FIGS. 5D-5K can present emotional elements to the user, and reproduce the emotional state recognized by the electronic device when the picture is acquired, so that the user can have emotional resonance and improve the user experience.
  • the electronic device when the electronic device recognizes the scene in which the user wants to view the image, it can recognize the emotional state of the user in the current scene, and push a specific image to the user according to the emotional state with a certain push strategy.
  • the user interface exemplarily shown in FIGS. 6A-6B may be a way for the electronic device to recognize the user's emotional state and push a specific image to the user when the electronic device recognizes the scene in which the user wants to view the image.
  • the user interface 51 exemplarily shown in FIG. 6A may be used to display multiple photo albums.
  • the user interface 51 is the same as the user interface 51 shown in FIG. 5A, and reference may be made to related descriptions.
  • the electronic device displays the user interface 51 as shown in FIG. 6A, the electronic device can confirm that the user is currently in the scene where the user wants to view the picture.
  • the prompt manner may include, but is not limited to, displaying prompt information, playing prompt sound, etc. on the user interface 51 as shown in FIG. 6A.
  • the prompt information can be implemented as an icon, text, etc.
  • the user interface 51 as shown in FIG. 6A may display an "emotion recognition” switch control, and the user can input an operation (such as a touch operation) on the "emotion recognition” switch control to turn on the "emotion recognition” switch control. Recognition”.
  • the user can also turn on "emotion recognition” in the manner shown in FIGS. 2A-2B.
  • the electronic device After the electronic device recognizes the user's emotional state, it can push a specific image to the user according to the emotional state with a certain push strategy.
  • the push strategy may include but is not limited to: 1. When the current user's emotional state is "sorrow”, push to the user the image obtained when the user's emotional state is "happy” or "romantic". Such a push strategy can eliminate users' bad emotions. 2. When the current user's emotional state is "calm”, push an image preset by the user or the electronic device to the user, for example, push an image acquired when the user's emotional state is "happy" or "happy”.
  • the user interface 61 exemplarily shown in FIG. 6B may be a way when the electronic device pushes a specific image for the user.
  • the user interface 61 may include: a status bar 601, a return button, a current page indicator, and one or more album entries.
  • the status bar 601 can refer to the status bar 201 in the user interface 21 shown in FIG. 2A, which will not be repeated here.
  • the return key 604 is an APP-level return key, which can be used to return to the upper level of the menu.
  • the upper level page of the user interface 61 may be the user interface 51 as shown in FIG. 6A.
  • the current page indicators 602 and 603 can be used to indicate the current page.
  • the text information "album” and "recommended” can be used to indicate that the current page is used to display recommended albums.
  • the current page indicators 602 and 603 may also be icons.
  • the album entries may include: one or more album entries recommended by the electronic device according to the current user's emotional state and classified according to the user's emotional state when the electronic device acquires the image, such as happy album entry 605A and romantic album entry 605B.
  • the happy album entry 605A and the romantic album entry 605B may refer to the happy album entry 503A and the romantic album entry 503B in the user interface 51 as shown in FIG. 5A.
  • the happy album entry 605A and the romantic album entry 605B may receive a user operation (such as a touch operation), and in response to the user operation, the electronic device may display the image under the corresponding album.
  • a user operation such as a touch operation
  • the user interface used to display the images of a certain album may be called the second user interface
  • the images in the album may be called the third image
  • the third image may include the second image saved by the electronic device .
  • the exemplary implementation of the second user interface may be the user interface 52 shown in FIG. 5B
  • the exemplary implementation of the third image may include the pictures 510A, 510B, 510D-510H, and 510J in the user interface 52 shown in FIG. 5B.
  • -510L and videos 510C, 510I an exemplary implementation of the second image may be picture 510E.
  • the operation for viewing the second image may be referred to as the second operation
  • the user interface for displaying the second image displayed in response to the second operation may be referred to as the third user interface.
  • the emotional element presented in response to the second operation is called the second emotional element.
  • the second emotional element includes one or more emotional elements corresponding to the first emotional state stored in association with the second image, or the second emotional element includes the first emotional element.
  • An exemplary implementation of the second operation is an operation (for example, touch, double-click, or long-press, etc.) acting on the picture 510E in the user interface 52 shown in FIG. 5B.
  • An exemplary implementation of the third user interface may include the user interface 53 shown in FIGS. 5D-5K.
  • the exemplary implementation of the second emotional element may refer to the emotional element presented by the electronic device when displaying the user interface 53 shown in FIGS. 5D to 5K, and may refer to the related description of the foregoing embodiment.
  • the user interface for displaying one or more album entries is called the fourth user interface, and the one or more album entries include the first album entry; the user interface that will act on the first album entry
  • the operation for opening the second user interface is called the third operation; the entry corresponding to the photo album displayed on the second user interface is the first photo album entry.
  • An exemplary implementation of the fourth user interface may include a user interface 51 as shown in FIG. 5A.
  • An exemplary implementation of the first album entry may be the "happy" album entry 503A in FIG. 5A.
  • An exemplary implementation of the third operation may include an operation (for example, touch, long press operation) acting on the "happy" album entry 503A.
  • the user interface for displaying one or more album entries may be referred to as the fourth user interface, and the one or more album entries include the first album entry; it will be used to display according to the electronic device The preset recommendation strategy.
  • the user interface of the album entry determined in one or more album entries in the fourth user interface according to the recognized emotional state is called the fifth user interface, and the first album is displayed in the fifth user interface Entry; the operation to open the second user interface on the first album entry on the fifth interface is called the third operation; the entry corresponding to the album displayed on the second user interface is the first album entry.
  • An exemplary implementation of the fourth user interface may include a user interface 51 as shown in FIG. 6A.
  • An exemplary implementation of the fifth user interface may be the user interface 61 shown in FIG. 6B.
  • An exemplary implementation of the first album entry may be the "happy" album entry 605A in FIG. 6B.
  • An exemplary implementation of the third operation may include an operation (for example, touch, long press operation) acting on the "happy" album entry 605A.
  • the emotional elements corresponding to different emotional states are set by the electronic device by default and stored in the local end of the electronic device or the cloud server in advance, or are set by the cloud server by default and stored in the local end of the electronic device or the cloud server.
  • the electronic device can obtain the emotional element corresponding to a certain emotional state from the local end or the cloud server.
  • the emotional elements corresponding to different emotional states are independently set by the user.
  • Figures 7A-7D exemplarily show an implementation manner in which the user interface can set the emotional elements corresponding to different emotional states for the user.
  • the user interface 71 exemplarily shown in FIG. 7A may be an implementation of the "setting interface".
  • the user interface 71 may be provided by the "Settings” application.
  • the "settings” application is an application installed on electronic devices such as smart phones, tablet computers and the like for setting various functions of the electronic device, and the name of the application is not limited in the embodiment of the application.
  • the user interface 71 may be a user interface opened by the user clicking the setting icon 215 in FIG. 2A.
  • the user interface 71 may include: a status bar 701, a current page indicator 702, a search box 703, an icon 704, and one or more setting items.
  • the status bar 701 can refer to the status bar 201 in the user interface 21 shown in FIG. 2A, which will not be repeated here.
  • the current page indicator 702 may be used to indicate the current page, for example, the text information "settings" may be used to indicate that the current page is used to display one or more setting items. Not limited to text information, the current page indicator 702 may also be an icon.
  • the search box 703 can be used to monitor operations (such as touch operations) of searching for setting items through text.
  • the electronic device may display a text input box, so that the user displays the setting item that the user wants to search for in the input box.
  • the icon 704 can be used to monitor an operation (such as a touch operation) of searching a setting item by voice.
  • the electronic device may display a voice input interface, so that the user inputs voice in the voice input interface, thereby searching for setting items.
  • Setting items can include: login Huawei account settings, wireless and network settings, device connection settings, application and notification settings, battery settings, display settings, sound settings, emotion recognition settings 705, security and privacy settings Items, user and account settings, etc.
  • the expression form of each setting item may include icons and/or text, which is not limited in this application.
  • Each setting item can be used to monitor an operation (such as a touch operation) that triggers the display of the setting content of the corresponding setting item, and in response to the operation, the electronic device can open a user interface for displaying the setting content of the corresponding setting item.
  • the electronic device may display a user interface 72 as shown in FIG. 7B.
  • the user interface 72 is used to display the corresponding content of the emotion recognition setting item.
  • the user interface 72 may include: a status bar, a return button 706, current page indicators 707 and 708, and an emotional state display area 709.
  • the return key 706 is an APP-level return key, which can be used to return to the upper level of the menu.
  • the upper level page of the user interface 72 may be the user interface 71 as shown in FIG. 7A.
  • the current page indicators 707 and 708 can be used to indicate the current page.
  • the text information "settings" and “emotion recognition” can be used to indicate that the current page is used to display the corresponding content of the emotion recognition setting items, and is not limited to text information.
  • the current page indicator 707 and 708 can also be an icon.
  • the emotional state display area 709 is used to display one or more emotional state options.
  • the one or more emotional state options may be set by default by the electronic device, for example, may be set by default when the electronic device is shipped from the factory, or may be an emotion that matches the personality of the user obtained by analysis of the electronic device. Options corresponding to the status.
  • the one or more emotional state options may be independently set by the user.
  • the one or more emotional state options include: a "happy" option 709A, a "romantic” option 709D, and a "sorrowful” option 709G.
  • the expression form of the one or more emotional state options may be text information (for example, the text information "happy") or an icon, which is not limited in the embodiment of the present application.
  • the area 709 may also include the delete control corresponding to each emotional state option, such as the delete control 709B corresponding to the "happy" option 709A, the delete control 709E corresponding to the "romance” option 709D, and the "sorrowful” option 709A.
  • the corresponding delete control 709B The delete control can monitor an operation (such as a touch operation) used to delete the corresponding emotional state option, and in response to the operation, the electronic device can delete the corresponding emotional state option.
  • the expression form of the delete control can be text information (for example, the text information "Delete") or an icon, which is not limited in the embodiment of the present application.
  • the user interface 72 may also include a control 710.
  • the control 710 can monitor an operation (such as a touch operation) for adding an emotional state option.
  • the electronic device can display an interface for adding an emotional state option, and the user can select or customize the emotion to be added on the interface. Status options. After the user adds an emotional state option, the emotional state option can be added to the area 709 in the user interface 72.
  • the expression form of the control 710 may be text information or an icon, which is not limited in the embodiment of the present application.
  • the area 709 may also include controls for setting the emotional element corresponding to each emotional state option, such as the setting control 709C corresponding to the "happy" option 709A, the setting control 709F corresponding to the "romance” option 709D, and the "sorrow” option 709A corresponding
  • the settings control 709I may be text information (for example, the text information "set emotion element") or an icon, which is not limited in the embodiment of the present application.
  • the setting control can monitor an operation (such as a touch operation) used to set the corresponding emotional element, and in response to the operation, the electronic device can open a user interface for setting the corresponding emotional element.
  • the electronic device may display a user interface 73 as shown in FIG. 7C.
  • the user interface 73 is used to display the emotional element setting items corresponding to the emotional state "happy".
  • the user interface 73 may include: a status bar, a return button 711, current page indicators 712 and 713, and one or more emotional element classification options.
  • the return key 706 is an APP-level return key, which can be used to return to the upper level of the menu.
  • the upper level page of the user interface 73 may be the user interface 72 as shown in FIG. 7B.
  • the current page indicators 712 and 713 can be used to indicate the current page.
  • the text information "set emotional element” and “happy” can be used to indicate that the current page is used to display the emotional element setting items corresponding to the emotional state "happy". It is not limited to text information.
  • the page indicators 712 and 713 may also be icons.
  • the emotional element classification options may include: light and shadow options 714A, composition options 714B, color options 714C, picture options 714D, text options 714E, and background music options 714F. It is not limited to the emotional element classification options shown in FIG. 7C, and may also include more or less or other emotional element classification options, such as saturation options, hue options, etc., which are not limited in the embodiment of the present application.
  • Each emotional element classification option can monitor the operation (such as touch operation) used to open the user interface for setting the emotional element of the corresponding category, and the user interface can be used for the user to select or customize one or more corresponding emotional elements.
  • the electronic device may open a user interface for setting the light and shadow effect corresponding to the emotional state "happy”; in response to the operation detected on the composition option 714B (Such as touch operation), the electronic device can open the user interface for setting the composition effect corresponding to the emotional state "happy”; the operation (such as touch operation) detected on the color option 714C, the electronic device can be opened to set the emotional state The user interface of the color effect corresponding to "happy”; the operation (such as touch operation) detected on the picture option 714D, the electronic device can open the user interface for setting the picture corresponding to the emotional state "happy”; on the text option 714E For detected operations (such as touch operations), the electronic device can
  • FIG. 7D shows the user interface 74 opened by the electronic device for setting the color corresponding to the emotional state "happy".
  • the user interface 74 may include: a status bar, a return key 716, current page indicators 717 and 718, and one or more color options.
  • the return key 716 is an APP-level return key, which can be used to return to the upper level of the menu.
  • the upper level page of the user interface 74 may be the user interface 73 as shown in FIG. 7C.
  • the current page indicators 717 and 718 can be used to indicate the current page.
  • the text information "set emotional element” and “color” can be used to indicate the current page is used to set emotional elements of the category "color”. It is not limited to text information.
  • the current page indicates Symbols 717 and 718 can also be icons.
  • Color options can include: Vivid 719, Gray 720, Dim 721, Normal 722. It is not limited to the several color options as shown in FIG. 7D, and may also include more or less or other color options, which is not limited in the embodiment of the present application.
  • Each color option can receive a user operation (such as a touch operation), and in response to the user operation, the electronic device can set the color corresponding to the color option as an emotional element corresponding to the emotional state "happy".
  • prompt information may be displayed on the user interface 74 to prompt the user that the emotional element is set as an emotional element corresponding to the emotional state "happy".
  • the prompt information may be, for example, an icon 723 as shown in FIG. 7D, and the icon 723 may prompt the user that the color "bright" is set as an emotional element corresponding to the emotional state "happy".
  • the user can also set other types of emotional elements on other user interfaces similar to the user interface 74.
  • the emotional elements set by the user can be stored in the local end of the electronic device, or can be stored in the cloud server.
  • the user can select background music on the local end of the electronic device, or select background music on the cloud server, which is not limited in the embodiment of the present application.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium. For example, the computer instructions may be transmitted from a website, computer, server, or data center.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or a data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, and a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention porte sur un procédé permettant d'enregistrer une émotion d'utilisateur et sur un appareil associé. Dans le procédé permettant d'enregistrer une émotion d'utilisateur, un dispositif électronique peut reconnaître un état émotionnel d'un utilisateur lors de la prévisualisation d'une image dans un scénario de photographie, et stocker de manière associative une image photographiée et l'état émotionnel reconnu de l'utilisateur lors de la photographie d'une image, ou stocker de manière associative l'image photographiée et un élément d'émotion correspondant à l'état émotionnel reconnu de l'utilisateur lors de la photographie de l'image. De cette manière, une émotion d'utilisateur peut être enregistrée, la mise au point est basée sur l'émotion de l'utilisateur et l'expérience d'utilisation d'un utilisateur est améliorée.
PCT/CN2020/081666 2019-03-28 2020-03-27 Procédé permettant d'enregistrer une émotion d'utilisateur et appareil associé WO2020192761A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910245272.0 2019-03-28
CN201910245272.0A CN110059211B (zh) 2019-03-28 2019-03-28 记录用户情感的方法及相关装置

Publications (1)

Publication Number Publication Date
WO2020192761A1 true WO2020192761A1 (fr) 2020-10-01

Family

ID=67317852

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/081666 WO2020192761A1 (fr) 2019-03-28 2020-03-27 Procédé permettant d'enregistrer une émotion d'utilisateur et appareil associé

Country Status (2)

Country Link
CN (1) CN110059211B (fr)
WO (1) WO2020192761A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114710553A (zh) * 2020-12-30 2022-07-05 本田技研工业(中国)投资有限公司 信息获取方法、信息推送方法以及终端设备

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110059211B (zh) * 2019-03-28 2024-03-01 华为技术有限公司 记录用户情感的方法及相关装置
CN110570844B (zh) * 2019-08-15 2023-05-05 平安科技(深圳)有限公司 语音情绪识别方法、装置及计算机可读存储介质
CN110825968B (zh) * 2019-11-04 2024-02-13 腾讯科技(深圳)有限公司 信息推送方法、装置、存储介质和计算机设备
CN111669462B (zh) * 2020-05-30 2022-09-02 华为技术有限公司 一种显示图像的方法及相关装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101997969A (zh) * 2009-08-13 2011-03-30 索尼爱立信移动通讯有限公司 图片声音注释添加方法和装置以及包括该装置的移动终端
CN103888696A (zh) * 2014-03-20 2014-06-25 深圳市中兴移动通信有限公司 照片的拍摄和查看方法和拍摄装置
CN108989662A (zh) * 2013-09-30 2018-12-11 北京三星通信技术研究有限公司 一种控制拍摄的方法及终端设备
CN109191370A (zh) * 2018-08-06 2019-01-11 光锐恒宇(北京)科技有限公司 图像处理方法、装置、智能终端和计算机可读存储介质
CN110059211A (zh) * 2019-03-28 2019-07-26 华为技术有限公司 记录用户情感的方法及相关装置

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030117651A1 (en) * 2001-12-26 2003-06-26 Eastman Kodak Company Method for using affective information recorded with digital images for producing an album page
CN102780651A (zh) * 2012-07-21 2012-11-14 上海量明科技发展有限公司 在即时通信消息中加注情绪数据的方法、客户端及系统
CN103257796B (zh) * 2013-05-15 2016-08-24 华为终端有限公司 移动通信设备及其主题色彩的切换方法
US20150178915A1 (en) * 2013-12-19 2015-06-25 Microsoft Corporation Tagging Images With Emotional State Information
CN104063683B (zh) * 2014-06-06 2017-05-17 北京搜狗科技发展有限公司 一种基于人脸识别的表情输入方法和装置
CN105100435A (zh) * 2015-06-15 2015-11-25 百度在线网络技术(北京)有限公司 移动通讯应用方法和装置
CN107391997A (zh) * 2017-08-25 2017-11-24 突维科技有限公司 数码相框装置及其控制方法
CN108470188B (zh) * 2018-02-26 2022-04-22 北京物灵智能科技有限公司 基于图像分析的交互方法及电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101997969A (zh) * 2009-08-13 2011-03-30 索尼爱立信移动通讯有限公司 图片声音注释添加方法和装置以及包括该装置的移动终端
CN108989662A (zh) * 2013-09-30 2018-12-11 北京三星通信技术研究有限公司 一种控制拍摄的方法及终端设备
CN103888696A (zh) * 2014-03-20 2014-06-25 深圳市中兴移动通信有限公司 照片的拍摄和查看方法和拍摄装置
CN109191370A (zh) * 2018-08-06 2019-01-11 光锐恒宇(北京)科技有限公司 图像处理方法、装置、智能终端和计算机可读存储介质
CN110059211A (zh) * 2019-03-28 2019-07-26 华为技术有限公司 记录用户情感的方法及相关装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114710553A (zh) * 2020-12-30 2022-07-05 本田技研工业(中国)投资有限公司 信息获取方法、信息推送方法以及终端设备

Also Published As

Publication number Publication date
CN110059211B (zh) 2024-03-01
CN110059211A (zh) 2019-07-26

Similar Documents

Publication Publication Date Title
JP7142783B2 (ja) 音声制御方法及び電子装置
CN110114747B (zh) 一种通知处理方法及电子设备
WO2021213164A1 (fr) Procédé d'interaction entre des interfaces d'application, dispositif électronique et support de stockage lisible par ordinateur
WO2020078299A1 (fr) Procédé permettant de traiter un fichier vidéo et dispositif électronique
CN112148400B (zh) 锁定状态下的显示方法及装置
WO2020192761A1 (fr) Procédé permettant d'enregistrer une émotion d'utilisateur et appareil associé
CN114706664A (zh) 跨设备任务处理的交互方法、电子设备及存储介质
WO2021082835A1 (fr) Procédé d'activation de fonction et dispositif électronique
WO2021036770A1 (fr) Procédé de traitement d'écran partagé et dispositif terminal
CN111078091A (zh) 分屏显示的处理方法、装置及电子设备
CN114390139B (zh) 一种电子设备在来电时呈现视频的方法、电子设备和存储介质
WO2021013132A1 (fr) Procédé d'entrée et dispositif électronique
US20220343648A1 (en) Image selection method and electronic device
WO2021082815A1 (fr) Procédé d'affichage d'élément d'affichage et dispositif électronique
WO2022068819A1 (fr) Procédé d'affichage d'interface et appareil associé
WO2022037726A1 (fr) Procédé d'affichage à écran partagé et dispositif électronique
WO2022042766A1 (fr) Procédé d'affichage d'informations, dispositif terminal et support de stockage lisible par ordinateur
WO2024045801A1 (fr) Procédé de capture d'écran, dispositif électronique, support et produit programme
CN112068907A (zh) 一种界面显示方法和电子设备
CN112150499A (zh) 图像处理方法及相关装置
CN113949803A (zh) 拍照方法及电子设备
CN112449101A (zh) 一种拍摄方法及电子设备
WO2023207799A1 (fr) Procédé de traitement de messages et dispositif électronique
CN114244951B (zh) 应用程序打开页面的方法及其介质和电子设备
WO2024114212A1 (fr) Procédé de commutation de mise au point inter-dispositifs, dispositif électronique et système

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20778242

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20778242

Country of ref document: EP

Kind code of ref document: A1