CN113144374A - Method and device for adjusting user state based on intelligent wearable device - Google Patents

Method and device for adjusting user state based on intelligent wearable device Download PDF

Info

Publication number
CN113144374A
CN113144374A CN202110384332.4A CN202110384332A CN113144374A CN 113144374 A CN113144374 A CN 113144374A CN 202110384332 A CN202110384332 A CN 202110384332A CN 113144374 A CN113144374 A CN 113144374A
Authority
CN
China
Prior art keywords
user
current
state
wearable device
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110384332.4A
Other languages
Chinese (zh)
Inventor
李志业
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Search Information Technology Co ltd
Original Assignee
Shanghai Search Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Search Information Technology Co ltd filed Critical Shanghai Search Information Technology Co ltd
Priority to CN202110384332.4A priority Critical patent/CN113144374A/en
Publication of CN113144374A publication Critical patent/CN113144374A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • A61M21/02Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis for inducing sleep or relaxation, e.g. by direct nerve stimulation, hypnosis, analgesia
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/681Wristwatch-type devices
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G21/00Input or output devices integrated in time-pieces
    • G04G21/02Detectors of external physical values, e.g. temperature
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G21/00Input or output devices integrated in time-pieces
    • G04G21/02Detectors of external physical values, e.g. temperature
    • G04G21/025Detectors of external physical values, e.g. temperature for measuring physiological data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/165Detection; Localisation; Normalisation using facial parts and geometric relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • A61M2021/0005Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus
    • A61M2021/0027Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the hearing sense

Abstract

The method comprises the steps of obtaining a current heartbeat value and a motion state of a user, which are acquired through intelligent wearable equipment; judging whether to start a camera of the wearable device or not based on the current heartbeat value and the motion state of the user, and if so, identifying the current emotion state of the user based on the camera; an adjustment is made to the identified current emotional state of the user. Therefore, the emotional state of the user is detected in time and is adjusted correspondingly, so that the user can get away from poor emotional atmosphere, and the user experience is improved.

Description

Method and device for adjusting user state based on intelligent wearable device
Technical Field
The application relates to the field of computers, in particular to a method and equipment for adjusting user states based on intelligent wearable equipment.
Background
Along with the development of electronic technology, the development bottleneck appears in the miniaturization and the light weight of smart phones, wearable equipment becomes the hot spot field that people pay attention to at present, and smart watch is one of the representatives of wearable equipment, becomes the main contact mode of parents and children among the family user gradually. The existing smart watch can display incoming call information, news, weather information and the like, can synchronize telephones, short messages, mails, music and the like in a mobile phone, but cannot intelligently detect the emotional state of a user and adopt a corresponding mode for adjustment.
Disclosure of Invention
An object of this application is to provide a method and equipment based on intelligent wearing equipment adjusts user's state, solves the problem that intelligent wearing equipment still can not intelligent detection user's state and correspond the regulation among the prior art.
According to one aspect of the application, a method for adjusting a user state based on a smart wearable device is provided, and the method comprises the following steps:
acquiring a current heartbeat value and a motion state of a user, which are acquired through intelligent wearable equipment;
judging whether to start a camera of the wearable device or not based on the current heartbeat value and the motion state of the user, and if so, identifying the current emotion state of the user based on the camera;
an adjustment is made to the identified current emotional state of the user.
Optionally, before obtaining the current heartbeat value and the motion state of the user collected by the smart wearable device, the method includes:
acquiring a normal heartbeat data interval of the user through a heart rate sensor in the intelligent wearable device;
and setting a state detection time period of the wearable device.
Optionally, obtaining the current heartbeat value and the motion state of the user collected by the intelligent wearable device includes:
simultaneously starting a heart rate sensor and a gravity sensor in the intelligent wearable device within the state detection time period;
and acquiring the current heartbeat value of the user through the heart rate sensor and acquiring the current motion state of the user through the gravity sensor.
Optionally, determining whether to turn on the camera of the wearable device based on the current heartbeat value and the motion state of the user includes:
if the current motion state of the user is in motion, waiting for a preset time and detecting whether the user is in a non-motion state again through the gravity sensor;
if yes, whether a camera of the wearable device is started or not is judged based on the current heartbeat value of the user and the normal heartbeat data interval.
Optionally, identifying the current emotional state of the user based on the camera includes:
and carrying out timing photographing based on the camera to obtain a plurality of images, and carrying out emotion recognition on the obtained images to determine the current emotion state of the user.
Optionally, performing emotion recognition on the obtained plurality of images to determine a current emotional state of the user, including:
and identifying a plurality of target key points on the face part of the obtained images, and determining the current emotional state of the user according to the target key points.
Optionally, adjusting the identified current emotional state of the user comprises:
and judging whether the identified current emotional state of the user belongs to the type of the emotional state to be adjusted, and if so, automatically playing the target music to adjust the current emotional state of the user.
Optionally, the method comprises:
and reporting the user data detected by the intelligent wearable device and the current emotional state of the user obtained by identification to a server, and sending the user data to a mobile terminal through an application program, wherein the detected user data comprises the heartbeat value and the motion state of the user and image information collected by a camera.
According to another aspect of the present application, there is also provided an apparatus for adjusting a user status based on a smart wearable device, the apparatus including:
the acquisition device is used for acquiring the current heartbeat value and the motion state of the user acquired through the intelligent wearable equipment;
the identification device is used for judging whether a camera of the wearable device is started or not based on the current heartbeat value and the motion state of the user, and if so, identifying the current emotion state of the user based on the camera;
and the adjusting device is used for adjusting the identified current emotional state of the user.
According to another aspect of the present application, there is also provided a device for adjusting a user status based on a smart wearable device, the device including:
one or more processors; and
a memory storing computer readable instructions that, when executed, cause the processor to perform the operations of the method as previously described.
According to yet another aspect of the present application, there is also provided a computer readable medium having computer readable instructions stored thereon, the computer readable instructions being executable by a processor to implement the method as described above.
Compared with the prior art, the method and the device have the advantages that the current heartbeat value and the motion state of the user, which are acquired through the intelligent wearable device, are acquired; judging whether to start a camera of the wearable device or not based on the current heartbeat value and the motion state of the user, and if so, identifying the current emotion state of the user based on the camera; an adjustment is made to the identified current emotional state of the user. Therefore, the emotional state of the user is detected in time and is adjusted correspondingly, so that the user can get away from poor emotional atmosphere, and the user experience is improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
fig. 1 illustrates a flow diagram of a method for adjusting a user state based on a smart wearable device according to an aspect of the present application;
fig. 2 is a flowchart illustrating a method for adjusting a user's mood based on a smart watch according to an embodiment of the present application;
fig. 3 shows a schematic structural diagram of a device for adjusting a user state based on a smart wearable device according to still another aspect of the present application.
The same or similar reference numbers in the drawings identify the same or similar elements.
Detailed Description
The present application is described in further detail below with reference to the attached figures.
In a typical configuration of the present application, the terminal, the device serving the network, and the trusted party each include one or more processors (e.g., Central Processing Units (CPUs)), input/output interfaces, network interfaces, and memory.
The Memory may include volatile Memory in a computer readable medium, Random Access Memory (RAM), and/or nonvolatile Memory such as Read Only Memory (ROM) or flash Memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, Phase-Change RAM (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), flash Memory or other Memory technology, Compact Disc Read-Only Memory (CD-ROM), Digital Versatile Disc (DVD) or other optical storage, magnetic cassette tape, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, computer readable media does not include non-transitory computer readable media (transient media), such as modulated data signals and carrier waves.
Fig. 1 shows a flowchart of a method for adjusting a user state based on a smart wearable device according to an aspect of the present application, the method including: S11-S13, wherein in the S11, the current heartbeat value and the motion state of the user, which are acquired through the intelligent wearable device, are acquired; step S12, judging whether to start a camera of the wearable device based on the current heartbeat value and the motion state of the user, and if so, identifying the current emotion state of the user based on the camera; step S13, adjusting the identified current emotional state of the user. Therefore, the emotional state of the user is detected in time and is adjusted correspondingly, so that the user can get away from poor emotional atmosphere, and the user experience is improved.
Specifically, in step S11, the current heartbeat value and the motion state of the user collected by the smart wearable device are obtained; here, intelligence wearing equipment is wearable equipment of intelligence, preferably intelligent wrist-watch, and after the user wore this intelligent wrist-watch, gather user's current heartbeat value and motion state through this intelligent wrist-watch, motion state divide into motion and non-motion to follow-up emotion state who judges the user.
Specifically, in step S12, it is determined whether to turn on a camera of the wearable device based on the current heartbeat value and the motion state of the user, and if so, the current emotional state of the user is identified based on the camera; here, according to the collected current heartbeat value and the current motion state of the user, whether a camera of the wearable device needs to be started for further photographing analysis can be judged, and the current emotion state of the user is determined again through photographing analysis.
Specifically, in step S13, the identified current emotional state of the user is adjusted. Here, the current emotional state of the user is, for example, happy, angry, fear, sadness, disgust, or the like. After the current emotion state of the user is identified again through the camera, the corresponding music is automatically played by the intelligent wearable device to adjust the current emotion of the user, for example, the relaxing music is played when the current emotion is not good, so that the user can walk out of the bad emotion atmosphere as soon as possible.
In an embodiment of the application, before acquiring a current heartbeat value and a motion state of a user, which are acquired through intelligent wearable equipment, a normal heartbeat data interval of the user can be acquired through a heart rate sensor in the intelligent wearable equipment; and setting a state detection time period of the wearable device. The information of the user can be collected based on a sensor in the intelligent wearable device, a heart rate sensor, a camera and a gravity sensor are arranged in the intelligent wearable device, and the normal heartbeat data range of the user, such as 60-80 times per minute, is obtained through the heart rate sensor; then, the intelligent wearable device sets an opening state detection time period, such as an emotion detection time period, the intelligent wearable device is opened in the daytime or in a certain time period in the daytime, or the emotion of the child after school is detected, and the emotion detection time period is set from 4-8 pm.
Next, in step S11, simultaneously turning on a heart rate sensor and a gravity sensor in the smart wearable device within the state detection time period; and acquiring the current heartbeat value of the user through the heart rate sensor and acquiring the current motion state of the user through the gravity sensor. The heart rate sensor and the gravity sensor are started to work simultaneously within a set state detection time period, the heart rate sensor is used for collecting the current heartbeat value of the user within the set state detection time period, and the gravity sensor is used for collecting the motion state of the user.
In an embodiment of the present application, in step S12, if the current motion state of the user is in motion, waiting for a preset time and then detecting whether the user is already in a non-motion state again through the gravity sensor; if yes, whether a camera of the wearable device is started or not is judged based on the current heartbeat value of the user and the normal heartbeat data interval. Here, if the user is currently in motion, waiting for a preset time (for example, 5 minutes), detecting whether the user is in a non-motion state again through the gravity sensor, and if so, performing the next step of judging whether to start the camera operation. And when the user is in a non-motion state, comparing the detected heartbeat value with the normal heartbeat data area, for example, if the detected heartbeat value is higher than the maximum value of the normal heartbeat by more than 10 times/minute, starting a camera to shoot the user at regular time, thereby performing emotion recognition on the obtained image.
In step S12, the camera performs timed photographing to obtain a plurality of images, and performs emotion recognition on the obtained plurality of images to determine the current emotional state of the user. After the camera is started to photograph, further confirming which emotion state the current user is in, and identifying the photographed image in an emotion identification mode during confirmation; specifically, a plurality of target key points on the face part of the obtained images are identified, and the current emotional state of the user is determined according to the target key points, wherein the current emotional state comprises happiness, sadness, anger, fear and the like. Identifying a plurality of target key points in the face of the user, wherein the target key points comprise corners of a mouth, corners of eyes, eyebrows and the like, identifying the current emotional state of the user according to the confirmation of the target key points, for example, judging that the user is happy when the corners of the mouth of the user are upwarped and the eyes of the user have annular wrinkles; and judging the user to be sad when the tearing or the locking of the eyebrows of the user is detected, and judging the user to be angry when the user frowns and opens the eyes.
In an embodiment of the present application, in step S13, it is determined whether the identified current emotional state of the user belongs to the type of emotional state to be adjusted, and if so, the current emotional state of the target music-like adjustment user is automatically played. Here, the type of the emotional state to be adjusted includes sadness, anger, fear, disgust, and the like, and when it is recognized that the current emotional state of the user belongs to any one of sadness, anger, fear, disgust, and the like, the smart wearable device automatically plays the soothing-type music for adjustment.
In an embodiment of the present application, the method includes: and reporting the user data detected by the intelligent wearable device and the current emotional state of the user obtained by identification to a server, and sending the user data to a mobile terminal through an application program, wherein the detected user data comprises the heartbeat value and the motion state of the user and image information collected by a camera. The method and the system have the advantages that the heart rate value and the motion state of the user detected each time, the collected information such as the image obtained by photographing and the recognized current emotion state are recorded, the collected information and the current emotion state are reported to the server, follow-up analysis and guidance suggestion giving are facilitated, and the collected information and the current emotion state are sent to the mobile terminal through the application program (APP) to be provided for parents, so that the parents can conveniently master the situation.
In a specific embodiment of the present application, as shown in fig. 2, motion state data of a user is acquired through a gravity sensor in a smart watch, if the motion state is a motion state in motion, the gravity sensor is reused for acquiring new motion state data after a preset time, if the motion state is a non-motion state, test data is acquired through a heart rate sensor, whether the test data exceeds a maximum value of a normal heartbeat is compared, if the test data does not exceed the maximum value of the normal heartbeat, the test data is reported to a server, if the test data does not exceed the maximum value of the normal heartbeat, a camera is used for taking a picture and performing emotion recognition, when one of sadness, anger, fear and disgust is recognized, a soothing music is automatically played and the recognized emotion is reported to the server, and if one of happiness and excitement is recognized, the emotion state at this time is directly reported to the server.
In addition, a computer readable medium is provided in the embodiments of the present application, and computer readable instructions stored thereon are executable by a processor to implement the foregoing method for adjusting the user status based on the smart wearable device.
In correspondence with the method described above, the present application also provides a terminal, which includes modules or units capable of executing the method steps described in fig. 1 or fig. 2 or various embodiments, and these modules or units can be implemented by hardware, software or a combination of hardware and software, and the present application is not limited thereto. For example, in an embodiment of the present application, there is also provided an apparatus for adjusting a user state based on a smart wearable apparatus, the apparatus including:
one or more processors; and
a memory storing computer readable instructions that, when executed, cause the processor to perform the operations of the method as previously described.
For example, the computer readable instructions, when executed, cause the one or more processors to:
acquiring a current heartbeat value and a motion state of a user, which are acquired through intelligent wearable equipment;
judging whether to start a camera of the wearable device or not based on the current heartbeat value and the motion state of the user, and if so, identifying the current emotion state of the user based on the camera;
an adjustment is made to the identified current emotional state of the user.
Fig. 3 shows a schematic structural diagram of an apparatus for adjusting a user state based on a smart wearable device according to another aspect of the present application, the apparatus including: the device comprises an acquisition device 11, an identification device 12 and an adjustment device 13, wherein the acquisition device 11 is used for acquiring the current heartbeat value and the motion state of the user, which are acquired by the intelligent wearable device; the identification device 12 is configured to determine whether to start a camera of the wearable device based on the current heartbeat value and the motion state of the user, and if so, identify the current emotional state of the user based on the camera; the adjusting means 13 is used to adjust the identified current emotional state of the user.
It should be noted that the content executed by the obtaining device 11, the identifying device 12 and the adjusting device 13 is the same as or corresponding to the content in the above steps S11, S12 and S13, and for brevity, will not be described again.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.
It should be noted that the present application may be implemented in software and/or a combination of software and hardware, for example, implemented using Application Specific Integrated Circuits (ASICs), general purpose computers or any other similar hardware devices. In one embodiment, the software programs of the present application may be executed by a processor to implement the steps or functions described above. Likewise, the software programs (including associated data structures) of the present application may be stored in a computer readable recording medium, such as RAM memory, magnetic or optical drive or diskette and the like. Additionally, some of the steps or functions of the present application may be implemented in hardware, for example, as circuitry that cooperates with the processor to perform various steps or functions.
In addition, some of the present application may be implemented as a computer program product, such as computer program instructions, which when executed by a computer, may invoke or provide methods and/or techniques in accordance with the present application through the operation of the computer. Program instructions which invoke the methods of the present application may be stored on a fixed or removable recording medium and/or transmitted via a data stream on a broadcast or other signal-bearing medium and/or stored within a working memory of a computer device operating in accordance with the program instructions. An embodiment according to the present application comprises an apparatus comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the apparatus to perform a method and/or a solution according to the aforementioned embodiments of the present application.
It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the apparatus claims may also be implemented by one unit or means in software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.

Claims (10)

1. A method for adjusting a user state based on an intelligent wearable device is characterized by comprising the following steps:
acquiring a current heartbeat value and a motion state of a user, which are acquired through intelligent wearable equipment;
judging whether to start a camera of the wearable device or not based on the current heartbeat value and the motion state of the user, and if so, identifying the current emotion state of the user based on the camera;
an adjustment is made to the identified current emotional state of the user.
2. The method of claim 1, wherein obtaining the current heartbeat value and the motion state of the user collected by the smart wearable device comprises:
acquiring a normal heartbeat data interval of the user through a heart rate sensor in the intelligent wearable device;
and setting a state detection time period of the wearable device.
3. The method of claim 2, wherein obtaining the current heartbeat value and the motion state of the user collected by the smart wearable device comprises:
simultaneously starting a heart rate sensor and a gravity sensor in the intelligent wearable device within the state detection time period;
and acquiring the current heartbeat value of the user through the heart rate sensor and acquiring the current motion state of the user through the gravity sensor.
4. The method of claim 3, wherein determining whether to turn on a camera of the wearable device based on the current heartbeat value and the motion state of the user comprises:
if the current motion state of the user is in motion, waiting for a preset time and detecting whether the user is in a non-motion state again through the gravity sensor;
if yes, whether a camera of the wearable device is started or not is judged based on the current heartbeat value of the user and the normal heartbeat data interval.
5. The method of any one of claims 1 to 4, wherein identifying a current emotional state of a user based on the camera comprises:
and carrying out timing photographing based on the camera to obtain a plurality of images, and carrying out emotion recognition on the obtained images to determine the current emotion state of the user.
6. The method of claim 5, wherein performing emotion recognition on the obtained plurality of images to determine a current emotional state of the user comprises:
and identifying a plurality of target key points on the face part of the obtained images, and determining the current emotional state of the user according to the target key points.
7. The method of claim 1, wherein adjusting the identified current emotional state of the user comprises:
and judging whether the identified current emotional state of the user belongs to the type of the emotional state to be adjusted, and if so, automatically playing the target music to adjust the current emotional state of the user.
8. The method according to claim 1, characterized in that it comprises:
and reporting the user data detected by the intelligent wearable device and the current emotional state of the user obtained by identification to a server, and sending the user data to a mobile terminal through an application program, wherein the detected user data comprises the heartbeat value and the motion state of the user and image information collected by a camera.
9. The utility model provides an equipment based on intelligence wearing equipment adjusts user's state which characterized in that, equipment includes:
the acquisition device is used for acquiring the current heartbeat value and the motion state of the user acquired through the intelligent wearable equipment;
the identification device is used for judging whether a camera of the wearable device is started or not based on the current heartbeat value and the motion state of the user, and if so, identifying the current emotion state of the user based on the camera;
and the adjusting device is used for adjusting the identified current emotional state of the user.
10. A computer readable medium having computer readable instructions stored thereon which are executable by a processor to implement the method of any one of claims 1 to 8.
CN202110384332.4A 2021-04-09 2021-04-09 Method and device for adjusting user state based on intelligent wearable device Pending CN113144374A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110384332.4A CN113144374A (en) 2021-04-09 2021-04-09 Method and device for adjusting user state based on intelligent wearable device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110384332.4A CN113144374A (en) 2021-04-09 2021-04-09 Method and device for adjusting user state based on intelligent wearable device

Publications (1)

Publication Number Publication Date
CN113144374A true CN113144374A (en) 2021-07-23

Family

ID=76889723

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110384332.4A Pending CN113144374A (en) 2021-04-09 2021-04-09 Method and device for adjusting user state based on intelligent wearable device

Country Status (1)

Country Link
CN (1) CN113144374A (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104434142A (en) * 2014-11-14 2015-03-25 惠州Tcl移动通信有限公司 Wearable device and emotion reminding method
CN104699972A (en) * 2015-03-18 2015-06-10 小米科技有限责任公司 Emotion recognition reminding method and device
CN105536118A (en) * 2016-02-19 2016-05-04 京东方光科技有限公司 Emotion regulation device, wearable equipment and cap with function of relieving emotion
CN105726045A (en) * 2016-01-28 2016-07-06 惠州Tcl移动通信有限公司 Emotion monitoring method and mobile terminal thereof
CN106599057A (en) * 2016-11-18 2017-04-26 上海斐讯数据通信技术有限公司 Music rhythm control terminal and method adaptive to user emotion
CN106725473A (en) * 2016-12-29 2017-05-31 杭州联络互动信息科技股份有限公司 A kind of method and device that emotional state is adjusted based on intelligent wearable device
CN107666539A (en) * 2017-10-12 2018-02-06 广东小天才科技有限公司 The information processing method and wearable device of a kind of wearable device
CN107714056A (en) * 2017-09-06 2018-02-23 上海斐讯数据通信技术有限公司 A kind of wearable device of intellectual analysis mood and the method for intellectual analysis mood
CN108594991A (en) * 2018-03-28 2018-09-28 努比亚技术有限公司 A kind of method, apparatus and computer storage media that help user to adjust mood
CN108604246A (en) * 2016-12-29 2018-09-28 华为技术有限公司 A kind of method and device adjusting user emotion
CN108765869A (en) * 2018-05-31 2018-11-06 深圳市零度智控科技有限公司 Children's safety wrist-watch based on recognition of face
CN109460752A (en) * 2019-01-10 2019-03-12 广东乐心医疗电子股份有限公司 Emotion analysis method and device, electronic equipment and storage medium
CN109871807A (en) * 2019-02-21 2019-06-11 百度在线网络技术(北京)有限公司 Face image processing process and device
CN110399836A (en) * 2019-07-25 2019-11-01 深圳智慧林网络科技有限公司 User emotion recognition methods, device and computer readable storage medium

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104434142A (en) * 2014-11-14 2015-03-25 惠州Tcl移动通信有限公司 Wearable device and emotion reminding method
CN104699972A (en) * 2015-03-18 2015-06-10 小米科技有限责任公司 Emotion recognition reminding method and device
CN105726045A (en) * 2016-01-28 2016-07-06 惠州Tcl移动通信有限公司 Emotion monitoring method and mobile terminal thereof
CN105536118A (en) * 2016-02-19 2016-05-04 京东方光科技有限公司 Emotion regulation device, wearable equipment and cap with function of relieving emotion
CN106599057A (en) * 2016-11-18 2017-04-26 上海斐讯数据通信技术有限公司 Music rhythm control terminal and method adaptive to user emotion
CN108604246A (en) * 2016-12-29 2018-09-28 华为技术有限公司 A kind of method and device adjusting user emotion
CN106725473A (en) * 2016-12-29 2017-05-31 杭州联络互动信息科技股份有限公司 A kind of method and device that emotional state is adjusted based on intelligent wearable device
CN107714056A (en) * 2017-09-06 2018-02-23 上海斐讯数据通信技术有限公司 A kind of wearable device of intellectual analysis mood and the method for intellectual analysis mood
CN107666539A (en) * 2017-10-12 2018-02-06 广东小天才科技有限公司 The information processing method and wearable device of a kind of wearable device
CN108594991A (en) * 2018-03-28 2018-09-28 努比亚技术有限公司 A kind of method, apparatus and computer storage media that help user to adjust mood
CN108765869A (en) * 2018-05-31 2018-11-06 深圳市零度智控科技有限公司 Children's safety wrist-watch based on recognition of face
CN109460752A (en) * 2019-01-10 2019-03-12 广东乐心医疗电子股份有限公司 Emotion analysis method and device, electronic equipment and storage medium
CN109871807A (en) * 2019-02-21 2019-06-11 百度在线网络技术(北京)有限公司 Face image processing process and device
CN110399836A (en) * 2019-07-25 2019-11-01 深圳智慧林网络科技有限公司 User emotion recognition methods, device and computer readable storage medium

Similar Documents

Publication Publication Date Title
US11871328B2 (en) Method for identifying specific position on specific route and electronic device
US9392163B2 (en) Method and apparatus for unattended image capture
EP3859561A1 (en) Method for processing video file, and electronic device
US10170157B2 (en) Method and apparatus for finding and using video portions that are relevant to adjacent still images
US10115019B2 (en) Video categorization method and apparatus, and storage medium
US9830727B2 (en) Personalizing image capture
US20100086204A1 (en) System and method for capturing an emotional characteristic of a user
US11470246B2 (en) Intelligent photographing method and system, and related apparatus
EP2998960A1 (en) Method and device for video browsing
CN110377761A (en) A kind of method and device enhancing video tastes
CN114816610B (en) Page classification method, page classification device and terminal equipment
CN109981976A (en) Picture pick-up device and its control method and storage medium
JP4513699B2 (en) Moving image editing apparatus, moving image editing method and program
CN109167877A (en) Terminal screen control method, device, terminal device and storage medium
CN102857685A (en) Image capturing method and image capturing system
CN113170037A (en) Method for shooting long exposure image and electronic equipment
CN106464812A (en) Lifelog camera and method of controlling same according to transitions in activity
CN112286364A (en) Man-machine interaction method and device
CN103780808A (en) Content acquisition apparatus and storage medium
WO2019137166A1 (en) Video producing method, apparatus, storage medium, and electronic device
CN111176440B (en) Video call method and wearable device
KR100827848B1 (en) Method and system for recognizing person included in digital data and displaying image by using data acquired during visual telephone conversation
CN113144374A (en) Method and device for adjusting user state based on intelligent wearable device
CN105138950B (en) A kind of photographic method and user terminal
CN109525791A (en) Information recording method and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210723