CN116211247A - Health detection method, device and storage medium - Google Patents

Health detection method, device and storage medium Download PDF

Info

Publication number
CN116211247A
CN116211247A CN202211558582.6A CN202211558582A CN116211247A CN 116211247 A CN116211247 A CN 116211247A CN 202211558582 A CN202211558582 A CN 202211558582A CN 116211247 A CN116211247 A CN 116211247A
Authority
CN
China
Prior art keywords
index
emotion
detected
target
recognition result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211558582.6A
Other languages
Chinese (zh)
Inventor
杨小海
贾志强
秦吉波
韩晓玉
王亮
赵君
朱创
于伟
陈二运
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China United Network Communications Group Co Ltd
China Unicom Online Information Technology Co Ltd
Original Assignee
China United Network Communications Group Co Ltd
China Unicom Online Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China United Network Communications Group Co Ltd, China Unicom Online Information Technology Co Ltd filed Critical China United Network Communications Group Co Ltd
Priority to CN202211558582.6A priority Critical patent/CN116211247A/en
Publication of CN116211247A publication Critical patent/CN116211247A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/48Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
    • G10L25/51Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
    • G10L25/63Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for estimating an emotional state

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Multimedia (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Child & Adolescent Psychology (AREA)
  • General Physics & Mathematics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Hospice & Palliative Care (AREA)
  • Educational Technology (AREA)
  • Social Psychology (AREA)
  • Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Computational Linguistics (AREA)
  • Signal Processing (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Acoustics & Sound (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The application provides a health detection method, a health detection device and a storage medium, relates to the technical field of computers, and is used for solving the technical problem that general wearable equipment cannot comprehensively detect the health state of an object to be detected. The health detection method comprises the following steps: acquiring motion data, audio data and video data of an object to be detected; determining a motion index of an object to be detected according to the motion data; determining emotion indexes of objects to be detected according to the audio data and the video data; and determining the health index of the object to be detected according to the movement index and the emotion index.

Description

Health detection method, device and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a health detection method, apparatus, and storage medium.
Background
At present, most of hardware equipment and products related to health detection on the market are wearable equipment for detecting motion data, such as smart watch bracelets, heart rate belts and the like, and the motion condition of an object to be detected is mainly analyzed through indexes such as electrocardio, heart rate, blood oxygen and body temperature.
However, general wearable devices cannot comprehensively detect the health status of the object to be detected.
Disclosure of Invention
The application provides a health detection method, a health detection device and a storage medium, which are used for solving the technical problem that the health state of an object to be detected cannot be comprehensively detected by a general technology.
In order to achieve the above purpose, the present application adopts the following technical scheme:
in a first aspect, a health detection method is provided, comprising: acquiring motion data, audio data and video data of an object to be detected; determining a motion index of an object to be detected according to the motion data; determining emotion indexes of objects to be detected according to the audio data and the video data; and determining the health index of the object to be detected according to the movement index and the emotion index.
Optionally, after determining the health index of the object to be detected according to the movement index and the emotion index, the method further comprises: determining a detection result of the object to be detected according to the target index and a preset rule curve; the target index includes: at least one of a motor index, an emotion index, and a health index; the preset rule curve comprises the following steps: biological three-rhythm curve.
Optionally, the motion data includes: number of exercise steps and exercise time; determining a motion index of the object to be detected according to the motion data, comprising: determining the ratio between the number of motion steps and the motion time as the motion intensity of the object to be detected; and determining the movement index according to the movement steps, the movement intensity, the preset movement steps and the preset movement intensity.
Optionally, determining the emotion index of the object to be detected according to the audio data and the video data includes: inputting the audio data into a voice emotion recognition model to obtain a first emotion recognition result; inputting the video data into a face attribute analysis model to obtain a second emotion recognition result; and determining the emotion index according to the first emotion recognition result and the second emotion recognition result.
Optionally, the first emotion recognition result includes: at least one first emotion category and a first confidence level corresponding to each first emotion category; the second emotion recognition result includes: at least one second emotion category and a second confidence level corresponding to each second emotion category; determining the emotion index based on the first emotion recognition result and the second emotion recognition result includes: combining the first emotion recognition result and the second emotion recognition result to obtain a target emotion recognition result; the target emotion recognition result comprises a plurality of target emotion categories and target confidence degrees corresponding to each target emotion category; selecting at least one target confidence coefficient with the target confidence coefficient larger than the preset confidence coefficient from the target confidence coefficients, and determining the emotion index according to the times of at least one target emotion category corresponding to the at least one target confidence coefficient in a preset time period.
Optionally, the health detection method further comprises: and when the health index is smaller than a preset threshold value and/or the detection result is abnormal, sending a prompt message to the electronic equipment corresponding to the object to be detected.
In a second aspect, there is provided a health detection device comprising: an acquisition unit and a processing unit; an acquisition unit configured to acquire motion data, audio data, and video data of an object to be detected; the processing unit is used for determining the motion index of the object to be detected according to the motion data; the processing unit is also used for determining the emotion index of the object to be detected according to the audio data and the video data; and the processing unit is also used for determining the health index of the object to be detected according to the movement index and the emotion index.
Optionally, the processing unit is further configured to determine a detection result of the object to be detected according to the target index and a preset rule curve; the target index includes: at least one of a motor index, an emotion index, and a health index; the preset rule curve comprises the following steps: biological three-rhythm curve.
Optionally, the motion data includes: number of exercise steps and exercise time; the processing unit is specifically used for: determining the ratio between the number of motion steps and the motion time as the motion intensity of the object to be detected;
And determining the movement index according to the movement steps, the movement intensity, the preset movement steps and the preset movement intensity.
Optionally, the processing unit is specifically configured to: inputting the audio data into a voice emotion recognition model to obtain a first emotion recognition result; inputting the video data into a face attribute analysis model to obtain a second emotion recognition result; and determining the emotion index according to the first emotion recognition result and the second emotion recognition result.
Optionally, the first emotion recognition result includes: at least one first emotion category and a first confidence level corresponding to each first emotion category; the second emotion recognition result includes: at least one second emotion category and a second confidence level corresponding to each second emotion category; the processing unit is specifically used for: combining the first emotion recognition result and the second emotion recognition result to obtain a target emotion recognition result; the target emotion recognition result comprises a plurality of target emotion categories and target confidence degrees corresponding to each target emotion category; selecting at least one target confidence coefficient with the target confidence coefficient larger than the preset confidence coefficient from the target confidence coefficients, and determining the emotion index according to the times of at least one target emotion category corresponding to the at least one target confidence coefficient in a preset time period.
Optionally, the health detection device further comprises: a transmitting unit; and the sending unit is used for sending a prompt message to the electronic equipment corresponding to the object to be detected when the health index is smaller than a preset threshold value and/or the detection result is abnormal.
In a third aspect, a health detection device is provided, comprising a memory and a processor; the memory is used for storing computer execution instructions, and the processor is connected with the memory through a bus; when the health detection device is running, the processor executes computer-executable instructions stored in the memory to cause the health detection device to perform the health detection method of the first aspect.
The health detection device may be a network device or may be a part of a device in a network device, such as a system-on-chip in a network device. The system-on-a-chip is configured to support the network device to implement the functions involved in the first aspect and any one of its possible implementations, for example, to obtain, determine, and send data and/or information involved in the above-mentioned health detection method. The chip system includes a chip, and may also include other discrete devices or circuit structures.
In a fourth aspect, there is provided a computer readable storage medium comprising computer executable instructions which, when run on a computer, cause the computer to perform the health detection method of the first aspect.
In a fifth aspect, there is also provided a computer program product comprising computer instructions which, when run on a health detection device, cause the health detection device to perform the health detection method according to the first aspect described above.
It should be noted that the above-mentioned computer instructions may be stored in whole or in part on a computer-readable storage medium. The computer readable storage medium may be packaged together with the processor of the health detection device, or may be packaged separately from the processor of the health detection device, which is not limited in this embodiment of the present application.
The description of the second, third, fourth and fifth aspects of the present application may refer to the detailed description of the first aspect.
In the embodiment of the present application, the names of the above health detection devices do not limit the devices or functional modules, and in actual implementation, these devices or functional modules may appear under other names. For example, the receiving unit may also be referred to as a receiving module, a receiver, etc. Insofar as the function of each device or function module is similar to the present application, it is within the scope of the claims of the present application and the equivalents thereof.
The technical scheme provided by the application at least brings the following beneficial effects:
based on any one of the above aspects, the present application provides a health detection method, which can obtain motion data, audio data and video data of an object to be detected, determine a motion index of the object to be detected according to the motion data, and determine an emotion index of the object to be detected according to the audio data and the video data. Subsequently, the health index of the subject to be detected may be determined based on the movement index and the emotion index. Therefore, the method and the device can comprehensively detect the health state of the object to be detected based on the movement index and the emotion index of the object to be detected, and solve the technical problem that the general wearable equipment cannot comprehensively detect the health state of the object to be detected.
The advantages of the first, second, third, fourth and fifth aspects of the present application may be referred to for analysis of the above-mentioned advantages, and are not described here again.
Drawings
Fig. 1 is a schematic application scenario diagram of a health detection method according to an embodiment of the present application;
fig. 2 is a schematic hardware structure diagram of a health detection device according to an embodiment of the present application;
Fig. 3 is a schematic hardware structure diagram of a health detection device according to a second embodiment of the present disclosure;
fig. 4 is a schematic flow chart of a health detection method according to an embodiment of the present application;
fig. 5 is a second flow chart of a health detection method according to an embodiment of the present application;
fig. 6 is a flowchart of a health detection method according to an embodiment of the present application;
fig. 7 is a flow chart diagram of a health detection method according to an embodiment of the present application;
fig. 8 is a flowchart fifth of a health detection method according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a health detection device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
It should be noted that, in the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In order to clearly describe the technical solutions of the embodiments of the present application, in the embodiments of the present application, the terms "first", "second", and the like are used to distinguish the same item or similar items having substantially the same function and effect, and those skilled in the art will understand that the terms "first", "second", and the like are not limited in number and execution order.
As described in the background art, most of the hardware devices and products related to health detection on the market are wearable devices for detecting motion data, such as smart watch bracelets and heart rate belts, and the motion situation of the object to be detected is mainly analyzed by detecting the indexes such as electrocardio, heart rate, blood oxygen and body temperature.
However, general wearable devices cannot comprehensively detect the health status of the object to be detected.
Studies show that the emotional state can be causally related to the physical health through various ways or modes such as biology, psychology and society, and meanwhile, the physical strength, emotion and intelligence have periodicity (biorhythm), and each person has a periodic fluctuation rule of 23 days, 28 days and 33 days from birth to life termination. Therefore, the emotion detection and prediction have important significance for guiding the work, study and physical exercise of human beings.
At present, there are various methods and devices for emotion detection, and there are special medical devices: the characteristic extraction and analysis are carried out by collecting physiological signals of brain electricity, electrocardio, skin electricity and the like of the user. In recent years, with the development of artificial intelligence (Artificial Intelligence, AI) technology, methods for analyzing and processing human voice and facial images by using AI technology are increasing.
With the rapid development of smart families, smart terminal devices of the families are also becoming more and more popular, such as smart speakers, cameras, smart watches, and the like. The collection and analysis of emotion and motion data based on intelligent devices are becoming faster and faster, and the population is becoming wider and wider.
Therefore, how to comprehensively detect the health status of the object to be detected based on the motion data and the emotional state of the object to be detected is a technical problem that needs to be solved at present.
In view of the above problems, the present application provides a health detection method, which can obtain motion data, audio data and video data of an object to be detected, determine a motion index of the object to be detected according to the motion data, and determine an emotion index of the object to be detected according to the audio data and the video data. Subsequently, the health index of the subject to be detected may be determined based on the movement index and the emotion index. Therefore, the method and the device can comprehensively detect the health state of the object to be detected based on the movement index and the emotion index of the object to be detected, and solve the technical problem that the general wearable equipment cannot comprehensively detect the health state of the object to be detected.
The health detection method is suitable for a health detection system. Fig. 1 shows one configuration of the health detection system. As shown in fig. 1, the health detection system includes: an electronic device 101, a motion data acquisition device 102 and an audio and video data acquisition device 103.
Wherein the electronic device 101 is communicatively coupled to the motion data acquisition device 102 and the audio and video data acquisition device 103, respectively.
In practice, the electronic device 101 may be connected to any number of motion data acquisition devices 102 and audio and video data acquisition devices 103. For ease of understanding, fig. 1 illustrates an electronic device 101 coupled to a motion data acquisition device 102 and an audio and video data acquisition device 103.
In this embodiment, the motion data acquisition device 102 is configured to provide motion data to the electronic device 101, so that the electronic device 101 determines a motion index of an object to be detected according to the motion data sent by the motion data acquisition device 102.
Alternatively, the athletic data collection device 102 may be a wearable device that detects athletic data, such as a smart watch bracelet, heart rate strap, or the like.
The audio and video data collection device 103 is configured to provide audio data and video data to the electronic device 101, so that the electronic device 101 determines an emotion index of the object to be detected according to the audio data and video data sent by the audio and video data collection device 102.
Alternatively, the audio and video data collection device 103 may be a smart terminal device of a smart home, such as a smart speaker, camera, smart watch, etc.
The electronic device 101 may determine the movement index of the object to be detected and the emotion index of the object to be detected, and further determine the health index of the object to be detected according to the movement index and the emotion index, so as to implement comprehensive detection of the health status of the object to be detected.
Alternatively, the entity device of the electronic device 101 may be a server, a terminal, or other types of electronic devices, which is not limited in this embodiment of the present application.
Alternatively, the terminal may be a device that provides voice and/or data connectivity to the user, a handheld device with wireless connectivity, or other processing device connected to a wireless modem. The wireless terminal may communicate with one or more core networks via a radio access network (radio access network, RAN). The wireless terminals may be mobile terminals such as mobile telephones (or "cellular" telephones) and computers with mobile terminals, as well as portable, pocket, hand-held, computer-built-in or car-mounted mobile devices which exchange voice and/or data with radio access networks, e.g. cell phones, tablet computers, notebook computers, netbooks, personal digital assistants (personal digital assistant, PDA).
Alternatively, the server may be one server in a server cluster (including multiple servers), or may be a chip in the server, or may be a system on a chip in the server, or may be implemented by a Virtual Machine (VM) deployed on a physical machine, which is not limited in this embodiment of the present application.
Alternatively, when the entity device of the electronic device 101 and the entity device of the motion data acquisition device 102 are the same (for example, both are wearable devices), the electronic device 101 and the entity device and the motion data acquisition device 102 may be two devices that are disposed independently of each other, or may be integrated in the same device.
It will be readily appreciated that when the electronic device 101 and the physical device and the motion data collection device 102 are integrated in the same device, the manner of communication between the electronic device 101 and the physical device and the motion data collection device 102 is communication between modules within the device. In this case, the communication flow between the two is the same as "in the case where the electronic device 101 and the entity device and the motion data collection device 102 are independent of each other".
For ease of understanding, the present application will be described with the example of the electronic device 101 and the physical device and the athletic data collection device 102 being independent of each other.
Alternatively, when the physical device of the electronic device 101 and the physical device of the audio and video data acquisition device 103 are the same (e.g., both are smart cameras), the electronic device 101 and the physical device and the audio and video data acquisition device 103 may be two devices that are disposed independently of each other, or may be integrated in the same device.
It will be readily appreciated that when the electronic device 101 and the physical device and the audio and video data acquisition device 103 are integrated in the same device, the manner of communication between the electronic device 101 and the physical device and the audio and video data acquisition device 103 is communication between modules within the device. In this case, the communication flow between the two is the same as "in the case where the electronic device 101 and the entity device and the audio and video data collection device 103 are independent of each other".
For ease of understanding, the present application will be described with the example of the electronic device 101 and the physical device and the audio and video data acquisition device 103 being independent of each other.
Alternatively, the motion data acquisition device 102 and the audio and video data acquisition device 103 may be integrated into one device. For ease of understanding, the present application will be described with the example of the motion data acquisition device 102 and the audio and video data acquisition device 103 being independent of each other.
The basic hardware structure of the electronic device 101 includes the elements included in the health detection apparatus shown in fig. 2 or 3. The hardware configuration of the electronic apparatus 101 will be described below taking the health detection apparatus shown in fig. 2 and 3 as an example.
Fig. 2 is a schematic hardware structure of a health detection device according to an embodiment of the present application. The health detection device comprises a processor 21, a memory 22, a communication interface 23, a bus 24. The processor 21, the memory 22 and the communication interface 23 may be connected by a bus 24.
The processor 21 is a control center of the health detection device, and may be one processor or a collective name of a plurality of processing elements. For example, the processor 21 may be a general-purpose central processing unit (central processing unit, CPU), or may be another general-purpose processor. Wherein the general purpose processor may be a microprocessor or any conventional processor or the like.
As one example, processor 21 may include one or more CPUs, such as CPU 0 and CPU 1 shown in fig. 2.
Memory 22 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a random access memory (random access memory, RAM) or other type of dynamic storage device that can store information and instructions, or an electrically erasable programmable read-only memory (EEPROM), magnetic disk storage or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
In a possible implementation, the memory 22 may exist separately from the processor 21, and the memory 22 may be connected to the processor 21 by a bus 24 for storing instructions or program code. The processor 21, when calling and executing instructions or program code stored in the memory 22, is capable of implementing the health detection method provided in the embodiments described below.
In the embodiment of the present application, the software program stored in the memory 22 is different for the electronic device 101, so the functions implemented by the electronic device 101 are different. The functions performed with respect to the respective devices will be described in connection with the following flowcharts.
In another possible implementation, the memory 22 may also be integrated with the processor 21.
The communication interface 23 is used for connecting the health detection device with other devices through a communication network, wherein the communication network can be an ethernet, a wireless access network, a wireless local area network (wireless local area networks, WLAN) and the like. The communication interface 23 may include a receiving unit for receiving data, and a transmitting unit for transmitting data.
Bus 24 may be an industry standard architecture (industry standard architecture, ISA) bus, an external device interconnect (peripheral component interconnect, PCI) bus, or an extended industry standard architecture (extended industry standard architecture, EISA) bus, among others. The bus may be classified as an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in fig. 2, but not only one bus or one type of bus.
Fig. 3 shows another hardware configuration of the health detection device in the embodiment of the present application. As shown in fig. 3, the health detection device may include a processor 31 and a communication interface 32. The processor 31 is coupled to a communication interface 32.
The function of the processor 31 may be as described above with reference to the processor 21. The processor 31 also has a memory function and can function as the memory 22.
The communication interface 32 is used to provide data to the processor 31. The communication interface 32 may be an internal interface of the health detection device or an external interface of the health detection device (corresponding to the communication interface 23).
It should be noted that the structure shown in fig. 2 (or fig. 3) does not constitute a limitation of the health detection device, and the health detection device may include more or less components than those shown in fig. 2 (or fig. 3), or may combine some components, or may be arranged in different components.
The following describes the health detection method provided in the embodiment of the present application in detail with reference to the accompanying drawings.
The health detection method provided by the embodiment of the present application is applied to the electronic device 101 in the health detection system shown in fig. 1, as shown in fig. 4, and the health detection method provided by the embodiment of the present application includes:
S401, the electronic equipment acquires motion data, audio data and video data of an object to be detected.
In connection with fig. 1, the motion data acquisition device 102 may acquire motion data of an object to be detected and transmit the motion data of the object to be detected to an electronic device.
By way of example, a smart terminal device having a motion sensor (i.e., motion data acquisition device 102) may implement a step counting function to determine step counting data as motion data.
The intelligent terminal equipment with the motion sensor can be a small communication assistant, a smart watch, an intelligent terminal and the like. The intelligent terminal equipment with the motion sensor can report the accumulated steps and positions to the cloud service end at regular time. The cloud service end can report the obtained step number and the obtained position to the electronic equipment.
In connection with fig. 1, the audio and video data acquisition device 103 may acquire audio data and video data of an object to be detected and transmit the audio data and video data of the object to be detected to the electronic device.
For example, a network high definition camera (i.e., the audio and video data acquisition device 103) with a recording function can record video with 720P or 1080P resolution, and push audio and video streams to a streaming server cluster in real time by using a streaming protocol. The streaming media server cluster can report the acquired audio and video streams to the electronic equipment.
After the raw data (including the motion data, the audio data, and the video data) acquired by the acquisition device is acquired, the electronic device may perform preprocessing on the acquired raw data to obtain the motion data, the audio data, and the video data, because the raw data acquired by the acquisition device may include noise data.
Optionally, when the wearable motion data acquisition device is worn by the object to be detected, user information of the object to be detected can be input on the motion data acquisition device. In this case, the user information of the object to be detected can be bound to the device code of the motion data acquisition device. Subsequently, the electronic device may determine motion data of the object to be detected according to the device code of the acquisition device.
Optionally, the motion data acquired by the motion data acquisition device may include coordinate information of the object to be detected. In this case, after the electronic device obtains the coordinate information of the object to be detected, various data in different coordinate systems can be unified and regulated into data in a GCJ02 coordinate system.
Wherein GCJ02 is the coordinate system of the geographic information system formulated by the mapping office.
Optionally, the electronic device may further be deployed with an audio/video stream extraction frame filtering application module. The audio and video stream frame extraction filtering application module is used for separating audio and video, slicing the audio and video and extracting key frames.
Specifically, the audio and video stream frame extraction filtering application module can perform preliminary filtering according to an audio and video filtering algorithm to remove meaningless data, such as still pictures, silent audio segments, and the like.
And then, the audio and video stream frame extraction filtering application module can store the audio and video fragments needing further analysis and processing to the cloud disk, abstract the audio and video fragments into audio message events and video message events, and put the audio and video message events into a message queue for waiting for further analysis and processing by a consumption process.
S402, the electronic equipment determines the motion index of the object to be detected according to the motion data.
Specifically, after the motion data is obtained, the electronic device may determine a motion index of the object to be detected according to the motion data.
In one implementation, the electronic device may filter out valid continuous motion time and number of steps based on the number of motion steps and the location update message reported by the motion data acquisition device, and perform the calculation of the motion index based on the valid motion data.
In still another implementation manner, the electronic device may directly determine the number of motion steps reported by the motion data acquisition device as the motion index of the object to be detected.
Optionally, the electronic device may determine the motion index of the object to be detected according to the motion data through other algorithms, which is not limited in the embodiment of the present application.
S403, the electronic equipment determines the emotion index of the object to be detected according to the audio data and the video data.
In one implementation, the electronic device may input the audio data and the video data acquired by the audio and video data acquisition device into a pre-trained fusion model to obtain the emotion index of the object to be detected.
In yet another implementation, the electronic device may input the audio data collected by the audio and video data collection device into a pre-trained speech emotion recognition model to obtain a first emotion index of the object to be detected. Then, the electronic device may input the video data collected by the audio and video data collecting device into a pre-trained face attribute analysis model to obtain a second emotion index of the object to be detected. Then, the electronic device may combine the first emotion index and the second emotion index to obtain an emotion index of the object to be detected.
Optionally, the electronic device may determine the emotion index of the object to be detected according to the audio data and the video data through other models or algorithms, which is not limited in the embodiment of the present application.
S404, the electronic equipment determines the health index of the object to be detected according to the movement index and the emotion index.
Specifically, after determining the movement index and the emotion index, the electronic device may comprehensively determine the health index of the object to be detected according to the movement index and the emotion index.
In one implementation, the health index is calculated from the mood index and the movement index, each taking 50 minutes and 100 minutes. The health index, emotion index and motor index satisfy the following formulas:
health index = 50 x exercise index/5+50 x mood index/5.
Wherein, the health index can take the value range of [20, 100].
From the above, the present application provides a health detection method, which can obtain motion data, audio data and video data of an object to be detected, determine a motion index of the object to be detected according to the motion data, and determine an emotion index of the object to be detected according to the audio data and the video data. Subsequently, the health index of the subject to be detected may be determined based on the movement index and the emotion index. Therefore, the method and the device can comprehensively detect the health state of the object to be detected based on the movement index and the emotion index of the object to be detected, and solve the technical problem that the general wearable equipment cannot comprehensively detect the health state of the object to be detected.
In some embodiments, in conjunction with fig. 4, as shown in fig. 5, after determining the health index of the object to be detected according to the movement index and the emotion index, the electronic device further includes:
s501, the electronic equipment determines a detection result of the object to be detected according to the target index and a preset rule curve.
Wherein the target index comprises: at least one of a motor index, an emotion index, and a health index. The preset rule curve comprises the following steps: biological three-rhythm curve.
Specifically, the electronic device may calculate the emotion value and the physical strength value of the human body biological three-rhythm curve and the stage where the emotion value and the physical strength value are located according to the birth date of the object to be detected, and determine the detection result of the object to be detected by combining the emotion index, the movement index and the health index obtained by the actual analysis and processing according to the data of the electronic device.
Optionally, the electronic device may further aggregate indexes such as a movement index, an emotion index, a health index, and the like according to the date and the dimension of the object to be detected, and store the indexes in the data warehouse. Then, the electronic device can respectively conduct graph display and insight analysis of the data in the dimensions of days, weeks, months and years, so as to determine the detection result of the object to be detected.
From the above, the present application provides a health detection method, which can determine the health index of the object to be detected according to the motion index and the emotion index, and then determine the detection result of the object to be detected according to the target index and the preset rule curve, thereby comprehensively determining the detection result of the object to be detected.
In some embodiments, in conjunction with fig. 5, as shown in fig. 6, the motion data described above includes: number of exercise steps and exercise time. In this case, in S402, the method for determining, by the electronic device, the motion index of the object to be detected according to the motion data specifically includes:
s601, the electronic equipment determines the ratio between the movement steps and the movement time as the movement intensity of the object to be detected.
S602, the electronic equipment determines a movement index according to the movement step number, the movement intensity, the preset movement step number and the preset movement intensity.
Specifically, the electronic device may set a preset number of exercise steps and a preset exercise intensity of continuous exercise for each age group.
For example, the electronic device may set the preset number of exercise steps of the object to be detected in the age range of 30 to 40 years to 6000 steps, and the preset exercise intensity to=235.
The electronic device may then determine a ratio between the number of steps of movement and the movement time of the object to be detected as a movement intensity of the object to be detected.
Then, the electronic device may determine the movement index according to the number of movement steps, the movement intensity, the preset number of movement steps, and the preset movement intensity.
For example, when the absolute value of the difference between the number of moving steps of the object to be detected and the preset number of moving steps is less than or equal to 100 steps and the absolute value of the difference between the moving intensity of the object to be detected and the preset moving intensity is less than or equal to 10, the electronic device may determine the moving index of the object to be detected as 5 points.
When the absolute value of the difference between the number of moving steps of the object to be detected and the preset number of moving steps is greater than 100 steps and less than or equal to 1000 steps, and the absolute value of the difference between the moving intensity of the object to be detected and the preset moving intensity is less than or equal to 15, the electronic device can determine the moving index of the object to be detected as 4 points.
When the absolute value of the difference between the number of moving steps of the object to be detected and the preset number of moving steps is greater than 1000 steps and less than or equal to 2000 steps, and the absolute value of the difference between the moving intensity of the object to be detected and the preset moving intensity is less than or equal to 100, the electronic device may determine the moving index of the object to be detected as 3 points.
When the absolute value of the difference between the number of moving steps of the object to be detected and the preset number of moving steps is greater than 2000 steps, the electronic device may determine the moving index of the object to be detected as 2 points.
When the motion intensity of the object to be detected is less than or equal to 200, the electronic device may determine the motion index of the object to be detected as 0 point.
Alternatively, in other cases than the above, the electronic apparatus may determine the movement index of the object to be detected as 1 minute.
From the above, the above embodiments provide a specific implementation manner for determining the movement index of the object to be detected according to the movement data, so as to facilitate the subsequent comprehensive detection of the health status of the object to be detected based on the movement index and the emotion index of the object to be detected, thereby solving the technical problem that the general wearable device cannot comprehensively detect the health status of the object to be detected.
In some embodiments, referring to fig. 6, as shown in fig. 7, in S403, the method for determining, by the electronic device, the emotion index of the object to be detected according to the audio data and the video data specifically includes:
s701, the electronic equipment inputs the audio data into a voice emotion recognition model to obtain a first emotion recognition result.
S702, the electronic equipment inputs the video data into the face attribute analysis model to obtain a second emotion recognition result.
Alternatively, an AI analysis module can be deployed in the electronic device. After receiving the audio data and the video data, the AI analysis module can dynamically schedule threads for audio AI analysis and image AI analysis according to the attributes of the message based on a message driven manner.
If the message attribute has the audio file address, an AI analysis module in the electronic equipment calls a voice emotion recognition model API to carry out voiceprint emotion recognition and semantic emotion recognition so as to obtain a first emotion recognition result.
If the message attribute has the video file address, an AI analysis module in the electronic equipment firstly acquires a key frame picture in the video data by using FFMPEG, and then the facial emotion is identified by using a facial attribute analysis model API so as to obtain a second emotion identification result.
S703, the electronic equipment determines an emotion index according to the first emotion recognition result and the second emotion recognition result.
In some embodiments, the first emotion recognition result includes: at least one first emotion category and a first confidence level corresponding to each first emotion category. The second emotion recognition result includes: at least one second emotion category and a second confidence level corresponding to each second emotion category.
The at least one first emotion category may include anger, fear, happiness, injury, surprise, other 6 emotions. The at least one second emotion category may also include anger, fear, happiness, injury, surprise, other 6 emotions.
Optionally, the electronic device may determine, as the emotion index, a sum of a first confidence level corresponding to each first emotion category in the first emotion recognition result and a second confidence level corresponding to each second emotion category in the second emotion recognition result.
Optionally, the electronic device may further combine the first emotion recognition result and the second emotion recognition result to obtain a target emotion recognition result, select at least one target confidence coefficient with a target confidence coefficient greater than a preset confidence coefficient from a plurality of target confidence coefficients included in the target emotion recognition result, and determine the emotion index according to the number of times of the at least one target emotion recognition result corresponding to the at least one target confidence coefficient in a preset time period.
In some embodiments, the method for determining the emotion index by the electronic device according to the first emotion recognition result and the second emotion recognition result specifically includes:
s703-1, the electronic device combines the first emotion recognition result and the second emotion recognition result to obtain a target emotion recognition result.
The target emotion recognition result comprises a plurality of target emotion categories and target confidence degrees corresponding to the target emotion categories.
S703-2, the electronic device selects at least one target confidence coefficient with the target confidence coefficient larger than the preset confidence coefficient from the target confidence coefficients, and determines the emotion index according to the times of at least one target emotion category corresponding to the at least one target confidence coefficient in the preset time period.
Specifically, the electronic device may combine the first confidence coefficient corresponding to each first emotion category and the second confidence coefficient corresponding to each second emotion category to obtain an emotion list with high confidence coefficient corresponding to the audio data and the video data.
For example, an AI analysis module in an electronic device may divide a day into a plurality of time periods. Then, an AI analysis module in the electronic device may count the number of occurrences of each emotion category in a plurality of time periods, and determine an emotion index of the object to be detected according to the number.
For example, when the full score of the emotion index of the object to be detected is 5, the electronic device may determine the target value=2×happy emotion number-1×wounded emotion number-1×angry emotion number.
When the target value is greater than or equal to 5, the electronic device determines that the emotion index of the object to be detected is 5 points.
When the target value is greater than or equal to 0 and less than 5, the electronic device determines that the emotion index of the object to be detected is 4 points.
When the target value is more than or equal to-3 and less than 0, the electronic device determines that the emotion index of the object to be detected is 3 points.
When the target value is smaller than-3, the electronic equipment determines that the emotion index of the object to be detected is 2 points.
Optionally, the electronic device may update the calculation formula corresponding to the target value according to other emotion types (such as fear emotion, surprise emotion, etc.), which is not limited in the embodiment of the present application.
Optionally, if the audio data and the video data are analyzed and processed to obtain the emotion list through the audio and the image, the electronic device may determine identity information of the object to be detected according to the key video frame image carried by the video data by using the face comparison model.
If the face comparison fails, the object to be detected is considered to be a stranger (namely an unauthorized object), and the object to be detected can carry out face annotation through the mobile phone APP application to confirm the identity.
Subsequently, the electronic device may store the regular mood list and the identity information in a database.
As can be seen from the foregoing, the above embodiments provide a specific implementation manner for determining the emotion index of the object to be detected according to the audio data and the video data, so as to facilitate the subsequent comprehensive detection of the health status of the object to be detected based on the motion index and the emotion index of the object to be detected, and solve the technical problem that the general wearable device cannot comprehensively detect the health status of the object to be detected.
In some embodiments, in conjunction with fig. 7, as shown in fig. 8, the health detection method further comprises:
s801, when the health index is smaller than a preset threshold value and/or the detection result is abnormal, the electronic equipment sends a prompt message to the electronic equipment corresponding to the object to be detected.
Optionally, the electronic device corresponding to the object to be detected may be a wearable device worn by the object to be detected, a terminal held by the object to be detected, and other devices.
Specifically, when the health index is smaller than the preset threshold value and/or the detection result is abnormal, it is indicated that the health state of the object to be detected may be abnormal. In this case, the electronic device may send a prompt message to the electronic device corresponding to the object to be detected, prompting the object to be detected to schedule activities such as learning, physical exercise, and the like.
Optionally, the electronic device may further generate a health report according to the health index level of the object to be detected in the last period of time, and send the health report to the electronic device corresponding to the object to be detected.
Illustratively, the electronic device determines that the target index of the last 3 months of the object to be detected is continuously low. If the score caused by the emotion reasons is low, a prompt message is sent to the electronic equipment corresponding to the object to be detected, and the object to be detected is reminded of paying attention to the emotion. And if the exercise is lack, sending a prompt message to the electronic equipment corresponding to the object to be detected to remind the object to be detected to strengthen the exercise.
Optionally, the electronic device may further send a prompt message to the electronic device of the related object of the object to be detected according to the prompt policy and the channel configured by the object to be detected.
As can be seen from the foregoing, the present application provides a health detection method, which may send a prompt message to an electronic device corresponding to an object to be detected when the health index is smaller than a preset threshold and/or the detection result is abnormal, so that the object to be detected can look up the corresponding health index and/or the detection result in time, and enrich the health detection modes.
The foregoing description of the solution provided in the embodiments of the present application has been mainly presented in terms of a method. To achieve the above functions, it includes corresponding hardware structures and/or software modules that perform the respective functions. Those of skill in the art will readily appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the present application may divide the functional modules of the health detection device according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated modules may be implemented in hardware or in software functional modules. Optionally, the division of the modules in the embodiments of the present application is schematic, which is merely a logic function division, and other division manners may be actually implemented.
Fig. 9 is a schematic structural diagram of a health detection device according to an embodiment of the present application. The health detection device may be used to perform the method of health detection shown in fig. 4-8. The health detection device shown in fig. 9 includes: an acquisition unit 901 and a processing unit 902;
an acquisition unit 901 for acquiring motion data, audio data, and video data of an object to be detected;
a processing unit 902, configured to determine a motion index of the object to be detected according to the motion data;
the processing unit 902 is further configured to determine an emotion index of the object to be detected according to the audio data and the video data;
the processing unit 902 is further configured to determine a health index of the object to be detected according to the movement index and the emotion index.
Optionally, the processing unit 902 is further configured to determine a detection result of the object to be detected according to the target index and a preset rule curve; the target index includes: at least one of a motor index, an emotion index, and a health index; the preset rule curve comprises the following steps: biological three-rhythm curve.
Optionally, the motion data includes: number of exercise steps and exercise time;
the processing unit 902 is specifically configured to:
determining the ratio between the number of motion steps and the motion time as the motion intensity of the object to be detected;
and determining the movement index according to the movement steps, the movement intensity, the preset movement steps and the preset movement intensity.
Optionally, the processing unit 902 is specifically configured to:
inputting the audio data into a voice emotion recognition model to obtain a first emotion recognition result;
inputting the video data into a face attribute analysis model to obtain a second emotion recognition result;
and determining the emotion index according to the first emotion recognition result and the second emotion recognition result.
Optionally, the first emotion recognition result includes: at least one first emotion category and a first confidence level corresponding to each first emotion category;
the second emotion recognition result includes: at least one second emotion category and a second confidence level corresponding to each second emotion category;
The processing unit 902 is specifically configured to:
combining the first emotion recognition result and the second emotion recognition result to obtain a target emotion recognition result; the target emotion recognition result comprises a plurality of target emotion categories and target confidence degrees corresponding to each target emotion category;
selecting at least one target confidence coefficient with the target confidence coefficient larger than the preset confidence coefficient from the target confidence coefficients, and determining the emotion index according to the times of at least one target emotion category corresponding to the at least one target confidence coefficient in a preset time period.
Optionally, the health detection device further comprises: a transmitting unit 903;
the sending unit 903 is configured to send a prompt message to an electronic device corresponding to the object to be detected when the health index is less than a preset threshold and/or the detection result is abnormal.
The present application also provides a computer-readable storage medium, which includes computer-executable instructions that, when executed on a computer, cause the computer to perform the health detection method provided in the above embodiments.
The embodiment of the present application also provides a computer program, which can be directly loaded into a memory and contains software codes, and the computer program can implement the health detection method provided in the above embodiment after being loaded and executed by a computer.
Those of skill in the art will appreciate that in one or more of the examples described above, the functions described herein may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, these functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer-readable storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described embodiments of the apparatus are merely illustrative, and the division of modules or units, for example, is merely a logical function division, and other manners of division are possible when actually implemented. For example, multiple units or components may be combined or may be integrated into another device, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form. The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units. The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk, etc.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions easily conceivable by those skilled in the art within the technical scope of the present application should be covered in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (14)

1. A method of health detection comprising:
acquiring motion data, audio data and video data of an object to be detected;
determining a motion index of the object to be detected according to the motion data;
determining an emotion index of the object to be detected according to the audio data and the video data;
and determining the health index of the object to be detected according to the movement index and the emotion index.
2. The method according to claim 1, wherein after determining the health index of the subject to be detected based on the movement index and the emotion index, further comprising:
determining a detection result of the object to be detected according to the target index and a preset rule curve; the target index includes: at least one of the motor index, the mood index, and the health index; the preset rule curve comprises: biological three-rhythm curve.
3. The method of claim 1, wherein the athletic data comprises: number of exercise steps and exercise time;
the determining the motion index of the object to be detected according to the motion data comprises the following steps:
Determining the ratio between the movement step number and the movement time as the movement intensity of the object to be detected;
and determining the movement index according to the movement step number, the movement intensity, the preset movement step number and the preset movement intensity.
4. The method of claim 1, wherein determining the emotion index of the subject to be detected from the audio data and the video data comprises:
inputting the audio data into a voice emotion recognition model to obtain a first emotion recognition result;
inputting the video data into a face attribute analysis model to obtain a second emotion recognition result;
and determining the emotion index according to the first emotion recognition result and the second emotion recognition result.
5. The method of claim 4, wherein the first emotion recognition result comprises: at least one first emotion category and a first confidence level corresponding to each first emotion category;
the second emotion recognition result includes: at least one second emotion category and a second confidence level corresponding to each second emotion category;
the determining the emotion index according to the first emotion recognition result and the second emotion recognition result includes:
Combining the first emotion recognition result and the second emotion recognition result to obtain a target emotion recognition result; the target emotion recognition result comprises a plurality of target emotion categories and target confidence degrees corresponding to the target emotion categories;
selecting at least one target confidence coefficient with the target confidence coefficient larger than the preset confidence coefficient from the target confidence coefficients, and determining the emotion index according to the times of at least one target emotion category corresponding to the at least one target confidence coefficient in a preset time period.
6. The method of health detection according to claim 2, further comprising:
and when the health index is smaller than a preset threshold value and/or the detection result is abnormal, sending a prompt message to the electronic equipment corresponding to the object to be detected.
7. A health detection device, comprising: an acquisition unit and a processing unit;
the acquisition unit is used for acquiring motion data, audio data and video data of an object to be detected;
the processing unit is used for determining the motion index of the object to be detected according to the motion data;
the processing unit is further used for determining the emotion index of the object to be detected according to the audio data and the video data;
The processing unit is further configured to determine a health index of the object to be detected according to the movement index and the emotion index.
8. The health detection device according to claim 7, wherein,
the processing unit is further used for determining a detection result of the object to be detected according to the target index and a preset rule curve; the target index includes: at least one of the motor index, the mood index, and the health index; the preset rule curve comprises: biological three-rhythm curve.
9. The health detection apparatus of claim 7, wherein the motion data comprises: number of exercise steps and exercise time;
the processing unit is specifically configured to:
determining the ratio between the movement step number and the movement time as the movement intensity of the object to be detected;
and determining the movement index according to the movement step number, the movement intensity, the preset movement step number and the preset movement intensity.
10. The health detection device according to claim 7, wherein the processing unit is specifically configured to:
inputting the audio data into a voice emotion recognition model to obtain a first emotion recognition result;
Inputting the video data into a face attribute analysis model to obtain a second emotion recognition result;
and determining the emotion index according to the first emotion recognition result and the second emotion recognition result.
11. The health detection apparatus according to claim 10, wherein the first emotion recognition result includes: at least one first emotion category and a first confidence level corresponding to each first emotion category;
the second emotion recognition result includes: at least one second emotion category and a second confidence level corresponding to each second emotion category;
the processing unit is specifically configured to:
combining the first emotion recognition result and the second emotion recognition result to obtain a target emotion recognition result; the target emotion recognition result comprises a plurality of target emotion categories and target confidence degrees corresponding to the target emotion categories;
selecting at least one target confidence coefficient with the target confidence coefficient larger than the preset confidence coefficient from the target confidence coefficients, and determining the emotion index according to the times of at least one target emotion category corresponding to the at least one target confidence coefficient in a preset time period.
12. The health detection apparatus according to claim 8, further comprising: a transmitting unit;
and the sending unit is used for sending a prompt message to the electronic equipment corresponding to the object to be detected when the health index is smaller than a preset threshold value and/or the detection result is abnormal.
13. A health detection device comprising a memory and a processor; the memory is used for storing computer execution instructions, and the processor is connected with the memory through a bus; when the health detection device is running, the processor executes the computer-executable instructions stored in the memory to cause the health detection device to perform the health detection method of any one of claims 1-6.
14. A computer readable storage medium comprising computer executable instructions which, when run on a computer, cause the computer to perform the health detection method of any of claims 1-6.
CN202211558582.6A 2022-12-06 2022-12-06 Health detection method, device and storage medium Pending CN116211247A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211558582.6A CN116211247A (en) 2022-12-06 2022-12-06 Health detection method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211558582.6A CN116211247A (en) 2022-12-06 2022-12-06 Health detection method, device and storage medium

Publications (1)

Publication Number Publication Date
CN116211247A true CN116211247A (en) 2023-06-06

Family

ID=86577448

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211558582.6A Pending CN116211247A (en) 2022-12-06 2022-12-06 Health detection method, device and storage medium

Country Status (1)

Country Link
CN (1) CN116211247A (en)

Similar Documents

Publication Publication Date Title
US10321870B2 (en) Method and system for behavioral monitoring
US9421420B2 (en) Wellness/exercise management method and system by wellness/exercise mode based on context-awareness platform on smartphone
US20180060500A1 (en) Smart health activity scheduling
US20150056589A1 (en) Wellness management method and system by wellness mode based on context-awareness platform on smartphone
CN109938711A (en) Health monitor method, system and computer readable storage medium
CN106897569A (en) A kind of method of work of intelligent health terminal system
US20210259621A1 (en) Wearable system for brain health monitoring and seizure detection and prediction
WO2016115835A1 (en) Human body characteristic data processing method and apparatus
CN108877939A (en) It is a kind of with the health management system arranged of intelligent characteristic abstraction function
US20150170674A1 (en) Information processing apparatus, information processing method, and program
CN106469297A (en) Emotion identification method, device and terminal unit
CN111063437A (en) Personalized chronic disease analysis system
Luštrek et al. Recognising lifestyle activities of diabetic patients with a smartphone
KR101584685B1 (en) A memory aid method using audio-visual data
WO2023025037A1 (en) Health management method and system, and electronic device
Alam et al. A smart segmentation technique towards improved infrequent non-speech gestural activity recognition model
CN116127082A (en) Data acquisition method, system and related device
CN113764099A (en) Psychological state analysis method, device, equipment and medium based on artificial intelligence
CN109067830A (en) Information push method, device and terminal
CN116211247A (en) Health detection method, device and storage medium
CN114444705A (en) Model updating method and device
CN112826514A (en) Atrial fibrillation signal classification method, device, terminal and storage medium
CN116049535A (en) Information recommendation method, device, terminal device and storage medium
CN107832690A (en) The method and Related product of recognition of face
CN114365149A (en) Device usage processing for generating inferred user states

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination