CN112489797A - Accompanying method, device and terminal equipment - Google Patents
Accompanying method, device and terminal equipment Download PDFInfo
- Publication number
- CN112489797A CN112489797A CN201910859308.4A CN201910859308A CN112489797A CN 112489797 A CN112489797 A CN 112489797A CN 201910859308 A CN201910859308 A CN 201910859308A CN 112489797 A CN112489797 A CN 112489797A
- Authority
- CN
- China
- Prior art keywords
- accompanying
- data
- person
- mode
- accompanied
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 46
- 230000004044 response Effects 0.000 claims abstract description 43
- 230000014509 gene expression Effects 0.000 claims abstract description 30
- 230000008451 emotion Effects 0.000 claims description 59
- 238000004458 analytical method Methods 0.000 claims description 32
- 230000008520 organization Effects 0.000 claims description 18
- 230000002159 abnormal effect Effects 0.000 claims description 8
- 238000010801 machine learning Methods 0.000 claims description 5
- 238000004590 computer program Methods 0.000 claims description 4
- 238000009223 counseling Methods 0.000 claims description 4
- 230000000694 effects Effects 0.000 abstract description 12
- 230000006870 function Effects 0.000 description 9
- 230000009471 action Effects 0.000 description 8
- 230000007246 mechanism Effects 0.000 description 7
- 238000012545 processing Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000012549 training Methods 0.000 description 4
- 230000003044 adaptive effect Effects 0.000 description 3
- 206010011469 Crying Diseases 0.000 description 2
- 230000001815 facial effect Effects 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 230000004630 mental health Effects 0.000 description 2
- 230000009323 psychological health Effects 0.000 description 2
- 241000282414 Homo sapiens Species 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 239000000969 carrier Substances 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000036651 mood Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000000474 nursing effect Effects 0.000 description 1
- 229910052760 oxygen Inorganic materials 0.000 description 1
- 239000001301 oxygen Substances 0.000 description 1
- 230000036544 posture Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
- B25J11/009—Nursing, e.g. carrying sick persons, pushing wheelchairs, distributing drugs
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/60—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H80/00—ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Nursing (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The application discloses a accompanying method, a device and terminal equipment. The method and the device can be applied to terminal equipment corresponding to the accompanying person, such as an intelligent bracelet, an intelligent electronic ornament and the like. Firstly, the data of the accompanying person needs to be collected, wherein the data comprises one of the following data: sound data, expression data, motion data and physical sign data; then, analyzing the data of the person to be accompanied to determine a corresponding accompanying mode; the accompanying mode includes at least one of: a voice response mode, a psychological consultation mode and an emergency mode; and finally accompanying the person to be accompanied according to the accompanying mode. This application can match the mode that suits and accompany according to the actual conditions by the attendant. Accompanying mode is abundant various, consequently can richen by accompanying person's experience of being accompanied by, promotes the effect of accompanying and attending to. In addition, this application technical scheme can realize on the terminal equipment of various forms, does not confine to intelligent robot, therefore greatly reduced accompany and attend to the spending, convenient and practical.
Description
Technical Field
The application relates to the technical field of intelligent equipment, in particular to a accompanying method, a device and terminal equipment.
Background
Nowadays, the aging phenomenon in China is increasingly remarkable, and the physical and mental health problems of the old people are widely concerned. However, many elderly people lack accompanying care of their family due to various reasons such as dissimilarity, deviance, loss of independence and busy work of children, which is very unfavorable for the physical and mental health of the elderly.
Some intelligent accompanying robots are used for accompanying the old people in the market at present, but the robots are very expensive as carriers, so that the cost burden of families is increased, and the intelligent accompanying robots are not convenient to popularize and apply. In addition, the accompanying mode is relatively limited, and only single accompanying requirements can be met.
Disclosure of Invention
Based on the problems, the accompanying method, the accompanying device and the terminal device can be applied to various existing intelligent devices to achieve rich accompanying functions.
The embodiment of the application discloses the following technical scheme:
in a first aspect, a method for accompanying and attending is provided, which is applied to a terminal device corresponding to a person to be accompanied and attended; the method comprises the following steps:
collecting data of a person to be accompanied; the data of the accompanied person at least comprises one of the following data: sound data, expression data, motion data and physical sign data;
analyzing the data of the person to be accompanied to determine a corresponding accompanying mode; the accompanying mode at least comprises one of the following modes: a voice response mode, a psychological consultation mode and an emergency mode;
and accompanying the person to be accompanied according to the determined accompanying mode.
Optionally, the analyzing the data of the attendee to determine a corresponding accompanying mode specifically includes:
inputting the data of the person to be accompanied into a pre-constructed accompanying model to obtain a corresponding accompanying mode, wherein the accompanying model is a model which is obtained by machine learning of sample data and is used for outputting the corresponding accompanying mode, and the sample data at least comprises one of the following: sound sample data, expression sample data, motion sample data and physical sign sample data.
Optionally, when the accompanying mode is the voice response mode, the accompanying person is accompanied according to the determined accompanying mode, which specifically includes:
obtaining an emotion analysis result of the accompanying person by using the accompanying model, and obtaining accompanying data according to the emotion analysis result; the accompanying data is response audio or music;
and outputting the accompanying data.
Optionally, the method further comprises: collecting sound data of relatives or friends of the accompanying person;
the obtaining of accompanying data according to the emotion analysis result specifically includes:
and obtaining accompanying data by using the sound data of relatives or friends of the accompanying person according to the emotion analysis result.
Optionally, when the accompanying mode is the psychological consultation mode, the accompanied person is accompanied according to the determined accompanying mode, and the accompanying mode specifically includes:
establishing connection with a psychological counseling institution;
sending the data of the accompanied person to a psychological consulting organization, and/or sending the emotion analysis result of the accompanied person;
and receiving and outputting the audio and video provided by the psychological consulting organization.
Optionally, the method further comprises: and sending the conversation content of the attended person and a psychological consultant of the psychological consultant organization to terminal equipment corresponding to the guardian of the attended person.
Optionally, when the accompanying mode is the first aid mode, the data of the accompanying person comprises sign data; the mode is to being accompanied the person of being accompanied according to the accompanying and attending to that determines, specifically includes:
and sending the physical sign data to an emergency agency closest to the accompanied person.
Optionally, the analyzing the data of the attendee to determine a corresponding accompanying mode specifically includes:
and judging whether the sign data is abnormal or not, and if so, determining that the accompanying mode is the emergency mode.
In a second aspect, the application provides an accompanying device, which is applied to a terminal device corresponding to an attended person; the device comprises:
the data acquisition module is used for acquiring the data of the accompanied person; the data of the accompanied person at least comprises one of the following data: sound data, expression data, motion data and physical sign data;
the accompanying mode determining module is used for analyzing the data of the accompanying person to determine a corresponding accompanying mode; the accompanying mode includes at least one of: a voice response mode, a psychological consultation mode and an emergency mode;
and the accompanying module is used for accompanying the accompanying person according to the accompanying mode determined by the accompanying mode determining module.
In a third aspect, the present application provides a terminal device, including: a data acquisition device, a processor, and a storage medium;
the data acquisition device is used for acquiring data of a person under accompanying and sending the data to the processor; the data of the accompanied person at least comprises one of the following data: sound data, expression data, motion data and physical sign data;
the storage medium is for storing a computer program which, when executed by the processor, performs the following:
obtaining data of the accompanied person;
analyzing the data of the person to be accompanied to determine a corresponding accompanying mode; the accompanying mode at least comprises one of the following modes: a voice response mode, a psychological consultation mode and an emergency mode;
and accompanying the person to be accompanied according to the determined accompanying mode.
Optionally, the terminal device is any one of:
cell-phone, computer, intelligent bracelet, intelligent electron ornaments or intelligent robot.
Compared with the prior art, the method has the following beneficial effects:
the technical scheme that this application provided can be applied to the terminal equipment that is attended the person and corresponds, for example intelligent bracelet, intelligent electronic ornaments etc.. Firstly, the data of the accompanying person needs to be collected, wherein the data comprises one of the following data: sound data, expression data, motion data and physical sign data; then, analyzing the data of the person to be accompanied to determine a corresponding accompanying mode; the accompanying mode includes at least one of: a voice response mode, a psychological consultation mode and an emergency mode; and finally accompanying the person to be accompanied according to the accompanying mode.
The method and the device can match the adaptive mode for accompanying according to the actual situation (namely the data provided by the person) of the accompanied person. Accompanying mode is abundant various, consequently can richen by accompanying person's the experience of being accompanied by, promotes the effect of accompanying. In addition, this application technical scheme can realize on the terminal equipment of various forms, does not confine to intelligent robot, therefore greatly reduced accompany and attend to the spending, convenient and practical.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a flowchart of a method for accompanying and attending provided by an embodiment of the present application;
fig. 2 is a flowchart of another accompanying method provided in an embodiment of the present application;
fig. 3 is a schematic structural diagram of an accompanying device provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
As described above, most of the existing accompanying products are in the form of intelligent accompanying robots, but the hardware configuration is complex, the failure rate is high, the manufacturing cost is high, and the practicability is poor. In addition, the current accompanying and attending product has a single accompanying and attending mode, and the accompanying and attending effect is to be improved.
In view of the above problems, the inventors have studied and provided a method, an apparatus and a terminal device for accompanying and attending to the present application. In the scheme that this application provided, can match corresponding accompanying mode and accompany according to sound data, expression data, action data and sign data etc. of being accompanied person. The accompanying mode includes at least one of a voice response mode, a psychological consultation mode, and a first aid mode. And finally, accompanying the person to be accompanied according to the determined corresponding accompanying mode. The technical scheme of the application is not limited to be applied to the intelligent accompanying robot, and can also be applied to various terminal devices corresponding to the accompanying person, such as an intelligent bracelet or a mobile phone. Therefore, the application range is wide, and the cost is reduced. In addition, the accompanying mode is abundant and the pertinence is stronger, can promote the experience of being accompanied by the person of being accompanied by.
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Method embodiment one
Referring to fig. 1, the figure is a flowchart of a method for accompanying and attending according to an embodiment of the present application.
The method is applied to the terminal equipment corresponding to the accompanying person. In practical applications, the person to be accompanied may be any person needing accompanying, such as the elderly, the disabled, the patient or the child, and the specific type of person to be accompanied is not limited herein.
The terminal device corresponding to the accompanied person can be a mobile terminal device of the accompanied person, such as a mobile phone, a tablet computer, a notebook computer, an intelligent bracelet, an intelligent electronic ornament, an intelligent robot and the like; in addition, the device may be a fixed terminal device of a person to be accompanied, such as a desktop computer.
As shown in fig. 1, a method for accompanying and attending provided by an embodiment of the present application includes:
step 101: and collecting the data of the accompanied person.
The terminal device corresponding to the accompanying person may include various types of data acquisition devices, as an example, including: the voice data acquisition device is used for acquiring voice data of the accompanying person; the image acquisition device is used for acquiring an image of the accompanying person, and the expression data and the action data of the accompanying person can be captured by processing the image of the accompanying person, wherein the action data refers to action data on the aspect of limbs and not action data on the aspect of faces and faces; and the physical sign data acquisition device is used for acquiring physical sign data of the accompanied person.
As a possible implementation manner, the physical sign data acquisition device may be a sensor installed on a terminal device corresponding to the accompanying person. Vital sign data may include, but is not limited to: heartbeat, electrodermal activity (EDA), body temperature, blood oxygen saturation, Electrocardiogram (ECG).
In practical application, when the terminal device corresponding to the accompanying person works, according to the type of data provided by the accompanying person, all the data acquisition devices may acquire valid data, and only a part of the data acquisition devices may acquire valid data. For example, if the accompanying person does not make a sound and only makes a difficult expression, the sound data acquisition device cannot acquire valid data.
Therefore, the data of the caretaker collected in the step at least comprises one of the following data: sound data, expression data, motion data, and vital sign data. That is, the voice data, the expression data, the motion data, and the physical sign data of the accompanying person may all be collected, or only one, two, or three may be collected.
Step 102: and analyzing the data of the person to be accompanied to determine a corresponding accompanying mode.
In the technical scheme provided by the embodiment of the application, various available accompanying modes are provided, such as: a voice response mode, a psychological consultation mode, and an emergency mode.
In practical application, all available accompanying modes are not required to accompany the person to be accompanied every moment. For example, in some application scenarios, it may not be necessary to accompany the attended person in the first aid mode; in some application scenarios, it may not be necessary to accompany the attended person in a psychological consultation mode. For this, it is necessary to analyze data of the caretaker to determine a corresponding pattern of caretaking. Here, the corresponding accompanying mode is an accompanying mode which is determined according to the data of the accompanying person and matches with the accompanying requirement of the accompanying person. In practical application, after the data of the accompanying person is analyzed, the determined corresponding accompanying mode at least comprises one of the following modes: a voice response mode, a psychological consultation mode and an emergency mode.
The intelligent voice response can be provided for the accompanied person by the terminal equipment of the accompanied person in the voice response mode. The voice response may be response audio or music, and the content of the voice response provided in the voice response mode is not limited herein.
The psychological counseling mode is to provide psychological counseling service for the accompanied person.
The first aid mode is that the accompanying person wants to provide first aid service.
As an example, when abnormal physical sign data of the accompanying person is collected, and it is analyzed that the accompanying person sends out a help after sound data of the accompanying person is collected, at this time, the accompanying mode corresponding to the analysis and matching may include an emergency mode, and in order to sooth the emotion of the accompanying person or give encouragement and support to the accompanying person, the accompanying mode may further include a voice response mode.
Step 103: and accompanying the person to be accompanied according to the determined accompanying mode.
If the determined accompanying mode includes a voice response mode, when the step is specifically implemented, the following steps may be performed: accompanying data in a voice response mode is obtained first, and then the accompanying data is output. The accompanying data may specifically include: responsive audio or music, etc. Outputting the accompanying data, namely providing the accompanying data to the person to be accompanied. The output mode can be in the forms of audio playing, video playing and the like.
If the determined accompanying mode includes a psychological consultation mode, when the step is specifically implemented, the following steps may be performed: firstly, establishing connection in a psychological consulting organization, wherein the connection can be remote communication connection and can be established by some existing instant communication software; then sending the data of the accompanied person to a psychological consulting organization, and/or sending the emotion analysis result of the accompanied person (the emotion analysis result is obtained based on the data analysis of the accompanied person); and finally, receiving and outputting the audio and video provided by the psychological consulting organization.
If the determined accompanying mode comprises an emergency mode, the step can be realized by sending the physical sign data to an emergency organization closest to the accompanied person. As a possible implementation manner, the terminal device corresponding to the accompanied person has a positioning function, because the terminal device can be worn or carried by the accompanied person, or is collocated with the accompanied person in one room, the distance between the terminal device and the accompanied person is very close, and the position obtained by positioning can be used as the position of the accompanied person. If the terminal device is provided with a mobile communication module or a Wifi module, the emergency mechanism closest to the positioning position can be determined in a mode of networking a login map APP or opening a related webpage (such as a Baidu map or a Gaudi map).
In addition, if the accompanied person has weak mobility and the long-term activity area rarely changes, the corresponding terminal device may analyze the rule of the activity area of the accompanied person to determine the nearest emergency agency near the long-term activity area in advance, and store the relevant information (such as contact telephone, location information, etc.) of the emergency agency. Therefore, when the accompanying person is in the area and needs to start the emergency mode, the nearest emergency mechanism can be determined very conveniently and rapidly, and the emergency time is saved.
As a possible implementation manner, if the terminal device corresponding to the attended person has the positioning function, the physical sign data of the attended person may also be sent together with the location information of the attended person when the physical sign data of the attended person is sent to the emergency organization in this step, so that the emergency organization can provide fast and accurate emergency service for the attended person.
The accompanying method provided by the embodiment of the application is as above. The method can be applied to the terminal equipment corresponding to the accompanying person, such as an intelligent bracelet, an intelligent electronic ornament and the like. Firstly, the data of the accompanying person needs to be collected, wherein the data comprises one of the following data: sound data, expression data, motion data and physical sign data; then, analyzing the data of the person to be accompanied to determine a corresponding accompanying mode; the accompanying mode at least comprises one of the following modes: a voice response mode, a psychological consultation mode and an emergency mode; and finally accompanying the person to be accompanied according to the accompanying mode.
The accompanying method provided by the application can match the adaptive mode to accompany according to the actual situation of the person to be accompanied (namely the data provided by the person). Accompanying mode is abundant various, consequently can richen by accompanying person's experience of being accompanied by, promotes the effect of accompanying and attending to. In addition, this application technical scheme can realize on the terminal equipment of various forms, does not confine to intelligent robot, therefore greatly reduced accompany and attend to the spending, convenient and practical.
It is understood that in practice, the accompanying person usually has a guardian. The guardian can be a relatives, friends, etc. of the accompanying person. Although the method provided by the embodiment of the application is applied, the terminal device corresponding to the accompanied person can have rich accompanying functions, if the accompanying information related to the accompanied person can be provided for the guardian of the accompanied person, the guardian can know and understand the actual conditions of the accompanied person in more detail, including physical health conditions and psychological health conditions.
Therefore, as a possible implementation manner, if the accompanying mode determined in step 102 includes a psychological consultation mode, the accompanying method provided in this embodiment may further include:
and sending the conversation content of the attended person and a psychological consultant of a psychological consultant organization to the terminal equipment corresponding to the guardian of the attended person.
In specific implementation, as a possible implementation manner, the terminal device corresponding to the accompanied person can store the audio and video of the audio and video conversation remotely established by the accompanied person and a psychological consultant of a psychological consulting organization. And sending the audio and video to the terminal equipment corresponding to the guardian of the accompanied person.
By obtaining the conversation content, the guardian of the accompanied person can master the actual condition of the accompanied person in real time, and the guardian can also know the feasible way of relieving the negative emotion of the accompanied person by a psychological consultant according to the conversation content, so that the guardian can be better accompanied by the attended person by utilizing the feasible way when the attended person needs the feasible way.
As a possible implementation manner, if the accompanying mode determined in step 102 includes a first aid mode, the accompanying method provided in this embodiment may further include:
and sending the physical sign data of the attended person and the time information of the emergency rescue mechanism which sends the physical sign data to the nearest emergency rescue mechanism to the attended person to the terminal equipment corresponding to the guardian of the attended person.
Therefore, the guardian can timely know the abnormal condition of the physical sign data of the accompanied person and know the time of the first-aid request of the accompanied person of the first-aid organization.
In practical application, in order to realize the rapid analysis of the data of the accompanying person and rapidly determine the corresponding accompanying mode, the available accompanying mode can be trained in a mode of pre-constructing the accompanying model. And (4) completing the analysis of data and the confirmation of the accompanying mode by using the accompanying model.
Method embodiment two
Referring to fig. 2, the figure is a flowchart of another accompanying method provided in the embodiment of the present application.
As shown in fig. 2, the accompanying method provided in this embodiment includes:
step 201: and collecting the data of the accompanied person.
Step 201 of this embodiment is substantially the same as step 101 of the previous embodiment, and the description of step 201 may refer to the previous embodiment, which is not repeated herein.
Step 202: and inputting the data of the person to be accompanied into a pre-constructed accompanying model to obtain a corresponding accompanying mode.
The accompanying model described in this embodiment will be described and explained below.
The accompanying model is obtained by pre-training before the accompanying method provided by the embodiment of the application is executed. Specifically, the accompanying model is a model for outputting a corresponding accompanying pattern obtained by machine learning sample data. Since the model can output a corresponding accompanying pattern, the model is referred to as a accompanying model for convenience of description and understanding.
The sample data can be from a large number of different sample individuals (human beings), and the data of the different sample individuals is collected, and the collected data is called the sample data. The present embodiment does not limit the specific type of sample data. As a possible implementation, the sample data may include at least one of: sound sample data, expression sample data, motion sample data and physical sign sample data.
As an example, the sound sample data may include: the audio of the sample individual under various emotions, such as audio containing crying, audio containing laughing, audio containing sighs, audio containing crying, and the like.
As an example, the expression sample data may include: and processing the facial images of the sample individuals under various emotions, and extracting the facial image to obtain expression data. The face image may be an image in which the sample individual reveals a smiling expression, an image in which an expression of sadness is revealed, an image in which an expression of anger is revealed, or the like.
As an example, the action sample data may include: and processing the body images of the sample individuals under various emotions, and extracting the obtained motion data. The body image may be an image of a sample individual making various postures. It is understood that the gesture constructed by the limb movements is associated with an emotion.
As an example, the vital sign sample data may include: sign data of sample individuals of different ages in a plurality of time periods. The specific type of the physical sign sample data is not limited herein.
In practical application, as a possible implementation manner, sample data may be labeled, for example, type labels and emotion labels are labeled. The type tag may include any one of: sound tags, expression tags, action tags, sign tags, and the like. The emotion labels may include positive emotion labels or negative emotion labels, for example, labels that reveal images of smiling expressions are positive emotion labels, and labels that reveal images of sading expressions or angry expressions are negative emotion labels. In addition, the positive emotion label and the negative emotion label may further include specific emotion values, and the higher the absolute value of the emotion value is, the stronger the positive or negative emotion corresponding to the sample data is. As an example, the emotion value of an image revealing a grief is-1.5; the emotion value of the image exposing the angry expression is-2.
In addition, the sample data may further include reference intervals corresponding to various types of physical sign sample data, and when the physical sign sample data exceeds the corresponding reference intervals, according to the specific type of the physical sign sample data and the specific value of the physical sign sample data exceeding the reference intervals, the urgency degree may be assigned to the physical sign sample data. The degree of urgency is a positive value, and the higher the degree of urgency, the higher the degree of abnormality of the sign sample data.
The emotion value and the urgency value can be obtained by processing and analyzing sample data.
In practical application, the corresponding relation among the urgency result, the emotion analysis result and the accompanying mode of the sample data can be constructed in advance. And taking the corresponding relation as an objective function for constructing the accompany model by machine learning.
And the emergency degree result is obtained according to the emergency degree of all types of physical sign sample data of the same sample individual. Specifically, the information may be directly accumulated, or may be obtained by adding each urgency degree after assigning a certain weight thereto. In the above corresponding relationship, the accompanying mode corresponding to the emergency result is the first-aid mode under the condition that the emergency result exceeds the preset emergency result threshold.
The emotion analysis result is a result obtained according to the emotion value of all types of sample data (wherein physical sign sample data is not included) of the same sample individual. Specifically, the emergency degree may be directly accumulated, or obtained by adding each emergency degree with a certain weight. In the corresponding relation, under the condition that the emotion analysis result is lower than a preset first threshold and higher than a preset second threshold (the preset first threshold is higher than the preset second threshold), the accompanying mode corresponding to the emotion analysis result is a voice response mode; and under the condition that the emotion analysis result is not higher than a preset second threshold value, the accompanying mode corresponding to the emotion analysis result is a psychological consultation mode, or the mode corresponding to the emotion analysis result is a psychological consultation mode and a voice response mode.
And in the training process, continuously training coefficients of all layers of the accompanying model according to the objective function, and obtaining the trained accompanying model after meeting a preset termination condition. The preset termination condition may be that the training frequency reaches a preset frequency, or that the value of the objective function is smaller than a preset function value.
By utilizing the trained accompanying model, after accompanying data of an accompanying person is input, the accompanying model can output a corresponding accompanying mode. Through constructing the accompanying model in advance to utilize the accompanying model to confirm suitable accompanying mode, can promote the definite speed of accompanying mode, thereby quick response is by the accompanying demand of accompanying person. The experience of the accompanying person is improved.
Step 203: accompanying the person in the determined accompanying mode
The implementation manner of step 203 in this embodiment is substantially the same as that of step 103 in the previous embodiment. Reference is therefore made to the foregoing embodiments for a related description of the present step.
When the accompany mode is the voice response mode, step 203 may be implemented as follows:
obtaining an emotion analysis result of the accompanying person by using the accompanying model, and obtaining accompanying data according to the emotion analysis result; and outputting the accompanying data. Wherein, the accompanying data is response audio or music.
In this embodiment, the emotion analysis result may further include an emotion type, for example: sadness, anger, joy, etc. A knowledge base containing accompanying data (response audio or music) corresponding to emotion analysis results of various emotion types may be created in advance. For example, if the type of emotion of the emotion analysis result is sad, response audio or soft and cheerful music that can be used to sooth that type of emotion is output; if the type of emotion of the emotion analysis result is anger, a response audio or a rhythm-soothing music that can be used to sooth the type of emotion is output.
The following is an exemplary description of an implementation of the database in the present embodiment.
Sound data of relatives or friends of the accompanying person (specifically, relatives or friends of the accompanying person who has died away) can be collected in advance; these audio data are processed (the processing method is not limited, and for example, processing such as extraction of audio frequencies, tones, and speech recognition may be performed). The response audio for soothing various types of emotions can be replaced according to the processed sound data, namely, the far response audio is replaced by the sound frequency and tone of relatives or friends of the accompanying person, and the voice content of the response audio is kept unchanged. Accompanying data (response audio) of the sound frequency and the tone of relatives or friends of the accompanying person can be obtained by combining the collected sound data and the knowledge base. Therefore, the accompanying data is output, the sound of the familiar relatives or friends can be heard by the accompanying person, the accompanying person can be responded by the sound, the accompanying effect (such as the soothing effect) on the accompanying person can be improved, and the emotion of the accompanying person is influenced to be more positive and active.
It should be noted that, in the above embodiment, the accompanying model may be continuously trained and updated, for example, medical records collected from medical institutions frequently visited by the accompanying person may be added to the accompanying model, wherein the medical records include the usual physical sign information of the accompanying person, and both the abnormal or normal items are identified. By utilizing the physical sign information in the medical records, the accompanying model which is more suitable for the physical sign state of the accompanying person can be trained. In addition, the accompanying model can identify and judge whether the input physical sign data of the person to be accompanied is abnormal or not more accurately. Thereby improving the accuracy of the determined accompany mode.
Based on the accompanying method provided by the foregoing embodiment, correspondingly, the application further provides an accompanying device. A specific implementation of the apparatus is described below with reference to the embodiments and the drawings.
Device embodiment
Referring to fig. 3, the figure is a schematic structural diagram of a nursing device provided in an embodiment of the present application.
The accompanying device provided by the embodiment can be applied to the terminal equipment corresponding to the accompanied person.
As shown in fig. 3, the accompanying device provided in this embodiment includes:
a data acquisition module 301, configured to acquire data of a person being cared; the data of the accompanying person at least comprises one of the following data: sound data, expression data, motion data and physical sign data;
the accompany mode determining module 302 is used for analyzing the data of the attendee to determine a corresponding accompany mode; the accompanying mode includes at least one of: a voice response mode, a psychological consultation mode and an emergency mode;
and the accompanying module 303 is configured to accompany the accompanying person according to the accompanying mode determined by the accompanying mode determining module 302.
In practical application, a data acquisition device can be installed on the terminal equipment corresponding to the accompanying person and used for acquiring the data of the accompanying person. In this embodiment, the data obtaining module 301 may specifically obtain data of the attended person from a data collecting device of the terminal device.
The accompanying device that this application provided can match the mode that suits and accompany according to the actual conditions (its data that provide promptly) by the attendant. Accompanying mode is abundant various, consequently can richen by accompanying person's experience of being accompanied by, promotes the effect of accompanying and attending to. In addition, the accompanying device that this application provided can realize on the terminal equipment of various forms, does not confine to intelligent robot to, therefore greatly reduced accompany and attend to the spending and sell, convenient and practical.
Optionally, the accompany mode determining module 302 specifically includes:
a first determining unit, configured to input data of the person under career into a pre-constructed accompanying model to obtain a corresponding accompanying mode, where the accompanying model is a model obtained by machine learning sample data and used for outputting the corresponding accompanying mode, and the sample data at least includes one of: sound sample data, expression sample data, action sample data and physical sign sample data.
Through constructing the accompanying model in advance to utilize the accompanying model to confirm suitable accompanying mode, can promote the definite speed of accompanying mode, thereby quick response is by the accompanying demand of accompanying person. The experience of the accompanying person is improved.
Optionally, when the accompanying mode is the voice response mode, the accompanying module 303 specifically includes:
the accompanying data acquisition unit is used for acquiring an emotion analysis result of the attendee by using the accompanying model and acquiring accompanying data according to the emotion analysis result; the accompanying data is response audio or music;
and the first output unit is used for outputting the accompanying data.
Optionally, the data obtaining module 301 is further configured to obtain sound data of relatives or friends of the accompanying person; in particular, the sound data may be obtained from a data acquisition device.
The accompanying data acquisition unit is specifically used for acquiring accompanying data by using the sound data of relatives or friends of the accompanying person according to the emotion analysis result.
Carry out voice response to the person of being accompanied by with the sound of the parent or friend of the person of being accompanied by, can promote the effect of accompanying and attending to the person of being accompanied by (for example, pacify the effect), influence the mood of the person of being accompanied by and make it more positive, positive.
Optionally, when the determined accompanying mode is the psychological consultation mode, the accompanying module 303 specifically includes:
the connection establishing unit is used for establishing connection with a psychological consultation mechanism;
the first data sending unit is used for sending the data of the attended person to a psychological consultation mechanism and/or sending an emotion analysis result of the attended person;
the audio and video receiving unit is used for receiving the audio and video provided by the psychological consultation mechanism;
and the audio and video output unit is used for outputting the received audio and video.
It is understood that in practice, the accompanying person usually has a guardian. The guardian can be a relatives, friends, etc. of the accompanying person. Although the method provided by the embodiment of the application is applied, the terminal device corresponding to the accompanied person can have rich accompanying functions, if the accompanying information related to the accompanied person can be provided for the guardian of the accompanied person, the guardian can know and understand the actual conditions of the accompanied person in more detail, including physical health conditions and psychological health conditions.
Optionally, the apparatus further comprises: and the conversation content sending module is used for sending the conversation content of the attended person and a psychological consultant of the psychological consulting organization to the terminal equipment corresponding to the guardian of the attended person.
By obtaining the conversation content, the guardian of the accompanied person can master the actual condition of the accompanied person in real time, and the guardian can also know the feasible way of relieving the negative emotion of the accompanied person by a psychological consultant according to the conversation content, so that the guardian can be better accompanied by the attended person by utilizing the feasible way when the attended person needs the feasible way.
Optionally, when the accompanying mode is the first aid mode, the data of the accompanying person comprises sign data; the accompanying module 303 specifically includes:
and the second data sending unit is used for sending the physical sign data to an emergency organization closest to the accompanied person. Therefore, the first-aid organization can rescue the accompanied person as soon as possible.
Optionally, the accompany mode determining module 302 specifically includes:
the abnormity judging unit is used for judging whether the physical sign data is abnormal or not;
and the second determining unit is used for determining the accompanying mode as the first-aid mode when the abnormity judging unit judges that the physical sign data of the accompanied person is abnormal.
Based on the accompanying method and the accompanying device provided by the foregoing embodiments, correspondingly, the application further provides a terminal device. The following describes a specific implementation of the terminal device with reference to the embodiments and the drawings.
Terminal device embodiment
Referring to fig. 4, the figure is a schematic structural diagram of a terminal device according to an embodiment of the present application.
As shown in fig. 4, the terminal device provided in this embodiment includes:
a data acquisition device 401, a processor 402 and a storage medium 403;
the data acquisition device 401 is used for acquiring data of a person under accompanying, and sending the data to the processor 402; the data of the accompanied person at least comprises one of the following data: sound data, expression data, motion data and physical sign data;
the storage medium 403 is used for storing a computer program which, when executed by the processor 402, performs the following operations:
obtaining data of the accompanied person;
analyzing the data of the person to be accompanied to determine a corresponding accompanying mode; the accompanying mode at least comprises one of the following modes: a voice response mode, a psychological consultation mode and an emergency mode;
and accompanying the person to be accompanied according to the determined accompanying mode.
The terminal device may be any one of the following:
cell-phone, computer, intelligent bracelet, intelligent electron ornaments or intelligent robot. Of course, the terminal device may also be other intelligent electronic devices, and is not limited in particular here.
The terminal equipment provided by the application can match the adaptive mode for accompanying according to the actual situation of the person to be accompanied (namely the data provided by the person). Accompanying mode is abundant various, consequently can richen by accompanying person's experience of being accompanied by, promotes the effect of accompanying and attending to. In addition, the accompanying device that this application provided can realize on the terminal equipment of various forms, does not confine to intelligent robot to, therefore greatly reduced accompany and attend to the spending and sell, convenient and practical.
In addition, when the computer program stored in the storage medium 403 is executed by the processor 402, the operations described in any one of the foregoing method embodiments may also be performed, so as to implement diversified accompanying for a person being accompanied. The experience of the accompanying person is promoted, and the safe and reliable accompanying of the accompanying person is realized.
It should be noted that, in the present specification, each embodiment is described in a progressive manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the apparatus and system embodiments, since they are substantially similar to the method embodiments, they are described relatively simply, and reference may be made to some of the descriptions of the method embodiments for relevant points. The above-described embodiments of the apparatus and system are merely illustrative, and the units described as separate parts may or may not be physically separate, and the parts described as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The above description is only one specific embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (10)
1. A accompany method is characterized in that the accompany method is applied to terminal equipment corresponding to a person to be accompanied; the method comprises the following steps:
collecting data of a person to be accompanied; the data of the accompanied person at least comprises one of the following data: sound data, expression data, motion data and physical sign data;
analyzing the data of the person to be accompanied to determine a corresponding accompanying mode; the accompanying mode includes at least one of: a voice response mode, a psychological consultation mode and an emergency mode;
and accompanying the person to be accompanied according to the determined accompanying mode.
2. The accompanying method as claimed in claim 1, wherein the analyzing the data of the accompanying person to determine the corresponding accompanying mode specifically comprises:
inputting the data of the person to be accompanied into a pre-constructed accompanying model to obtain a corresponding accompanying mode, wherein the accompanying model is a model which is obtained by machine learning of sample data and is used for outputting the corresponding accompanying mode, and the sample data at least comprises one of the following: sound sample data, expression sample data, motion sample data and physical sign sample data.
3. The accompanying method according to claim 2, wherein when the accompanying mode is the voice response mode, the accompanying person according to the determined accompanying mode specifically comprises:
obtaining an emotion analysis result of the attendee by using the accompanying model, and obtaining accompanying data according to the emotion analysis result; the accompanying data is response audio or music;
and outputting the accompanying data.
4. The method of accompanying claim 3, further comprising: collecting sound data of relatives or friends of the accompanying person;
the obtaining of accompanying data according to the emotion analysis result specifically includes:
and obtaining accompanying data by using the sound data of relatives or friends of the accompanying person according to the emotion analysis result.
5. The accompanying method according to claim 1, wherein when the accompanying mode is the psychological consultation mode, the accompanying of the accompanying person according to the determined accompanying mode specifically includes:
establishing connection with a psychological counseling institution;
sending the data of the accompanied person to a psychological consulting organization, and/or sending the emotion analysis result of the accompanied person;
and receiving and outputting the audio and video provided by the psychological consulting organization.
6. The method of accompanying claim 1, further comprising: and sending the conversation content of the attended person and a psychological consultant of the psychological consultant organization to terminal equipment corresponding to the guardian of the attended person.
7. The accompanying method as defined in claim 1, wherein when the accompanying mode is the first aid mode, the data of the accompanying person includes sign data; the mode is to being accompanied the person of being accompanied according to the accompanying and attending to that determines, specifically includes:
and sending the physical sign data to an emergency agency closest to the accompanied person.
8. The accompanying method as claimed in claim 1, wherein the analyzing the data of the accompanying person to determine the corresponding accompanying mode specifically comprises:
and judging whether the sign data is abnormal or not, and if so, determining that the accompanying mode is the first-aid mode.
9. An accompanying device is characterized in that the accompanying device is applied to terminal equipment corresponding to a person to be accompanied; the device comprises:
the data acquisition module is used for acquiring the data of the accompanied person; the data of the accompanied person at least comprises one of the following data: sound data, expression data, motion data and physical sign data;
the accompanying mode determining module is used for analyzing the data of the person to be accompanied to determine a corresponding accompanying mode; the accompanying mode includes at least one of: a voice response mode, a psychological consultation mode and an emergency mode;
and the accompanying module is used for accompanying the accompanying person according to the accompanying mode determined by the accompanying mode determining module.
10. A terminal device, comprising: a data acquisition device, a processor, and a storage medium;
the data acquisition device is used for acquiring data of a person under accompanying and sending the data to the processor; the data of the accompanied person at least comprises one of the following data: sound data, expression data, motion data and physical sign data;
the storage medium is for storing a computer program which, when executed by the processor, performs the following:
obtaining data of the accompanied person;
analyzing the data of the person to be accompanied to determine a corresponding accompanying mode; the accompanying mode includes at least one of: a voice response mode, a psychological consultation mode and an emergency mode;
and accompanying the person to be accompanied according to the determined accompanying mode.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910859308.4A CN112489797A (en) | 2019-09-11 | 2019-09-11 | Accompanying method, device and terminal equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910859308.4A CN112489797A (en) | 2019-09-11 | 2019-09-11 | Accompanying method, device and terminal equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112489797A true CN112489797A (en) | 2021-03-12 |
Family
ID=74920155
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910859308.4A Pending CN112489797A (en) | 2019-09-11 | 2019-09-11 | Accompanying method, device and terminal equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112489797A (en) |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150190927A1 (en) * | 2012-12-21 | 2015-07-09 | Crosswing Inc. | Customizable robotic system |
CN105082150A (en) * | 2015-08-25 | 2015-11-25 | 国家康复辅具研究中心 | Robot man-machine interaction method based on user mood and intension recognition |
CN105654413A (en) * | 2016-03-23 | 2016-06-08 | 张春生 | Smart old-age community system based on cloud technology |
CN106230972A (en) * | 2016-08-30 | 2016-12-14 | 江苏艾倍科科技股份有限公司 | A kind of endowment service system of wisdom at home |
CN106236031A (en) * | 2016-08-30 | 2016-12-21 | 江苏艾倍科科技股份有限公司 | A kind of family endowment emergency relief based on the Big Dipper and alignment system |
CN106249711A (en) * | 2016-08-03 | 2016-12-21 | 海南警视者科技开发有限公司 | A kind of Multifunctional intelligent robot |
CN106671105A (en) * | 2017-01-17 | 2017-05-17 | 五邑大学 | Intelligent accompanying robot for old people |
CN107301168A (en) * | 2017-06-01 | 2017-10-27 | 深圳市朗空亿科科技有限公司 | Intelligent robot and its mood exchange method, system |
CN108711452A (en) * | 2018-01-25 | 2018-10-26 | 鲁东大学 | The health state analysis method and system of view-based access control model |
CN108717678A (en) * | 2018-05-25 | 2018-10-30 | 青岛联合创智科技有限公司 | A kind of wisdom endowment system |
-
2019
- 2019-09-11 CN CN201910859308.4A patent/CN112489797A/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150190927A1 (en) * | 2012-12-21 | 2015-07-09 | Crosswing Inc. | Customizable robotic system |
CN105082150A (en) * | 2015-08-25 | 2015-11-25 | 国家康复辅具研究中心 | Robot man-machine interaction method based on user mood and intension recognition |
CN105654413A (en) * | 2016-03-23 | 2016-06-08 | 张春生 | Smart old-age community system based on cloud technology |
CN106249711A (en) * | 2016-08-03 | 2016-12-21 | 海南警视者科技开发有限公司 | A kind of Multifunctional intelligent robot |
CN106230972A (en) * | 2016-08-30 | 2016-12-14 | 江苏艾倍科科技股份有限公司 | A kind of endowment service system of wisdom at home |
CN106236031A (en) * | 2016-08-30 | 2016-12-21 | 江苏艾倍科科技股份有限公司 | A kind of family endowment emergency relief based on the Big Dipper and alignment system |
CN106671105A (en) * | 2017-01-17 | 2017-05-17 | 五邑大学 | Intelligent accompanying robot for old people |
CN107301168A (en) * | 2017-06-01 | 2017-10-27 | 深圳市朗空亿科科技有限公司 | Intelligent robot and its mood exchange method, system |
CN108711452A (en) * | 2018-01-25 | 2018-10-26 | 鲁东大学 | The health state analysis method and system of view-based access control model |
CN108717678A (en) * | 2018-05-25 | 2018-10-30 | 青岛联合创智科技有限公司 | A kind of wisdom endowment system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11301680B2 (en) | Computing device for enhancing communications | |
US8715179B2 (en) | Call center quality management tool | |
US10448887B2 (en) | Biometric customer service agent analysis systems and methods | |
US9138186B2 (en) | Systems for inducing change in a performance characteristic | |
US8715178B2 (en) | Wearable badge with sensor | |
JP6101684B2 (en) | Method and system for assisting patients | |
JP2019145067A (en) | System and method, computer implementation method, program and computer system for physiological detection for detecting state of concentration of person for optimization of productivity and business quality | |
US10504379B2 (en) | System and method for generating an adaptive embodied conversational agent configured to provide interactive virtual coaching to a subject | |
WO2016089594A2 (en) | Conversation agent | |
US20160321401A1 (en) | System and method for topic-related detection of the emotional state of a person | |
JP7422797B2 (en) | Medical treatment support system | |
US20200375544A1 (en) | System, method and computer program product for detecting a mobile phone user's risky medical condition | |
CN113287175A (en) | Interactive health status evaluation method and system thereof | |
US10978209B2 (en) | Method of an interactive health status assessment and system thereof | |
Jarwar et al. | Exploring web objects enabled data-driven microservices for E-health service provision in IoT environment | |
Martinez et al. | A predictive model for automatic detection of social isolation in older adults | |
CN112489797A (en) | Accompanying method, device and terminal equipment | |
Samyoun et al. | VoiceCare: a voice-interactive cognitive assistant on a smartwatch for monitoring and assisting daily healthcare activities | |
Bonilla et al. | Facial recognition of emotions with smartphones to improve the elder quality of life | |
SureshKumar et al. | HELTRAK-a medical application with chatbot based on AI | |
JP7405357B2 (en) | Elderly person monitoring system | |
WO2022065386A1 (en) | Thought inference system, inference model generation system, thought inference device, inference model generation method, computer program, and inference model | |
CN111428540A (en) | Method and device for outputting information | |
US20240008766A1 (en) | System, method and computer program product for processing a mobile phone user's condition | |
Kaushan et al. | Personalized and Interactive Demented Care and Learning Mate with a Virtual System Using Emotion Recognition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |