WO2023228789A1 - Système de traitement d'informations, procédé de traitement d'informations, dispositif de traitement d'informations et terminal utilisateur - Google Patents

Système de traitement d'informations, procédé de traitement d'informations, dispositif de traitement d'informations et terminal utilisateur Download PDF

Info

Publication number
WO2023228789A1
WO2023228789A1 PCT/JP2023/018045 JP2023018045W WO2023228789A1 WO 2023228789 A1 WO2023228789 A1 WO 2023228789A1 JP 2023018045 W JP2023018045 W JP 2023018045W WO 2023228789 A1 WO2023228789 A1 WO 2023228789A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
user
information processing
sharing range
sharing
Prior art date
Application number
PCT/JP2023/018045
Other languages
English (en)
Japanese (ja)
Inventor
健治 山根
乃愛 金子
咲湖 安川
律子 金野
拓 田中
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2023228789A1 publication Critical patent/WO2023228789A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/17Details of further file system functions
    • G06F16/176Support for shared access to files; File sharing support
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data

Definitions

  • the present disclosure relates to an information processing system and an information processing method, as well as an information processing device and a user terminal.
  • Vital sensors that detect a user's biological information are known. Furthermore, in recent years, technology has been developed to predict a user's illness based on biological information detected using a vital sensor.
  • Patent Document 1 discloses a technique in which, when a user's biometric information is shared with a third party, restrictions are placed on the biometric information to be shared.
  • the present disclosure aims to provide an information processing system and an information processing method, as well as an information processing device and a user terminal, which make it possible to easily set the sharing range of user information including biometric information.
  • the information processing system includes a sharing range acquisition unit that obtains an expected sharing range that a business operator expects to share with respect to user information, and recommended sharing that is recommended to be shared, calculated based on the expected sharing range.
  • the information processing apparatus includes a processing unit that transmits the range to a user terminal corresponding to the user information, and the user terminal includes a display control unit that displays the recommended sharing range on a display unit.
  • FIG. 1 is a schematic diagram illustrating an example of a schematic configuration of an information processing system applicable to the present disclosure.
  • FIG. 2 is a schematic diagram showing an example of a sensor and sensing data applicable to the present disclosure.
  • FIG. 2 is a functional block diagram of an example for explaining functions of an information processing system applicable to the present disclosure.
  • FIG. 2 is a diagram illustrating an example of an algorithm DB in which information such as feature value calculation algorithms applicable to the present disclosure is registered.
  • FIG. 2 is a schematic diagram illustrating an example of a feature amount that can be calculated and is applicable to the present disclosure.
  • FIG. 2 is a schematic diagram showing an example of an algorithm DB in which information such as learned models is registered, which is applicable to the present disclosure.
  • FIG. 2 is a block diagram showing a hardware configuration of an example of an information processing device applicable to the present disclosure.
  • FIG. 2 is a block diagram showing a hardware configuration of an example of a user terminal applicable to the present disclosure.
  • FIG. 2 is a functional block diagram of an example for explaining the functions of the information processing system according to the embodiment.
  • FIG. 3 is a schematic diagram showing an example of user information held by the information holding unit according to the embodiment.
  • FIG. 7 is a schematic diagram illustrating an example of service provider sharing range information stored in a service provider sharing range storage unit according to the embodiment. 7 is a flowchart of an example of processing by an access control unit according to the embodiment. 5 is a flowchart of an example of processing by an intervention module according to an embodiment.
  • FIG. 2 is a flowchart of an example of a method of presenting an expected sharing range according to an embodiment.
  • FIG. 2 is a schematic diagram showing an example of an initial screen displayed on a user terminal by the user terminal program according to the embodiment.
  • FIG. 3 is a schematic diagram showing an example of a sharing range setting screen according to the embodiment.
  • FIG. 3 is a schematic diagram showing an example of an inference target value table in which an inference target value holding unit according to an embodiment holds inference target values.
  • FIG. 3 is a schematic diagram illustrating an example of a sharing status visualization screen that visualizes the sharing status of users with respect to service providers for each service provider, according to the embodiment. It is a flowchart of an example which shows the process based on the 1st modification of embodiment. It is a flowchart of an example which shows the process based on the 2nd modification of embodiment.
  • the present disclosure relates to a method for presenting the scope of data sharing when sharing user information with a third party in a healthcare application that assists user health management.
  • the third party may be, for example, a business operator that provides services to users using the healthcare application.
  • the business operator will be referred to as a service provider (SP) as a service provider.
  • SP service provider
  • a service provider may be an individual, a corporation, a specific organization, etc.
  • the user information may include identification information that identifies the user, attribute information of the user, and biometric information acquired from the user.
  • Non-Patent Document 1 (Yuri Rykov, 4 others, “Digital Biomarkers for Depression Screening With Wearable Devices: Cross-sectional Study With Machine Learning Modeling", [online], October 15, 2021, JMIR mHealth and uHealth ISSN 2291-5222, [searched on April 21, 2020], Internet, ⁇ https://mhealth.jmir.org/2021/10/e24872>) provides data on 11 digital biomarkers of activity trackers. This suggests that there is a significant relationship with the severity of depression. In this way, it has been shown that the value of each biomarker has a correlation with the degree of depression, and it is said that it is possible to provide personalized medical care based on these biomarkers.
  • Biometric information data obtained from, for example, a wearable device needs to be kept confidential.
  • the present disclosure proposes a solution to the problems in these conventional techniques.
  • the user specifies information included in the user information that is permitted to be shared with the service provider, and sets the user information sharing range.
  • User information and user information sharing range are held and managed by the server according to the present disclosure.
  • the server can infer variables that are indicative of the user's health status, for example, in response to a request from the service provider. At this time, the server infers the variable using information that is permitted to be shared within the user information sharing range among the user information.
  • the server may present to the user a recommended sharing range that recommends sharing as the user information sharing range.
  • the recommended sharing range may be a combination of one or more pieces of information that are recommended to be shared among the pieces of information included in the user information held in the server.
  • the recommended sharing range may be specified manually by the service provider, or may be specified based on explanatory variables that are considered preferable as a result of identification using algorithms such as disease and mood determination.
  • the server may maintain a table indicating which service provider the user shares which information included in the user information as the user information sharing range.
  • the server may refer to the table to visualize and present to the user information to be shared within the user information sharing range for each service provider.
  • the server may present a plurality of recommended sharing ranges with reasons for the user information sharing range specified by the user.
  • the reason given to the recommended sharing range may be, for example, the benefit that the user can obtain by applying the recommended sharing range.
  • the user may select a recommended sharing range that is considered appropriate from the plurality of recommended sharing ranges presented, and change the user information sharing range specified by the user based on the selected recommended sharing range.
  • the technology according to the present disclosure has the above-described configuration, it is possible to easily set the sharing range of user information including biometric information with a third party.
  • FIGS. 1 to 6 (1-3. System applicable to this disclosure) Next, for ease of understanding, a system applicable to the present disclosure will be described using FIGS. 1 to 6.
  • FIG. 1 is a diagram illustrating an example of a schematic configuration of an information processing system applicable to the present disclosure.
  • an information processing system 100 to which the present disclosure is applicable includes an information acquisition device 1, an information processing device 2, and one or more terminals.
  • a medical worker terminal 3, a community member terminal 4, a life insurance/health insurance terminal 5, a service provider terminal 6, and an analysis terminal 7 are shown as terminals.
  • the information processing device 2 functions as a server according to the present disclosure.
  • a user of the information processing system 100 is referred to as a user U.
  • the service provider terminal 6 is a business terminal used by a service provider that provides products and services to the user U.
  • Each device and terminal included in the information processing system 100 is configured to be able to communicate via a network N such as the Internet.
  • the information processing device 2 is shown as being included in a cloud network CL connected to a network N.
  • the cloud network CL is a network that includes a plurality of computers and storage devices that are communicably connected to each other via a network, and that can provide computer resources in the form of services.
  • the information processing device 2 is not limited to the configuration on the cloud network CL, but may be configured with a single piece of hardware connected to the network N, or the information processing device 2 may be configured with multiple computers connected to each other so as to be able to communicate with each other, distributing functions. It may also be configured by
  • User U may be a patient with a disease.
  • diseases include neurological diseases, muscular degenerative diseases, cardiovascular diseases, psychiatric diseases, diabetes, immune/allergic diseases, and diseases specific to the elderly, and more specifically, Parkinson's disease, muscular dystrophy, and cerebellar degenerative diseases. , amyotrophic lateral sclerosis (ALS), arrhythmia, heart failure, hypertension, depression, dementia, sleep disorders, asthma, hay fever, sarcopenia/frailty, etc.
  • ALS amyotrophic lateral sclerosis
  • User U is not limited to a patient with a disease.
  • the user U may be a healthy person without any disease or the like.
  • the information acquisition device 1 includes a sensor 11 and a user terminal 12.
  • Sensor 11 is provided for user U.
  • the sensing data may be data at a specific point in time or may be time-series data.
  • the sensor 11 may be a wearable sensor attached to the body of the user U, or may be an image sensor (for example, a monitoring camera), which will be described later, that captures an image of the user U or detects the sound of the user U, or a sound collection sensor ( For example, it may be a sound collection section of a smart speaker).
  • the body parts of the user U to which the sensor 11 is attached include the wrist Ua, forearm Ub, upper arm Uc, neck Ud, waist Ue, thigh Uf, chest Ug, finger Uh, face Ui (eyes, mouth, etc.), and ear Uj. Illustrated with .
  • FIG. 2 is a schematic diagram showing an example of the sensor 11 and sensing data applicable to the present disclosure.
  • the sensor 11 include an acceleration sensor, an angular velocity sensor, a position sensor, an image sensor, a sound collection sensor, and a vital sensor.
  • Various known sensors may be used as these sensors.
  • the acceleration sensor detects acceleration. Sensing data of the angular velocity sensor is referred to as acceleration data.
  • the angular velocity sensor is also called a gyro sensor or the like, and detects angular velocity. Sensing data of the angular velocity sensor is referred to as angular velocity data.
  • a position sensor detects a position. Sensing data of the position sensor is referred to as position data.
  • the image sensor is an image sensor or the like, and captures an image of a subject. Sensing data from the image sensor is referred to as image data. Note that "image” may be interpreted to include video, and “imaging” may be interpreted to include shooting.
  • the sound collection sensor is a microphone or the like, and detects sound. Sensing data from the sound collection sensor is referred to as sound data.
  • the vital sensor detects biological information.
  • the sensing data of the vital sensor is referred to as vital data.
  • the sensing data may include heart rate, brain waves, electrocardiogram, blood flow, blood pressure, respiration, vital capacity, electromyography, body temperature, blood sugar level, weight, vibration, shock, perspiration, and the like.
  • image sensor and sound collection sensor may be used as a non-contact vital sensor.
  • blood pressure and the like are detected from image data from an image sensor.
  • Heartbeat etc. are detected from the sound data of the sound collection sensor.
  • the result of operation of the user terminal 12 by the user U may be included in the sensing data.
  • operation results include operation time, operation time (operation timing), input voice, input document, and the like.
  • medication/meal information regarding user U's medication and meals may also be included in the sensing data.
  • the medication/meal information may be input and acquired by operating the user terminal 12, for example, or may be detected by a sensor provided on a medicine box, tray, or the like.
  • User U's health checkup results and the like may also be included in the sensing data.
  • the user terminal 12 is a terminal used by user U.
  • Examples of the user terminal 12 are a smartphone, a tablet terminal, a personal computer, a television receiver, etc., and terminals other than the user terminal 12 described later may be the same.
  • Sensing data from the sensor 11 is transmitted from the sensor 11 to the user terminal 12 and collected. Note that a part of the sensor 11 may be incorporated into the user terminal 12.
  • the user terminal 12 transmits the collected sensing data to the information processing device 2.
  • the information processing device 2 receives sensing data.
  • the information processing device 2 processes the received sensing data. Details of the processing will be described later. Information obtained by processing in the information processing device 2 (corresponding to "result information" in FIG. 3, which will be described later) is transmitted to other devices and terminals.
  • the medical worker terminal 3 is a terminal used by the medical worker C.
  • Examples of medical personnel C include doctors, nurses, pharmacists, physical therapists, and caregivers.
  • Community member terminal 4 is a terminal used by member M.
  • the member M is a member who belongs to the same community as the user U, and is, for example, a family member of the user U, another patient with the same disease as the user U, or the like.
  • the life insurance/health insurance terminal 5, service provider terminal 6, and analysis terminal 7 will be described later.
  • FIG. 3 is an example functional block diagram for explaining the functions of the information processing system 100 applicable to the present disclosure.
  • sensing data from the sensor 11 is collected by the user terminal 12 .
  • the user terminal 12 includes a communication section 13, a user interface section 14, a processing section 15, and a storage section 16.
  • An example of the information stored in the storage unit 16 is an application program 161 (application software).
  • an application for use by the user U is provided.
  • applications are applications that support rehabilitation of user U, applications that support treatment of diseases, applications that record symptoms of diseases, eating habits, etc. More specifically, applications that are used when performing rehabilitation These include rehabilitation applications, applications that provide cognitive behavioral therapy, and applications that record symptoms of heart and respiratory diseases.
  • the communication unit 13 communicates with other devices and the like. For example, the communication unit 13 receives sensing data from the sensor 11 and result information (described later) from the information processing device 2. Furthermore, the communication unit 13 transmits sensing data to the information processing device 2 .
  • the user interface unit 14 accepts operations on the user terminal 12 by the user U and presents information to the user U.
  • the processing unit 15 functions as a control unit that controls each element of the user terminal 12, and also executes various processes.
  • the processing unit 15 executes the application program 161. This provides the various applications described above.
  • a rehabilitation application as an example, a rehabilitation menu indicating the contents of rehabilitation, etc., is presented (displayed, etc.) by the user interface unit 14, for example. User U performs rehabilitation according to the presented rehabilitation menu.
  • the information processing device 2 receives sensing data from the information acquisition device 1 and processes it.
  • the information processing device 2 includes a communication section 21 , an estimation section 22 , a storage section 23 , a recommendation section 24 , and a processing section 25 .
  • Examples of information stored in the storage unit 23 include patient information 231, community information 232, algorithm DB 233, learned model 234, recommendation information 235, and anonymously processed information 236.
  • Patient information 231 includes information regarding user U who is a patient.
  • Examples of the patient information 251 include user U's disease information, diagnosis information, medical records, examination information, medication information, hospital visit history information, rehabilitation history information, and the like.
  • the disease information includes the disease name of the user U mentioned above. Diagnostic information, medical record, checkup information, medication information, and hospital visit history information are information related to user U's diagnosis, medical treatment, checkup, medication, and hospital visit history, and are stored, for example, outside the information processing device 2 (medical worker terminal 3, etc.) given from.
  • the rehabilitation history information is information regarding past rehabilitation performed by the user U, is acquired, for example, in the above-mentioned rehabilitation application, and is transmitted from the user terminal 12 to the information processing device 2.
  • the community information 232 is information about the community to which the user U belongs, and includes information about the members M and the community member terminals 4.
  • the algorithm DB 233, trained model 234, recommendation information 235, and anonymously processed information 236 will be described later.
  • the communication unit 21 communicates with other devices. For example, the communication unit 21 receives sensing data from the user terminal 12.
  • the estimation unit 22 executes estimation processing based on sensing data. For example, the estimation unit 22 calculates various indicators that can be used for estimation based on the sensing data. Examples of the indicators include indicators related to user U's physical functions, indicators related to behavior, indicators related to diseases, indicators related to emotions, and the like. Various algorithms for calculating the index may be used. The algorithm may be a trained model generated by machine learning using training data. The algorithm may be designed to output an index when sensing data is input, or may be designed to output an index when a feature amount obtained from sensing data is input. The feature amount may be calculated by the estimation unit 22 based on sensing data. Calculation of a feature amount may be understood to include generation, extraction, etc. of a feature amount.
  • the communication unit 21 transmits the result information to other devices and terminals, in this example, the user terminal 12, the medical worker terminal 3, the community member terminal 4, and the life insurance/health insurance terminal 5. Note that the recommendation unit 24 and processing unit 25 of the information processing device 2 will be described later.
  • the result information is presented by the user interface section 14.
  • User U can know various indicators regarding his/her physical functions, behavior, diseases, etc.
  • the result information transmitted from the information processing device 2 to the user terminal 12 may include content to be presented to the user U. It becomes possible to present individual content based on the estimation results.
  • the medical worker terminal 3 includes a communication section 31, a user interface section 32, and a storage section 33.
  • Medical information 331 is exemplified as the information stored in the storage unit 33.
  • the medical information 331 includes, for example, medical record information of the user U, and is used by the medical worker C to diagnose the user U.
  • the communication unit 31 communicates with other devices and the like. For example, the communication unit 31 receives result information from the information processing device 2.
  • the user interface unit 32 accepts operations on the medical worker terminal 3 by the medical worker C, presents information to the medical worker C, and the like. For example, the user interface unit 32 presents result information from the information processing device 2, and thereby updates the medical information 331, such as adding medical record information. Intervention by medical worker C is also possible. For example, individual content such as a rehabilitation menu customized by the medical worker C to suit the user U is generated. The content is transmitted to the user terminal 12 by the communication unit 31 and presented to the user U.
  • the community member terminal 4 includes a communication section 41 and a user interface section 42.
  • the communication unit 41 communicates with other devices and the like. For example, the communication unit 41 receives result information from the information processing device 2.
  • the user interface unit 42 accepts operations on the community member terminal 4 by members M, and presents information to members M. For example, result information from the information processing device 2 is presented and shared with the member M as well.
  • the result information transmitted from the information processing device 2 to the community member terminal 4 may include content to be presented to the member M. It becomes possible to present individual content based on the estimation results.
  • the life insurance/health insurance terminal 5 is a terminal used by insurance companies, health insurance companies, etc.
  • the life insurance/health insurance terminal 5 includes a communication section 51, an analysis section 52, and a storage section 53.
  • customer/employee information 531 is exemplified.
  • the customer/employee information 531 includes information regarding user U's life insurance, health insurance, and the like.
  • the communication unit 51 communicates with the information processing device 2 and receives, for example, result information from the information processing device 2.
  • the analysis unit 52 analyzes the result information and specifies (calculates, etc.) insurance premiums and rewards. Identification may involve the work or judgment of employees of life insurance companies or health insurance companies. Insurance premiums may be reduced or changes to limited plans may be made.
  • the communication unit 51 transmits the identified insurance premium and insurance premium/reward information recommending it to the user terminal 12. Premium/reward information is presented by the user interface section 14 of the user terminal 12.
  • the service provider terminal 6 (shown as SP terminal 6 in the figure) is a terminal used by the service provider.
  • Examples of products for which information is transmitted from the service provider terminal 6 to the information processing device 2 include wheelchairs, walking aids, rehabilitation equipment, health foods, and health appliances.
  • an example of a service in which information is transmitted from the service provider terminal 6 to the information processing device 2 is a health application or the like that can be executed on the user terminal 12.
  • the service provider terminal 6 includes a communication section 61 and a user interface section 62.
  • the service provider terminal 6 receives or generates product/service information that associates result information with products and services, for example, via the user interface unit 62. For example, conditions indicating the association between the index shown in the result information and the product or service are input to the service provider terminal 6, and information including this condition is generated as product/service information.
  • the communication unit 61 transmits product/service information to the information processing device 2 .
  • the communication unit 21 of the information processing device 2 receives product/service information from the service provider terminal 6.
  • the recommendation unit 24 and processing unit 25 of the information processing device 2 will be explained.
  • the recommendation unit 24 generates recommendation information 235 including information on products and services that should be recommended to the user U based on the product/service information from the service provider terminal 6. For example, the recommendation unit 24 determines products and services corresponding to the result information based on the conditions input at the service provider terminal 6 described above, and generates recommendation information 235 that recommends them.
  • the communication unit 21 transmits the recommendation information 235 to the user terminal 12 and the community member terminal 4.
  • the recommendation information 235 is presented by the user interface unit 14 of the user terminal 12 or by the user interface unit 42 of the community member terminal 4.
  • the processing unit 25 generates anonymously processed information 236 by anonymizing the result information.
  • the anonymously processed information 236 describes anonymized personal information and result information in association with each other.
  • the communication unit 21 of the information processing device 2 transmits anonymously processed information 236 to the analysis terminal 7.
  • the analysis terminal 7 is a terminal used by, for example, companies that provide the above-mentioned products/services, pharmaceutical companies that conduct clinical development, and the like.
  • the analysis terminal 7 includes a communication section 71, an analysis section 72, and a user interface section 73.
  • the communication unit 71 receives anonymously processed information 236 from the information processing device 2.
  • the analysis unit 72 performs data analysis based on the anonymously processed information 236.
  • the analysis may involve work, judgment, etc. by employees of the company.
  • the user interface unit 73 presents information related to data analysis, etc. Examples of analysis include analysis of user groups for products such as health foods and health appliances, and data analysis for clinical development.
  • the anonymously processed information 236 can be utilized for various services such as marketing analysis by manufacturers, analysis of the proportion of users with specific symptoms, age, gender, etc., and monitoring of symptoms for patients taking specific drugs.
  • functions and information related to various services at the life insurance/health insurance terminal 5, the service provider terminal 6, and the analysis terminal 7, such as the recommendation section 24, the processing section 25, the recommendation information 235, and the anonymously processed information 236, are transferred to the information processing device 2.
  • the life insurance/health insurance terminal 5, the service provider terminal 6, and the analysis terminal 7 communicate with corresponding server devices and utilize their functions. According to this configuration, it is possible to reduce the processing load on the information processing device 2 and reduce costs by simplifying functions.
  • Some of the functions of a terminal may be provided in a server device or the like managed by the user of the corresponding terminal.
  • the function of the analysis unit 52 of the life insurance/health insurance terminal 5 may be provided in a server device or the like managed by an insurance company, health insurance company, or the like.
  • the life insurance/health insurance terminal 5 communicates with the server device and utilizes its functions.
  • the function of the analysis unit 72 of the analysis terminal 7 may be provided in a server device or the like managed by a pharmaceutical company or the like.
  • the analysis terminal 7 communicates with the server device and uses its functions. It is possible to reduce the processing burden on the life insurance/health insurance terminal 5 and the analysis terminal 7, and to simplify the functions and reduce costs.
  • the estimation unit 22 calculates an index based on the evaluation result of the feature amount calculated from the sensing data. In that case, the estimation unit 22 calculates the feature amount from the sensing data using a feature amount calculation algorithm. Then, the estimation unit 22 calculates an index by evaluating the calculated feature amount. Information such as a feature value calculation algorithm suitable for calculating a specific index is registered in the algorithm DB 233 stored in the storage unit 23.
  • FIG. 4 is a diagram showing an example of the algorithm DB 233 in which information such as feature value calculation algorithms applicable to the present disclosure is registered.
  • the algorithm DB 233 describes indicators, input data, feature amount calculation algorithms, evaluation items, and evaluation conditions in association with each other. In FIG. 4, a specific example is shown in the first line of the algorithm DB 233.
  • An example of an indicator is a rehabilitation indicator.
  • the input data indicates the data on which the estimation is based, and in this example is image data.
  • the feature amount calculation algorithm is an algorithm that calculates feature amounts necessary for calculating an index, and in this example, it is a posture estimation algorithm that calculates a feature amount related to posture based on image data. For example, various feature amounts related to posture are calculated based on feature points in the image.
  • the evaluation item indicates a feature amount that is an evaluation item among the feature amounts calculated by the feature amount calculation algorithm.
  • the feature amounts of the evaluation items are schematically shown as feature amount A and feature amount B.
  • the evaluation condition indicates the condition for evaluating the evaluation item, and includes a weighting coefficient and a threshold value in this example.
  • the weighting coefficient is used to weight each evaluation item (feature amount A, feature amount B, etc.).
  • the threshold value is set for the score (evaluation value) calculated from each weighted evaluation item. For example, when the score is greater than or equal to the threshold, an index indicating that a rehabilitation effect has been obtained, that is, a rehabilitation effect is present, is calculated as a rehabilitation index. If the score is less than the threshold, an index indicating that no rehabilitation effect was obtained, that is, no rehabilitation effect, is calculated as a rehabilitation index.
  • the algorithm DB 233 describes feature value calculation algorithms and the like in association with each rehabilitation indicator. Furthermore, there may be various indicators other than rehabilitation indicators, and the algorithm DB 233 describes feature value calculation algorithms and the like in association with each of these indicators.
  • FIG. 5 is a schematic diagram showing an example of computable feature amounts applicable to the present disclosure.
  • Features that can be calculated include, in addition to the posture-related features mentioned above, walking-related features, wheelchair-related features, upper limb-related features, rehabilitation-related features, vital stability-related features, speech-related features, and facial expressions.
  • a feature amount related to sleep and a feature amount related to sleep are exemplified.
  • An example of a feature amount related to posture is the posture of the user U's body, more specifically, for example, the inclination of the neck Ud.
  • This feature amount may be calculated from image data as described above, or may be calculated from acceleration data, angular velocity data, etc.
  • Examples of feature amounts related to walking include the number of steps, walking distance, walking range, etc.
  • Examples of features related to wheelchairs include riding distance, riding range, and the like. These feature amounts are calculated from, for example, acceleration data, angular velocity data, position data, image data, and the like.
  • feature amounts related to the upper limb are feature amounts related to raising and lowering the upper limb, and more specifically, the upper limb raising and lowering time, the number of times the upper limb is raised and lowered, the height of the upper limb raising and lowering, and the like. This feature amount is calculated from, for example, acceleration data, angular velocity data, image data, etc.
  • Examples of features related to rehabilitation include rehabilitation implementation rate, rehabilitation quality, etc.
  • the rehabilitation implementation rate is, for example, the ratio (progress) of the rehabilitation menu actually performed to the target rehabilitation menu.
  • the quality of rehabilitation is a score etc. calculated based on other information, such as information such as raising and lowering of the upper limb.
  • This feature amount is calculated, for example, from other feature amounts such as posture-related features and upper limb-related feature amounts, rehabilitation history information (an example of the patient information 231 described above), and the like.
  • An example of a feature amount related to vital stability is the stability of user U's heartbeat, etc. This feature amount is calculated from vital data, image data, etc., for example.
  • An example of a feature amount related to utterance is utterance time, etc. This feature amount is calculated from image data, sound data, etc., for example.
  • feature amounts related to facial expressions include whether user U's facial expression is bright or dark. This feature amount is calculated from, for example, image data.
  • An example of a feature amount related to sleep is sleep time, etc. This feature amount is calculated from image data, vital data, etc., for example.
  • Various feature quantities other than those mentioned above can also be calculated.
  • keystrokes, the number of social contacts, etc. are calculated from the operation results of the user terminal 12.
  • Abnormal behavior, number of outings, outing time, etc. are calculated from the location data.
  • Speech prosody, linguistic content, etc. are calculated from the sound data, more specifically from the user U's speech data.
  • Sociability, language performance, hand movements, etc. are calculated from input documents.
  • Facial features and the like are calculated from the image data.
  • Information on line of sight, characteristic data during the game, etc. are calculated from image data, more specifically video data.
  • the estimation unit 22 may calculate an appropriate index by referring to the algorithm DB 233 as described above and using the feature value calculation algorithm registered therein.
  • the learned model 234 may be used as the algorithm.
  • the learned model 234 may be a model (such as a program) that has been machine learned using training data so as to output data corresponding to an index when data corresponding to the above-described input data is input.
  • the estimation unit 22 calculates the index by inputting input data to the trained model 234 and acquiring output data from the trained model 234.
  • the trained model 234 may be a plurality of trained models 234, each of which calculates a different index.
  • the estimation unit 22 uses an appropriate trained model 234 to calculate the index.
  • Information such as the trained model 234 used for calculation may be registered in the algorithm DB 233 stored in the storage unit 23.
  • FIG. 6 is a diagram showing an example of the algorithm DB 233 in which information such as the learned model 234 is registered, which is applicable to the present disclosure.
  • the algorithm DB 233 describes indicators, input data, and learned models 234 in association with each other.
  • FIG. 6 a specific example is shown in the first line of the algorithm DB 233.
  • a trained model 234 for calculating a rehabilitation index from image data is schematically shown as a trained model 234A.
  • the learned model 234A receives data corresponding to image data, it outputs data corresponding to a rehabilitation index.
  • the estimation unit 22 (see FIG. 3) refers to the algorithm DB 233 as described above and calculates an appropriate index by using the learned model 234 registered there.
  • the learned model 234 may be the feature amount calculation algorithm described above with reference to FIG. 4. In this case, the learned model 234 outputs data corresponding to the feature amount when data corresponding to the input data is input.
  • the estimation unit 22 calculates various indicators regarding the user U and outputs result information.
  • FIG. 7 is a block diagram showing the hardware configuration of an example of the information processing device 2 applicable to the present disclosure. Note that the description here assumes that the information processing device 2 is constituted by, for example, a single computer.
  • the information processing device 2 includes a CPU (Central Processing Unit) 2000, a ROM (Read Only Memory) 2001, and a RAM (Random Access Memory) 2002, which are communicably connected to each other via a bus 2010. It includes a storage device 2003, a data I/F (interface) 2004, and a communication I/F (interface) 2005. Note that although the storage device 2003 is shown as being built into the information processing device 2 in the figure, the storage device 2003 may be configured to be externally connected to the information processing device 2.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the storage device 2003 is a nonvolatile storage medium such as a hard disk drive or flash memory.
  • the CPU 2000 operates according to programs stored in the ROM 2001 and the storage device 2003, using the RAM 2002 as a work memory, and controls the overall operation of the information processing device 2.
  • the data I/F 2004 is an interface for transmitting and receiving data with an external device.
  • Communication I/F 2005 controls communication via network N.
  • FIG. 8 is a block diagram showing the hardware configuration of an example of the user terminal 12 applicable to the present disclosure.
  • the user terminal 12 includes a CPU 1200, a ROM 1201, a RAM 1202, a display control unit 1203, a storage device 1204, an input device 1205, and a data I/O device that are communicably connected to each other via a bus 1210.
  • the storage device 1204 is a nonvolatile storage medium such as flash memory.
  • a hard disk drive may be used as the storage device 1204.
  • the CPU 1200 operates according to programs stored in the ROM 1201 and the storage device 1204, using the RAM 1202 as a work memory, and controls the overall operation of the user terminal 12.
  • the display control unit 1203 generates a display signal that can be displayed by the display device 1220 based on the display control information generated by the CPU 1200.
  • the display device 1220 includes a display element such as an LCD (Liquid Crystal Display), and a drive circuit for driving the display element.
  • the display device 1220 causes a display element to display a screen according to the display signal output from the display control unit 1203.
  • the display device 1220 causing the display device 1220 to display a predetermined screen based on the display control information generated by the CPU 1200 in the user terminal 12 will be described as, for example, the user terminal 12 displaying a predetermined screen.
  • the input device 1205 accepts user operations and outputs control signals according to the user operations.
  • a touch pad that outputs a control signal according to the operating position can be used as the input device 1205.
  • the input device 1205 is not limited to this, and may include an operator such as a mechanical button, a keyboard, a pointing device such as a mouse, etc.
  • a touch panel may be configured by integrally forming the display device 1220 and the input device 1205 so that the input device 1205 can transmit the display by the display device 1220.
  • the data I/F 1206 is an interface for transmitting and receiving data with an external device.
  • the data I/F 1206 may be an interface for wired communication such as USB (Universal Serial Bus), or may be an interface for wireless communication such as Bluetooth (registered trademark).
  • the communication I/F 1207 controls communication to the network N by, for example, wireless communication.
  • Sensor I/F 1208 is an interface for each sensor included in sensor 11.
  • the service provider terminal shown in FIG. 1 can have a configuration similar to that of a general computer including a CPU, memory, storage device, communication I/F, input device, and display device, so the description here will be omitted. omitted.
  • FIG. 9 is an example functional block diagram for explaining the functions of the information processing system according to the embodiment. Note that FIG. 9 focuses on parts that are closely related to the embodiment, and descriptions of parts that are less relevant to the embodiment in the information processing system 100 described using FIGS. 1 and 3 are omitted. ing.
  • an information processing system 100a includes an information processing device 2, a service provider terminal 6, and a user terminal 12.
  • the information processing device 2 includes an intervention notification unit 200, a recommended sharing range notification unit 201, an information storage unit 202, a recommended sharing range calculation unit 203, an access control unit 204, a SP sharing range storage unit 205, and an inference/sharing range storage unit 205.
  • a target value holding unit 206 is included.
  • Each part of the range holding unit 205 and the inference target value holding unit 206 is realized by running the information processing program for the server according to the embodiment on the CPU 2000.
  • the present invention is not limited to this, and some or all of these units may be realized by hardware circuits that operate in cooperation with each other.
  • the CPU 2000 configures each of the above-mentioned units as modules, for example, on the main storage area of the RAM 2002 by executing the information processing program for the server according to the embodiment.
  • the information processing program can be acquired from the outside via, for example, the network N, and installed on the information processing apparatus 2 through communication via the communication I/F 2005.
  • the information processing program is not limited to this, and the information processing program may be provided from the outside via the data I/F 2004. Further, the information processing program may be provided while being stored in a removable storage medium such as a CD (Compact Disk), a DVD (Digital Versatile Disk), or a USB (Universal Serial Bus) memory.
  • CD Compact Disk
  • DVD Digital Versatile Disk
  • USB Universal Serial Bus
  • the user terminal 12 includes an intervention presentation section 120, a sharing range presentation section 121, a biometric information acquisition section 122, a personal information acquisition section 123, and a sharing range setting section 124.
  • Each of the intervention presentation unit 120, sharing range presentation unit 121, biological information acquisition unit 122, personal information acquisition unit 123, and sharing range setting unit 124 is operated by the user terminal program according to the embodiment on the CPU 1200. Realized. The present invention is not limited to this, and some or all of these units may be realized by hardware circuits that operate in cooperation with each other.
  • the CPU 1200 executes the user terminal program according to the embodiment, thereby controlling the intervention presentation section 120, the sharing range presentation section 121, the biometric information acquisition section 122, the personal information acquisition section 123, and the sharing range described above.
  • the setting unit 124 is configured as a module on the main storage area of the RAM 1202, respectively.
  • the program for the user terminal can be obtained from the outside via, for example, the network N, and installed on the user terminal 12 by communication via the communication I/F 1207.
  • the present invention is not limited to this, and the user terminal program may be provided from outside via the data I/F 1206.
  • the program for the user terminal may be stored and provided in a removable storage medium such as a CD, DVD, or USB memory.
  • the service provider terminal 6 includes a recommended sharing range setting section 600, an intervention module 601, and an information storage section 602.
  • the configuration of the service provider terminal 6 is not limited to this example.
  • the service provider terminal 6 may include the above-described recommended sharing range calculation unit 203 and recommended sharing range notification unit 201 in its configuration.
  • the service provider terminal 6 may include the above-described intervention notification unit 200 and recommended sharing range notification unit 201 in its configuration.
  • the recommended sharing range setting unit 600, the intervention module 601, and the information storage unit 602 are realized by executing the service provider terminal program according to the embodiment on the CPU of the service provider terminal 6.
  • the present invention is not limited to this, and some or all of these units may be realized by hardware circuits that operate in cooperation with each other.
  • the CPU executes the service provider terminal program according to the embodiment, so that the service provider terminal 6 has the above-mentioned recommended sharing range setting section 600, intervention module 601, and information storage section 602. Each of them is configured, for example, as a module on the main storage area of the memory.
  • the program for the service provider terminal can be acquired from the outside via the network N, for example, and installed on the service provider terminal 6 through communication via the communication I/F of the service provider terminal 6. There is. However, the program for the service provider terminal may be provided from outside via the data I/F of the service provider terminal 6. Furthermore, the service provider terminal program may be provided while being stored in a removable storage medium such as a CD, DVD, or USB memory.
  • the biological information acquisition unit 122 acquires the user's biological information.
  • the biological information acquisition unit 122 may acquire the user's biological information based on each piece of information acquired by the sensor 11.
  • the biological information acquisition unit 122 may acquire the user's biological information from a wearable device worn by the user.
  • the biological information acquisition unit 122 acquires each piece of information included in the biological information as a digital biomarker (hereinafter referred to as a biomarker).
  • the biomarker may be, for example, a digital value such as the user's daily step count, the user's heart rate, heart rate variability value, sleep time, time of each phase of sleep time (such as REM sleep), skin temperature, and the like. Values applicable as biomarkers are not limited to these.
  • the personal information acquisition unit 123 acquires the attribute information and identification information of the user, which is included in the user information of the user who uses the information processing system 100a according to the embodiment using the user terminal 12, for example. .
  • a user who uses the user terminal 12 to use the information processing system 100a according to the embodiment will be simply referred to as a "user.”
  • the user inputs attribute information and identification information through a user operation on the user terminal 12.
  • the personal information acquisition unit 123 acquires these input attribute information and identification information.
  • the attribute information may include information indicating attributes of the user, such as age, gender, height, weight, and place of residence.
  • the identification information may be the user's name, or information unique to the user, issued to the user by the information processing system 100a, for example.
  • the personal information acquisition unit 123 may further acquire questionnaire indicators regarding the user and include them in the user information of the user.
  • the questionnaire index may be, for example, the depression index PHQ-9 (Patient Health Questionnaire-9), the well-being index UWES (Utrecht Work Engagement Scale), or SWLS (Satisfaction With Life Scale).
  • the questionnaire index is input into the user terminal 12 by a user operation, for example, and is acquired by the personal information acquisition unit 123.
  • the user terminal 12 transmits each piece of information acquired by the biometric information acquisition unit 122 and the personal information acquisition unit 123 to the information processing device 2 as user information.
  • the information processing apparatus 2 uses the information holding unit 202 to store and hold the user information transmitted from the user terminal 12 in, for example, a storage device 2003 that the information processing apparatus 2 has.
  • FIG. 10 is a schematic diagram showing an example of user information held by the information holding unit 202 according to the embodiment.
  • the user information includes the item “ID”, “age”, “gender”, “biomarker #1”, “biomarker #2”, etc., which are associated with the value of the item "ID”. Contains the following items: “Biomarker #N” and "PHQ-9".
  • the item “ID” stores a user ID, which is identification information for identifying a user.
  • the user ID stored in the item “ID” may be set for each user by the information processing device 2, for example.
  • the items “age” and “gender” are the age and gender of the user of the item “ID” and are attribute information of the user.
  • Biomarker #1", “Biomarker #2”, ..., “Biomarker #N” are the biometric information of the user of the item "ID", respectively.
  • N biomarkers are associated with each item “ID,” but this is not limited to this example.
  • each item “ID” may be associated with a different number of biomarkers.
  • PHQ-9 indicates the value of the depression index PHQ-9, which is one of the questionnaire indicators of the user of the item "ID”.
  • PHQ-9 is indicated by the total value of responses to nine questionnaire items, each with a value of 0 to 3.
  • the sharing range setting unit 124 sets the information to be shared with the service provider from among the pieces of information included in the user information held in the information holding unit 202, for example, by user operation. Set accordingly.
  • the sharing range setting unit 124 acquires each item of user information acquired by the biometric information acquisition unit 122 and the personal information acquisition unit 123 and transmitted to the information processing device 2.
  • the sharing range setting unit 124 causes the user terminal 12 to display a screen for prompting the user to specify, among the items of the acquired user information, items to be allowed to be shared with the service provider. If the user is using services provided by multiple service providers, the sharing range setting unit 124 causes the user terminal 12 to display a screen for prompting the user to specify items to be allowed to be shared for each of the multiple service providers.
  • the sharing range setting unit 124 When the sharing range setting unit 124 completes the setting of information to be shared with the service provider, the user terminal 12 stores information in which each item of user information is set with a flag indicating whether sharing is possible (sharing range information for service providers). ) is transmitted to the information processing device 2.
  • the sharing range information for service providers corresponds to the above-mentioned user information sharing range that is set by specifying information that is permitted to be shared with the service provider.
  • the information processing device 2 stores and holds the service provider sharing range information transmitted from the user terminal 12 in the service provider sharing range holding unit 205.
  • the service provider sharing range holding unit 205 is shown as the SP sharing range holding unit 205.
  • service provider may be described as "SP”.
  • FIG. 11 is a schematic diagram showing an example of service provider sharing range information stored in the service provider sharing range storage unit 205 according to the embodiment.
  • section (a) shows an example of service provider sharing range information set for each user for service provider #1 (shown as SP #1 in the figure), and section (b) shows An example of service provider sharing range information set for each user for service provider #2 different from service provider #1 is shown. In this way, service provider sharing range information is held for each service provider.
  • Sections (a) and (b) have the same items, so an example of section (a) will be described.
  • the service provider sharing range information includes items corresponding to the user information held in the information holding unit 202 shown in FIG.
  • a flag "OK” is attached to items that are allowed to be shared with the service provider
  • a flag "NG” is attached to items that are not allowed to be shared.
  • the flag “abstract” is attached to the item “age” of the user whose item “ID” is the value "002".
  • the flag “abstract” indicates that sharing is permitted by abstracting the value of the item. In other words, these flags "OK”, “NG”, and “abstract” can be said to indicate conditions related to sharing of the user information.
  • a value may be abstracted by using a value as a general concept without specifically indicating the value.
  • the specific age in the case of the item "age”, it is conceivable to indicate the specific age as a representative value of the range, such as “teens", “twenties”, etc.
  • the content of the item “Biomarker #1” is "Sleep time”
  • the specific sleep time can be indicated as a representative value of the range, "5 hours or more and less than 6 hours", “6 hours or more and less than 7 hours”. ”, etc.
  • abstraction may be performed by representing the numerical value as a representative value of the range.
  • the access control unit 204 extracts the user information held in the information holding unit 202 based on the service provider sharing range information held in the service provider sharing range holding unit 205. , obtain the information you are authorized to share with the service provider. At this time, the access control unit 204 processes the value of the item to which the flag "abstract" is attached in the shared range information for service providers, and abstracts it as described above. Note that the abstraction method may be determined in advance for each item, for example.
  • the access control unit 204 associates each piece of information, including abstracted information, acquired from the information holding unit 202 with the user ID of the corresponding user, and transmits the information to the service provider terminal 6.
  • the service provider terminal 6 stores each piece of information transmitted from the access control unit 204 in the information storage unit 602 in association with a user ID that identifies the user.
  • FIG. 12 is a flowchart of an example of processing by the access control unit 204 according to the embodiment.
  • step S100 the information processing device 2 obtains a request from the service provider terminal 6 for information that the user has authorized to share with the service provider.
  • This request may include a user ID that identifies the target user.
  • the information processing device 2 passes the obtained request to the access control unit 204.
  • the access control unit 204 in response to the request obtained in step S100, selects a user ID that corresponds to the user ID of the target user held in the service provider sharing range holding unit 205 and that is set in step S100.
  • the access control unit 204 obtains flags for each item included in the shared range information for service providers obtained in step S102, and selects the obtained flags as "OK", "NG", and "NG”. Determine whether it is "abstract”.
  • step S102 the access control unit 204 executes the process of step S103 for the item with the flag "OK” (step S102, "OK").
  • step S103 the access control unit 204 transmits the data stored in the relevant item out of the user information related to the user ID of the target user held in the information holding unit 202 to the service provider terminal 6 as is.
  • step S102 the access control unit 204 executes the process in step S104 for the items flagged with "NG” (step S102, "NG").
  • step S ⁇ b>104 the access control unit 204 does not acquire the data stored in the item in the user information related to the user ID of the target user held in the information holding unit 202 .
  • step S102 the access control unit 204 executes the process of step S105 for the item to which the flag "abstract" is attached (step S102, "abstract”).
  • step S105 the access control unit 204 selects the data stored in the item from among the user information related to the user ID of the target user held in the information storage unit 202 according to the abstraction method specified in the item. Process and abstract. The access control unit 204 transmits the abstracted data of the item to the service provider terminal 6.
  • the service provider terminal 6 stores each data transmitted from the access control unit 204 in steps S103 and S105 in the information storage unit 602 in association with the user ID and each item.
  • the intervention module 601 transmits an intervention message to the user terminal 12 of the user with the user ID related to the information, based on the information stored in the information storage unit 602.
  • FIG. 13 is a flowchart of an example of processing by the intervention module 601 according to the embodiment.
  • step S200 the intervention module 601 acquires data for each user based on the user ID from the data stored in the information storage unit 602.
  • the intervention module 601 acquires data such as biomarkers for each user, for example.
  • step S201 the intervention module 601 determines an operation to be performed by the intervention module 601 based on the data obtained in step S200.
  • the intervention module 601 notifies the target user of the action determined in step S201. More specifically, the intervention module 601 transmits information indicating the action determined in step S201 to the information processing device 2.
  • the information processing device 2 passes the information transmitted from the intervention module 601 to the intervention notification unit 200.
  • the intervention notification unit 200 transmits the passed information to the user terminal 12 of the target user.
  • the user terminal 12 passes the information transmitted from the intervention notification section 200 to the intervention presentation section 120.
  • the intervention presentation unit 120 presents the intervention operation by the intervention module 601 to the user using, for example, a screen display.
  • the processing by the intervention module 601 will be explained using a specific example.
  • a service provider may intervene with a target user using cognitive behavioral therapy for behavioral activation, and encourage the target user to take a predetermined behavior according to the number of steps the target user takes in a day. .
  • the intervention module 601 acquires the target user's data from the information storage unit 602 (step S200).
  • the acquired data includes the number of steps per day as a biomarker.
  • the intervention module 601 determines an intervention action for the target user based on the number of steps per day included in the data (step S201). For example, the intervention module 601 performs a threshold determination on the number of steps per day included in the acquired data of the target user, and determines the action for the target user.
  • the intervention module 601 may set a behavioral goal of "walking at least 10,000 steps per day.” Create or obtain a message urging the target user to take a predetermined action, such as "Let's take action!”, and notify the target user (step S202).
  • the intervention module 601 may have a message to notify the target user in advance for each number of steps in a day, or may generate it each time using artificial intelligence or the like.
  • the intervention module 601 may perform the intervention on the target user using such a rule-based intervention method.
  • the operation by the intervention module 601 is not limited to behavioral activation in cognitive behavioral therapy.
  • the intervention module 601 may use a chatbot that automatically executes a conversation (chat) via a network using artificial intelligence to have a simulated conversation with the target user, for example.
  • the intervention module 601 may also speak to the target user in an appropriate manner that can be inferred from the current biomarker value included in the target user's data.
  • the recommended sharing range calculation unit 203 calculates data for inferring the variables set by the recommended sharing range setting unit 600 of the service provider terminal 6 from the data of each item of user information.
  • the recommended sharing range calculation unit 203 passes the calculated data to the recommended sharing range notification unit 201.
  • the recommended sharing range notification unit 201 transmits the data passed from the recommended sharing range calculation unit 203 to the user terminal 12 as expected sharing range information indicating the expected sharing range that the target user is expected to share.
  • the recommended sharing range calculation unit 203 functions as a sharing range acquisition unit that obtains the expected sharing range that the service provider expects to share with respect to user information.
  • the recommended sharing range calculation unit 203 and the recommended sharing range notification unit 201 function as a processing unit that transmits the recommended sharing range in which sharing is recommended, calculated based on the expected sharing range, to the user terminal corresponding to the user information. .
  • the user terminal 12 passes the expected sharing range information transmitted from the recommended sharing range notification unit 201 to the sharing range presentation unit 121. Based on the passed expected sharing range information, the sharing range presentation unit 121 causes the user terminal 12 to display a screen showing items that are expected to be shared among the items included in the user information (biometric information and personal information). , presented to the user.
  • FIG. 14 is a flowchart of an example of a method of presenting the expected sharing range according to the embodiment.
  • the flowchart in FIG. 14 includes processing of the service provider terminal 6 and processing of the information processing device 2.
  • step S300 the service provider terminal 6 uses the recommended sharing range setting unit 600 to determine, for example, a variable to be inferred based on the user information associated with the user ID of the target user, in accordance with instructions from the service provider.
  • step S301 the recommended sharing range setting unit 600 transmits information indicating the determined inference target variable to the information processing device 2.
  • the information processing device 2 causes the recommended sharing range calculation unit 203 to store, in the information holding unit 202 , according to the information indicating the inference target variable transmitted from the recommended sharing range setting unit 600 .
  • a variable to be inferred is calculated based on the user information of the target user.
  • the recommended sharing range calculation unit 203 performs supervised learning for each combination of each item included in the user information of each user, using each item as an explanatory variable.
  • the recommended sharing range calculation unit 203 calculates the inference accuracy of the variables for each combination using the learning model based on this learning.
  • the recommended sharing range calculation unit 203 passes each calculated inference accuracy to the recommended sharing range notification unit 201.
  • the recommended sharing range notification unit 201 sends a sharing range to users who are allowed to share each item with combinations with low inference accuracy based on the combinations that each user is currently allowed to share. Send you a notification reminding you to make changes.
  • the recommended sharing range notifying unit 201 determines which items each user is permitted to share among the inference accuracies calculated for each combination of items by the recommended sharing range calculation unit 203 in step S302. Extract each inference accuracy for each combination. Based on each of the extracted inference accuracies, the recommended sharing range notification unit 201 allows sharing of combinations of items for which the inference result is lower than the inference result with the highest accuracy among the inference accuracies calculated in step S302. Identify the users who are present. The recommended sharing range notification unit 201 sends a notification to the specified user to prompt the user to change the sharing range to the expected sharing range.
  • the service provider has determined an index based on the PHQ-9, which is a depression index, as a variable to be inferred (step S300). Furthermore, based on the calculation results by the recommended sharing range calculation unit 203, the PHQ-9 index with the highest inference accuracy when using a combination of sleep time and the number of steps per day among the various pieces of information included in the user information. It is assumed that the following has been inferred (step S302).
  • the recommended sharing range notification unit 201 notifies a user who is sharing each item resulting from a combination of inference accuracy lower than that of this combination, a message urging the user to change the sharing range, along with the reason.
  • the recommended sharing range notification unit 201 sends a message urging the user to change the sharing range, such as "To improve the accuracy of PHQ-9, it is better to share your sleep time and daily step count.” It is sent to the user terminal 12 of each target user along with the reason.
  • FIG. 15 is a schematic diagram showing an example of an initial screen displayed on the user terminal 12 by the user terminal program according to the embodiment.
  • the user terminal 12 causes the display device 1220 to display an initial screen 80.
  • the initial screen 80 includes a settings button 800, an intervention message display area 801, and a biomarker value display area 802. Note that each part and layout included in the initial screen 80 are merely examples, and the present invention is not limited to this example.
  • the settings button 800 is a button for performing various settings related to the user terminal program. By operating the settings button 800, a sharing range setting screen for setting the sharing range of user information, which will be described later, can be displayed.
  • the intervention message display area 801 displays an intervention message output by the intervention module 601 of the service provider terminal 6 and transmitted to the user terminal 12 via the information processing device 2.
  • the biomarker value display area 802 for example, a value based on biological information acquired by the sensor 11 is displayed. In the illustrated example, "number of steps (number of steps per day)" and “sleep state (for example, sleep time)" are displayed in the biomarker value display area 802.
  • FIG. 16 is a schematic diagram showing an example of a sharing range setting screen according to the embodiment.
  • the sharing range setting screen 82 shown in FIG. 16 is displayed by, for example, the sharing range setting unit 124 on the user terminal 12 in response to an operation on the setting button 800.
  • the sharing range setting screen 82 is shown for setting the sharing range of user information for service provider #1.
  • the sharing range setting unit 124 may display a screen for specifying which service provider's sharing range is to be displayed on the sharing range setting screen 82, for example, in response to an operation on the setting button 800.
  • the sharing range setting screen 82 is provided with flag setting units 820a to 820d that set flags indicating whether sharing is possible or abstracted for each item included in the user information.
  • flag setting units 820a, 820b, 820c, and 820d respectively set the items “age” and “gender” based on personal information included in the user information, and the items “step count” and “number of steps” based on biometric information (biomarker).
  • a flag is set for each item of "sleep time" to indicate whether it can be shared or abstracted.
  • the display in FIG. 16 is an example, and the sharing range setting screen 82 may include more flag setting sections. Furthermore, if all flag setting sections cannot be displayed within the display range of the shared range setting screen 82, the flag setting sections that cannot be displayed may be displayed by switching or scrolling the screen.
  • Each of the flag setting sections 820a to 820d includes a switch section 821 having a knob 822, respectively.
  • the switch unit 821 when a knob 822 is moved to the left by a user operation, sharing of the corresponding item is disabled, and when the knob 822 is shifted to the right, sharing of the corresponding item is set to be allowed.
  • each of the flag setting units 820a to 820d may be provided with an operator for permitting sharing when abstracted, as will be described later.
  • a portion other than the switch section 821 (referred to as a background section) can be operated by a user operation. If sharing of the item corresponding to the flag setting section whose background part has been operated is set to be disabled, and sharing is encouraged by the service provider for the item, the sharing range setting section 124 sends a recommended sharing range notification. A message that prompts the change of the sharing range notified from the section 201 is displayed along with the reason at a position corresponding to the flag setting section.
  • the information processing device 2 may hold, in the inference target value holding unit 206, a value that is the target of inference at the service provider terminal 6 (referred to as an inference target value).
  • the information processing device 2 acquires the inference target value from the recommended sharing range calculation unit 203, stores it in the inference target value storage unit 206, and holds it.
  • FIG. 17 is a schematic diagram showing an example of an inference target value table in which the inference target value holding unit 206 according to the embodiment holds inference target values.
  • the inference target value table stores flags indicating whether or not each inference target value is to be inferred for each service provider indicated by SP#1, SP#2, ..., SP#K. Ru.
  • subjective evaluation values such as "PHQ-9” and "GAD7 (Generalized Anxiety Disorder -7)" may be applied, or values that estimate the degree of improvement such as "increase in sleep amount” may be applied. good.
  • the inference target value table stores a flag "Yes” indicating that the service provider (SP#1) has set “PHQ-9” and “increase in sleep amount” as inference target values, and " A flag “No” indicating that "GAD7” is not the inference target value is stored.
  • a flag "No” is stored for "PHQ-9” and "GAD7”
  • a flag "Yes” is stored for "increase in sleep amount”.
  • the information processing device 2 may transmit the inference target value table stored in the inference target value holding unit 206 to the user terminal 12.
  • the user terminal 12 may present the inference target table to the user using, for example, a screen configured as described using FIG. 16.
  • each of the flag setting units 820a to 820d in FIG. 16 is used as a setting unit that sets a flag indicating whether or not to perform inference using each inference target value.
  • the flag setting section 820a is "PHQ-9”
  • the flag setting section 820b is "GAD7”
  • the flag setting section 820c is "increase in sleep amount", etc.
  • the setting section sets whether or not to perform inference based on the above.
  • the user terminal 12 can visualize and present to the user which service provider the user is allowed to share which item, which item is allowed, not allowed, or in the case of abstraction. I can do it.
  • FIG. 18 is a schematic diagram showing an example of a sharing status visualization screen that visualizes the sharing status of users with service providers for each service provider, according to the embodiment.
  • the user terminal 12 acquires service provider sharing range information held by the service provider sharing range holding unit 205 from the information processing device 2 .
  • the sharing range setting unit 124 displays a sharing status visualization screen 83 shown in FIG. 18 based on the acquired service provider sharing range information.
  • each column indicates service provider #1, #2, ..., #N (shown as SP#1, SP#2, ..., SP#N in the figure), and each row indicates Each item included in the user information is shown. Note that in each item, the biomarker is indicated as "BM”.
  • an area 830 displays the sharing status for the items shown in the row direction for the service providers shown in the column direction.
  • a display 831 shows the status of whether the item can be shared.
  • the display 831 indicates "OK” to indicate permission, "NG” to disallow, and "ABST" to indicate permission when abstracted.
  • area 830 includes a switch section 832 having a knob 833 and a check box 834. Similar to the sharing range setting screen 82 shown in FIG. 16, the switch section 832 allows the user to move the knob 833 to the left to disable sharing of the corresponding item, and to shift the knob 833 to the right to disable sharing of the corresponding item. are set to permission respectively.
  • the check box 834 is unchecked by a user operation while the knob 833 is shifted to the right, the item is set to permit sharing when the item is abstracted. It is preferable that the check box 834 be made operable when the knob 833 is shifted to the right, for example, to prevent erroneous operation.
  • This check box 834 may be applied to the sharing range setting screen 82 shown in FIG. 16.
  • the recommended sharing range calculation unit 203 calculated the inference accuracy for all combinations of items.
  • the service provider does not perform inference based on each item, but instead prepares a plurality of sets of recommended sharing ranges along with the reasons for the sets.
  • FIG. 19 is an example flowchart showing processing according to the first modification of the embodiment.
  • the recommended sharing range setting unit 600 in the service provider terminal 6 sets one or more combinations of items included in the user information that are recommended for sharing. Examples of combinations set by the recommended sharing range setting unit 600 include user information items such as "number of steps per day,” “age,” and "REM sleep time.”
  • the recommended sharing range setting unit 600 sets a reason for recommending the sharing of the combination set in step S400, for the combination. Possible reasons include, for example, "I wish to share the information for research use.”
  • the recommended sharing range setting unit 600 transmits each set combination to the information processing device 2.
  • the information processing device 2 passes each combination transmitted from the recommended sharing range setting unit 600 to the recommended sharing range notification unit 201 via the recommended sharing range calculation unit 203.
  • the recommended sharing range notification unit 201 transmits each combination passed from the recommended sharing range calculation unit 203 to each user terminal 12.
  • Each user terminal 12 displays each combination transmitted from the recommended sharing range notification unit 201 using, for example, the sharing range presentation unit 121.
  • the user can specify a desired combination from one or more combinations displayed on the user terminal 12.
  • the user terminal 12 transmits the specified combination to the information processing device 2 from the sharing range setting unit 124, for example.
  • the information processing device 2 updates the information stored in the SP sharing range storage unit 205, for example, based on the combination transmitted from the user terminal 12.
  • the service provider sets a plurality of combinations of items included in user information that are recommended for sharing.
  • the service provider specifies one variable (for example, PHQ-9) to be inferred.
  • One inference target variable is inferred using each of the plurality of set combinations recommended for sharing, and inference accuracy is calculated for each. The inference accuracy of each combination is presented to the user, and the user is prompted to change the sharing range.
  • FIG. 20 is a flowchart of an example of processing according to the second modification of the embodiment.
  • the recommended sharing range setting unit 600 in the service provider terminal 6 sets a plurality of recommended sharing combinations for items included in the user information.
  • the recommended sharing range setting unit 600 specifies one variable to be inferred.
  • the recommended sharing range setting unit 600 transmits the plurality of combinations recommended for sharing set in step S600 and the inference target variable specified in step S601 to the information processing device 2.
  • the information processing device 2 passes the plurality of combinations and one inference target variable sent from the recommended sharing range setting unit 600 to the recommended sharing range calculation unit 203.
  • the recommended sharing range calculation unit 203 specifies in the recommended sharing range setting unit 600 based on the combination in which the target user shares items with the service provider, that is, the sharing range. Infer the variables to be inferred and calculate the inference accuracy. In addition, the recommended sharing range calculation unit 203 infers variables to be inferred based on each of the combinations recommended for sharing set by the recommended sharing range setting unit 600, and calculates inference accuracy for each. The recommended sharing range calculation unit 203 compares the inference accuracy calculated based on the combinations in which the target users share items with each inference accuracy calculated based on each of the plurality of combinations for which sharing is recommended.
  • the recommended sharing range notification unit 201 calculates the inference accuracy calculated based on the combinations in which the target users share items compared in step S412 and each of the multiple combinations recommended for sharing. Each inference accuracy calculated based on the information is transmitted to the user terminal 12 and presented to the user.
  • the recommended sharing range notification unit 201 sends a message urging the user to change the combination of currently shared items, that is, the sharing range, along with the reason, to the user terminal 12, and notifies the user.
  • the recommended sharing range setting unit 600 specifies, for example, three combinations of items that are recommended for sharing: set A, set B, and set C, as illustrated below (step S410).
  • Set A “Number of steps per day”, “Sleep time”
  • Set B “Number of steps per day”, “Sleep time”, “REM sleep time”
  • C set "Age”, "Number of steps per day”, “Sleep time”, “REM sleep time”
  • the recommended sharing range setting unit 600 specifies, for example, "PHQ-9" as a variable to be inferred (step S411).
  • the recommended sharing range calculation unit 203 infers variables to be inferred for each of the A set, B set, and C set, and the sharing range that the user currently allows sharing, and calculates the inference accuracy for each.
  • each inference accuracy is calculated as shown in (1) to (4) below. Note that in the following, inference accuracy is shown as a value of AUC (Area Under the Curve).
  • AUC of the sharing range that the user is currently allowed to share 0.5
  • AUC of set A 0.6
  • AUC of B set 0.9
  • AUC of C set 0.6
  • the recommended sharing range notification unit 201 transmits each of the values (1) to (4) described above to the user terminal 12.
  • the user terminal 12 presents each of these values (1) to (4) using the sharing range presentation unit 121, and displays a message prompting the user to change the sharing range.
  • (3) the AUC of set B is the highest value
  • (1) the AUC of the sharing range that the user is currently permitted to share is lower than each AUC of sets A to C.
  • the sharing range presentation unit 121 displays a message prompting the user to change the sharing range that is currently permitted to be shared to a combination of set B, along with the reason.
  • the user terminal 12 may directly set the combination of B sets that are prompted to be changed as the sharing range.
  • the user terminal 12 is not limited to this, and in response to the above-mentioned message, among the items included in the combination of set B that is prompted to change, only the items that are not included in the current sharing range are added to the current sharing range. You may also add it to your account and change it to shareable. For example, consider three items: “sleep time”, “number of steps per day”, and "REM sleep time”. It is assumed that the user has allowed only "sleep time” to be shared among these. On the other hand, set B includes "number of steps per day” and "REM sleep time”. In this case, since the user has already allowed sharing of "sleep time,” “number of steps per day” and “REM sleep time” are added and allowed to be shared.
  • the user terminal 12 it is preferable for the user terminal 12 to present the sets A to C in the order in which the recommended sharing range indicated by the set is closest to the current sharing range, as this facilitates the user's selection of the set.
  • the information processing system 100a includes the following functions (A) to (F).
  • the information processing system 100a holds user information including the user's personal information and biometric information in the information processing device 2, which is a server on the network N or cloud network CL.
  • the information processing system 100a defines a service provider that uses the information processing system 100a, and the information that the user shares with the service provider out of each piece of information included in the user information is shared with the service provider.
  • the information processing device 2 holds SP sharing range information that is managed on a per-unit basis.
  • the information processing system 100a has a function of notifying the user of the recommended sharing range based on information that the service provider recommends the user to share, as the expected sharing range that the user is expected to share. are doing.
  • the information processing system 100a has a recommended sharing range calculation function that calculates a recommended sharing range based on user information.
  • the recommended sharing range calculation function also has a function of presenting the reason for the calculated recommended sharing range (benefits obtained by the user).
  • the information processing system 100a has a function of visualizing, on the user terminal 12 of the user, which information of the user information is shared with which service provider by the user.
  • the information processing system 100a when a user determines a sharing range indicating information to be shared with a service provider among user information, from a plurality of recommended sharing ranges, the information processing system 100a selects the recommended sharing range along with the reason. It has a function to present the items in order of closest to the current sharing range.
  • the information processing system 100a uses the functions (A), (B), and (E) described above to enable the user to know which information of his/her own user information is shared with which service provider. Can be presented clearly.
  • the information processing system 100a can present to the user the benefits obtained by sharing information with a service provider by using the functions (A) to (D) described above.
  • the information processing system 100a presents the recommended sharing range and the reason for sharing to the user using the functions (A) to (D) and (F) described above. Therefore, the information processing system 100a can encourage the user to share information included in the user information with the service provider, thereby providing better services to the user. In addition, service providers can conduct more accurate analysis by having users share information.
  • the information processing system 100a according to the embodiment, it is possible to easily set the sharing range of user information including biometric information.
  • the present technology can also have the following configuration.
  • a sharing range acquisition unit that obtains an expected sharing range that a business operator expects to share with respect to user information; a processing unit that transmits a recommended sharing range in which sharing is recommended, calculated based on the expected sharing range, to a user terminal corresponding to the user information; an information processing device comprising; a display control unit that displays the recommended sharing range on a display unit;
  • the user terminal comprising: information processing systems, including
  • the information processing device includes: a user information storage unit that stores the user information; Furthermore, The processing unit includes: In response to a request from the vendor, the user information is acquired from the user information storage unit based on a user information sharing range in which information that is permitted to be shared among the user information is specified by the user corresponding to the user information.
  • the information processing device includes: a sharing range storage unit that stores the user information sharing range set by the user for each of the above mentioned traders in association with the above mentioned traders; further comprising, The information processing system according to (2) above.
  • the processing unit includes: If the condition is added to the user information sharing range, processing the user information acquired based on the user information sharing range and transmitting it to the vendor terminal; The information processing system according to (2) or (3) above.
  • the conditions include abstraction of the user information; The information processing system according to (4) above.
  • the processing unit includes: Calculating the inference accuracy when inferring the inference target variable based on the inference target variable obtained from the vendor terminal and the user information corresponding to the user information sharing range; The information processing system according to any one of (2) to (5) above.
  • the processing unit includes: calculating the inference accuracy for each of the plurality of expected sharing ranges based on combinations of different variables obtained from the vendor terminal; The information processing system according to (6) above.
  • the processing unit includes: If the inference accuracy is less than or equal to a predetermined value, transmitting a notification to the user terminal to prompt a change in the user information sharing range; The information processing system according to (6) or (7) above.
  • the processing unit includes: transmitting the plurality of expected sharing ranges based on combinations of different variables obtained from the vendor terminal to any of the user terminals; The information processing system according to any one of (2) to (5) above.
  • the display control section includes: setting information that allows sharing of the user information in the user information sharing range according to a user operation on the user terminal; The information processing system according to any one of (2) to (9) above.
  • the display control section includes: displaying the user information sharing range set according to the user operation; The information processing system according to (10) above.
  • the display control section includes: displaying, for each of the above-mentioned vendors, which information is permitted to be shared among the various pieces of information included in the user information; The information processing system according to any one of (2) to (11) above.
  • the display control section includes: Setting which information included in the user information is permitted to be shared for each of the above-mentioned businesses according to a user operation; The information processing system according to any one of (2) to (12) above. (14) The display control section includes: adding conditions to the permission to share the information in response to a user operation; The information processing system according to any one of (10) to (13) above. (15) The display control section includes: updating the user information sharing range based on the recommended sharing range; The information processing system according to any one of (2) to (14) above. (16) The display control section includes: selecting one recommended sharing range from among the plurality of recommended sharing ranges by the one vendor transmitted from the information processing device according to a user operation; The information processing system according to any one of (1) to (15) above.
  • the user information is including identification information for identifying the user corresponding to the user information, and biometric information and attribute information of the user;
  • the information processing system according to any one of (1) to (16) above.
  • the user terminal is acquiring the biometric information from at least one of a wearable terminal that can communicate with the user terminal and the user terminal;
  • the information processing system according to (17) above.
  • a sharing range acquisition step in which the information processing device acquires an expected sharing range in which the business operator expects the user information to be shared from a business terminal corresponding to the business operator; a processing step in which the information processing device transmits a recommended sharing range in which sharing is recommended, calculated based on the expected sharing range, to a user terminal corresponding to the user information; a display control step in which the user terminal displays a recommended sharing range on a display unit; has, Information processing method.
  • a sharing range acquisition unit that obtains an expected sharing range that a business operator expects to share with respect to user information
  • a processing unit that transmits a recommended sharing range in which sharing is recommended, calculated based on the expected sharing range, to a user terminal corresponding to the user information
  • An information processing device comprising: (21) a display control unit that performs display on the display unit; Equipped with The display control section includes: Displaying the recommended sharing range in which sharing is calculated based on the expected sharing range in which the operator expects the user information to be shared, which is acquired by the information processing device from the operator terminal corresponding to the operator. to be displayed in the User terminal.
  • Information processing device 6 Service provider terminal 12 User terminal 80 Initial screen 82 Sharing range setting screen 100, 100a Information processing system 120 Intervention presentation unit 121 Sharing range presentation unit 122 Biological information acquisition unit 123 Personal information acquisition unit 124 Sharing range setting unit 200 Intervention notification unit 201 Recommended sharing range notification unit 202 Information holding unit 203 Recommended sharing range calculation unit 204 Access control unit 205 Service provider sharing range holding unit 206 Inference target value holding unit 600 Recommended sharing range setting unit 601 Intervention module 602 Information storage unit 800 Setting button 801 Intervention message display area 802 Biomarker value display area 820a, 820b, 820c, 820d Flag setting section 821, 832 Switch section 822, 833 Knob 834 Check box

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Bioethics (AREA)
  • Computer Hardware Design (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Epidemiology (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

Un système de traitement d'informations, selon la présente divulgation, comprend : un dispositif de traitement d'informations comprenant une unité d'acquisition de plage partagée qui acquiert une plage partagée attendue pour laquelle une entreprise s'attend à partager des informations d'utilisateur, et une unité de traitement qui transmet, à un terminal utilisateur correspondant aux informations d'utilisateur, une plage partagée recommandée pour laquelle le partage est recommandé et qui est calculée sur la base de la plage partagée attendue ; et le terminal utilisateur précité qui comprend une unité de commande d'affichage qui affiche la plage partagée recommandée.
PCT/JP2023/018045 2022-05-26 2023-05-15 Système de traitement d'informations, procédé de traitement d'informations, dispositif de traitement d'informations et terminal utilisateur WO2023228789A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-085741 2022-05-26
JP2022085741 2022-05-26

Publications (1)

Publication Number Publication Date
WO2023228789A1 true WO2023228789A1 (fr) 2023-11-30

Family

ID=88919221

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/018045 WO2023228789A1 (fr) 2022-05-26 2023-05-15 Système de traitement d'informations, procédé de traitement d'informations, dispositif de traitement d'informations et terminal utilisateur

Country Status (1)

Country Link
WO (1) WO2023228789A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005284598A (ja) * 2004-03-29 2005-10-13 Sanyo Electric Co Ltd 個人情報提供システム、記録装置及びコンピュータプログラム
WO2012046670A1 (fr) * 2010-10-05 2012-04-12 日本電気株式会社 Système et procédé d'émission-réception d'informations personnelles, dispositif de fourniture d'informations personnelles, dispositif de gestion des préférences et programme informatique
JP2019159773A (ja) * 2018-03-13 2019-09-19 本田技研工業株式会社 データ配信制御装置、情報処理装置、及びデータ配信制御方法
JP2020046953A (ja) * 2018-09-19 2020-03-26 Kddi株式会社 プライバシ設定情報生成装置、プライバシ設定情報生成方法及びコンピュータプログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005284598A (ja) * 2004-03-29 2005-10-13 Sanyo Electric Co Ltd 個人情報提供システム、記録装置及びコンピュータプログラム
WO2012046670A1 (fr) * 2010-10-05 2012-04-12 日本電気株式会社 Système et procédé d'émission-réception d'informations personnelles, dispositif de fourniture d'informations personnelles, dispositif de gestion des préférences et programme informatique
JP2019159773A (ja) * 2018-03-13 2019-09-19 本田技研工業株式会社 データ配信制御装置、情報処理装置、及びデータ配信制御方法
JP2020046953A (ja) * 2018-09-19 2020-03-26 Kddi株式会社 プライバシ設定情報生成装置、プライバシ設定情報生成方法及びコンピュータプログラム

Similar Documents

Publication Publication Date Title
KR102116664B1 (ko) 온라인 기반의 건강 관리 방법 및 장치
US20190239791A1 (en) System and method to evaluate and predict mental condition
Bergmann et al. Body-worn sensor design: what do patients and clinicians want?
Ugajin et al. White-coat hypertension as a risk factor for the development of home hypertension: the Ohasama study
KR102477776B1 (ko) 사용자 맞춤형 의료 정보를 제공하기 위한 방법 및 장치
US20030050538A1 (en) System and method for medical observation system located away from a hospital
WO2020059794A1 (fr) Procédé de traitement d'informations, dispositif de traitement d'informations et programme
US20220192556A1 (en) Predictive, diagnostic and therapeutic applications of wearables for mental health
US11120897B2 (en) System and method for tracking informal observations about a care recipient by caregivers
Sakamaki et al. Remote patient monitoring for neuropsychiatric disorders: a Scoping review of current trends and future perspectives from recent publications and upcoming clinical trials
Ghahari et al. Using cardiac implantable electronic device data to facilitate health decision making: a design study
KR101890513B1 (ko) 수전증 진단 장치 및 방법
KR102144938B1 (ko) 개인건강기록을 이용한 가족력 위험도 산출 방법
Pregnolato et al. Clinical Features to Predict the Use of a sEMG Wearable Device (REMO®) for Hand Motor Training of Stroke Patients: A Cross-Sectional Cohort Study
Elkefi Supporting patients’ workload through wearable devices and mobile health applications, a systematic literature review
WO2024024317A1 (fr) Programme informatique, dispositif de traitement d'informations, procédé de traitement d'informations et procédé de génération de modèle d'apprentissage
JP7382741B2 (ja) 医療機関選定支援装置
WO2023228789A1 (fr) Système de traitement d'informations, procédé de traitement d'informations, dispositif de traitement d'informations et terminal utilisateur
Dhamanti et al. Smart home healthcare for chronic disease management: A scoping review
Moreira et al. Design of a biomedical kit for bedridden patients: a conceptual approach
Daramola et al. Semantic integration of multiple health data for treatment decision-making in low-resource settings
WO2023007593A1 (fr) Procédé de collecte d'informations, dispositif de collecte d'informations, et procédé de partage d'informations pour terminal mobile
Alshammari et al. Identification of stroke using deepnet machine learning algorithm
JP7393518B2 (ja) 教師データ収集装置、教師データ収集方法および記録媒体
WO2024014175A1 (fr) Système de détermination de risque de dépression à l'aide d'une image de fond d'œil, dispositif de génération de modèle d'apprentissage automatique, dispositif de détermination de risque de dépression et procédé de détermination de risque de dépression

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23811663

Country of ref document: EP

Kind code of ref document: A1