WO2023157606A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et programme Download PDF

Info

Publication number
WO2023157606A1
WO2023157606A1 PCT/JP2023/002563 JP2023002563W WO2023157606A1 WO 2023157606 A1 WO2023157606 A1 WO 2023157606A1 JP 2023002563 W JP2023002563 W JP 2023002563W WO 2023157606 A1 WO2023157606 A1 WO 2023157606A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
information processing
processing apparatus
user terminal
personality
Prior art date
Application number
PCT/JP2023/002563
Other languages
English (en)
Japanese (ja)
Inventor
拓哉 杉谷
栄二郎 森
律子 金野
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2023157606A1 publication Critical patent/WO2023157606A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance

Definitions

  • the present disclosure relates to an information processing device, an information processing method, and a program.
  • dialogue agents have been developed that infer the user's emotions from the user's facial expressions and tone of voice, etc., and have a dialogue with the user.
  • Dialogue agents that can automatically interact with users are, for example, therapeutic actions such as exercise therapy, diet therapy, and medical interviews, wellness actions such as fitness and dieting, and learning actions such as schools and cram schools. It can be expected that it will be utilized in
  • the present disclosure proposes an information processing device, an information processing method, and a program capable of suppressing a decrease in motivation.
  • An information processing apparatus includes a generation unit that generates a message indicating an action to be proposed to a user based on sensor data acquired by a sensor mounted on a user terminal owned by the user.
  • FIG. 1 is a block diagram showing a schematic configuration example of an information processing system according to an embodiment of the present disclosure
  • FIG. FIG. 4 is a diagram showing an example of a case/personality table according to an embodiment of the present disclosure
  • FIG. 11 is a diagram illustrating an example of a comment table according to an embodiment of the present disclosure
  • FIG. 10 is a diagram showing an example of a calling time table according to an embodiment of the present disclosure
  • FIG. 4 is a flowchart showing a schematic operation example according to an embodiment of the present disclosure
  • FIG. 5 is a diagram showing an example of comments in response to feedback presented to a user via a user terminal according to an embodiment of the present disclosure
  • FIG. 5 is a diagram showing an example of comments corresponding to personality data presented to a user via a user terminal according to an embodiment of the present disclosure
  • FIG. FIG. 4 is a diagram showing an example of sensor data according to an embodiment of the present disclosure
  • FIG. [0014] Fig. 5 illustrates an example of feedback obtained with a wearable device in accordance with an embodiment of the present disclosure
  • FIG. 13 illustrates an example of feedback obtained by a smart phone according to an embodiment of the present disclosure
  • Figure 4 illustrates an example of a visual effect visualized on a user terminal according to an embodiment of the present disclosure
  • FIG. 11 illustrates an example of rewards given to a user according to an embodiment of the present disclosure
  • FIG. It is a figure which shows the basic treatment schedule of cognitive-behavioral therapy.
  • FIG. 5 is a diagram showing an example of user information stored in a symptom/personality table when an embodiment of the present disclosure is applied to cognitive behavioral therapy
  • FIG. 10 is a diagram showing an example of a comment regarding notification of an inquiry destination stored in a comment table when an embodiment of the present disclosure is applied to cognitive behavioral therapy
  • FIG. 10 is a diagram showing an example of comments regarding the speaking frequency stored in a comment table when an embodiment of the present disclosure is applied to cognitive behavioral therapy
  • FIG. 17 is a diagram showing an example of a comment according to a user's answer to the question about the speaking frequency shown in FIG. 16, which is stored in a comment table when an embodiment of the present disclosure is applied to cognitive behavioral therapy
  • FIG. 10 is a diagram showing an example of a comment regarding notification of an inquiry destination stored in a comment table when an embodiment of the present disclosure is applied to cognitive behavioral therapy
  • FIG. 10 is a diagram showing an example of comments regarding the speaking frequency stored in a comment table when an embodiment of the present disclosure is applied to cognitive behavioral therapy
  • FIG. 17
  • FIG. 10 is a diagram showing an example of comments about questions about lesson 1 stored in a comment table when an embodiment of the present disclosure is applied to cognitive behavioral therapy;
  • FIG. 5 is a diagram showing an example of comments about questions about lessons and worksheets stored in a comment table when an embodiment of the present disclosure is applied to cognitive behavioral therapy;
  • FIG. 5 is a diagram showing an example of comments about questions about characters stored in a comment table when an embodiment of the present disclosure is applied to cognitive behavioral therapy;
  • FIG. 10 is a diagram illustrating an example of fixed comments for each lesson stored in a comment table when an embodiment of the present disclosure is applied to cognitive behavioral therapy;
  • FIG. 10 is a diagram showing an example of comments that are stored in a comment table and presented to a user at random timing or according to progress or the like when an embodiment of the present disclosure is applied to cognitive behavioral therapy;
  • FIG. 5 is a diagram showing an example of comments presented to a user in response to feedback from a user terminal, stored in a comment table when an embodiment of the present disclosure is applied to cognitive behavioral therapy;
  • 1 is a block diagram showing an example of a hardware configuration according to an embodiment of the present disclosure; FIG.
  • the patient's activity, behavior, etc. are collected by a wearable device worn by the patient, etc., and from the collected information It analyzes the treatment status and patient characteristics, and avatars that exist on devices, etc. support the patient and encourage treatment actions that are in line with the patient's behavior and characteristics.
  • the following items are executed. ⁇ Every time the system presents a treatment action to the patient, it is recorded whether the patient actually performed the treatment from the data acquired by the wearable sensor and voice data. Then, an agent that speaks to the patient's attributes is deployed on the system. Based on the collected data and the patient's attributes, the proposed treatment action and frequency are dynamically switched. Visualize logs
  • FIG. 1 is a block diagram showing a schematic configuration example of an interactive medical support system (hereinafter simply referred to as a support system) as an information processing system according to this embodiment.
  • a support system an interactive medical support system
  • the support system 1 includes an analysis server 100, a database (DB) 200, a medical institution system 300, and a user terminal 400 connected via a predetermined network.
  • Predetermined networks include, for example, wired or wireless LAN (Local Area Network) (including WiFi), WAN (Wide Area Network), the Internet, mobile communication systems (4G (4th Generation Mobile Communication System), 4G-LTE (Long Term Evolution), 5G, etc.), Bluetooth (registered trademark), infrared communication, and other networks capable of mutual communication.
  • the analysis server 100 is, for example, an example of an information processing apparatus according to the present disclosure, and includes a suggested action generation unit 110 and a speaking time setting unit 120 .
  • the analysis server 100 may be composed of one server, or may be composed of a plurality of servers.
  • the analysis server 100 may be composed of one or more cloud servers arranged on a network.
  • the database 200 holds, for example, a symptom/personality table 210, a comment table 220, an avatar table 230, and a speaking time table 240 as information for generating actions to be suggested to the user.
  • the medical institution system 300 is, for example, a system installed in a medical institution that treats users. It includes an information processing terminal 310 for creating actions and comments instructing the user.
  • the user terminal 400 is an information processing device for acquiring various information such as symptoms and behavior of the user, proposing behavior to the user, and communicating with the user. Glasses, tablet terminals, HMD (Head Mounted Display), AR (Augmented Reality) glasses, personal computers, and the like may be used. In this embodiment, a case where the user possesses a wearable device 410 and a smart phone 420 as the user terminal 400 is illustrated.
  • HMD Head Mounted Display
  • AR Augmented Reality
  • FIG. 2 is a diagram showing an example of the case/personality table according to the present embodiment.
  • the symptom/personality table 210 includes, for each user, a user ID and user name for uniquely identifying the user, information on the user's personality (hereinafter also referred to as personality data), and a user terminal.
  • the sensor data acquired in 400 and/or the symptom specified from the sensor data corresponds to information for specifying the date and time when each sensor data and/or symptom was acquired or specified (hereinafter also referred to as a time stamp). Attach and store.
  • personality data, sensor data and/or symptoms identified from sensor data may be collectively referred to as user information.
  • This user information may include personality data, sensor data/symptoms, as well as various information related to the user, such as diagnosis results by a doctor or the like input from the information processing terminal 310 of the medical institution system 300 .
  • FIG. 3 is a diagram showing an example of a comment table according to this embodiment.
  • the comment table 220 holds comments presented to the user via the user terminal 400. For example, comments are classified according to their contents and attributes; and when the timing of presenting the comment to the user is determined, the timing of the presentation and the content of the comment are stored in association with each other.
  • the avatar table 230 retains information on avatars that are viewed by the user as dialogue agents when communicating with the user via the user terminal 400 .
  • a plurality of avatars are registered in the avatar table 230, and an avatar selected by the user or the system may be used for communication with the user.
  • the information related to the avatar may include information related to actions (may include gestures and habits) and personality (may include language etc.). good.
  • FIG. 4 is a diagram showing an example of the calling time table according to the present embodiment.
  • the greeting timing table 240 holds information about the greeting timing for each user generated by the greeting timing setting unit 120 in the analysis server 100.
  • the user ID, the user One or more timings set for each timing and information (comment specifying information) for identifying the contents of comments to be presented to the user at each timing are stored in association with each other.
  • the information about the time to speak is, for example, information indicating at what timing to speak to the user by using the comments held in the comment table 220 or the functions provided in the user terminal 400.
  • information in which one or more times to be called out may be scheduled.
  • information in which one or more call-out times are scheduled is also referred to as a call-out schedule.
  • the comment specifying information is information for specifying the classification, comment ID, etc. in the comment table 220 .
  • the comment identification information stores the comment classification.
  • the comment ID is stored in the comment specifying information.
  • FIG. 5 is a flowchart showing a schematic operation example according to this embodiment.
  • Step S101 As shown in FIG. 5, in the present embodiment, first, when a user undergoes a medical examination at a medical institution and the disease name and case are specified, for example, medical staff (eg, doctors, nurses, etc.) at the medical institution receive information.
  • medical staff eg, doctors, nurses, etc.
  • a template of an interview schedule (hereinafter also referred to as an interview schedule template) that serves as a base according to the user's disease name or case is selected (step S101).
  • an appropriate calling schedule template may be selected from the calling schedule templates created in advance for each disease name or case.
  • Step S102 When the interview schedule template is selected and the base of the interview schedule for the user is determined, the user's character data is next collected (step S102).
  • Personality data may be collected, for example, using user terminal 400 (for example, smart phone 420) in which an application (hereinafter simply referred to as a chat application) having a text or voice chat function is installed.
  • a chat application an application having a text or voice chat function
  • the calling timing setting unit 120 in the analysis server 100 transmits a question-type comment for diagnosing the user's personality to the chat application of the smartphone 420, and based on the user's answer to this comment
  • a user's personality may be diagnosed.
  • the user's personality may be diagnosed based on various information input by the user according to a question-type tutorial pre-installed in the chat application of the smartphone 420 .
  • the personality diagnosis of the user may be performed in the speaking timing setting unit 120, in the smart phone 420, or in another information processing unit.
  • the user's character data diagnosed in this manner is transmitted to, for example, the database 200 and stored in the symptom/character table 210 for each user.
  • Step S103 When the user's personality data is registered in the symptom/personality table 210, next, based on the registered personality data, the greeting timing setting unit 120 determines the greeting stored for each user in the greeting timing table 240 based on the registered personality data.
  • the schedule is updated (step S103). For example, if the user has a serious personality, the greeting time setting unit 120 does not have to update the greeting schedule so as to propose an action (for example, present a message) as prescribed. For example, if the user has an easy-going personality, the greeting time setting unit 120 may update the greeting schedule so as to reduce the number of action proposals. For example, if the user is forgetful, the greeting timing setting unit 120 may update the greeting schedule so as to increase the number of action proposals.
  • step S103 user's feedback (sensor data, answers, etc. collected in step S106) for the previous action proposal (step S105 described later) and information such as user's symptoms are accumulated in the symptom/personality table 210, for example. If so, the greeting time setting unit 120 may update the greeting schedule based not only on the personality data in the symptom/personality table 210 but also on feedback from the user, information on symptoms, and the like.
  • FIG. 6 is a diagram showing an example of comments according to feedback presented to the user via the user terminal according to this embodiment. Note that FIG. 6 illustrates a case where a reminder is presented to the user via an avatar displayed on the wearable device 410. As shown in FIG.
  • the voice schedule may be adjusted so as to increase the number of reminders regarding taking medication.
  • the user would be asked to confirm symptoms and medication status after 3 days.
  • the reminder schedule may be adjusted to reduce the number of reminders about taking medication, such as "I'm going to take it.”
  • Step S104 When the greeting schedule for each user is registered in the greeting time table 240, next, the suggested action generation unit 110 generates a message indicating the action to be next proposed according to the latest greeting schedule in the greeting time table 240. is generated (step S104). If the suggested action indicated by the generated message is a therapeutic action, for example, taking medicine, injecting insulin, exercising, eating, going to bed, waking up, measuring blood pressure, heart rate, body weight, etc., the user performs for treatment. It may be various actions to perform.
  • a therapeutic action for example, taking medicine, injecting insulin, exercising, eating, going to bed, waking up, measuring blood pressure, heart rate, body weight, etc.
  • Step S105 The message generated in this way is transmitted from the suggested action generating unit 110 to the user terminal 400 and presented to the user at the user terminal 400 (step S105).
  • a chat application of the smartphone 420 is used to present a text or voice of a comment prompting the user to perform a proposed action, or a specific Play back audio, change the display color, pattern, background screen, etc. of the wearable device 410 or smartphone 420, or cause the avatar displayed on the wearable device 410 or smartphone 420 to perform actions such as specific gestures or remarks. etc.
  • various techniques may be used.
  • a reminder such as "Take medicine on time” or "Let's clean the room” may be presented to the user using a chat application or an avatar.
  • the timing of reminding the user may be a predetermined time (several minutes to several tens of minutes) from the time when the user should actually perform the action (taking medicine, cleaning, etc.), or It may be a predetermined time of the day (specified time, scheduled wake-up time, noon, etc.).
  • FIG. 7 is a diagram showing an example of comments corresponding to personality data presented to the user via the user terminal according to this embodiment. Note that FIG. 7 illustrates a case where a reminder is presented to the user via an avatar displayed on the wearable device 410 .
  • the action suggestion to the user in step S105 is accumulated in the symptom/personality table 210, for example, the action suggestion to the user in step S105 is
  • the personality data in the personality table 210 may be adjusted as well as information such as user feedback and symptoms.
  • the content of the action proposal may be adjusted so that the volume and tone of the reminder regarding taking the medicine are strengthened.
  • various methods may be used, such as blinking the screen of the user terminal 400 or outputting a specific sound or melody, in addition to using a chat application, an avatar, or mind.
  • step S104 the adjustment of action proposal content according to the user's personality data, feedback, etc. may be executed in the proposed action generation unit 110 (that is, step S104), or may be executed in the user terminal 400. (ie step S105).
  • Step S106 When an action suggestion is made to the user using the user terminal 400, next, feedback on the action suggestion is collected using a sensor, a chat application, or the like mounted on the user terminal 400, and the collected feedback is stored in the symptom/personality table 210. It is stored (step S106).
  • various sensors such as an acceleration sensor, an angular velocity sensor, and a GUS (Global Positioning System) receiver mounted on the wearable device 410 are used to detect the actions and actions of the user, such as taking medicine and cleaning, and the detected values ( sensor data) or the information of behavior or behavior specified from the detected value may be stored in the symptom/personality table 210 as feedback.
  • GUS Global Positioning System
  • the user's blood pressure, heart rate, etc. are detected using a blood pressure monitor, a heart rate monitor, or the like mounted on the wearable device 410, and the detected values (sensor data) are stored as feedback in the symptom/personality table 210. good too.
  • a chat application may be used to ask the user questions about his or her current mood or condition, and the answers may be stored in the symptom/personality table 210 as feedback.
  • a “thing” 430 such as a medicine box or a medicine bottle
  • the user terminal 400 is connected to the “thing” 430.
  • Whether or not the user has taken the medication may be detected in response to data from the sensor.
  • the medicine can be recognized from the captured image or video
  • the user terminal 400 may detect that the user has taken the medicine correctly when the medicine is recognized.
  • FIG. 9 is a diagram showing an example of feedback obtained by the wearable device according to this embodiment
  • FIG. 10 is a diagram showing an example of feedback obtained by the smartphone according to this embodiment.
  • medication action for example, medication action (medication recognition), seizure action (seizure recognition), body temperature, blood oxygen concentration, heart rate (heart rate), blood pressure, sleep time, amount of activity (consumption calorie) may be automatically obtained as feedback detected by various sensors.
  • the results of the personality diagnosis may be manually obtained as feedback through communication using a chat app or the like.
  • Step S107 After the feedback from the user is stored in the symptom/personality table 210 in this way, the suggested action generating unit 110 next generates the collected feedback and the personality data for each user stored in the symptom/personality table 210. , the visual effect to be visualized on the user terminal 400 is set (step S107). This visual effect may be one of the messages indicating actions suggested to the user in this embodiment.
  • FIG. 11 is a diagram showing an example of visual effects visualized on the user terminal according to this embodiment.
  • personality 1 is "serious”
  • personality 2 is “easygoing”
  • personality 3 is "sociable”.
  • the display unit of the user terminal 400 (for example, the dial or band of the wearable device 410) display, the display of the smartphone 420, etc.), as a visual effect in the normal state, a black-and-white clock screen is displayed if the user is serious, a light blue gradation is displayed if the user is easy-going, and a black-and-white polka dot pattern is displayed if the user is sociable. is displayed.
  • the display unit of the user terminal 400 displays black and white stripes if the user is serious, dark blue gradation if the user is easygoing, and multi-colored polka dots are displayed as a visual effect. Further, for example, when the user forgets to take medicine, the display unit of the user terminal 400 displays a gradation of white and black if the user is serious, a black and white clock screen if the user is easygoing, and a black and white clock screen if the user is sociable. A flashing fluorescent color is displayed as a visual effect.
  • Step S108 the suggested action generation unit 110 determines whether or not the user has achieved the quota for the suggested action based on the collected feedback (step S108). For example, when the suggested action is taking medicine, the suggested action generation unit 110 determines whether or not the user has taken the medicine correctly based on sensor data and the like fed back. Also, for example, when the suggested action is a question to the user, the suggested action generation unit 110 determines whether or not the user has answered the question. Note that if the suggested action presented to the user in step S105 does not involve a quota, such as an announcement of the next scheduled action, steps S106 to S112 may be skipped.
  • a quota such as an announcement of the next scheduled action
  • Step S109 If the user has achieved the quota for the suggested action (YES in step S108), the suggested action generation unit 110 rewards the user via the user terminal 400, for example (step S109), and proceeds to step S110. On the other hand, if the user has not achieved the quota for the suggested action (NO in step S108), the suggested action generator 110 skips step S109 and proceeds to step S110.
  • a form in which a story such as a manga, a novel, or an animation is divided into a plurality of chapters and the next chapter is provided to the user as a reward when the quota is achieved will be exemplified.
  • the next chapter is provided to the user according to the quota achievement, and the story progresses, so the user's motivation for the treatment is maintained by making the user want to see the continuation of the story. becomes possible.
  • the suggested action generation unit 110 may display, for example, a scene of the next story or a digest on the display unit of the user terminal 400 as a standby screen or the like.
  • a reward may be given to the user according to quota achievement through the pet application or the like.
  • a treat 403 may be given to the virtual pet 402 through the display 401 of the user terminal 400 .
  • the virtual pet 402 displayed on the display 401 may perform a specific action or communicate with the user in a specific manner.
  • ⁇ Utilization of IoT For example, when “things” such as candy boxes, bottles, boxes containing them, toy boxes, and bookshelves are connected to the analysis server 100 and/or the user terminal 400 via a predetermined network Alternatively, the reward may be given to the user by unlocking the "thing” according to the achievement of the quota by the suggested action generating unit 110 or the user terminal 400.
  • FIG. 1 For example, when “things” such as candy boxes, bottles, boxes containing them, toy boxes, and bookshelves are connected to the analysis server 100 and/or the user terminal 400 via a predetermined network Alternatively, the reward may be given to the user by unlocking the "thing” according to the achievement of the quota by the suggested action generating unit 110 or the user terminal 400.
  • Step S110 The feedback collected in step S106 and stored in the symptom/characteristic table 210 may be reported to the information processing terminal 310 in the medical institution system 300 (step S110).
  • the report to the medical institution may be a push type in which information is transmitted from the analysis server 100 to the information processing terminal 310, or the information processing terminal 310 acquires the information stored in the symptom/personality table 210. It may be of the pull type.
  • Step S111 In this way, when the collected feedback is reported to the medical institution, next, the calling timing setting unit 120 receives the diagnosis result of the doctor or the like for the user's symptoms specified from the feedback and the doctor or the like regarding the future treatment action. It is determined whether or not there is an instruction from the medical institution (hereinafter collectively referred to as a diagnosis result from the medical institution) (step S111). If there is no diagnostic result from the medical institution (NO in step S111), the operation proceeds to step S113. On the other hand, if there is a diagnosis result from the medical institution (YES in step S111), the operation proceeds to step S112.
  • Step S112 If there is a diagnosis result from the medical institution (YES in step S111), the calling timing setting unit 120 takes the diagnosis result from the medical institution into consideration and calculates the feedback and personality data stored in the symptom/personality table 210. , the calling schedule stored for each user in the calling time table 240 is updated (step S112). After that, the operation proceeds to step S113. It should be noted that the overview of the update of the calling schedule in step S112 may be the same as that in step S103 except for the consideration of the diagnosis result from the medical institution, so detailed description will be omitted here.
  • Step S113 the analysis server 100 determines whether or not to end this operation. When this operation is terminated (YES in step S113), the analysis server 100 cooperates with the user terminal 400 and the medical institution system 300 to terminate this operation. On the other hand, if the operation is not terminated (NO in step S113), the operation returns to step S103, and the subsequent operations are continued.
  • Fig. 13 is a diagram showing a basic treatment schedule for cognitive behavioral therapy.
  • the treatment schedule is roughly divided into five stages from 1 to 5, and quotas are imposed on the user at each stage.
  • the user has to complete a set lesson, and when the lesson is completed, the worksheet, which is the quota, is released.
  • a minimum period (for example, one week) is set for the lesson, and the user must continue the lesson for at least this minimum period at each stage.
  • the worksheets released by completing the lesson have a final deadline (for example, within 8 weeks), and the user must complete the worksheets for all stages before the final deadline.
  • FIG. 14 is a diagram showing an example of user information stored in the symptom/personality table when this embodiment is applied to cognitive behavioral therapy.
  • the user information according to this usage pattern can include "class”, "name (or identifier)", “value”, and “explanation” of each piece of information.
  • “Name” is information that serves as an identifier for identifying each piece of information.
  • “Classification” is for grouping each piece of information according to its attributes.
  • Value is the value of each piece of information itself. Note that the initial values are underlined in FIG. "Description” is for operators, system developers, etc. to confirm what the value of each information is.
  • a “bot” may be a character (avatar or the like) for communicating with a user using a chat application or the like.
  • the category “relationship with bots” includes information on the name “call frequency” and the name “friendship level.”
  • the value of the name "calling frequency” has, for example, three levels of high/medium/low, and the initial value is set to "high”, which is the highest frequency of calling.
  • the value of the name "friendship level” has, for example, five levels of 5/4/3/2/1, and the initial value is set to "1", which is the lowest level of intimacy.
  • other information in the user information may be, for example, information acquired by various sensors in the user terminal 400, applications (including chat applications), and the like.
  • the value of "talking frequency" is the response from the user to the question to the user (comment described later with reference to FIG. 17), the progress of the lesson by the user, the worksheet creation status, the user's personality, and the relationship with the bot. It may be changed based on the "friendship level", symptoms, recovery status, and the like.
  • a question is asked about the current frequency of calling using a chat application or the like, and if the user answers "frequently", the value of "frequency of calling" in the symptom/personality table 210 is lowered by one step to "less". If the answer is yes, the value of the "calling frequency” may be raised by one step, and if the answer is “just right", the value of the "calling frequency” may not be changed.
  • the call timing registered in the initial call schedule template may be increased or decreased based on, for example, the value of "call frequency" classified as "relationship with bot" in the user information.
  • call frequency classified as "relationship with bot” in the user information.
  • random timing may be added to the calling schedule, or preset timing may be added.
  • preset timing may be added.
  • timings scheduled in the greetings schedule may be deleted at random, or preset timings may be deleted.
  • Example of comment table When adding/deleting a greeting at a preset timing, information about which timing to delete the greeting may be managed in, for example, the comment table 220 or the like.
  • FIG. 15 to 23 are diagrams showing examples of comments stored in the comment table when this embodiment is applied to cognitive behavioral therapy.
  • FIG. 15 is a diagram showing an example of a comment regarding notification of an inquiry destination presented to the user at timings of the 5th day, the 12th day, and the 33rd day in the calling schedule in the initial state.
  • FIG. 16 is a diagram showing an example of a comment on the frequency of calls presented to the user on the 8th day and the 30th day (however, it changes according to progress) in the initial state of the call-out schedule.
  • FIG. 17 is a diagram showing an example of a comment according to the user's answer to the question about the frequency of speaking shown in FIG. FIG.
  • FIG. 18 is a diagram showing an example of comments on questions about lesson 1 presented to the user on the 15th day and the 36th day in the initial calling schedule.
  • FIG. 19 is a diagram showing an example of comments about questions about lessons and worksheets presented to the user on the 40th, 27th, and 50th days in the initial speaking schedule.
  • FIG. 20 is a diagram showing an example of a comment about a character (also called a bot or avatar) presented to the user on the 23rd day and the 45th day in the initial call-out schedule.
  • FIG. 21 is a diagram showing an example of fixed comments for each lesson presented to the user on the 15th day and the 36th day in the initial call-out schedule.
  • FIG. 22 is a diagram showing an example of comments presented to the user according to progress or at random timing.
  • FIG. 23 is a diagram showing an example of comments presented to the user in response to feedback from the user terminal 400. As shown in FIG.
  • each comment record in the comment table 220 has an item of "target frequency".
  • the “target frequency” is information for managing which value is stored in the “calling frequency” in the user information and whether the comment is presented to the user. For example, when “high, medium” is stored in the “target frequency”, if the "calling frequency” in the user information is “high” or “medium”, the comment is presented to the user, and the “calling frequency is "low”, presentation of the comment to the user is omitted. If nothing is stored in “target frequency” or a predetermined value (for example, "-" or a specific flag) is stored, the comment will be It may be presented to the user at random, or it may be subject to comments that are omitted at random.
  • Hardware configuration Analysis server 100 (proposed action generation unit 110, call timing setting unit 120) according to the above-described embodiments, information processing terminal 310 in medical institution system 300, various user terminals 400 (wearable device 410, smartphone 420, etc.) ) can be implemented by a computer 1000 configured as shown in FIG. 24, for example.
  • FIG. 24 realizes the functions of the analysis server 100 (proposed action generation unit 110, calling time setting unit 120), the information processing terminal 310 in the medical institution system 300, and various user terminals 400 (wearable device 410, smartphone 420, etc.).
  • 1 is a hardware configuration diagram showing an example of a computer 1000; FIG.
  • the computer 1000 has a CPU 1100 , a RAM 1200 , a ROM (Read Only Memory) 1300 , a HDD (Hard Disk Drive) 1400 , a communication interface 1500 and an input/output interface 1600 . Each part of computer 1000 is connected by bus 1050 .
  • the CPU 1100 operates based on programs stored in the ROM 1300 or HDD 1400 and controls each section. For example, the CPU 1100 loads programs stored in the ROM 1300 or HDD 1400 into the RAM 1200 and executes processes corresponding to various programs.
  • the ROM 1300 stores a boot program such as BIOS (Basic Input Output System) executed by the CPU 1100 when the computer 1000 is started, and programs dependent on the hardware of the computer 1000.
  • BIOS Basic Input Output System
  • the HDD 1400 is a computer-readable recording medium that non-temporarily records programs executed by the CPU 1100 and data used by such programs.
  • HDD 1400 is a recording medium that records a program for executing each operation according to the present disclosure, which is an example of program data 1450 .
  • a communication interface 1500 is an interface for connecting the computer 1000 to an external network 1550 (for example, the Internet).
  • the CPU 1100 receives data from another device via the communication interface 1500, and transmits data generated by the CPU 1100 to another device.
  • the input/output interface 1600 includes the I/F section 18 described above, and is an interface for connecting the input/output device 1650 and the computer 1000 .
  • the CPU 1100 receives data from input devices such as a keyboard and mouse via the input/output interface 1600 .
  • the CPU 1100 transmits data to an output device such as a display, a speaker, or a printer via the input/output interface 1600 .
  • the input/output interface 1600 may function as a media interface for reading a program or the like recorded on a predetermined recording medium.
  • Media include, for example, optical recording media such as DVD (Digital Versatile Disc) and PD (Phase change rewritable disk), magneto-optical recording media such as MO (Magneto-Optical disk), tape media, magnetic recording media, semiconductor memories, etc. is.
  • optical recording media such as DVD (Digital Versatile Disc) and PD (Phase change rewritable disk)
  • magneto-optical recording media such as MO (Magneto-Optical disk)
  • tape media magnetic recording media
  • magnetic recording media semiconductor memories, etc. is.
  • the computer 1000 includes the analysis server 100 according to the above-described embodiment (the suggested action generation unit 110, the call timing setting unit 120), the information processing terminal 310 in the medical institution system 300, and various user terminals 400 (wearable device 410, smartphone 420). etc.), the CPU 1100 of the computer 1000 executes a program loaded on the RAM 1200 to obtain the information in the analysis server 100 (the suggested action generation unit 110, the call timing setting unit 120), the medical institution system 300 It implements the functions of the processing terminal 310 and various user terminals 400 (wearable device 410, smart phone 420, etc.).
  • the HDD 1400 also stores programs and the like according to the present disclosure. Although CPU 1100 reads and executes program data 1450 from HDD 1400 , as another example, these programs may be obtained from another device via external network 1550 .
  • the technical category that embodies the above technical idea is not limited.
  • the above technical ideas may be embodied by a computer program for causing a computer to execute one or more procedures (steps) included in the method of manufacturing or using the above apparatus.
  • the above technical idea may be embodied by a computer-readable non-transitory recording medium in which such a computer program is recorded.
  • An information processing apparatus comprising a generation unit that generates a message indicating an action to be proposed to a user based on sensor data acquired by a sensor mounted on a user terminal owned by the user. (2) further comprising a diagnostic unit for diagnosing the personality of the user; The information processing apparatus according to (1), wherein the generation unit generates the message further based on the personality of the user diagnosed by the diagnosis unit.
  • the sensor data includes at least one of medication taking action, seizure action, body temperature, blood oxygen level, heart rate, blood pressure, sleep time, and activity amount.
  • the generating unit generates questions to be presented to the user, The information processing apparatus according to any one of (1) to (3), wherein the sensor data includes an answer from the user to the question presented to the user via the user terminal. (5) The information processing apparatus according to any one of (1) to (4), wherein the generation unit generates the message further based on a diagnostic result of the user's symptom specified from the sensor data. (6) further comprising a setting unit that sets one or more timings for presenting the message to the user; The information processing apparatus according to any one of (1) to (5), wherein the generation unit generates a message indicating an action to be proposed to the user at each of the one or more timings.
  • the message is presented to the user in one or more formats of text, voice, display color of the user terminal, pattern displayed on the user terminal, and action of the character displayed on the user terminal.
  • (12) The information processing apparatus according to (10) or (11), wherein the generation unit rewards the user based on sensor data acquired by the sensor in response to presentation of the message to the user.
  • (12) The information processing apparatus according to (12), wherein the reward is given to the user via the user terminal.
  • An information processing method comprising generating a message indicating an action to be proposed to a user based on sensor data acquired by a sensor mounted on a user terminal owned by the user.
  • (15) A message indicating an action to be proposed to the user based on sensor data obtained by a sensor installed in the user terminal is sent to a processor of an information processing device connected to the user terminal owned by the user via a predetermined network. program to generate.
  • Interactive Medical Support System 100 Analysis Server 110 Suggested Action Generating Unit 120 Steping Timing Setting Unit 200 Database 210 Symptom/Personality Table 220 Comment Table 230 Avatar Table 240 Steping Timing Table 300 Medical Institution System 310 Information Processing Terminal 400 User Terminal 401 Display 402 Virtual Pet 403 Snack 410 Wearable Device 420 Smartphone 430 Object

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

Ce dispositif de traitement d'informations comprend une unité de génération qui, sur la base de données de capteur acquises par un capteur, génère un message indiquant une action qui est suggérée à un utilisateur, ledit capteur étant installé dans un terminal utilisateur que l'utilisateur possède.
PCT/JP2023/002563 2022-02-15 2023-01-27 Dispositif de traitement d'informations, procédé de traitement d'informations et programme WO2023157606A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022021630 2022-02-15
JP2022-021630 2022-02-15

Publications (1)

Publication Number Publication Date
WO2023157606A1 true WO2023157606A1 (fr) 2023-08-24

Family

ID=87578324

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/002563 WO2023157606A1 (fr) 2022-02-15 2023-01-27 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Country Status (1)

Country Link
WO (1) WO2023157606A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006320735A (ja) * 2000-03-14 2006-11-30 Toshiba Corp 身体装着型生活支援装置および方法
JP2013020587A (ja) * 2011-07-14 2013-01-31 Nec Corp 情報処理システム、ユーザの行動促進方法、情報処理装置及びその制御方法と制御プログラム
JP2017059261A (ja) * 2013-10-01 2017-03-23 国立大学法人東北大学 健康情報処理装置及び方法
JP2017091586A (ja) * 2017-02-13 2017-05-25 株式会社FiNC 健康管理サーバおよび健康管理サーバ制御方法並びに健康管理プログラム
JP2019197584A (ja) * 2019-07-18 2019-11-14 株式会社野村総合研究所 健康管理支援システムおよび健康管理支援プログラム
JP2020077062A (ja) * 2018-11-05 2020-05-21 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
JP2020177674A (ja) * 2019-04-18 2020-10-29 株式会社こどもみらい 個人別の概日リズムに基づく生活時刻の提示システム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006320735A (ja) * 2000-03-14 2006-11-30 Toshiba Corp 身体装着型生活支援装置および方法
JP2013020587A (ja) * 2011-07-14 2013-01-31 Nec Corp 情報処理システム、ユーザの行動促進方法、情報処理装置及びその制御方法と制御プログラム
JP2017059261A (ja) * 2013-10-01 2017-03-23 国立大学法人東北大学 健康情報処理装置及び方法
JP2017091586A (ja) * 2017-02-13 2017-05-25 株式会社FiNC 健康管理サーバおよび健康管理サーバ制御方法並びに健康管理プログラム
JP2020077062A (ja) * 2018-11-05 2020-05-21 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
JP2020177674A (ja) * 2019-04-18 2020-10-29 株式会社こどもみらい 個人別の概日リズムに基づく生活時刻の提示システム
JP2019197584A (ja) * 2019-07-18 2019-11-14 株式会社野村総合研究所 健康管理支援システムおよび健康管理支援プログラム

Similar Documents

Publication Publication Date Title
Novick et al. Women’s experience of group prenatal care
Cibrian et al. Supporting self-regulation of children with ADHD using wearables: tensions and design challenges
Worrall et al. The validity of functional assessments of communication and the Activity/Participation components of the ICIDH-2: do they reflect what really happens in real-life?
Dworkin et al. A realistic talking human embodied agent mobile phone intervention to promote HIV medication adherence and retention in care in young HIV-positive African American men who have sex with men: qualitative study
US20190043623A1 (en) System and method for physiological and psychological support in home healthcare
US20130017519A1 (en) System and methods for monitoring and adjusting human behavioral patterns and conditions
Thomas et al. Teaching patients with advanced cancer to self-advocate: Development and acceptability of the Strong Together™ serious game
Ankrah et al. Me, my health, and my watch: How children with ADHD understand smartwatch health data
Shin et al. Identifying opportunities and challenges: how children use technologies for managing diabetes
Costello et al. Pediatric acute and intensive care in hospitals
WO2023157606A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
Munson Child and adolescent mental health
Guthrie et al. The theoretical basis of the Conversational Model of Therapy
EP4109461A1 (fr) Assistant tridimensionnel, à miroitage atmosphérique et à variation dynamique, de l'interface addison pour environnements externes
Awada et al. Mobile@ old-an assistive platform for maintaining a healthy lifestyle for elderly people
Li A mobile game for the social and cognitive well-being of the elderly in China
Khan et al. Be your own superhero: A case of a young boy with selective mutism and complex comorbidities
Orr et al. Engagement and clinical improvement among older adult primary care patients using a mobile intervention for depression and anxiety: case studies
Pavel et al. The story of our lives: From sensors to stories in self-monitoring systems
Coziahr et al. Designing a Digital Mental Health App for Opioid Use Disorder Using the UX Design Thinking Framework
Rexhepi et al. Elena+ Care for COVID-19, a pandemic lifestyle care intervention: intervention design and study protocol
Morris Patient-provider communication: Perspectives of individuals with significant communication disabilities
Singal Adaptive mHealth Interventions for Improving Youth Responsiveness and Clinical Outcomes
EP4109459A1 (fr) Mise en miroir de l'atmosphère et de l'interface d'assistant d'addison tridimensionnel variant de manière dynamique pour des environnements comportementaux
EP4109460A1 (fr) Mise en miroir de l'atmosphère et de l'interface d'assistant d'addison tridimensionnel variant de manière dynamique pour des environnements intérieurs

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23756134

Country of ref document: EP

Kind code of ref document: A1