WO2023248937A1 - Terminal, notification system, notification method, and program - Google Patents

Terminal, notification system, notification method, and program Download PDF

Info

Publication number
WO2023248937A1
WO2023248937A1 PCT/JP2023/022353 JP2023022353W WO2023248937A1 WO 2023248937 A1 WO2023248937 A1 WO 2023248937A1 JP 2023022353 W JP2023022353 W JP 2023022353W WO 2023248937 A1 WO2023248937 A1 WO 2023248937A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
unit
notification
terminal
vibration
Prior art date
Application number
PCT/JP2023/022353
Other languages
French (fr)
Japanese (ja)
Inventor
琢也 神田
公海 ▲高▼橋
麻美 宮島
康雄 石榑
Original Assignee
日本電信電話株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社 filed Critical 日本電信電話株式会社
Publication of WO2023248937A1 publication Critical patent/WO2023248937A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present invention relates to a terminal, a notification system, a notification method, and a program.
  • Techniques exist for systems to encourage action by sending notifications to users. Many of these notification methods that prompt users to take action may distract the user's attention and make the user feel bothered by the notification. Techniques have been developed to reduce the annoyance that users feel when a system sends notifications and engages users.
  • Patent Document 1 in order to reduce the annoyance that the user feels when the system sends notifications to the user or other actions, a threshold value or a mental burden function based on the mental burden of presenting information accompanying the action is proposed.
  • a technique has been disclosed that determines the content and presentation timing of information to be presented to a user when a condition indicated by a threshold value is satisfied.
  • the disclosed technology aims to notify the user at an appropriate timing according to the user's actions.
  • the disclosed technology includes a detection information acquisition unit that acquires information on detecting a user's motion, and a sensor configured to vibrate while the user is performing a motion based on the information on the detected motion of the user.
  • This terminal is equipped with a vibration control unit that generates vibrations to notify the user.
  • FIG. 1 is a diagram showing an example of a hardware configuration of a wearable terminal according to an embodiment of the present invention.
  • 1 is a diagram illustrating an example of a functional configuration of a wearable terminal according to Example 1 of the embodiment of the present invention. It is a flowchart which shows an example of the flow of the notification process based on Example 1 of embodiment of this invention.
  • FIG. 3 is a diagram for explaining an operation start time and an operation end time according to Example 1 of the embodiment of the present invention.
  • FIG. 2 is a diagram for explaining an acceleration prediction method according to Example 1 of the embodiment of the present invention.
  • FIG. 3 is a diagram showing an example of a functional configuration of a wearable terminal according to Example 2 of the embodiment of the present invention.
  • FIG. 12 is a flowchart illustrating an example of the flow of notification processing according to Example 2 of the embodiment of the present invention.
  • FIG. 7 is a diagram for explaining a vibration adjustment method according to Example 2 of the embodiment of the present invention. It is a figure which shows an example of the functional structure of the wearable terminal based on Example 3 of embodiment of this invention. 12 is a flowchart illustrating an example of the flow of notification processing according to Example 3 of the embodiment of the present invention. It is a figure which shows an example of the hardware structure of the electronic terminal based on Example 4 of embodiment of this invention. It is a figure which shows an example of the functional structure of the electronic terminal and wearable terminal based on Example 4 of embodiment of this invention.
  • FIG. 12 is a flowchart illustrating an example of the flow of learning processing according to Example 4 of the embodiment of the present invention.
  • FIG. 7 is a diagram for explaining an example of learning processing according to Example 4 of the embodiment of the present invention. It is a figure which shows an example of the functional structure of the learning device based on Example 4 of embodiment of this invention.
  • 12 is a flowchart illustrating an example of the flow of inference processing according to Example 4 of the embodiment of the present invention.
  • Patent Document 1 discloses a technique for reducing the annoyance felt by a user when a system sends a notification to the user.
  • Example 1 In this embodiment, an example of a terminal that vibrates as a notification while the user is moving his or her body will be described.
  • FIG. 1 is a diagram showing an example of the hardware configuration of a wearable terminal according to an embodiment of the present invention.
  • the wearable terminal 10 may be a device that is in constant contact with the user's body, such as a smart watch. They don't have to be in contact.
  • the wearable terminal 10 is an example of a terminal that has a function of notifying a user by generating vibrations.
  • the wearable terminal 10 includes a CPU 101, a memory 102, a communication device 103, a sensor 104, and a vibration device 105, which are interconnected via a bus.
  • the memory 102 stores programs and the like that describe the processing contents described in this embodiment.
  • the CPU 101 executes processing written in a program stored in the memory 102.
  • the communication device 103 is a device for transmitting and receiving signals with other devices.
  • the sensor 104 is a device that performs various measurements to detect the movements of the user wearing the wearable terminal 10.
  • the sensor 104 may include, for example, an acceleration sensor that measures acceleration, a touch sensor that detects contact, a camera sensor that captures an image, and the like.
  • the vibration device 105 is a device that generates vibrations in response to instructions from the CPU 101. Vibration device 105 may be composed of one or more vibrators.
  • the wearable terminal 10 may further include a display device for displaying messages.
  • the vibration device 105 may generate vibration and the display device may display a message when notifying the wearable terminal 10.
  • FIG. 2 is a diagram illustrating an example of the functional configuration of a wearable terminal according to Example 1 of the embodiment of the present invention.
  • the wearable terminal 10 includes a storage section 11 , a detection information acquisition section 12 , a motion start confirmation section 13 , an acceleration prediction section 14 , a motion end estimation section 15 , and a vibration control section 16 .
  • Each of these parts is realized by a CPU 101 and a memory 102 that constitute a computer. Specifically, each part is realized by the CPU 101 executing processing described in a program stored in the memory 102.
  • the storage unit 11 stores information indicating the message content of the notification, information indicating the timing of notification, information indicating the vibration duration d at the time of notification, etc.
  • the detection information acquisition unit 12 acquires detection information indicating the result of detecting the user's motion by the sensor 104. For example, the detection information acquisition unit 12 acquires information indicating acceleration. Note that the wearable terminal 10 may be premised on moving in space in unison with the body part of the user wearing the wearable terminal 10. That is, the acceleration of the wearable terminal 10 is considered to indicate the movement of the body part of the user wearing the wearable terminal 10.
  • the motion start confirmation unit 13 determines whether the user's motion has started based on the detection information acquired by the detection information acquisition unit 12. Then, the operation start confirmation unit 13 estimates the operation start time T ms . Note that the motion start confirmation unit 13 may determine whether or not the user's motion has started based on detection information obtained by any type of sensor, as long as the information is related to the user's motion. good.
  • the acceleration prediction unit 14 predicts how the acceleration will change from the operation start time T ms based on the detection information acquired by the detection information acquisition unit 12. For example, the acceleration prediction unit 14 may predict acceleration based on acceleration included in detection information acquired by an acceleration sensor. Note that the acceleration prediction unit 14 may predict acceleration based on detection information obtained by any type of sensor as long as the information is related to prediction of acceleration.
  • the motion end estimating unit 15 estimates the time at which the user's motion ends (motion end time T me ) based on the acceleration prediction result.
  • the operation end estimating unit 15 estimates the vibration based on the time to start the vibration (vibration start time T ns ), the time to continue the vibration (vibration duration d), the time to end the vibration (vibration end time T ne ), etc. , it is determined whether the vibration due to the notification ends before the operation end time Tme .
  • the vibration control unit 16 controls the vibration device 105 to generate vibrations based on the vibration start time T ns , the vibration duration time d, the vibration end time T ne , the determination result by the operation end estimation unit 15, and the like.
  • FIG. 3 is a flowchart illustrating an example of the flow of notification processing according to Example 1 of the embodiment of the present invention.
  • the notification process is started when a reason for notifying the user occurs.
  • the operation start confirmation unit 13 monitors the operation (step S101). Specifically, the motion start confirmation unit 13 determines whether the user's motion has started based on the detection information acquired by the detection information acquisition unit 12. Then, the operation start confirmation unit 13 estimates the operation start time T ms .
  • the acceleration prediction unit 14 predicts the acceleration of the wearing part (the part where the user wears the wearable terminal 10) (step S102). Specifically, the acceleration prediction unit 14 predicts how the acceleration will change from the operation start time T ms based on the detection information acquired by the detection information acquisition unit 12.
  • the motion end estimating unit 15 estimates the end time of the motion (step S103). Specifically, the motion end estimating unit 15 estimates the motion end time Tme based on the acceleration prediction result.
  • the operation end estimating unit 15 determines whether the duration of the notification (vibration) is before the end time of the operation (step S104). Specifically, the operation end estimating unit 15 determines whether the vibration due to the notification will end before the operation end time Tme based on the vibration start time Tns , vibration duration d, vibration end time Tne, etc. Determine whether
  • step S104 determines that the duration of the notification (vibration) is not before the end time of the operation (step S104: NO).
  • step S104 determines that the duration of the notification (vibration) is before the end time of the operation (step S104: YES)
  • step S105 the vibration control unit 16 generates vibration (step S105).
  • FIG. 4 is a diagram for explaining the operation start time and operation end time according to Example 1 of the embodiment of the present invention.
  • the movement start time T ms is the time at which he starts the movement with his arm lowered.
  • the motion end time Tme is the time when the motion of raising the arm ends and the arm becomes in a raised state.
  • FIG. 5 is a diagram for explaining an acceleration prediction method according to Example 1 of the embodiment of the present invention.
  • FIG. 5 shows a method for predicting acceleration in the user's motion shown in FIG.
  • the vertical axis in FIG. 5 indicates acceleration, and the horizontal axis indicates time.
  • FIG. 5 shows an operation start time T ms , an operation end time T me , a vibration start time T ns , a vibration end time T ne , and a vibration duration time d.
  • the storage unit 11 may store in advance information indicating the vibration duration d for notification and a threshold value A th for determination regarding acceleration.
  • the operation start confirmation unit 13 monitors the operation and acquires information indicating the acceleration of the measurement result.
  • a graph 901 is an example of acceleration as a measurement result.
  • the operation start confirmation unit 13 detects the start of the operation using the threshold value A th based on the information indicating the acceleration of the measurement result. For example, the operation start confirmation unit 13 may determine that the operation has started when the acceleration exceeds the threshold value A th . The operation start confirmation unit 13 sets the time when the start of the operation is detected as the operation start time T ms .
  • the acceleration prediction unit 14 predicts how the acceleration will change after the start of the motion is detected. For example, as shown in FIG. 5, the acceleration prediction unit 14 calculates that the acceleration changes from the point at which the acceleration reaches the maximum by the same amount of decrease as the amount of increase over time until the acceleration reaches the maximum.
  • the acceleration may be predicted based on the model that A graph 902 is an example of acceleration as a prediction result.
  • A(t) the acceleration of the prediction result
  • the motion end estimation unit 15 estimates the motion end time Tme based on the acceleration prediction result. For example, the motion end estimation unit 15 may estimate the time when the predicted acceleration A(t) becomes 0 as the motion end time T me . Note that the motion end estimating unit 15 may estimate the time when the predicted acceleration A(t) becomes equal to or less than the threshold value A th as the motion end time T me .
  • the movement end time Tme can be estimated by any other method as long as it is possible to estimate the movement end time Tme , such as by using a musculoskeletal model formula, by machine learning, or by using a statistical model. This method may also be used.
  • step S104 of the notification process the motion end estimating unit 15 checks whether the vibration end time T ne , which is the vibration start time T ns plus d, is earlier than the estimated motion end time T me . Then, if the vibration end time T ne is earlier than the estimated operation end time T me , the vibration control unit 16 generates vibration in step S105 of the notification process.
  • the method of this embodiment may be implemented in combination with the conventional techniques described above, such as a method of detecting a break in behavior, a method of estimating a timing with high interruptibility, and issuing a notification. good.
  • vibration for notification can be generated while the user is moving the part to which the wearable terminal 10 is attached. This can reduce the annoyance felt by the user.
  • Example 2 Example 2 will be described below with reference to the drawings.
  • the second embodiment differs from the first embodiment in that the intensity of vibration is adjusted according to the user's movement. Therefore, in the following explanation of the second embodiment, the differences from the first embodiment will be mainly explained, and parts having the same functional configuration as the first embodiment will be designated by the same reference numerals as used in the explanation of the first embodiment. A symbol is given and the explanation thereof is omitted.
  • FIG. 6 is a diagram illustrating an example of a functional configuration of a wearable terminal according to Example 2 of the embodiment of the present invention.
  • the wearable terminal 10 according to the present embodiment has a configuration in which a vibration adjustment section 17 is added to the wearable terminal 10 according to the first embodiment.
  • the vibration adjustment unit 17 adjusts the intensity of the vibration to be generated based on the prediction result of the acceleration estimated by the acceleration prediction unit 14.
  • FIG. 7 is a flowchart illustrating an example of the flow of notification processing according to Example 2 of the embodiment of the present invention.
  • the flow of the notification process according to the present embodiment is the same as the notification process according to the first embodiment with step S201 added.
  • step S104 if the operation end estimating unit 15 determines that the duration of the notification (vibration) is before the end time of the operation (step S104: YES), the vibration adjustment unit 17 adjusts the intensity of the vibration (step S201). Then, in step S105, the vibration control unit 16 generates vibration with the adjusted intensity.
  • FIG. 8 is a diagram for explaining a vibration adjustment method according to Example 2 of the embodiment of the present invention.
  • FIG. 8 shows a method for adjusting vibrations in the user's actions shown in FIG.
  • step S201 of the notification process according to the second embodiment the vibration adjustment unit 17 adjusts the vibration as shown in equation (1) below in response to the time-series acceleration A(t) predicted by the acceleration prediction unit 14. Adjust the strength V s (t).
  • V s (t) ⁇ A(t)...Formula (1)
  • the vibration adjustment unit 17 may divide the acceleration predicted by comparison with one or more reference threshold values into groups, and adjust the intensity to be different for each group.
  • Sensitivity to vibrations is thought to change depending on the movement of the worn part of a smartwatch, etc. Therefore, conventional smart watches etc. send notifications with a constant vibration regardless of the user's actions, and do not respond to changes in the sensitivity of the user receiving notifications to vibrations, so notification vibrations can be annoying. Conceivable.
  • the wearable terminal 10 it is possible to obtain information on the acceleration of the part of the user to which the wearable terminal 10 is attached using the installed sensor, and to generate vibrations with an intensity corresponding to the magnitude of the acceleration. . This can further reduce the annoyance felt by the user.
  • Example 3 will be described below with reference to the drawings.
  • the third embodiment differs from the first embodiment in that information indicating a notification schedule (schedule information) is acquired and vibration is generated based on the schedule. Therefore, in the following explanation of the third embodiment, the differences from the first embodiment will be mainly explained, and the same reference numerals as used in the explanation of the first embodiment will be used for parts having the same functional configuration as the first embodiment. A symbol is given and the explanation thereof is omitted.
  • FIG. 9 is a diagram illustrating an example of the functional configuration of a wearable terminal according to Example 3 of the embodiment of the present invention.
  • the notification system according to this embodiment includes a wearable terminal 10 (first terminal) and an electronic terminal 20 (second terminal).
  • the electronic terminal 20 is, for example, a mobile terminal, a server device, etc., and stores schedule information. Note that the electronic terminal 20 may be any other device as long as it can store schedule information.
  • the electronic terminal 20 includes a storage section 21 and a communication section 22.
  • the storage unit 21 stores schedule information.
  • the schedule information may be information indicating, for example, the date and time of notification, day of the week, frequency, number of times, etc.
  • the communication unit 22 acquires schedule information from the storage unit 21 and transmits it to the wearable terminal 10.
  • the wearable terminal 10 has a configuration in which a communication unit 18 is added to the wearable terminal 10 according to the first embodiment.
  • the communication unit 18 receives schedule information from the electronic terminal 20 and stores it in the storage unit 11 .
  • the vibration control unit 16 generates notification vibration at an appropriate timing using the method of the first embodiment, as long as it is within the schedule time, based on the schedule information stored in the storage unit 11.
  • FIG. 10 is a flowchart illustrating an example of the flow of notification processing according to Example 3 of the embodiment of the present invention.
  • the flow of the notification process according to the present embodiment is a flow in which step S301 and step S302 are added to the notification process according to the first embodiment.
  • the communication unit 18 receives notification schedule information from the electronic terminal 20 (step S301). Schedule information is stored in the storage unit 11.
  • step S302 determines whether it is within the scheduled time.
  • step S302 determines that it is within the scheduled time.
  • step S302 NO
  • the wearable terminal 10 determines that it is not within the scheduled time (step S302: NO)
  • the vibration control unit 16 may immediately generate a notification vibration.
  • the vibration timing of the wearable at an appropriate timing when notifying a message such as a reminder that has a predetermined notification time period.
  • Example 4 will be described below with reference to the drawings.
  • Healthcare apps have become commonly used in recent years.
  • a terminal equipped with a healthcare application can, for example, collect data related to a user's health (such as data when performing a certain action) and manage the user's health.
  • a health care application is used on the wearable terminal 10 or the electronic terminal 20.
  • a wearable terminal 10 for example, a smart watch
  • an electronic terminal 20 for example, a smartphone
  • the wearable terminal 10 is connected to the electronic terminal 20 or the wearable terminal in order to have the user perform actions such as running and answering a questionnaire (hereinafter referred to as recommended actions) necessary for the operation of the healthcare application.
  • recommended actions such as running and answering a questionnaire (hereinafter referred to as recommended actions) necessary for the operation of the healthcare application.
  • the feasibility of implementing each recommended action is estimated based on user state information (eg, action recognition information, time, GPS, etc.) that can be obtained by 10.
  • the wearable terminal 10 selects an appropriate recommended action based on the feasibility of implementing each recommended action, and uses the technology of the third embodiment to provide a reminder at an appropriate timing and content.
  • Example 4 a neural network (NN) model (feasibility estimation model) is used to estimate the feasibility of each recommended action.
  • NN neural network
  • a softmax function is used as the activation function of the output layer of the model, and the feasibility (also called probability) of each recommended action is output as an output of the softmax function.
  • the hardware configuration of the wearable terminal 10 is as described in the first embodiment.
  • the electronic terminal 20 may be a smartphone, a tablet, a PC (personal computer), or other devices.
  • FIG. 11 shows an example of the hardware configuration of the electronic terminal 20 in the fourth embodiment.
  • FIG. 11 is a diagram assuming a computer such as a PC, a smartphone or a tablet has the same basic configuration as shown in FIG. 11 and has the configuration of a computer.
  • the learning device 100 which will be described later, also has a computer configuration shown in FIG. 11 as a hardware configuration.
  • the electronic terminal 20 can be realized by using hardware resources such as a CPU and memory built into a computer to execute a program corresponding to the processing performed by the electronic terminal 20.
  • the above program can be recorded on a computer-readable recording medium (such as a portable memory) and can be stored or distributed. It is also possible to provide the above program through a network such as the Internet or e-mail.
  • the computer shown in FIG. 11 includes a drive device 1000, an auxiliary storage device 1002, a memory device 1003, a CPU 1004, an interface device 1005, a display device 1006, an input device 1007, an output device 1008, and a sensor 1009, which are interconnected by a bus BS. etc.
  • a program that realizes processing on the computer is provided, for example, by a recording medium 1001 such as a memory card.
  • a recording medium 1001 such as a memory card.
  • the program is installed from the recording medium 1001 to the auxiliary storage device 1002 via the drive device 1000.
  • the program does not necessarily need to be installed from the recording medium 1001, and may be downloaded from another computer via a network.
  • the auxiliary storage device 1002 stores installed programs as well as necessary files, data, and the like.
  • the memory device 1003 reads and stores the program from the auxiliary storage device 1002 when there is an instruction to start the program.
  • the CPU 1004 implements functions related to the electronic terminal 20 according to programs stored in the memory device 1003.
  • the interface device 1005 is used as an interface for connecting to a network or the like.
  • a display device 1006 displays a GUI (Graphical User Interface) and the like based on a program.
  • the input device 1007 is composed of a keyboard, a mouse, buttons, a touch panel, or the like, and is used to input various operation instructions.
  • An output device 1008 outputs the calculation result.
  • the sensor 1009 includes, for example, one or more of an acceleration sensor that measures acceleration, a touch sensor that detects contact, a camera sensor that captures an image, a position sensor (GPS device), or all of the following. But that's fine.
  • FIG. 12 is a diagram showing an example of the functional configuration of the wearable terminal 10 and the electronic terminal 20 in the fourth embodiment.
  • the electronic terminal 20 is a terminal such as a smartphone that is carried and used by a user.
  • the electronic terminal 20 according to the fourth embodiment has a configuration in which a detection information acquisition section 23 and a recommended action implementation recognition section 24 are added to the electronic terminal 20 according to the third embodiment.
  • the detection information acquisition unit 23 acquires detection information indicating the result of detecting the user's motion by the sensor 1008. Further, the detection information acquisition unit 23 can also acquire the user's position information acquired by the sensor 1008 (for example, a GPS device) as the detection information, or can acquire information such as the user's posture acquired by the sensor 1008 (for example, a camera). It can be acquired as detection information, or other information can be acquired.
  • the sensor 1008 for example, a GPS device
  • information such as the user's posture acquired by the sensor 1008 (for example, a camera). It can be acquired as detection information, or other information can be acquired.
  • the recommended behavior implementation recognition unit 24 checks whether the user has implemented the recommended behavior based on the information acquired by the detection information acquisition unit 23.
  • the wearable terminal 10 according to the fourth embodiment has a configuration in which a recommended action implementation recognition section 31, a user state estimation section 32, an implementation possibility estimation section 33, and a recommended action selection section 34 are added to the wearable terminal 10 according to the third embodiment.
  • the recommended behavior selection section 34 may also be referred to as the behavior selection section 34.
  • the recommended behavior implementation recognition unit 31 may be referred to as the behavior implementation recognition unit 31.
  • the recommended behavior implementation recognition unit 31 checks whether the user has implemented the recommended behavior based on the information acquired by the detection information acquisition unit 12.
  • Both the recommended action implementation recognition unit 24 and the recommended action implementation recognition unit 31 can detect implementation of the recommended action by the user, for example, by using the above-mentioned action recognition API.
  • the user state estimation unit 32 estimates the user state based on the information acquired by the detection information acquisition unit 12.
  • the user state estimation unit 32 may use the information acquired by the detection information acquisition unit 12 as the user state information as it is.
  • the feasibility estimating unit 33 includes a storage unit that stores weight data of the NN that estimates the feasibility of each recommended action.
  • the feasibility estimation unit 33 can operate the NN in which the weight data is set by reading the weight data (weight parameters) from the storage unit.
  • the user state obtained by the user state estimating unit 32 is input to the NN as input data, and the NN outputs the feasibility of each recommended action.
  • a softmax function which is often used in multi-level classification, is used.
  • the softmax function outputs the feasibility of each category of recommended actions in the range of 0 to 1, and outputs the feasibility of all categories so that the sum of the possibilities of action becomes 1. Examples of outputs are: Survey: 0.4, Running: 0.3, Medication: 0.3.
  • the above NN will be referred to as a feasibility estimation model.
  • the recommended action selection unit 34 selects one recommended action from among the plurality of recommended actions that have implementation possibilities and are output from the implementation possibility estimation unit 33 (implementability estimation model). The specific operation will be described later.
  • the implementation feasibility estimation unit 33 will be described as including a function of learning the implementation feasibility estimation model (creating weight data).
  • the feasibility estimating unit 33 during learning of the feasibility estimation model may be referred to as a “learning unit”.
  • the learning of the feasibility estimation model may be performed by a device (referred to as a learning device) separate from the electronic terminal 20 and the wearable terminal 10.
  • the feasibility estimating unit 33 (learning unit) of the wearable terminal 10 performs learning of the feasibility estimation model.
  • the implementability estimation unit 33 (learning unit) determines that the implementability estimation model into which the information obtained by the user state estimation unit 32 is input is the correct recommended action (recommended action implementation recognition unit 31/recommended action implementation recognition unit).
  • the weight data (parameters) of the feasibility estimation model are adjusted so as to output the behavior obtained by the unit 24 (that is, so that the probability of implementing the correct recommended behavior is 1).
  • learning may be performed using information obtained by either the information obtained by the recommended action implementation recognition unit 31 or the information obtained by the recommended action implementation recognition unit 24.
  • the recommended action implementation recognition unit 24 acquires detection information from the detection information acquisition unit 23.
  • the recommended behavior implementation recognition unit 24 detects implementation of the recommended behavior.
  • the content of the detected recommended action is transmitted from the communication unit 22 to the communication unit 18.
  • the recommended action implementation recognition unit 31 and the user state estimation unit 32 each acquire detection information from the detection information acquisition unit 12.
  • the recommended behavior implementation recognition unit 31 detects implementation of the recommended behavior based on the detection information.
  • the user state estimation unit 32 acquires data indicating the user state based on the detection information.
  • the feasibility estimating unit 33 determines that the feasibility estimation model that is the learning target is a recommended behavior class that is implemented from among predefined recommended behaviors using user state data as input. Create weight data for the model so that it can be estimated.
  • the weight data is created by using the recommended action recognized by the recommended action implementation recognition unit 24/recommended action implementation recognition unit 31 as the correct answer.
  • the created weight data (learned model) is stored in a storage unit included in the feasibility estimating unit 33.
  • the learning operation will be explained more specifically using an example.
  • the recommended action implementation recognition unit 31 of the wearable terminal 10 or the recommended action implementation recognition unit 24 of the electronic terminal 20 detects the user's implementation of the recommended action
  • the user state at the start timing of the detected recommended action implementation is determined by Simultaneously with the timing, it is recorded, for example, in the storage section of the feasibility estimating section 33.
  • the feasibility estimating unit 33 trains the feasibility estimation model using the recommended behavior and the user state at the same timing as the timing at which the recommended behavior was detected.
  • FIG. 14 An example will be explained using FIG. 14.
  • the user holds a smartphone as the electronic terminal 20 and a smart watch as the wearable terminal 10.
  • the recommended action implementation recognition unit 24 or the recommended action implementation recognition unit 31 detects the implementation of “questionnaire input” as a user action (recommended action).
  • the user state estimating unit 32 determines the user state as follows: the behavior is "standing still”, the location is “outdoors”, the schedule is “no plans”, and the electronic terminal 20 (smartphone) is Data such as the status "Active” and the heart rate "82" are acquired and recorded in the storage section of the feasibility estimating section 33.
  • the user state estimating unit 32 constantly acquires the above data, and records the acquired user state data in the storage unit at the timing of recommended action detection. At the same time, the detected recommended actions are also recorded in the storage unit.
  • the implementability estimating unit 33 (learning unit) inputs the user state at the start timing of the recorded recommended action to the implementability estimation model, and sets the implementability of the category of the recommended action performed by the user to 1 and that.
  • the weight data of the feasibility estimation model is learned by setting the feasibility of other categories to 0. In the example of FIG. 14, the weight data of the feasibility estimation model is learned so that the feasibility of the category "questionnaire" is 1. Learning of weight data itself is an existing technique, and can be performed using, for example, error backpropagation.
  • the learning of the feasibility estimation model may be performed by a device other than the wearable terminal 10 (this will be referred to as the learning device 100).
  • FIG. 15 shows a configuration example of the learning device 100.
  • the learning device 100 includes an input section 110, a learning section 120, an output section 130, and a storage section 140.
  • the recommended behavior detected in the electronic terminal 20 or the wearable terminal 10 and data on the user state at the start timing of implementation of the recommended behavior are input to the input unit 110.
  • the input recommended actions and user status data are stored in the storage unit 140.
  • the storage unit 140 stores weight data of a feasibility estimation model to be learned.
  • the learning unit 120 inputs the user state at the start timing of the recommended action to the implementation feasibility estimation model, and sets the implementation possibility of the category of recommended actions performed by the user to 1 and the implementation possibility of other categories to 0. , learn the weight data of the feasibility estimation model. Learning is performed using, for example, error backpropagation.
  • the learned weight data is output from the output unit 130 and input to the feasibility estimation unit 33 in the wearable terminal 10.
  • the recommended action implementation recognition unit 24/recommended action implementation recognition unit 33 may not be provided in the electronic terminal 20 and the wearable terminal 10. Further, in the inference phase, it is assumed that the feasibility estimating unit 33 holds a learned feasibility estimation model (specifically, weight data). The feasibility estimating unit 33 itself may be considered to be a trained feasibility estimation model.
  • the feasibility estimation unit 33 inputs the user state information obtained by the user state estimation unit 32 to the feasibility estimation model.
  • the feasibility estimation model (trained NN) outputs the feasibility of each recommended action defined in advance.
  • the feasibility estimation unit 33 inputs the feasibility of each recommended action outputted from the feasibility estimation model to the recommended action selection unit 34 .
  • the recommended behavior selection unit 34 acquires notification schedule time information from the recommended behavior reminder data obtained from the storage unit 11, and acquires information on recommended behaviors whose current time is within the schedule time.
  • the recommended action selection unit 34 selects the recommended action that is most likely to be implemented within the scheduled time from among the one or more recommended actions output from the implementation possibility estimation unit 33.
  • the recommended action whose current time is within the scheduled time is (taking medication, questionnaire), and the feasibility of each recommended action category is (questionnaire: 0.4, running: 0.5, taking medication: 0.1). If so, the recommended action selection unit 34 selects the questionnaire.
  • the vibration control unit 16 sends a message to the user at a notification timing that reduces the annoyance detected by the operation start confirmation unit 13 and the operation end estimation unit 15. Notification vibrations will be transmitted.
  • the method of notifying at a timing that reduces annoyance is the same as the method described in the first embodiment.
  • information on the selected recommended action may be included in the notification information.
  • the selected recommended action for example, a questionnaire
  • the selected recommended action may be displayed as a message on the display of the wearable terminal 10 or the electronic terminal 20 at the same time as the notification (vibration).
  • the electronic terminal 20 receives notification schedule information from, for example, a server.
  • the schedule information is stored in the storage unit 21.
  • the communication unit 22 transmits notification schedule information to the wearable terminal 10, and in S603, the communication unit 18 receives the notification schedule information.
  • Schedule information is stored in the storage unit 11.
  • the wearable terminal 10 determines whether the current time is within the scheduled time. When the wearable terminal 10 determines that the current time is within the scheduled time, it executes S605 to S609 and S611. On the other hand, if the wearable terminal 10 determines that the current time is not within the scheduled time, it ends the notification process.
  • the user state estimating unit 32 acquires user state data.
  • the implementability estimation unit 33 (implementability estimation model) outputs the implementability of each recommended action based on the user status
  • the recommended action selection unit 34 outputs the implementability of each recommended action based on the user status.
  • the recommended action that is most likely to be implemented within the scheduled time is selected.
  • S607 to S609, S611 notification (vibration) is performed at a timing that is not bothersome, and at a timing when the user is performing an action.
  • S607 to S609 and S611 correspond to S101 to S105 in FIG. 3 in the first embodiment.
  • Example 4 an example was explained in which a NN is used as the feasibility estimation model, but the feasibility estimation model is not limited to a NN. Any model that can predict categories may be used.
  • the output function is not limited to a softmax function. Any method may be used as long as it can output the probability of implementation (feasibility of implementation) for each category.
  • the fourth embodiment assumes the use of a healthcare app
  • the technology of the fourth embodiment can be applied to various uses regardless of whether a healthcare app is used.
  • (Section 2) an operation start confirmation unit that determines whether the user has started an operation based on information obtained by detecting the user's operation; an acceleration prediction unit that predicts a change in acceleration in the user's movement based on information obtained by detecting the user's movement; further comprising a motion end estimation unit that estimates an end time of the user's motion based on a predicted change in acceleration in the user's motion;
  • the vibration control unit may be configured to send a notification to the user. generate vibrations, The terminal described in paragraph 1.
  • (Section 3) further comprising a vibration adjustment unit that adjusts the intensity of the vibration to be generated according to a predicted change in acceleration in the user's motion;
  • (Section 4) a user state estimation unit that estimates the state of the user; an implementation possibility estimating unit that estimates the implementation possibility of one or more predetermined actions based on the state;
  • (Section 5) further comprising an action implementation recognition unit that detects implementation of the action by the user,
  • the implementation possibility estimation model used in the implementation possibility estimation section is obtained by the user state estimation section based on the action whose implementation was detected by the action implementation recognition section and the timing at which the implementation of the action was detected.
  • a notification system comprising a first terminal and a second terminal
  • the first terminal is a communication unit that receives information indicating a schedule for notification to the user from the second terminal; a detection information acquisition unit that acquires information on detected user motion; a vibration control unit that generates a vibration for notification to the user, based on information obtained by detecting the user's movement, at a scheduled notification timing, so as to vibrate while the user is performing the movement; and,
  • the second terminal is comprising a communication unit that transmits information indicating a schedule for notification to the user to the first terminal; Notification system.
  • Wearable terminal 11 Storage unit 12 Detection information acquisition unit 13 Operation start confirmation unit 14 Acceleration prediction unit 15 Operation end estimation unit 16 Vibration control unit 17 Vibration adjustment unit 18 Communication unit 20 Electronic terminal 21 Storage unit 22 Communication unit 23 Detection information acquisition unit 24 Recommended action implementation recognition unit 31 Recommended action implementation recognition unit 32 User state estimation unit 33 Implementability estimation unit 34 Recommended action selection unit 100 Learning device 110 Input unit 120 Learning unit 130 Output unit 140 Storage unit 101 CPU 102 Memory 103 Communication device 104 Sensor 105 Vibration device 1000 Drive device 1001 Recording medium 1002 Auxiliary storage device 1003 Memory device 1004 CPU 1005 Interface device 1006 Display device 1007 Input device 1008 Output device 1009 Sensor

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Electric Clocks (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

This terminal comprises a detection information acquisition unit that acquires information about a detected motion of a user, and a vibration control unit that generates vibrations for notification to the user such that vibrations are generated during the action of the user on the basis of the information about the detected motion of the user.

Description

端末、通知システム、通知方法、及びプログラムTerminal, notification system, notification method, and program
 本発明は、端末、通知システム、通知方法、及びプログラムに関する。 The present invention relates to a terminal, a notification system, a notification method, and a program.
 システムがユーザに通知を送ることにより、行動を促進する技術が存在する。このような行動を促す通知手法の多くは、通知によりユーザの注意を奪い、煩わしさを感じさせる場合がある。システムが通知を送り、ユーザに働きかけを行う際に、ユーザが感じる煩わしさを低減する技術が開発されている。 Techniques exist for systems to encourage action by sending notifications to users. Many of these notification methods that prompt users to take action may distract the user's attention and make the user feel bothered by the notification. Techniques have been developed to reduce the annoyance that users feel when a system sends notifications and engages users.
 例えば、特許文献1には、システムがユーザに通知を送るなどの働きかけを行う際にユーザが感じる煩わしさを低減するため、行動に伴う情報提示に対する心的負担に基づく閾値、又は心的負担関数で示される閾値が示す条件を満たす場合に、ユーザへ提示する情報の内容と提示のタイミングを決定する技術が開示されている。 For example, in Patent Document 1, in order to reduce the annoyance that the user feels when the system sends notifications to the user or other actions, a threshold value or a mental burden function based on the mental burden of presenting information accompanying the action is proposed. A technique has been disclosed that determines the content and presentation timing of information to be presented to a user when a condition indicated by a threshold value is satisfied.
特開2019-185389号公報JP 2019-185389 Publication
 上述した従来の技術では、介入可能な行動の切れ目、割込み可能性が高いタイミングなどが日常生活の中で多く存在しないため、通知すべき頻度と比べて少ない。そのため、適切なタイミングでユーザに通知することができないという課題がある。 In the above-mentioned conventional technology, there are not many breaks in actions that can be intervened in, timings with a high possibility of interruption, etc. in daily life, so the frequency of notifications is low compared to the frequency that should be notified. Therefore, there is a problem that it is not possible to notify the user at an appropriate timing.
 開示の技術は、ユーザの動作に応じた適切なタイミングでユーザに通知することを目的とする。 The disclosed technology aims to notify the user at an appropriate timing according to the user's actions.
 開示の技術は、ユーザの動作を検知した情報を取得する検知情報取得部と、前記ユーザの動作を検知した情報に基づいて、前記ユーザが動作を行っている間に振動するように、前記ユーザへの通知のための振動を発生させる振動制御部と、を備える端末である。 The disclosed technology includes a detection information acquisition unit that acquires information on detecting a user's motion, and a sensor configured to vibrate while the user is performing a motion based on the information on the detected motion of the user. This terminal is equipped with a vibration control unit that generates vibrations to notify the user.
 ユーザの動作に応じた適切なタイミングでユーザに通知することができる。 It is possible to notify the user at an appropriate timing according to the user's actions.
本発明の実施の形態に係るウェアラブル端末のハードウェア構成の一例を示す図である。1 is a diagram showing an example of a hardware configuration of a wearable terminal according to an embodiment of the present invention. 本発明の実施の形態の実施例1に係るウェアラブル端末の機能構成の一例を示す図である。1 is a diagram illustrating an example of a functional configuration of a wearable terminal according to Example 1 of the embodiment of the present invention. 本発明の実施の形態の実施例1に係る通知処理の流れの一例を示すフローチャートである。It is a flowchart which shows an example of the flow of the notification process based on Example 1 of embodiment of this invention. 本発明の実施の形態の実施例1に係る動作開始時刻および動作終了時刻について説明するための図である。FIG. 3 is a diagram for explaining an operation start time and an operation end time according to Example 1 of the embodiment of the present invention. 本発明の実施の形態の実施例1に係る加速度の予測方法について説明するための図である。FIG. 2 is a diagram for explaining an acceleration prediction method according to Example 1 of the embodiment of the present invention. 本発明の実施の形態の実施例2に係るウェアラブル端末の機能構成の一例を示す図である。FIG. 3 is a diagram showing an example of a functional configuration of a wearable terminal according to Example 2 of the embodiment of the present invention. 本発明の実施の形態の実施例2に係る通知処理の流れの一例を示すフローチャートである。12 is a flowchart illustrating an example of the flow of notification processing according to Example 2 of the embodiment of the present invention. 本発明の実施の形態の実施例2に係る振動の調整方法について説明するための図である。FIG. 7 is a diagram for explaining a vibration adjustment method according to Example 2 of the embodiment of the present invention. 本発明の実施の形態の実施例3に係るウェアラブル端末の機能構成の一例を示す図である。It is a figure which shows an example of the functional structure of the wearable terminal based on Example 3 of embodiment of this invention. 本発明の実施の形態の実施例3に係る通知処理の流れの一例を示すフローチャートである。12 is a flowchart illustrating an example of the flow of notification processing according to Example 3 of the embodiment of the present invention. 本発明の実施の形態の実施例4に係る電子端末のハードウェア構成の一例を示す図である。It is a figure which shows an example of the hardware structure of the electronic terminal based on Example 4 of embodiment of this invention. 本発明の実施の形態の実施例4に係る電子端末及びウェアラブル端末の機能構成の一例を示す図である。It is a figure which shows an example of the functional structure of the electronic terminal and wearable terminal based on Example 4 of embodiment of this invention. 本発明の実施の形態の実施例4に係る学習処理の流れの一例を示すフローチャートである。12 is a flowchart illustrating an example of the flow of learning processing according to Example 4 of the embodiment of the present invention. 本発明の実施の形態の実施例4に係る学習処理の例を説明するための図である。FIG. 7 is a diagram for explaining an example of learning processing according to Example 4 of the embodiment of the present invention. 本発明の実施の形態の実施例4に係る学習装置の機能構成の一例を示す図である。It is a figure which shows an example of the functional structure of the learning device based on Example 4 of embodiment of this invention. 本発明の実施の形態の実施例4に係る推論処理の流れの一例を示すフローチャートである。12 is a flowchart illustrating an example of the flow of inference processing according to Example 4 of the embodiment of the present invention.
 以下、図面を参照して本発明の実施の形態(本実施の形態)を説明する。以下で説明する実施の形態は一例に過ぎず、本発明が適用される実施の形態は、以下の実施の形態に限られるわけではない。 Hereinafter, an embodiment of the present invention (this embodiment) will be described with reference to the drawings. The embodiments described below are merely examples, and embodiments to which the present invention is applied are not limited to the following embodiments.
 (従来の技術の問題点)
 まず、従来の技術の問題点について説明する。システム側から通知を送ることにより、行動を促進する技術が多く存在する。こういった行動を促す通知手法の多くは通知によりユーザの注意を奪いわずらわしさをユーザに感じさせる。こういったシステム側から通知を送り、ユーザに働きかけを行う際の煩わしさを低減する技術が多く開発されている。例えば、特許文献1には、システムがユーザに通知を送るなどの働きかけを行う際にユーザが感じる煩わしさを低減する技術が開示されている。
(Problems with conventional technology)
First, problems with the conventional technology will be explained. There are many technologies that encourage behavior by sending notifications from the system side. Many of the notification methods that encourage these actions steal the user's attention and make the user feel bothered. Many technologies have been developed to reduce the hassle of sending notifications from the system side and encouraging users. For example, Patent Document 1 discloses a technique for reducing the annoyance felt by a user when a system sends a notification to the user.
 ウェアラブル端末を用い、リマインドの通知を行う手段として、ウェアラブル端末を振動させることによって通知を行うものが存在する。こういった振動はすぐリマインド通知に気づくことが出来る一方、人の注意を大きく奪う。通知の煩わしさを低減させる方法として行動の切れ目を検出する手法、割込み可能性(Interruptibility)を推定して通知を行う手法などが存在する。しかし、介入可能な行動の切れ目、割込み可能性が高いタイミングなどが日常生活の中で多く存在せず、通知の頻度と比べて少ないことが課題である。 There is a method that uses a wearable terminal to send reminder notifications by vibrating the wearable terminal. While these types of vibrations make it easy to notice reminder notifications, they also take away a lot of people's attention. There are methods to reduce the annoyance of notifications, such as a method of detecting breaks in behavior and a method of estimating interruptibility and issuing notifications. However, the problem is that there are not many breaks in actions that can be intervened in, or timings where there is a high possibility of interruption, in daily life, and the frequency of notifications is small compared to the frequency of notifications.
 また、通知を行うモダリティ自体を工夫した手法も多く存在する。例えば熱の変化を用いて通知の有無をユーザに伝える手法、LEDをアンビエントに点灯することによって通知を行う手法なども存在する。しかし、これらの手法に使われている通知のためのアクチュエータは現在市販されている多くのスマートウォッチに搭載されておらず、通知だけを目的として新たにアクチュエータを取り付けるのは非常にコストがかかると考えられる。 Additionally, there are many methods in which the modality of notification itself is devised. For example, there are methods that use changes in heat to notify the user of the presence or absence of a notification, and methods that notify the user by lighting an LED in an ambient manner. However, the notification actuators used in these methods are not installed in many smartwatches currently on the market, and it is extremely costly to install new actuators for the sole purpose of notifications. Conceivable.
 (本実施の形態の概要)
 ユーザが通知の振動に対して煩わしさを感じないタイミングとして振動を感じる体の部位が動いている時が挙げられる。そこで、本実施の形態では、上述した従来の技術の問題点に対応するため、より生活の中で頻繁に起こる通知を想定し、通知時に振動する端末を装着しているユーザの部位が大きく動くタイミングに通知の振動を行う例について説明する。端末を装着している体の部位が大きく動いている時間内に振動することによって通知を行い、大きく動いている時間内に振動を終わらせることによって、通知時の振動によってユーザが感じる煩わしさを低減することができる。
(Summary of this embodiment)
The timing at which the user does not feel bothered by notification vibrations is when the part of the body that feels the vibrations is moving. Therefore, in order to address the problems of the conventional technology described above, in this embodiment, we assume that notifications occur more frequently in daily life, and the parts of the user who is wearing a terminal that vibrates when a notification moves significantly. An example of vibration for notification at the appropriate timing will be explained. Notifications are made by vibrating during periods when the part of the body the device is attached to is moving significantly, and the vibrations end during periods when the body parts are moving significantly, thereby reducing the annoyance that users feel due to vibrations during notifications. can be reduced.
 以下、本実施の形態の具体的な実施例として、実施例1から実施例4までについて説明する。 Hereinafter, Examples 1 to 4 will be described as specific examples of this embodiment.
 (実施例1)
 本実施例では、ユーザが身体を動かしている途中に通知の振動を行う端末の例について説明する。
(Example 1)
In this embodiment, an example of a terminal that vibrates as a notification while the user is moving his or her body will be described.
 図1は、本発明の実施の形態に係るウェアラブル端末のハードウェア構成の一例を示す図である。ウェアラブル端末10は、スマートウォッチのような常時体に接触しているものであってもよく、ユーザの動きを検出する時と通知を行う時に体に接触しているものであれば、常時体に接触していなくてもよい。ウェアラブル端末10は、振動を発生させることによってユーザに通知する機能を有する端末の一例である。 FIG. 1 is a diagram showing an example of the hardware configuration of a wearable terminal according to an embodiment of the present invention. The wearable terminal 10 may be a device that is in constant contact with the user's body, such as a smart watch. They don't have to be in contact. The wearable terminal 10 is an example of a terminal that has a function of notifying a user by generating vibrations.
 ウェアラブル端末10は、それぞれバスで相互に接続されているCPU101と、メモリ102と、通信装置103と、センサ104と、振動装置105と、を備える。 The wearable terminal 10 includes a CPU 101, a memory 102, a communication device 103, a sensor 104, and a vibration device 105, which are interconnected via a bus.
 メモリ102は、本実施の形態で説明する処理内容を記述したプログラム等を記憶する。CPU101は、メモリ102に記憶されたプログラムに記述された処理を実行する。 The memory 102 stores programs and the like that describe the processing contents described in this embodiment. The CPU 101 executes processing written in a program stored in the memory 102.
 通信装置103は、他の装置との間で信号を送受信するための装置である。 The communication device 103 is a device for transmitting and receiving signals with other devices.
 センサ104は、ウェアラブル端末10を装着しているユーザの動きを検出するための各種の測定を行う装置である。センサ104は、例えば、加速度を測定する加速度センサ、接触を検知するタッチセンサ、画像を撮影するカメラセンサ等を含んでもよい。 The sensor 104 is a device that performs various measurements to detect the movements of the user wearing the wearable terminal 10. The sensor 104 may include, for example, an acceleration sensor that measures acceleration, a touch sensor that detects contact, a camera sensor that captures an image, and the like.
 振動装置105は、CPU101からの指示を受けて、振動を発生させる装置である。振動装置105は、1つまたは複数の振動子から構成されていてもよい。 The vibration device 105 is a device that generates vibrations in response to instructions from the CPU 101. Vibration device 105 may be composed of one or more vibrators.
 なお、ウェアラブル端末10は、メッセージを表示するための表示装置をさらに備えていてもよい。その場合、ウェアラブル端末10は、通知に際して、振動装置105が振動を発生させるとともに、表示装置がメッセージを表示してもよい。 Note that the wearable terminal 10 may further include a display device for displaying messages. In that case, in the wearable terminal 10, the vibration device 105 may generate vibration and the display device may display a message when notifying the wearable terminal 10.
 図2は、本発明の実施の形態の実施例1に係るウェアラブル端末の機能構成の一例を示す図である。ウェアラブル端末10は、記憶部11と、検知情報取得部12と、動作開始確認部13と、加速度予測部14と、動作終了推定部15と、振動制御部16と、を備える。これら各部は、コンピュータを構成するCPU101およびメモリ102によって実現される。具体的には、各部は、CPU101が、メモリ102に記憶されたプログラムに記述された処理を実行することによって実現される。 FIG. 2 is a diagram illustrating an example of the functional configuration of a wearable terminal according to Example 1 of the embodiment of the present invention. The wearable terminal 10 includes a storage section 11 , a detection information acquisition section 12 , a motion start confirmation section 13 , an acceleration prediction section 14 , a motion end estimation section 15 , and a vibration control section 16 . Each of these parts is realized by a CPU 101 and a memory 102 that constitute a computer. Specifically, each part is realized by the CPU 101 executing processing described in a program stored in the memory 102.
 記憶部11は、通知のメッセージ内容を示す情報、通知を行うタイミングを示す情報、通知時の振動継続時間dを示す情報等を記憶する。 The storage unit 11 stores information indicating the message content of the notification, information indicating the timing of notification, information indicating the vibration duration d at the time of notification, etc.
 検知情報取得部12は、センサ104によってユーザの動作を検出した結果を示す検知情報を取得する。例えば、検知情報取得部12は、加速度を示す情報を取得する。なお、ウェアラブル端末10は、ウェアラブル端末10を装着しているユーザの部位と一体になって空間を移動することを前提としてもよい。すなわち、ウェアラブル端末10の加速度は、ウェアラブル端末10を装着しているユーザの部位の動きを示していると考えられる。 The detection information acquisition unit 12 acquires detection information indicating the result of detecting the user's motion by the sensor 104. For example, the detection information acquisition unit 12 acquires information indicating acceleration. Note that the wearable terminal 10 may be premised on moving in space in unison with the body part of the user wearing the wearable terminal 10. That is, the acceleration of the wearable terminal 10 is considered to indicate the movement of the body part of the user wearing the wearable terminal 10.
 動作開始確認部13は、検知情報取得部12によって取得された検知情報に基づいて、ユーザの動作が開始したか否かを判定する。そして、動作開始確認部13は、動作開始時刻Tmsの推定を行う。なお、動作開始確認部13は、ユーザの動作に関連する情報であれば、どのような種類のセンサによって取得された検知情報に基づいて、ユーザの動作が開始したか否かを判定してもよい。 The motion start confirmation unit 13 determines whether the user's motion has started based on the detection information acquired by the detection information acquisition unit 12. Then, the operation start confirmation unit 13 estimates the operation start time T ms . Note that the motion start confirmation unit 13 may determine whether or not the user's motion has started based on detection information obtained by any type of sensor, as long as the information is related to the user's motion. good.
 加速度予測部14は、検知情報取得部12によって取得された検知情報に基づいて、動作開始時刻Tmsから加速度がどのように変化するかを予測する。例えば、加速度予測部14は、加速度センサによって取得された検知情報に含まれる加速度に基づいて、加速度を予測してもよい。なお、加速度予測部14は、加速度の予測に関連する情報であれば、どのような種類のセンサによって取得された検知情報に基づいて、加速度を予測してもよい。 The acceleration prediction unit 14 predicts how the acceleration will change from the operation start time T ms based on the detection information acquired by the detection information acquisition unit 12. For example, the acceleration prediction unit 14 may predict acceleration based on acceleration included in detection information acquired by an acceleration sensor. Note that the acceleration prediction unit 14 may predict acceleration based on detection information obtained by any type of sensor as long as the information is related to prediction of acceleration.
 動作終了推定部15は、加速度の予測結果に基づいて、ユーザの動作が終了する時刻(動作終了時刻Tme)を推定する。そして、動作終了推定部15は、振動を開始する時刻(振動開始時刻Tns)、振動を継続する時間(振動継続時間d)、振動を終了する時刻(振動終了時刻Tne)等に基づいて、動作終了時刻Tmeより前に、通知による振動が終了するか否かを判定する。 The motion end estimating unit 15 estimates the time at which the user's motion ends (motion end time T me ) based on the acceleration prediction result. The operation end estimating unit 15 then estimates the vibration based on the time to start the vibration (vibration start time T ns ), the time to continue the vibration (vibration duration d), the time to end the vibration (vibration end time T ne ), etc. , it is determined whether the vibration due to the notification ends before the operation end time Tme .
 振動制御部16は、振動開始時刻Tns、振動継続時間d、振動終了時刻Tne、動作終了推定部15による判定結果等に基づいて、振動を発生させるように振動装置105を制御する。 The vibration control unit 16 controls the vibration device 105 to generate vibrations based on the vibration start time T ns , the vibration duration time d, the vibration end time T ne , the determination result by the operation end estimation unit 15, and the like.
 図3は、本発明の実施の形態の実施例1に係る通知処理の流れの一例を示すフローチャートである。通知処理は、ユーザへの通知事由が発生した場合に開始される。 FIG. 3 is a flowchart illustrating an example of the flow of notification processing according to Example 1 of the embodiment of the present invention. The notification process is started when a reason for notifying the user occurs.
 動作開始確認部13は、動作のモニタリングを行う(ステップS101)。具体的には、動作開始確認部13は、検知情報取得部12によって取得された検知情報に基づいて、ユーザの動作が開始したか否かを判定する。そして、動作開始確認部13は、動作開始時刻Tmsの推定を行う。 The operation start confirmation unit 13 monitors the operation (step S101). Specifically, the motion start confirmation unit 13 determines whether the user's motion has started based on the detection information acquired by the detection information acquisition unit 12. Then, the operation start confirmation unit 13 estimates the operation start time T ms .
 次に、加速度予測部14は、装着部(ユーザがウェアラブル端末10を装着している部位)の加速度を予測する(ステップS102)。具体的には、加速度予測部14は、検知情報取得部12によって取得された検知情報に基づいて、動作開始時刻Tmsから加速度がどのように変化するかを予測する。 Next, the acceleration prediction unit 14 predicts the acceleration of the wearing part (the part where the user wears the wearable terminal 10) (step S102). Specifically, the acceleration prediction unit 14 predicts how the acceleration will change from the operation start time T ms based on the detection information acquired by the detection information acquisition unit 12.
 続いて、動作終了推定部15は、動作の終了時刻を推定する(ステップS103)。具体的には、動作終了推定部15は、加速度の予測結果に基づいて、動作終了時刻Tmeを推定する。 Subsequently, the motion end estimating unit 15 estimates the end time of the motion (step S103). Specifically, the motion end estimating unit 15 estimates the motion end time Tme based on the acceleration prediction result.
 そして、動作終了推定部15は、通知(振動)の継続時間が動作の終了時刻前であるか否かを判定する(ステップS104)。具体的には、動作終了推定部15は、振動開始時刻Tns、振動継続時間d、振動終了時刻Tne等に基づいて、動作終了時刻Tmeより前に、通知による振動が終了するか否かを判定する。 Then, the operation end estimating unit 15 determines whether the duration of the notification (vibration) is before the end time of the operation (step S104). Specifically, the operation end estimating unit 15 determines whether the vibration due to the notification will end before the operation end time Tme based on the vibration start time Tns , vibration duration d, vibration end time Tne, etc. Determine whether
 動作終了推定部15は、通知(振動)の継続時間が動作の終了時刻前でないと判定すると(ステップS104:NO)、通知処理を終了する。他方、動作終了推定部15が、通知(振動)の継続時間が動作の終了時刻前であると判定すると(ステップS104:YES)、振動制御部16は、振動を発生させる(ステップS105)。 When the operation end estimating unit 15 determines that the duration of the notification (vibration) is not before the end time of the operation (step S104: NO), it ends the notification process. On the other hand, if the operation end estimating unit 15 determines that the duration of the notification (vibration) is before the end time of the operation (step S104: YES), the vibration control unit 16 generates vibration (step S105).
 図4は、本発明の実施の形態の実施例1に係る動作開始時刻および動作終了時刻について説明するための図である。例えば、ユーザがウェアラブル端末10を腕に装着している状態で腕を上げる動作をする場合、動作開始時刻Tmsは、腕を下げた状態において動作を開始する時刻である。動作終了時刻Tmeは、腕を上げる動作が終了し、腕を上げた状態になった時点の時刻である。 FIG. 4 is a diagram for explaining the operation start time and operation end time according to Example 1 of the embodiment of the present invention. For example, when the user lifts his arm while wearing the wearable terminal 10 on his arm, the movement start time T ms is the time at which he starts the movement with his arm lowered. The motion end time Tme is the time when the motion of raising the arm ends and the arm becomes in a raised state.
 図5は、本発明の実施の形態の実施例1に係る加速度の予測方法について説明するための図である。図5は、図4に示されるユーザの動作において、加速度を予測する方法を示している。図5における縦軸は加速度を示し、横軸は時刻を示す。図5には、動作開始時刻Tms、動作終了時刻Tme、振動開始時刻Tns、振動終了時刻Tneおよび振動継続時間dがそれぞれ示されている。 FIG. 5 is a diagram for explaining an acceleration prediction method according to Example 1 of the embodiment of the present invention. FIG. 5 shows a method for predicting acceleration in the user's motion shown in FIG. The vertical axis in FIG. 5 indicates acceleration, and the horizontal axis indicates time. FIG. 5 shows an operation start time T ms , an operation end time T me , a vibration start time T ns , a vibration end time T ne , and a vibration duration time d.
 また、記憶部11には、通知を行う際の振動継続時間dを示す情報および加速度に係る判定のための閾値Athがあらかじめ格納されていてもよい。動作開始確認部13は、通知処理のステップS101において、動作のモニタリングを行い、測定結果の加速度を示す情報を取得する。グラフ901は、測定結果の加速度の一例である。 Further, the storage unit 11 may store in advance information indicating the vibration duration d for notification and a threshold value A th for determination regarding acceleration. In step S101 of the notification process, the operation start confirmation unit 13 monitors the operation and acquires information indicating the acceleration of the measurement result. A graph 901 is an example of acceleration as a measurement result.
 そして、動作開始確認部13は、測定結果の加速度を示す情報に基づいて、閾値Athを使って動作の開始を検知する。例えば、動作開始確認部13は、加速度が閾値Athを超えた場合に動作が開始したものと判断してもよい。動作開始確認部13は、動作の開始が検知された時刻を動作開始時刻Tmsとする。 Then, the operation start confirmation unit 13 detects the start of the operation using the threshold value A th based on the information indicating the acceleration of the measurement result. For example, the operation start confirmation unit 13 may determine that the operation has started when the acceleration exceeds the threshold value A th . The operation start confirmation unit 13 sets the time when the start of the operation is detected as the operation start time T ms .
 次に、加速度予測部14は、通知処理のステップS102において、動作の開始が検知された後の加速度がどのように変化していくかを予測する。例えば、加速度予測部14は、図5に示されるように、加速度が最大となった点を起点として、最大となるまでの加速度の時間ごとの上昇量と同様の下降量で加速度が変化していくというモデルを基に、加速度を予測してもよい。グラフ902は、予測結果の加速度の一例である。以下、予測結果の加速度をA(t)とする。 Next, in step S102 of the notification process, the acceleration prediction unit 14 predicts how the acceleration will change after the start of the motion is detected. For example, as shown in FIG. 5, the acceleration prediction unit 14 calculates that the acceleration changes from the point at which the acceleration reaches the maximum by the same amount of decrease as the amount of increase over time until the acceleration reaches the maximum. The acceleration may be predicted based on the model that A graph 902 is an example of acceleration as a prediction result. Hereinafter, the acceleration of the prediction result will be referred to as A(t).
 続いて、動作終了推定部15は、通知処理のステップS103において、加速度の予測結果に基づいて、動作終了時刻Tmeを推定する。例えば、動作終了推定部15は、予測された加速度A(t)が0になる時刻を動作終了時刻Tmeと推定してもよい。なお、動作終了推定部15は、予測された加速度A(t)が閾値Ath以下になる時刻を動作終了時刻Tmeと推定してもよい。 Subsequently, in step S103 of the notification process, the motion end estimation unit 15 estimates the motion end time Tme based on the acceleration prediction result. For example, the motion end estimation unit 15 may estimate the time when the predicted acceleration A(t) becomes 0 as the motion end time T me . Note that the motion end estimating unit 15 may estimate the time when the predicted acceleration A(t) becomes equal to or less than the threshold value A th as the motion end time T me .
 なお、動作終了時刻Tmeの推定方法は、筋骨格モデルの式を活用したもの、機械学習を用いたもの、統計モデルに使ったものなど、動作終了時刻Tmeの推定が可能であれば他の方法であってもよい。 The movement end time Tme can be estimated by any other method as long as it is possible to estimate the movement end time Tme , such as by using a musculoskeletal model formula, by machine learning, or by using a statistical model. This method may also be used.
 動作終了推定部15は、通知処理のステップS104において、振動開始時刻Tnsにdを加えた振動終了時刻Tneが、推定された動作終了時刻Tmeよりも前になるかを確認する。そして、振動制御部16は、振動終了時刻Tneが、推定された動作終了時刻Tmeよりも前になる場合には、通知処理のステップS105において、振動を発生させる。 In step S104 of the notification process, the motion end estimating unit 15 checks whether the vibration end time T ne , which is the vibration start time T ns plus d, is earlier than the estimated motion end time T me . Then, if the vibration end time T ne is earlier than the estimated operation end time T me , the vibration control unit 16 generates vibration in step S105 of the notification process.
 なお、本実施例の方法は、上述した従来の技術、例えば、行動の切れ目を検出する方法、割込み可能性(Interruptibility)の高いタイミングを推定して通知を行う方法等と組み合わせて実施してもよい。 Note that the method of this embodiment may be implemented in combination with the conventional techniques described above, such as a method of detecting a break in behavior, a method of estimating a timing with high interruptibility, and issuing a notification. good.
 本実施例によれば、ユーザがウェアラブル端末10を装着している部位を動かしている間に、通知のための振動を発生させることができる。これによって、ユーザが感じる煩わしさを軽減させることができる。 According to this embodiment, vibration for notification can be generated while the user is moving the part to which the wearable terminal 10 is attached. This can reduce the annoyance felt by the user.
 (実施例2)
 以下に図面を参照して、実施例2について説明する。実施例2は、ユーザの動きに合わせて振動の強度を調整する点が、実施例1と相違する。よって、以下の実施例2の説明では、実施例1との相違点を中心に説明し、実施例1と同様の機能構成を有するものには、実施例1の説明で用いた符号と同様の符号を付与し、その説明を省略する。
(Example 2)
Example 2 will be described below with reference to the drawings. The second embodiment differs from the first embodiment in that the intensity of vibration is adjusted according to the user's movement. Therefore, in the following explanation of the second embodiment, the differences from the first embodiment will be mainly explained, and parts having the same functional configuration as the first embodiment will be designated by the same reference numerals as used in the explanation of the first embodiment. A symbol is given and the explanation thereof is omitted.
 図6は、本発明の実施の形態の実施例2に係るウェアラブル端末の機能構成の一例を示す図である。本実施例に係るウェアラブル端末10は、実施例1に係るウェアラブル端末10に、振動調整部17が追加された構成である。 FIG. 6 is a diagram illustrating an example of a functional configuration of a wearable terminal according to Example 2 of the embodiment of the present invention. The wearable terminal 10 according to the present embodiment has a configuration in which a vibration adjustment section 17 is added to the wearable terminal 10 according to the first embodiment.
 振動調整部17は、加速度予測部14で推定された加速度の予測結果に基づいて、発生させる振動の強度を調整する。 The vibration adjustment unit 17 adjusts the intensity of the vibration to be generated based on the prediction result of the acceleration estimated by the acceleration prediction unit 14.
 図7は、本発明の実施の形態の実施例2に係る通知処理の流れの一例を示すフローチャートである。本実施例に係る通知処理の流れは、実施例1に係る通知処理にステップS201が追加されたフローである。 FIG. 7 is a flowchart illustrating an example of the flow of notification processing according to Example 2 of the embodiment of the present invention. The flow of the notification process according to the present embodiment is the same as the notification process according to the first embodiment with step S201 added.
 ステップS104において、動作終了推定部15が、通知(振動)の継続時間が動作の終了時刻前であると判定すると(ステップS104:YES)、振動調整部17は、振動の強度を調整する(ステップS201)。そして、ステップS105において、振動制御部16は、調整された強度の振動を発生させる。 In step S104, if the operation end estimating unit 15 determines that the duration of the notification (vibration) is before the end time of the operation (step S104: YES), the vibration adjustment unit 17 adjusts the intensity of the vibration (step S201). Then, in step S105, the vibration control unit 16 generates vibration with the adjusted intensity.
 図8は、本発明の実施の形態の実施例2に係る振動の調整方法について説明するための図である。図8は、図4に示されるユーザの動作において、振動を調整する方法を示している。 FIG. 8 is a diagram for explaining a vibration adjustment method according to Example 2 of the embodiment of the present invention. FIG. 8 shows a method for adjusting vibrations in the user's actions shown in FIG.
 実施例2に係る通知処理のステップS201において、振動調整部17は、加速度予測部14によって予測された時系列の加速度A(t)に対応して、以下の式(1)のように振動の強さV(t)を調整する。 In step S201 of the notification process according to the second embodiment, the vibration adjustment unit 17 adjusts the vibration as shown in equation (1) below in response to the time-series acceleration A(t) predicted by the acceleration prediction unit 14. Adjust the strength V s (t).
 V(t)=αA(t)・・・式(1) V s (t)=αA(t)...Formula (1)
 なお、αは定数である。グラフ903は、このようにして調整された振動の強度を示している。 Note that α is a constant. Graph 903 shows the intensity of vibration adjusted in this way.
 また、振動調整部17は、基準となる1または複数の閾値との比較によって予測された加速度をグループ分けして、グループごとに異なる強度に調整してもよい。 Further, the vibration adjustment unit 17 may divide the acceleration predicted by comparison with one or more reference threshold values into groups, and adjust the intensity to be different for each group.
 振動に対する感度はスマートウォッチ等の装着部の動きによって変化すると考えられる。したがって、従来のスマートウォッチ等は、ユーザの動作に関わらず、一定の振動で通知を行っており、通知を受け取るユーザの振動に対する感度の変化に対応しないため、通知の振動が煩わしい場合があると考えられる。 Sensitivity to vibrations is thought to change depending on the movement of the worn part of a smartwatch, etc. Therefore, conventional smart watches etc. send notifications with a constant vibration regardless of the user's actions, and do not respond to changes in the sensitivity of the user receiving notifications to vibrations, so notification vibrations can be annoying. Conceivable.
 本実施例に係るウェアラブル端末10によれば、搭載されるセンサによってユーザのウェアラブル端末10を装着する部位の加速度の情報を取得し、加速度の大きさに応じた強度の振動を発生させることができる。これによって、ユーザの感じる煩わしさをさらに軽減させることができる。 According to the wearable terminal 10 according to the present embodiment, it is possible to obtain information on the acceleration of the part of the user to which the wearable terminal 10 is attached using the installed sensor, and to generate vibrations with an intensity corresponding to the magnitude of the acceleration. . This can further reduce the annoyance felt by the user.
 (実施例3)
 以下に図面を参照して、実施例3について説明する。実施例3は、通知のスケジュールを示す情報(スケジュール情報)を取得して、スケジュールに基づいて振動を発生させる点が、実施例1と相違する。よって、以下の実施例3の説明では、実施例1との相違点を中心に説明し、実施例1と同様の機能構成を有するものには、実施例1の説明で用いた符号と同様の符号を付与し、その説明を省略する。
(Example 3)
Example 3 will be described below with reference to the drawings. The third embodiment differs from the first embodiment in that information indicating a notification schedule (schedule information) is acquired and vibration is generated based on the schedule. Therefore, in the following explanation of the third embodiment, the differences from the first embodiment will be mainly explained, and the same reference numerals as used in the explanation of the first embodiment will be used for parts having the same functional configuration as the first embodiment. A symbol is given and the explanation thereof is omitted.
 図9は、本発明の実施の形態の実施例3に係るウェアラブル端末の機能構成の一例を示す図である。本実施例に係る通知システムは、ウェアラブル端末10(第一の端末)と電子端末20(第二の端末)とを備える。 FIG. 9 is a diagram illustrating an example of the functional configuration of a wearable terminal according to Example 3 of the embodiment of the present invention. The notification system according to this embodiment includes a wearable terminal 10 (first terminal) and an electronic terminal 20 (second terminal).
 電子端末20は、例えば、モバイル端末、サーバ装置等であって、スケジュール情報が格納されている。なお、電子端末20は、スケジュール情報が格納できるものであれば、他でもよい。 The electronic terminal 20 is, for example, a mobile terminal, a server device, etc., and stores schedule information. Note that the electronic terminal 20 may be any other device as long as it can store schedule information.
 電子端末20は、記憶部21と通信部22とを備える。記憶部21は、スケジュール情報を記憶する。スケジュール情報は、例えば、通知の日時、曜日、頻度、回数等を示す情報であってもよい。 The electronic terminal 20 includes a storage section 21 and a communication section 22. The storage unit 21 stores schedule information. The schedule information may be information indicating, for example, the date and time of notification, day of the week, frequency, number of times, etc.
 通信部22は、スケジュール情報を記憶部21から取得して、ウェアラブル端末10に送信する。 The communication unit 22 acquires schedule information from the storage unit 21 and transmits it to the wearable terminal 10.
 本実施例に係るウェアラブル端末10は、実施例1に係るウェアラブル端末10に、通信部18が追加された構成である。通信部18は、電子端末20からスケジュール情報を受信して、記憶部11に格納する。 The wearable terminal 10 according to the present embodiment has a configuration in which a communication unit 18 is added to the wearable terminal 10 according to the first embodiment. The communication unit 18 receives schedule information from the electronic terminal 20 and stores it in the storage unit 11 .
 また、本実施例に係る振動制御部16は、記憶部11に格納されたスケジュール情報に基づいて、スケジュール時間内であれば、実施例1の手法により適切なタイミングに通知の振動を発生させる。 Further, the vibration control unit 16 according to the present embodiment generates notification vibration at an appropriate timing using the method of the first embodiment, as long as it is within the schedule time, based on the schedule information stored in the storage unit 11.
 図10は、本発明の実施の形態の実施例3に係る通知処理の流れの一例を示すフローチャートである。本実施例に係る通知処理の流れは、実施例1に係る通知処理にステップS301およびステップS302が追加されたフローである。 FIG. 10 is a flowchart illustrating an example of the flow of notification processing according to Example 3 of the embodiment of the present invention. The flow of the notification process according to the present embodiment is a flow in which step S301 and step S302 are added to the notification process according to the first embodiment.
 通知処理が開始されると、通信部18は、電子端末20から通知のスケジュール情報を受信する(ステップS301)。スケジュール情報は、記憶部11に格納される。 When the notification process is started, the communication unit 18 receives notification schedule information from the electronic terminal 20 (step S301). Schedule information is stored in the storage unit 11.
 続いて、ウェアラブル端末10は、スケジュール時間内であるか否かを判定する(ステップS302)。ウェアラブル端末10は、スケジュール時間内であると判定すると(ステップS302:YES)、ステップS101の処理に進む。 Next, the wearable terminal 10 determines whether it is within the scheduled time (step S302). When the wearable terminal 10 determines that it is within the scheduled time (step S302: YES), the process proceeds to step S101.
 他方、ウェアラブル端末10は、スケジュール時間内でないと判定すると(ステップS302:NO)、通知処理を終了する。 On the other hand, if the wearable terminal 10 determines that it is not within the scheduled time (step S302: NO), it ends the notification process.
 なお、振動制御部16は、スケジュール時間内に通知できなかったためにスケジュール時間を過ぎた場合には、即座に通知の振動を発生させてもよい。 Incidentally, if the scheduled time has passed because the notification could not be made within the scheduled time, the vibration control unit 16 may immediately generate a notification vibration.
 本実施例によれば、リマインダなどのように予め通知する時間帯が決まっているメッセージを通知する際のウェアラブルの振動のタイミングを適切なタイミングにすることができる。 According to this embodiment, it is possible to set the vibration timing of the wearable at an appropriate timing when notifying a message such as a reminder that has a predetermined notification time period.
 (実施例4)
 以下に図面を参照して、実施例4について説明する。近年、ヘルスケアアプリが一般に使用されている。ヘルスケアアプリを搭載する端末は、例えば、ユーザの健康に関するデータ(ある行動を行ったときのデータ等)を収集し、ユーザの健康を管理することができる。実施例4では、ウェアラブル端末10あるいは電子端末20において、ヘルスケアアプリが使用されていることを想定する。
(Example 4)
Example 4 will be described below with reference to the drawings. Healthcare apps have become commonly used in recent years. A terminal equipped with a healthcare application can, for example, collect data related to a user's health (such as data when performing a certain action) and manage the user's health. In the fourth embodiment, it is assumed that a health care application is used on the wearable terminal 10 or the electronic terminal 20.
 実施例4においても実施例3と同様に、ウェアラブル端末10(例えばスマートウォッチ)と電子端末20(例えばスマートフォン)が使用される。 In the fourth embodiment, as in the third embodiment, a wearable terminal 10 (for example, a smart watch) and an electronic terminal 20 (for example, a smartphone) are used.
 実施例4では、ヘルスケアアプリの運用に必要なランニング、及びアンケート回答などの行動(以降、推奨行動と呼ぶ)をユーザに実施してもらうために、ウェアラブル端末10が、電子端末20あるいはウェアラブル端末10により取得できるユーザ状態の情報(例:行動認識の情報、時刻、GPSなど)に基づいて、各推奨行動の実施可能性を推定する。 In the fourth embodiment, the wearable terminal 10 is connected to the electronic terminal 20 or the wearable terminal in order to have the user perform actions such as running and answering a questionnaire (hereinafter referred to as recommended actions) necessary for the operation of the healthcare application. The feasibility of implementing each recommended action is estimated based on user state information (eg, action recognition information, time, GPS, etc.) that can be obtained by 10.
 そして、ウェアラブル端末10は、各推奨行動の実施可能性に基づき、適切な推奨行動を選択し、実施例3の技術を活用することで、適切なタイミングと内容のリマインダを実施する。 Then, the wearable terminal 10 selects an appropriate recommended action based on the feasibility of implementing each recommended action, and uses the technology of the third embodiment to provide a reminder at an appropriate timing and content.
 実施例4では、各推奨行動の実施可能性を推定するために、ニューラルネットワーク(NN)のモデル(実施可能性推定モデル)を使用する。ここでは、当該モデルの出力層の活性化関数としてソフトマックス関数を使用しており、ソフトマックス関数の出力として、各推奨行動の実施可能性(確率と呼んでもよい)が出力される。 In Example 4, a neural network (NN) model (feasibility estimation model) is used to estimate the feasibility of each recommended action. Here, a softmax function is used as the activation function of the output layer of the model, and the feasibility (also called probability) of each recommended action is output as an output of the softmax function.
 以下の実施例4の説明では、実施例1及び実施例3との相違点を中心に説明し、実施例1及び実施例3と同様の機能構成を有するものには、実施例1及び実施例3の説明で用いた符号と同様の符号を付与し、その説明を省略する。 In the following explanation of the fourth embodiment, the differences from the first and third embodiments will be mainly explained, and those having the same functional configuration as the first and third embodiments will be explained. The same reference numerals as those used in the explanation of 3 will be given, and the explanation thereof will be omitted.
 <ハードウェア構成について>
 ウェアラブル端末10のハードウェア構成は、実施例1で説明したとおりである。電子端末20は、スマートフォンでもよいし、タブレットでもよいし、PC(パーソナルコンピュータ)でもよいし、これら以外の装置でもよい。実施例4における電子端末20のハードウェア構成例を図11に示す。図11は、PC等のコンピュータを想定した図であるが、スマートフォンやタブレットでも、基本的な構成は図11に示すものと同じであり、コンピュータの構成を備える。なお、後述する学習装置100についても、ハードウェア構成として、図11に示すコンピュータの構成を有する。
<About hardware configuration>
The hardware configuration of the wearable terminal 10 is as described in the first embodiment. The electronic terminal 20 may be a smartphone, a tablet, a PC (personal computer), or other devices. FIG. 11 shows an example of the hardware configuration of the electronic terminal 20 in the fourth embodiment. Although FIG. 11 is a diagram assuming a computer such as a PC, a smartphone or a tablet has the same basic configuration as shown in FIG. 11 and has the configuration of a computer. Note that the learning device 100, which will be described later, also has a computer configuration shown in FIG. 11 as a hardware configuration.
 電子端末20は、コンピュータに内蔵されるCPUやメモリ等のハードウェア資源を用いて、電子端末20で実施される処理に対応するプログラムを実行することによって実現することが可能である。上記プログラムは、コンピュータが読み取り可能な記録媒体(可搬メモリ等)に記録して、保存したり、配布したりすることが可能である。また、上記プログラムをインターネットや電子メール等、ネットワークを通して提供することも可能である。 The electronic terminal 20 can be realized by using hardware resources such as a CPU and memory built into a computer to execute a program corresponding to the processing performed by the electronic terminal 20. The above program can be recorded on a computer-readable recording medium (such as a portable memory) and can be stored or distributed. It is also possible to provide the above program through a network such as the Internet or e-mail.
 図11に示すコンピュータは、それぞれバスBSで相互に接続されているドライブ装置1000、補助記憶装置1002、メモリ装置1003、CPU1004、インタフェース装置1005、表示装置1006、入力装置1007、出力装置1008、センサ1009等を有する。 The computer shown in FIG. 11 includes a drive device 1000, an auxiliary storage device 1002, a memory device 1003, a CPU 1004, an interface device 1005, a display device 1006, an input device 1007, an output device 1008, and a sensor 1009, which are interconnected by a bus BS. etc.
 当該コンピュータでの処理を実現するプログラムは、例えば、メモリカード等の記録媒体1001によって提供される。プログラムを記憶した記録媒体1001がドライブ装置1000にセットされると、プログラムが記録媒体1001からドライブ装置1000を介して補助記憶装置1002にインストールされる。但し、プログラムのインストールは必ずしも記録媒体1001より行う必要はなく、ネットワークを介して他のコンピュータよりダウンロードするようにしてもよい。補助記憶装置1002は、インストールされたプログラムを格納すると共に、必要なファイルやデータ等を格納する。 A program that realizes processing on the computer is provided, for example, by a recording medium 1001 such as a memory card. When the recording medium 1001 storing the program is set in the drive device 1000, the program is installed from the recording medium 1001 to the auxiliary storage device 1002 via the drive device 1000. However, the program does not necessarily need to be installed from the recording medium 1001, and may be downloaded from another computer via a network. The auxiliary storage device 1002 stores installed programs as well as necessary files, data, and the like.
 メモリ装置1003は、プログラムの起動指示があった場合に、補助記憶装置1002からプログラムを読み出して格納する。CPU1004は、メモリ装置1003に格納されたプログラムに従って、電子端末20に係る機能を実現する。インタフェース装置1005は、ネットワーク等に接続するためのインタフェースとして用いられる。表示装置1006はプログラムによるGUI(Graphical User Interface)等を表示する。入力装置1007はキーボード及びマウス、ボタン、又はタッチパネル等で構成され、様々な操作指示を入力させるために用いられる。出力装置1008は演算結果を出力する。 The memory device 1003 reads and stores the program from the auxiliary storage device 1002 when there is an instruction to start the program. The CPU 1004 implements functions related to the electronic terminal 20 according to programs stored in the memory device 1003. The interface device 1005 is used as an interface for connecting to a network or the like. A display device 1006 displays a GUI (Graphical User Interface) and the like based on a program. The input device 1007 is composed of a keyboard, a mouse, buttons, a touch panel, or the like, and is used to input various operation instructions. An output device 1008 outputs the calculation result.
 センサ1009は、例えば、加速度を測定する加速度センサ、接触を検知するタッチセンサ、画像を撮影するカメラセンサ、及び位置センサ(GPS装置)等のうちのいずれか1つ又はいずれか複数又は全部を含んでもよい。 The sensor 1009 includes, for example, one or more of an acceleration sensor that measures acceleration, a touch sensor that detects contact, a camera sensor that captures an image, a position sensor (GPS device), or all of the following. But that's fine.
 <機能構成例>
 図12は、実施例4における、ウェアラブル端末10と電子端末20の機能構成例を示す図である。以降の説明では、電子端末20は、スマートフォン等のユーザが持ち運んで使用する端末であることを想定する。
<Functional configuration example>
FIG. 12 is a diagram showing an example of the functional configuration of the wearable terminal 10 and the electronic terminal 20 in the fourth embodiment. In the following description, it is assumed that the electronic terminal 20 is a terminal such as a smartphone that is carried and used by a user.
 実施例4における電子端末20は、実施例3に係る電子端末20に、検知情報取得部23と推奨行動実施認識部24が追加された構成を有する。 The electronic terminal 20 according to the fourth embodiment has a configuration in which a detection information acquisition section 23 and a recommended action implementation recognition section 24 are added to the electronic terminal 20 according to the third embodiment.
 検知情報取得部23は、センサ1008によってユーザの動作を検出した結果を示す検知情報を取得する。また、検知情報取得部23は、センサ1008(例えばGPS装置)によって取得したユーザの位置情報を検知情報として取得することもできるし、センサ1008(例えばカメラ)によって取得したユーザの姿勢等の情報を検知情報として取得することもできるし、これら以外の情報を取得することもできる。 The detection information acquisition unit 23 acquires detection information indicating the result of detecting the user's motion by the sensor 1008. Further, the detection information acquisition unit 23 can also acquire the user's position information acquired by the sensor 1008 (for example, a GPS device) as the detection information, or can acquire information such as the user's posture acquired by the sensor 1008 (for example, a camera). It can be acquired as detection information, or other information can be acquired.
 推奨行動実施認識部24は、検知情報取得部23により取得された情報に基づいて、ユーザが推奨行動を実施したかどうかを確認する。 The recommended behavior implementation recognition unit 24 checks whether the user has implemented the recommended behavior based on the information acquired by the detection information acquisition unit 23.
 実施例4におけるウェアラブル端末10は、実施例3のウェアラブル端末10に、推奨行動実施認識部31、ユーザ状態推定部32、実施可能性推定部33、推奨行動選択部34が追加された構成である。なお、推奨行動選択部34を行動選択部34と呼んでもよい。また、推奨行動実施認識部31を行動実施認識部31と呼んでもよい。 The wearable terminal 10 according to the fourth embodiment has a configuration in which a recommended action implementation recognition section 31, a user state estimation section 32, an implementation possibility estimation section 33, and a recommended action selection section 34 are added to the wearable terminal 10 according to the third embodiment. . Note that the recommended behavior selection section 34 may also be referred to as the behavior selection section 34. Further, the recommended behavior implementation recognition unit 31 may be referred to as the behavior implementation recognition unit 31.
 推奨行動実施認識部31は、検知情報取得部12により取得された情報に基づいて、ユーザが推奨行動を実施したかどうかを確認する。 The recommended behavior implementation recognition unit 31 checks whether the user has implemented the recommended behavior based on the information acquired by the detection information acquisition unit 12.
 本実施例で想定しているヘルスケアアプリにおける推奨行動として、事前にいくつかの行動が定められているとする。また、当該ヘルスケアアプリにおける推奨行動の多くは、行動認識API等により、ウェアラブル端末10あるいいは電子端末20の検知情報(センサにより得られた情報)を用いることで、その有無(例:歩行の有無、ランニングの有無等)を確認することができる。また、ウェアラブル端末10あるいは電子端末20で使用されているヘルスケアアプリ内の操作ログあるいは回答によって、アンケート回答、通知確認、及び服薬などの行動の実施の有無を確認することも可能である。 It is assumed that several actions are determined in advance as recommended actions in the healthcare application assumed in this example. In addition, many of the recommended actions in the healthcare app are based on the presence or absence (for example, walking) of the wearable terminal 10 or the electronic terminal 20 using detection information (information obtained by the sensor) using the action recognition API. You can check the presence or absence of running, running, etc.). Furthermore, it is also possible to check whether or not the user has responded to a questionnaire, confirmed a notification, taken medication, etc., based on operation logs or responses in a healthcare application used on the wearable terminal 10 or the electronic terminal 20.
 推奨行動実施認識部24と推奨行動実施認識部31はいずれも、例えば上記の行動認識API等を使用することで、ユーザによる推奨行動の実施を検知することができる。 Both the recommended action implementation recognition unit 24 and the recommended action implementation recognition unit 31 can detect implementation of the recommended action by the user, for example, by using the above-mentioned action recognition API.
 ユーザ状態推定部32は、検知情報取得部12により取得された情報を基にユーザ状態を推定する。ユーザ状態推定部32は、検知情報取得部12により取得された情報をそのままユーザ状態の情報とする場合もある。 The user state estimation unit 32 estimates the user state based on the information acquired by the detection information acquisition unit 12. The user state estimation unit 32 may use the information acquired by the detection information acquisition unit 12 as the user state information as it is.
 実施可能性推定部33は、各推奨行動の実施可能性を推定するNNの重みデータを格納する記憶部を含む。実施可能性推定部33は、重みデータ(重みパラメータ)を記憶部から読み出すことで、当該重みデータがセットされたNNを動作させることができる。 The feasibility estimating unit 33 includes a storage unit that stores weight data of the NN that estimates the feasibility of each recommended action. The feasibility estimation unit 33 can operate the NN in which the weight data is set by reading the weight data (weight parameters) from the storage unit.
 当該NNには、ユーザ状態推定部32により得られたユーザ状態が入力データとして入力され、当該NNは、各推奨行動の実施可能性を出力する。 The user state obtained by the user state estimating unit 32 is input to the NN as input data, and the NN outputs the feasibility of each recommended action.
 前述したように、NNの出力層において、多値分類などでよく用いられるソフトマックス関数が使用される。当該ソフトマックス関数は、各推奨行動のカテゴリの実施可能性を0~1で出力し、すべてのカテゴリの実施可能性の合計が1になるように出力する。出力の例は次のとおりである:アンケート:0.4、ランニング:0.3、服薬:0.3。上記NNを実施可能性推定モデルと呼ぶことにする。 As mentioned above, in the output layer of the NN, a softmax function, which is often used in multi-level classification, is used. The softmax function outputs the feasibility of each category of recommended actions in the range of 0 to 1, and outputs the feasibility of all categories so that the sum of the possibilities of action becomes 1. Examples of outputs are: Survey: 0.4, Running: 0.3, Medication: 0.3. The above NN will be referred to as a feasibility estimation model.
 実施例4では実施可能性推定モデルの学習を行う学習フェーズと、学習済みの実施可能性推定モデルを使用して推論を行う推論フェーズがある。それぞれの動作については後述する。 In the fourth embodiment, there is a learning phase in which a feasibility estimation model is learned, and an inference phase in which inference is made using the learned feasibility estimation model. Each operation will be described later.
 推奨行動選択部34は、実施可能性推定部33(実施可能性推定モデル)から出力された実施可能性を持つ複数の推奨行動の中から1つの推奨行動を選択する。具体的な動作については後述する。 The recommended action selection unit 34 selects one recommended action from among the plurality of recommended actions that have implementation possibilities and are output from the implementation possibility estimation unit 33 (implementability estimation model). The specific operation will be described later.
 <学習フェーズ>
 次に、学習フェーズにおけるシステム(電子端末20とウェアラブル端末10)の動作例を説明する。ここでは、実施可能性推定部33が、実施可能性推定モデルの学習(重みデータの作成)を行う機能を含むものとして説明する。実施可能性推定モデルの学習時における実施可能性推定部33を「学習部」と呼んでもよい。
<Learning phase>
Next, an example of the operation of the system (electronic terminal 20 and wearable terminal 10) in the learning phase will be described. Here, the implementation feasibility estimation unit 33 will be described as including a function of learning the implementation feasibility estimation model (creating weight data). The feasibility estimating unit 33 during learning of the feasibility estimation model may be referred to as a “learning unit”.
 なお、実施可能性推定モデルの学習(重みデータの作成)について、電子端末20及びウェアラブル端末10とは別の装置(学習装置と呼ぶ)で行うこととしてもよい。 Note that the learning of the feasibility estimation model (creation of weight data) may be performed by a device (referred to as a learning device) separate from the electronic terminal 20 and the wearable terminal 10.
 学習フェーズでは、ウェアラブル端末10における「ユーザ状態推定部32により得られた情報、及び、推奨行動実施認識部31により得られた情報」、及び、電子端末20における推奨行動実施認識部24により得られた情報を用いて、ウェアラブル端末10の実施可能性推定部33(学習部)が、実施可能性推定モデルの学習を行う。つまり、実施可能性推定部33(学習部)は、ユーザ状態推定部32により得られた情報を入力した実施可能性推定モデルが、正解の推奨行動(推奨行動実施認識部31/推奨行動実施認識部24により得られた行動)を出力するように(つまり、正解の推奨行動の実施可能性が1になるように)、実施可能性推定モデルの重みデータ(パラメータ)を調整する。 In the learning phase, "the information obtained by the user state estimation section 32 and the information obtained by the recommended action implementation recognition section 31" in the wearable terminal 10 and the information obtained by the recommended action implementation recognition section 24 in the electronic terminal 20 are used. Using the obtained information, the feasibility estimating unit 33 (learning unit) of the wearable terminal 10 performs learning of the feasibility estimation model. In other words, the implementability estimation unit 33 (learning unit) determines that the implementability estimation model into which the information obtained by the user state estimation unit 32 is input is the correct recommended action (recommended action implementation recognition unit 31/recommended action implementation recognition unit). The weight data (parameters) of the feasibility estimation model are adjusted so as to output the behavior obtained by the unit 24 (that is, so that the probability of implementing the correct recommended behavior is 1).
 なお、推奨行動実施認識部31により得られた情報と、推奨行動実施認識部24により得られた情報のうちのどちらか一方で得られた情報を用いて学習を行うこととしてもよい。 Note that learning may be performed using information obtained by either the information obtained by the recommended action implementation recognition unit 31 or the information obtained by the recommended action implementation recognition unit 24.
 図13のフローチャートを参照して、学習フェーズにおける、電子端末20の動作と、ウェアラブル端末10の動作を説明する。以下、S401~S403をS501~S503の前に説明しているが、これは説明の便宜のためである。S401~S403とS501~S503は基本的には並行して行われる。 The operation of the electronic terminal 20 and the operation of the wearable terminal 10 in the learning phase will be described with reference to the flowchart in FIG. 13. Hereinafter, S401 to S403 will be explained before S501 to S503, but this is for convenience of explanation. S401 to S403 and S501 to S503 are basically performed in parallel.
 S401において、推奨行動実施認識部24は、検知情報取得部23から検知情報を取得する。S402において、推奨行動実施認識部24は、推奨行動の実施を検知する。S403において、通信部22から通信部18に対して、検知された推奨行動の内容を送信する。 In S401, the recommended action implementation recognition unit 24 acquires detection information from the detection information acquisition unit 23. In S402, the recommended behavior implementation recognition unit 24 detects implementation of the recommended behavior. In S403, the content of the detected recommended action is transmitted from the communication unit 22 to the communication unit 18.
 S501において、推奨行動実施認識部31とユーザ状態推定部32はそれぞれ、検知情報取得部12から検知情報を取得する。S502において、推奨行動実施認識部31は、検知情報に基づいて、推奨行動の実施を検知する。S503において、ユーザ状態推定部32が、検知情報に基づいて、ユーザ状態を示すデータを取得する。 In S501, the recommended action implementation recognition unit 31 and the user state estimation unit 32 each acquire detection information from the detection information acquisition unit 12. In S502, the recommended behavior implementation recognition unit 31 detects implementation of the recommended behavior based on the detection information. In S503, the user state estimation unit 32 acquires data indicating the user state based on the detection information.
 S504において、実施可能性推定部33(学習部)は、学習対象である実施可能性推定モデルが、ユーザ状態のデータを入力として、事前に定義された推奨行動の中から実施された推奨行動クラスを推定できるように、当該モデルの重みデータを作成する。ここでは、推奨行動実施認識部24/推奨行動実施認識部31により認識された推奨行動が正解として用いられることにより、上記重みデータを作成する。作成された重みデータ(学習済みのモデル)は、実施可能性推定部33が備える記憶部に格納される。 In S504, the feasibility estimating unit 33 (learning unit) determines that the feasibility estimation model that is the learning target is a recommended behavior class that is implemented from among predefined recommended behaviors using user state data as input. Create weight data for the model so that it can be estimated. Here, the weight data is created by using the recommended action recognized by the recommended action implementation recognition unit 24/recommended action implementation recognition unit 31 as the correct answer. The created weight data (learned model) is stored in a storage unit included in the feasibility estimating unit 33.
 学習動作について、例を用いてより具体的に説明する。 The learning operation will be explained more specifically using an example.
 ウェアラブル端末10の推奨行動実施認識部31、あるいは、電子端末20における推奨行動実施認識部24によりユーザの推奨行動の実施が検出された場合、検出された推奨行動実施の開始タイミングのユーザ状態を、当該タイミングと同時に、例えば実施可能性推定部33の記憶部に記録する。 When the recommended action implementation recognition unit 31 of the wearable terminal 10 or the recommended action implementation recognition unit 24 of the electronic terminal 20 detects the user's implementation of the recommended action, the user state at the start timing of the detected recommended action implementation is determined by Simultaneously with the timing, it is recorded, for example, in the storage section of the feasibility estimating section 33.
 そして、推奨行動と、その推奨行動が検知されたタイミングと同じタイミングのユーザ状態とを用いて、実施可能性推定部33により、実施可能性推定モデルの学習がなされる。 Then, the feasibility estimating unit 33 trains the feasibility estimation model using the recommended behavior and the user state at the same timing as the timing at which the recommended behavior was detected.
 図14を用いて例を説明する。図14の例では、ユーザは、電子端末20としてスマートフォンを保持し、ウェアラブル端末10として、スマートウォッチを保持していることを想定している。 An example will be explained using FIG. 14. In the example of FIG. 14, it is assumed that the user holds a smartphone as the electronic terminal 20 and a smart watch as the wearable terminal 10.
 ここでは、推奨行動実施認識部24又は推奨行動実施認識部31が、ユーザの行動(推奨行動)として、「アンケート入力」の実施を検知する。 Here, the recommended action implementation recognition unit 24 or the recommended action implementation recognition unit 31 detects the implementation of “questionnaire input” as a user action (recommended action).
 「アンケート入力」の実施の開始タイミングで、ユーザ状態推定部32は、ユーザ状態として、行動は「立ち止まっている」、居場所は「屋外」、スケジュールは「予定なし」、電子端末20(スマートフォン)の状態は「Active」、心拍数は「82」などのデータを取得し、実施可能性推定部33の記憶部に記録する。 At the start of implementation of the "questionnaire input", the user state estimating unit 32 determines the user state as follows: the behavior is "standing still", the location is "outdoors", the schedule is "no plans", and the electronic terminal 20 (smartphone) is Data such as the status "Active" and the heart rate "82" are acquired and recorded in the storage section of the feasibility estimating section 33.
 なお、ユーザ状態推定部32は、上記のようなデータを常時取得しており、推奨行動検知のタイミングで、取得したユーザ状態のデータを記憶部に記録する。同時に、検知された推奨行動も記憶部に記録される。 Note that the user state estimating unit 32 constantly acquires the above data, and records the acquired user state data in the storage unit at the timing of recommended action detection. At the same time, the detected recommended actions are also recorded in the storage unit.
 実施可能性推定部33(学習部)は、記録された推奨行動の開始タイミングのユーザ状態を実施可能性推定モデルへの入力とし、ユーザが実施した推奨行動のカテゴリの実施可能性を1、それ以外のカテゴリの実施可能性を0として、実施可能性推定モデルの重みデータを学習する。図14の例では、「アンケート」のカテゴリの実施可能性が1になるように、実施可能性推定モデルの重みデータを学習する。重みデータの学習自体は既存技術であり、例えば、誤差逆伝播法を用いて行うことができる。 The implementability estimating unit 33 (learning unit) inputs the user state at the start timing of the recorded recommended action to the implementability estimation model, and sets the implementability of the category of the recommended action performed by the user to 1 and that. The weight data of the feasibility estimation model is learned by setting the feasibility of other categories to 0. In the example of FIG. 14, the weight data of the feasibility estimation model is learned so that the feasibility of the category "questionnaire" is 1. Learning of weight data itself is an existing technique, and can be performed using, for example, error backpropagation.
 <学習フェーズにおけるその他の構成例>
 上記の例では、ウェアラブル端末10において実施可能性推定モデルの学習を行うこととしているが、このような形態に限定されない。
<Other configuration examples in the learning phase>
In the above example, the feasibility estimation model is learned in the wearable terminal 10, but the present invention is not limited to such a form.
 実施可能性推定モデルの学習を、ウェアラブル端末10以外の装置(これを学習装置100と呼ぶ)で行ってもよい。 The learning of the feasibility estimation model may be performed by a device other than the wearable terminal 10 (this will be referred to as the learning device 100).
 図15に、学習装置100の構成例を示す。図15に示すように、学習装置100は、入力部110、学習部120、出力部130、記憶部140を含む。 FIG. 15 shows a configuration example of the learning device 100. As shown in FIG. 15, the learning device 100 includes an input section 110, a learning section 120, an output section 130, and a storage section 140.
 入力部110には、電子端末20又はウェアラブル端末10において検知された推奨行動と、当該推奨行動の実施の開始タイミングにおけるユーザ状態のデータ(ウェアラブル端末10において取得されたデータ)が入力される。入力された推奨行動とユーザ状態のデータは記憶部140に格納される。 The recommended behavior detected in the electronic terminal 20 or the wearable terminal 10 and data on the user state at the start timing of implementation of the recommended behavior (data acquired in the wearable terminal 10) are input to the input unit 110. The input recommended actions and user status data are stored in the storage unit 140.
 記憶部140には、学習対象の実施可能性推定モデルの重みデータが格納されている。学習部120は、推奨行動の開始タイミングのユーザ状態を実施可能性推定モデルへの入力とし、ユーザが実施した推奨行動のカテゴリの実施可能性を1、それ以外のカテゴリの実施可能性を0として、実施可能性推定モデルの重みデータを学習する。学習は例えば誤差逆伝播法を用いて行われる。 The storage unit 140 stores weight data of a feasibility estimation model to be learned. The learning unit 120 inputs the user state at the start timing of the recommended action to the implementation feasibility estimation model, and sets the implementation possibility of the category of recommended actions performed by the user to 1 and the implementation possibility of other categories to 0. , learn the weight data of the feasibility estimation model. Learning is performed using, for example, error backpropagation.
 学習済みの重みデータは出力部130から出力され、ウェアラブル端末10における実施可能性推定部33に入力される。 The learned weight data is output from the output unit 130 and input to the feasibility estimation unit 33 in the wearable terminal 10.
 <推論フェーズ>
 推論フェーズでは、電子端末20及びウェアラブル端末10において、推奨行動実施認識部24/推奨行動実施認識部33が備えられていなくてもよい。また、推論フェーズにおいて、実施可能性推定部33は、学習済みの実施可能性推定モデル(具体的には重みデータ)を保持しているとする。実施可能性推定部33自体が、学習済みの実施可能性推定モデルであると考えてもよい。
<Inference phase>
In the inference phase, the recommended action implementation recognition unit 24/recommended action implementation recognition unit 33 may not be provided in the electronic terminal 20 and the wearable terminal 10. Further, in the inference phase, it is assumed that the feasibility estimating unit 33 holds a learned feasibility estimation model (specifically, weight data). The feasibility estimating unit 33 itself may be considered to be a trained feasibility estimation model.
 推論フェーズでは、実施可能性推定部33は、ユーザ状態推定部32により得られるユーザ状態の情報を実施可能性推定モデルへ入力する。実施可能性推定モデル(学習済みNN)は、事前に定義された各推奨行動の実施可能性を出力する。実施可能性推定部33は、実施可能性推定モデルから出力された各推奨行動の実施可能性を推奨行動選択部34に入力する。 In the inference phase, the feasibility estimation unit 33 inputs the user state information obtained by the user state estimation unit 32 to the feasibility estimation model. The feasibility estimation model (trained NN) outputs the feasibility of each recommended action defined in advance. The feasibility estimation unit 33 inputs the feasibility of each recommended action outputted from the feasibility estimation model to the recommended action selection unit 34 .
 推奨行動選択部34は、記憶部11から得られた推奨行動のリマインダのデータから通知のスケジュール時間情報を取得し、現在時刻がスケジュール時間内である推奨行動の情報を取得する。推奨行動選択部34は、実施可能性推定部33から出力された1以上の推奨行動の内、スケジュール時間内で実施可能性が最も高い推奨行動の選択を行う。 The recommended behavior selection unit 34 acquires notification schedule time information from the recommended behavior reminder data obtained from the storage unit 11, and acquires information on recommended behaviors whose current time is within the schedule time. The recommended action selection unit 34 selects the recommended action that is most likely to be implemented within the scheduled time from among the one or more recommended actions output from the implementation possibility estimation unit 33.
 例えば、現在時刻がスケジュール時間内である推奨行動が(服薬、アンケート)であり、各推奨行動のカテゴリの実施可能性が(アンケート:0.4、ランニング:0.5、服薬:0.1)である場合、推奨行動選択部34は、アンケートを選択する。 For example, the recommended action whose current time is within the scheduled time is (taking medication, questionnaire), and the feasibility of each recommended action category is (questionnaire: 0.4, running: 0.5, taking medication: 0.1). If so, the recommended action selection unit 34 selects the questionnaire.
 その後、実施例3(及び実施例1)と同様にして、動作開始確認部13及び動作終了推定部15によって検出される煩わしさを低減させる通知タイミングに、振動制御部16により、ユーザに対して通知の振動が伝えられる。煩わしさを低減させるタイミングで通知を行う方法自体は実施例1で説明した方法と同じである。 Thereafter, in the same manner as in the third embodiment (and the first embodiment), the vibration control unit 16 sends a message to the user at a notification timing that reduces the annoyance detected by the operation start confirmation unit 13 and the operation end estimation unit 15. Notification vibrations will be transmitted. The method of notifying at a timing that reduces annoyance is the same as the method described in the first embodiment.
 実施例4での通知の際には、通知の情報に、選択した推奨行動の情報(例えば、アンケートに対応する振動数で振動を行うこと)が含まれていてもよい。また、実施例4での通知の際には、選択した推奨行動(例えばアンケート)を、通知(振動)と同時に、メッセージとしてウェアラブル端末10又は電子端末20のディスプレイに表示してもよい。 In the case of notification in the fourth embodiment, information on the selected recommended action (for example, vibrating at a frequency corresponding to the questionnaire) may be included in the notification information. Further, in the case of the notification in the fourth embodiment, the selected recommended action (for example, a questionnaire) may be displayed as a message on the display of the wearable terminal 10 or the electronic terminal 20 at the same time as the notification (vibration).
 図16のフローチャートを参照して、推奨行動を選択し、煩わしくないタイミングに通知する処理手順の例を説明する。 With reference to the flowchart in FIG. 16, an example of a processing procedure for selecting a recommended action and notifying the user at a convenient timing will be described.
 S601において、電子端末20は、通知のスケジュール情報を例えばサーバ等から受信する。スケジュール情報は、記憶部21に格納される。S602において、通信部22が通知のスケジュール情報をウェアラブル端末10に送信し、S603において、通信部18が通知のスケジュール情報を受信する。スケジュール情報は、記憶部11に格納される。 In S601, the electronic terminal 20 receives notification schedule information from, for example, a server. The schedule information is stored in the storage unit 21. In S602, the communication unit 22 transmits notification schedule information to the wearable terminal 10, and in S603, the communication unit 18 receives the notification schedule information. Schedule information is stored in the storage unit 11.
 S604(S610)において、ウェアラブル端末10は、現在時刻がスケジュール時間内であるか否かを判定する。ウェアラブル端末10は、現在時刻がスケジュール時間内であると判定すると、S605~S609、S611を実行する。他方、ウェアラブル端末10は、現在時刻がスケジュール時間内でないと判定すると通知処理を終了する。 In S604 (S610), the wearable terminal 10 determines whether the current time is within the scheduled time. When the wearable terminal 10 determines that the current time is within the scheduled time, it executes S605 to S609 and S611. On the other hand, if the wearable terminal 10 determines that the current time is not within the scheduled time, it ends the notification process.
 S605において、ユーザ状態推定部32は、ユーザ状態のデータを取得する。S606において、実施可能性推定部33(実施可能性推定モデル)がユーザ状態に基づき各推奨行動の実施可能性を出力し、推奨行動選択部34が、実施可能性推定部33から出力された1以上の推奨行動の内、スケジュール時間内で実施可能性が最も高い推奨行動を選択する。 In S605, the user state estimating unit 32 acquires user state data. In S606, the implementability estimation unit 33 (implementability estimation model) outputs the implementability of each recommended action based on the user status, and the recommended action selection unit 34 outputs the implementability of each recommended action based on the user status. Among the recommended actions above, the recommended action that is most likely to be implemented within the scheduled time is selected.
 その後、実施例1で説明したとおりの処理(S607~S609、S611)により、煩わしくないタイミングである、ユーザが動作を行っているタイミングで、通知(振動)を行う。S607~S609、S611は、実施例1での図3におけるS101~S105に相当する。 Thereafter, through the same process as described in Example 1 (S607 to S609, S611), notification (vibration) is performed at a timing that is not bothersome, and at a timing when the user is performing an action. S607 to S609 and S611 correspond to S101 to S105 in FIG. 3 in the first embodiment.
 本実施例によれば、推奨行動が実施されている可能性が高いタイミングでユーザへの通知(リマインド)を行うことができるので、ユーザが煩わしさを感じることなく、ユーザに対して、ヘルスケアアプリの運用に必要な行動を実施してもらうことが可能となる。 According to this embodiment, it is possible to notify (reminder) the user at a time when there is a high possibility that the recommended action has been implemented, so the user can receive health care information without the user feeling bothered. It is possible to have the user perform the actions necessary for the operation of the app.
 なお、実施例4では、実施可能性推定モデルとしてNNを使用する例を説明したが、実施可能性推定モデルはNNに限定されない。カテゴリの予測ができるものであればどのようなモデルを使用してもよい。 Note that in Example 4, an example was explained in which a NN is used as the feasibility estimation model, but the feasibility estimation model is not limited to a NN. Any model that can predict categories may be used.
 また、NNを使用する場合において、出力関数はソフトマックス関数に限定されない。各カテゴリの実施確率(実施可能性)を出力できるものであればどのような手法を使用してもよい。 Furthermore, when using a NN, the output function is not limited to a softmax function. Any method may be used as long as it can output the probability of implementation (feasibility of implementation) for each category.
 また、実施例4では、ヘルスケアアプリの使用を想定しているが、実施例4の技術は、ヘルスケアアプリの使用の有無に関わらず、様々な用途に適用可能である。 Furthermore, although the fourth embodiment assumes the use of a healthcare app, the technology of the fourth embodiment can be applied to various uses regardless of whether a healthcare app is used.
 (実施の形態のまとめ)
 本明細書には、少なくとも下記の各項に記載した端末、通知システム、通知方法、及びプログラムが記載されている。
(第1項)
 ユーザの動作を検知した情報を取得する検知情報取得部と、
 前記ユーザの動作を検知した情報に基づいて、前記ユーザが動作を行っている間に振動するように、前記ユーザへの通知のための振動を発生させる振動制御部と、を備える、
 端末。
(第2項)
 前記ユーザの動作を検知した情報に基づいて、前記ユーザが動作を開始したか否かを判定する動作開始確認部と、
 前記ユーザの動作を検知した情報に基づいて、前記ユーザの動作における加速度の変化を予測する加速度予測部と、
 予測された前記ユーザの動作における加速度の変化に基づいて、前記ユーザの動作の終了時刻を推定する動作終了推定部と、をさらに備え、
 前記振動制御部は、推定された前記ユーザの動作の終了時刻の前に振動が終了することを、あらかじめ規定された振動の継続期間に基づいて判断した場合に、前記ユーザへの通知のための振動を発生させる、
 第1項に記載の端末。
(第3項)
 予測された前記ユーザの動作における加速度の変化に応じて、発生させる振動の強度を調整する振動調整部をさらに備える、
 第2項に記載の端末。
(第4項)
 前記ユーザの状態を推定するユーザ状態推定部と、
 前記状態に基づいて、予め定められた1以上の行動の実施可能性を推定する実施可能性推定部と、
 前記実施可能性に基づいて、前記1以上の行動から1つの行動を選択する行動選択部と
 をさらに備える第1項に記載の端末。
(第5項)
 前記ユーザによる行動の実施を検知する行動実施認識部をさらに備え、
 前記実施可能性推定部において使用される実施可能性推定モデルは、前記行動実施認識部により実施を検知された行動と、当該行動の実施が検知されたタイミングで前記ユーザ状態推定部により得られた前記ユーザの状態とを用いて学習されたモデルである
 第4項に記載の端末。
(第6項)
 第一の端末と、第二の端末とを備える通知システムであって、
 前記第一の端末は、
 ユーザへの通知のスケジュールを示す情報を前記第二の端末から受信する通信部と、
 ユーザの動作を検知した情報を取得する検知情報取得部と、
 スケジュールされた通知のタイミングにおいて、前記ユーザの動作を検知した情報に基づいて、前記ユーザが動作を行っている間に振動するように、前記ユーザへの通知のための振動を発生させる振動制御部と、を備え、
 前記第二の端末は、
 ユーザへの通知のスケジュールを示す情報を前記第一の端末に送信する通信部を備える、
 通知システム。
(第7項)
 端末が実行する通知方法であって、
 ユーザの動作を検知した情報を取得するステップと、
 前記ユーザの動作を検知した情報に基づいて、前記ユーザが動作を行っている間に振動するように、前記ユーザへの通知のための振動を発生させるステップと、を備える、
 通知方法。
(第8項)
 コンピュータを、第1項から第3項のいずれか1項に記載の端末における各部として機能させるためのプログラム。
(Summary of embodiments)
This specification describes at least the terminal, notification system, notification method, and program described in each section below.
(Section 1)
a detection information acquisition unit that acquires information on detected user motion;
a vibration control unit that generates vibration for notification to the user so as to vibrate while the user is performing the action, based on information obtained by detecting the user's action;
terminal.
(Section 2)
an operation start confirmation unit that determines whether the user has started an operation based on information obtained by detecting the user's operation;
an acceleration prediction unit that predicts a change in acceleration in the user's movement based on information obtained by detecting the user's movement;
further comprising a motion end estimation unit that estimates an end time of the user's motion based on a predicted change in acceleration in the user's motion;
When the vibration control unit determines that the vibration will end before the estimated end time of the user's motion based on a predefined vibration duration, the vibration control unit may be configured to send a notification to the user. generate vibrations,
The terminal described in paragraph 1.
(Section 3)
further comprising a vibration adjustment unit that adjusts the intensity of the vibration to be generated according to a predicted change in acceleration in the user's motion;
The terminal described in Section 2.
(Section 4)
a user state estimation unit that estimates the state of the user;
an implementation possibility estimating unit that estimates the implementation possibility of one or more predetermined actions based on the state;
The terminal according to claim 1, further comprising: an action selection unit that selects one action from the one or more actions based on the feasibility.
(Section 5)
further comprising an action implementation recognition unit that detects implementation of the action by the user,
The implementation possibility estimation model used in the implementation possibility estimation section is obtained by the user state estimation section based on the action whose implementation was detected by the action implementation recognition section and the timing at which the implementation of the action was detected. The terminal according to item 4, wherein the terminal is a model learned using the user's state.
(Section 6)
A notification system comprising a first terminal and a second terminal,
The first terminal is
a communication unit that receives information indicating a schedule for notification to the user from the second terminal;
a detection information acquisition unit that acquires information on detected user motion;
a vibration control unit that generates a vibration for notification to the user, based on information obtained by detecting the user's movement, at a scheduled notification timing, so as to vibrate while the user is performing the movement; and,
The second terminal is
comprising a communication unit that transmits information indicating a schedule for notification to the user to the first terminal;
Notification system.
(Section 7)
A notification method performed by a terminal,
a step of acquiring information on detecting a user's motion;
a step of generating vibration for notification to the user so as to vibrate while the user is performing the action, based on information obtained by detecting the user's action;
Notification method.
(Section 8)
A program for causing a computer to function as each part of the terminal according to any one of items 1 to 3.
 以上、本実施の形態について説明したが、本発明はかかる特定の実施形態に限定されるものではなく、請求の範囲に記載された本発明の要旨の範囲内において、種々の変形・変更が可能である。 Although the present embodiment has been described above, the present invention is not limited to such specific embodiment, and various modifications and changes can be made within the scope of the gist of the present invention as described in the claims. It is.
 本特許出願は2022年6月22日に出願した国際特許出願PCT/JP2022/024966に基づきその優先権を主張するものであり、国際特許出願PCT/JP2022/024966の全内容を本願に援用する。 This patent application claims priority based on the international patent application PCT/JP2022/024966 filed on June 22, 2022, and the entire content of the international patent application PCT/JP2022/024966 is incorporated into this application.
 10 ウェアラブル端末
 11 記憶部
 12 検知情報取得部
 13 動作開始確認部
 14 加速度予測部
 15 動作終了推定部
 16 振動制御部
 17 振動調整部
 18 通信部
 20 電子端末
 21 記憶部
 22 通信部
 23 検知情報取得部
 24 推奨行動実施認識部
 31 推奨行動実施認識部
 32 ユーザ状態推定部
 33 実施可能性推定部
 34 推奨行動選択部
 100 学習装置
 110 入力部
 120 学習部
 130 出力部
 140 記憶部
 101 CPU
 102 メモリ
 103 通信装置
 104 センサ
 105 振動装置
 1000 ドライブ装置
 1001 記録媒体
 1002 補助記憶装置
 1003 メモリ装置
 1004 CPU
 1005 インタフェース装置
 1006 表示装置
 1007 入力装置
 1008 出力装置
 1009 センサ
10 Wearable terminal 11 Storage unit 12 Detection information acquisition unit 13 Operation start confirmation unit 14 Acceleration prediction unit 15 Operation end estimation unit 16 Vibration control unit 17 Vibration adjustment unit 18 Communication unit 20 Electronic terminal 21 Storage unit 22 Communication unit 23 Detection information acquisition unit 24 Recommended action implementation recognition unit 31 Recommended action implementation recognition unit 32 User state estimation unit 33 Implementability estimation unit 34 Recommended action selection unit 100 Learning device 110 Input unit 120 Learning unit 130 Output unit 140 Storage unit 101 CPU
102 Memory 103 Communication device 104 Sensor 105 Vibration device 1000 Drive device 1001 Recording medium 1002 Auxiliary storage device 1003 Memory device 1004 CPU
1005 Interface device 1006 Display device 1007 Input device 1008 Output device 1009 Sensor

Claims (8)

  1.  ユーザの動作を検知した情報を取得する検知情報取得部と、
     前記ユーザの動作を検知した情報に基づいて、前記ユーザが動作を行っている間に振動するように、前記ユーザへの通知のための振動を発生させる振動制御部と、を備える、
     端末。
    a detection information acquisition unit that acquires information on detected user motion;
    a vibration control unit that generates vibration for notification to the user so as to vibrate while the user is performing the action, based on information obtained by detecting the user's action;
    terminal.
  2.  前記ユーザの動作を検知した情報に基づいて、前記ユーザが動作を開始したか否かを判定する動作開始確認部と、
     前記ユーザの動作を検知した情報に基づいて、前記ユーザの動作における加速度の変化を予測する加速度予測部と、
     予測された前記ユーザの動作における加速度の変化に基づいて、前記ユーザの動作の終了時刻を推定する動作終了推定部と、をさらに備え、
     前記振動制御部は、推定された前記ユーザの動作の終了時刻の前に振動が終了することを、あらかじめ規定された振動の継続期間に基づいて判断した場合に、前記ユーザへの通知のための振動を発生させる、
     請求項1に記載の端末。
    an operation start confirmation unit that determines whether the user has started an operation based on information obtained by detecting the user's operation;
    an acceleration prediction unit that predicts a change in acceleration in the user's movement based on information obtained by detecting the user's movement;
    further comprising a motion end estimation unit that estimates an end time of the user's motion based on a predicted change in acceleration in the user's motion;
    When the vibration control unit determines that the vibration will end before the estimated end time of the user's motion based on a predefined vibration duration, the vibration control unit may be configured to send a notification to the user. generate vibrations,
    The terminal according to claim 1.
  3.  予測された前記ユーザの動作における加速度の変化に応じて、発生させる振動の強度を調整する振動調整部をさらに備える、
     請求項2に記載の端末。
    further comprising a vibration adjustment unit that adjusts the intensity of the vibration to be generated according to a predicted change in acceleration in the user's motion;
    The terminal according to claim 2.
  4.  前記ユーザの状態を推定するユーザ状態推定部と、
     前記状態に基づいて、予め定められた1以上の行動の実施可能性を推定する実施可能性推定部と、
     前記実施可能性に基づいて、前記1以上の行動から1つの行動を選択する行動選択部と
     をさらに備える請求項1に記載の端末。
    a user state estimation unit that estimates the state of the user;
    an implementation possibility estimating unit that estimates the implementation possibility of one or more predetermined actions based on the state;
    The terminal according to claim 1, further comprising: an action selection unit that selects one action from the one or more actions based on the feasibility.
  5.  前記ユーザによる行動の実施を検知する行動実施認識部をさらに備え、
     前記実施可能性推定部において使用される実施可能性推定モデルは、前記行動実施認識部により実施を検知された行動と、当該行動の実施が検知されたタイミングで前記ユーザ状態推定部により得られた前記ユーザの状態とを用いて学習されたモデルである
     請求項4に記載の端末。
    further comprising an action implementation recognition unit that detects implementation of the action by the user,
    The implementation possibility estimation model used in the implementation possibility estimation section is obtained by the user state estimation section based on the action whose implementation was detected by the action implementation recognition section and the timing at which the implementation of the action was detected. The terminal according to claim 4, wherein the terminal is a model learned using the user's state.
  6.  第一の端末と、第二の端末とを備える通知システムであって、
     前記第一の端末は、
     ユーザへの通知のスケジュールを示す情報を前記第二の端末から受信する通信部と、
     ユーザの動作を検知した情報を取得する検知情報取得部と、
     スケジュールされた通知のタイミングにおいて、前記ユーザの動作を検知した情報に基づいて、前記ユーザが動作を行っている間に振動するように、前記ユーザへの通知のための振動を発生させる振動制御部と、を備え、
     前記第二の端末は、
     ユーザへの通知のスケジュールを示す情報を前記第一の端末に送信する通信部を備える、
     通知システム。
    A notification system comprising a first terminal and a second terminal,
    The first terminal is
    a communication unit that receives information indicating a schedule for notification to the user from the second terminal;
    a detection information acquisition unit that acquires information on detected user motion;
    a vibration control unit that generates a vibration for notification to the user, based on information obtained by detecting the user's movement, at a scheduled notification timing, so as to vibrate while the user is performing the movement; and,
    The second terminal is
    comprising a communication unit that transmits information indicating a schedule for notification to the user to the first terminal;
    Notification system.
  7.  端末が実行する通知方法であって、
     ユーザの動作を検知した情報を取得するステップと、
     前記ユーザの動作を検知した情報に基づいて、前記ユーザが動作を行っている間に振動するように、前記ユーザへの通知のための振動を発生させるステップと、を備える、
     通知方法。
    A notification method performed by a terminal,
    a step of obtaining information on detected user motion;
    a step of generating vibration for notification to the user so as to vibrate while the user is performing the action, based on information obtained by detecting the user's action;
    Notification method.
  8.  コンピュータを、請求項1から5のいずれか1項に記載の端末における各部として機能させるためのプログラム。 A program for causing a computer to function as each part of the terminal according to any one of claims 1 to 5.
PCT/JP2023/022353 2022-06-22 2023-06-16 Terminal, notification system, notification method, and program WO2023248937A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JPPCT/JP2022/024966 2022-06-22
PCT/JP2022/024966 WO2023248392A1 (en) 2022-06-22 2022-06-22 Terminal that generates vibration for notification

Publications (1)

Publication Number Publication Date
WO2023248937A1 true WO2023248937A1 (en) 2023-12-28

Family

ID=89379613

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/JP2022/024966 WO2023248392A1 (en) 2022-06-22 2022-06-22 Terminal that generates vibration for notification
PCT/JP2023/022353 WO2023248937A1 (en) 2022-06-22 2023-06-16 Terminal, notification system, notification method, and program

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/024966 WO2023248392A1 (en) 2022-06-22 2022-06-22 Terminal that generates vibration for notification

Country Status (1)

Country Link
WO (2) WO2023248392A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002319997A (en) * 2001-04-20 2002-10-31 Sanyo Electric Co Ltd Portable telephone set
WO2012057326A1 (en) * 2010-10-28 2012-05-03 Necカシオモバイルコミュニケーションズ株式会社 Mobile terminal device, notification method, and program
JP2012199891A (en) * 2011-03-23 2012-10-18 Ntt Docomo Inc Portable terminal and vibration function control method
JP2015116508A (en) * 2013-12-16 2015-06-25 富士通株式会社 Housing vibration equipment and electrical equipment comprising vibration equipment
JP2015525026A (en) * 2012-06-07 2015-08-27 アップル インコーポレイテッド Notification quiet hour
JP2016034480A (en) * 2014-07-31 2016-03-17 セイコーエプソン株式会社 Notification device, exercise analysis system, notification method, notification program, exercise support method, and exercise support device
WO2022014274A1 (en) * 2020-07-14 2022-01-20 ソニーグループ株式会社 Notification control device, notification control method, and notification system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002319997A (en) * 2001-04-20 2002-10-31 Sanyo Electric Co Ltd Portable telephone set
WO2012057326A1 (en) * 2010-10-28 2012-05-03 Necカシオモバイルコミュニケーションズ株式会社 Mobile terminal device, notification method, and program
JP2012199891A (en) * 2011-03-23 2012-10-18 Ntt Docomo Inc Portable terminal and vibration function control method
JP2015525026A (en) * 2012-06-07 2015-08-27 アップル インコーポレイテッド Notification quiet hour
JP2015116508A (en) * 2013-12-16 2015-06-25 富士通株式会社 Housing vibration equipment and electrical equipment comprising vibration equipment
JP2016034480A (en) * 2014-07-31 2016-03-17 セイコーエプソン株式会社 Notification device, exercise analysis system, notification method, notification program, exercise support method, and exercise support device
WO2022014274A1 (en) * 2020-07-14 2022-01-20 ソニーグループ株式会社 Notification control device, notification control method, and notification system

Also Published As

Publication number Publication date
WO2023248392A1 (en) 2023-12-28

Similar Documents

Publication Publication Date Title
US20210349618A1 (en) System, Method and User Interface for Supporting Scheduled Mode Changes on Electronic Devices
US10504339B2 (en) Mobile device with instinctive alerts
JP6538825B2 (en) Semantic framework for variable haptic output
US6874127B2 (en) Method and system for controlling presentation of information to a user based on the user&#39;s condition
WO2018071274A1 (en) Extracting an emotional state from device data
US9456308B2 (en) Method and system for creating and refining rules for personalized content delivery based on users physical activities
CN110753514A (en) Sleep monitoring based on implicit acquisition for computer interaction
US10552772B2 (en) Break management system
WO2023248937A1 (en) Terminal, notification system, notification method, and program
JP2018503187A (en) Scheduling interactions with subjects
CN103190136B (en) Mobile terminal device, Notification Method
JP6996379B2 (en) Learning system and programs for learning system
KR20200068057A (en) Method and apparatus for helping self-development using artificial intelligence learning
KR20200068058A (en) Method and apparatus for helping self-development using artificial intelligence learning
US11435829B2 (en) Communication device and method using haptic actuator
US20240079130A1 (en) User interfaces for health tracking
CN113168596A (en) Behavior recommendation method and device, storage medium and electronic equipment
CN115345967A (en) Auxiliary review method and review consolidation system based on intelligent bedding product
JP2020166593A (en) User support device, user support method, and user support program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23827122

Country of ref document: EP

Kind code of ref document: A1