WO2023242986A1 - Action record display device - Google Patents

Action record display device Download PDF

Info

Publication number
WO2023242986A1
WO2023242986A1 PCT/JP2022/023936 JP2022023936W WO2023242986A1 WO 2023242986 A1 WO2023242986 A1 WO 2023242986A1 JP 2022023936 W JP2022023936 W JP 2022023936W WO 2023242986 A1 WO2023242986 A1 WO 2023242986A1
Authority
WO
WIPO (PCT)
Prior art keywords
time
action
control unit
display device
determines
Prior art date
Application number
PCT/JP2022/023936
Other languages
French (fr)
Japanese (ja)
Inventor
菲 史
秀一郎 鬼頭
聡志 清水
丈二 五十棲
Original Assignee
株式会社Fuji
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Fuji filed Critical 株式会社Fuji
Priority to PCT/JP2022/023936 priority Critical patent/WO2023242986A1/en
Publication of WO2023242986A1 publication Critical patent/WO2023242986A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • This specification discloses an action record display device.
  • Patent Document 1 discloses a graphic element that monitors a person's activities of daily living (ADL) and has a circle including a symbol and a description of the ADL, and an expression of the timing and duration of the event. is disclosed.
  • ADL daily living
  • behavior record display device that displays behavior records
  • when multiple behavior events occur if only information corresponding to one behavior event is displayed on one time axis, the user will have trouble understanding the behavior records later. It becomes difficult to keep track of behavior records because multiple time axes must be checked.
  • the main purpose of this disclosure is to make it easier to understand the behavior records of users.
  • the present disclosure has taken the following measures to achieve the above-mentioned main objective.
  • the action record display device of the present disclosure includes: An action record display device that acquires a user's action record and displays it on a display screen, A time axis of an arc or a circle is displayed on the display screen, and a plurality of action events extracted from the action record are selected from among a plurality of types of icons respectively associated with a plurality of predetermined types of action events.
  • the gist of the present invention is to include a display control unit that acquires each corresponding icon and controls each acquired icon to be displayed at the position of the arc or the circle corresponding to the time period in which each action event occurs.
  • a time axis of an arc or a circle is displayed on the display screen, and each icon corresponding to a plurality of action events extracted from the user's action record is displayed as each action event occurs. displayed at the position of the arc or circle corresponding to the time period.
  • multiple icons corresponding to each behavioral event are displayed on a single time axis, making it easy for users to check the type of behavioral event that occurred and the time period in which the behavioral event occurred. This makes it easier to understand activity records.
  • FIG. 1 is a schematic configuration diagram of a behavior monitoring system.
  • FIG. 2 is an explanatory diagram showing an example of sensors installed in each room of a residence.
  • 3 is a flowchart illustrating an example of data measurement processing.
  • 3 is a flowchart illustrating an example of measurement data registration processing.
  • It is a flowchart which shows an example of action record display processing.
  • It is an explanatory view showing an example of an action record display screen.
  • FIG. 6 is an explanatory diagram showing how the behavior record display changes due to a clockwise swipe operation.
  • FIG. 6 is an explanatory diagram showing how the behavior record display changes due to a counterclockwise swipe operation.
  • FIG. 1 is a schematic configuration diagram of a behavior monitoring system.
  • FIG. 2 is an explanatory diagram showing an example of sensors installed in each room of a residence.
  • 3 is a flowchart illustrating an example of data measurement processing.
  • 3 is a flow
  • FIG. 7 is an explanatory diagram showing how the behavior record display changes due to a left swipe operation.
  • FIG. 7 is an explanatory diagram showing how the behavior record display changes due to a right swipe operation.
  • FIG. 6 is an explanatory diagram showing how the behavior record display changes due to a pinch-in operation.
  • FIG. 6 is an explanatory diagram showing how the behavior record display changes due to a pinch-out operation.
  • FIG. 1 is a schematic configuration diagram of the behavior monitoring system 10.
  • the behavior monitoring system 10 includes a management server 20 that manages the entire system, a monitoring device 30 that is installed in each residence A to C where a person to be monitored lives and monitors the person to be monitored, and a monitoring device 30 that monitors the person to be monitored.
  • a mobile terminal 50 is provided as an action record display device that is portable for the subject, his/her manager, etc. and displays the action record of the monitored subject.
  • residences A to C are residences for elderly people or people requiring care to live alone, for example, as shown in Figure 2, they include L (living room), D (dining room), K (kitchen) room, and bedroom.
  • the behavior monitoring system 10 can be used, for example, in place of a caregiver, to monitor the behavior of an elderly person or a care recipient as a monitoring target, and to detect behavioral abnormalities at an early stage.
  • the monitoring device 30 includes a control section 31, a communication section 32, an operation display section 33, a speaker (not shown), and sensors 40.
  • the control unit 31 is configured as a microprocessor centered on a CPU, and includes a ROM, a RAM, etc. in addition to the CPU.
  • the operation display unit 33 and the speaker output various information from the management server 20 by display or sound.
  • the operation display section 33 is configured as a touch panel display that allows an operator to input operations.
  • the sensors 40 are sensors for detecting where and what the person to be monitored living in the residence is doing, and as shown in FIG. 2, there are human sensors 41, 42, 43, 44, 45, 46, a sleep sensor 47 provided in the bedroom, a toilet faucet sensor 48 provided in the toilet room, and a door sensor 49 provided in the entrance door.
  • the human sensors 41 to 46 are sensors that detect people within the detection area in a non-contact manner, and are configured, for example, as infrared sensors that detect infrared rays and convert them into electrical signals.
  • the human sensors 41, 42, and 43 are provided in the living room, dining room, and kitchen of the LDK room, respectively. Further, the human sensor 44 is provided in the bedroom, and the human sensor 45 is provided in the bathroom. Furthermore, a human sensor 46 is provided in the toilet room.
  • the sleep sensor 47 is, for example, a sheet-type biological sensor installed under a bed mattress in a bedroom, and acquires biological information such as the pulse and breathing of the person sleeping on the bed.
  • the toilet faucet sensor 48 detects the operation of a flush lever or a flush switch that instructs flushing of the toilet bowl, and in this embodiment, it detects a large flush operation and a small flush operation in a distinguishable manner.
  • the door sensor 49 detects the opening and closing of the entrance door, and is configured, for example, as a magnetic opening/closing sensor having a permanent magnet fixed to the door side and a magnetic sensor fixed to the frame side.
  • the mobile terminal 50 is a portable communication terminal such as a smartphone or a tablet, and includes a control section 51, a communication section 52, an operation display section 53, and a storage section 54, as shown in FIG.
  • the control unit 51 is configured as a microprocessor centered on a CPU, and includes a ROM, a RAM, etc. in addition to the CPU.
  • the operation display unit 53 is configured as a touch panel display that allows touch operations by an operator.
  • the storage unit 54 stores various application programs (processing programs), data files, identification information (identification ID) for identifying the mobile terminal 50, and the like.
  • the management server 20 includes a processing section 21, a communication section 22, and a storage section 23.
  • the processing unit 21 is configured as a microprocessor centered on a CPU, and includes a ROM, a RAM, etc. in addition to the CPU.
  • the communication unit 22 of the management server 20 is connected to the communication unit 32 of each monitoring device 30 and the communication unit 52 of the mobile terminal 50 via the network 11, and exchanges data and signals with each other.
  • the storage unit 23 is constituted by a hard disk drive (HDD), solid state drive (SSD), etc., and receives data measured by each monitoring device 30 and stores it over a certain period of time.
  • the data measurement process is a process of measuring (collecting) the location of the person to be monitored from sensors installed in each room of the residence.
  • FIG. 3 is a flowchart illustrating an example of a data measurement process executed by the control unit 31 of each monitoring device 30. This process is repeatedly executed at predetermined time intervals.
  • the control unit 31 of the monitoring device 30 first determines whether or not there is a reaction in the living room human sensor 41 provided in the living room (step S100).
  • the control unit 31 determines that there is a reaction in the human sensor 41 for the living room, it determines that the person to be monitored is present in the living room (step S102), and transmits the determination result to the management server 20 as measurement data. (Step S136), and the data measurement process ends.
  • control unit 31 determines in step S100 that there is no response in the living room human sensor 41, it next determines whether or not there is a response in the dining human sensor 42 provided in the dining room (step S100). S104). When the control unit 31 determines that there is a reaction in the human sensor 41 for the dining room, it determines that the person to be monitored is present in the dining room (step S106), and transmits the determination result to the management server 20 as measurement data. (Step S136), and the data measurement process ends.
  • control unit 31 determines in step S104 that there is no response to the dining room human sensor 42, it next determines whether or not there is a response to the kitchen human sensor 43 provided in the kitchen (step S104). S108). When the control unit 31 determines that there is a reaction in the kitchen human sensor 43, it determines that the person to be monitored is present in the kitchen (step S110), and transmits the determination result to the management server 20 as measurement data. (Step S136), and the data measurement process ends.
  • control unit 31 determines in step S108 that there is no response to the kitchen human sensor 43, it next determines whether or not there is a response to the bedroom human sensor 44 provided in the bedroom (step S108). S112). If the control unit 31 determines that there is a reaction in the bedroom human sensor 44, it determines that the person to be monitored is in the bedroom (step S114). Then, the control unit 31 acquires the detection signal of the sleep sensor 47 (step S116), transmits the determination result and the detection signal of the sleep sensor 47 as measurement data to the management server 20 (step S136), and performs data measurement processing. end.
  • control unit 31 determines in step S112 that there is no response in the bedroom human sensor 44, it next determines whether or not there is a response in the bathroom human sensor 45 provided in the bathroom (step S112). S118). If the control unit 31 determines that there is a reaction in the bathroom human sensor 45, it determines that the person to be monitored is in the bathroom (step S120), and transmits the determination result to the management server 20 as measurement data. (Step S136), and the data measurement process ends.
  • control unit 31 determines in step S118 that there is no response in the bathroom human sensor 45, it next determines whether or not there is a response in the bathroom human sensor 46 provided in the toilet room. (Step S122). If the control unit 31 determines that there is a reaction in the human sensor 46 for the toilet room, it determines that the person to be monitored is present in the toilet room (step S124). Then, the control unit 31 acquires the detection signal of the toilet faucet sensor 48 (step S126), and transmits the determination result and the detection signal of the toilet faucet sensor 48 as measurement data to the management server 20 (step S136). , ends the data measurement process.
  • control unit 31 determines in step S122 that there is no response to the human sensor 46 for the toilet room, it next determines whether or not there is a response to the door sensor 49 for the entrance provided at the entrance door (step S122). S128).
  • the control unit 31 determines that there is a reaction in the entrance door sensor 49, it determines whether the person to be monitored is determined to be at home in step S134, which will be described later (step S130).
  • step S132 determines that the person to be monitored has gone out (step S132), and transmits the determination result as measurement data to the management server 20 (step S136), End data measurement processing.
  • control unit 31 determines that the person to be monitored is not at home (determined to be out of the house)
  • the control unit 31 determines that the person to be monitored has returned home, that is, is at home (step S134). , transmits the determination result as measurement data to the management server 20 (step S136), and ends the data measurement process.
  • FIG. 4 is a flowchart illustrating an example of a measurement data reception process executed by the processing unit 21 of the management server 20. This process is repeatedly executed at predetermined time intervals.
  • the processing unit 21 of the management server 20 first determines whether measurement data has been received from the monitoring device 30 (step S200). If the processing unit 21 determines in step S200 that no measurement data has been received, the process proceeds to step S206. On the other hand, when determining that the measurement data has been received, the processing unit 21 accesses the time server via the Internet and acquires the current date, time (hours/minutes/seconds), and day of the week as time information (step S202), the acquired time information is stored in the storage unit 23 in association with the measurement data (step S204), and the process proceeds to step S206. Note that the time information may be acquired by reading the current time from an RTC (real-time clock).
  • RTC real-time clock
  • the processing unit 21 determines whether measurement data for a certain period (for example, one day) has been accumulated in the storage unit 23 (step S206). If the processing unit 21 determines that measurement data for a certain period of time has not been accumulated, the process proceeds to step S212. On the other hand, if the processing unit 21 determines that measurement data for a certain period of time has been accumulated, it extracts a predetermined behavioral event from the measurement data for a certain period of time (step S208), and extracts the extracted behavioral event from the occurrence of the behavioral event. It is registered in the storage unit 23 along with the time (step S210).
  • the behavioral events include waking up, going to bed, eating, excretion (small), excretion (large), bathing, and the like.
  • waking up can be determined based on the fact that the sleep sensor 47 detects a change from a sleeping state to an awake state while the person is in the bedroom.
  • Going to bed can be determined based on a change from an awake state to a sleeping state being detected by the sleep sensor 47 while the person is in the bedroom.
  • Meals can be determined based on whether a stay in the dining room for a certain period of time or more is determined after it is determined that the person has moved from the kitchen to the dining room.
  • Excretion (small) can be determined based on the fact that a small flushing operation is detected by the toilet faucet sensor 48 while the user is in the toilet room.
  • Excretion (large) can be determined based on the fact that a large flush operation is detected by the toilet faucet sensor 48 while the user is in the toilet room.
  • Taking a bath can be determined based on the determination that the user has stayed in the bathroom for a certain period of time or more.
  • the processing unit 21 determines whether there is a request to transmit an action record from the mobile terminal 50 (step S212). When the processing unit 21 determines that there is no request to send an action record, it immediately ends the measurement data registration process. On the other hand, if the processing unit 21 determines that there is a request to transmit an action record, it transmits the action events and their respective occurrence times stored in the storage unit 23 as an action record to the mobile terminal 50 that has made the transmission request. (Step S214), the measurement data registration process ends.
  • FIG. 5 is a flowchart illustrating an example of an action record display process executed by the control unit 51 of the mobile terminal 50. This process is executed when a dedicated application installed in the storage unit 54 of the mobile terminal 50 is started.
  • the control unit 51 of the mobile terminal 50 first transmits an action record transmission request to the management server 20 (step S300), and waits to receive the action record from the management server 20. (Step S302).
  • the control unit 51 receives the action record, it displays the action record on the screen of the operation display unit 53 (step S304).
  • FIG. 6 is an explanatory diagram showing an example of an action record display screen.
  • FIG. 7 is an explanatory diagram showing an example of an icon.
  • the action record display screen includes the user name, the date of the action record to be displayed, and an arc time axis T with a time scale corresponding to the target time period (time division). A plurality of icons I arranged on the time axis T are displayed.
  • a plurality of types of icons I are prepared. As shown in FIG. 7, each icon I is associated with a type of behavioral event, such as waking up, going to bed, eating, excretion (small), excretion (large), and bathing.
  • To display the behavior record extract the behavior event that occurred during the target time period on the target date and time from the behavior record received from the management server 20, select the icon I corresponding to the extracted behavior event, and display the selected icon I. This is done by displaying (arranging) the time scale at the position corresponding to the time when the behavioral event occurs. Thereby, even with the limited screen size of the mobile terminal 50, the user can easily grasp the behavior record of the person to be monitored during the target time period at the target date and time via the behavior record display screen.
  • the control unit 51 determines whether an arc swipe operation has been performed on the action record display screen (step S306).
  • the arc swipe operation is an operation in which the finger that has touched the screen of the operation display unit 53 is moved in an arc shape along the arc of the time axis while the finger is being touched. There is a counterclockwise swipe operation that moves the cursor counterclockwise. If the control unit 51 determines that an arc swipe operation has not been performed, the process proceeds to step S314. On the other hand, if the control unit 51 determines that an arc swipe operation has been performed, it determines whether the arc swipe operation is a clockwise swipe operation (step S308).
  • FIG. 8 is an explanatory diagram showing how the action record display changes due to a clockwise swipe operation. As shown in the figure, when a clockwise swipe operation is performed, the time scale of the time axis shifts in the forward direction by a fixed amount of time each time the operation is performed. Then, the icon is displayed (rearranged) at a position corresponding to the time of occurrence of the behavioral event on the time scale of the time axis shifted in the forward direction.
  • FIG. 9 is an explanatory diagram showing how the behavior record display changes due to a counterclockwise swipe operation. As shown in the figure, when a counterclockwise swipe operation is performed, the time scale of the time axis shifts in the backward direction by a fixed amount of time each time the operation is performed. Then, the icon is displayed (rearranged) at a position corresponding to the time of occurrence of the behavioral event on the time scale of the time axis shifted in the backward direction.
  • the control unit 51 determines whether a left/right swipe operation has been performed (step S314).
  • the left/right swipe operation is an operation in which a finger that has touched the screen of the operation display section 53 is moved in a straight line in the left/right direction while the finger is being touched, and there is a left swipe operation that moves the finger to the left, and a left swipe operation that moves the finger to the right. There is a right swipe operation. If the control unit 51 determines that a left/right swipe operation has not been performed, the process proceeds to step S322. On the other hand, if the control unit 51 determines that a left/right swipe operation has been performed, it determines whether it is a left swipe operation (step S316).
  • FIG. 10 is an explanatory diagram showing how the behavior record display changes due to the left swipe operation. As shown in the figure, when a left swipe operation is performed, the target date and time is shifted to the previous day while the target time zone (time scale of the time axis) is maintained. Then, the icon is changed to one corresponding to the action event that occurred during the target time period of the target date and time shifted to the previous day.
  • FIG. 11 is an explanatory diagram showing how the action record display changes due to the right swipe operation. As shown in the figure, when a right swipe operation is performed, the target date and time is shifted to the next day while the target time zone (time scale of the time axis) is maintained. Then, the icon is changed to one corresponding to the action event that occurred during the target time period of the target date and time shifted to the next day.
  • the control unit 51 determines whether a pinch operation has been performed (step S322).
  • the pinch operation is an operation in which two fingers are touched on the screen of the operation display unit 53, and the two fingers are spread or narrowed while touching. There is a pinch-out operation to widen the image. If the control unit 51 determines that a pinch operation has not been performed, the process proceeds to step S330. On the other hand, if the control unit 51 determines that a pinch operation has been performed, it determines whether the pinch operation is a pin-in operation (step S324). If the control unit 51 determines that it is a pinch-in operation, it reduces the width of the time scale (time scale) on the time axis (step S326), and proceeds to step S330.
  • FIG. 12 is an explanatory diagram showing how the action record display changes due to the pinch-in operation. As shown in the figure, when the pinch-in operation is performed, the width of the time scale on the time axis is reduced, and the target time period is reduced. Then, the icon is changed to one corresponding to the action event that occurred during the reduced target time period.
  • the control unit 51 determines that the pinch operation is not a pinch-in operation but a pinch-out operation, it expands the width of the time scale (time scale) on the time axis (step S328).
  • FIG. 13 is an explanatory diagram showing how the behavior record display changes due to the pinch-out operation. As shown in the figure, when the pinch-out operation is performed, the width of the time scale on the time axis expands, and the target time period expands. Then, the icon is changed to one corresponding to the action event that occurred during the expanded target time period.
  • control unit 51 determines whether an end operation to end the display has been performed.
  • the process returns to step S306 and repeats the process, and when it determines that the end operation has been performed, it ends the action record display process.
  • the mobile terminal 50 of this embodiment is an example of the behavior record display device of the present disclosure
  • the control unit 51 that executes the behavior record display process is an example of a display control unit.
  • control unit 51 determines that the arc swipe operation is a clockwise swipe operation, it shifts the target time period in the forward direction, and determines that the arc swipe operation is a counterclockwise swipe operation. Then, an example of shifting the target time period in the backward direction is shown. However, for example, if the control unit 51 determines that the arc swipe operation is a counterclockwise swipe operation, it shifts the target time period in the forward direction, and if it determines that the arc swipe operation is a clockwise swipe operation, it shifts the target time period. The band may be shifted in the return direction.
  • control unit 51 determines that the left/right swipe operation is a right swipe operation, it shifts the target date and time of the target time period to the previous day, and if it determines that the left/right swipe operation is a left swipe operation, the control unit 51 shifts the target date and time of the target time period to the previous day. You may shift the target date and time to the next day.
  • a time axis of an arc or a circle is displayed on the display screen, and each icon corresponding to a plurality of behavior events extracted from the user's behavior record is displayed for each behavior event. displayed at the position of the arc or circle corresponding to the time period in which it occurred.
  • multiple icons corresponding to each behavioral event are displayed on a single time axis, making it easy for users to check the type of behavioral event that occurred and the time period in which the behavioral event occurred. This makes it easier to understand activity records.
  • the display screen is a touch panel type screen
  • the display control unit changes the time scale of the time axis when a pinch operation is performed on the display screen. It may be controlled as follows. In this way, the time scale of the time axis on which action records are displayed can be changed through an intuitive operation.
  • the present disclosure is not limited to the form of an action record display device, but can also be applied to a form of an action record display method.

Abstract

Provided is an action record display device comprising a display control unit which displays a time axis in the form of an arc or circle on a display screen, acquires each icon corresponding to each of a plurality of action events extracted from an action record, the icon being acquired from a plurality of types of icon each associated with each of a plurality of types of action event defined in advance, and controls each acquired icon to be displayed at a position on the arc or the circle corresponding to a time zone in which the corresponding action event occurred.

Description

行動記録表示装置Behavior record display device
 本明細書は、行動記録表示装置について開示する。 This specification discloses an action record display device.
 従来、利用者の行動を監視し、行動記録を表示画面に表示するものが知られている。例えば、特許文献1には、人の日常生活動作(ADL)をモニタリングし、シンボルとADLの記述とを含む円と、イベントの発生タイミングや持続時間の表現と、を有するグラフィック要素を表示するものが開示されている。 Conventionally, devices that monitor user behavior and display behavior records on a display screen are known. For example, Patent Document 1 discloses a graphic element that monitors a person's activities of daily living (ADL) and has a circle including a symbol and a description of the ADL, and an expression of the timing and duration of the event. is disclosed.
特表2018-513439号公報Special table 2018-513439 publication
 行動記録を表示する行動記録表示装置において、複数の行動イベントが発生した場合、1つの時間軸に一つの行動イベントに対応した情報しか表示されないと、利用者は、後に、行動記録を把握する際に複数の時間軸を確認しなければならないため、行動記録を把握することが困難となる。 In a behavior record display device that displays behavior records, when multiple behavior events occur, if only information corresponding to one behavior event is displayed on one time axis, the user will have trouble understanding the behavior records later. It becomes difficult to keep track of behavior records because multiple time axes must be checked.
 本開示は、利用者の行動記録を把握し易くすることを主目的とする。 The main purpose of this disclosure is to make it easier to understand the behavior records of users.
 本開示は、上述の主目的を達成するために以下の手段を採った。 The present disclosure has taken the following measures to achieve the above-mentioned main objective.
 本開示の行動記録表示装置は、
 利用者の行動記録を取得して表示画面に表示する行動記録表示装置であって、
 前記表示画面上に円弧または円の時間軸を表示し、予め定められた複数種類の行動イベントにそれぞれ対応付けられた複数種類のアイコンの中から前記行動記録から抽出される複数の行動イベントにそれぞれ対応する各アイコンを取得すると共に取得した各アイコンをそれぞれの行動イベントが発生した時間帯に対応する前記円弧または前記円の位置に表示するように制御する表示制御部を備えることを要旨とする。
The action record display device of the present disclosure includes:
An action record display device that acquires a user's action record and displays it on a display screen,
A time axis of an arc or a circle is displayed on the display screen, and a plurality of action events extracted from the action record are selected from among a plurality of types of icons respectively associated with a plurality of predetermined types of action events. The gist of the present invention is to include a display control unit that acquires each corresponding icon and controls each acquired icon to be displayed at the position of the arc or the circle corresponding to the time period in which each action event occurs.
 この本開示の行動記録表示装置では、表示画面上に円弧または円の時間軸を表示し、利用者の行動記録から抽出される複数の行動イベントにそれぞれ対応する各アイコンをそれぞれの行動イベントが発生した時間帯に対応する円弧または円の位置に表示する。これにより、単一の時間軸にはそれぞれの行動イベントに対応した複数のアイコンが表示されるため、利用者は、発生した行動イベントの種類や行動イベントが発生した時間帯を確認することが容易となり、行動記録を把握し易くなる。 In the action record display device of the present disclosure, a time axis of an arc or a circle is displayed on the display screen, and each icon corresponding to a plurality of action events extracted from the user's action record is displayed as each action event occurs. displayed at the position of the arc or circle corresponding to the time period. As a result, multiple icons corresponding to each behavioral event are displayed on a single time axis, making it easy for users to check the type of behavioral event that occurred and the time period in which the behavioral event occurred. This makes it easier to understand activity records.
行動監視システムの概略構成図である。FIG. 1 is a schematic configuration diagram of a behavior monitoring system. 住居の各部屋に設置されるセンサの一例を示す説明図である。FIG. 2 is an explanatory diagram showing an example of sensors installed in each room of a residence. データ測定処理の一例を示すフローチャートである。3 is a flowchart illustrating an example of data measurement processing. 測定データ登録処理の一例を示すフローチャートである。3 is a flowchart illustrating an example of measurement data registration processing. 行動記録表示処理の一例を示すフローチャートである。It is a flowchart which shows an example of action record display processing. 行動記録表示画面の一例を示す説明図である。It is an explanatory view showing an example of an action record display screen. アイコンの一例を示す説明図である。It is an explanatory diagram showing an example of an icon. 時計回りスワイプ操作による行動記録表示の変化の様子を示す説明図である。FIG. 6 is an explanatory diagram showing how the behavior record display changes due to a clockwise swipe operation. 反時計回りスワイプ操作による行動記録表示の変化の様子を示す説明図である。FIG. 6 is an explanatory diagram showing how the behavior record display changes due to a counterclockwise swipe operation. 左スワイプ操作による行動記録表示の変化の様子を示す説明図である。FIG. 7 is an explanatory diagram showing how the behavior record display changes due to a left swipe operation. 右スワイプ操作による行動記録表示の変化の様子を示す説明図である。FIG. 7 is an explanatory diagram showing how the behavior record display changes due to a right swipe operation. ピンチイン操作による行動記録表示の変化の様子を示す説明図である。FIG. 6 is an explanatory diagram showing how the behavior record display changes due to a pinch-in operation. ピンチアウト操作による行動記録表示の変化の様子を示す説明図である。FIG. 6 is an explanatory diagram showing how the behavior record display changes due to a pinch-out operation.
 次に、本開示を実施するための形態について図面を参照しながら説明する。 Next, embodiments for carrying out the present disclosure will be described with reference to the drawings.
 図1は、行動監視システム10の概略構成図である。行動監視システム10は、図1に示すように、システム全体を管理する管理サーバ20と、監視対象者が住む各住居A~Cにそれぞれ設置されて監視対象者を監視する監視装置30と、監視対象者やその管理者等が携帯可能で監視対象者の行動記録を表示する行動記録表示装置としての携帯端末50と、を備える。なお、住居A~Cは、例えば高齢者や要介護者が一人暮らしをするための住居であり、例えば、図2に示すように、L(リビング)D(ダイニング)K(キッチン)室と寝室と洗面所と浴室とトイレ室と玄関とを有する。行動監視システム10は、例えば、介護者に代わって、高齢者や要介護者を監視対象者としてその行動を監視し、行動の異常を早期に見つけ出すために用いることができる。 FIG. 1 is a schematic configuration diagram of the behavior monitoring system 10. As shown in FIG. 1, the behavior monitoring system 10 includes a management server 20 that manages the entire system, a monitoring device 30 that is installed in each residence A to C where a person to be monitored lives and monitors the person to be monitored, and a monitoring device 30 that monitors the person to be monitored. A mobile terminal 50 is provided as an action record display device that is portable for the subject, his/her manager, etc. and displays the action record of the monitored subject. In addition, residences A to C are residences for elderly people or people requiring care to live alone, for example, as shown in Figure 2, they include L (living room), D (dining room), K (kitchen) room, and bedroom. It has a washroom, a bathroom, a toilet room, and an entrance. The behavior monitoring system 10 can be used, for example, in place of a caregiver, to monitor the behavior of an elderly person or a care recipient as a monitoring target, and to detect behavioral abnormalities at an early stage.
 監視装置30は、制御部31と通信部32と操作表示部33とスピーカ(図示せず)とセンサ類40とを備える。制御部31は、CPUを中心としたマイクロプロセッサとして構成され、CPUの他にROMやRAM等を備える。操作表示部33およびスピーカは、管理サーバ20からの各種情報を表示や音声によって出力するものである。操作表示部33は、操作者による操作入力が可能なタッチパネル式のディスプレイとして構成される。 The monitoring device 30 includes a control section 31, a communication section 32, an operation display section 33, a speaker (not shown), and sensors 40. The control unit 31 is configured as a microprocessor centered on a CPU, and includes a ROM, a RAM, etc. in addition to the CPU. The operation display unit 33 and the speaker output various information from the management server 20 by display or sound. The operation display section 33 is configured as a touch panel display that allows an operator to input operations.
 センサ類40は、住居に住む監視対象者がどこで何をしているのかを検知するためのセンサであり、図2に示すように、各部屋に設置された人感センサ41,42,43,44,45,46と、寝室に設けられた睡眠センサ47と、トイレ室に設けられたトイレ水栓センサ48と、玄関ドアに設けられたドアセンサ49と、を有する。 The sensors 40 are sensors for detecting where and what the person to be monitored living in the residence is doing, and as shown in FIG. 2, there are human sensors 41, 42, 43, 44, 45, 46, a sleep sensor 47 provided in the bedroom, a toilet faucet sensor 48 provided in the toilet room, and a door sensor 49 provided in the entrance door.
 人感センサ41~46は、非接触により検知エリア内の人を検知するセンサであり、例えば、赤外線を感知して電気信号に変換する赤外線センサとして構成される。人感センサ41,42,43は、それぞれLDK室のリビング,ダイニング,キッチンに設けられている。また、人感センサ44は、寝室に設けられ、人感センサ45は、浴室に設けられている。さらに、人感センサ46は、トイレ室に設けられている。 The human sensors 41 to 46 are sensors that detect people within the detection area in a non-contact manner, and are configured, for example, as infrared sensors that detect infrared rays and convert them into electrical signals. The human sensors 41, 42, and 43 are provided in the living room, dining room, and kitchen of the LDK room, respectively. Further, the human sensor 44 is provided in the bedroom, and the human sensor 45 is provided in the bathroom. Furthermore, a human sensor 46 is provided in the toilet room.
 睡眠センサ47は、例えば、寝室のベッドマットレスの下に設けられるシート型の生体センサであり、当該ベッドに寝ている人の脈拍や呼吸などの生体情報を取得する。トイレ水栓センサ48は、便器の洗浄を指示する洗浄レバーや洗浄スイッチの操作を検知するものであり、本実施形態では、大洗浄操作と小洗浄操作とを識別可能に検知する。ドアセンサ49は、玄関ドアの開閉を検知するものであり、例えば、ドア側に固定された永久磁石と枠側に固定された磁気センサとを有する磁石式の開閉センサとして構成される。 The sleep sensor 47 is, for example, a sheet-type biological sensor installed under a bed mattress in a bedroom, and acquires biological information such as the pulse and breathing of the person sleeping on the bed. The toilet faucet sensor 48 detects the operation of a flush lever or a flush switch that instructs flushing of the toilet bowl, and in this embodiment, it detects a large flush operation and a small flush operation in a distinguishable manner. The door sensor 49 detects the opening and closing of the entrance door, and is configured, for example, as a magnetic opening/closing sensor having a permanent magnet fixed to the door side and a magnetic sensor fixed to the frame side.
 携帯端末50は、スマートフォンやタブレットなどの携帯可能な通信端末であり、図1に示すように、制御部51と通信部52と操作表示部53と記憶部54とを備える。制御部51は、CPUを中心としたマイクロプロセッサとして構成され、CPUの他にROMやRAM等を備える。操作表示部53は、操作者によるタッチ操作が可能なタッチパネル式のディスプレイとして構成される。記憶部54には、各種アプリケーションプログラム(処理プログラム)やデータファイル、携帯端末50を識別するための識別情報(識別ID)などが記憶されている。 The mobile terminal 50 is a portable communication terminal such as a smartphone or a tablet, and includes a control section 51, a communication section 52, an operation display section 53, and a storage section 54, as shown in FIG. The control unit 51 is configured as a microprocessor centered on a CPU, and includes a ROM, a RAM, etc. in addition to the CPU. The operation display unit 53 is configured as a touch panel display that allows touch operations by an operator. The storage unit 54 stores various application programs (processing programs), data files, identification information (identification ID) for identifying the mobile terminal 50, and the like.
 管理サーバ20は、処理部21と通信部22と記憶部23とを備える。処理部21は、CPUを中心としたマイクロプロセッサとして構成され、CPUの他にROMやRAM等を備える。管理サーバ20の通信部22は、ネットワーク11を介して各監視装置30の通信部32や携帯端末50の通信部52と接続され、互いにデータや信号をやり取りする。記憶部23は、ハードディスクドライブ(HDD)やソリッドステートドライブ(SSD)等により構成され、各監視装置30で測定されたデータを受信して一定期間に亘って記憶する。 The management server 20 includes a processing section 21, a communication section 22, and a storage section 23. The processing unit 21 is configured as a microprocessor centered on a CPU, and includes a ROM, a RAM, etc. in addition to the CPU. The communication unit 22 of the management server 20 is connected to the communication unit 32 of each monitoring device 30 and the communication unit 52 of the mobile terminal 50 via the network 11, and exchanges data and signals with each other. The storage unit 23 is constituted by a hard disk drive (HDD), solid state drive (SSD), etc., and receives data measured by each monitoring device 30 and stores it over a certain period of time.
 次に、こうして構成された行動監視システムの動作について説明する。すなわち、監視装置30の動作と管理サーバ20の動作と携帯端末50の動作とについてそれぞれ説明する。 Next, the operation of the behavior monitoring system configured in this way will be explained. That is, the operation of the monitoring device 30, the operation of the management server 20, and the operation of the mobile terminal 50 will be explained respectively.
 データ測定処理は、住居の各部屋に設けられたセンサから監視対象者の居場所を測定(収集)する処理である。図3は、各監視装置30の制御部31により実行されるデータ測定処理の一例を示すフローチャートである。この処理は、所定時間毎に繰り返し実行される。 The data measurement process is a process of measuring (collecting) the location of the person to be monitored from sensors installed in each room of the residence. FIG. 3 is a flowchart illustrating an example of a data measurement process executed by the control unit 31 of each monitoring device 30. This process is repeatedly executed at predetermined time intervals.
 データ測定処理が実行されると、監視装置30の制御部31は、まず、リビングに設けられたリビング用の人感センサ41に反応があるか否かを判定する(ステップS100)。制御部31は、リビング用の人感センサ41に反応があると判定すると、監視対象者はリビングに在室していると判定し(ステップS102)、判定結果を測定データとして管理サーバ20に送信して(ステップS136)、データ測定処理を終了する。 When the data measurement process is executed, the control unit 31 of the monitoring device 30 first determines whether or not there is a reaction in the living room human sensor 41 provided in the living room (step S100). When the control unit 31 determines that there is a reaction in the human sensor 41 for the living room, it determines that the person to be monitored is present in the living room (step S102), and transmits the determination result to the management server 20 as measurement data. (Step S136), and the data measurement process ends.
 制御部31は、ステップS100でリビング用の人感センサ41に反応がないと判定すると、次に、ダイニングに設けられたダイニング用の人感センサ42に反応があるか否かを判定する(ステップS104)。制御部31は、ダイニング用の人感センサ41に反応があると判定すると、監視対象者はダイニングに在室していると判定し(ステップS106)、判定結果を測定データとして管理サーバ20に送信して(ステップS136)、データ測定処理を終了する。 When the control unit 31 determines in step S100 that there is no response in the living room human sensor 41, it next determines whether or not there is a response in the dining human sensor 42 provided in the dining room (step S100). S104). When the control unit 31 determines that there is a reaction in the human sensor 41 for the dining room, it determines that the person to be monitored is present in the dining room (step S106), and transmits the determination result to the management server 20 as measurement data. (Step S136), and the data measurement process ends.
 制御部31は、ステップS104でダイニング用の人感センサ42に反応がないと判定すると、次に、キッチンに設けられたキッチン用の人感センサ43に反応があるか否かを判定する(ステップS108)。制御部31は、キッチン用の人感センサ43に反応があると判定すると、監視対象者はキッチンに在室していると判定し(ステップS110)、判定結果を測定データとして管理サーバ20に送信して(ステップS136)、データ測定処理を終了する。 When the control unit 31 determines in step S104 that there is no response to the dining room human sensor 42, it next determines whether or not there is a response to the kitchen human sensor 43 provided in the kitchen (step S104). S108). When the control unit 31 determines that there is a reaction in the kitchen human sensor 43, it determines that the person to be monitored is present in the kitchen (step S110), and transmits the determination result to the management server 20 as measurement data. (Step S136), and the data measurement process ends.
 制御部31は、ステップS108でキッチン用の人感センサ43に反応がないと判定すると、次に、寝室に設けられた寝室用の人感センサ44に反応があるか否かを判定する(ステップS112)。制御部31は、寝室用の人感センサ44に反応があると判定すると、監視対象者は寝室に在室していると判定する(ステップS114)。そして、制御部31は、睡眠センサ47の検知信号を取得し(ステップS116)、判定結果と睡眠センサ47の検知信号とを測定データとして管理サーバ20に送信して(ステップS136)、データ測定処理を終了する。 When the control unit 31 determines in step S108 that there is no response to the kitchen human sensor 43, it next determines whether or not there is a response to the bedroom human sensor 44 provided in the bedroom (step S108). S112). If the control unit 31 determines that there is a reaction in the bedroom human sensor 44, it determines that the person to be monitored is in the bedroom (step S114). Then, the control unit 31 acquires the detection signal of the sleep sensor 47 (step S116), transmits the determination result and the detection signal of the sleep sensor 47 as measurement data to the management server 20 (step S136), and performs data measurement processing. end.
 制御部31は、ステップS112で寝室用の人感センサ44に反応がないと判定すると、次に、浴室に設けられた浴室用の人感センサ45に反応があるか否かを判定する(ステップS118)。制御部31は、浴室用の人感センサ45に反応があると判定すると、監視対象者は浴室に在室していると判定し(ステップS120)、判定結果を測定データとして管理サーバ20に送信して(ステップS136)、データ測定処理を終了する。 When the control unit 31 determines in step S112 that there is no response in the bedroom human sensor 44, it next determines whether or not there is a response in the bathroom human sensor 45 provided in the bathroom (step S112). S118). If the control unit 31 determines that there is a reaction in the bathroom human sensor 45, it determines that the person to be monitored is in the bathroom (step S120), and transmits the determination result to the management server 20 as measurement data. (Step S136), and the data measurement process ends.
 制御部31は、ステップS118で浴室用の人感センサ45に反応がないと判定すると、次に、トイレ室に設けられたトイレ室用の人感センサ46に反応があるか否かを判定する(ステップS122)。制御部31は、トイレ室用の人感センサ46に反応があると判定すると、監視対象者はトイレ室に在室していると判定する(ステップS124)。そして、制御部31は、トイレ水栓センサ48の検知信号を取得し(ステップS126)、判定結果とトイレ水栓センサ48の検知信号とを測定データとして管理サーバ20に送信して(ステップS136)、データ測定処理を終了する。 When the control unit 31 determines in step S118 that there is no response in the bathroom human sensor 45, it next determines whether or not there is a response in the bathroom human sensor 46 provided in the toilet room. (Step S122). If the control unit 31 determines that there is a reaction in the human sensor 46 for the toilet room, it determines that the person to be monitored is present in the toilet room (step S124). Then, the control unit 31 acquires the detection signal of the toilet faucet sensor 48 (step S126), and transmits the determination result and the detection signal of the toilet faucet sensor 48 as measurement data to the management server 20 (step S136). , ends the data measurement process.
 制御部31は、ステップS122でトイレ室用の人感センサ46に反応がないと判定すると、次に、玄関ドアに設けられた玄関用のドアセンサ49に反応があるか否かを判定する(ステップS128)。制御部31は、玄関用のドアセンサ49に反応があると判定すると、後述するステップS134により監視対象者は在宅中であると判定されているか否かを判定する(ステップS130)。制御部31は、在宅中であると判定されていると判定すると、監視対象者は外出したと判定し(ステップS132)、判定結果を測定データとして管理サーバ20に送信して(ステップS136)、データ測定処理を終了する。一方、制御部31は、在宅中であると判定されていない(外出中であると判定されている)と判定すると、監視対象者は帰宅した、すなわち在宅中であると判定し(ステップS134)、判定結果を測定データとして管理サーバ20に送信して(ステップS136)、データ測定処理を終了する。 When the control unit 31 determines in step S122 that there is no response to the human sensor 46 for the toilet room, it next determines whether or not there is a response to the door sensor 49 for the entrance provided at the entrance door (step S122). S128). When the control unit 31 determines that there is a reaction in the entrance door sensor 49, it determines whether the person to be monitored is determined to be at home in step S134, which will be described later (step S130). When the control unit 31 determines that the person to be monitored is at home, it determines that the person to be monitored has gone out (step S132), and transmits the determination result as measurement data to the management server 20 (step S136), End data measurement processing. On the other hand, if the control unit 31 determines that the person to be monitored is not at home (determined to be out of the house), the control unit 31 determines that the person to be monitored has returned home, that is, is at home (step S134). , transmits the determination result as measurement data to the management server 20 (step S136), and ends the data measurement process.
 次に、管理サーバ20の動作について説明する。図4は、管理サーバ20の処理部21により実行される測定データ受信処理の一例を示すフローチャートである。この処理は、所定時間毎に繰り返し実行される。 Next, the operation of the management server 20 will be explained. FIG. 4 is a flowchart illustrating an example of a measurement data reception process executed by the processing unit 21 of the management server 20. This process is repeatedly executed at predetermined time intervals.
 測定データ受信処理が実行されると、管理サーバ20の処理部21は、まず、監視装置30から測定データを受信したか否かを判定する(ステップS200)。処理部21は、ステップS200で測定データを受信していないと判定すると、ステップS206に進む。一方、処理部21は、測定データを受信したと判定すると、インターネットを介して時刻サーバにアクセスして現在の年月日、時間(時/分/秒)および曜日を時刻情報として取得し(ステップS202)、取得した時刻情報を測定データと関連付けて記憶部23に記憶して(ステップS204)、ステップS206に進む。なお、時刻情報の取得は、RTC(リアルタイムクロック)から現在の時刻を読み取ることにより行なわれてもよい。 When the measurement data reception process is executed, the processing unit 21 of the management server 20 first determines whether measurement data has been received from the monitoring device 30 (step S200). If the processing unit 21 determines in step S200 that no measurement data has been received, the process proceeds to step S206. On the other hand, when determining that the measurement data has been received, the processing unit 21 accesses the time server via the Internet and acquires the current date, time (hours/minutes/seconds), and day of the week as time information (step S202), the acquired time information is stored in the storage unit 23 in association with the measurement data (step S204), and the process proceeds to step S206. Note that the time information may be acquired by reading the current time from an RTC (real-time clock).
 次に、処理部21は、一定期間分(例えば、一日分)の測定データが記憶部23に蓄積されたか否かを判定する(ステップS206)。処理部21は、一定期間分の測定データが蓄積されていないと判定すると、ステップS212に進む。一方、処理部21は、一定期間分の測定データが蓄積されていると判定すると、一定期間分の測定データから予め定められた行動イベントを抽出し(ステップS208)、抽出した行動イベントをその発生時刻と共に記憶部23に登録する(ステップS210)。ここで、本実施形態では、行動イベントには、起床や就寝、食事、排泄(小)、排泄(大)、入浴などが含まれる。例えば、起床は、寝室の在室中に睡眠センサ47により睡眠状態から覚醒状態への変化が検出されたことに基づいて判定することができる。就寝は、寝室の在室中に睡眠センサ47により覚醒状態から睡眠状態への変化が検出されたことに基づいて判定することができる。食事は、キッチンからダイニングへの移動が判定された後、一定時間以上のダイニングの滞在が判定されたことに基づいて判定することができる。排泄(小)は、トイレ室の在室中にトイレ水栓センサ48により小洗浄操作が検知されたことに基づいて判定することができる。排泄(大)は、トイレ室の在室中にトイレ水栓センサ48により大洗浄操作が検知されたことに基づいて判定することができる。入浴は、一定時間以上の浴室の滞在が判定されたことに基づいて判定することができる。 Next, the processing unit 21 determines whether measurement data for a certain period (for example, one day) has been accumulated in the storage unit 23 (step S206). If the processing unit 21 determines that measurement data for a certain period of time has not been accumulated, the process proceeds to step S212. On the other hand, if the processing unit 21 determines that measurement data for a certain period of time has been accumulated, it extracts a predetermined behavioral event from the measurement data for a certain period of time (step S208), and extracts the extracted behavioral event from the occurrence of the behavioral event. It is registered in the storage unit 23 along with the time (step S210). Here, in this embodiment, the behavioral events include waking up, going to bed, eating, excretion (small), excretion (large), bathing, and the like. For example, waking up can be determined based on the fact that the sleep sensor 47 detects a change from a sleeping state to an awake state while the person is in the bedroom. Going to bed can be determined based on a change from an awake state to a sleeping state being detected by the sleep sensor 47 while the person is in the bedroom. Meals can be determined based on whether a stay in the dining room for a certain period of time or more is determined after it is determined that the person has moved from the kitchen to the dining room. Excretion (small) can be determined based on the fact that a small flushing operation is detected by the toilet faucet sensor 48 while the user is in the toilet room. Excretion (large) can be determined based on the fact that a large flush operation is detected by the toilet faucet sensor 48 while the user is in the toilet room. Taking a bath can be determined based on the determination that the user has stayed in the bathroom for a certain period of time or more.
 次に、処理部21は、携帯端末50から行動記録の送信要求があったか否かを判定する(ステップS212)。処理部21は、行動記録の送信要求がないと判定すると、そのまま測定データ登録処理を終了する。一方、処理部21は、行動記録の送信要求があったと判定すると、記憶部23に記憶されている行動イベントとそれぞれの発生時刻とを行動記録として送信要求があった携帯端末50に送信して(ステップS214)、測定データ登録処理を終了する。 Next, the processing unit 21 determines whether there is a request to transmit an action record from the mobile terminal 50 (step S212). When the processing unit 21 determines that there is no request to send an action record, it immediately ends the measurement data registration process. On the other hand, if the processing unit 21 determines that there is a request to transmit an action record, it transmits the action events and their respective occurrence times stored in the storage unit 23 as an action record to the mobile terminal 50 that has made the transmission request. (Step S214), the measurement data registration process ends.
 次に、行動イベントを受信した携帯端末50の動作について説明する。図5は、携帯端末50の制御部51により実行される行動記録表示処理の一例を示すフローチャートである。この処理は、携帯端末50の記憶部54にインストールされている専用のアプリケーションが起動したときに実行される。 Next, the operation of the mobile terminal 50 that receives the behavioral event will be described. FIG. 5 is a flowchart illustrating an example of an action record display process executed by the control unit 51 of the mobile terminal 50. This process is executed when a dedicated application installed in the storage unit 54 of the mobile terminal 50 is started.
 行動記録表示処理が実行されると、携帯端末50の制御部51は、まず、行動記録の送信要求を管理サーバ20に送信し(ステップS300)、管理サーバ20から行動記録を受信するのを待つ(ステップS302)。制御部51は、行動記録を受信すると、操作表示部53の画面に行動記録を表示する行動記録表示を行なう(ステップS304)。図6は、行動記録表示画面の一例を示す説明図である。図7は、アイコンの一例を示す説明図である。図示するように、行動記録表示画面には、ユーザ名と、表示対象となった行動記録の日付と、対象時間帯(時間区分)に合わせた時間目盛りが付された円弧の時間軸Tと、時間軸T上に配置される複数のアイコンIと、が表示される。アイコンIは、複数種類用意される。それぞれのアイコンIは、図7に示すように、起床、就寝、食事、排泄(小)、排泄(大)、入浴といった行動イベントの種類に対応付けられている。行動記録表示は、管理サーバ20から受信した行動記録の中から対象日時の対象時間帯に発生した行動イベントを抽出し、抽出した行動イベントに対応するアイコンIを選択し、選択したアイコンIをその行動イベントの発生時刻に対応する時間目盛りの位置に表示(配置)することにより行なう。これにより、ユーザは、携帯端末50の限られた画面サイズにおいても、行動記録表示画面を介して対象日時の対象時間帯における監視対象者の行動記録を容易に把握することが可能となる。 When the action record display process is executed, the control unit 51 of the mobile terminal 50 first transmits an action record transmission request to the management server 20 (step S300), and waits to receive the action record from the management server 20. (Step S302). When the control unit 51 receives the action record, it displays the action record on the screen of the operation display unit 53 (step S304). FIG. 6 is an explanatory diagram showing an example of an action record display screen. FIG. 7 is an explanatory diagram showing an example of an icon. As shown in the figure, the action record display screen includes the user name, the date of the action record to be displayed, and an arc time axis T with a time scale corresponding to the target time period (time division). A plurality of icons I arranged on the time axis T are displayed. A plurality of types of icons I are prepared. As shown in FIG. 7, each icon I is associated with a type of behavioral event, such as waking up, going to bed, eating, excretion (small), excretion (large), and bathing. To display the behavior record, extract the behavior event that occurred during the target time period on the target date and time from the behavior record received from the management server 20, select the icon I corresponding to the extracted behavior event, and display the selected icon I. This is done by displaying (arranging) the time scale at the position corresponding to the time when the behavioral event occurs. Thereby, even with the limited screen size of the mobile terminal 50, the user can easily grasp the behavior record of the person to be monitored during the target time period at the target date and time via the behavior record display screen.
 次に、制御部51は、行動記録表示画面において円弧スワイプ操作がなされたか否かを判定する(ステップS306)。円弧スワイプ操作は、操作表示部53の画面にタッチした指を、タッチしたまま時間軸の円弧に沿って円弧状に移動させる操作であり、指を時計回りに移動させる時計回りスワイプ操作と、指を反時計回りに移動させる反時計回りスワイプ操作とがある。制御部51は、円弧スワイプ操作がなされていないと判定すると、ステップS314に進む。一方、制御部51は、円弧スワイプ操作がなされたと判定すると、円弧スワイプ操作が時計回りスワイプ操作であるか否かを判定する(ステップS308)。制御部51は、円弧スワイプ操作が時計回りスワイプ操作であると判定すると、対象時間帯(時間区分)を進み方向にシフトして(ステップS310)、ステップS314に進む。図8は、時計回りスワイプ操作による行動記録表示の変化の様子を示す説明図である。図示するように、時計回りスワイプ操作がなされると、操作の度に、時間軸の時間目盛りは、一定時間ずつ進み方向にシフトする。そして、アイコンは、進み方向にシフトした時間軸の時間目盛りにおいて、行動イベントの発生時刻に対応した位置に表示(再配置)される。一方、制御部51は、円弧スワイプ操作が時計回りスワイプ操作ではなく反時計回りスワイプ操作であると判定すると、対象時間帯(時間区分)を戻り方向にシフトする(ステップS312)。図9は、反時計回りスワイプ操作による行動記録表示の変化の様子を示す説明図である。図示するように、反時計回りスワイプ操作がなされると、操作の度に、時間軸の時間目盛りは、一定時間ずつ戻り方向にシフトする。そして、アイコンは、戻り方向にシフトした時間軸の時間目盛りにおいて、行動イベントの発生時刻に対応した位置に表示(再配置)される。 Next, the control unit 51 determines whether an arc swipe operation has been performed on the action record display screen (step S306). The arc swipe operation is an operation in which the finger that has touched the screen of the operation display unit 53 is moved in an arc shape along the arc of the time axis while the finger is being touched. There is a counterclockwise swipe operation that moves the cursor counterclockwise. If the control unit 51 determines that an arc swipe operation has not been performed, the process proceeds to step S314. On the other hand, if the control unit 51 determines that an arc swipe operation has been performed, it determines whether the arc swipe operation is a clockwise swipe operation (step S308). If the control unit 51 determines that the arc swipe operation is a clockwise swipe operation, it shifts the target time period (time segment) in the forward direction (step S310), and proceeds to step S314. FIG. 8 is an explanatory diagram showing how the action record display changes due to a clockwise swipe operation. As shown in the figure, when a clockwise swipe operation is performed, the time scale of the time axis shifts in the forward direction by a fixed amount of time each time the operation is performed. Then, the icon is displayed (rearranged) at a position corresponding to the time of occurrence of the behavioral event on the time scale of the time axis shifted in the forward direction. On the other hand, if the control unit 51 determines that the arc swipe operation is not a clockwise swipe operation but a counterclockwise swipe operation, it shifts the target time period (time segment) in the backward direction (step S312). FIG. 9 is an explanatory diagram showing how the behavior record display changes due to a counterclockwise swipe operation. As shown in the figure, when a counterclockwise swipe operation is performed, the time scale of the time axis shifts in the backward direction by a fixed amount of time each time the operation is performed. Then, the icon is displayed (rearranged) at a position corresponding to the time of occurrence of the behavioral event on the time scale of the time axis shifted in the backward direction.
 次に、制御部51は、左右スワイプ操作がなされたか否かを判定する(ステップS314)。左右スワイプ操作は、操作表示部53の画面にタッチした指を、タッチしたまま左右方向に直線状に移動させる操作であり、指を左方向に移動させる左スワイプ操作と、指を右方向に移動させる右スワイプ操作とがある。制御部51は、左右スワイプ操作がなされていないと判定すると、ステップS322に進む。一方、制御部51は、左右スワイプ操作がなされたと判定すると、左スワイプ操作であるか否かを判定する(ステップS316)。制御部51は、左右スワイプ操作が左スワイプ操作であると判定すると、対象時間帯の対象日時を前日にシフトして(ステップS318)、ステップS322に進む。図10は、左スワイプ操作による行動記録表示の変化の様子を示す説明図である。図示するように、左スワイプ操作がなされると、操作の度に、対象時間帯(時間軸の時間目盛り)が維持されたまま対象日時が前日にシフトする。そして、アイコンは、前日にシフトした対象日時の対象時間帯に発生した行動イベントに対応するものに変更される。一方、制御部51は、左右スワイプ操作が左スワイプ操作ではなく右スワイプ操作であると判定すると、対象時間帯の対象日時を翌日にシフトする(ステップS320)。図11は、右スワイプ操作による行動記録表示の変化の様子を示す説明図である。図示するように、右スワイプ操作がなされると、操作の度に、対象時間帯(時間軸の時間目盛り)が維持されたまま対象日時が翌日にシフトする。そして、アイコンは、翌日にシフトした対象日時の対象時間帯に発生した行動イベントに対応するものに変更される。 Next, the control unit 51 determines whether a left/right swipe operation has been performed (step S314). The left/right swipe operation is an operation in which a finger that has touched the screen of the operation display section 53 is moved in a straight line in the left/right direction while the finger is being touched, and there is a left swipe operation that moves the finger to the left, and a left swipe operation that moves the finger to the right. There is a right swipe operation. If the control unit 51 determines that a left/right swipe operation has not been performed, the process proceeds to step S322. On the other hand, if the control unit 51 determines that a left/right swipe operation has been performed, it determines whether it is a left swipe operation (step S316). If the control unit 51 determines that the left/right swipe operation is a left swipe operation, the control unit 51 shifts the target date and time of the target time period to the previous day (step S318), and proceeds to step S322. FIG. 10 is an explanatory diagram showing how the behavior record display changes due to the left swipe operation. As shown in the figure, when a left swipe operation is performed, the target date and time is shifted to the previous day while the target time zone (time scale of the time axis) is maintained. Then, the icon is changed to one corresponding to the action event that occurred during the target time period of the target date and time shifted to the previous day. On the other hand, if the control unit 51 determines that the left and right swipe operation is not a left swipe operation but a right swipe operation, the control unit 51 shifts the target date and time of the target time zone to the next day (step S320). FIG. 11 is an explanatory diagram showing how the action record display changes due to the right swipe operation. As shown in the figure, when a right swipe operation is performed, the target date and time is shifted to the next day while the target time zone (time scale of the time axis) is maintained. Then, the icon is changed to one corresponding to the action event that occurred during the target time period of the target date and time shifted to the next day.
 次に、制御部51は、ピンチ操作がなされたか否かを判定する(ステップS322)。ピンチ操作は、操作表示部53の画面に2本の指をタッチし、タッチしたまま2本の指を広げたり狭めたりする操作であり、2本の指を狭めるピンチイン操作と、2本の指を広げるピンチアウト操作とがある。制御部51は、ピンチ操作がなされていないと判定すると、ステップS330に進む。一方、制御部51は、ピンチ操作がなされたと判定すると、ピンチ操作がピンイン操作であるか否かを判定する(ステップS324)。制御部51は、ピンチイン操作であると判定すると、時間軸の時間目盛りの幅(時間スケール)を縮小して(ステップS326)、ステップS330に進む。図12は、ピンチイン操作による行動記録表示の変化の様子を示す説明図である。図示するように、ピンチイン操作がなされると、時間軸の時間目盛りの幅が縮小し、対象時間帯が縮小する。そして、アイコンは、縮小した対象時間帯に発生した行動イベントに対応するものに変更される。一方、制御部51は、ピンチ操作がピンチイン操作ではなくピンチアウト操作であると判定すると、時間軸の時間目盛りの幅(時間スケール)を拡大する(ステップS328)。図13は、ピンチアウト操作による行動記録表示の変化の様子を示す説明図である。図示するように、ピンチアウト操作がなされると、時間軸の時間目盛りの幅が拡大し、対象時間帯が拡大する。そして、アイコンは、拡大した対象時間帯に発生した行動イベントに対応するものに変更される。 Next, the control unit 51 determines whether a pinch operation has been performed (step S322). The pinch operation is an operation in which two fingers are touched on the screen of the operation display unit 53, and the two fingers are spread or narrowed while touching. There is a pinch-out operation to widen the image. If the control unit 51 determines that a pinch operation has not been performed, the process proceeds to step S330. On the other hand, if the control unit 51 determines that a pinch operation has been performed, it determines whether the pinch operation is a pin-in operation (step S324). If the control unit 51 determines that it is a pinch-in operation, it reduces the width of the time scale (time scale) on the time axis (step S326), and proceeds to step S330. FIG. 12 is an explanatory diagram showing how the action record display changes due to the pinch-in operation. As shown in the figure, when the pinch-in operation is performed, the width of the time scale on the time axis is reduced, and the target time period is reduced. Then, the icon is changed to one corresponding to the action event that occurred during the reduced target time period. On the other hand, if the control unit 51 determines that the pinch operation is not a pinch-in operation but a pinch-out operation, it expands the width of the time scale (time scale) on the time axis (step S328). FIG. 13 is an explanatory diagram showing how the behavior record display changes due to the pinch-out operation. As shown in the figure, when the pinch-out operation is performed, the width of the time scale on the time axis expands, and the target time period expands. Then, the icon is changed to one corresponding to the action event that occurred during the expanded target time period.
 そして、制御部51は、表示を終了する終了操作がなされたか否かを判定する。制御部51は、終了操作がなされていないと判定すると、ステップS306に戻って処理を繰り返し、終了操作がなされたと判定すると、これで行動記録表示処理を終了する。 Then, the control unit 51 determines whether an end operation to end the display has been performed. When the control unit 51 determines that the end operation has not been performed, the process returns to step S306 and repeats the process, and when it determines that the end operation has been performed, it ends the action record display process.
 ここで、実施形態の主要な要素と請求の範囲に記載した本開示の主要な要素との対応関係について説明する。即ち、本実施形態の携帯端末50が本開示の行動記録表示装置の一例であり、行動記録表示処理を実行する制御部51が表示制御部の一例である。 Here, the correspondence between the main elements of the embodiment and the main elements of the present disclosure described in the claims will be explained. That is, the mobile terminal 50 of this embodiment is an example of the behavior record display device of the present disclosure, and the control unit 51 that executes the behavior record display process is an example of a display control unit.
 なお、本開示は上述した実施形態に何ら限定されることはなく、本開示の技術的範囲に属する限り種々の態様で実施し得ることはいうまでもない。 It goes without saying that the present disclosure is not limited to the embodiments described above, and can be implemented in various forms as long as they fall within the technical scope of the present disclosure.
 例えば、上述した実施形態では、制御部51は、操作表示部53の行動記録表示画面に円弧の時間軸を表示するものとしたが、円の時間軸を表示するようにしてもよい。 For example, in the embodiment described above, the control unit 51 displays an arc time axis on the action record display screen of the operation display unit 53, but it may also display a circular time axis.
 また、上述した実施形態では、制御部51が、円弧スワイプ操作が時計回りスワイプ操作であると判定すると、対象時間帯を進み方向にシフトし、円弧スワイプ操作が反時計回りスワイプ操作であると判定すると、対象時間帯を戻り方向にシフトする例を示した。しかし、例えば、制御部51は、円弧スワイプ操作が反時計回りスワイプ操作であると判定すると、対象時間帯を進み方向にシフトし、円弧スワイプ操作が時計回りスワイプ操作であると判定すると、対象時間帯を戻り方向にシフトしてもよい。 Further, in the embodiment described above, when the control unit 51 determines that the arc swipe operation is a clockwise swipe operation, it shifts the target time period in the forward direction, and determines that the arc swipe operation is a counterclockwise swipe operation. Then, an example of shifting the target time period in the backward direction is shown. However, for example, if the control unit 51 determines that the arc swipe operation is a counterclockwise swipe operation, it shifts the target time period in the forward direction, and if it determines that the arc swipe operation is a clockwise swipe operation, it shifts the target time period. The band may be shifted in the return direction.
 また、上述した実施形態では、制御部51が、左右スワイプ操作が左スワイプ操作であると判定すると、対象時間帯の対象日時を前日にシフトし、左右スワイプ操作が右スワイプ操作であると判定すると、対象時間帯の対象日時を翌日にシフトする例を示した。しかし、例えば、制御部51は、左右スワイプ操作が右スワイプ操作であると判定すると、対象時間帯の対象日時を前日にシフトし、左右スワイプ操作が左スワイプ操作であると判定すると、対象時間帯の対象日時を翌日にシフトしてもよい。 Further, in the embodiment described above, when the control unit 51 determines that the left/right swipe operation is a left swipe operation, it shifts the target date and time of the target time period to the previous day, and when it determines that the left/right swipe operation is a right swipe operation, the control unit 51 shifts the target date and time of the target time period to the previous day. , an example of shifting the target date and time of the target time zone to the next day was shown. However, for example, if the control unit 51 determines that the left/right swipe operation is a right swipe operation, it shifts the target date and time of the target time period to the previous day, and if it determines that the left/right swipe operation is a left swipe operation, the control unit 51 shifts the target date and time of the target time period to the previous day. You may shift the target date and time to the next day.
 また、上述した実施形態では、管理サーバ20は、携帯端末50に監視対象者の行動記録を送信し、携帯端末50は、受信した監視対象者の行動記録に基づいて行動記録表示処理を実行することにより操作表示部53に行動記録表示を行なうものとした。しかし、管理サーバ20は監視装置30に行動記録を送信し、監視装置30は、受信した行動記録に基づいて上述した行動記録表示処理と同様の処理を実行することにより操作表示部33に行動記録表示を行なってもよい。 Furthermore, in the embodiment described above, the management server 20 transmits the behavior record of the person to be monitored to the mobile terminal 50, and the mobile terminal 50 executes the behavior record display process based on the received behavior record of the person to be monitored. Accordingly, the action record is displayed on the operation display section 53. However, the management server 20 transmits the behavior record to the monitoring device 30, and the monitoring device 30 displays the behavior record on the operation display unit 33 by executing a process similar to the behavior record display process described above based on the received behavior record. It may also be displayed.
 以上説明した本開示の行動記録表示装置では、表示画面上に円弧または円の時間軸を表示し、利用者の行動記録から抽出される複数の行動イベントにそれぞれ対応する各アイコンをそれぞれの行動イベントが発生した時間帯に対応する円弧または円の位置に表示する。これにより、単一の時間軸にはそれぞれの行動イベントに対応した複数のアイコンが表示されるため、利用者は、発生した行動イベントの種類や行動イベントが発生した時間帯を確認することが容易となり、行動記録を把握し易くなる。 In the behavior record display device of the present disclosure described above, a time axis of an arc or a circle is displayed on the display screen, and each icon corresponding to a plurality of behavior events extracted from the user's behavior record is displayed for each behavior event. displayed at the position of the arc or circle corresponding to the time period in which it occurred. As a result, multiple icons corresponding to each behavioral event are displayed on a single time axis, making it easy for users to check the type of behavioral event that occurred and the time period in which the behavioral event occurred. This makes it easier to understand activity records.
 こうした本開示の行動記録表示装置において、前記表示画面は、タッチパネル式の画面であり、前記表示制御部は、デフォルト時には所定の時間区分で前記時間軸を表示し、前記表示画面に対して前記時間軸の前記円弧または前記円に沿ってスワイプ操作されると、前記時間区分が前後にシフトするように制御してもよい。こうすれば、直感的な操作により、行動記録を表示する時間軸の時間区分を変更することができる。 In such an action record display device of the present disclosure, the display screen is a touch panel type screen, and the display control unit displays the time axis in predetermined time segments in a default time, The time segment may be controlled to shift forward or backward when a swipe operation is performed along the arc or the circle of the axis. In this way, it is possible to change the time segment of the time axis in which action records are displayed through an intuitive operation.
 また、本開示の行動記録表示装置において、前記表示画面は、タッチパネル式の画面であり、前記表示制御部は、前記表示画面に対してピンチ操作されると、前記時間軸の時間スケールが広狭するように制御してもよい。こうすれば、直感的な操作により、行動記録を表示する時間軸の時間スケールを変更することができる。 Further, in the action record display device of the present disclosure, the display screen is a touch panel type screen, and the display control unit changes the time scale of the time axis when a pinch operation is performed on the display screen. It may be controlled as follows. In this way, the time scale of the time axis on which action records are displayed can be changed through an intuitive operation.
 また、本開示の行動記録表示装置は、携帯可能な携帯端末に搭載されてもよい。携帯端末は、画面サイズが限られているため、本開示の行動記録表示装置を適用する意義がより大きなものとなる。 Furthermore, the action record display device of the present disclosure may be installed in a portable mobile terminal. Since the screen size of a mobile terminal is limited, the application of the action record display device of the present disclosure has greater significance.
 本開示では、行動記録表示装置の形態に限られず、行動記録表示方法の形態とすることもできる。 The present disclosure is not limited to the form of an action record display device, but can also be applied to a form of an action record display method.
 なお、本明細書では、出願時の請求項3において「請求項1に記載の行動記録表示装置」を「請求項1または2に記載の行動記録表示装置」に変更した技術思想も開示されている。 Note that this specification also discloses a technical idea in which "the behavior record display device according to claim 1" is changed to "the behavior record display device according to claim 1 or 2" in claim 3 as filed. There is.
 本開示は、行動記録表示装置の製造産業に利用可能である。 The present disclosure can be used in the manufacturing industry of behavior record display devices.
10 行動監視システム、11 ネットワーク、20 管理サーバ、21 処理部、22 通信部、23 記憶部、30 監視装置、31 制御部、32 通信部、33 操作表示部、40 センサ類、41,42,43,44,45,46 人感センサ、47 睡眠センサ、48 トイレ水栓センサ、49 ドアセンサ、50 携帯端末、51 制御部、52 通信部、53 操作表示部、54 記憶部。 10 Behavior monitoring system, 11 Network, 20 Management server, 21 Processing unit, 22 Communication unit, 23 Storage unit, 30 Monitoring device, 31 Control unit, 32 Communication unit, 33 Operation display unit, 40 Sensors, 41, 42, 43 , 44, 45, 46 human sensor, 47 sleep sensor, 48 toilet faucet sensor, 49 door sensor, 50 mobile terminal, 51 control section, 52 communication section, 53 operation display section, 54 storage section.

Claims (4)

  1.  利用者の行動記録を取得して表示画面に表示する行動記録表示装置であって、
     前記表示画面上に円弧または円の時間軸を表示し、予め定められた複数種類の行動イベントにそれぞれ対応付けられた複数種類のアイコンの中から前記行動記録から抽出される複数の行動イベントにそれぞれ対応する各アイコンを取得すると共に取得した各アイコンをそれぞれの行動イベントが発生した時間帯に対応する前記円弧または前記円の位置に表示するように制御する表示制御部を備える行動記録表示装置。
    An action record display device that acquires a user's action record and displays it on a display screen,
    A time axis of an arc or a circle is displayed on the display screen, and a plurality of action events extracted from the action record are selected from among a plurality of types of icons respectively associated with a plurality of predetermined types of action events. An action record display device comprising a display control unit that obtains each corresponding icon and controls the obtained icons to be displayed at the position of the arc or the circle corresponding to the time period in which each action event occurs.
  2.  請求項1に記載の行動記録表示装置であって、
     前記表示画面は、タッチパネル式の画面であり、
     前記表示制御部は、デフォルト時には所定の時間区分で前記時間軸を表示し、前記表示画面に対して前記時間軸の前記円弧または前記円に沿ってスワイプ操作されると、前記時間区分が前後にシフトするように制御する、
     行動記録表示装置。
    The action record display device according to claim 1,
    The display screen is a touch panel type screen,
    The display control unit displays the time axis in predetermined time segments in a default state, and when a swipe operation is performed on the display screen along the arc or the circle of the time axis, the time segment changes back and forth. control to shift,
    Behavior record display device.
  3.  請求項1に記載の行動記録表示装置であって、
     前記表示画面は、タッチパネル式の画面であり、
     前記表示制御部は、前記表示画面に対してピンチ操作されると、前記時間軸の時間スケールが広狭するように制御する、
     行動記録表示装置。
    The action record display device according to claim 1,
    The display screen is a touch panel type screen,
    The display control unit controls the time scale of the time axis to widen or narrow when a pinch operation is performed on the display screen.
    Behavior record display device.
  4.  請求項1ないし3いずれか1項に記載の行動記録表示装置であって、
     携帯可能な携帯端末に搭載される、
     行動記録表示装置。
    The action record display device according to any one of claims 1 to 3,
    installed in a portable mobile device,
    Behavior record display device.
PCT/JP2022/023936 2022-06-15 2022-06-15 Action record display device WO2023242986A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/023936 WO2023242986A1 (en) 2022-06-15 2022-06-15 Action record display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/023936 WO2023242986A1 (en) 2022-06-15 2022-06-15 Action record display device

Publications (1)

Publication Number Publication Date
WO2023242986A1 true WO2023242986A1 (en) 2023-12-21

Family

ID=89192462

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/023936 WO2023242986A1 (en) 2022-06-15 2022-06-15 Action record display device

Country Status (1)

Country Link
WO (1) WO2023242986A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1185450A (en) * 1997-09-10 1999-03-30 Canon Inc Method and device for processing information
JP2007082190A (en) * 2005-09-09 2007-03-29 Lg Electronics Inc Event display device of mobile communication terminal and method thereof
US20080081594A1 (en) * 2006-09-29 2008-04-03 Lg Electronics Inc. Event information display apparatus and method for mobile communication terminal
JP2008211794A (en) * 2007-02-27 2008-09-11 Lg Electronics Inc Method and apparatus for displaying event of mobile terminal
JP2016202347A (en) * 2015-04-17 2016-12-08 セイコーエプソン株式会社 Biological information processing system, biological information processing device, and analysis result information generation method
WO2017022306A1 (en) * 2015-08-05 2017-02-09 ソニー株式会社 Information processing system and information processing method
JP2017074174A (en) * 2015-10-14 2017-04-20 株式会社日立製作所 Living body action evaluation device and living body action evaluation method
JP2019200779A (en) * 2019-02-06 2019-11-21 株式会社ニューロスペース Presentation device, method for presentation, and program

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1185450A (en) * 1997-09-10 1999-03-30 Canon Inc Method and device for processing information
JP2007082190A (en) * 2005-09-09 2007-03-29 Lg Electronics Inc Event display device of mobile communication terminal and method thereof
US20080081594A1 (en) * 2006-09-29 2008-04-03 Lg Electronics Inc. Event information display apparatus and method for mobile communication terminal
JP2008211794A (en) * 2007-02-27 2008-09-11 Lg Electronics Inc Method and apparatus for displaying event of mobile terminal
JP2016202347A (en) * 2015-04-17 2016-12-08 セイコーエプソン株式会社 Biological information processing system, biological information processing device, and analysis result information generation method
WO2017022306A1 (en) * 2015-08-05 2017-02-09 ソニー株式会社 Information processing system and information processing method
JP2017074174A (en) * 2015-10-14 2017-04-20 株式会社日立製作所 Living body action evaluation device and living body action evaluation method
JP2019200779A (en) * 2019-02-06 2019-11-21 株式会社ニューロスペース Presentation device, method for presentation, and program

Similar Documents

Publication Publication Date Title
US10217342B2 (en) Method and process for determining whether an individual suffers a fall requiring assistance
US10212495B2 (en) Programmable monitoring system
JP5555443B2 (en) System for monitoring human cognitive ability
EP1585078A2 (en) system and method for determining whether a resident is at home or away
JP2020098619A (en) Monitored person monitoring system, information processing device, and program
JP2004133777A (en) Device and method for monitoring life of person to be monitored, device and method for tracking the person, computer program and recording medium
JP3606386B2 (en) Anomaly detection system for care recipients
JP6445815B2 (en) Information processing apparatus, program, and information processing method
WO2023242986A1 (en) Action record display device
JP2015060530A (en) Watching system, watching method, watching terminal, management terminal, program and recording medium
JP6880811B2 (en) Observed person monitoring device, the method and the system
JP7420207B2 (en) Sleep state determination system and sleep state determination method
JP2006236128A (en) Action detection system
WO2017026309A1 (en) Sensor device and care support system
JP7396390B2 (en) Central processing unit and central processing method of monitored person monitoring system
KR102404885B1 (en) SYSTEM AND METHOD FOR PROTECTING LONELY DEATH BASED ON IoT
EP3832618A1 (en) Monitoring system
JP2004046560A (en) Solitary resident lifeline data processing system
JP6679019B1 (en) Information processing system, computer program, and information processing method.
JP2005301960A (en) Operation means for management system for solitary person
GB2579674A (en) Monitoring method and system
JP7147787B2 (en) Monitored Person Monitoring Support Device, Monitored Person Monitoring Support Method, and Monitored Person Monitoring Support Program
JPWO2019142566A1 (en) Monitored person monitoring support system and monitored person monitoring support method
WO2023058154A1 (en) Action monitoring system and action monitoring method
JP7268604B2 (en) Monitored Person Monitoring Support Device, Monitored Person Monitoring Support System, Monitored Person Monitoring Support Method, and Monitored Person Monitoring Support Program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22946805

Country of ref document: EP

Kind code of ref document: A1