WO2023242986A1 - 行動記録表示装置 - Google Patents

行動記録表示装置 Download PDF

Info

Publication number
WO2023242986A1
WO2023242986A1 PCT/JP2022/023936 JP2022023936W WO2023242986A1 WO 2023242986 A1 WO2023242986 A1 WO 2023242986A1 JP 2022023936 W JP2022023936 W JP 2022023936W WO 2023242986 A1 WO2023242986 A1 WO 2023242986A1
Authority
WO
WIPO (PCT)
Prior art keywords
time
action
control unit
display device
determines
Prior art date
Application number
PCT/JP2022/023936
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
菲 史
秀一郎 鬼頭
聡志 清水
丈二 五十棲
Original Assignee
株式会社Fuji
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Fuji filed Critical 株式会社Fuji
Priority to PCT/JP2022/023936 priority Critical patent/WO2023242986A1/ja
Priority to JP2024527983A priority patent/JPWO2023242986A1/ja
Publication of WO2023242986A1 publication Critical patent/WO2023242986A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • This specification discloses an action record display device.
  • Patent Document 1 discloses a graphic element that monitors a person's activities of daily living (ADL) and has a circle including a symbol and a description of the ADL, and an expression of the timing and duration of the event. is disclosed.
  • ADL daily living
  • behavior record display device that displays behavior records
  • when multiple behavior events occur if only information corresponding to one behavior event is displayed on one time axis, the user will have trouble understanding the behavior records later. It becomes difficult to keep track of behavior records because multiple time axes must be checked.
  • the main purpose of this disclosure is to make it easier to understand the behavior records of users.
  • the present disclosure has taken the following measures to achieve the above-mentioned main objective.
  • the action record display device of the present disclosure includes: An action record display device that acquires a user's action record and displays it on a display screen, A time axis of an arc or a circle is displayed on the display screen, and a plurality of action events extracted from the action record are selected from among a plurality of types of icons respectively associated with a plurality of predetermined types of action events.
  • the gist of the present invention is to include a display control unit that acquires each corresponding icon and controls each acquired icon to be displayed at the position of the arc or the circle corresponding to the time period in which each action event occurs.
  • a time axis of an arc or a circle is displayed on the display screen, and each icon corresponding to a plurality of action events extracted from the user's action record is displayed as each action event occurs. displayed at the position of the arc or circle corresponding to the time period.
  • multiple icons corresponding to each behavioral event are displayed on a single time axis, making it easy for users to check the type of behavioral event that occurred and the time period in which the behavioral event occurred. This makes it easier to understand activity records.
  • FIG. 1 is a schematic configuration diagram of a behavior monitoring system.
  • FIG. 2 is an explanatory diagram showing an example of sensors installed in each room of a residence.
  • 3 is a flowchart illustrating an example of data measurement processing.
  • 3 is a flowchart illustrating an example of measurement data registration processing.
  • It is a flowchart which shows an example of action record display processing.
  • It is an explanatory view showing an example of an action record display screen.
  • FIG. 6 is an explanatory diagram showing how the behavior record display changes due to a clockwise swipe operation.
  • FIG. 6 is an explanatory diagram showing how the behavior record display changes due to a counterclockwise swipe operation.
  • FIG. 1 is a schematic configuration diagram of a behavior monitoring system.
  • FIG. 2 is an explanatory diagram showing an example of sensors installed in each room of a residence.
  • 3 is a flowchart illustrating an example of data measurement processing.
  • 3 is a flow
  • FIG. 7 is an explanatory diagram showing how the behavior record display changes due to a left swipe operation.
  • FIG. 7 is an explanatory diagram showing how the behavior record display changes due to a right swipe operation.
  • FIG. 6 is an explanatory diagram showing how the behavior record display changes due to a pinch-in operation.
  • FIG. 6 is an explanatory diagram showing how the behavior record display changes due to a pinch-out operation.
  • FIG. 1 is a schematic configuration diagram of the behavior monitoring system 10.
  • the behavior monitoring system 10 includes a management server 20 that manages the entire system, a monitoring device 30 that is installed in each residence A to C where a person to be monitored lives and monitors the person to be monitored, and a monitoring device 30 that monitors the person to be monitored.
  • a mobile terminal 50 is provided as an action record display device that is portable for the subject, his/her manager, etc. and displays the action record of the monitored subject.
  • residences A to C are residences for elderly people or people requiring care to live alone, for example, as shown in Figure 2, they include L (living room), D (dining room), K (kitchen) room, and bedroom.
  • the behavior monitoring system 10 can be used, for example, in place of a caregiver, to monitor the behavior of an elderly person or a care recipient as a monitoring target, and to detect behavioral abnormalities at an early stage.
  • the monitoring device 30 includes a control section 31, a communication section 32, an operation display section 33, a speaker (not shown), and sensors 40.
  • the control unit 31 is configured as a microprocessor centered on a CPU, and includes a ROM, a RAM, etc. in addition to the CPU.
  • the operation display unit 33 and the speaker output various information from the management server 20 by display or sound.
  • the operation display section 33 is configured as a touch panel display that allows an operator to input operations.
  • the sensors 40 are sensors for detecting where and what the person to be monitored living in the residence is doing, and as shown in FIG. 2, there are human sensors 41, 42, 43, 44, 45, 46, a sleep sensor 47 provided in the bedroom, a toilet faucet sensor 48 provided in the toilet room, and a door sensor 49 provided in the entrance door.
  • the human sensors 41 to 46 are sensors that detect people within the detection area in a non-contact manner, and are configured, for example, as infrared sensors that detect infrared rays and convert them into electrical signals.
  • the human sensors 41, 42, and 43 are provided in the living room, dining room, and kitchen of the LDK room, respectively. Further, the human sensor 44 is provided in the bedroom, and the human sensor 45 is provided in the bathroom. Furthermore, a human sensor 46 is provided in the toilet room.
  • the sleep sensor 47 is, for example, a sheet-type biological sensor installed under a bed mattress in a bedroom, and acquires biological information such as the pulse and breathing of the person sleeping on the bed.
  • the toilet faucet sensor 48 detects the operation of a flush lever or a flush switch that instructs flushing of the toilet bowl, and in this embodiment, it detects a large flush operation and a small flush operation in a distinguishable manner.
  • the door sensor 49 detects the opening and closing of the entrance door, and is configured, for example, as a magnetic opening/closing sensor having a permanent magnet fixed to the door side and a magnetic sensor fixed to the frame side.
  • the mobile terminal 50 is a portable communication terminal such as a smartphone or a tablet, and includes a control section 51, a communication section 52, an operation display section 53, and a storage section 54, as shown in FIG.
  • the control unit 51 is configured as a microprocessor centered on a CPU, and includes a ROM, a RAM, etc. in addition to the CPU.
  • the operation display unit 53 is configured as a touch panel display that allows touch operations by an operator.
  • the storage unit 54 stores various application programs (processing programs), data files, identification information (identification ID) for identifying the mobile terminal 50, and the like.
  • the management server 20 includes a processing section 21, a communication section 22, and a storage section 23.
  • the processing unit 21 is configured as a microprocessor centered on a CPU, and includes a ROM, a RAM, etc. in addition to the CPU.
  • the communication unit 22 of the management server 20 is connected to the communication unit 32 of each monitoring device 30 and the communication unit 52 of the mobile terminal 50 via the network 11, and exchanges data and signals with each other.
  • the storage unit 23 is constituted by a hard disk drive (HDD), solid state drive (SSD), etc., and receives data measured by each monitoring device 30 and stores it over a certain period of time.
  • the data measurement process is a process of measuring (collecting) the location of the person to be monitored from sensors installed in each room of the residence.
  • FIG. 3 is a flowchart illustrating an example of a data measurement process executed by the control unit 31 of each monitoring device 30. This process is repeatedly executed at predetermined time intervals.
  • the control unit 31 of the monitoring device 30 first determines whether or not there is a reaction in the living room human sensor 41 provided in the living room (step S100).
  • the control unit 31 determines that there is a reaction in the human sensor 41 for the living room, it determines that the person to be monitored is present in the living room (step S102), and transmits the determination result to the management server 20 as measurement data. (Step S136), and the data measurement process ends.
  • control unit 31 determines in step S100 that there is no response in the living room human sensor 41, it next determines whether or not there is a response in the dining human sensor 42 provided in the dining room (step S100). S104). When the control unit 31 determines that there is a reaction in the human sensor 41 for the dining room, it determines that the person to be monitored is present in the dining room (step S106), and transmits the determination result to the management server 20 as measurement data. (Step S136), and the data measurement process ends.
  • control unit 31 determines in step S104 that there is no response to the dining room human sensor 42, it next determines whether or not there is a response to the kitchen human sensor 43 provided in the kitchen (step S104). S108). When the control unit 31 determines that there is a reaction in the kitchen human sensor 43, it determines that the person to be monitored is present in the kitchen (step S110), and transmits the determination result to the management server 20 as measurement data. (Step S136), and the data measurement process ends.
  • control unit 31 determines in step S108 that there is no response to the kitchen human sensor 43, it next determines whether or not there is a response to the bedroom human sensor 44 provided in the bedroom (step S108). S112). If the control unit 31 determines that there is a reaction in the bedroom human sensor 44, it determines that the person to be monitored is in the bedroom (step S114). Then, the control unit 31 acquires the detection signal of the sleep sensor 47 (step S116), transmits the determination result and the detection signal of the sleep sensor 47 as measurement data to the management server 20 (step S136), and performs data measurement processing. end.
  • control unit 31 determines in step S112 that there is no response in the bedroom human sensor 44, it next determines whether or not there is a response in the bathroom human sensor 45 provided in the bathroom (step S112). S118). If the control unit 31 determines that there is a reaction in the bathroom human sensor 45, it determines that the person to be monitored is in the bathroom (step S120), and transmits the determination result to the management server 20 as measurement data. (Step S136), and the data measurement process ends.
  • control unit 31 determines in step S118 that there is no response in the bathroom human sensor 45, it next determines whether or not there is a response in the bathroom human sensor 46 provided in the toilet room. (Step S122). If the control unit 31 determines that there is a reaction in the human sensor 46 for the toilet room, it determines that the person to be monitored is present in the toilet room (step S124). Then, the control unit 31 acquires the detection signal of the toilet faucet sensor 48 (step S126), and transmits the determination result and the detection signal of the toilet faucet sensor 48 as measurement data to the management server 20 (step S136). , ends the data measurement process.
  • control unit 31 determines in step S122 that there is no response to the human sensor 46 for the toilet room, it next determines whether or not there is a response to the door sensor 49 for the entrance provided at the entrance door (step S122). S128).
  • the control unit 31 determines that there is a reaction in the entrance door sensor 49, it determines whether the person to be monitored is determined to be at home in step S134, which will be described later (step S130).
  • step S132 determines that the person to be monitored has gone out (step S132), and transmits the determination result as measurement data to the management server 20 (step S136), End data measurement processing.
  • control unit 31 determines that the person to be monitored is not at home (determined to be out of the house)
  • the control unit 31 determines that the person to be monitored has returned home, that is, is at home (step S134). , transmits the determination result as measurement data to the management server 20 (step S136), and ends the data measurement process.
  • FIG. 4 is a flowchart illustrating an example of a measurement data reception process executed by the processing unit 21 of the management server 20. This process is repeatedly executed at predetermined time intervals.
  • the processing unit 21 of the management server 20 first determines whether measurement data has been received from the monitoring device 30 (step S200). If the processing unit 21 determines in step S200 that no measurement data has been received, the process proceeds to step S206. On the other hand, when determining that the measurement data has been received, the processing unit 21 accesses the time server via the Internet and acquires the current date, time (hours/minutes/seconds), and day of the week as time information (step S202), the acquired time information is stored in the storage unit 23 in association with the measurement data (step S204), and the process proceeds to step S206. Note that the time information may be acquired by reading the current time from an RTC (real-time clock).
  • RTC real-time clock
  • the processing unit 21 determines whether measurement data for a certain period (for example, one day) has been accumulated in the storage unit 23 (step S206). If the processing unit 21 determines that measurement data for a certain period of time has not been accumulated, the process proceeds to step S212. On the other hand, if the processing unit 21 determines that measurement data for a certain period of time has been accumulated, it extracts a predetermined behavioral event from the measurement data for a certain period of time (step S208), and extracts the extracted behavioral event from the occurrence of the behavioral event. It is registered in the storage unit 23 along with the time (step S210).
  • the behavioral events include waking up, going to bed, eating, excretion (small), excretion (large), bathing, and the like.
  • waking up can be determined based on the fact that the sleep sensor 47 detects a change from a sleeping state to an awake state while the person is in the bedroom.
  • Going to bed can be determined based on a change from an awake state to a sleeping state being detected by the sleep sensor 47 while the person is in the bedroom.
  • Meals can be determined based on whether a stay in the dining room for a certain period of time or more is determined after it is determined that the person has moved from the kitchen to the dining room.
  • Excretion (small) can be determined based on the fact that a small flushing operation is detected by the toilet faucet sensor 48 while the user is in the toilet room.
  • Excretion (large) can be determined based on the fact that a large flush operation is detected by the toilet faucet sensor 48 while the user is in the toilet room.
  • Taking a bath can be determined based on the determination that the user has stayed in the bathroom for a certain period of time or more.
  • the processing unit 21 determines whether there is a request to transmit an action record from the mobile terminal 50 (step S212). When the processing unit 21 determines that there is no request to send an action record, it immediately ends the measurement data registration process. On the other hand, if the processing unit 21 determines that there is a request to transmit an action record, it transmits the action events and their respective occurrence times stored in the storage unit 23 as an action record to the mobile terminal 50 that has made the transmission request. (Step S214), the measurement data registration process ends.
  • FIG. 5 is a flowchart illustrating an example of an action record display process executed by the control unit 51 of the mobile terminal 50. This process is executed when a dedicated application installed in the storage unit 54 of the mobile terminal 50 is started.
  • the control unit 51 of the mobile terminal 50 first transmits an action record transmission request to the management server 20 (step S300), and waits to receive the action record from the management server 20. (Step S302).
  • the control unit 51 receives the action record, it displays the action record on the screen of the operation display unit 53 (step S304).
  • FIG. 6 is an explanatory diagram showing an example of an action record display screen.
  • FIG. 7 is an explanatory diagram showing an example of an icon.
  • the action record display screen includes the user name, the date of the action record to be displayed, and an arc time axis T with a time scale corresponding to the target time period (time division). A plurality of icons I arranged on the time axis T are displayed.
  • a plurality of types of icons I are prepared. As shown in FIG. 7, each icon I is associated with a type of behavioral event, such as waking up, going to bed, eating, excretion (small), excretion (large), and bathing.
  • To display the behavior record extract the behavior event that occurred during the target time period on the target date and time from the behavior record received from the management server 20, select the icon I corresponding to the extracted behavior event, and display the selected icon I. This is done by displaying (arranging) the time scale at the position corresponding to the time when the behavioral event occurs. Thereby, even with the limited screen size of the mobile terminal 50, the user can easily grasp the behavior record of the person to be monitored during the target time period at the target date and time via the behavior record display screen.
  • the control unit 51 determines whether an arc swipe operation has been performed on the action record display screen (step S306).
  • the arc swipe operation is an operation in which the finger that has touched the screen of the operation display unit 53 is moved in an arc shape along the arc of the time axis while the finger is being touched. There is a counterclockwise swipe operation that moves the cursor counterclockwise. If the control unit 51 determines that an arc swipe operation has not been performed, the process proceeds to step S314. On the other hand, if the control unit 51 determines that an arc swipe operation has been performed, it determines whether the arc swipe operation is a clockwise swipe operation (step S308).
  • FIG. 8 is an explanatory diagram showing how the action record display changes due to a clockwise swipe operation. As shown in the figure, when a clockwise swipe operation is performed, the time scale of the time axis shifts in the forward direction by a fixed amount of time each time the operation is performed. Then, the icon is displayed (rearranged) at a position corresponding to the time of occurrence of the behavioral event on the time scale of the time axis shifted in the forward direction.
  • FIG. 9 is an explanatory diagram showing how the behavior record display changes due to a counterclockwise swipe operation. As shown in the figure, when a counterclockwise swipe operation is performed, the time scale of the time axis shifts in the backward direction by a fixed amount of time each time the operation is performed. Then, the icon is displayed (rearranged) at a position corresponding to the time of occurrence of the behavioral event on the time scale of the time axis shifted in the backward direction.
  • the control unit 51 determines whether a left/right swipe operation has been performed (step S314).
  • the left/right swipe operation is an operation in which a finger that has touched the screen of the operation display section 53 is moved in a straight line in the left/right direction while the finger is being touched, and there is a left swipe operation that moves the finger to the left, and a left swipe operation that moves the finger to the right. There is a right swipe operation. If the control unit 51 determines that a left/right swipe operation has not been performed, the process proceeds to step S322. On the other hand, if the control unit 51 determines that a left/right swipe operation has been performed, it determines whether it is a left swipe operation (step S316).
  • FIG. 10 is an explanatory diagram showing how the behavior record display changes due to the left swipe operation. As shown in the figure, when a left swipe operation is performed, the target date and time is shifted to the previous day while the target time zone (time scale of the time axis) is maintained. Then, the icon is changed to one corresponding to the action event that occurred during the target time period of the target date and time shifted to the previous day.
  • FIG. 11 is an explanatory diagram showing how the action record display changes due to the right swipe operation. As shown in the figure, when a right swipe operation is performed, the target date and time is shifted to the next day while the target time zone (time scale of the time axis) is maintained. Then, the icon is changed to one corresponding to the action event that occurred during the target time period of the target date and time shifted to the next day.
  • the control unit 51 determines whether a pinch operation has been performed (step S322).
  • the pinch operation is an operation in which two fingers are touched on the screen of the operation display unit 53, and the two fingers are spread or narrowed while touching. There is a pinch-out operation to widen the image. If the control unit 51 determines that a pinch operation has not been performed, the process proceeds to step S330. On the other hand, if the control unit 51 determines that a pinch operation has been performed, it determines whether the pinch operation is a pin-in operation (step S324). If the control unit 51 determines that it is a pinch-in operation, it reduces the width of the time scale (time scale) on the time axis (step S326), and proceeds to step S330.
  • FIG. 12 is an explanatory diagram showing how the action record display changes due to the pinch-in operation. As shown in the figure, when the pinch-in operation is performed, the width of the time scale on the time axis is reduced, and the target time period is reduced. Then, the icon is changed to one corresponding to the action event that occurred during the reduced target time period.
  • the control unit 51 determines that the pinch operation is not a pinch-in operation but a pinch-out operation, it expands the width of the time scale (time scale) on the time axis (step S328).
  • FIG. 13 is an explanatory diagram showing how the behavior record display changes due to the pinch-out operation. As shown in the figure, when the pinch-out operation is performed, the width of the time scale on the time axis expands, and the target time period expands. Then, the icon is changed to one corresponding to the action event that occurred during the expanded target time period.
  • control unit 51 determines whether an end operation to end the display has been performed.
  • the process returns to step S306 and repeats the process, and when it determines that the end operation has been performed, it ends the action record display process.
  • the mobile terminal 50 of this embodiment is an example of the behavior record display device of the present disclosure
  • the control unit 51 that executes the behavior record display process is an example of a display control unit.
  • control unit 51 determines that the arc swipe operation is a clockwise swipe operation, it shifts the target time period in the forward direction, and determines that the arc swipe operation is a counterclockwise swipe operation. Then, an example of shifting the target time period in the backward direction is shown. However, for example, if the control unit 51 determines that the arc swipe operation is a counterclockwise swipe operation, it shifts the target time period in the forward direction, and if it determines that the arc swipe operation is a clockwise swipe operation, it shifts the target time period. The band may be shifted in the return direction.
  • control unit 51 determines that the left/right swipe operation is a right swipe operation, it shifts the target date and time of the target time period to the previous day, and if it determines that the left/right swipe operation is a left swipe operation, the control unit 51 shifts the target date and time of the target time period to the previous day. You may shift the target date and time to the next day.
  • a time axis of an arc or a circle is displayed on the display screen, and each icon corresponding to a plurality of behavior events extracted from the user's behavior record is displayed for each behavior event. displayed at the position of the arc or circle corresponding to the time period in which it occurred.
  • multiple icons corresponding to each behavioral event are displayed on a single time axis, making it easy for users to check the type of behavioral event that occurred and the time period in which the behavioral event occurred. This makes it easier to understand activity records.
  • the display screen is a touch panel type screen
  • the display control unit changes the time scale of the time axis when a pinch operation is performed on the display screen. It may be controlled as follows. In this way, the time scale of the time axis on which action records are displayed can be changed through an intuitive operation.
  • the present disclosure is not limited to the form of an action record display device, but can also be applied to a form of an action record display method.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Alarm Systems (AREA)
PCT/JP2022/023936 2022-06-15 2022-06-15 行動記録表示装置 WO2023242986A1 (ja)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2022/023936 WO2023242986A1 (ja) 2022-06-15 2022-06-15 行動記録表示装置
JP2024527983A JPWO2023242986A1 (enrdf_load_stackoverflow) 2022-06-15 2022-06-15

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/023936 WO2023242986A1 (ja) 2022-06-15 2022-06-15 行動記録表示装置

Publications (1)

Publication Number Publication Date
WO2023242986A1 true WO2023242986A1 (ja) 2023-12-21

Family

ID=89192462

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/023936 WO2023242986A1 (ja) 2022-06-15 2022-06-15 行動記録表示装置

Country Status (2)

Country Link
JP (1) JPWO2023242986A1 (enrdf_load_stackoverflow)
WO (1) WO2023242986A1 (enrdf_load_stackoverflow)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1185450A (ja) * 1997-09-10 1999-03-30 Canon Inc 情報処理方法及び装置
JP2007082190A (ja) * 2005-09-09 2007-03-29 Lg Electronics Inc 移動通信端末機のイベント表示装置及びその方法
US20080081594A1 (en) * 2006-09-29 2008-04-03 Lg Electronics Inc. Event information display apparatus and method for mobile communication terminal
JP2008211794A (ja) * 2007-02-27 2008-09-11 Lg Electronics Inc 移動端末機のイベント表示方法及びその装置
JP2016202347A (ja) * 2015-04-17 2016-12-08 セイコーエプソン株式会社 生体情報処理システム、生体情報処理装置及び解析結果情報の生成方法
WO2017022306A1 (ja) * 2015-08-05 2017-02-09 ソニー株式会社 情報処理システム、及び情報処理方法
JP2017074174A (ja) * 2015-10-14 2017-04-20 株式会社日立製作所 生体行動評価装置及び生体行動評価方法
JP2019200779A (ja) * 2019-02-06 2019-11-21 株式会社ニューロスペース 提示装置、提示方法及びプログラム

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1185450A (ja) * 1997-09-10 1999-03-30 Canon Inc 情報処理方法及び装置
JP2007082190A (ja) * 2005-09-09 2007-03-29 Lg Electronics Inc 移動通信端末機のイベント表示装置及びその方法
US20080081594A1 (en) * 2006-09-29 2008-04-03 Lg Electronics Inc. Event information display apparatus and method for mobile communication terminal
JP2008211794A (ja) * 2007-02-27 2008-09-11 Lg Electronics Inc 移動端末機のイベント表示方法及びその装置
JP2016202347A (ja) * 2015-04-17 2016-12-08 セイコーエプソン株式会社 生体情報処理システム、生体情報処理装置及び解析結果情報の生成方法
WO2017022306A1 (ja) * 2015-08-05 2017-02-09 ソニー株式会社 情報処理システム、及び情報処理方法
JP2017074174A (ja) * 2015-10-14 2017-04-20 株式会社日立製作所 生体行動評価装置及び生体行動評価方法
JP2019200779A (ja) * 2019-02-06 2019-11-21 株式会社ニューロスペース 提示装置、提示方法及びプログラム

Also Published As

Publication number Publication date
JPWO2023242986A1 (enrdf_load_stackoverflow) 2023-12-21

Similar Documents

Publication Publication Date Title
US10217342B2 (en) Method and process for determining whether an individual suffers a fall requiring assistance
US8659423B2 (en) Smart display device for independent living care
JP2009254817A (ja) 人の認知能力を監視するシステム及び方法
JP6357623B2 (ja) 情報処理装置、プログラム及び情報処理方法
JP7021634B2 (ja) 被監視者監視システムの中央処理装置および中央処理方法ならびに被監視者監視システム
KR102404885B1 (ko) IoT 기반 고독사 방지 시스템 및 그 방법
WO2017082037A1 (ja) 被監視者監視システムの中央処理装置および該方法ならびに前記被監視者監視システム
US20250090709A1 (en) Sanitation method and system
JP3606386B2 (ja) 要介護者の異常検出システム
JP6445815B2 (ja) 情報処理装置、プログラム及び情報処理方法
JP2004133777A (ja) 監視対象者生活監視装置、監視対象者追跡装置、監視対象者生活監視方法、監視対象者追跡方法、コンピュータ・プログラム及び記録媒体
WO2017026309A1 (ja) センサ装置及び介護支援システム
JP2015060530A (ja) 見守りシステム、見守り方法、見守り端末、管理端末、プログラム、記録媒体
JP7396390B2 (ja) 被監視者監視システムの中央処理装置および中央処理方法
WO2023242986A1 (ja) 行動記録表示装置
JP7420207B2 (ja) 睡眠状態判定システム、及び睡眠状態判定方法
JP2004046560A (ja) 独居居住者ライフラインデータ処理システム
EP3832618A1 (en) Monitoring system
JPWO2019142566A1 (ja) 被監視者監視支援システムおよび被監視者監視支援方法
WO2025027708A1 (ja) 生活行動表示装置
JP7147787B2 (ja) 被監視者監視支援装置、被監視者監視支援方法、および、被監視者監視支援プログラム
JP7268604B2 (ja) 被監視者監視支援装置、被監視者監視支援システム、被監視者監視支援方法、および、被監視者監視支援プログラム
TWI854720B (zh) 照護系統
JP6679019B1 (ja) 情報処理システム、コンピュータプログラム、及び情報処理方法。
GB2579674A (en) Monitoring method and system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22946805

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2024527983

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22946805

Country of ref document: EP

Kind code of ref document: A1