WO2021192399A1 - Serveur, système et procédé de reconnaissance de comportement - Google Patents

Serveur, système et procédé de reconnaissance de comportement Download PDF

Info

Publication number
WO2021192399A1
WO2021192399A1 PCT/JP2020/042057 JP2020042057W WO2021192399A1 WO 2021192399 A1 WO2021192399 A1 WO 2021192399A1 JP 2020042057 W JP2020042057 W JP 2020042057W WO 2021192399 A1 WO2021192399 A1 WO 2021192399A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor information
behavior
unit
observed person
sensor
Prior art date
Application number
PCT/JP2020/042057
Other languages
English (en)
Japanese (ja)
Inventor
健太郎 佐野
大平 昭義
佐知 田中
卓男 姚
浩平 京谷
優佑 円谷
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Priority to CN202080064882.1A priority Critical patent/CN114402575B/zh
Publication of WO2021192399A1 publication Critical patent/WO2021192399A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/04Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using a single signalling line, e.g. in a closed loop
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H80/00ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M11/00Telephonic communication systems specially adapted for combination with other electrical systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information

Definitions

  • the present invention relates to a behavior recognition server, a behavior recognition system, and a behavior recognition method.
  • I 000T Internet of Things
  • Patent Document 1 describes a method of smoothing those reactions and associating them with one action even when the sensor reacts a plurality of times within one second according to the sensor reaction definition prepared in advance.
  • erroneous detection occurs in which measurement is performed as non-detection. For example, there is a factor of erroneous detection such that the infrared rays for detection emitted from the motion sensor are blocked by the amount of illumination light in the room. As a result, the motion sensor overlooks a stationary person, and even if there is a relaxed person, it may be mistakenly recognized as absent.
  • the main subject of the present invention is to suppress the reduction of recognition accuracy from the sensor information including the false detection data.
  • the behavior recognition server of the present invention has the following features.
  • the present invention includes a sensor information acquisition unit that acquires sensor information indicating a detection result for each sensor from a set of sensors that detect an observed person.
  • a sensor information conversion unit that converts the sensor information into a probability density function in the time direction that maximizes the reaction time based on the reaction time detected by the observed person in the sensor information in the time series.
  • a behavior classification unit that classifies the behavior of the observed person at each time based on the converted sensor information, It is characterized by having a behavior output unit that digitizes and outputs the classified behavior of the observed person. Other means will be described later.
  • FIG. 1 is a configuration diagram of an action recognition system.
  • the behavior recognition system is configured so that the observer 3u remotely monitors the living condition of the observer 2u living at home 2h using the observer terminal 3.
  • the behavior recognition server 1 recognizes the living state of the observed person 2u based on the sensor information acquired from various sensors 2, and notifies the observer terminal 3 of the recognition result.
  • the observer 3u who sees the display screen of the observer terminal 3 can grasp the living state of the observer 2u.
  • the observer 2u is, for example, a care recipient, and the observer 3u is, for example, the family of the care recipient.
  • a behavior recognition system may be introduced in a hospital or a long-term care facility instead of the home 2h, in which case the observer 3u becomes a doctor or a care manager.
  • the sensor 2 may be, for example, a sensor incorporated in a home electric appliance such as a refrigerator 2a or an autonomous mobile vacuum cleaner 2b, or a single sensor such as a motion sensor 2c. It is desirable that the sensor 2 such as the motion sensor 2c is installed in a direction in which the measurement area does not face the entrance of the room. By this installation, it is possible to prevent the motion sensor 2c from erroneously detecting a person different from the observed person 2u passing through the corridor outside the room.
  • FIG. 2 is a hardware configuration diagram of the behavior recognition system.
  • the sensor 2 notifies the observer 2u of a communication unit 121 that notifies other devices of sensor information detected by the detection unit 122, a detection unit 122 that detects the observer 2u, and a message from the observer 3u. It has a notification unit 123 and the like.
  • the action recognition server 1 controls the communication unit 111 that receives the sensor information from the sensor 2 and notifies the observer terminal 3 of the recognition result from the sensor information, the control unit 112 that recognizes the living state of the observer 2u, and the control unit 112. It has a storage unit 113 for storing data used for processing of the unit 112.
  • the observer terminal 3 inputs a communication unit 131 that receives the recognition result of the observer 2u, a notification unit 132 that notifies the observer 3u of the recognition result of the observer 2u, a message from the observer 2u, and the like. It has an input unit 133 to be operated.
  • the action recognition server 1 is configured as a computer having a CPU (Central Processing Unit) as an arithmetic unit (control unit 112), a memory as a main storage device, and a hard disk as an external storage device (storage unit 113).
  • the CPU operates a control unit (control means) composed of each processing unit by executing a program (also called an application or an abbreviation for application) read in the memory.
  • a program also called an application or an abbreviation for application
  • FIG. 3 is a configuration diagram showing details of the action recognition server 1.
  • the control unit 112 (FIG. 2) of the action recognition server 1 includes a sensor information acquisition unit 11, a sensor information conversion unit 11T, a time information acquisition unit 12, an image conversion unit 13, an action classification unit 14, and an action correction unit. It has 15, a current action storage unit 16, and an action output unit 17.
  • the storage unit 113 (FIG. 2) of the action recognition server 1 stores the layout data 13L and the classification model 14m.
  • FIG. 4 is a flowchart showing the processing of the action recognition server 1.
  • the sensor information acquisition unit 11 acquires sensor information from the sensors 2 (refrigerator 2a, vacuum cleaner 2b, motion sensor 2c) installed in the home 2h (S101).
  • the data format of the sensor information may differ depending on the type of the sensor 2.
  • the sensor information conversion unit 11T receives the sensor information in the discrete value data format of 0 or 1 from the sensor information acquisition unit 11, and converts the discrete value sensor information into the sensor information of the probability density function (S102, FIGS. 5 to 5 to 5). (See below in FIG. 9).
  • the sensor information conversion unit 11T sets the function value at time t as the maximum value (for example, "1") from the input data of the discrete value "1" at the time t when the sensor reacts. Output data is created by adding a function value less than the maximum value in the time direction before and after that (Fig. 5). The function value less than the maximum value is calculated by the sensor information conversion unit 11T so that the function value becomes smaller as the time difference from the time t becomes larger.
  • the sensor information conversion unit 11T uses the input data, which is a data format other than the discrete value, as the output data as it is without conversion.
  • the image conversion unit 13 images a set of sensor information at a predetermined time based on the sensor information for each sensor 2 which is the output data of the sensor information conversion unit 11T (S103).
  • the layout data 13L referred to by the image conversion unit 13 at the time of conversion information regarding the layout in the image, such as which part of the image the sensor information of which sensor 2 is to be arranged, is defined in advance (FIG. 10, FIG. FIG. 11).
  • the image conversion unit 13 acquires time information indicating a predetermined time, which is the measurement time of the sensor information, via the time information acquisition unit 12, and sets the time information as an imaging target. May be included. If the sensor 2 includes a time stamp in the sensor information, the time information acquisition unit 12 acquires the time, and if there is no time stamp, the time information reception time is set as an image target. Note that the behavior classification unit 14 may accept the sensor information and the time information that have not been imaged, omitting the image conversion process of the sensor information by the image conversion unit 13.
  • the behavior classification unit 14 classifies the behavior of the observed person 2u in the time information from the image data showing the sensor information (S104). For this classification process, a classification model 14m is prepared in which when image data is input in advance, the corresponding behavior is converted into data and output. The classification model 14m is trained by a machine learning algorithm such as deep learning.
  • the behavior correction unit 15 corrects an unnatural behavior that has occurred momentarily by referring to the behavior before and after the individual behavior output by the behavior classification unit 14 in time (FIG. 13). Later). Therefore, when there is a local change from the action before and after the action (current action) to be focused on this time (S111, Yes), the action correction unit 15 changes the local action into the action before and after. After the correction is made so as to be consistent, the corrected action is accumulated in the current action storage unit 16 (S112). On the other hand, when there is no local change (S111, No), the natural behavior is accumulated in the current behavior storage unit 16 as it is (S113).
  • the action output unit 17 outputs the action recognition result currently accumulated in the action storage unit 16 to the outside (observer terminal 3).
  • the output destination of the action recognition result is not limited to the customer environment (observer terminal 3), and may be output to another system such as a database system or a cloud system.
  • FIG. 5 shows a time series graph of sensor information in a state where there is no detection omission.
  • FIG. 211 is sensor information input from the sensor information acquisition unit 11 to the sensor information conversion unit 11T.
  • the discrete value “1” indicating the detection of the observed person 2u at the reaction times t1 to t5 is included in the graph 211.
  • the graph 212 is the result of the sensor information conversion unit 11T converting the discrete value sensor information into the probability density function using the graph 211 as input data.
  • the sensor information conversion unit 11T receives the discrete value “1” of the reaction time t1 and converts it into a probability density function of the curve m1 having the reaction time t1 as a peak. Similarly, the sensor information conversion unit 11T creates a curve m2 at the reaction time t2, a curve m3 at the reaction time t3, a curve m4 at the reaction time t4, and a curve m5 at the reaction time t5, respectively.
  • the sensor information conversion unit 11T may apply, for example, a normal distribution, a student ⁇ distribution, a U (Universal) distribution, and an arbitrary distribution used in other statistical fields as a distribution in which the sensor information is converted into a probability density function. can.
  • Graph 213 integrates the overlapping sections between the curves of Graph 212.
  • the sensor information conversion unit 11T adopts the maximum value of those curves, but the sum of the curves may be adopted.
  • the value of the probability density function at each time is uniquely obtained in the graph 213.
  • the function value of each reaction time t1 to t5 is not "0" even after the conversion of the sensor information conversion unit 11T, the correct detection result is not deleted by the sensor information conversion unit 11T.
  • FIG. 6 shows a time-series graph in a state where a part of detection omission has occurred from the time-series graph of FIG.
  • Graph 221 is sensor information input from the sensor information acquisition unit 11 to the sensor information conversion unit 11T.
  • the discrete value becomes "0" due to the omission of detection.
  • the discrete value “1” is correctly detected as in FIG.
  • Graph 222 is the result of the sensor information conversion unit 11T converting the sensor information of discrete values into a probability density function using the graph 221 as input data.
  • the curve m2 at time t2 and the curve m4 at time t4 are missing from the graph 212 of FIG.
  • Graph 223 is a combination of overlapping sections between the curves of Graph 222, similar to Graph 213 of FIG.
  • the sensor information (function value) at the time t2 is not "0" but is influenced by the probability density function (curves m1, m3) from the time t1 and t3 in the vicinity in time. ..
  • the function value at time t4 is also affected by the probability density function (curve m5) from time t5 in the vicinity in time. In this way, even if a detection omission occurs at time t2 and t4, the detection omission can be relieved by using another signal in the vicinity in time as a probability density function.
  • FIG. 7 shows a time series graph when a probability density function other than a curve is applied from the same input data as the time series graph of FIG. Similar to the graph 211, the graph 231 includes a discrete value “1” indicating the detection of the observed person 2u at times t1 to t5, respectively.
  • the graph 232 is the result of the sensor information conversion unit 11T converting the graph 231 as input data into a linear approximation probability density function having peaks at each time t1 to t5 of the discrete value “1”. Straight line approximation requires less calculation. Further, in addition to the equation approximation of the linear approximation, the sensor information conversion unit 11T may use the curve approximation shown in FIG. 5, the polynomial approximation (not shown), or the like.
  • Graph 233 is the result of the sensor information conversion unit 11T converting the graph 231 into a predetermined range of random values using the graph 231 as input data.
  • the range in which the random value can be taken differs depending on whether the discrete value of the input data is "0" or "1".
  • -Discrete value of input data "1” ⁇ Output data "Random value in the range of 0.7 to 1.0”
  • FIG. 8 is a graph when the probability density function is applied to the spatial axis.
  • the sensor information conversion unit 11T artificially creates a detection signal around the time when the discrete value “1” of the input data is generated by applying the probability density function to the time axis. rice field.
  • the sensor information conversion unit 11T applies a probability density function to the spatial axis, so that the sensor information conversion unit 11T is located around the place (living room) where the discrete value “1” of the input data is generated (bedroom, kitchen). ) May also create a pseudo detection signal.
  • FIG. 9 is a plan view showing a specific example of the space to which the graph of FIG. 8 is applied.
  • FIG. 10 is an explanatory diagram showing an example of layout data 13L used by the image conversion unit 13 for imaging processing.
  • layout data 13L the data contents to be written at each position in the square image data of 12 squares in the vertical direction and 12 squares in the horizontal direction are arranged as symbols in the figure such as "T” and "ACC1".
  • the "mass” is the smallest unit obtained by subdividing the image area, and at least one writing area is assigned to the sensor information and the time information.
  • FIG. 11 is an explanatory table of the layout data 13L of FIG.
  • the top “T” in FIG. 10 corresponds to the symbol “T” in the figure of the first line “time” in FIG.
  • the image data arranged at the position of the uppermost portion "T” in FIG. 10 is the time data acquired by the time information acquisition unit 12. That is, one image shown in FIG. 12 is visualized by aggregating a set of sensor information measured at the same measurement time (time data of "T") from the sensors 2 arranged at each location. The result.
  • the types of the sensor 2 that the sensor information conversion unit 11T converts into the probability density function in S102 include, for example, an acceleration sensor, a (door) open / close sensor, and the like that detect the operation of the observed person 2u, a motion sensor, and the like. Those that detect the presence of the observed person 2u can be mentioned.
  • the third column "number of squares" in the explanation table indicates the size of the writing area.
  • the image conversion unit 13 fills the number of cells in the image by copying and writing the same data content to a plurality of locations.
  • the number of cells in the layout data 13L indicates the weight between the information to be written, and the larger the number of cells is assigned, the greater the influence on the behavior.
  • the distribution of the number of squares is determined by, for example, the following policy. ⁇ Since it is customary for humans to take actions depending on the time of day, such as going out during the day and sleeping at night, the time information "T” allocates a larger number of cells (24 cells) than other sensor information. .. -Since the actions that humans can take are narrowed down to some extent depending on where they are, the sensor information (location information) of the motion sensors "HM1 to HM5" allocates a larger number of cells (12 cells) than other sensor information.
  • the day of the week information "DoW” has more cells than the sensor information that measures the environment of home 2h. Allocate (12 squares). -As sensor information for detecting human movements, the acceleration sensors "ACC1 to ACC4" and the open / close sensors "OC1 to OC3" allocate a larger number of cells (4 cells) than the sensor information for measuring the environment of the home 2h. ..
  • the fourth column "value" of the explanatory table indicates the data content to be written in the writing area.
  • the value "0.31" of the time “T” indicates 7:40 am when 0:00 is the value "0.00” and 23:59 is the value "1.00".
  • the day of the week is selected from seven ways when Monday is set to the value "0.00” and Sunday is set to the value "1.00”.
  • the above-mentioned "value” is a value in an arbitrary range based on the value of each sensor information. In addition to the case of referring to the color corresponding to the value of each sensor information as described above, the case of referring to the value of each sensor information as it is is also included.
  • the "humidity” value "0.66,0.57,0.64,0.58,0.7” is the value of the first humidity sensor “0.66”, the value of the second humidity sensor “0.57”, ..., The value of the fifth humidity sensor in order from the left. Indicates "0.7".
  • the layout data 13L described above has described an example of arranging the same type of sensor information in close proximity in an image. On the other hand, sensor information having the same sensor installation location (room) may be arranged close to each other in the image.
  • FIG. 12 is an explanatory diagram of image data as a result of writing the “value” of FIG. 11 with respect to the layout data 13L of FIG.
  • symbols in the figure such as “T” and “ACC1” are also shown for the sake of clarity, but in reality, the symbols in the figure are omitted from the image.
  • the image conversion unit 13 writes black indicating the value “0” in the writing area of “ACC1”.
  • the image conversion unit 13 writes white indicating the value "1" in the writing area of "HM4". That is, the larger the value to be written, the closer to white.
  • the classification model 14m is defined by associating the image data created by the image conversion unit 13 with the behavior "returning home" of the observer 2u indicating the situation indicated by the image data.
  • the behavior classification unit 14 refers to the classification model 14m registered in the past, and when image data matching or similar to the image data of the classification model 14m is detected from the current observer 2u, the classification model The corresponding action "returning home” at 14 m is output as a classification result (S104).
  • a person such as the observer 3u may teach a meaningful action label such as "going home” or "resting".
  • an action label such as "behavior A” or “behavior B” automatically classified by machine learning may be used, which is simply a group of similar actions that have no meaning.
  • FIG. 13 is a time series graph showing the processing contents of the action correction unit 15.
  • Graph 241 shows the output data of the action classification unit 14 before correction. In Graph 241 it is basically detected that the observer 2u is out, but the bathing behavior for 5 minutes ( ⁇ T1) at 10:00 and the cleaning for 3 minutes ( ⁇ T2) at 15:00. Suppose an action is detected.
  • Graph 242 shows the output data of the behavior correction unit 15 after correction. When an action different from the previous / next action is suddenly detected, the action correction unit 15 corrects the different action so that the action is the same as the previous / next action (S112).
  • the action correction unit 15 determines that the action is an unnatural action to be corrected. As a result, the bathing behavior at 10:00 and the cleaning behavior at 15:00 are corrected to the same behaviors as "going out" before and after, respectively.
  • the action correction unit 15 may refer not only to the period of action but also to the type of action as a method of detecting an unnatural action to be corrected.
  • the behavior correction unit 15 may make a correction target for an action (going out) that is unnatural to occur immediately after (1 minute later) with respect to the previous action (relaxation).
  • the behavior correction unit 15 may change the predetermined period Th to be compared depending on the type of different behavior as to whether or not to correct the behavior different from the front and back. For example, bathing behavior is corrected as unnatural behavior if it is less than 20 minutes (predetermined period Th1), while cleaning behavior is corrected as unnatural behavior if it is less than 5 minutes (predetermined period Th2).
  • predetermined period Th1 a method of improving the accuracy of action recognition by shortening the time interval of action detection, but this method causes complicated control.
  • the sensor information conversion unit 11T is on the time axis or Detection omission can be relieved by using a probability density function based on sensor information in the vicinity on the spatial axis. As a result, it is possible to suppress a decrease in recognition accuracy from sensor information including false detection data.
  • the present invention is not limited to the above-described embodiment, and includes various modifications.
  • the above-described embodiment has been described in detail in order to explain the present invention in an easy-to-understand manner, and is not necessarily limited to those having all the described configurations.
  • it is possible to replace a part of the configuration of one embodiment with the configuration of another embodiment and it is also possible to add the configuration of another embodiment to the configuration of one embodiment.
  • each of the above configurations, functions, processing units, processing means and the like may be realized by hardware by designing a part or all of them by, for example, an integrated circuit.
  • each of the above configurations, functions, and the like may be realized by software by the processor interpreting and executing a program that realizes each function.
  • control lines and information lines indicate those that are considered necessary for explanation, and do not necessarily indicate all the control lines and information lines in the product. In practice, it can be considered that almost all configurations are interconnected.
  • the communication means for connecting each device is not limited to the wireless LAN, and may be changed to a wired LAN or other communication means.
  • Action recognition server 2 Sensor 2u Observer 3 Observer terminal 11 Sensor information acquisition unit 11T Sensor information conversion unit 12 Time information acquisition unit 13 Image conversion unit 13L Layout data 14 Behavior classification unit 14m Classification model 15 Behavior correction unit 16 Current behavior Accumulation unit 17 Action output unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Emergency Management (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Epidemiology (AREA)
  • Public Health (AREA)
  • Primary Health Care (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • Alarm Systems (AREA)
  • Emergency Alarm Devices (AREA)
  • Telephonic Communication Services (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
  • Image Processing (AREA)

Abstract

L'invention concerne un serveur (1) de reconnaissance de comportement comprenant: une unité (11) d'acquisition d'informations de capteurs qui acquiert, à partir d'une collection de capteurs (2) servant à détecter un sujet observé (2u), des informations de capteurs de chacun des capteurs (2); une unité (11T) de conversion d'informations de capteurs qui, sur la base d'un temps de réaction lorsque le sujet observé (2u) est détecté dans les informations de capteurs dans un ordre chronologique, convertit les informations de capteurs en une fonction de densité de probabilité d'une direction temporelle où le temps de réaction est réglé en tant que valeur maximum; une unité (14) de classification de comportement qui classifie le comportement du sujet observé (2u) à chaque instant; et une unité (17) de sortie de comportement qui convertit le comportement classifié du sujet observé (2u) en données et délivre les données.
PCT/JP2020/042057 2020-03-25 2020-11-11 Serveur, système et procédé de reconnaissance de comportement WO2021192399A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202080064882.1A CN114402575B (zh) 2020-03-25 2020-11-11 行动识别服务器、行动识别系统和行动识别方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020054435A JP7436257B2 (ja) 2020-03-25 2020-03-25 行動認識サーバ、行動認識システム、および、行動認識方法
JP2020-054435 2020-03-25

Publications (1)

Publication Number Publication Date
WO2021192399A1 true WO2021192399A1 (fr) 2021-09-30

Family

ID=77891272

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/042057 WO2021192399A1 (fr) 2020-03-25 2020-11-11 Serveur, système et procédé de reconnaissance de comportement

Country Status (3)

Country Link
JP (1) JP7436257B2 (fr)
CN (1) CN114402575B (fr)
WO (1) WO2021192399A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023048268A1 (fr) 2021-09-27 2023-03-30 富士フイルム株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030229471A1 (en) * 2002-01-22 2003-12-11 Honeywell International Inc. System and method for learning patterns of behavior and operating a monitoring and response system based thereon
JP2004145820A (ja) * 2002-10-28 2004-05-20 Nippon Telegr & Teleph Corp <Ntt> 生活動作検出方法、装置、プログラム、および該プログラムを記録した記録媒体
JP2019087179A (ja) * 2017-11-10 2019-06-06 富士通株式会社 分析装置、分析方法およびプログラム

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3996428B2 (ja) 2001-12-25 2007-10-24 松下電器産業株式会社 異常検知装置及び異常検知システム
CN1322465C (zh) * 2005-08-15 2007-06-20 阜阳师范学院 自动指纹识别方法中的图像分割及指纹纹路距离提取方法
JP2011232871A (ja) * 2010-04-26 2011-11-17 Sony Corp 情報処理装置、テキスト選択方法及びプログラム
JP2012058780A (ja) * 2010-09-03 2012-03-22 Toyota Motor Corp 環境マップ作成装置及び方法、行動予測装置及び方法
JP5593486B2 (ja) * 2012-10-18 2014-09-24 独立行政法人産業技術総合研究所 センサネットワークシステム
JP2016006611A (ja) 2014-06-20 2016-01-14 ソニー株式会社 情報処理装置、情報処理方法およびプログラム
EP3243554B1 (fr) * 2015-01-05 2022-07-20 Sony Group Corporation Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
DE102015207415A1 (de) * 2015-04-23 2016-10-27 Adidas Ag Verfahren und Gerät zum Verknüpfen von Bildern in einem Video einer Aktivität einer Person mit einem Ereignis
KR20170084445A (ko) * 2016-01-12 2017-07-20 삼성에스디에스 주식회사 시계열 데이터를 이용한 이상 감지 방법 및 그 장치
JP2017224174A (ja) * 2016-06-15 2017-12-21 シャープ株式会社 情報取得端末、情報収集装置、行動観察システム、情報取得端末の制御方法、および、情報収集装置の制御方法
JP6890813B2 (ja) 2016-08-22 2021-06-18 学校法人慶應義塾 行動検知システム、情報処理装置、プログラム
CN106644436B (zh) * 2016-12-16 2019-02-01 中国西电电气股份有限公司 一种断路器机械特性的评定方法
JP6795093B2 (ja) * 2017-06-02 2020-12-02 富士通株式会社 判定装置、判定方法及び判定プログラム
JP2019054333A (ja) * 2017-09-13 2019-04-04 株式会社東芝 無線端末、無線通信システム、無線通信方法及び無線通信プログラム
CN108764059B (zh) * 2018-05-04 2021-01-01 南京邮电大学 一种基于神经网络的人体行为识别方法及系统
JP2019213030A (ja) * 2018-06-04 2019-12-12 凸版印刷株式会社 看視システム
JP7085750B2 (ja) 2018-07-18 2022-06-17 株式会社Z-Works 生活習慣分析システム、生活習慣分析方法及びプログラム
CN109362066B (zh) * 2018-11-01 2021-06-25 山东大学 一种基于低功耗广域物联网和胶囊网络的实时行为识别系统及其工作方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030229471A1 (en) * 2002-01-22 2003-12-11 Honeywell International Inc. System and method for learning patterns of behavior and operating a monitoring and response system based thereon
JP2004145820A (ja) * 2002-10-28 2004-05-20 Nippon Telegr & Teleph Corp <Ntt> 生活動作検出方法、装置、プログラム、および該プログラムを記録した記録媒体
JP2019087179A (ja) * 2017-11-10 2019-06-06 富士通株式会社 分析装置、分析方法およびプログラム

Also Published As

Publication number Publication date
JP2021157275A (ja) 2021-10-07
CN114402575A (zh) 2022-04-26
JP7436257B2 (ja) 2024-02-21
CN114402575B (zh) 2023-12-12

Similar Documents

Publication Publication Date Title
Ghayvat et al. Smart aging system: uncovering the hidden wellness parameter for well-being monitoring and anomaly detection
Monekosso et al. Behavior analysis for assisted living
CN108348160B (zh) 监测人的日常生活活动
Aran et al. Anomaly detection in elderly daily behavior in ambient sensing environments
Sunder et al. Incidence, characteristics, and mortality of infective endocarditis in France in 2011
Dahmen et al. Smart secure homes: a survey of smart home technologies that sense, assess, and respond to security threats
US20210241923A1 (en) Sensor-based machine learning in a health prediction environment
US20180174671A1 (en) Cognitive adaptations for well-being management
JP2005509218A (ja) 質を堅持するための患者データマイニング
Pike et al. Sensor networks and data management in healthcare: Emerging technologies and new challenges
EP3163545A1 (fr) Détection d&#39;une activité anormale pour personnes âgées et personnes handicapées
Robin et al. The epidemiology of acute rheumatic fever in Northland, 2002-2011
Howedi et al. An entropy-based approach for anomaly detection in activities of daily living in the presence of a visitor
WO2021192399A1 (fr) Serveur, système et procédé de reconnaissance de comportement
WO2021192398A1 (fr) Serveur et procédé de reconnaissance de comportement
Gonzalez et al. Variational autoencoders for anomaly detection in the behaviour of the elderly using electricity consumption data
JP2019155071A (ja) 事象予測システム、センサ信号処理システム、事象予測方法及びプログラム
Bijlani et al. An unsupervised data-driven anomaly detection approach for adverse health conditions in people living with dementia: Cohort study
CN107958434B (zh) 智能看护方法、装置、电子设备及存储介质
Payandeh et al. Application of modified pagerank algorithm for anomaly detection in movements of older adults
Gargees et al. Early illness recognition in older adults using transfer learning
Annapragada et al. SWIFT: A deep learning approach to prediction of hypoxemic events in critically-Ill patients using SpO2 waveform prediction
Ou et al. Identifying Elderlies at Risk of Becoming More Depressed with Internet-of-Things
Jiang et al. Recognising activities at home: Digital and human sensors
EP3163546A1 (fr) Procédé et dispositif pour détecter un comportement anormal d&#39;un utilisateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20927577

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20927577

Country of ref document: EP

Kind code of ref document: A1