WO2023112757A1 - Sensing system, sensing device, and sensing method - Google Patents

Sensing system, sensing device, and sensing method Download PDF

Info

Publication number
WO2023112757A1
WO2023112757A1 PCT/JP2022/044782 JP2022044782W WO2023112757A1 WO 2023112757 A1 WO2023112757 A1 WO 2023112757A1 JP 2022044782 W JP2022044782 W JP 2022044782W WO 2023112757 A1 WO2023112757 A1 WO 2023112757A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
importance
data
sensing system
sensors
Prior art date
Application number
PCT/JP2022/044782
Other languages
French (fr)
Japanese (ja)
Inventor
健太郎 佐野
佐知 田中
昭義 大平
シャヘッド サルワル
景子 藤咲
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Publication of WO2023112757A1 publication Critical patent/WO2023112757A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/04Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using a single signalling line, e.g. in a closed loop
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • the present invention relates to a sensing system, a sensing device, and a sensing method.
  • Patent Literature 1 describes that "a watching device in a target person's house detects movement when a human sensor (human body detection means) other than the one that has responded immediately before senses a movement. When there is a response from another human sensor before the measured value t reaches the movement fixed time T, the measured value t is reset and the measurement is restarted.Response of one human sensor After the measurement is started by, when the measured value t reaches the movement fixed time T, it is determined that the person being watched has moved to the watching area of the one human sensor. According to this technology, it is possible to reduce the amount of information to be stored by specifying an important motion sensor without interfering with grasping the daily activities of the person being watched over.
  • the movement of a person is confirmed using the movement confirmation time T when a plurality of human sensors react.
  • lifestyle habits differ from person to person and may change according to life stages, it is difficult to appropriately set the fixed movement time T in advance.
  • the purpose of the present invention is to build a system that appropriately recognizes human behavior by utilizing sensor information.
  • the sensing system of the present invention is a sensing system comprising a plurality of sensors including at least one human sensor, wherein the space in which the sensors are installed is divided into a plurality of areas.
  • a plurality of sensors are installed in each of a plurality of areas, and as the subject moves, information of the sensors is obtained by converting discrete data of at least one of the sensors into continuous data as the degree of importance. and the action recognition of the person to be measured is performed based on the weight calculated using the information of the plurality of sensors including the degree of importance.
  • FIG. 1 is a block diagram showing the configuration of a sensing system according to an embodiment
  • FIG. 3 is a block diagram showing the internal configuration of a computing unit
  • FIG. 4 is a diagram showing a case where sensor data are discrete values at time t
  • FIG. 10 is a diagram showing creation of curve data m1 to m5 for discrete values of reaction times t1 to t5
  • FIG. 3C is a diagram in which overlapping sections of curve data m1 to m5 in FIG. 3B are integrated.
  • FIG. 9 is a diagram showing an example of applying linear approximation to discrete values of reaction times t1 to t5
  • FIG. 10 is a diagram showing an example of converting discrete values of reaction times t1 to t5 into rectangular functions;
  • FIG. 4 is a diagram showing an example of randomizing the binary sensor data 18 at the time t when the sensor 10 reacts. It is a figure which shows adding the function value of less than the maximum value of sensor data in the spatial direction centering on the sensor which reacted, and making it sensor information.
  • FIG. 4 is a plan view showing a specific example of a space to which sensor information is added in a spatial direction; It is a figure which shows the time change of the importance of a 1st sensor. Time change of the sensor information of the second sensor
  • FIG. 10 is a diagram showing changes over time in weights calculated by an importance determining unit 102a;
  • FIG. 10 is a diagram showing the time evolution of the importance indicated by the curve of the third sensor;
  • FIG. 10 is a diagram showing the time evolution of importance indicated using a rectangular function of the fourth sensor;
  • FIG. 8C is a diagram showing temporal changes in weights calculated by another importance determining unit 102b.
  • FIG. 4 is a flow chart explaining the operation of the sensing system;
  • FIG. 11 is a block diagram showing the configuration of a sensing system according to another embodiment;
  • FIG. It is the figure which showed the example of arrangement
  • FIG. 1 is a block diagram showing the configuration of a sensing system 100 according to an embodiment.
  • the sensing system 100 includes an information acquisition unit 101 , a calculation unit 102 , a storage unit 103 , a display unit 104 and a sensor 10 .
  • the configuration of the sensing system 100 excluding the sensor 10 is called a sensing device.
  • a computer comprising a CPU that performs arithmetic processing, a memory, a camera, a communication unit, an operation unit, a display unit, and a nonvolatile storage medium executes a program stored in the nonvolatile storage medium.
  • the information acquisition unit 101 , the calculation unit 102 , and the display unit 104 and constitute the sensing system 100 .
  • the storage unit 103 is configured with a memory or a nonvolatile storage medium.
  • the information acquisition unit 101 acquires the sensor data 18 from the sensor 10 and also acquires the body information 105 of the person to be analyzed.
  • the physical information 105 of the person to be analyzed includes, for example, height, weight, date of birth, BMI (body mass index), body fat percentage, visceral fat level, muscle mass, body water percentage, and body age.
  • the information acquisition unit 101 acquires sensor data 18 by directly connecting to the sensor 10 via Ethernet, wireless communication, or the like. Also, when collecting data from a locally installed PC connected to the sensor 10 via a gateway, the information acquisition unit 101 accesses the PC via a local network or the Internet to acquire the sensor data 18. . Furthermore, when the sensor data 18 is collected directly or via a gateway to a cloud computing service server, the information acquisition unit 101 may acquire the sensor data 18 by accessing the server via the Internet.
  • the information acquisition unit 101 receives sensor data 18 from an external server in a file format such as csv (Comma Separated Value) using Python (registered trademark), java (registered trademark), c language, or the like.
  • csv Common Separated Value
  • Python registered trademark
  • java registered trademark
  • c language or the like.
  • the information acquiring unit 101 refers to the storage unit 103 and acquires the physical information 105 of the analysis target, which is stored in the storage unit 103 by the system administrator, based on the application documents that the analysis target fills out when applying for the service. do.
  • the computing unit 102 analyzes a plurality of sensor data 18 and recognizes human behavior. Specifically, the calculation unit 102 classifies one or more types of sensor data 18 into one or more types. Here, the classification means grouping the sensor data 18 based on their characteristics. Assign label names that indicate human actions, such as "”, “cleaning”, “washing”, etc. The configuration of the calculation unit 102 will be described later.
  • the accumulation unit 103 accumulates at least one of the information acquired by the information acquisition unit 101 and the output of the calculation unit 102 .
  • the data may be tagged with attributes such as age, gender, origin, language, religion, and tastes and preferences of the person to be analyzed.
  • time series analysis refers to visualization of various data by graphs with time on the horizontal axis, and data values, the number of classifications, or the duration of each classification, in the time direction It is defined as calculating the rate of change, moving average, variance, standard deviation, etc., and performing operations such as error analysis and polynomial approximation.
  • time-series analyzes may include comparison with average values of persons belonging to the same category in terms of age, gender, origin, religion, lifestyle, occupation, medical history, and the like.
  • the display unit 104 receives information from the calculation unit 102 and the storage unit 103, and displays the information to the administrator of the sensing system 100, system users, analysis subjects, and the like.
  • the display unit 104 displays numbers, characters, tables, graphs, and the like in arbitrary formats.
  • FIG. 2 is a block diagram showing the internal configuration of the arithmetic unit 102. As shown in FIG. The calculation unit 102 is composed of an importance determination unit 102a, a weight determination unit 102b, and a classification unit 102c.
  • the importance determination unit 102a calculates the importance of the sensor data 18 acquired by the calculation unit 102.
  • the degree of importance is calculated using the response (change) of the sensor data 18 in the temporal direction and the spatial direction. It is defined as an index that gives a clue as to whether the data 18 is relatively emphasized. A method for calculating the degree of importance will be described in detail below.
  • the importance determination unit 102a inputs a discrete value (for example, 1) at the time t when the sensor 10 reacts to the acquired sensor data 18, as shown in FIG. 3A, as sensor information.
  • a discrete value for example, 1
  • the sensor 10 is an example of a human sensor (pyroelectric sensor) using a pyroelectric effect.
  • a human sensor that uses the pyroelectric effect will detect when a person stops moving for a certain period of time even within the detection range, or when a person leaves the detection range, that is, when there is no temperature change for a certain period of time. value will be 0.
  • FIG. 3A shows temporal changes in sensor information from one human sensor.
  • the human sensor does not react between reaction time t1 and reaction time t2. It will be in a state of not doing. This point is the same between reaction time t2 and reaction time t3 and between reaction time t4 and reaction time t5.
  • reaction time t3 and reaction time t4 the person to be analyzed stops moving, or leaves the detection range of the human sensor after reaction time t3, and returns to the detection range of the human sensor at reaction time t4. It is assumed that the
  • the importance determination unit 102a sets the function value at time t to the maximum value (for example, 1) as shown in FIG. Create sensor information with added function value less than value. Specifically, the importance determining unit 102a creates curve data m1 to m5 for discrete values (eg, 1) of the sensor data 18 at reaction times t1 to t5.
  • the function value less than the maximum value is calculated by the importance determination unit 102a so that the function value decreases as the time difference from time t increases.
  • the importance determination unit 102a determines a function value that is less than the maximum value in the time direction before and after the sensor reaction time t, a normal distribution, a Student's ⁇ distribution, a U (Universal) distribution, and any other arbitrary distribution used in the statistical field. apply the distribution of
  • the importance determining unit 102a sets the human existence probability to 1.0 when the sensor information (the value of the sensor data 18) is 1, and determines the continuous values of the created sensor information to be 1.0. , can be treated as the existence probability of humans in the time series direction.
  • FIG. 3C is a diagram that integrates overlapping sections of the curve data m1 to m5 in FIG. 3B. If the created curve data overlap, the importance determining unit 102a integrates the overlapped curves into a curve having the maximum value. Alternatively, the importance determination unit 102a adopts the sum of a plurality of curves and normalizes it by the maximum value of the entire sensor information. Thereby, the importance determining unit 102a can uniquely determine the value of the sensor information at each time.
  • the sensor information in FIG. 3C is characterized by having a large value (area) so that the sensor 10 responds continuously, except when the elapsed time from the reaction time of the sensor 10 is short. In addition, it is possible to use it as an indicator of which sensor is to be emphasized. In this specification, as shown in FIG. 3C, the sensor information converted so as to be continuously distributed in the time direction is defined as importance.
  • the importance determining unit 102a converts discrete sensor data 18 into continuous sensor data 18 continuously distributed in the time direction, as shown in FIGS. 3B and 3C. As a result, it is possible to prevent the value from suddenly becoming 0 or invalid due to an unintended loss of the sensor data 18, so that a system that is robust against loss of the sensor data 18 can be realized.
  • the importance determination unit 102a converts the discrete sensor data 18 into continuous values having an arbitrary distribution used in the statistical field, as shown in FIG. 3B.
  • other conversion methods may be used.
  • FIG. 4A shows that when the importance determination unit 102a adds function values less than the maximum value in the time direction before and after the sensor reaction time t to convert discrete values into continuous values, the discrete values of the reaction times t1 to t5 It is a figure which shows the example which changed to the curve and applied linear approximation with respect to a value. Compared to the case of FIG. 3B, the amount of calculation of the importance determining unit 102a can be reduced.
  • FIG. 4B shows an example in which the importance determining unit 102a converts the discrete values of the reaction times t1 to t5, which are the sensor data 18 at the time t when the sensor 10 reacts, into a rectangular function. This is effective for sensor data 18 whose value is approximately constant for a short period of time, such as from a sensor.
  • FIG. 4C is an example in which the importance determination unit 102a randomizes the binary sensor data 18 at the time t when the sensor 10 reacts (for example, 1 or 0 for a sensor with a maximum value of 1 and a minimum value of 0). is.
  • the possible range of random values is changed for the minimum value (eg 0) and the maximum value (eg 1) of the discrete values of the sensor data 18 respectively. For example, if the sensor data 18 is "0", it is converted to "a random number in the range of 0 to 0.3", and if it is "1", it is converted to "a random number in the range of 0.7 to 1.0 Random value of ”.
  • the importance determining unit 102a not only adds values in the direction of the time axis to obtain sensor information, but also adds the maximum value of the sensor data 18 in the spatial direction centering on the sensor 10 that has responded. Add a function value of less than the sensor information.
  • the importance determining unit 102a may add sensor information values to the bedroom and kitchen that are spatially adjacent to the living room in a movable manner.
  • FIG. 6 is a plan view showing a specific example of a space to which the sensor information shown in FIG. 5 is applied.
  • the importance determining unit 102a adds a relatively large value (existence probability: 0.7) to the nearest space, the kitchen and the corridor, starting from the living room (existence probability: 1.0) where the sensor data 18 is generated, and moves away from the living room. Decrease the value to be added.
  • the weight determination unit 102b converts the importance determined by the importance determination unit 102a into a weight.
  • the weight is defined as an index indicating which sensor 10 data is relatively emphasized when analyzing multiple types of sensor data 18 .
  • the degree of importance may be used as it is, but the degree of importance is converted by a plurality of methods described in detail below.
  • FIG. 7A, 7B, and 7C a first example in which the weight determination unit 102b converts the importance into a weight based on a combination of a plurality of sensors.
  • the weight determination unit 102b uses the time change of the sensor information (sensor data 18) of the second sensor shown in FIG. Calculate the weight by integrating.
  • FIG. 7C is a diagram showing changes over time in weights calculated by the importance determining unit 102a.
  • the first sensor is, for example, a human sensor installed in the kitchen.
  • the importance determining unit 102a calculates the importance with the maximum value being "1", which can be regarded as the existence probability of a person.
  • the second sensor is, for example, a door open/close sensor.
  • the second sensor which is a door open/close sensor, is pre-installed in the connected home appliance, or is externally attached to the refrigerator or microwave oven.
  • the second sensor measures sensor information indicating the open/close status of the doors of refrigerators, microwave ovens, and the like.
  • the graph of the time change of the weights calculated by the weight determination unit 102b in FIG. Indicates that there is a high possibility that
  • human behavior can be recognized with higher accuracy than when based only on the data of the human sensor.
  • it functions effectively when actions such as going to the kitchen to wash hands during a meal or washing dishes in the kitchen after a meal are distinguished from cooking.
  • the weight determination unit 102b calculates the weight by integrating the importance of the first sensor and the sensor information of the second sensor. If there is, the weight may be used as the weight of the sensor information of the second sensor, and if the importance of the first sensor is less than 0.5, the weight may be ignored (the weight value may be set to 0).
  • FIG. 8A is a diagram showing a temporal change in the degree of importance indicated by the curve of the third sensor determined by the degree-of-importance determining unit 102a.
  • FIG. 8B is a diagram showing changes over time in importance indicated by using the rectangular function of the fourth sensor determined by the importance determining unit 102a.
  • the weight determining unit 102b calculates the weight by integrating the temporal change in importance of the third sensor in FIG. 8A and the temporal change in importance of the fourth sensor in FIG. 8B.
  • FIG. 8C is a diagram showing changes over time in weights calculated by the importance determining unit 102a.
  • the third sensor is, for example, a human sensor installed in the bedroom.
  • the importance determining unit 102a calculates the importance with the maximum value being "1", which can be regarded as the existence probability of a person.
  • the fourth sensor is, for example, an illuminance sensor in the bedroom, which measures the illuminance in the room as sensor information by being pre-installed in the lighting or installed in the room as an afterthought. Since the measured value of the illuminance sensor is considered to be a constant value for a short period of time, it is appropriate to calculate the degree of importance using a rectangular function.
  • the degree of importance is set to a certain threshold value (for example, 10 [lx]) in order to express the binary state of ON/OFF of the lighting, and if it is equal to or higher than the threshold value, ON is set. It may be "1" to indicate the value, and "0" to indicate OFF when the value is equal to or less than the threshold.
  • a certain threshold value for example, 10 [lx]
  • the graph of FIG. 8C showing changes in weight over time calculated by the weight determining unit 102b indicates that the greater the weight, the higher the possibility that a person is present in the bedroom and active.
  • human actions can be recognized with higher accuracy than when based only on the data of the human sensor. In particular, it works effectively when you are in your bedroom but not asleep and are doing activities such as relaxing or reading, and distinguishing them from sleep.
  • the weight determining unit 102b calculates the weight by integrating the temporal change in the importance of the third sensor and the temporal change in the importance of the fourth sensor. If each of the degrees of importance of the fourth sensor is equal to or greater than a predetermined value, the weight is calculated by integration, and if not, the weight is ignored (the weight value is set to 0). good.
  • the classification unit 102c classifies human behavior based on the weights determined by the weight determination unit 102b. Algorithms used for classification are not particularly limited.
  • the classification unit 102c may set a threshold for each weight, and identify the corresponding action when the weight is greater than or equal to the threshold. Specifically, in FIG. 8C, if the weight has a value of 0.8 or greater, the person is classified as relaxing in the bedroom.
  • the classification unit 102c uses supervised machine learning to prepare teacher data and learning data including cases to be classified in advance, train a classifier capable of classifying data according to the number of types of teacher data, and train Human behavior may be classified by a classifier based on Further, the classification unit 102c may learn the physical information 105 of the person to be analyzed and prepare a classifier for each person to be analyzed.
  • an example of teacher data is human behavior such as “sleep” and “going out”
  • an example of learning data is sensor data 18 of a time period corresponding to "sleep” and "going out”.
  • the sensor data 18 may be converted into importance or weight and used as learning data.
  • the type of machine learning is not limited as long as it learns using teacher data and acquires a classification function.
  • methods using decision trees including boosting, logistic regression, k nearest neighbors, support vector machines, random forests and their ensembles, fully connected layers of deep learning, CNN, classifiers constructed using RNN, etc. can be utilized.
  • the classification unit 102c may collect data having similar feature amounts of the collected sensor data 18 by using, for example, unsupervised learning as a classification method.
  • the type of machine learning is not limited as long as it has a classification function without teacher data, such as k-means method, k-means++ method, x-means method, k-shape method, mixed Gaussian model, etc. clustering methods, methods using anomaly detection methods such as OneClass SVM, Elliptic Envelope, Isolation Forest, Local Outlier Factor, mixed Gaussian models, and clustering methods that combine them with dimensionality reduction methods such as t-sne and autoencoder , Adversarial Autoencoders, and the like, which uses the low-dimensional space of adversarial generative networks and collects data into a predetermined distribution.
  • the method of determining the number of classifications arbitrarily, the method of determining an appropriate value using methods such as the elbow method and the silhouette method, and the method of defining a distance index between feature values without predetermining it, and Any method may be used, such as counting as different classifications if there are items that differ by more than the value.
  • FIG. 9 is a flow diagram illustrating the operation of sensing system 100. As shown in FIG. The sensing system 100 starts processing in response to an operation by the system administrator.
  • step S1 the information acquisition unit 101 acquires the sensor data 18 of the sensor 10 (human sensor, door sensor, etc.) and the body information 105 of the person to be analyzed.
  • step S2 the information acquisition unit 101 transmits the data acquired in step S1 to the calculation unit 102 and the storage unit 103.
  • step S3 the calculation unit 102 uses the sensor data 18 received from the information acquisition unit 101 to perform action recognition.
  • the importance determination unit 102a converts discrete data from the sensor data 18 of the plurality of sensors 10 into continuous data to obtain the importance
  • the weight determination unit 102b calculates the importance of the plurality of A change over time of a weight that is the product of the importance of the sensor 10 and the raw sensor data 18 or the product of the importances is calculated
  • the classification unit 102c recognizes actions based on the weight.
  • step S4 the calculation unit 102 transmits the recognition result to the display unit 104 and the storage unit 103.
  • step S5 the accumulation unit 103 transmits the recognition result notified from the calculation unit 102 in step S4 to the calculation unit 102 together with the past accumulated data of the person to be analyzed or the other person.
  • step S6 the calculation unit 102 utilizes the current action recognition result obtained in step S3 and past accumulated data to calculate a time-series change and transmit it to the display unit 104.
  • step S7 the display unit 104 displays the current action recognition result and its chronological change transmitted in step 6.
  • step S8 the sensing system 100 determines whether or not to end the process, and may end as it is (Yes in S8), or move to step S1 and repeat the process.
  • a loop process that returns immediately after the process start (S1) may be implemented by presetting (No in S8).
  • a business operator such as an insurance company or a day service provider, that provides services and products to the elderly utilizes the sensing system 100 of the embodiment.
  • the sensor data 18 of the door opening/closing sensor 14 and the acceleration sensor 15 (hereinafter collectively referred to as the sensor 10) are transmitted to the external server 31 directly or via the router 16, and then the information of the sensing system 100 is acquired. It is sent to the unit 101 .
  • the senor 10 is installed when there is an application for subscription to a service or product provided by a business operator, and a power source is secured by power connection, battery, or energy harvesting such as sunlight or vibration, and measurement is started.
  • FIG. 11 is a diagram showing an example of in-home arrangement of the sensors 10 of the present embodiment.
  • At least two types of the human sensor 11, the illuminance sensor 12 and the temperature/humidity sensor 13 may be treated as a set as an environment sensor.
  • a microphone (not shown), a noise sensor, an air pressure sensor, an odor sensor, a pressure sensor, a wearable sensor, a weight scale, a body composition monitor, an indoor/outdoor image sensor, an image sensor built into a robot cleaner, etc. can be incorporated into the system. good.
  • the home is divided into multiple areas, and a human sensor 11, illuminance sensor 12, and temperature/humidity sensor 13 are installed in each area.
  • the area division method may be for each room.
  • the ranges in which each action is performed may be separate areas.
  • a door open/close sensor 14 is installed in each door, microwave oven 22, and refrigerator 23 to sense the open/close status of the door and home appliances.
  • Acceleration sensors 15 are installed in the washing machine 21, the kitchen sink, and the robot cleaner 24 to sense the usage conditions of each.
  • the data of these sensors 10 are collected by the router 16, and the router 16 transmits the data to the information acquisition unit 101.
  • the data of each sensor 10 installed in the home appliance may be replaced with the data of the connected home appliance by replacing the sensor 10 .
  • the human sensor 11 since the human sensor 11 has a viewing range of ⁇ 45 degrees in the horizontal direction and about 5 m in the depth direction, it should be installed so as not to detect information in areas other than the area in which the human sensor 11 is installed. is desirable. For example, it is important to arrange each human sensor 11 so that the center axis 17 of the visual field range of the human sensor 11 does not substantially face the doorway to the adjacent area.
  • the type of sensor used can be changed according to the service and product. For example, some or all of the human sensor 11, the illuminance sensor 12, and the temperature/humidity sensor 13 are combined into one environment sensor. may It may also include data linkage with other commercially available devices such as online-connectable wearable sensors, weight scales, and body composition meters.
  • the external server 31 is not limited to a server owner, and may be a server managed by a business operator, a server managed by a sensor manufacturer, or a web service. It may be a server that can be rented to the general public. Furthermore, data may be stored in different environments for each business, read by the sensing system 100, and connected within the system. However, these servers need to be able to communicate with the sensing system 100 built in either the company's internal environment or the cloud environment.
  • the information acquisition unit 101 of the sensing system 100 acquires the sensor data 18 of the sensor 10 collected by the external server 31 and the body information 105 of the person to be analyzed into the sensing system 100 .
  • the body information 105 of the person to be analyzed is acquired by making it possible to enter it into the sensor 10, by making it an entry form for subscription to services and products, or making it possible to additionally enter it from my page on the company's homepage.
  • measurement instruments such as a grip strength meter and a body composition meter may be prepared, or may be obtained in cooperation with a third party such as another business operator, local government, non-profit organization, or the like.
  • the information acquisition unit 101 transmits the data of the sensor 10 and the body information 105 of the person to be analyzed to the calculation unit 102 and the storage unit 103, as described in FIG.
  • the calculation unit 102 recognizes human behavior from the information of the sensor 10 received from the information acquisition unit 101 and the body information 105 of the person to be analyzed.
  • human actions include, for example, “sleep”, “going out”, “relax”, “cooking”, “eating”, “cleaning”, “washing”, and “others”.
  • the accumulation unit 103 accumulates the data of the sensor 10 and the body information 105 of the person to be analyzed, receives the body information 105 of the person to be analyzed from the information acquisition unit 101, and obtains the past information of the same person. history data such as physical information and amount of activity, and transmits the data to the information acquisition unit 101 .
  • the storage unit 103 may be provided in the external server 31.
  • the information acquisition unit 101 transmits the past history data of the analysis subject in the external server 31 together with the current data to the calculation unit 102. do. Accordingly, the calculation unit 102 can perform time-series analysis.
  • the calculation unit 102 transmits the current action recognition result and its time-series analysis result to the accumulation unit 103 and the display unit 104 .
  • the accumulation unit 103 organizes and accumulates the results output by the calculation unit for each person to be analyzed and for each category such as the age, sex, origin, religion, lifestyle, occupation, and medical history of the person to be analyzed.
  • the display unit 104 displays the received results to the system operator or business operator.
  • the display method may differ between the system operator and the provider.
  • the system operator for example, in addition to the current behavior of the person to be analyzed and its time-series analysis results, information on the operating state of the sensor, information on the battery status of the sensor, It displays the indicated value, the date and time when the operator referred to the system, the number of times, the time and information on the content, etc.
  • an insurer can infer the applicability of insurance for the current subject of analysis by simultaneously showing the history of insurance coverage of others in the past, the types of daily behavior, and their chronological changes. You may provide materials to Armed with this information, insurers can consider interventions to modify the types of daily behavior of the analyses, and their changes over time, in a way that makes insurance coverage less likely. This intervention includes, for example, making recommendations such as "Let's go out for about 30 minutes every day” and "Let's go to bed 15 minutes earlier today.”
  • day service provider by simultaneously showing the schedule of the day service and changes in the types of daily activities, it would be possible to provide materials for examining the impact of the content of the day service on the lifestyle of the subject of analysis. may Based on this information, the day service provider can consider a program for correcting the chronological change in the type of daily behavior for each person to be analyzed in a desirable direction.
  • Businesses can give feedback to the system administrator about the displayed content.
  • Means of feedback include verbal communication, email, and posting via the system administrator's home page.
  • the system administrator may change the display contents based on the feedback from the operator.
  • the contents of the sensing system 100 of the embodiment can also be applied to individuals whose elderly parents want to watch over them.
  • an individual who wants to watch over an elderly parent may sign a contract for a watching service provided by a business, and sensors may be sent.
  • sensors may be sent.
  • the elderly parent's home and sending the data to the business operator it is possible to receive the type of daily behavior of the elderly parent as an analysis result and monitor the lifestyle habits.
  • FIG. 12 is a diagram showing an example of display contents of the display unit 104 in the monitoring service.
  • the display unit 104 displays the current action recognition result "cooking” recognized by the calculation unit 102, the amount of activity recognized by the action recognition, and the time-series analysis result “sleep" on the display of the terminal of the individual who wants to watch over his elderly parent.
  • a pie chart showing the time ratio of daily activities such as “eating” and “relaxing” is displayed.
  • the display unit 104 displays the chronological change in the amount of activity for each month, and displays analysis comments on the living situation.
  • the present invention is not limited to the above-described examples, and includes various modifications.
  • the above embodiments have been described in detail to facilitate understanding of the present invention, and are not necessarily limited to those having all the described configurations.
  • part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.
  • sensor 18 sensor data 100 sensing system (sensing device) 101 Information acquisition unit 102 Calculation unit 102a Importance determination unit 102b Weight determination unit 102c Classification unit 103 Accumulation unit 104 Display unit 105 Physical information of person to be analyzed

Landscapes

  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Alarm Systems (AREA)
  • Emergency Alarm Devices (AREA)

Abstract

This sensing system (100) comprises a plurality of sensors (10), including at least one human-motion sensor. A space in which the sensors are installed is divided into a plurality of areas. A plurality of the sensors are installed in the plurality of areas respectively. As a subject moves, sensor information obtained by converting discrete data of at least one of the sensors into continuous data is acquired as a degree of importance. The behavior of the subject is recognized on the basis of a weight calculated using information from the plurality of sensors including the degree of importance, and the information from the sensors is utilized to suitably recognize human behavior.

Description

センシングシステム、センシング装置、およびセンシング方法SENSING SYSTEM, SENSING DEVICE, AND SENSING METHOD
 本発明は、センシングシステム、センシング装置、およびセンシング方法に関する。 The present invention relates to a sensing system, a sensing device, and a sensing method.
 近年、通信技術およびセンサ技術が発達し、日常生活において、センシング結果により人間の動作や行動を認識し、見守りに活用する検討が行われている。 In recent years, communication technology and sensor technology have developed, and in everyday life, the results of sensing are used to recognize human movements and actions, and consideration is being given to using them for monitoring.
 例えば、特許文献1には、「対象者宅の見守り装置は、複数の人感センサ(人体検出手段)のうち直前に感応があったもの以外の人感センサが感応したとき、移動判定のための時間計測を開始する。計測値tが移動確定時間Tに達しないうちに別の人感センサの感応があったとき、計測値tをリセットして計測をやり直す。1つの人感センサの感応により計測を開始した後、計測値tが移動確定時間Tに達したとき、見守り対象者が上記1つの人感センサの見守り区域に移動したと判定する。」ことが記載されている。この技術によれば、見守り対象者の生活動作状況の把握に支障を来たすことなく、重要な人感センサを特定することで,保存する情報量を減らすことができる。 For example, Patent Literature 1 describes that "a watching device in a target person's house detects movement when a human sensor (human body detection means) other than the one that has responded immediately before senses a movement. When there is a response from another human sensor before the measured value t reaches the movement fixed time T, the measured value t is reset and the measurement is restarted.Response of one human sensor After the measurement is started by, when the measured value t reaches the movement fixed time T, it is determined that the person being watched has moved to the watching area of the one human sensor. According to this technology, it is possible to reduce the amount of information to be stored by specifying an important motion sensor without interfering with grasping the daily activities of the person being watched over.
特開2003-132462号公報JP-A-2003-132462
 上記の先行技術によれば、複数の人感センサが反応した場合に、移動確定時間Tを用いて人の移動を確定する。しかし、生活習慣は人によって異なり、また、ライフステージに従い変化する可能性があるため、移動確定時間Tを予め適切に設定することは困難である。 According to the prior art described above, the movement of a person is confirmed using the movement confirmation time T when a plurality of human sensors react. However, since lifestyle habits differ from person to person and may change according to life stages, it is difficult to appropriately set the fixed movement time T in advance.
 本発明の目的は、センサの情報を活用することで、適切に人間の行動を認識するシステムを構築することにある。 The purpose of the present invention is to build a system that appropriately recognizes human behavior by utilizing sensor information.
 前記課題を解決するため、本発明のセンシングシステムは、少なくとも1つの人感センサを含む複数のセンサから構成されるセンシングシステムであって、前記センサが設置される空間は複数のエリアに分割されており、複数のエリアのそれぞれに、複数のセンサが設置され、被測定者の移動に伴い、少なくとも1つの前記センサの離散的なデータを連続的なデータに変換したセンサの情報を重要度として求め、前記重要度を含む前記複数のセンサの情報を用いて算出した重みに基づいて前記被測定者の行動認識を行うようにした。 In order to solve the above problems, the sensing system of the present invention is a sensing system comprising a plurality of sensors including at least one human sensor, wherein the space in which the sensors are installed is divided into a plurality of areas. A plurality of sensors are installed in each of a plurality of areas, and as the subject moves, information of the sensors is obtained by converting discrete data of at least one of the sensors into continuous data as the degree of importance. and the action recognition of the person to be measured is performed based on the weight calculated using the information of the plurality of sensors including the degree of importance.
 本発明によれば、センサの情報を活用することで、適切に人間の行動を認識するシステムを構築できる。 According to the present invention, it is possible to build a system that appropriately recognizes human behavior by utilizing sensor information.
実施形態のセンシングシステムの構成を示すブロック図である。1 is a block diagram showing the configuration of a sensing system according to an embodiment; FIG. 演算部の内部構成を示すブロック図である。3 is a block diagram showing the internal configuration of a computing unit; FIG. センサデータが時刻tにおける離散値である場合を示す図である。FIG. 4 is a diagram showing a case where sensor data are discrete values at time t; 反応時刻t1~t5の離散値に対して曲線データm1~m5を作成することを示す図である。FIG. 10 is a diagram showing creation of curve data m1 to m5 for discrete values of reaction times t1 to t5; 図3Bの曲線データm1~m5の重複区間を統合した図である。FIG. 3C is a diagram in which overlapping sections of curve data m1 to m5 in FIG. 3B are integrated. 反応時刻t1~t5の離散値に対して直線近似を適用した例を示す図である。FIG. 9 is a diagram showing an example of applying linear approximation to discrete values of reaction times t1 to t5; 反応時刻t1~t5の離散値を矩形関数に変換した例を示す図である。FIG. 10 is a diagram showing an example of converting discrete values of reaction times t1 to t5 into rectangular functions; センサ10が反応した時刻tにおける2値のセンサデータ18を、乱数化する例を示す図である。FIG. 4 is a diagram showing an example of randomizing the binary sensor data 18 at the time t when the sensor 10 reacts. 反応したセンサを中心として空間方向に、センサデータの最大値未満の関数値を追加して、センサ情報とすることを示す図である。It is a figure which shows adding the function value of less than the maximum value of sensor data in the spatial direction centering on the sensor which reacted, and making it sensor information. センサ情報を空間方向に追加することが適用される空間の具体例を示す平面図である。FIG. 4 is a plan view showing a specific example of a space to which sensor information is added in a spatial direction; 第一のセンサの重要度の時間変化を示す図である。It is a figure which shows the time change of the importance of a 1st sensor. 第二のセンサのセンサ情報の時間変化をTime change of the sensor information of the second sensor 重要度決定部102aが算出した重みの時間変化を示す図である。FIG. 10 is a diagram showing changes over time in weights calculated by an importance determining unit 102a; 第三のセンサの曲線で示される重要度の時間変化を示す図である。FIG. 10 is a diagram showing the time evolution of the importance indicated by the curve of the third sensor; 第四のセンサの矩形関数を用いて示される重要度の時間変化を示す図である。FIG. 10 is a diagram showing the time evolution of importance indicated using a rectangular function of the fourth sensor; 図8Cは、他の重要度決定部102bが算出した重みの時間変化を示す図である。FIG. 8C is a diagram showing temporal changes in weights calculated by another importance determining unit 102b. センシングシステムの動作を説明するフロー図である。FIG. 4 is a flow chart explaining the operation of the sensing system; 他の実施形態のセンシングシステムの構成を示すブロック図である。FIG. 11 is a block diagram showing the configuration of a sensing system according to another embodiment; FIG. 本実施形態のセンサ10の宅内の配置例を示した図である。It is the figure which showed the example of arrangement|positioning in the house of the sensor 10 of this embodiment. 見守りサービスのおける表示部の表示内容の一例を示す図である。It is a figure which shows an example of the display content of the display part in a watching service.
 以下、本発明の実施形態について図面を用いて詳細に説明するが、本発明は以下の実施形態に限定されることなく、本発明の技術的な概念の中で種々の変形例や応用例もその範囲に含む。 Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. However, the present invention is not limited to the following embodiments, and various modifications and applications can be made within the technical concept of the present invention. Included in the scope.
 図1は、実施形態のセンシングシステム100の構成を示すブロック図である。
 センシングシステム100は、情報取得部101と、演算部102と、蓄積部103と、表示部104と、センサ10とで構成される。なお、本明細書では、センシングシステム100において、センサ10を除く構成(情報取得部101と演算部102と蓄積部103と表示部104)を、センシング装置と呼ぶ。
FIG. 1 is a block diagram showing the configuration of a sensing system 100 according to an embodiment.
The sensing system 100 includes an information acquisition unit 101 , a calculation unit 102 , a storage unit 103 , a display unit 104 and a sensor 10 . In this specification, the configuration of the sensing system 100 excluding the sensor 10 (the information acquisition unit 101, the calculation unit 102, the storage unit 103, and the display unit 104) is called a sensing device.
 より具体的には、演算処理を行うCPUとメモリとカメラと通信部と操作部と表示部と不揮発性記憶媒体とから成るコンピュータが、不揮発性記憶媒体に記憶するプログラムをCPUが実行することにより、情報取得部101と、演算部102と、と、表示部104として機能して、センシングシステム100を構成する。蓄積部103は、メモリまたは不揮発性記憶媒体により構成する。 More specifically, a computer comprising a CPU that performs arithmetic processing, a memory, a camera, a communication unit, an operation unit, a display unit, and a nonvolatile storage medium executes a program stored in the nonvolatile storage medium. , the information acquisition unit 101 , the calculation unit 102 , and the display unit 104 , and constitute the sensing system 100 . The storage unit 103 is configured with a memory or a nonvolatile storage medium.
 情報取得部101は、センサ10からセンサデータ18を取得すると共に、分析対象者の身体情報105を取得する。
 ここで、分析対象者の身体情報105は、例えば、身長、体重、生年月日、BMI(ボディーマス指数)、体脂肪率、内臓脂肪レベル、筋肉量、体水分率、体内年齢などである。
The information acquisition unit 101 acquires the sensor data 18 from the sensor 10 and also acquires the body information 105 of the person to be analyzed.
Here, the physical information 105 of the person to be analyzed includes, for example, height, weight, date of birth, BMI (body mass index), body fat percentage, visceral fat level, muscle mass, body water percentage, and body age.
 情報取得部101は、イーサネット、無線通信等でセンサ10と直接接続して、センサデータ18を取得する。また、センサ10を接続しローカルに設置したPCから、ゲートウェイ経由でに収集する場合には、情報取得部101は、ローカルネットワークまたはインターネットを経由し前記PCにアクセスして、センサデータ18を取得する。さらに、センサデータ18が直接またはゲートウェイ経由でクラウドコンピューティングサービスのサーバに収集する場合には、情報取得部101はインターネット経由で前記サーバにアクセスすることで、センサデータ18を取得してもよい。 The information acquisition unit 101 acquires sensor data 18 by directly connecting to the sensor 10 via Ethernet, wireless communication, or the like. Also, when collecting data from a locally installed PC connected to the sensor 10 via a gateway, the information acquisition unit 101 accesses the PC via a local network or the Internet to acquire the sensor data 18. . Furthermore, when the sensor data 18 is collected directly or via a gateway to a cloud computing service server, the information acquisition unit 101 may acquire the sensor data 18 by accessing the server via the Internet.
 情報取得部101は、Python(登録商標)やjava(登録商標)、c言語などにより、csv(Comma Separated Value)などのファイル形式で、外部のサーバからセンサデータ18を受領する。 The information acquisition unit 101 receives sensor data 18 from an external server in a file format such as csv (Comma Separated Value) using Python (registered trademark), java (registered trademark), c language, or the like.
 また、分析対象者がサービス申し込み時に記載する申請書類に基づいて、システム管理者が蓄積部103に蓄積した分析対象者の身体情報105を、情報取得部101は、蓄積部103を参照して取得する。 In addition, the information acquiring unit 101 refers to the storage unit 103 and acquires the physical information 105 of the analysis target, which is stored in the storage unit 103 by the system administrator, based on the application documents that the analysis target fills out when applying for the service. do.
 演算部102は、複数のセンサデータ18を分析し人間の行動認識を行う。具体的には、演算部102は、1種類以上のセンサデータ18を1種類以上に分類する。ここで分類とは、センサデータ18をその特徴を基にグループ分けすることであり、人間の行動認識を行う場合は各分類結果に対し、「睡眠」「外出」「リラックス」「料理」「食事」「掃除」「洗濯」など人間の行動を示すラベル名を割り当てる。
 演算部102の構成は、後述する。
The computing unit 102 analyzes a plurality of sensor data 18 and recognizes human behavior. Specifically, the calculation unit 102 classifies one or more types of sensor data 18 into one or more types. Here, the classification means grouping the sensor data 18 based on their characteristics. Assign label names that indicate human actions, such as "", "cleaning", "washing", etc.
The configuration of the calculation unit 102 will be described later.
 蓄積部103は、前記情報取得部101が取得した情報および演算部102の出力の少なくとも一つを蓄積する。データの蓄積は分析対象者ごとに行うほか、分析対象者の年齢、性別、出身、言語、信教、趣味嗜好などの属性でタグ付けしてもよい。 The accumulation unit 103 accumulates at least one of the information acquired by the information acquisition unit 101 and the output of the calculation unit 102 . In addition to accumulating data for each person to be analyzed, the data may be tagged with attributes such as age, gender, origin, language, religion, and tastes and preferences of the person to be analyzed.
 また、蓄積部103は、演算部102に分析対象者に蓄積した履歴データを送信し、演算部102がこれらデータの時系列解析を行ってもよい。ここで本明細書中では、時系列解析とは、各種のデータに対し横軸に時間をとったグラフにより可視化すること、および、データの値、分類数または分類ごとの継続時間に関し、時間方向の変化率、移動平均、分散、標準偏差などの算出および誤差解析、多項式近似などの操作を行うことと定義する。これらの時系列解析は、年齢、性別、出身、信教、ライフスタイル、職業、既往歴などが同一カテゴリに属する人物の平均値との比較を含んでもよい。 Further, the storage unit 103 may transmit the history data stored for the person to be analyzed to the calculation unit 102, and the calculation unit 102 may perform time-series analysis of these data. Here, in this specification, time series analysis refers to visualization of various data by graphs with time on the horizontal axis, and data values, the number of classifications, or the duration of each classification, in the time direction It is defined as calculating the rate of change, moving average, variance, standard deviation, etc., and performing operations such as error analysis and polynomial approximation. These time-series analyzes may include comparison with average values of persons belonging to the same category in terms of age, gender, origin, religion, lifestyle, occupation, medical history, and the like.
 表示部104は、演算部102および蓄積部103から情報を受信し、センシングシステム100の管理者、システムのユーザ、分析対象者などに情報を表示する。表示部104は、数字、文字、表、グラフなど任意のフォーマットの表示を行う。 The display unit 104 receives information from the calculation unit 102 and the storage unit 103, and displays the information to the administrator of the sensing system 100, system users, analysis subjects, and the like. The display unit 104 displays numbers, characters, tables, graphs, and the like in arbitrary formats.
 つぎに、演算部102の動作を詳細に説明する。
 図2は、演算部102の内部構成を示すブロック図である。
 演算部102は、重要度決定部102a、重み決定部102bおよび分類部102cから構成する。
Next, the operation of the calculation unit 102 will be described in detail.
FIG. 2 is a block diagram showing the internal configuration of the arithmetic unit 102. As shown in FIG.
The calculation unit 102 is composed of an importance determination unit 102a, a weight determination unit 102b, and a classification unit 102c.
 重要度決定部102aは、演算部102が取得したセンサデータ18に対し、重要度を算出する。ここで、本明細書では、重要度を、センサデータ18の時間方向および空間方向の反応(変化)を利用し算出するものであり、複数のセンサ10が存在する場合は、どのセンサ10のセンサデータ18を相対的に重視するかの手掛かりを与える指標と定義する。以下に、重要度の算出方法について、詳細に説明する。 The importance determination unit 102a calculates the importance of the sensor data 18 acquired by the calculation unit 102. Here, in this specification, the degree of importance is calculated using the response (change) of the sensor data 18 in the temporal direction and the spatial direction. It is defined as an index that gives a clue as to whether the data 18 is relatively emphasized. A method for calculating the degree of importance will be described in detail below.
 まず、重要度決定部102aが、図3Aに示すような、取得したセンサデータ18がセンサ10が反応した時刻tにおける離散値(例えば1)をセンサ情報として入力する場合について説明する。 First, a case will be described where the importance determination unit 102a inputs a discrete value (for example, 1) at the time t when the sensor 10 reacts to the acquired sensor data 18, as shown in FIG. 3A, as sensor information.
 なお、図3Aでは、センサ10は焦電効果による人感センサ(焦電センサ)を例にしている。焦電効果による人感センサは、検知範囲内でも、ある一定時間、人が動きを止めていた場合や、検知範囲から人が立ち去った場合、即ち、一定時間温度変化がない場合は、センサの値は0になる。図3Aは、ある一つの人感センサのセンサ情報の時間変化を示している。図3Aにおいて、反応時刻t1と反応時刻t2の間は人感センサは反応していないが、これは、例えば分析対象者が動きを止めていれば、図3Aのように、人感センサは反応しない状態となる。この点、反応時刻t2と反応時刻t3の間や、反応時刻t4と反応時刻t5の間も同じである。また、反応時刻t3と反応時刻t4の間は、分析対象者が動きを止めていた場合か、反応時刻t3後に人感センサの検知範囲から立ち去り、反応時刻t4で人感センサの検知範囲に戻ってきた場合と考えられる。 Note that in FIG. 3A, the sensor 10 is an example of a human sensor (pyroelectric sensor) using a pyroelectric effect. A human sensor that uses the pyroelectric effect will detect when a person stops moving for a certain period of time even within the detection range, or when a person leaves the detection range, that is, when there is no temperature change for a certain period of time. value will be 0. FIG. 3A shows temporal changes in sensor information from one human sensor. In FIG. 3A, the human sensor does not react between reaction time t1 and reaction time t2. It will be in a state of not doing. This point is the same between reaction time t2 and reaction time t3 and between reaction time t4 and reaction time t5. Between reaction time t3 and reaction time t4, the person to be analyzed stops moving, or leaves the detection range of the human sensor after reaction time t3, and returns to the detection range of the human sensor at reaction time t4. It is assumed that the
 重要度決定部102aは、図3Aの離散値のセンサ情報が入力されると、図3Bに示すような、時刻tにおける関数値を最大値(例えば1)とし、その前後の時間方向にも最大値未満の関数値を追加したセンサ情報を作成する。詳しくは、重要度決定部102aは、反応時刻t1~t5におけるセンサデータ18の離散値(例えば1)に対して曲線データm1~m5を作成する。 When the discrete value sensor information of FIG. 3A is input, the importance determination unit 102a sets the function value at time t to the maximum value (for example, 1) as shown in FIG. Create sensor information with added function value less than value. Specifically, the importance determining unit 102a creates curve data m1 to m5 for discrete values (eg, 1) of the sensor data 18 at reaction times t1 to t5.
 ここで、最大値未満の関数値とは、時刻tからの時間差が大きくなるほど関数値が小さくなるように重要度決定部102aが計算したものである。例えば、重要度決定部102aは、センサ反応時刻tの前後の時間方向に最大値未満の関数値を、正規分布、スチューデントτ分布、U(Universal)分布、および、その他の統計分野で用いられる任意の分布を適用する。 Here, the function value less than the maximum value is calculated by the importance determination unit 102a so that the function value decreases as the time difference from time t increases. For example, the importance determination unit 102a determines a function value that is less than the maximum value in the time direction before and after the sensor reaction time t, a normal distribution, a Student's τ distribution, a U (Universal) distribution, and any other arbitrary distribution used in the statistical field. apply the distribution of
 重要度決定部102aは、センサ10が人感センサである場合に、センサ情報(センサデータ18の値)が1である場合を人間の存在確率1.0とし、作成したセンサ情報の連続値を、人間の時系列方向の存在確率として扱うことができる。 When the sensor 10 is a human sensor, the importance determining unit 102a sets the human existence probability to 1.0 when the sensor information (the value of the sensor data 18) is 1, and determines the continuous values of the created sensor information to be 1.0. , can be treated as the existence probability of humans in the time series direction.
 図3Cは、図3Bの曲線データm1~m5の重複区間を統合した図である。重要度決定部102aは、作成した曲線データに重複が生じる場合、重複する曲線の最大値を有する曲線に統合する。または、重要度決定部102aは、複数の曲線の総和と採用し、センサ情報の全体の最大値で正規化する。これにより、重要度決定部102aは、センサ情報の各時刻の値を一意に定めることができる。 FIG. 3C is a diagram that integrates overlapping sections of the curve data m1 to m5 in FIG. 3B. If the created curve data overlap, the importance determining unit 102a integrates the overlapped curves into a curve having the maximum value. Alternatively, the importance determination unit 102a adopts the sum of a plurality of curves and normalizes it by the maximum value of the entire sensor information. Thereby, the importance determining unit 102a can uniquely determine the value of the sensor information at each time.
 図3Cのセンサ情報は、センサ10の反応時刻からの経過時間が短い場合を除き、センサ10が連続で反応するほど大きな値(面積)を有する特徴があるため、複数のセンサ10が反応する場合に、どのセンサを重視するかの指標とすることが可能である。
 本明細書では、図3Cのように、時間方向に連続に分布するように変換したセンサ情報を重要度と定義する。
The sensor information in FIG. 3C is characterized by having a large value (area) so that the sensor 10 responds continuously, except when the elapsed time from the reaction time of the sensor 10 is short. In addition, it is possible to use it as an indicator of which sensor is to be emphasized.
In this specification, as shown in FIG. 3C, the sensor information converted so as to be continuously distributed in the time direction is defined as importance.
 重要度決定部102aは、図3B、図3Cに示すように、離散的なセンサデータ18を時間方向に連続に分布する連続的なセンサデータ18に変換する。これにより、センサデータ18の意図しない欠損に対して値が急激に0または無効になることを防ぐことができるので、センサデータ18の欠損に対してロバストなシステムを実現できる。 The importance determining unit 102a converts discrete sensor data 18 into continuous sensor data 18 continuously distributed in the time direction, as shown in FIGS. 3B and 3C. As a result, it is possible to prevent the value from suddenly becoming 0 or invalid due to an unintended loss of the sensor data 18, so that a system that is robust against loss of the sensor data 18 can be realized.
 上記では、重要度決定部102aが、図3Bに示したように、離散的なセンサデータ18を統計分野で用いられる任意の分布を有する連続値に変換することを説明したが、これに限定されず、他の変換方法であってもよい。 In the above, it was explained that the importance determination unit 102a converts the discrete sensor data 18 into continuous values having an arbitrary distribution used in the statistical field, as shown in FIG. 3B. However, other conversion methods may be used.
 図4Aは、重要度決定部102aが、センサ反応時刻tの前後の時間方向に最大値未満の関数値を追加して離散値を連続値への変換する場合に、反応時刻t1~t5の離散値に対して、曲線に替えて直線近似を適用した例を示す図である。図3Bの場合に比較して、重要度決定部102aの計算量を削減できる。 FIG. 4A shows that when the importance determination unit 102a adds function values less than the maximum value in the time direction before and after the sensor reaction time t to convert discrete values into continuous values, the discrete values of the reaction times t1 to t5 It is a figure which shows the example which changed to the curve and applied linear approximation with respect to a value. Compared to the case of FIG. 3B, the amount of calculation of the importance determining unit 102a can be reduced.
 図4Bは、重要度決定部102aが、センサ10が反応した時刻tにおけるセンサデータ18である反応時刻t1~t5の離散値を矩形関数に変換した例であり、温湿度センサ、気圧センサ、照度センサなどの、短時間では値がおおよそ一定となるセンサデータ18に対して有効である。 FIG. 4B shows an example in which the importance determining unit 102a converts the discrete values of the reaction times t1 to t5, which are the sensor data 18 at the time t when the sensor 10 reacts, into a rectangular function. This is effective for sensor data 18 whose value is approximately constant for a short period of time, such as from a sensor.
 図4Cは、重要度決定部102aが、センサ10が反応した時刻tにおける2値(例えば最大値が1、最小値が0であるセンサにおける1または0)のセンサデータ18を、乱数化する例である。 FIG. 4C is an example in which the importance determination unit 102a randomizes the binary sensor data 18 at the time t when the sensor 10 reacts (for example, 1 or 0 for a sensor with a maximum value of 1 and a minimum value of 0). is.
 より具体的な乱数化の設定方法としては、センサデータ18の離散値の最小値(例えば0)の場合と、最大値(例えば1)の場合とで、それぞれ乱数値の取り得る範囲を変える。例えば、センサデータ18が「0」の場合には、「0~0.3の範囲での乱数値」に変換し、「1」の場合には、「0.7~1.0の範囲での乱数値」に変換する。 As a more specific setting method for randomization, the possible range of random values is changed for the minimum value (eg 0) and the maximum value (eg 1) of the discrete values of the sensor data 18 respectively. For example, if the sensor data 18 is "0", it is converted to "a random number in the range of 0 to 0.3", and if it is "1", it is converted to "a random number in the range of 0.7 to 1.0 Random value of ”.
 これにより、センサデータ18の値の変化に起因する分析時の異常を避け、システムを安定駆動することができる。
 詳しくは、計算時に実数を0で割ることを防ぐ、離散値を一定の範囲で分散することで機械学習の汎化性能を向上する、若しくは、最小値から最大値への急劇な変化を防ぎ計算を安定化する、等の効果を期待できる。
As a result, it is possible to avoid abnormalities during analysis caused by changes in the values of the sensor data 18, and to stably drive the system.
Specifically, it prevents division of real numbers by 0 during calculation, improves the generalization performance of machine learning by distributing discrete values in a certain range, or prevents sudden changes from the minimum value to the maximum value in calculations. can be expected to have the effect of stabilizing the
 さらに、重要度決定部102aは、図5に示すように、時間軸方向に値を追加してセンサ情報とするだけでなく、反応したセンサ10を中心として空間方向に、センサデータ18の最大値未満の関数値を追加して、センサ情報とする。 Furthermore, as shown in FIG. 5, the importance determining unit 102a not only adds values in the direction of the time axis to obtain sensor information, but also adds the maximum value of the sensor data 18 in the spatial direction centering on the sensor 10 that has responded. Add a function value of less than the sensor information.
 具体的には、重要度決定部102aは、リビングでセンサが反応した場合、リビングと移動可能な形で空間的に隣接した寝室およびキッチンにも、センサ情報の値を追加してもよい。 Specifically, when the sensor responds in the living room, the importance determining unit 102a may add sensor information values to the bedroom and kitchen that are spatially adjacent to the living room in a movable manner.
 図6は、図5に示したセンサ情報が適用される空間の具体例を示す平面図である。重要度決定部102aは、センサデータ18が発生したリビング(存在確率:1.0)を起点に、最も近い空間であるキッチンおよび廊下に比較的大きな値(存在確率:0.7)を追加し、リビングから離れるほど追加する値を小さくする。 FIG. 6 is a plan view showing a specific example of a space to which the sensor information shown in FIG. 5 is applied. The importance determining unit 102a adds a relatively large value (existence probability: 0.7) to the nearest space, the kitchen and the corridor, starting from the living room (existence probability: 1.0) where the sensor data 18 is generated, and moves away from the living room. Decrease the value to be added.
 つぎに、図2の重み決定部102bについて説明する。重み決定部102bは、重要度決定部102aが決定した重要度を、重みに換算する。ここで、本明細書では、重みとは、複数種類のセンサデータ18を分析する場合に、どのセンサ10のデータを相対的に重視するかの指標と定義する。重みは、重要度をそのまま用いてもよいが、以下で詳細に説明する複数の手法で重要度を換算する。 Next, the weight determination unit 102b in FIG. 2 will be described. The weight determination unit 102b converts the importance determined by the importance determination unit 102a into a weight. Here, in this specification, the weight is defined as an index indicating which sensor 10 data is relatively emphasized when analyzing multiple types of sensor data 18 . For the weight, the degree of importance may be used as it is, but the degree of importance is converted by a plurality of methods described in detail below.
 重み決定部102bが複数のセンサの組み合わせにより重要度を重みに換算する第一の例を、図7A、図7B、図7Cにより説明する
 図7Aは、重要度決定部102aが決定した第一のセンサの重要度の時間変化を示す図である。
7A, 7B, and 7C, a first example in which the weight determination unit 102b converts the importance into a weight based on a combination of a plurality of sensors. FIG. It is a figure which shows the time change of the importance of a sensor.
 重み決定部102bは、図7Bに示す第二のセンサのセンサ情報(センサデータ18)の時間変化を生値のまま利用し、第一のセンサの重要度と第二のセンサのセンサ情報とを積算して重みを算出する。図7Cは、重要度決定部102aが算出した重みの時間変化を示す図である。 The weight determination unit 102b uses the time change of the sensor information (sensor data 18) of the second sensor shown in FIG. Calculate the weight by integrating. FIG. 7C is a diagram showing changes over time in weights calculated by the importance determining unit 102a.
 より具体的には、第一のセンサは、例えばキッチンに設置された人感センサである。重要度決定部102aが、最大値を「1」として重要度を算出することで、人間の存在確率と見なすことができる。 More specifically, the first sensor is, for example, a human sensor installed in the kitchen. The importance determining unit 102a calculates the importance with the maximum value being "1", which can be regarded as the existence probability of a person.
 第二のセンサは、例えばドア開閉センサである。ドア開閉センサである第二のセンサは、コネクテッド家電に予め組み込まれる、または、外付けするなどの方法で冷蔵庫や電子レンジなどに備えられる。そして、第二のセンサは、冷蔵庫や電子レンジなどのドアの開閉状況を示すセンサ情報を測定する。 The second sensor is, for example, a door open/close sensor. The second sensor, which is a door open/close sensor, is pre-installed in the connected home appliance, or is externally attached to the refrigerator or microwave oven. The second sensor measures sensor information indicating the open/close status of the doors of refrigerators, microwave ovens, and the like.
 重みの説明を続けると、重み決定部102bの算出した図7Cの重みの時間変化のグラフは、重みの値が大きいほど、人間がキッチンに存在し、かつ、料理に関連の深い家電製品を操作している可能性が高いことを示す。 Continuing with the description of the weights, the graph of the time change of the weights calculated by the weight determination unit 102b in FIG. Indicates that there is a high possibility that
 したがって、図7Cに示すような重みの時間変化に基づいて人間の行動を認識することで、人感センサのデータのみに基づいた場合と比較し、より高精度に人間の行動を認識できる。特に、食事中にキッチンに手を洗いに行く、食事後にキッチンで皿洗いをするなどの行動を、料理と区別する場合に有効に機能する。 Therefore, by recognizing human behavior based on the time change of weights as shown in FIG. 7C, human behavior can be recognized with higher accuracy than when based only on the data of the human sensor. In particular, it functions effectively when actions such as going to the kitchen to wash hands during a meal or washing dishes in the kitchen after a meal are distinguished from cooking.
 上記で、重み決定部102bが、第一のセンサの重要度と第二のセンサのセンサ情報とを積算して重みを算出することを説明したが、第一のセンサの重要度が0.5以上であれば、第二のセンサのセンサ情報の重みとし、第一のセンサの重要度が0.5未満であれば、重みを無視(重みの値を0にする)ように処理してもよい。 It has been described above that the weight determination unit 102b calculates the weight by integrating the importance of the first sensor and the sensor information of the second sensor. If there is, the weight may be used as the weight of the sensor information of the second sensor, and if the importance of the first sensor is less than 0.5, the weight may be ignored (the weight value may be set to 0).
 つぎに、重み決定部102bが複数のセンサの組み合わせにより重要度を重みに換算する第二の例を、図8A、図8B、図8Cにより説明する Next, a second example in which the weight determination unit 102b converts the importance into a weight by combining a plurality of sensors will be described with reference to FIGS. 8A, 8B, and 8C.
 図8Aは、重要度決定部102aが決定した第三のセンサの曲線で示される重要度の時間変化を示す図である。
 図8Bは、重要度決定部102aが決定した第四のセンサの矩形関数を用いて示される重要度の時間変化を示す図である。
FIG. 8A is a diagram showing a temporal change in the degree of importance indicated by the curve of the third sensor determined by the degree-of-importance determining unit 102a.
FIG. 8B is a diagram showing changes over time in importance indicated by using the rectangular function of the fourth sensor determined by the importance determining unit 102a.
 重み決定部102bは、図8Aの第三のセンサの重要度の時間変化と、図8Bの第四のセンサの重要度の時間変化と、を積算して重みを算出する。図8Cは、重要度決定部102aが算出した重みの時間変化を示す図である。 The weight determining unit 102b calculates the weight by integrating the temporal change in importance of the third sensor in FIG. 8A and the temporal change in importance of the fourth sensor in FIG. 8B. FIG. 8C is a diagram showing changes over time in weights calculated by the importance determining unit 102a.
 より具体的には、第三のセンサは、例えば寝室に設置された人感センサである。重要度決定部102aが、最大値を「1」として重要度を算出することで、人間の存在確率と見なすことができる。 More specifically, the third sensor is, for example, a human sensor installed in the bedroom. The importance determining unit 102a calculates the importance with the maximum value being "1", which can be regarded as the existence probability of a person.
 第四のセンサは、例えば寝室の照度センサであり、照明に予め組み込まれる、または、後付けで部屋に設置されるなどの方法で、室内の照度をセンサ情報として測定する。照度センサの測定値は短時間であれば一定値と考えられるため、矩形関数により重要度を算出することが適切である。 The fourth sensor is, for example, an illuminance sensor in the bedroom, which measures the illuminance in the room as sensor information by being pre-installed in the lighting or installed in the room as an afterthought. Since the measured value of the illuminance sensor is considered to be a constant value for a short period of time, it is appropriate to calculate the degree of importance using a rectangular function.
 さらに、照度センサの場合には、重要度は、照明のON/OFFの2値の状態を表現するために、一定の閾値(例えば10[lx])を設定し、閾値以上の場合にОNを示す「1」、閾値以下の場合にОFFを示す「0」としてもよい。 Furthermore, in the case of the illuminance sensor, the degree of importance is set to a certain threshold value (for example, 10 [lx]) in order to express the binary state of ON/OFF of the lighting, and if it is equal to or higher than the threshold value, ON is set. It may be "1" to indicate the value, and "0" to indicate OFF when the value is equal to or less than the threshold.
 この場合、重み決定部102bの算出した図8Cの重みの時間変化のグラフは、重みが大きいほど人間が寝室に存在し、かつ、活動的である可能性が高いことを示している。 In this case, the graph of FIG. 8C showing changes in weight over time calculated by the weight determining unit 102b indicates that the greater the weight, the higher the possibility that a person is present in the bedroom and active.
 したがって、図8Cに示すような重みに基づいて人間の行動を認識することで、人感センサのデータのみに基づいた場合と比較し、より高精度に人間の行動を認識できる。特に、寝室にいるが睡眠はせず、リラックス、読書などの行動を行っている場合に、それらを睡眠と区別する場合に有効に機能する。 Therefore, by recognizing human actions based on the weights shown in FIG. 8C, human actions can be recognized with higher accuracy than when based only on the data of the human sensor. In particular, it works effectively when you are in your bedroom but not asleep and are doing activities such as relaxing or reading, and distinguishing them from sleep.
 上記で、重み決定部102bが、第三のセンサの重要度の時間変化と、第四のセンサの重要度の時間変化と、を積算して重みを算出することを説明したが、第三と第四のセンサの重要度のそれぞれが、所定値以上の場合に、積算して重みを算出し、そうでない場合には、重みを無視(重みの値を0にする)ように処理してもよい。 It has been described above that the weight determining unit 102b calculates the weight by integrating the temporal change in the importance of the third sensor and the temporal change in the importance of the fourth sensor. If each of the degrees of importance of the fourth sensor is equal to or greater than a predetermined value, the weight is calculated by integration, and if not, the weight is ignored (the weight value is set to 0). good.
 ここで、図2に戻り、分類部102cについて説明する。
 分類部102cは、重み決定部102bで求めた重みに基づいて、人間行動の分類を行う。分類に用いるアルゴリズムは、特に限定されない。
Here, returning to FIG. 2, the classification unit 102c will be described.
The classification unit 102c classifies human behavior based on the weights determined by the weight determination unit 102b. Algorithms used for classification are not particularly limited.
 例えば、分類部102cは、それぞれの重みに対し閾値を設定し、その閾値以上の重みを有する場合に、対応する行動を特定してもよい。具体的には、図8Cにおいて、重みが0.8以上の値を有する場合は、人間は寝室でリラックスしていると分類する。 For example, the classification unit 102c may set a threshold for each weight, and identify the corresponding action when the weight is greater than or equal to the threshold. Specifically, in FIG. 8C, if the weight has a value of 0.8 or greater, the person is classified as relaxing in the bedroom.
 また、分類部102cは、教師有り機械学習を利用して、事前に分類したい事例を含む教師データと学習データを用意し、教師データの種類数にデータを分類可能な分類器を訓練し、訓練した分類器により人間行動を分類してもよい。
 また、分類部102cは、分析対象者の身体情報105を学習して、対象者毎に分類器を用意してもよい。
In addition, the classification unit 102c uses supervised machine learning to prepare teacher data and learning data including cases to be classified in advance, train a classifier capable of classifying data according to the number of types of teacher data, and train Human behavior may be classified by a classifier based on
Further, the classification unit 102c may learn the physical information 105 of the person to be analyzed and prepare a classifier for each person to be analyzed.
 ここで、教師データの例は、例えば「睡眠」「外出」などの人間行動であり、学習データの例は、例えば前記「睡眠」「外出」に該当する時間帯のセンサデータ18であり、このセンサデータ18を重要度または重みに換算して学習データとしてもよい。 Here, an example of teacher data is human behavior such as "sleep" and "going out", and an example of learning data is sensor data 18 of a time period corresponding to "sleep" and "going out". The sensor data 18 may be converted into importance or weight and used as learning data.
 機械学習の種類は、教師データを用いて学習して分類機能を獲得するものであれば、限定されることはない。例えば、ブースティングを含む決定木を用いた手法、ロジスティック回帰、k近傍法、サポートベクトルマシン、ランダムフォレストおよびそれらのアンサンブル、深層学習の全結合層、CNN、RNNなどを活用し構築した分類器などを活用できる。 The type of machine learning is not limited as long as it learns using teacher data and acquires a classification function. For example, methods using decision trees including boosting, logistic regression, k nearest neighbors, support vector machines, random forests and their ensembles, fully connected layers of deep learning, CNN, classifiers constructed using RNN, etc. can be utilized.
 さらに、分類部102cは、分類方法として例えば教師無し学習を利用して、収集したセンサデータ18の特徴量が類似するデータ同士を集めてもよい。 Further, the classification unit 102c may collect data having similar feature amounts of the collected sensor data 18 by using, for example, unsupervised learning as a classification method.
 機械学習の種類は、教師データ無しで分類機能を有するものであれば限定されることは無く、例えばk-means法、k-means++法、x-means法、k-shape法、混合ガウシアンモデルなどのクラスタリング手法、OneClass SVM、Elliptic Envelope、Isolation Forest、Local Outlier Factorなどの異常検知手法を利用した手法、混合ガウシアンモデル、およびそれらと、t-sneやオートエンコーダなどの次元削減手法を組み合わせたクラスタリング手法、Adversarial Autoencodersに代表される敵対的生成ネットワークの低次元空間を利用し、データを所定の分布にまとめる手法などを活用してよい。 The type of machine learning is not limited as long as it has a classification function without teacher data, such as k-means method, k-means++ method, x-means method, k-shape method, mixed Gaussian model, etc. clustering methods, methods using anomaly detection methods such as OneClass SVM, Elliptic Envelope, Isolation Forest, Local Outlier Factor, mixed Gaussian models, and clustering methods that combine them with dimensionality reduction methods such as t-sne and autoencoder , Adversarial Autoencoders, and the like, which uses the low-dimensional space of adversarial generative networks and collects data into a predetermined distribution.
 また、分類数は任意に定める方法、エルボー法、シルエット法などの手法を利用し適切と考えられる値を決定する方法、事前に定めず特徴量どうしの距離指標を定義し、その指標が所定の値以上異なっているものがあれば異なる分類としてカウントする方法などのいずれでもよい。 In addition, the method of determining the number of classifications arbitrarily, the method of determining an appropriate value using methods such as the elbow method and the silhouette method, and the method of defining a distance index between feature values without predetermining it, and Any method may be used, such as counting as different classifications if there are items that differ by more than the value.
 図9は、センシングシステム100の動作を説明するフロー図である。
 センシングシステム100は、システム管理者の操作をきっかけに処理を開始する。
FIG. 9 is a flow diagram illustrating the operation of sensing system 100. As shown in FIG.
The sensing system 100 starts processing in response to an operation by the system administrator.
 ステップS1で、情報取得部101がセンサ10(人感センサやドアセンサなど)のセンサデータ18および分析対象者の身体情報105を取得する。 In step S1, the information acquisition unit 101 acquires the sensor data 18 of the sensor 10 (human sensor, door sensor, etc.) and the body information 105 of the person to be analyzed.
 ステップS2で、情報取得部101は、ステップS1で取得したデータを演算部102および蓄積部103に送信する。 In step S2, the information acquisition unit 101 transmits the data acquired in step S1 to the calculation unit 102 and the storage unit 103.
 ステップS3で、演算部102は、情報取得部101から受信したセンサデータ18を用いて行動認識を行う。 In step S3, the calculation unit 102 uses the sensor data 18 received from the information acquisition unit 101 to perform action recognition.
 詳しくは、演算部102では、重要度決定部102aが、複数のセンサ10のセンサデータ18から離散的なデータを連続的なデータに変換して重要度を求め、重み決定部102bが、複数のセンサ10の重要度と生のセンサデータ18の積、または重要度同士の積から成る重みの時間変化を算出し、分類部102cが、重みに基づいて、行動認識する。 Specifically, in the calculation unit 102, the importance determination unit 102a converts discrete data from the sensor data 18 of the plurality of sensors 10 into continuous data to obtain the importance, and the weight determination unit 102b calculates the importance of the plurality of A change over time of a weight that is the product of the importance of the sensor 10 and the raw sensor data 18 or the product of the importances is calculated, and the classification unit 102c recognizes actions based on the weight.
 ステップS4で、演算部102は、認識結果を表示部104および蓄積部103に送信される。 In step S4, the calculation unit 102 transmits the recognition result to the display unit 104 and the storage unit 103.
 ステップS5で、蓄積部103は、ステップS4で演算部102から通知された認識結果を、分析対象者または他者の過去の蓄積データと合わせて演算部102に送信する。 In step S5, the accumulation unit 103 transmits the recognition result notified from the calculation unit 102 in step S4 to the calculation unit 102 together with the past accumulated data of the person to be analyzed or the other person.
 ステップS6で、演算部102は、ステップS3で求めた現時点の行動認識結果と過去の蓄積データを活用し、時系列変化を算出し、表示部104に送信する。 In step S6, the calculation unit 102 utilizes the current action recognition result obtained in step S3 and past accumulated data to calculate a time-series change and transmit it to the display unit 104.
 表示部104は、ステップS7で、ステップ6で送信された、現時点の行動認識結果およびその時系列変化を表示する。 In step S7, the display unit 104 displays the current action recognition result and its chronological change transmitted in step 6.
 ステップS8で、センシングシステム100は処理を終了するか否かを判定し、そのまま終了してもよいし(S8のYes)、ステップS1に移り、処理を繰り返すようにしてもよい。予め設定することで、処理開始(S1)の直後に戻るループ処理を実装してもよい(S8のNo)。 In step S8, the sensing system 100 determines whether or not to end the process, and may end as it is (Yes in S8), or move to step S1 and repeat the process. A loop process that returns immediately after the process start (S1) may be implemented by presetting (No in S8).
 つぎに、図10により、保険業者やデイサービス業者など、高齢者に対しサービスおよび商品を提供する事業者が、実施形態のセンシングシステム100を活用する場合について説明する。 Next, with reference to FIG. 10, a case will be described in which a business operator, such as an insurance company or a day service provider, that provides services and products to the elderly utilizes the sensing system 100 of the embodiment.
 図10の人感センサ11、照度センサ12、温湿度センサ13、ドア開閉センサ14、加速度センサ15は、図1のセンサ10に相当し、人感センサ11、照度センサ12、温湿度センサ13、ドア開閉センサ14、加速度センサ15(以下では、総称してセンサ10と記す)のそれぞれセンサデータ18は、直接またはルータ16を経由し、外部サーバ31に送信され、その後、センシングシステム100の情報取得部101に送信される。 The human sensor 11, the illuminance sensor 12, the temperature/humidity sensor 13, the door opening/closing sensor 14, and the acceleration sensor 15 in FIG. The sensor data 18 of the door opening/closing sensor 14 and the acceleration sensor 15 (hereinafter collectively referred to as the sensor 10) are transmitted to the external server 31 directly or via the router 16, and then the information of the sensing system 100 is acquired. It is sent to the unit 101 .
 また、センサ10は、事業者が提供するサービスおよび商品の加入申し込みがあった場合に設置され、電源接続、電池、または太陽光や振動などの環境発電により電源を確保し、測定を開始する。 Also, the sensor 10 is installed when there is an application for subscription to a service or product provided by a business operator, and a power source is secured by power connection, battery, or energy harvesting such as sunlight or vibration, and measurement is started.
 図11は、本実施形態のセンサ10の宅内の配置例を示した図である。
 人感センサ11、照度センサ12および温湿度センサ13は少なくとも2種類以上をセットにし環境センサとして扱うこともある。また、図示しないマイク、騒音センサ、気圧センサ、においセンサ、感圧センサ、ウェアラブルセンサ、体重計、体組成計、室内外の画像センサ、ロボット掃除機に内蔵される画像センサなどをシステムに組み込んでもよい。
FIG. 11 is a diagram showing an example of in-home arrangement of the sensors 10 of the present embodiment.
At least two types of the human sensor 11, the illuminance sensor 12 and the temperature/humidity sensor 13 may be treated as a set as an environment sensor. In addition, a microphone (not shown), a noise sensor, an air pressure sensor, an odor sensor, a pressure sensor, a wearable sensor, a weight scale, a body composition monitor, an indoor/outdoor image sensor, an image sensor built into a robot cleaner, etc. can be incorporated into the system. good.
 宅内は複数のエリアに分割し、各エリアに人感センサ11、照度センサ12および温湿度センサ13を設置する。エリアの分割方法は部屋ごとであってもよい。一つの部屋で複数の行動が行われる場合には、各行動が行われる範囲を別々のエリアとしてもよい。 The home is divided into multiple areas, and a human sensor 11, illuminance sensor 12, and temperature/humidity sensor 13 are installed in each area. The area division method may be for each room. When a plurality of actions are performed in one room, the ranges in which each action is performed may be separate areas.
 各ドア、電子レンジ22および冷蔵庫23には、ドア開閉センサ14を設置し、ドアおよび家電機器の開閉状況をセンシングする。洗濯機21、台所のシンクおよびロボット掃除機24には加速度センサ15を設置し、それぞれの使用状況をセンシングする。 A door open/close sensor 14 is installed in each door, microwave oven 22, and refrigerator 23 to sense the open/close status of the door and home appliances. Acceleration sensors 15 are installed in the washing machine 21, the kitchen sink, and the robot cleaner 24 to sense the usage conditions of each.
 これらのセンサ10のデータはルータ16により収集し、ルータ16はそのデータを情報取得部101に送信する。なお、家電に設置する各センサ10のデータは、当該センサ10をコネクテッド家電に置き換え、そのデータで代替してもよい。 The data of these sensors 10 are collected by the router 16, and the router 16 transmits the data to the information acquisition unit 101. In addition, the data of each sensor 10 installed in the home appliance may be replaced with the data of the connected home appliance by replacing the sensor 10 .
 ここで、人感センサ11は、水平方向に±45度、奥行方向に5m程度の視野範囲を持つため、人感センサ11が設置されたエリア以外のエリアの情報は検知しないように設置することが望ましい。例えば、人感センサ11の視野範囲の中心軸17が、隣接するエリアとの出入口に略対向しないようそれぞれの人感センサ11を配置することが重要である。 Here, since the human sensor 11 has a viewing range of ±45 degrees in the horizontal direction and about 5 m in the depth direction, it should be installed so as not to detect information in areas other than the area in which the human sensor 11 is installed. is desirable. For example, it is important to arrange each human sensor 11 so that the center axis 17 of the visual field range of the human sensor 11 does not substantially face the doorway to the adjacent area.
 なお、サービスおよび商品に応じ使用するセンサの種類は変更可能であり、例えば、人感センサ11、照度センサ12、温湿度センサ13はそれらの一部または全てが環境センサとして1つにまとめられていてもよい。また、オンライン接続可能なウェアラブルセンサ、体重計、体組成計など、他の市販の機器とのデータ連携を含んでもよい。 The type of sensor used can be changed according to the service and product. For example, some or all of the human sensor 11, the illuminance sensor 12, and the temperature/humidity sensor 13 are combined into one environment sensor. may It may also include data linkage with other commercially available devices such as online-connectable wearable sensors, weight scales, and body composition meters.
 また、外部サーバ31は、サーバの所有者も限定されることはなく、事業者の管理するサーバであってもよいし、センサメーカの管理するサーバであってもよいし、また、ウェブサービスに代表される一般にレンタル可能なサーバでもよい。さらに、事業者毎に異なる環境でそれぞれデータを保管し、センシングシステム100がそれぞれ読み込み、システム内部で接続してもよい。ただし、これらのサーバは、事業者の社内環境、クラウド環境のいずれかに構築されたセンシングシステム100と通信可能である必要がある。 The external server 31 is not limited to a server owner, and may be a server managed by a business operator, a server managed by a sensor manufacturer, or a web service. It may be a server that can be rented to the general public. Furthermore, data may be stored in different environments for each business, read by the sensing system 100, and connected within the system. However, these servers need to be able to communicate with the sensing system 100 built in either the company's internal environment or the cloud environment.
 センシングシステム100の情報取得部101は、外部サーバ31に収集されたセンサ10のセンサデータ18および分析対象者の身体情報105をセンシングシステム100内に取得する。 The information acquisition unit 101 of the sensing system 100 acquires the sensor data 18 of the sensor 10 collected by the external server 31 and the body information 105 of the person to be analyzed into the sensing system 100 .
 分析対象者の身体情報105は、センサ10に入力できるようにする、サービスおよび商品の加入申込書の記入事項とする、または、事業者ホームページのマイページから追加で入力できるようにして、取得する。別途、握力計、体組成計などの測定器具を用意する、第三者である他の事業者、自治体、非営利団体などと連携し入手するようにしてもよい。 The body information 105 of the person to be analyzed is acquired by making it possible to enter it into the sensor 10, by making it an entry form for subscription to services and products, or making it possible to additionally enter it from my page on the company's homepage. . Separately, measurement instruments such as a grip strength meter and a body composition meter may be prepared, or may be obtained in cooperation with a third party such as another business operator, local government, non-profit organization, or the like.
 情報取得部101は、図1で説明したように、センサ10のデータおよび分析対象者の身体情報105を演算部102および蓄積部103に送信する。 The information acquisition unit 101 transmits the data of the sensor 10 and the body information 105 of the person to be analyzed to the calculation unit 102 and the storage unit 103, as described in FIG.
 演算部102は、図1で説明したように、情報取得部101から受信したセンサ10の情報および分析対象者の身体情報105から人間の行動認識を行う。ここで、人間の行動とは例えば「睡眠」「外出」「リラックス」「料理」「食事」「掃除」「洗濯」「その他」などである。 As described in FIG. 1, the calculation unit 102 recognizes human behavior from the information of the sensor 10 received from the information acquisition unit 101 and the body information 105 of the person to be analyzed. Here, human actions include, for example, "sleep", "going out", "relax", "cooking", "eating", "cleaning", "washing", and "others".
 蓄積部103は、図1で説明したように、センサ10のデータおよび分析対象者の身体情報105を蓄積すると共に、情報取得部101から分析対象者の身体情報105を受信し、同一人物の過去の身体情報、活動量などの履歴データを検索し、情報取得部101に送信する。 As described with reference to FIG. 1, the accumulation unit 103 accumulates the data of the sensor 10 and the body information 105 of the person to be analyzed, receives the body information 105 of the person to be analyzed from the information acquisition unit 101, and obtains the past information of the same person. history data such as physical information and amount of activity, and transmits the data to the information acquisition unit 101 .
 蓄積部103は、外部サーバ31に設けられてもよく、この場合には、情報取得部101が、外部サーバ31の分析対象者の過去の履歴データを現在のデータと合わせて演算部102に送信する。これにより演算部102は時系列解析を行うことができる。 The storage unit 103 may be provided in the external server 31. In this case, the information acquisition unit 101 transmits the past history data of the analysis subject in the external server 31 together with the current data to the calculation unit 102. do. Accordingly, the calculation unit 102 can perform time-series analysis.
 演算部102は、現在の行動認識結果およびその時系列解析結果を、蓄積部103および表示部104に送信する。蓄積部103は、演算部の出力した前記結果を、分析対象者ごとおよび分析対象者の年齢、性別、出身、信教、ライフスタイル、職業、既往歴などのカテゴリごとに整理して蓄積する。 The calculation unit 102 transmits the current action recognition result and its time-series analysis result to the accumulation unit 103 and the display unit 104 . The accumulation unit 103 organizes and accumulates the results output by the calculation unit for each person to be analyzed and for each category such as the age, sex, origin, religion, lifestyle, occupation, and medical history of the person to be analyzed.
 表示部104は、受信した結果をシステムの運営者または事業者に対し表示する。表示方法は、システムの運営者と事業者で異なってもよい。
 システムの運営者に対しては、例えば、現時点の分析対象者の行動およびその時系列解析結果に加え、センサの稼働状態に関する情報、センサのバッテリー状況に関する情報、演算部102の処理を介さないセンサの指示値、事業者がシステムを参照した日時、回数、時間および内容に関する情報などを表示する。
The display unit 104 displays the received results to the system operator or business operator. The display method may differ between the system operator and the provider.
For the system operator, for example, in addition to the current behavior of the person to be analyzed and its time-series analysis results, information on the operating state of the sensor, information on the battery status of the sensor, It displays the indicated value, the date and time when the operator referred to the system, the number of times, the time and information on the content, etc.
 事業者に対しては、例えば、現時点の分析対象者の行動認識結果およびその時系列変化に加え、事業者ごとに特に必要とする情報に変換して表示する。具体的には、保険業者であれば、過去の他者の保険の適用履歴と毎日の行動の種類およびその時系列変化を同時に示すことで、現在の分析対象者について、保険の適用可能性を類推する材料を提供してもよい。この情報を基に、保険業者は、分析対象者の毎日の行動の種類およびその時系列変化を、保険の適用可能性が低くなる方向に修正するための介入手段を検討することができる。この介入手段には、例えば、「毎日30分程度外出しましょう」「今日は15分早く寝ましょう。」などのレコメンドを行うことが含まれる。 For businesses, for example, in addition to the current behavior recognition results of the person to be analyzed and their chronological changes, it is converted into information that is particularly necessary for each business and displayed. Specifically, an insurer can infer the applicability of insurance for the current subject of analysis by simultaneously showing the history of insurance coverage of others in the past, the types of daily behavior, and their chronological changes. You may provide materials to Armed with this information, insurers can consider interventions to modify the types of daily behavior of the analyses, and their changes over time, in a way that makes insurance coverage less likely. This intervention includes, for example, making recommendations such as "Let's go out for about 30 minutes every day" and "Let's go to bed 15 minutes earlier today."
 また、デイサービス業者であれば、デイサービスを実施した日程と毎日の行動の種類の変化を同時に示すことで、デイサービスの内容が分析対象者の生活習慣におよぼす影響を検討する材料を提供してもよい。この情報を基に、デイサービス業者は、分析対象者ごとに毎日の行動の種類の時系列変化を望ましい方向に修正するためのプログラムを検討することができる。 In the case of a day service provider, by simultaneously showing the schedule of the day service and changes in the types of daily activities, it would be possible to provide materials for examining the impact of the content of the day service on the lifestyle of the subject of analysis. may Based on this information, the day service provider can consider a program for correcting the chronological change in the type of daily behavior for each person to be analyzed in a desirable direction.
 事業者は、表示内容についてシステム管理者にフィードバックを返すことができる。フィードバックの手段は、口頭での伝達、メール、システム管理者のホームページを介した投稿などである。システム管理者は、事業者からのフィードバックに基づき、表示内容を変更してもよい。 Businesses can give feedback to the system administrator about the displayed content. Means of feedback include verbal communication, email, and posting via the system administrator's home page. The system administrator may change the display contents based on the feedback from the operator.
 実施形態のセンシングシステム100の内容は、高齢の親が見守りたい個人に対しても適用可能である。例えば、高齢の親を見守りたい個人が事業者の見守りサービスを契約することで、センサ類が送付される方式でもよい。それらを高齢の親の宅内に設置しデータを事業者に送信することで、分析結果として高齢の親の毎日の行動の種類を受け取り、生活習慣を見守ることができる。また、その結果を基に、外出が少ない親に外出を促す、起床が遅い親の様子を見に行くなどの行動をとることができる。 The contents of the sensing system 100 of the embodiment can also be applied to individuals whose elderly parents want to watch over them. For example, an individual who wants to watch over an elderly parent may sign a contract for a watching service provided by a business, and sensors may be sent. By installing them in the elderly parent's home and sending the data to the business operator, it is possible to receive the type of daily behavior of the elderly parent as an analysis result and monitor the lifestyle habits. In addition, based on the results, it is possible to take actions such as encouraging parents who rarely go out to go out or visiting parents who wake up late.
 図12は、見守りサービスのおける表示部104の表示内容の一例を示す図である。
 表示部104は、高齢の親を見守りたい個人の端末のディスプレイに、演算部102が認識した現在の行動認識結果「料理」、行動認識した活動量、および、時系列解析結果である「睡眠」「食事」「リラックス」の日常行動の時間割合を示す円グラフを表示する。さらに、表示部104は、月毎に活動量の時系列変化を表示し、生活状況の分析コメントを表示する。
FIG. 12 is a diagram showing an example of display contents of the display unit 104 in the monitoring service.
The display unit 104 displays the current action recognition result "cooking" recognized by the calculation unit 102, the amount of activity recognized by the action recognition, and the time-series analysis result "sleep" on the display of the terminal of the individual who wants to watch over his elderly parent. A pie chart showing the time ratio of daily activities such as "eating" and "relaxing" is displayed. Furthermore, the display unit 104 displays the chronological change in the amount of activity for each month, and displays analysis comments on the living situation.
 また、本発明は上記した実施例に限定されるものではなく、様々な変形例が含まれる。上記の実施例は本発明で分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、ある実施形態の構成の一部を他の実施形態の構成に置き換えることが可能であり、また、ある実施形態の構成に他の実施形態の構成を加えることも可能である。 In addition, the present invention is not limited to the above-described examples, and includes various modifications. The above embodiments have been described in detail to facilitate understanding of the present invention, and are not necessarily limited to those having all the described configurations. Also, part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.
 10 センサ
 18 センサデータ
 100 センシングシステム(センシング装置)
 101 情報取得部
 102 演算部
 102a 重要度決定部
 102b 重み決定部
 102c 分類部
 103 蓄積部
 104 表示部
 105 分析対象者の身体情報
10 sensor 18 sensor data 100 sensing system (sensing device)
101 Information acquisition unit 102 Calculation unit 102a Importance determination unit 102b Weight determination unit 102c Classification unit 103 Accumulation unit 104 Display unit 105 Physical information of person to be analyzed

Claims (9)

  1.  少なくとも1つの人感センサを含む複数のセンサから構成されるセンシングシステムであって、
     前記センサが設置される空間は複数のエリアに分割されており、複数のエリアのそれぞれに、複数のセンサが設置され、
     被測定者の移動に伴い、少なくとも1つの前記センサの離散的なデータを連続的なデータに変換したセンサの情報を重要度として求め、
     前記重要度を含む前記複数のセンサの情報を用いて算出した重みに基づいて前記被測定者の行動認識を行う
    ことを特徴とするセンシングシステム。
    A sensing system comprising a plurality of sensors including at least one human sensor,
    The space in which the sensors are installed is divided into a plurality of areas, and a plurality of sensors are installed in each of the plurality of areas,
    Obtaining sensor information obtained by converting discrete data of at least one sensor into continuous data as the subject moves, as an importance level;
    A sensing system for recognizing the behavior of the person to be measured based on a weight calculated using information from the plurality of sensors including the degree of importance.
  2.  請求項1に記載のセンシングシステムにおいて、
     それぞれのセンサは、それらが設置されたエリア以外のエリアの情報を検知しないように配置され、少なくとも1つのセンサの測定範囲の中心軸が、隣接するエリアとの出入口に略対向しない
    ことを特徴とするセンシングシステム。
    The sensing system according to claim 1,
    Each sensor is arranged so as not to detect information in an area other than the area in which they are installed, and the central axis of the measurement range of at least one sensor does not substantially face the entrance to the adjacent area. sensing system.
  3.  請求項1に記載のセンシングシステムにおいて、
     前記重みは、センサの重要度とセンサのデータとから求める
    ことを特徴とするセンシングシステム。
    The sensing system according to claim 1,
    A sensing system according to claim 1, wherein the weight is obtained from the importance of the sensor and the data of the sensor.
  4.  請求項1に記載のセンシングシステムにおいて、
     前記重みは、2つのセンサの重要度から求める
    ことを特徴とするセンシングシステム。
    The sensing system according to claim 1,
    The sensing system, wherein the weight is obtained from the importance of two sensors.
  5.  請求項1~4のいずれか一つに記載のセンシングシステムにおいて、
     前記重要度は、前記センサの時間または空間的に離散的なデータを、曲線、直線、矩形、乱数のいずれか1つを用いて、時間または空間的に連続なデータに変換して求める
    ことを特徴とするセンシングシステム。
    In the sensing system according to any one of claims 1 to 4,
    The degree of importance is obtained by converting temporally or spatially discrete data of the sensor into temporally or spatially continuous data using any one of curves, straight lines, rectangles, and random numbers. A sensing system characterized by:
  6.  請求項1~4のいずれか一つに記載のセンシングシステムにおいて、
     前記被測定者の行動認識の時系列解析を行う
    ことを特徴とするセンシングシステム。
    In the sensing system according to any one of claims 1 to 4,
    A sensing system characterized by performing time-series analysis of behavior recognition of the person to be measured.
  7.  少なくとも1つの人感センサを含む複数のセンサから構成されるセンシングシステムのセンシング方法であって、
     複数のエリアに分割された空間のそれぞれにエリアに設置されたセンサからセンサデータを取得するステップと、
     前記センサデータの離散的なデータを連続的なデータに変換したセンサの情報を重要度として求めるステップと、
     前記重要度を含む複数のセンサの情報を用いて算出した重みに基づいて行動認識するステップと、
    を含むことを特徴とするセンシング方法。
    A sensing method for a sensing system comprising a plurality of sensors including at least one human sensor,
    Acquiring sensor data from sensors installed in each area of the space divided into a plurality of areas;
    a step of obtaining sensor information obtained by converting discrete data of the sensor data into continuous data as a degree of importance;
    a step of recognizing an action based on a weight calculated using information of a plurality of sensors including the importance;
    A sensing method, comprising:
  8.  少なくとも1つの人感センサを含む複数のセンサのセンサデータにより被測定者の行動認識を行うセンシング装置において、
     複数のエリアに分割された空間のそれぞれにエリアに設置されたセンサからセンサデータをセンサの情報として取得する情報取得部と、
     前記センサデータの離散的なデータを連続的なデータに変換したセンサの情報を重要度として求める重要度決定部と、
     前記重要度を含む複数のセンサの情報を用いて重みを算出する重み決定部と、
     前記重みに基づいて前記被測定者の行動認識を行う分類部と、
    を備えることを特徴とするセンシング装置。
    A sensing device that recognizes the behavior of a subject based on sensor data from a plurality of sensors including at least one human sensor,
    an information acquisition unit that acquires sensor data as sensor information from sensors installed in each of the areas divided into a plurality of areas;
    an importance determining unit that obtains, as an importance, sensor information obtained by converting discrete data of the sensor data into continuous data;
    a weight determination unit that calculates a weight using information of a plurality of sensors including the importance;
    a classification unit that recognizes the behavior of the person to be measured based on the weight;
    A sensing device comprising:
  9.  請求項8に記載のセンシング装置において、
     前記重み決定部は、センサの前記重要度と他のセンサのデータとから重みを算出するか、または、複数のセンサの重要度から重みを算出する
    ことを特徴とするセンシング装置。
    The sensing device according to claim 8,
    The sensing device according to claim 1, wherein the weight determining unit calculates the weight from the importance of the sensor and data of other sensors, or calculates the weight from the importance of a plurality of sensors.
PCT/JP2022/044782 2021-12-17 2022-12-05 Sensing system, sensing device, and sensing method WO2023112757A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021205155A JP2023090273A (en) 2021-12-17 2021-12-17 Sensing system, sensing device, and sensing method
JP2021-205155 2021-12-17

Publications (1)

Publication Number Publication Date
WO2023112757A1 true WO2023112757A1 (en) 2023-06-22

Family

ID=86774620

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/044782 WO2023112757A1 (en) 2021-12-17 2022-12-05 Sensing system, sensing device, and sensing method

Country Status (2)

Country Link
JP (1) JP2023090273A (en)
WO (1) WO2023112757A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200210265A1 (en) * 2018-12-31 2020-07-02 Tata Consultancy Services Limited Method and system for prediction of correct discrete sensor data based on temporal uncertainty
JP2020160608A (en) * 2019-03-25 2020-10-01 株式会社日立製作所 Abnormality detection system
JP2021157274A (en) * 2020-03-25 2021-10-07 株式会社日立製作所 Behavior recognition server and behavior recognition method
JP2021162366A (en) * 2020-03-30 2021-10-11 株式会社豊田中央研究所 Sensor delay time estimation device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200210265A1 (en) * 2018-12-31 2020-07-02 Tata Consultancy Services Limited Method and system for prediction of correct discrete sensor data based on temporal uncertainty
JP2020160608A (en) * 2019-03-25 2020-10-01 株式会社日立製作所 Abnormality detection system
JP2021157274A (en) * 2020-03-25 2021-10-07 株式会社日立製作所 Behavior recognition server and behavior recognition method
JP2021162366A (en) * 2020-03-30 2021-10-11 株式会社豊田中央研究所 Sensor delay time estimation device

Also Published As

Publication number Publication date
JP2023090273A (en) 2023-06-29

Similar Documents

Publication Publication Date Title
Rajan Jeyaraj et al. Smart-monitor: Patient monitoring system for IoT-based healthcare system using deep learning
US11355227B2 (en) Activity capability monitoring
Amiribesheli et al. A review of smart homes in healthcare
Verma et al. Fog assisted-IoT enabled patient health monitoring in smart homes
Suryadevara et al. Smart homes
Chung et al. Ambient context-based modeling for health risk assessment using deep neural network
Forkan et al. CoCaMAAL: A cloud-oriented context-aware middleware in ambient assisted living
Meng et al. Towards online and personalized daily activity recognition, habit modeling, and anomaly detection for the solitary elderly through unobtrusive sensing
Tunca et al. Multimodal wireless sensor network-based ambient assisted living in real homes with multiple residents
Verma et al. A comprehensive framework for student stress monitoring in fog-cloud IoT environment: m-health perspective
Kim et al. Prediction model of user physical activity using data characteristics-based long short-term memory recurrent neural networks
JP7316038B2 (en) Event prediction system, sensor signal processing system and program
Sebbak et al. Dempster–Shafer theory-based human activity recognition in smart home environments
Roy et al. A middleware framework for ambiguous context mediation in smart healthcare application
Morita et al. Health monitoring using smart home technologies: Scoping review
Huang et al. A semantic approach with decision support for safety service in smart home management
Bacciu et al. Smart environments and context-awareness for lifestyle management in a healthy active ageing framework
Buchmayr et al. A survey on situation-aware ambient intelligence systems
JP2022543082A (en) Systems and methods for analyzing patient health by monitoring energy usage
Agarwal Weighted support vector regression approach for remote healthcare monitoring
Fatima et al. Analysis and effects of smart home dataset characteristics for daily life activity recognition
WO2023112757A1 (en) Sensing system, sensing device, and sensing method
Ji et al. A Systematic Review of Sensing Technology in Human-Building Interaction Research
Chiriac et al. Towards combining validation concepts for short and long-term ambient health monitoring
Echeverría et al. A semantic framework for continuous u-health services provisioning

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22907285

Country of ref document: EP

Kind code of ref document: A1