WO2016189202A1 - Système et procédé de surveillance - Google Patents

Système et procédé de surveillance Download PDF

Info

Publication number
WO2016189202A1
WO2016189202A1 PCT/FI2016/050356 FI2016050356W WO2016189202A1 WO 2016189202 A1 WO2016189202 A1 WO 2016189202A1 FI 2016050356 W FI2016050356 W FI 2016050356W WO 2016189202 A1 WO2016189202 A1 WO 2016189202A1
Authority
WO
WIPO (PCT)
Prior art keywords
processing device
data
person
basis
monitored
Prior art date
Application number
PCT/FI2016/050356
Other languages
English (en)
Other versions
WO2016189202A8 (fr
Inventor
Sami Nurmela
Pasi Nurmela
Original Assignee
Seniortek Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Seniortek Oy filed Critical Seniortek Oy
Publication of WO2016189202A1 publication Critical patent/WO2016189202A1/fr
Publication of WO2016189202A8 publication Critical patent/WO2016189202A8/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0407Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
    • G08B21/0423Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting deviation from an expected pattern of behaviour or schedule
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/0507Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  using microwaves or terahertz waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1115Monitoring leaving of a patient support, e.g. a bed or a wheelchair
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0469Presence detectors to detect unsafe condition, e.g. infrared sensor, microphone
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • A61B5/1117Fall detection
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0476Cameras to detect unsafe condition, e.g. video cameras

Definitions

  • the invention relates to a monitoring system and a monitoring method.
  • the movement of people suffering from dementia and mental disorders can be monitored by means of a wristband, for example, fastened around the wrist and wirelessly connected by radio path with access control base stations attached to a building.
  • a display in the system control room may show a current state, and the system may give an alarm if the person takes the wristband off or tries to leave the monitored or allowed area without permission.
  • This type of monitoring makes it possible to give an alarm and provide help, if a person remains in place for an unusually long time due to a fall or an illness, for example, or moves in a manner differing from what has been previously agreed.
  • a problem with what is described above is that the person can only be helped after the occurrence of a problematic event and the alarm is not given or the help does not arrive immediately, but after a delay. Therefore, there is a need to improve monitoring.
  • An object of the invention is to provide an improved solution. This is achieved by a monitoring system according to claim 1 .
  • the invention also relates to a monitoring method according to claim 14.
  • Figure 1 shows an example of a monitoring system
  • Figure 2 shows an example of a non-image-forming system attachable to the monitoring system
  • Figure 3 shows a second example of a non-image-forming system attachable to the monitoring system
  • Figure 4 shows a third example of a non-image-forming system attachable to the monitoring system.
  • Figure 5 shows an example of a flowchart of the monitoring method.
  • Figure 1 shows an example of a preventive monitoring system, with which a person 100 is monitored within a monitored area 102.
  • the monitored area 102 may be inside a building or outside. There may be several monitored areas 102 and persons 100, and one monitored area 102 may have one or more monitored persons.
  • the monitoring system comprises at least one image-forming measuring device 104, 106. Each image-forming device 104, 106 is directed to the monitored area 102. Each image-forming device 104 produces image data from the monitored area 102.
  • the image-forming device 104 may comprise a video camera, thermographic camera and/or imaging radar, for example.
  • An imaging radar can transmit and receive electromagnetic UWB (ultra-wide band) pulses, from the reflection of which the imaging radar can form image data.
  • the UWB pulses may pass through a person, walls, and furniture.
  • the UWB pulses can form an image at different distances and from different directions.
  • heartbeat data and/or breathing data for instance, of a person 100.
  • the image-forming device 104 can form an image on its detector matrix by optical radiation.
  • the image-forming device 104 can form an image on its detector matrix by visible light and/or infrared light.
  • a processing device 108 receives image data from each measuring device 104, 106 and measures, on the basis of the image data, at least one of the following features of a monitored person 100 in the monitored area 102: the movement and position of one or more body parts, whereby the processing device 108 generates measuring data.
  • the processing device 108 that receives image data from at least one measuring device 104, 106 may comprise one or more processors 1081 , one or more memories 1082 and one or more suitable computer programs for processing the image data.
  • the processing device 108 may also control one or more measuring devices 104, 106.
  • the processing device 108 may also receive other data.
  • the processing device 108 stores measuring data on a monitored person 100.
  • the processing device 108 compares sets of measuring data stored at different times with each other and generates change data on the basis of differences in features present in the comparison.
  • the processing device 108 controls a user interface 1 10 to display at least said change data.
  • the user interface 1 10 may be a display, for instance.
  • the display may be of a liquid crystal- or led-based technology.
  • the change data may be presented in graphical format, in alphanumeric table format, as one or more freeze-frames, video images or the like.
  • the presentation may also comprise other characters.
  • the characters may comprise logograms, abjads or hyphens.
  • the processing device 108 and user interface 1 10 may be separate from the monitored area 102.
  • the processing device 108 and user interface 1 10 may reside in a separate control room and the user interface 1 10 is monitored and used by the nursing staff.
  • the persons 100 being monitored are usually prevented from using the user interface 1 10, and the monitored per- sons 100 usually do not have access to the control room at least without a member of the nursing staff.
  • the control room and monitored area 102 may reside in the same building or in connection with the same building.
  • a measurable feature may refer to the position or movement of a person 100.
  • the processing device 108 may store measuring data on at least one measured movement or position.
  • the processing device 108 may measure at least one movement made by the monitored person 100 in the monitored area 102.
  • the processing device 108 may store measuring data on at least one measured movement.
  • the processing device 108 may generate change data on the basis of differences in movement in the measuring data.
  • the movement of a body part may refer to its path.
  • the movement of a body part may refer to the movement of the hand, wrist, elbow, shoulder, leg, head, shoulders, pelvis, eyes, face when making expressions, mouth, the entire body or the like.
  • the movement of a body part may also refer to the movement of one body part in relation to another.
  • the movement of a body part may then be the relative movement of the feet, for instance. This movement can be measured by the length of a step, for example.
  • a movement of the body also refers to the outcome resulting from the movement. Other measuring values may include the speed, acceleration, path length and/or rotational angle of a body part, for instance.
  • the differing of features from a predefined value or characteristic may be displayed on the user interface 1 10, and on the basis of what is shown, the nursing staff may initiate action to correct the difference or to reduce problems caused by the difference.
  • the position may refer to the posture, the upright position or straightness of the entire body, the slump of the entire body, the straightness or bend of the knees or the like.
  • the posture In an upright position, the posture is good, the back is straight in a correct manner and the knees are also straight. In a slumped position, the back bends forward and the person is stooped.
  • the knees may also be at least slightly bent.
  • the position may refer to the upright position, sitting position, lying down position or a combination thereof. A slump and/or the bending of knees may mean that the person 100 may fall or drop things in the future, which is good to know so that the nursing staff may try to prevent it in advance.
  • the movement of one or more body parts may also refer to swaying or trembling.
  • the entire body may in a stationary position sway back and forth.
  • Excessive swaying may be a sign that the person 100 may fall or drop things in the future, which is good to know so that the nursing staff may try to prevent it in advance. Swaying may be so extensive that the person 100 may need to take a balancing step or steps.
  • the hand, hands, leg, legs and/or head of the person 100 may tremble. Trembling may also foretell a problem that the nursing staff may try to prevent in advance, when the information based on the measuring is available.
  • the nursing staff can offer human help to the person 100 in problematic situations, the nursing staff can provide auxiliary devices for performing problematic things, the nursing staff can remove problematic substances, devices or equipment from the reach of the person 100, the nursing staff can provide physiotherapy and/or offer medication or other medical assistance.
  • An auxiliary device may be a rollator, bottle opener or the like.
  • Problematic substances, devices, and equipment may comprise chemicals, such as detergents, electric devices, such as electric tools (e.g. drill) and/or sharp objects, such as knives.
  • the monitoring system may comprise as measuring devices at least two video cameras that are directed to the monitored area 102 from diagonal directions. When an image is formed from two diagonal directions, each measured feature can be better detected. The measured feature is then clearly visible from the view angle of at least one video camera 104, 106. In addition, possible pieces of furniture or other structures blocking visibility will probably not obscure the view angle of both video cameras at the same time. These advantages also apply to the generally more than one measuring devices 104, 106 of the video camera.
  • the processing device 108 may compare the first and second sets of measuring data, in which the first measuring data is generated time-wise before the second measuring data, and generate change data on the basis of the differences in the compared sets of measuring data. This way, it is possible to see, how one or more features measured of the person 100 develop over time. For instance, it is possible to monitor the de- velopment of the length of a step. It is also possible to monitor the change in the shaking of a hand after starting medication for Parkinson's disease.
  • the processing device 108 may receive data on a procedure done on the monitored person 100.
  • the processing device 108 may compare said first measuring data generated before said procedure and said second measuring data generated after the procedure. This way, it is possible to see, how one or more features of the person 100 change or develop under the effect of the procedure. If more procedures are done on the person 100, the effect of each of them on the person 100 may be monitored in this manner.
  • the procedure may refer to a change in medication, physiotherapy, diet, physical stress or the like. In general, the procedure may change factors affecting functioning, disability and health of the ICF classification (International Classification of Functioning, Disability and Health). A possible change in one or more of these factors can be monitored by using the features measured from the person 100.
  • the ICF classification can be used as a target in the measurement, in which case the procedure done on the person 100 aims at maintaining or achieving the desired ICF classification level.
  • the processing device 108 may receive data on a change in the procedure done on the monitored person 100.
  • the processing device 108 may compare said first measuring data generated before said change in the procedure and said second measuring data generated after the change in the procedure. This way, it is possible to see, how one or more features measured of the person 100 change or develop under the effect of the change in the procedure.
  • the feature may be coded according to the ICF classification and processed and expressed according to the code.
  • the processing device 108 may receive at least one limit value related to a measured feature.
  • the limit value may be on the boundary of an accepted characteristic of the feature and one differing from the accepted.
  • An accepted characteristic refers to the fact that the staff will not initiate action to correct the characteristic.
  • a characteristic differing from the accepted refers to the fact that the staff will initiate action to return the characteristic to the accepted.
  • the processing device 108 may compare said at least one limit value and stored measuring data for the purpose of performing control action. Control action refers to the fact that the processing device 108 controls the user interface 1 10 to present information on exceeding the limit value to the staff, if the limit value is exceeded.
  • At least one limit value may be stored by the nursing staff into the memory of the processing device 108. Said at least one limit value may in one way or another be related to the functioning and/or risks of the person 100.
  • the user interface 1 10 may indicate that the limit value has been exceeded with a signal light that may be personal, in writing or in some other corresponding manner.
  • the limit value is exceeded, if the characteristic or value of the feature shifts from the acceptable to the differing from acceptable.
  • the nursing staff may initiate procedures on the person 100 to correct or relieve the situation.
  • the limit value is exceeded, if the characteristic or value of the feature shifts from the differing from acceptable to the acceptable.
  • the nursing staff may note that the situation has corrected itself and act accordingly. Any action may then be reduced or action related to the measured features stopped.
  • the monitoring system comprises an alarm device 150.
  • the alarm device 150 may be a fixed device in a control room and it may be part of the user interface 1 10 or a separate device.
  • the alarm device 150 may be part of a portable electronic device, such as user terminal, mobile phone, computer, tablet or the like.
  • the portable alarm device 150 may belong to one or more persons in the nursing staff and/or a person close to the monitored person 100, such relative or friend.
  • the processing device 108 may control the alarm device 150 to give an alarm.
  • the processing device 108 may learn a characteristic property of each measured feature of the monitored person 100.
  • the characteristic property may, thus, be personal.
  • a characteristic property may refer to an average or typical value or magnitude of the feature during the measuring period.
  • the measuring period may be a fixed or flexible measuring window.
  • the processing device 108 may set a limit value related to each measured feature on the basis of said characteristic property.
  • the processing device 108 may compare said at least one limit value and stored measuring data for the purpose of performing control action.
  • the processing device 108 may control at least one device 1 12, 1 14, 1 16 acting on the monitored area 102, if the limit value has been exceeded.
  • the at least one device 1 12, 1 14, 1 16 may comprise one or more lighting fixtures, one or more blinds, air-conditioning, a temperature controller or the like.
  • the processing device 108 need not necessarily determine the emotional state of the person 100, but pre-set limit values may be set on the basis of what people think the features of a sad or depressed person 100 look like. Limit values related to other emotional states can be set correspondingly.
  • the limit values may be based on the Geriatric Depression Scale (GDS) or Georgia scale.
  • Increasing (decreasing) air-conditioning either by increasing (decreasing) the power of the air- conditioning device or opening (closing) a window can also change the measured features of the monitored person 100. Further, a change in the temperature control or in music may also change the measured features of the monitored person 100.
  • the processing device 108 determines the emotional state of the person 100 on the basis of measuring data. In an embodiment, the processing device 108 determines the emotional state of the person 100 on the basis of change data.
  • the processing device 108 may receive at least one new limit value related to at least one measured feature as the procedures directed to the person change or as said at least one feature of the person changes on the basis of the measured change data.
  • the change in procedure may be a change in medication (increasing a drug, decreasing a drug, starting a drug, stopping a drug), physiotherapy (starting, stopping, increasing, decreasing) or the setting or removal of a cast on a leg, for instance.
  • the processing device 108 may detect on the basis of the measuring data at least one predefined intentionally produced characteristic of a feature.
  • the predefined characteristic may be a predefined movement, for instance.
  • the predefined movement may be a predefined movement of a hand, the head, and/or the face. This way, it is possible to perform a control based on motion detection, for example.
  • the processing device 108 may then control at least one device 1 12, 1 14, 1 16 in the monitored area 102 on the basis of the detected predefined property.
  • a disabled person 100 may open a water tap, for instance, with one or more predefined movements of one or more body parts. Similarly, he or she may close it with one or more similar or different predefined movements.
  • the movement may comprise hand movements or facial movements such as expressions, for instance.
  • the processing device 108 may determine the functioning of the person 100 on the basis of measuring data. The measuring data may then be measured repeatedly.
  • the functioning of a person may be defined according to the ICF classification and/or according to different geriatric evaluation principles. Functioning may also or alternatively be defined by case, facility or country.
  • the processing device 108 may define the functioning of the person 100 several times on the basis of motion data measured and stored of one and the same movement.
  • the processing device 108 may generate forecast data for forecasting the development of the person's 100 features on the basis of the change data.
  • the processing device 108 may control a user interface 1 10 to display the forecast data.
  • the forecast data forecasts the development related to the person's 100 movement.
  • the forecast data may be presented in graphical format, in alphanumeric table format, as one or more freeze-frames, video images or the like. Instead of or in addition to alphanumeric presentation, the presentation may also comprise other characters.
  • the characters may comprise logograms, abjads or hyphens.
  • the processing device 108 may determine the current location of the person 100 on the basis of said image data.
  • the location data may relate to the interior of a room or to room-specific data in a flat. Outside, the location data may relate to a location inside a monitored external area 102.
  • the processing device 108 may determine the time of stay of the person 100 at a defined location on the basis of said image data. If the person 100 stays in the toilet, for instance, for longer than a predefined time, the processing device 108 may control the user interface to display information on too long a stay. This way, the nursing staff is aware of the matter and may take action. In addition, the processing device 108 may control the alarm device to give an alarm due to a longer than predefined stay in the toilet. The time of stay can be measured in any space instead of the toilet, and the information can be displayed and alarm given in a corresponding manner.
  • the processing device 108 may receive external data.
  • external data may be received from a national patient information register, for instance.
  • external data may be received from an electronic device 170 that the person carries.
  • the electronic device 170 carried by the person may be a mobile phone and/or an activity bracelet, for example.
  • the monitoring system may also direct at least two non-overlapping beams 212, 214 to the monitored area 102 by using at least two motion detectors 208, 210 that are fastened to the building and do not generate images.
  • the monitored area 102 may be a room or flat with a place for sleeping 202, for example.
  • At least one beam 212, 214 of said at least two non-overlapping beams 212, 214 extends in a width of at most tens of centimetres across the monitored room. The movement of each person can be indicated in at least two locations 216, 218 defined by the beams 212, 214.
  • the processing unit 108 may detect the movement of each person in a predefined time without giving an alarm with the alarm device 150, if the locations of the person, where the movement is indicated, form a succession of predefined consecutive locations 216, 218 from a predefined initial location 10 to a predefined destination location 20, which may be the toilet, for example. Otherwise, the processing unit 108 will control the alarm means 150 to give an alarm.
  • the initial location 10 and destination location 20 are within the monitored area 102. This solution is described in more detail in patent publication US 8,026,820.
  • the monitoring system may also comprise first sensor means 302 for detecting with a beam 314 a movement that is higher than a first predetermined height (EK), second sensor means 304 for detect- ing with a beam 316, on a floor 310 of a building 90, a movement that is lower than a second predetermined height (TK), and third sensor means 306 for detecting an arrival to a sleeping place 202 and a departure from the sleeping place 202.
  • EK first predetermined height
  • second sensor means 304 for detect- ing with a beam 316, on a floor 310 of a building 90, a movement that is lower than a second predetermined height (TK)
  • third sensor means 306 for detecting an arrival to a sleeping place 202 and a departure from the sleeping place 202.
  • the processing unit 108 can control the alarm device 150 to give an alarm if a predetermined first delay, beginning at the arrival to the sleeping place 202, has elapsed before information about the departure from the sleeping place 202 has been received from the third sensor means 306.
  • the processing unit 108 can direct the alarm device 150 to give an alarm, if the processing unit 108 has received no motion detections from both the first sensor means 302 and the second sensor means 304 during a predetermined second delay. This solution is described in more detail in patent publication Fl 123399.
  • the monitoring system may also monitor the monitored person 100 in a flat, which in this example is the monitored area 102, by using a room sensor 404, door sensor 406, and bed sensor 306.
  • a room sensor 404 There may be other persons 100', 100", 100"' monitored in a similar manner in other rooms 102', 102" and possibly in the hallway.
  • the beam of the room sensor 404 covers the room entirely or almost entirely.
  • a near sensor 410 can also be used close to the bed 202 to monitor the movement of the person 100 in the room.
  • the beam 122 of the near sensor 410 is narrow with a width H of less than 20 cm, for instance.
  • the beam 407 of the door sensor 406 is also narrow, similar to the near sensor 410 beam 122.
  • the door sensor 406 is in the immediate vicinity of the door 420 or in the doorway. All these sensors 404, 406, 306, 410 may be non-image-forming motion detectors.
  • the monitoring system may comprise an alarm unit 150.
  • a processing unit 108 may control an alarm device 150 to give an alarm, if the room sensor 404 does not detect movement in the flat within a time period longer than a room delay, and if there are no detections by a door sensor 406 and bed sensor 306 during said room delay.
  • the processing unit 108 may set a rest delay, which is longer than the room delay, if the bed sensor 306 has detected a presence in the bed 202.
  • the processing unit 108 does not control the alarm unit 150 to give an alarm, even though the room sensor 404 does not detect movement for a time longer than the room delay.
  • the processing unit 108 may shift to the room delay or control the alarm unit 150 to give an alarm after the end of the rest delay, if it takes a longer time than the room delay to detect movement in the flat.
  • the processing unit 108 may start, after a previous check-up delay has ended, a check-up delay that is shorter than the room delay, during which the processing unit 108 does not control the alarm unit 150 to give an alarm.
  • the processing unit 108 may, during a time outside the rest delay, control the alarm unit 150 to give an alarm, if the room sensor 404 does not detect movement in the flat during the check-up delay, or the bed sensor 306 does not detect a presence in the bed 202.
  • the processing unit 108 may end the check-up delay and start the room delay, if movement is detected in the flat during the check-up delay without detection of movement by the door sensor 406 at a time outside the rest delay.
  • the processing unit 108 may control during the rest delay the alarm unit 150 to give an alarm, if the room sensor 404 has detected movement in the flat during the check-up delay without a detection by the door sensor 406.
  • the processing unit 108 may end the check-up delay, if the door sensor 406 has carried out detection during a check-up delay within the rest delay. This solution is described in more detail in patent publication Fl 123909.
  • FIG. 5 shows an example of a flowchart of the monitoring method.
  • image data is produced by at least one image-forming measuring device 104, 106 that is directed to the monitored area 102.
  • step 502 on the basis of the image data, at least one or more of the following features of the monitored person 100 are measured by the processing device 108 in the monitored area 102: the movement and position of one or more body parts.
  • step 504 measuring data on the monitored person 100 is stored in the processing device 108.
  • sets of measuring data stored at different times are compared with each other in the processing device 108 to form differences in features.
  • change data is generated on the ba- sis of the differences in features in the comparison.
  • step 510 said change data is displayed on the user interface 1 10.
  • the monitoring system comprises at least one processor and memory having a computer program
  • the computer program and the memory can, together with said at least one processor, cause the monitoring system to execute the steps referred to in the method.
  • the computer program may be placed on a computer program distribution means for the distribution thereof.
  • the computer program distribution means is readable with a data processing device, and it may encode computer program commands for performing the operation of the processing device 108.
  • the distribution means may be a solution known per se for distributing a computer program, for instance a computer-readable medium, a program storage medium, a computer-readable memory, a computer- readable software distribution package, a computer-readable signal, a computer-readable telecommunication signal or a computer-readable compressed software package.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • General Physics & Mathematics (AREA)
  • Psychiatry (AREA)
  • Multimedia (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physiology (AREA)
  • Social Psychology (AREA)
  • Dentistry (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Psychology (AREA)
  • Emergency Management (AREA)
  • Gerontology & Geriatric Medicine (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Primary Health Care (AREA)
  • Developmental Disabilities (AREA)
  • Child & Adolescent Psychology (AREA)
  • Epidemiology (AREA)
  • General Business, Economics & Management (AREA)
  • Signal Processing (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

L'invention concerne un système de surveillance qui permet de surveiller une personne (100) devant être surveillée dans une zone surveillée (102). Au moins un dispositif de mesure de formation d'image (104, 106) est dirigé vers la zone surveillée 102 et génère des données d'image de la zone surveillée (102). Un dispositif de traitement (108) mesure, sur la base desdites données d'image, au moins l'une des caractéristiques suivantes de la personne surveillée (100) dans la zone surveillée (102) : le mouvement et la position d'une ou de plusieurs parties du corps. Le dispositif de traitement (108) stocke des données de mesure de la personne surveillée 100. Le dispositif de traitement (108) compare des ensembles de données de mesure stockés à différents instants les uns avec les autres, et génère des données de changement sur la base de différences de caractéristiques présentes dans la comparaison. Une interface utilisateur (110) affiche au moins lesdites données de changement.
PCT/FI2016/050356 2015-05-26 2016-05-25 Système et procédé de surveillance WO2016189202A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FI20155392 2015-05-26
FI20155392A FI126359B (fi) 2015-05-26 2015-05-26 Valvontajärjestelmä ja valvontamenetelmä

Publications (2)

Publication Number Publication Date
WO2016189202A1 true WO2016189202A1 (fr) 2016-12-01
WO2016189202A8 WO2016189202A8 (fr) 2017-02-09

Family

ID=57179516

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2016/050356 WO2016189202A1 (fr) 2015-05-26 2016-05-25 Système et procédé de surveillance

Country Status (2)

Country Link
FI (1) FI126359B (fr)
WO (1) WO2016189202A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111552281A (zh) * 2020-04-13 2020-08-18 程国军 一种智能耕种系统及其装置
CN108652637B (zh) * 2018-06-30 2024-04-12 源珈力医疗器材国际贸易(上海)有限公司 一种穿戴式跌倒预测防护系统及其预测方法

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008064431A1 (fr) * 2006-12-01 2008-06-05 Latrobe University Procédé et système de surveillance des changements d'état émotionnel
US20090119843A1 (en) * 2007-11-12 2009-05-14 Valence Broadband, Inc. Monitoring patient support exiting and initiating response
US20110295583A1 (en) * 2010-05-27 2011-12-01 Infrared Integrated Systems Limited Monitoring changes in behavior of a human subject
US20120075464A1 (en) * 2010-09-23 2012-03-29 Stryker Corporation Video monitoring system
US20120161969A1 (en) * 2009-09-03 2012-06-28 Koninklijke Philips Electronics N.V. Consciousness monitoring
US20120313785A1 (en) * 2011-04-04 2012-12-13 Alarm.Com Medication management and reporting technology
US20140198954A1 (en) * 2011-07-28 2014-07-17 Adrian BULZACKI Systems and methods of detecting body movements using globally generated multi-dimensional gesture data
US20140253710A1 (en) * 2013-03-06 2014-09-11 Nk Works Co., Ltd. Information processing apparatus for watching, information processing method and non-transitory recording medium recorded with program
US20140371544A1 (en) * 2013-06-14 2014-12-18 Medtronic, Inc. Motion-based behavior identification for controlling therapy
US20150022338A1 (en) * 2013-07-17 2015-01-22 Vivint, Inc. Geo-location services
US20150109442A1 (en) * 2010-09-23 2015-04-23 Stryker Corporation Video monitoring system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008064431A1 (fr) * 2006-12-01 2008-06-05 Latrobe University Procédé et système de surveillance des changements d'état émotionnel
US20090119843A1 (en) * 2007-11-12 2009-05-14 Valence Broadband, Inc. Monitoring patient support exiting and initiating response
US20120161969A1 (en) * 2009-09-03 2012-06-28 Koninklijke Philips Electronics N.V. Consciousness monitoring
US20110295583A1 (en) * 2010-05-27 2011-12-01 Infrared Integrated Systems Limited Monitoring changes in behavior of a human subject
US20120075464A1 (en) * 2010-09-23 2012-03-29 Stryker Corporation Video monitoring system
US20150109442A1 (en) * 2010-09-23 2015-04-23 Stryker Corporation Video monitoring system
US20120313785A1 (en) * 2011-04-04 2012-12-13 Alarm.Com Medication management and reporting technology
US20140198954A1 (en) * 2011-07-28 2014-07-17 Adrian BULZACKI Systems and methods of detecting body movements using globally generated multi-dimensional gesture data
US20140253710A1 (en) * 2013-03-06 2014-09-11 Nk Works Co., Ltd. Information processing apparatus for watching, information processing method and non-transitory recording medium recorded with program
US20140371544A1 (en) * 2013-06-14 2014-12-18 Medtronic, Inc. Motion-based behavior identification for controlling therapy
US20150022338A1 (en) * 2013-07-17 2015-01-22 Vivint, Inc. Geo-location services

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108652637B (zh) * 2018-06-30 2024-04-12 源珈力医疗器材国际贸易(上海)有限公司 一种穿戴式跌倒预测防护系统及其预测方法
CN111552281A (zh) * 2020-04-13 2020-08-18 程国军 一种智能耕种系统及其装置

Also Published As

Publication number Publication date
WO2016189202A8 (fr) 2017-02-09
FI126359B (fi) 2016-10-31
FI20155392A (fi) 2016-10-31

Similar Documents

Publication Publication Date Title
US20240081748A1 (en) Systems for automatic assessment of fall risk
JP6720961B2 (ja) 姿勢検知装置および姿勢検知方法
EP3525673B1 (fr) Procédé et appareil pour déterminer un risque de chute
ES2381712T3 (es) Sistema de detección de caídas
JP6150207B2 (ja) 監視システム
JP2011129120A (ja) 一群の個人の歩行特性を監視するためのシステム及び方法
JP7271915B2 (ja) 画像処理プログラムおよび画像処理装置
US10262517B2 (en) Real-time awareness of environmental hazards for fall prevention
EP3074961A1 (fr) Systèmes et procédés pour analyse d'activité de sujet
JP6720909B2 (ja) 行動検知装置、該方法および該プログラム、ならびに、被監視者監視装置
JP6983866B2 (ja) 転倒検出に関するデバイス、システム、及び方法
JP6852733B2 (ja) 生体監視装置及び生体監視方法
WO2019013257A1 (fr) Système d'aide à la surveillance et son procédé de commande, et programme
JP6048630B1 (ja) 行動検知装置および行動検知方法ならびに被監視者監視装置
WO2016189202A1 (fr) Système et procédé de surveillance
US20180322334A1 (en) Person Monitoring Device And Method, And Person Monitoring System
EP3819864A1 (fr) Programme de détection d'objet cible et dispositif de détection d'objet cible
JP6908028B2 (ja) 被監視者監視装置、該方法、該システムおよびプログラム
JP7047945B2 (ja) 情報処理装置、情報処理方法、及びプログラム
JP6115689B1 (ja) 転倒検知装置および転倒検知方法ならびに被監視者監視装置
JP7342863B2 (ja) コンピュータで実行されるプログラム、情報処理システム、および、コンピュータで実行される方法
Spournias et al. Smart health monitoring using AI techniques in AAL environments
JP2020190889A (ja) 要介護者見守りシステム
JP2021033379A (ja) 画像処理システム、画像処理プログラム、および画像処理方法
JP7215481B2 (ja) コンピューターで実行されるプログラム、情報処理装置、および、コンピューターで実行される方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16799416

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16799416

Country of ref document: EP

Kind code of ref document: A1