EP1566782A1 - Verfahren und Vorrichtung zur Überwachung von alten Leuten zu Hause. - Google Patents

Verfahren und Vorrichtung zur Überwachung von alten Leuten zu Hause. Download PDF

Info

Publication number
EP1566782A1
EP1566782A1 EP05290367A EP05290367A EP1566782A1 EP 1566782 A1 EP1566782 A1 EP 1566782A1 EP 05290367 A EP05290367 A EP 05290367A EP 05290367 A EP05290367 A EP 05290367A EP 1566782 A1 EP1566782 A1 EP 1566782A1
Authority
EP
European Patent Office
Prior art keywords
person
monitored
environment
processor
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP05290367A
Other languages
English (en)
French (fr)
Other versions
EP1566782B1 (de
Inventor
Delphine Guegan-Bourgoin
Sylvie Jumperts
Frédéric Martin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Orange SA
Original Assignee
France Telecom SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by France Telecom SA filed Critical France Telecom SA
Priority to PL05290367T priority Critical patent/PL1566782T3/pl
Publication of EP1566782A1 publication Critical patent/EP1566782A1/de
Application granted granted Critical
Publication of EP1566782B1 publication Critical patent/EP1566782B1/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0453Sensor means for detecting worn on the body to detect health condition by physiological monitoring, e.g. electrocardiogram, temperature, breathing
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0446Sensor means for detecting worn on the body to detect changes of posture, e.g. a fall, inclination, acceleration, gait

Definitions

  • the present invention relates to a method and a secure holding device at least one person moving in a predetermined environment.
  • the invention lies in the field of the secure maintenance of a person, by example with reduced autonomy at home.
  • the object of the invention is to solve the disadvantages of the prior art by providing a method and a device for the secure maintenance of at least one person evolving in a predetermined environment such as a dwelling in which a communication device is always placed close to the person to watch when necessary.
  • the invention also aims to solve the problem related to the existence of zones hidden in the environment in which the person lives. monitor by allowing the command to move a device to communication comprising image capture means.
  • the invention proposes a method of maintaining secure of at least one person operating in a predetermined environment, characterized in that a device adapted to move in the environment in which evolves the person to be monitored includes means of communication and the process includes the steps of receiving at least one event from a group predetermined events, determining the position of the person to monitor the environment in which the person to be supervised is control for moving the device able to move to the position determined and establishing a communication with a treatment device remote or another person through the means of communication.
  • the invention relates to a secure holding server from least one person operating in a predetermined environment, characterized in that that a device able to move in the environment in which the person to be monitored includes means of communication and the server comprises means for receiving at least one event from a group of events predetermined, means for determining the position of the person to be supervised in the environment in which the person to be supervised evolves, means of control for moving the device able to move to the position determined and means of establishing a communication with a remote treatment or another person through the means of communication.
  • the person to be monitored no longer has to carry a permanent communication device.
  • the position of the person to be monitored is determined automatically and a device suitable for move into the environment in which the person to be monitored is ordered to move to the location where the person to watch is located.
  • the group of predetermined events includes events representative of a detection of a fall from the person to to monitor, events representative of a control of the physiological data of the person to be monitored, events representative of a state control of the person to be monitored, events representative of a request establishing a telephone call between the person to be monitored and someone else.
  • the determination of the position of the person to watch in the environment in which the person to be supervised evolves is made from sensors placed in different places of the environment in which evolves the person to watch.
  • the predetermined event is representative of a fall of the person to be monitored
  • the generation of a message is controlled voice of invitation to establish a communication with another person selected from the group of people including at least the family of the person to be monitored, the doctor of the person to be monitored, a helpline of people in danger.
  • the predetermined event when the predetermined event is representative of a control of the emotional state of the person to be monitored, commands the generation of an invitation voice message to establish a communication with a voice server and / or the diffusion of a predetermined perfume by the device of communication.
  • the voice server establishes a dialogue with the person to watch on themes depending on the emotional state of the person to monitor.
  • At least one command from the communication device of the other person and one transfers the or each command to the device able to move in the environment in which evolves the nobody.
  • the physiological data of the person to monitor and the emotional state of the person to be monitored are obtained by through measurements of the resistance and the cutaneous electrical potential of the person to watch for, the temperature and blood supply as well as the heart rate of the person to watch.
  • the device able to move in In addition, the environment in which the person lives capturing images and transferring captured images and / or broadcasting means of perfumes.
  • the invention also relates to computer programs stored on a medium information, said programs including instructions for putting the previously described secure holding method, when it is loaded and executed by a computer system.
  • Fig. 1 represents the architecture of the secure home care system.
  • a device 180 able to move in the environment in which evolves the person to be monitored including means of communication is placed at the home of a person to be monitored 120.
  • This device 180, subsequently called robot 180 is able to go to the person to watch 120 on receipt of a predetermined event.
  • the dwelling of the person to be monitored is equipped with a plurality of presence detectors 130a and 130b placed in the different pieces of Housing. Only two detectors have been represented for the sake of simplification, of course a larger number of detectors 130 are present in the house to watch.
  • the detectors 130 make it possible to precisely locate the person to be monitored 120 in the dwelling and so to be able to determine the room in which she is find.
  • the detectors 130 are acoustic detectors constituting a network of directional microphones, or ultrasonic, or infrared detectors, or even be replaced by a positioning system such as the GPS system when the no one to watch evolves in large spaces. These detectors are distributed in housing according to the number of rooms in the house, the shape of the rooms to avoid uncovered areas or blind spots.
  • the detectors 130 transmit to the home secure server 100 of information enabling the latter to locate the person to be monitored in Housing.
  • the person to be monitored 120 has a equipment allowing the robot 180 to locate it more precisely in a room of Housing.
  • a server for secure home care 100 determines the path that the robot 180 must take to reach the person to watch.
  • the person to be monitored 120 is equipped with a plurality of sensors. Only two groups of sensors 121 and 122 have been represented for the sake of simplification, of course a larger number of sensors equip the person to be monitored 120. Among these sensors, a group of sensors 121 detect the fall of the person 120. These sensors are, for example and without limitation, inclination detectors, motion detectors or accelerometers.
  • sensors 122 makes it possible to measure the physiological data 120.
  • These non-invasive sensors are, for example, devices for measuring cutaneous electrical resistance for example in the palm of the person's hand to monitor 120, measuring devices of the cutaneous temperature, microcirculation, or cardiac rhythm of the 120. Some of these sensors may be placed in the person's clothing to be monitored 120 as described in the publication Philips invents intelligent biomedical clothing for personal healthcare »October 8, 2003.
  • the information provided by these sensors 122 is among others used in combination by the secure home care server 100 for determine the primary emotions felt by the person to be monitored 120 according to the system presented in the journal "For science” number 313 of November 2003 "emotion detector” or in the publication of E Vernet Maury, O ROBIN, A DITTMAR "Study of the emotional response to odor by sensors non-invasive »Microsystems Micromedical microsensors UMR 5511 CNRS-LPM-INSA from Lyon.
  • the device 180 able to move in the environment in which the person to watch is a robot able to move in an environment such as the dwelling of the person to be monitored 120 according to the present invention.
  • the robot 180 is for example a robot marketed by the company Wany Robotics ⁇ under the Pekee ⁇ robot name.
  • the robot 180 includes a control module 180a adapted to control the different elements of the robot 180.
  • the module of control 180a controls inter alia the moving means of the robot 180 in function of commands received from the secure holding server 100 or in function received from a distance measuring module 180b between the robot 180 and possible obstacles that could hinder the robot's journey.
  • command 180a receives from the home secure server 100, via a network interface 180c and a telecommunications network wireless 190, different control signals or sound signals.
  • the module command 180a controls the movement of robot 180 and / or the activation of different sensors or transducers equipping the robot 180 according to these signals of orders.
  • the different sensors or transducers are, for example, a audiovisual information capturing device such as a 180d camera, a 180f sound card comprising at least one speaker and a microphone.
  • the network interface 180c and the sound card 180f comprising at least one loudspeaker and a microphone are the means of communication included in the device capable of moving in the environment in which the person monitor 120.
  • the control module 180a transfers to the secure holding server 100 home people, through the 180c network interface and network wireless communication device 190, images and / or sequences of images and / or the voice signals generated by the person to be monitored.
  • the control module 180a controls a 180e perfume dispensing device that diffuses perfumes as a result of commands generated by the secure server home 100.
  • the 180e perfume diffusion device is for example in accordance with the device as described in the French patent application FR 2823442 "System of programmable fragrance diffusion and method of implementing such a system.
  • the wireless telecommunications network 190 is for example a network of WiFi type telecommunication.
  • WiFi ⁇ for "Wireless Fidelity" is the name of the 802.11 standard.
  • the wireless telecommunications network 190 is in variant a "Bluetooth” type network ⁇ or a "zigBee” type network ⁇ .
  • the server for maintaining secure persons at home 100 is a computer placed in the home of the person to be monitored.
  • the server for maintaining secure home people 100 is able to exchange information with a communication device 110a or 110b or 110c by via the telecommunication network 150.
  • the secure holding server of people at home 100 is also able to exchange information with the robot 180 through the telecommunication network 190.
  • the holding server secure home-based 100 is also able to exchange information with a voice server 155 via the telecommunications network 190.
  • the home secure server 100 is able to analyze and interpret simple voice commands issued by the person to be monitored and memorize predetermined voice messages transmitted to the robot 180 for a reproduction of these to the person to watch.
  • the home secure server 100 will be described in more detail. detail with reference to FIG. 2.
  • the telecommunication network 150 is for example an Internet network to which communication devices 110 are connected, for example a computer 110a of a doctor, a mobile phone 110b such as for example a telephone mobile phone type SPV E200 marketed by the company Orange ⁇ a service assistance or the mobile phone of other persons such as for example family members of the person to be supervised 120.
  • the communication devices 110 may alternatively be connected to the network of telecommunication 150 via a Wi-Fi network or a network cellular phone type GPRS.
  • the communication devices 110 are able to receive an HTML type page transmitted by the maintenance server secure 100 and to control the movement of the robot 180 when a call has been received in order to communicate with the person to be monitored 120 or so as to diagnose the person to be monitored 120.
  • This page allows the robot 180 to be controlled by the user of a communication 110 via telecommunication networks 150 and 190 and secure holding server 100.
  • This page is transmitted by the server of secure hold 100 in the form of a PHP file or "Hypertext PreProcessor".
  • the PHP file includes a declaration of the HTML document including between others a chart to represent the different action buttons allowing the control of the robot and the sending of associated tags, information allowing the recovery by the PHP script values corresponding to labels of directions, robot control and speed, information allowing the transfer labels to the PHP module processing labels, information to define the different values of the direction label according to the zone activated by the user, a table to transmit the different values of the robot control label according to the user-activated screen area, a drop-down menu to select a speed value for robot for the speed label, data to validate the transfer of one or several commands to the secure holding server 100 and a module label interpretation and a robot control module allowing the sending of the commands and parameters of commands that can be interpreted by the robot 180.
  • the voice server 155 comprises a dialogue module 156, a module of voice recognition 157 and a voice synthesis module 158.
  • the voice server 155 is a server capable of interacting intelligently with the person to monitor 120.
  • the voice server 155 and more particularly the dialogue module 156 is based on natural dialogue technology with a smart agent such as described in WO 0039672 entitled "Model and method implementation of a dialogical rational agent; server and multi-agent system for implementation or in the publications of D Sadek «Design Considerations on Dialogue Systems: From Theory to Technology - The Case of Artimis, Proceedings of the ESCA TR Workshop on Interactive Dialogue for Multimodal Systems (IDS'99), Germany 1999 and D Sadek, P Bretier and F Panaget: Artimis: Natural dialogue meets the limits of international joint proceedings Conference on Artificial Intelligence (IJCAI'97), Nagoya, Japan, pages 1030 to 1035, 1997.
  • the database 159 includes information associated with the person to be 120.
  • This database includes the person's areas of interest to be monitored, such as painting, cinema or travel, information about the personal history of the person to watch 120 or about family of the person to watch 120.
  • Fig. 2 represents a block diagram of the secure home-people holding server according to the present invention.
  • the server for maintaining secure persons at home 100 is for example a computer placed at the home of the person to be monitored 120.
  • the holding server secure home people 100 includes a communication bus 301 to which are connected a central unit 300, a non-volatile memory 302, a random access memory 303, a database 106, two network interfaces 160 and 170 and a screen 304 and a keyboard 305.
  • the non-volatile memory 302 stores the programs implementing the invention which will be described later with reference to FIGS. 3 to 8.
  • the memory non-volatile 302 is for example a hard disk. More generally, the programs according to the present invention are stored in storage means. This storage means can be read by a computer or a microprocessor 300. means of storage is integrated or not to the secure server of people to home 100, and can be removable. When turning on the server of secure home care 100, the programs that will be described later with reference to Figs. 3 to 8, are transferred to the RAM 303 which then contains the executable code of the invention as well as the necessary data to the implementation of the invention.
  • the non-volatile memory 302 stores predetermined voice messages which are transmitted to the means of communication included in the robot 180 or to a communication device 180. The memory volatile 302 also memorizes voice command recognition software performed by the person to be monitored.
  • the secure home people holding server 100 also includes a screen 304 and a keyboard 305 serving as a human machine interface with the user of the secure holding system according to the present invention. Through this human machine interface, the user defines the dwelling plan of the person to monitor 120. Via this same man-machine interface, the user defines the different points where the detectors 130 are placed.
  • the home secure server 100 also includes a telecommunication network interface 160.
  • This interface is for example consisting of an ADSL-type modem able to communicate with a command 110 via the telecommunication network 150 or with the voice server 155 which analyzes the words of the person to be monitored 120 using the voice recognition module 157 and form responses to the person to monitor 120 from the dialog modules 158, voice synthesis 158 and the base associated data 159.
  • the home secure server 100 also includes a wireless telecommunication network interface 170.
  • This interface is example a wireless radio interface complying with the 802.11 standard.
  • the database 106 stores the various physiological measurements recorded for the person to be monitored 120.
  • the secure home care server 100 includes presentation modules. These presentation modules 101 are modules that define the human machine interface that will be reproduced by the device of command 110 used by the doctor or the emergency services or the family of the nobody to watch.
  • the secure server server of people to home 100 includes a tag reference module. This module allows to associate, when the user uses the keyboard of a control device 110 to perform a command of the robot 180 a predetermined information.
  • the 100 home secure server server server includes a interpretation module. This module allows to associate with each label processed by the tag reference module or received from the controller 110 via the telecommunication network 150, at least one parameter predetermined order.
  • Secure people holding server server at home 100 includes a robot control module. This module allows to associate for each command parameter determined by module of interpretation, one or more commands interpretable by the robot 180.
  • the robot control module is adapted to the type of robot 180 used in this invention.
  • the presentation modules, the label reference module, the interpretation module and robot control module allow control by a conventional communication device of a robot 180.
  • the server for maintaining secure persons to home 100 is placed on a site remote from the dwelling of the person to be supervised 120.
  • the server for maintaining secure home people 100 manages a plurality of people to monitor and control the respective robots of people to watch.
  • the exchange of information between the server of maintenance secure 100 home people and the robots is carried through the telecommunication network 150 to which is connected a gateway placed in each house of people to watch.
  • Each gateway provides the transfer information between the telecommunication network 150 and a wireless network such as a Wifi ⁇ or ZigBee ⁇ network connecting the gateway to the robot present in Housing.
  • Fig. 3 represents the master algorithm performed by the secure home-keeping server.
  • the algorithm of FIG. 3 is the master algorithm that controls the activation of various algorithms which will be described later with reference to FIGS. 3 to 8. This algorithm is performed continuously by the server processor 300 secure home care 100.
  • step E300 the processor 300 checks whether the person to be monitored 120 has made or not a fall. For this, the processor 300 obtains the information measured by a sensor 121 which is for example a tilt sensor, the compares to measurements made previously and stored in the database of data 106 and if a significant difference is determined, the processor 300 considers that the person to be monitored 120 has made a fall or is in a abnormal position. Of course, sensors such as accelerometers can also be used in place of or in combination with the inclination sensor. If the person to be monitored 120 has fallen or is in an abnormal position, the processor 300 proceeds to step E301.
  • a sensor 121 which is for example a tilt sensor
  • sensors such as accelerometers can also be used in place of or in combination with the inclination sensor.
  • step E301 the processor 300 controls the activation of the algorithm of bringing the robot closer to the person to be monitored 120. This algorithm will be described in more detail with reference to FIG. 4. When the robot approached the nobody to watch, the processor 300 proceeds to the next step E302.
  • step E302 the processor 300 controls the activation of the algorithm help desk call for the person to be monitored 120. This algorithm will be described in more detail with reference to FIG. 5. When the Service Call Algorithm of assistance for the person to be monitored 120 is completed, the processor 300 returns in step E300 previously described.
  • step E300 If at the test of step E300, the person to be monitored 120 is considered as being in a normal position, the processor 300 proceeds to step E303.
  • step E303 processor 300 checks whether a call is received or not.
  • This call is for example an incoming phone call to the person to monitor 120 or a call commanded by the person to be monitored 120 so that the robot 180 is getting closer to her.
  • This call is for example made by the person to monitor 120 by pressing a button on a control box that the person to watch on it or by a predetermined voice command performed by the person to watch 120 picked up by microphones placed in different places of housing, retransmitted to the server for the secure maintenance of people at home 100 and interpreted by the Secure Home Attendant Maintenance Server 100. If a call is received, the processor 300 proceeds to step E304.
  • step E304 the processor 300 controls the activation of the algorithm of approaching the robot 180 towards the person to be monitored 120. This algorithm will be described in more detail with reference to FIG. 4. When the robot 180 approached the person to be monitored 120, the processor 300 proceeds to the next step E305.
  • step E305 the processor 300 controls the activation of the algorithm of providing services to the person to be monitored 120. This algorithm will be described more in detail with reference to FIG. 6. When the algorithm for providing services to the nobody to watch 120 is finished, the processor 300 goes on to the next step E306.
  • step E306 the processor 300 checks whether the delay to perform measurements physiological on the person to monitor is passed or not. This time is by example configurable by the person to watch 120 or the doctor of the person to 120. In fact, the server for maintaining secure people at home 100 periodically, for example every two hours a survey of different measurements taken by the physiological sensors 122 placed on the person to be monitored or in his clothes. If less than two hours have elapsed since the last raised, the processor 300 proceeds to step E308. If at least two hours have passed since the last reading, the processor 300 proceeds to step E307.
  • step E307 the processor 300 controls the activation of the algorithm of control of the physiological data of the person to be monitored 120. This algorithm will be described in more detail with reference to FIG. 7. When the control algorithm physiological data of the person to be monitored 120 is completed, the processor 300 goes to the next step E308.
  • step E308 the processor 300 checks whether the delay to perform the check of the emotional state of the person to be monitored 120 has elapsed or not. This time is by example configurable by the person to watch 120 or the doctor of the person to 120. In fact, the server for maintaining secure people at home 100 periodically, for example every two hours a survey of different measurements taken by the physiological sensors 122 placed on the person to be monitored 120 to determine the emotional state of it. If less than two hours since the last reading, processor 300 returns to step E300 and repeats the algorithm of FIG. 3. If at least two hours have elapsed since the last raised, the processor 300 proceeds to step E309.
  • step E309 the processor 300 controls the activation of the algorithm of control of the emotional state of the person to be monitored 120. This algorithm will be described in more detail with reference to FIG. 8. When the control algorithm of the emotional state of the person to watch 120 is over, the 300 processor returns to step E300 and repeats the algorithm of FIG. 3.
  • Fig. 4 represents the robot approximation algorithm performed by the server for maintaining secure home people.
  • step E400 the processor 300 obtains information from the different 130 sensors placed in the dwelling of the person to be monitored 120.
  • the sensors 130 transmit to the secure home care server 100 the information enabling the latter to locate the person to be monitored 120 in the dwelling, that is to say, the presence of it in a room of the dwelling.
  • the home secure server 100 determines in step E402 the position and more precisely the room in which is the person to watch 120.
  • the processor 300 determines in step E403, from the position determined in the previous step and data representative of the dwelling, the path that the robot 180 must take to reach the person monitor 120.
  • the processor 300 proceeds to the next step E404 and transfers via the telecommunication network 190 the different instructions allowing the robot 180 to move into the home and move 120.
  • These instructions are for example a continuation of advance control in one direction, change of direction to the left or the right or recoil.
  • These instructions also include commands for change the speed of movement of the robot 180.
  • the different instructions include in addition an identifier of the robot among the set of robots that must perform the displacement.
  • the processor 300 controls at step E405 the robot 180 so that it activates the distance measuring module 180b between the robot 180 and possible obstacles that could hinder the course of the robot 180.
  • the control module 180a modifies the path received from the secure holding server 100 people at home to avoid the obstacles present on the way received.
  • the processor 300 transfers to robot 180 a sensor activation command infrared sensors placed on the robot 180.
  • the infrared sensors being sensitive to the temperature, the robot 180 can thus determine in the room the exact position of the person to watch and move around it or even in some cases follow the movements of it.
  • the person to be monitored 120 has a radio transmitter equipment allowing the robot 180 to locate it more precisely.
  • Fig. 5 represents the call to a helpdesk performed by the server for maintaining secure home people.
  • step E500 the processor 300 controls the transfer of a voice message to destination of the robot 180 via the telecommunication network 190.
  • voice message is for example a message of the form "Mr. X do you feel well, do you need help? ". This voice message is stored in the server server database 106 of secure holding of people to 100. On receipt of this message, the robot 180 reproduces this message by via the 180f audio card.
  • the processor 300 checks whether the person to be monitored 120 responded to the previously generated message. Indeed, the robot 180, by through the 180d audio card microphone captures the sound signals transmitted in its environment and transfers the sounds recorded by the microphone of the 180f audio card to the secure home people server 100 by via the network interface card 180d and the telecommunications network 190. The processor 300 analyzes the received signal and determines if a voice signal is present in recorded sounds. If not, the person to be not responded to the voice message, the processor 300 then proceeds to step E502.
  • the processor 300 makes a telephone call to a service assistance to people at risk with a communication device 110.
  • This communication device is for example the communication device 110b of FIG. 1.
  • This call is made through the network of telecommunication 150 or through a conventional telephone network.
  • the doctor and / or the family of the person to be supervised are called in addition to or instead of help desk for people at risk.
  • the processor 300 proceeds with the transfer of a page HTML, "an acronym for Hypertext Markup Language" to the communication 110b.
  • This page allows the control of robot 180. Via of this page, a person in the help desk for people at risk may control the movements of the robot 180, view the different sensor readings or the images provided by the 180d camera and so evaluate the condition of the person to monitor 120.
  • the control of the movement of the robot 180 is preferentially carried out via labels which will be described later on of FIG. 9.
  • the processor 300 controls the data transfer to the communication device 110b.
  • These data are a history of the data physiological of the person to be monitored 120, a survey of physiological sensors 122 readings at the time of the help desk call, audible signals captured by the 180f audio card microphone and relayed back to the keeping 100 people at home as well as the images of the person to watch 120 captured by the camera 180c of the robot 180 and transmitted to the holding server of people at home 100.
  • the processor 300 proceeds to the next step E505.
  • the processor 300 enters a waiting loop of the receipt of one or several orders generated by the person called, for example a person from help desk for people at risk via the page previously sent.
  • the processor 300 proceeds to the next step E506.
  • the person of the helpdesk can, in moving the robot around the person 120 make a first diagnosis and determine whether an intervention at the home of the person to be supervised is or is not necessary.
  • step E506 the processor 300 transfers the previously received command to the robot 180 via the telecommunication network 190. It is at notice that the received order can be transcribed in an interpretable language by the robot 180 prior to its transfer.
  • step E507 the processor 300 checks whether the helpdesk person to people in danger has stopped transmitting orders for a while predetermined. If so, the present algorithm stops, if not, the processor 300 returns to step E505 and processes the received command.
  • step E501 If at step E501 the person to be monitored 120 has responded to the message generated at step E500, processor 300 proceeds to step E508.
  • the processor 300 controls the generation by the robot 180 of a vocal message.
  • This voice message is stored in the database 106 of the secure home server server 100.
  • This message is an invitation to make a phone call to predetermined people such as the family of the person to watch 120 or the doctor of the person to 120.
  • the robot 180 reproduces this message by via the 180f audio card. Sound signals picked up by the audio card 180f of the robot 180 are transferred to the secure holding server of people to home 100 and are then analyzed. These sound signals are processed by the server secure home care with the aid of recognition software voice to determine if the person to be monitored wants a communication telephone is established.
  • the processor 300 checks in step E509 whether the person to monitor, that a telephone call should be established and in the negative stops the present algorithm. If the person to be monitored wishes that a telephone communication is established, the processor 300 proceeds to step E510 and establishes a telephone call through the network of telecommunication 150 or a conventional telephone network with the person with which person to watch 100 wishes to communicate.
  • Fig. 6 shows the algorithm for providing services to a person performed by the server for maintaining secure home people.
  • step E600 the processor 300 of the secure people holding server at home 100 determines whether a remote party is trying to establish a telephone communication with the person to be monitored 120 through the telecommunication network 150 or a conventional telephone network.
  • the processor 300 controls at step E601 the generation a message to the robot 180 so that it activates the sound card 180f.
  • the processor 300 transfers the voice information generated by the remote correspondent to the audio card of the robot 180 via of the telecommunication network 190.
  • the processor 300 transfers to the remote party the sound information picked up by the microphone of the sound card 180f and received by the server 100 via the network of telecommunication 190.
  • a telephone call is then established at the E602 between the remote party and the person to be monitored.
  • step E603 which consists of a loop waiting for the end of the ongoing telephone call.
  • the 300 processor stops the present algorithm.
  • step E600 If at step E600 it has not been detected that a remote party is trying to to establish a telephone call with the person to be monitored 120, the processor 300 proceeds to step E604.
  • step E604 the processor 300 controls the generation of a message to destination of the robot 180 so that it activates the sound card 180f.
  • the processor 300 determines in step E605 whether the person to be monitored 120 wishes to establish a telephone call with a remote correspondent via the telecommunication network 150 or a conventional telephone network. If the person to be monitored 120 wishes to establish a telephone call with a distant correspondent, the processor 300 proceed to step E606.
  • the person to be monitored 120 expresses his wish to establish a communication by generating a voice command or command predetermined via a predetermined key of the robot 180 or a control device that the person to watch 120 carries on it.
  • the processor 300 analyzes the sound signals picked up by the microphone 180f audio card 180 robot to determine the number of phone or a correspondent ID that the person to watch 120 wish to call.
  • the robot 180 includes a keyboard allowing the dialing by the person to be monitored 180 of the telephone number or identifier.
  • the processor 300 controls at step E608 the dialing of the number phone on a telephone network or order the establishment of a communication, via the telecommunication network 150, with the corresponding that the person to be monitored 120 wishes to reach.
  • step E609 which consists of a loop waiting for the end of the ongoing telephone call.
  • the 300 processor stops the present algorithm.
  • step E605 If in step E605 it has not been detected that the person to be monitored 120 wishes establish a telephone call with a distant correspondent, the processor 300 goes to step E610.
  • the processor 300 generates a command for the transmission a voice message to the person to be monitored 120 and reproduced by the 180f audio card 180f loudspeaker that allows the robot 180 to talk to it with the voice server 155.
  • the processor 300 determines in step E611 whether the person to watch 120 wants to dialogue with the voice server 155 by through the telecommunication network 150. If the person to be monitored wishes to dialogue with the voice server, the processor 300 proceeds to step E611. If not, processor 300 stops the present algorithm.
  • the person to watch 120 expresses his wish to dialogue by generating a voice command or a predetermined command via a key predetermined robot 180.
  • step E612 the processor 300 establishes a connection with the voice server 155.
  • the voice server 155 dialog module 156 includes a smart agent as described in WO 0039672 and is able to interact with the person to monitor 120 and uses for this information on the person to watch stored in the database 159.
  • the processor 300 then provides in step E613 the transfer of the conversation between the person to be monitored 120 and the voice server 155 via the robot 180 and telecommunication networks 150 and 190.
  • Fig. 7 represents the algorithm for controlling the physiological data of a person performed by the secure home maintenance server.
  • step E700 the processor 300 of the secure people holding server home 100 receives via the telecommunications network 190 the physiological data of the person to be monitored 120 measured by the sensors 122. The physiological data are then stored in the database 106 at step E701.
  • the processor 300 then proceeds to the step E702 on reading, in the database of data 106, physiological data stored during previous checks of physiological data of the person to be monitored.
  • the data physiological readings are physiological reference data for the person to monitor.
  • the processor 300 then proceeds to step E703 to the comparison of the data received at step E700 with the physiological data read at step E702.
  • step E704 the processor 300 checks whether the comparison is correct or not. If so, the processor 300 stops the present algorithm. If not, the processor 300 proceeds to the next step E705.
  • step E705 the processor 300 checks whether the comparison is representative a physical condition considered very bad or disturbing. In the affirmative, the processor 300 proceeds to the next step E706.
  • step E706 the processor 300 controls the execution of the algorithm of the Fig. 4 previously described to approach the robot 180 of the person to watch 120.
  • the processor 300 controls at step E707 the execution of the algorithm of FIG. Previously described.
  • step E705 If the comparison is not representative of a disturbing physical state, the processor 300 goes from step E705 to step E708 and commands at this step the execution of the algorithm of FIG. 4 previously described to approach the robot 180 of the person to watch 120.
  • the processor 300 commands the establishment of a phone call with the doctor of the person to be monitored and the transfer to step E709 of a voice message to the doctor informing him of the deterioration of the physiological data of the person to be monitored.
  • step E710 reading the data stored in step E701 and transfers them to the voice server 155 for a vocal synthesis of these to transfer them to the doctor in the form a voice message at step E711.
  • Fig. 8 represents the algorithm for controlling the emotional state of a person performed by the secure home maintenance server.
  • step E800 the processor 300 of the secure people holding server home 100 receives via the telecommunications network 190 the physiological data of the person to be monitored 120 measured by the sensors 122.
  • the physiological data are then stored in the database 106 at step E801.
  • step E802 determines the emotions primary sensed by the person to monitor 120 from the data received in step E800 and this in accordance with the system presented in the magazine "For science” issue 313 of November 2003 "emotion detector” or in the publication of E Vernet Maury, ROBIN O, A DITTMAR "Study of the Emotional response to odors by non-invasive sensors Microsystems Biomedical microsensors UMR 5511 CNRS-LPM-INSA of Lyon ". From measures of resistance and cutaneous electrical potential measured on the palm of the hand, temperature and blood flow and frequency cardiac of the person to monitor 120, the processor determines the emotional state of the person to be monitored 120. The emotional state is thus classified in four primary emotions that are sadness, disgust, anger and fear. This operation performed, the processor 300 proceeds to the next step E803.
  • the processor 300 determines whether the emotional state determined at Step E802 corresponds to an emotional state of sadness. If so, processor 300 proceeds to step E804.
  • step E804 the processor 300 commands the generation of a message to destination of the robot 180 to control the 180e perfume dispenser so that it diffuses a perfume for example of vanilla to comfort the 120.
  • a perfume for example of vanilla for example of vanilla to comfort the 120.
  • other perfumes adapted to the person to watch can also be broadcast. These scents can be determined experimentally as described in the publication of E Vernet Maury, O ROBIN and A DITTMAR previously mentioned.
  • step E805 a activation command 180f sound card of robot 180.
  • the processor 300 establishes in step E806 a dialogue between the person to be monitored and the voice server 155. For this, the processor 300 establishes a connection with the voice server 155.
  • the dialogue module 156 of the voice server 155 dialogue with the person to be monitored 120 and uses the information about the person to be monitored stored in the database 159. This information is for example related to the past of the person to be monitored.
  • processor 300 ensures the transfer of the conversation between the person to be monitored 120 and the voice server 155 through the robot 180 and networks of telecommunication 150 and 190.
  • the processor 300 waits a predetermined time to step E807 and when the predetermined time has elapsed, the processor 300 repeat the present algorithm until the person to be monitored is no longer in an emotional state of sadness.
  • step E808 the processor 300 determines whether the emotional state determined in step E802 corresponds to an emotional state of disgust. If so, the processor 300 proceeds to step E809.
  • step E809 the processor 300 controls the generation of a message to destination of the robot 180 to control the 180e perfume dispenser so that it diffuses a perfume for example of lemon to comfort the person to be supervised 120.
  • a perfume for example of lemon to comfort the person to be supervised 120.
  • other perfumes adapted to the person to be monitored can also be broadcast.
  • step E810 a activation command 180f sound card of robot 180.
  • the processor 300 establishes in step E811 a dialogue between the person to be monitored and the voice server 155. For this, the processor 300 establishes a connection with the voice server 155.
  • the dialogue module 156 of the voice server 155 dialogue with the person to be monitored 120 and uses the information about the interests of the person to be monitored 120 memorized in the database 159.
  • the processor 300 transfers the conversation between the person to be monitored 120 and the voice server 155 via the robot 180 and telecommunication networks 150 and 190.
  • the processor 300 waits a predetermined time to step E812 and when the predetermined time has elapsed, the processor 300 repeat the present algorithm until the person to be monitored is no longer in an emotional state of disgust.
  • the processor 300 determines whether the emotional state determined at step E802 corresponds to an emotional state of anger. If so, processor 300 proceeds to step E814.
  • step E814 the processor 300 generates an activation command of the 180f sound card from robot 180.
  • the processor 300 establishes in step E815 a dialogue between the person to be monitored and the voice server 155. For this, the processor 300 establishes a connection with the voice server 155.
  • the dialogue module 156 of the voice server 155 dialogue with the person to be monitored 120 and uses the information about the interests of the person to be monitored 120 memorized in the database 159.
  • the processor 300 transfers the conversation between the person to be monitored 120 and the voice server 155 via the robot 180 and telecommunication networks 150 and 190.
  • the processor 300 waits a predetermined time to step E812 and when the predetermined time has elapsed, the processor 300 repeat the present algorithm until the person to be monitored is no longer in an emotional state of anger.
  • step E817 If the person to be monitored is not in an emotional state of anger, the processor 300 proceeds to step E817.
  • the processor 300 determines whether the emotional state determined at step E802 corresponds to an emotional state of fear. If not, the processor 300 stops the present algorithm. If so, the 300 processor go to step E818.
  • step E818 the processor 300 generates an activation command of the 180f sound card from robot 180.
  • the processor 300 establishes in step E819 a dialogue between the person to be monitored and the voice server 155. For this, the processor 300 establishes a connection with the voice server 155.
  • the dialogue module 156 of the voice server 155 dialogue with the person to be monitored 120 and uses the information about the family of the person to be monitored stored in the database of data 159.
  • the processor 300 ensures the transfer of the conversation between the person to watch 120 and the voice server 155 through the robot 180 and telecommunication networks 150 and 190.
  • the processor 300 waits a predetermined time to step E820 and when the predetermined time has elapsed, the processor 300 repeat the present algorithm until the person to be monitored is no longer in an emotional state of anger.
  • Fig. 9 is a table including information used by the tag reference module according to the present invention.
  • the table of FIG. 9 consists of three columns denoted 920 to 922.
  • the column 920 includes examples of orders or orders generated by the user of a controller 110.
  • Column 921 includes the name of the label associated with each of the commands included in column 920 and the Column 922 includes the value of the label associated with each of the commands included in column 920.
  • the table of FIG. 9 is made up of twelve lines, each line corresponds a command made by the user of a control device. Good heard, a larger or smaller number of orders can be considered according to the present invention.
  • Line 900 associates with the command " ⁇ ", a label Direction having the North value.
  • Line 901 associates with the " ⁇ ” command, a Direction label having the South value.
  • Line 902 associates with the " ⁇ ” command, a Direction label having the value West.
  • Line 903 associates with the " ⁇ ” command, a label Direction having the value East.
  • Line 904 associates with the command " ⁇ ", a Direction label with Northwestern value.
  • Line 905 associates with the command " ⁇ ", a Direction label with the value North-East.
  • Line 906 associates with the command " ⁇ ", a Direction label with the value Southwest.
  • Line 907 associates with the " ⁇ ” command, a Direction label with the value South-East.
  • the line 908 associates with the Reflex command, a robot control label with the On or OFF value, that is active or inactive.
  • the reflex command is a command allowing the robot 180 to enter a detection procedure automatic removal of obstacles and changes in its movements according to detected obstacles. During the automatic obstacle detection procedure, the robot 180 transmits to the server 100 any displacement modification according to the detected obstacles that it performs.
  • Line 909 associates with the speed command of displacement, a Speed label with a user-defined value. This parameterizable value makes it possible to modify the speed of movement of the robot 180.
  • line 910 associates with the temperature measurement command, a Temp label having value is the Tempin variable whose value is the temperature measured by the robot.
  • Line 911 associates with the robot index command to select a robot among a set of robots, a Robotnum tag having as value a index previously assigned to the robot that the user of the holding system Secure wishes to order.

Landscapes

  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Gerontology & Geriatric Medicine (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Cardiology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Physiology (AREA)
  • Pulmonology (AREA)
  • Telephonic Communication Services (AREA)
  • Alarm Systems (AREA)
  • Debugging And Monitoring (AREA)
EP05290367A 2004-02-23 2005-02-17 Verfahren und Vorrichtung zur Überwachung von alten Leuten zu Hause. Active EP1566782B1 (de)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PL05290367T PL1566782T3 (pl) 2004-02-23 2005-02-17 Sposób i urządzenie do utrzymywania w bezpieczeństwie co najmniej jednej osoby poruszającej się w określonym środowisku

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR0401801A FR2866739A1 (fr) 2004-02-23 2004-02-23 Procede et dispositif de maintien securise d'au moins une personne evoluant dans un environnement predetermine
FR0401801 2004-02-23

Publications (2)

Publication Number Publication Date
EP1566782A1 true EP1566782A1 (de) 2005-08-24
EP1566782B1 EP1566782B1 (de) 2012-08-22

Family

ID=34708019

Family Applications (1)

Application Number Title Priority Date Filing Date
EP05290367A Active EP1566782B1 (de) 2004-02-23 2005-02-17 Verfahren und Vorrichtung zur Überwachung von alten Leuten zu Hause.

Country Status (4)

Country Link
EP (1) EP1566782B1 (de)
ES (1) ES2393718T3 (de)
FR (1) FR2866739A1 (de)
PL (1) PL1566782T3 (de)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009037612A3 (en) * 2007-09-19 2009-08-20 Koninkl Philips Electronics Nv Method and apparatus for detecting an abnormal situation
WO2011010191A1 (en) * 2009-07-22 2011-01-27 Koninklijke Philips Electronics N.V. Fall detectors and a method of detecting falls
WO2018114209A3 (de) * 2016-12-21 2018-08-30 Service-Konzepte MM AG Autonomes haushaltsgerät und sitz- oder liegemöbel hierzu sowie haushaltsgerät
CN110087024A (zh) * 2019-03-08 2019-08-02 合肥泛米智能科技有限公司 一种养老居家的监控装置
DE102022135078A1 (de) 2022-08-31 2024-02-29 VIVAI Software AG Servereinrichtung mit Alarmierungseinheit und damit ausgeführtes Verfahren
WO2024062284A1 (de) 2022-08-31 2024-03-28 VIVAI Software AG Multifunktionales assistenzsystem, insbesondere fur alleinlebende personen

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108831110A (zh) * 2018-07-17 2018-11-16 同济大学 基于穿戴式设备的老人跌倒检测及防走失监护系统及方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4665385A (en) * 1985-02-05 1987-05-12 Henderson Claude L Hazardous condition monitoring system
US5515858A (en) * 1992-02-28 1996-05-14 Myllymaeki; Matti Wrist-held monitoring device for physical condition
US6002994A (en) * 1994-09-09 1999-12-14 Lane; Stephen S. Method of user monitoring of physiological and non-physiological measurements
US6313743B1 (en) * 1997-08-01 2001-11-06 Siemens Aktiengellschaft Home emergency warning system
FR2837016A3 (fr) * 2002-02-07 2003-09-12 Christa Duschek Appareil qui donne un signal pour appeler secours

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4665385A (en) * 1985-02-05 1987-05-12 Henderson Claude L Hazardous condition monitoring system
US5515858A (en) * 1992-02-28 1996-05-14 Myllymaeki; Matti Wrist-held monitoring device for physical condition
US6002994A (en) * 1994-09-09 1999-12-14 Lane; Stephen S. Method of user monitoring of physiological and non-physiological measurements
US6313743B1 (en) * 1997-08-01 2001-11-06 Siemens Aktiengellschaft Home emergency warning system
FR2837016A3 (fr) * 2002-02-07 2003-09-12 Christa Duschek Appareil qui donne un signal pour appeler secours

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009037612A3 (en) * 2007-09-19 2009-08-20 Koninkl Philips Electronics Nv Method and apparatus for detecting an abnormal situation
CN101802881B (zh) * 2007-09-19 2012-08-15 皇家飞利浦电子股份有限公司 检测异常情况的设备和方法
WO2011010191A1 (en) * 2009-07-22 2011-01-27 Koninklijke Philips Electronics N.V. Fall detectors and a method of detecting falls
RU2559933C2 (ru) * 2009-07-22 2015-08-20 Конинклейке Филипс Электроникс Н.В. Детекторы падения и способ обнаружения падений
US9974908B2 (en) 2009-07-22 2018-05-22 Koninklijke Philips N.V. Fall detectors and a method of detecting falls
WO2018114209A3 (de) * 2016-12-21 2018-08-30 Service-Konzepte MM AG Autonomes haushaltsgerät und sitz- oder liegemöbel hierzu sowie haushaltsgerät
US11406235B2 (en) 2016-12-21 2022-08-09 Service-Konzepte MM AG Autonomous domestic appliance and seating or reclining furniture as well as domestic appliance
CN110087024A (zh) * 2019-03-08 2019-08-02 合肥泛米智能科技有限公司 一种养老居家的监控装置
DE102022135078A1 (de) 2022-08-31 2024-02-29 VIVAI Software AG Servereinrichtung mit Alarmierungseinheit und damit ausgeführtes Verfahren
WO2024062284A1 (de) 2022-08-31 2024-03-28 VIVAI Software AG Multifunktionales assistenzsystem, insbesondere fur alleinlebende personen

Also Published As

Publication number Publication date
EP1566782B1 (de) 2012-08-22
PL1566782T3 (pl) 2013-01-31
ES2393718T3 (es) 2012-12-27
FR2866739A1 (fr) 2005-08-26

Similar Documents

Publication Publication Date Title
EP1566782B1 (de) Verfahren und Vorrichtung zur Überwachung von alten Leuten zu Hause.
EP1091273B1 (de) Mobiler Roboter und Steuerverfahren für einen mobilen Roboter
US20230087729A1 (en) Information processing using a population of data acquisition devices
EP1264470B1 (de) System zur häuslichen medizinischen fernhilfe
EP3844988B1 (de) Übertragungsverfahren eines elektronischen alarms über ein intelligentes telefon, und vorrichtung zur umsetzung dieses verfahrens
JP2005305631A (ja) ロボットおよびその制御方法
CN109074035A (zh) 多功能的每房间自动化系统
US9843916B2 (en) Systems and methods for automatic emergency contact routing
CN104767860B (zh) 来电提示方法、装置及终端
FR2916981A1 (fr) Defibrillateur portable, systeme configure pour surveiller un patient et procede pour le traitement d'un patient
EP3412036B1 (de) Verfahren zur unterstützung einer hörgeschädigten person beim folgen eines gesprächs
EP2362582A1 (de) Verfahren und Vorrichtung zur kontextuellen Haustechnik Automatisierung
WO2005073875A1 (fr) Systeme et procede de reconnaissance de sequence sonore
EP1566245A1 (de) Verfahren und Vorrichtung zur Verarbeitung einer Robotsteuerung durch einer Fernbedienungsvorrichtung, die über ein Telekommunikationsnetzwerk zu einem Server verbunden ist
EP2015552A2 (de) Modulares Alarmsystem und -verfahren
FR2617299A1 (fr) Systeme a reconnaissance vocale, autonome et portable, de gestion de fichiers d'ordinateur
FR3080943A1 (fr) Procede d’assistance d’une personne equipee d’un telephone cellulaire
US10887552B1 (en) Door-knocking for teleconferencing
FR2863433A1 (fr) Procede et dispositif de traitement d'alertes
WO2017103480A1 (fr) Système et procédé de réalisation d'un tour de table lors d'une réunion à distance
JP7425413B2 (ja) 被監視者監視支援装置、被監視者監視支援方法、被監視者監視支援システムおよび被監視者監視支援サーバ装置
EP1416422B1 (de) Anlage zum zentralen Verwalten von Nachrichten für Teilnehmer mit entsprechenden Kennungen
JP2005217508A (ja) コミュニケーション装置
JP2002351992A (ja) ヘルスケアシステムおよびその端末装置
EP1425870A1 (de) Rundsende-dienstverfahren

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA HR LV MK YU

17P Request for examination filed

Effective date: 20060128

AKX Designation fees paid

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR

17Q First examination report despatched

Effective date: 20070824

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: FRANCE TELECOM

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

RIN1 Information on inventor provided before grant (corrected)

Inventor name: JUMPERTS, SYLVIE

Inventor name: MARTIN, FREDERIC

Inventor name: GUEGAN-BOURGOIN, DELPHINE

Inventor name: LEDUNOIS, VALERIE

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: FRENCH

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 572314

Country of ref document: AT

Kind code of ref document: T

Effective date: 20120915

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602005035744

Country of ref document: DE

Effective date: 20121018

REG Reference to a national code

Ref country code: NL

Ref legal event code: VDEP

Effective date: 20120822

Ref country code: ES

Ref legal event code: FG2A

Ref document number: 2393718

Country of ref document: ES

Kind code of ref document: T3

Effective date: 20121227

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 572314

Country of ref document: AT

Kind code of ref document: T

Effective date: 20120822

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

Effective date: 20120822

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120822

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120822

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20121222

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120822

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120822

REG Reference to a national code

Ref country code: PL

Ref legal event code: T3

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20121123

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120822

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20121224

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120822

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120822

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120822

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120822

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120822

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120822

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120822

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20130523

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20121122

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602005035744

Country of ref document: DE

Effective date: 20130523

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130228

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130228

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130228

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130217

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120822

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130217

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20050217

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 12

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 13

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 14

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: ES

Payment date: 20240301

Year of fee payment: 20

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20240123

Year of fee payment: 20

Ref country code: GB

Payment date: 20240123

Year of fee payment: 20

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: PL

Payment date: 20240126

Year of fee payment: 20

Ref country code: IT

Payment date: 20240123

Year of fee payment: 20

Ref country code: FR

Payment date: 20240123

Year of fee payment: 20

Ref country code: BE

Payment date: 20240123

Year of fee payment: 20