WO2007072356A2 - Positioning system for patient monitoring sensors - Google Patents

Positioning system for patient monitoring sensors Download PDF

Info

Publication number
WO2007072356A2
WO2007072356A2 PCT/IB2006/054854 IB2006054854W WO2007072356A2 WO 2007072356 A2 WO2007072356 A2 WO 2007072356A2 IB 2006054854 W IB2006054854 W IB 2006054854W WO 2007072356 A2 WO2007072356 A2 WO 2007072356A2
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
pattern
skin
positioning system
texture
Prior art date
Application number
PCT/IB2006/054854
Other languages
French (fr)
Other versions
WO2007072356A3 (en
Inventor
Gerd Lanfermann
Richard D. Willmann
Original Assignee
Koninkijke Philips Electronics N.V.
Philips Intellectual Property & Standards Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninkijke Philips Electronics N.V., Philips Intellectual Property & Standards Gmbh filed Critical Koninkijke Philips Electronics N.V.
Publication of WO2007072356A2 publication Critical patent/WO2007072356A2/en
Publication of WO2007072356A3 publication Critical patent/WO2007072356A3/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0062Arrangements for scanning
    • A61B5/0064Body surface scanning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/064Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/684Indicating the position of the sensor on the body
    • A61B5/6842Indicating the position of the sensor on the body by marking the skin
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • A61B90/92Identification means for patients or instruments, e.g. tags coded with colour
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • A61B90/94Identification means for patients or instruments, e.g. tags coded with symbols, e.g. text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/373Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers

Definitions

  • This invention relates to a positioning system for monitoring sensors or treatment devices which are to be accurately located on the skin of a patient, especially where professional assistance is unavailable, for example in a domestic environment.
  • the present invention provides a positioning system for a patient monitoring sensor or treatment device (hereinafter referred to as "a sensor”) comprising imaging means for detecting a texture or pattern on the skin of a patient to identify a sensor location, an image processing unit which is adapted to learn the required location of a sensor by storing the local texture or pattern, and means for guiding a user to reposition the sensor in the desired location by reference to the stored pattern.
  • a sensor a patient monitoring sensor or treatment device
  • the texture or pattern may consist of a natural pattern such as a pattern of moles or varying skin color, or an artificial pattern, such as can be provided by UV active markers.
  • the detection means may operate using visible light and/or UV or IR which may reveal additional texture information.
  • each sensor is initially placed in the required position by a medical professional, and the system learns the exact location from the local skin pattern.
  • the system incorporates an inertia sensor in order to ensure that the final placement towards the target position is executed in a consistent way. This ensures that a lay person can attach the sensor at exactly the same position as done previously by a medical professional, thereby significantly improving the quality of data which can be gathered. It also ensures that, when the device has to be utilized over a period of time, during which it is removed and repeatedly reattached, the consistency of positioning will still be maintained.
  • a patient may be required to re-attach the sensor in an area where he or she cannot see it directly, for example, on the patient's own back.
  • the system is applicable to the attachment of various kinds of sensors, and transducers such as capacious or inductive types, and also to treatment electrodes such as those used for AED (Automatic External Def brillators) which require exact position of their stimulating pads, in order to operate effectively.
  • AED Automatic External Def brillators
  • Figs. IA to IE illustrate the process of using skin pattern recognition to control navigation of a sensor type device
  • Fig. 2 is a diagrammatic view of a signal interface presented to the user, in order to provide suitable positioning guidance
  • Fig. 3 is a diagrammatic view of one practical form of the system.
  • Fig. 4 is an overall schematic diagram of the modes of operation of the system.
  • Fig. 5 is a detailed flowchart of operation in the patient guiding mode.
  • the desired target position for location of a sensor or stimulating pad for example, is indicated by reference 2, and this is located on an area of the patient's skin which, in the normal way, carries a distinctive pattern of various larger or smaller, lighter or darker areas such as moles.
  • This position is initially determined by a medical professional.
  • Figure IB shows the individual, slightly darkened areas of skin which are identified by a pattern of crosses, 4, 6, 8, 10 etc whose coordinates relative to the target area 2 are memorized in the system, for each sensor, after it has been properly positioned. As shown in Figure 4, this is achieved using the "training mode" of the system. In operation this is achieved as follows:
  • the medical professional activates the training mode of the device and places the device above the desired location, e.g. 10 cm.
  • the system's camera image is acquired and analyzed for skin texture, e.g. moles, as shown in Fig. 1, at step 40 in Figure 4.
  • the system may use different wavelength in addition to the visible light. UV or IR may reveal additional texture information.
  • the device is fitted with appropriate light emitting units (LEDs).
  • LEDs light emitting units
  • Figure 2 illustrates a possible type of user instruction interface (48 in Figure 4) which can be used to give the necessary directional information to the user in the "guiding mode". This will be capable of displaying downwardly or upwardly directed arrows A and B, as well as rightwardly and leftwardly directed arrows such as C and D to indicate required movement in the corresponding lateral directions.
  • the electrode may require to be moved in a specific rotational direction and for that purpose, clockwise and counter-clockwise arrow signs such as indicated at F and E respectively, may also be displayable.
  • a centrally located display element indicated by the letter L in Figure 2 will be activated so as to indicate that the device can be lowered into place.
  • Figure 3 illustrates a practical form of a sensor arrangement of the system in use, in which a skin attached device 16 includes an integrated imaging device 18 comprising an optical system 20, an image acquisition device 22, and an image processing unit 24.
  • the optical system is arranged to cover a reasonably wide angle of view, as indicated by lines 26 in the Figure, so as to encompass a reasonable number of areas 28 of different coloration in the desired skin area.
  • the imaging device may utilize UV or IR light as well as or instead of visible wavelength, in which case it will be fitted with appropriate light emitting devices such as LEDs.
  • the device also incorporates a contact sensor to confirm when it has reached a contact position with the skin, and an inertia sensor such as an etched-beam capacitive device, to accurately control the lowering of the device into its final placement in the target position. This ensures that it s not shifted sideways at the last moment before making contact with the skin, or immediately afterwards.
  • Figure 5 is a detailed flowchart of operation of the device in the "guiding mode".
  • the image is acquired and the detected pattern is analyzed (52). If it cannot be identified (54) the patient is instructed to increase the sensor distance from the skin or move the sensor around (56) and the image acquisition step (52) is repeated. If the pattern is identified, the location and orientation are determined (58) and checked against the desired position and orientation (60). If this is incorrect the patient is instructed to shift or rotate the sensor (62) and the acquisition step (52) is repeated.
  • the patient is instructed to lower the sensor (64) and the signal from the inertia sensor is monitored (66) to determine whether the patient's movement steady in the lowering direction; however if rapid movement is detected, the image acquisition is repeated (52).
  • the system checks for the occurrence of sensor contact with the skin (68), and when this occurs, the system determines that the process is complete (70) otherwise the image acquisition process is restarted (52).

Abstract

This invention relates to a positioning system for monitoring sensors or treatment devices which are to be accurately located on the skin of a patient, especially where professional assistance is unavailable, for example in a domestic environment. It comprises a positioning system for a patient monitoring sensor or treatment device with imaging means for detecting a texture or pattern on the skin of a patient to identify a sensor location, an image processing unit which is adapted to learn the required location of a sensor by storing the local texture or pattern, and means for guiding a user to reposition the sensor in the desired location by reference to the stored pattern.

Description

Positioning system for patient monitoring sensors
This invention relates to a positioning system for monitoring sensors or treatment devices which are to be accurately located on the skin of a patient, especially where professional assistance is unavailable, for example in a domestic environment.
The use of sophisticated sensing and actuating devices for medical purposes has increasingly spread into the unsupervised home environment, in addition to their application in well- supervised patient care facilities as hospitals. However, it is very difficult for a non-professional user or their carer to utilize such equipment effectively, because of the requirement for accurately reproducible results, when sensors are repeatedly removed and replaced. Accordingly, the present invention provides a positioning system for a patient monitoring sensor or treatment device (hereinafter referred to as "a sensor") comprising imaging means for detecting a texture or pattern on the skin of a patient to identify a sensor location, an image processing unit which is adapted to learn the required location of a sensor by storing the local texture or pattern, and means for guiding a user to reposition the sensor in the desired location by reference to the stored pattern.
The texture or pattern may consist of a natural pattern such as a pattern of moles or varying skin color, or an artificial pattern, such as can be provided by UV active markers.
The detection means may operate using visible light and/or UV or IR which may reveal additional texture information. In operation of the system, each sensor is initially placed in the required position by a medical professional, and the system learns the exact location from the local skin pattern.
Subsequently, when the patient is required to re-position the sensor himself, he places it over the area of the desired location, and the system produces signals instructing him to move or rotate the sensor, based on its recognition of the skin pattern, until the required position over the target location is achieved, and the device can be placed in contact with the skin. Preferably, the system incorporates an inertia sensor in order to ensure that the final placement towards the target position is executed in a consistent way. This ensures that a lay person can attach the sensor at exactly the same position as done previously by a medical professional, thereby significantly improving the quality of data which can be gathered. It also ensures that, when the device has to be utilized over a period of time, during which it is removed and repeatedly reattached, the consistency of positioning will still be maintained.
In addition, under some circumstances a patient may be required to re-attach the sensor in an area where he or she cannot see it directly, for example, on the patient's own back.
The system is applicable to the attachment of various kinds of sensors, and transducers such as capacious or inductive types, and also to treatment electrodes such as those used for AED (Automatic External Def brillators) which require exact position of their stimulating pads, in order to operate effectively.
One embodiment of the invention will now be described by way of example, with reference to the accompanying drawings, in which:
Figs. IA to IE illustrate the process of using skin pattern recognition to control navigation of a sensor type device;
Fig. 2 is a diagrammatic view of a signal interface presented to the user, in order to provide suitable positioning guidance; and
Fig. 3 is a diagrammatic view of one practical form of the system. Fig. 4 is an overall schematic diagram of the modes of operation of the system; and
Fig. 5 is a detailed flowchart of operation in the patient guiding mode.
Referring firstly to Figure IA, the desired target position for location of a sensor or stimulating pad, for example, is indicated by reference 2, and this is located on an area of the patient's skin which, in the normal way, carries a distinctive pattern of various larger or smaller, lighter or darker areas such as moles. This position is initially determined by a medical professional.
Figure IB shows the individual, slightly darkened areas of skin which are identified by a pattern of crosses, 4, 6, 8, 10 etc whose coordinates relative to the target area 2 are memorized in the system, for each sensor, after it has been properly positioned. As shown in Figure 4, this is achieved using the "training mode" of the system. In operation this is achieved as follows:
(a) The medical professional activates the training mode of the device and places the device above the desired location, e.g. 10 cm. The system's camera image is acquired and analyzed for skin texture, e.g. moles, as shown in Fig. 1, at step 40 in Figure 4.
(b) If there is not enough skin texture in the image area, the medical professional is instructed by the device to increase the distance to skin or to move the device in a circle around the desired area (more area covered). (c) The system generates a reference pattern or map that is stored in the system's memory.
(d) The system may use different wavelength in addition to the visible light. UV or IR may reveal additional texture information. In this case, the device is fitted with appropriate light emitting units (LEDs). (e) The device is lowered by the medical professional onto the patient's skin and the device tracks the skin texture until the camera opening is obscured by the skin, (f) From the last distribution of the texture, the valid position will be calculated
(42, 44 in Figure 4).
When the patient subsequently attempts to locate the sensor in the approximate area of the target 2, in the "guiding mode" of Figure 4, he will typically, at first, place it too far to one side as indicated in the diagram of Figure 1C in which it can be seen that only the left hand group of mole patterns, including areas 6 and 10 can be detected. Accordingly, the system will identify the current skin texture pattern and compare it with the stored reference image, and signal that the sensor should be moved to the right, as indicated by the arrow 12 in Figure 1C.
Similarly, if the patient initially positions the sensor too far below the required target position 2, as indicated in Figure ID, again, only part of the required mole pattern including areas 6 and 8 can be detected by the sensor, so the user will be directed to move it in an upward direction as indicated by the arrow 14 of Figure ID, until the proper position is acquired as indicated in Figure IE, with the target 2 centralized.
If the user positions the sensor in a region that is not recognized, he may also be instructed to move it further from his skin, or to move it in a circular pattern, to increase the area covered. Figure 2 illustrates a possible type of user instruction interface (48 in Figure 4) which can be used to give the necessary directional information to the user in the "guiding mode". This will be capable of displaying downwardly or upwardly directed arrows A and B, as well as rightwardly and leftwardly directed arrows such as C and D to indicate required movement in the corresponding lateral directions.
Of course it is also possible that the electrode may require to be moved in a specific rotational direction and for that purpose, clockwise and counter-clockwise arrow signs such as indicated at F and E respectively, may also be displayable.
Once the device is accurately positioned, a centrally located display element indicated by the letter L in Figure 2 will be activated so as to indicate that the device can be lowered into place.
Figure 3 illustrates a practical form of a sensor arrangement of the system in use, in which a skin attached device 16 includes an integrated imaging device 18 comprising an optical system 20, an image acquisition device 22, and an image processing unit 24. The optical system is arranged to cover a reasonably wide angle of view, as indicated by lines 26 in the Figure, so as to encompass a reasonable number of areas 28 of different coloration in the desired skin area. The imaging device may utilize UV or IR light as well as or instead of visible wavelength, in which case it will be fitted with appropriate light emitting devices such as LEDs. The device also incorporates a contact sensor to confirm when it has reached a contact position with the skin, and an inertia sensor such as an etched-beam capacitive device, to accurately control the lowering of the device into its final placement in the target position. This ensures that it s not shifted sideways at the last moment before making contact with the skin, or immediately afterwards. Figure 5 is a detailed flowchart of operation of the device in the "guiding mode". At the start (50) the image is acquired and the detected pattern is analyzed (52). If it cannot be identified (54) the patient is instructed to increase the sensor distance from the skin or move the sensor around (56) and the image acquisition step (52) is repeated. If the pattern is identified, the location and orientation are determined (58) and checked against the desired position and orientation (60). If this is incorrect the patient is instructed to shift or rotate the sensor (62) and the acquisition step (52) is repeated.
If the pattern is correctly centered and oriented, the patient is instructed to lower the sensor (64) and the signal from the inertia sensor is monitored (66) to determine whether the patient's movement steady in the lowering direction; however if rapid movement is detected, the image acquisition is repeated (52).
If the inertia sensor signal does not detect rapid movement, the system checks for the occurrence of sensor contact with the skin (68), and when this occurs, the system determines that the process is complete (70) otherwise the image acquisition process is restarted (52).

Claims

CLAIMS:
1. A positioning system for a patient monitoring sensor or treatment device (hereinafter referred to as "a sensor") comprising imaging means for detecting a texture or pattern on the skin of a patient to identify a sensor location, an image processing unit which is adapted to learn the required location of a sensor by storing the local texture or pattern, and means for guiding a user to reposition the sensor in the desired location by reference to the stored pattern.
2. A positioning system according to claim 1 in which the texture or pattern comprises a naturally occurring pattern of moles or skin coloration.
3. A positioning system according to claim 1 in which the texture or pattern comprises artificial markers.
4. A positioning system according to any preceding claim in which the imaging means is adapted to use UV or IR wavelengths and includes corresponding light emitting devices.
5. An imaging system according to any preceding claim in which the device is adapted to provide visible signals to the user to guide the sensor to the desired position.
6. An imaging system according to any preceding claim in which the device is adapted to provide audible signals to the user to guide the sensor to the desired position.
7. An imaging system according to any preceding claim which comprises a separate signaling unit adapted to communicate wirelessly with the sensor device.
8. An imaging system according to any preceding claim in which the image processing unit is separated from the image sensor and is adapted to communicate wirelessly with the sensor device.
PCT/IB2006/054854 2005-12-21 2006-12-14 Positioning system for patient monitoring sensors WO2007072356A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP05112601 2005-12-21
EP05112601.9 2005-12-21

Publications (2)

Publication Number Publication Date
WO2007072356A2 true WO2007072356A2 (en) 2007-06-28
WO2007072356A3 WO2007072356A3 (en) 2007-11-15

Family

ID=38091206

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2006/054854 WO2007072356A2 (en) 2005-12-21 2006-12-14 Positioning system for patient monitoring sensors

Country Status (1)

Country Link
WO (1) WO2007072356A2 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012111012A1 (en) 2011-02-17 2012-08-23 Eon Medical Ltd. System and method for performing an automatic and self-guided medical examination
CN102670177A (en) * 2011-03-15 2012-09-19 明达医学科技股份有限公司 Skin optical diagnosis device and operation method thereof
EP2675345A1 (en) * 2011-02-17 2013-12-25 Eon Medical Ltd. System and method for performing an automatic and self-guided medical examination
WO2014207733A1 (en) * 2013-06-26 2014-12-31 Sensible Medical Innovations Ltd. Controlling electromagnetic (em) transmission based on pressure parameters
CN107072552A (en) * 2014-11-06 2017-08-18 皇家飞利浦有限公司 Skin treatment system
WO2018104518A1 (en) 2016-12-08 2018-06-14 Koninklijke Philips N.V. Surface tissue tracking
US10143373B2 (en) 2011-02-17 2018-12-04 Tyto Care Ltd. System and method for performing an automatic and remote trained personnel guided medical examination
US10582901B2 (en) 2014-04-24 2020-03-10 Koninklijke Philips N.V. Recognizer of staff or patient body parts using markers to prevent or reduce unwanted irradiation

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0560331A1 (en) * 1992-03-11 1993-09-15 Bodenseewerk Gerätetechnik GmbH Positioning device for a part of the body for therapeutic treatment
EP0898931A2 (en) * 1997-09-01 1999-03-03 Kyoto Daiichi Kagaku Co., Ltd. Probe positioning method and device therefor
EP0917854A2 (en) * 1997-11-21 1999-05-26 Kyoto Dai-ichi Kagaku Co., Ltd. Non-contact non-invasive measuring method and apparatus
WO1999027839A2 (en) * 1997-12-01 1999-06-10 Cosman Eric R Surgical positioning system
US6147749A (en) * 1995-08-07 2000-11-14 Kyoto Daiichi Kagaku Co., Ltd Method and apparatus for measuring concentration by light projection
EP1652470A1 (en) * 2004-10-26 2006-05-03 Hitachi, Ltd. Optical measuring instrument for living body

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0560331A1 (en) * 1992-03-11 1993-09-15 Bodenseewerk Gerätetechnik GmbH Positioning device for a part of the body for therapeutic treatment
US6147749A (en) * 1995-08-07 2000-11-14 Kyoto Daiichi Kagaku Co., Ltd Method and apparatus for measuring concentration by light projection
EP0898931A2 (en) * 1997-09-01 1999-03-03 Kyoto Daiichi Kagaku Co., Ltd. Probe positioning method and device therefor
EP0917854A2 (en) * 1997-11-21 1999-05-26 Kyoto Dai-ichi Kagaku Co., Ltd. Non-contact non-invasive measuring method and apparatus
WO1999027839A2 (en) * 1997-12-01 1999-06-10 Cosman Eric R Surgical positioning system
EP1652470A1 (en) * 2004-10-26 2006-05-03 Hitachi, Ltd. Optical measuring instrument for living body

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10143373B2 (en) 2011-02-17 2018-12-04 Tyto Care Ltd. System and method for performing an automatic and remote trained personnel guided medical examination
EP2675345A1 (en) * 2011-02-17 2013-12-25 Eon Medical Ltd. System and method for performing an automatic and self-guided medical examination
JP2014518642A (en) * 2011-02-17 2014-08-07 イーオン メディカル リミテッド System and method for performing automatic and self-guided medical examinations
EP2675345A4 (en) * 2011-02-17 2015-01-21 Eon Medical Ltd System and method for performing an automatic and self-guided medical examination
WO2012111012A1 (en) 2011-02-17 2012-08-23 Eon Medical Ltd. System and method for performing an automatic and self-guided medical examination
CN102670177A (en) * 2011-03-15 2012-09-19 明达医学科技股份有限公司 Skin optical diagnosis device and operation method thereof
CN102670177B (en) * 2011-03-15 2015-05-20 明达医学科技股份有限公司 Skin optical diagnosis device and operation method thereof
WO2014207733A1 (en) * 2013-06-26 2014-12-31 Sensible Medical Innovations Ltd. Controlling electromagnetic (em) transmission based on pressure parameters
CN105517483A (en) * 2013-06-26 2016-04-20 明智医疗创新有限公司 Controlling electromagnetic (EM) transmission based on pressure parameters
US10610100B2 (en) 2013-06-26 2020-04-07 Sensible Medical Innovations Ltd. Controlling electromagnetic (EM) transmission based on pressure parameters
CN105517483B (en) * 2013-06-26 2019-07-23 明智医疗创新有限公司 The method for controlling electromagnetic transmission according to pressure parameter
US10582901B2 (en) 2014-04-24 2020-03-10 Koninklijke Philips N.V. Recognizer of staff or patient body parts using markers to prevent or reduce unwanted irradiation
JP2017534388A (en) * 2014-11-06 2017-11-24 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Skin treatment system
EP3215219A1 (en) * 2014-11-06 2017-09-13 Koninklijke Philips N.V. Skin treatment system
JP2020039894A (en) * 2014-11-06 2020-03-19 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Skin treatment system
CN107072552A (en) * 2014-11-06 2017-08-18 皇家飞利浦有限公司 Skin treatment system
US10722726B2 (en) * 2014-11-06 2020-07-28 Koninklijke Philips N.V. Skin treatment system
EP3215219B1 (en) * 2014-11-06 2024-01-10 Koninklijke Philips N.V. Skin treatment system
WO2018104518A1 (en) 2016-12-08 2018-06-14 Koninklijke Philips N.V. Surface tissue tracking
CN110072435A (en) * 2016-12-08 2019-07-30 皇家飞利浦有限公司 Surface texture tracking
US11071459B2 (en) 2016-12-08 2021-07-27 Koninklijke Philips N.V. Surface tissue tracking
CN110072435B (en) * 2016-12-08 2022-07-19 皇家飞利浦有限公司 Surface tissue tracking
US11571130B2 (en) 2016-12-08 2023-02-07 Koninklijke Philips N.V. Surface tissue tracking

Also Published As

Publication number Publication date
WO2007072356A3 (en) 2007-11-15

Similar Documents

Publication Publication Date Title
WO2007072356A2 (en) Positioning system for patient monitoring sensors
US7452336B2 (en) Interactive neural training device
CN105682537B (en) Automatic Perimeter
US5953102A (en) Method for substantially objective testing of the visual capacity of a test subject
EP2964078B1 (en) System and method for determining vital sign information
KR102199189B1 (en) Acupuncture training system using mixed reality and acupuncture training method thereof
CA2926709C (en) Pressure ulcer detection methods, devices and techniques
EP3307142B1 (en) Apparatus and method for inspecting skin lesions
ES2262423A1 (en) Automatic control system for radiation device used in medical diagnosis, determines true gazing direction of operator of medical device, relative to monitor and accordingly controls activation of pedal
CN107000777B (en) It is present in the system of near steering wheel for vehicle and its position for detecting hand and/or finger
CN106943678A (en) A kind of method and device of automatic radiotherapy position
CN207941171U (en) Body fitness testing system
KR20150118242A (en) Brain damage and cognitive function in the visual perception training system using visual perception system and training method
US20030125638A1 (en) Optical stimulation of the human eye
US20190282082A1 (en) Apparatus and method for capturing a visual field of a person having a scotoma
JP2009276860A (en) Biological pattern imaging device, biological pattern imaging method, and biological pattern imaging program
WO2015119630A1 (en) Vision training method and apparatus
CN102934050B (en) The method and apparatus of option for rendering
KR101089116B1 (en) Patient Position Monitoring Device
US8167813B2 (en) Systems and methods for locating a blood vessel
CN110251074B (en) Multifunctional medical detection system
CN110711357A (en) Walking training guidance system
CA3019931A1 (en) Pressure ulcer detection methods, devices and techniques
RU2522848C1 (en) Method of controlling device using eye gestures in response to stimuli
KR102325431B1 (en) Rehabilitation cure and evaluation system based on information and communication technology using dual-task stacking cone

Legal Events

Date Code Title Description
NENP Non-entry into the national phase in:

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 06842525

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 06842525

Country of ref document: EP

Kind code of ref document: A2