WO2007072356A2 - Positioning system for patient monitoring sensors - Google Patents

Positioning system for patient monitoring sensors Download PDF

Info

Publication number
WO2007072356A2
WO2007072356A2 PCT/IB2006/054854 IB2006054854W WO2007072356A2 WO 2007072356 A2 WO2007072356 A2 WO 2007072356A2 IB 2006054854 W IB2006054854 W IB 2006054854W WO 2007072356 A2 WO2007072356 A2 WO 2007072356A2
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
pattern
system according
adapted
positioning system
Prior art date
Application number
PCT/IB2006/054854
Other languages
French (fr)
Other versions
WO2007072356A3 (en
Inventor
Gerd Lanfermann
Richard D. Willmann
Original Assignee
Koninkijke Philips Electronics N.V.
Philips Intellectual Property & Standards Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to EP05112601 priority Critical
Priority to EP05112601.9 priority
Application filed by Koninkijke Philips Electronics N.V., Philips Intellectual Property & Standards Gmbh filed Critical Koninkijke Philips Electronics N.V.
Publication of WO2007072356A2 publication Critical patent/WO2007072356A2/en
Publication of WO2007072356A3 publication Critical patent/WO2007072356A3/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/0059Detecting, measuring or recording for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0062Arrangements for scanning
    • A61B5/0064Body surface scanning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/064Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/684Indicating the position of the sensor on the body
    • A61B5/6842Indicating the position of the sensor on the body by marking the skin
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • A61B90/92Identification means for patients or instruments, e.g. tags coded with colour
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • A61B90/94Identification means for patients or instruments, e.g. tags coded with symbols, e.g. text
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/373Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers

Abstract

This invention relates to a positioning system for monitoring sensors or treatment devices which are to be accurately located on the skin of a patient, especially where professional assistance is unavailable, for example in a domestic environment. It comprises a positioning system for a patient monitoring sensor or treatment device with imaging means for detecting a texture or pattern on the skin of a patient to identify a sensor location, an image processing unit which is adapted to learn the required location of a sensor by storing the local texture or pattern, and means for guiding a user to reposition the sensor in the desired location by reference to the stored pattern.

Description

Positioning system for patient monitoring sensors

This invention relates to a positioning system for monitoring sensors or treatment devices which are to be accurately located on the skin of a patient, especially where professional assistance is unavailable, for example in a domestic environment.

The use of sophisticated sensing and actuating devices for medical purposes has increasingly spread into the unsupervised home environment, in addition to their application in well- supervised patient care facilities as hospitals. However, it is very difficult for a non-professional user or their carer to utilize such equipment effectively, because of the requirement for accurately reproducible results, when sensors are repeatedly removed and replaced. Accordingly, the present invention provides a positioning system for a patient monitoring sensor or treatment device (hereinafter referred to as "a sensor") comprising imaging means for detecting a texture or pattern on the skin of a patient to identify a sensor location, an image processing unit which is adapted to learn the required location of a sensor by storing the local texture or pattern, and means for guiding a user to reposition the sensor in the desired location by reference to the stored pattern.

The texture or pattern may consist of a natural pattern such as a pattern of moles or varying skin color, or an artificial pattern, such as can be provided by UV active markers.

The detection means may operate using visible light and/or UV or IR which may reveal additional texture information. In operation of the system, each sensor is initially placed in the required position by a medical professional, and the system learns the exact location from the local skin pattern.

Subsequently, when the patient is required to re-position the sensor himself, he places it over the area of the desired location, and the system produces signals instructing him to move or rotate the sensor, based on its recognition of the skin pattern, until the required position over the target location is achieved, and the device can be placed in contact with the skin. Preferably, the system incorporates an inertia sensor in order to ensure that the final placement towards the target position is executed in a consistent way. This ensures that a lay person can attach the sensor at exactly the same position as done previously by a medical professional, thereby significantly improving the quality of data which can be gathered. It also ensures that, when the device has to be utilized over a period of time, during which it is removed and repeatedly reattached, the consistency of positioning will still be maintained.

In addition, under some circumstances a patient may be required to re-attach the sensor in an area where he or she cannot see it directly, for example, on the patient's own back.

The system is applicable to the attachment of various kinds of sensors, and transducers such as capacious or inductive types, and also to treatment electrodes such as those used for AED (Automatic External Def brillators) which require exact position of their stimulating pads, in order to operate effectively.

One embodiment of the invention will now be described by way of example, with reference to the accompanying drawings, in which:

Figs. IA to IE illustrate the process of using skin pattern recognition to control navigation of a sensor type device;

Fig. 2 is a diagrammatic view of a signal interface presented to the user, in order to provide suitable positioning guidance; and

Fig. 3 is a diagrammatic view of one practical form of the system. Fig. 4 is an overall schematic diagram of the modes of operation of the system; and

Fig. 5 is a detailed flowchart of operation in the patient guiding mode.

Referring firstly to Figure IA, the desired target position for location of a sensor or stimulating pad, for example, is indicated by reference 2, and this is located on an area of the patient's skin which, in the normal way, carries a distinctive pattern of various larger or smaller, lighter or darker areas such as moles. This position is initially determined by a medical professional.

Figure IB shows the individual, slightly darkened areas of skin which are identified by a pattern of crosses, 4, 6, 8, 10 etc whose coordinates relative to the target area 2 are memorized in the system, for each sensor, after it has been properly positioned. As shown in Figure 4, this is achieved using the "training mode" of the system. In operation this is achieved as follows:

(a) The medical professional activates the training mode of the device and places the device above the desired location, e.g. 10 cm. The system's camera image is acquired and analyzed for skin texture, e.g. moles, as shown in Fig. 1, at step 40 in Figure 4.

(b) If there is not enough skin texture in the image area, the medical professional is instructed by the device to increase the distance to skin or to move the device in a circle around the desired area (more area covered). (c) The system generates a reference pattern or map that is stored in the system's memory.

(d) The system may use different wavelength in addition to the visible light. UV or IR may reveal additional texture information. In this case, the device is fitted with appropriate light emitting units (LEDs). (e) The device is lowered by the medical professional onto the patient's skin and the device tracks the skin texture until the camera opening is obscured by the skin, (f) From the last distribution of the texture, the valid position will be calculated

(42, 44 in Figure 4).

When the patient subsequently attempts to locate the sensor in the approximate area of the target 2, in the "guiding mode" of Figure 4, he will typically, at first, place it too far to one side as indicated in the diagram of Figure 1C in which it can be seen that only the left hand group of mole patterns, including areas 6 and 10 can be detected. Accordingly, the system will identify the current skin texture pattern and compare it with the stored reference image, and signal that the sensor should be moved to the right, as indicated by the arrow 12 in Figure 1C.

Similarly, if the patient initially positions the sensor too far below the required target position 2, as indicated in Figure ID, again, only part of the required mole pattern including areas 6 and 8 can be detected by the sensor, so the user will be directed to move it in an upward direction as indicated by the arrow 14 of Figure ID, until the proper position is acquired as indicated in Figure IE, with the target 2 centralized.

If the user positions the sensor in a region that is not recognized, he may also be instructed to move it further from his skin, or to move it in a circular pattern, to increase the area covered. Figure 2 illustrates a possible type of user instruction interface (48 in Figure 4) which can be used to give the necessary directional information to the user in the "guiding mode". This will be capable of displaying downwardly or upwardly directed arrows A and B, as well as rightwardly and leftwardly directed arrows such as C and D to indicate required movement in the corresponding lateral directions.

Of course it is also possible that the electrode may require to be moved in a specific rotational direction and for that purpose, clockwise and counter-clockwise arrow signs such as indicated at F and E respectively, may also be displayable.

Once the device is accurately positioned, a centrally located display element indicated by the letter L in Figure 2 will be activated so as to indicate that the device can be lowered into place.

Figure 3 illustrates a practical form of a sensor arrangement of the system in use, in which a skin attached device 16 includes an integrated imaging device 18 comprising an optical system 20, an image acquisition device 22, and an image processing unit 24. The optical system is arranged to cover a reasonably wide angle of view, as indicated by lines 26 in the Figure, so as to encompass a reasonable number of areas 28 of different coloration in the desired skin area. The imaging device may utilize UV or IR light as well as or instead of visible wavelength, in which case it will be fitted with appropriate light emitting devices such as LEDs. The device also incorporates a contact sensor to confirm when it has reached a contact position with the skin, and an inertia sensor such as an etched-beam capacitive device, to accurately control the lowering of the device into its final placement in the target position. This ensures that it s not shifted sideways at the last moment before making contact with the skin, or immediately afterwards. Figure 5 is a detailed flowchart of operation of the device in the "guiding mode". At the start (50) the image is acquired and the detected pattern is analyzed (52). If it cannot be identified (54) the patient is instructed to increase the sensor distance from the skin or move the sensor around (56) and the image acquisition step (52) is repeated. If the pattern is identified, the location and orientation are determined (58) and checked against the desired position and orientation (60). If this is incorrect the patient is instructed to shift or rotate the sensor (62) and the acquisition step (52) is repeated.

If the pattern is correctly centered and oriented, the patient is instructed to lower the sensor (64) and the signal from the inertia sensor is monitored (66) to determine whether the patient's movement steady in the lowering direction; however if rapid movement is detected, the image acquisition is repeated (52).

If the inertia sensor signal does not detect rapid movement, the system checks for the occurrence of sensor contact with the skin (68), and when this occurs, the system determines that the process is complete (70) otherwise the image acquisition process is restarted (52).

Claims

CLAIMS:
1. A positioning system for a patient monitoring sensor or treatment device (hereinafter referred to as "a sensor") comprising imaging means for detecting a texture or pattern on the skin of a patient to identify a sensor location, an image processing unit which is adapted to learn the required location of a sensor by storing the local texture or pattern, and means for guiding a user to reposition the sensor in the desired location by reference to the stored pattern.
2. A positioning system according to claim 1 in which the texture or pattern comprises a naturally occurring pattern of moles or skin coloration.
3. A positioning system according to claim 1 in which the texture or pattern comprises artificial markers.
4. A positioning system according to any preceding claim in which the imaging means is adapted to use UV or IR wavelengths and includes corresponding light emitting devices.
5. An imaging system according to any preceding claim in which the device is adapted to provide visible signals to the user to guide the sensor to the desired position.
6. An imaging system according to any preceding claim in which the device is adapted to provide audible signals to the user to guide the sensor to the desired position.
7. An imaging system according to any preceding claim which comprises a separate signaling unit adapted to communicate wirelessly with the sensor device.
8. An imaging system according to any preceding claim in which the image processing unit is separated from the image sensor and is adapted to communicate wirelessly with the sensor device.
PCT/IB2006/054854 2005-12-21 2006-12-14 Positioning system for patient monitoring sensors WO2007072356A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP05112601 2005-12-21
EP05112601.9 2005-12-21

Publications (2)

Publication Number Publication Date
WO2007072356A2 true WO2007072356A2 (en) 2007-06-28
WO2007072356A3 WO2007072356A3 (en) 2007-11-15

Family

ID=38091206

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2006/054854 WO2007072356A2 (en) 2005-12-21 2006-12-14 Positioning system for patient monitoring sensors

Country Status (1)

Country Link
WO (1) WO2007072356A2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012111012A1 (en) 2011-02-17 2012-08-23 Eon Medical Ltd. System and method for performing an automatic and self-guided medical examination
CN102670177A (en) * 2011-03-15 2012-09-19 明达医学科技股份有限公司 Skin optical diagnosing apparatus and operating method thereof
EP2675345A1 (en) * 2011-02-17 2013-12-25 Eon Medical Ltd. System and method for performing an automatic and self-guided medical examination
WO2014207733A1 (en) * 2013-06-26 2014-12-31 Sensible Medical Innovations Ltd. Controlling electromagnetic (em) transmission based on pressure parameters
WO2018104518A1 (en) 2016-12-08 2018-06-14 Koninklijke Philips N.V. Surface tissue tracking
US10143373B2 (en) 2011-02-17 2018-12-04 Tyto Care Ltd. System and method for performing an automatic and remote trained personnel guided medical examination

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0560331A1 (en) * 1992-03-11 1993-09-15 Bodenseewerk Gerätetechnik GmbH Positioning device for a part of the body for therapeutic treatment
EP0898931A2 (en) * 1997-09-01 1999-03-03 Kurashiki Boseki Kabushiki Kaisha Probe positioning method and device therefor
EP0917854A2 (en) * 1997-11-21 1999-05-26 Kurashiki Boseki Kabushiki Kaisha Non-contact non-invasive measuring method and apparatus
WO1999027839A2 (en) * 1997-12-01 1999-06-10 Cosman Eric R Surgical positioning system
US6147749A (en) * 1995-08-07 2000-11-14 Kyoto Daiichi Kagaku Co., Ltd Method and apparatus for measuring concentration by light projection
EP1652470A1 (en) * 2004-10-26 2006-05-03 Hitachi, Ltd. Optical measuring instrument for living body

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0560331A1 (en) * 1992-03-11 1993-09-15 Bodenseewerk Gerätetechnik GmbH Positioning device for a part of the body for therapeutic treatment
US6147749A (en) * 1995-08-07 2000-11-14 Kyoto Daiichi Kagaku Co., Ltd Method and apparatus for measuring concentration by light projection
EP0898931A2 (en) * 1997-09-01 1999-03-03 Kurashiki Boseki Kabushiki Kaisha Probe positioning method and device therefor
EP0917854A2 (en) * 1997-11-21 1999-05-26 Kurashiki Boseki Kabushiki Kaisha Non-contact non-invasive measuring method and apparatus
WO1999027839A2 (en) * 1997-12-01 1999-06-10 Cosman Eric R Surgical positioning system
EP1652470A1 (en) * 2004-10-26 2006-05-03 Hitachi, Ltd. Optical measuring instrument for living body

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012111012A1 (en) 2011-02-17 2012-08-23 Eon Medical Ltd. System and method for performing an automatic and self-guided medical examination
EP2675345A1 (en) * 2011-02-17 2013-12-25 Eon Medical Ltd. System and method for performing an automatic and self-guided medical examination
JP2014518642A (en) * 2011-02-17 2014-08-07 イーオン メディカル リミテッド Systems and methods for performing medical tests is automatic and self guide
EP2675345A4 (en) * 2011-02-17 2015-01-21 Eon Medical Ltd System and method for performing an automatic and self-guided medical examination
US10143373B2 (en) 2011-02-17 2018-12-04 Tyto Care Ltd. System and method for performing an automatic and remote trained personnel guided medical examination
CN102670177A (en) * 2011-03-15 2012-09-19 明达医学科技股份有限公司 Skin optical diagnosing apparatus and operating method thereof
CN102670177B (en) * 2011-03-15 2015-05-20 明达医学科技股份有限公司 Skin optical diagnosing apparatus and operating method thereof
WO2014207733A1 (en) * 2013-06-26 2014-12-31 Sensible Medical Innovations Ltd. Controlling electromagnetic (em) transmission based on pressure parameters
CN105517483A (en) * 2013-06-26 2016-04-20 明智医疗创新有限公司 Controlling electromagnetic (EM) transmission based on pressure parameters
CN105517483B (en) * 2013-06-26 2019-07-23 明智医疗创新有限公司 The method for controlling electromagnetic transmission according to pressure parameter
WO2018104518A1 (en) 2016-12-08 2018-06-14 Koninklijke Philips N.V. Surface tissue tracking

Also Published As

Publication number Publication date
WO2007072356A3 (en) 2007-11-15

Similar Documents

Publication Publication Date Title
EP0625024B1 (en) Motorized mammographic biopsy apparatus
KR101331655B1 (en) Electronic data input system
EP2210552B1 (en) Apparatus for tracking insertion depth
Barea et al. Wheelchair guidance strategies using EOG
JP6129161B2 (en) System and method for measuring the head, eyes, a reaction of the eyelid and the pupil
EP1800616B1 (en) Computer assisted surgery system with light source
ES2323743T3 (en) Thermometric device and method.
US9980778B2 (en) Instrument having radio frequency identification systems and methods for use
JP6308940B2 (en) System and method for identifying the eye-tracking scene reference position
JP5812986B2 (en) Shape identification acuity evaluation and tracking system
US7033025B2 (en) Interactive occlusion system
US20070275830A1 (en) Gait training system using motion analysis
ES2327633T3 (en) Installation to see and keep an eye and gaze direction thereof.
JP5654595B2 (en) System, and a measuring method for measuring and / or training the visual ability of a subject
JP5146692B2 (en) System for optical position measurement and guidance to rigid or semi-flexible needle target
US20080312709A1 (en) Wearable medical treatment device with motion/position detection
van Beers et al. How humans combine simultaneous proprioceptive and visual position information
EP1374758A1 (en) Device for detecting measurement data of an eye
US7532201B2 (en) Position tracking device
US6161033A (en) Image guided surgery system
US5668622A (en) Device for measuring the position of the fixing point of an eye on a target, method for illuminating the eye and application for displaying images which change according to the movements of the eye
JP4382171B2 (en) Mapping equipment of the electrical activity of the heart
US20060036163A1 (en) Method of, and apparatus for, controlling medical navigation systems
US5035500A (en) Automated ocular perimetry, particularly kinetic perimetry
US7549743B2 (en) Systems and methods for improving visual discrimination

Legal Events

Date Code Title Description
NENP Non-entry into the national phase in:

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 06842525

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 06842525

Country of ref document: EP

Kind code of ref document: A2