WO2009141769A1 - Reproducible positioning of sensing and/or treatment devices - Google Patents

Reproducible positioning of sensing and/or treatment devices Download PDF

Info

Publication number
WO2009141769A1
WO2009141769A1 PCT/IB2009/051998 IB2009051998W WO2009141769A1 WO 2009141769 A1 WO2009141769 A1 WO 2009141769A1 IB 2009051998 W IB2009051998 W IB 2009051998W WO 2009141769 A1 WO2009141769 A1 WO 2009141769A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensing
target location
treatment device
reference image
image
Prior art date
Application number
PCT/IB2009/051998
Other languages
French (fr)
Inventor
Yan Liu
Golo Von Basum
Bastiaan W. M. Moeskops
Calina Ciuhu
Kiran K. Thumma
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2009141769A1 publication Critical patent/WO2009141769A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/684Indicating the position of the sensor on the body
    • A61B5/6842Indicating the position of the sensor on the body by marking the skin
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0062Arrangements for scanning
    • A61B5/0066Optical coherence imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/064Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6843Monitoring or controlling sensor contact pressure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/373Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • A61B2090/395Visible markers with marking agent for marking skin or other tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14532Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring glucose, e.g. by tissue impedance measurement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Definitions

  • the present invention relates to reproducibly positioning a sensing and/or treatment device on a target location on a body part of a human being or an animal. More particularly, the present invention relates to a repositioning system, a method for making such a repositioning system and a method for positioning a sensing and/or treatment device on a target position on a body part of a human being or an animal by using such a repositioning system.
  • instruments have been developed for minimally invasive measurement of physiological parameters in a human or animal body, such as for example glucose measurements, e.g. based on optical methods. These methods make use of a sensor implanted beneath the skin which is in contact with subcutaneous fluids.
  • the sensor may include gels, particles, liquids which are biodegradable.
  • the biosensor that has to be implanted is small in size, and does not require a complicated or painful insertion below the skin.
  • Non-invasive measurement is the most desirable method for consumers. But the uncertainty and inaccuracy hampered the acceptance of non- invasive tests. There is a strong need in the non-invasive glucose-monitoring market to solve the inaccuracy or unreliability problems.
  • an irreproducible factor is the placement of the measurement device on the skin.
  • the morphology of the skin 1 is different at different locations (see Fig. 1), which leads to variations in the optical properties from site to site. It has been recognised that reproducible placement of the measurement device on a specific location at the skin is an important factor for improving accuracy of the technique used and the results obtained.
  • Another irreproducible factor is the relative position of the sensing device to an implanted minimally invasive microsensor.
  • the measurement device is always located on a same location on the skin for each measurement.
  • a number of solutions have been tried: - Permanently fixing the device to the skin, e.g. by means of stickers. However, it appeared that this could lead to skin irritation.
  • WO 2007/072356 describes a method to find back a same position on a skin of a patient.
  • the main purpose of this document is manually repositioning of a measurement device by lay persons.
  • the position has to be trained once by a professional.
  • Preferred devices are e.g. defibrillators. All operations are performed completely manual.
  • the professional finds a specific position for 'train mode'.
  • the subject has to replace the measurement device, also referred to as probe in 'guide mode'.
  • Image acquisition is done by a camera at a distance of about 10 cm of the measurement site. After finding a correct spot, the probe is lowered to the skin. There is no update on the specific position during lowering of the measuring device, e.g. sensor towards the measurement site on the skin.
  • a device and method according to embodiments of the present invention allow reproducibly positioning a sensing and/or treatment device on a same measurement site on a body part of a human being or animal. Therefore, a device and method according to embodiments of the present invention allow reproducibly measuring physiological parameters at a same measurement site on the body part of the human being or animal without the need for permanently fixing at least part of a device to the skin. This may be important for, for example, patients for whom on a regular basis, e.g. every day or a few times a day, a same measurement has to be performed, e.g. glucose monitoring in case of diabetes patients. In these cases it may be important to always perform the measurement at a same target location in order to obtain comparable measurement results.
  • a device and method according to embodiments of the present invention may provide an accuracy of better than 0.5 mm, i.e. by using a device and method according to embodiments of the present invention a sensing device may be positioned and repositioned at substantially the same target location showing a deviation of less than 0.5 mm, for example a deviation of less than 0.3 mm, less than 0.1 mm, or less than 0.01 mm.
  • a device and method according to embodiments of the invention are applicable to and suitable for being used with various minimally invasive or non- invasive measuring devices, and is particularly suitable for, but not limited to, being used with minimally invasive or non-invasive measuring devices which use optical detection.
  • the above objective is accomplished by a method and device according to the present invention.
  • the present invention provides a repositioning system for positioning a sensing and/or treatment device at a target location.
  • the repositioning system comprises an imaging system for identifying the target location having a marker on a body part of a human being or an animal, thereby obtaining a reference image of the target location, and for acquiring at least one measured image of the target location.
  • the imaging system furthermore comprises: registration means for registering the at least one measured image to the reference image to determine spatial alignment between the at least one measured image and the reference image, and signal generating means for generating a driving signal indicative of the target location direction when the spatial alignment is not approached or reached within a predetermined threshold value.
  • the repositioning system furthermore comprises actuator means for driving a sensing and/or treatment device to guide it towards to target location thereby using the driving signal.
  • target location direction is meant the direction in which a sensing and/or treatment device has to be moved and/or rotated in order to arrive at a location where measurement of a physiological parameter has to be performed or some kind of treatment is to be carried out.
  • the desired location can be one where there is a subcutaneous microsenor implanted.
  • the imaging target location direction refers to both a direction of spatial displacement and to change of orientation of the sensing and/or treatment device.
  • a repositioning system according to embodiments of the present invention provides a means which is applicable and suitable for various minimally invasive or non-invasive measurements, such as e.g. minimally invasive or non-invasive daily glucose monitoring.
  • the repositioning system may furthermore comprise the sensing and/or treatment device.
  • the repositioning system may furthermore comprise a control unit comprising a memory for storing the at least one reference image.
  • the control unit may furthermore comprise an algorithm for registering and comparing the at least one measured image to the reference image.
  • the imaging system and the sensing and/or treatment device may be incorporated in one common device.
  • An advantage hereof is easier handling of the repositioning system.
  • the imaging system and the sensing and/or treatment device may be formed by separate devices.
  • the repositioning system may furthermore comprise a user interface for providing feedback to a user for guiding the sensing and/or treatment device towards the target location.
  • the repositioning system may furthermore comprise a marker means for providing a marker on the target location and an ink reservoir for providing the marker means with ink.
  • the imaging system may furthermore comprise a radiation beam generating means for generating a radiation beam and wherein the sensing and/or treatment device may furthermore comprise a radiation beam detection means for detecting the radiation beam.
  • the imaging system may comprises at least one of charged coupled device (CCD) cameras (e.g. VIS, UV, IR), laser speckle contrast analysis (LASCA), optical coherence tomography (OCT), orthogonal polarized spectral imaging (OPSI), confocal scanning laser microscopy (CLSM) ), luminescence such as e.g. fluorescence, photoacoustics or ultrasound.
  • CCD charged coupled device
  • LASCA laser speckle contrast analysis
  • OCT optical coherence tomography
  • OPSI orthogonal polarized spectral imaging
  • CLSM confocal scanning laser microscopy
  • the marker may comprise at least one of an artificial marker, a natural marker, a sub-surface natural marker or a sub-surface artificial marker.
  • the sub-surface artificial marker may be a subcutaneous microsensor.
  • the repositioning system may furthermore comprise a pressure and/or suction system comprising at least one pressure and/or suction element.
  • the target location may be adapted in shape to the shape of the sensing and/or treatment device or to the volume of the subcutaneous measurement and/or treatment site (e.g. that contains a microsensor). such that always an optimal contact may be achieved between the sensing and/or treatment device and the target location or an optimal contact may be achieved with a region of the skin above a volume containing a microsensor. This increases accuracy of the measurement performed after positioning of the sensing and/or treatment device on the target location.
  • the present invention provides a method for making a repositioning system.
  • the method comprises: providing an imaging system for acquiring at least one reference image and at least one measured image, the imaging system comprising a registration means for registering the at least one measured image to the reference image and signal generating means for generating a driving signal indicative of the target location direction, and providing actuator means for driving a sensing and/or treatment device to guide it towards to target location thereby using the driving signal.
  • the method may furthermore comprise providing a control unit comprising a memory for storing the at least one reference image.
  • the method may furthermore comprise providing a sensing and/or treatment device.
  • providing an imaging system and providing a sensing and/or treatment device may be performed by providing an imaging system and a sensing and/or treatment device incorporated in one common device.
  • providing an imaging system and providing a sensing and/or treatment device may be performed by providing a separate imaging system and a separate sensing and/or treatment device.
  • the method may furthermore comprise providing a user interface for providing feedback to a user for guiding the sensing and/or treatment device towards the target location.
  • the method may furthermore comprise: providing a radiation beam generating means to the imaging system for generating a radiation beam, and providing a radiation beam detection means to the sensing and/or treatment device for detecting the radiation beam.
  • the present invention provides a repositioning system formed by the method according to embodiments of the present invention.
  • the present invention provides a method for positioning a sensing and/or treatment device on a target location on a body part of a human being or an animal. The method comprises: identifying the target location having a marker on or in the body part by acquiring at least one reference image, acquiring at least one measured image of the target location, determining spatial alignment between the at least one measured image and the reference image, generating a driving signal indicative of the target location direction when the spatial alignment is not approached or reached within a predetermined threshold value, and driving the sensing and/or treatment device with the driving signal to guide it towards the target location.
  • a method according to embodiments of the invention allows registering with an accuracy comparable with pixel size, e.g. to an accuracy of about 0.1 mm. Therefore, a method according to embodiments of the invention allows reproducibly positioning a sensing and/or treatment device to a target location on a human body.
  • the method may furthermore comprise storing the reference image in a memory device of a control unit.
  • the method may furthermore comprise, before driving the sensing and/or treatment device with the driving signal, providing feedback to a user interface incorporated in the sensing and/or treatment device for giving instructions to a user on how to move and/or rotate the sensing and/or treatment device.
  • Acquiring at least one reference image and acquiring at least one measured image of the target location may be performed by using at least one of charged coupled device (CCD) cameras (e.g. VIS, UV, IR), laser speckle contrast analysis (LASCA), optical coherence tomography (OCT), orthogonal polarized spectral imaging (OPSI) or confocal scanning laser microscopy (CLSM).
  • CCD charged coupled device
  • LASCA laser speckle contrast analysis
  • OCT optical coherence tomography
  • OPSI orthogonal polarized spectral imaging
  • CLSM confocal scanning laser microscopy
  • the present invention provides a controller for controlled driving of a sensing and/or treatment device towards a target location on a body part of a human being or an animal.
  • the controller comprises a control unit for controlling actuator means adapted for driving the sensing and/or treatment device.
  • the control unit may comprise: a memory for storing a reference image, and a computing unit for registering and comparing measured images to the reference image and for determining a driving signal for driving the sensing and/or treatment device.
  • the present invention also provides, in a further aspect, a computer program product for performing, when executed on a computing means, a method according to embodiments of the invention.
  • the present invention also provides, in a further aspect, a machine readable data storage device storing the computer program product according to embodiments of the invention.
  • the present invention also provides, in a further aspect, transmission of the computer program product according to embodiments of the invention over a local or wide area telecommunications network.
  • Fig. 1 schematically shows skin composition of part of a body of a human being or animal
  • Fig. 2 schematically illustrates a repositioning system according to embodiments of the present invention
  • Fig. 3 to Fig. 5 illustrate particular steps of a method according to embodiments of the present invention
  • Fig. 6 illustrates an embodiment of an algorithm for use with a method according to embodiments of the present invention
  • Fig. 7 illustrates recordings of spatial alignment with respect to the target as defined by the reference image
  • Fig. 8 to Fig. 19 schematically illustrate different implementations of a repositioning system and a method according to embodiments of the present invention
  • Fig. 20 schematically illustrates a system controller for use with a driver system according to embodiments of the present invention
  • Fig. 21 is a schematic representation of a processing system as can be used for performing a method according to embodiments of the present invention.
  • an element described herein of an apparatus embodiment is an example of a means for carrying out the function performed by the element for the purpose of carrying out the invention.
  • the present invention provides a repositioning system for a sensing and/or treatment device on a target location on a body part of a human being or an animal, a method for making such a repositioning system and a method for positioning a sensing and/or treatment device on a target position on a body part of a human being or an animal by using such a repositioning system.
  • Minimally invasive glucose monitoring can also be used to control glucose levels.
  • Minimally invasive techniques can rely on optical means for receiving signals from a subcutaneous microsensor in contact with subcutaneous body tissue and fluids. The analytes present in the blood can be determined by analysing the light returned from the microsensor. Near infrared spectroscopy has also proved to be a promising method for determination of the glucose levels minimally invasively.
  • minimally invasive as used for example in the phrase minimally invasive measurement of physiological parameters in a human or animal body, includes those methods where there is a minor level of invasion.
  • An example is where the measurement itself is non- invasive but the determination of the analyte is done with the help of an implanted microsensor, e.g. a subcutaneous microsensor.
  • the sensor is implanted beneath the skin which is in contact with subcutaneous fluids.
  • the sensor may include gels, particles, liquids which are biodegradable.
  • the biosensor that has to be implanted is small in size, and does not require a complicated or painful insertion below the skin.
  • the micor sensor comprises an assay such as for example for the determination of glucose, e.g. based on optical methods.
  • the measurement of glucose through spectroscopy can be made by a change in the absorption of light according to the absorption and scattering properties of minimally invasive microsensors, or to the change in light emitted or reflected from such microsensors located below the skin.
  • Such methods using microsensors may include, for example, observing fluorescence (e.g.
  • the microcapsules can be polyelectrolyte microcapsules, detecting glucose using boronic acid-substituted violegens in fluorescent hydrogels in which a fluorescent anionic dye and a viologen are appended to boronic acid, which serve as glucose receptors, and are immobilised into a hydrogel, the fluorescence of the dye being modulated by the quenching efficiency of the viologen based receptor which is dependent upon the glucose concentration, other methods, e.g.
  • a device and method according to embodiments of the present invention allow reproducibly positioning a sensing and/or treatment device on a same measurement site on a body part of a human being or animal. Therefore, a device and method according to embodiments of the present invention allow reproducibly measuring physiological parameters at a same measurement site on the body part of the human being or animal without the need for permanently fixing at least part of a device to the skin.
  • a device and method according to embodiments of the present invention may provide an accuracy of better than 0.5 mm, i.e.
  • a sensing and/or treatment device may be positioned and repositioned at substantially the same target location showing a deviation of less than 0.5 mm, for example a deviation of less than 0.3 mm, less than 0.1 mm or less than 0.01 mm.
  • a device and method according to embodiments of the invention are applicable to and suitable for being used with various minimally invasive or non- invasive measuring devices, such as minimally invasive or non-invasive glucose measuring devices, and are particularly suitable for, but not limited to, being used with minimally invasive or non- invasive measuring devices which use optical detection.
  • Further applications of a repositioning system 10 and method for positing a sensing and/or treatment device 6 at a target location 16 on a body part of a human being or an animal according to embodiments of the present invention may include measurements of skin properties such as e.g. skin cancer, skin aging, etc., for example by any means of light.
  • a device and method according to embodiments of the present invention may be applied with sensing methods and/or with treatment methods known by a person skilled in the art.
  • a device and method according to embodiments of the present invention may generally be applied to sensing methods which benefit from reproducibly placing the sensor at a same position.
  • Such sensing methods may, for example, comprise ultrasound measurements, temperature sensing, pressure sensing, measurements using parts of the electromagnetic spectrum (such as e.g. optical, microwave, radiowave methods), skin impedance/resistance measurements and capacitance measurements, and flux measurements of compounds (such as Trans Epidermal Water Loss).
  • a device and method according to embodiments of the present invention may be for performing a treatment at a reproducible position, such as heat or light treatment for hair removal, treatment of skin disorders by e.g. light, skin rejuvenation, injection of medication, implanting or tissue sampling such as e.g. taking a biopsy.
  • a treatment at a reproducible position such as heat or light treatment for hair removal, treatment of skin disorders by e.g. light, skin rejuvenation, injection of medication, implanting or tissue sampling such as e.g. taking a biopsy.
  • embodiments of the invention provide a repositioning system 10 comprising an imaging system 2.
  • the imaging system 2 is suitable for identifying an imaging target location having a marker on or in a body part of a human being or an animal, thereby obtaining a reference image of the imaging target location, and for acquiring at least one measured image of the imaging target location.
  • This imaging target location is identified as the target location at which subsequent measurement of a physiological parameter and/or treatment has to be performed by means of a sensing and/or treatment device.
  • a plurality of subsequently measured images may be acquired. In that the latter case, at least two of the subsequently measured images may be measured under different circumstances, e.g. they may be acquired for different positions and/or under different angles and/or under different illumination conditions.
  • imaging target location and target location may be used next to each other. It has to be understood that these terms are intended to indicate a same location, i.e. the location at which measurement of a physiological parameter and/or treatment has, repeatedly or not, to be performed.
  • the imaging system 2 furthermore comprises: registration means 3 a adapted for registering the at least one measured image to the reference image to determine spatial alignment between the at least one measured image and the reference image, and signal generating means 3b adapted for generating a driving signal 5 indicative of the imaging target location direction when the spatial alignment is not approached or reached within a predetermined threshold value.
  • the repositioning system furthermore comprises actuator means for driving the sensing and/or treatment device to guide it towards the target location thereby using the driving signal.
  • the driving signal 5 may be generated until spatial alignment is obtained, between the at least one measured image and the reference image, or until spatial alignment is approached or reached within a predetermined threshold value. Hence, the driving signal 5 may be generated until the spatial alignment is approached or reached within the predetermined threshold value. In other words, the driving signal 5 may be generated until the sensing and/or treatment device arrives at the imaging target location or in a predetermined neighbourhood thereof.
  • imaging target location direction is meant the direction in which a sensing and/or treatment device has to be moved and/or rotated in order to arrive at a location where measurement of a physiological parameter and/or treatment has to be performed. It has to be noted that the imaging target location direction refers to both a direction of spatial displacement and a changes of orientation of the sensing and/or treatment device.
  • a desired imaging target location has to be defined and stored. This is the role of the reference image. Different factors can influence the choice of such a target location. It can be defined by a sensing and/or treatment system used to perform the measurement of the physiological parameter and/or to perform treatment, by recording an image just before, during or just after performing the sensing and/or treatment action.
  • a 'reference image' has similar characteristics as a 'measured image', except that the 'reference image' was taken at an earlier time, and specifies the desired imaging target location of the repositioning system.
  • the 'reference image' can be static, specified only at initial use or during a position calibration measurement.
  • the 'reference image' can be dynamic, and can be updated with every sensing and/or treatment action.
  • the first case might be sensitive to changes of the target are over time (e.g. in case of aging and/or injury), while the latter should allow adapting to gradual changes.
  • At least two images are to be acquired.
  • One is the 'reference image', specifying and identifying the target location.
  • the second is the 'measured image', specifying the current state of the system, including measured position and orientation.
  • the availability of these two images may be sufficient to calculate a transformation between the two, resulting in alignment. A better accuracy and reliability might be obtained if a continuous stream of
  • the repositioning system 10 furthermore comprises actuator means 4 generating, based on the driving signal 5, actuation signals 7 for driving the sensing and/or treatment device 6 to guide it towards the imaging target location.
  • the repositioning system 10 allows reproducibly positioning a sensing and/or treatment device 6 on a same target location. This may be desired for, for example, patients by whom on a regular basis, e.g. every day or a few times a day, a same measurement has to be performed, for example glucose monitoring in case of diabetes patients. In these cases it may be important to always perform the measurement at a same target location in order to obtain comparable measurement results.
  • the sensing and/or treatment device 6 and the imaging system 2 may be comprised in a same device.
  • the sensing and/or treatment device 6 may be part of the repositioning system 10 itself.
  • there may be a fixed positional relationship between the imaging system 2 and the sensing and/or treatment device 6 (see further with respect to Fig. 8). This fixed relationship can be limited to parameters included in the movement of the sensing and/or treatment device 6, also referred to as transformation. If a 2D position is reproducibly desired, and only XY is calculated (see co-ordinate system further in Fig. 8), then at least the XY relationship should be fixed.
  • the Z distance between imaging system 2 and sensing and/or treatment device 6 may still be flexible, since this parameter is not part of the calculated transformation.
  • the imaging system 2, e.g. camera may also have the function of sensing and/or treatment device 6.
  • the sensing and/or treatment device 6 may be separated from the imaging system 2 and may, in that case, not be part of the repositioning system 10 (see further with respect to Fig. 9). In that case, no fixed positional relationship exists between the imaging system 2 and the sensing and/or treatment device 6. It has to be noted that, according to embodiments of the invention, the target location at which the sensing and/or treatment device 6 eventually arrives may, within a predefined range, be different from the target location of the imaging system 2, i.e. the target location which was originally set.
  • the imaging system 2 is adapted for acquiring at least one reference image and for acquiring at least one measured image.
  • the imaging system 2 may have a high spatial resolution of tens of ⁇ m.
  • the imaging system 2 may be able to detect structures with sized of between tens of ⁇ m and tens of cm, for example, sizes between 0.01 mm and 10 cm, e.g. between 0.1 mm and 10 cm.
  • These structures may also be referred to as markers.
  • markers may for example be, but are not limited to, an artificial marker provided on the skin of the body such as e.g. a stamp, a natural marker, such as e.g.
  • a sub-surface natural marker such as vessels or skin structures or a sub-surface artificial marker such as an implanted marker, for example a subcutaneous microsensor as using in minimally invasive testing.
  • a sub-surface artificial marker such as an implanted marker, for example a subcutaneous microsensor as using in minimally invasive testing.
  • artificial markers such as e.g. stamps, or a microsensor, these can be visible or invisible to the human eyes.
  • Artificial markers may be provided by e.g. drawing a mark onto the target location. This may, for example, be done by hand by a user, a doctor or a nurse or may be done by use of the repositioning system 10 itself which may then comprise a built-in marker means 23 (see Fig. 23 and corresponding description).
  • sub-surface markers these markers are located below skin level which makes them less sensitive to changes in time and less exposed to environmental influence like hydration or sweating compared to markers that are provided at the outside of the skin.
  • sub-surface natural markers may be sweat glands, hairs follicles or capillaries.
  • the imaging system 2 may use any suitable known imaging technique such as for example charged coupled device (CCD) cameras (e.g. VIS, UV, IR), laser speckle contrast analysis (LASCA), optical coherence tomography (OCT), orthogonal polarized spectral imaging (OPSI), confocal laser scanning microscopy (CLSM), luminescence such as e.g. fluorescence, photoacoustics or ultrasound.
  • CCD charged coupled device
  • LASCA laser speckle contrast analysis
  • OCT optical coherence tomography
  • OPSI orthogonal polarized spectral imaging
  • CLSM confocal laser scanning microscopy
  • luminescence such as e.g. fluorescence, photoacoustics or ultrasound.
  • the repositioning system 10 furthermore comprises registration means 3a for registering the at least one measured image to the reference image to determine spatial alignment between the at least one measured image and the reference image.
  • the degree of alignment can be calculated from an image registration algorithm, such as e.g. Scale Invariant Feature Transform (SIFT).
  • SIFT Scale Invariant Feature Transform
  • unique features are identified between the measured image and the reference image. If such features present in the measured image are also present in the reference image then the necessary transformation between the two images, or in other words the required movement of the sensing and/or treatment device 6 in order to arrive at the imaging target location can be calculated using affine transformations.
  • affine transformations In a closed loop system, using a stream, i.e. a plurality of subsequently measured images, repositioning is completed when these calculated values for the necessary transformation or movement have decreased below the set target (e.g. 0.1 mm, 0.3 mm or 0.5 mm).
  • the repositioning system 10 only executes the transformation as calculated from the comparison between this one measured image and the reference image.
  • a driving signal 5 is generated by means of a signal generating means 3b and this driving signal 5 is sent to an actuator means 4 where it is used to provide actuation signals 7 to the sensing and/or treatment device 6 for moving of the sensing and/or treatment device 6 towards the target location.
  • the actuator means 3b may, for example, comprise XY-stages, Z-stages or rotation stages, which may use DC motors, stepper motors, voice coils or may comprise any other suitable actuator known by a person skilled in the art.
  • the actuator means 4 may obtain, e.g. receive or generate, a signal representative of the target location direction, which may aid in determining the direction of moving of the sensing and/or treatment device 6. Details on the method and the use of the repositioning system 10 will be described further in the description.
  • the sensing and/or treatment device 6 may be any suitable sensing and/or treatment device known by a person skilled in the art, and may for example be a spectroscopic sensing device suitable for performing minimally invasive or non- invasive measurement of glucose.
  • the actuator means 4 may be part of the imaging system 2, or may be part of the sensing and/or treatment device 6 or may be a separate device (as is illustrated schematically in Fig. 2). The actuator means 4 must be placed such that it moves the sensing and/or treatment device 6 relative to the skin.
  • the imaging system 2 can also be part of the actuated system, i.e.
  • the imaging system 2 is not actuated, i.e. is not driven by the driving signal 5, and remains stationary.
  • the sensing and/or treatment device 6 may furthermore comprise a user interface such that the sensing and/or treatment device 6 can first be guided roughly towards the target location by a user and can then be fine-tuned by the actuator means 4 which further accurately guides the sensing and/or treatment device 6 towards the target location (see further).
  • the user interface may comprise indication means for indicating the direction in which the sensing and/or treatment device 6 has to be moved and/or rotated.
  • the repositioning system 10 may furthermore comprise a built-in marker means such as e.g. a stamp for providing a marker to the target location and a reservoir comprising a substance, e.g. an ink reservoir, connected to the marker means for providing the substance, e.g. ink to the marker means (see further).
  • a built-in marker means such as e.g. a stamp for providing a marker to the target location and a reservoir comprising a substance, e.g. an ink reservoir, connected to the marker means for providing the substance, e.g. ink to the marker means (see further).
  • a built-in marker means such as e.g. a stamp for providing a marker to the target location and a reservoir comprising a substance, e.g. an ink reservoir, connected to the marker means for providing the substance, e.g. ink to the marker means (see further).
  • the marker can be provided in a same handling as the positioning of the sensing and/or treatment device 6 at
  • the imaging system 2 may furthermore comprise radiation beam generating means such as e.g. a laser 25 for generating a radiation beam, e.g. laser beam.
  • the sensing and/or treatment device 6 may furthermore comprise radiation beam detection means, e.g. a photodiode 26, for detecting the radiation beam, e.g. laser beam and in that way providing information on the target location (see further).
  • the radiation beam generating means, e.g. laser 25, may be used to identify the target location in comparison with the reference image.
  • an image processing algorithm locates the imaging target location, within the current field of view of the imaging system 2. The target location can then be illuminated with the radiation beam generating means, e.g.
  • the sensing and/or treatment device 6 is then moved towards the target location. When it arrives at the target location, it will detect the radiation beam, e.g. laser beam by means of the radiation beam detection means, e.g. a photodiode 26. Upon detection of the radiation beam, e.g. laser beam, it may be decided to stop movement of the sensing and/or treatment device 6.
  • the radiation beam e.g. laser beam
  • the radiation beam detection means such as e.g. a photodiode 26, may be for detecting the radiation beam, e.g. laser beam, when the target location is reached and for communicating this to the user in some suitable way.
  • the repositioning system 10 may furthermore comprise a pressure and/or suction system comprising at least one pressure and/or suction element for providing a particular pressure to the surface at the target location on the skin of a human being or an animal for adapting the target location in shape to the shape of the sensing and/or treatment device 6. In that way, an optimal contact may be achieved between the sensing and/or treatment device 6 and the target location. This increases accuracy of the measurement performed after positioning of the sensing and/or treatment device 6 on the target location.
  • the present invention provides a method for positioning a sensing and/or treatment device 6 on a target location on a body part of a human being or an animal by using a repositioning system 10 according to embodiments of the invention as described above.
  • the method comprises: - identifying a target location having a marker on or in the body part by acquiring at least one reference image, acquiring at least one measured image of the target location, registering the at least one measured image to the reference image to determine spatial alignment between the at least one measured image and the reference image, generating a driving signal indicative of the target location direction when the spatial alignment is not approached or reached within a predetermined threshold value, and driving the sensing and/or treatment device 6 based on the driving signal to guide it towards the target location.
  • target location direction is meant the direction in which a sensing and/or treatment device 6 has to be moved and/or rotated in order to arrive at a location where measurement of a physiological parameter and/or treatment has to be performed.
  • At least two images are to be acquired.
  • One is the 'reference image', specifying and identifying the target location.
  • the second is the 'measured image', specifying the current state of the system, including measured position and orientation. The availability of these two images may be sufficient to calculate a transformation between the two, resulting in alignment.
  • a better accuracy and reliability might be obtained if a continuous stream of 'measured images' is available, i.e. when a plurality of subsequently measured images is acquired.
  • at least two of the subsequently measured images may be measured under different circumstances, e.g. they may be acquired for different positions and/or under different angles and/or under different illumination conditions. Acquisition of a plurality of images would allow a closed loop system, which can compensate for disturbances during the alignment procedure. If for example, a patients arm would shift during alignment, this would only be noticed when 'measured images' are available that show this disturbance, and allow the system to compensate.
  • a method according to embodiments of the invention allows registering the at least one measured image to the reference image with an accuracy comparable with pixel size, e.g. to an accuracy of about 0.1 mm. Therefore, the method according to embodiments of the invention allows reproducibly positioning a sensing and/or treatment device at a target location on a body part of a human being or an animal.
  • an imaging target location having a marker is identified on or in a body part of a human being or an animal.
  • the marker may, for example but not limited thereto, comprise an artificial marker provided on the skin of the body such as e.g. a stamp, or a smart tattoo, a natural marker such as e.g. skin texture or colours, a sub-surface natural marker such as a vessel or skin structure or a sub-surface artificial marker such as an implanted marker.
  • the implanted marker may be subcutaneous micorsensor. In case of using artificial markers such as e.g. stamps, or microsensors these can be visible or invisible to the human eyes. Artificial markers may be provided by e.g.
  • a mark onto the target location This may, for example, be done by hand by a user, a doctor or a nurse or may be done by use of the repositioning system itself which may then comprise a built-in marker means as described above.
  • Artificial markers may be provided by subcutaneously, e.g. by inserting or implanting a microsensor at the target location. In case of sub-surface markers, such markers are located below skin level which makes them less sensitive to changes in time and to environmental influences like hydration or sweating compared to markers that are provided at the outside of the skin. Examples of sub-surface natural markers may be sweat-glands, hair follicles or capillaries. An example of a sub-surface artificial marker is a microsensor.
  • Identification of the imaging target location having a marker is obtained by acquiring at least one reference image.
  • one reference image may be acquired.
  • the method may comprise acquiring a reference map comprising multiple reference images.
  • the method may comprise acquiring a larger reference image taken by a separate imaging device or at a different distance than the following subsequently measured images (see hereinafter).
  • the imaging system 2 may comprise two different imaging devices. The two different imaging devices may operate at the same or different frequencies or range of light frequencies.
  • At least one measured image is acquired.
  • a plurality of subsequently measured images may be acquired.
  • at least two, and in embodiments of the present invention optionally each of the subsequently measured images may be measured under different circumstances, i.e. they may be acquired for different positions and/or under different angles and/or under different illumination conditions.
  • Acquiring the reference image and the at least one measured image may be performed using any known suitable imaging technology such as e.g. using charged coupled device (CCD) cameras (e.g. VIS, UV, IR), laser speckle contrast analysis (LASCA), optical coherence tomography (OCT), orthogonal polarized spectral imaging (OPSI), confocal laser scanning microscopy (CLSM), luminescence such as e.g. fluorescence, photoacoustics or ultrasound.
  • CCD charged coupled device
  • LASCA laser speckle contrast analysis
  • OCT optical coherence tomography
  • OPSI orthogonal polarized spectral imaging
  • CLSM confocal laser scanning microscopy
  • luminescence such as e.g. fluorescence, photoacoustics or ultrasound.
  • affine image registration techniques may be used to register the at least one measured image to the reference image.
  • any other suitable registration technique known by a person skilled in the art may be used.
  • a survey of image registration methods is given in "Image registration methods: a survey”, Barbara Zitova', Jan Flusser, Image and Vision Computing 21 (2003) 977-1000'.
  • Feature detection can be based on area-based methods or feature-based methods. Feature-based methods may analyse region features, line features or point features.
  • Matching of features in the reference image with those in the measured image can be done using are-based methods, such as correlation- like methods, Fourier methods, mutual information methods or optimization methods. Also, feature matching can be accomplished using feature-based methods, such as using spatial relations, invariant descriptors, relaxation methods or pyramids and wavelets.
  • Fig. 3, Fig. 4 and Fig. 5 the accuracy of a method according to embodiments of the invention will be demonstrated by means of Fig. 3, Fig. 4 and Fig. 5. It has thus to be noted that these figures are not intended to illustrate the method itself, but are intended to indicated the accuracy of the method.
  • Fig. 3 shows an example of acquisition of a reference image 13 and subsequently measured images 14.
  • the reference image has been rotated over -90°. This is only for the ease of illustration and this rotation is not intended to be part of the method according to embodiments of the invention.
  • fourteen images of a body part 11 of a human being were acquired, i.e. one reference image 13 (indicated by the dashed square) and thirteen subsequently measured images 14.
  • the subsequently measured images 14 were measured under different circumstances, e.g. under different positions and/or under different angles and/or under different illumination conditions.
  • the target location had an artificial marker 12, e.g. a marker 12 had been drawn on the body part 11. All images were acquired with a resolution up to 3-pixel accurate repositioning (current resolution is 32 pixels per mm).
  • a next step the thirteen subsequently measured images were registered to the reference image 13.
  • affine image registration techniques have been used to register the subsequently measured images 14 to the reference image 13.
  • image processing/registration tools from Matlab have been employed.
  • the result after registration of the subsequently measured images 14 to the reference image 13 is shown in Fig. 4. These images may be referred to as registered images.
  • Fig. 5 is a sum or combination of all registered images shown in Fig. 4. This Fig. 5 shows the overlap of the registered images shown in Fig. 4 and gives an indication of the registration accuracy by calculated resolution.
  • a method according to embodiments of the invention allows registering at an accuracy comparable to pixel size, which is 0.1 mm. Again it is indicated that the above description with respect to Fig. 3, Fig. 4 and Fig. 5 is only to indicate accuracy of a method according to embodiments of the invention and is not intended to illustrate the method according to embodiments of the invention itself.
  • a driving signal 5 indicative of the target location direction is determined and the sensing and/or treatment device 6 is driven based on that driving signal 5 to guide it towards the target location.
  • the method according to embodiments of the invention allows for reproducibly placing or positioning a sensing and/or treatment device 6 onto a target location on a body part of a human being or animal by using feedback from the imaging system 2. This may be important for, for example, patients by whom on a regular time basis, e.g.
  • the sensing and/or treatment device 6 is guided towards the imaging target location based on the driving signal 5 obtained from registering at least one measured image 14 to at least one reference image 13.
  • Actuator means 4 may be used for generating actuation signals 7 for driving the sensing and/or treatment device 6 based on the driving signal 5.
  • guiding the sensing and/or treatment device 6 towards the target location may be completely actuator based, i.e.
  • the sensing and/or treatment device 6 may be guided towards the target location via actuator means 4 that is directly driven by the driving signal 5.
  • guiding the sensing and/or treatment device 6 towards the target location can be done in a two step manner, where a first step is movement of the sensing and/or treatment device 6 by a user, for example visually or according to given feedback by means of a user interface and a second step then is fine adjustment e.g. via actuator means 4 that is directly driven by the driver signal 5.
  • Fig. 6 gives an example of an algorithm for using the imaging system 2 to find back the target location and using it to drive the sensing and/or treatment device 6 towards that target location.
  • a target location having a marker 12 is identified by acquiring at least one reference image 13 (see step 20 in Fig. 6). This step is also referred to as position registration.
  • the reference image 13 is then stored in a memory device (step 30 in Fig. 6).
  • a body part is imaged by acquiring at least one measured image 14 (step 40 in Fig. 6).
  • the at least one measured image 14 is then registered to the at least one reference image 13 to determine spatial alignment between the at least one measured image and the reference image (step 50 in Fig. 6).
  • the sensing and/or treatment device 6 may be rotated and/or moved based on a driving signal 5 applied to actuator means 4 (step 70 in Fig. 6), the sensing and/or treatment device 6 being guided towards the target location, and steps 50 and 60 may be repeated. If spatial alignment between the at least one measured image and the reference image is approached or reached within the predetermined threshold value , a driving signal 5 indicative for "target location reached” is sent to the sensing and/or treatment device 6. The sensing and/or treatment device 6 is loaded (step 80 in Fig.
  • a required measurement such as e.g. non-invasive monitoring of the amount of glucose in the blood of the user (step 90 in Fig. 6).
  • a new marker 12 may be printed on the target location and a new position registration may be performed (step 100 in Fig. 6), which then again may be stored in the memory device (step 30 in Fig. 6).
  • Fig. 7 illustrates recordings of spatial alignment with respect to the imaging target location as defined by the reference image.
  • the reference image was acquired at the start of the measurement, leading to an initial alignment of the reference image with the measured image.
  • the image registration algorithm continuously calculates the spatial transformations necessary to mach the measured image to the reference image. In this test, a misalignment was introduced, and then corrected using the near realtime feedback from the image registration algorithm as described above.
  • the method according to embodiments of the present invention allows accurately and reproducibly positioning a sensing and/or treatment device 6 on a target location on a body part 11 of a human being or an animal. This increases accuracy of the measurement performed with the sensing and/or treatment device 6 and improves quality of the results.
  • a repositioning system 10 and a method for positioning a sensing and/or treatment device 6 on a target location on a body part of a human being or an animal by using such a repositioning system 10 according to embodiments of the invention. It has to be understood that these implementations are only for the purpose of illustration and explanation and are not intended to limit the invention in any way.
  • Fig. 8 schematically illustrates an example of an implementation of a repositioning system 10 according to an embodiment of the present invention and how it can be used.
  • the imaging system 2 and the sensing and/or treatment device 6 may be incorporated in one common device.
  • the sensing and/or treatment device 6 may be part of the repositioning system 10 itself.
  • a marker for example an artificial marker 12 is provided to the target location 16 on a body part 11 of a human being or an animal, in the example given, on an arm of a human being.
  • This target location 16 is identified by acquiring at least one reference image 13 by means of the imaging system 2.
  • at least one measured image 14 is acquired in the region of interest 15 which are then registered to the reference image 13 to determine spatial alignment between the at least one measured image 14 and the reference image 13.
  • a driving signal 5 is generated as described above for driving the sensing and/or treatment device 6 to guide it to the target location 16.
  • the driving signal 5 may cause the sensing and/or treatment device 6 to move and/or rotate (this is indicated by arrows 17 in Fig. 7).
  • non- invasive measurement may be performed, such as e.g. glucose monitoring.
  • a repositioning system 10 according to embodiments of the present invention is schematically illustrated and how it can be used.
  • the imaging system 2 and the sensing and/or treatment device 6 may be two separate devices which are interconnected by a rigid connection, e.g. a communicative channel 18. According to these embodiments, there is no fixed positional relationship between the imaging system 2 and the sensing and/or treatment device 6.
  • Driving the sensing and/or treatment device 6 for guiding it to the target location 16 on a body part 11 , in the example given an arm, of a human being may be performed in a same way as described in the former example and in the embodiments hereabove.
  • the driving signal 5 is sent to the sensing and/or treatment device 6 through the communicative channel 18.
  • the actuator means 4 may be part of the imaging system 2 or may be part of the sensing and/or treatment device 6.
  • the sensing and/or treatment device 6 may furthermore comprise a user interface 19.
  • a user interface 19 This is schematically illustrated in Fig. 10.
  • the user interface 19 the user gets direct feedback on the movements and/or rotations he has to make with the sensing and/or treatment device 6 in order to guide the sensing and/or treatment device 6 towards the target location 16 (indicated by arrows 21).
  • the user interface 19 may comprise indicator means to warn the user when the target location 16 is reached and the user may stop moving and/or rotating the sensing and/or treatment device 6.
  • a driving signal 5 may be generated as described above for substantially exactly positioning the sensing and/or treatment device 6 at the target location 16.
  • non-invasive measurements may be performed, such as e.g. glucose monitoring.
  • the repositioning system 10 may comprise an imaging system 2, actuator means 4, a sensing and/or treatment device 6, a user interface 19 and a control unit 22.
  • specific/unique natural markers such as e.g. skin texture or skin colour may be used as a marker 12 to identify a specific target location 16 and register and store this target location 16 in a memory device of the repositioning system 10 as a reference image 13.
  • the control unit 22 may comprise a memory e.g. electronic memory, to store the reference image 13, as well as an algorithm, of which an example was given in Fig. 6, for registering and comparing measured images 14 to the reference image 13. Generating a driving signal 5 for driving the sensing and/or treatment device 6 to guide it towards the target location 16 may then be done as described above.
  • the repositioning system 10 may, similar to the example illustrated in Fig. 11, comprise an imaging system 2, actuator means 4, a sensing and/or treatment device 6, a user interface 19 and a control unit 22.
  • the repositioning system 10 may furthermore comprise a built-in marker means 23 such as e.g. a stamp to provide a marker 12 to the target location 16 on a body part 11 and a reservoir comprising a long-lasting substance reservoir, e.g. ink reservoir 24 to provide the marker means 23 with the long-lasting substance, e.g. ink.
  • the repositioning system 10 may furthermore comprise a built-in microsensor implanting means to insert a microsensor below the skin at the target location 16 on a body part 11.
  • a specific target location 16 is marked by a long-lasting marker 12 by using the built-in marker means 23.
  • An advantage of this example is that the marker 12 can be provided in a same handling as the positioning of the sensing and/or treatment device 6 at the target location 16. Generating a driving signal 5 for driving the sensing and/or treatment device 6 to guide it towards the target location 16 may then be done as described above.
  • the imaging system 2 and the sensing and/or treatment device 6 are incorporated in one common device.
  • the imaging system 2 and the sensing and/or treatment device 6 are two separate devices which are interconnected by a communicative channel 18.
  • the repositioning system 10 may comprise an imaging system 2, actuator means 4, a sensing and/or treatment device 6, a user interface 19, a built-in marker means 23 such as a stamp for providing a marker 12 onto the target location 16 and an ink reservoir 24 for providing the built-in marker means 23 with ink.
  • the imaging system 2 is separated from the sensing and/or treatment device 6 and is connected to the sensing and/or treatment device 6 by means of a communicative channel 18.
  • the sensing and/or treatment device 6 is, in the example given, together with the actuator means 4, the built-in marker means 23 and the ink reservoir 24, guided to the target location 16 by the driving signal 5 which is generated as already described above.
  • the driving signal 5 is sent towards the sensing and/or treatment device 6 by means of the communicative channel 18.
  • the actuator means 4 forms, together with the user interface 19, the built-in marker means 23 and the ink reservoir
  • the repositioning system 10 may, similar to the previous example, comprise an imaging system 2, actuator means 4, a sensing and/or treatment device 6, a user interface 19, a built-in marker means 23 such as a stamp for providing a marker 12 onto the target location 16 and an ink reservoir 24 for providing the built-in marker means 23 with ink (or in an alternative a built- in microsensor implanting means to insert a microsensor below the skin at the target location 16 on a body part 11).
  • the imaging system 2 is separated from the sensing and/or treatment device 6 and is connected to the sensing and/or treatment device 6 by means of a communicative channel 18.
  • the repositioning system 10 may furthermore comprise a radiation beam generating means 25 such as e.g. a laser for generating a radiation beam such as e.g. a laser beam.
  • the radiation beam generating means 25 such as e.g. a laser for generating a radiation beam such as e.g. a laser beam.
  • the radiation beam generating means such
  • the radiation beam generating means 25, e.g. laser is situated in the imaging system 2.
  • the radiation beam generating means 25, e.g. laser may be used to identify the target location 16 in comparing with the reference image 13.
  • the sensing and/or treatment device 6 may furthermore comprise a radiation beam detection means 26, such as e.g. a photodiode for, when the target location 16 is reached, detecting the radiation beam, e.g. laser beam and communicating this to the user, as was described earlier.
  • a radiation beam detection means 26 such as e.g. a photodiode for, when the target location 16 is reached, detecting the radiation beam, e.g. laser beam and communicating this to the user, as was described earlier.
  • a radiation beam detection means 26 such as e.g. a photodiode for, when the target location 16 is reached, detecting the radiation beam, e.g. laser beam and communicating this to the user, as was described earlier.
  • an example of a method according to embodiments of the invention
  • LASCA Laser Speckle Contrast Analysis
  • a light source suitable for being used with the LASCA imaging may, for example, comprise a laser diode around 650 nm, with an output power higher than 25 mW. Such laser diodes may be readily available at relatively low cost. No scanning optics are required, because LASCA is a non-scanning, full- field technique.
  • a conventional CCD-camera can be used for recording of the images.
  • the 2D-images need to be acquired and processed.
  • DAQ-electronics and a microprocessor may be required.
  • an update rate of several Hz is required. Such rate can be readily obtained with modern microprocessor power. If necessary, a compromise may be made between the desired resolution of the map, and the update rate of the user feedback indicators.
  • FIG. 15 A typical image obtained by LASCA is shown in Fig. 15.
  • subsurface blood vessels 27 are imaged.
  • the present example combines LASCA with the use of a reference image 13 as is illustrated in Fig. 16.
  • Reference number 14 indicates a measured image.
  • the circle with reference number 16 indicates the target location.
  • the reference image 13 is acquired such that it is much larger than the measured images 14, so as to be able to find the location of the measured image 14 with respect to the reference image 13.
  • a thick branching 28 of the blood vessel 27 is used as a marker 12 of the target location 16.
  • a user interface 19 on the sensing and/or treatment device 6 indicates to a user the direction in which the sensing and/or treatment device 6 has to be moved in order to be guided to the target location 16.
  • this example illustrates the use of sub-surface natural markers, in the example given blood vessels 27, in order to position a sensing and/or treatment device 6 at a target location 16.
  • a suitable repositioning system 10 which may be used with the method described in this example is illustrated in Fig. 17.
  • the repositioning system 10 may comprise an imaging system 2, a sensing and/or treatment device 6, a control unit 22 and a user interface 19.
  • the imaging system 2 and the sensing and/or treatment device 6 may be part of the same device.
  • the actuator means 4 may be part of the imaging system 2 or may be part of the sensing and/or treatment device 6.
  • the sensing and/or treatment device 6 may be any suitable sensing device known by a person skilled in the art, and may for example be a spectroscopic sensing device suitable for performing minimally invasive or non-invasive measurement of glucose.
  • the control unit 22 may comprise a self-locating algorithm, of which an example is given in Fig. 6. This algorithm matches the at least one real-time measured image 14 to the reference image 13 stored in the memory device, and determines the location of the sensing and/or treatment device 6 with respect to the marker. Then, appropriate user feedback is given to guide the user to move the sensing and/or treatment device 6 to the target location 16.
  • the actuator means 4 may then furthermore guide the sensing and/or treatment device 6 by means of the driving signal obtained as described above for fine tuning the position of the sensing and/or treatment device 6.
  • the body part 11 of the human being or animal may be placed onto the repositioning system 10.
  • the repositioning system 10 may be larger than in the other cases.
  • a schematic illustration of the present example is illustrated in Fig. 18.
  • the repositioning system 10 comprises a large part 29 such that a body part 11 can be provided onto the repositioning system 10, an imaging system 2 and a sensing and/or treatment device 6 which are optionally incorporated in a common device.
  • the other parts of the repositioning system are not illustrated in this figure.
  • the imaging system 2 may comprise two different imaging parts. This example is illustrated in Fig. 19.
  • a first imaging part 31 may be used for acquiring the reference image 13 and is thus different from the second imaging part 32 which may be used for acquiring the at least one measured image 14.
  • the first imaging part 31 may, according to embodiments, work as a non-contact fashion.
  • This first imaging part 31 may also be referred to as initial use device.
  • the initial use device may acquire a high quality reference image 13 or reference map comprising a plurality of reference images 13. It may then communicate the reference image 13 or reference map to the second imaging part 32.
  • the second imaging part 32 of the imaging system 2 and the sensing and/or treatment device 6 are incorporated in a common device.
  • the other parts of the repositioning system 10 have not been shown in this figure.
  • the repositioning system 10 and method for positing a sensing and/or treatment device 6 at a target location 16 on a body part of a human being or an animal may be used with any technique in which analytes have to be measured within the skin of a body part 11 of the human being or animal.
  • the repositioning system 10 and method for positing a sensing and/or treatment device 6 at a target location 16 on a body part of a human being or an animal according to embodiments of the invention can, for example, be applied for minimally invasive or non-invasive glucose detection by means of optical spectroscopy.
  • Further applications of the repositioning system 10 and method for positing a sensing and/or treatment device 6 at a target location 16 on a body part of a human being or an animal may include measurements of skin properties such as e.g. skin cancer, skin aging, etc., by any means of light.
  • the present invention also provides a system controller 100 for use in a repositioning system 10 for controlled driving of a sensing and/or treatment device 6 of the repositioning system 10 according to embodiments of the present invention.
  • the system controller 100 which is schematically illustrated in Fig. 20, may comprise a control unit 22 for controlling actuator means 4 adapted for driving the sensing and/or treatment device 6.
  • the control unit 22 may comprise a memory e.g. electronic memory to store the reference image 13, as well as an algorithm, of which an example was given in Fig. 6, for registering and comparing measured images 14 to the reference image 13.
  • the system controller 100 may include a computing device, e.g. microprocessor, for instance it may be a micro-controller.
  • a programmable controller for instance a programmable digital logic device such as a Programmable Array Logic (PAL), a Programmable Logic Array, a Programmable Gate Array, especially a Field Programmable Gate Array (FPGA).
  • PAL Programmable Array Logic
  • FPGA Field Programmable Gate Array
  • the use of an FPGA allows subsequent programming of the repositioning system 10, e.g. by downloading the required settings of the FPGA.
  • the system controller 100 may be operated in accordance with settable parameters, such as driving parameters, for example temperature and timing parameters.
  • Fig. 21 shows one configuration of processing system 200 that includes at least one customisable or programmable processor 41 coupled to a memory subsystem 42 that includes at least one form of memory, e.g., RAM, ROM, and so forth.
  • the processor 41 or processors may be a general purpose, or a special purpose processor, and may be for inclusion in a device, e.g., a chip that has other components that perform other functions.
  • one or more aspects of the method according to embodiments of the present invention can be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them.
  • the processing system may include a storage subsystem 43 that has at least one disk drive and/or CD-ROM drive and/or DVD drive.
  • a display system, a keyboard, a pointing device or a touchscreen may be included as part of a user interface subsystem 44 to provide for a user to manually input information, such as parameter values indicating directions of movement.
  • More elements such as network connections, interfaces to various devices, and so forth, may be included in some embodiments, but are not illustrated in Fig. 21.
  • the various elements of the processing system 40 may be coupled in various ways, including via a bus subsystem 45 shown in Fig. 21 for simplicity as a single bus, but will be understood to those in the art to include a system of at least one bus.
  • the memory of the memory subsystem 42 may at some time hold part or all (in either case shown as 46) of a set of instructions that when executed on the processing system 40 implement the steps of the method embodiments described herein.
  • the present invention also includes a computer program product which provides the functionality of any of the methods according to embodiments of the present invention when executed on a computing device. Such computer program product can be tangibly embodied in a carrier medium carrying machine-readable code for execution by a programmable processor.
  • the present invention thus relates to a carrier medium carrying a computer program product that, when executed on computing means, provides instructions for executing any of the methods as described above.
  • carrier medium refers to any medium that participates in providing instructions to a processor for execution.
  • Non-volatile media includes, for example, optical or magnetic disks, such as a storage device which is part of mass storage.
  • Common forms of computer readable media include, a CD-ROM, a DVD, a flexible disk or floppy disk, a tape, a memory chip or cartridge or any other medium from which a computer can read.
  • Various forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to a processor for execution.
  • the computer program product can also be transmitted via a carrier wave in a network, such as a LAN, a WAN or the Internet.
  • Transmission media can take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
  • Transmission media include coaxial cables, copper wire and fibre optics, including the wires that comprise a bus within a computer.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Robotics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The present invention provides a repositioning system (10) for positioning a sensing and/or treatment device (6) at a target location (16), the repositioning system (10) comprising an imaging system (2) for identifying a target location (16) having a marker (12) on a body part (11) of a human being or an animal, thereby obtaining a reference image (13) ofthe target location (16), and for acquiring at least one measured image (14) of the target location (16).The imaging system (2) furthermore comprises registration means (3) for registering the at least one measured image (14) to the reference image (13), and signal generating means for generating a driving signal (5) indicative of the target location direction when spatial alignment between the at least one measured image (14) and reference image (13) is not approached within a predetermined threshold value. The repositioning system (10) furthermore comprises actuator means (4) for driving the sensing and/or treatment device (6) to guide it towards the target location thereby using the driving signal (5). The present invention also provides a method for making such a repositioning system (10) and a method for positioning a sensing and/or treatment device (6) on a target location (16) on a body part (11) of a human being or an animal by using such a repositioning system (10).

Description

Reproducible positioning of sensing and/or treatment devices
TECHNICAL FIELD OF THE INVENTION
The present invention relates to reproducibly positioning a sensing and/or treatment device on a target location on a body part of a human being or an animal. More particularly, the present invention relates to a repositioning system, a method for making such a repositioning system and a method for positioning a sensing and/or treatment device on a target position on a body part of a human being or an animal by using such a repositioning system.
BACKGROUND OF THE INVENTION Obtaining values for biological or physical quantities in a living body in a noninvasive way has been thoroughly studied over the last decades. Obtaining accurately reproducible results by using sophisticated sensing and actuating devices for medical purposes may become difficult when sensors have to be repeatedly removed and replaced. Currently, many efforts have been put in developing instruments for non- invasive measurement of physiological parameters in a human or animal body, such as for example glucose measurements, based on optical methods. Although these methods have proven to have sufficient sensitivity for in-vitro and/or ex-vivo glucose quantification, devices based on such currently existing techniques have not been successfully brought to the market. The main reason for that is that the accuracy of recently developed devices is not sufficient to get an FDA (food and drug administration) approval.
Also instruments have been developed for minimally invasive measurement of physiological parameters in a human or animal body, such as for example glucose measurements, e.g. based on optical methods. These methods make use of a sensor implanted beneath the skin which is in contact with subcutaneous fluids. The sensor may include gels, particles, liquids which are biodegradable. Preferably, the biosensor that has to be implanted is small in size, and does not require a complicated or painful insertion below the skin.
Non-invasive measurement is the most desirable method for consumers. But the uncertainty and inaccuracy hampered the acceptance of non- invasive tests. There is a strong need in the non-invasive glucose-monitoring market to solve the inaccuracy or unreliability problems.
Many techniques have been investigated to non-invasively detect skin analyte(s) concentration by means of optical, electrical and/or optoelectronic methods, such as for example non-invasive glucose monitoring. Typically, in vivo measurements deal with a larger number of chemical, physical, and physiological interfering elements compared to in vitro measurements. These interfering elements induce ^reproducibility and inaccuracy of non- invasive measurements. Information on the presence of interfering elements, their effects, and variability range are often not known. Typically, data analysis of a system where a number of interfering elements is present is based on chemometric tools, e.g. multivariate analysis. However, for complex and variable systems such as e.g. human tissue, chemometric analysis becomes more complicated and may be prone to large errors. The accuracy and reproducibility of these measurements are generally poor due to the many interfering elements and ^reproducibility comparing with the in vitro case. An example of an irreproducible factor is the placement of the measurement device on the skin. The morphology of the skin 1 is different at different locations (see Fig. 1), which leads to variations in the optical properties from site to site. It has been recognised that reproducible placement of the measurement device on a specific location at the skin is an important factor for improving accuracy of the technique used and the results obtained. Another irreproducible factor is the relative position of the sensing device to an implanted minimally invasive microsensor.
It is therefore beneficial to the accuracy of the measurement performed that the measurement device is always located on a same location on the skin for each measurement. A number of solutions have been tried: - Permanently fixing the device to the skin, e.g. by means of stickers. However, it appeared that this could lead to skin irritation.
Permanently fixing a mounting position to the skin. However, this also leads to irritation and moreover, appeared not to have good enough accuracy.
WO 2007/072356 describes a method to find back a same position on a skin of a patient. The main purpose of this document is manually repositioning of a measurement device by lay persons. The position has to be trained once by a professional. Preferred devices are e.g. defibrillators. All operations are performed completely manual. First, the professional finds a specific position for 'train mode'. Later, the subject has to replace the measurement device, also referred to as probe in 'guide mode'. Image acquisition is done by a camera at a distance of about 10 cm of the measurement site. After finding a correct spot, the probe is lowered to the skin. There is no update on the specific position during lowering of the measuring device, e.g. sensor towards the measurement site on the skin.
SUMMARY OF THE INVENTION
It is an object of embodiments of the present invention to provide a good repositioning system for a sensing and/or treatment device on a target location on a body part of a human being or an animal, a method for making such a repositioning system and a method for positioning a sensing and/or treatment device on a target position on a body part of a human being or an animal by using such a repositioning system.
A device and method according to embodiments of the present invention allow reproducibly positioning a sensing and/or treatment device on a same measurement site on a body part of a human being or animal. Therefore, a device and method according to embodiments of the present invention allow reproducibly measuring physiological parameters at a same measurement site on the body part of the human being or animal without the need for permanently fixing at least part of a device to the skin. This may be important for, for example, patients for whom on a regular basis, e.g. every day or a few times a day, a same measurement has to be performed, e.g. glucose monitoring in case of diabetes patients. In these cases it may be important to always perform the measurement at a same target location in order to obtain comparable measurement results.
A device and method according to embodiments of the present invention may provide an accuracy of better than 0.5 mm, i.e. by using a device and method according to embodiments of the present invention a sensing device may be positioned and repositioned at substantially the same target location showing a deviation of less than 0.5 mm, for example a deviation of less than 0.3 mm, less than 0.1 mm, or less than 0.01 mm.
A device and method according to embodiments of the invention are applicable to and suitable for being used with various minimally invasive or non- invasive measuring devices, and is particularly suitable for, but not limited to, being used with minimally invasive or non-invasive measuring devices which use optical detection. The above objective is accomplished by a method and device according to the present invention.
In a first aspect, the present invention provides a repositioning system for positioning a sensing and/or treatment device at a target location. The repositioning system comprises an imaging system for identifying the target location having a marker on a body part of a human being or an animal, thereby obtaining a reference image of the target location, and for acquiring at least one measured image of the target location. The imaging system furthermore comprises: registration means for registering the at least one measured image to the reference image to determine spatial alignment between the at least one measured image and the reference image, and signal generating means for generating a driving signal indicative of the target location direction when the spatial alignment is not approached or reached within a predetermined threshold value. The repositioning system furthermore comprises actuator means for driving a sensing and/or treatment device to guide it towards to target location thereby using the driving signal.
With "target location direction" is meant the direction in which a sensing and/or treatment device has to be moved and/or rotated in order to arrive at a location where measurement of a physiological parameter has to be performed or some kind of treatment is to be carried out. The desired location can be one where there is a subcutaneous microsenor implanted.
It has to be noted that the imaging target location direction refers to both a direction of spatial displacement and to change of orientation of the sensing and/or treatment device.
It is an advantage of a repositioning system according to embodiments of the present invention that it is minimally invasive but more preferably non- invasive and allows for reproducible measurements at a same location, without permanently fixing anything to the skin and thereby providing an accuracy of better than 0.5 mm, for example better than 0.3 mm, better than 0.1 mm or better than 0.01 mm. A repositioning system according to embodiments of the present invention provides a means which is applicable and suitable for various minimally invasive or non-invasive measurements, such as e.g. minimally invasive or non-invasive daily glucose monitoring.
The repositioning system may furthermore comprise the sensing and/or treatment device.
The repositioning system may furthermore comprise a control unit comprising a memory for storing the at least one reference image. According to embodiments of the invention, the control unit may furthermore comprise an algorithm for registering and comparing the at least one measured image to the reference image.
According to embodiments of the invention, the imaging system and the sensing and/or treatment device may be incorporated in one common device. An advantage hereof is easier handling of the repositioning system.
According to other embodiments of the invention, the imaging system and the sensing and/or treatment device may be formed by separate devices.
The repositioning system may furthermore comprise a user interface for providing feedback to a user for guiding the sensing and/or treatment device towards the target location.
An advantage hereof is that it may be easy for a user to guide the sensing and/or treatment device towards the target location. Further fine tuning of the positioning of the sensing and/or treatment device can then be done by means of the actuator means. The repositioning system may furthermore comprise a marker means for providing a marker on the target location and an ink reservoir for providing the marker means with ink.
An advantage hereof is that the marker can be provided in a same handling as the positioning of the sensing device at the target location. The imaging system may furthermore comprise a radiation beam generating means for generating a radiation beam and wherein the sensing and/or treatment device may furthermore comprise a radiation beam detection means for detecting the radiation beam. The imaging system may comprises at least one of charged coupled device (CCD) cameras (e.g. VIS, UV, IR), laser speckle contrast analysis (LASCA), optical coherence tomography (OCT), orthogonal polarized spectral imaging (OPSI), confocal scanning laser microscopy (CLSM) ), luminescence such as e.g. fluorescence, photoacoustics or ultrasound.
The marker may comprise at least one of an artificial marker, a natural marker, a sub-surface natural marker or a sub-surface artificial marker. The sub-surface artificial marker may be a subcutaneous microsensor.
The repositioning system may furthermore comprise a pressure and/or suction system comprising at least one pressure and/or suction element.
By providing such a pressure and/or suction system, the target location may be adapted in shape to the shape of the sensing and/or treatment device or to the volume of the subcutaneous measurement and/or treatment site (e.g. that contains a microsensor). such that always an optimal contact may be achieved between the sensing and/or treatment device and the target location or an optimal contact may be achieved with a region of the skin above a volume containing a microsensor. This increases accuracy of the measurement performed after positioning of the sensing and/or treatment device on the target location.
In a second aspect, the present invention provides a method for making a repositioning system. The method comprises: providing an imaging system for acquiring at least one reference image and at least one measured image, the imaging system comprising a registration means for registering the at least one measured image to the reference image and signal generating means for generating a driving signal indicative of the target location direction, and providing actuator means for driving a sensing and/or treatment device to guide it towards to target location thereby using the driving signal.
The method may furthermore comprise providing a control unit comprising a memory for storing the at least one reference image.
The method may furthermore comprise providing a sensing and/or treatment device. According to embodiments of the invention providing an imaging system and providing a sensing and/or treatment device may be performed by providing an imaging system and a sensing and/or treatment device incorporated in one common device. According to other embodiments of the invention, providing an imaging system and providing a sensing and/or treatment device may be performed by providing a separate imaging system and a separate sensing and/or treatment device.
According to embodiments of the invention, the method may furthermore comprise providing a user interface for providing feedback to a user for guiding the sensing and/or treatment device towards the target location.
The method may furthermore comprise: providing a radiation beam generating means to the imaging system for generating a radiation beam, and providing a radiation beam detection means to the sensing and/or treatment device for detecting the radiation beam.
In a further aspect, the present invention provides a repositioning system formed by the method according to embodiments of the present invention. In yet a further aspect, the present invention provides a method for positioning a sensing and/or treatment device on a target location on a body part of a human being or an animal. The method comprises: identifying the target location having a marker on or in the body part by acquiring at least one reference image, acquiring at least one measured image of the target location, determining spatial alignment between the at least one measured image and the reference image, generating a driving signal indicative of the target location direction when the spatial alignment is not approached or reached within a predetermined threshold value, and driving the sensing and/or treatment device with the driving signal to guide it towards the target location.
A method according to embodiments of the invention allows registering with an accuracy comparable with pixel size, e.g. to an accuracy of about 0.1 mm. Therefore, a method according to embodiments of the invention allows reproducibly positioning a sensing and/or treatment device to a target location on a human body.
The method may furthermore comprise storing the reference image in a memory device of a control unit.
According to embodiments of the invention, the method may furthermore comprise, before driving the sensing and/or treatment device with the driving signal, providing feedback to a user interface incorporated in the sensing and/or treatment device for giving instructions to a user on how to move and/or rotate the sensing and/or treatment device.
Acquiring at least one reference image and acquiring at least one measured image of the target location may be performed by using at least one of charged coupled device (CCD) cameras (e.g. VIS, UV, IR), laser speckle contrast analysis (LASCA), optical coherence tomography (OCT), orthogonal polarized spectral imaging (OPSI) or confocal scanning laser microscopy (CLSM).
In a further aspect, the present invention provides a controller for controlled driving of a sensing and/or treatment device towards a target location on a body part of a human being or an animal. The controller comprises a control unit for controlling actuator means adapted for driving the sensing and/or treatment device.
The control unit may comprise: a memory for storing a reference image, and a computing unit for registering and comparing measured images to the reference image and for determining a driving signal for driving the sensing and/or treatment device.
The present invention also provides, in a further aspect, a computer program product for performing, when executed on a computing means, a method according to embodiments of the invention.
The present invention also provides, in a further aspect, a machine readable data storage device storing the computer program product according to embodiments of the invention. The present invention also provides, in a further aspect, transmission of the computer program product according to embodiments of the invention over a local or wide area telecommunications network.
Particular and preferred aspects of the invention are set out in the accompanying independent and dependent claims. Features from the dependent claims may be combined with features of the independent claims and with features of other dependent claims as appropriate and not merely as explicitly set out in the claims.
The above and other characteristics, features and advantages of the present invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, which illustrate, by way of example, the principles of the invention. This description is given for the sake of example only, without limiting the scope of the invention. The reference figures quoted below refer to the attached drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 schematically shows skin composition of part of a body of a human being or animal,
Fig. 2 schematically illustrates a repositioning system according to embodiments of the present invention,
Fig. 3 to Fig. 5 illustrate particular steps of a method according to embodiments of the present invention, Fig. 6 illustrates an embodiment of an algorithm for use with a method according to embodiments of the present invention,
Fig. 7 illustrates recordings of spatial alignment with respect to the target as defined by the reference image, Fig. 8 to Fig. 19 schematically illustrate different implementations of a repositioning system and a method according to embodiments of the present invention, Fig. 20 schematically illustrates a system controller for use with a driver system according to embodiments of the present invention, Fig. 21 is a schematic representation of a processing system as can be used for performing a method according to embodiments of the present invention.
In the different figures, the same reference signs refer to the same or analogous elements.
DESCRIPTION OF ILLUSTRATIVE EMBODIMENTS
The present invention will be described with respect to particular embodiments and with reference to certain drawings but the invention is not limited thereto but only by the claims. Any reference signs in the claims shall not be construed as limiting the scope. The drawings described are only schematic and are non-limiting. In the drawings, the size of some of the elements may be exaggerated and not drawn on scale for illustrative purposes.
Where the term "comprising" is used in the present description and claims, it does not exclude other elements or steps. Where an indefinite or definite article is used when referring to a singular noun e.g. "a" or "an", "the", this includes a plural of that noun unless something else is specifically stated.
Furthermore, the terms first, second and the like in the description and in the claims, are used for distinguishing between similar elements and not necessarily for describing a sequence, either temporally, spatially, in ranking or in any other manner. It is to be understood that the terms so used are interchangeable under appropriate circumstances and that the embodiments of the invention described herein are capable of operation in other sequences than described or illustrated herein.
Reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment, but may. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner, as would be apparent to one of ordinary skill in the art from this disclosure, in one or more embodiments. Similarly it should be appreciated that in the description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Furthermore, while some embodiments described herein include some but not other features included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention, and form different embodiments, as would be understood by those in the art. For example, in the following claims, any of the claimed embodiments can be used in any combination.
Furthermore, some of the embodiments are described herein as a method or combination of elements of a method that can be implemented by a processor of a computer system or by other means of carrying out the function. Thus, a processor with the necessary instructions for carrying out such a method or element of a method forms a means for carrying out the method or element of a method. Furthermore, an element described herein of an apparatus embodiment is an example of a means for carrying out the function performed by the element for the purpose of carrying out the invention.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In other instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
The present invention provides a repositioning system for a sensing and/or treatment device on a target location on a body part of a human being or an animal, a method for making such a repositioning system and a method for positioning a sensing and/or treatment device on a target position on a body part of a human being or an animal by using such a repositioning system.
Minimally invasive glucose monitoring can also be used to control glucose levels. Minimally invasive techniques can rely on optical means for receiving signals from a subcutaneous microsensor in contact with subcutaneous body tissue and fluids. The analytes present in the blood can be determined by analysing the light returned from the microsensor. Near infrared spectroscopy has also proved to be a promising method for determination of the glucose levels minimally invasively.
The term "minimally invasive" as used for example in the phrase minimally invasive measurement of physiological parameters in a human or animal body, includes those methods where there is a minor level of invasion. An example is where the measurement itself is non- invasive but the determination of the analyte is done with the help of an implanted microsensor, e.g. a subcutaneous microsensor. The sensor is implanted beneath the skin which is in contact with subcutaneous fluids. The sensor may include gels, particles, liquids which are biodegradable. Preferably, the biosensor that has to be implanted is small in size, and does not require a complicated or painful insertion below the skin. The micor sensor comprises an assay such as for example for the determination of glucose, e.g. based on optical methods.
The measurement of glucose through spectroscopy can be made by a change in the absorption of light according to the absorption and scattering properties of minimally invasive microsensors, or to the change in light emitted or reflected from such microsensors located below the skin. Such methods using microsensors may include, for example, observing fluorescence (e.g. fluorescence resonance energy transfer) of a competitive binding assay encapsulated in microcapsules, for example based on competitive binding between the protein Concanavalin A and various saccharide molecules, specifically a glycodendrimer and glucose, the microcapsules can be polyelectrolyte microcapsules, detecting glucose using boronic acid-substituted violegens in fluorescent hydrogels in which a fluorescent anionic dye and a viologen are appended to boronic acid, which serve as glucose receptors, and are immobilised into a hydrogel, the fluorescence of the dye being modulated by the quenching efficiency of the viologen based receptor which is dependent upon the glucose concentration, other methods, e.g. to monitor oxygen or pH or other "smart tattoo" methods. A device and method according to embodiments of the present invention allow reproducibly positioning a sensing and/or treatment device on a same measurement site on a body part of a human being or animal. Therefore, a device and method according to embodiments of the present invention allow reproducibly measuring physiological parameters at a same measurement site on the body part of the human being or animal without the need for permanently fixing at least part of a device to the skin. A device and method according to embodiments of the present invention may provide an accuracy of better than 0.5 mm, i.e. by using a device and method according to embodiments of the present invention a sensing and/or treatment device may be positioned and repositioned at substantially the same target location showing a deviation of less than 0.5 mm, for example a deviation of less than 0.3 mm, less than 0.1 mm or less than 0.01 mm.
A device and method according to embodiments of the invention are applicable to and suitable for being used with various minimally invasive or non- invasive measuring devices, such as minimally invasive or non-invasive glucose measuring devices, and are particularly suitable for, but not limited to, being used with minimally invasive or non- invasive measuring devices which use optical detection. Further applications of a repositioning system 10 and method for positing a sensing and/or treatment device 6 at a target location 16 on a body part of a human being or an animal according to embodiments of the present invention may include measurements of skin properties such as e.g. skin cancer, skin aging, etc., for example by any means of light. A device and method according to embodiments of the present invention may be applied with sensing methods and/or with treatment methods known by a person skilled in the art. For example, a device and method according to embodiments of the present invention may generally be applied to sensing methods which benefit from reproducibly placing the sensor at a same position. Such sensing methods may, for example, comprise ultrasound measurements, temperature sensing, pressure sensing, measurements using parts of the electromagnetic spectrum (such as e.g. optical, microwave, radiowave methods), skin impedance/resistance measurements and capacitance measurements, and flux measurements of compounds (such as Trans Epidermal Water Loss). Furthermore, a device and method according to embodiments of the present invention may be for performing a treatment at a reproducible position, such as heat or light treatment for hair removal, treatment of skin disorders by e.g. light, skin rejuvenation, injection of medication, implanting or tissue sampling such as e.g. taking a biopsy.
In a first aspect, as schematically illustrated in Fig. 2, embodiments of the invention provide a repositioning system 10 comprising an imaging system 2. The imaging system 2 is suitable for identifying an imaging target location having a marker on or in a body part of a human being or an animal, thereby obtaining a reference image of the imaging target location, and for acquiring at least one measured image of the imaging target location. This imaging target location is identified as the target location at which subsequent measurement of a physiological parameter and/or treatment has to be performed by means of a sensing and/or treatment device. According to embodiments of the invention, a plurality of subsequently measured images may be acquired. In that the latter case, at least two of the subsequently measured images may be measured under different circumstances, e.g. they may be acquired for different positions and/or under different angles and/or under different illumination conditions.
In the further description the terms imaging target location and target location may be used next to each other. It has to be understood that these terms are intended to indicate a same location, i.e. the location at which measurement of a physiological parameter and/or treatment has, repeatedly or not, to be performed. The imaging system 2 furthermore comprises: registration means 3 a adapted for registering the at least one measured image to the reference image to determine spatial alignment between the at least one measured image and the reference image, and signal generating means 3b adapted for generating a driving signal 5 indicative of the imaging target location direction when the spatial alignment is not approached or reached within a predetermined threshold value.
The repositioning system furthermore comprises actuator means for driving the sensing and/or treatment device to guide it towards the target location thereby using the driving signal. The driving signal 5 may be generated until spatial alignment is obtained, between the at least one measured image and the reference image, or until spatial alignment is approached or reached within a predetermined threshold value. Hence, the driving signal 5 may be generated until the spatial alignment is approached or reached within the predetermined threshold value. In other words, the driving signal 5 may be generated until the sensing and/or treatment device arrives at the imaging target location or in a predetermined neighbourhood thereof.
With "imaging target location direction" is meant the direction in which a sensing and/or treatment device has to be moved and/or rotated in order to arrive at a location where measurement of a physiological parameter and/or treatment has to be performed. It has to be noted that the imaging target location direction refers to both a direction of spatial displacement and a changes of orientation of the sensing and/or treatment device.
To be able to come back to a same position every time, or in other words, to be able to repeatedly perform measurement of a physiological parameter or treatment at a same position, a desired imaging target location has to be defined and stored. This is the role of the reference image. Different factors can influence the choice of such a target location. It can be defined by a sensing and/or treatment system used to perform the measurement of the physiological parameter and/or to perform treatment, by recording an image just before, during or just after performing the sensing and/or treatment action. A 'reference image' has similar characteristics as a 'measured image', except that the 'reference image' was taken at an earlier time, and specifies the desired imaging target location of the repositioning system.
According to embodiments of the invention, the 'reference image' can be static, specified only at initial use or during a position calibration measurement. Alternatively, according to other embodiments of the invention, the 'reference image' can be dynamic, and can be updated with every sensing and/or treatment action. The first case might be sensitive to changes of the target are over time (e.g. in case of aging and/or injury), while the latter should allow adapting to gradual changes.
In order to determine a desired movement of a sensing and/or treatment device towards a target location in order to align with this target location, at least two images are to be acquired. One is the 'reference image', specifying and identifying the target location. The second is the 'measured image', specifying the current state of the system, including measured position and orientation. The availability of these two images may be sufficient to calculate a transformation between the two, resulting in alignment. A better accuracy and reliability might be obtained if a continuous stream of
'measured images' is available, i.e. when a plurality of subsequently measured images is acquired. According to these embodiments and for even more improving accuracy, at least two of the subsequently measured images may be measured under different circumstances, e.g. they may be acquired for different positions and/or under different angles and/or under different illumination conditions. Acquisition of a plurality of images would allow a closed loop system, which can compensate for disturbances during the alignment procedure. If, for example, a patient's arm would shift during alignment, this would only be noticed when 'measured images' are available that show this disturbance, and allow the system to compensate. The repositioning system 10 furthermore comprises actuator means 4 generating, based on the driving signal 5, actuation signals 7 for driving the sensing and/or treatment device 6 to guide it towards the imaging target location.
The repositioning system 10 according to embodiments of the invention allows reproducibly positioning a sensing and/or treatment device 6 on a same target location. This may be desired for, for example, patients by whom on a regular basis, e.g. every day or a few times a day, a same measurement has to be performed, for example glucose monitoring in case of diabetes patients. In these cases it may be important to always perform the measurement at a same target location in order to obtain comparable measurement results.
According to embodiments of the invention, the sensing and/or treatment device 6 and the imaging system 2 may be comprised in a same device. In that case, the sensing and/or treatment device 6 may be part of the repositioning system 10 itself. According to these embodiments, there may be a fixed positional relationship between the imaging system 2 and the sensing and/or treatment device 6 (see further with respect to Fig. 8). This fixed relationship can be limited to parameters included in the movement of the sensing and/or treatment device 6, also referred to as transformation. If a 2D position is reproducibly desired, and only XY is calculated (see co-ordinate system further in Fig. 8), then at least the XY relationship should be fixed. However, the Z distance between imaging system 2 and sensing and/or treatment device 6 may still be flexible, since this parameter is not part of the calculated transformation. According to further, specific embodiments, the imaging system 2, e.g. camera may also have the function of sensing and/or treatment device 6.
According to other embodiments of the invention, the sensing and/or treatment device 6 may be separated from the imaging system 2 and may, in that case, not be part of the repositioning system 10 (see further with respect to Fig. 9). In that case, no fixed positional relationship exists between the imaging system 2 and the sensing and/or treatment device 6. It has to be noted that, according to embodiments of the invention, the target location at which the sensing and/or treatment device 6 eventually arrives may, within a predefined range, be different from the target location of the imaging system 2, i.e. the target location which was originally set.
The imaging system 2 is adapted for acquiring at least one reference image and for acquiring at least one measured image. According to embodiments of the invention the imaging system 2 may have a high spatial resolution of tens of μm. Furthermore, the imaging system 2 may be able to detect structures with sized of between tens of μm and tens of cm, for example, sizes between 0.01 mm and 10 cm, e.g. between 0.1 mm and 10 cm. These structures may also be referred to as markers. These markers may for example be, but are not limited to, an artificial marker provided on the skin of the body such as e.g. a stamp, a natural marker, such as e.g. skin textures or colours, a sub-surface natural marker such as vessels or skin structures or a sub-surface artificial marker such as an implanted marker, for example a subcutaneous microsensor as using in minimally invasive testing. In case of using artificial markers such as e.g. stamps, or a microsensor, these can be visible or invisible to the human eyes. Artificial markers may be provided by e.g. drawing a mark onto the target location. This may, for example, be done by hand by a user, a doctor or a nurse or may be done by use of the repositioning system 10 itself which may then comprise a built-in marker means 23 (see Fig. 23 and corresponding description). In case of sub-surface markers, these markers are located below skin level which makes them less sensitive to changes in time and less exposed to environmental influence like hydration or sweating compared to markers that are provided at the outside of the skin. Examples of sub-surface natural markers may be sweat glands, hairs follicles or capillaries.
The imaging system 2 may use any suitable known imaging technique such as for example charged coupled device (CCD) cameras (e.g. VIS, UV, IR), laser speckle contrast analysis (LASCA), optical coherence tomography (OCT), orthogonal polarized spectral imaging (OPSI), confocal laser scanning microscopy (CLSM), luminescence such as e.g. fluorescence, photoacoustics or ultrasound.
The repositioning system 10 furthermore comprises registration means 3a for registering the at least one measured image to the reference image to determine spatial alignment between the at least one measured image and the reference image. The degree of alignment can be calculated from an image registration algorithm, such as e.g. Scale Invariant Feature Transform (SIFT). By using such an algorithm unique features are identified between the measured image and the reference image. If such features present in the measured image are also present in the reference image then the necessary transformation between the two images, or in other words the required movement of the sensing and/or treatment device 6 in order to arrive at the imaging target location can be calculated using affine transformations. In a closed loop system, using a stream, i.e. a plurality of subsequently measured images, repositioning is completed when these calculated values for the necessary transformation or movement have decreased below the set target (e.g. 0.1 mm, 0.3 mm or 0.5 mm).
In case only one measured image is acquired and compared to the reference image, the repositioning system 10 only executes the transformation as calculated from the comparison between this one measured image and the reference image.
When the spatial alignment between the measured image and the reference image is not approached or reached within the pre-determined threshold value, a driving signal 5 is generated by means of a signal generating means 3b and this driving signal 5 is sent to an actuator means 4 where it is used to provide actuation signals 7 to the sensing and/or treatment device 6 for moving of the sensing and/or treatment device 6 towards the target location. The actuator means 3b may, for example, comprise XY-stages, Z-stages or rotation stages, which may use DC motors, stepper motors, voice coils or may comprise any other suitable actuator known by a person skilled in the art.
In order to provide the actuation signal 7, the actuator means 4 may obtain, e.g. receive or generate, a signal representative of the target location direction, which may aid in determining the direction of moving of the sensing and/or treatment device 6. Details on the method and the use of the repositioning system 10 will be described further in the description.
According to embodiments of the invention, the sensing and/or treatment device 6 may be any suitable sensing and/or treatment device known by a person skilled in the art, and may for example be a spectroscopic sensing device suitable for performing minimally invasive or non- invasive measurement of glucose. According to embodiments of the present invention, the actuator means 4 may be part of the imaging system 2, or may be part of the sensing and/or treatment device 6 or may be a separate device (as is illustrated schematically in Fig. 2). The actuator means 4 must be placed such that it moves the sensing and/or treatment device 6 relative to the skin. The imaging system 2 can also be part of the actuated system, i.e. can also be driven by the driving signal 5 thereby moving together or in close co-operation with the sensing and/or treatment device. Alternatively, according to other embodiments of the invention, the imaging system 2 is not actuated, i.e. is not driven by the driving signal 5, and remains stationary.
According to embodiments of the invention, the sensing and/or treatment device 6 may furthermore comprise a user interface such that the sensing and/or treatment device 6 can first be guided roughly towards the target location by a user and can then be fine-tuned by the actuator means 4 which further accurately guides the sensing and/or treatment device 6 towards the target location (see further). The user interface may comprise indication means for indicating the direction in which the sensing and/or treatment device 6 has to be moved and/or rotated.
According to embodiments of the present invention, the repositioning system 10 may furthermore comprise a built-in marker means such as e.g. a stamp for providing a marker to the target location and a reservoir comprising a substance, e.g. an ink reservoir, connected to the marker means for providing the substance, e.g. ink to the marker means (see further). An advantage of this embodiment is that the marker can be provided in a same handling as the positioning of the sensing and/or treatment device 6 at the target location. According to other embodiments of the present invention, the repositioning system 10 may furthermore comprise a built-in microsensor implanting means or any other smart tattoo placing means. An advantage of this embodiment is that the micorsensor can be provided in a same handling as the positioning of the sensing and/or treatment device 6 at the target location.
According to still further embodiments of the invention, the imaging system 2 may furthermore comprise radiation beam generating means such as e.g. a laser 25 for generating a radiation beam, e.g. laser beam. According to these embodiments, the sensing and/or treatment device 6 may furthermore comprise radiation beam detection means, e.g. a photodiode 26, for detecting the radiation beam, e.g. laser beam and in that way providing information on the target location (see further). The radiation beam generating means, e.g. laser 25, may be used to identify the target location in comparison with the reference image. In this case, an image processing algorithm locates the imaging target location, within the current field of view of the imaging system 2. The target location can then be illuminated with the radiation beam generating means, e.g. laser 25. The sensing and/or treatment device 6 is then moved towards the target location. When it arrives at the target location, it will detect the radiation beam, e.g. laser beam by means of the radiation beam detection means, e.g. a photodiode 26. Upon detection of the radiation beam, e.g. laser beam, it may be decided to stop movement of the sensing and/or treatment device 6.
The radiation beam detection means, such as e.g. a photodiode 26, may be for detecting the radiation beam, e.g. laser beam, when the target location is reached and for communicating this to the user in some suitable way. According to embodiments of the invention, the repositioning system 10 may furthermore comprise a pressure and/or suction system comprising at least one pressure and/or suction element for providing a particular pressure to the surface at the target location on the skin of a human being or an animal for adapting the target location in shape to the shape of the sensing and/or treatment device 6. In that way, an optimal contact may be achieved between the sensing and/or treatment device 6 and the target location. This increases accuracy of the measurement performed after positioning of the sensing and/or treatment device 6 on the target location.
More details and examples of the repositioning system 10 according to embodiments of the invention will be described further on in the description. In a second aspect, the present invention provides a method for positioning a sensing and/or treatment device 6 on a target location on a body part of a human being or an animal by using a repositioning system 10 according to embodiments of the invention as described above. The method comprises: - identifying a target location having a marker on or in the body part by acquiring at least one reference image, acquiring at least one measured image of the target location, registering the at least one measured image to the reference image to determine spatial alignment between the at least one measured image and the reference image, generating a driving signal indicative of the target location direction when the spatial alignment is not approached or reached within a predetermined threshold value, and driving the sensing and/or treatment device 6 based on the driving signal to guide it towards the target location. With "target location direction" is meant the direction in which a sensing and/or treatment device 6 has to be moved and/or rotated in order to arrive at a location where measurement of a physiological parameter and/or treatment has to be performed.
In order to determine a desired movement of a sensing and/or treatment device towards a target location in order to align with this target location, at least two images are to be acquired. One is the 'reference image', specifying and identifying the target location. The second is the 'measured image', specifying the current state of the system, including measured position and orientation. The availability of these two images may be sufficient to calculate a transformation between the two, resulting in alignment.
A better accuracy and reliability might be obtained if a continuous stream of 'measured images' is available, i.e. when a plurality of subsequently measured images is acquired. According to these embodiments and for even more improving accuracy, at least two of the subsequently measured images may be measured under different circumstances, e.g. they may be acquired for different positions and/or under different angles and/or under different illumination conditions. Acquisition of a plurality of images would allow a closed loop system, which can compensate for disturbances during the alignment procedure. If for example, a patients arm would shift during alignment, this would only be noticed when 'measured images' are available that show this disturbance, and allow the system to compensate. A method according to embodiments of the invention allows registering the at least one measured image to the reference image with an accuracy comparable with pixel size, e.g. to an accuracy of about 0.1 mm. Therefore, the method according to embodiments of the invention allows reproducibly positioning a sensing and/or treatment device at a target location on a body part of a human being or an animal.
In a first step of a method according to embodiments of the invention, an imaging target location having a marker is identified on or in a body part of a human being or an animal. The marker may, for example but not limited thereto, comprise an artificial marker provided on the skin of the body such as e.g. a stamp, or a smart tattoo, a natural marker such as e.g. skin texture or colours, a sub-surface natural marker such as a vessel or skin structure or a sub-surface artificial marker such as an implanted marker. The implanted marker may be subcutaneous micorsensor. In case of using artificial markers such as e.g. stamps, or microsensors these can be visible or invisible to the human eyes. Artificial markers may be provided by e.g. drawing a mark onto the target location. This may, for example, be done by hand by a user, a doctor or a nurse or may be done by use of the repositioning system itself which may then comprise a built-in marker means as described above. Artificial markers may be provided by subcutaneously, e.g. by inserting or implanting a microsensor at the target location. In case of sub-surface markers, such markers are located below skin level which makes them less sensitive to changes in time and to environmental influences like hydration or sweating compared to markers that are provided at the outside of the skin. Examples of sub-surface natural markers may be sweat-glands, hair follicles or capillaries. An example of a sub-surface artificial marker is a microsensor.
Identification of the imaging target location having a marker is obtained by acquiring at least one reference image. According to embodiments of the invention, one reference image may be acquired. However, according to other embodiments of the invention, the method may comprise acquiring a reference map comprising multiple reference images. According to embodiments of the invention, the method may comprise acquiring a larger reference image taken by a separate imaging device or at a different distance than the following subsequently measured images (see hereinafter). In this case, the imaging system 2 may comprise two different imaging devices. The two different imaging devices may operate at the same or different frequencies or range of light frequencies.
Next, at least one measured image is acquired. According to embodiments of the invention, a plurality of subsequently measured images may be acquired. In that the latter case, at least two, and in embodiments of the present invention optionally each of the subsequently measured images may be measured under different circumstances, i.e. they may be acquired for different positions and/or under different angles and/or under different illumination conditions.
Acquiring the reference image and the at least one measured image may be performed using any known suitable imaging technology such as e.g. using charged coupled device (CCD) cameras (e.g. VIS, UV, IR), laser speckle contrast analysis (LASCA), optical coherence tomography (OCT), orthogonal polarized spectral imaging (OPSI), confocal laser scanning microscopy (CLSM), luminescence such as e.g. fluorescence, photoacoustics or ultrasound. After acquiring the reference image, at least one measured image is acquired.
The at least one measured image is then registered to the reference image to determine spatial alignment between the at least one measured image and the reference image. According to embodiments of the invention, affine image registration techniques may be used to register the at least one measured image to the reference image. According to other embodiments, any other suitable registration technique known by a person skilled in the art may be used. A survey of image registration methods is given in "Image registration methods: a survey", Barbara Zitova', Jan Flusser, Image and Vision Computing 21 (2003) 977-1000'. Feature detection can be based on area-based methods or feature-based methods. Feature-based methods may analyse region features, line features or point features. Matching of features in the reference image with those in the measured image can be done using are-based methods, such as correlation- like methods, Fourier methods, mutual information methods or optimization methods. Also, feature matching can be accomplished using feature-based methods, such as using spatial relations, invariant descriptors, relaxation methods or pyramids and wavelets. Hereinafter, the accuracy of a method according to embodiments of the invention will be demonstrated by means of Fig. 3, Fig. 4 and Fig. 5. It has thus to be noted that these figures are not intended to illustrate the method itself, but are intended to indicated the accuracy of the method.
Fig. 3 shows an example of acquisition of a reference image 13 and subsequently measured images 14. In Fig. 3 the reference image has been rotated over -90°. This is only for the ease of illustration and this rotation is not intended to be part of the method according to embodiments of the invention.
For illustrating accuracy of a method according to embodiments of the invention, fourteen images of a body part 11 of a human being were acquired, i.e. one reference image 13 (indicated by the dashed square) and thirteen subsequently measured images 14. The subsequently measured images 14 were measured under different circumstances, e.g. under different positions and/or under different angles and/or under different illumination conditions. In the example given, the target location had an artificial marker 12, e.g. a marker 12 had been drawn on the body part 11. All images were acquired with a resolution up to 3-pixel accurate repositioning (current resolution is 32 pixels per mm).
In a next step, the thirteen subsequently measured images were registered to the reference image 13. In the example given, affine image registration techniques have been used to register the subsequently measured images 14 to the reference image 13. In the present example, image processing/registration tools from Matlab have been employed. The result after registration of the subsequently measured images 14 to the reference image 13 is shown in Fig. 4. These images may be referred to as registered images.
In a next step, an overlap is determined between the at least two registered images, in the example given the thirteen registered images. Fig. 5 is a sum or combination of all registered images shown in Fig. 4. This Fig. 5 shows the overlap of the registered images shown in Fig. 4 and gives an indication of the registration accuracy by calculated resolution. By using affine image registration techniques, a method according to embodiments of the invention allows registering at an accuracy comparable to pixel size, which is 0.1 mm. Again it is indicated that the above description with respect to Fig. 3, Fig. 4 and Fig. 5 is only to indicate accuracy of a method according to embodiments of the invention and is not intended to illustrate the method according to embodiments of the invention itself.
When the spatial alignment between the at least one measured image and the reference image is not approached or reached within a predetermined threshold value, a driving signal 5 indicative of the target location direction is determined and the sensing and/or treatment device 6 is driven based on that driving signal 5 to guide it towards the target location.
After this, at least one new image is measured, registered to the reference image and compared so as to determine the degree spatial alignment between the at least one measured image and the reference image and, depending on whether the spatial alignment is approached or reached within a pre-determined threshold value or not, the sensing and/or treatment device 6 is driven or not. This series of steps may be repeated until the sensing and/or treatment device 6 has reached its target location. Hence, the method according to embodiments of the invention allows for reproducibly placing or positioning a sensing and/or treatment device 6 onto a target location on a body part of a human being or animal by using feedback from the imaging system 2. This may be important for, for example, patients by whom on a regular time basis, e.g. every day or a few times a day, a same measurement has to be performed, e.g. glucose monitoring in case of diabetes patients. In these cases it may be important to always perform the measurement at a same target location in order to obtain comparable measurement results. The sensing and/or treatment device 6 is guided towards the imaging target location based on the driving signal 5 obtained from registering at least one measured image 14 to at least one reference image 13. Actuator means 4 may be used for generating actuation signals 7 for driving the sensing and/or treatment device 6 based on the driving signal 5. According to embodiments of the invention, guiding the sensing and/or treatment device 6 towards the target location may be completely actuator based, i.e. the sensing and/or treatment device 6 may be guided towards the target location via actuator means 4 that is directly driven by the driving signal 5. According to other embodiments of the invention, guiding the sensing and/or treatment device 6 towards the target location can be done in a two step manner, where a first step is movement of the sensing and/or treatment device 6 by a user, for example visually or according to given feedback by means of a user interface and a second step then is fine adjustment e.g. via actuator means 4 that is directly driven by the driver signal 5.
Fig. 6 gives an example of an algorithm for using the imaging system 2 to find back the target location and using it to drive the sensing and/or treatment device 6 towards that target location. First, a target location having a marker 12 is identified by acquiring at least one reference image 13 (see step 20 in Fig. 6). This step is also referred to as position registration. The reference image 13 is then stored in a memory device (step 30 in Fig. 6). Next, a body part is imaged by acquiring at least one measured image 14 (step 40 in Fig. 6). The at least one measured image 14 is then registered to the at least one reference image 13 to determine spatial alignment between the at least one measured image and the reference image (step 50 in Fig. 6). It is then determined whether the spatial alignment is approached or reached within a predetermined threshold value or not (see step 60 in Fig. 6). If the spatial alignment between the at least one measured image and the reference image is not approached or reached within the predetermined threshold value, the sensing and/or treatment device 6 may be rotated and/or moved based on a driving signal 5 applied to actuator means 4 (step 70 in Fig. 6), the sensing and/or treatment device 6 being guided towards the target location, and steps 50 and 60 may be repeated. If spatial alignment between the at least one measured image and the reference image is approached or reached within the predetermined threshold value , a driving signal 5 indicative for "target location reached" is sent to the sensing and/or treatment device 6. The sensing and/or treatment device 6 is loaded (step 80 in Fig. 6) for performing a required measurement such as e.g. non-invasive monitoring of the amount of glucose in the blood of the user (step 90 in Fig. 6). For each new measurement, a new marker 12 may be printed on the target location and a new position registration may be performed (step 100 in Fig. 6), which then again may be stored in the memory device (step 30 in Fig. 6). Fig. 7 illustrates recordings of spatial alignment with respect to the imaging target location as defined by the reference image. In the example given, the reference image was acquired at the start of the measurement, leading to an initial alignment of the reference image with the measured image. Using a continuous stream of measured images while moving the sensing and/or treatment device 6, the image registration algorithm continuously calculates the spatial transformations necessary to mach the measured image to the reference image. In this test, a misalignment was introduced, and then corrected using the near realtime feedback from the image registration algorithm as described above.
The method according to embodiments of the present invention allows accurately and reproducibly positioning a sensing and/or treatment device 6 on a target location on a body part 11 of a human being or an animal. This increases accuracy of the measurement performed with the sensing and/or treatment device 6 and improves quality of the results.
Hereinafter, different implementations will be described of a repositioning system 10 and a method for positioning a sensing and/or treatment device 6 on a target location on a body part of a human being or an animal by using such a repositioning system 10 according to embodiments of the invention. It has to be understood that these implementations are only for the purpose of illustration and explanation and are not intended to limit the invention in any way.
Fig. 8 schematically illustrates an example of an implementation of a repositioning system 10 according to an embodiment of the present invention and how it can be used. According to the present example, the imaging system 2 and the sensing and/or treatment device 6 may be incorporated in one common device. In that case, the sensing and/or treatment device 6 may be part of the repositioning system 10 itself. According to these embodiments, there may be a fixed positional relationship between the imaging system 2 and the sensing and/or treatment device 6, as was described earlier. A marker, for example an artificial marker 12 is provided to the target location 16 on a body part 11 of a human being or an animal, in the example given, on an arm of a human being. This target location 16 is identified by acquiring at least one reference image 13 by means of the imaging system 2. Next, at least one measured image 14 is acquired in the region of interest 15 which are then registered to the reference image 13 to determine spatial alignment between the at least one measured image 14 and the reference image 13. In a next step, when spatial alignment between the at least one measured image 14 and the reference image 13 is not approached or reached within the predetermined threshold value, a driving signal 5 is generated as described above for driving the sensing and/or treatment device 6 to guide it to the target location 16. The driving signal 5 may cause the sensing and/or treatment device 6 to move and/or rotate (this is indicated by arrows 17 in Fig. 7). When the sensing and/or treatment device 6 arrives at the target location 16 non- invasive measurement may be performed, such as e.g. glucose monitoring. In Fig. 9 another example of an implementation of a repositioning system 10 according to embodiments of the present invention is schematically illustrated and how it can be used. According to this example, the imaging system 2 and the sensing and/or treatment device 6 may be two separate devices which are interconnected by a rigid connection, e.g. a communicative channel 18. According to these embodiments, there is no fixed positional relationship between the imaging system 2 and the sensing and/or treatment device 6. Driving the sensing and/or treatment device 6 for guiding it to the target location 16 on a body part 11 , in the example given an arm, of a human being may be performed in a same way as described in the former example and in the embodiments hereabove. The driving signal 5 is sent to the sensing and/or treatment device 6 through the communicative channel 18. According to this example, the actuator means 4 may be part of the imaging system 2 or may be part of the sensing and/or treatment device 6.
According to the embodiment illustrated in Fig. 9, the sensing and/or treatment device 6 may furthermore comprise a user interface 19. This is schematically illustrated in Fig. 10. In this figure only the sensing and/or treatment device 6 located on a body part 11 of a user is illustrated for the ease of explanation. By means of the user interface 19 the user gets direct feedback on the movements and/or rotations he has to make with the sensing and/or treatment device 6 in order to guide the sensing and/or treatment device 6 towards the target location 16 (indicated by arrows 21). The user interface 19 may comprise indicator means to warn the user when the target location 16 is reached and the user may stop moving and/or rotating the sensing and/or treatment device 6. For fine tuning the location of the sensing and/or treatment device 6, a driving signal 5 may be generated as described above for substantially exactly positioning the sensing and/or treatment device 6 at the target location 16. When the sensing and/or treatment device 6 is substantially exactly positioned at the target location 16 non-invasive measurements may be performed, such as e.g. glucose monitoring.
In a further example of which a cross-section is schematically illustrated in Fig. 11, the repositioning system 10 may comprise an imaging system 2, actuator means 4, a sensing and/or treatment device 6, a user interface 19 and a control unit 22. According to this example specific/unique natural markers such as e.g. skin texture or skin colour may be used as a marker 12 to identify a specific target location 16 and register and store this target location 16 in a memory device of the repositioning system 10 as a reference image 13. The control unit 22 may comprise a memory e.g. electronic memory, to store the reference image 13, as well as an algorithm, of which an example was given in Fig. 6, for registering and comparing measured images 14 to the reference image 13. Generating a driving signal 5 for driving the sensing and/or treatment device 6 to guide it towards the target location 16 may then be done as described above.
According to another example of which a cross-section is illustrated in Fig. 12, the repositioning system 10 may, similar to the example illustrated in Fig. 11, comprise an imaging system 2, actuator means 4, a sensing and/or treatment device 6, a user interface 19 and a control unit 22. According to the present example, the repositioning system 10 may furthermore comprise a built-in marker means 23 such as e.g. a stamp to provide a marker 12 to the target location 16 on a body part 11 and a reservoir comprising a long-lasting substance reservoir, e.g. ink reservoir 24 to provide the marker means 23 with the long-lasting substance, e.g. ink. According to another example, the repositioning system 10 may furthermore comprise a built-in microsensor implanting means to insert a microsensor below the skin at the target location 16 on a body part 11. In this example, a specific target location 16 is marked by a long-lasting marker 12 by using the built-in marker means 23. An advantage of this example is that the marker 12 can be provided in a same handling as the positioning of the sensing and/or treatment device 6 at the target location 16. Generating a driving signal 5 for driving the sensing and/or treatment device 6 to guide it towards the target location 16 may then be done as described above.
In the two examples illustrated in Fig. 11 and Fig. 12, the imaging system 2 and the sensing and/or treatment device 6 are incorporated in one common device. Hereinafter, some examples will be described in which the imaging system 2 and the sensing and/or treatment device 6 are two separate devices which are interconnected by a communicative channel 18.
In a further example of which a cross-section is illustrated in Fig. 13, the repositioning system 10 may comprise an imaging system 2, actuator means 4, a sensing and/or treatment device 6, a user interface 19, a built-in marker means 23 such as a stamp for providing a marker 12 onto the target location 16 and an ink reservoir 24 for providing the built-in marker means 23 with ink. The imaging system 2 is separated from the sensing and/or treatment device 6 and is connected to the sensing and/or treatment device 6 by means of a communicative channel 18. The sensing and/or treatment device 6 is, in the example given, together with the actuator means 4, the built-in marker means 23 and the ink reservoir 24, guided to the target location 16 by the driving signal 5 which is generated as already described above. The driving signal 5 is sent towards the sensing and/or treatment device 6 by means of the communicative channel 18. In the example given, the actuator means 4 forms, together with the user interface 19, the built-in marker means 23 and the ink reservoir
24, part of the sensing and/or treatment device 6.
In yet another example of which a cross-section is illustrated in Fig. 14, the repositioning system 10 may, similar to the previous example, comprise an imaging system 2, actuator means 4, a sensing and/or treatment device 6, a user interface 19, a built-in marker means 23 such as a stamp for providing a marker 12 onto the target location 16 and an ink reservoir 24 for providing the built-in marker means 23 with ink (or in an alternative a built- in microsensor implanting means to insert a microsensor below the skin at the target location 16 on a body part 11). Again, the imaging system 2 is separated from the sensing and/or treatment device 6 and is connected to the sensing and/or treatment device 6 by means of a communicative channel 18. In the present example, the repositioning system 10 may furthermore comprise a radiation beam generating means 25 such as e.g. a laser for generating a radiation beam such as e.g. a laser beam. The radiation beam generating means
25, e.g. laser is situated in the imaging system 2. According to this example, while the imaging system 2 is acquiring at least one measured images 14 in real time, the radiation beam generating means 25, e.g. laser may be used to identify the target location 16 in comparing with the reference image 13. The sensing and/or treatment device 6 may furthermore comprise a radiation beam detection means 26, such as e.g. a photodiode for, when the target location 16 is reached, detecting the radiation beam, e.g. laser beam and communicating this to the user, as was described earlier. In a further example, an example of a method according to embodiments of the invention will be illustrated. The present example makes us of a sub-surface imaging technique, i.e. Laser Speckle Contrast Analysis (LASCA) (see Physiol. Meas. 22 (2001) R35-R66 PII: S0967-3334(01)21913-3 Laser Doppler, speckle and related techniques for blood perfusion mapping and imaging for information in this technique). The LASCA technique is able to acquire a 2D-image of sub-surface blood vessels in real-time. A light source suitable for being used with the LASCA imaging may, for example, comprise a laser diode around 650 nm, with an output power higher than 25 mW. Such laser diodes may be readily available at relatively low cost. No scanning optics are required, because LASCA is a non-scanning, full- field technique. A conventional CCD-camera can be used for recording of the images. To convert the images to a map of blood vessels, the 2D-images need to be acquired and processed. For this, DAQ-electronics and a microprocessor may be required. For the device to be useful to the user, an update rate of several Hz is required. Such rate can be readily obtained with modern microprocessor power. If necessary, a compromise may be made between the desired resolution of the map, and the update rate of the user feedback indicators.
A typical image obtained by LASCA is shown in Fig. 15. In this figure, subsurface blood vessels 27 are imaged. The present example combines LASCA with the use of a reference image 13 as is illustrated in Fig. 16. Reference number 14 indicates a measured image. The circle with reference number 16 indicates the target location. In the present example, the reference image 13 is acquired such that it is much larger than the measured images 14, so as to be able to find the location of the measured image 14 with respect to the reference image 13. A thick branching 28 of the blood vessel 27 is used as a marker 12 of the target location 16. A user interface 19 on the sensing and/or treatment device 6 indicates to a user the direction in which the sensing and/or treatment device 6 has to be moved in order to be guided to the target location 16. Hence, this example illustrates the use of sub-surface natural markers, in the example given blood vessels 27, in order to position a sensing and/or treatment device 6 at a target location 16.
A suitable repositioning system 10 which may be used with the method described in this example is illustrated in Fig. 17. The repositioning system 10 according to this example may comprise an imaging system 2, a sensing and/or treatment device 6, a control unit 22 and a user interface 19. In this example, the imaging system 2 and the sensing and/or treatment device 6 may be part of the same device. The actuator means 4 may be part of the imaging system 2 or may be part of the sensing and/or treatment device 6. The sensing and/or treatment device 6 may be any suitable sensing device known by a person skilled in the art, and may for example be a spectroscopic sensing device suitable for performing minimally invasive or non-invasive measurement of glucose. The control unit 22 may comprise a self- locating algorithm, of which an example is given in Fig. 6. This algorithm matches the at least one real-time measured image 14 to the reference image 13 stored in the memory device, and determines the location of the sensing and/or treatment device 6 with respect to the marker. Then, appropriate user feedback is given to guide the user to move the sensing and/or treatment device 6 to the target location 16. The actuator means 4 may then furthermore guide the sensing and/or treatment device 6 by means of the driving signal obtained as described above for fine tuning the position of the sensing and/or treatment device 6.
In a further example, instead of bringing the repositioning system 10 towards a body part 11 of a human being or an animal, the body part 11 of the human being or animal may be placed onto the repositioning system 10. In that case, there is no need for the user or anyone else to hold the system 10. According to this example, the repositioning system 10 may be larger than in the other cases. A schematic illustration of the present example is illustrated in Fig. 18. The repositioning system 10 comprises a large part 29 such that a body part 11 can be provided onto the repositioning system 10, an imaging system 2 and a sensing and/or treatment device 6 which are optionally incorporated in a common device. For the ease of illustration, the other parts of the repositioning system are not illustrated in this figure.
In a last example, the imaging system 2 may comprise two different imaging parts. This example is illustrated in Fig. 19. A first imaging part 31 may be used for acquiring the reference image 13 and is thus different from the second imaging part 32 which may be used for acquiring the at least one measured image 14. The first imaging part 31 may, according to embodiments, work as a non-contact fashion. This first imaging part 31 may also be referred to as initial use device. The initial use device may acquire a high quality reference image 13 or reference map comprising a plurality of reference images 13. It may then communicate the reference image 13 or reference map to the second imaging part 32. In the example given, the second imaging part 32 of the imaging system 2 and the sensing and/or treatment device 6 are incorporated in a common device. For the ease of illustration the other parts of the repositioning system 10 have not been shown in this figure.
The repositioning system 10 and method for positing a sensing and/or treatment device 6 at a target location 16 on a body part of a human being or an animal according to embodiments of the present invention may be used with any technique in which analytes have to be measured within the skin of a body part 11 of the human being or animal. The repositioning system 10 and method for positing a sensing and/or treatment device 6 at a target location 16 on a body part of a human being or an animal according to embodiments of the invention can, for example, be applied for minimally invasive or non-invasive glucose detection by means of optical spectroscopy. Further applications of the repositioning system 10 and method for positing a sensing and/or treatment device 6 at a target location 16 on a body part of a human being or an animal according to embodiments of the present invention may include measurements of skin properties such as e.g. skin cancer, skin aging, etc., by any means of light.
In a further aspect, the present invention also provides a system controller 100 for use in a repositioning system 10 for controlled driving of a sensing and/or treatment device 6 of the repositioning system 10 according to embodiments of the present invention. The system controller 100, which is schematically illustrated in Fig. 20, may comprise a control unit 22 for controlling actuator means 4 adapted for driving the sensing and/or treatment device 6. The control unit 22 may comprise a memory e.g. electronic memory to store the reference image 13, as well as an algorithm, of which an example was given in Fig. 6, for registering and comparing measured images 14 to the reference image 13. The system controller 100 may include a computing device, e.g. microprocessor, for instance it may be a micro-controller. In particular, it may include a programmable controller, for instance a programmable digital logic device such as a Programmable Array Logic (PAL), a Programmable Logic Array, a Programmable Gate Array, especially a Field Programmable Gate Array (FPGA). The use of an FPGA allows subsequent programming of the repositioning system 10, e.g. by downloading the required settings of the FPGA. The system controller 100 may be operated in accordance with settable parameters, such as driving parameters, for example temperature and timing parameters.
The method described above according to embodiments of the present invention may be implemented in a processing system 200 such as shown in Fig. 21. Fig. 21 shows one configuration of processing system 200 that includes at least one customisable or programmable processor 41 coupled to a memory subsystem 42 that includes at least one form of memory, e.g., RAM, ROM, and so forth. It is to be noted that the processor 41 or processors may be a general purpose, or a special purpose processor, and may be for inclusion in a device, e.g., a chip that has other components that perform other functions. Thus, one or more aspects of the method according to embodiments of the present invention can be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them. The processing system may include a storage subsystem 43 that has at least one disk drive and/or CD-ROM drive and/or DVD drive. In some implementations, a display system, a keyboard, a pointing device or a touchscreen may be included as part of a user interface subsystem 44 to provide for a user to manually input information, such as parameter values indicating directions of movement. More elements such as network connections, interfaces to various devices, and so forth, may be included in some embodiments, but are not illustrated in Fig. 21. The various elements of the processing system 40 may be coupled in various ways, including via a bus subsystem 45 shown in Fig. 21 for simplicity as a single bus, but will be understood to those in the art to include a system of at least one bus. The memory of the memory subsystem 42 may at some time hold part or all (in either case shown as 46) of a set of instructions that when executed on the processing system 40 implement the steps of the method embodiments described herein. The present invention also includes a computer program product which provides the functionality of any of the methods according to embodiments of the present invention when executed on a computing device. Such computer program product can be tangibly embodied in a carrier medium carrying machine-readable code for execution by a programmable processor. The present invention thus relates to a carrier medium carrying a computer program product that, when executed on computing means, provides instructions for executing any of the methods as described above. The term "carrier medium" refers to any medium that participates in providing instructions to a processor for execution. Such a medium may take many forms, including but not limited to, non-volatile media, and transmission media. Non-volatile media includes, for example, optical or magnetic disks, such as a storage device which is part of mass storage. Common forms of computer readable media include, a CD-ROM, a DVD, a flexible disk or floppy disk, a tape, a memory chip or cartridge or any other medium from which a computer can read. Various forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to a processor for execution. The computer program product can also be transmitted via a carrier wave in a network, such as a LAN, a WAN or the Internet. Transmission media can take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications. Transmission media include coaxial cables, copper wire and fibre optics, including the wires that comprise a bus within a computer. It is to be understood that although preferred embodiments, specific constructions and configurations, as well as materials, have been discussed herein for devices according to the present invention, various changes or modifications in form and detail may be made without departing from the scope of this invention as defined by the appended claims.

Claims

CLAIMS:
1. A repositioning system (10) for positioning a sensing and/or treatment device (6) at a target location (16), the repositioning system (10) comprising an imaging system (2) for identifying the target location (16) having a marker (12) in or on a body part (11) of a human being or an animal, thereby obtaining a reference image (13) of the target location (16), and for acquiring at least one measured image (14) of the target location (16), the imaging system (2) furthermore comprising: registration means (3) for registering the at least one measured image (14) to the reference image (13) to determine spatial alignment between the at least one measured image (14) and the reference image (13), and - signal generating means for generating a driving signal (5) indicative of the target location direction when spatial alignment between the at least one measured image and the reference image is not approached within a predetermined threshold value, and wherein the repositioning system (10) furthermore comprises actuator means (4) for driving the sensing and/or treatment device (6) to guide it towards to target location thereby using the driving signal (5).
2. A repositioning system (10) according to claim 1, furthermore comprising the sensing device (2) and/or a treatment device.
3. A repositioning system (10) according to claim 1 or 2, furthermore comprising a control unit (22) comprising a memory for storing the at least one reference image (13).
4. A repositioning system (10) according to claim 2 or 3, wherein the imaging system (2) and the sensing and/or treatment device (6) are incorporated in one common device.
5. A repositioning system (10) according to claim 2 or 3, wherein the imaging system (2) and the sensing and/or treatment device (6) are formed by separate devices.
6. A repositioning system (10) according to any of the previous claims, furthermore comprising a user interface (19) for providing feedback to a user for guiding the sensing and/or treatment device (6) towards the target location (16).
7. A repositioning system (10) according to any of the previous claims, furthermore comprising a marker means (23) for providing a marker (12) on the target location (16) and an ink reservoir (24) for providing the marker means (23) with ink or a mareker measn (23) for implanting a microsensor at the traget location (16).
8. A repositioning system (10) according to any of the previous claims, wherein the imaging system (2) furthermore comprises a radiation beam generating means (25) for generating a radiation beam and wherein the sensing and/or treatment device (6) furthermore comprises a radiation beam detection means (26) for detecting the radiation beam.
9. A method for making a repositioning system (10), the method comprising: providing an imaging system (2) for acquiring at least one reference image (13) and at least one measured image (14), the imaging system (2) comprising a registration means (3) for registering the at least one measured image (14) to the reference image (13) and signal generating means for generating a driving signal (5) indicative of the target location direction, and providing actuator means (4) for driving the sensing and/or treatment device (6) to guide it towards to target location thereby using the driving signal (5).
10. Method according toe claim 9, furthermore comprising providing a control unit (22) comprising a memory for storing the at least one reference image (13).
11. Method according to any of claims 9 or 10, furthermore comprising providing a user interface (19) for providing feedback to a user for guiding the sensing and/or treatment device (6) towards the target location (16).
12. A method for positioning a sensing and/or treatment device (6) on a target location (16) on a body part (11) of a human being or an animal, the method comprising: identifying the target location (16) having a marker (12) in or on the body part (11) by acquiring at least one reference image (13), acquiring at least one measured image (14) of the target location (16), determining spatial alignment between the at least one measured image (14) and the reference image (13), generating a driving signal (5) indicative of the target location direction when spatial alignment between the at least one measured image (14) and the reference image (13) is not approached within a predetermined threshold value, and driving the sensing and/or treatment device (6) with the driving signal (5) to guide it towards the target location (16).
13. Method according to claim 12, furthermore comprising storing the reference image (13) in a memory device of a control unit (22).
14. Method according to claim 12 or 13, furthermore comprising, before driving the sensing and/or treatment device (6) with the driving signal (5), providing feedback to a user interface incorporated in the sensing and/or treatment device (6) for giving instructions to a user on how to move and/or rotate the sensing and/or treatment device (6).
15. A controller (100) for controlled driving of a sensing and/or treatment device (6) towards a target location (16) on a body part (11) of a human being or an animal, the controller (100) comprising a control unit (22) for controlling actuator means (4) adapted for driving the sensing and/or treatment device (6).
PCT/IB2009/051998 2008-05-19 2009-05-14 Reproducible positioning of sensing and/or treatment devices WO2009141769A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP08156439.5 2008-05-19
EP08156439 2008-05-19

Publications (1)

Publication Number Publication Date
WO2009141769A1 true WO2009141769A1 (en) 2009-11-26

Family

ID=41130484

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2009/051998 WO2009141769A1 (en) 2008-05-19 2009-05-14 Reproducible positioning of sensing and/or treatment devices

Country Status (1)

Country Link
WO (1) WO2009141769A1 (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012111012A1 (en) 2011-02-17 2012-08-23 Eon Medical Ltd. System and method for performing an automatic and self-guided medical examination
CN102670177A (en) * 2011-03-15 2012-09-19 明达医学科技股份有限公司 Skin optical diagnosis device and operation method thereof
EP2675345A1 (en) * 2011-02-17 2013-12-25 Eon Medical Ltd. System and method for performing an automatic and self-guided medical examination
WO2014045558A1 (en) * 2012-09-20 2014-03-27 Sony Corporation Information processing apparatus, information processing method, program, and measuring system
JP2014509010A (en) * 2011-02-17 2014-04-10 イーオン メディカル リミテッド System and method for performing medical tests guided by automatic and remote trained persons
EP2781184A1 (en) * 2013-03-18 2014-09-24 MedSense Inc. Positioning system for pulse measurement devices
WO2015016290A1 (en) * 2013-07-31 2015-02-05 Canon Kabushiki Kaisha Image processing apparatus and image processing method
WO2015016292A3 (en) * 2013-07-31 2015-04-02 Canon Kabushiki Kaisha Image processing apparatus and image processing method
JP2017074398A (en) * 2016-12-06 2017-04-20 ソニー株式会社 Information processing apparatus, information processing method, program, and measurement system
WO2017151441A3 (en) * 2016-02-29 2017-10-19 Truinject Medical Corp. Cosmetic and therapeutic injection safety systems, methods, and devices
US9922578B2 (en) 2014-01-17 2018-03-20 Truinject Corp. Injection site training system
WO2018055401A3 (en) * 2016-09-22 2018-05-03 Guy's And St. Thomas' Nhs Foundation Trust System for marking the surface of a patient's tissue in the course of imaging
US10235904B2 (en) 2014-12-01 2019-03-19 Truinject Corp. Injection training tool emitting omnidirectional light
US10269266B2 (en) 2017-01-23 2019-04-23 Truinject Corp. Syringe dose and position measuring apparatus
US10290232B2 (en) 2014-03-13 2019-05-14 Truinject Corp. Automated detection of performance characteristics in an injection training system
US10500340B2 (en) 2015-10-20 2019-12-10 Truinject Corp. Injection system
US10643497B2 (en) 2012-10-30 2020-05-05 Truinject Corp. System for cosmetic and therapeutic training
US10648790B2 (en) 2016-03-02 2020-05-12 Truinject Corp. System for determining a three-dimensional position of a testing tool
US10650703B2 (en) 2017-01-10 2020-05-12 Truinject Corp. Suture technique training system
EP3673790A1 (en) * 2018-12-24 2020-07-01 Koninklijke Philips N.V. Device to detect and mark skin areas of interest for user self-detection or other devices
US10849688B2 (en) 2016-03-02 2020-12-01 Truinject Corp. Sensory enhanced environments for injection aid and social training
WO2024135313A1 (en) * 2022-12-19 2024-06-27 株式会社資生堂 Position identification method and position identification device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19529950C1 (en) * 1995-08-14 1996-11-14 Deutsche Forsch Luft Raumfahrt Guiding method for stereo laparoscope in minimal invasive surgery
WO1999027839A2 (en) * 1997-12-01 1999-06-10 Cosman Eric R Surgical positioning system
WO2005065565A1 (en) * 2003-12-31 2005-07-21 Palomar Medical Technologies, Inc. Dermatological treatment with vusualization
US20080033410A1 (en) * 2006-08-02 2008-02-07 Rastegar Jahangir S Automated laser-treatment system with real-time integrated 3D vision system for laser debridement and the like
WO2008032234A1 (en) * 2006-09-11 2008-03-20 Philips Intellectual Property & Standards Gmbh System and method for positioning electrodes on a patient body

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19529950C1 (en) * 1995-08-14 1996-11-14 Deutsche Forsch Luft Raumfahrt Guiding method for stereo laparoscope in minimal invasive surgery
WO1999027839A2 (en) * 1997-12-01 1999-06-10 Cosman Eric R Surgical positioning system
WO2005065565A1 (en) * 2003-12-31 2005-07-21 Palomar Medical Technologies, Inc. Dermatological treatment with vusualization
US20080033410A1 (en) * 2006-08-02 2008-02-07 Rastegar Jahangir S Automated laser-treatment system with real-time integrated 3D vision system for laser debridement and the like
WO2008032234A1 (en) * 2006-09-11 2008-03-20 Philips Intellectual Property & Standards Gmbh System and method for positioning electrodes on a patient body

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2675345A4 (en) * 2011-02-17 2015-01-21 Eon Medical Ltd System and method for performing an automatic and self-guided medical examination
EP2675345A1 (en) * 2011-02-17 2013-12-25 Eon Medical Ltd. System and method for performing an automatic and self-guided medical examination
US10143373B2 (en) 2011-02-17 2018-12-04 Tyto Care Ltd. System and method for performing an automatic and remote trained personnel guided medical examination
WO2012111012A1 (en) 2011-02-17 2012-08-23 Eon Medical Ltd. System and method for performing an automatic and self-guided medical examination
JP2014509010A (en) * 2011-02-17 2014-04-10 イーオン メディカル リミテッド System and method for performing medical tests guided by automatic and remote trained persons
JP2014518642A (en) * 2011-02-17 2014-08-07 イーオン メディカル リミテッド System and method for performing automatic and self-guided medical examinations
CN102670177A (en) * 2011-03-15 2012-09-19 明达医学科技股份有限公司 Skin optical diagnosis device and operation method thereof
CN102670177B (en) * 2011-03-15 2015-05-20 明达医学科技股份有限公司 Skin optical diagnosis device and operation method thereof
KR20150058194A (en) * 2012-09-20 2015-05-28 소니 주식회사 Information processing apparatus, information processing method, program, and measuring system
US9646378B2 (en) 2012-09-20 2017-05-09 Sony Corporation Information processing apparatus, information processing method, program, and measuring system
KR102200740B1 (en) * 2012-09-20 2021-01-08 소니 주식회사 Information processing apparatus, information processing method, program, and measuring system
WO2014045558A1 (en) * 2012-09-20 2014-03-27 Sony Corporation Information processing apparatus, information processing method, program, and measuring system
JP2014061057A (en) * 2012-09-20 2014-04-10 Sony Corp Information processor, information processing method, program, and measurement system
CN104684461A (en) * 2012-09-20 2015-06-03 索尼公司 Information processing apparatus, information processing method, program, and measuring system
US11854426B2 (en) 2012-10-30 2023-12-26 Truinject Corp. System for cosmetic and therapeutic training
US11403964B2 (en) 2012-10-30 2022-08-02 Truinject Corp. System for cosmetic and therapeutic training
US10643497B2 (en) 2012-10-30 2020-05-05 Truinject Corp. System for cosmetic and therapeutic training
US10902746B2 (en) 2012-10-30 2021-01-26 Truinject Corp. System for cosmetic and therapeutic training
US9364182B2 (en) 2013-03-18 2016-06-14 Maisense Inc. Pulse measurement devices for bio-signals
EP2781184A1 (en) * 2013-03-18 2014-09-24 MedSense Inc. Positioning system for pulse measurement devices
GB2534051A (en) * 2013-07-31 2016-07-13 Canon Kk Image processing apparatus and image processing method
WO2015016290A1 (en) * 2013-07-31 2015-02-05 Canon Kabushiki Kaisha Image processing apparatus and image processing method
CN105431077A (en) * 2013-07-31 2016-03-23 佳能株式会社 Image processing apparatus and image processing method
GB2534051B (en) * 2013-07-31 2017-06-14 Canon Kk Image processing apparatus and image processing method
US9700199B2 (en) 2013-07-31 2017-07-11 Canon Kabushiki Kaisha Image processing apparatus and image processing method
WO2015016292A3 (en) * 2013-07-31 2015-04-02 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US10244937B2 (en) 2013-07-31 2019-04-02 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US9922578B2 (en) 2014-01-17 2018-03-20 Truinject Corp. Injection site training system
US10896627B2 (en) 2014-01-17 2021-01-19 Truinjet Corp. Injection site training system
US10290231B2 (en) 2014-03-13 2019-05-14 Truinject Corp. Automated detection of performance characteristics in an injection training system
US10290232B2 (en) 2014-03-13 2019-05-14 Truinject Corp. Automated detection of performance characteristics in an injection training system
US10235904B2 (en) 2014-12-01 2019-03-19 Truinject Corp. Injection training tool emitting omnidirectional light
US10500340B2 (en) 2015-10-20 2019-12-10 Truinject Corp. Injection system
US12070581B2 (en) 2015-10-20 2024-08-27 Truinject Corp. Injection system
WO2017151441A3 (en) * 2016-02-29 2017-10-19 Truinject Medical Corp. Cosmetic and therapeutic injection safety systems, methods, and devices
US10743942B2 (en) 2016-02-29 2020-08-18 Truinject Corp. Cosmetic and therapeutic injection safety systems, methods, and devices
US10648790B2 (en) 2016-03-02 2020-05-12 Truinject Corp. System for determining a three-dimensional position of a testing tool
US10849688B2 (en) 2016-03-02 2020-12-01 Truinject Corp. Sensory enhanced environments for injection aid and social training
US11730543B2 (en) 2016-03-02 2023-08-22 Truinject Corp. Sensory enhanced environments for injection aid and social training
WO2018055401A3 (en) * 2016-09-22 2018-05-03 Guy's And St. Thomas' Nhs Foundation Trust System for marking the surface of a patient's tissue in the course of imaging
JP2017074398A (en) * 2016-12-06 2017-04-20 ソニー株式会社 Information processing apparatus, information processing method, program, and measurement system
US10650703B2 (en) 2017-01-10 2020-05-12 Truinject Corp. Suture technique training system
US11710424B2 (en) 2017-01-23 2023-07-25 Truinject Corp. Syringe dose and position measuring apparatus
US10269266B2 (en) 2017-01-23 2019-04-23 Truinject Corp. Syringe dose and position measuring apparatus
EP3673790A1 (en) * 2018-12-24 2020-07-01 Koninklijke Philips N.V. Device to detect and mark skin areas of interest for user self-detection or other devices
WO2024135313A1 (en) * 2022-12-19 2024-06-27 株式会社資生堂 Position identification method and position identification device

Similar Documents

Publication Publication Date Title
WO2009141769A1 (en) Reproducible positioning of sensing and/or treatment devices
RU2535605C2 (en) Recalibration of pre-recorded images during interventions using needle device
US20190239751A1 (en) Compact Optical Imaging Devices, Systems, and Methods
US7930015B2 (en) Methods and sensors for monitoring internal tissue conditions
JP4849755B2 (en) Imaging apparatus and sample analysis method
US7697966B2 (en) Noninvasive targeting system method and apparatus
US8244332B2 (en) Three-dimensional breast anatomy imaging system
US20110242301A1 (en) Image processing device, image processing method, and program
CA2860026A1 (en) Biopsy device with integrated optical spectroscopy guidance
US20060241450A1 (en) Ultrasound guided tissue measurement system
JP2013542768A5 (en)
JP2013542768A (en) Flexible leash with integrated sensors for dynamic instrument tracking
CN104067313B (en) Imaging device
JP2005501586A (en) Multi-sensor probe for tissue identification
CN110037660A (en) Pressure sore detection system based on near-infrared spectrum technique
US20110066092A1 (en) Perfusion regulation device
JP3182601B2 (en) Tissue type recognition method and apparatus therefor
CN115005989A (en) Multi-light fusion brain positioning method and corresponding positioning system
EP2983581A1 (en) System and method for imaging biomarkers indicative of cardiac thermal ablation lesions
RU2804292C1 (en) Device for carrying out low-traumatic optical biopsy
WO2022243714A1 (en) Depth-surface imaging device for registering ultrasound images to each other and to surface images by using surface information
US20170164837A1 (en) Intraoperative guidance system for tumor surgery
JP2000225108A (en) Biomodel for non-invasive biological inspection
JP2024518392A (en) Augmented reality headsets and probes for medical imaging
CN115024819A (en) Multi-light fusion brain positioning system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09750226

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09750226

Country of ref document: EP

Kind code of ref document: A1