WO2013057703A1 - Body surface feedback for medical interventions - Google Patents

Body surface feedback for medical interventions Download PDF

Info

Publication number
WO2013057703A1
WO2013057703A1 PCT/IB2012/055729 IB2012055729W WO2013057703A1 WO 2013057703 A1 WO2013057703 A1 WO 2013057703A1 IB 2012055729 W IB2012055729 W IB 2012055729W WO 2013057703 A1 WO2013057703 A1 WO 2013057703A1
Authority
WO
WIPO (PCT)
Prior art keywords
garment
feedback
recited
subject
sensors
Prior art date
Application number
PCT/IB2012/055729
Other languages
English (en)
French (fr)
Inventor
Tobias Klinder
Robert Manzke
Raymond Chan
Original Assignee
Koninklijke Philips Electronics N.V.
Philips Intellectual Property & Standards Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V., Philips Intellectual Property & Standards Gmbh filed Critical Koninklijke Philips Electronics N.V.
Priority to CN201280051305.4A priority Critical patent/CN103889259B/zh
Priority to US14/352,693 priority patent/US20140323856A1/en
Priority to EP12805526.6A priority patent/EP2747590A1/en
Priority to JP2014536394A priority patent/JP2014534848A/ja
Publication of WO2013057703A1 publication Critical patent/WO2013057703A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/113Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • A61B2017/00119Electrical control of surgical instruments with audible or visual output alarm; indicating an abnormal situation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00694Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body
    • A61B2017/00699Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body correcting for movement caused by respiration, e.g. by triggering
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2061Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/065Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/113Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing
    • A61B5/1135Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing by monitoring thoracic expansion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation

Definitions

  • This disclosure relates to optical shape sensing and more particularly to systems and methods which employ optical shape sensing and sensory feedback for medical applications.
  • Breathing motion can be a problem in any thoracic or abdominal imaging procedure because, e.g., standard image reconstruction algorithms usually implicitly assume a static scan object. Motion therefore can affect imaging of larger volumes when the time needed for the imaging procedure is comparable to or even longer than the respiratory period. Image-guided interventional procedures can suffer from the effect of motion during the intervention, since this can make automated assignment of reference points in the live images difficult.
  • Devices for monitoring respiratory motion such as spirometers or breathing belts can yield only crude information about the patient's breathing motion state at specific points in time during the acquisition.
  • precise knowledge of both the state of the respiratory cycle and shape of body contours and surfaces at a given time becomes increasingly important. Even if breathing is accounted for in images, it may not be sufficient feedback for a physician attempting an operative task, such as inserting a needle or the like.
  • pre-operative information e.g., patient specific anatomy
  • pre-operative information e.g., patient specific anatomy
  • having direct, real-time feedback of certain interactions e.g., needle insertion
  • a system for providing sensory feedback includes a garment configured to flexibly and snuggly fit over a portion of a subject.
  • the garment includes one or more sensors disposed therein to monitor activity of the subject or monitor points of interest of the subject.
  • An interpretation module is coupled with the sensors to receive sensor signals and interpret the sensor signals to determine if conditions are met to provide feedback signals to the garment.
  • a feedback modality is incorporated into the garment and is responsive to the feedback signals such that the feedback modality emits energy from the garment to provide sensory information to assist a physician during a procedure.
  • the feedback modality may be responsive to feedback signals from sensors on the garment or sensors looked elsewhere, e.g., on other equipment, from an interventional devices, etc.
  • a garment providing sensory feedback includes a fabric configured to fit over at least a portion of a subject.
  • One or more sensors are incorporated in the fabric to perform
  • a feedback modality is incorporated into the fabric and is responsive to one or more feedback signals derived from measurements such that the feedback modality emits energy from the fabric to provide sensory information to assist a physician during a procedure.
  • the feedback modality may be responsive to feedback signals from sensors on the garment or sensors looked elsewhere, e.g., on other equipment, from an interventional devices, etc.
  • a method for providing sensory feedback includes providing a garment configured to fit over at least a portion of a subject, one or more sensors incorporated in the garment to perform measurements from the subject, and a feedback modality incorporated into the garment and being responsive to the one or more feedback signals derived from measurements such that the feedback modality emits energy from the garment to provide sensory information to assist a physician during a procedure; generating the one or more feedback signals derived from the measurements; and activating the feedback modality in accordance with the feedback signals to emit energy from the garment to provide sensory information to assist a physician.
  • FIG. 1 is a block/flow diagram showing an illustrative system/method which employs sensory feedback from a patient during a procedure in accordance with the present principles
  • FIG. 2 is a schematic diagram showing a garment for generating sensory feedback in accordance with one illustrative embodiment
  • FIG. 3 is a diagram showing a garment or manifold in accordance with another illustrative embodiment.
  • FIG. 4 is a flow diagram showing steps for measuring data and generating sensory feedback from a garment in accordance with an illustrative embodiment of the present invention.
  • a measurement system and method provide an adaptable and optimized setup configured to provide status feedback of a patient.
  • Lighting sensors or other feedback mechanisms are associated with a patient's body surface (e.g., by including the sensors in a medical vest or other garment) to provide visual/audio feedback to a physician of a physical state or states of the patient and also to account for augmented reality.
  • Augmented reality refers to the use of pre-operatively collected data employed during a present-time procedure.
  • the data collected preoperatively may be employed to indicate points of interest directly on a patient during a procedure.
  • shape sensing technology is employed to determine shape information of a patient's body.
  • the shape sensing data may be collected using a garment equipped with shape sensing technology.
  • Shape information can be derived from a variety of systems. These include: optical shape interrogation systems (e.g., fiber optic Bragg sensors, Rayleigh scattering, Brillouin scatter, optical intensity-based attenuation), multi-coil arrays for electromagnetic (EM) localization of points on the apparatus, laser scanning systems for three-dimensional surface estimation and optical/acoustic marker/emitter arrays for camera (time-of-flight or
  • a shape sensing garment may be equipped with sensors and/or feedback devices to indicate positions on the patient or a status or activity of the patient (e.g., breathing cycles, swallowing, muscle twitching, etc.).
  • one or more shape sensing optical fibers are employed in a garment with electrical, thermal or other measurement sensors. The fibers and other sensors work in conjunction to provide signals to feedback devices such as light emitting diodes or other feedback mechanisms to give guidance to the physician using preoperative data or currently measured data.
  • the present invention will be described in terms of medical instruments; however, the teachings of the present invention are much broader and are applicable to any instruments employed in tracking or analyzing complex biological or mechanical systems.
  • the present principles are applicable to internal tracking procedures of biological systems, procedures in all areas of the body such as the lungs, gastrointestinal tract, excretory organs, blood vessels, etc.
  • the elements depicted in the FIGS may be implemented in various combinations of hardware and software and provide functions which may be combined in a single element or multiple elements.
  • processors can be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software.
  • the functions can be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which can be shared.
  • explicit use of the term "processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and can implicitly include, without limitation, digital signal processor ("DSP”) hardware, read-only memory (“ROM”) for storing software, random access memory
  • DSP digital signal processor
  • ROM read-only memory
  • RAM random access memory
  • non-volatile storage etc.
  • embodiments of the present invention can take the form of a computer program product accessible from a computer-usable or computer-readable storage medium providing program code for use by or in connection with a computer or any instruction execution system.
  • a computer-usable or computer readable storage medium can be any apparatus that may include, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium.
  • Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk.
  • Current examples of optical disks include compact disk - read only memory (CD-ROM), compact disk - read/write (CD-R/W) and DVD.
  • System 100 may include a workstation or console 112 from which a procedure is supervised and managed. Procedures may include any procedure including but not limited to biopsies, ablations, injection of medications, etc. Workstation 112 preferably includes one or more processors 114 and memory 116 for storing programs and applications. It should be understood that the function and components of system 100 may be integrated into one or more workstations or systems.
  • Memory 116 may store an interpretation module 115 configured to interpret electromagnetic, optical, acoustic, etc. feedback signals from a sensitized flexible garment 106.
  • the garment 106 may include fiber sensors, optical, acoustic, electrical or electromagnetic markers or sensors, etc. embedded therein with known geometry or with a geometry that is initialized before use.
  • a shape interrogation console 122 measures the marker/sensor distribution over the surface of interest and supplies feedback about calibration / reference sections and measurement sections to the interpretation module 115.
  • the shape interrogation module 122 sends and receives light to/from optical fibers or provide electrical power or signals to sensors 104.
  • the optical fiber sensors 104 are weaved or otherwise integrated into garment 106 in a pattern that allows for stretching of the underlying textile substrate while accounting for the fact that the overall fiber sensor length in the textile can change only minimally (e.g., a 2D spiral pattern or 2D sinusoidal pattern embedded within the flexible membrane).
  • the fibers for sensors 104 are locally anchored at control points to provide a strain in the fiber during the flexure of the subject 148.
  • control points can constrain the fiber in all degrees of freedom relative to the mesh, e.g., at the fiber tip, whereas others can allow for a sliding degree of freedom so that the fiber can slide freely relative to the mesh pattern to accommodate any overall path length changes in the patterned structure as the mesh deforms.
  • the interpretation module 115 may include the capability of receiving multiple inputs from multiple devices or systems to interpret an event or dynamic occurrence during a medical procedure, diagnostic test, etc.
  • a medical imaging device 110 and/or a tracking module 117 may also be included and may provide additional feedback to the interpretation module 115.
  • the interpretation module 115 is configured to use the signal feedback (and any other feedback) to account for errors or aberrations related to dynamic changes of a patient's body.
  • a subject 148 or a region of interest 140 on the subject 148 is covered or constrained by the flexible garment 106.
  • the flexible garment 106 may include a fabric or netting configured to stretch corresponding with movement or flexure of the subject 148 or the region of interest 140.
  • garment 106 includes a feedback modality or feedback mechanisms 108.
  • the feedback mechanisms 108 are configurable to react to stimuli collected by sensors 104 (or by other external data).
  • feedback mechanisms 108 include lights, such as light emitting diodes (LEDs) 109 integrated into the fabric of garment 106.
  • the LEDs 109 are distributed within and throughout the garment 106 to provide feedback as to locations and//or events occurring relative to the patient.
  • a medical device 102 may include, e.g., a needle, a catheter, a guide wire, an endoscope, a probe, a robot, an electrode, a filter device, a balloon device or other medical component, etc.
  • the device 102 is to be inserted in the patient.
  • the device 102 has its coordinate system registered to pre-operative data 119 (e.g., image data).
  • pre-operative data 119 e.g., image data
  • an LED 109 nearest to a tip of the device 102 (which can be tracked by a tracking device 107) is illuminated on the garment 106 to provide visual feedback to the physician and any other person present in the environment.
  • One or more tracking devices or cameras 107 may be incorporated into the device 102, so tracking information can be provided.
  • the tracking devices 107 may include electromagnetic (EM) trackers, fiber optic tracking, robotic positioning systems, cameras, etc.
  • breathing, heatbeat and swallowing data is collected by sensors 104 in the garment 106.
  • the patient's breathing is visually indicated by a first LED (e.g., a white LED illuminated at each breathing cycle), the patient's heartbeat is indicated by a second LED (e.g., a red LED that is illuminated at each beat) and the patient's swallowing is indicated by a third LED (e.g., a blue LED that is illuminated at each swallow).
  • a first LED e.g., a white LED illuminated at each breathing cycle
  • the patient's heartbeat is indicated by a second LED (e.g., a red LED that is illuminated at each beat)
  • the patient's swallowing is indicated by a third LED (e.g., a blue LED that is illuminated at each swallow).
  • the garment 106 is spatially registered with pre-operative data 119, such as pre-operative images.
  • image landmarks may be indicated using LEDs 109 to assist in locating a proper insertion point, in addition a depth of the needle may be indicated on the garment 106 by a number or color of LEDs 109 that have been lit.
  • the system 100 can be configured to be sensitive to any event or movement of a patient.
  • the sensors 104 are configured to relay positional, temperature, electric field information, shape information, etc. back to the interpretation module 115.
  • the garment 106 may be disposed over a mid-section of a patient (148) such that during a breathing cycle sensors 104 sense the dynamic shape changes of the abdomen or chest. This information may be interpreted using the interpretation module 115 which computes distances or changes in distances between nodes or positions in the garment 106.
  • the mesh deflections may then be employed to account for breathing in images taken by an imaging device 110 or assist in the timing of an action during a medical procedure (e.g., inserting a device on an exhale, etc.), or any other event or action that needs compensation for dynamic changes.
  • the garment 106 may be applied to any portion of the subject's anatomy to collect dynamic data.
  • the garment 106 may be placed over the arms, legs, abdomen, chest, neck, head, or combinations thereof.
  • the garment 106 may be made adjustable to be configured for different sized subjects, different sized appendages, etc.
  • the garment 106 may be employed during a medical procedure to assist a clinician in performing the procedure.
  • Workstation 112 may include a display 118 for viewing internal images of the subject 148 using the imaging system 110.
  • the imaging system 110 may include one or more imaging modalities, such as, e.g., ultrasound, photoacoustics, a magnetic resonance imaging (MRI) system, a fluoroscopy system, a computed tomography (CT) system, positron emission tomography (PET), single photon emission computed tomography (SPECT), or other system.
  • Imaging system 110 may be provided to collect real-time intraoperative imaging data.
  • the imaging data may be displayed on display 118.
  • Display 118 may permit a user to interact with the workstation 112 and its components and functions. This is further facilitated by an interface 120 which may include a keyboard, mouse, a joystick or any other peripheral or control to permit user interaction with the workstation 112.
  • a controller module 126 or other device is provided to condition signals and to control feedback mechanisms 108.
  • the controller module 126 may generate control signals to control various controllers, sensors, radiation sources/beams, etc. in accodance with programmed conditions for which feedback is desired. The manner of the feedback response can also be programmed.
  • the controller 126 receives data from the interpretation module 115 and issues commands or signals to the feeddback mechanisms 108.
  • the interpretation module 115 dynamically provides information collected and interpreted from the sensors 104 in garment 106 to the controller 126 to render the feedback to the physician in real-time, which may then be employed in administering medication, making decisions, etc.
  • garment 106 includes a vest or manifold 202.
  • the vest 202 is formed from a mesh or fabric 206.
  • the mesh 206 or vest 202 measures body surface deformation continuously in time and space with spatial high resolution (e.g., shape sensing vest).
  • the vest 202 is preferably flexible with a snug fit over the subject 148.
  • sensing fibers 210 are integrated in the vest 202 and are employed to determine a shape of the chest of the subject 148.
  • the sensing fibers 210 are also employed to determine dynamic geometry changes in the subject 148 and/or monitor a status of the subject 148.
  • the sensing fiber(s) 210 may include a single optical fiber integrated into the vest 202 that spirals around the subject's body and hence delivers a sufficient picture of the geometry or may include multiple fibers integrated into the vest 202.
  • the sensing fibers 210 may include one or more fiber optic Bragg gratings (FBG), which are a segment of an optical fiber that reflects particular wavelengths of light and transmits all others. This is achieved by adding a periodic variation of the refractive index in the fiber core, which generates a wavelength-specific dielectric mirror.
  • FBG fiber optic Bragg gratings
  • a FBG can therefore be used as an inline optical filter to block certain wavelengths, or as a wavelength-specific reflector.
  • a fundamental principle behind the operation of a FBG is Fresnel reflection at each of the interfaces where the refractive index is changing. For some wavelengths, the reflected light of the various periods is in phase so that constructive interference exists for reflection and consequently, destructive interference for transmission.
  • the Bragg wavelength is sensitive to strain as well as to temperature. This means that FBGs gratings can be used as sensing elements in fiber optical sensors. In a FBG sensor, a shift in the Bragg wavelength, ⁇ is caused, and the relative shift in the Bragg wavelength, ⁇ / ⁇ , due to an applied strain ( ⁇ ) and a change in temperature ( ⁇ ) is approximately given by:
  • the coefficient C s is called the coefficient of strain and its magnitude is usually around 0.8 x 10 ⁇ 6 / ⁇ or in absolute quantities about 1 pm/ ⁇ .
  • the coefficient Cj describes the temperature sensitivity of the sensor and is made up of the thermal expansion coefficient and the thermo optic effect. Its value is around 7xlO ⁇ 6 /K (or as an absolute quantity 13 pm/K).
  • One of the main advantages of the technique is that various sensor elements can be distributed over the length of a fiber. Incorporating three or more cores with various sensors (gauges) along the length of a fiber that is embedded in a structure allows for evaluation of the curvature of the structure as a function of longitudinal position and hence for the three dimensional form of such a structure to be precisely determined.
  • the vest 202 may include an integrated lighting arrangement 220 (LEDs) and/or a visual display screen 222 within the vest 202.
  • the lighting arrangement 220 may include a grid or grids of LEDs 224 disposed on a surface or integrated within the vest 202.
  • the grid of LEDs 224 may include or be associated with tracking devices 226 or with the sensing fibers 210 such that each LED 224 can be located relative to the subject 148.
  • the grid pattern shown in FIG. 2 is illustrative and non-limiting as other configurations are also contemplated.
  • An initialization process may be employed once the vest 202 is securely positioned on the subject 148 to register the location of the vest 202 and/or the LEDs 224 (or display screen 222) with positions or references on the subject 148.
  • a registration scheme is needed between pre-operative data (e.g., images) and intra-operative patient positions.
  • pre-operative data e.g., images
  • Such registration may employ known technologies and methods. For example, fiducial markers on the patient and/or in the vest 202 may be registered with corresponding points in the pre-operative images.
  • the signals from the sensors in the vest 202 are interpreted by the shape interpretation module (module 115), which generates signals that are addressed to one or more locations in the grid of LEDs 224 by the controller 126 (FIG. 1).
  • the LEDs 224 that satisfy the programmed conditions are illuminated by the controller 126 and provide feedback to the physician or others.
  • augmented reality In order to bridge the gap between the pre-operative information (in particular, the anatomical information provided by a computed tomography (CT) or a magnetic resonance (MR) scan together with associated plans, e.g., needle insertion points on the body surface) and the live view of the physician in the operating room, augmented reality is employed.
  • CT computed tomography
  • MR magnetic resonance
  • This may entail a camera system mounted on a head of the physician together with a special set of glasses that have been designed so that while looking through the glasses the physician can virtually overlay preoperative information over a patient or region of interest. This feature can be switched on or switched off.
  • relevant pre-operative information may be displayed directly on the body surface by making use of vest 202 with integrated lighting or display screen 222, which is worn by the patient throughout the procedure.
  • This display screen 222 may include a flexible display integrated with the fabric of vest 202.
  • preoperative images may be registered to the patient and displayed on the screen 222.
  • Representations of instruments or overlays may be generated on the screen in accordance with information received from the sensors to integrate visual feedback via integrated display / lighting components.
  • the visual feedback feature(s) of the vest 202 may be used independently of the shape sensing.
  • the LEDs 224 provide visual feedback in the form of information encoded spatially and temporally via color, intensity, phase/timing, direction, etc. Depending on the application these LEDs 224 can be spread sparsely or densely. Information can be presented to reflect guidance information for navigation of an instrument through an access port based on real time measurements of surface anatomy deformation (that in turn reflects organ and target motion relative to an original planned path). Other sensory information may be provided to the physician and can take the form of acoustic or haptic / tactile feedback that is spatially modulated over the surface of the vest/manifold 202.
  • knowing the deformation of the outer body surface is of wide interest for many applications.
  • One application includes respiratory motion compensation which is a problem for many image guided interventions.
  • Another application is the deformation of organs and tissue due to applied forces during the interventions (e.g., needle insertion).
  • the vest provides effective feedback, e.g., through visualization.
  • the visual feedback of deformation measurements makes use of lighting from LEDs 224 or from the display screen 222 included in the clothing.
  • the sensor / feedback enabled clothing itself may take the form of a surgical drape or other sterile manifold disposed over the patient. This component may be reusable or disposable in nature, depending on the application and costs involved.
  • the use for this application would be to convert a measurement from shape sensing fibers into an LED signal. By doing so, the physician would have direct visual feedback, e.g., how much deformation he caused when he inserted the needle or where high areas of respiratory motion exist.
  • a pre-procedural scan of the patient is acquired.
  • the patient wears the shape sensing vest 202 with the lighting system included.
  • a continuous mapping of the outer surface can be calculated between the pre-operative data and the current patient deformation. This mapping also gives an estimate of the internal organ position (plus locations of derived information, e.g., during planning as a region of interest or insertion point).
  • the lighting system LEDs 224 and/or display screen 222 are employed to display useful information during the intervention or other procedure, e.g., to activate certain LEDs 224 to display the insertion points or even to use the LEDs 224 to display the organ position mapped on a surface of vest 202.
  • the shape sensing vest 202 permits tracking of patient/organ deformation in real-time.
  • the LED feedback is able to account for, e.g., respiratory motion, reactionary forces, etc.
  • Display screen 222 may include a higher-resolution flexible display integrated in the vest 202 for displaying medical (imaging) data.
  • the medical data can be filtered or transformed depending on input of the deformation sensors 210 in the vest 202 or other data acquired by other sensors, e.g., location of a needle tip.
  • the display screen 222 and/or LEDs 224 may indicate medical alarms and/or sites where complications may have occurred.
  • the vest 202 may be employed to display medical condition diagnoses depending on other sensors internal or external to the vest 202.
  • a garment 302 can have sterility maintaining access ports 310 to permit perforations to be made through the sensing manifold by percutaneous interventional devices.
  • the garment 302 can have a modular design which permits re-usable and sterilizable sensing matrix 312 to be embedded or mated with a sterile, disposable portion 315 makes direct contact with the patient 148.
  • Portion 315 includes a surface or interface 314 which couples or connects with matrix 312.
  • garment 302 e.g., a pair of trunks, a bra, a skull cap, a sock, a glove, etc.
  • the clothing may include elastic, SpandexTM or other form of elastic clothing.
  • the garment 302 (or any other garment in accordance with the present principles) may include adjustment mechanisms 304 to be adaptively sized to snuggly fit a patient.
  • the adjustment mechanisms 304 may include hook and loop connectors, buckles, elastic bands, zippers, snaps, adhesive strips, inflatable cuffs, suction devices or any other devices for connecting to a body surface and/or adjusting to a size.
  • a method for providing sensory feedback is illustratively shown in accordance with illustrative embodiments.
  • a garment is provided, which is configured to fit over at least a portion of a subject, e.g., an arm, leg, chest, etc.
  • One or more sensors are incorporated in the garment to perform measurements of the subject.
  • a feedback modality is incorporated into the garment and is responsive to the one or more feedback signals derived from measurements of: the one or more sensors or other devices such that the feedback modality emits energy from the garment to provide sensory information to assist a physician during a procedure.
  • Sensory information may include light, sound, tactile information, etc.
  • Embodiments may include one or more types of sensory information employed together.
  • the garment may include one or more shape sensing optical fibers disposed therein.
  • the optical fibers may be woven into the fabric of the garment and are preferably attached to the fabric such that flexing the fabric imparts a strain in the fiber.
  • the optical fibers may be tied to the fabric, glued or otherwise coupled to the fabric to permit flexure but also to constraint the motion of the portion of the subject.
  • one or more feedback signals are generated.
  • the feedback signals are preferably derived from measurements of the one or more sensors. Other sensors or devices may also be employed to trigger the generation of feedback signals. These sensors or devices may include monitoring devices or stored data not necessarily located on the garment.
  • the measurements of the one or more sensors are interpreted to determine one or more of anatomical movement, anatomical function, a position of an interventional device, etc.
  • the feedback modality is activated to emit energy from the garment to provide sensory information to assist a physician.
  • lights or other devices may be selectively illuminating in an array to indicate, e.g., a location of an anatomical feature, anatomical movement, pre-operative data, a position of an interventional device, etc.
  • a display screen may be illuminated to render images to indicate a location of an anatomical feature, anatomical movement, pre-operative data, a position of an interventional device, etc.
  • the images may include pre-operative images.
  • the sensory feedback may be employed by physicians to determine a patient status, determine triggering events, understand the boundaries of an organ, obtain anatomical responses, etc.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Human Computer Interaction (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Length Measuring Devices By Optical Means (AREA)
PCT/IB2012/055729 2011-10-20 2012-10-19 Body surface feedback for medical interventions WO2013057703A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201280051305.4A CN103889259B (zh) 2011-10-21 2012-10-19 用于医疗介入的体表反馈
US14/352,693 US20140323856A1 (en) 2011-10-20 2012-10-19 Magnetic particle detection with incubation period
EP12805526.6A EP2747590A1 (en) 2011-10-21 2012-10-19 Body surface feedback for medical interventions
JP2014536394A JP2014534848A (ja) 2011-10-21 2012-10-19 医療介入のための体表面フィードバック

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161549943P 2011-10-21 2011-10-21
US61/549,943 2011-10-21

Publications (1)

Publication Number Publication Date
WO2013057703A1 true WO2013057703A1 (en) 2013-04-25

Family

ID=47427397

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2012/055729 WO2013057703A1 (en) 2011-10-20 2012-10-19 Body surface feedback for medical interventions

Country Status (4)

Country Link
EP (1) EP2747590A1 (zh)
JP (1) JP2014534848A (zh)
CN (1) CN103889259B (zh)
WO (1) WO2013057703A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016533849A (ja) * 2013-09-12 2016-11-04 インテュイティブ サージカル オペレーションズ, インコーポレイテッド 局所的に移動可能な標的についての形状センサシステム
JP2018500984A (ja) * 2014-12-11 2018-01-18 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 改善されたmri安全性のためのケーブルループ検知機構
US10952810B2 (en) 2012-07-09 2021-03-23 Koninklijke Philips N.V. Method and system for adaptive image guided intervention
US11129679B2 (en) * 2017-11-14 2021-09-28 Mako Surgical Corp. Fiber optic tracking system

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI610656B (zh) * 2016-01-22 2018-01-11 Zhou Chang An 穿戴式生理監測裝置
WO2019031041A1 (ja) * 2017-08-10 2019-02-14 国立大学法人信州大学 光ファイバセンサ導入編地、及び光ファイバセンサ導入編地の製造方法
CN110742618A (zh) * 2019-10-29 2020-02-04 南通大学 一种吞咽智能检测系统

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090171233A1 (en) * 2006-06-02 2009-07-02 Koninklijke Philips Electronics N.V. Biofeedback system and display device
US20100234714A1 (en) * 2006-08-17 2010-09-16 Koninklijke Philips Electronics N.V. Dynamic body state display device
US20110054303A1 (en) * 2000-01-04 2011-03-03 George Mason Intellectual Properties, Inc. Apparatus for registering and tracking an instrument

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6381482B1 (en) * 1998-05-13 2002-04-30 Georgia Tech Research Corp. Fabric or garment with integrated flexible information infrastructure
US5876357A (en) * 1997-11-20 1999-03-02 Labor Control System (L.C.S.) Ltd. Uterine cervix dilation, effacement, and consistency monitoring system
JP4353668B2 (ja) * 1999-12-23 2009-10-28 ヒル−ロム サービシズ,インコーポレイテッド 手術シアターシステム
US20040034297A1 (en) * 2002-08-13 2004-02-19 General Electric Company Medical device positioning system and method
US7226454B2 (en) * 2004-12-07 2007-06-05 Arizant Healthcare Inc. Warming device with varied permeability

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110054303A1 (en) * 2000-01-04 2011-03-03 George Mason Intellectual Properties, Inc. Apparatus for registering and tracking an instrument
US20090171233A1 (en) * 2006-06-02 2009-07-02 Koninklijke Philips Electronics N.V. Biofeedback system and display device
US20100234714A1 (en) * 2006-08-17 2010-09-16 Koninklijke Philips Electronics N.V. Dynamic body state display device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10952810B2 (en) 2012-07-09 2021-03-23 Koninklijke Philips N.V. Method and system for adaptive image guided intervention
JP2016533849A (ja) * 2013-09-12 2016-11-04 インテュイティブ サージカル オペレーションズ, インコーポレイテッド 局所的に移動可能な標的についての形状センサシステム
US10376321B2 (en) 2013-09-12 2019-08-13 Intuitive Surgical Operations, Inc. Shape sensor systems for localizing movable targets
JP2018500984A (ja) * 2014-12-11 2018-01-18 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 改善されたmri安全性のためのケーブルループ検知機構
US11129679B2 (en) * 2017-11-14 2021-09-28 Mako Surgical Corp. Fiber optic tracking system

Also Published As

Publication number Publication date
CN103889259B (zh) 2017-01-18
JP2014534848A (ja) 2014-12-25
EP2747590A1 (en) 2014-07-02
CN103889259A (zh) 2014-06-25

Similar Documents

Publication Publication Date Title
JP6072017B2 (ja) 光学形状センシングに伴う動的な制約
US20140323856A1 (en) Magnetic particle detection with incubation period
WO2013057703A1 (en) Body surface feedback for medical interventions
US11219487B2 (en) Shape sensing for orthopedic navigation
US10610085B2 (en) Optical sensing-enabled interventional instruments for rapid distributed measurements of biophysical parameters
US11266466B2 (en) Shape sensor systems with redundant sensing
US20170296292A1 (en) Systems and Methods for Surgical Imaging
EP2677937B1 (en) Non-rigid-body morphing of vessel image using intravascular device shape
CN105934215B (zh) 具有光学形状感测的成像设备的机器人控制
US20130216025A1 (en) Adaptive imaging and frame rate optimizing based on real-time shape sensing of medical instruments
WO2014001977A2 (en) Fiber optic sensor guided navigation for vascular visualization and monitoring
JP6706576B2 (ja) 最小侵襲性のインターベンションのための形状センスされるロボット超音波
CA3028792A1 (en) Sensored surgical tool and surgical intraoperative tracking and imaging system incorporating same
JP2017500935A5 (zh)
US11406278B2 (en) Non-rigid-body morphing of vessel image using intravascular device shape
CN109715054A (zh) 与体外图像中的仪器相关的图像对象的可视化
WO2018160955A1 (en) Systems and methods for surgical tracking and visualization of hidden anatomical features
EP3944254A1 (en) System for displaying an augmented reality and method for generating an augmented reality

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12805526

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2014536394

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 14352693

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE