WO2019048269A1 - System for venipuncture and arterial line guidance with augmented reality - Google Patents

System for venipuncture and arterial line guidance with augmented reality Download PDF

Info

Publication number
WO2019048269A1
WO2019048269A1 PCT/EP2018/072951 EP2018072951W WO2019048269A1 WO 2019048269 A1 WO2019048269 A1 WO 2019048269A1 EP 2018072951 W EP2018072951 W EP 2018072951W WO 2019048269 A1 WO2019048269 A1 WO 2019048269A1
Authority
WO
WIPO (PCT)
Prior art keywords
needle
target
blood vessel
corrective action
current position
Prior art date
Application number
PCT/EP2018/072951
Other languages
French (fr)
Inventor
Christine Menking SWISHER
Portia E. SINGH
Mladen Milosevic
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2019048269A1 publication Critical patent/WO2019048269A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4887Locating particular structures in or on the body
    • A61B5/489Blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M5/00Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests
    • A61M5/42Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests having means for desensitising skin, for protruding skin to facilitate piercing, or for locating point where body is to be pierced
    • A61M5/427Locating point where body is to be pierced, e.g. vein location means using ultrasonic waves, injection site templates
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • A61B2017/00119Electrical control of surgical instruments with audible or visual output alarm; indicating an abnormal situation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/08Accessories or related features not otherwise provided for
    • A61B2090/0807Indication means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0075Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by spectroscopy, i.e. measuring spectra, e.g. Raman spectroscopy, infrared absorption spectroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/02007Evaluating blood vessel condition, e.g. elasticity, compliance
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5269Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts

Definitions

  • Venipuncture and arterial line placement provide access to a patient's venous and arterial blood systems, respectively. Venipuncture is used for tasks such as drawing blood for testing or blood donation, administering intravenous (IV) fluids, and the like. Venipuncture is a very common medical procedure: by some estimates around one billion venipuncture procedures are performed each year. Arterial lines are used for drawing arterial blood gas (ABG) samples, direct arterial blood pressure monitoring, and the like. Venipuncture and arterial line placement are commonly performed by nurses, doctors, and other medical professionals. Accurate initial placement of the hypodermic needle or IV needle in venipuncture greatly improves patient experience by minimizing skin penetrations that can lead to pain and potential pathways for infection, and avoids delays and improves clinical workflow. However, by some estimates accurate placement on the first attempt is achieved less than half the time. Arterial line placement is a more difficult procedure due to the deeper location of arteries compared with veins, leading to increased pain and potential for injury in the case of repeated arterial line placement attempts.
  • a device call AccuVein® (available from Accuvein, Inc., Huntington, New York) has had success using IR to detect the vasculature via hemoglobin absorption.
  • the device provides no guidance as to aspects such as target depth, target angle, and target speed which are important for accurate needle placement.
  • the device also does not provide real-time feedback to the clinician during the needle insertion phase of the venipuncture procedure.
  • a needle tracker is configured to track a current position of an associated needle.
  • the device also includes at least one electronic processor, and a non- transitory storage medium storing data related to one or more of target needle depth, target needle angle, and target needle speed, and instructions readable and executable by the at least one electronic processor to perform a needle placement assistance method.
  • the method includes: performing machine vision processing of the stereo images to generate a three- dimensional (3D) map of the target portion; detect a blood vessel in the 3D map of the target portion; determining a target needle position relative to the blood vessel detected by the machine vision processing based on the data related to one or more of target depth, target angle, and target speed; and identifying corrective action to align a current position of the needle with the target needle position.
  • 3D three- dimensional
  • a needle tracker is configured to track a current position of an associated needle.
  • a feedback mechanism is configured to present the corrective action to a user during insertion of the needle into the detected blood vessel.
  • the feedback mechanism includes at least an augmented-reality heads-up display (AR-HUD) device including an AR- HUD display.
  • AR-HUD augmented-reality heads-up display
  • the stereo camera is mounted to the AR-HUD device.
  • the device includes at least one electronic processor, and a non-transitory storage medium storing data related to one or more of target needle depth, target needle angle, and target needle speed, and instructions readable and executable by the at least one electronic processor to perform a needle placement assistance method.
  • the method includes: performing machine vision processing of the stereo images to generate a three-dimensional (3D) map of the target portion; detect a blood vessel in the 3D map of the target portion; determining a target needle position relative to the blood vessel detected by the machine vision processing based on the data related to one or more of target depth, target angle, and target speed; and identifying corrective action to align a current position of the needle with the target needle position.
  • the needle tracker comprises the at least one electronic processor and the non-transitory storage medium.
  • the needle placement assistance method further includes performing needle tracking machine vision processing of the stereo images to determine the current position of the needle relative to the detected blood vessel
  • a needle tracker is configured to track a current position of an associated needle.
  • a feedback mechanism is configured to present the corrective action to a user during insertion of the needle into the detected blood vessel.
  • the device also includes at least one electronic processor, and a non-transitory storage medium storing data related to one or more of target needle depth, target needle angle, and target needle speed, and instructions readable and executable by the at least one electronic processor to perform a needle placement assistance method.
  • the method includes: performing machine vision processing of the stereo images to generate a three-dimensional (3D) map of the target portion; detect a blood vessel in the 3D map of the target portion; determining a target needle position relative to the blood vessel detected by the machine vision processing based on the data related to one or more of target depth, target angle, and target speed; and identifying corrective action to align a current position of the needle with the target needle position.
  • the feedback mechanism includes at least a speaker configured to provide audio instructions to a user presenting the corrective action.
  • One advantage resides in providing a device or method which improves the likelihood of successful first placement for venipuncture or arterial line placement procedures. Another advantage resides in providing a device that provides real-time feedback for a medical professional during needle insertion into a blood vessel.
  • Another advantage resides in providing textual or graphic feedback to a medical professional to correct a position of a needle during needle insertion into a blood vessel.
  • Another advantage resides in providing an augmented reality device to correct a position of a needle during needle insertion into a blood vessel.
  • Another advantage resides in providing monitoring and real-time feedback as to compliance with a target depth during a venipuncture or arterial line placement procedure.
  • Another advantage resides in providing monitoring and real-time feedback as to compliance with a target angle during the needle insertion phase of a venipuncture or arterial line placement procedure.
  • Another advantage resides in providing monitoring and real-time feedback as to compliance with a target speed during the needle insertion phase of a venipuncture or arterial line placement procedure.
  • a given embodiment may provide none, one, two, more, or all of the foregoing advantages, and/or may provide other advantages as will become apparent to one of ordinary skill in the art upon reading and understanding the present disclosure.
  • FIGURE 1 diagrammatically illustrates a needle placement assistance device in accordance with one aspect.
  • FIGURE 2 diagrammatically illustrates a flowchart of a method of use of the device of FIGURE 1.
  • FIGURES 3 and 4 diagrammatically illustrates image processing operations of the vasculature imaging process of FIGURE 2.
  • FIGURE 5 diagrammatically illustrates a feedback mechanism of the device of
  • FIGURE 1 DETAILED DESCRIPTION
  • the following discloses a guidance approach for assisting in venipuncture or arterial line placement. While some existing devices such as Accuvein® assist in identifying a vein or artery to target, the following describes systems that go beyond this to provide guidance as to the location, depth, angle and/or speed of insertion, including real-time feedback provided to the clinician during the needle insertion phase of the procedure.
  • a stereo camera is provided to generate stereo images of the arm or other anatomical target.
  • a blood vessel detector is applied to detect a suitable vein or artery.
  • the stereo images are also analyzed using machine vision techniques to construct a three- dimensional (3D) map of the arm or other target.
  • a knowledge base is referenced to determine the depth, angle, and optionally speed of insertion. These may depend on factors such as the type of operation being performed (e.g. venipuncture versus arterial line) and possibly information about the blood vessel being targeted in accord with the output of the blood vessel detector.
  • the stereo camera produces stereo video of the insertion process, and measures relevant parameters such as the needle tip location and angle and rate of positional change (i.e. speed).
  • the syringe may include one or more tracking markers to assist in tracking the needle.
  • the system further includes a feedback output.
  • this is an augmented reality (AR) head-up display (HUD) with one or more see-through AR display(s).
  • AR augmented reality
  • HUD head-up display
  • the provided feedback guidance may take various forms.
  • guidance may be displayed as AR content such as text or standard icons to indicate guidance such as whether the needle tip is positioned at the target blood vessel, advice such as "needle too shallow” or “target depth reached", or so forth.
  • graphics may be superimposed as AR content, e.g. showing the correct angle of the needle as a translucent line that terminates at the point on the target blood vessel where the needle should be inserted.
  • the clinician merely needs to align the physical needle with this translucent line.
  • the stereo camera should be mounted on the AR-HUD to provide a "first person view" so as to align the AR content with the actual view seen through the transparent display(s) of the AR-HUD.
  • an AR-HUD is preferred, other feedback mechanisms are contemplated.
  • a purely audio feedback could be provided.
  • the syringe itself could include feedback indicators, e.g. a green LED on the syringe could light up to indicate the tip is at the correct insertion point, and a level display could indicate if the needle is too shallow or too steep, or haptic feedback of various types.
  • the stereo camera could be mounted in a stationary location (e.g. at a blood lab station), and no HUD would be needed.
  • an illustrative needle placement assistance device or system 10 is shown.
  • the device 10 is configured to assist in placement of an associated needle 12 (e.g., the needle of an illustrative syringe 13, or of an arterial line, or of a vascular catheter, and the like) in a patient (and more particularly, into an artery or vein of the patient).
  • the needle placement assistance device 10 includes a stereo camera 14 configured to acquire stereo images or videos of a target portion of a patient, which can be used in various subsequent operations (e.g., detecting a blood vessel in the patient, determining a needle position, and the like) and a needle tracker 16 configured to track a current position of an associated needle 12.
  • the stereo camera 14 typically includes multiple lenses or lens assemblies with a separate sensor for each lens that forms an image on a digital detector array (e.g. a CCD imaging array, a CMOS imaging array, et cetera) to capture 3D images.
  • a digital detector array e.g. a CCD imaging array, a CMOS imaging array, et cetera
  • the stereo camera 14 preferably has color video capability, e.g. by having an imaging array with pixels sensitive to red, green, and blue light (or another set of colors substantially spanning the visible spectrum, e.g. 400-700 nm).
  • the stereo camera 14 optionally may include other typical features, such as a built-in flash (not shown) and/or an ambient light sensor (not shown) for setting exposure times.
  • the system 10 also includes a computer, workstation, mobile device (e.g., a cellular telephone, a tablet computer, personal data assistant or PDA, or so forth) or other electronic data processing device 18 with typical components, such as at least one electronic processor 20 operably connected to the needle tracker 16, at least one user input device (e.g., a mouse, a keyboard, a trackball, and/or the like) 22, and a display device 24.
  • the display device 24 can be a separate component from the computer 18, and can be an LCD display, an OLED display, a touch sensitive display, or the like.
  • the workstation 18 can also include one or more databases 26 (stored in a non-transitory storage medium such as RAM or ROM, a magnetic disk, an electronic medical record (EMR) database, a picture archiving and communication system (PACS) database, and the like). For example, video of venipuncture or arterial line placement procedures could be stored for later use in clinical training or for assessment of clinician skills.
  • the computer 18 is connected to the stereo camera 14 and the needle tracker 16 via a wired or wireless communication link or network e.g., (Wi-Fi, 4G, or another wireless communication link).
  • the at least one electronic processor 20 is operatively connected with a non-transitory storage medium (not shown) that stores instructions which are readable and executable by the at least one electronic processor 20 to perform disclosed operations including performing a needle placement assistance method or process 100.
  • the non-transitory storage medium 26 may, for example, comprise a hard disk drive, RAID, or other magnetic storage medium; a solid state drive, flash drive, electronically erasable read-only memory (EEROM) or other electronic memory; an optical disk or other optical storage; various combinations thereof; or so forth.
  • the needle placement assistance method or process 100 may be performed by cloud processing.
  • the non-transitory storage medium 26 stores data related to one or more of target needle depth, target needle angle, and target needle speed.
  • the non-transitory storage medium 26 suitably stores data related to proper IV placement and arterial line placement, including appropriate needle 12 angles and depths for various vessel sizes, depths from the surface, branching, and trajectory of the needle for various venipuncture and/or arterial line procedures, optionally further segmented into different target depth/angle/speed settings for different blood vessels that may be targeted for such procedures.
  • venipuncture target depths are shallower than arterial line target depths due to the typically shallower veins compared with arteries; moreover, different veins may be generally located at different depths, and likewise for different arteries.
  • Target needle speed is also usually higher for arteries compared with veins, and target angle may be steeper for arteries in order to more effectively achieve the desired deeper penetration.
  • the needle tracker 16 can be any suitable sensor disposed to move with the needle 12, including one or more of: an accelerometer; a gyroscope; and an electromagnetic (EM) tracking system with an EM sensor.
  • the needle tracker 16 includes the at least one electronic processor 20 and the non-transitory storage medium 26.
  • the needle tracker 16 can also include one or more visual markers 30 that are disposed to move with the needle 12.
  • the visual markers 30 can include one or more of Bokode visual tags, frequency modulated LEDS, and other suitable visual markers. In the case of visual markers, the needle tracker 16 may be viewed as further including the stereo camera 14 which records video of the markers.
  • an illustrative embodiment of the needle placement assistance method 100 is diagrammatically shown as a flowchart.
  • the workstation 18 receives a time sequence of stereo images (i.e. stereo video 103 of an illustrative wrist targeted for venipuncture, see FIGURE 1) from the stereo camera 14 and performs machine vision processing of the stereo images 103 to generate a three-dimensional (3D) map 32 of the target portion.
  • a time sequence of stereo images i.e. stereo video 103 of an illustrative wrist targeted for venipuncture, see FIGURE 1
  • the stereo images 103 i.e. stereo video 103 of an illustrative wrist targeted for venipuncture, see FIGURE 1
  • the target portion of the patient e.g., an arm
  • a foreground and a background of an initial stereo image frame is removed using a fully convolutional neural network (FCN) that is trained for recognition and segmentation of the body part that is the subject of the venipuncture or arterial line placement procedure as a joint task.
  • FCN fully convolutional neural network
  • the body part is then detected using a Haar Cascade classifier.
  • Features of the body part such as joints, boundaries, or so forth are detected using a mixture of experts per feature.
  • Features may be tied together, e.g. the mixture of experts for one body part feature depends on a neighboring feature.
  • confidence maps are produced also using sequential detection from a mixture of experts. However, they are constrained using optical flow from other frames.
  • the optical flow is weighted by the temporal distance between frames.
  • a segmentation superpixel is dilated to prevent over segmentation of regions of interest.
  • Key points are extracted using a SIFT (scale-invariant feature transform) algorithm. These key points are used to create a geometrical mesh using non-uniform rational b-spline (NURBS).
  • NURBS non-uniform rational b-spline
  • the at least one electronic processor 20 is programmed to detect a blood vessel in the 3D map 32 of the target portion.
  • a two-step approach includes the first step 60, in which subtle signal variations in the 3D map 32 are amplified.
  • FIGURE 3 illustrates one approach.
  • the operation 60 employs amplification of motion variation as using the Eulerian Motion Magnification algorithm. See Eulerian Video Magnification. Wu et al, “Eulerian Video Magnification for Revealing Subtle Changes in the World", ACM Transactions on Graphics vol. 31 no. 4 (Proc. SIGGRAPH, 2012).
  • a pixel 70 which is over vasculature has a sinusoidal signal 72 in the expected frequency range consistent with the arterial waveform (e.g. corresponding the cardiac cycle or pulse rate).
  • Another pixel 74, which is not located over vasculature, has a signal 76 that does not have a frequency consistent with physiology.
  • the signal is decomposed into its frequency components via Fourier transform to produce frequency spectra 80.
  • One method to extract information is to identify the peaks within the physiological feasible passband 82 (e.g. corresponding to the credible range of pulse rates for the patient, e.g. having a lower limit of 40 beats/min or some other lowest value that is realistic for a patient and an upper limit of 200 beats/min or some other highest value that is realistic for a patient) by windowing followed by a slope inversion and a local peak search.
  • Other approaches can be employed to identify pixels representing vasculature based on the temporal variation of the values of the pixels produced by the computation 60.
  • the approach of FIGURES 3 and 4 is one illustrative example of blood vessel detection, and other approaches can be used, for example relying upon detection of edges of vessels in the image.
  • the stereo camera may include an infrared imager that provides thermal imaging of blood vessels. The infrared image is spatially aligned with the 3D map from the stereo camera to provide the blood vessel detection.
  • the at least one electronic processor 20 is programmed to determine a target needle position relative to the detected blood vessel based on the data related to one or more of target depth, target angle, and target speed (i.e., the data stored in the non-transitory storage medium 26).
  • the target needle position can include one or more of the location of the needle (based on the target depth and target angle), a translational location in the 3D map 32, and target speed.
  • the at least one electronic processor 20 is programmed to calculate an optimal location (e.g., x, y, and z coordinates), angle, and insertion depth of the needle 12 into the detected blood vessel on the 3D map 32. To determine the optimal location, the at least one electronic processor 20 is programmed to find branch points and determine vessel size in the 3D map 32, and use the data stored in the non-transitory storage medium 26 to calculate the target needle position. From the blood vessels and other vasculature detected the 3D map 32, the at least one electronic processor 20 is programmed to calculate a current optimal (i.e.
  • the at least one electronic processor 20 is programmed to performing needle tracking machine vision processing of the stereo images to determine the current position of the needle 12 relative to the detected blood vessel.
  • the at least one electronic processor 20 is programmed to determine the current position of the needle 12 relative to the detected blood vessel by detecting the visual markers 30 connected to the needle in the 3D map 32. At least two such markers 30 are generally employed to determine the current angle of the needle. It may be noted that the markers are usually not able to be disposed on the needle 12 itself, but rather are disposed on a component that is in a rigid spatial relationship with the needle 12, such as the barrel of the illustrative syringe 13. To determine the current actual needle speed, the needle position in successive frames of the video are suitably determined to assess a distance and the time interval between the successive frames provides the time dimension of the speed (distance/time).
  • the at least one electronic processor 20 is programmed to identifying corrective action to align a current position of the needle from 108 with the target needle position from 106. This operation is performed when the determined current position of the needle 12 (determined at 108) is offset by more than some allowable tolerance from the determined target position of the needle (determined at 106).
  • the corrective action can include moving the position, the angle, and/or the speed of the needle 12 to align with the determined target position, which can be displayed with the 3D map 32 on the display device 24, presented aurally via a loudspeaker using synthesized or prerecorded verbiage (e.g. speaking "Use a more shallow angle” or "Increase insertion speed” or so forth).
  • the device 10 includes a feedback mechanism 34 configured to present a corrective action to a user during insertion of the needle 12 into a detected blood vessel.
  • the feedback mechanism 34 includes an augmented-reality heads-up display (AR-HUD) device 36 with one or more AR-HUD displays 40.
  • AR-HUD augmented-reality heads-up display
  • the illustrative design employs left-eye and right-eye displays 40, but alternatively the display can be a single large window that spans both eyes.
  • the stereo camera 14 is mounted to the AR-HUD device 36 to provide a "first person view" so as to align the AR content with the actual view seen through the transparent display(s) 40 of the AR-HUD.
  • the AR-HUD device 36 can be configured as a helmet, a headband, glasses, goggles, or other suitable embodiment in order to be worn on the head of the user.
  • the stereo camera 14 is mounted to the AR-HUD device 36 (e.g., to overlay the user's forehead, or including two stereo cameras disposed on lenses of the glasses).
  • the AR- HUD 36 transmits the stereo images comprising a video stream from the stereo camera 14 to the processor 20 of FIGURE 1.
  • the AR content is transmitted back from the processor 20 to the AR-HUD 36 which displays the generated at least one of text and graphics (or, more generally, the AR content) on the AR HUD display 40.
  • text such as "needle too shallow” or "target depth reached", or so forth.
  • graphics may be superimposed as AR content, e.g. showing the correct angle of the needle as a translucent line that terminates at the point on the target blood vessel where the needle should be inserted.
  • the clinician merely needs to align the physical needle with this translucent line.
  • the graphics can include a "level" graphic to show the stability of the needle 12.
  • the processor 20 is programmed to use Simultaneous Location and Mapping (SLAM) processing to align the AR content (e.g. the superimposed translucent target needle angle/position, or textual instruction annotations and/or so forth) with the recorded video.
  • SLAM processing may be formulated mathematically by the probabilistic formulation P(c 1 (t), c 2 (t), ... , ⁇ ⁇ ⁇ ), ⁇ ) ⁇ , ... , f t ) where Ci(t), c 2 (t), ...
  • c n (t) are the locations of reference points of the needle 12 (or the connected barrel of the syringe 13) at a time t (where without loss of generality n reference points are assumed), p(t) is the location of the service person at time t, and f ,—, f t are the frames of the obtained video up to the time t.
  • SLAM algorithms known for robotic vision mapping, self-driving vehicle navigation technology, or so forth are suitably applied to implement the AR content-to-video mapping.
  • the feedback mechanism 34 includes a speaker 42 configured to provide audio instructions to a user presenting the corrective action.
  • the text and graphics generated by the AR-HUD device 36 can be broadcast by the speaker 42.
  • the feedback mechanism 34 includes one or more light emitting diodes (LEDs) 44, configured to illuminate to indicate the corrective action during insertion of the needle into the detected blood vessel.
  • the illumination of the LEDs 44 can show the target position, target speed, and/or target angle of the needle 12.
  • the corrective action includes an adjustment of an angle of the current position of the needle 12 to align with the target needle angle.
  • the corrective action includes an adjustment of a rate of change of the current position of the needle 12 to align with the target needle speed.
  • the feedback mechanism 34 includes a haptic device 46 built into or secured with the needle 12 (e.g. via the barrel of the syringe 13 in FIGURE 5) and configured to present the corrective action comprising haptic feedback to the user.
  • the haptic device 46 may vibrate when the needle 12 deviates from the target needle position during needle insertion.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Vascular Medicine (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Dermatology (AREA)
  • Anesthesiology (AREA)
  • Hematology (AREA)
  • Gynecology & Obstetrics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Robotics (AREA)
  • Infusion, Injection, And Reservoir Apparatuses (AREA)

Abstract

A needle placement assistance device (10) for assisting in venipuncture or arterial line placement includes a stereo camera (14) configured to acquire stereo images of a target portion of a patient. A needle tracker (16) is configured to track a current position of an associated needle (12). The device also includes at least one electronic processor (20); and a non- transitory storage medium (26) storing data related to one or more of target needle depth, target needle angle, and target needle speed, and instructions readable and executable by the at least one electronic processor to perform a needle placement assistance method (100). The method includes: performing machine vision processing of the stereo images to generate a three- dimensional (3D) map (32) of the target portion; detect a blood vessel in the 3D map of the target portion; determining a target needle position relative to the blood vessel detected by the machine vision processing based on the data related to one or more of target depth, target angle, and target speed; and identifying corrective action to align a current position of the needle with the target needle position.

Description

SYSTEM FOR VENIPUNCTURE AND ARTERIAL LINE GUIDANCE WITH
AUGMENTED REALITY
FIELD
The following relates generally to the venipuncture arts, arterial line placement arts, nursing and patient care arts, augmented reality arts, and related arts. BACKGROUND
Venipuncture and arterial line placement provide access to a patient's venous and arterial blood systems, respectively. Venipuncture is used for tasks such as drawing blood for testing or blood donation, administering intravenous (IV) fluids, and the like. Venipuncture is a very common medical procedure: by some estimates around one billion venipuncture procedures are performed each year. Arterial lines are used for drawing arterial blood gas (ABG) samples, direct arterial blood pressure monitoring, and the like. Venipuncture and arterial line placement are commonly performed by nurses, doctors, and other medical professionals. Accurate initial placement of the hypodermic needle or IV needle in venipuncture greatly improves patient experience by minimizing skin penetrations that can lead to pain and potential pathways for infection, and avoids delays and improves clinical workflow. However, by some estimates accurate placement on the first attempt is achieved less than half the time. Arterial line placement is a more difficult procedure due to the deeper location of arteries compared with veins, leading to increased pain and potential for injury in the case of repeated arterial line placement attempts.
Prior work has shown significant needle tip movement during venipuncture blood sample collection. Needle motion due to switching of hands while removing a first blood tube from the system and replacing with the second blood tube is common in vacuum venipuncture. Some complications that can arise from excessive movement during venipuncture include nerve damage, hematomas, and neuropathic pain (see, e.g., Fujii C. Clarification of the characteristics of needle-tip movement during vacuum venipuncture to improve safety. Vascular Health and Risk Management. 2013;9:381-390. doi: 10.2147/VHRM.S47490).
Improving accuracy of venipuncture would have a large societal impact - fulfilling a widespread unmet need. In the US annually, there are approximately 1 billion venipunctures for blood draws and IV therapy. In hospital settings, virtually all patients receive IVs, which can require multiple attempts to successfully attach to the patient. This results in wasted time for clinical staff, delay of diagnostic evaluation or treatment, and most notably significantly reduced patient satisfaction scores. Moreover, patient satisfaction is particularly important for hospital reimbursement with the Affordable Care Act, where Medicare payments will be withheld from hospitals with unacceptable patient satisfaction scores. This problem is even more acute when placing arterial lines. They can be harder to identify visually as they are deeper and errors can be dangerous, such as hematoma and nerve or vascular damage to the extremity.
Currently, a device call AccuVein® (available from Accuvein, Inc., Huntington, New York) has had success using IR to detect the vasculature via hemoglobin absorption. However, the device provides no guidance as to aspects such as target depth, target angle, and target speed which are important for accurate needle placement. The device also does not provide real-time feedback to the clinician during the needle insertion phase of the venipuncture procedure.
The following discloses a new and improved systems and methods that address the above referenced issues, and others.
SUMMARY
In one disclosed aspect, a needle placement assistance device for assisting in venipuncture or arterial line placement includes a stereo camera configured to acquire stereo images of a target portion of a patient. A needle tracker is configured to track a current position of an associated needle. The device also includes at least one electronic processor, and a non- transitory storage medium storing data related to one or more of target needle depth, target needle angle, and target needle speed, and instructions readable and executable by the at least one electronic processor to perform a needle placement assistance method. The method includes: performing machine vision processing of the stereo images to generate a three- dimensional (3D) map of the target portion; detect a blood vessel in the 3D map of the target portion; determining a target needle position relative to the blood vessel detected by the machine vision processing based on the data related to one or more of target depth, target angle, and target speed; and identifying corrective action to align a current position of the needle with the target needle position.
In another disclosed aspect, a needle placement assistance device for assisting in venipuncture or arterial line placement includes a stereo camera configured to acquire stereo images of a target portion of a patient. A needle tracker is configured to track a current position of an associated needle. A feedback mechanism is configured to present the corrective action to a user during insertion of the needle into the detected blood vessel. The feedback mechanism includes at least an augmented-reality heads-up display (AR-HUD) device including an AR- HUD display. The stereo camera is mounted to the AR-HUD device. The device includes at least one electronic processor, and a non-transitory storage medium storing data related to one or more of target needle depth, target needle angle, and target needle speed, and instructions readable and executable by the at least one electronic processor to perform a needle placement assistance method. The method includes: performing machine vision processing of the stereo images to generate a three-dimensional (3D) map of the target portion; detect a blood vessel in the 3D map of the target portion; determining a target needle position relative to the blood vessel detected by the machine vision processing based on the data related to one or more of target depth, target angle, and target speed; and identifying corrective action to align a current position of the needle with the target needle position. The needle tracker comprises the at least one electronic processor and the non-transitory storage medium. The needle placement assistance method further includes performing needle tracking machine vision processing of the stereo images to determine the current position of the needle relative to the detected blood vessel
In another disclosed aspect, a needle placement assistance device for assisting in venipuncture or arterial line placement includes a stereo camera configured to acquire stereo images of a target portion of a patient. A needle tracker is configured to track a current position of an associated needle. A feedback mechanism is configured to present the corrective action to a user during insertion of the needle into the detected blood vessel. The device also includes at least one electronic processor, and a non-transitory storage medium storing data related to one or more of target needle depth, target needle angle, and target needle speed, and instructions readable and executable by the at least one electronic processor to perform a needle placement assistance method. The method includes: performing machine vision processing of the stereo images to generate a three-dimensional (3D) map of the target portion; detect a blood vessel in the 3D map of the target portion; determining a target needle position relative to the blood vessel detected by the machine vision processing based on the data related to one or more of target depth, target angle, and target speed; and identifying corrective action to align a current position of the needle with the target needle position. The feedback mechanism includes at least a speaker configured to provide audio instructions to a user presenting the corrective action.
One advantage resides in providing a device or method which improves the likelihood of successful first placement for venipuncture or arterial line placement procedures. Another advantage resides in providing a device that provides real-time feedback for a medical professional during needle insertion into a blood vessel.
Another advantage resides in providing textual or graphic feedback to a medical professional to correct a position of a needle during needle insertion into a blood vessel.
Another advantage resides in providing an augmented reality device to correct a position of a needle during needle insertion into a blood vessel.
Another advantage resides in providing monitoring and real-time feedback as to compliance with a target depth during a venipuncture or arterial line placement procedure.
Another advantage resides in providing monitoring and real-time feedback as to compliance with a target angle during the needle insertion phase of a venipuncture or arterial line placement procedure.
Another advantage resides in providing monitoring and real-time feedback as to compliance with a target speed during the needle insertion phase of a venipuncture or arterial line placement procedure.
A given embodiment may provide none, one, two, more, or all of the foregoing advantages, and/or may provide other advantages as will become apparent to one of ordinary skill in the art upon reading and understanding the present disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
The disclosure may take form in various components and arrangements of components, and in various steps and arrangements of steps. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the disclosure.
FIGURE 1 diagrammatically illustrates a needle placement assistance device in accordance with one aspect.
FIGURE 2 diagrammatically illustrates a flowchart of a method of use of the device of FIGURE 1.
FIGURES 3 and 4 diagrammatically illustrates image processing operations of the vasculature imaging process of FIGURE 2.
FIGURE 5 diagrammatically illustrates a feedback mechanism of the device of
FIGURE 1. DETAILED DESCRIPTION
The following discloses a guidance approach for assisting in venipuncture or arterial line placement. While some existing devices such as Accuvein® assist in identifying a vein or artery to target, the following describes systems that go beyond this to provide guidance as to the location, depth, angle and/or speed of insertion, including real-time feedback provided to the clinician during the needle insertion phase of the procedure.
To this end, a stereo camera is provided to generate stereo images of the arm or other anatomical target. A blood vessel detector is applied to detect a suitable vein or artery. The stereo images are also analyzed using machine vision techniques to construct a three- dimensional (3D) map of the arm or other target. A knowledge base is referenced to determine the depth, angle, and optionally speed of insertion. These may depend on factors such as the type of operation being performed (e.g. venipuncture versus arterial line) and possibly information about the blood vessel being targeted in accord with the output of the blood vessel detector.
Thereafter, the stereo camera produces stereo video of the insertion process, and measures relevant parameters such as the needle tip location and angle and rate of positional change (i.e. speed). Optionally, the syringe may include one or more tracking markers to assist in tracking the needle.
Many seemingly static videos contain subtle changes that are invisible to the human eye. However, it is possible to measure these small changes via the use of algorithms such as Eulerian video magnification (see, e.g., Wu et al. Eulerian Video Magnification for Revealing Subtle Changes in the World. Sigggraph, 31 :4 (2012). Previously, it has been shown the human pulse can be measured from pulse-induced color variations or small motion of pulsatile flow with conventional videos, which is called remote photoplethysmography (rPPG) (see, e.g., Wang et al. Exploiting Spatial Redundancy of Image Sensor for Motion Robust rPPG. IEEE Trans on Biomed Eng. 62:2 (2014).
The system further includes a feedback output. In some embodiments, this is an augmented reality (AR) head-up display (HUD) with one or more see-through AR display(s). The provided feedback guidance may take various forms. In a straightforward approach, guidance may be displayed as AR content such as text or standard icons to indicate guidance such as whether the needle tip is positioned at the target blood vessel, advice such as "needle too shallow" or "target depth reached", or so forth. In another embodiment, graphics may be superimposed as AR content, e.g. showing the correct angle of the needle as a translucent line that terminates at the point on the target blood vessel where the needle should be inserted. Here, the clinician merely needs to align the physical needle with this translucent line.
For the AR-HUD design, the stereo camera should be mounted on the AR-HUD to provide a "first person view" so as to align the AR content with the actual view seen through the transparent display(s) of the AR-HUD. While an AR-HUD is preferred, other feedback mechanisms are contemplated. For example, a purely audio feedback could be provided. In another approach, the syringe itself could include feedback indicators, e.g. a green LED on the syringe could light up to indicate the tip is at the correct insertion point, and a level display could indicate if the needle is too shallow or too steep, or haptic feedback of various types. In these non-AR embodiments, the stereo camera could be mounted in a stationary location (e.g. at a blood lab station), and no HUD would be needed.
With reference to FIGURE 1 , an illustrative needle placement assistance device or system 10 is shown. The device 10 is configured to assist in placement of an associated needle 12 (e.g., the needle of an illustrative syringe 13, or of an arterial line, or of a vascular catheter, and the like) in a patient (and more particularly, into an artery or vein of the patient). To do so, the needle placement assistance device 10 includes a stereo camera 14 configured to acquire stereo images or videos of a target portion of a patient, which can be used in various subsequent operations (e.g., detecting a blood vessel in the patient, determining a needle position, and the like) and a needle tracker 16 configured to track a current position of an associated needle 12. The stereo camera 14 typically includes multiple lenses or lens assemblies with a separate sensor for each lens that forms an image on a digital detector array (e.g. a CCD imaging array, a CMOS imaging array, et cetera) to capture 3D images. The stereo camera 14 preferably has color video capability, e.g. by having an imaging array with pixels sensitive to red, green, and blue light (or another set of colors substantially spanning the visible spectrum, e.g. 400-700 nm). The stereo camera 14 optionally may include other typical features, such as a built-in flash (not shown) and/or an ambient light sensor (not shown) for setting exposure times.
The system 10 also includes a computer, workstation, mobile device (e.g., a cellular telephone, a tablet computer, personal data assistant or PDA, or so forth) or other electronic data processing device 18 with typical components, such as at least one electronic processor 20 operably connected to the needle tracker 16, at least one user input device (e.g., a mouse, a keyboard, a trackball, and/or the like) 22, and a display device 24. In some embodiments, the display device 24 can be a separate component from the computer 18, and can be an LCD display, an OLED display, a touch sensitive display, or the like. The workstation 18 can also include one or more databases 26 (stored in a non-transitory storage medium such as RAM or ROM, a magnetic disk, an electronic medical record (EMR) database, a picture archiving and communication system (PACS) database, and the like). For example, video of venipuncture or arterial line placement procedures could be stored for later use in clinical training or for assessment of clinician skills. The computer 18 is connected to the stereo camera 14 and the needle tracker 16 via a wired or wireless communication link or network e.g., (Wi-Fi, 4G, or another wireless communication link).
The at least one electronic processor 20 is operatively connected with a non-transitory storage medium (not shown) that stores instructions which are readable and executable by the at least one electronic processor 20 to perform disclosed operations including performing a needle placement assistance method or process 100. The non-transitory storage medium 26 may, for example, comprise a hard disk drive, RAID, or other magnetic storage medium; a solid state drive, flash drive, electronically erasable read-only memory (EEROM) or other electronic memory; an optical disk or other optical storage; various combinations thereof; or so forth. In some examples, the needle placement assistance method or process 100 may be performed by cloud processing. The non-transitory storage medium 26 stores data related to one or more of target needle depth, target needle angle, and target needle speed. For example, the non-transitory storage medium 26 suitably stores data related to proper IV placement and arterial line placement, including appropriate needle 12 angles and depths for various vessel sizes, depths from the surface, branching, and trajectory of the needle for various venipuncture and/or arterial line procedures, optionally further segmented into different target depth/angle/speed settings for different blood vessels that may be targeted for such procedures. In general, venipuncture target depths are shallower than arterial line target depths due to the typically shallower veins compared with arteries; moreover, different veins may be generally located at different depths, and likewise for different arteries. Target needle speed is also usually higher for arteries compared with veins, and target angle may be steeper for arteries in order to more effectively achieve the desired deeper penetration.
In some embodiments, the needle tracker 16 can be any suitable sensor disposed to move with the needle 12, including one or more of: an accelerometer; a gyroscope; and an electromagnetic (EM) tracking system with an EM sensor. In other embodiments, the needle tracker 16 includes the at least one electronic processor 20 and the non-transitory storage medium 26. In further embodiments, the needle tracker 16 can also include one or more visual markers 30 that are disposed to move with the needle 12. The visual markers 30 can include one or more of Bokode visual tags, frequency modulated LEDS, and other suitable visual markers. In the case of visual markers, the needle tracker 16 may be viewed as further including the stereo camera 14 which records video of the markers.
With reference to FIGURE 2, an illustrative embodiment of the needle placement assistance method 100 is diagrammatically shown as a flowchart. At 102, the workstation 18 receives a time sequence of stereo images (i.e. stereo video 103 of an illustrative wrist targeted for venipuncture, see FIGURE 1) from the stereo camera 14 and performs machine vision processing of the stereo images 103 to generate a three-dimensional (3D) map 32 of the target portion. To do so, the target portion of the patient (e.g., an arm) is detected and tracked. In a detection operation, a foreground and a background of an initial stereo image frame is removed using a fully convolutional neural network (FCN) that is trained for recognition and segmentation of the body part that is the subject of the venipuncture or arterial line placement procedure as a joint task. The body part is then detected using a Haar Cascade classifier. Features of the body part such as joints, boundaries, or so forth are detected using a mixture of experts per feature. Features may be tied together, e.g. the mixture of experts for one body part feature depends on a neighboring feature. In subsequent frames of the stereo video, confidence maps are produced also using sequential detection from a mixture of experts. However, they are constrained using optical flow from other frames. To compensate for this, the optical flow is weighted by the temporal distance between frames. To create the 3D map 32 (i.e., of the arm and hand), a segmentation superpixel is dilated to prevent over segmentation of regions of interest. Key points are extracted using a SIFT (scale-invariant feature transform) algorithm. These key points are used to create a geometrical mesh using non-uniform rational b-spline (NURBS). Once generated, the 3D map 32 is displayed on the display device 24. It should be noted that this is one illustrative 3D map segmentation process, and other machine vision processing approaches may be employed for this task.
At 104, the at least one electronic processor 20 is programmed to detect a blood vessel in the 3D map 32 of the target portion. With continuing reference to FIGURES 1 and 2, and with further reference to FIGURES 3 and 4, to detect a blood vessel in the 3D map 32, a two-step approach includes the first step 60, in which subtle signal variations in the 3D map 32 are amplified. FIGURE 3 illustrates one approach. In the example of FIGURE 3, the operation 60 employs amplification of motion variation as using the Eulerian Motion Magnification algorithm. See Eulerian Video Magnification. Wu et al, "Eulerian Video Magnification for Revealing Subtle Changes in the World", ACM Transactions on Graphics vol. 31 no. 4 (Proc. SIGGRAPH, 2012). The amplification of variations by this approach enhances variations not easily detected from the raw video. For example, as shown in FIGURE 3, after signal amplification a pixel 70 which is over vasculature has a sinusoidal signal 72 in the expected frequency range consistent with the arterial waveform (e.g. corresponding the cardiac cycle or pulse rate). Another pixel 74, which is not located over vasculature, has a signal 76 that does not have a frequency consistent with physiology.
With reference to FIGURE 4, in the second step 62, pixels consistent with vascular physiology are identified. In the illustrative example of FIGURE 4, the signal is decomposed into its frequency components via Fourier transform to produce frequency spectra 80. One method to extract information is to identify the peaks within the physiological feasible passband 82 (e.g. corresponding to the credible range of pulse rates for the patient, e.g. having a lower limit of 40 beats/min or some other lowest value that is realistic for a patient and an upper limit of 200 beats/min or some other highest value that is realistic for a patient) by windowing followed by a slope inversion and a local peak search. Other approaches can be employed to identify pixels representing vasculature based on the temporal variation of the values of the pixels produced by the computation 60.
The approach of FIGURES 3 and 4 is one illustrative example of blood vessel detection, and other approaches can be used, for example relying upon detection of edges of vessels in the image. In another variant, the stereo camera may include an infrared imager that provides thermal imaging of blood vessels. The infrared image is spatially aligned with the 3D map from the stereo camera to provide the blood vessel detection.
At 106, the at least one electronic processor 20 is programmed to determine a target needle position relative to the detected blood vessel based on the data related to one or more of target depth, target angle, and target speed (i.e., the data stored in the non-transitory storage medium 26). The target needle position can include one or more of the location of the needle (based on the target depth and target angle), a translational location in the 3D map 32, and target speed. From the target depth, target angle, and target speed data stored in the non- transitory storage medium 26 and the blood vessels and other vasculature detected the 3D map 32, the at least one electronic processor 20 is programmed to calculate an optimal location (e.g., x, y, and z coordinates), angle, and insertion depth of the needle 12 into the detected blood vessel on the 3D map 32. To determine the optimal location, the at least one electronic processor 20 is programmed to find branch points and determine vessel size in the 3D map 32, and use the data stored in the non-transitory storage medium 26 to calculate the target needle position. From the blood vessels and other vasculature detected the 3D map 32, the at least one electronic processor 20 is programmed to calculate a current optimal (i.e. target) location angle, and insertion depth of the needle 12 into the detected blood vessel on the 3D map 32. At 108, the at least one electronic processor 20 is programmed to performing needle tracking machine vision processing of the stereo images to determine the current position of the needle 12 relative to the detected blood vessel. In some examples, the at least one electronic processor 20 is programmed to determine the current position of the needle 12 relative to the detected blood vessel by detecting the visual markers 30 connected to the needle in the 3D map 32. At least two such markers 30 are generally employed to determine the current angle of the needle. It may be noted that the markers are usually not able to be disposed on the needle 12 itself, but rather are disposed on a component that is in a rigid spatial relationship with the needle 12, such as the barrel of the illustrative syringe 13. To determine the current actual needle speed, the needle position in successive frames of the video are suitably determined to assess a distance and the time interval between the successive frames provides the time dimension of the speed (distance/time).
At 110, the at least one electronic processor 20 is programmed to identifying corrective action to align a current position of the needle from 108 with the target needle position from 106. This operation is performed when the determined current position of the needle 12 (determined at 108) is offset by more than some allowable tolerance from the determined target position of the needle (determined at 106). The corrective action can include moving the position, the angle, and/or the speed of the needle 12 to align with the determined target position, which can be displayed with the 3D map 32 on the display device 24, presented aurally via a loudspeaker using synthesized or prerecorded verbiage (e.g. speaking "Use a more shallow angle" or "Increase insertion speed" or so forth).
With continuing reference to FIGURES 1 and 2, and referring now to FIGURE 5, in some embodiments, the device 10 includes a feedback mechanism 34 configured to present a corrective action to a user during insertion of the needle 12 into a detected blood vessel. In some embodiments, the feedback mechanism 34 includes an augmented-reality heads-up display (AR-HUD) device 36 with one or more AR-HUD displays 40. The illustrative design employs left-eye and right-eye displays 40, but alternatively the display can be a single large window that spans both eyes. In some examples, the stereo camera 14 is mounted to the AR-HUD device 36 to provide a "first person view" so as to align the AR content with the actual view seen through the transparent display(s) 40 of the AR-HUD. In some examples, the AR-HUD device 36 can be configured as a helmet, a headband, glasses, goggles, or other suitable embodiment in order to be worn on the head of the user. The stereo camera 14 is mounted to the AR-HUD device 36 (e.g., to overlay the user's forehead, or including two stereo cameras disposed on lenses of the glasses). To perform operation 110 using the AR-HUD approach of FIGURE 5, the AR- HUD 36 transmits the stereo images comprising a video stream from the stereo camera 14 to the processor 20 of FIGURE 1. After operations 104-110 are performed (i.e., the blood vessel, target needle position, and current needle position, and the correction are determined) at the processor 20 and AR content for correcting any needle insertion errors is generated, the AR content is transmitted back from the processor 20 to the AR-HUD 36 which displays the generated at least one of text and graphics (or, more generally, the AR content) on the AR HUD display 40. For example, text such as "needle too shallow" or "target depth reached", or so forth. In another example, graphics may be superimposed as AR content, e.g. showing the correct angle of the needle as a translucent line that terminates at the point on the target blood vessel where the needle should be inserted. Here, the clinician merely needs to align the physical needle with this translucent line. In one example, the graphics can include a "level" graphic to show the stability of the needle 12.
To generate the text or graphics aligned with or positioned proximate to an actually observed feature such as the blood vessel, the processor 20 is programmed to use Simultaneous Location and Mapping (SLAM) processing to align the AR content (e.g. the superimposed translucent target needle angle/position, or textual instruction annotations and/or so forth) with the recorded video. The SLAM processing may be formulated mathematically by the probabilistic formulation P(c1(t), c2 (t), ... , οη{ί), ρ{ΐ) \^, ... , ft) where Ci(t), c2 (t), ... , cn(t) are the locations of reference points of the needle 12 (or the connected barrel of the syringe 13) at a time t (where without loss of generality n reference points are assumed), p(t) is the location of the service person at time t, and f ,—, ft are the frames of the obtained video up to the time t. Various SLAM algorithms known for robotic vision mapping, self-driving vehicle navigation technology, or so forth are suitably applied to implement the AR content-to-video mapping.
Referring back to FIGURE 5, in another embodiment, the feedback mechanism 34 includes a speaker 42 configured to provide audio instructions to a user presenting the corrective action. In this embodiment, the text and graphics generated by the AR-HUD device 36 can be broadcast by the speaker 42.
In another embodiment, the feedback mechanism 34 includes one or more light emitting diodes (LEDs) 44, configured to illuminate to indicate the corrective action during insertion of the needle into the detected blood vessel. The illumination of the LEDs 44 can show the target position, target speed, and/or target angle of the needle 12. In one example, the corrective action includes an adjustment of an angle of the current position of the needle 12 to align with the target needle angle. In another example, the corrective action includes an adjustment of a rate of change of the current position of the needle 12 to align with the target needle speed.
In another embodiment, the feedback mechanism 34 includes a haptic device 46 built into or secured with the needle 12 (e.g. via the barrel of the syringe 13 in FIGURE 5) and configured to present the corrective action comprising haptic feedback to the user. For example, the haptic device 46 may vibrate when the needle 12 deviates from the target needle position during needle insertion.
The disclosure has been described with reference to the preferred embodiments. Modifications and alterations may occur to others upon reading and understanding the preceding detailed description. It is intended that the disclosure be construed as including all such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims

CLAIMS:
1. A needle placement assistance device (10) for assisting in venipuncture or arterial line placement, the device comprising:
a stereo camera (14) configured to acquire stereo images of a target portion of a patient; a needle tracker (16) configured to track a current position of an associated needle (12); at least one electronic processor (20); and
a non-transitory storage medium (26) storing data related to one or more of target needle depth, target needle angle, and target needle speed, and instructions readable and executable by the at least one electronic processor to perform a needle placement assistance method (100) including:
performing machine vision processing of the stereo images to generate a three- dimensional (3D) map (32) of the target portion;
detect a blood vessel in the 3D map of the target portion;
determining a target needle position relative to the blood vessel detected by the machine vision processing based on the data related to one or more of target depth, target angle, and target speed; and
identifying corrective action to align a current position of the needle with the target needle position.
2. The device (10) of claim 1, further including:
a feedback mechanism (34) configured to present the corrective action to a user during insertion of the needle (12) into the detected blood vessel.
3. The device (10) of claim 2, wherein the feedback mechanism (34) includes an augmented-reality heads-up display (AR-HUD) device (36) including an AR-HUD display (40), the stereo camera (14) being mounted to the AR-HUD device.
4. The device (10) of claim 3, wherein the processor (20) is programmed to:
receive the stereo images comprising a video stream from the stereo camera (14); generate at least one of text and graphics indicative of the corrective action; and control the A -HUD display (40) to display the generated at least one of text and graphics.
5. The device (10) of claim 2, wherein the feedback mechanism (34) includes a speaker (42) configured to provide audio instructions to a user presenting the corrective action.
6. The device (10) of any one of claims 2-5, wherein the feedback mechanism (34) includes one or more light emitting diodes (LEDs) (44), the LEDs being configured to illuminate to indicate the corrective action during insertion of the needle into the detected blood vessel.
7. The device (10) of claim 6, wherein the corrective action includes an adjustment of an angle of the current position of the needle (12) to align with the target needle angle.
8. The device (10) of claim 6, wherein the corrective action includes an adjustment of a rate of change of the current position of the needle (12) to align with the target needle speed.
9. The device (10) of claim 2, wherein the feedback mechanism (34) includes a haptic device (46) operatively connected with the needle (12) and configured to present the corrective action comprising haptic feedback to the user.
10. The device (10) of any one of claims 1-9, wherein the needle tracker (16) comprises the at least one electronic processor (20) and the non-transitory storage medium (26), wherein the needle placement assistance method (100) further includes:
performing needle tracking machine vision processing of the stereo images to determine the current position of the needle (12) relative to the detected blood vessel.
11. The device (10) of claim 10, wherein the needle tracker (16) further comprises visual markers (30) disposed to move with the needle (12), wherein the needle tracking machine vision processing further includes:
detecting the visual markers in the stereo images.
12. The device (10) of any one of claims 1-11, wherein the needle tracker (16) comprises one or more of: an accelerometer; a gyroscope; and an electromagnetic (EM) tracking system with an EM sensor disposed to move with the needle (12).
13. A needle placement assistance device (10) for assisting in venipuncture or arterial line placement, the device comprising:
a stereo camera (14) configured to acquire stereo images of a target portion of a patient; a needle tracker (16) configured to track a current position of an associated needle (12); a feedback mechanism (34) configured to present the corrective action to a user during insertion of the needle (12) into the detected blood vessel, the feedback mechanism (34) including at least an augmented-reality heads-up display (AR-HUD) device (36) including an AR-HUD display (40), the stereo camera (14) being mounted to the AR-HUD device;
at least one electronic processor (20); and
a non-transitory storage medium (26) storing data related to one or more of target needle depth, target needle angle, and target needle speed, and instructions readable and executable by the at least one electronic processor to perform a needle placement assistance method (100) including:
performing machine vision processing of the stereo images to generate a three- dimensional (3D) map (32) of the target portion;
detect a blood vessel in the 3D map of the target portion;
determining a target needle position relative to the blood vessel detected by the machine vision processing based on the data related to one or more of target depth, target angle, and target speed; and
identifying corrective action to align a current position of the needle with the target needle position;
wherein the needle tracker (16) comprises the at least one electronic processor (20) and the non-transitory storage medium (26), wherein the needle placement assistance method (100) further includes:
performing needle tracking machine vision processing of the stereo images to determine the current position of the needle (12) relative to the detected blood vessel
14. The device (10) of claim 13, wherein the processor (20) is programmed to:
receive the stereo images comprising a video stream from the stereo camera (14); generate at least one of text and graphics indicative of the corrective action; and control the A -HUD display (40) to display the generated at least one of text and graphics.
15. The device (10) of either one of claims 13 and 14, wherein the feedback mechanism (34) further includes one or more light emitting diodes (LEDs) (44), the LEDs being configured to illuminate to indicate the corrective action during insertion of the needle into the detected blood vessel.
wherein the corrective action includes at least one of:
an adjustment of an angle of the current position of the needle (12) to align with the target needle angle; and
an adjustment of a rate of change of the current position of the needle (12) to align with the target needle speed.
16. The device (10) of any one of claims 13-15, wherein the needle tracker (16) further comprises visual markers (30) disposed to move with the needle (12), wherein the needle tracking machine vision processing further includes:
detecting the visual markers in the stereo images.
17. A needle placement assistance device (10) for assisting in venipuncture or arterial line placement, the device comprising:
a stereo camera (14) configured to acquire stereo images of a target portion of a patient; a needle tracker (16) configured to track a current position of an associated needle (12); a feedback mechanism (34) configured to present the corrective action to a user during insertion of the needle (12) into the detected blood vessel,
at least one electronic processor (20); and
a non-transitory storage medium (26) storing data related to one or more of target needle depth, target needle angle, and target needle speed, and instructions readable and executable by the at least one electronic processor to perform a needle placement assistance method (100) including:
performing machine vision processing of the stereo images to generate a three- dimensional (3D) map (32) of the target portion;
detect a blood vessel in the 3D map of the target portion; determining a target needle position relative to the blood vessel detected by the machine vision processing based on the data related to one or more of target depth, target angle, and target speed; and
identifying corrective action to align a current position of the needle with the target needle position.
wherein the feedback mechanism includes at least a speaker (42) configured to provide audio instructions to a user presenting the corrective action.
18. The device (10) of claim 17, wherein the feedback mechanism (34) further includes one or more light emitting diodes (LEDs) (44), the LEDs being configured to illuminate to indicate the corrective action during insertion of the needle into the detected blood vessel. wherein the corrective action includes at least one of:
an adjustment of an angle of the current position of the needle (12) to align with the target needle angle; and
an adjustment of a rate of change of the current position of the needle (12) to align with the target needle speed.
19. The device (10) of either one of claims 17 and 18, wherein the needle tracker (16) comprises the at least one electronic processor (20), the non-transitory storage medium (26) and visual markers (30) disposed to move with the needle (12), wherein the needle placement assistance method (100) further includes:
performing needle tracking machine vision processing of the stereo images to detect the visual markers in the stereo images and determine the current position of the needle (12) relative to the detected blood vessel.
20. The device (10) of either one of claims 17 and 18, wherein the needle tracker (16) comprises one or more of: an accelerometer; a gyroscope; and an electromagnetic (EM) tracking system with an EM sensor disposed to move with the needle (12).
PCT/EP2018/072951 2017-09-05 2018-08-27 System for venipuncture and arterial line guidance with augmented reality WO2019048269A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762554114P 2017-09-05 2017-09-05
US62/554114 2017-09-05

Publications (1)

Publication Number Publication Date
WO2019048269A1 true WO2019048269A1 (en) 2019-03-14

Family

ID=63491586

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/072951 WO2019048269A1 (en) 2017-09-05 2018-08-27 System for venipuncture and arterial line guidance with augmented reality

Country Status (1)

Country Link
WO (1) WO2019048269A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112075981A (en) * 2020-08-27 2020-12-15 同济大学 Venipuncture robot control method, device and computer-readable storage medium
BE1027866B1 (en) * 2019-12-16 2021-07-15 Medtec Bv SYSTEM FOR REPEATABLE CANULATION OF A SUBJECT
CN113229936A (en) * 2021-05-06 2021-08-10 卫飞鹏 Method and system for improving liver intervention target positioning accuracy

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2289578A1 (en) * 2008-06-16 2011-03-02 Nory Co., Ltd. Syringe needle guiding apparatus
WO2012088471A1 (en) * 2010-12-22 2012-06-28 Veebot, Llc Systems and methods for autonomous intravenous needle insertion
WO2014139022A1 (en) * 2013-03-15 2014-09-18 Synaptive Medical (Barbados) Inc. Systems and methods for navigation and simulation of minimally invasive therapy
US20150065916A1 (en) * 2013-08-29 2015-03-05 Vasculogic, Llc Fully automated vascular imaging and access system
US20160324580A1 (en) * 2015-03-23 2016-11-10 Justin Esterberg Systems and methods for assisted surgical navigation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2289578A1 (en) * 2008-06-16 2011-03-02 Nory Co., Ltd. Syringe needle guiding apparatus
WO2012088471A1 (en) * 2010-12-22 2012-06-28 Veebot, Llc Systems and methods for autonomous intravenous needle insertion
WO2014139022A1 (en) * 2013-03-15 2014-09-18 Synaptive Medical (Barbados) Inc. Systems and methods for navigation and simulation of minimally invasive therapy
US20150065916A1 (en) * 2013-08-29 2015-03-05 Vasculogic, Llc Fully automated vascular imaging and access system
US20160324580A1 (en) * 2015-03-23 2016-11-10 Justin Esterberg Systems and methods for assisted surgical navigation

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
FUJII C.: "Clarification of the characteristics of needle-tip movement during vacuum venipuncture to improve safety", VASCULAR HEALTH AND RISK MANAGEMENT, vol. 9, 2013, pages 381 - 390
WANG ET AL.: "Exploiting Spatial Redundancy of Image Sensor for Motion Robust rPPG", IEEE TRANS ON BIOMED ENG., vol. 62, 2014, pages 2
WU ET AL.: "Eulerian Video Magnification for Revealing Subtle Changes in the World", SIGGGRAPH, vol. 31, 2012, pages 4, XP002781054
WU ET AL.: "Proc. SIGGRAPH", vol. 31, 2012, ACM TRANSACTIONS ON GRAPHICS, article "Eulerian Video Magnification for Revealing Subtle Changes in the World"

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
BE1027866B1 (en) * 2019-12-16 2021-07-15 Medtec Bv SYSTEM FOR REPEATABLE CANULATION OF A SUBJECT
CN112075981A (en) * 2020-08-27 2020-12-15 同济大学 Venipuncture robot control method, device and computer-readable storage medium
CN112075981B (en) * 2020-08-27 2021-09-03 同济大学 Venipuncture robot control method, device and computer-readable storage medium
CN113229936A (en) * 2021-05-06 2021-08-10 卫飞鹏 Method and system for improving liver intervention target positioning accuracy

Similar Documents

Publication Publication Date Title
RU2740259C2 (en) Ultrasonic imaging sensor positioning
US20190239850A1 (en) Augmented/mixed reality system and method for the guidance of a medical exam
JP6987893B2 (en) General-purpose devices and methods for integrating diagnostic trials into real-time treatment
US5912720A (en) Technique for creating an ophthalmic augmented reality environment
US20150366628A1 (en) Augmented surgical reality environment system
US11351312B2 (en) Use of infrared light absorption for vein finding and patient identification
WO2019048269A1 (en) System for venipuncture and arterial line guidance with augmented reality
Richa et al. Fundus image mosaicking for information augmentation in computer-assisted slit-lamp imaging
US20220139044A1 (en) Method and system for proposing and visualizing dental treatments
US20200359884A1 (en) System and method for detecting abnormal tissue using vascular features
US20230000419A1 (en) Electrogram Annotation System
CN107810535A (en) Wear-type computing device, method and computer program product
Crimi et al. Automatic measurement of venous pressure using B-mode ultrasound
Berger et al. Design considerations for a computer-vision-enabled ophthalmic augmented reality environment
CN111462314B (en) Organ three-dimensional image reconstruction method, operation navigation method and operation auxiliary system
KR102576422B1 (en) Method of displaying on a head mounted display apparatus and apparatus using the same
EP4287136A1 (en) System of vein location for medical interventions and biometric recognition using mobile devices
US11986248B2 (en) Apparatus and method for matching the real surgical image with the 3D-based virtual simulated surgical image based on POI definition and phase recognition
Hoyoux et al. A new computer vision-based system to help clinicians objectively assess visual pursuit with the moving mirror stimulus for the diagnosis of minimally conscious state
NL2034443A (en) Ultrasonic display system and method based on augmented reality
Ansar et al. Vein Detection and Cannula Insertion using Raspberry Pi
Kaur et al. A novel Weighted Integral Energy Functional (WIEF) algorithm: Augmented Reality (AR) for visualising the blood vessels in breast implant surgeries
Monserrat et al. Markerless monocular tracking system for guided external eye surgery
Louhith et al. Vein Detection using Infrared Technology and Machine Learning for Cannulation
CN111724883A (en) Medical data processing method, apparatus, system, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18765381

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18765381

Country of ref document: EP

Kind code of ref document: A1