US20210015447A1 - Breast ultrasound workflow application - Google Patents

Breast ultrasound workflow application Download PDF

Info

Publication number
US20210015447A1
US20210015447A1 US16/979,964 US201916979964A US2021015447A1 US 20210015447 A1 US20210015447 A1 US 20210015447A1 US 201916979964 A US201916979964 A US 201916979964A US 2021015447 A1 US2021015447 A1 US 2021015447A1
Authority
US
United States
Prior art keywords
ultrasound
interest
region
lesion
workflow tool
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US16/979,964
Other languages
English (en)
Inventor
Shawn St. Pierre
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hologic Inc
Original Assignee
Hologic Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hologic Inc filed Critical Hologic Inc
Priority to US16/979,964 priority Critical patent/US20210015447A1/en
Assigned to BANK OF AMERICA, N.A., AS COLLATERAL AGENT reassignment BANK OF AMERICA, N.A., AS COLLATERAL AGENT SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FAXITRON BIOPTICS, LLC, FOCAL THERAPEUTICS, INC., GEN-PROBE INCORPORATED, GEN-PROBE PRODESSE, INC., HOLOGIC, INC.
Publication of US20210015447A1 publication Critical patent/US20210015447A1/en
Assigned to HOLOGIC, INC. reassignment HOLOGIC, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ST. PIERRE, SHAWN
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0825Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the breast, e.g. mammography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • A61B10/0041Detection of breast cancer
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • A61B10/02Instruments for taking cell samples or for biopsy
    • A61B10/0233Pointed or sharp biopsy instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/085Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B8/469Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment

Definitions

  • Procedures involving the use of an ultrasound device may include initial diagnostic exams, guidance during interventional procedures, excision confirmation, post excision marker placement, pre-surgical localization procedures, and/or visualization during a surgical procedure.
  • a technician or radiologist uses the ultrasound device to scan the breast to identify and visualize the existence of potentially cancerous breast lesions, cysts, or other regions or features of interest.
  • an interventional procedure such as a biopsy, may be performed to excise a sample of tissue from the detected lesion or cyst to determine whether that tissue is malignant or benign.
  • the interventional procedure may require the use of an ultrasound device to visualize and locate the detected lesion, guide the technician during the biopsy procedure, confirm that the region of interest has been accurately sampled, and place a marker for subsequent procedures.
  • a recommended course of action may be to remove the identified tumor in a surgical procedure.
  • a separate pre-surgical procedure called localization, may also include the use of an ultrasound device to locate the tumor in order to guide the surgeon during surgical removal of the tumor.
  • Examples of the present disclosure describe systems and methods for executing an ultrasound workflow tool that facilitates procedures, such as a diagnostic exam, an interventional procedure, a localization procedure and/or a removal (excision) surgery, by anticipating procedure workflow steps and automatically providing pertinent prompts and information in real time.
  • a computer device having a processor and memory may be provided.
  • the computing device storing computer-executable instructions that implement an ultrasound workflow tool for facilitating a diagnostic procedure.
  • the computer-executable instructions causing the computing device to receive one or more signals from an ultrasound probe scanning an anatomical feature of a patient and detect, by the ultrasound workflow tool, a region of interest associated with the scanned anatomical feature.
  • the computer-executable instructions further causing the computing device to cause display of the region of interest on a display of the computing device and provide a prompt for classifying the detected region of interest.
  • the computer-executable instructions further causing the computing device to provide a template for identifying at least one location of the classified region of interest and provide a review screen comprising one or more fields for characterizing the classified region of interest. Additionally, the computing device programed to receive a description into each of the one or more fields for characterizing the classified region of interest and to generate a report of the diagnostic scan based in part on the classified region of interest, the received at least one location, and the received one or more descriptions. As should be appreciated, one or more of the described steps may be optionally performed.
  • a method for facilitating an interventional procedure is performed by an ultrasound workflow tool operating on a computing device having a processor operatively coupled to an ultrasound device.
  • the method includes receiving one or more signals from an ultrasound probe scanning an anatomical feature of a patient and detecting, by the workflow tool, a lesion associated with the anatomical feature. Additionally, the method includes capturing a pre-fire image of a needle positioned at the lesion and automatically annotating the pre-fire image. The method further includes capturing a post-fire image of the needle at the lesion and automatically annotating the post-fire image. Also, the method includes generating a report of the interventional procedure based in part on the annotated pre-fire image and the annotated post-fire image. As should be appreciated, one or more of the described steps may be optionally performed.
  • a method for facilitating a localization procedure is provide.
  • the method is performed by an ultrasound workflow tool operating on a computing device having a processor operatively coupled to an ultrasound device.
  • the method includes receiving one or more signals from an ultrasound probe scanning an anatomical feature of a patient and causing display of an image associated with the scanned anatomical feature on a display of the computing device.
  • the method further includes detecting, by the workflow tool, a first position of a localization device in a region associated with the scanned anatomical feature, determining a distance between the cancerous tissue and the first position of the localization device and providing the distance.
  • the method Based on detecting a movement of the localization device to a second position, the method includes recalculating the distance and providing the recalculated distance.
  • the method includes providing guidance for advancing the localization device toward the cancerous tissue.
  • one or more of the described steps may be optionally performed.
  • a method executed by a computing device having at least a processor that implements a workflow tool for facilitating an ultrasound diagnostic procedure includes receiving one or more signals from an ultrasound probe scanning an anatomical feature of a patient, detecting, by the ultrasound workflow tool, a region of interest associated with the scanned anatomical feature, and causing display of the region of interest on the display.
  • the method further includes providing a prompt for classifying the detected region of interest and receiving a classification of the region of interest.
  • the method includes providing a template for identifying at least one location of the classified region of interest and providing a review screen comprising one or more fields for characterizing the classified region of interest.
  • the method includes receiving a description into each of the one or more fields for characterizing the classified region of interest and generating a report of the diagnostic scan based in part on the classified region of interest, the received at least one location, and the received one or more descriptions.
  • the described steps may be optionally performed.
  • FIG. 1A depicts an example of a wireless ultrasound system.
  • FIG. 1B depicts an example of a wireless ultrasound system used to scan a patient.
  • FIG. 1C depicts an example of a suitable operating environment for incorporation into the biopsy needle visualization system.
  • FIG. 2A depicts an example method for facilitating performing a diagnostic scan using the disclosed ultrasound workflow tool.
  • FIGS. 2B-2L depict example screenshots of facilitating a diagnostic scan and the generated report.
  • FIG. 3A depicts an example method for facilitating an interventional procedure using the disclosed ultrasound workflow tool.
  • FIGS. 3B-3I depict example screenshots of the resulting report generated and annotated images.
  • FIG. 4 depicts an example method for facilitating a localization and an excision surgery procedure using the disclosed ultrasound workflow tool.
  • the disclosure generally relates to an ultrasound workflow tool that facilitates procedures, such as a diagnostic exam, an interventional procedure, a localization procedure and/or a removal (excision) surgery, by anticipating procedure workflow steps and automatically providing pertinent prompts and information in real time.
  • procedures such as a diagnostic exam, an interventional procedure, a localization procedure and/or a removal (excision) surgery, by anticipating procedure workflow steps and automatically providing pertinent prompts and information in real time.
  • the ultrasound operator typically scans the breast and captures a number of two-dimensional images. These two-dimensional images may be annotated with minimal information associated with identified anomalies, such as indications of which anatomical feature was scanned (e.g., right or left breast), a number of anomalies identified, estimated polar (or clock) coordinates of such anomalies (e.g., with a nipple of the breast as center point), or basic tissue depth (e.g., first, second, third depth layers).
  • identified anomalies such as indications of which anatomical feature was scanned (e.g., right or left breast), a number of anomalies identified, estimated polar (or clock) coordinates of such anomalies (e.g., with a nipple of the breast as center point), or basic tissue depth (e.g., first, second, third depth layers).
  • WO2017/060791 to Roger the limited disclosures of which are hereby incorporated herein by reference.
  • the present application discloses and claims an integrated workflow tool that not only anticipates each step of a procedure workflow, providing pertinent prompts, templates and information to facilitate the procedure and inform the practitioner, but may draw from information and data captured during previous procedures to further augment and facilitate a current procedure.
  • the scan is thereafter printed and placed in the patient's file.
  • the radiologist or surgeon later analyzes the scan to confirm the operator's findings and may order an interventional procedure to determine whether a region of interest is cancerous. For instance, during an interventional procedure, such as an image-guided breast biopsy, potentially cancerous tissue is extracted from a targeted location of a patient for further examination. Once the needle is inserted into the patient, the location of the needle can no longer be visually identified. Accordingly, during the biopsy, the surgeon may use an ultrasound device to guide the needle to the lesion, while also capturing images during the entirety of the procedure.
  • the surgeon will capture images as the needle is being guided to the lesion as well as capture images before the needle is fired, after the needle is fired into the lesion site, and to obtain a cross-sectional image of the lesion.
  • the medical professional holds the ultrasound probe in one hand while guiding the needle to the lesion site with the other hand. Accordingly, capturing images and characterizing regions of interest is cumbersome and inefficient. However, these images serve as evidentiary proof of the complete performance of the procedure as well as the basis for later analysis of the lesion tissue. After completion of the intervention procedure, the medical professional later characterizes and annotates each of the images.
  • an excision procedure may be ordered.
  • a device is placed to pin point the location of the cancerous tissue using an imaging modality. This localization procedure allows the surgeon to use the location device to remove the cancerous tissue from the patient.
  • the medical professional will hold the ultrasound device in one hand while placing either a wire or a wireless localization device at the site of the cancerous tissue.
  • the surgeon during the excise procedure therefore relies on the pre-implanted localization device to target the cancerous tissue by locating the localization device using imaging to guide the incision instrument to the cancer site, and to make any necessary adjustments throughout this process.
  • the live ultrasound image is used to make sure the entirety of the cancerous tissue and, optionally, the surrounding margins were removed.
  • the present application seeks to address and overcome many of the challenges of the above-described, multi-procedural set of workflows for diagnosing and treating cancerous tissues.
  • the use of ultrasound in close contact to the body of the ultrasound probe is needed in order to provide imaging.
  • the specific ultrasound-guided procedures such as the diagnostic exam, interventional procedure, and localization procedure described above each have unique challenges.
  • the information obtained about the patient from the diagnostic exam may not be accessible for the purposes of the interventional procedure and the medical professional performing the biopsy would have to determine the location of the cancerous tissue for the purposes of targeting the correct area of the breast.
  • the medical professional operates the ultrasound transceiver with one hand and the biopsy device with the other, making it hard to make any notations during the procedure.
  • many images may need to be captured during the procedure.
  • the medical professional typically goes back to characterize each image at a later time—oftentimes hours after the procedure, and sometimes even after multiple other procedures. The medical professional may therefore spend unnecessary additional time recalling context relating to each of the captured images, making it challenging for the medical professional to accurately recall the information pertaining to each image.
  • the nature of the breast tissue of the patient affects how the medical professional operates the biopsy tool to obtain the sample of tissue, for example if the tissue is dense, whether there are calcifications, breast implants, etc. Without having access to the diagnostic exam, the medical professional has to spend unnecessary additional time understanding the nature of the breast tissue in order to perform the interventional procedure.
  • the localization procedure which is often several weeks to months following the interventional procedure and likely conducted at a different facility with different medical staff, information obtained during the intervention procedure may not be assessable for purposes of placing the localization device.
  • the legacy ultrasound images and associated information may not be accessible and/or compatible for reference or side-by-side comparison with displayed images generated by the ultrasound device being used for the localization procedure.
  • the medical professional relies exclusively on the localization device placed prior to the procedure to locate the cancerous tissue and identify the boundaries thereof to excise a correct amount of cancerous tissue.
  • the information obtained during the diagnostic exam, the interventional procedure or the localization procedure may not be accessible during the excision procedure, leaving the medical professional to spend time re-evaluating the characteristics of the tissue, the location of the lesion in addition to performing the surgery.
  • Embodiments described herein improve upon the prior technology by providing novel and inventive ultrasound workflow tools.
  • the enhanced ultrasound workflow tools facilitate use of an ultrasound device during operation of a diagnostic exam, an interventional procedure, and/or a localization procedure. Not only are these workflow tools customized for each procedure, but they are integrated to provide cross-procedure information at a physician's fingertips. That is, all of the imaging and information collected via ultrasound for a particular patient across different procedures, different medical facilities, and different medical staff may be compiled and accessed during subsequent procedures via the workflow tools.
  • Such compilation and integration of clinical and/or diagnostic information regarding patient lesion(s) enables medical professionals to make well-informed assessments and to perform procedures more precisely and efficiently, not only saving time and effort on behalf of medical personnel but potentially reducing the durations of at least some procedures—benefiting both patients and medical facilities.
  • the workflow tools are able to anticipate the next step in a workflow of each procedure to provide prompts and/or additional information (e.g., imaging and diagnostics from previous procedures) that are pertinent to the particular step.
  • the ultrasound workflow tools are part of an application that is executed on a wireless computing device, and is used in connection with a wireless ultrasound device.
  • the ultrasound device is a wired device that is capable of communicating with the wired computing device on which the ultrasound workflow tool application operates.
  • the ultrasound workflow tool enables the ultrasound operator to concisely characterize each lesion as it is identified during the exam.
  • This characterizations may include, for example, the following details about the lesion: size, location (region and depth), readability, parenchymal pattern, classification as a focal lesion, margins, shape, echo pattern, orientation, posterior features, calcifications, and classification (lesion, cyst, etc.).
  • the ultrasound workflow tool may prompt the operator to describe the detected area as a lesion or a cyst and to identify the location (region and even depth) of that lesion/cyst on a pictogram.
  • the ultrasound workflow tool may further prompt the operator to further characterize each detected lesion or cyst according to the above features using, for example, a series of check boxes and text inputs.
  • the ultrasound workflow tool may provide the ultrasound operator with an automatically characterized description of the detected lesion, to which the operator need only confirm or edit the characterization. The operator repeats these steps until the diagnostic exam is completed.
  • the ultrasound workflow tool gathers the data obtained during the exam to generate a detailed report for the patient's file.
  • This report includes a complete characterization of each detected lesion and a pictogram of the patient's breasts that visually depict the location of each detected lesion.
  • the ultrasound workflow tool enables an ultrasound operator to quickly and efficiently characterize detected lesions during or immediately following the diagnostic exam. Because this is done during or immediately following completion of the scan, the ultrasound operator does not have to rely on memory to add any characterizations to the patient's report.
  • the report generated by the ultrasound workflow tool is a complete report, fully characterizing each lesion detected during the exam, and not just a mere annotation of a scan.
  • Any follow up diagnostic scan or another ultrasound procedure would be able to access the information, including characterization of the lesion, entered during the diagnostic exam and generated as part of the report described above.
  • the next medical professional can save time by referencing the information previously entered and compiled by the workflow tool.
  • the ultrasound workflow tool facilitates the procedure for the medical professional, who is also operating the ultrasound with one hand and the biopsy device in the other, by automatically capturing images as the medical professional performs the biopsy.
  • a biopsy procedure requires the medical professional to perform a sequence of steps, each of which must be carefully imaged and documented.
  • the information entered as part of the diagnostic workflow such as the location and characterization of the lesion, can be accessed by the workflow tool during the interventional procedure and used to locate the lesion for the purpose of the biopsy, thereby saving the medical professional time and effort in having to locate the lesion all over again.
  • the ultrasound workflow tool therefore facilitates the procedure by initially prompting the medical professional to input information about the biopsy, such as, for example, the number of lesions to be tested, the type of needle used, and the needle gauge. Once this information is provided, the ultrasound workflow tool populates information about the needle so that images captured can be carefully related to the needle use. Thereafter, when the biopsy procedure begins, the ultrasound workflow tool automatically captures images as the medical professional performs the biopsy and automatically annotates each image.
  • the ultrasound workflow tool predicts and detects each step of the biopsy procedure and annotates each image captured with a description of that image for that particular lesion (e.g., image of initial placement of needle, needle pre-fire image, needle post-fire image, lesion cross-sectional image, etc.). Accordingly, the ultrasound workflow tool facilitates the procedure by automatically annotating many images captured during an interventional procedure. This avoids the excessive time normally taken by the medical professional to later annotate images, which oftentimes occurs hours after the procedure. This automatic annotation during the procedure also increases the accuracy of the annotations, avoiding situations in which the surgeon is required to recall the context relating to the particular procedure and each image captured, which may result in inaccurate annotations or missing information.
  • a description of that image for that particular lesion e.g., image of initial placement of needle, needle pre-fire image, needle post-fire image, lesion cross-sectional image, etc.
  • the ultrasound workflow tool facilitates the procedure for the medical professional who is also operating the ultrasound and localization device by providing the medical professional with step-by-step guidance of the localization device to the cancerous tissue.
  • the localization device provides guidance during the excision surgical process to accurately excise cancerous tissue and a sufficient margin area around the cancerous tissue.
  • the medical professional initially locates the cancerous tissue by placing the ultrasound probe on the patient's breast to visually identify, using the ultrasound device, a previously implanted marker at the lesion site. Either a wire or wireless localization device is then placed at the site of the lesion and the marker.
  • a marker is not used during the interventional procedure, and the medical professional would need to visually identify the location of the lesion without the aid of the marker. Without the workflow tool, the medical professional would need to both locate and place the localization device on their own, determining the location of the lesion. With the workflow tool, significant time is saved and accuracy of determining placement is improved.
  • FIG. 1A depicts an example of a system 100 including an ultrasound device 101 connected to a wireless computing device 110 on which the disclosed ultrasound workflow tool operates.
  • the ultrasound device 101 is a wireless ultrasound device and in other examples the ultrasound device is wired.
  • the ultrasound device 101 includes an ultrasound probe 102 that includes an ultrasonic transducer 104 .
  • the ultrasonic transducer 104 is configured to emit an array of ultrasonic sound waves 106 .
  • the ultrasonic transducer 104 converts an electrical signal into ultrasonic sound waves 106 .
  • the ultrasonic transducer 104 may also be configured to detect ultrasonic sound waves, such as ultrasonic sound waves that have been reflected from internal portions of a patient.
  • the ultrasonic transducer 104 may incorporate a capacitive transducer and/or a piezoelectric transducer, as well as other suitable transducing technology.
  • the ultrasound probe 102 may also include a probe localization transceiver 108 .
  • the probe localization transceiver 108 is a transceiver that emits a signal providing localization information for the ultrasound probe 102 .
  • the probe localization transceiver 108 may include an RFID chip or device for sending and receiving information.
  • the signal emitted by the probe localization transceiver 108 may be processed to determine the orientation or location of the ultrasound probe 102 .
  • the orientation and location of the ultrasound probe 102 may be determined or provided in three-dimensional components, such as Cartesian coordinates or spherical coordinates.
  • the orientation and location of the ultrasound probe 102 may also be determined or provided relative to other items, such as an incision instrument, a marker, a magnetic direction, a normal to gravity, etc. With the orientation and location of the ultrasound probe 102 , additional information can be generated and provided to the surgeon to facilitate guiding the surgeon to a lesion within the patient, as described further below. While the term transceiver is used herein, the term is intended to cover both transmitters, receivers, and transceivers, along with any combination thereof.
  • the ultrasonic transducer 104 is operatively connected to a wireless computing device 110 .
  • the wireless computing device 110 may be a part of a computing system, including processors and memory configured to produce and analyze ultrasound images. Further discussion of a suitable computing system is provided below with reference to FIG. 1D .
  • the wireless computing device 110 is configured to display ultrasound images based on an ultrasound imaging of a patient.
  • the ultrasound imaging performed in the ultrasound localization system 100 is primarily B-mode imaging, which results in a two-dimensional ultrasound image of a cross-section of a portion of the interior of a patient.
  • the brightness of the pixels in the resultant image generally corresponds to amplitude or strength of the reflected ultrasound waves.
  • Other ultrasound imaging modes may also be utilized.
  • the wireless computing device 110 is further configured to operate the ultrasound workflow tool. As will be described in further detail herein, the ultrasound workflow tool may be used by an ultrasound operator in relation to a diagnostic exam, an interventional procedure, and an excision surgery procedure.
  • FIG. 1B depicts an example of the system 100 including a wireless ultrasound device 101 in use with a patient 112 .
  • the ultrasound probe 102 is in contact with a portion of the patient 112 , such as a breast of the patient 112 .
  • the ultrasound probe 102 is being used to image a portion of the patient 112 containing a lesion 114 .
  • a marker 116 has been implanted at the lesion 114 .
  • the marker 116 may be implanted at the lesion during or in association with a biopsy procedure prior to the surgical procedure discussed herein.
  • the marker 116 allows for the lesion 114 to be localized through the use of the wireless ultrasound device during a later excision surgery procedure.
  • the ultrasonic transducer 104 emits an array of ultrasonic sound waves 106 into the interior of the patient 112 .
  • a portion of the ultrasonic sound waves 106 are reflected off internal components of the patient 112 as well as the marker 116 , when the marker 116 is in the field of view, and return to the ultrasound probe 102 as reflected ultrasonic sound waves 120 .
  • the reflected ultrasonic sound waves 120 may be detected by the ultrasonic transducer 104 .
  • the ultrasonic transducer 104 receives the reflected ultrasonic sound waves 120 and converts the ultrasonic sound waves 120 into an electric signal that can be processed and analyzed to generate ultrasound image data on display 110 .
  • the depth of the marker 116 or other objects in an imaging plane may be determined from the time between a pulse of ultrasonic waves 106 being emitted from the ultrasound prove 102 and the reflected ultrasonic waves 120 being detected by the ultrasonic probe 102 .
  • the speed of sound is well-known and the effects of the speed of sound based on soft tissue are also determinable. Accordingly, based on the time of flight of the ultrasonic waves 106 (more specifically, half the time of flight), the depth of the object within an ultrasound image may be determined. Other corrections or methods for determining object depth, such as compensating for refraction and variant speed of waves through tissue, may also be implemented. Those having skill in the art will understand further details of depth measurements in medical ultrasound imaging technology.
  • the wireless computing device 110 wirelessly connected to the wireless ultrasound device 101 .
  • the wireless computing device 110 displays an ultrasound image 130 , including an image of the implanted marker 116 , on the display of the wireless computing device 110 .
  • the ultrasound probe 102 detects images, those images are simultaneously displayed on the wireless computing device 110 .
  • the wireless computing device 110 executes the ultrasound workflow tool, which is operatively connected to the wireless ultrasound device 101 .
  • the ultrasound workflow tool is used to facilitate performing a diagnostic scan, an interventional procedure, and a localization and excision surgery procedure.
  • the ultrasound workflow tool is further configured to automatically generate reports reflecting data obtained during each of the procedures.
  • the ultrasound workflow tool executed on the wireless computing device 110 may be used during a diagnostic breast exam.
  • the ultrasound operator may operate the wireless ultrasound device 101 in one hand, while operating the wireless computing device 110 in the other.
  • the operator may identify various areas of interest as shown on the display of the wireless computing device 110 .
  • the operator can use the ultrasound workflow tool to classify the areas of interest as a lesion or a cyst as they are detected during the diagnostic exam.
  • the operator can also use the ultrasound workflow tool to mark, on a pictogram of the breast displayed on a display, a location indicating the region and depth of the potential lesion. This pictogram or location information can be later provided in a chart automatically generated by the ultrasound workflow tool.
  • the ultrasound workflow tool executed on the wireless computing device 110 may be used during an interventional procedure such as an image-guided breast biopsy.
  • the medical professional operates the wireless ultrasound device to visually guide the needle to a region of interest while also capturing images.
  • the medical professional because the medical professional is operating the ultrasound probe 102 in one hand and the needle in another, the medical professional does not have the ability to annotate images during the procedure.
  • the ultrasound workflow tool therefore is used to prompt the medical professional to advance through each step within the biopsy process, automatically capturing necessary images, and automatically annotating those images. Those annotated images can thereafter be provided in a report attached to the patient's file.
  • the ultrasound workflow tool executed on the wireless computing device 110 may be used during the pre-surgical localization procedure in order to place a localization device that would later be used in the excise surgery.
  • the medical professional will locate the cancerous tissue by placing the ultrasound probe on the patient's breast to visually identify a previously implanted marker at the cancer site.
  • the ultrasound workflow tool can facilitate the procedure by providing the medical professional with updated, detailed visualization information on the display of the wireless computing device 110 .
  • the ultrasound workflow tool can provide step-by-step audio instructions to guide the incision tool during the excision surgery to the cancerous tissue, as well as guide the surgeon during the excising process.
  • FIG. 1C depicts an example of a suitable operating environment 138 for incorporation into the disclosed system.
  • operating environment 150 typically includes at least one processing unit 152 and memory 154 .
  • memory 154 storing instructions to perform the active monitoring embodiments disclosed herein
  • memory 154 may be volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.), or some combination of the two.
  • This most basic configuration is illustrated in FIG. 1E by dashed line 156 .
  • environment 150 may also include storage devices (removable 158 , and/or non-removable 160 ) including, but not limited to, magnetic or optical disks or tape.
  • environment 150 may also have input device(s) 164 such as keyboard, mouse, pen, voice input, etc. and/or output device(s) 166 such as a display, speakers, printer, etc.
  • the input devices 164 may also include one or more antennas to detect signals emitted from the various transceivers in the system 100 .
  • Also included in the environment may be one or more communication connections 162 , such as LAN, WAN, point to point, etc. In embodiments, the connections may be operable to facility point-to-point communications, connection-oriented communications, connectionless communications, etc.
  • Operating environment 150 typically includes at least some form of computer readable media.
  • Computer readable media can be any available media that can be accessed by processing unit 152 or other devices comprising the operating environment.
  • Computer readable media may comprise computer storage media and communication media.
  • Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
  • Computer storage media includes, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transitory medium which can be used to store the desired information.
  • Computer storage media does not include communication media.
  • Communication media embodies computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
  • modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, microwave, and other wireless media. Combinations of the any of the above should also be included within the scope of computer readable media.
  • the operating environment 150 may be a single computer operating in a networked environment using logical connections to one or more remote computers.
  • the remote computer may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above as well as others not so mentioned.
  • the logical connections may include any method supported by available communications media.
  • FIG. 2A depicts an example method 200 for facilitate performing a diagnostic scan using the disclosed ultrasound workflow tool.
  • the method 200 may be performed by the ultrasound workflow tool executed on a wireless device having a processor such as for example the system 100 and the operating environment 150 .
  • the method 200 begins at operation 202 in which the ultrasound workflow tool displays a prompt to select a workflow from among a plurality of workflows.
  • the ultrasound workflow tool can facilitate various operations such as a diagnostic breast exam, an interventional procedure, a localization and/or an excision surgery.
  • a workflow is a sequence of predetermined steps that are performed during a particular procedure.
  • the prompt displayed is a drop down menu or some other suitable menu used to select one of a plurality of workflows associated with a procedure. In this way, regardless of the experience of a medical professional, the workflow tool anticipates a next step of the procedure and provides a suitable menu or prompt to the medical professional, thereby ensuring that the workflow tool collects appropriate information as the procedure progresses.
  • the ultrasound workflow tool receives a selection of a diagnostic workflow, related to a diagnostic breast exam.
  • the diagnostic breast exam procedure is a breast scan to identify any areas of interest that may be potentially cancerous.
  • the breast scan may be performed by a medical professional using an ultrasound device having a probe that is used to scan the patient's breasts.
  • the ultrasound workflow tool receives one or more signals from the ultrasound device.
  • the ultrasound workflow tool receives one or more signals from the ultrasound device.
  • These one or more signals may include one or more images of the patient's breast as the ultrasound device is used to scan the patient's breast during the diagnostic scan.
  • the ultrasound device is rotated around the patient's breast until a region of interest is detected.
  • a potential region of interest may look like a darker or lighter portion on the ultrasound image.
  • the ultrasound workflow tool may identify the region of interest based on a visual identification of the scan displayed on the display of the wireless computing device. In an example, the operator may select an option at the ultrasound workflow tool to indicate the region of interest.
  • the method 200 proceeds to operation 206 . If a region of interest is identified (YES at operation 208 ), the method 200 proceeds to operation 210 in which the ultrasound workflow tool causes display of the region of interest. In one example, this may include a zoomed in image of the region of interest.
  • the ultrasound workflow tool provides a prompt to classify the region of interest.
  • the region of interest may be, for example, a lesion or a cyst.
  • the region of interest may include more than one lesion or cyst.
  • each lesion (or cyst) may be evaluated independently based on the operations described below, or the collection of lesions (or cysts) may be evaluated collectively based on the operations below, or some combination thereof.
  • a lesion is an abnormal mass of tissue or swelling and a cyst is a sac that may be filled with air, fluid, or another material.
  • the ultrasound workflow tool may display an option (e.g., a drop-down menu, a check box, or other suitable selection option) to characterize the detected region as a lesion or a cyst.
  • the ultrasound workflow tool receives the selected classification.
  • the selected classification of the region of interest may be a lesion classification or a cyst classification.
  • the ultrasound workflow tool display a template with one or more fields for characterizing the classified region of interest (e.g. a pictogram or another schematic representation of the breast).
  • the pictogram may be, for example, a two-dimensional line diagram of the breast that enables the ultrasound operator to quickly, easily, and more accurately identify the location of the classified region of interest (e.g., lesion or cyst).
  • the pictogram includes numerically identified circular regions that radiate from the nipple.
  • a cross-sectional side view pictogram is provided, displaying alphabetical regions that radiate from the top surface of the breast to identify depth.
  • the ultrasound workflow tool receives a location of the classified region of interest.
  • the ultrasound workflow tool may receive a signal in response to the operator's selection, on the displayed pictogram, of a location of the anomaly on the pictogram.
  • the ultrasound workflow tool may also receive a selection, from a cross-sectional pictogram view, of a depth of the anomaly.
  • the ultrasound workflow tool may translate these one or more selections into number-letter coordinates identifying a location of the classified anomaly. Alternatively or additionally, only the location relating to the nipple is selected, thereby generating a numerical region identification of the location of the classified anomaly.
  • the ultrasound workflow tool determines whether the scan is complete. For example, a scan could be complete after the entire breast is scanned. In another example, the ultrasound operator may make a selection to end the scan. In still another example, the scan may be complete when the last lesion (or cyst) of a region of interest is characterized and, if not, may return to operation 214 to classify the next lesion (or cyst). If the scan is not complete (NO at operation 216 ), the method 200 proceeds to operation 206 in which the ultrasound workflow tool receives one or more signals from the ultrasound device during a continuation of the diagnostic scan of the patient's breast.
  • the method 200 proceeds to operation 220 in which a review screen is displayed at the ultrasound workflow tool.
  • the review screen enables the ultrasound operator to further characterize each classified region of interest identified during the diagnostic scan.
  • the review screen comprises a field for receiving a description corresponding to one or more parameters associated with the region of interest.
  • the review screen may therefore prompt the ultrasound operator to provide further information regarding the following parameters for each classified region of interest: a readability finding, the parenchymal pattern (e.g., homogeneous dense, scattered density, involuted, etc.), whether it is a focal lesion, the size, margin, shape, echo pattern, orientation, posterior features, calcifications, and associated features.
  • a list of features is described, it is understood that this is an exemplary list of features and that more or fewer features may be used to describe the classified region of interest.
  • the ultrasound workflow tool receives the characterizations of each classified region of interest.
  • the features listed above are displayed as check boxes or as drop-down menu options that allow the ultrasound operator to quickly and efficiently characterize each classified region of interest identified during the scan.
  • text box inputs may be provided that allow the operator to provide further description such as, for example, indications, mammography correlations, clinical correlations, impressions, and recommendations.
  • the ultrasound workflow tool automatically processes each classified region of interest and automatically populates the findings. Accordingly, in such an example, the ultrasound operator need only confirm, deny, or edit such findings.
  • the ultrasound workflow tool generates a report.
  • the ultrasound workflow tool collects all the images captured during the scan, the classifications, the identified location information, the characterizations, and any comments into a report that is attached to the patient's file.
  • the report can further include the pictogram showing a pictorial location of each detected lesion or cyst.
  • FIG. 2B depicts an example user interface 250 of the ultrasound workflow tool displaying an image of a breast as scanned by the wireless ultrasound probe.
  • the ultrasound workflow tool can be used to perform a diagnostic exam during which the ultrasound operator can view an image of the breast during a breast scan.
  • FIG. 2C illustrates an example user interface 252 of the ultrasound workflow tool displaying a lesion as detected on the ultrasound image during a diagnostic exam.
  • the ultrasound workflow tool displays a classification user interface 254 , which enables the ultrasound operator to quickly classify the region of interest as a lesion or a cyst, and to mark the boundaries of the identified region (e.g., height, width, length).
  • FIG. 2D illustrates an example user interface 256 of the ultrasound workflow tool displaying a lesion as marked by the ultrasound operator.
  • the ultrasound workflow tool enables the ultrasound operator to quickly and accurately mark the boundaries of the classified region of interest (e.g., lesion or cyst).
  • the marking further corresponds to a size of the classified region of interest, as shown on the user interface 256 .
  • This user interface 256 illustrates an example in which the ultrasound operator marked both the height and length of the classified region of interest.
  • FIG. 2E depicts an example user interface 258 of the ultrasound workflow tool displaying a region of interest as marked by the ultrasound operator.
  • This user interface illustrates an example in which the ultrasound operator marked only the height of the region of interest. This marking corresponds to a height size of the region of interest, as shown on the user interface 258 .
  • FIG. 2F illustrates an example annotation user interface 260 of the ultrasound workflow tool.
  • the annotation user interface 260 allows the ultrasound operator to indicate, for example, the breast and a region in which the classified region of interest was found.
  • the ultrasound workflow tool may allow the ultrasound operator to indicate a region in which the classified region of interest is located for example, by indicating an orientation from the nipple (e.g., 12:00, 1:00, 2:00 . . . ) and a depth of the classified region of interest. These may be translated into letter-number coordinates specifically pinpointing the classified region of interest.
  • FIG. 2G depicts an example extended view 262 of the annotation user interface of the ultrasound workflow tool.
  • This extended view 262 shows the various options in which the ultrasound operator can use to select the type, orientation, and region of the classified region of interest.
  • FIG. 2H illustrates an example second annotation user interface 264 of the ultrasound workflow tool.
  • This second annotation user interface 264 enables the ultrasound operator to further indicate the type of diagnostic exam the operator is performing.
  • FIG. 2I illustrates an example review screen 266 of the ultrasound workflow tool.
  • the review screen 266 is displayed as part of the application operating on the wireless computing device after completion of the diagnostic exam and provides a summary of the findings during the diagnostic exam.
  • the review screen provides a summary of each classified region of interest identified, the location of each classified region of interest, the size of each classified region of interest, and the orientation of each classified region of interest.
  • the review screen 266 may further display calculations automatically performed based on the operator's performance of the diagnostic exam, including for example, classification of regions of interest, volume, depth-to-width ratio, among others.
  • FIG. 2J illustrates an example user interface of the ultrasound workflow tool displaying another review screen 268 in which impressions can be captured. Information from other prior diagnostic exams and/or reports can be auto populated into these fields or entered from medical records associated with the patient.
  • FIG. 2K illustrates an example characterization review screen 270 of the ultrasound workflow tool.
  • the ultrasound operator can characterize each classified region of interest at the characterization review screen 270 .
  • the ultrasound operator can indicate the margins of the classified region of interest, shape, echo pattern, and orientation. Although a specific set of characteristics are shown, it is understood that more or fewer characteristics may be displayed as contemplated by the disclosed ultrasound workflow tool.
  • FIG. 2L depicts an example extended view of the characterization review screen 272 displayed on the ultrasound workflow tool.
  • the ultrasound operator indicate any posterior features, calcifications, and associated features.
  • FIG. 3A depicts an example method 300 for facilitating performing an interventional procedure using the disclosed ultrasound workflow tool.
  • the method 300 may be performed by the ultrasound workflow tool executing on a wireless device having a processor, such as for example the system 100 and the operating environment 150 .
  • the method 300 begins at operation 302 in which the ultrasound workflow tool displays a prompt to select a workflow from among a plurality of workflows.
  • the ultrasound workflow tool can assist an ultrasound operator in various operations such as a diagnostic breast exam, an interventional procedure, a localization procedure, and/or excision surgery.
  • a workflow is a sequence of predetermined steps that are performed during a particular procedure.
  • the prompt displayed is a drop down menu or some other suitable menu used to select one of a plurality of workflows associated with a procedure.
  • the ultrasound workflow tool receives a selection of an interventional workflow procedure, such as a biopsy exam.
  • a breast biopsy is a procedure performed to excise a sample of potentially cancerous tissue from a detected lesion or cyst to determine whether that tissue is malignant or benign. Accordingly, a biopsy may be performed sometime after a diagnostic breast exam, for example. Although this example may refer to the potentially cancerous tissue as a lesion, it is understood that this method 300 may also be used during a biopsy of a cyst.
  • the ultrasound workflow tool displays a prompt asking the ultrasound surgeon select the needle type and needle gauge used during the biopsy procedure.
  • potentially cancerous tissue is extracted from a targeted location of a patient using a needle. Once the needle is inserted into the patient, the location of the needle can no longer be visually identified. Accordingly, during the biopsy, the surgeon uses an ultrasound device to guide the needle to the lesion, while also capturing images during the entirety of the procedure. Understanding the needle type and gauge is important because different needles have different sizes, and some are spring-loaded or have other “firing” mechanisms that cause a portion of the biopsy needle to extend to capture a sample of the lesion tissue.
  • the ultrasound workflow tool may display a prompt asking the surgeon or other person to select the type and gauge of the needle from among a plurality of optional needle types and sizes.
  • the prompt provides a drop-down menu, a selection box, or other suitable selection feature to select the appropriate needle and type.
  • the ultrasound workflow tool receives a selection of the needle type and gauge.
  • a selection is received in response to the surgeon's selection at the prompt displayed at operation 306 .
  • the ultrasound workflow tool may populate needle and gauge information. This information may be used during the biopsy operation (e.g., in order to detect various steps within the procedure).
  • operation 308 may be optional and the method may advance from operation 306 to operation 310 .
  • the ultrasound workflow tool receives one or more signals from the ultrasound device.
  • the medical professional starts scanning the patient's breast using the ultrasound device.
  • the ultrasound workflow tool receives one or more signals, which may include one or more images of the patient's breast as the ultrasound device is used to scan the patient's breast during the interventional procedure.
  • the ultrasound workflow tool determines if a lesion is identified.
  • a lesion is identified by the ultrasound workflow tool based on one or more images received from the ultrasound device during a scan of the patient's breast.
  • a lesion may look like a darker or lighter portion on the ultrasound image.
  • the ultrasound workflow tool may identify the lesion based on a visual identification of the scan displayed on the display of the wireless computing device.
  • the ultrasound workflow tool determines that a lesion is identified using the ultrasound probe based on the surgeon's indication that the lesion is found by scanning the patient's breast using the ultrasound device.
  • the ultrasound workflow tool may use image analysis or machine-learning techniques to automatically identify the lesion as the surgeon scans the patient's breast and uses that understanding to automatically capture the image.
  • the information from the diagnostic scan such as the scan performed in steps described in reference to FIG. 2A , may be used to suggest an area where the lesion is located. This determination can be confirmed by the surgeon performing the interventional procedure.
  • the surgeon may select an option at the ultrasound workflow tool when a lesion is identified. One or more images may be captured of the identified lesion. If a lesion is not identified (NO at operation 312 ), the method 300 returns to operation 310 and the surgeon continues to scan the patient's breast. If a lesion is identified (YES at operation 312 ), the method proceeds to operation 314 .
  • the ultrasound workflow tool evaluates the lesion to determine whether the lesion was previously characterized.
  • the interventional procedure may be performed sometime after one or more potentially cancerous lesions are identified in a diagnostic exam.
  • the ultrasound workflow tool determines whether the lesion identified in the scan at operation 312 is a lesion that was previously characterized during a diagnostic exam.
  • the lesion was not previously characterized, in which case the lesion may be a newly developed lesion or a lesion not identified during a diagnostic scan.
  • the lesion identified in operation 312 may have been previously identified and characterized in a diagnostic exam.
  • the ultrasound workflow tool evaluates the size, location, shape, etc. of the identified lesion to make this determination.
  • the ultrasound workflow tool automatically annotates one or more images captured of the characterized lesion.
  • the annotation may include a description of the image (e.g., initial image of the lesion), a description or identification of the particular lesion (e.g., as being a previously characterized lesion or a new lesion), a timestamp indicating the time at which the image was captured, the name of the surgeon performing the procedure, the patient's name, etc.
  • the ultrasound workflow tool captures an image of the needle before the needle is fired into the potentially cancerous lesion.
  • the ultrasound workflow tool may receive an indication that the needle is positioned at the lesion.
  • the ultrasound workflow tool makes this determination based on the surgeon's indication, using the ultrasound device or a foot pedal or other device operatively connected to one or more of the ultrasound device and the ultrasound workflow tool that the needle has advanced to the lesion site.
  • the ultrasound workflow tool may receive an indication based on a signal received from the RFID transmitter in the needle.
  • the ultrasound workflow tool may use image analysis or machine-learning techniques to automatically identify the location of the needle as the surgeon scans the patient's breast and advances the needle to the appropriate position and use that understanding to automatically capture the image.
  • some biopsy needles are spring-loaded or have other “firing” mechanisms that cause a portion of the biopsy needle to extend to capture a sample of the lesion tissue. Accordingly, at operation 318 , the ultrasound workflow tool captures an image of the needle before the needle is fired into the potentially cancerous lesion.
  • the ultrasound workflow tool automatically annotates the one or more pre-fire images captured.
  • the annotation may include a description of the image (e.g., pre-fire image of the needle positioned at or near the lesion site), a description or identification of the particular lesion (e.g., as being a previously characterized lesion or a new lesion), the needle type and gauge, orientation of the needle aperture, a timestamp indicating the time at which the image was captured, the name of the surgeon performing the procedure, the patient's name, etc.
  • the ultrasound workflow tool determines that the needle fired into the lesion. In one example, the ultrasound workflow tool makes this determination based on the surgeon's indication, using the ultrasound device or other device operatively connected to the ultrasound device or workflow tool, that the needle has been fired into the lesion site. Alternatively or additionally, this determination is made based on a signal received from the RFID transmitter in the needle. Alternatively or additionally, the ultrasound workflow tool may use image analysis or machine-learning techniques to automatically identify when the needle has fired into the lesion. As described herein, some biopsy needles are spring-loaded or have other “firing” mechanisms that cause a portion of the biopsy needle to extend to capture a sample of the lesion tissue. Accordingly, in operation 322 , the ultrasound workflow tool may receive an indication that the needle is in a post-fire position.
  • the ultrasound workflow tool captures a post-fire image of the needle at the lesion site.
  • the post-fire image may include an image of a portion of the needle fired into the lesion.
  • the ultrasound workflow tool determines that the needle is in a post-fire position and uses that understanding to automatically capture the image. Accordingly, at operation 324 one or more post-fire images are captured.
  • the ultrasound workflow tool annotates the one or more post-fire images captured.
  • the annotation may include a description of the image (e.g., post-fire image of the needle fired into the lesion site), a description or identification of the particular lesion (e.g., as being a previously characterized lesion or a new lesion), the needle type and gauge, orientation of the needle aperture, a timestamp indicating the time at which the image was captured, the name of the surgeon performing the procedure, the patient's name, etc.
  • the ultrasound workflow tool determines that the ultrasound probe has rotated.
  • the detection of a rotation of the ultrasound probe is important in order to capture a cross-sectional view of the particular lesion.
  • the ultrasound workflow tool makes this determination based on the surgeon's indication, using the ultrasound device, that the ultrasound probe has been rotated such that a cross-sectional view of the lesion is now in focus.
  • the ultrasound workflow tool may use image analysis to automatically identify when the ultrasound probe has rotated and focused on a cross-sectional view of the lesion.
  • the ultrasound workflow tool may make this determination based on a signal received from ultrasound probe.
  • the ultrasound workflow tool captures a cross-sectional image of the lesion.
  • the cross-sectional image may include an image of the needle fired into the lesion.
  • the cross-sectional image may include an image of the needle retracted from the lesion site.
  • the ultrasound workflow tool determines that the probe has rotated based on the surgeon's indication or based on a signal received from ultrasound probe, such as by an indication from an internal accelerometer within the ultrasound probe.
  • the ultrasound workflow tool may use image analysis or machine-learning techniques to automatically identify the rotation of the probe and uses that understanding to automatically capture the image. Accordingly, at operation 330 one or more cross-sectional images of the lesion are captured.
  • the ultrasound workflow tool annotates the one or more cross-sectional images captured.
  • the annotation may include a description of the image (e.g., cross-sectional image of the lesion), a description or identification of the particular lesion (e.g., as being a previously characterized lesion or a new lesion), a timestamp indicating the time at which the image was captured, the name of the surgeon performing the procedure, the patient's name, etc.
  • the ultrasound workflow tool generates a report.
  • the ultrasound workflow tool collects the needle information provided before the interventional procedure, the images captured during the procedure, and the associated annotations. All such information is compiled into a report that is attached to the patient's file. Accordingly, the method 300 provides a quick, accurate, and efficient solution to annotating and workflow images captured during an interventional procedure, eliminating or minimizing the time it would take the surgeon to prepare the report hours after the procedure.
  • FIG. 3B illustrates an example user interface 350 of the ultrasound workflow tool displaying an image of a breast as scanned by the wireless ultrasound probe.
  • the ultrasound workflow tool can be used to perform an interventional procedure during which an ultrasound operator (e.g., a surgeon) can view an image of the breast during an interventional procedure.
  • an ultrasound operator e.g., a surgeon
  • FIG. 3C illustrates an example user interface of the ultrasound workflow tool displaying a needle selection screen 352 .
  • the needle selection screen 352 enables the ultrasound operator can select the particular needle that will be used during the interventional procedure.
  • FIG. 3D illustrates an example user interface of the ultrasound workflow tool displaying a patient demographics screen 354 .
  • the patient demographics screen 354 enables the ultrasound operator to provide patient information in advance of the interventional procedure.
  • FIG. 3E illustrates an example user interface of the ultrasound workflow tool displaying an indications user interface 356 .
  • the indications user interface 356 enables the ultrasound operator to provide any notes in advance of the interventional procedure.
  • FIG. 3F illustrates an example user interface 358 of the ultrasound workflow tool displaying an initial image of the breast.
  • the initial image may be taken automatically by the ultrasound workflow tool, in response to automatically detecting the lesion.
  • the ultrasound operator will capture images as the needle is being guided to the lesion as well as capture images before the needle is fired, after the needle is fired into the lesion site, and to obtain a cross-sectional image of the lesion.
  • these multiple images are captured during the procedure to serve as evidentiary proof of the complete performance of the procedure as well as later analysis of the lesion tissue. Accordingly, an initial image of the breast may be captured and displayed.
  • FIG. 3G illustrates an example user interface 360 of the ultrasound workflow tool displaying an image of the breast as the needle is guided to the lesion cite. As described, this image may be automatically captured in response to detection, by the ultrasound workflow tool, of the needle and the lesion.
  • FIG. 3H illustrates an example user interface of the ultrasound workflow tool displaying a review screen 362 in which impressions can be captured. This review screen 362 may be displayed after completion of the interventional procedure.
  • FIG. 3I illustrates an example user interface of the ultrasound workflow tool displaying an end screen 364 .
  • the end screen may be displayed after the interventional procedure is completed and after all impressions or notes are provided by the ultrasound operator.
  • This end screen 364 indicates the exam is completed.
  • This end screen 364 may further provide a summary of the images taken.
  • FIG. 4A depicts an example method 400 for facilitating performing a localization pre-surgical procedure using the disclosed ultrasound workflow tool.
  • the method 400 may be performed by the ultrasound workflow tool executing on a wireless device having a processor, such as for example the system 100 and the operating environment 150 .
  • the ultrasound workflow tool facilitates operating the ultrasound device by providing step-by-step guidance for placing the localization device at or near the cancerous tissue.
  • a localization device such as, for example, a wire, is placed at the lesion site.
  • the localization device is used by a medical professional (generally a different medical professional) during the surgical procedure to guide an excision instrument to the cancerous tissue in order to excise cancerous tissue and a sufficient margin around the cancerous tissue.
  • a medical professional generally a different medical professional
  • the medical professional locates the cancerous tissue by placing the ultrasound probe on the patient's breast to visually identify a previously implanted marker (e.g., implanted during a biopsy procedure) at the lesion site. Either a wire or wireless localization device is then placed at the site of the lesion and the marker. In some examples, however, a marker is not used.
  • the method 400 begins at operation 402 in which the ultrasound workflow tool displays a prompt to select a workflow from among a plurality of workflows.
  • the ultrasound workflow tool can facilitate various operations such as a diagnostic breast exam, an interventional procedure, and a localization procedure.
  • a workflow is a sequence of predetermined steps that are performed during a particular procedure.
  • the prompt displayed is a drop down menu or some other suitable menu used to select one of a plurality of workflows associated with a procedure.
  • the ultrasound workflow tool receives a selection of a localization workflow for a localization procedure.
  • a localization device such as, for example, a wire
  • a localization device is placed at the lesion site, and is later used by the medical professional during a surgical procedure to guide an excision instrument to the cancerous tissue in order to excise cancerous tissue and a sufficient margin around the cancerous tissue.
  • a localization procedure may be performed sometime after tissue excised from a biopsy is found to be malignant and before a surgical procedure to fully remove the cancerous tissue.
  • the ultrasound workflow tool receives one or more signals from the ultrasound device.
  • the ultrasound workflow tool receives one or more signals from the ultrasound device.
  • the anatomical feature may be a patient's breast.
  • the one or more signals may include one or more images of the patient's breast as the ultrasound device is used to scan the patient's breast during the localization procedure.
  • the ultrasound workflow tool causes display of an image of the scan. In one example, this may include a zoomed in image of the scan.
  • the ultrasound workflow tool identifies the cancerous tissue.
  • the ultrasound workflow tool identifies cancerous tissue based on receiving a signal from the ultrasound probe in response to the medical professional's indication that the cancerous tissue is found using the ultrasound device.
  • the ultrasound workflow tool may use image analysis or machine-learning techniques to automatically identify the cancerous tissue as the medical professional scans the patient's breast.
  • the information from the diagnostic scan such as the scan performed in steps described in reference to FIG. 2A , may be used to suggest an area where the cancerous tissue is located. This determination can be confirmed by the medical professional performing the localization procedure.
  • the identified cancerous tissue may be displayed on a display of the wireless computing device on which the workflow tool application operates.
  • the ultrasound workflow tool evaluates the cancerous tissue to determine whether the identified cancerous tissue was previously characterized.
  • the localization procedure is performed sometime after performing an interventional procedure (e.g., a breast biopsy) to determine that a lesion is cancerous.
  • the ultrasound workflow tool determines whether the cancerous tissue identified in the scan at operation 410 is the same lesion that was previously characterized as being cancerous after an interventional procedure.
  • the cancerous tissue was not previously characterized, in which case the cancerous tissue may be newly developed or tissue that was not identified during a diagnostic scan or an interventional procedure.
  • the cancerous tissue evaluated in operation 412 may have been previously identified and characterized in a diagnostic exam and tested after an interventional procedure.
  • the ultrasound workflow tool evaluates the size, location, shape, etc., of the identified cancerous tissue to make this determination.
  • the ultrasound workflow tool detects a first position of the localization device.
  • the localization device includes a transceiver having an RFID chip for providing orientation and location information. Accordingly, at operation 414 , the ultrasound workflow tool detects the first position of the localization device based on one or more signals received from the RFID chip. In some examples, this first position is located within a region associated with the scanned anatomical feature.
  • the ultrasound workflow tool determines a first distance between the cancerous tissue and the localization device.
  • the site of the cancerous tissue may include a previously implanted marker. Accordingly, in some examples, the ultrasound workflow tool determines a distance from the localization device to the cancerous tissue (or to a marker implanted at the cancerous tissue). The ultrasound workflow tool can, in some examples, determine a distance between the cancerous tissue and the localization device based on knowledge of the respective locations.
  • the ultrasound workflow tool provides the calculated first distance between the localization device and the cancerous tissue (or marker implanted at the cancerous tissue). This calculated distance can be displayed on a screen of the ultrasound workflow tool operating on a wireless computing device. In aspects, the display may further indicate orientation information of the marker, the localization device, or both. Alternatively or additionally, the ultrasound workflow tool provides an audio signal specifying the calculated distance between the localization device and the cancerous tissue (or the implanted marker).
  • the ultrasound workflow tool determines if the localization device has moved. If, at operation 420 , the ultrasound workflow tool determines that the localization device has not moved (NO at operation 420 ), the method 400 flows to operation 418 . If the ultrasound workflow tool determines that the localization device has moved (YES at operation 420 ), the method 400 proceeds to operation 422 where the ultrasound workflow tool recalculates the distance between the localization device and the cancerous tissue (or the implanted marker).
  • the ultrasound workflow tool provides the recalculated distance computed in operation 422 .
  • This recalculated distance can be displayed on a screen of the ultrasound workflow tool operating on the wireless computing device.
  • the display may further indicate orientation information of the marker, the localization device, or both.
  • the ultrasound workflow tool provides voice guidance specifying the location of the cancerous tissue with reference to the location of the localization device.
  • this voice guidance specifies the location of the cancerous tissue with reference to the localization device and the direction to which the medical professional should guide the localization device in order to ultimately place the localization device at or near the cancerous tissue. Accordingly, in some examples, the medical professional need only listen to voice guidance to determine how to guide the localization device to the cancerous tissue.
  • the ultrasound workflow tool determines whether the localization device is located at or near the cancerous tissue. As described, the ultrasound workflow tool determines the relative distance between the localization device and the cancerous tissue, and can therefore calculate when the localization device is located at the cancerous tissue. Alternatively or additionally, the ultrasound workflow tool provides an audio signal specifying the localization device is now located at the cancerous tissue. If the ultrasound workflow tool determines that the localization device is not located at or near the cancerous tissue (NO at operation 426 ) the method 400 flows to operation 420 to determine whether the localization device has moved. If the ultrasound workflow tool determines that the localization device is located at or near the cancerous tissue (YES at operation 426 ) the method 400 flows to operation 428 .
  • the ultrasound workflow tool provides guidance to implant the localization device at the cancerous tissue site.
  • this guidance is a step-by-step instruction displayed on the display of the ultrasound workflow tool or some other display operatively connected to the ultrasound workflow tool.
  • step-by-step audio instructions to implant the localization device are provided in operation 428 .
  • the medical professional does not need to rely solely on a rough visual interpretation of the location of the cancerous tissue and the boundaries thereof when placing the localization device at the site of the cancerous tissue. Furthermore, because the medical professional must hold the ultrasound device in one hand while placing a wire or wireless localization device with the other hand, the medical professional can use the hands-free workflow tool to facilitate locating and guiding the localization device to the cancerous tissue. With the workflow tool, significant time is saved and accuracy of determining placement is improved.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Vascular Medicine (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Physiology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Oncology (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
US16/979,964 2018-05-07 2019-05-07 Breast ultrasound workflow application Pending US20210015447A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/979,964 US20210015447A1 (en) 2018-05-07 2019-05-07 Breast ultrasound workflow application

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201862668078P 2018-05-07 2018-05-07
PCT/US2019/031094 WO2019217405A1 (en) 2018-05-07 2019-05-07 Breast ultrasound workflow application
US16/979,964 US20210015447A1 (en) 2018-05-07 2019-05-07 Breast ultrasound workflow application

Publications (1)

Publication Number Publication Date
US20210015447A1 true US20210015447A1 (en) 2021-01-21

Family

ID=66821368

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/979,964 Pending US20210015447A1 (en) 2018-05-07 2019-05-07 Breast ultrasound workflow application

Country Status (7)

Country Link
US (1) US20210015447A1 (ja)
EP (1) EP3790468A1 (ja)
JP (2) JP7463287B2 (ja)
KR (1) KR20210006360A (ja)
CN (1) CN112020332A (ja)
AU (1) AU2019265531A1 (ja)
WO (1) WO2019217405A1 (ja)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210251602A1 (en) * 2018-08-22 2021-08-19 Koninklijke Philips N.V. System, device and method for constraining sensor tracking estimates in interventional acoustic imaging
US11379990B2 (en) * 2018-02-21 2022-07-05 Covidien Lp Locating tumors using structured light scanning
EP4335380A1 (en) * 2022-09-12 2024-03-13 FUJI-FILM Corporation Ultrasound diagnostic apparatus and control method for ultrasound diagnostic apparatus

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4279142A1 (en) 2021-01-15 2023-11-22 Reel Tech Co., Ltd. Portable emergency escape apparatus

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140180106A1 (en) * 2011-08-31 2014-06-26 Toshiba Medical Systems Corporation Ultrasonic diagnostic apparatus, ultrasonic diagnostic apparatus control method, and medical image diagnostic apparatus
US20170235903A1 (en) * 2014-05-30 2017-08-17 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. Systems and methods for contextual imaging workflow
US20180342060A1 (en) * 2017-05-25 2018-11-29 Enlitic, Inc. Medical scan image analysis system
US20210004960A1 (en) * 2018-03-07 2021-01-07 Koninklijke Philips N.V. Display of medical image data
US20210093301A1 (en) * 2018-04-09 2021-04-01 Koninklijke Philips N.V. Ultrasound system with artificial neural network for retrieval of imaging parameter settings for recurring patient

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3392462B2 (ja) * 1993-05-28 2003-03-31 株式会社東芝 医用診断支援システム
US7736313B2 (en) 2004-11-22 2010-06-15 Carestream Health, Inc. Detecting and classifying lesions in ultrasound images
WO2006128302A1 (en) * 2005-06-02 2006-12-07 The Medipattern Corporation System and method of computer-aided detection
US20090309874A1 (en) * 2008-06-11 2009-12-17 Siemens Medical Solutions Usa, Inc. Method for Display of Pre-Rendered Computer Aided Diagnosis Results
EP2353070A1 (en) 2008-11-06 2011-08-10 Koninklijke Philips Electronics N.V. Breast ultrasound annotation user interface
US8139832B2 (en) * 2008-12-12 2012-03-20 Hologic, Inc. Processing medical images of the breast to detect anatomical abnormalities therein
JP6385697B2 (ja) 2014-03-26 2018-09-05 キヤノンメディカルシステムズ株式会社 医用画像診断装置及び医用画像診断装置における穿刺針の管理装置
EP3320851A4 (en) 2015-07-09 2019-10-23 Olympus Corporation ULTRASONIC OBSERVATION DEVICE, ULTRASONIC OBSERVATION SYSTEM, OPERATING METHOD FOR ULTRASONIC OBSERVATION DEVICE AND OPERATING PROGRAM FOR ULTRASONIC OBSERVATION
WO2017060791A1 (en) 2015-10-08 2017-04-13 Koninklijke Philips N.V. Apparatuses, methods, and systems for annotation of medical images
US10499882B2 (en) 2016-07-01 2019-12-10 yoR Labs, Inc. Methods and systems for ultrasound imaging
US10346982B2 (en) * 2016-08-22 2019-07-09 Koios Medical, Inc. Method and system of computer-aided detection using multiple images from different views of a region of interest to improve detection accuracy

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140180106A1 (en) * 2011-08-31 2014-06-26 Toshiba Medical Systems Corporation Ultrasonic diagnostic apparatus, ultrasonic diagnostic apparatus control method, and medical image diagnostic apparatus
US20170235903A1 (en) * 2014-05-30 2017-08-17 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. Systems and methods for contextual imaging workflow
US20180342060A1 (en) * 2017-05-25 2018-11-29 Enlitic, Inc. Medical scan image analysis system
US20210004960A1 (en) * 2018-03-07 2021-01-07 Koninklijke Philips N.V. Display of medical image data
US20210093301A1 (en) * 2018-04-09 2021-04-01 Koninklijke Philips N.V. Ultrasound system with artificial neural network for retrieval of imaging parameter settings for recurring patient

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11379990B2 (en) * 2018-02-21 2022-07-05 Covidien Lp Locating tumors using structured light scanning
US11636606B2 (en) 2018-02-21 2023-04-25 Covidien Lp Locating tumors using structured light scanning
US20210251602A1 (en) * 2018-08-22 2021-08-19 Koninklijke Philips N.V. System, device and method for constraining sensor tracking estimates in interventional acoustic imaging
EP4335380A1 (en) * 2022-09-12 2024-03-13 FUJI-FILM Corporation Ultrasound diagnostic apparatus and control method for ultrasound diagnostic apparatus

Also Published As

Publication number Publication date
KR20210006360A (ko) 2021-01-18
JP2021522875A (ja) 2021-09-02
CN112020332A (zh) 2020-12-01
EP3790468A1 (en) 2021-03-17
WO2019217405A1 (en) 2019-11-14
JP7463287B2 (ja) 2024-04-08
JP2023076710A (ja) 2023-06-01
AU2019265531A1 (en) 2020-10-08

Similar Documents

Publication Publication Date Title
US20210015447A1 (en) Breast ultrasound workflow application
US20230103969A1 (en) Systems and methods for correlating regions of interest in multiple imaging modalities
WO2019100212A1 (zh) 用于规划消融的超声系统及方法
US20110208052A1 (en) Breast ultrasound annotation user interface
US10813625B2 (en) Ultrasound image diagnostic apparatus
EP1787594A2 (en) System and method for improved ablation of tumors
CN111465351A (zh) 具有先进的活检部位标记物的超声定位系统
CN109313698B (zh) 同步的表面和内部肿瘤检测
US9892557B2 (en) Integrated system for focused treatment and methods thereof
EP3795108A1 (en) Systems and methods for planning medical procedures
JP2008188163A (ja) 超音波診断装置、医用画像処理装置、及び医用画像処理プログラム
US20230098305A1 (en) Systems and methods to produce tissue imaging biomarkers
CN113662594B (zh) 乳腺穿刺定位/活检方法、装置、计算机设备和存储介质
CN108403145A (zh) 医用信息处理系统及医用图像处理装置
US12114933B2 (en) System and method for interventional procedure using medical images
JP2021146210A (ja) 生検針の再構成誤差を評価するための方法およびシステム
JP6258026B2 (ja) 超音波診断装置
US20230124481A1 (en) Systems and methods for identifying regions of interest in multiple imaging modalities
JP2011136044A (ja) 超音波診断装置
US20230263577A1 (en) Automatic ablation antenna segmentation from ct image
JP5380079B2 (ja) 超音波治療支援装置および超音波治療支援プログラム

Legal Events

Date Code Title Description
AS Assignment

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA

Free format text: SECURITY INTEREST;ASSIGNORS:HOLOGIC, INC.;FAXITRON BIOPTICS, LLC;FOCAL THERAPEUTICS, INC.;AND OTHERS;REEL/FRAME:054089/0804

Effective date: 20201013

STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

AS Assignment

Owner name: HOLOGIC, INC., MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ST. PIERRE, SHAWN;REEL/FRAME:054992/0891

Effective date: 20210113

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

ZAAB Notice of allowance mailed

Free format text: ORIGINAL CODE: MN/=.

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS