JP5519937B2 - Anatomical labeling system and method on PACS - Google Patents

Anatomical labeling system and method on PACS Download PDF

Info

Publication number
JP5519937B2
JP5519937B2 JP2008540244A JP2008540244A JP5519937B2 JP 5519937 B2 JP5519937 B2 JP 5519937B2 JP 2008540244 A JP2008540244 A JP 2008540244A JP 2008540244 A JP2008540244 A JP 2008540244A JP 5519937 B2 JP5519937 B2 JP 5519937B2
Authority
JP
Japan
Prior art keywords
image
images
anatomical
reference
acquired
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2008540244A
Other languages
Japanese (ja)
Other versions
JP2009515599A (en
Inventor
ラウ,デニー・ウィングチャン
イェルリ,ヴィジェイカルヤン
オーウェン,フランク
フレドリック,ペリー・スコット
ボーリュー,クリストファー・フレドリック
バース,リチャード・アレン
ゴールド,ゲーリー・エヴァン
パイク,デイヴィッド・ソンウン
ラマン,レグハフ
サマラ,ヤシーン
Original Assignee
ゼネラル・エレクトリック・カンパニイ
スタンフォード・ユニバーシティ ザ・ボード・オブ・トラスティーズ・オブ・ザ・ルランド・スタンフォード・ジュニア・ユニバーシティ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US11/273,285 priority Critical
Priority to US11/273,285 priority patent/US7590440B2/en
Application filed by ゼネラル・エレクトリック・カンパニイ, スタンフォード・ユニバーシティ ザ・ボード・オブ・トラスティーズ・オブ・ザ・ルランド・スタンフォード・ジュニア・ユニバーシティ filed Critical ゼネラル・エレクトリック・カンパニイ
Priority to PCT/US2006/043964 priority patent/WO2007059020A2/en
Publication of JP2009515599A publication Critical patent/JP2009515599A/en
Application granted granted Critical
Publication of JP5519937B2 publication Critical patent/JP5519937B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F19/00Digital computing or data processing equipment or methods, specially adapted for specific applications
    • G06F19/30Medical informatics, i.e. computer-based analysis or dissemination of patient or disease data
    • G06F19/32Medical data management, e.g. systems or protocols for archival or communication of medical images, computerised patient records or computerised general medical references
    • G06F19/321Management of medical image data, e.g. communication or archiving systems such as picture archiving and communication systems [PACS] or related medical protocols such as digital imaging and communications in medicine protocol [DICOM]; Editing of medical image data, e.g. adding diagnosis information
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/38Registration of image sequences
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H15/00ICT specially adapted for medical reports, e.g. generation or transmission thereof

Description

  The present invention relates generally to anatomical labeling on a picture archiving and communication system (PACS). The present invention particularly relates to anatomical labeling on a PACS that allows image presentation and analysis specific to anatomical structures.

  A clinical or medical environment is a crowded, demanding environment that benefits from organizing and improving the ease of use of imaging systems, data storage systems, and other equipment used within the medical environment. is there. Medical environments such as hospitals and clinics encompass many different professionals, patients and equipment. Healthcare personnel must manage the challenges of providing multiple patients, systems, and high quality services to patients. Medical personnel may encounter many problems or obstacles in their workflow.

  Medical environments such as hospitals and clinics include hospital information systems (HIS) and clinical information systems such as radiology information systems (RIS) and storage systems such as image storage and communication systems (PACS) . Stored information may include, for example, patient medical history, imaging data, test results, diagnostic information, management information and / or scheduling information. Information may be stored centrally or divided into multiple locations. Medical personnel may wish to access patient information or other information at various points in the medical workflow. For example, during surgery, medical personnel may access patient information such as images of patient anatomy stored in a medical information system. Alternatively, during ongoing medical procedures, medical personnel may enter new information, such as medical history, diagnostic or treatment information, into the medical information system.

  The PACS may connect to an imaging device for medical diagnosis and use an acquisition gateway (between the acquisition device and the PACS), a storage and storage unit, a display workstation, a database and an advanced data processor. These components are integrated together by the communication network and the data management system. In general, PACS goals include streamlining medical operations, facilitating distributed remote testing and diagnosis, and improving patient care.

  A common application of a PACS system is to provide one or more medical images for examination by a medical professional. For example, the PACS system can provide a series of X-ray images to a display workstation where the images are displayed for performing a diagnostic examination by a radiologist. Based on the presentation of these images, the radiologist can provide a diagnosis. For example, a radiologist can diagnose a tumor or lesion in an x-ray image of a patient's lungs.

  PACS is complex to configure and operate. Furthermore, using PACS requires training and preparation that can vary from user to user. Accordingly, systems and methods that facilitate PACS operation are highly desirable. There is a need for systems and methods that improve the ease and automation of PACS.

  A computed tomography (“CT”) examination may include an image obtained from scanning a large section of a patient's body. For example, a chest / abdominal / pelvic CT examination includes one or more images of a plurality of different anatomical structures. However, each anatomical structure can be viewed better under different window level settings. Thus, when interpreting a thoracic / abdominal / pelvic CT examination, a radiologist or other personnel switches between different window level settings, for example, to view images of different anatomical structures. Window level setting (s) automatically for them based on image (s) / anatomical structure (s) viewed by radiologist and other personnel If adjusted, it is beneficial to them.

  Currently, image review workstations are unable to correlate image content with anatomical structures to facilitate the presentation of relevant anatomical data. However, medical personnel, such as radiologists, see information about a particular anatomy and / or other patient data when viewing and / or interpreting a patient's image (s). May be interested in. For example, when viewing a CT axis image that includes the liver, a radiologist may wish to know about a disease process associated with the liver, or a clinical examination of a patient associated with the liver. Therefore, an image review workstation that has the ability to recognize anatomical structures of interest, such as the liver, search for information related to the anatomical structures, and present to the user is highly desirable.

  During the examination interpretation process, the radiologist and / or other medical personnel may wish to write down the image findings as a mechanism for report construction. In the case of structured reports, radiologists thought the data entry mechanism was too cumbersome. That is, since there are so many possible findings related to the inspection procedure, the findings need to be classified in some hierarchical structure. The multiple hierarchical levels and options of selection require the radiologist to perform a wide range of manual operations.

  For example, a chest / abdomen / pelvic CT examination may include images of the liver, pancreas, stomach, and the like. If the radiologist wishes to enter findings related to the liver, he must now traverse the hierarchy of options presented in the GUI before the desired findings can be identified.

A radiologist may wish to see an image that is specific to a particular organ when viewing the patient's image during the examination interpretation process. For example, a patient with a history of colon cancer undergoes a CT examination that includes images of the stomach, small intestine, liver, pancreas, colon, and the like. The radiologist may want to see an image of the colon first. If the colon does not show significant abnormalities, the radiologist may suspect that the reported symptoms are related to liver disease and would like to see an image containing the liver. However, there are currently no methods on image review workstations that allow radiologists to view organ-specific images. The radiologist can only see the images in order.
US Patent Application Publication No. 20030228042 US Pat. No. 6,912,888 US Pat. No. 6,910,278 US Pat. No. 6,873,421 US Pat. No. 6,841,780 US Pat. No. 6,825,937 US Pat. No. 6,788,210 US Pat. No. 6,636,255 US Pat. No. 6,603,103 US Pat. No. 6,438,272 US Pat. No. 6,252,623 US Pat. No. 6,229,913 US Pat. No. 6,084,712 US Pat. No. 6,040,910 US Pat. No. 5,835,218 US Pat. No. 5,825,495 US Pat. No. 5,636,025 US Pat. No. 5,189,493 US Pat. No. 5,069,548 U.S. Pat. No. 4,984,893 U.S. Pat. No. 4,983,404 U.S. Pat. No. 4,641,972

  Accordingly, there is a need for systems and methods for improved image presentation and analysis. What is needed is a system and method for anatomical labeling that facilitates anatomy-specific image presentation and analysis.

  Certain embodiments of the present invention provide systems and methods for image registration and display of related information. Certain embodiments provide an image registration system for correlating clinical information with at least one image. The system includes a reference image set that includes one or more reference images and an image registration module for aligning the acquired one or more images with the reference image set. At least one reference anatomy is identified within each image of the one or more reference images. Related anatomical information is correlated with at least one reference anatomical structure within each image of the one or more reference images. The image alignment module aligns the acquired at least one anatomical structure in the acquired image or images to the reference image set. The image registration module associates relevant anatomical information with the acquired image or images based on the reference image set.

  In one embodiment, the system may include more than one reference image set based on patient characteristics. Patient characteristics include weight, height, gender, nationality, and the like. The combination of patient characteristics may represent different reference sets, for example. In one embodiment, the newly acquired image exam may be classified based on different characteristics associated with the reference image set.

  In one embodiment, the system further includes a display module that can display the acquired image or images and associated associated anatomical information. The related information may be stored as metadata related to the acquired one or more images, for example. In one embodiment, the associated anatomical information includes clinical information associated with the acquired at least one anatomical structure. In one embodiment, the associated anatomical information includes a window level setting for displaying the acquired image or images based on the anatomical structure.

  In one embodiment, the associated anatomical information allows the user to request an image based on a voice command. The voice command may relate to anatomical structures in one or more acquired images, for example. In one embodiment, the associated anatomical information can narrow the selection of structured report findings based on at least one anatomical structure.

  Certain embodiments provide a computer-readable storage medium that includes a set of instructions that execute on a processor, such as a computer or other processing device. The set of instructions includes an image registration routine configured to align the acquired image with respect to the at least one reference image set, wherein the at least one reference image set includes associated anatomical information, And associated with a display routine for displaying the acquired image based on relevant anatomical information. In one embodiment, associated anatomical information facilitates audio navigation of acquired images based on anatomical structures. In one embodiment, relevant anatomical information includes, for example, patient clinical information, reference sources, disease processes, related images, and / or drug interactions.

  In one embodiment, relevant anatomical information is displayed with the acquired image. In one embodiment, the associated anatomical information specifies display settings, such as window level settings, for displaying at least one of the acquired images. The associated anatomical information may include multiple display settings with associated priorities, and the acquired image is displayed according to the anatomical structure having the highest associated priority among the acquired images. Is done. In one embodiment, relevant anatomical information can narrow the selection of findings to enter into the structured report.

  Certain embodiments provide a method for correlating an anatomical structure in an acquired image with a reference image. The method includes identifying one or more anatomical portions in the acquired image, mapping the acquired image to a reference image based on the one or more anatomical portions, and Storing relevant anatomical information about the acquired image and displaying the acquired image based on the relevant anatomical information. The method may also include controlling the display of the acquired image based on the voice command and relates to the anatomical information with which the voice command is associated.

  In one embodiment, relevant anatomical information is displayed with the acquired image. Related anatomical information may include, for example, clinical information, reference information, disease process information, related images, and / or drug interaction information. In one embodiment, a list of findings made to relevant anatomical information may be displayed for input to a structured report. In one embodiment, the acquired image is displayed according to display settings, such as window level settings and / or other display settings, based on relevant anatomical information.

  Certain embodiments provide a method for constructing a structured report based on image anatomy. This method involves identifying anatomical structures in an image, mapping anatomical structures to a reference image set, and inputting image findings related to anatomical structures for input to a structured report. Presenting a list.

  The foregoing detailed description of the "means for solving the problems", as well as the following detailed description of specific embodiments of the present invention, will be better understood when read in conjunction with the appended drawings. For the purpose of illustrating the invention, certain embodiments are shown in the drawings. However, it should be understood that the invention is not limited to the arrangements and instrumentality shown in the attached drawings.

  FIG. 1 illustrates an exemplary image archiving communication system (PACS) 100 used in accordance with one embodiment of the present invention. The PACS system 100 includes an imaging modality 110, an acquisition workstation 120, a PACS server 130, and one or more PACS workstations 140. The system 100 may include any number of imaging modalities 110, acquisition workstations 120, PACS servers 130, and PACS workstations 140, but is in no way limited to the embodiment of the system 100 shown in FIG. The components of system 100 may communicate via, for example, wired and / or wireless communications, and may be, for example, separate systems and / or integrated to varying degrees.

  In operation, imaging modality 110 obtains one or more images of the patient's anatomy. Imaging modality 110 may include any device capable of capturing an image of a patient's anatomy, such as a medical diagnostic imaging device. For example, the imaging modality 110 may include an X-ray imaging apparatus, an ultrasonic scanner, a magnetic resonance imaging apparatus, and the like. Image data representing the image (s) is communicated between the imaging modality 110 and the acquisition workstation 120. The image data may be communicated electronically via, for example, a wired or wireless connection.

  In one embodiment, acquisition workstation 120 may apply one or more preprocessing functions to the image data, for example, for image preparation for viewing on PACS workstation 140. For example, the acquisition workstation 120 may convert the raw image data to a DICOM standard format or attach a DICOM header. The pre-processing function is applied at the beginning of the imaging and display workflow, eg, modality specific It may be characterized as an extension (eg, contrast or frequency compensation features that are specific to a particular x-ray imaging device). The preprocessing function differs from the processing function applied to the image data in that the processing function is not modality specific and is applied at the end of the imaging and display workflow (eg, on the display workstation 140).

  The image data may then be communicated between the acquisition workstation 120 and the PACS server 130. The image data may be communicated electronically via, for example, a wired or wireless connection.

  The PACS server 130 may include a computer readable storage medium suitable for storing image data for later retrieval and viewing on the PACS workstation 140. The PACS server 130 may also include one or more software applications for additional processing and / or preprocessing of image data by one or more PACS workstations 140.

  One or more PACS workstations 140 can or are configured to communicate with server 130. The PACS workstation 140 may include, for example, a general purpose processing circuit, a PACS server 130 interface, software memory, and / or an image display monitor. The PACS server 130 interface may be implemented as a network card connected to a TCP / IP-based network, but may also be implemented as a parallel port interface, for example.

  The PACS workstation 140 may retrieve or receive image data from the server 130 for display to one or more users. For example, the PACS workstation 140 may retrieve or receive image data representing a computed radiography (“CR”) image of the patient's chest. The radiologist or user may then examine an image of any object of interest, such as a tumor, lesion, etc.

  Also, the PACS workstation 140 can apply processing functions to the image data or can be configured as such. For example, the user may wish to apply a processing function that improves the features in the image representing the image data. Thus, the processing function may adjust the image of the patient's anatomy to facilitate the user's diagnostic imaging. Such processing functions may include any software-based application that can change the visual appearance or representation of the image data. For example, the processing function may include one or more of image inversion, image enlargement, image panning, image data window and / or grayscale representation level changes, and image contrast and / or brightness changes. May be included.

  In one embodiment, PACS system 100 may provide one or more perspectives for viewing images and / or accessing applications on PACS workstation 140. The perspective may be provided locally on the PACS workstation 140 and / or remotely from the PACS server 130. In one embodiment, the PACS system 100 includes a perspective manager that can be used to view an image with multiple perspectives. PACS server 130 and / or PACS workstation 140 may include a perspective manager, or the perspective manager may be implemented in a separate system. In one embodiment, each PACS workstation 140 may include a perspective manager.

  A user may wish to apply additional processing steps to one or more images in order to further improve the features in the images. For example, the user may wish to apply additional processing functions or steps to the image in order to change the presentation of the image to suit the user's level of confidence in making an accurate diagnosis. In other words, different users may wish to apply additional processing steps that are different from those included in the default image processing workflow.

  The additional image processing stage (s) may include any image processing stage useful for preparing an image for diagnostic examination. For example, as described above, the image processing stage (as a default image processing stage or an additional image processing stage) may be image inversion, image enlargement, image panning, and image window, level, brightness and contrast. One or more of the settings may be changed.

  The PACS workstation 140 may retrieve or receive image data from the server 130 for display to one or more users. For example, the PACS workstation 140 may retrieve or receive image data representing a computed radiographic image of the patient's chest. The radiologist may then inspect the image displayed on the display device for any object of interest, eg, a tumor, lesion, etc.

  The PACS workstation 140 may also retrieve or be configured to retrieve one or more hanging protocols from the server 130. For example, a default hanging protocol may be communicated from server 130 to PACS workstation 140. The hanging protocol may be communicated between server 130 and PACS workstation 140 via, for example, a wired or wireless connection.

  In general, PACS workstation 140 may present an image representing image data retrieved and / or received from server 130. The PACS workstation 140 may present the image according to a hanging protocol. As described above, the hanging protocol is a set of display rules for presenting, formatting and otherwise organizing images on the display device of the PACS workstation 140. A display rule is a convention for presenting one or more images in a specific time and / or spatial layout or sequence. For example, the hanging protocol is a set of computer readable instructions (indicating a computer to display multiple images at a specific location on a display device and / or to display multiple images in a specific sequence or order ( Or, for example, display rules may be included. In another example, the hanging protocol may include a set of computer readable instructions that instruct the computer to place multiple images on multiple screens and / or display areas on the display device. In general, the hanging protocol may be used to present multiple images for diagnostic examination of a patient anatomy that is characteristic of the image.

  The hanging protocol may, for example, instruct the PACS workstation 140 to display an anteroposterior (“AP”) image adjacent to a lateral image of the same anatomical structure. In another example, the hanging protocol may instruct PACS workstation 140 to display the AP image before displaying the lateral image. In general, the hanging protocol determines how multiple images of the PACS workstation 140 are presented spatially and / or temporally.

  The hanging protocol is different from the default display protocol ("DDP: default display protocol"). In general, DDP is a default workflow that applies a series of image processing functions to image data. The image processing function is applied to the image data to present an image (based on the image data) to the user. The image processing function changes the appearance of the image data. For example, the image processing function may change the contrast level of the image.

  DDP generally includes processing steps or functions that are applied before any diagnostic examination of the image. For example, processing functions may be applied to the image data to improve features in the image (based on the image data). Such processing functions can include any software-based application that can change the visual appearance or representation of the image data. For example, the processing function may be any of image inversion, image enlargement, image panning, image data representation window and / or level setting changes, and image data representation contrast and / or brightness setting changes. One or more of.

  DDP is usually based on some type of imaging modality used to obtain image data. For example, image data generally obtained with a C-arm imaging device or with a particular C-arm imaging device may have the same or similar DDP applied to the image data. In general, DDP attempts to present image data in a way that is most useful for many users.

  Conversely, applying a hanging protocol to image data does not change the appearance of the image (based on the image data), but rather, as described above, how the image (s) are Decide if it will be presented.

  Server 130 may store multiple hanging protocols and / or DDPs. The hanging protocol and / or DDP stored in the server 130 and not yet modified or customized is the default hanging protocol / DDP. The default hanging protocol and / or DDP is selected from a plurality of default hanging protocols and / or DDP based on any number of related factors such as manual selection, user identification and / or preprocessing of image data, for example. Also good.

  Specifically, the default hanging protocol and / or DDP may be selected based on a manual selection by simply communicating the default protocol when the user selects a particular protocol. The user may select, for example, at PACS workstation 140.

  In another example, the default protocol may be selected based on user identification. For example, a user may have a preferred DDP. The DDP may be customized to suit the user's preference for the particular time and / or spatial layout of the image. When the user gains access to the PACS workstation 140 (eg, by entering a correct login and password combination or by some other type of user ID identification procedure), for example, the preferred DDP is the PACS workstation 140 May be communicated to.

  In another example, the default protocol may be selected based on preprocessing of the image data. The preprocessing of the image data may include any image processing known to those skilled in the art that prepares the image for user review. Pre-processing can also include, for example, computer-aided diagnosis (“CAD”) of image data. The CAD of image data may include a computer (or similar operating unit) that automatically analyzes the image data of the object of interest. For example, the CAD may include a software application that analyzes image data of nodules in the image, such as lungs, lesions, and tumors. However, CAD applications can include any automatic analysis of image data known to those skilled in the art.

  For example, a default hanging protocol corresponding to CAD findings of lung tumors presents anterior-posterior ("PA") and lateral lung images adjacent to each other, followed by computer tomography ("CT: computer tomography") lung images Subsequent to this, for example, a presentation of a magnetic resonance (“MR”) lung image may be provided. In general, default hanging protocols corresponding to CAD findings are designed to present images in a spatial and / or temporal layout useful for radiologists. For example, a radiologist is a great help in reviewing CAD findings by looking at adjacent PA and lateral lung images, followed by previously obtained multi-slice CT and MR images of the lung. Can be obtained.

  Thus, based on CAD findings, a default protocol may be selected from a plurality of default protocols and applied on the workstation 140 to present images to the user.

  In one embodiment, the reference image set may be collected and stored, for example, in a PACS (eg, PACS server 130 and / or workstation 140), or other image data storage device. The reference image may be a set of collected references, or “gold standards”, and / or may be organized for a particular patient, for example. For example, five sets of images may serve as a reference set for acquired image data. The images in the reference set (s) are labeled with the main anatomical structures or other features in each image. When image registration is performed, acquired image studies, such as newly acquired radiological image studies, are automatically labeled based on the labels found in the reference image set (s). It may be attached. Labels may be correlated for newly acquired images, such as patient level, organ level. For example, the liver may be identified in the newly acquired image based on the alignment between the newly acquired image and one of the reference images. A label assigned to a structure, such as the liver, in the reference image (s) is assigned to the corresponding structure in the acquired image.

  In one embodiment, a user may search multiple numbers of images based on labels assigned to one or more features in the images. In one embodiment, voice commands may be used to search multiple images of a study based on the label. For example, a radiologist may say “liver” and PACS automatically displays the first liver image of the study. Therefore, the user may search the image set by anatomical structure such as by organ. Alternatively, the user may search the image set by patient, for example by saying a patient name, identifier, etc., or by other criteria.

  In one embodiment, a library of standard or general scans may be categorized by one or more characteristics such as modality, patient, gender, weight, height, age, nationality, disease, etc. Any number of images may be included in the library and classified. A reference set of images may be selected from a library based on one or more characteristics. The PACS may automatically match the reference set to the desired characteristic (s) based on data input at the time of image acquisition, for example. In one embodiment, the system may include more than one reference image set based on patient characteristics. The combination of patient characteristics may represent different reference sets, for example. In one embodiment, newly acquired image exams may be classified based on different characteristics associated with the reference image set.

  In one embodiment, clinical information may be displayed based on anatomical structures. For example, a patient may have pathology, test results, microbiology, medical history data, etc., as well as images. Radiology or other medical personnel may wish to consider clinical information such as test results when examining images. Furthermore, the results may relate only to specific parts of the anatomical structure. When PACS determines which images and / or anatomical structures are being reviewed by medical personnel, the PACS may automatically display relevant results. A PACS or other system may map results and / or other data with associated anatomical structures. PACS aligns the viewed image (s) with the reference image (s). Based on the alignment, the PACS may determine an anatomical structure and retrieve related data mapped to the anatomical structure.

  FIG. 2 illustrates an image registration system 200 for registering images with respect to a reference image set established according to one embodiment of the present invention. The system 200 includes a reference image set 210, an acquired image set 220, and anatomical or clinical information 230. Reference image set 210 and acquired image set 220 may include, for example, the same or related modalities. Alternatively, the reference image set 210 and the acquired image set 220 may include images of different modalities. The reference image set 210 and / or the acquired image set 220 may be CT, MR, digital radiography (“DR”), X-ray, ultrasound, nucleus, single photon emission computed tomography (“SPECT”). : Single photo emission computed tomography "), positron emission tomography (" PET ") and / or one or more images obtained from other imaging systems. For example, the reference image set 210 may include one or more images showing a plurality of anatomical structures, such as the liver, pancreas, kidney and / or large intestine in the image. The reference image set 210 and / or the acquired image set 220 may include one or more image subsets. In one embodiment, the reference image set 210 and / or a subset of the reference image set 210 may be organized by anatomy, disease, patient, and / or by other criteria. The acquired image set 220 and / or a subset of the acquired image set 220 may also be organized by anatomy, disease, patient, and / or by other criteria. Anatomical structure information 230 may include, for example, patient clinical information, reference sources, disease processes, images, drug interactions, and / or other information.

  Reference image set 210 may be used to correlate the anatomy shown in the image (s) with relevant clinical information from one or more sources. Image registration techniques, such as cross-correlation, variance minimization, mutual information, principal axes, manual registration and / or other registration techniques, and / or images in image set 220 are acquired. The points in the image (s) may be used to correlate with the images and / or points in the reference image set 210. The anatomical information included in the reference image set 210 represents the image content in the reference image set 210. The image registration module may be used to align the acquired image set 220 with respect to the reference image set 210. The image registration module may be implemented with, for example, a PACS workstation, a PACS server, an image viewer, and / or other processors. Image alignment, display, and / or other functionality may be implemented with a set of instructions in, for example, hardware, firmware, and / or software.

  Once the acquired image set 220 is aligned, the reference image set 210 may be used to retrieve relevant clinical data or other information 230. Related anatomical or clinical data 230 may include, for example, patient clinical information, reference sources, disease or disease processes, related image (s), drug interactions, and / or other information.

  For example, a series of chest CT images of the patient may be obtained as acquired image data 220. The images include, for example, the patient's liver, pancreas, kidney and large intestine. The image anatomy is registered based on the corresponding anatomical structures in the reference image set 210. For example, the liver in the acquired image set 220 is correlated to the liver in the reference image set 210, and so on. Once the acquired image 220 is registered with respect to the reference image 210, clinical data 230 associated with the reference image 210 may be retrieved for the acquired image 220. Relevant clinical data 230 may be provided to the user along with the acquired image 220.

  FIG. 3 shows a flowchart of a method 300 for automatic display adjustment based on image anatomy used in accordance with an embodiment of the present invention. In one embodiment, the method 300 is used to automatically adjust window level settings and / or other display settings of the display, such as the display of the PACS workstation 140, the display of an image viewer, and / or other displays. May be. In one embodiment, the displayed image is correlated to an anatomical part, which is then correlated to a preset window level setting, for example.

  First, at step 310, the main organ and / or other anatomical portion of interest is identified for each image of the examination. For example, the heart, lungs, liver, kidneys, etc. may be identified in a series of images obtained for the patient. Then, at step 320, information regarding the identified anatomical structure in the image is stored for the image. For example, information identifying the heart and lungs in the image may be stored as metadata for the image. The metadata may be stored as image header information and / or may be stored, for example, in a database that references the image.

  Next, at step 330, metadata associated with the image is read. The metadata may be read from the image header and / or a database or other storage that stores the metadata. At step 340, window level settings associated with metadata (ie, anatomical parts in the image) are used to display the image. The images may be displayed on a monitor or other display associated with, for example, a PACS, image viewer, and / or other storage system.

  Identification of organs and anatomical parts in the image may be accomplished by various methods. For example, a technician may view each image and identify anatomical structures within each image. Information may be entered into a database or other data store, for example. Alternatively, automatic image registration may be performed on a “gold” standard reference set of images. For example, for each examination procedure (eg, CT chest / abdominal / pelvic examination reference image set), a “gold” standard reference set of images may be stored in the PACS, image viewer, and / or other image storage system. Each image in the reference set is labeled with associated significant anatomy, manually and / or automatically by a radiologist or other personnel. When anatomical labeling is performed, the image reference set becomes an anatomical atlas. In Atlas, every image contains meta-image data that describes the associated anatomy. Thus, the atlas effectively has mapping information for mapping image content to anatomical structures.

  For each new exam obtained, the exam image (s) will best match the newly obtained exam image (s) to the atlas reference exam image (s). Aligned with an associated reference set of atlas image (s). After registration, the newly acquired examination image is also mapped to the anatomy found in the Atlas reference examination. The mapping information may be indexed, for example, in a database and / or data storage. When a radiologist or other medical personnel is looking at an image, systems such as PACS and other image viewing / storage systems automatically detect that the image contains anatomical structures A, B, and C. Recognize that the window level setting D should be used to display the image.

  For example, a table or other data structure may be used to map anatomical structures to display window level settings and / or other display settings. In addition, the table may include priorities for certain anatomical structures relative to other anatomical structures. In one embodiment, if the image includes anatomical structures A, B, and C, and each anatomical structure is mapped to a different window level or other setting, then the anatomical structure is prioritized. A prioritization scheme may be used to determine which anatomical structure settings to apply based on.

  As shown in Table 1 below, if the image contains both the lung and heart, the window level setting is set at the lung window level setting because the lung has a higher priority than the heart. The

FIG. 4 shows a flowchart of a method 400 for displaying images and related clinical information based on image anatomy used in accordance with an embodiment of the present invention. At step 410, for each image, the main organ and / or other anatomical part (s) of interest are identified. Items of interest in the image may be identified manually and / or automatically as described above with respect to FIG.

  Then, at step 420, information related to the item of interest in the image is stored as metadata for the image. The metadata may be stored as header information and / or in a database or other data structure that associates the image with the metadata. At step 430, an image is requested for display. For example, the image is retrieved for display on a PACS workstation or image viewer. At step 440, when the image is displayed, the metadata is read and the relevant clinical information associated with the anatomical portion is also displayed.

  Identification of organs and anatomical parts in the image may be accomplished by a variety of methods, including manual and automatic analysis as described above. For example, for each inspection procedure, an image reference set is stored in the PACS and / or other image viewing or analysis system. Each image in the reference set is labeled with an associated salient anatomy. When anatomical labeling is performed, the image reference set becomes an anatomical atlas. In the anatomy atlas, every image contains meta-image data that describes the associated anatomy. Thus, the anatomical atlas effectively has mapping information for mapping image content to anatomical structures.

  For each new exam obtained, the exam is aligned with the associated reference set of atlas images to best match the image of the newly obtained exam with the atlas reference exam. After registration, the newly acquired examination image is also mapped to the anatomy in the atlas. Anatomical mapping information may be indexed in a database or other data structure. When a user views an image, a viewing system, such as a PACS workstation or image viewer, recognizes that the image is associated with anatomical structures A, B, and C, and a disease associated with anatomical structures A, B, and C. Present clinical information such as process and patient information to the user.

  FIG. 5 shows a flowchart of a method 500 for simplifying image reporting in accordance with one embodiment of the present invention. In one embodiment, the image reporting hierarchy may be simplified by presenting only findings related to anatomical structures included in the image. Thus, instead of presenting options related to all images in the exam, only options related to the displayed image are presented. In order to display image-specific choices, content in one image is distinguished from another image in the examination. For example, image content may be distinguished based on body organs included in the image.

  At step 510, organs and / or other anatomical parts of interest may be identified within each acquired image. Then, at step 520, the identified anatomical information is stored as image metadata (eg, as header information or in a database). Step 530 then displays the image. At step 540, associated image metadata is read. Next, at step 550, the selection of structured report findings is associated with the anatomical portion identified by the metadata.

  Identification of organs and / or other anatomical parts in the image (s) may be accomplished in various ways. For example, a technician may manually view each image to identify anatomical structures within each image and enter that information into a database or other data structure. An automatic method may be to perform automatic image registration with the gold standard reference set of images. For example, for each inspection procedure, a gold standard reference set of images may be stored. Each image in the reference set may be labeled with an associated salient anatomy (eg, automatically by PACS and / or manually by a radiologist or other personnel). When anatomical labeling is performed, the image reference set becomes an anatomical atlas. In Atlas, every image contains meta-image data that describes the associated anatomy. Thus, the atlas effectively has mapping information for mapping image content to anatomical structures.

  For each new exam obtained, the exam image (s) are aligned with the associated reference set of atlas images to best match the image of the newly obtained exam with the atlas reference exam. After registration, the newly acquired examination image is also mapped to the anatomy of the reference examination. The mapping information may be indexed in a database or other data structure, for example. A radiologist or other user viewing an image may wish to enter findings in a structured report. Anatomical structures A, B, C, etc. associated with the image may be identified based on the mapping information. A list of findings relating only to the anatomical structures A, B, C may then be presented to an observer, such as a radiologist or other medical personnel.

  FIG. 6 shows a flowchart of a method 600 for identifying items in an image according to an embodiment of the invention. The content within each image may be identified to allow the display of organ-specific image (s). First, at step 610, for each image, an organ of interest and / or other anatomical portion may be identified. Identification of items such as organs and / or anatomical parts in the image (s) can be done by manually entering anatomical information into a database or other data structure and / or images as an image reference set. May be performed in various ways, such as automatically aligning. Each image in the reference set may be labeled with an associated anatomy. When anatomical labeling is performed, the image reference set becomes an anatomical atlas. Each image in the anatomy atlas includes meta-image data that describes the associated anatomy. The atlas contains information that maps image content to anatomical structures.

  The newly obtained image is aligned with a reference set of atlas images so that the newly obtained image of the examination matches the atlas reference examination. After registration, the newly acquired examination image is also mapped to the anatomy.

  At step 620, the anatomical identification information is stored as metadata for the image. The metadata may be stored in a number of ways, such as storing the metadata as image header information or in a database, table or other data structure that references the image. The mapping information may be indexed, for example in a database or other data structure.

  At step 630, the user may request the desired organ. For example, a user may enter a desired organ name into software on a PACS workstation, image viewer, and / or other system. Alternatively, the user may speak the name of the desired organ and the speech recognition engine converts the speech data into text information to identify the relevant image data. At step 640, a first image of a sequence including the desired organ is displayed based on the name of the desired organ. Thus, for example, a radiologist may request a patient's liver image verbally, and a first image of the patient's series of liver images is retrieved and displayed, for example, on a PACS display. Subsequent liver images may be displayed as well. The user can also retrieve relevant clinical information related to, for example, the patient and / or anatomy in the image. The user can also switch to a different patient and / or anatomy, for example based on input and / or voice commands.

  Certain embodiments may be provided as a set of instructions that reside on a computer-readable medium, such as a memory or hard disk, for execution on a computer or other processing device such as a PACS workstation or image viewer. .

  Thus, certain embodiments provide for automatic application of display settings, such as presetting of window level settings, based on anatomical structures in the displayed image (s). Certain embodiments provide the ability to search multiple images by anatomy in addition to continuous navigation. Thus, a user such as a radiologist does not need to manually search for an image containing the desired anatomy to be viewed. Certain embodiments allow a user to search for images based on voice commands as well as keyboard or mouse input. Furthermore, certain embodiments improve the usefulness of structured reporting by dynamically filtering the choice of entering findings and other information into the structured report. Particular embodiments correlate image content with text and other data to allow presentation of image-specific finding selections for input to structured reports.

  Certain embodiments reduce the time used by radiologists and other medical personnel to search for relevant anatomical information. Certain embodiments display the image (s) as well as anatomical information associated with the displayed image (s) for review by the user. Certain embodiments correlate image content with text and other data to allow searching and presenting relevant anatomical information from reference sources and patient medical history. Certain embodiments allow users such as radiologists and other medical personnel to make a more accurate diagnosis, for example, when equipped with more relevant clinical and reference information.

  Although the invention has been described with reference to specific embodiments, those skilled in the art will recognize that various modifications may be made and equivalents may be used instead without departing from the scope of the invention. Let's be done. In addition, many modifications may be made to adapt a particular situation or element to the teachings of the invention without departing from the scope of the invention. Accordingly, it is intended that the invention not be limited to the particular embodiments disclosed, but the invention encompasses all embodiments that fall within the scope of the appended claims.

FIG. 2 illustrates an exemplary image archiving communication system used in accordance with an embodiment of the present invention. 1 is a diagram illustrating an image alignment system for aligning an image with a reference image set according to an embodiment of the present invention. FIG. 4 is a flowchart of a method for automatically adjusting display based on image anatomy used in accordance with an embodiment of the present invention. 2 is a flowchart of a method for displaying images and associated clinical information based on image anatomy used in accordance with an embodiment of the present invention. 4 is a flowchart of a method for simplifying image reporting according to an embodiment of the present invention. 4 is a flowchart of a method for identifying items in an image according to an embodiment of the invention.

Claims (4)

  1. A set of reference images including one or more reference images, wherein at least one reference anatomical structure is identified within each image of the one or more reference images, and associated anatomical information is: A set of reference images (210) correlated to the at least one reference anatomy in each image of the one or more reference images;
    An image registration module (110) for registering the acquired one or more images (220) with the reference image set, wherein the acquired at least one of the acquired one or more images An image alignment module that aligns an anatomical structure with the reference image set and associates the associated anatomical information with the acquired image or images based on the reference image set; 110)
    A display module capable of displaying the associated one or more images (220) and the associated anatomical information, wherein the display module is based on at least a portion of the anatomical information; A display module that automatically adjusts the display settings of one or more acquired images;
    An image registration system (100) for correlating clinical information to at least one image, comprising:
    The system reads metadata including anatomical portions associated with the acquired image or images and adjusts window level settings associated with the metadata by the display module to process the images in a PACS work. A PACS workstation operable to display on the station,
    The related anatomical information includes a plurality of display settings related to priority, and the acquired image is displayed according to a window level setting of an anatomical structure having the highest related priority among the acquired images. Display
    The relevant anatomical information can narrow the selection of findings to enter into the structured report;
    Image registration system (100).
  2.   The system (100) of claim 1, wherein the associated anatomical information includes clinical information associated with the acquired at least one anatomical structure.
  3.   The system (100) of claim 1, wherein the associated anatomical information allows a user to request an image based on a voice command.
  4. The system (100) of claim 3, wherein the voice command relates to an anatomical structure in the acquired one or more images (220).
JP2008540244A 2005-11-14 2006-11-13 Anatomical labeling system and method on PACS Active JP5519937B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US11/273,285 2005-11-14
US11/273,285 US7590440B2 (en) 2005-11-14 2005-11-14 System and method for anatomy labeling on a PACS
PCT/US2006/043964 WO2007059020A2 (en) 2005-11-14 2006-11-13 System and method for anatomy labeling on a pacs

Publications (2)

Publication Number Publication Date
JP2009515599A JP2009515599A (en) 2009-04-16
JP5519937B2 true JP5519937B2 (en) 2014-06-11

Family

ID=37773581

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2008540244A Active JP5519937B2 (en) 2005-11-14 2006-11-13 Anatomical labeling system and method on PACS

Country Status (4)

Country Link
US (1) US7590440B2 (en)
EP (1) EP1955237A2 (en)
JP (1) JP5519937B2 (en)
WO (1) WO2007059020A2 (en)

Families Citing this family (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003076003A2 (en) 2002-03-06 2003-09-18 Tomotherapy Incorporated Method for modification of radiotherapy treatment delivery
US8315450B2 (en) * 2004-11-24 2012-11-20 Wisconsin Alumni Research Foundation Method and system for display of medical image data
JP5312801B2 (en) * 2005-02-08 2013-10-09 コーニンクレッカ フィリップス エヌ ヴェ Medical image viewing protocol
US8232535B2 (en) 2005-05-10 2012-07-31 Tomotherapy Incorporated System and method of treating a patient with radiation therapy
US8442287B2 (en) 2005-07-22 2013-05-14 Tomotherapy Incorporated Method and system for evaluating quality assurance criteria in delivery of a treatment plan
CA2616313A1 (en) * 2005-07-22 2007-02-01 Tomotherapy Incorporated System and method of recommending a location for radiation therapy treatment
AT507879T (en) 2005-07-22 2011-05-15 Tomotherapy Inc System for the administration of radiation therapy into a moving target area
AU2006272742A1 (en) 2005-07-22 2007-02-01 Tomotherapy Incorporated System and method of delivering radiation therapy to a moving region of interest
CA2616293A1 (en) 2005-07-23 2007-02-01 Tomotherapy Incorporated Radiation therapy imaging and delivery utilizing coordinated motion of gantry and couch
US20070076929A1 (en) * 2005-10-05 2007-04-05 General Electric Company System and method for automatic post processing image generation
CN101505652A (en) * 2005-10-14 2009-08-12 断层放疗公司 Method and interface for adaptive radiation therapy
US20070223793A1 (en) * 2006-01-19 2007-09-27 Abraham Gutman Systems and methods for providing diagnostic imaging studies to remote users
US7765109B2 (en) * 2006-01-19 2010-07-27 AG Mednet, Inc. Systems and methods for obtaining readings of diagnostic imaging studies
JP2008079760A (en) * 2006-09-27 2008-04-10 Fujifilm Corp Method and apparatus for image compression processing and medical network system
US20080117225A1 (en) * 2006-11-21 2008-05-22 Rainer Wegenkittl System and Method for Geometric Image Annotation
US20080175460A1 (en) * 2006-12-19 2008-07-24 Bruce Reiner Pacs portal with automated data mining and software selection
US8442290B2 (en) * 2007-01-19 2013-05-14 Mayo Foundation For Medical Education And Research Simultaneous dual window/level settings for display of CT colonography images
EP1946702B1 (en) * 2007-01-22 2012-03-07 BrainLAB AG Illustration of anatomic structure
JP2011500293A (en) 2007-10-25 2011-01-06 トモセラピー・インコーポレーテッド Method for adapting radiotherapy dose splitting
US8467497B2 (en) * 2007-10-25 2013-06-18 Tomotherapy Incorporated System and method for motion adaptive optimization for radiation therapy delivery
WO2009055801A2 (en) * 2007-10-25 2009-04-30 Tomo Therapy Incorporated System and method for motion adaptive optimization for radiation therapy delivery
US20090138280A1 (en) * 2007-11-26 2009-05-28 The General Electric Company Multi-stepped default display protocols
RU2493593C2 (en) * 2007-12-13 2013-09-20 Конинклейке Филипс Электроникс Н.В. Method of extracting data from set of data of medical images
US8577115B2 (en) * 2008-03-04 2013-11-05 Tomotherapy Incorporated Method and system for improved image segmentation
US20090313170A1 (en) * 2008-06-16 2009-12-17 Agmednet, Inc. Agent for Medical Image Transmission
US8244008B2 (en) * 2008-07-21 2012-08-14 International Business Machines Corporation Methods involving optimizing and mapping images
US8755635B2 (en) * 2008-08-11 2014-06-17 Siemens Aktiengesellschaft Method and system for data dependent multi phase visualization
US8370293B2 (en) * 2008-08-21 2013-02-05 Terarecon Inc. Workflow template management for medical image data processing
US8385688B2 (en) * 2008-08-27 2013-02-26 International Business Machines Corporation System and method for automatic recognition and labeling of anatomical structures and vessels in medical imaging scans
WO2010025372A2 (en) * 2008-08-28 2010-03-04 Tomotherapy Incorporated System and method of contouring a target area
US8363784B2 (en) * 2008-08-28 2013-01-29 Tomotherapy Incorporated System and method of calculating dose uncertainty
US20100054555A1 (en) * 2008-08-29 2010-03-04 General Electric Company Systems and methods for use of image recognition for hanging protocol determination
DE102009006148B4 (en) * 2009-01-07 2010-11-18 Siemens Aktiengesellschaft Method, monitor control module, system and computer program for displaying medical images
WO2010102068A2 (en) * 2009-03-03 2010-09-10 Tomotherapy Incorporated System and method of optimizing a heterogeneous radiation dose to be delivered to a patient
US20110019889A1 (en) * 2009-06-17 2011-01-27 David Thomas Gering System and method of applying anatomically-constrained deformation
CN102802519B (en) * 2009-06-25 2014-12-31 株式会社日立医疗器械 Medical Imaging Apparatus
WO2011041412A2 (en) * 2009-09-29 2011-04-07 Tomotherapy Incorporated Patient support device with low attenuation properties
US8401148B2 (en) 2009-10-30 2013-03-19 Tomotherapy Incorporated Non-voxel-based broad-beam (NVBB) algorithm for intensity modulated radiation therapy dose calculation and plan optimization
JP5462614B2 (en) * 2009-12-18 2014-04-02 株式会社日立メディコ Medical image diagnostic apparatus and medical image system
US20110182493A1 (en) * 2010-01-25 2011-07-28 Martin Huber Method and a system for image annotation
DE102010012797B4 (en) * 2010-03-25 2019-07-18 Siemens Healthcare Gmbh Computer-aided evaluation of an image data set
WO2012024635A2 (en) * 2010-08-19 2012-02-23 Medicis Pharmaceutical Corporation Mid-face aesthetic scale and related methods
JP5683174B2 (en) * 2010-08-31 2015-03-11 キヤノン株式会社 Image processing apparatus and control method thereof
US9262444B2 (en) 2010-11-24 2016-02-16 General Electric Company Systems and methods for applying series level operations and comparing images using a thumbnail navigator
US8837791B2 (en) 2010-12-22 2014-09-16 Kabushiki Kaisha Toshiba Feature location method and system
US9510771B1 (en) 2011-10-28 2016-12-06 Nuvasive, Inc. Systems and methods for performing spine surgery
JP2014124268A (en) * 2012-12-25 2014-07-07 Toshiba Corp Medical diagnostic apparatus
US9443633B2 (en) 2013-02-26 2016-09-13 Accuray Incorporated Electromagnetically actuated multi-leaf collimator
US9642560B2 (en) * 2013-04-03 2017-05-09 Brainlab Ag Method and device for determining the orientation of a co-ordinate system of an anatomical object in a global co-ordinate system
US20140344701A1 (en) * 2013-05-17 2014-11-20 Algotec Systems Ltd. Method and system for image report interaction for medical image software
US9305358B2 (en) * 2013-07-01 2016-04-05 Kabushiki Kaisha Toshiba Medical image processing
US9600778B2 (en) * 2013-07-02 2017-03-21 Surgical Information Sciences, Inc. Method for a brain region location and shape prediction
US9848922B2 (en) 2013-10-09 2017-12-26 Nuvasive, Inc. Systems and methods for performing spine surgery
JP2015084890A (en) * 2013-10-30 2015-05-07 コニカミノルタ株式会社 Image display device and image display method
US10083510B2 (en) 2014-02-27 2018-09-25 Koninklijke Philips N.V. Unsupervised training for an atlas-based registration
US9952301B2 (en) * 2014-03-11 2018-04-24 Hologic, Inc. System and method for selecting and modifying a hanging protocol for displaying MRI information
JP6523686B2 (en) * 2015-01-05 2019-06-05 キヤノンメディカルシステムズ株式会社 X-ray diagnostic device
US20170083665A1 (en) * 2015-09-23 2017-03-23 Siemens Healthcare Gmbh Method and System for Radiology Structured Report Creation Based on Patient-Specific Image-Derived Information
CN106997573A (en) * 2016-01-22 2017-08-01 广东福地新视野光电技术有限公司 A kind of bidirectionally transfering consultation system and its method of changing the place of examination for supporting to be classified diagnosis and treatment
US10452813B2 (en) * 2016-11-17 2019-10-22 Terarecon, Inc. Medical image identification and interpretation
US10188361B2 (en) * 2017-03-27 2019-01-29 Siemens Healthcare Gmbh System for synthetic display of multi-modality data

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62121577A (en) * 1985-11-22 1987-06-02 Toshiba Corp Medical image display device
JPH01140377A (en) * 1987-11-27 1989-06-01 Toshiba Corp Medical image data base system
US5825908A (en) * 1995-12-29 1998-10-20 Medical Media Systems Anatomical visualization and measurement system
US6674879B1 (en) * 1998-03-30 2004-01-06 Echovision, Inc. Echocardiography workstation
US6425865B1 (en) * 1998-06-12 2002-07-30 The University Of British Columbia Robotically assisted medical ultrasound
US6514201B1 (en) 1999-01-29 2003-02-04 Acuson Corporation Voice-enhanced diagnostic medical ultrasound system and review station
US6785410B2 (en) * 1999-08-09 2004-08-31 Wake Forest University Health Sciences Image reporting method and system
US7646898B1 (en) 2000-11-24 2010-01-12 Kent Ridge Digital Labs Methods and apparatus for processing medical images
JP2002165786A (en) * 2000-12-01 2002-06-11 Hitachi Medical Corp X-ray ct apparatus
US7158692B2 (en) * 2001-10-15 2007-01-02 Insightful Corporation System and method for mining quantitive information from medical images
AU2002359444A1 (en) 2001-11-21 2003-06-10 Viatronix Incorporated Imaging system and method for cardiac analysis
US20030228042A1 (en) * 2002-06-06 2003-12-11 Usha Sinha Method and system for preparation of customized imaging atlas and registration with patient images
JP4405172B2 (en) * 2003-04-03 2010-01-27 東芝医用システムエンジニアリング株式会社 Medical system
US7343030B2 (en) * 2003-08-05 2008-03-11 Imquant, Inc. Dynamic tumor treatment system
US7142633B2 (en) * 2004-03-31 2006-11-28 General Electric Company Enhanced X-ray imaging system and method
US7418120B2 (en) * 2004-09-22 2008-08-26 General Electric Company Method and system for structuring dynamic data

Also Published As

Publication number Publication date
US7590440B2 (en) 2009-09-15
US20070127790A1 (en) 2007-06-07
JP2009515599A (en) 2009-04-16
EP1955237A2 (en) 2008-08-13
WO2007059020A3 (en) 2007-10-04
WO2007059020A2 (en) 2007-05-24

Similar Documents

Publication Publication Date Title
US7289651B2 (en) Image reporting method and system
US7945083B2 (en) Method for supporting diagnostic workflow from a medical imaging apparatus
US8913808B2 (en) Systems and methods for viewing medical images
US7979383B2 (en) Atlas reporting
US5779634A (en) Medical information processing system for supporting diagnosis
JP2004005364A (en) Similar image retrieval system
US20050147284A1 (en) Image reporting method and system
US20040151358A1 (en) Medical image processing system and method for processing medical image
US7949166B2 (en) Diagnosis support system
US9171130B2 (en) Multiple modality mammography image gallery and clipping system
US7388974B2 (en) Medical image processing apparatus
JP5416335B2 (en) Real-time interactive data analysis management tool
US20050111733A1 (en) Automated digitized film slicing and registration tool
US20050114140A1 (en) Method and apparatus for contextual voice cues
US20130093781A1 (en) Examination information display device and method
AU2004266022B2 (en) Computer-aided decision support systems and methods
US20090054755A1 (en) Medical imaging system
US20120183188A1 (en) Medical image display apparatus, method, and program
US9014485B2 (en) Image reporting method
US20100114597A1 (en) Method and system for medical imaging reporting
US8254649B2 (en) Medical image observation system
US20110311026A1 (en) Mobile radiography imaging apparatus using prior related images before current image exposure and methods for same
DE102005004383B4 (en) Method and device for controlling an imaging modality
WO2009073185A1 (en) Systems and methods for efficient imaging
EP1878239A2 (en) Method and apparatus for automated quality assurance in medical imaging

Legal Events

Date Code Title Description
RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20091113

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20091113

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20091113

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20101111

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20111226

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20120110

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120404

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130326

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130620

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20140311

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20140404

R150 Certificate of patent or registration of utility model

Ref document number: 5519937

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250