EP3028197B1 - Reporting tool with integrated lesion stager - Google Patents

Reporting tool with integrated lesion stager Download PDF

Info

Publication number
EP3028197B1
EP3028197B1 EP14777799.9A EP14777799A EP3028197B1 EP 3028197 B1 EP3028197 B1 EP 3028197B1 EP 14777799 A EP14777799 A EP 14777799A EP 3028197 B1 EP3028197 B1 EP 3028197B1
Authority
EP
European Patent Office
Prior art keywords
variable
value
user input
computed output
output value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP14777799.9A
Other languages
German (de)
French (fr)
Other versions
EP3028197A1 (en
Inventor
Merlijn Sevenster
Paul Joseph Chang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Publication of EP3028197A1 publication Critical patent/EP3028197A1/en
Application granted granted Critical
Publication of EP3028197B1 publication Critical patent/EP3028197B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H15/00ICT specially adapted for medical reports, e.g. generation or transmission thereof
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion

Definitions

  • a medical image may be generated to determine whether a patient has a medical issue.
  • a magnetic resonance imaging (MRI) may be used to determine if a patient has a mass.
  • an X-ray may be used to determine if a patient has a lesion, a fracture, etc.
  • a reporting guideline may be a hierarchically related set of instructions that indicate to the user the information that is recommended to be included given the data already obtained. For example, if a user indicates that an imaging study includes a mass, the dimensions of the mass should be included. The physician/technician who creates the report may come to a common conclusion but provide different interpretations such as identifying a mass but describing the mass with different dimensions. The physician/technician may also be unfamiliar with the reporting guidelines/staging schemes that indicate the information that should be included, resulting in such information to be omitted.
  • a classification scheme may be utilized in which the classification scheme provides a set of rules that classify a finding in terms of a given, finite number of classes.
  • a grading system for tumors is a typical classification scheme.
  • a nomogram may be utilized in which the nomogram is a function that returns a value based on input values of various parameters.
  • the nomogram may generate a value for a body/mass index (BMI) in which a person's height and weight are the input values.
  • BMI body/mass index
  • WO 2012/123829 A1 discloses systems and methods for intelligently combining medical findings received across different modalities.
  • the system comprises an extraction module extracting contextual information from an image of an area of interest including annotations, a feature selection module building a current feature vector using the extracted contextual information and the annotations, and a referencing engine computing a similarity score between the current feature vector and a prior feature vector of a prior image.
  • the method comprises extracting contextual information from an image of an area of interest including annotations, building a current feature vector using the extracted contextual information and the annotations, and computing a similarity score between the current feature vector and a prior feature vector of a prior image.
  • the invention provides a computer-implemented method, a reporting tool and a non-transitory computer readable storage medium as defined in the independent claims. Preferable embodiments of the invention are defined in the dependent claims.
  • the exemplary embodiments may be further understood with reference to the following description of the exemplary embodiments and the related appended drawings, wherein like elements are provided with the same reference numerals.
  • the exemplary embodiments are related to a system and method of providing a reporting tool that receives a user input and variable values thereof to generate a computed output value.
  • the user input may relate to a lesion and the variable values may relate to dimensions and a margin of the lesion such that the computed output value is a Breast Imaging-Reporting and Data System (BI-RADS) score.
  • BIOS Breast Imaging-Reporting and Data System
  • variable values associated with the user input may be defined with a predetermined number such that the variable values relate to selected parameters of the user input.
  • each variable is a parameter from a given set such as "lesion size" or "calcification” and the values thereof are real values, such as 2.0, or objects from a normalized list such as microcalification, calcification, large calcification, none, etc.
  • each variable value is assigned a default value which is unknown and is provided by the user.
  • the manner of generating the computed output value integrates one classification scheme or nomogram and generation of the computed output value is treated as a mathematical function that accepts the user input having at least one or potentially all of the predetermined variable values.
  • the exemplary embodiments may also relate to a process in which the above noted assumptions are not made.
  • the variable values may not have a predetermined number and other variable values that would otherwise not be utilized may be provided by the user or determined.
  • Fig. 1 shows a system for a reporting tool 100 of an imaging study according to the exemplary embodiments.
  • the reporting tool 100 is configured to receive a user input such that a predetermined set of variable values are defined as a function of the user input.
  • the reporting tool 100 subsequently receives select variable values.
  • the reporting tool 100 determines whether the select variable values are satisfactory to generate a computed output value. If at least one further variable value is required, a request is made such that the further variable value is received. When sufficient variable values are received, the reporting tool 100 generates the computed output value.
  • the reporting tool 100 includes a context definition engine 105, a variable value extraction engine 110, a guideline engine 115, a computation engine 120, and an integration engine 125.
  • the context definition engine 105 receives a user input.
  • the user input may be in a variety of manners such as an image.
  • the image is a MRI, X-ray, etc.
  • the image is an initial user input provided by the user.
  • there may be image annotations. Accordingly, the annotations in the image may be considered by the context definition engine 105 such as a dimensional or angular measurement, an arrow, a box, a graphical item attached to the image, etc.
  • free text may be included in the user input.
  • context definition engine 105 text that may be dictated or typed by the user is considered by the context definition engine 105.
  • structured content may be included in the user input.
  • an object of structured context is considered by the context definition engine 105 such as an XML object.
  • the context definition engine 105 is alerted that an XML object should be taken as context via a standard user interface means (e.g., a button). Therefore, the context definition engine 105 extracts the information that is included in the user input.
  • the user may manually enter the further information in the user input using the above noted examples. Accordingly, the user may initially view the user input such as an image (e.g., MRI) and make measurements or perform other forms of information gathering. The user may then include the information in the user input prior to the context definition engine 105 receiving the user input.
  • an image e.g., MRI
  • the context definition engine 105 also restricts a scope of the available information associated with a use of the reporting tool 100.
  • the reporting tool 100 relates to a particular functionality. Accordingly, the context definition engine 105 limits the information from the user input to the parameters defined therein.
  • the user input includes image annotations, select ones of the annotations are considered if pertaining to the functionality while others are omitted.
  • the reporting tool 100 determines the functionality associated with the user input upon receiving it. Accordingly, the context definition engine 105 receives the user input and subsequently defines the parameters of the information associated therewith in an ad hoc manner. Therefore, the context definition engine 105 restricts the information to pertinent data that is used subsequently as a function of the reporting tool 100 or the user input.
  • variable value extraction engine 110 generates a mapping from variable parameters to its respective value. Initially, it is noted that, depending on the context, select variable parameters may be considered, of which further select ones thereof have a higher relevance or priority than others.
  • the variable value extraction engine 110 extracts the values received by the context definition engine 105 and associates these values with the appropriate variable parameter. Once all values from the context definition engine are associated with its variable parameter, the variable value extraction engine 110 uses a default value (e.g., "?") if no value is obtained for certain variable parameters. It is noted that an actual real number is not used as a zero (0) value may be indicative of an actual value to associate with a particular variable parameter.
  • the context definition engine 105 receives a user input having a variety of information associated therewith such as an image annotation, a free text, and a structured content.
  • the variable value extraction engine 110 associates the values included in this user input information with its appropriate variable parameter, thereby extracting a variable-value vector from the content.
  • imaging features are extracted such as a length or surface of a measurement, an angle of two lines, a surface of a box, etc.
  • the variable value extraction engine 110 directly extracts these values from the user input.
  • the variable value extraction engine 110 automatically extracts additional features from the user input when not specifically denoted in the information included therein. For example, a contour analysis is extracted.
  • variable value extraction engine 110 receives a particular annotation that requires further input. For example, if the user draws a box via a user interface in the image received by the context definition engine 105, the variable value extraction engine 110 requests additional information such as the particular variable parameter the box defines (e.g., tumor, cyst, hemorrhage, etc.).
  • the variable value extraction engine 110 includes a natural language processing engine that automatically extracts values from the selected text.
  • the selected text is normalized against a background vocabulary such as SNOMED CT or RadLex.
  • the natural language processing engine further uses a negation detection application or other application that interconnects extracted findings to increase a granularity of the information extraction.
  • a query in the content's native language is used for the information extraction.
  • the guideline engine 115 is used to determine whether further information is required. That is, the guideline engine 115 fills gaps in the information extraction that has been performed. For example, when three (3) parameter values are required to determine a computed output value but only two (2) parameter values are received or extracted, the guideline engine 115 requests that the third parameter value be entered. Accordingly, the guidelines engine 115 is configured with predetermined settings or rules to verify if at least one value and associated variable parameter is satisfactory to generate the computed output value.
  • the guideline engine 115 receives the extracted values and the associated variable parameter from the variable value extraction engine 110.
  • the guideline engine 115 relates to a Boolean value indicating whether the extracted values and the associated variable parameter is satisfactory to generate the computed output value. If the guideline engine 115 returns a "yes" or "satisfactory” value, the variable value extraction engine 110 has all the necessary information. Conversely, if the guideline engine 115 returns a "no" or "unsatisfactory” value, the variable value extraction engine 110 still requires at least one further value to associate with the variable parameter.
  • the guideline engine 115 further includes which of the at least one further value is required. Thus, upon receiving this information, the variable vector extraction engine 110 requests the user for the at least one further value.
  • the guideline engine 115 is further configured with a Help-functionality.
  • the Help-functionality relates to the predetermined settings or rules that indicate whether values and variable parameters result in a "satisfactory" response.
  • the user may utilize a user interface to access the Help-functionality.
  • the guideline engine 115 determines the relevant predetermined settings/rules that the user may be requesting as a function of the user input and/or the extracted values.
  • the guideline engine 115 provides a menu for the user to select the relevant predetermined settings/rules. Subsequently, the user may view the predetermined settings/rules to aid in determining whether the extracted values for the associated variable parameter will result in a "satisfactory" response.
  • the computation engine 120 generates the computed output value as a function of the variable values and its associated variable parameter.
  • the computation engine 120 receives the variable values and the associated variable parameter to generate the computed output value.
  • the computed output value may be a class or value based on algebraic computations or statistical estimations preprogrammed into the computation engine 120.
  • the computation engine 120 may generate the computed output value in a variety of manners.
  • the computed output value may be an XML object that indicates the formula being used, the value, the units thereof, etc.
  • the computation engine 120 generates the computed output value with free text or as an annotation.
  • the integration engine 125 integrates the computed output value in a report that is created for the user input.
  • the integration engine 125 receives the computed output value such as an XML object and integrate its semantics with a workflow. This may be done in a variety of manners.
  • the computed output value is returned to the user such that the user is able to view the outcome.
  • the computed output value is automatically inserted into the report such as the XML object being converted to natural language and inserted into a suitable area in the report.
  • the computed output value is attached as metadata.
  • the reporting tool 100 may be configured to determine further output data.
  • the computation engine 120 may also generate follow-up or treatment data that is derived from the computed output value, the extracted values with the associated variable parameter, additional received values, etc.
  • the computation engine 120 may also generate this further output data in the variety of manners discussed above.
  • the reporting tool 100 may be a processor configured to perform the above described manner of generating the computed output value.
  • Fig. 2 shows a device 200 incorporating the reporting tool 100 of Fig. 1 according to the exemplary embodiments.
  • the device 200 includes a processor 205 that incorporates the reporting tool 100 including the context definition engine 105, the variable value extraction engine 110, the guideline engine 115, the computation engine 120, and the integration engine 125.
  • the device 200 also includes a memory arrangement storing the data related to the reporting tool 100. For example, the user input that is received by the context definition engine 105 is stored thereon, the extracted values of the variable value extraction engine 110 is stored thereon, the settings/rules of the guideline engine 115 is stored thereon, etc.
  • the reporting tool 100 generates at least one computed output value as a function of the values extracted from a user input and/or user input values entered by the user.
  • the reporting tool 100 may relate to a variety of different types of context.
  • the reporting tool 100 is used in relation to a breast MRI.
  • a radiologist views the breast MRI.
  • the radiologist may determine that there is a lesion and measurements of the lesion may be made by the user.
  • the user includes the measurements in the breast MRI as an annotation, using free text, as structured content, etc.
  • the reporting tool 100 is utilized such that the context definition engine 105 receives the breast MRI with the measurement data of the lesion.
  • the context definition engine 105 determines the received user input is the image (e.g., breast MRI) and further determines the relevant information to be considered.
  • the variable value extraction engine 110 extracts the dimensions of the lesion (e.g., 2 cm x 3 cm) from the included information (e.g., annotation, free text, structured content, etc.).
  • the guideline engine 115 receives the variable-value pair and determines that the extracted information does not include a value for margin of the lesion.
  • the guideline engine 115 returns an "unsatisfactory" result to the variable value extraction engine 110 indicating that the margin value is missing.
  • the variable value extraction engine 110 requests for this missing information from the user.
  • the user enters a further user input value of this information using any of the aforementioned manners and indicates that the margin value is specified such as "spiculated.”
  • the computation engine 120 receives the information vector including the variable values and associated variable parameters. Subsequently, the computation engine 120 determines the computed output value such as BI-RADS score of the lesion. For example, given the dimensions and margin of the lesion, the computation engine 120 may determine a BI-RADS score of two (2). The computation engine 120 may further determine a further output value such as a recommended treatment. For example, the computation engine 120 may determine a yearly follow-up is recommended. In another example, the computation engine 120 includes other output values such as a background information section (e.g., a URL).
  • a background information section e.g., a URL
  • the integration engine 125 receives the computed output value and the further output data from the computation engine 120 such that a structured content such as an XML object is generated and included in a report for the breast MRI.
  • the integration engine 125 further incorporates the variable values and the associated variable parameters in the report as an XML object.
  • Fig. 3 shows a method 300 of generating data from a reporting tool of an imaging study according to the exemplary embodiments.
  • the method 300 relates to receiving a user input that includes information and requesting/receiving further necessary information such that a computed output value is determined as a function thereof.
  • the user may include the information in the user input.
  • the method 200 will be discussed with reference to the reporting tool 100 of Fig. 1 .
  • the context definition engine 105 receives the user input.
  • the user input may be a variety of types such as an image.
  • the reporting tool 100 may relate to an imaging study.
  • the context definition engine 105 determines the context of the user input.
  • the context definition engine 105 may automatically determine the context in which the user input relates, thereby restricting the relevant information that is to be considered.
  • the context definition engine 105 may be predetermined for a particular context and have predetermined settings to restrict the relevant information.
  • the context definition engine 105 may be manually set by the user for the particular context and may further be manually set to restrict the relevant information.
  • the variable value extraction engine 110 receives and/or extracts the variable values associated with the user input.
  • the user input may include information such as annotations, free text, structured content, etc.
  • the variable value extraction engine 110 extracts the variable values therein.
  • the variable value extraction engine 110 also receives variable values that may be entered by the user.
  • the variable value extraction engine 110 determines the variable parameter in which the variable values are to be associated. For example, if a variable value is "2 cm x 3 cm,” the variable value extraction engine 115 pairs this variable value with the variable parameter of "dimensions.”
  • step 325 a determination is made by the guideline engine 115 whether a further variable value or variable parameter is required.
  • the guideline engine 115 is programmed with predetermined settings and/or rules to make this determination. In the above example of dimensions, the guideline engine 115 determines that a variable value for margin is also required.
  • the method 300 continues to step 330 where the variable value extraction engine 110 requests the user to provide this variable value. The user enters the margin variable value as "spiculated.” This process continues until the guideline engine 115 indicates that no further variable value is required.
  • the method 300 then continues to step 335 where the computation engine 120 receives the variable-value pairs (i.e., variable values and associated variable parameter) in order to determine a computed output value as a function thereof.
  • the steps described above for the method 300 is only exemplary.
  • the method 300 may include additional steps.
  • the method 300 may include a step of integrating the computed output value in a report.
  • the integration engine 125 may integrate the computed output value as an XML object in the report at a predetermined or selected area.
  • the method 200 may include a step of returning a Boolean result and including the further variable value that is required.
  • the method 300 may include another step of determining further output values such as treatment recommendations and background information.
  • this step may further include this value/information in the report.
  • the method 300 may include a step of receiving a user entry indicating the manner in which the report is to be generated. For example, the user may indicate that the computed output value is to be included in a specific format (e.g., annotation, free text, etc.).
  • the method 300 may include a step of assigning a default unknown value to variable parameters that are not associated with a variable value.
  • the guideline engine 115 may be aware of select variable parameters that are necessary for the computed output value to be generated.
  • the exemplary embodiments provide a system and method to generate a computed output value from variable values and associated variable parameters that are extracted and/or received from a user and/or user input.
  • the variable values are extracted from information included in the user input.
  • the extracted variable values are also selected for extraction as a function of the context in which the user input relates.
  • the computed output value may ultimately be integrated into a report that is created for the user input.
  • An exemplary hardware platform for implementing the exemplary embodiments may include, for example, an Intel x86 based platform with compatible operating system, a Mac platform and MAC OS, etc.
  • the exemplary embodiments of the reporting tool may be embodied as a program containing lines of code stored on a non-transitory computer readable storage medium that, when compiled, may be executed on a processor.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Public Health (AREA)
  • Radiology & Medical Imaging (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Molecular Biology (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Description

  • A medical image may be generated to determine whether a patient has a medical issue. For example, a magnetic resonance imaging (MRI) may be used to determine if a patient has a mass. In another example, an X-ray may be used to determine if a patient has a lesion, a fracture, etc. When a physician or technician views the medical image, it is possible that varying interpretations may be made from a common image. That is, a report may be made by the physician or technician but the report may include different characteristics of the medical issue. Ideally, if two radiologists interpret the same imaging study, a substantially identical report should be made. However, due to a variety of reasons, this is not always the case. Consumers of reports may get confused if radiologists use different terminology for the same finding, which may lead to suboptimal care or inefficiency such as if an inaccurate exam is ordered.
  • One of the root causes for inter-radiologist variability is the fact that there are too many reporting guidelines and staging schemes to memorize. A reporting guideline may be a hierarchically related set of instructions that indicate to the user the information that is recommended to be included given the data already obtained. For example, if a user indicates that an imaging study includes a mass, the dimensions of the mass should be included. The physician/technician who creates the report may come to a common conclusion but provide different interpretations such as identifying a mass but describing the mass with different dimensions. The physician/technician may also be unfamiliar with the reporting guidelines/staging schemes that indicate the information that should be included, resulting in such information to be omitted.
  • With regard to imaging studies, additional schemes or features may be recommended to be used or included. A classification scheme may be utilized in which the classification scheme provides a set of rules that classify a finding in terms of a given, finite number of classes. For example, a grading system for tumors is a typical classification scheme. In another example, a nomogram may be utilized in which the nomogram is a function that returns a value based on input values of various parameters. For example, the nomogram may generate a value for a body/mass index (BMI) in which a person's height and weight are the input values.
  • WO 2012/123829 A1 discloses systems and methods for intelligently combining medical findings received across different modalities. The system comprises an extraction module extracting contextual information from an image of an area of interest including annotations, a feature selection module building a current feature vector using the extracted contextual information and the annotations, and a referencing engine computing a similarity score between the current feature vector and a prior feature vector of a prior image. The method comprises extracting contextual information from an image of an area of interest including annotations, building a current feature vector using the extracted contextual information and the annotations, and computing a similarity score between the current feature vector and a prior feature vector of a prior image.
  • Accordingly, it is desirable to reduce inter-radiologist variability when reviewing an imaging study and also integrate the various schemes and nomograms. Thus, there is a need for a reporting tool that reduces inter-radiologist variability and incorporates the various schemes and features.
  • The invention provides a computer-implemented method, a reporting tool and a non-transitory computer readable storage medium as defined in the independent claims. Preferable embodiments of the invention are defined in the dependent claims.
    • Fig. 1 shows a system for a reporting tool of an imaging study according to the exemplary embodiments.
    • Fig. 2 shows a device incorporating the reporting tool of Fig. 1 according to the exemplary embodiments.
    • Fig. 3 shows a method of generating data from a reporting tool of an imaging study according to the exemplary embodiments.
  • The exemplary embodiments may be further understood with reference to the following description of the exemplary embodiments and the related appended drawings, wherein like elements are provided with the same reference numerals. The exemplary embodiments are related to a system and method of providing a reporting tool that receives a user input and variable values thereof to generate a computed output value. For example, the user input may relate to a lesion and the variable values may relate to dimensions and a margin of the lesion such that the computed output value is a Breast Imaging-Reporting and Data System (BI-RADS) score. The reporting tool, the user input, the variable values, the computed output value, and a related method will be explained in further detail below.
  • Initially, it is noted that for concreteness, several assumptions may be made regarding the manner in which the computed output value is generated. For example, a domain of finite number of variable values may be made. That is, the variable values associated with the user input may be defined with a predetermined number such that the variable values relate to selected parameters of the user input. Specifically, with regard to a lesion, each variable is a parameter from a given set such as "lesion size" or "calcification" and the values thereof are real values, such as 2.0, or objects from a normalized list such as microcalification, calcification, large calcification, none, etc. In another example, each variable value is assigned a default value which is unknown and is provided by the user. Accordingly, the manner of generating the computed output value integrates one classification scheme or nomogram and generation of the computed output value is treated as a mathematical function that accepts the user input having at least one or potentially all of the predetermined variable values. However, it should be noted that the exemplary embodiments may also relate to a process in which the above noted assumptions are not made. For example, the variable values may not have a predetermined number and other variable values that would otherwise not be utilized may be provided by the user or determined.
  • Fig. 1 shows a system for a reporting tool 100 of an imaging study according to the exemplary embodiments. The reporting tool 100 is configured to receive a user input such that a predetermined set of variable values are defined as a function of the user input. The reporting tool 100 subsequently receives select variable values. The reporting tool 100 determines whether the select variable values are satisfactory to generate a computed output value. If at least one further variable value is required, a request is made such that the further variable value is received. When sufficient variable values are received, the reporting tool 100 generates the computed output value. The reporting tool 100 includes a context definition engine 105, a variable value extraction engine 110, a guideline engine 115, a computation engine 120, and an integration engine 125.
  • The context definition engine 105 receives a user input. The user input may be in a variety of manners such as an image. In a specific exemplary embodiment, the image is a MRI, X-ray, etc. The image is an initial user input provided by the user. Within this initial user input, there may further be included a variety of associated information. That is, additional user input is provided with the initial user input. In a first example, there may be image annotations. Accordingly, the annotations in the image may be considered by the context definition engine 105 such as a dimensional or angular measurement, an arrow, a box, a graphical item attached to the image, etc. In a second example, free text may be included in the user input. Accordingly, text that may be dictated or typed by the user is considered by the context definition engine 105. In a third example, structured content may be included in the user input. Accordingly, an object of structured context is considered by the context definition engine 105 such as an XML object. The context definition engine 105 is alerted that an XML object should be taken as context via a standard user interface means (e.g., a button). Therefore, the context definition engine 105 extracts the information that is included in the user input.
  • It should be noted that prior to the context definition engine 105 receiving the user input, the user may manually enter the further information in the user input using the above noted examples. Accordingly, the user may initially view the user input such as an image (e.g., MRI) and make measurements or perform other forms of information gathering. The user may then include the information in the user input prior to the context definition engine 105 receiving the user input.
  • The context definition engine 105 also restricts a scope of the available information associated with a use of the reporting tool 100. In a first example, the reporting tool 100 relates to a particular functionality. Accordingly, the context definition engine 105 limits the information from the user input to the parameters defined therein. In a specific exemplary embodiment, if the user input includes image annotations, select ones of the annotations are considered if pertaining to the functionality while others are omitted. In a second example, the reporting tool 100 determines the functionality associated with the user input upon receiving it. Accordingly, the context definition engine 105 receives the user input and subsequently defines the parameters of the information associated therewith in an ad hoc manner. Therefore, the context definition engine 105 restricts the information to pertinent data that is used subsequently as a function of the reporting tool 100 or the user input.
  • The variable value extraction engine 110 generates a mapping from variable parameters to its respective value. Initially, it is noted that, depending on the context, select variable parameters may be considered, of which further select ones thereof have a higher relevance or priority than others. The variable value extraction engine 110 extracts the values received by the context definition engine 105 and associates these values with the appropriate variable parameter. Once all values from the context definition engine are associated with its variable parameter, the variable value extraction engine 110 uses a default value (e.g., "?") if no value is obtained for certain variable parameters. It is noted that an actual real number is not used as a zero (0) value may be indicative of an actual value to associate with a particular variable parameter.
  • As discussed above, the context definition engine 105 receives a user input having a variety of information associated therewith such as an image annotation, a free text, and a structured content. With particular regard to these examples, the variable value extraction engine 110 associates the values included in this user input information with its appropriate variable parameter, thereby extracting a variable-value vector from the content. In the first example of image annotations, imaging features are extracted such as a length or surface of a measurement, an angle of two lines, a surface of a box, etc. The variable value extraction engine 110 directly extracts these values from the user input. In another exemplary embodiment, the variable value extraction engine 110 automatically extracts additional features from the user input when not specifically denoted in the information included therein. For example, a contour analysis is extracted. In a further exemplary embodiment, the variable value extraction engine 110 receives a particular annotation that requires further input. For example, if the user draws a box via a user interface in the image received by the context definition engine 105, the variable value extraction engine 110 requests additional information such as the particular variable parameter the box defines (e.g., tumor, cyst, hemorrhage, etc.). In the second example of free text, the variable value extraction engine 110 includes a natural language processing engine that automatically extracts values from the selected text. In a specific example, the selected text is normalized against a background vocabulary such as SNOMED CT or RadLex. The natural language processing engine further uses a negation detection application or other application that interconnects extracted findings to increase a granularity of the information extraction. In the third example of structured content, since the content is structured, a query in the content's native language is used for the information extraction.
  • The guideline engine 115 is used to determine whether further information is required. That is, the guideline engine 115 fills gaps in the information extraction that has been performed. For example, when three (3) parameter values are required to determine a computed output value but only two (2) parameter values are received or extracted, the guideline engine 115 requests that the third parameter value be entered. Accordingly, the guidelines engine 115 is configured with predetermined settings or rules to verify if at least one value and associated variable parameter is satisfactory to generate the computed output value.
  • The guideline engine 115 receives the extracted values and the associated variable parameter from the variable value extraction engine 110. In a specific exemplary embodiment, the guideline engine 115 relates to a Boolean value indicating whether the extracted values and the associated variable parameter is satisfactory to generate the computed output value. If the guideline engine 115 returns a "yes" or "satisfactory" value, the variable value extraction engine 110 has all the necessary information. Conversely, if the guideline engine 115 returns a "no" or "unsatisfactory" value, the variable value extraction engine 110 still requires at least one further value to associate with the variable parameter. The guideline engine 115 further includes which of the at least one further value is required. Thus, upon receiving this information, the variable vector extraction engine 110 requests the user for the at least one further value.
  • The guideline engine 115 is further configured with a Help-functionality. Specifically, the Help-functionality relates to the predetermined settings or rules that indicate whether values and variable parameters result in a "satisfactory" response. The user may utilize a user interface to access the Help-functionality. In a first exemplary embodiment, the guideline engine 115 determines the relevant predetermined settings/rules that the user may be requesting as a function of the user input and/or the extracted values. In a second exemplary embodiment, the guideline engine 115 provides a menu for the user to select the relevant predetermined settings/rules. Subsequently, the user may view the predetermined settings/rules to aid in determining whether the extracted values for the associated variable parameter will result in a "satisfactory" response.
  • The computation engine 120 generates the computed output value as a function of the variable values and its associated variable parameter. The computation engine 120 receives the variable values and the associated variable parameter to generate the computed output value. The computed output value may be a class or value based on algebraic computations or statistical estimations preprogrammed into the computation engine 120. The computation engine 120 may generate the computed output value in a variety of manners. For example, the computed output value may be an XML object that indicates the formula being used, the value, the units thereof, etc. In another example, the computation engine 120 generates the computed output value with free text or as an annotation.
  • The integration engine 125 integrates the computed output value in a report that is created for the user input.
    The integration engine 125 receives the computed output value such as an XML object and integrate its semantics with a workflow. This may be done in a variety of manners. In a first example, the computed output value is returned to the user such that the user is able to view the outcome. In a second example, the computed output value is automatically inserted into the report such as the XML object being converted to natural language and inserted into a suitable area in the report. In a third example, the computed output value is attached as metadata.
  • It should be noted that the reporting tool 100 may be configured to determine further output data. For example, the computation engine 120 may also generate follow-up or treatment data that is derived from the computed output value, the extracted values with the associated variable parameter, additional received values, etc. The computation engine 120 may also generate this further output data in the variety of manners discussed above.
  • It should also be noted that the reporting tool 100 may be a processor configured to perform the above described manner of generating the computed output value. Fig. 2 shows a device 200 incorporating the reporting tool 100 of Fig. 1 according to the exemplary embodiments. The device 200 includes a processor 205 that incorporates the reporting tool 100 including the context definition engine 105, the variable value extraction engine 110, the guideline engine 115, the computation engine 120, and the integration engine 125. The device 200 also includes a memory arrangement storing the data related to the reporting tool 100. For example, the user input that is received by the context definition engine 105 is stored thereon, the extracted values of the variable value extraction engine 110 is stored thereon, the settings/rules of the guideline engine 115 is stored thereon, etc.
  • According to the exemplary embodiments, the reporting tool 100 generates at least one computed output value as a function of the values extracted from a user input and/or user input values entered by the user. The reporting tool 100 may relate to a variety of different types of context. In a specific example, the reporting tool 100 is used in relation to a breast MRI. As a preliminary step, a radiologist views the breast MRI. The radiologist may determine that there is a lesion and measurements of the lesion may be made by the user. The user includes the measurements in the breast MRI as an annotation, using free text, as structured content, etc. Subsequently, the reporting tool 100 is utilized such that the context definition engine 105 receives the breast MRI with the measurement data of the lesion. The context definition engine 105 determines the received user input is the image (e.g., breast MRI) and further determines the relevant information to be considered. The variable value extraction engine 110 extracts the dimensions of the lesion (e.g., 2 cm x 3 cm) from the included information (e.g., annotation, free text, structured content, etc.). The variable value extraction engine 110 also associates the measurement data with the variable parameter of dimensions. Accordingly, a variable-value pair <dimension value = "2 cm x 3 cm"/> may be added to the information vector of the user input. The guideline engine 115 receives the variable-value pair and determines that the extracted information does not include a value for margin of the lesion. Accordingly, the guideline engine 115 returns an "unsatisfactory" result to the variable value extraction engine 110 indicating that the margin value is missing. The variable value extraction engine 110 requests for this missing information from the user. The user enters a further user input value of this information using any of the aforementioned manners and indicates that the margin value is specified such as "spiculated." The variable value extraction engine 110 associates this value with the appropriate variable parameter such that a variable-value pair <margin = "spiculated"/> is also added to the information vector of the user input. This process continues until the guideline engine 115 returns a result that the information vector is "satisfactory" (i.e., sufficiently complete for computations). Once this result is received by the variable value extraction engine 110, the computation engine 120 receives the information vector including the variable values and associated variable parameters. Subsequently, the computation engine 120 determines the computed output value such as BI-RADS score of the lesion. For example, given the dimensions and margin of the lesion, the computation engine 120 may determine a BI-RADS score of two (2). The computation engine 120 may further determine a further output value such as a recommended treatment. For example, the computation engine 120 may determine a yearly follow-up is recommended. In another example, the computation engine 120 includes other output values such as a background information section (e.g., a URL). Finally, the integration engine 125 receives the computed output value and the further output data from the computation engine 120 such that a structured content such as an XML object is generated and included in a report for the breast MRI. The integration engine 125 further incorporates the variable values and the associated variable parameters in the report as an XML object.
  • Fig. 3 shows a method 300 of generating data from a reporting tool of an imaging study according to the exemplary embodiments. Specifically, the method 300 relates to receiving a user input that includes information and requesting/receiving further necessary information such that a computed output value is determined as a function thereof. Accordingly, prior to the method 300, the user may include the information in the user input. The method 200 will be discussed with reference to the reporting tool 100 of Fig. 1.
  • In step 305, the context definition engine 105 receives the user input. As discussed above, the user input may be a variety of types such as an image. Thus, the reporting tool 100 may relate to an imaging study. In step 310, the context definition engine 105 determines the context of the user input. As discussed above, the context definition engine 105 may automatically determine the context in which the user input relates, thereby restricting the relevant information that is to be considered. In another example, the context definition engine 105 may be predetermined for a particular context and have predetermined settings to restrict the relevant information. In yet another example, the context definition engine 105 may be manually set by the user for the particular context and may further be manually set to restrict the relevant information.
  • In step 315, the variable value extraction engine 110 receives and/or extracts the variable values associated with the user input. As discussed above, the user input may include information such as annotations, free text, structured content, etc. The variable value extraction engine 110 extracts the variable values therein. The variable value extraction engine 110 also receives variable values that may be entered by the user. Thus, in step 320, the variable value extraction engine 110 determines the variable parameter in which the variable values are to be associated. For example, if a variable value is "2 cm x 3 cm," the variable value extraction engine 115 pairs this variable value with the variable parameter of "dimensions."
  • In step 325, a determination is made by the guideline engine 115 whether a further variable value or variable parameter is required. Specifically, the guideline engine 115 is programmed with predetermined settings and/or rules to make this determination. In the above example of dimensions, the guideline engine 115 determines that a variable value for margin is also required. Thus, the method 300 continues to step 330 where the variable value extraction engine 110 requests the user to provide this variable value. The user enters the margin variable value as "spiculated." This process continues until the guideline engine 115 indicates that no further variable value is required. The method 300 then continues to step 335 where the computation engine 120 receives the variable-value pairs (i.e., variable values and associated variable parameter) in order to determine a computed output value as a function thereof.
  • It should be noted that the steps described above for the method 300 is only exemplary. The method 300 may include additional steps. For example, after step 335, the method 300 may include a step of integrating the computed output value in a report. Specifically, the integration engine 125 may integrate the computed output value as an XML object in the report at a predetermined or selected area. In another example, if step 325 determines further variable values are required, the method 200 may include a step of returning a Boolean result and including the further variable value that is required. In yet another example, after step 335, the method 300 may include another step of determining further output values such as treatment recommendations and background information. Thus, utilizing the above noted step of including the computed output value in the report, this step may further include this value/information in the report. In a further example, the method 300 may include a step of receiving a user entry indicating the manner in which the report is to be generated. For example, the user may indicate that the computed output value is to be included in a specific format (e.g., annotation, free text, etc.). In yet another further example, the method 300 may include a step of assigning a default unknown value to variable parameters that are not associated with a variable value. Thus, the guideline engine 115 may be aware of select variable parameters that are necessary for the computed output value to be generated.
  • The exemplary embodiments provide a system and method to generate a computed output value from variable values and associated variable parameters that are extracted and/or received from a user and/or user input. The variable values are extracted from information included in the user input. The extracted variable values are also selected for extraction as a function of the context in which the user input relates. The computed output value may ultimately be integrated into a report that is created for the user input.
  • Those skilled in the art will understand that the above-described exemplary embodiments may be implemented in any suitable software or hardware configuration or combination thereof. An exemplary hardware platform for implementing the exemplary embodiments may include, for example, an Intel x86 based platform with compatible operating system, a Mac platform and MAC OS, etc. In a further example, the exemplary embodiments of the reporting tool may be embodied as a program containing lines of code stored on a non-transitory computer readable storage medium that, when compiled, may be executed on a processor.

Claims (12)

  1. A computer-implemented method for computing an assessment of a medical issue of a patient, comprising:
    receiving (305) a user input, the user input comprising medical image data of the patient and at least one variable value, the variable value being indicative of a variable parameter of the medical image data related to a medical issue of the patient;
    associating (320) each of the at least one variable values with a corresponding variable parameter to generate at least one variable-value pair;
    determining (325) whether at least one further variable-value pair related to a medical issue of the patient is required to determine a computed output value indicative of an assessment of a medical issue of the patient as a function of select ones of the at least one variable-value pair,
    wherein the determination (325) for at least one further variable-value pair is a function of at least one of predetermined settings and predetermined rules;
    when at least one further variable-value pair is required,
    - determining (325) the at least one further variable-value pair necessary to determine the computed output value,
    - requesting for the at least one further variable-value pair, and
    - receiving (330) the at least one further variable-value pair prior to determining the computed output value; and determining (335) the computed output value as a function of the at least one further variable-value pair and select ones of the at least one generated variable-value pair.
  2. The method of claim 1, further comprising:
    determining (310) a context of the user input; and
    extracting (315) select ones of the at least one variable value from the user input as a function of the context.
  3. The method of claim 1, further comprising:
    integrating the computed output value in a report for the user input.
  4. The method of claim 1, wherein the at least one variable value is included in the user input as at least one of an annotation, a free text, and a structured content.
  5. The method of claim 1, further comprising:
    determining (335) at least one further computed output value as a function of the computed output value.
  6. The method of claim 1, wherein the medical image data is a magnetic resonance imaging (MRI), the at least one variable value is a width and length of a lesion, the variable parameter is dimensions of the lesion, and the computed output value is a Breast Imaging-Reporting and Data System (BI-RADS) score.
  7. A reporting tool (100) for computing an assessment of a medical issue of a patient, comprising:
    a processor (205) configured to receive a user input, the user input comprising medical image data of the patient and at least one variable value, the variable value being indicative of a variable parameter of the medical image data related to a medical issue of the patient; and
    a memory arrangement (210) configured to store the user input, the at least one variable value, and the variable parameter,
    wherein the processor (205) is configured to
    associate (320) each of the at least one variable values with a corresponding variable parameter to generate at least one variable-value pair;
    determine (325) whether at least one further variable-value pair related to a medical issue of the patient is required to determine a computed output value indicative of an assessment of a medical issue of the patient as a function of select ones of the at least one variable-value pair,
    wherein the determination (325) for at least one further variable-value pair is a function of at least one of predetermined settings and predetermined rules;
    when at least one further variable-value pair is required,
    - determine (325) the at least one further variable-value pair necessary to determine the computed output value,
    - request for the at least one further variable-value pair, and
    - receive (330) the at least one further variable-value pair prior to determining the computed output value; and
    determine a computed output value as a function of the at least one further variable-value pair and select ones of the at least one generated variable-value pair.
  8. The reporting tool (100) of claim 7, wherein the processor (205) is configured to determine a context of the user input and extract (315) select ones of the at least one variable value from the user input as a function of the context.
  9. The reporting tool (100) of claim 7, wherein the processor (205) is configured to integrate the computed output value in a report for the user input.
  10. The reporting tool (100) of claim 7, wherein the at least one variable value is included in the user input as at least one of an annotation, a free text, and a structured content.
  11. The reporting tool (100) of claim 7, wherein the processor (205) is configured to determine at least one further computed output value as a function of the computed output value.
  12. A non-transitory computer readable storage medium (210) storing a set of instructions that are executable by a processor (205), the instructions causing the processor to perform the steps of the method of claim 1.
EP14777799.9A 2013-07-29 2014-07-28 Reporting tool with integrated lesion stager Active EP3028197B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361859423P 2013-07-29 2013-07-29
PCT/IB2014/063472 WO2015015393A1 (en) 2013-07-29 2014-07-28 Reporting tool with integrated lesion stager

Publications (2)

Publication Number Publication Date
EP3028197A1 EP3028197A1 (en) 2016-06-08
EP3028197B1 true EP3028197B1 (en) 2019-04-17

Family

ID=51655784

Family Applications (1)

Application Number Title Priority Date Filing Date
EP14777799.9A Active EP3028197B1 (en) 2013-07-29 2014-07-28 Reporting tool with integrated lesion stager

Country Status (5)

Country Link
US (2) US9875538B2 (en)
EP (1) EP3028197B1 (en)
JP (1) JP6606074B2 (en)
CN (1) CN105683973B (en)
WO (1) WO2015015393A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11321372B2 (en) * 2017-01-03 2022-05-03 The Johns Hopkins University Method and system for a natural language processing using data streaming

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5836877A (en) 1997-02-24 1998-11-17 Lucid Inc System for facilitating pathological examination of a lesion in tissue
US7418119B2 (en) * 2002-10-31 2008-08-26 Siemens Computer Aided Diagnosis Ltd. Display for computer-aided evaluation of medical images and for establishing clinical recommendation therefrom
US8014576B2 (en) 2005-11-23 2011-09-06 The Medipattern Corporation Method and system of computer-aided quantitative and qualitative analysis of medical images
KR100763526B1 (en) * 2005-12-12 2007-10-04 한국전자통신연구원 Device and method for management of application context
US7664786B2 (en) * 2005-12-12 2010-02-16 Electronics And Telecommunications Research Institute Apparatus and method for managing application context
US8195594B1 (en) 2008-02-29 2012-06-05 Bryce thomas Methods and systems for generating medical reports
US9364194B2 (en) * 2008-09-18 2016-06-14 General Electric Company Systems and methods for detecting regions of altered stiffness
CN102197413B (en) * 2008-10-29 2017-03-22 皇家飞利浦电子股份有限公司 Analyzing an at least three-dimensional medical image
US20100158332A1 (en) 2008-12-22 2010-06-24 Dan Rico Method and system of automated detection of lesions in medical images
KR101667428B1 (en) 2009-08-25 2016-10-18 한국전자통신연구원 Preamble generation method and apparatus of station, data frmae generation method
JP5486364B2 (en) 2009-09-17 2014-05-07 富士フイルム株式会社 Interpretation report creation apparatus, method and program
JP5398518B2 (en) * 2009-12-25 2014-01-29 キヤノン株式会社 Medical diagnosis support device
US9378331B2 (en) * 2010-11-19 2016-06-28 D.R. Systems, Inc. Annotation and assessment of images
RU2604698C2 (en) 2011-03-16 2016-12-10 Конинклейке Филипс Н.В. Method and system for intelligent linking of medical data

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
US20160148374A1 (en) 2016-05-26
WO2015015393A1 (en) 2015-02-05
US20180144469A1 (en) 2018-05-24
US9875538B2 (en) 2018-01-23
US10650514B2 (en) 2020-05-12
CN105683973A (en) 2016-06-15
JP2016531661A (en) 2016-10-13
EP3028197A1 (en) 2016-06-08
JP6606074B2 (en) 2019-11-13
CN105683973B (en) 2019-06-25

Similar Documents

Publication Publication Date Title
CN108475538B (en) Structured discovery objects for integrating third party applications in an image interpretation workflow
JP6749835B2 (en) Context-sensitive medical data entry system
KR101878217B1 (en) Method, apparatus and computer program for medical data
RU2604698C2 (en) Method and system for intelligent linking of medical data
US20190108175A1 (en) Automated contextual determination of icd code relevance for ranking and efficient consumption
US10803980B2 (en) Method, apparatus, and computer program product for preparing a medical report
US10235360B2 (en) Generation of pictorial reporting diagrams of lesions in anatomical structures
US20240266028A1 (en) Device, system, and method for determining a reading environment by synthesizing downstream needs
US8515213B2 (en) System, method and computer instructions for aiding image analysis
WO2015031296A1 (en) System and method for implementing clinical decision support for medical imaging analysis
US10650514B2 (en) Reporting tool with integrated lesion stager
US20230420096A1 (en) Document creation apparatus, document creation method, and document creation program
EP3489962A1 (en) Method for controlling an evaluation device for medical images of patient, evaluation device, computer program and electronically readable storage medium
CN109906487A (en) The system and method that structuring Finding Object (SFO) for carrying out workflow sensitivity for clinical care continuity is recommended
US10956411B2 (en) Document management system for a medical task
US20210357634A1 (en) Methods and systems for processing documents with task-specific highlighting
RU2740219C2 (en) Context-sensitive medical guidance engine
US20240021320A1 (en) Worklist prioritization using non-patient data for urgency estimation
CN114091861A (en) Film reading task allocation method and device, readable storage medium and electronic equipment
CN114091862A (en) Method and device for determining reading result, readable storage medium and electronic equipment
WO2022192893A1 (en) Artificial intelligence system and method for generating medical impressions from text-based medical reports
Samuelson et al. Analyzing ROC curves using the effective set-size model

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20160229

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20170626

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20181106

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602014044954

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1122354

Country of ref document: AT

Kind code of ref document: T

Effective date: 20190515

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20190417

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190717

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190817

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190717

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190718

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1122354

Country of ref document: AT

Kind code of ref document: T

Effective date: 20190417

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190817

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602014044954

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

RAP2 Party data changed (patent owner data changed or rights of a patent transferred)

Owner name: KONINKLIJKE PHILIPS N.V.

26N No opposition filed

Effective date: 20200120

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20190731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190731

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190731

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190728

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190728

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20140728

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190417

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20240730

Year of fee payment: 11

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20240724

Year of fee payment: 11

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20240725

Year of fee payment: 11