WO2022220383A1 - 엑스레이 이미지 내에서의 대상 병변의 크기 변화를 측정하는 방법 및 시스템 - Google Patents
엑스레이 이미지 내에서의 대상 병변의 크기 변화를 측정하는 방법 및 시스템 Download PDFInfo
- Publication number
- WO2022220383A1 WO2022220383A1 PCT/KR2022/001892 KR2022001892W WO2022220383A1 WO 2022220383 A1 WO2022220383 A1 WO 2022220383A1 KR 2022001892 W KR2022001892 W KR 2022001892W WO 2022220383 A1 WO2022220383 A1 WO 2022220383A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target lesion
- occupancy
- ray image
- size
- change
- Prior art date
Links
- 230000003902 lesion Effects 0.000 title claims abstract description 366
- 230000008859 change Effects 0.000 title claims abstract description 187
- 238000000034 method Methods 0.000 title claims abstract description 62
- 230000010365 information processing Effects 0.000 claims description 100
- 238000000605 extraction Methods 0.000 claims description 30
- 238000010801 machine learning Methods 0.000 claims description 19
- 238000012360 testing method Methods 0.000 claims description 9
- 230000003247 decreasing effect Effects 0.000 claims description 8
- 210000004072 lung Anatomy 0.000 description 54
- 238000013528 artificial neural network Methods 0.000 description 46
- 238000011976 chest X-ray Methods 0.000 description 18
- 238000010586 diagram Methods 0.000 description 17
- 238000004891 communication Methods 0.000 description 13
- 238000005259 measurement Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 10
- 238000012549 training Methods 0.000 description 8
- 230000007423 decrease Effects 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 238000003384 imaging method Methods 0.000 description 6
- 230000004044 response Effects 0.000 description 6
- 238000004590 computer program Methods 0.000 description 5
- 201000003144 pneumothorax Diseases 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 230000014509 gene expression Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 210000000056 organ Anatomy 0.000 description 4
- 238000012545 processing Methods 0.000 description 3
- 230000011218 segmentation Effects 0.000 description 3
- 210000000225 synapse Anatomy 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 238000013529 biological neural network Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 210000004789 organ system Anatomy 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 206010028980 Neoplasm Diseases 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 210000000481 breast Anatomy 0.000 description 1
- 201000011510 cancer Diseases 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 238000013523 data management Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000000946 synaptic effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
- G06T7/0016—Biomedical image inspection using an image reference approach involving temporal comparison
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5217—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data extracting a diagnostic or physiological parameter from medical diagnostic data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/50—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
- G06V10/443—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
- G06V10/449—Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
- G06V10/451—Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters with interaction between the filter responses, e.g. cortical complex cells
- G06V10/454—Integrating the filters into a hierarchical structure, e.g. convolutional neural networks [CNN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/751—Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/761—Proximity, similarity or dissimilarity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10116—X-ray image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30061—Lung
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
Definitions
- the present disclosure relates to a method and a system for measuring a change in the size of a target lesion in an X-ray image, and more specifically, calculating the occupancy occupied by a region corresponding to the target lesion among reference regions in an X-ray image, and the calculated occupancy
- a method and system for measuring the size change of a target lesion based on
- the changes in the lesion in the picture may correspond to important information.
- X-rays may be continuously taken to see a change in the size of the pneumothorax after the procedure. If the size of the pneumothorax decreases, it means that the treatment is going well, and if the size of the pneumothorax increases, other measures may be required as an emergency.
- information on whether the corresponding lesion has continuously existed in the same size in the past or has increased/decreased in size may be information necessary for the treatment of the specific lesion. Therefore, information on the size change of the lesion over time may correspond to very clinically important information.
- CAD Computer Aided Detection
- a method of absolutely comparing the size of a lesion in two X-ray images and a method of comparing the size of a lesion by aligning the X-ray images in units of pixels (registration) There is a way.
- the first method of absolutely comparing the size of the lesion in two X-ray images it may be difficult to solve the problem that the size of the lesion is different in the image depending on various variables such as the patient's posture and the distance to the imaging device. That is, the change in the size of the lesion may not be accurate.
- a difference in the size of the lesions can be predicted more accurately if it is assumed that the registration in units of pixels is successful.
- an operation that requires a large amount of computation in pixel-unit alignment itself must be performed, and a specific situation (for example, a situation in which alignment is difficult due to the very different states of the patient's two X-ray images) or the state of the image Depending on (eg, the resolution of the image, the state according to the storage type of the image), it may be practically impossible to align the two X-ray images.
- pixel-unit alignment corresponds to a very complex algorithm, the computing power required for this purpose is difficult to commercialize, and a huge cost may be incurred in developing the algorithm.
- the present disclosure provides a method and system for measuring a change in the size of a target lesion to solve the above problems.
- the present disclosure may be implemented in various ways, including a method, an apparatus (system) or a computer readable storage medium storing instructions, a computer program.
- a method of measuring a size change of a target lesion in an X-ray image includes a first X-ray image including the target lesion and a target Receiving a second X-ray image including the lesion, calculating an occupancy occupied by an area corresponding to the target lesion among reference areas in each of the first X-ray image and the second X-ray image, and based on the calculated occupancy and measuring a change in the size of the target lesion.
- the calculating includes: determining a first reference area and a second reference area from each of the first X-ray image and the second X-ray image, each of the first X-ray image and the second X-ray image and identifying a target lesion from the , calculating a first occupancy occupied by the identified target lesion in the first reference region and calculating a second occupancy occupied by the identified target lesion in the second reference region.
- the determining of the first reference region and the second reference region includes: inputting the first X-ray image to the reference region extraction model, outputting the first reference region, and the second X-ray image and outputting a second reference region by inputting the input to the reference region extraction model, wherein the reference region extraction model is learned using a plurality of reference X-ray images and label information for the reference reference region.
- the second reference area corresponds to the first reference area.
- the calculating includes: calculating the first occupancy rate by dividing the number of pixels in the area occupied by the target lesion in the first reference area by the number of pixels corresponding to the first reference area; and calculating the second occupancy rate by dividing the number of pixels in the area occupied by the target lesion in the reference area by the number of pixels corresponding to the second reference area.
- determining the first reference region and the second reference region comprises determining a score for the first reference region and a score for the second reference region
- the target Identifying the lesion includes determining a score for the target lesion within the first reference region and determining a score for the target lesion within the second reference region
- calculating comprises: calculating a first occupancy based on the score for the region and the score for the subject lesion within the first reference region and a second occupancy based on the score for the second reference region and the score for the subject lesion within the second reference region 2 including calculating the share.
- the measuring includes determining whether the size of the target lesion changes, based on the first occupancy and the second occupancy.
- the step of determining whether the size of the target lesion changes based on the first occupancy and the second occupancy includes calculating an occupancy change amount for the target lesion based on the first occupancy and the second occupancy and comparing the calculated occupancy change amount for the target lesion with a reference value, and determining whether the size of the target lesion changes.
- the reference value is determined based on a numerical value associated with a prediction accuracy or a numerical value associated with a target metric calculated for the test set.
- the reference value includes the first reference value and the second reference value, and when the calculated change in occupancy for the target lesion is greater than or equal to the first reference value, it is determined that the size of the target lesion has increased. If the determined and calculated change in occupancy for the target lesion is smaller than the first reference value and is greater than or equal to the second reference value, it is determined that the size of the target lesion is unchanged, and the calculated change in occupancy for the target lesion is the second. 2 If it is smaller than the reference value, it is determined that the size of the target lesion has decreased.
- the step of determining whether the size of the target lesion changes based on the first occupancy and the second occupancy includes calculating an occupancy change amount for the target lesion based on the first occupancy and the second occupancy and inputting the calculated occupancy change amount for the target lesion into the change or not determining model, and determining whether to change the size of the target lesion based on the output determination result, wherein the change determination model includes: and a machine learning model trained to output a determination result as to whether the size of the reference target lesion changes based on the input value for the reference occupancy change amount.
- the reference area is an area determined by dividing each entire area of the first X-ray image and the second X-ray image into a plurality of areas.
- the information processing system provides a first X-ray image including a target lesion and a second X-ray image including a target lesion by executing a memory for storing one or more instructions and one or more stored instructions.
- a processor configured to receive, calculate a occupancy occupied by a region corresponding to the target lesion among the reference regions in each of the first X-ray image and the second X-ray image, and measure a change in size of the target lesion based on the calculated occupancy do.
- the change in the size of the lesion can be accurately measured even with a small amount of computation.
- the patient's posture and the distance from the imaging device in measuring the size change of the target lesion It may not be significantly affected by various variables such as
- the ratio of the size of the target lesion to the size of the lung may be maintained. Therefore, a change in this ratio may indicate a change in the target lesion.
- a plurality of lesions included in the entire area (eg, lung, etc.) in the X-ray image may be detected separately, and the occupancy and/or change amount for each lesion may be calculated. Therefore, the size change for each lesion can be measured.
- FIG. 1 is a diagram illustrating an example in which an information processing system according to an embodiment of the present disclosure provides a measurement result of a size change of a target lesion.
- FIG. 2 is a block diagram illustrating an internal configuration of an information processing system according to an embodiment of the present disclosure.
- FIG. 3 is a flowchart illustrating a method of measuring a size change of a target lesion in an X-ray image according to an embodiment of the present disclosure.
- FIG. 4 is a diagram illustrating an example in which the information processing system receives label information for learning a reference region extraction model through a user terminal according to an embodiment of the present disclosure.
- FIG. 5 is a diagram illustrating an example in which the information processing system determines a reference region from an X-ray image and outputs it through a user terminal according to an embodiment of the present disclosure.
- FIG. 6 is a diagram illustrating an example of calculating a first occupancy from a first X-ray image and calculating a second occupancy from a second X-ray image according to an embodiment of the present disclosure.
- FIG. 7 is a diagram illustrating an example of generating a measurement result of a size change of a target lesion based on a first occupancy and a second occupancy according to an embodiment of the present disclosure.
- FIG. 8 is a diagram illustrating an example of measuring a size change of a target lesion in a chest X-ray image according to an embodiment of the present disclosure.
- FIG. 9 is a diagram illustrating an example of determining the occupancy of a target lesion in a reference area and measuring a size change of the target lesion according to an embodiment of the present disclosure.
- FIG. 10 is an exemplary diagram illustrating an artificial neural network model according to an embodiment of the present disclosure.
- 'module' or 'unit' used in the specification means a software or hardware component, and 'module' or 'unit' performs certain roles.
- 'module' or 'unit' is not meant to be limited to software or hardware.
- a 'module' or 'unit' may be configured to reside on an addressable storage medium or may be configured to reproduce one or more processors.
- a 'module' or 'unit' refers to components such as software components, object-oriented software components, class components and task components, processes, functions, properties, may include at least one of procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, database, data structures, tables, arrays or variables.
- Components and 'modules' or 'units' are the functions provided within are combined into a smaller number of components and 'modules' or 'units' or additional components and 'modules' or 'units' can be further separated.
- a 'module' or a 'unit' may be implemented with a processor and a memory.
- 'Processor' should be construed broadly to include general purpose processors, central processing units (CPUs), microprocessors, digital signal processors (DSPs), controllers, microcontrollers, state machines, and the like.
- a 'processor' may refer to an application specific semiconductor (ASIC), a programmable logic device (PLD), a field programmable gate array (FPGA), or the like.
- ASIC application specific semiconductor
- PLD programmable logic device
- FPGA field programmable gate array
- 'Processor' refers to a combination of processing devices, such as, for example, a combination of a DSP and a microprocessor, a combination of a plurality of microprocessors, a combination of one or more microprocessors in combination with a DSP core, or any other such configurations. You may. Also, 'memory' should be construed broadly to include any electronic component capable of storing electronic information.
- RAM random access memory
- ROM read-only memory
- NVRAM non-volatile random access memory
- PROM programmable read-only memory
- EPROM erase-programmable read-only memory
- a memory is said to be in electronic communication with the processor if the processor is capable of reading information from and/or writing information to the memory.
- a memory integrated in the processor is in electronic communication with the processor.
- an 'X-ray image' may refer to any image captured by any inspection equipment that transmits X-rays to at least a part of the human body using X-rays.
- the examination equipment includes, but is not limited to, not only a general X-ray machine, but also a special X-ray machine (eg, a dedicated breast machine, etc.) to be suitable for a special part of the human body.
- a 'target lesion' may refer to data/information, an image region, an object, etc. that are to be measured for size change.
- the 'target lesion' may include a target to be detected through an X-ray image, such as cancer or pneumothorax.
- a 'pixel' may refer to a pixel included in an X-ray image.
- the 'number of pixels' may refer to the number of pixels corresponding to a specific region in the X-ray image.
- the larger the number of pixels the larger the size of the specific area in the X-ray image, and the smaller the number of pixels, the smaller the size of the specific area in the X-ray image.
- an 'artificial neural network model' is an example of a machine learning model, and may include any model used to infer an answer to a given input.
- the artificial neural network model may include an artificial neural network model including an input layer (layer), a plurality of hidden layers, and an output layer.
- each layer may include one or more nodes.
- an artificial neural network model may be trained to determine, identify, and/or detect a reference region and/or a target lesion region in an X-ray image.
- the artificial neural network model is based on the change in occupancy for the target lesion (eg, the difference between the first occupancy and the second occupancy for the target lesion, or a value obtained by subtracting the first occupancy from the second occupancy), It can be learned to output information about the size change of the lesion.
- the artificial neural network model may include weights associated with a plurality of nodes included in the artificial neural network model.
- the weight may include any parameter associated with the artificial neural network model.
- each of A and B' may refer to a component (eg, region) included in A and a component included in B.
- a 'reference area in each of the first X-ray image and the second X-ray image' may refer to a reference area in the first X-ray image and a reference area in the second X-ray image.
- the 'region corresponding to the target lesion among the reference areas in each of the first X-ray image and the second X-ray image' is defined in the reference area corresponding to the target lesion in the first X-ray image and in the second X-ray image. may refer to a region corresponding to the target lesion among the reference regions of .
- an 'instruction' is one or more instructions grouped based on a function, and may refer to a component of a computer program and executed by a processor.
- a 'user' may refer to a person who uses a user terminal.
- a user may include an annotator that performs an annotation operation.
- the user may include a doctor, a patient, etc. provided with a measurement result of a size change of a target lesion.
- a user may refer to a user terminal, and conversely, a user terminal may refer to a user. That is, the terms user and user terminal may be used interchangeably herein.
- 'annotation' may refer to annotation information (eg, label information, etc.) determined according to performing an annotation operation and/or an annotation operation.
- 'annotation information' may refer to information for an annotation operation and/or information generated by an annotation operation (eg, label information).
- the 'total area' may refer to an area of a photographing target included in an X-ray image.
- the entire area may refer to an area in which an object (eg, a patient) is photographed except for a background area in the X-ray image.
- the entire area may refer to an area in which a target tissue, an organ, an organ, an organ system, etc. to be observed through an X-ray image are photographed.
- the 'reference region' may refer to at least a partial region of the entire region.
- 'occupancy' may refer to the occupancy ratio of the target area to the reference area.
- the 'occupancy' may be calculated as a ratio of the size of the target lesion area to the size of the reference area.
- the 'occupancy' may be calculated as a ratio of the number of pixels in the target lesion region to the number of pixels in the reference region.
- the 'occupancy' may be calculated based on the size of the region as well as the predicted score (eg, probability value) of the region.
- the 'occupancy' may be calculated as a ratio of the prediction score of the target lesion area (eg, the sum of the prediction scores of each of a plurality of pixels included in the target area, the average, etc.) to the size of the reference area. .
- 'occupancy' refers to the predicted score of the target lesion area (eg, the target lesion area, etc.) with respect to the prediction score of the reference area (eg, the sum, average, etc. of the predicted scores of each of a plurality of pixels included in the reference area). It may be calculated as a ratio of the sum of the prediction scores of each of the plurality of pixels included in the region, the average, etc.).
- the 'occupancy' may be calculated based on not only the size of the target lesion, but also a probability map (eg, heat map) of the target lesion and/or the state of the target lesion.
- the information processing system 100 is any computing device used to measure a size change of a target lesion.
- the computing device may refer to any type of device equipped with a computing function, for example, a notebook, a desktop, a laptop, a tablet computer, a server, a cloud system. , a user terminal, etc., but is not limited thereto.
- the information processing system 100 is illustrated as one computing device in FIG. 1 , the present invention is not limited thereto, and the information processing system 100 may be configured to distribute information and/or data through a plurality of computing devices. .
- the information processing system 100 may be configured to be connected to and communicate with each of an image capturing apparatus (eg, an X-ray image capturing apparatus), a user terminal, and/or a storage system (or apparatus).
- the storage system may be a device or a cloud system that stores and manages various data related to a machine learning model for measuring a change in the size of a target lesion.
- the storage system may store and manage various data using a database.
- the various data may include arbitrary data related to the machine learning model, and may include, for example, an X-ray image, label information for a reference region, a test set, a machine learning model, and the like, but is not limited thereto. .
- the information processing system 100 compares the reference area (eg, lung area) in the X-ray image (or image) with the target lesion area observed in the reference area, The occupancy of the target lesion with respect to the reference area may be calculated. That is, the information processing system 100 may calculate the occupancy rate of the target lesion with respect to the reference region for each of the plurality of X-ray images having different imaging timings. The information processing system 100 may measure a change in the size of the target lesion in the reference region through the change in occupation of the target lesion.
- the reference area eg, lung area
- the occupancy of the target lesion with respect to the reference area may be calculated. That is, the information processing system 100 may calculate the occupancy rate of the target lesion with respect to the reference region for each of the plurality of X-ray images having different imaging timings.
- the information processing system 100 may measure a change in the size of the target lesion in the reference region through the change in occupation of the target lesion.
- the information processing system 100 includes a first X-ray image 110 including a target lesion and a second X-ray image 120 including a target lesion from an image capturing device, a user terminal, and/or a storage system (or device). ) can be received.
- the information processing system 100 may sequentially receive the first X-ray image 110 and the second X-ray image 120 .
- the information processing system 100 may simultaneously receive the first X-ray image 110 and the second X-ray image 120 .
- the first X-ray image 110 and the second X-ray image 120 may correspond to images obtained by photographing the same object/object from different viewpoints.
- the first X-ray image 110 and/or the second X-ray image 120 received by the information processing system 100 may not include a target lesion.
- the information processing system 100 may generate and/or output the measurement result 130 of the size change of the target lesion based on the received first X-ray image 110 and the second X-ray image 120 .
- the measurement result 130 of the size change of the target lesion is the absolute size value of the target lesion included in each of the first X-ray image 110 and the second X-ray image 120 , the occupancy, whether the size of the target lesion is changed, the change degree and the like.
- the information processing system 100 may calculate the occupancy occupied by the area corresponding to the target lesion among the reference areas in each of the first X-ray image 110 and the second X-ray image 120 . To this end, the information processing system 100 determines a first reference area and a second reference area from each of the first X-ray image 110 and the second X-ray image 120 , and the first X-ray image 110 and the second reference area. 2 It is possible to identify and/or detect a target lesion from each of the X-ray images 120 .
- the reference region is a plurality of regions (eg, left lung, right lung, etc.) of each entire region (eg, lung region) of the first X-ray image 110 and the second X-ray image 120 .
- the information processing system 100 outputs a first reference area (eg, a left lung area in the first X-ray image) by inputting the first X-ray image 110 to the reference area extraction model, By inputting the second X-ray image 120 to the reference area extraction model, a second reference area corresponding to the first reference area (eg, a left lung area in the second X-ray image) may be output.
- the information processing system 100 and/or the storage system may include a reference region extraction model learned by using a plurality of reference X-ray images and label information on the reference reference region.
- the information processing system 100 calculates the first occupancy based on the score for the first reference region and the score for the target lesion, and based on the score for the second reference region and the score for the target lesion can calculate the second share.
- the information processing system 100 may measure a change in the size of the target lesion based on the calculated occupancy. For example, the information processing system 100 determines whether the size of the target lesion changes based on the first occupancy occupied by the identified target lesion in the first reference region and the second occupancy occupied by the identified target lesion in the second reference region. can be determined. That is, the information processing system 100 may measure the size change of the target lesion based on the change in occupancy, not the absolute size of the target lesion included in the X-ray image.
- the information processing system 100 calculates an occupancy change for the target lesion based on the first occupancy and the second occupancy, and compares the calculated occupancy change with the reference value for the target lesion, It is possible to determine whether the size of
- the reference value may include a first reference value and a second reference value. In this case, when the calculated occupancy change for the target lesion is greater than or equal to the first reference value, it may be determined that the size of the target lesion has increased. When the calculated occupancy change for the target lesion is smaller than the first reference value and greater than or equal to the second reference value, it may be determined that the size of the target lesion is unchanged. When the calculated occupancy change for the target lesion is smaller than the second reference value, it may be determined that the size of the target lesion has decreased. Additionally or alternatively, the reference value may be determined based on a numerical value associated with a prediction accuracy or a numerical value associated with a target metric calculated for the test set.
- the information processing system 100 calculates the occupancy change amount for the target lesion based on the first occupancy and the second occupancy rate, and inputs the calculated occupancy change amount for the target lesion into the change or not determination model, Based on the output determination result, it may be determined whether the size of the target lesion changes.
- the information processing system 100 and/or the storage system may include a change/non-change determination model, wherein the change/non-change determination model is the size of a reference target lesion based on an input value for a change in reference occupancy for the target lesion. It may include a machine learning model trained to output a determination result as to whether or not there is a change.
- the information processing system 100 may include a memory 210 , a processor 220 , a communication module 230 , and an input/output interface 240 . As shown in FIG. 2 , the information processing system 100 may be configured to communicate information and/or data through a network using the communication module 230 .
- the memory 210 may include any non-transitory computer-readable recording medium.
- the memory 210 is a non-volatile mass storage device such as random access memory (RAM), read only memory (ROM), disk drive, solid state drive (SSD), flash memory, etc. mass storage device).
- a non-volatile mass storage device such as a ROM, an SSD, a flash memory, a disk drive, etc. may be included in the information processing system 100 as a permanent storage device separate from the memory.
- the memory 210 includes an operating system and at least one program code (eg, an application for measuring a change in size of a target lesion, a program for determining a reference area from an X-ray image, a program for identifying a target lesion from an X-ray image, code for a program for calculating the occupancy rate of the target lesion, etc.) may be stored.
- program code eg, an application for measuring a change in size of a target lesion, a program for determining a reference area from an X-ray image, a program for identifying a target lesion from an X-ray image, code for a program for calculating the occupancy rate of the target lesion, etc.
- These software components may be loaded from a computer-readable recording medium separate from the memory 210 .
- a separate computer-readable recording medium may include a recording medium directly connectable to the information processing system 100, for example, a floppy drive, a disk, a tape, a DVD/CD-ROM drive, a memory card, etc. may include a computer-readable recording medium of
- the software components may be loaded into the memory 210 through the communication module 230 rather than a computer-readable recording medium.
- the at least one program is a computer program (eg, a change in the size of a target lesion) installed by files provided through the communication module 230 by developers or a file distribution system that distributes installation files of applications. to be loaded into the memory 210 based on an application for measuring can
- the processor 220 may be configured to process instructions of a computer program by performing basic arithmetic, logic, and input/output operations.
- the command may be provided to a user terminal (not shown) or another external system by the memory 210 or the communication module 230 .
- the processor 220 may be configured to execute a received instruction according to program code stored in a recording device such as the memory 210 .
- the communication module 230 may provide a configuration or function for the user terminal and/or the image capturing apparatus and the information processing system 100 to communicate with each other through a network, and the information processing system 100 may use a storage device and/or It may provide a configuration or function for communicating with another system (eg, a separate cloud system, etc.).
- control signals, commands, data, etc. provided under the control of the processor 220 of the information processing system 100 are provided to the user terminal through the communication module 230 and the network via the communication module of the user terminal.
- the information processing system 100 receives an X-ray image (eg, a target lesion) including a target lesion from an external device (eg, a storage device, an imaging device, an external system, etc.) through the communication module 230 .
- an external device eg, a storage device, an imaging device, an external system, etc.
- the information processing system 100 may provide the measurement result of the size change of the target lesion to the user terminal through the communication module 230 .
- the input/output interface 240 of the information processing system 100 is connected to the information processing system 100 or means for an interface with a device (not shown) for input or output that the information processing system 100 may include.
- a device not shown
- the input/output interface 240 is illustrated as an element configured separately from the processor 220 in FIG. 2 , the present invention is not limited thereto, and the input/output interface 240 may be configured to be included in the processor 220 .
- the information processing system 100 may include more components than those of FIG. 2 . However, there is no need to clearly show most of the prior art components.
- the processor 220 of the information processing system 100 may be configured to manage, process and/or store information and/or data received from a plurality of user terminals and/or a plurality of external systems.
- the processor 220 may store, process, and transmit the received first X-ray image, the second X-ray image, and the like.
- the processor 220 may calculate the occupancy occupied by the area corresponding to the target lesion among the reference areas in each of the first X-ray image and the second X-ray image.
- the processor 220 may measure a change in the size of the target lesion based on the calculated occupancy rate, and transmit the measurement result to the user terminal.
- the method 300 for measuring a change in size of a target lesion may be performed by a processor (eg, at least one processor of an information processing system).
- the method 300 for measuring a change in the size of a target lesion may be started when the processor receives a first X-ray image including the target lesion and a second X-ray image including the target lesion ( S310 ).
- the processor may calculate the occupancy occupied by the area corresponding to the target lesion among the reference areas in each of the first X-ray image and the second X-ray image (S320).
- the reference area may refer to an area determined by dividing each entire area of the first X-ray image and the second X-ray image into a plurality of areas.
- the processor determines a first reference area and a second reference area from each of the first X-ray image and the second X-ray image, identifies a target lesion from each of the first X-ray image and the second X-ray image, A first occupancy occupied by the identified target lesion in the first reference region and a second occupancy occupied by the identified target lesion in the second reference region may be calculated.
- the processor may input the first X-ray image to the reference region extraction model to output a first reference region, and input the second X-ray image to the reference region extraction model to output a second reference region.
- the second reference region corresponds to the first reference region
- the reference region extraction model may be learned using a plurality of reference X-ray images and label information on the reference reference region.
- the processor calculates the first occupancy rate by dividing the number of pixels in the area occupied by the target lesion in the first reference area by the number of pixels corresponding to the first reference area, and occupies the target lesion in the second reference area. By dividing the number of pixels in the area by the number of pixels corresponding to the second reference area, the second occupancy may be calculated.
- the processor may measure a change in the size of the target lesion based on the calculated occupancy rate (S330). In an embodiment, the processor may determine whether the size of the target lesion changes based on the first occupancy and the second occupancy. For example, the processor may determine whether the size of the target lesion changes by comparing the calculated occupancy change amount for the target lesion (eg, a value obtained by subtracting the first occupancy rate from the second occupancy rate) with a reference value. In this case, the reference value includes the first reference value and the second reference value, and when the value obtained by subtracting the first occupancy from the second occupancy is greater than or equal to the first reference value, it may be determined that the size of the target lesion has increased. have.
- the calculated occupancy change amount for the target lesion eg, a value obtained by subtracting the first occupancy rate from the second occupancy rate
- the processor may determine whether to change the size of the target lesion based on the output determination result by inputting the difference between the first occupancy and the second occupancy into the change determination model.
- the change determination model may include a machine learning model trained to output a determination result on whether the size of the reference target lesion changes based on the input value for the reference occupancy difference.
- a region corresponding to a reference region may be set in a region that may be commonly present in an X-ray image (eg, a past X-ray image and a current X-ray image).
- the reference region may be a region determined by dividing each entire region of the X-ray image into a plurality of regions. For example, 6 independent areas ('Upper right area', 'Upper Left area', 'Mid right area', 'Mid left area', 'Lower right area') used to read the lung area in a chest X-ray image. ', 'Lower left area') may be set as a reference area including at least one area.
- each of the six independent regions or a region fused to some of the six independent regions may correspond to the reference region. have.
- the information processing system 100 may determine the reference region from the X-ray image through the reference region extraction model. To this end, the information processing system 100 may generate/learn a reference region extraction model. In an embodiment, the information processing system 100 may learn an algorithm for finding a reference region in each X-ray image through a machine learning method.
- the reference region extraction model may refer to a segmentation artificial neural network model.
- the generated/learned reference region extraction model may be stored in an information processing system and/or a storage system. In order to generate/learn the reference region extraction model, the information processing system 100 may be configured to be connected to and communicate with the user terminal 420 and/or the storage system 410 .
- the information processing system 100 may output a reference X-ray image to be annotated to the user terminal 420 .
- the reference X-ray image to be annotated may be received from the storage system 410 .
- the user eg, an annotator, etc.
- the user divides the entire region (eg, the lung region) included in the reference X-ray image output through the user terminal 420 into a plurality of regions (eg, The upper right area, the mid right area, the lower right area, the upper left area, the mid left area, the lower left area, etc.) determine the label information (eg, annotation information) for each and provide it to the information processing system 100 .
- a user may provide label information on the reference reference region to the information processing system 100 through the user terminal 420 .
- the reference reference region may include at least one region in which a change in size of the target lesion is to be measured among the plurality of regions.
- the information processing system 100 may generate and/or learn a reference region extraction model by using the received reference X-ray image and label information on the reference reference region.
- the information processing system 100 may provide a plurality of chest X-ray images to the user terminal 420 as training images to be used as training materials for the reference region extraction model.
- the user performs an annotation on six regions of the lung included in the plurality of chest X-ray images through the user terminal 420 , and processes a plurality of training images including label information on the six regions of the lung as an annotation result may be provided to the system 100 .
- the plurality of training images may include a training image 430 including label information for 6 regions of the lung.
- the information processing system 100 may generate/learn a model for determining each of the six regions of the lung from the chest X-ray image based on the plurality of training images including the training image 430 including the label information.
- the information processing system 100 may generate and/or train a model for determining a reference region from a chest X-ray image, based on the training image and the label information.
- the reference region may include at least one region among six regions of the lung.
- the information processing system 100 learns an algorithm that receives one X-ray image as an input, and minimizes loss with label information (ie, annotation information) for six regions of the lung. can learn For example, the information processing system 100 may calculate a probability that each pixel of the image corresponds to each region by using label information on six regions of the lung in the X-ray image. In this case, it may be assumed that the image is not flipped in order to distinguish left and right of each region. Also, in order to remove noise, a value lower than the corresponding value may be clipped by setting a threshold.
- label information ie, annotation information
- FIG. 4 one user terminal 420 is illustrated, but the present invention is not limited thereto, and a plurality of user terminals 420 may be configured to be connected to and communicate with the information processing system 100 .
- the storage system 410 is illustrated as one device in FIG. 4 , the present invention is not limited thereto, and may be configured as a system supporting a cloud or a plurality of storage devices.
- each component of a system for generating/learning a reference region extraction model represents functional elements that are functionally separated, and a plurality of components may be implemented in a form that is integrated with each other in an actual physical environment.
- the information processing system 100 and the storage system 410 are illustrated as separate systems, but the present invention is not limited thereto, and may be integrated into one system.
- FIG. 5 is a diagram illustrating an example in which the information processing system 100 determines a reference region from an X-ray image and outputs it through the user terminal 520 according to an embodiment of the present disclosure.
- the information processing system eg, at least one processor of the information processing system
- the information processing system 100 may determine a reference region in the X-ray image.
- the reference region may be determined by dividing the entire region of the X-ray image into a plurality of regions.
- the information processing system 100 may determine the first reference area and the second reference area from each of the first X-ray image and the second X-ray image.
- the information processing system 100 inputs the first X-ray image to the reference area extraction model, outputs a first reference area, and inputs the second X-ray image to the reference area extraction model to obtain a second reference area can be printed out.
- the second reference area may correspond to the first reference area.
- the area corresponding to the left lung in the first X-ray image corresponds to the first reference area
- the area corresponding to the left lung in the second X-ray image may also correspond to the second reference area.
- the information processing system 100 may receive a target X-ray image (eg, a first X-ray image and a second X-ray image) for measuring a size change of a target lesion.
- a target X-ray image eg, a first X-ray image and a second X-ray image
- the information processing system 100 may receive the target X-ray image from the storage system 510 , the user terminal 520 , and/or the image capturing apparatus.
- the information processing system 100 may determine a reference region from the received target X-ray image. For example, the information processing system 100 may determine the reference region by dividing the entire region included in the target X-ray image into a plurality of regions.
- the information processing system 100 divides the lung region included in the target X-ray image into six regions (eg, 'Upper right region', 'Upper Left region', 'Mid right region', 'Mid left region').
- the reference area can be determined by dividing the area into 'area', 'lower right area', and 'lower left area').
- the reference region may include at least a portion of the plurality of regions.
- a tissue (eg, lung) region included in the target X-ray image may be divided into a plurality of regions, and any combination of the plurality of divided regions may be determined as a reference region.
- the occupancy may be calculated by calculating the ratio of the size of the lesion in the reference region, which is a combination of the plurality of regions.
- the reference region may be determined as a right lung region including an 'Upper right region', a 'Mid right region', and a 'Lower right region'.
- each component of a system for generating/learning a reference region extraction model represents functional elements that are functionally separated, and a plurality of components may be implemented in a form that is integrated with each other in an actual physical environment.
- the information processing system 100 and the storage system 510 are illustrated as separate systems, but the present invention is not limited thereto, and may be integrated into one system.
- the processor receives the first X-ray image 620 including the target lesion and the second X-ray image 630 including the target lesion, and the first X-ray image
- the occupancy occupied by the area corresponding to the target lesion among the reference areas in each of the 620 and the second X-ray image 630 may be calculated.
- the occupancy may refer to the size (eg, number of pixels) of the lesion located in the reference region compared to the size of the reference region (eg, number of pixels, etc.).
- the occupancy of the target lesion for the left lung is the size of the target lesion located in the left lung (eg, the number of pixels) compared to the size of the left lung (eg, number of pixels) in the X-ray image. : number of pixels, etc.).
- the processor 610 determines a first reference area 626 and a second reference area 636 from each of the first X-ray image 620 and the second X-ray image 630 , and the first X-ray A target lesion may be identified from each of the image 620 and the second X-ray image 630 . Then, the processor 610 calculates the first occupancy 628 occupied by the identified target lesion in the first reference region 626 and the second occupancy 638 occupied by the identified target lesion in the second reference region 636 . can be calculated.
- the processor 610 calculates the number of pixels in the area 622 occupied by the target lesion in the first X-ray image 620 in pixels corresponding to the first reference area 626 .
- the first occupancy 628 is calculated, and by dividing the number of pixels in the area 632 occupied by the target lesion in the second X-ray image 630 by the number of pixels corresponding to the second reference area 636, A second occupancy 638 may be calculated.
- the processor can compare the size of the target lesion in the two X-ray images, regardless of the size of the X-ray image or the position of the patient, and can more accurately calculate the change in the size of the target lesion.
- the processor receives the first chest X-ray image 620 , and determines a lung area (ie, the entire area) 624 included in the first chest X-ray image 620 , and the determined lung area 624 . ), a first reference region 626 corresponding to the left lung region may be determined. Also, the processor may identify the target lesion region 622 included in the first chest X-ray image 620 . Here, the target lesion area may refer to an area occupied by the target lesion in the reference area. Thereafter, the processor may calculate the first occupancy 628 by dividing the number of pixels in the target lesion region 622 by the number of pixels in the first reference region 626 .
- the processor receives the second chest X-ray image 630 , determines the lung region 634 included in the second chest X-ray image 630 , and corresponds to the left lung region among the determined lung regions 634 .
- a second reference region 636 may be determined.
- the processor may identify the target lesion region 632 included in the second chest X-ray image 630 . Thereafter, the processor may calculate the second occupancy 638 by dividing the number of pixels in the target lesion region 632 by the number of pixels in the second reference region 636 .
- the processor receives the first X-ray image 620 and the second X-ray image 630 , respectively, and calculates the occupancy is illustrated, but the present invention is not limited thereto.
- the processor may simultaneously receive the first X-ray image 620 and the second X-ray image 630 to calculate the occupancy.
- the processor may sequentially receive the first X-ray image 620 and the second X-ray image 630 to calculate the occupancy.
- the processor may measure a change in the size of the target lesion based on the calculated occupancy.
- the change measuring unit 710 included in the processor receives the first occupancy 628 and the second occupancy 638 calculated as described above, and measures the size change of the target lesion. Then, a measurement result 720 of a change in the size of the target lesion may be generated.
- the target lesion size change measurement result 720 includes first occupancy information, second occupancy information, occupancy change information of the target lesion, and/or information on the size change of the target lesion (eg, increase, decrease, no change, etc.). may include Here, the occupancy change information of the target lesion may be calculated as in Equation 3 below.
- the user may need information on whether the target lesion has changed (eg, information on whether increase, decrease, or no change) as well as the numerically expressed change in occupancy of the target lesion.
- the change measuring unit 710 may determine whether the size of the target lesion changes based on the first occupancy 628 and the second occupancy 638 . For example, the change measuring unit 710 may determine whether the size of the target lesion has changed by comparing a value obtained by subtracting the first occupancy 628 from the second occupancy 638 with a reference value. That is, the change measuring unit 710 may determine whether the size of the target lesion changes based on a heuristic reference value.
- the reference value includes the first reference value (t1) and the second reference value (t2), and a value obtained by subtracting the first occupancy 628 from the second occupancy 638 (ie, occupancy change amount of the target lesion)
- a value obtained by subtracting the first occupancy 628 from the second occupancy 638 ie, occupancy change amount of the target lesion
- the first reference value t1 it may be determined that the size of the target lesion has increased.
- the value obtained by subtracting the first occupancy 628 from the second occupancy 638 is less than the first reference value t1 and greater than or equal to the second reference value t2, it is determined that the size of the target lesion is unchanged.
- the value obtained by subtracting the first occupancy 628 from the second occupancy 638 is smaller than the second reference value t2, it may be determined that the size of the target lesion has decreased. That is, it can be assumed that a slight change is not a change. This is because the X-ray image may not correspond to completely accurate information.
- a threshold may be determined based on a numerical value associated with a prediction accuracy or a numerical value associated with a target metric calculated for the test set. That is, in order to set the first reference value t1 and the second reference value t2, the processor receives a test set, and a specific metric (eg, auc, accuracy, etc.) is high for the received test set.
- a specific metric eg, auc, accuracy, etc.
- a standard value can be obtained. For example, the processor may arbitrarily find a reference value. Alternatively, since there is a curve with an operating point in the Area Under Curve (AUC) metric, the sensitivity and/or specificity of a specific reference value may be calculated.
- AUC Area Under Curve
- the processor may find a point at which sensitivity and/or specificity is maximized and set it as a reference value. In this manner, the processor may set a first reference value between an increase and no change and a second reference value between a decrease and no change.
- the change measuring unit 710 inputs the difference between the first occupancy 628 and the second occupancy 638 into the change or not determining model, and based on the output determination result, whether the size of the target lesion changes.
- the difference between the first occupancy 628 and the second occupancy 638 may refer to a value obtained by subtracting the first occupancy 628 from the second occupancy 638 (that is, the change in occupancy for the target lesion).
- the change determination model includes a machine learning model trained to output a determination result on whether the size of the reference target lesion changes based on the input value for the reference occupancy difference (or the reference occupancy change for the target lesion) can do. In order to generate/learn a change determination model, a user's annotation work may be required.
- the processor receives the user's label information (increase, decrease, no change) on the reference occupancy change amount and whether the size of the target lesion changes, and selects one of the increase, decrease, and no change of the target lesion as the occupancy change amount is input. You can create/train a machine learning model to output.
- the change measuring unit 710 may determine whether the size of the target lesion changes or not by using the generated/learned machine learning model (eg, a change determination model).
- the processor may measure the size of the entire area (eg, lung) and/or the size of a reference region (eg, the right lung) within the x-ray image. have.
- the processor may measure the size and location of the target lesion through a computer aided detection (CAD) method or an existing algorithm.
- CAD computer aided detection
- the size of the right lung area lung_area1 on the first chest X-ray image 810 is measured to be 11389.0, and the size of the right lung area lung_area2 on the second chest X-ray image 820 is measured.
- ) can be measured as 12076.0. That is, even if it corresponds to an X-ray image obtained by photographing the lungs of the same object/object, the size of the lung region may be measured differently depending on the size of the image, the location and/or state of the object/object.
- the size of the target area area1 in the right lung area on the first chest X-ray image 810 is measured to be about 1739.8578
- the target area area2 within the right lung area on the second chest X-ray image 820 . can be measured to be about 1553.8666.
- the first occupancy is 1739.8578/11389.0, which is calculated as about 0.15277
- the second occupancy is 1553.8666/12076, which can be calculated as about 0.12867.
- the occupancy change (change_ratio) of the target lesion is 0.12867 - 0.15277, which can be calculated as about -0.0241.
- the processor may determine whether the size of the target lesion has changed by comparing the calculated occupancy change amount of about -0.0241 with a reference value. For example, when the calculated occupancy change amount is smaller than the reference value (eg, the second reference value), the processor may determine that the size of the target lesion has decreased. Alternatively, when the calculated occupancy change amount is greater than the first reference value and smaller than the second reference value, the processor may determine that the size of the target lesion does not change. As another example, the processor inputs the calculated occupancy change amount (eg, the difference between the first occupancy and the second occupancy) into the change or not determination model, and determines whether the size of the target lesion changes based on the output determination result.
- the calculated occupancy change amount eg, the difference between the first occupancy and the second occupancy
- the ratio ie, occupancy
- a change in this ratio may represent a substantial change in the target lesion.
- a plurality of lesions included in the lung may be detected separately, and since the occupancy rate and/or occupancy change amount for each lesion may be calculated, the size change for each lesion may be measured.
- the processor determines the score for the first reference region 910 (eg, the size of the first reference region, the number of pixels, etc.), the score for the second reference region 920 (eg, the size of the second reference region). , number of pixels, etc.), a score for the target lesion 912 in the first reference region, and/or a score for the target lesion 922 in the second reference region may be determined (or calculated).
- the processor uses the change algorithm 900 to obtain a score for a first reference region 910 , a score for a second reference region 920 , and a score for a target lesion 912 within the first reference region. and a change score may be calculated based on the score for the target lesion 922 in the second reference region.
- the processor uses the reference region extraction model, for each of the first X-ray image (taken at time t1) and the second X-ray image (taken at time t2 and different from t1), the reference region 910 920), and a score for the reference region may be determined. Also, the processor may determine a score for the target lesion 912 in the first reference region by using the target lesion detection model, the segmentation model, and the like. To this end, the processor may calculate a target lesion prediction score (or a heat map value) for each of the plurality of pixels included in the first reference region.
- the processor calculates a target lesion prediction score for each of a plurality of pixels included in the second reference region using a target lesion detection model, a segmentation model, and the like, thereby providing the target lesion 922 in the second reference region. score can be determined.
- the processor determines a heat map value (eg, a heat map value determined based on a target lesion prediction score) for each of a plurality of pixels included in the target lesion region within the reference region, and performs the following math As in Equation 4, it is possible to calculate a score for a target lesion within each of the reference regions.
- a heat map value eg, a heat map value determined based on a target lesion prediction score
- the heat map may represent a heat map value for the target lesion region
- f(x) may represent an arbitrary function for the x value
- output_score may represent a score for the target lesion.
- f (heat map) may represent the sum of heat map values for each of a plurality of pixels included in the target lesion region.
- f (heat map) may represent an average value of heat map values for each of a plurality of pixels included in the target lesion region.
- the processor may calculate a change score as shown in Equation 5 below.
- the processor may determine whether the size of the target lesion has changed and/or the amount of size change based on the calculated scores.
- output_score_t1 is the score for the target lesion within the first reference region
- area_t1 is the score for the first reference region
- output_score_t2 is the score for the target lesion within the second reference region
- area_t2 is the score for the second reference region
- g(x, y, z, r) may represent any function that calculates the first occupancy and the second occupancy based on x, y, z, r, thereby calculating a size change score of the target lesion.
- the processor calculates the first occupancy based on the score for the first reference region 910 and the score for the target lesion 912 within the first reference region, and is applied to the second reference region 920 .
- the second occupancy may be calculated based on the score for the target lesion 922 and the score for the target lesion 922 within the second reference region.
- the processor may calculate the first occupancy rate and the second occupancy rate by using Equations 6 and 7 below.
- the processor may determine whether the size of the target lesion changes based on the first occupancy and the second occupancy. That is, the processor calculates a change score (eg, change score, occupancy change amount, etc.) based on the first occupancy and the second occupancy, and whether and/or change in size of the target lesion based on the calculated change score degree can be determined. For example, the processor may calculate a change score using Equation 8 below.
- a change score eg, change score, occupancy change amount, etc.
- the processor may calculate a change score using Equation 9 below.
- the processor may calculate a change score using Equation 10 below.
- Equations 8, 9 and 10 above represents the first occupancy, may represent the second occupancy.
- FIG. 9 illustrates a heat map for a target lesion within a reference region as an example of calculating a score for a target lesion, but is not limited thereto.
- a predicted value, a probability map, and the like of each of a plurality of pixels included in the reference region for the target lesion may be used.
- the artificial neural network model 1000 may be a statistical learning algorithm implemented based on the structure of a biological neural network or a structure for executing the algorithm in machine learning technology and cognitive science. have.
- the artificial neural network model 1000 nodes that are artificial neurons that form a network by combining synapses as in a biological neural network repeatedly adjust the weights of synapses, By learning to reduce the error between the output and the inferred output, it is possible to represent a machine learning model with problem-solving ability.
- the artificial neural network model 1000 may include arbitrary probabilistic models, neural network models, etc. used in artificial intelligence learning methods such as machine learning and deep learning.
- the artificial neural network model 1000 may include an artificial neural network model configured to determine a reference region from an input X-ray image. Additionally or alternatively, the artificial neural network model 1000 may include an artificial neural network model for identifying a target lesion from an input X-ray image. Additionally or alternatively, by inputting the difference between the first occupancy and the second occupancy (eg, the amount of occupancy change for the target lesion), the result of determination as to whether the size of the target lesion changes (eg, of the target lesion) It may include an artificial neural network model configured to output increase, decrease, or no change).
- the artificial neural network model 1000 is implemented as a multilayer perceptron (MLP) composed of multilayer nodes and connections between them.
- the artificial neural network model 1000 may be implemented using one of various artificial neural network model structures including MLP.
- the artificial neural network model 1000 includes an input layer 1020 that receives an input signal or data 1010 from the outside, and an output layer that outputs an output signal or data 1050 corresponding to the input data. (1040), which is located between the input layer 1020 and the output layer 1040, receives a signal from the input layer 1020, extracts characteristics, and transmits the characteristics to the output layer 1040 of n pieces (where n is a positive integer) It may be composed of hidden layers 1030_1 to 1030_n.
- the output layer 1040 receives signals from the hidden layers 1030_1 to 1030_n and outputs them to the outside.
- the learning method of the artificial neural network model 1000 includes a supervised learning method that learns to be optimized to solve a problem by input of a teacher signal (correct answer), and an unsupervised learning method that does not require a teacher signal. ) is a way.
- the information processing system may supervised and/or unsupervised the artificial neural network model 1000 to output the reference region from the X-ray image.
- the information processing system may supervise the artificial neural network model 1000 to output the reference region from the X-ray image by using a plurality of reference X-ray images and label information on the reference reference region.
- the information processing system supervises and/or uses the artificial neural network model 1000 to determine and output a reference region by dividing the entire region (eg, lung region) of the X-ray image into a plurality of regions. can also be learned.
- the information processing system guides the artificial neural network model 1000 to output a determination result on whether the size of the target lesion changes as the difference between the first occupancy and the second occupancy is input to the change determination model. It can be learned and/or unsupervised.
- the information processing system uses the artificial neural network model 1000 to output a determination result as to whether the size of the reference target lesion has changed based on the input value for the reference occupancy difference (eg, the reference occupancy change for the target lesion). ) can be supervised.
- the artificial neural network model 1000 learned in this way may be stored in a memory (not shown) of the information processing system, and in response to an input of the X-ray image received from the communication module and/or memory, a reference region and/or from the X-ray image A target lesion area may be determined, and a reference area and/or a target lesion area may be output. Additionally or alternatively, the artificial neural network model 1000 determines whether the size of the target lesion is changed in response to an input for the difference in occupancy of the target lesion (eg, occupancy change for the target lesion) in a plurality of X-ray images. It may be determined, and a result of determination as to whether the size of the target lesion has changed may be output.
- a memory not shown
- the input variable of the machine learning model for determining the reference region may be one or more X-ray images.
- an input variable input to the input layer 1020 of the artificial neural network model 1000 may be an image vector 1010 composed of one or more X-ray images as one vector data element.
- the output variable output from the output layer 1040 of the artificial neural network model 1000 may be a vector 1050 indicating or characterizing a reference region and/or a target lesion region in the X-ray image. .
- the output layer 1040 of the artificial neural network model 1000 may be configured to output a vector indicating or characterizing a plurality of regions obtained by dividing the entire region of the X-ray image.
- the reference area may be an area including at least one of a plurality of areas.
- the output variable of the artificial neural network model 1000 is not limited to the type described above, and may include any information/data indicating a reference region and/or a target lesion region in an X-ray image. .
- the input variable of the machine learning model for determining whether the size of the target lesion changes is the difference between the first occupancy and the second occupancy for the target lesion (eg, A value obtained by subtracting the first occupancy from the second occupancy, the occupancy change amount for the target lesion).
- the input variable input to the input layer 1020 of the artificial neural network model 1000 may be a numerical vector 1010 in which the difference between the first occupancy and the second occupancy is composed of one vector data element. have.
- the output variable output from the output layer 1040 of the artificial neural network model 1000 is a vector indicating or characterizing the determination result as to whether the size of the target lesion has changed. (1050).
- the output variable of the artificial neural network model 1000 is not limited to the type described above, and may include any information/data indicating a result of determining whether the size of the target lesion changes.
- the output layer 1040 of the artificial neural network model 1000 may be configured to output a vector indicating the reliability and/or accuracy of the output reference region, the target lesion region, and the determination result for whether the size of the target lesion changes. have.
- a plurality of output variables corresponding to a plurality of input variables are respectively matched to the input layer 1020 and the output layer 1040 of the artificial neural network model 1000, and the input layer 1020, the hidden layers 1030_1 to 1030_n, and By adjusting synaptic values between nodes included in the output layer 1040 , it may be learned to extract a correct output corresponding to a specific input. Through this learning process, the characteristics hidden in the input variable of the artificial neural network model 1000 can be identified, and the nodes of the artificial neural network model 1000 can be reduced so that the error between the output variable calculated based on the input variable and the target output is reduced. You can adjust the synapse value (or weight) between them.
- the artificial neural network model 1000 trained in this way, in response to the input X-ray image, information (eg, location, size, number of pixels, etc.) about the reference area and/or the target lesion area is output from the X-ray image. can be Additionally, by using the artificial neural network model 1000, in response to the input occupancy change amount for the target lesion (eg, the difference between the first occupancy and the second occupancy for the target lesion), the size of the target lesion changes. A result of the determination may be output.
- the input occupancy change amount for the target lesion eg, the difference between the first occupancy and the second occupancy for the target lesion
- example implementations may refer to utilizing aspects of the presently disclosed subject matter in the context of one or more standalone computer systems, the subject matter is not so limited, but rather in connection with any computing environment, such as a network or distributed computing environment. may be implemented. Still further, aspects of the presently disclosed subject matter may be implemented in or across a plurality of processing chips or devices, and storage may be similarly affected across the plurality of devices. Such devices may include PCs, network servers, and handheld devices.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Medical Informatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Multimedia (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Databases & Information Systems (AREA)
- Optics & Photonics (AREA)
- Public Health (AREA)
- Heart & Thoracic Surgery (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Veterinary Medicine (AREA)
- High Energy & Nuclear Physics (AREA)
- Biophysics (AREA)
- Quality & Reliability (AREA)
- Geometry (AREA)
- Physiology (AREA)
- Data Mining & Analysis (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- General Engineering & Computer Science (AREA)
- Biodiversity & Conservation Biology (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
Description
Claims (20)
- 적어도 하나의 컴퓨팅 장치에 의해 수행되는, 엑스레이(X-ray) 이미지 내에서의 대상 병변의 크기 변화를 측정하는 방법에 있어서,상기 대상 병변을 포함한 제1 엑스레이 이미지 및 상기 대상 병변을 포함한 제2 엑스레이 이미지를 수신하는 단계;상기 제1 엑스레이 이미지 및 상기 제2 엑스레이 이미지의 각각에서의 기준 영역 중 상기 대상 병변에 대응하는 영역이 차지하는 점유율(occupancy)을 산출하는 단계; 및상기 산출된 점유율을 기초로 상기 대상 병변의 크기 변화를 측정하는 단계를 포함하는, 대상 병변의 크기 변화를 측정하는 방법.
- 제1항에 있어서,상기 산출하는 단계는,상기 제1 엑스레이 이미지 및 상기 제2 엑스레이 이미지의 각각으로부터 제1 기준 영역 및 제2 기준 영역을 결정하는 단계;상기 제1 엑스레이 이미지 및 상기 제2 엑스레이 이미지의 각각으로부터 상기 대상 병변을 식별하는 단계; 및상기 제1 기준 영역 중 상기 식별된 대상 병변이 차지하는 제1 점유율 및 상기 제2 기준 영역 중 상기 식별된 대상 병변이 차지하는 제2 점유율을 산출하는 단계를 포함하는, 대상 병변의 크기 변화를 측정하는 방법.
- 제2항에 있어서,상기 제1 기준 영역 및 제2 기준 영역을 결정하는 단계는,상기 제1 엑스레이 이미지를 기준 영역 추출 모델에 입력하여, 상기 제1 기준 영역을 출력하는 단계; 및상기 제2 엑스레이 이미지를 상기 기준 영역 추출 모델에 입력하여 상기 제2 기준 영역을 출력하는 단계 - 상기 제2 기준 영역은 상기 제1 기준 영역에 대응됨 - 를 포함하고,상기 기준 영역 추출 모델은, 복수의 참조 엑스레이 이미지 및 참조 기준 영역에 대한 레이블 정보를 이용하여 학습되는, 대상 병변의 크기 변화를 측정하는 방법.
- 제2항에 있어서,상기 산출하는 단계는,상기 제1 기준 영역 내에서 상기 대상 병변이 차지하는 영역의 픽셀 수를 상기 제1 기준 영역에 대응하는 픽셀 수로 나눔으로써, 상기 제1 점유율을 산출하는 단계; 및상기 제2 기준 영역 내에서 상기 대상 병변이 차지하는 영역의 픽셀 수를 상기 제2 기준 영역에 대응하는 픽셀 수로 나눔으로써, 상기 제2 점유율을 산출하는 단계를 포함하는, 대상 병변의 크기 변화를 측정하는 방법.
- 제2항에 있어서,상기 제1 기준 영역 및 제2 기준 영역을 결정하는 단계는,상기 제1 기준 영역에 대한 스코어(score) 및 상기 제2 기준 영역에 대한 스코어를 결정하는 단계를 포함하고,상기 대상 병변을 식별하는 단계는,상기 제1 기준 영역 내에서 상기 대상 병변에 대한 스코어를 결정하는 단계; 및상기 제2 기준 영역 내에서 상기 대상 병변에 대한 스코어를 결정하는 단계를 포함하고,상기 산출하는 단계는,상기 제1 기준 영역에 대한 스코어 및 상기 제1 기준 영역 내에서 상기 대상 병변에 대한 스코어를 기초로 제1 점유율을 산출하는 단계; 및상기 제2 기준 영역에 대한 스코어 및 상기 제2 기준 영역 내에서 상기 대상 병변에 대한 스코어를 기초로 제2 점유율을 산출하는 단계를 포함하는, 대상 병변의 크기 변화를 측정하는 방법.
- 제2항에 있어서,상기 측정하는 단계는,상기 제1 점유율 및 상기 제2 점유율에 기초하여, 상기 대상 병변의 크기 변화 여부를 판정하는 단계를 포함하는, 대상 병변의 크기 변화를 측정하는 방법.
- 제6항에 있어서,상기 제1 점유율 및 상기 제2 점유율에 기초하여, 상기 대상 병변의 크기 변화 여부를 판정하는 단계는,상기 제1 점유율 및 상기 제2 점유율에 기초하여, 상기 대상 병변에 대한 점유율 변화량을 산출하는 단계; 및상기 산출된 대상 병변에 대한 점유율 변화량과 기준 수치를 비교하여, 상기 대상 병변의 크기 변화 여부를 판정하는 단계를 포함하는, 대상 병변의 크기 변화를 측정하는 방법.
- 제7항에 있어서,상기 기준 수치는, 테스트 세트에 대해 산출된 타겟 메트릭과 연관된 수치 또는 예측 정확도와 연관된 수치에 기초하여 결정되는, 대상 병변의 크기 변화를 측정하는 방법.
- 제7항에 있어서,상기 기준 수치는 제1 기준 수치 및 제2 기준 수치를 포함하고,상기 산출된 대상 병변에 대한 점유율 변화량이 상기 제1 기준 수치보다 크거나 같은 경우, 상기 대상 병변의 크기가 증가한 것으로 판정되고,상기 산출된 대상 병변에 대한 점유율 변화량이 상기 제1 기준 수치보다 작고, 상기 제2 기준 수치보다 크거나 같은 경우, 상기 대상 병변의 크기가 무변화한 것으로 판정되고,상기 산출된 대상 병변에 대한 점유율 변화량이 상기 제2 기준 수치보다 작은 경우, 상기 대상 병변의 크기가 감소한 것으로 판정되는, 대상 병변의 크기 변화를 측정하는 방법.
- 제6항에 있어서,상기 제1 점유율 및 상기 제2 점유율에 기초하여, 상기 대상 병변의 크기 변화 여부를 판정하는 단계는,상기 제1 점유율 및 상기 제2 점유율에 기초하여, 상기 대상 병변에 대한 점유율 변화량을 산출하는 단계; 및상기 산출된 대상 병변에 대한 점유율 변화량을 변화 여부 판정 모델에 입력함으로써, 출력되는 판정 결과에 기초하여, 상기 대상 병변의 크기 변화 여부를 결정하는 단계를 포함하고,상기 변화 여부 판정 모델은, 상기 대상 병변에 대한 참조 점유율 변화량에 대한 입력 값을 기초로 참조 대상 병변의 크기 변화 여부에 대한 판정 결과를 출력하도록 학습된 기계학습 모델을 포함하는, 대상 병변의 크기 변화를 측정하는 방법.
- 정보 처리 시스템으로서,하나 이상의 인스트럭션(instructions)을 저장하는 메모리; 및상기 저장된 하나 이상의 인스트럭션을 실행함으로써,대상 병변을 포함한 제1 엑스레이 이미지 및 상기 대상 병변을 포함한 제2 엑스레이 이미지를 수신하고, 상기 제1 엑스레이 이미지 및 상기 제2 엑스레이 이미지의 각각에서의 기준 영역 중 상기 대상 병변에 대응하는 영역이 차지하는 점유율을 산출하고, 상기 산출된 점유율을 기초로 상기 대상 병변의 크기 변화를 측정하도록 구성된 프로세서를 포함하는,정보 처리 시스템.
- 제11항에 있어서,상기 프로세서는,상기 제1 엑스레이 이미지 및 상기 제2 엑스레이 이미지의 각각으로부터 제1 기준 영역 및 제2 기준 영역을 결정하고, 상기 제1 엑스레이 이미지 및 상기 제2 엑스레이 이미지의 각각으로부터 상기 대상 병변을 식별하고, 상기 제1 기준 영역 중 상기 식별된 대상 병변이 차지하는 제1 점유율 및 상기 제2 기준 영역 중 상기 식별된 대상 병변이 차지하는 제2 점유율을 산출하도록 더 구성된,정보 처리 시스템.
- 제12항에 있어서,상기 프로세서는,상기 제1 엑스레이 이미지를 기준 영역 추출 모델에 입력하여, 상기 제1 기준 영역을 출력하고,상기 제2 엑스레이 이미지를 상기 기준 영역 추출 모델에 입력하여 상기 제2 기준 영역 - 상기 제2 기준 영역은 상기 제1 기준 영역에 대응됨 - 을 출력하도록 더 구성되고,상기 기준 영역 추출 모델은, 복수의 참조 엑스레이 이미지 및 참조 기준 영역에 대한 레이블 정보를 이용하여 학습되는,정보 처리 시스템.
- 제12항에 있어서,상기 프로세서는,상기 제1 기준 영역 내에서 상기 대상 병변이 차지하는 영역의 픽셀 수를 상기 제1 기준 영역에 대응하는 픽셀 수로 나눔으로써, 상기 제1 점유율을 산출하고, 상기 제2 기준 영역 내에서 상기 대상 병변이 차지하는 영역의 픽셀 수를 상기 제2 기준 영역에 대응하는 픽셀 수로 나눔으로써, 상기 제2 점유율을 산출하도록 더 구성된,정보 처리 시스템.
- 제12항에 있어서,상기 프로세서는,상기 제1 기준 영역에 대한 스코어(score) 및 상기 제2 기준 영역에 대한 스코어를 결정하고, 상기 제1 기준 영역 내에서 상기 대상 병변에 대한 스코어를 결정하고, 상기 제2 기준 영역 내에서 상기 대상 병변에 대한 스코어를 결정하고, 상기 제1 기준 영역에 대한 스코어 및 상기 제1 기준 영역 내에서 상기 대상 병변에 대한 스코어를 기초로 제1 점유율을 산출하고, 상기 제2 기준 영역에 대한 스코어 및 상기 제2 기준 영역 내에서 상기 대상 병변에 대한 스코어를 기초로 제2 점유율을 산출하도록 더 구성된,정보 처리 시스템.
- 제12항에 있어서,상기 프로세서는,상기 제1 점유율 및 상기 제2 점유율에 기초하여, 상기 대상 병변의 크기 변화 여부를 판정하도록 더 구성된,정보 처리 시스템.
- 제16항에 있어서,상기 프로세서는,상기 제1 점유율 및 상기 제2 점유율에 기초하여, 상기 대상 병변에 대한 점유율 변화량을 산출하고, 상기 산출된 대상 병변에 대한 점유율 변화량과 기준 수치를 비교하여, 상기 대상 병변의 크기 변화 여부를 판정하도록 더 구성된,정보 처리 시스템.
- 제17항에 있어서,상기 기준 수치는, 테스트 세트에 대해 산출된 타겟 메트릭과 연관된 수치 또는 예측 정확도와 연관된 수치에 기초하여 결정되는,정보 처리 시스템.
- 제17항에 있어서,상기 기준 수치는 제1 기준 수치 및 제2 기준 수치를 포함하고,상기 산출된 대상 병변에 대한 점유율 변화량이 상기 제1 기준 수치보다 크거나 같은 경우, 상기 대상 병변의 크기가 증가한 것으로 판정되고,상기 산출된 대상 병변에 대한 점유율 변화량이 상기 제1 기준 수치보다 작고, 상기 제2 기준 수치보다 크거나 같은 경우, 상기 대상 병변의 크기가 무변화한 것으로 판정되고,상기 산출된 대상 병변에 대한 점유율 변화량이 상기 제2 기준 수치보다 작은 경우, 상기 대상 병변의 크기가 감소한 것으로 판정되는,정보 처리 시스템.
- 제16항에 있어서,상기 프로세서는,상기 제1 점유율 및 상기 제2 점유율에 기초하여, 상기 대상 병변에 대한 점유율 변화량을 산출하고, 상기 산출된 대상 병변에 대한 점유율 변화량을 변화 여부 판정 모델에 입력함으로써, 출력되는 판정 결과에 기초하여, 상기 대상 병변의 크기 변화 여부를 결정하도록 더 구성되고,상기 변화 여부 판정 모델은, 상기 대상 병변에 대한 참조 점유율 변화량에 대한 입력 값을 기초로 참조 대상 병변의 크기 변화 여부에 대한 판정 결과를 출력하도록 학습된 기계학습 모델을 포함하는,정보 처리 시스템.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023532580A JP2023552163A (ja) | 2021-04-12 | 2022-02-08 | X線イメージ内における対象病変の大きさの変化を測定する方法及びシステム |
EP22788241.2A EP4272648A1 (en) | 2021-04-12 | 2022-02-08 | Method and system for measuring size change of target lesion in x-ray image |
US18/270,886 US20240029258A1 (en) | 2021-04-12 | 2022-02-08 | Method and system for measuring size change of target lesion in x-ray image |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020210047446A KR102577161B1 (ko) | 2021-04-12 | 2021-04-12 | 엑스레이 이미지 내에서의 대상 병변의 크기 변화를 측정하는 방법 및 시스템 |
KR10-2021-0047446 | 2021-04-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022220383A1 true WO2022220383A1 (ko) | 2022-10-20 |
Family
ID=83640396
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2022/001892 WO2022220383A1 (ko) | 2021-04-12 | 2022-02-08 | 엑스레이 이미지 내에서의 대상 병변의 크기 변화를 측정하는 방법 및 시스템 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240029258A1 (ko) |
EP (1) | EP4272648A1 (ko) |
JP (1) | JP2023552163A (ko) |
KR (1) | KR102577161B1 (ko) |
WO (1) | WO2022220383A1 (ko) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030018245A1 (en) * | 2001-07-17 | 2003-01-23 | Accuimage Diagnostics Corp. | Methods for generating a lung report |
JP2007534447A (ja) * | 2004-04-26 | 2007-11-29 | ヤンケレヴィッツ,デヴィット,エフ. | 標的病変における変化の精密な測定評価のための医療用撮像システム |
JP4664489B2 (ja) * | 2000-12-22 | 2011-04-06 | 株式会社東芝 | 放射線治療計画システム |
JP2019154943A (ja) * | 2018-03-15 | 2019-09-19 | ライフサイエンスコンピューティング株式会社 | 人工知能を用いた病変の検知方法、及び、そのシステム |
KR102186893B1 (ko) * | 2019-10-15 | 2020-12-04 | 주식회사 리드브레인 | 인공지능 기반 커리큘럼 학습 방법을 이용한 의료 영상 처리 시스템 및 지능형 의료 진단 및 진료 시스템 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101611367B1 (ko) * | 2013-11-18 | 2016-04-12 | 재단법인 아산사회복지재단 | 뇌질환 진단 서비스 장치 및 방법 |
JP6855228B2 (ja) * | 2016-12-13 | 2021-04-07 | キヤノンメディカルシステムズ株式会社 | 医用画像診断支援装置、その制御方法、及びプログラム |
KR101910822B1 (ko) * | 2017-04-12 | 2018-10-24 | 고려대학교 산학협력단 | 복수의 의료 영상에서의 동일 병변 영역 추적 장치 및 방법 |
KR102318959B1 (ko) * | 2019-09-10 | 2021-10-27 | 계명대학교 산학협력단 | 의료 영상을 해석하는 인공지능 모델을 이용한 폐암 발병 가능성 예측 방법 및 의료 영상 분석 장치 |
-
2021
- 2021-04-12 KR KR1020210047446A patent/KR102577161B1/ko active IP Right Grant
-
2022
- 2022-02-08 WO PCT/KR2022/001892 patent/WO2022220383A1/ko active Application Filing
- 2022-02-08 EP EP22788241.2A patent/EP4272648A1/en active Pending
- 2022-02-08 JP JP2023532580A patent/JP2023552163A/ja active Pending
- 2022-02-08 US US18/270,886 patent/US20240029258A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4664489B2 (ja) * | 2000-12-22 | 2011-04-06 | 株式会社東芝 | 放射線治療計画システム |
US20030018245A1 (en) * | 2001-07-17 | 2003-01-23 | Accuimage Diagnostics Corp. | Methods for generating a lung report |
JP2007534447A (ja) * | 2004-04-26 | 2007-11-29 | ヤンケレヴィッツ,デヴィット,エフ. | 標的病変における変化の精密な測定評価のための医療用撮像システム |
JP2019154943A (ja) * | 2018-03-15 | 2019-09-19 | ライフサイエンスコンピューティング株式会社 | 人工知能を用いた病変の検知方法、及び、そのシステム |
KR102186893B1 (ko) * | 2019-10-15 | 2020-12-04 | 주식회사 리드브레인 | 인공지능 기반 커리큘럼 학습 방법을 이용한 의료 영상 처리 시스템 및 지능형 의료 진단 및 진료 시스템 |
Also Published As
Publication number | Publication date |
---|---|
US20240029258A1 (en) | 2024-01-25 |
KR102577161B1 (ko) | 2023-09-11 |
JP2023552163A (ja) | 2023-12-14 |
EP4272648A1 (en) | 2023-11-08 |
KR20220141194A (ko) | 2022-10-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019132168A1 (ko) | 수술영상데이터 학습시스템 | |
WO2019235828A1 (ko) | 투 페이스 질병 진단 시스템 및 그 방법 | |
WO2020242239A1 (ko) | 앙상블 학습 알고리즘을 이용한 인공지능 기반 진단 보조 시스템 | |
WO2020122432A1 (ko) | 전자 장치 및 그의 3d 이미지 표시 방법 | |
WO2022131642A1 (ko) | 의료 영상 기반 질환 중증도 결정 장치 및 방법 | |
WO2021071288A1 (ko) | 골절 진단모델의 학습 방법 및 장치 | |
WO2021006522A1 (ko) | 딥 러닝 모델을 활용한 영상 진단 장치 및 그 방법 | |
WO2020076133A1 (ko) | 암 영역 검출의 유효성 평가 장치 | |
WO2021261808A1 (ko) | 병변 판독 결과 표시 방법 | |
WO2022139246A1 (ko) | 골절 검출 방법 및 이를 이용한 디바이스 | |
WO2022265197A1 (ko) | 인공지능에 기반하여 내시경 영상을 분석하기 위한 방법 및 장치 | |
WO2022220383A1 (ko) | 엑스레이 이미지 내에서의 대상 병변의 크기 변화를 측정하는 방법 및 시스템 | |
WO2017090815A1 (ko) | 관절 가동 범위를 측정하는 장치 및 방법 | |
WO2023282389A1 (ko) | 두경부 영상 이미지를 이용한 지방량 산출 방법 및 이를 위한 장치 | |
WO2020032561A2 (ko) | 다중 색 모델 및 뉴럴 네트워크를 이용한 질병 진단 시스템 및 방법 | |
WO2021225422A1 (ko) | 병리 슬라이드 이미지에 대한 면역 표현형과 연관된 정보를 제공하는 방법 및 장치 | |
WO2024034748A1 (ko) | 심혈관 조영 영상을 이용한 신택스 스코어 산출 방법 및 시스템 | |
WO2020032560A2 (ko) | 진단 결과 생성 시스템 및 방법 | |
WO2021177771A1 (ko) | 의료 영상으로부터 바이오마커 발현을 예측하는 방법 및 시스템 | |
WO2021261727A1 (ko) | 캡슐 내시경 영상 판독 시스템 및 방법 | |
WO2021235804A1 (ko) | 메디컬 장치의 이상을 결정하는 방법 및 시스템 | |
WO2022050713A1 (ko) | 흉부 이미지 판독 방법 | |
WO2020212962A2 (ko) | 발의 유형 평가 방법 및 이를 이용한 발의 유형 평가용 디바이스 | |
WO2022173232A2 (ko) | 병변의 발생 위험성을 예측하는 방법 및 시스템 | |
WO2022108249A1 (ko) | 학습데이터 생성 방법, 장치, 프로그램 및 이를 이용한 이물질 검출 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22788241 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023532580 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18270886 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022788241 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022788241 Country of ref document: EP Effective date: 20230803 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |