WO2018233520A1 - 一种生成预测图像的方法及装置 - Google Patents

一种生成预测图像的方法及装置 Download PDF

Info

Publication number
WO2018233520A1
WO2018233520A1 PCT/CN2018/090930 CN2018090930W WO2018233520A1 WO 2018233520 A1 WO2018233520 A1 WO 2018233520A1 CN 2018090930 W CN2018090930 W CN 2018090930W WO 2018233520 A1 WO2018233520 A1 WO 2018233520A1
Authority
WO
WIPO (PCT)
Prior art keywords
patient
image
lesion
stage
lesion image
Prior art date
Application number
PCT/CN2018/090930
Other languages
English (en)
French (fr)
Inventor
董文储
张振中
Original Assignee
京东方科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司 filed Critical 京东方科技集团股份有限公司
Publication of WO2018233520A1 publication Critical patent/WO2018233520A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • the present disclosure relates to the technical field of medicine, and in particular to a method and apparatus for generating a predicted image.
  • a method for generating a predicted image includes: acquiring feature information of a patient; determining a stage in a disease course corresponding to the lesion image of the patient according to the feature information of the patient; based on the lesion The image and the determined phase, a predicted lesion image for the patient is generated, wherein the predicted lesion image is a lesion image corresponding to a subsequent phase of the phase.
  • generating a predicted lesion image for the patient based on the lesion image and the determined phase comprises: inputting the lesion image and the determined phase into a pre-established image prediction model to generate The patient's predicted lesion image.
  • the image prediction model is pre-established by acquiring a phase in a disease course of the sample and a corresponding sample lesion image, wherein the phase in the disease duration of the sample is continuous; And corresponding sample lesion image input prediction model is trained to establish the image prediction model.
  • the acquiring the feature information of the patient comprises: receiving medical record data of the patient; and extracting feature information of the patient from the medical record data.
  • the determining, according to the characteristic information of the patient, the stage in the course of the disease corresponding to the lesion image of the patient comprises: determining, according to the characteristic information of the patient, the number of the patient from the cluster type of the patient a cluster type, wherein the cluster type of the patient is obtained from the patient feature information as a sample by applying a clustering algorithm, each cluster type corresponding to an image classification model; and inputting the lesion image of the patient Determining, in the first image classification model corresponding to the first cluster type, a stage in a disease course corresponding to the lesion image.
  • the image classification model is obtained by: acquiring a sample lesion image corresponding to different stages in the disease course of each cluster type; and inputting the sample lesion image corresponding to the different stages into the classification model for training, and obtaining An image classification model corresponding to each cluster type.
  • generating the predicted lesion image for the patient based on the lesion image and the determined phase comprises: determining, according to the patient's characteristic information, the first of the patient's cluster type to which the patient belongs a cluster type, wherein the cluster type of the patient is obtained from patient feature information as a sample by using a clustering algorithm; according to the determined stage in the disease path corresponding to the lesion image of the patient, from the first A predicted lesion image for the patient is generated in the image information corresponding to the class type, wherein the image information includes a correspondence relationship between all stages in the course of the disease and the lesion image.
  • an apparatus for generating a predicted image includes: a feature information acquiring module configured to acquire feature information of a patient; and a phase determining module configured to determine the feature according to the feature information of the patient a stage in the course of the disease corresponding to the patient's lesion image; a predicted lesion image generation module configured to generate a predicted lesion image for the patient based on the lesion image and the determined phase, wherein the predicted lesion The image is the lesion image corresponding to the subsequent phase of the phase.
  • the predicted lesion image generation module is configured to generate a predicted lesion image of the patient by inputting the lesion image and the determined phase into a pre-established image prediction model.
  • the image prediction model is pre-established by acquiring a phase in a disease course of the sample and a corresponding sample lesion image, wherein the phase in the disease duration of the sample is continuous;
  • the corresponding sample lesion image input prediction model is trained to establish the image prediction model.
  • the feature information acquiring module includes: a medical record data receiving submodule configured to receive medical record data of the patient; and a feature information extracting submodule configured to extract the characteristics of the patient from the medical record data information.
  • the stage determining module includes: a first cluster type determining submodule configured to determine, according to the feature information of the patient, a first cluster type to which the patient belongs from a cluster type of the patient, Wherein the cluster type of the patient is obtained from the patient feature information as a sample by using a clustering algorithm, each cluster type corresponding to an image classification model; the image classification sub-module configured to pass the pre-obtained The lesion image of the patient is input to the first image classification model corresponding to the first cluster type to determine a stage in the course of the disease corresponding to the lesion image.
  • a computing device comprising: a processor; a memory having computer executable instructions that, when executed by the processor, perform any of the above method.
  • FIG. 1a illustrates a flow chart of a method of generating a predicted image, in accordance with one embodiment of the present disclosure
  • Figure 1b shows a schematic diagram of the evolution of a predicted image in a natural development scenario in accordance with one embodiment of the present disclosure
  • 1c shows a schematic diagram of the evolution of a predicted image in the case of using the treatment plan one according to the one embodiment of the present disclosure
  • Figure 1d illustrates a flow diagram of generating a predicted image model in accordance with one of the embodiments of the present disclosure
  • Figure 1e illustrates a flow diagram of generating a predicted image model in accordance with one of the embodiments of the present disclosure
  • Figure 1f illustrates a flow diagram of generating a predicted lesion image in accordance with one of the embodiments of the present disclosure
  • Figure 1g illustrates a flow diagram of generating a predicted lesion image in accordance with one of the embodiments of the present disclosure
  • Figure 1h illustrates a flow diagram of generating a model for generating a predicted image, in accordance with one of the embodiments of the present disclosure
  • FIG. 2a shows a flowchart of a method of generating a predicted image, in accordance with another embodiment of the present disclosure
  • FIG. 2b shows a flow chart of a stage of acquiring a lesion image according to another embodiment of the present disclosure
  • FIG. 3 is a schematic structural diagram of an apparatus for generating a predicted image according to still another embodiment of the present disclosure
  • FIG. 4 is a schematic structural diagram of an apparatus for generating a predicted image according to still another embodiment of the present disclosure.
  • FIG. 5 illustrates an example computing device that can implement various techniques in an embodiment of the present disclosure.
  • FIG. 1a illustrates a flow chart of a method of generating a predicted image, in accordance with one embodiment of the present disclosure.
  • the method of generating a predicted image may include the following steps 101-103.
  • the characteristic information of the patient may include one or more of a patient's age, weight, symptoms, test items in the case, and the like.
  • acquiring the feature information of the patient may include: receiving medical record data of the patient; and extracting feature information of the patient from the medical record data.
  • the patient When the patient goes to the hospital to check the body, the patient will be asked to fill in the patient's personal information, such as name, age, height, weight, etc., in the medical record data form, and after each check item is completed, the doctor will also fill in the test results. Enter the patient's medical record data sheet. After the examination of the patient is completed, the patient's characteristic information can be extracted from the patient's medical record data.
  • the patient's personal information such as name, age, height, weight, etc.
  • step 102 After acquiring the patient's characteristic information, step 102 is performed. At step 102, a stage in the course of the disease corresponding to the lesion image of the patient is determined based on the characteristic information of the patient.
  • a course of disease can include multiple stages.
  • the lesion image may be a medical image such as an X-ray image, a B-mode image, a gastroscope photo, a nuclear magnetic image, or the like, wherein the predicted lesion image is a lesion image corresponding to a subsequent stage of the stage in the course of the disease.
  • the stages in the course of the disease are based on age, height, weight, type of illness, and so on. For example, a lesion image of a patient having an age of 25 to 35 years old, a height of 168 to 178 cm, a body weight of 65 to 78 kg, the same type of illness, and a similar severity of the disease can be determined to be at the same stage.
  • the predicted image evolution for different stages of lesion images is different in the case of a natural progression or in the case of a treatment regimen.
  • the stage in the course of the disease corresponding to the lesion image of the patient may be determined according to the age, height, weight, type of disease, severity of the disease, and the like of the patient in the characteristic information of the patient.
  • Step 103 is performed after determining the stage in the course of the disease corresponding to the patient's lesion image.
  • a predicted lesion image for the patient is generated based on the lesion image and the determined phase, wherein the predicted lesion image is a lesion image corresponding to a subsequent phase of the phase.
  • the lesion image and the stage in the determined course of disease can be input into a pre-established image prediction model to generate a predicted lesion image for the patient.
  • the cluster type (ie, the first cluster type) to which the patient belongs may be determined from a plurality of predetermined cluster types for the patient according to the feature information of the patient, the cluster The type is obtained by using a clustering algorithm from feature information of a patient as a sample; and then determining from the image information corresponding to the first cluster type to the patient based on a stage in the course of the disease determined from the feature information of the patient; a predicted lesion image; the image information includes a correspondence between all stages in the course of the disease and the lesion image.
  • a clustering algorithm is applied to the feature information of the M patients as samples to obtain K categories, each of which represents one type of the disease.
  • the clustering algorithm is, for example, a K-means, a BIRCH, or a DENCLUE algorithm.
  • the cluster type of the patient is obtained by applying a clustering algorithm to the feature information of the patient as a sample, wherein each cluster type corresponds to one image classification model.
  • medical record data of a plurality of patients having certain diseases may be pre-acquired from medical-related big data, and corresponding characteristic information (such as age, weight, symptoms, and cases) may be extracted from medical record data of a plurality of patients.
  • characteristic information such as age, weight, symptoms, and cases
  • the recorded inspection items, etc. then use the clustering algorithm to obtain the cluster type for the patient from the extracted feature information, wherein one cluster type corresponds to one type of disease.
  • the image information associated with each cluster type contains the correspondence between the stage in the course of the disease and the lesion image. If the lesion image of a certain stage in the course of the patient is included in the image information associated with the first cluster type, the current patient is directly determined from the image information associated with the first cluster type.
  • the lesion image corresponding to a specific stage in the desired course of disease does not require the use of an image prediction model to process the current patient's lesion image.
  • the predicted lesion image may be a lesion image corresponding to a subsequent stage of the determined phase.
  • the predicted lesion image may include a predicted subsequent image of the lesion in the case of natural development, ie, the development of future lesions without the use of any treatment regimen.
  • the first picture is the patient's current gastroscopic picture, at which time the patient's stomach shadow block area is small. Without any treatment plan, the condition will gradually increase over time, and when it reaches the xth stage (the middle picture), the shadow area of the stomach is significantly larger than that of the first picture.
  • the enlargement indicates that the patient's stomach is exacerbated, and if no treatment is continued, the entire stomach of the patient is exposed to the virus when it progresses to the x+N phase (ie, as shown in the last figure).
  • the patient can visually observe the evolution of future lesions without any treatment regimen based on the predicted lesion image.
  • the predicted lesion image may also include a predicted subsequent image under a certain treatment regimen, ie, the development of future lesions in the case of a certain treatment regimen.
  • the first picture is the patient's current gastroscope photo.
  • the patient's stomach has a partial shadow block.
  • the treatment plan 1 the condition will be relieved over time.
  • the shadow area of the stomach is significantly smaller than that of the first picture, indicating a reduction in the patient's stomach disease, and continues to use treatment plan one, when developing to the first x+
  • the patient's entire stomach has no shadows, indicating that the patient's stomach has healed.
  • the patient can visually observe the evolution of future lesions in the case of treatment regimen 1 based on the predicted lesion image.
  • the lesion images at different stages of the disease course are different under natural development conditions or in the case of the same treatment regimen, and the evolution of the lesion images is different.
  • the lesion image of the same stage in the course of the disease is different in the case of using different treatment schemes, and the evolution of the lesion image is also different, which will be described in detail in the following examples, and will not be further described herein.
  • the image prediction model may be pre-established in the following manner:
  • Step N1 obtaining a stage in the course of the sample and a corresponding image of the sample lesion, wherein the stage in the course of the sample is continuous;
  • Step N2 The stage in the disease course of the sample and the corresponding sample lesion image are input into a prediction model for training to establish the image prediction model.
  • stages in the course of the sample and corresponding sample lesion images may be obtained from medically relevant big data.
  • Medical-related big data includes detailed information about all patients during hospital examinations, such as age, height, weight, gender, and various examination data, etc., and thus includes various types of diseases in medical-related big data.
  • the stage of the disease and the corresponding lesion image may be obtained from medically relevant big data.
  • the image prediction model can be obtained by training the stage in the sample course (where the stage is continuous) and the corresponding sample lesion image into the prediction model.
  • the manner in which the image prediction model is acquired is as shown in FIGS. 1d to 1e.
  • medical record data of a plurality of patients having the same type of disease are acquired from the medical-related big data, and then the characteristic information of the patient is extracted from the medical record data of the plurality of patients, respectively.
  • the patient's age, medical history, other disease conditions, physical parameters, blood routine and other laboratory data can be obtained, and the lesion images at each stage of the disease course can be obtained.
  • FIG. 1e the manner in which the image prediction model is acquired is as shown in FIGS. 1d to 1e.
  • a predicted image of stages 1 to n in the course of the disease is generated based on the acquired feature information and the lesion image of the plurality of patients under the disease of the type, thereby establishing an image prediction model of the disease of the type. In this way, an image prediction model for different types of diseases can be established.
  • the current patient's lesion image and the stage in the corresponding disease course can be input into the established image prediction model to generate a predicted lesion image of the patient.
  • the current patient's medical record data, age and body weight and other related vital signs parameters are input into the image prediction model to obtain a matched patient type; then, after the matched patient type is acquired, the current lesion of the patient is input. An image to obtain an image prediction model that matches the lesion image. Then, as shown in FIG.
  • the matched image prediction model outputs a subsequent predicted image of the phase in the corresponding future disease course, such as the output stage X+1.
  • the predicted image can also output a predicted image of phase X+N, and can output a predicted image of a specific stage according to the needs of the patient.
  • an image prediction model can be established by, for example, inputting each of the successive stages of the sample disease and corresponding sample lesion images into a currently used prediction model (eg, a gray prediction model, etc.) for training. This can be done in the manner shown in Figure 1h:
  • Step M1 establishing an image generation model for the lesion image of each stage of each type of patient, wherein the image generation model may include: an encoder and a decoder (both the encoder and the decoder may be commonly used in the prior art) CNN (Convolutional Neural Network) model);
  • CNN Convolutional Neural Network
  • Step M2 input the image of the i-stage lesion in the course of the patient of type i into the encoder;
  • Step M3 the encoder encodes the input image into a fixed length vector d
  • Step M4 Input the vector d into the decoder, and take the lesion image corresponding to the i+1th stage in the course of the patient of type i as an output.
  • An image generation model can be generated in the above manner. After the image generation model training is completed, an i+1 stage lesion image can be generated for the input i-stage lesion image of the patient, and the i+1 stage lesion image is input as an input, and the i+2 stage lesion image is generated. . In this way, lesion images of successive i+1, i+2, ..., i+n stages after the current phase required by the patient can be obtained. Then, the lesion images of different stages are used as training data to train the prediction model to obtain an image prediction model.
  • embodiments of the present disclosure can utilize the same acquired from medically relevant big data.
  • a pattern of lesions of the same stage of the patient's type to train the image generation model to obtain a uniform lesion image of the patient at the current stage, thereby acquiring uniform lesion images of the patient at different stages of the type for training, thereby A predicted image model is acquired for predicting a predicted lesion image of each stage of the future corresponding to the lesion image of the patient.
  • the lesion image and the stage are input into a pre-established image prediction model to generate a predicted lesion image for the patient.
  • the patient can visually observe his or her future lesion evolution based on the generated predicted lesion image.
  • FIG. 2a shows a flowchart of a method of generating a predicted image, which may include the following steps 201-204, in accordance with another embodiment of the present disclosure.
  • the patient's characteristic information may include one or more of the patient's age, weight, symptoms, test items in the case, and the like.
  • a first cluster type to which the patient belongs is determined in a cluster type of the patient based on the characteristic information of the patient.
  • the cluster type of the patient is obtained from the feature information of the patient as a sample by using a clustering algorithm, wherein each cluster type corresponds to an image classification model.
  • the medical record data of a plurality of patients suffering from a certain disease can be obtained in advance from the medical-related big data, and corresponding characteristic information (such as age, weight, symptom, and test item in the case) is extracted from the medical record data of the plurality of patients. Wait). Then, according to the extracted feature information, a clustering algorithm is used to cluster a plurality of patients to obtain a cluster type of the patient, wherein one cluster type corresponds to one type of the disease.
  • characteristic information such as age, weight, symptom, and test item in the case
  • the clustering algorithm may be used to obtain the first cluster type to which the patient belongs from the cluster type of the predetermined patient.
  • step 203 the lesion image of the patient obtained in advance is input into an image classification model corresponding to the first cluster type, and a stage in the course of the disease corresponding to the lesion image is determined.
  • an image classification model corresponding to the first cluster type to which the patient belongs may be acquired, and then the patient may be determined.
  • the lesion image corresponds to the stage of the disease.
  • the associated image classification model can be obtained by:
  • Step S1 acquiring a sample lesion image corresponding to different stages in the course of each cluster type
  • Step S2 input the sample lesion images corresponding to the different stages into the classification model for training, and obtain an image classification model corresponding to each of the cluster types.
  • M patients such as case data, medical image data, physical signs, etc.
  • characteristic information such as age, weight, and Symptoms, test items in cases, etc.
  • clustering algorithms are used to cluster M patients, and the cluster types of the corresponding patients are obtained, wherein each cluster type corresponds to one type of certain diseases.
  • a classifier is established for each cluster type, and the classifier is used to classify the stages in the course of the lesion image of the patient belonging to the cluster type, and then all of the M patients in the cluster type
  • the lesion image and its stage are trained as training data to generate a corresponding image classification model.
  • the classification of the stages in the course of the lesion image by the classifier to obtain the image classification model can be performed as follows:
  • the patient's lesion image of a certain type of disease is first acquired, and the patient's lesion image is input into the CNN model, and the CNN model outputs a corresponding d-dimensional vector, where d is a positive integer greater than or equal to 2, the d-dimension The vector is used to represent the patient's lesion image. Then, the d-dimensional vector is input into an SVM (Support Vector Machine) classifier to determine the stage to which the d-dimensional vector belongs, that is, the stage in the course of the patient's lesion image. After determining the stage to which the lesion image of all the patients belongs, the stage in the course of the disease to which the patient's lesion image of the type of disease belongs is trained as the training data to obtain an image classification model.
  • SVM Serial Vector Machine
  • the lesion image and the stage in the course of disease are entered into a pre-established image prediction model to generate a predicted lesion image of the patient.
  • the predicted lesion image may be a lesion image corresponding to a subsequent stage of the stage. For example, if the stage of the current lesion image is n, then the predicted lesion image is the predicted lesion image of the next phase or the next x phase (eg, x+1 phase, ..., x+n phase) of the current phase.
  • Embodiments of the present disclosure do not limit the predicted lesion images that specifically capture several stages.
  • the acquired stage of the lesion image of the current patient and the stage of the disease corresponding to the lesion image are input into a pre-established image prediction model, and the predicted lesion image is output from the prediction model.
  • the lesion image and the stage in the course of the disease are input into a pre-established image prediction model to generate a predicted lesion image of the patient. Therefore, the patient can visually observe his or her future lesion evolution based on the generated predicted lesion image.
  • FIG. 3 is a block diagram showing an apparatus for generating a predicted image according to still another embodiment of the present disclosure, which may include:
  • a feature information obtaining module 301 configured to acquire feature information of the patient
  • a stage determination module 302 configured to determine a stage in a disease course corresponding to the lesion image of the patient based on the characteristic information of the patient;
  • the predicted lesion image generation module 303 is configured to generate a predicted lesion image for the patient based on the lesion image and the determined phase, wherein the predicted lesion image is a lesion corresponding to a subsequent stage of the stage image.
  • the predicted lesion image generation module is configured to generate a predicted lesion image of the patient by inputting the lesion image and the determined phase into a pre-established image prediction model.
  • the feature information obtaining module 301 can include:
  • a medical record data receiving sub-module configured to receive medical record data of the patient
  • a feature information extraction sub-module configured to extract feature information of the patient from the medical record data.
  • the image prediction model is established by:
  • the stage in the disease course of the sample and the corresponding sample lesion image are input into a prediction model for training to establish the image prediction model.
  • FIG. 4 is a schematic structural diagram of an apparatus for generating a predicted image according to still another embodiment of the present disclosure, which may include:
  • a feature information obtaining module 401 configured to acquire feature information of the patient
  • a stage determination module 402 configured to determine a stage in a disease course corresponding to the lesion image of the patient based on the characteristic information of the patient;
  • the predicted lesion image generation module 403 is configured to generate a predicted lesion image for the patient based on the lesion image and the determined phase, wherein the predicted lesion image is a lesion corresponding to a subsequent stage of the stage image.
  • the predicted lesion image generation module is configured to generate a predicted lesion image of the patient by inputting the lesion image and the determined phase into a pre-established image prediction model.
  • phase determining module 402 can include:
  • a first cluster type determining sub-module 4022 configured to determine, according to the feature information of the patient, a first cluster type to which the patient belongs from a cluster type of the patient; the cluster type of the patient is passed Using a clustering algorithm obtained from patient feature information as a sample, each cluster type corresponds to an image classification model;
  • the image classification sub-module 4024 is configured to determine a stage in the course of the disease corresponding to the lesion image by inputting the previously acquired lesion image of the patient into the first image classification model corresponding to the first cluster type.
  • the embodiments of the present disclosure can be implemented by hardware, software, firmware or any combination thereof.
  • the technical solutions of the embodiments of the present disclosure may be embodied in the form of computer executable instructions, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a USB flash drive, a mobile hard disk, etc.). Any of the methods described in the embodiments of the present disclosure are performed when the computer executable instructions are executed.
  • the technical solution of the embodiments of the present disclosure may be embodied in the form of a computing device (which may be a personal computer, a server, or a network device, etc.), the computing device including a processor and a memory, the memory storing computer executable instructions, when Any of the methods described in the embodiments of the present disclosure when the computer executable instructions are executed by a processor.
  • a computing device which may be a personal computer, a server, or a network device, etc.
  • the computing device including a processor and a memory, the memory storing computer executable instructions, when Any of the methods described in the embodiments of the present disclosure when the computer executable instructions are executed by a processor.
  • FIG. 5 illustrates an example computing device 500 that can implement the various techniques described herein.
  • Computing device 500 can be, for example, a server, a device associated with a client (eg, a client device), a system on a chip, and/or any other suitable computing device or computing system.
  • the example computing device 500 as illustrated includes a processing system 501 communicatively coupled to each other, one or more computer readable media 502, and one or more I/O interfaces 503. Although not shown, computing device 500 may further include a system bus or other data and command transmission system that couples various components to each other.
  • Processing system 501 represents functionality for performing one or more operations using hardware. Accordingly, processing system 501 is illustrated as including hardware elements 504 that can be configured as processors, functional blocks, and the like.
  • the processor can be comprised of a semiconductor and/or a transistor (eg, an electronic integrated circuit (IC)).
  • the processor-executable instructions can be electronically executable instructions.
  • Computer readable medium 502 is illustrated as including memory/storage 505.
  • Memory/storage 505 can include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), flash memory, optical disks, magnetic disks, and so forth).
  • RAM random access memory
  • ROM read only memory
  • Memory/storage 505 may include fixed media (e.g., RAM, ROM, fixed hard drive, etc.) as well as removable media (e.g., flash memory, removable hard drive, optical disk, etc.).
  • One or more input/output interfaces 503 represent functionality for allowing a user to input commands and information to computing device 500 using various input devices and also to allow various output devices to present information to users and/or other components or devices.
  • input devices include a keyboard, a cursor control device (eg, a mouse), a microphone (eg, for voice input), a scanner, touch functionality (eg, a capacitive or other sensor configured to detect a physical touch), a camera (For example, it may use non-visible wavelengths that are visible or such as infrared frequencies to detect movements that are not touch-related such as gestures, etc.).
  • Examples of output devices include display devices (eg, monitors or projectors), speakers, printers, network cards, tactile response devices, and the like.
  • modules include routines, programs, objects, components, components, data structures, and the like that perform particular tasks or implement particular abstract data types.
  • module means software, firmware, hardware, or a combination thereof.
  • the features of the techniques described herein are platform-independent, which means that the techniques can be implemented on a variety of computing platforms having multiple processors.
  • Software, hardware or program modules and other program modules may be implemented as one or more instructions and/or logic embodied on some form of computer readable storage medium and/or by one or more hardware elements 504.
  • Computing device 500 can be configured to implement specific instructions and/or functionality corresponding to software and/or hardware modules.
  • computing device 500 can take a variety of different configurations, such as computers, mobile devices, and televisions.
  • the techniques described herein may be supported by these various configurations of computing device 500 and are not limited to the specific examples of the techniques described herein.
  • This functionality may also be implemented in whole or in part by using a distributed system, such as being implemented on a "cloud.”
  • a person skilled in the art can understand that the drawings are only schematic diagrams of alternative embodiments, and the modules or processes in the drawings are not necessarily required to implement the disclosure.
  • modules in the apparatus described in the embodiments may be distributed in the manner described in the embodiments, or may be distributed in a manner different from that described in the embodiment.
  • the modules of the above embodiments may be combined into one module, or may be further split into multiple sub-modules.

Abstract

一种生成预测图像的方法和装置。所述方法包括:获取患者的特征信息;根据所述患者的特征信息确定所述患者的病灶图像对应的病程中的阶段;基于所述病灶图像以及确定的所述阶段,生成针对所述患者的预测的病灶图像,其中所述预测的病灶图像为所述阶段的后续阶段对应的病灶图像。患者可以根据生成的预测的病灶图像直观地观察自己未来的病灶演变的情况。

Description

一种生成预测图像的方法及装置
相关申请的交叉引用
本申请要求2017年6月19日提交的中国专利申请No.201710467221.3的权益,其全部公开内容通过引用合并于此。
技术领域
本公开涉及医学的技术领域,特别是涉及一种生成预测图像的方法及装置。
背景技术
目前的医学领域中,病灶图像的处理和分析是常用的手段之一,医生可以通过对病灶图像(如X光图像、B超图像、胃镜照片、核磁影像等等)的分析深入了解患者的病灶情况,并给出合理的治疗建议。
然而,由于大部分患者对医学专业知识掌握有限,患者本人所了解到的仅是医生依据病灶图像分析到的结果,患者不能准确和形象地预知自己未来的病灶演变情况。
发明内容
根据本公开的一个方面,提供了一种生成预测图像的方法,包括:获取患者的特征信息;根据所述患者的特征信息确定所述患者的病灶图像对应的病程中的阶段;基于所述病灶图像以及确定的所述阶段,生成针对所述患者的预测的病灶图像,其中所述预测的病灶图像为所述阶段的后续阶段对应的病灶图像。
可选地,基于所述病灶图像以及确定的所述阶段生成针对所述患者的预测的病灶图像包括:将所述病灶图像以及确定的所述阶段输入预先建立的图像预测模型,以生成针对所述患者的预测的病灶图像。
可选地,所述图像预测模型通过以下方式被预先建立:获取样本病程中的阶段及对应的样本病灶图像,其中在所述样本病程中的阶段是连续的;将所述样本病程中的阶段及对应的样本病灶图像输入预测模型进行训练,以建立所述图像预测模型。
可选地,所述获取患者的特征信息包括:接收所述患者的病历数 据;从所述病历数据中提取所述患者的特征信息。
可选地,所述根据所述患者的特征信息确定所述患者的病灶图像对应的病程中的阶段包括:根据所述患者的特征信息,从患者的聚类类型中确定所述患者所属的第一聚类类型,其中所述患者的聚类类型是通过应用聚类算法从作为样本的患者特征信息获得的,每种聚类类型对应一种图像分类模型;将所述患者的病灶图像输入所述第一聚类类型对应的第一图像分类模型,确定所述病灶图像对应的病程中的阶段。
可选地,所述图像分类模型通过以下方式获得:获取每种聚类类型下病程中不同的阶段对应的样本病灶图像;将所述不同的阶段对应的样本病灶图像输入分类模型进行训练,获得所述每种聚类类型对应的图像分类模型。
可选地,基于所述病灶图像以及确定的所述阶段生成针对所述患者的预测的病灶图像包括:根据所述患者的特征信息,从患者的聚类类型中确定所述患者所属的第一聚类类型,其中所述患者的聚类类型是通过利用聚类算法从作为样本的患者特征信息获得的;根据确定的所述患者的病灶图像对应的病程中的阶段,从所述第一聚类类型对应的图像信息中生成针对所述患者的预测的病灶图像,其中所述图像信息包括病程中的所有阶段与病灶图像的对应关系。
根据本公开的另一方面,提供了一种生成预测图像的装置,包括:特征信息获取模块,被配置成获取患者的特征信息;阶段确定模块,被配置成根据所述患者的特征信息确定所述患者的病灶图像对应的病程中的阶段;预测病灶图像生成模块,被配置成基于所述病灶图像以及确定的所述阶段,生成针对所述患者的预测的病灶图像,其中所述预测的病灶图像为所述阶段的后续阶段对应的病灶图像。
可选地,预测病灶图像生成模块被配置成通过将所述病灶图像以及确定的所述阶段输入预先建立的图像预测模型,以生成所述患者的预测的病灶图像。
可选地,所述图像预测模型通过以下方式被预先建立:获取样本病程中的阶段及对应的样本病灶图像,其中所述样本病程中的阶段是连续的;将所述样本病程中的阶段及对应的样本病灶图像输入预测模型进行训练,以建立所述图像预测模型。
可选地,所述特征信息获取模块包括:病历数据接收子模块,被配置成接收所述患者的病历数据;特征信息提取子模块,被配置成从所述病历数据中提取所述患者的特征信息。
可选地,所述阶段确定模块包括:第一聚类类型确定子模块,被配置成根据所述患者的特征信息,从患者的聚类类型中确定所述患者所属的第一聚类类型,其中所述患者的聚类类型是通过利用聚类算法从作为样本的患者特征信息获得的,每种聚类类型对应一种图像分类模型;图像分类子模块,被配置成通过将预先获得的所述患者的病灶图像输入所述第一聚类类型对应的第一图像分类模型,来确定所述病灶图像对应的病程中的阶段。
根据本公开的又一方面,提供了一种计算设备,包括:处理器;存储器,其存储器有计算机可执行指令,当所述计算机可执行指令被处理器执行时,执行上面所述的任一方法。
附图说明
图1a示出了根据本公开的一个实施例的一种生成预测图像的方法的流程图;
图1b示出了根据本公开的所述一个实施例的一种自然发展情形下的预测图像的演变示意图;
图1c示出了根据本公开的所述一个实施例的一种采用治疗方案一的情形下的预测图像的演变示意图;
图1d示出了根据本公开的所述一个实施例的一种生成预测图像模型的流程图;
图1e示出了根据本公开的所述一个实施例的一种生成预测图像模型的流程图;
图1f示出了根据本公开的所述一个实施例的一种生成预测的病灶图像的流程图;
图1g示出了根据本公开的所述一个实施例的一种生成预测的病灶图像的流程图;
图1h示出了根据本公开的所述一个实施例的一种生成生成预测图像的模型的流程图;
图2a示出了根据本公开的另一实施例的一种生成预测图像的方法 的流程图;
图2b示出了根据本公开的所述另一实施例的一种获取病灶图像所属阶段的流程图;
图3示出了根据本公开的再一实施例的一种生成预测图像的装置的结构示意图;
图4示出了根据本公开的又一实施例的一种生成预测图像的装置的结构示意图;及
图5图示了可以实施本公开的实施例中的各种技术的示例计算设备。
具体实施方式
为使本公开的上述目的、特征和优点更加明显,下面将结合附图和具体实施方式对本公开作进一步详细说明。
图1a示出了根据本公开的一个实施例的一种生成预测图像的方法的流程图。参照图1a,所述生成预测图像的方法可以包括如下步骤101-103。
在步骤101处,获取患者的特征信息。所述患者的特征信息可以包括:患者的年龄、体重、症状、病例中的检验项目等等中的一个或多个。
可选地,获取患者的特征信息可以包括:接收所述患者的病历数据;从所述病历数据中提取所述患者的特征信息。
当患者去医院检查身体时,患者会被要求在病历数据表填写患者的个人信息,如姓名、年龄、身高、体重等等,并且,在每项检查项目完成之后,医生也会将检查结果填入该患者的病历数据表中。在对该患者的检查完成之后,可以从患者的病历数据中提取出该患者的特征信息。
在获取患者的特征信息之后,步骤102被执行。在步骤102处,根据所述患者的特征信息确定所述患者的病灶图像对应的病程中的阶段。一个病程可以包括多个阶段。
病灶图像可以为X光图像、B超图像、胃镜照片、核磁影像等等医学影像,其中,预测的病灶图像为病程中的所述阶段的后续阶段对应的病灶图像。
病程中的阶段是基于年龄、身高、体重、患病类型及等等进行划分的。例如,可以将年龄在25~35岁、身高在168~178cm、体重在65~78kg、患病类型相同、病情严重程度相似的患者的病灶图像确定为处于同一阶段。
在病情自然发展的情形下或在采用治疗方案的情形下,针对不同阶段的病灶图像的预测的图像演变是不相同的。
在本公开的实施例中,可以根据患者的特征信息确定该患者的病灶图像处于病程中的哪个阶段。例如,可以根据患者的特征信息中的患者的年龄、身高、体重、患病类型、所患疾病的严重程度等等信息,确定患者的病灶图像对应的病程中的阶段。
在确定患者的病灶图像对应的病程中的阶段之后,步骤103被执行。在步骤103处,基于所述病灶图像以及确定的所述阶段,生成针对所述患者的预测的病灶图像,其中所述预测的病灶图像为所述阶段的后续阶段对应的病灶图像。
作为一个例子,可以将所述病灶图像及所确定的病程中的阶段输入预先建立的图像预测模型,生成针对所述患者的预测的病灶图像。
作为另一例子,可以根据所述患者的特征信息,从预先确定的针对患者的多个聚类类型中确定所述患者所属的聚类类型(即,第一聚类类型),所述聚类类型是通过利用聚类算法从作为样本的患者的特征信息获得的;然后根据从所述患者的特征信息确定的病程中的阶段,从所述第一聚类类型对应的图像信息中确定针对患者的预测的病灶图像;所述图像信息包括病程中的所有阶段与病灶图像的对应关系。作为示例,对作为样本的M个患者的特征信息应用聚类算法,以得到K个类别,其中每个类别表示该疾病的一个类型。所述聚类算法例如是K-means、BIRCH或者DENCLUE算法等。
在本公开的实施例中,患者的聚类类型是对作为样本的患者的特征信息应用聚类算法获得的,其中每个聚类类型对应一个图像分类模型。
作为例子,可以从医疗相关的大数据中预先获取患有某种疾病的多个患者的病历数据,并从多个患者的病历数据中提取相应的特征信息(如年龄、体重、症状、病例中记载的检查项目等),然后使用聚类算法从提取的特征信息获取针对患者的聚类类型,其中一个聚类类型 对应该疾病的一个类型。
与每种聚类类型相关联的图像信息包含病程中的阶段与病灶图像的对应关系。如果在与第一聚类类型相关联的图像信息中包含了患者的病程中的未来某个阶段的病灶图像时,则直接从与该第一聚类类型相关联的图像信息中确定出当前患者所需的病程中的特定阶段对应的病灶图像,无需利用图像预测模型对当前患者的病灶图像进行处理。
在本公开的实施例中,预测的病灶图像可以为所确定的阶段的后续阶段对应的病灶图像。预测的病灶图像可以包括病灶在自然发展情形下的预测的后续图像,即在不使用任何治疗方案的情况下,未来病灶的发展情况。如图1b所示,第一张图为患者当前的胃镜照片,此时患者胃部阴影块面积较小。在不采取任何治疗方案的情况下,随着时间的推移,病情会逐渐加重,发展到第x阶段(即中间那张图所示)时,胃部阴影块面积与第一张图相比明显变大,表明了患者胃病加重,而如果继续不采用任何治疗方案,当发展到第x+N阶段(即最后一张图所示)时,患者的整个胃部遭受病毒的侵袭。由此,患者可以根据预测的病灶图像直观的观察自己在不采用任何治疗方案情况下未来病灶的演变情况。
相应地,预测的病灶图像也可以包括在采用某种治疗方案下的预测的后续图像,即采用了某种治疗方案的情况下,未来病灶的发展情况。如图1c所示,第一张图为患者当前的胃镜照片,此时患者胃部有部分阴影块,而在采用了治疗方案一之后,随着时间的推移,病情会得到缓解,发展到第x阶段(即中间那张图所示)时,胃部阴影块面积与第一张图相比明显变小,表明了患者胃病的减轻,而继续采用治疗方案一,当发展到第第x+N阶段(即最后一张图所示)时,患者的整个胃部已没有任何阴影块,表明了患者的胃病痊愈。由此,患者可以根据预测的病灶图像直观的观察自己在采用治疗方案一的情况下未来病灶的演变情况。
可以理解,处于病程中的不同阶段的病灶图像在自然发展状况下或者在采用相同治疗方案的情形下,其病灶图像的演变是不相同的。并且,处于病程中的相同阶段的病灶图像在采用不同治疗方案的情形下,其病灶图像的演变也是不相同的,这将在下述实施例中详细说明,在此不再加以赘述。
可选地,所述图像预测模型可以通过以下方式被预先建立:
步骤N1:获取样本病程中的阶段及对应的样本病灶图像,其中所述样本病程中的阶段是连续的;
步骤N2:将所述样本病程中的阶段以及对应的样本病灶图像,输入预测模型进行训练,以建立所述图像预测模型。
在本公开的实施例中,样本病程中的阶段及对应的样本病灶图像可以从医疗相关的大数据中被获取。医疗相关的大数据中包括了所有患者在医院检查时的详细信息,如年龄、身高、体重、性别、以及各项检查数据等等,进而在医疗相关的大数据中就包括了各类型疾病的病程中的阶段及对应的病灶图像。
借助于医疗相关的大数据,可以获取各类型疾病的样本病程中的阶段及对应的样本病灶图像。通过将样本病程(其中的阶段是连续的)中的阶段及对应的样本病灶图像输入预测模型进行训练,可以获得图像预测模型。
例如,获取图像预测模型的方式如图1d~图1e所示。首先,如图1d所示,从医疗相关的大数据中获取患有同一类型疾病的多个患者的病历数据,然后分别从所述多个患者的病历数据中提取出患者的特征信息。如图所示,可以获取患者的年龄、病史、同期的其他疾病情况、体征参数、血常规等化验数据,并获取病程中各阶段的病灶图像。然后,如图1e所示,根据获取的在该类型疾病下多个患者的特征信息及病灶图像,生成病程中的阶段1~n的预测图像,由此建立该类型疾病的图像预测模型。通过此种方式,可以建立不同类型疾病的图像预测模型。
在建立图像预测模型之后,可以将当前患者的病灶图像及对应的病程中的阶段输入建立的图像预测模型,生成患者的预测的病灶图像。例如,如图1f所示,将当前患者的病历数据、年龄体重等相关体征参数输入图像预测模型中以获取匹配的患者类型;然后,在获取到匹配的患者类型之后,输入该患者的当前病灶图像,以获取与该病灶图像匹配的图像预测模型。然后,如图1g所示,依据患者当前的病灶图像及其对应的病程中的阶段X,由匹配的图像预测模型输出对应的未来的病程中的阶段的后续预测图像,如输出阶段X+1的预测图像,也可以输出阶段X+N的预测图像,以及可以根据患者的需求输出特定阶段 的预测图像。
在一个实施例中,将其中阶段是连续的样本病程中的各阶段及对应的样本病灶图像,输入目前常用的预测模型(如灰色预测模型等等)进行训练,可以建立图像预测模型,这例如可以通过如图1h所示的方式进行:
步骤M1:对每个类型的患者的每个阶段的病灶图像建立一个图像生成模型,其中,图像生成模型可以包括:编码器和解码器(编码器和解码器均可以采用现有技术中常用的CNN(卷积神经网络)模型);
步骤M2:将类型i的患者的病程中的第i阶段的病灶图像输入编码器;
步骤M3:编码器将输入的图像编码成为固定长度的向量d;
步骤M4:将向量d输入解码器,并将类型i的患者的病程中的第i+1阶段对应的病灶图像作为输出。
通过上述方式可以产生图像生成模型。在图像生成模型训练完成之后,则可以针对输入的患者的第i阶段的病灶图像产生i+1阶段的病灶图像,进而以i+1阶段的病灶图像作为输入,产生i+2阶段的病灶图像。以此,可以获取患者所需的当前阶段之后的连续i+1、i+2、...、i+n个阶段的病灶图像。然后,以不同阶段的病灶图像作为训练数据以训练预测模型,以得到图像预测模型。
由于从医疗相关的大数据中所获取的各类患者的病程中的相同阶段的病灶图像的数量是相当多的,因此本公开的实施例中可以利用从医疗相关的大数据中所获取的同一类型的患者的相同阶段的病灶图像来训练图像生成模型,以获取该类型患者的在当前阶段的统一的病灶图像,进而获取该类型的患者的在不同阶段的统一的病灶图像以进行训练,从而获取预测图像模型,以用于预测患者的病灶图像对应的未来各个阶段的预测的病灶图像。
在本公开的实施例中,在确定患者的病灶图像对应的病程中的阶段之后,将病灶图像以及阶段输入预先建立的图像预测模型,生成针对患者的预测的病灶图像。这样,患者可以根据生成的预测的病灶图像直观地观察自己未来的病灶演变情况。
图2a示出了根据本公开的另一实施例的一种生成预测图像的方法的流程图,其可以包括如下步骤201-204。
在步骤201处,获取患者的特征信息。在本公开的实施例中,患者的特征信息可以包括患者的年龄、体重、症状、病例中的检验项目等等中的一个或多个。
在获取患者的特征信息之后,则进入步骤202。在步骤202处,根据所述患者的特征信息,在患者的聚类类型中确定所述患者所属的第一聚类类型。
在本公开的实施例中,患者的聚类类型为通过使用聚类算法从作为样本的患者的特征信息获得的,其中每种聚类类型对应有图像分类模型。
可以预先从医疗相关的大数据中获取患有某种疾病的多个患者的病历数据,并从多个患者的病历数据中提取相应的特征信息(如年龄、体重、症状、病例中的检验项目等)。然后,依据提取的特征信息使用聚类算法对多个患者进行聚类,得到患者的聚类类型,其中一个聚类类型对应着该疾病的一个类型。
在接收到当前患者的特征信息后,可以利用聚类算法从预先确定的患者的聚类类型获得该患者所属的第一聚类类型。
在确定了当前患者所属的第一聚类类型之后,进入步骤203。在步骤203处,将预先获得的所述患者的病灶图像输入所述第一聚类类型对应的图像分类模型,确定所述病灶图像对应的病程中的阶段。
由于不同的聚类类型对应有不同的图像分类模型,因此在确定当前患者所属的第一聚类类型之后,可以获取到与患者所属第一聚类类型对应的图像分类模型,然后可以确定该患者的病灶图像对应的病程中阶段。
可选地,所属图像分类模型可以通过以下方式获得:
步骤S1:获取每种聚类类型下病程中的不同阶段对应的样本病灶图像;
步骤S2:将所述不同阶段对应的样本病灶图像输入分类模型进行训练,获得所述每种聚类类型对应的图像分类模型。
例如,首先从医疗相关的大数据中获取M个患者的数据(如病例数据、医学图像数据、体征症状等等),然后从这些患者的数据中提取出相应的特征信息(如年龄、体重、症状、病例中的检验项目等)。依据提取的这些特征信息使用聚类算法对M个患者进行聚类,得到相应 的患者的聚类类型,其中每种聚类类型对应着某种疾病的一个类型。然后,针对每种聚类类型建立一个分类器,利用分类器对属于该聚类类型的患者的病灶图像所处的病程中的阶段进行分类,进而以该聚类类型下M个患者中所有的病灶图像及其所属阶段作为训练数据进行训练,以生成对应的图像分类模型。
对于采用分类器对病灶图像所处的病程中的阶段进行分类以获取图像分类模型可以采用如下方式进行:
如图2b所示,首先获取某类型疾病下的患者病灶图像,并将患者的病灶图像输入CNN模型,由CNN模型输出对应一个d维向量,其中d为大于等于2的正整数,该d维向量用于表示患者的病灶图像。然后,将该d维向量输入到SVM(Support Vector Machine,支持向量机)分类器中以判断该d维向量所属的阶段,也即患者病灶图像所属的病程中的阶段。在确定所有患者的病灶图像所属的阶段之后,则以该类型疾病下的患者的病灶图像所属的病程中的阶段作为训练数据进行训练,以得到图像分类模型。
CNN模型如何输出d维向量及SVM分类器判断d维向量所属的阶段是本领域常用的技术手段。
在步骤204处,将所述病灶图像及所述病程中的阶段输入预先建立的图像预测模型,生成所述患者的预测的病灶图像。
在本公开的实施例中,预测的病灶图像可以为所述阶段的后续阶段对应的病灶图像。例如,当前病灶图像的阶段为n,则预测的病灶图像则为当前阶段的下一阶段或下x阶段(如x+1阶段、...、x+n阶段)的预测的病灶图像。
本公开的实施例对具体获取几个阶段的预测的病灶图像不加以限制。
将所获取的当前患者的病灶图像及病灶图像对应的病程中的阶段输入预先建立的图像预测模型中,由预测模型输出预测的病灶图像。
在本公开的实施例中,在确定患者的病灶图像对应的病程中的阶段之后,将病灶图像以及病程中的阶段输入预先建立的图像预测模型,以生成患者的预测的病灶图像。因此,患者可以根据生成的预测的病灶图像可以直观地观察自己未来的病灶演变情况。
图3示出了根据本公开的再一实施例的一种生成预测图像的装置 的结构示意图,其可以包括:
特征信息获取模块301,其被配置成获取患者的特征信息;
阶段确定模块302,其被配置成根据所述患者的特征信息确定所述患者的病灶图像对应的病程中的阶段;
预测病灶图像生成模块303,被配置成基于所述病灶图像以及确定的所述阶段,生成针对所述患者的预测的病灶图像,其中所述预测的病灶图像为所述阶段的后续阶段对应的病灶图像。
可选地,预测病灶图像生成模块被配置成通过将所述病灶图像以及确定的所述阶段输入预先建立的图像预测模型,以生成所述患者的预测的病灶图像。
所述特征信息获取模块301可以包括:
病历数据接收子模块,其被配置成接收所述患者的病历数据;
特征信息提取子模块,其被配置成从所述病历数据中提取所述患者的特征信息。
可选地,所述图像预测模型通过以下方式建立:
获取其中阶段连续的样本病程中的阶段及对应的样本病灶图像;
将所述样本病程中的阶段及对应的样本病灶图像,输入预测模型进行训练,以建立所述图像预测模型。
图4示出了根据本公开的又一实施例的一种生成预测图像的装置的结构示意图,其可以包括:
特征信息获取模块401,其被配置成获取患者的特征信息;
阶段确定模块402,其被配置成根据所述患者的特征信息确定所述患者的病灶图像对应的病程中的阶段;
预测病灶图像生成模块403,被配置成基于所述病灶图像以及确定的所述阶段,生成针对所述患者的预测的病灶图像,其中所述预测的病灶图像为所述阶段的后续阶段对应的病灶图像。
可选地,预测病灶图像生成模块被配置成通过将所述病灶图像以及确定的所述阶段输入预先建立的图像预测模型,以生成所述患者的预测的病灶图像。
可选地,所述阶段确定模块402可以包括:
第一聚类类型确定子模块4022,其被配置成根据所述患者的特征信息,从患者的聚类类型中确定所述患者所属的第一聚类类型;所述 患者的聚类类型是通过利用聚类算法从作为样本的患者特征信息获得的,每种聚类类型对应图像分类模型;
图像分类子模块4024,配置成通过将预先获得的所述患者的病灶图像输入所述第一聚类类型对应的第一图像分类模型,来确定所述病灶图像对应的病程中的阶段。
通过以上的实施例的描述,本领域的技术人员可以清楚地了解到本公开的实施例可以通过硬件、软件、固件或者其任何组合来实现。本公开的实施例的技术方案可以以计算机可执行指令的形式体现出来,该计算机可执行指令可以存储在一个非易失性存储介质(可以是CD-ROM,U盘,移动硬盘等)中。当所述计算机可执行指令被执行时本公开的实施例所述的任一方法被执行。本公开的实施例的技术方案可以以计算设备(可以是个人计算机,服务器,或者网络设备等)的形式体现出来,所述计算设备包括处理器和存储器,所述存储器存储计算机可执行指令,当所述计算机可执行指令被处理器执行时执行本公开的实施例所述的任一方法。
图5图示了可以实施本文中描述的各种技术的示例计算设备500。计算设备500例如可以是服务器、与客户端相关联的设备(例如,客户端设备)、片上系统和/或任何其它合适的计算设备或者计算系统。
如所图示的示例计算设备500包括通信地耦接到彼此的处理系统501、一个或多个计算机可读介质502和一个或多个I/O接口503。尽管未示出,但计算设备500可以进一步包括将各种构件耦接到彼此的系统总线或者其它的数据和命令传输系统。
处理系统501表示用于使用硬件执行一个或多个操作的功能性。相应地,处理系统501被图示为包括硬件元件504,其可以被配置为处理器、功能块等。例如,处理器可以由半导体和/或晶体管(例如,电子集成电路(IC))组成。在这样的上下文中,处理器可执行指令可以是电子可执行的指令。
计算机可读介质502被图示为包括存储器/存储装置505。存储器/存储装置505可以包括易失性介质(诸如是随机存取存储器(RAM))和/或非易失性介质(诸如是只读存储器(ROM)、闪存、光盘、磁盘等)。存储器/存储装置505可以包括固定介质(例如,RAM、ROM、固定硬盘驱动器等)以及可移除介质(例如,闪存、可移除硬盘驱动 器、光盘等)。
一个或多个输入/输出接口503表示用于允许用户使用各种输入设备向计算设备500输入命令和信息以及还允许使用各种输出设备向用户和/或其它构件或者设备呈现信息的功能性。输入设备的示例包括键盘、光标控制设备(例如,鼠标)、麦克风(例如,用于语音输入)、扫描仪、触摸功能性(例如,被配置为检测物理触摸的电容式或其它传感器)、照相机(例如,其可以使用可见的或者诸如是红外频率这样的非可见的波长来检测诸如是手势的不涉及触摸的移动)等。输出设备的示例包括显示设备(例如,监视器或者投影仪)、扬声器、打印机、网卡、触觉响应设备等。
在本文中可能在软件、硬件元件或者程序模块的一般上下文中描述了各种技术。一般地,这样的模块包括执行特定的任务或者实施特定的抽象数据类型的例程、程序、对象、元件、构件、数据结构等。一般地,如本文中使用的术语“模块”、“功能性”和“构件”表示软件、固件、硬件或者其组合。本文中描述的技术的特征是平台无关的,这意味着可以在具有多种处理器的多种计算平台上实施所述技术。
软件、硬件或者程序模块和其它程序模块可以作为被体现在某种形式的计算机可读存储介质上的一个或多个指令和/或逻辑和/或通过一个或多个硬件元件504被实施。计算设备500可以被配置为实施与软件和/或硬件模块相对应的特定的指令和/或功能。
在各种实施方案中,计算设备500可以采取多种不同的配置,诸如计算机、移动装置和电视机等。本文中描述的技术可以被计算设备500的这些各种配置支持,并且不限于本文中描述的技术的具体的示例。该功能性也可以全部或者部分地通过使用分布式系统被实施,诸如在“云”上被实施。本领域技术人员可以理解附图只是可选实施例的示意图,附图中的模块或流程并不一定是实施本公开所必须的。
本领域技术人员可以理解实施例中描述的装置中的模块可以按照实施例描述的方式被分布,也可以以不同于本实施例描述方式被分布。上述实施例的模块可以合并为一个模块,也可以进一步拆分成多个子模块。
为了描述的简单,前述的各方法实施例都被表述为一系列的动作组合,但是本领域技术人员应该理解,本公开并不受所描述的动作顺 序的限制,某些步骤可以采用其他顺序或者同时进行。本领域技术人员也应该理解,说明书中所描述的实施例均属于可选实施例,所涉及的动作和模块并不一定是本公开所必须的。
还需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、商品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、商品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个......”限定的要素,并不排除在包括所述要素的过程、方法、商品或者设备中还存在另外的相同要素。
显然,本领域的技术人员可以对本公开的实施例进行各种改动和变型而不脱离本公开的精神和范围。倘若本公开的实施例的这些修改和变型属于本公开的权利要求及其等同技术的范围之内,则本公开也意图包含这些改动和变型在内。

Claims (13)

  1. 一种生成预测图像的方法,包括:
    获取患者的特征信息;
    根据所述患者的特征信息确定所述患者的病灶图像对应的病程中的阶段;
    基于所述病灶图像以及确定的所述阶段,生成针对所述患者的预测的病灶图像,其中所述预测的病灶图像为所述阶段的后续阶段对应的病灶图像。
  2. 根据权利要求1所述的方法,其中基于所述病灶图像以及确定的所述阶段生成针对所述患者的预测的病灶图像包括:
    将所述病灶图像以及确定的所述阶段输入预先建立的图像预测模型,以生成针对所述患者的预测的病灶图像。
  3. 根据权利要求2所述的方法,其中,所述图像预测模型通过以下方式被预先建立:
    获取样本病程中的阶段及对应的样本病灶图像,其中在所述样本病程中的阶段是连续的;
    将所述样本病程中的阶段及对应的样本病灶图像输入预测模型进行训练,以建立所述图像预测模型。
  4. 根据权利要求1所述的方法,其中,所述获取患者的特征信息包括:
    接收所述患者的病历数据;
    从所述病历数据中提取所述患者的特征信息。
  5. 根据权利要求1所述的方法,其中,所述根据所述患者的特征信息确定所述患者的病灶图像对应的病程中的阶段包括:
    根据所述患者的特征信息,从患者的聚类类型中确定所述患者所属的第一聚类类型,其中所述患者的聚类类型是通过应用聚类算法从作为样本的患者特征信息获得的,每种聚类类型对应一种图像分类模型;
    将所述患者的病灶图像输入所述第一聚类类型对应的第一图像分类模型,确定所述病灶图像对应的病程中的阶段。
  6. 根据权利要求5所述的方法,其中,所述图像分类模型通过以 下方式获得:
    获取每种聚类类型下病程中不同的阶段对应的样本病灶图像;
    将所述不同的阶段对应的样本病灶图像输入分类模型进行训练,获得所述每种聚类类型对应的图像分类模型。
  7. 根据权利要求1所述的方法,其中,基于所述病灶图像以及确定的所述阶段生成针对所述患者的预测的病灶图像包括:
    根据所述患者的特征信息,从患者的聚类类型中确定所述患者所属的第一聚类类型,其中所述患者的聚类类型是通过利用聚类算法从作为样本的患者特征信息获得的;
    根据确定的所述患者的病灶图像对应的病程中的阶段,从所述第一聚类类型对应的图像信息中生成针对所述患者的预测的病灶图像,其中所述图像信息包括病程中的所有阶段与病灶图像的对应关系。
  8. 一种生成预测图像的装置,包括:
    特征信息获取模块,被配置成获取患者的特征信息;
    阶段确定模块,被配置成根据所述患者的特征信息确定所述患者的病灶图像对应的病程中的阶段;
    预测病灶图像生成模块,被配置成基于所述病灶图像以及确定的所述阶段,生成针对所述患者的预测的病灶图像,其中所述预测的病灶图像为所述阶段的后续阶段对应的病灶图像。
  9. 根据权利要求8所述的装置,其中预测病灶图像生成模块被配置成通过将所述病灶图像以及确定的所述阶段输入预先建立的图像预测模型,以生成所述患者的预测的病灶图像。
  10. 根据权利要求8所述的装置,其中,所述图像预测模型通过以下方式被预先建立:
    获取样本病程中的阶段及对应的样本病灶图像,其中所述样本病程中的阶段是连续的;
    将所述样本病程中的阶段及对应的样本病灶图像输入预测模型进行训练,以建立所述图像预测模型。
  11. 根据权利要求8所述的装置,其中,所述特征信息获取模块包括:
    病历数据接收子模块,被配置成接收所述患者的病历数据;
    特征信息提取子模块,被配置成从所述病历数据中提取所述患者 的特征信息。
  12. 根据权利要求8所述的装置,其中,所述阶段确定模块包括:
    第一聚类类型确定子模块,被配置成根据所述患者的特征信息,从患者的聚类类型中确定所述患者所属的第一聚类类型,其中所述患者的聚类类型是通过利用聚类算法从作为样本的患者特征信息获得的,每种聚类类型对应一种图像分类模型;
    图像分类子模块,被配置成通过将预先获得的所述患者的病灶图像输入所述第一聚类类型对应的第一图像分类模型,来确定所述病灶图像对应的病程中的阶段。
  13. 一种计算设备,包括:
    处理器;
    存储器,其存储器有计算机可执行指令,当所述计算机可执行指令被处理器执行时,执行如权利要求1-7中任一项所述的方法。
PCT/CN2018/090930 2017-06-19 2018-06-13 一种生成预测图像的方法及装置 WO2018233520A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710467221.3A CN107292103B (zh) 2017-06-19 2017-06-19 一种预测图像生成方法及装置
CN201710467221.3 2017-06-19

Publications (1)

Publication Number Publication Date
WO2018233520A1 true WO2018233520A1 (zh) 2018-12-27

Family

ID=60097375

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/090930 WO2018233520A1 (zh) 2017-06-19 2018-06-13 一种生成预测图像的方法及装置

Country Status (2)

Country Link
CN (1) CN107292103B (zh)
WO (1) WO2018233520A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107292103B (zh) * 2017-06-19 2020-07-31 京东方科技集团股份有限公司 一种预测图像生成方法及装置
CN108122613B (zh) * 2018-01-15 2022-04-01 北京颐圣智能科技有限公司 基于健康预测模型的健康预测方法和装置
CN110246563A (zh) * 2019-06-14 2019-09-17 合肥大族科瑞达激光设备有限公司 钬激光治疗设备
CN110544534B (zh) * 2019-08-30 2022-04-19 中国人民解放军联勤保障部队第九〇〇医院 一种皮肤病治疗效果自动评估方法与系统
CN113096756B (zh) * 2021-04-26 2023-12-22 讯飞医疗科技股份有限公司 病情演变分类方法、装置、电子设备和存储介质
CN115376698B (zh) * 2022-10-25 2023-04-11 北京鹰瞳科技发展股份有限公司 用于对眼底疾病的演进进行预测的装置、方法和存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105005714A (zh) * 2015-06-18 2015-10-28 中国科学院自动化研究所 一种基于肿瘤表型特征的非小细胞肺癌预后方法
CN105653858A (zh) * 2015-12-31 2016-06-08 中国科学院自动化研究所 一种基于影像组学的病变组织辅助预后系统和方法
CN106339593A (zh) * 2016-08-31 2017-01-18 青岛睿帮信息技术有限公司 基于医疗数据建模的川崎病分类预测方法
CN107292103A (zh) * 2017-06-19 2017-10-24 京东方科技集团股份有限公司 一种预测图像生成方法及装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7346203B2 (en) * 2003-11-19 2008-03-18 General Electric Company Methods and apparatus for processing image data to aid in detecting disease
CN106355023A (zh) * 2016-08-31 2017-01-25 北京数字精准医疗科技有限公司 基于医学影像的开放式定量分析方法与系统
CN106599553B (zh) * 2016-11-29 2019-08-16 中国科学院深圳先进技术研究院 疾病预警装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105005714A (zh) * 2015-06-18 2015-10-28 中国科学院自动化研究所 一种基于肿瘤表型特征的非小细胞肺癌预后方法
CN105653858A (zh) * 2015-12-31 2016-06-08 中国科学院自动化研究所 一种基于影像组学的病变组织辅助预后系统和方法
CN106339593A (zh) * 2016-08-31 2017-01-18 青岛睿帮信息技术有限公司 基于医疗数据建模的川崎病分类预测方法
CN107292103A (zh) * 2017-06-19 2017-10-24 京东方科技集团股份有限公司 一种预测图像生成方法及装置

Also Published As

Publication number Publication date
CN107292103A (zh) 2017-10-24
CN107292103B (zh) 2020-07-31

Similar Documents

Publication Publication Date Title
WO2018233520A1 (zh) 一种生成预测图像的方法及装置
Wang et al. Should health care demand interpretable artificial intelligence or accept “black box” medicine?
Wu et al. Comparison of chest radiograph interpretations by artificial intelligence algorithm vs radiology residents
Yu et al. Assessment of automated identification of phases in videos of cataract surgery using machine learning and deep learning techniques
US9760990B2 (en) Cloud-based infrastructure for feedback-driven training and image recognition
US11521716B2 (en) Computer-implemented detection and statistical analysis of errors by healthcare providers
US20210042916A1 (en) Deep learning-based diagnosis and referral of diseases and disorders
Alkhodari et al. Detection of COVID-19 in smartphone-based breathing recordings: A pre-screening deep learning tool
US20190259473A1 (en) Identification of individuals by trait prediction from the genome
US20200019823A1 (en) Medical image analysis method applying machine learning and system thereof
JP2020518050A (ja) エンティティ間のコンテキスト的類似度の学習及び適用
US20190237200A1 (en) Recording medium recording similar case retrieval program, information processing apparatus, and similar case retrieval method
JP2021518599A (ja) 医用レポート内のテキストデータに基づいて医用画像を生成する方法及びシステム
JP7320280B2 (ja) ラベル収集装置、ラベル収集方法及びラベル収集プログラム
TW202217843A (zh) 基於深度學習的遠端舌診方法及電腦程式產品以及裝置
Khalsa et al. Artificial intelligence and cardiac surgery during COVID‐19 era
McCullough et al. Convolutional neural network models for automatic preoperative severity assessment in unilateral cleft lip
US10936962B1 (en) Methods and systems for confirming an advisory interaction with an artificial intelligence platform
Bacellar et al. Covid-19 chest x-ray image classification using deep learning
WO2020036207A1 (ja) 医療用情報処理システム、医療用情報処理装置、および医療用情報処理方法
US20200286615A1 (en) Method for analysing a medical imaging data set, system for analysing a medical imaging data set, computer program product and a computer-readable medium
JP2022504508A (ja) モデル支援型事象予測のためのシステム及び方法
Attallah MonDiaL-CAD: Monkeypox diagnosis via selected hybrid CNNs unified with feature selection and ensemble learning
US20200253548A1 (en) Classifying a disease or disability of a subject
KR20230030992A (ko) 의료상담 콘텐츠에 기초한 의료상담자 추천 방법 및 시스템

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18821510

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 09/06/2020)

122 Ep: pct application non-entry in european phase

Ref document number: 18821510

Country of ref document: EP

Kind code of ref document: A1