CN117670830A - Index data determining method and device, electronic equipment and storage medium - Google Patents

Index data determining method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117670830A
CN117670830A CN202311666499.5A CN202311666499A CN117670830A CN 117670830 A CN117670830 A CN 117670830A CN 202311666499 A CN202311666499 A CN 202311666499A CN 117670830 A CN117670830 A CN 117670830A
Authority
CN
China
Prior art keywords
image
diseased
area
sample
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311666499.5A
Other languages
Chinese (zh)
Inventor
罗立刚
高光明
侯波林
张娟娟
罗祥凤
程学亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Linkdoc Technology Beijing Co ltd
Original Assignee
Linkdoc Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Linkdoc Technology Beijing Co ltd filed Critical Linkdoc Technology Beijing Co ltd
Priority to CN202311666499.5A priority Critical patent/CN117670830A/en
Publication of CN117670830A publication Critical patent/CN117670830A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Image Processing (AREA)

Abstract

The application provides an index data determining method, an index data determining device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring a current diseased image for a diseased subject; determining a first focus area in the current diseased image; obtaining a second focus area corresponding to the first focus area in the historical diseased image based on the current diseased image and the first focus area of the current diseased image; acquiring static characteristic data of a first focus area and static characteristic data of a second focus area; determining dynamic index data of the first focus area in the current diseased image relative to the second focus area in the historical diseased image based on the static feature data of the first focus area and the static feature data of the second focus area; the dynamic index data is used to determine the stage of the lesion where the diseased subject is located for the current diseased image. Accurate determination of the focus stage where the diseased subject is located is achieved.

Description

Index data determining method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a method and apparatus for determining index data, an electronic device, and a storage medium.
Background
With the improvement of computer tomography and the popularization of cancer high risk group screening, more and more cancer nodules are detected. The size, duty cycle and growth of the cancer nodules are important for the determination of the focal stage of the cancer and the formulation of later treatment regimens. In the related art, qualitative and static characterization indexes such as size, volume and the like can be obtained only through manual measurement. Objective dynamic index data describing the growth of cancer nodules is lacking. Therefore, how to accurately obtain dynamic index data representing the growth of cancer nodules, and further accurately determine the focus stage of cancer, is a technical problem to be solved.
Disclosure of Invention
The application provides an index data determining method, an index data determining device, electronic equipment and a storage medium, so as to at least solve the technical problems in the prior art.
According to a first aspect of the present application, there is provided an index data determining method, the method comprising:
acquiring a current diseased image for a diseased subject;
determining a first focus area in the current diseased image;
obtaining a second focus area corresponding to the first focus area in the historical diseased image based on the current diseased image and the first focus area of the current diseased image; the current disease image and the historical disease image are disease images of a disease object in different disease periods;
Acquiring static characteristic data of a first focus area and static characteristic data of a second focus area;
determining dynamic index data of the first focus area in the current diseased image relative to the second focus area in the historical diseased image based on the static feature data of the first focus area and the static feature data of the second focus area; the dynamic index data is used to determine a focal stage at which the diseased subject is located for the current diseased image.
In the above-mentioned scheme, the determining the first lesion area in the current diseased image includes:
inputting the current diseased image into a pre-trained detection model to obtain an interested region in the current diseased image; the interested area comprises a first focus area and an area except the first focus area in the current disease image;
inputting the region of interest into a segmentation model to obtain a first focus region in the current diseased image; the segmentation model is obtained by training a model to be trained through a single-channel binarization interesting sample area.
In the above-mentioned scheme, the obtaining the second focus area corresponding to the first focus area in the historical diseased image based on the current diseased image and the first focus area of the current diseased image includes:
Performing image registration on the current diseased image and the historical diseased image based on affine transformation to obtain an image registration result;
based on the image registration result, determining the dimension corresponding relation between the current diseased image and the historical diseased image through an intersection ratio function;
and carrying out region registration on the current diseased image and the historical diseased image based on the dimension corresponding relation to obtain a second focus region corresponding to the first focus region in the historical diseased image.
In the above solution, the determining, based on the static feature data of the first lesion area and the static feature data of the second lesion area, dynamic index data of the first lesion area in the current diseased image relative to the second lesion area in the historical diseased image includes:
determining a target static feature data value based on the static feature data of the first lesion area and the static feature data of the second lesion area;
and determining dynamic index data of the first focus area in the current diseased image relative to the second focus area in the historical diseased image according to the target static characteristic data value, the acquisition time of the current diseased image and the acquisition time of the historical diseased image.
In the above scheme, the method further comprises:
inputting the static characteristic data of the first focus area and the dynamic index data of the first focus area in the current diseased image relative to the second focus area in the historical diseased image into a stage prediction model to obtain an output result of the stage prediction model, wherein the output result is used for representing the focus stage of the diseased object relative to the current diseased image; the stage prediction model is obtained by training a model to be trained by a current disease sample image with a disease stage label.
In the above scheme, the stage prediction model is obtained by training a model to be trained from a current disease sample image with a disease stage label, and includes:
determining a first lesion sample area in a current diseased sample image;
obtaining a second focus sample area corresponding to the first focus sample area in the historical diseased sample image based on the current diseased sample image and the first focus sample area of the current diseased sample image; the current diseased sample image and the historical diseased sample image are diseased sample images of a diseased subject in different diseased periods;
Acquiring static characteristic sample data of a first focus sample area and static characteristic sample data of a second focus sample area;
determining dynamic index sample data of a first focus sample area in a current disease sample image relative to a second focus sample area in a historical disease sample image based on the static feature sample data of the first focus sample area and the static feature sample data of the second focus sample area;
and inputting the dynamic index sample data, the static characteristic sample data of the first focus sample area and the corresponding focus stage label into a model to be trained, and training the model to be trained to obtain a stage prediction model.
In the above-mentioned aspect, the determining, based on the static feature sample data of the first lesion sample area and the static feature sample data of the second lesion sample area, dynamic index sample data of the first lesion sample area in the current diseased sample image relative to the second lesion sample area in the historical diseased sample image includes:
determining a target static feature sample data value based on the static feature sample data of the first lesion sample area and the static feature sample data of the second lesion sample area;
And determining dynamic index sample data of a first focus sample area in the current diseased sample image relative to a second focus sample area in the historical diseased sample image according to the target static characteristic sample data value, the acquisition time of the current diseased sample image and the acquisition time of the historical diseased sample image.
According to a second aspect of the present application, there is provided an index data determination device, the device comprising:
a first acquisition unit configured to acquire a current diseased image for a diseased subject;
a first determining unit configured to determine a first lesion area in a current diseased image;
the second acquisition unit is used for acquiring a second focus area corresponding to the first focus area in the historical diseased image based on the current diseased image and the first focus area of the current diseased image; the current disease image and the historical disease image are disease images of a disease object in different disease periods;
a third obtaining unit, configured to obtain static feature data of the first focal region and static feature data of the second focal region;
a second determining unit, configured to determine dynamic index data of the first lesion area in the current diseased image relative to the second lesion area in the historical diseased image based on the static feature data of the first lesion area and the static feature data of the second lesion area; the dynamic index data is used to determine a focal stage at which the diseased subject is located for the current diseased image.
According to a third aspect of the present application, there is provided an electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the methods described herein.
According to a fourth aspect of the present application, there is provided a non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method described herein.
In the application, a first focus area in a current diseased image is determined by acquiring the current diseased image for a diseased subject. Obtaining a second focus area corresponding to the first focus area in the historical diseased image based on the current diseased image and the first focus area of the current diseased image; wherein the current disease image and the historical disease image are disease images of the disease subject in different disease periods. Acquiring static characteristic data of a first focus area and static characteristic data of a second focus area, and determining dynamic index data of the first focus area in a current diseased image relative to the second focus area in a historical diseased image based on the static characteristic data of the first focus area and the static characteristic data of the second focus area; the dynamic index data is used to determine the stage of the lesion where the diseased subject is located for the current diseased image. Dynamic index data representing the growth performance of a focus (particularly the actual components of the focus) can be accurately obtained, and then the focus stage of a diseased subject can be accurately determined.
It should be understood that the description of this section is not intended to identify key or critical features of the embodiments of the application or to delineate the scope of the application. Other features of the present application will become apparent from the description that follows.
Drawings
The above, as well as additional purposes, features, and advantages of exemplary embodiments of the present application will become readily apparent from the following detailed description when read in conjunction with the accompanying drawings. Several embodiments of the present application are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings, in which:
in the drawings, the same or corresponding reference numerals indicate the same or corresponding parts.
Fig. 1 is a schematic implementation flow chart of an index data determining method according to an embodiment of the present application;
FIG. 2 shows an application block diagram of the index data determining method according to the embodiment of the present application;
fig. 3 shows an application schematic diagram of the index data determining method according to the embodiment of the present application;
fig. 4 is a schematic diagram showing the composition structure of an index data determining apparatus according to an embodiment of the present application;
fig. 5 shows a schematic diagram of a composition structure of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, features and advantages of the present application more obvious and understandable, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The embodiment of the application provides an index data determining method, which is used for determining a first focus area in a current diseased image by acquiring the current diseased image of a diseased object. Obtaining a second focus area corresponding to the first focus area in the historical diseased image based on the current diseased image and the first focus area of the current diseased image; wherein the current disease image and the historical disease image are disease images of the disease subject in different disease periods. Acquiring static characteristic data of a first focus area and static characteristic data of a second focus area, and determining dynamic index data of the first focus area in a current diseased image relative to the second focus area in a historical diseased image based on the static characteristic data of the first focus area and the static characteristic data of the second focus area; the dynamic index data is used to determine the stage of the lesion where the diseased subject is located for the current diseased image. Dynamic index data representing the growth performance of a focus (particularly the actual components in the focus) can be accurately obtained, and then the focus stage of a diseased subject can be accurately determined.
The method for determining the index data according to the embodiment of the present application is described in detail below.
As shown in fig. 1, the method includes:
s101: a current diseased image is acquired for a diseased subject.
In this step, the current diseased image is an electronic computed tomography CT image taken for the diseased subject in the current case. In embodiments of the present application, the diseased subject may be a diseased organ of a cancer patient, such as a lung, breast, etc. As such, the current diseased image is a CT image for a diseased portion (organ) of the patient. The current diseased image for the diseased subject is acquired by taking a CT image for the diseased subject at the current moment.
S102: a first lesion area in the current lesion image is determined.
In this step, after the current diseased image for the diseased subject is acquired in step S101, the (first) lesion area in the current diseased image may be output through a pre-trained model. So as to facilitate the subsequent determination of the stage of the lesion where the diseased subject is located. The specific determination process of the first lesion area is referred to the following detailed description of the related parts, and is not repeated.
S103: obtaining a second focus area corresponding to the first focus area in the historical diseased image based on the current diseased image and the first focus area of the current diseased image; the current disease image and the historical disease image are disease images of a disease object in different disease periods.
In this step, the history diseased image and the current diseased image are diseased images taken at different times for the same diseased subject. Illustratively, the current diseased image is a CT image taken of the lung of a patient with lung cancer at this time. The historical diseased image is a CT image taken two months ago for the lung of the lung cancer patient. The first lesion area is a lesion area in the current diseased image. The second lesion area is a lesion area in the historical diseased image. Based on the current diseased image and the first focus area in the current diseased image, a (second) focus area in the history diseased image corresponding to the current diseased image is obtained, so that dynamic index data representing focus growth is conveniently obtained according to features of focuses in different periods. The specific obtaining process of the second lesion area is referred to the detailed description of the related parts below, and is not repeated.
S104: static feature data of the first focal region and static feature data of the second focal region are acquired.
In this step, the static feature data includes feature data such as size, volume, and duty ratio of the lesion. And extracting the static characteristic data of the first focus area and the static characteristic data of the second focus area, thereby obtaining the static characteristic data of the first focus area and the static characteristic data of the second focus area.
S105: determining dynamic index data of the first focus area in the current diseased image relative to the second focus area in the historical diseased image based on the static feature data of the first focus area and the static feature data of the second focus area; the dynamic index data is used to determine a focal stage at which the diseased subject is located for the current diseased image.
In this step, dynamic index data of the first lesion area with respect to the second lesion area is determined based on the static feature data of the first lesion area and the static feature data of the second lesion area obtained in step S104. The dynamic index data characterizes the dynamic growth condition of the focus, and can be combined with the static characteristic data of the first focus area to determine the focus stage of the current diseased image and the diseased object.
In practical applications, the focal stage is the infiltration stage of cancer, and can characterize the benign and malignant condition of cancer tumor of a patient. Illustratively, taking cancer as lung adenocarcinoma as an example, focal stages can generally be divided into three stages: carcinoma in situ, micro-invasive and invasive adenocarcinoma. The benign and malignant conditions of the cancer tumor of the patient can be obtained based on the focus stage by determining the focus stage of the diseased subject, and then a doctor is assisted to formulate a corresponding treatment scheme for the patient according to the benign and malignant conditions of the tumor. The specific determination process of the dynamic index data is described in detail below, and is not repeated.
In the scheme shown in S101 to S105, a first lesion area in a current diseased image is determined by acquiring the current diseased image for a diseased subject. Obtaining a second focus area corresponding to the first focus area in the historical diseased image based on the current diseased image and the first focus area of the current diseased image; wherein the current disease image and the historical disease image are disease images of the disease subject in different disease periods. Acquiring static characteristic data of a first focus area and static characteristic data of a second focus area, and determining dynamic index data of the first focus area in a current diseased image relative to the second focus area in a historical diseased image based on the static characteristic data of the first focus area and the static characteristic data of the second focus area; the dynamic index data is used to determine the stage of the lesion where the diseased subject is located for the current diseased image. Dynamic index data representing the growth of the focus can be accurately obtained, and then the focus stage of a diseased subject can be accurately determined.
In an alternative, the determining the first lesion area in the current diseased image includes:
inputting the current diseased image into a pre-trained detection model to obtain an interested region in the current diseased image; the interested area comprises a first focus area and an area except the first focus area in the current disease image;
Inputting the region of interest into a segmentation model to obtain a first focus region in the current diseased image; the segmentation model is obtained by training a model to be trained through a single-channel binarization interesting sample area.
In this application, a pre-trained detection model is used to determine a region of interest in a diseased image. The region of interest includes a first focal region and a region other than the first focal region. In conjunction with the illustration of fig. 2, in colloquial terms, by inputting the current diseased image into the detection model, a rough region of the first lesion region, i.e., the region of interest, can be obtained. The detection model can be obtained by training an RPN (Region Proposal Network, regional generation network) model of a classical algorithm Faster-RCNN in a target detection technology by a sample diseased image based on a deep-learning 3D-CNN (convolutional neural network). The model is integrally of a U-shaped framework, and learning and training are carried out by combining shallow local features (such as the size of a focus area) and deep semantic features (such as the distinction and connection between the focus area and surrounding non-focus area features), so that the difficulties of large size change, various forms and the like of a node (focus) can be effectively overcome, and the interested area in a current diseased image, including three-dimensional coordinates, diameter information and the like of the interested area, can be accurately and efficiently detected.
After obtaining the region of interest, inputting the region of interest into a segmentation model, and segmenting the first focus region. The segmentation model is obtained by training a model to be trained through a single-channel binarization interesting sample area. Specifically, the semantic segmentation technology of deep learning can be adopted to perform supervised training on the Denseunet network model. And inputting the single-channel binarized interesting sample region into a model to be trained, wherein a pixel value of 0 in the single-channel binarized interesting sample region represents a background region, and a pixel value of 1 represents a first focus region. And learning and optimizing the model to be trained to obtain a final segmentation model, wherein the segmentation model can be segmented to a pixel level accurately.
In an optional solution, the obtaining, based on the current diseased image and the first focal region of the current diseased image, a second focal region corresponding to the first focal region in the historical diseased image includes:
performing image registration on the current diseased image and the historical diseased image based on affine transformation to obtain an image registration result;
based on the image registration result, determining the dimension corresponding relation between the current diseased image and the historical diseased image through an intersection ratio function;
And carrying out region registration on the current diseased image and the historical diseased image based on the dimension corresponding relation to obtain a second focus region corresponding to the first focus region in the historical diseased image.
In the present application, it is understood that the first lesion area is a lesion area of a diseased part in the current diseased image. The number of the first lesion areas may be 1 or two or more. The second focus area is a focus area of the diseased part in the historical diseased image. The number of the second lesion areas may be 1 or two or more. The second focus area is obtained in a similar manner to the first focus area, and the interested area in the historical diseased image is obtained by inputting the historical diseased image into a pre-trained detection model. The region of interest is input to the segmentation model and the second lesion region is segmented. At this time, the obtained second lesion area is one or more lesion areas of the affected part in the historic affected image. In order to ensure the accuracy of the dynamic index data representing the dynamic growth condition of the same lesion, it is necessary to correspond (match) the (first) lesion area in the current diseased image with the (second) lesion area in the historic diseased image to obtain the second lesion area corresponding to (matching) the first lesion area. Thus, after obtaining the second lesion area in the historical diseased image, the second lesion area in the historical diseased image corresponding to (matching) the first lesion area may be determined by the scheme of the present application:
Considering that there may be differences in angle and direction when the current and historical diseased images are taken, it is assumed, for example, that the lung of a lung cancer patient is taken, the patient lies flat when the current diseased image is taken, and the patient lies on his side when the historical diseased image is taken. The angle, direction of the diseased object in the current diseased image is different from the angle, direction of the diseased object in the historical diseased image. According to affine transformation, registering the current diseased image and the historical diseased image so as to keep the angles and directions of the diseased objects in the current diseased image and the historical diseased image consistent.
After image registration of the current diseased image and the historical diseased image, the dimension correspondence of the current diseased image and the historical diseased image is determined through an intersection ratio function. It will be appreciated that since the current and the historical diseased images may be regarded as slice CT images for the diseased portion. When shooting a current diseased image and a historical diseased image, the diseased part needs to be scanned from top to bottom, and a diseased image synthesized by a plurality of image layers is obtained. Since the scan intervals set by the CT machine may be different at each shot, for diseased images of the same diseased portion shot at different times, the diseased portion may be at the second layer of the image in the current diseased image, but the diseased portion may be at the fourth layer of the image in the historical diseased image. Therefore, the corresponding relation of the dimensions (layer numbers) of the current diseased image and the historical diseased image is required to be determined through the cross-over comparison function, and after the corresponding relation of the dimensions of the two diseased images is determined, the areas of the corresponding layers in the two diseased images are compared and registered, so that the more accurate area registration of the two diseased images can be realized. Please refer to the related description for specific principles of the cross-correlation function, which is not repeated in the present application.
After image registration is carried out on the current diseased image and the historical diseased image, and the dimension corresponding relation between the current diseased image and the historical diseased image is determined, iteration is carried out step by step according to the distribution characteristics (such as the distance between the distribution characteristics and surrounding characteristic points) of the outline characteristic points of the first focus area, and a second focus area with the highest similarity with the first focus area in the historical diseased image is obtained.
In the application, the current diseased image and the historical diseased image are subjected to image registration through affine transformation, and an image registration result is obtained. And determining the dimension corresponding relation between the current diseased image and the historical diseased image through an intersection ratio function based on the image registration result. And then, according to the dimension corresponding relation, the current diseased image and the historical diseased image are subjected to area registration, so that a second focus area corresponding to the first focus area in the historical diseased image can be accurately obtained, and further dynamic index data can be accurately obtained.
In an optional solution, the determining, based on the static feature data of the first lesion area and the static feature data of the second lesion area, dynamic index data of the first lesion area in the current disease image relative to the second lesion area in the historical disease image includes:
Determining a target static feature data value based on the static feature data of the first lesion area and the static feature data of the second lesion area;
and determining dynamic index data of the first focus area in the current diseased image relative to the second focus area in the historical diseased image according to the target static characteristic data value, the acquisition time of the current diseased image and the acquisition time of the historical diseased image.
In the present application, the target static feature data value is a difference between the static feature data of the first lesion area and the static feature data of the second lesion area. The dynamic index data of the first focus area in the current disease image relative to the second focus area in the historical disease image can be calculated according to the formula (1):
wherein F is delta Representing dynamic characteristic data. F (F) current Static characteristic data representing a first lesion area. F (F) prior Static characteristic data representing a second lesion area. F (F) current -F prior The value of (2) represents the target static feature data value. T (T) current The acquisition time of the current diseased image (the acquisition time is the shooting time of the CT image) is represented. T (T) prior Representing the acquisition time of the historical diseased image.
In the application, a target static feature data value is determined based on the static feature data of the first lesion area and the static feature data of the second lesion area. According to the target static characteristic data value, the acquisition time of the current diseased image and the acquisition time of the historical diseased image, the scheme of dynamic index data of the first focus area in the current diseased image relative to the second focus area in the historical diseased image is determined, dynamic index data representing focus growth (such as growth speed) can be accurately obtained, and a data basis is provided for accurately determining focus stages of a diseased object.
In an alternative, the method further comprises:
inputting the static characteristic data of the first focus area and the dynamic index data of the first focus area in the current diseased image relative to the second focus area in the historical diseased image into a stage prediction model to obtain an output result of the stage prediction model, wherein the output result is used for representing the focus stage of the diseased object relative to the current diseased image; the stage prediction model is obtained by training a model to be trained by a current disease sample image with a disease stage label.
In the application, as shown in fig. 2, after dynamic index data of a first focus area in a current disease image relative to a second focus area in a historical disease image is obtained, the dynamic index data and static characteristic data of the first focus area are input into a stage prediction model to obtain a focus stage of the disease object in the current disease image. Illustratively, taking a patient as a lung cancer patient as an example, when the static characteristic data of the first lesion area is that the current pulmonary nodule size is 4mm, the dynamic index data characterizes that the current pulmonary nodule grows at a rate of 0.5mm per month relative to the last shot. And inputting the static characteristic data and the dynamic index data into a stage prediction model to obtain the focus stage of the diseased object of the current diseased image as an in-situ cancer stage.
When the static characteristic data of the first focus area is that the current pulmonary nodule size is 4mm, the dynamic index data represents that the current pulmonary nodule grows at a speed of 1 centimeter per month relative to the last shooting. And inputting the static characteristic data and the dynamic index data into a stage prediction model to obtain the stage of the focus of the diseased object of the current diseased image, which is the stage of invasive adenocarcinoma.
It is understood that the in situ cancer stage is the earliest stage of cancer and the invasive adenocarcinoma stage is the malignant stage of cancer. For pulmonary nodules of the same size, the larger the dynamic index data, the faster the growth rate and the higher the degree of deterioration of the nodule. The smaller the dynamic index data, the slower the growth rate of the nodule and the lower the degree of deterioration. Therefore, in the case of a certain nodule size, the dynamic index data will be different, and in general, the lesion stage in which the diseased subject is located will also be different.
The stage prediction model is obtained by training a model to be trained by a current disease sample image with a disease stage label. The specific training process of the stage prediction model is described in detail below, and is not repeated.
The static characteristic data of the first focus area and the dynamic index data of the first focus area in the current diseased image relative to the second focus area in the historical diseased image are used as the input of the stage prediction model, so as to combine two dimensions of the focus: the latest state (namely the static characteristic data of the first focus area) and the growth trend (namely the dynamic index data) predict the focus stage of the current diseased image diseased object, and the prediction can be comprehensively performed from the two aspects of static and dynamic, so that the prediction result is more comprehensive and accurate.
In an alternative, the stage prediction model is obtained by training a model to be trained from a current disease sample image with a disease stage label, and the method comprises the following steps:
determining a first lesion sample area in a current diseased sample image;
obtaining a second focus sample area corresponding to the first focus sample area in the historical diseased sample image based on the current diseased sample image and the first focus sample area of the current diseased sample image; the current diseased sample image and the historical diseased sample image are diseased sample images of a diseased subject in different diseased periods;
Acquiring static characteristic sample data of a first focus sample area and static characteristic sample data of a second focus sample area;
determining dynamic index sample data of a first focus sample area in a current disease sample image relative to a second focus sample area in a historical disease sample image based on the static feature sample data of the first focus sample area and the static feature sample data of the second focus sample area;
and inputting the dynamic index sample data, the static characteristic sample data of the first focus sample area and the corresponding focus stage label into a model to be trained, and training the model to be trained to obtain a stage prediction model.
In the application, the current disease sample image is input to the trained detection model, so that a rough area of the first focus sample area, namely the sample area of interest, can be obtained. After obtaining the sample region of interest, the sample region of interest is input to the segmentation model, and the first focus sample region is segmented.
Considering that there may be differences in the angle, direction when the current diseased sample image and the historical diseased sample image are taken, it is assumed, by way of example, that the lung of a lung cancer patient is taken, the patient lies flat when the current diseased sample image is taken, and the patient lies on his side when the historical diseased sample image is taken. The angle and direction of the diseased object in the current diseased sample image are different from the angle and direction of the diseased object in the historical diseased sample image. According to affine transformation, registering the current diseased sample image and the historical diseased sample image so as to keep the angles and directions of the diseased objects in the current diseased sample image and the historical diseased sample image consistent.
After image registration of the current diseased sample image and the historical diseased sample image, the dimension correspondence of the current diseased sample image and the historical diseased sample image is determined through an intersection ratio function. It will be appreciated that since the current diseased sample image and the historical diseased sample image may be considered as a slice CT image for the diseased portion. When shooting a current diseased sample image and a historical diseased sample image, the diseased part needs to be scanned from top to bottom, and the diseased sample image synthesized by a plurality of image layers is obtained. Since the scan interval set by the CT machine may be different at each shot, for diseased sample images taken at different times for the same diseased site, the diseased site may be on the second layer of the sample image in the current diseased sample image, but the diseased site may be on the fourth layer of the sample image in the historical diseased sample image. Therefore, the corresponding relation of the dimensions (layer numbers) of the current diseased sample image and the historical diseased sample image is required to be determined through the cross-correlation function, and after the corresponding relation of the dimensions of the two diseased sample images is determined, the areas of the corresponding layers in the two diseased sample images are compared and registered, so that the more accurate area registration of the two diseased sample images can be realized.
After image registration is performed on the current diseased sample image and the historical diseased sample image, and the dimension corresponding relation between the current diseased sample image and the historical diseased sample image is determined, iteration is gradually performed according to distribution characteristics (such as distances from surrounding characteristic points) of outline characteristic points of a first focus sample area, and a second focus sample area with highest similarity with the first focus sample area in the historical diseased sample image is obtained.
Dynamic index sample data of the first focus sample area in the current disease sample image relative to the second focus sample area in the historical disease sample image is determined by extracting static feature sample data of the first focus sample area and static feature sample data of the second focus sample area and based on the static feature sample data of the first focus sample area and the static feature sample data of the second focus sample area. The dynamic index sample data, the static characteristic sample data of the first focus sample area and the corresponding focus stage label are input into a model to be trained, and by taking a patient as a lung cancer patient as an example, the dynamic index sample data represents that the current lung nodule increases at a speed of 0.5mm per month relative to the last shooting, assuming that the static characteristic sample data of the corresponding first focus sample area is that the current lung nodule is 4mm in size when the focus stage label is an in-situ cancer stage. And inputting the static characteristic sample data, the dynamic index sample data and the in-situ cancer stage label into a model to be trained. When the focus stage label is an invasive adenocarcinoma stage, the static characteristic sample data corresponding to the first focus sample region is that the current pulmonary nodule size is 4mm, and the dynamic index sample data represents that the current pulmonary nodule grows at a speed of 1 centimeter per month relative to the last shooting. And inputting the static characteristic sample data, the dynamic index sample data and the invasive adenocarcinoma stage label into a model to be trained.
It is understood that the in situ cancer stage is the earliest stage of cancer and the invasive adenocarcinoma stage is the malignant stage of cancer. For pulmonary nodules of the same size, the larger the dynamic index sample data, the faster the growth rate and the higher the degree of deterioration of the nodule. The smaller the dynamic index sample data, the slower the growth rate of the nodule and the lower the degree of deterioration. Therefore, when the nodule size is constant, the dynamic index sample data is different, and the corresponding lesion stage label is also different in general.
And learning dynamic index sample data corresponding to different focus stages and static characteristic sample data of the first focus sample region by the model to be trained to obtain a trained stage prediction model. The stage prediction model is used for predicting the focus stage of the diseased subject aiming at the current diseased image.
The static characteristic sample data of the first focus sample area and the dynamic index sample data of the first focus sample area in the current disease sample image relative to the second focus sample area in the history disease sample image are used as the input of the model to be trained, so as to combine two dimensions of the focus: the latest state (i.e., static characteristic sample data of the first lesion sample area) and the growth trend (i.e., dynamic index sample data). The model to be trained is trained, and training can be comprehensively performed from two aspects of static and dynamic, so that the training result is more comprehensive and accurate.
In an optional aspect, the determining, based on the static feature sample data of the first lesion sample area and the static feature sample data of the second lesion sample area, dynamic index sample data of the first lesion sample area in the current diseased sample image relative to the second lesion sample area in the historical diseased sample image includes:
determining a target static feature sample data value based on the static feature sample data of the first lesion sample area and the static feature sample data of the second lesion sample area;
and determining dynamic index sample data of a first focus sample area in the current diseased sample image relative to a second focus sample area in the historical diseased sample image according to the target static characteristic sample data value, the acquisition time of the current diseased sample image and the acquisition time of the historical diseased sample image.
In the present application, the target static feature sample data value is a difference value between the static feature sample data of the first lesion sample area and the static feature sample data of the second lesion sample area. The dynamic index sample data of the first focus sample area in the current disease sample image relative to the second focus sample area in the historical disease sample image can be calculated according to the formula (1), and is not described in detail.
In the present application, a target static feature sample data value is determined based on the static feature sample data of the first lesion sample area and the static feature sample data of the second lesion sample area. According to the target static characteristic sample data value, the acquisition time of the current diseased sample image and the acquisition time of the historical diseased sample image, the scheme of dynamic index sample data of a first focus sample area in the current diseased sample image relative to a second focus sample area in the historical diseased sample image is determined, dynamic index sample data representing focus growth (such as growth speed) can be accurately obtained, and a data base is provided for realizing accurate training of a stage prediction model.
In one embodiment, the method for determining index data of the present application will be described by taking the lung of a patient suffering from lung cancer as an example.
As shown in fig. 3, a current diseased image of the lung of a lung cancer patient is acquired, and the current diseased image is input into a pre-trained detection model to obtain an interested region in the current diseased image. And inputting the region of interest into the segmentation model to obtain a first focus region in the current diseased image. And obtaining a second focus area corresponding to the first focus area in the historical diseased image based on the current diseased image and the first focus area of the current diseased image. Static feature data of the first focal region and static feature data of the second focal region are acquired. Dynamic index data of the first focus area in the current diseased image relative to the second focus area in the historical diseased image is determined based on the static feature data of the first focus area and the static feature data of the second focus area. And inputting the static characteristic data of the first focus area and the dynamic index data of the first focus area in the current disease image relative to the second focus area in the historical disease image into a stage prediction model to obtain focus stages of the current disease image, such as one of an in-situ cancer stage, a micro-infiltration stage and an infiltration gonad cancer stage, of a disease object. The specific process of the operation shown in fig. 3 is referred to the detailed description of the related parts, and is not repeated.
In the application, a first focus area in a current diseased image is determined by acquiring the current diseased image for a diseased subject. Obtaining a second focus area corresponding to the first focus area in the historical diseased image based on the current diseased image and the first focus area of the current diseased image; wherein the current disease image and the historical disease image are disease images of the disease subject in different disease periods. Acquiring static characteristic data of a first focus area and static characteristic data of a second focus area, and determining dynamic index data of the first focus area in a current diseased image relative to the second focus area in a historical diseased image based on the static characteristic data of the first focus area and the static characteristic data of the second focus area; the dynamic index data is used to determine the stage of the lesion where the diseased subject is located for the current diseased image. Dynamic index data representing the growth performance of a focus (particularly the actual components in the focus) can be accurately obtained, and then the focus stage of a diseased subject can be accurately determined. Furthermore, doctors can efficiently and accurately formulate the treatment scheme which is most suitable for patients according to the focus stage of the obtained diseased object.
An embodiment of the present application provides an index data determining apparatus, as shown in fig. 4, including:
a first acquisition unit 401 for acquiring a current diseased image for a diseased subject;
a first determining unit 402 for determining a first lesion area in the current diseased image;
a second obtaining unit 403, configured to obtain, based on the current diseased image and the first focal region of the current diseased image, a second focal region corresponding to the first focal region in the historical diseased image; the current disease image and the historical disease image are disease images of a disease object in different disease periods;
a third obtaining unit 404, configured to obtain static feature data of the first focal region and static feature data of the second focal region;
a second determining unit 405, configured to determine dynamic index data of the first lesion area in the current diseased image relative to the second lesion area in the historical diseased image based on the static feature data of the first lesion area and the static feature data of the second lesion area; the dynamic index data is used to determine a focal stage at which the diseased subject is located for the current diseased image.
In an alternative solution, the first determining unit 402 is configured to input the current disease image into a pre-trained detection model, so as to obtain a region of interest in the current disease image; the interested area comprises a first focus area and an area except the first focus area in the current disease image; inputting the region of interest into a segmentation model to obtain a first focus region in the current diseased image; the segmentation model is obtained by training a model to be trained through a single-channel binarization interesting sample area.
In an optional solution, the second obtaining unit 403 is configured to perform image registration on the current diseased image and the historical diseased image based on affine transformation, so as to obtain an image registration result; based on the image registration result, determining the dimension corresponding relation between the current diseased image and the historical diseased image through an intersection ratio function; and carrying out region registration on the current diseased image and the historical diseased image based on the dimension corresponding relation to obtain a second focus region corresponding to the first focus region in the historical diseased image.
In an alternative solution, the second determining unit 405 is configured to determine a target static feature data value based on the static feature data of the first lesion area and the static feature data of the second lesion area; and determining dynamic index data of the first focus area in the current diseased image relative to the second focus area in the historical diseased image according to the target static characteristic data value, the acquisition time of the current diseased image and the acquisition time of the historical diseased image.
In an alternative, the apparatus further comprises:
a fourth obtaining unit, configured to input static feature data of the first focal region and dynamic index data of the first focal region in the current diseased image relative to the second focal region in the historical diseased image to a stage prediction model, so as to obtain an output result of the stage prediction model, where the output result is used to characterize a focal stage where the diseased object is located for the current diseased image; the stage prediction model is obtained by training a model to be trained by a current disease sample image with a disease stage label.
In an optional aspect, the fourth obtaining unit is configured to determine a first focal sample area in the current diseased sample image; obtaining a second focus sample area corresponding to the first focus sample area in the historical diseased sample image based on the current diseased sample image and the first focus sample area of the current diseased sample image; the current diseased sample image and the historical diseased sample image are diseased sample images of a diseased subject in different diseased periods; acquiring static characteristic sample data of a first focus sample area and static characteristic sample data of a second focus sample area; determining dynamic index sample data of a first focus sample area in a current disease sample image relative to a second focus sample area in a historical disease sample image based on the static feature sample data of the first focus sample area and the static feature sample data of the second focus sample area; and inputting the dynamic index sample data, the static characteristic sample data of the first focus sample area and the corresponding focus stage label into a model to be trained, and training the model to be trained to obtain a stage prediction model.
In an optional solution, the fourth obtaining unit is configured to determine a target static feature sample data value based on the static feature sample data of the first lesion sample area and the static feature sample data of the second lesion sample area; and determining dynamic index sample data of a first focus sample area in the current diseased sample image relative to a second focus sample area in the historical diseased sample image according to the target static characteristic sample data value, the acquisition time of the current diseased sample image and the acquisition time of the historical diseased sample image.
It should be noted that, since the principle of solving the problem of the device in the index data determining device according to the embodiment of the present application is similar to that of the foregoing index data determining method, the implementation process, implementation principle and beneficial effect of the device may refer to the description of the implementation process, implementation principle and beneficial effect of the foregoing method, and the repetition is omitted.
According to embodiments of the present application, an electronic device and a readable storage medium are also provided.
Fig. 5 shows a schematic block diagram of an example electronic device 500 that may be used to implement embodiments of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the application described and/or claimed herein.
As shown in fig. 5, the electronic device 500 includes a computing unit 501 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 502 or a computer program loaded from a storage unit 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data required for the operation of the electronic device 500 may also be stored. The computing unit 501, ROM 502, and RAM 503 are connected to each other by a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
A number of components in electronic device 500 are connected to I/O interface 505, including: an input unit 506 such as a keyboard, a mouse, etc.; an output unit 507 such as various types of displays, speakers, and the like; a storage unit 508 such as a magnetic disk, an optical disk, or the like; and a communication unit 509 such as a network card, modem, wireless communication transceiver, etc. The communication unit 509 allows the electronic device 500 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The computing unit 501 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 501 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The calculation unit 501 performs the respective methods and processes described above, for example, the index data determination method. For example, in some embodiments, the metric data determination method may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as storage unit 508. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 500 via the ROM 502 and/or the communication unit 509. When the computer program is loaded into the RAM 503 and executed by the computing unit 501, one or more steps of the index data determination method described above may be performed. Alternatively, in other embodiments, the computing unit 501 may be configured to perform the index data determination method by any other suitable means (e.g. by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems-on-a-chip (SOCs), complex Programmable Logic Devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present application may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this application, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server incorporating a blockchain.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present application may be performed in parallel, sequentially, or in a different order, provided that the desired results of the technical solutions disclosed in the present application can be achieved, and are not limited herein.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present application, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A method of determining index data, the method comprising:
acquiring a current diseased image for a diseased subject;
determining a first focus area in the current diseased image;
obtaining a second focus area corresponding to the first focus area in the historical diseased image based on the current diseased image and the first focus area of the current diseased image; the current disease image and the historical disease image are disease images of a disease object in different disease periods;
acquiring static characteristic data of a first focus area and static characteristic data of a second focus area;
determining dynamic index data of the first focus area in the current diseased image relative to the second focus area in the historical diseased image based on the static feature data of the first focus area and the static feature data of the second focus area; the dynamic index data is used to determine a focal stage at which the diseased subject is located for the current diseased image.
2. The method of claim 1, wherein the determining the first lesion area in the current diseased image comprises:
inputting the current diseased image into a pre-trained detection model to obtain an interested region in the current diseased image; the interested area comprises a first focus area and an area except the first focus area in the current disease image;
Inputting the region of interest into a segmentation model to obtain a first focus region in the current diseased image; the segmentation model is obtained by training a model to be trained through a single-channel binarization interesting sample area.
3. The method according to claim 1, wherein the obtaining a second lesion area in the historical diseased image corresponding to the first lesion area based on the current diseased image and the first lesion area of the current diseased image includes:
performing image registration on the current diseased image and the historical diseased image based on affine transformation to obtain an image registration result;
based on the image registration result, determining the dimension corresponding relation between the current diseased image and the historical diseased image through an intersection ratio function;
and carrying out region registration on the current diseased image and the historical diseased image based on the dimension corresponding relation to obtain a second focus region corresponding to the first focus region in the historical diseased image.
4. A method according to any one of claims 1 to 3, wherein said determining dynamic index data of the first lesion area in the current lesion image relative to the second lesion area in the historical lesion image based on the static feature data of the first lesion area and the static feature data of the second lesion area comprises:
Determining a target static feature data value based on the static feature data of the first lesion area and the static feature data of the second lesion area;
and determining dynamic index data of the first focus area in the current diseased image relative to the second focus area in the historical diseased image according to the target static characteristic data value, the acquisition time of the current diseased image and the acquisition time of the historical diseased image.
5. The method according to claim 1, wherein the method further comprises:
inputting the static characteristic data of the first focus area and the dynamic index data of the first focus area in the current diseased image relative to the second focus area in the historical diseased image into a stage prediction model to obtain an output result of the stage prediction model, wherein the output result is used for representing the focus stage of the diseased object relative to the current diseased image; the stage prediction model is obtained by training a model to be trained by a current disease sample image with a disease stage label.
6. The method of claim 5, wherein the phase prediction model is derived from training a model to be trained from a current diseased sample image with a lesion phase label, comprising:
Determining a first lesion sample area in a current diseased sample image;
obtaining a second focus sample area corresponding to the first focus sample area in the historical diseased sample image based on the current diseased sample image and the first focus sample area of the current diseased sample image; the current diseased sample image and the historical diseased sample image are diseased sample images of a diseased subject in different diseased periods;
acquiring static characteristic sample data of a first focus sample area and static characteristic sample data of a second focus sample area;
determining dynamic index sample data of a first focus sample area in a current disease sample image relative to a second focus sample area in a historical disease sample image based on the static feature sample data of the first focus sample area and the static feature sample data of the second focus sample area;
and inputting the dynamic index sample data, the static characteristic sample data of the first focus sample area and the corresponding focus stage label into a model to be trained, and training the model to be trained to obtain a stage prediction model.
7. The method of claim 6, wherein the determining dynamic index sample data for the first lesion sample area in the current lesion sample image relative to the second lesion sample area in the historical lesion sample image based on the static feature sample data for the first lesion sample area and the static feature sample data for the second lesion sample area comprises:
Determining a target static feature sample data value based on the static feature sample data of the first lesion sample area and the static feature sample data of the second lesion sample area;
and determining dynamic index sample data of a first focus sample area in the current diseased sample image relative to a second focus sample area in the historical diseased sample image according to the target static characteristic sample data value, the acquisition time of the current diseased sample image and the acquisition time of the historical diseased sample image.
8. An index data determining apparatus, characterized in that the apparatus comprises:
a first acquisition unit configured to acquire a current diseased image for a diseased subject;
a first determining unit configured to determine a first lesion area in a current diseased image;
the second acquisition unit is used for acquiring a second focus area corresponding to the first focus area in the historical diseased image based on the current diseased image and the first focus area of the current diseased image; the current disease image and the historical disease image are disease images of a disease object in different disease periods;
a third obtaining unit, configured to obtain static feature data of the first focal region and static feature data of the second focal region;
A second determining unit, configured to determine dynamic index data of the first lesion area in the current diseased image relative to the second lesion area in the historical diseased image based on the static feature data of the first lesion area and the static feature data of the second lesion area; the dynamic index data is used to determine a focal stage at which the diseased subject is located for the current diseased image.
9. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
10. A non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method of any one of claims 1-7.
CN202311666499.5A 2023-12-06 2023-12-06 Index data determining method and device, electronic equipment and storage medium Pending CN117670830A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311666499.5A CN117670830A (en) 2023-12-06 2023-12-06 Index data determining method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311666499.5A CN117670830A (en) 2023-12-06 2023-12-06 Index data determining method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117670830A true CN117670830A (en) 2024-03-08

Family

ID=90074713

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311666499.5A Pending CN117670830A (en) 2023-12-06 2023-12-06 Index data determining method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117670830A (en)

Similar Documents

Publication Publication Date Title
CN107665736B (en) Method and apparatus for generating information
CN114565763B (en) Image segmentation method, device, apparatus, medium and program product
CN111932552B (en) Aorta modeling method and device
US20210327583A1 (en) Determination of a growth rate of an object in 3d data sets using deep learning
CN113240661A (en) Deep learning-based lumbar vertebra analysis method, device, equipment and storage medium
CN117373070B (en) Method and device for labeling blood vessel segments, electronic equipment and storage medium
US20150278976A1 (en) Systems and methods for using geometry sensitivity information for guiding workflow
CN116245832B (en) Image processing method, device, equipment and storage medium
CN115409856B (en) Lung medical image processing method, device, equipment and storage medium
CN114972220B (en) Image processing method and device, electronic equipment and readable storage medium
CN115482261A (en) Blood vessel registration method, device, electronic equipment and storage medium
CN115049590B (en) Image processing method and device, electronic equipment and storage medium
CN117670830A (en) Index data determining method and device, electronic equipment and storage medium
CN115631370A (en) Identification method and device of MRI (magnetic resonance imaging) sequence category based on convolutional neural network
CN115861189A (en) Image registration method and device, electronic equipment and storage medium
CN115578564B (en) Training method and device for instance segmentation model, electronic equipment and storage medium
CN115690143B (en) Image segmentation method, device, electronic equipment and storage medium
CN115482358B (en) Triangular mesh curved surface generation method, device, equipment and storage medium
CN114972242B (en) Training method and device for myocardial bridge detection model and electronic equipment
CN115187582B (en) Lymph node segmentation method and device, electronic equipment and readable storage medium
CN116128863B (en) Medical image processing method, device and equipment
CN115358976B (en) Image identification method, device, equipment and storage medium
CN117635578A (en) Image processing method, device, electronic equipment and storage medium
CN117253601A (en) Medical treatment effect prediction method and device, electronic equipment and storage medium
CN117115134A (en) Method and device for determining proportion of solid components in lung nodule

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination