CN114708974A - Method for predicting hospitalization duration of new coronary pneumonia patient and related product - Google Patents

Method for predicting hospitalization duration of new coronary pneumonia patient and related product Download PDF

Info

Publication number
CN114708974A
CN114708974A CN202210631043.4A CN202210631043A CN114708974A CN 114708974 A CN114708974 A CN 114708974A CN 202210631043 A CN202210631043 A CN 202210631043A CN 114708974 A CN114708974 A CN 114708974A
Authority
CN
China
Prior art keywords
image
feature
chest
fusion
new coronary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210631043.4A
Other languages
Chinese (zh)
Inventor
王振常
雷娜
魏璇
吕晗
金连宝
任玉雪
陈伟
李维
吴伯阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhituo Vision Technology Co ltd
Dalian University of Technology
Capital Normal University
Beijing Friendship Hospital
Original Assignee
Beijing Zhituo Vision Technology Co ltd
Dalian University of Technology
Capital Normal University
Beijing Friendship Hospital
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhituo Vision Technology Co ltd, Dalian University of Technology, Capital Normal University, Beijing Friendship Hospital filed Critical Beijing Zhituo Vision Technology Co ltd
Priority to CN202210631043.4A priority Critical patent/CN114708974A/en
Publication of CN114708974A publication Critical patent/CN114708974A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Strategic Management (AREA)
  • Human Resources & Organizations (AREA)
  • Artificial Intelligence (AREA)
  • Economics (AREA)
  • Public Health (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Primary Health Care (AREA)
  • Biomedical Technology (AREA)
  • Epidemiology (AREA)
  • Development Economics (AREA)
  • Pathology (AREA)
  • Game Theory and Decision Science (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The present disclosure relates to a method and related product for predicting length of stay in hospital for a patient with new coronary pneumonia. The method comprises the following steps: acquiring a chest CT image of a patient with new coronary pneumonia; obtaining feature information and image information related to the chest CT image based on the chest CT image; and inputting the chest CT image, the characteristic information and the image information into a pre-trained prediction model for prediction so as to obtain a prediction result of the hospitalization duration of the new coronary pneumonia patient. With the scheme of the present disclosure, accurate prediction results about the length of stay of a new coronary pneumonia patient can be obtained, so as to provide reference in the configuration and use of medical resources.

Description

Method for predicting hospitalization duration of new coronary pneumonia patient and related product
Technical Field
The present disclosure relates generally to the technical field of length of stay prediction. More particularly, the present disclosure relates to a method, apparatus, and computer-readable storage medium for predicting length of stay of a new coronary pneumonia patient.
Background
The chest CT examination is one of important means for resisting the new coronary pneumonia, is widely applied to examination of fever patients and screening of hospital admissions, and continuously and remarkably plays a role in epidemic prevention. The chest CT image acquired by the chest CT examination contains a large amount of information which indicates the progress and the outcome of the new coronary pneumonia, but most of the researches only carry out researches on focuses, neglect the integral characteristics of the patients as a 'person' reflected in the intra-pulmonary focuses and the extrapulmonary visible image characteristics except the focuses to a certain extent, and fail to comprehensively depict the integral health status of the new coronary pneumonia patients.
In addition, after the new coronary pneumonia patient is diagnosed, the length of hospitalization of the new coronary pneumonia patient significantly affects the number of beds, the allocation and the use of hospitalized materials, and excessive allocation causes unnecessary resource waste. Therefore, under the current situation that epidemic situations are repeated and fight is made at any moment, the length of hospitalization of the patient is fully known and predicted, and the method has important guiding significance for reasonable allocation and use of epidemic-resistant medical resources in fight and actual combat. Therefore, how to accurately predict the hospitalization duration of the new coronary pneumonia patient becomes a technical problem which needs to be solved urgently.
Disclosure of Invention
To at least partially address the technical problems noted in the background, aspects of the present disclosure provide a solution for predicting the length of stay in hospital for a new coronary pneumonia patient. By using the scheme disclosed by the invention, the prediction result of the hospitalization duration of the new coronary pneumonia patient can be efficiently and accurately obtained, so that a reference basis is provided in the aspect of configuration and use of medical resources. To this end, the present disclosure provides solutions in a number of aspects as follows.
In a first aspect, the present disclosure provides a method for predicting length of stay in a new coronary pneumonia patient, comprising: acquiring a chest CT image of a patient with new coronary pneumonia; obtaining feature information and image information related to the chest CT image based on the chest CT image; and inputting the chest CT image, the characteristic information and the image information into a pre-trained prediction model for prediction so as to obtain a prediction result of the hospitalization duration of the new coronary pneumonia patient.
In one embodiment, wherein the characteristic information relates to characteristics of lung regions and the characteristic information comprises at least gaussian curvature, mean curvature and/or reed curvature, the image information comprises at least image information relating to muscle and abdominal fat, image information relating to vertebral bone, and image information relating to liver, cardiovascular and thyroid in the breast CT image.
In another embodiment, the prediction model comprises a feature extraction module, a fusion module and a classifier which are connected in sequence, and the step of inputting the chest CT image, the feature information and the image information into a pre-trained prediction model for prediction to obtain the prediction result of the stay in hospital of the new coronary pneumonia patient comprises the following steps: inputting the chest CT image and the feature information into the feature extraction module to respectively execute feature extraction operation so as to obtain intermediate features corresponding to the chest CT image and the feature information; performing the feature fusion operation on the corresponding intermediate features by using the fusion module to obtain initial features of the chest CT image; splicing the initial features and the image information to obtain final features of the chest CT image; and classifying based on the final features by the classifier to obtain a prediction result of the length of stay of the new coronary pneumonia patient.
In yet another embodiment, the inputting the CT image of the chest and the feature information into the feature extraction module to respectively perform feature extraction operations to obtain intermediate features corresponding to the two comprises: inputting the chest CT image into the feature extraction module to perform a plurality of feature extraction operations to obtain a plurality of first intermediate features; and inputting the feature information to the feature extraction module to perform a plurality of feature extraction operations to obtain a plurality of second intermediate features.
In yet another embodiment, wherein the fusion module comprises a plurality, and wherein performing the feature fusion operation on the corresponding intermediate features with the fusion module to obtain initial features of a CT image of the breast comprises: selecting a plurality of first target intermediate features and a plurality of second target intermediate features from the plurality of first intermediate features and the plurality of second intermediate features, respectively; sequentially executing feature fusion operation on the corresponding first target intermediate features and the corresponding second target intermediate features by utilizing the plurality of fusion modules to obtain corresponding fusion results, and inputting the corresponding fusion results to the next fusion module; executing next feature fusion operation on the fusion result of the last fusion module, the corresponding first target intermediate feature and the corresponding second target intermediate feature by using the next fusion module until the fusion result of the last fusion module is obtained; and taking the fusion result of the last fusion module as the initial characteristic of the chest CT image.
In yet another embodiment, the method further comprises: adjusting decision boundaries of the classifier based on an optimal transmission map to obtain a final prediction of length of stay of the new coronary pneumonia patient.
In yet another embodiment, wherein adjusting the decision boundary of the classifier based on the optimal transfer mapping comprises: acquiring the weight of the classifier and the hospitalization duration of the new coronary pneumonia patient; calculating an optimal transmission map based on the length of stay of the new coronary pneumonia patient and the weight of the classifier in the feature space of the final feature to adjust the decision boundary of the classifier.
In yet another embodiment, wherein calculating an optimal transmission map based on the length of stay of the new coronary pneumonia patient and the weight of the classifier in the feature space of the final feature to adjust the decision boundary of the classifier comprises: setting the final feature as a point within a source domain and the weight of the classifier as a target point; counting the frequency of the hospitalization time of the new coronary pneumonia patient; adjusting the measure of the target point according to the frequency to obtain the adjusted measure of the target point; and calculating an optimal transfer mapping based on the measures of the points within the source domain, the target point and the adjusted target point to adjust the decision boundary of the classifier.
In a second aspect, the present disclosure also provides a device for predicting length of stay of a new coronary pneumonia patient, comprising: a processor; and a memory storing program instructions for predicting length of stay in a new coronary pneumonia patient, which when executed by the processor, causes the apparatus to perform the foregoing embodiments.
In a third aspect, the present disclosure also provides a computer-readable storage medium having stored thereon computer-readable instructions for predicting length of stay in a new coronary pneumonia patient, which, when executed by one or more processors, implement the foregoing embodiments.
According to the scheme, the characteristic information and the image information are obtained through the chest CT image, and the obtained characteristic information, the obtained image information and the chest CT image are input into a pre-trained prediction model for prediction, so that the prediction result of the hospitalization duration of the new coronary pneumonia patient is obtained. Based on the information, the information describing the overall health state of the new coronary pneumonia patient can be obtained more comprehensively, and then the network model is used for predicting based on the comprehensive information, so that an accurate prediction result about the hospitalization duration of the new coronary pneumonia patient can be obtained efficiently. Based on the prediction result, reference basis is provided conveniently in the aspect of configuration and use of medical resources, and therefore resource waste is avoided. Furthermore, the decision boundary of the classifier is adjusted through the optimal transmission mapping, the problem of unbalanced distribution of the length of stay in hospital is effectively solved, and the accuracy of the prediction result is improved.
Drawings
The above and other objects, features and advantages of exemplary embodiments of the present disclosure will become readily apparent from the following detailed description read in conjunction with the accompanying drawings. In the drawings, several embodiments of the disclosure are illustrated by way of example and not by way of limitation, and like or corresponding reference numerals indicate like or corresponding parts and in which:
fig. 1 is an exemplary flow diagram illustrating a method for predicting length of stay for a new coronary pneumonia patient according to an embodiment of the present disclosure;
fig. 2 is an exemplary block diagram illustrating a prediction model predicting length of stay for a new coronary pneumonia patient according to an embodiment of the present disclosure;
FIG. 3 is an exemplary block diagram illustrating initial features of obtaining a CT image of a breast in accordance with an embodiment of the present disclosure;
FIG. 4 is an exemplary diagram illustrating a classifier's decision boundary before and after adjustment according to an embodiment of the present disclosure;
fig. 5 is an exemplary block diagram illustrating an overview for predicting length of stay for a new coronary pneumonia patient, according to an embodiment of the present disclosure; and
fig. 6 is a block diagram illustrating an apparatus for predicting length of stay for a new coronary pneumonia patient according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings. It should be understood that the embodiments described in this specification are only some of the embodiments of the present disclosure provided to facilitate a clear understanding of the aspects and to comply with legal requirements, and not all embodiments of the present disclosure may be implemented. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed in the specification without making any creative effort, shall fall within the protection scope of the present disclosure.
Fig. 1 is an exemplary flow diagram illustrating a method 100 for predicting length of stay for a new coronary pneumonia patient according to an embodiment of the present disclosure. As shown in fig. 1, at step S102, a chest CT image of a new coronary pneumonia patient is acquired. In one embodiment, the aforementioned chest CT image may be acquired by, for example, a Computed Tomography ("CT") technique or device. Based on the acquired CT image of the chest, at step S104, feature information and image information related to the CT image of the chest are obtained based on the CT image of the chest. Wherein the aforementioned characteristic information relates to characteristics of the lung region and the characteristic information may include, but is not limited to, gaussian curvature, mean curvature, and/or reed curvature. The image information includes, but is not limited to, image information related to muscle and abdominal fat (e.g., fat area) in a CT image of the chest, image information related to vertebral bone (e.g., bone density), and image information related to liver, cardiovascular and thyroid (e.g., liver CT value, thyroid density, etc.).
In one implementation scenario, the above-mentioned gaussian curvature, flatThe mean curvature and/or the reed curvature may be determined by generating a two-dimensional mesh or a tetrahedral mesh connected by a plurality of vertices based on the lung lesion area in the chest CT image. Specifically, for Gaussian curvature, it is equal to 2 π minus the angle corresponding to its neighbor at the vertex of the original uncut closed mesh. Assuming Gaussian curvature is denoted as k, then
Figure DEST_PATH_IMAGE001
Wherein
Figure DEST_PATH_IMAGE002
Representing the angles at which the vertices of the mesh correspond to their neighbors,
Figure DEST_PATH_IMAGE003
indicating the number of adjacent meshes for the mesh vertex. For mean curvature, the lung lesion area is assumed to be a function
Figure DEST_PATH_IMAGE004
And the vertex
Figure DEST_PATH_IMAGE005
The normal vector of the isosurface is
Figure DEST_PATH_IMAGE006
Then the vertex can be put
Figure 913311DEST_PATH_IMAGE005
The mean curvature K of (A) is defined as
Figure DEST_PATH_IMAGE007
For the curie curvature, the weights of edges that are vertex-adjacent in a tetrahedral mesh may be defined first
Figure DEST_PATH_IMAGE008
And it is expressed as:
Figure DEST_PATH_IMAGE009
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE010
representing edges
Figure DEST_PATH_IMAGE011
The weight of (a) is calculated,
Figure DEST_PATH_IMAGE012
and
Figure DEST_PATH_IMAGE013
respectively representing vertices
Figure DEST_PATH_IMAGE014
And
Figure DEST_PATH_IMAGE015
the weight of (a) is determined,
Figure DEST_PATH_IMAGE016
representing all and the vertex
Figure 161277DEST_PATH_IMAGE014
Adjoining edges (excluding edges)
Figure 613118DEST_PATH_IMAGE011
),
Figure DEST_PATH_IMAGE017
Representing all and the vertex
Figure 993283DEST_PATH_IMAGE015
Adjoining edges (excluding edges)
Figure 479759DEST_PATH_IMAGE011
). Further, the weights of their common edges may be defined
Figure DEST_PATH_IMAGE018
And it is expressed as:
Figure DEST_PATH_IMAGE019
combining the above formula (1) and formula (2) can obtain the weight of the edges adjacent to the vertices in the tetrahedral mesh
Figure 708746DEST_PATH_IMAGE008
. Weights based on the foregoing
Figure 146418DEST_PATH_IMAGE008
Further, the richness curvature Ric at each vertex can be obtained according to the following formula:
Figure DEST_PATH_IMAGE020
in the above-mentioned formula (3),
Figure DEST_PATH_IMAGE021
representation and vertex
Figure DEST_PATH_IMAGE022
The edges of the adjacent edges are arranged to be adjacent,
Figure DEST_PATH_IMAGE023
representing all and the vertex
Figure 860165DEST_PATH_IMAGE022
The edges of the adjacent ones are,
Figure DEST_PATH_IMAGE024
can represent
Figure 873252DEST_PATH_IMAGE021
Number of (i.e. dots)
Figure 538719DEST_PATH_IMAGE022
The number of adjacent edges. Further, weights on three axes (i.e., x-axis, y-axis, z-axis) orthogonal to each other at the vertex may be calculated based on the above-described formula (1) and formula (2), respectivelyThe weights of the three axes are taken as the values of the richness. The aforementioned three axial weights may represent tensor data of a three-dimensional tensor. Thus, the cookie values may be represented as three-dimensional tensors.
In another implementation scenario, the various image information may be obtained by, for example, Quantitative Computed Tomography ("QCT"). For example, DICOM data for CT images of the chest are imported into the QCT and the region of interest (e.g., taken at the lumbar 1-2 disc level) is measured. By applying the QCT constitutional component measurement function, subcutaneous fat and fat in the abdominal cavity are distinguished by the outer edge of abdominal wall muscles, and abdominal cavity organs and blood vessels are avoided by identifying the fat in the abdominal cavity. And then, manually delineating the muscle edge of the abdominal wall to determine the muscle region of interest of the abdominal wall, and further quantitatively recording area data of the muscle of the abdominal wall, subcutaneous fat and fat in the abdominal cavity. Further, by taking the average value of the absolute values of cancellous bone density at the first lumbar vertebra (L1) and the second lumbar vertebra (L2) as a standard for evaluating osteoporosis, continuous variable data of the absolute values of bone density were quantitatively recorded, respectively, and corresponding qualitative data were recorded (for example, when the absolute value of bone density was >120mg/cm3, it belonged to normal range; when the absolute value of bone density was 80 to 120mg/cm3, it was low bone mass; and when the absolute value of bone density was <80 mg/cm3, it was osteoporosis).
For a chest CT image of, for example, a new coronary pneumonia patient, the lumbar 1-2 disc level liver CT values can be measured. Specifically, according to the Couinaud eight-segment method, an area of interest with a diameter of 1cm is defined at the center of each indicated liver segment, a CT value is recorded, and the average CT value of each liver segment is taken as a liver CT value (HU) in the lumbar 1-2 intervertebral disc layer, while avoiding intrahepatic vessels, bile ducts and ligaments. And measuring CT values of spleen at the level of the lumbar 1-2 intervertebral disc, and recording corresponding fatty liver degree grades (the CT ratio of liver/spleen is between 0.7 and 1.0, which is mild fatty liver, the CT ratio is between 0.5 and 0.7, which is moderate fatty liver, and the CT ratio is less than or equal to 0.5, which is severe fatty liver).
In some embodiments, a cross-sectional DICOM image of the mid-plane of the four chambers of the heart may also be taken from the chest CT image and imported into the ITK-SNAP software. The volume of interest can be measured automatically by manually delineating the heart borders on the ITK-SNAP software and recording the heart volume at the four-chamber central level (mm 3). In addition, by marking each branch calcification region of the coronary artery by using the coronary stenosis analysis AI software and Agatston's score system and setting the threshold to 130HU, the calcification scores of the left main trunk, left anterior descending branch, circumflex branch and right coronary artery can be automatically obtained and the total calcification score can be calculated. Similarly, aortic calcium scores can be obtained. In addition, the maximum cross section shown by the thyroid gland is selected from the chest CT image, and the region of interest is defined with a diameter of 1cm, and the isthmus and bilateral lobe CT values are measured respectively, and the average CT value thereof is taken as the measurement result (HU) of the thyroid gland density. Further, the volume of interest was automatically measured by taking the largest cross sectional DICOM image of the thyroid gland display and importing it into the ITK-SNAP software, by manually delineating the thyroid edge on the ITK-SNAP software, and the thyroid volume (mm 3) was recorded.
After obtaining the above feature information and image information, at step S106, the chest CT image, the feature information and the image information are input into a pre-trained prediction model for prediction to obtain a prediction result of the length of stay of the new coronary pneumonia patient. In one embodiment, the predictive model may include a feature extraction module, a fusion module, and a classifier connected in series. Wherein, the breast CT image and the feature information can be firstly input to the feature extraction module to respectively perform the feature extraction operation to obtain the corresponding intermediate features. And then, performing feature fusion operation on the corresponding intermediate features by using a fusion module to obtain initial features of the chest CT image, and splicing the initial features and the image information to obtain final features of the chest CT image. Further, classification is carried out based on the final characteristics by using a classifier so as to obtain a prediction result of the length of stay of the new coronary pneumonia patient. The prediction model will be described in detail later in conjunction with fig. 2. In an implementation scenario, the feature extraction module may include a plurality of convolutional layers and a plurality of pooling layers, and the fusion module may include a plurality of convolutional layers and pooling layers. The plurality of first intermediate features may be obtained by performing a plurality of feature extraction operations on the breast CT image using the plurality of convolutional layers and the plurality of pooling layers in the feature extraction module, and the plurality of second intermediate features may be obtained by performing a plurality of feature extraction operations on the feature information using the plurality of convolutional layers and the plurality of pooling layers in the feature extraction module. Based on the plurality of first intermediate features and the plurality of second intermediate features extracted by the feature extraction module, the initial features of the chest CT image can be obtained by performing a fusion operation thereon by using the plurality of fusion modules.
Specifically, the initial feature of the chest CT image may be performed by first selecting a plurality of first target intermediate features and a plurality of second target intermediate features from the plurality of first intermediate features and the plurality of second intermediate features described above, respectively, then performing a feature fusion operation on the respective first target intermediate features and the respective second target intermediate features in sequence using a plurality of fusion modules to obtain respective fusion results, and inputting the respective fusion results to a next fusion module. And then, performing next feature fusion operation on the fusion result of the last fusion module, the corresponding first target intermediate feature and the corresponding second target intermediate feature by using the next fusion module until the fusion result of the last fusion module is obtained, wherein the fusion result of the last fusion module is used as the initial feature of the chest CT image. In some embodiments, the aforementioned feature fusion operations may include, but are not limited to, addition, subtraction, multiplication, and max operations. As will be described in detail later in connection with fig. 3.
As described above, after the initial features are obtained, the initial features are spliced with the image information to obtain final features of the chest CT image, and then the final features are classified by using a classifier, so that a prediction result of the length of stay in hospital of a new coronary pneumonia patient can be obtained. In an implementation scenario, the prediction of the length of stay of the new coronary pneumonia patient is represented by a vector corresponding to the period of the length of stay of the new coronary pneumonia patient. As an example, assuming that the output prediction result is (1, 0, 0), it indicates that the length of hospitalization of the new coronary pneumonia patient is one week. Similarly, the output prediction result is (0, 1, 0), which indicates that the hospitalization time of the new coronary pneumonia patient is 2 weeks; the output prediction result is (0, 0, 1), and the hospitalization time of the new coronary pneumonia patient is 3 weeks.
As can be seen from the above description, the embodiments of the present disclosure comprehensively extract corresponding information based on the chest CT image of the new coronary pneumonia patient, so as to comprehensively depict the overall health status of the new coronary pneumonia patient. The comprehensively extracted information comprises, for example, geometric characteristic information (gaussian curvature, mean curvature and/or reed curvature) related to a lung lesion region and image information (fat area, bone density, liver CT value, thyroid density and the like) of other regions except for the lung, and then prediction is performed based on the comprehensively extracted information by using a pre-trained prediction model, so that an accurate prediction result about the hospitalization duration of a new coronary pneumonia patient can be efficiently obtained. Based on the prediction result, the medical resources are conveniently and reasonably configured and used, and the resource waste is avoided.
Fig. 2 is an exemplary block diagram illustrating a prediction model predicting length of stay of a new coronary pneumonia patient according to an embodiment of the present disclosure. It should be understood that fig. 2 is a specific embodiment of the method 100 of fig. 1, and thus the description above with respect to fig. 1 applies equally to fig. 2.
As shown in fig. 2, the predictive model of an embodiment of the present disclosure may include a feature extraction module 201, a fusion module 202, and a classifier 203. When the chest CT image 204, the feature information 205 and the image information 206 are input into a pre-trained prediction model for prediction, feature extraction is first performed on the chest CT image 204 and the feature information 205 by the feature extraction module 201 to obtain a plurality of first intermediate features 204-1 and a plurality of second intermediate features 205-1 corresponding to the first intermediate features and the second intermediate features. Further, a feature fusion operation may be performed on the extracted plurality of first intermediate features 204-1 and the plurality of second intermediate features 205-1 via the plurality of fusion modules 202, obtaining initial features 207 of the chest CT image. For example, first, a plurality of first target intermediate features and a plurality of second target intermediate features are selected from the plurality of first intermediate features 204-1 and the plurality of second intermediate features 205-1, respectively, and then a feature fusion operation is performed based on the plurality of first target intermediate features and the plurality of second target intermediate features via the plurality of fusion modules 202 to acquire initial features 207 (shown in fig. 3, for example) of the CT image of the chest.
As previously described, the above-described feature information 205 may include gaussian curvature, mean curvature, and/or reed curvature. In one implementation scenario, when the extracted feature information is gaussian curvature, mean curvature, and curie curvature, the gaussian curvature and the mean curvature may be superimposed to be regarded as two-dimensional tensor data. Thus, feature extraction can be performed on the two-dimensional tensor data and the reed curvature respectively to obtain a plurality of intermediate features corresponding to each other. That is, in this scenario, the input of the feature extraction model includes three types of data, namely, a chest CT image, two-dimensional tensor data formed by superimposing gaussian curvature and mean curvature, and a reed curvature.
As further shown in fig. 2, the image information (e.g., area data of abdominal wall muscles, subcutaneous fat and intra-abdominal fat, bone density absolute value, liver CT value, aortic calcium score, thyroid density and density, etc.) 206 is merged with the initial features 207 to obtain the final features of the CT breast image. It is understood that the aforementioned image information is usually real data, and the dimension of the real data can be normalized, tiled, and then stitched with the initial features to obtain the final features of the breast CT image. Next, the final features are classified by the classifier 203, and a prediction result 208 of the length of stay of the new coronary pneumonia patient is finally output.
Fig. 3 is an exemplary structural block diagram illustrating initial features of obtaining a thoracic CT image according to an embodiment of the present disclosure. As shown in fig. 3, it is assumed that a plurality of first target intermediate features 301 are selected from a plurality of first intermediate features corresponding to a chest CT image, a plurality of second target intermediate features 302 are selected from a plurality of second intermediate features corresponding to two-dimensional tensor data obtained by superimposing gaussian curvature and average curvature, and a plurality of second target intermediate features 303 are selected from a plurality of second intermediate features corresponding to reed curvature. In this scenario, a feature fusion operation is performed on the corresponding target intermediate features via a plurality of fusion modules 304 (i.e., the fusion module 202 shown in fig. 2 described above), and the fusion result output by the last fusion module 304 is taken as the initial feature 305 of the chest CT image.
Specifically, first, a feature fusion operation is performed on the corresponding first target intermediate feature 301-1, the corresponding second target intermediate feature 302-1 and the corresponding first target intermediate feature 303-1 via the first fusion module 304, so as to obtain a fusion result corresponding to the first fusion module 304. Then, the fusion result corresponding to the first fusion module 304 is input to the second fusion module 304, and then the second fusion module 304 performs a feature fusion operation based on the fusion result of the previous fusion module (i.e. the first fusion module 304), the corresponding first target intermediate feature 301-2, the corresponding second target intermediate feature 302-2, and the corresponding first target intermediate feature 303-2, so as to obtain a fusion result corresponding to the second fusion module 304. Similarly, a feature fusion operation is performed via the third fusion module 304 based on the fusion result of the second fusion module 304, the respective first target intermediate feature 301-3, the respective second target intermediate feature 302-3 and the respective first target intermediate feature 303-3 to obtain a corresponding fusion result of the third fusion module 304. A feature fusion operation is performed via the fourth fusion module 304 based on the fusion result of the third fusion module 304, the respective first target intermediate feature 301-4, the respective second target intermediate feature 302-4 and the respective first target intermediate feature 303-4 to obtain a fusion result corresponding to the fourth fusion module 304.
In the present disclosure, the plurality of Fusion modules may be, for example, a multi-factor Fusion structure ("MFS") module, and each of the plurality of Fusion modules is a fully connected layer. In the feature fusion operation performed by using the fusion module, the first fusion is to perform addition, subtraction, multiplication and maximum value operation on the corresponding first target intermediate feature and the corresponding second target intermediate feature, and fuse the results corresponding to the above operations (for example, as shown in the dashed box in the figure), so as to obtain the fusion result of the first fusion module. And for other fusion modules, the addition, subtraction, multiplication and maximum value taking operations are carried out on the fusion result of the last fusion module, the corresponding first target intermediate feature and the corresponding second target intermediate feature, and the results corresponding to the operations are fused to obtain the corresponding fusion result. By fourth fusionThe fourth fusion module 304 is for example configured to perform an addition (indicated by a "+" symbol in the figure), a subtraction (indicated by a "-" symbol in the figure), and a multiplication (indicated by a "in the figure) based on the fusion result of the third fusion module 304, the corresponding first target intermediate feature 301-4, the corresponding second target intermediate feature 302-4, and the corresponding first target intermediate feature 303-4, respectively"
Figure DEST_PATH_IMAGE025
"symbol shown) and a maximum value operation (shown as" max "symbol in the figure), and the results corresponding to the operations are fused (for example, spliced) to obtain the fusion result corresponding to the fourth fusion module 304. Assuming that the fourth fusion module 304 is the last one, the corresponding fusion result is the initial feature 305 of the chest CT image. Based on the method, the data of various types can be deeply fused, so that the information contained in the initial features is richer.
According to the above knowledge, the final feature of the CT image of the chest can be obtained by stitching the initial feature and the image information. Further, classifying the aforementioned final features by a classifier can output a prediction result of the length of stay of the new coronary pneumonia patient. It is understood that, when the prediction model is trained, the chest CT image, the feature information and the image information may be used as training data, and the prediction model may be trained with the length of stay of the new coronary pneumonia patient as a label. However, the distribution of the length of stay usually shows a long-tailed distribution, that is, some classes have a large number of samples, and the rest of the classes have a small number of samples, and the direct use of the samples in training can make the classifier in the prediction model have low accuracy, thereby resulting in low accuracy of the prediction result. Based on this, the disclosed embodiments also propose to adjust the decision boundary of the classifier based on the optimal transmission map to obtain the final prediction result of the length of stay of the new coronary pneumonia patient. In one embodiment, the weight of the classifier and the length of stay of the new coronary pneumonia patient may be obtained first, and then an optimal transmission map is calculated in the feature space of the final features based on the length of stay of the new coronary pneumonia patient and the weight of the classifier, so as to adjust the decision boundary of the classifier.
Specifically, the final feature described above may be set as a point within the source domain and the weight of the classifier may be set as the target point first. In the implementation scenario, the point in the source domain is the final feature extracted based on the chest CT image of each patient (e.g., a new coronary pneumonia patient). Similarly, the target point is the corresponding weight when classifying each final feature. Then, the frequency of the hospitalization time of the new coronary pneumonia patient is counted. That is, the duration of hospitalization of the patients with the new coronary pneumonia is counted on the first day
Figure DEST_PATH_IMAGE026
Number of patients in time period
Figure DEST_PATH_IMAGE027
After the frequency is obtained, the measure of the target point is adjusted according to the frequency to obtain the adjusted measure of the target point. In one embodiment, the measure of the target point may be adjusted according to the following formula:
Figure DEST_PATH_IMAGE028
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE029
a measure representing the adjusted target point is shown,
Figure DEST_PATH_IMAGE030
the frequency count is represented by the number of frequencies,
Figure DEST_PATH_IMAGE031
indicates the number of distribution types.
Further, an optimal transfer mapping is calculated based on the measures of the points within the source domain, the target points and the adjusted target points to adjust the decision boundary of the classifier. In one embodiment, an auxiliary function may be first constructed and then continuously optimized by measuring points in the source domain, the target point and the adjusted target point to determine the optimal transfer mappingThe decision boundary of the classifier is adjusted. In the implementation scenario, it is assumed that the constructed auxiliary function is recorded as
Figure DEST_PATH_IMAGE032
Points in the Source Domain are noted
Figure DEST_PATH_IMAGE033
The target point is recorded as
Figure DEST_PATH_IMAGE034
Measure of the adjusted target point
Figure DEST_PATH_IMAGE035
. First, the inner product of the target point and the point in the source domain can be calculated, and the number of maximum values obtained by the sum of the inner product and the auxiliary function is counted, and the maximum values are expressed by a mathematical expression
Figure DEST_PATH_IMAGE036
Wherein
Figure DEST_PATH_IMAGE037
Representing points within each source domain
Figure 748508DEST_PATH_IMAGE033
In that
Figure DEST_PATH_IMAGE038
The number of maximum values is obtained. For example, assume that the number of points in the source domain is 100, with 20 points in it
Figure 954362DEST_PATH_IMAGE038
Get the maximum value, then
Figure 149851DEST_PATH_IMAGE037
Is 20.
Then, the frequency of the total number of the maximum values is calculated based on the number of the maximum values, and the frequency is assumed to be recorded as
Figure DEST_PATH_IMAGE039
Then, then
Figure DEST_PATH_IMAGE040
Where N represents the total number (e.g., 100). Further, calculating based on the frequency and the adjusted target point measure
Figure DEST_PATH_IMAGE041
The product is
Figure DEST_PATH_IMAGE042
And calculate
Figure DEST_PATH_IMAGE043
. Then, updating by gradient descent method
Figure 501067DEST_PATH_IMAGE032
And continuously iterating until
Figure 173051DEST_PATH_IMAGE041
When the error is less than the preset threshold value, the optimal transmission mapping can be obtained
Figure DEST_PATH_IMAGE044
The product is
Figure DEST_PATH_IMAGE045
Is composed of
Figure DEST_PATH_IMAGE046
. Optimal transmission mapping obtained based on the foregoing
Figure 212289DEST_PATH_IMAGE044
The decision boundaries of the classifier can be adjusted to be evenly distributed, thereby solving the problem of unbalanced distribution of hospital stays, such as shown in fig. 4.
Fig. 4 is an exemplary diagram illustrating decision boundary adjustment before and after a classifier according to an embodiment of the present disclosure. The upper left diagram in fig. 4 shows the source domain before adjustment, and the lower left diagram shows the source domain after adjustment. The upper right panel shows the measurements of the target points before adjustment, which are 0.7, 0.1 and 0.2, respectively, i.e. the distribution of the length of stay before adjustment is uneven. The bottom right graph shows the measure of the adjusted target points, which are all 0.33, i.e. the decision boundaries of the adjusted classifiers are uniformly distributed. Based on the method, the precision of the classifier can be improved, and therefore a more accurate prediction result is obtained.
Fig. 5 is an exemplary block diagram illustrating an overview for predicting length of stay for a new coronary pneumonia patient according to an embodiment of the present disclosure. As shown in fig. 5, a chest CT image 501 (i.e., the chest CT image 204 shown in fig. 2), two-dimensional tensor data 502 obtained by superimposing gaussian curvature and mean curvature, and a reed curvature 503 are input to a feature extraction module, and a feature extraction operation is performed by the feature extraction module. In one embodiment, since the size of the breast CT image 501 is not uniform, the feature extraction module may employ a sliding window (the size may be 64 × 64), so that the size of the breast CT image 501 is 64 × 64. The two-dimensional tensor data size is 64 x 64. For a curie curvature, it can be expressed as a three-dimensional tensor, whose size can be 64 × 3. The gaussian curvature, the mean curvature, and the reed curvature may be obtained based on the CT image of the chest, which may specifically refer to the content described in fig. 1, and the details of the present disclosure are not repeated herein. As previously described, the aforementioned feature extraction module may include a plurality of convolutional layers and a plurality of pooling layers, and each convolutional layer may include convolution, BatchNormalization, and Relu. As an example, 10 convolutional layers and 4 pooling layers may be employed.
As further shown in the figure, the chest CT image 501, the superimposed two-dimensional tensor data 502 of gaussian curvature and average curvature, and the curie curvature 503 perform feature extraction via a feature extraction module, and may obtain a plurality of corresponding first intermediate features 504, a plurality of second intermediate features 505, and a plurality of second intermediate features 506, respectively. Next, a plurality of target intermediate features are selected from the plurality of first intermediate features and the second intermediate features, and input to the fusion module 507 for fusion, so as to obtain an initial feature 508 of the CT image of the chest. In one exemplary scenario, assume that a first intermediate feature 504-1 through a first intermediate feature 504-5 is selected from the plurality of first intermediate features 504, a second intermediate feature 505-1 through a second intermediate feature 505-5 is selected from the plurality of second intermediate features 505, and a second intermediate feature 506-1 through a second intermediate feature 506-5 is selected from the plurality of second intermediate features 506. In an implementation scenario, the choice of the target intermediate feature may be determined based on the number of layers of the residual module. After the target feature is obtained, it may be fused using a plurality of fusion modules 507.
For example, the first intermediate feature 504-1, the second intermediate feature 505-1 and the second intermediate feature 506-1 are first fused (including performing addition, subtraction, multiplication and maximum operation) by the first fusion module 507 to obtain corresponding fusion results. Then, the fusion result of the first fusion module 507 is input to the second fusion module 507, and the second fusion module 507 performs fusion based on the fusion result of the first fusion module 507, the first intermediate feature 504-2, the second intermediate feature 505-2, and the second intermediate feature 506-2, and inputs the fusion result to the next fusion module 507 until obtaining the fusion result of the last fusion module (shown as the fifth fusion module), and the fusion result of the last fusion module is used as the initial feature 508 of the CT image of the chest. For more details on obtaining the initial features, reference may be made to what is described above with reference to fig. 3, and the disclosure is not repeated here.
Further, image information 509 obtained based on the breast CT image 501 (for example, area data of abdominal wall muscles, subcutaneous fat, and intra-abdominal fat, absolute value of bone density, CT value of liver, aortic calcium score, thyroid density, and the like) is performed with the initial features 508 to obtain final features of the breast CT image. The final features are then classified by classifier 510 (i.e., classifier 203 shown in fig. 2 above) to output a prediction of the length of stay of the new coronary pneumonia patient. According to the foregoing, the optimal transfer mapping 511 can be used to adjust the decision boundary of the classifier 510, so as to improve the precision of the classifier, and make the final prediction result 512 output by the prediction model more accurate.
Fig. 6 is a block diagram illustrating an apparatus 600 for predicting length of stay for a new coronary pneumonia patient according to an embodiment of the present disclosure. It is to be understood that the device implementing aspects of the present disclosure may be a single device (e.g., a computing device) or a multifunction device including various peripheral devices.
As shown in fig. 6, the apparatus of the present disclosure may include a central processing unit or central processing unit ("CPU") 611, which may be a general purpose CPU, a dedicated CPU, or other execution unit on which information processing and programs run. Further, the device 600 may also include a mass storage memory 612 and a read only memory ("ROM") 613, wherein the mass storage memory 612 may be configured to store various types of data, including various types of data related to the acquisition of breast CT images of patients with new coronary pneumonia, algorithmic data, intermediate results, and various programs needed to operate the device 600. The ROM 613 may be configured to store power-on self-test for the device 600, initialization of various functional modules in the system, drivers for basic input/output of the system, and data and instructions required to boot the operating system.
Optionally, the device 600 may also include other hardware platforms or components, such as the illustrated tensor processing unit ("TPU") 614, graphics processing unit ("GPU") 615, field programmable gate array ("FPGA") 616, and machine learning unit ("MLU") 617. It is to be understood that although various hardware platforms or components are shown in the device 600, this is by way of illustration and not of limitation, and one skilled in the art can add or remove corresponding hardware as may be desired. For example, the device 600 may include only a CPU, associated memory devices, and interface devices to implement the disclosed method for predicting length of stay in a new coronary pneumonia patient.
In some embodiments, to facilitate the transfer and interaction of data with external networks, the device 600 of the present disclosure also includes a communication interface 618 such that it may connect to a local area network/wireless local area network ("LAN/WLAN") 605 via the communication interface 618, which in turn may connect to a local server 606 or to the Internet ("Internet") 607 via the LAN/WLAN. Alternatively or additionally, the device 600 of the present disclosure may also be directly connected to the internet or cellular network based on wireless communication technology, such as 3 rd generation ("3G"), 4 th generation ("4G"), or 5 th generation ("5G") based wireless communication technology, through the communication interface 618. In some application scenarios, the device 600 of the present disclosure may also access the server 608 and the database 609 of the external network as needed in order to obtain various known image models, data and modules, and may remotely store various data, such as various types of data or instructions for presenting breast CT images, feature information, image information, and a plurality of intermediate features obtained by performing feature extraction on the breast CT images, feature information.
The peripheral devices of the apparatus 600 may include a display device 602, an input device 603, and a data transmission interface 604. In one embodiment, the display device 602 may, for example, include one or more speakers and/or one or more visual displays configured for voice prompting and/or image video display of the present disclosure for breast CT images, feature extraction processes for feature information, feature fusion processes, initial features, or final features. Input device 603 may include other input buttons or controls, such as a keyboard, mouse, microphone, gesture capture camera, etc., configured to receive input of breast CT images, feature information, and imagery information, and/or user instructions. The data transfer interface 604 may include, for example, a serial interface, a parallel interface, or a universal serial bus interface ("USB"), a small computer system interface ("SCSI"), serial ATA, FireWire ("FireWire"), PCI Express, and a high-definition multimedia interface ("HDMI"), which are configured for data transfer and interaction with other devices or systems. In accordance with aspects of the present disclosure, the data transmission interface 604 may receive a thoracic CT image of a new coronary pneumonia patient from a CT device and transmit data or results including the thoracic CT image, feature information, and imagery information or various other types to the device 600.
The aforementioned CPU 611, mass memory 612, ROM 613, TPU 614, GPU 615, FPGA 616, MLU 617 and communication interface 618 of the device 600 of the present disclosure may be interconnected by a bus 619 and enable data interaction with peripheral devices via the bus. Through the bus 619, the CPU 611 may control other hardware components and their peripherals in the device 600, in one embodiment.
A device for predicting length of stay in hospital for a new coronary pneumonia patient that may be used to carry out the present disclosure is described above in connection with fig. 6. It is to be understood that the device structures or architectures herein are merely exemplary, and that the implementations and implementation entities of the present disclosure are not limited thereto but may be modified without departing from the spirit of the present disclosure.
From the above description in conjunction with the accompanying drawings, those skilled in the art will also appreciate that embodiments of the present disclosure may also be implemented by software programs. The present disclosure thus also provides a computer program product. The computer program product may be used to implement the method for predicting length of stay in hospital for a new coronary pneumonia patient described in this disclosure in conjunction with figures l-5.
It should be noted that while the operations of the disclosed methods are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Rather, the steps depicted in the flowcharts may change the order of execution. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
It should be understood that when the claims of the present disclosure, and when the terms first, second, third, fourth, etc. are used in the specification and drawings, they are used only to distinguish one object from another, and not to describe a particular order. The terms "comprises" and "comprising," when used in the specification and claims of this disclosure, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the disclosure herein is for the purpose of describing particular embodiments only, and is not intended to be limiting of the disclosure. As used in the specification and claims of this disclosure, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should be further understood that the term "and/or" as used in the specification and claims of this disclosure refers to any and all possible combinations of one or more of the associated listed items and includes such combinations.
Although the embodiments of the present disclosure are described above, the descriptions are only examples for facilitating understanding of the present disclosure, and are not intended to limit the scope and application scenarios of the present disclosure. It will be understood by those skilled in the art of the present disclosure that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure, and that the scope of the disclosure is to be limited only by the appended claims.

Claims (10)

1. A method for predicting length of stay in a new coronary pneumonia patient, comprising:
acquiring a chest CT image of a patient with new coronary pneumonia;
obtaining feature information and image information related to the chest CT image based on the chest CT image; and
and inputting the chest CT image, the characteristic information and the image information into a pre-trained prediction model for prediction so as to obtain a prediction result of the hospitalization duration of the new coronary pneumonia patient.
2. The method of claim 1, wherein the characteristic information relates to characteristics of lung regions and the characteristic information includes at least gaussian, mean and/or reed curvatures, the image information including at least image information relating to muscle and abdominal fat, image information relating to vertebral bone, and image information relating to liver, cardiovascular and thyroid in the chest CT image.
3. The method of claim 2, wherein the prediction model comprises a feature extraction module, a fusion module and a classifier connected in sequence, and wherein inputting the chest CT image, the feature information and the image information into a pre-trained prediction model for prediction to obtain the prediction result of the length of stay of the new coronary pneumonia patient comprises:
inputting the chest CT image and the feature information into the feature extraction module to respectively execute feature extraction operation so as to obtain intermediate features corresponding to the chest CT image and the feature information;
performing the feature fusion operation on the corresponding intermediate features by using the fusion module to obtain initial features of the chest CT image;
splicing the initial features and the image information to obtain final features of the chest CT image; and
classifying, with the classifier, based on the final features to obtain a prediction of length of stay of the new coronary pneumonia patient.
4. The method of claim 3, wherein inputting the CT image of the chest and the feature information into the feature extraction module to respectively perform feature extraction operations to obtain intermediate features corresponding to the CT image of the chest and the feature information comprises:
inputting the chest CT image into the feature extraction module to perform a plurality of feature extraction operations to obtain a plurality of first intermediate features; and
inputting the feature information to the feature extraction module to perform a plurality of feature extraction operations to obtain a plurality of second intermediate features.
5. The method of claim 4, wherein the fusion module comprises a plurality, and wherein performing the feature fusion operation on the corresponding intermediate features with the fusion module to obtain initial features of a chest CT image comprises:
selecting a plurality of first target intermediate features and a plurality of second target intermediate features from the plurality of first intermediate features and the plurality of second intermediate features, respectively;
sequentially executing feature fusion operation on the corresponding first target intermediate features and the corresponding second target intermediate features by utilizing the plurality of fusion modules to obtain corresponding fusion results, and inputting the corresponding fusion results to the next fusion module;
executing next feature fusion operation on the fusion result of the last fusion module, the corresponding first target intermediate feature and the corresponding second target intermediate feature by using the next fusion module until the fusion result of the last fusion module is obtained; and
and taking the fusion result of the last fusion module as the initial characteristic of the chest CT image.
6. The method of claim 3, further comprising:
adjusting decision boundaries of the classifier based on an optimal transmission map to obtain a final prediction of length of stay of the new coronary pneumonia patient.
7. The method of claim 6, wherein adjusting a decision boundary of the classifier based on an optimal transmission mapping comprises:
acquiring the weight of the classifier and the hospitalization duration of the new coronary pneumonia patient;
calculating an optimal transmission map based on the length of stay of the new coronary pneumonia patient and the weight of the classifier in the feature space of the final feature to adjust the decision boundary of the classifier.
8. The method of claim 7, wherein calculating an optimal transmission map based on the length of stay of the new coronary pneumonia patient and the weight of the classifier in the feature space of the final features to adjust the decision boundary of the classifier comprises:
setting the final feature as a point within a source domain and the weight of the classifier as a target point;
counting the frequency of the hospitalization time of the new coronary pneumonia patient;
adjusting the measure of the target point according to the frequency to obtain the adjusted measure of the target point; and
calculating an optimal transfer mapping based on measurements of the points within the source domain, the target point and the adjusted target point to adjust a decision boundary of the classifier.
9. A device for predicting the length of a hospital stay for a new coronary pneumonia patient, comprising:
a processor; and
a memory storing program instructions for predicting length of stay in a new coronary pneumonia patient, which when executed by the processor, cause the apparatus to implement the method of any one of claims 1-8.
10. A computer-readable storage medium having stored thereon computer-readable instructions for predicting length of stay of a new coronary pneumonia patient, the computer-readable instructions, when executed by one or more processors, implement the method of any one of claims 1-8.
CN202210631043.4A 2022-06-06 2022-06-06 Method for predicting hospitalization duration of new coronary pneumonia patient and related product Pending CN114708974A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210631043.4A CN114708974A (en) 2022-06-06 2022-06-06 Method for predicting hospitalization duration of new coronary pneumonia patient and related product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210631043.4A CN114708974A (en) 2022-06-06 2022-06-06 Method for predicting hospitalization duration of new coronary pneumonia patient and related product

Publications (1)

Publication Number Publication Date
CN114708974A true CN114708974A (en) 2022-07-05

Family

ID=82177990

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210631043.4A Pending CN114708974A (en) 2022-06-06 2022-06-06 Method for predicting hospitalization duration of new coronary pneumonia patient and related product

Country Status (1)

Country Link
CN (1) CN114708974A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111401138A (en) * 2020-02-24 2020-07-10 上海理工大学 Countermeasure optimization method for generating countermeasure neural network training process
CN112767340A (en) * 2021-01-13 2021-05-07 大连理工大学 Apparatus and related products for assessing focal zone based on neural network model
CN113052186A (en) * 2021-03-17 2021-06-29 华中科技大学同济医学院附属协和医院 Imaging-based method and system for diagnosing and tracking new coronary pneumonia

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111401138A (en) * 2020-02-24 2020-07-10 上海理工大学 Countermeasure optimization method for generating countermeasure neural network training process
CN112767340A (en) * 2021-01-13 2021-05-07 大连理工大学 Apparatus and related products for assessing focal zone based on neural network model
CN113052186A (en) * 2021-03-17 2021-06-29 华中科技大学同济医学院附属协和医院 Imaging-based method and system for diagnosing and tracking new coronary pneumonia

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张俊艺等: "基于最优传输的多中心自闭症谱系障碍诊断", 《数据采集与处理》 *

Similar Documents

Publication Publication Date Title
US10776922B2 (en) Systems and methods for analysis of blood flow state
CN106887000B (en) Gridding processing method and system for medical image
Kamiya et al. Automated segmentation of psoas major muscle in X-ray CT images by use of a shape model: preliminary study
EP3570288A1 (en) Method for obtaining at least one feature of interest
EP3471054B1 (en) Method for determining at least one object feature of an object
CN106573150A (en) Suppression of vascular structures in images
JP7170747B2 (en) Similarity determination device, method and program
JP6914233B2 (en) Similarity determination device, method and program
JP7004829B2 (en) Similarity determination device, method and program
CN112381822B (en) Method for processing images of focal zones of the lungs and related product
CN112750110A (en) Evaluation system for evaluating lung lesion based on neural network and related products
CN115035375A (en) Method for feature extraction of chest CT image and related product
CN114708974A (en) Method for predicting hospitalization duration of new coronary pneumonia patient and related product
EP3270308B1 (en) Method for providing a secondary parameter, decision support system, computer-readable medium and computer program product
CN112884706B (en) Image evaluation system based on neural network model and related product
Pepe et al. Semi-supervised virtual regression of aortic dissections using 3D generative inpainting
Tobon-Gomez et al. 3D mesh based wall thickness measurement: identification of left ventricular hypertrophy phenotypes
CN115064250A (en) Method for adjusting distribution of stay in hospital and related product
Tomic et al. Simulation of breast lesions based upon fractal Perlin noise
CN113222985A (en) Image processing method, image processing device, computer equipment and medium
Hoye et al. Bias and variability in morphology features of lung lesions across CT imaging conditions
WO2020044736A1 (en) Similarity determination device, method, and program
Koutkalaki et al. Towards a foot bio-model for performing finite element analysis for footwear design optimization using a cloud infrastructure
CN114708973B (en) Device and storage medium for evaluating human health
Połap et al. Lung segmentation on x-ray images with neural validation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination