CN115064250A - Method for adjusting distribution of stay in hospital and related product - Google Patents

Method for adjusting distribution of stay in hospital and related product Download PDF

Info

Publication number
CN115064250A
CN115064250A CN202210635531.2A CN202210635531A CN115064250A CN 115064250 A CN115064250 A CN 115064250A CN 202210635531 A CN202210635531 A CN 202210635531A CN 115064250 A CN115064250 A CN 115064250A
Authority
CN
China
Prior art keywords
image
feature
stay
chest
features
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210635531.2A
Other languages
Chinese (zh)
Inventor
雷娜
王振常
任玉雪
陈伟
金连宝
魏璇
吕晗
李维
吴伯阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhituo Vision Technology Co ltd
Dalian University of Technology
Capital Normal University
Beijing Friendship Hospital
Original Assignee
Beijing Zhituo Vision Technology Co ltd
Dalian University of Technology
Capital Normal University
Beijing Friendship Hospital
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhituo Vision Technology Co ltd, Dalian University of Technology, Capital Normal University, Beijing Friendship Hospital filed Critical Beijing Zhituo Vision Technology Co ltd
Priority to CN202210635531.2A priority Critical patent/CN115064250A/en
Publication of CN115064250A publication Critical patent/CN115064250A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/20ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Public Health (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • General Business, Economics & Management (AREA)
  • Quality & Reliability (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Biomedical Technology (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The present disclosure relates to a method and related product for adjusting the distribution of length of stay. The method comprises the following steps: acquiring chest CT images of a plurality of pneumonia patients and hospital stay durations corresponding to the pneumonia patients; acquiring feature information and image information related to the chest CT image based on the chest CT image; acquiring the final characteristics of the chest CT image according to the chest CT image, the characteristic information and the image information; and calculating an optimal transmission map based on the final characteristics and the corresponding length of stay to adjust the distribution of lengths of stay. By utilizing the scheme disclosed by the invention, the distribution of the length of stay in hospital can be effectively adjusted, and the problem of unbalanced distribution of the length of stay in hospital is solved.

Description

Method for adjusting distribution of stay in hospital and related product
Technical Field
The present disclosure relates generally to the field of long tail data processing. More particularly, the present disclosure relates to a method, apparatus, and computer-readable storage medium for adjusting the distribution of length of stay in a hospital.
Background
Naturally collected data classes usually exhibit a long-tailed distribution, i.e. some classes have a large number of samples, while the remaining more classes have a smaller number of samples. In the real world, the distribution of the length of stay is the typical long-tail distribution, and the direct application has difficulty in obtaining accurate results. For example, it is difficult to obtain accurate prediction results when predicting the length of a patient's stay in a hospital. At present, the processing mode aiming at long tail distribution mainly comprises resampling and data enhancement technology. However, for resampling, most class rebalancing methods improve the performance of the tail classes and the whole at the expense of the performance of the head classes, but due to the limited amount of data, it is essentially unable to handle the problem of information starvation and the tail classes may be over-fit. For the data enhancement technology, there is a limit to simply applying the existing classical data enhancement technology without considering the category difference to the long tail learning task, which can improve the overall performance of the long tail learning, but because the data volume of most categories is more, the data enhancement of most categories is also more, thereby further aggravating the category imbalance problem. Therefore, how to effectively solve the long tail distribution of the long hospitalization time becomes a technical problem which needs to be solved urgently.
Disclosure of Invention
To at least partially solve the technical problems mentioned in the background, the solution of the present disclosure provides a solution for adjusting the distribution of length of stay. By utilizing the scheme disclosed by the invention, the distribution of the length of stay in hospital can be effectively adjusted, so that the problem of unbalanced distribution is solved. To this end, the present disclosure provides solutions in a number of aspects as follows.
In a first aspect, the present disclosure provides a method for adjusting the distribution of length of stay in a hospital, comprising: acquiring chest CT images of a plurality of pneumonia patients and hospital stay durations corresponding to the pneumonia patients; acquiring feature information and image information related to the chest CT image based on the chest CT image; acquiring the final characteristics of the chest CT image according to the chest CT image, the characteristic information and the image information; and calculating an optimal transmission map based on the final characteristics and the corresponding length of stay to adjust the distribution of lengths of stay.
In one embodiment, wherein the characteristic information relates to characteristics of lung regions and the characteristic information comprises at least gaussian curvature, mean curvature and/or reed curvature, the image information comprises at least image information relating to muscle and abdominal fat, image information relating to vertebral bone, and image information relating to liver, cardiovascular and thyroid in the breast CT image.
In another embodiment, wherein obtaining the final feature of the CT image of the breast from the CT image of the breast, the feature information and the image information comprises: respectively performing feature extraction operation on the chest CT image and the feature information by using a feature extraction module to obtain intermediate features corresponding to the chest CT image and the feature information; performing the feature fusion operation on the corresponding intermediate features by using a fusion module to obtain initial features of the chest CT image; and splicing the initial features and the image information to acquire final features of the chest CT image.
In yet another embodiment, wherein the feature extraction module includes a plurality of convolution layers and a plurality of pooling layers, and wherein performing feature extraction operations on the breast CT image and the feature information with the feature extraction module, respectively, to obtain intermediate features corresponding thereto comprises: performing a plurality of feature extraction operations on the chest CT image with the plurality of convolutional layers and the plurality of pooling layers in the feature extraction module to obtain a plurality of first intermediate features; and performing a plurality of feature extraction operations on the feature information using the plurality of convolutional layers and the plurality of pooling layers in the feature extraction module to obtain a plurality of second intermediate features.
In yet another embodiment, wherein the fusion module comprises a plurality, and wherein performing the feature fusion operation on the corresponding intermediate features with the fusion module to obtain initial features of the CT image of the breast comprises: selecting a plurality of first target intermediate features and a plurality of second target intermediate features from the plurality of first intermediate features and the plurality of second intermediate features, respectively; sequentially executing feature fusion operation on the corresponding first target intermediate features and the corresponding second target intermediate features by utilizing the plurality of fusion modules to obtain corresponding fusion results, and inputting the corresponding fusion results to the next fusion module; executing next feature fusion operation on the fusion result of the last fusion module, the corresponding first target intermediate feature and the corresponding second target intermediate feature by using the next fusion module until the fusion result of the last fusion module is obtained; and taking the fusion result of the last fusion module as the initial characteristic of the chest CT image.
In yet another embodiment, wherein calculating an optimal transmission map based on the final characteristics and the corresponding length of stay to adjust the distribution of lengths of stays comprises: classifying the final features by using a classification module and acquiring the weight of the classification module; and calculating an optimal transmission mapping in the feature space of the final features based on the corresponding length of stay and the weight of the classification module to adjust the distribution of the length of stay.
In yet another embodiment, wherein computing an optimal transmission map within the feature space of the final features based on the corresponding length of stay and the weights of the classification module to adjust the distribution of lengths of stay comprises: setting the final feature as a point within a source domain and the weight of the classification module as a target point; counting the frequency of hospitalization time of the pneumonia patients; adjusting the measure of the target point according to the frequency to obtain the adjusted measure of the target point; and calculating an optimal transmission mapping based on the points in the source domain, the target point and the adjusted target point measurements to adjust the distribution of the length of stay in the hospital.
In yet another embodiment, wherein adjusting the measure of the target point according to the frequency comprises: adjusting the measure of the target point based on the following formula:
Figure RE-GDA0003795772870000031
wherein v is j A measure representing the adjusted target point, n j Indicates the frequency, and K indicates the number of distribution types.
In a second aspect, the present disclosure also provides an apparatus for adjusting the distribution of length of stay in a hospital, comprising: a processor; and a memory storing program instructions for adjusting the distribution of length of stay in the hospital, which when executed by the processor, cause the apparatus to perform the aforementioned embodiments.
In a third aspect, the present disclosure also provides a computer-readable storage medium having stored thereon computer-readable instructions for adjusting the distribution of length of stay in a hospital, which, when executed by one or more processors, implement the foregoing embodiments.
According to the scheme disclosed by the invention, the distribution of the hospitalization duration is adjusted by acquiring the final characteristics based on the chest CT image, the characteristic information and the image information and calculating the optimal transmission mapping according to the final characteristics and the hospitalization duration. That is, the distribution of the length of stay in hospital is adjusted through the optimal transmission mapping, and the problem of unbalanced distribution of the length of stay in hospital can be effectively solved. Based on the adjusted distribution of the stay length, the follow-up method can be applied to, for example, predicting the stay length of the pneumonia patient, and can obtain an accurate prediction result, so that a reference basis is provided in terms of configuration and use of medical resources.
Drawings
The above and other objects, features and advantages of exemplary embodiments of the present disclosure will become readily apparent from the following detailed description read in conjunction with the accompanying drawings. In the drawings, several embodiments of the disclosure are illustrated by way of example and not by way of limitation, and like or corresponding reference numerals indicate like or corresponding parts and in which:
fig. 1 is an exemplary flow diagram illustrating a method for adjusting a distribution of length of stay in a hospital according to an embodiment of the present disclosure;
fig. 2 is an exemplary block diagram illustrating the final features of a CT image of a breast obtained from the CT image of the breast, feature information and image information according to an embodiment of the present disclosure;
FIG. 3 is an exemplary block diagram illustrating initial features of obtaining a CT image of a breast in accordance with an embodiment of the present disclosure;
FIG. 4 is an exemplary flow diagram illustrating adjustments to the distribution of length of stay in a hospital according to an embodiment of the present disclosure;
FIG. 5 is an exemplary diagram illustrating before and after distribution adjustment of length of stay in a hospital according to an embodiment of the present disclosure;
FIG. 6 is an exemplary block diagram illustrating an overview of adjusting the distribution of length of stay in a hospital according to an embodiment of the present disclosure; and
fig. 7 is a block diagram illustrating an apparatus for adjusting the distribution of length of stay in a hospital according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings. It should be understood that the embodiments described in this specification are only some of the embodiments of the present disclosure provided to facilitate a clear understanding of the aspects and to comply with legal requirements, and not all embodiments of the present disclosure may be implemented. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed in the specification without making any creative effort, shall fall within the protection scope of the present disclosure.
Fig. 1 is an exemplary flow diagram illustrating a method 100 for adjusting a distribution of length of stay in a hospital according to an embodiment of the present disclosure. As shown in fig. 1, at step S102, chest CT images of a plurality of pneumonia patients and corresponding stay periods of the plurality of pneumonia patients are acquired. In one embodiment, the aforementioned chest CT image may be acquired by, for example, a Computed Tomography ("CT") technique or device. Based on the acquired CT image of the chest, at step S104, feature information and image information related to the CT image of the chest are acquired based on the CT image of the chest. Wherein the aforementioned characteristic information relates to characteristics of the lung region and the characteristic information may include, but is not limited to, gaussian curvature, mean curvature, and/or reed curvature. The image information includes, but is not limited to, image information related to muscle and abdominal fat (e.g., fat area) in a CT image of the chest, image information related to vertebral bone (e.g., bone density), and image information related to liver, cardiovascular and thyroid (e.g., liver CT value, thyroid density, etc.).
In one implementation scenario, the gaussian curvature, mean curvature, and/or reed curvature may be determined by generating a two-dimensional mesh or a four-dimensional mesh of connected vertices based on a lesion region of a lung in a thoracic CT image. Specifically, for Gaussian curvature, it is equal to 2 π minus the angle corresponding to its neighbor at the vertex of the original uncut closed mesh. Assuming Gaussian curvature is denoted as k, then
Figure RE-GDA0003795772870000051
Wherein theta is i The angle of the vertex of the mesh with the adjacent mesh is shown, and n is the number of the adjacent meshes of the vertex of the mesh. For mean curvature, assume the lung lesion area as a function F, and the normal vector of the iso-surface where the vertex x lies is
Figure RE-GDA0003795772870000052
The average curvature K at the vertex x may be defined as
Figure RE-GDA0003795772870000053
For the curie curvature, the weight f (e) of the edge where the vertices in the tetrahedral mesh adjoin can be defined first and expressed as:
Figure RE-GDA0003795772870000054
wherein, ω is e The weight value of the edge e is represented,
Figure RE-GDA0003795772870000055
and
Figure RE-GDA0003795772870000056
respectively represent the vertexes v 1 And v 2 The weight of (a) is determined,
Figure RE-GDA0003795772870000057
representing all and the vertex v 1 Adjoining edges (not covered)Including the edge e),
Figure RE-GDA0003795772870000058
representing all and the vertex v 2 The adjoining edge (excluding edge e). Further, a weight ω common to them may be defined e And it is expressed as:
Figure RE-GDA0003795772870000059
the weight f (e) of the edge where the vertices in the tetrahedral mesh adjoin can be obtained by combining the above equation (1) and equation (2). Based on the weight f (e) obtained in the foregoing, the richness curvature Ric at each vertex can be obtained further according to the following formula:
Figure RE-GDA0003795772870000061
in the above formula (3), e v Representing an edge adjacent to the vertex v, e v V represents all edges adjacent to vertex v, deg (v) may represent e v I.e. the number of edges adjacent to point v. Further, weights on three axes (i.e., x-axis, y-axis, and z-axis) orthogonal to each other at the vertex may be calculated based on the above equations (1) and (2), respectively, and the weights of the three axes may be set as the values of the richness. The aforementioned three axial weights may represent tensor data of a three-dimensional tensor. Thus, the cookie values may be represented as three-dimensional tensors.
In another implementation scenario, the various image information may be obtained by, for example, Quantitative Computed Tomography ("QCT"). For example, DICOM data for CT images of the chest are imported into the QCT and the region of interest (e.g., taken at the lumbar 1-2 disc level) is measured. By applying the QCT constitutional component measurement function, subcutaneous fat and fat in the abdominal cavity are distinguished by the outer edge of abdominal wall muscles, and abdominal cavity organs and blood vessels are avoided by identifying the fat in the abdominal cavity. And then, manually delineating the muscle edge of the abdominal wall to determine the muscle region of interest of the abdominal wall, and further quantitatively recording area data of the muscle of the abdominal wall, subcutaneous fat and fat in the abdominal cavity. Further, by taking the average value of the absolute values of cancellous bone density at the first lumbar vertebra (L1) and the second lumbar vertebra (L2) as a standard for evaluating osteoporosis, continuous variable data of the absolute values of bone density were quantitatively recorded, respectively, and corresponding qualitative data were recorded (for example, when the absolute value of bone density was >120mg/cm3, which belongs to the normal range; when the absolute value of bone density was 80 to 120mg/cm3, which belongs to low bone mass; and when the absolute value of bone density was <80mg/cm3, which belongs to osteoporosis).
For chest CT images of, for example, patients with pneumonia, waist 1-2 disc level liver CT values can be measured. Specifically, according to the Couinaud eight-segment method, an area of interest with a diameter of 1cm is defined at the center of each indicated liver segment, a CT value is recorded, and the average CT value of each liver segment is taken as a liver CT value (HU) in the lumbar 1-2 intervertebral disc layer, while avoiding intrahepatic vessels, bile ducts and ligaments. And measuring CT values of spleen at the level of the lumbar 1-2 intervertebral disc, and recording corresponding fatty liver degree grades (the CT ratio of liver/spleen is between 0.7 and 1.0, which is mild fatty liver, the CT ratio is between 0.5 and 0.7, which is moderate fatty liver, and the CT ratio is less than or equal to 0.5, which is severe fatty liver).
In some embodiments, a cross-sectional DICOM image of the mid-plane of the four chambers of the heart may also be taken from the chest CT image and imported into the ITK-SNAP software. The volume of interest can be measured automatically by manually delineating the heart borders on the ITK-SNAP software and recording the heart volume at the four-chamber central level (mm 3). In addition, by marking each branch calcification region of the coronary artery by using the coronary stenosis analysis AI software and Agatston's score system and setting the threshold to 130HU, the calcification scores of the left main trunk, left anterior descending branch, circumflex branch and right coronary artery can be automatically obtained and the total calcification score can be calculated. Similarly, aortic calcium scores can be obtained. In addition, the maximum cross section shown by the thyroid gland is selected from the chest CT image, and the region of interest is defined with a diameter of 1cm, and the isthmus and bilateral lobe CT values are measured respectively, and the average CT value thereof is taken as the measurement result (HU) of the thyroid gland density. Further, the volume of interest was automatically measured by taking the largest cross sectional DICOM image of the thyroid gland display and importing it into the ITK-SNAP software, by manually delineating the thyroid edge on the ITK-SNAP software, and the thyroid volume (mm3) was recorded.
After obtaining the above feature information and image information, at step S106, the final feature of the CT image of the chest is obtained according to the CT image of the chest, the feature information and the image information. In one embodiment, a feature extraction module may first be used to perform a feature extraction operation on the breast CT image and the feature information respectively to obtain intermediate features corresponding to the two. And then, performing feature fusion operation on the corresponding intermediate features by using a fusion module to obtain initial features of the chest CT image, and splicing the initial features and the image information to obtain final features of the chest CT image. In an implementation scenario, the feature extraction module may include a plurality of convolutional layers and a plurality of pooling layers, and the fusion module may include a plurality of convolutional layers and pooling layers. The plurality of first intermediate features may be obtained by performing a plurality of feature extraction operations on the breast CT image using the plurality of convolutional layers and the plurality of pooling layers in the feature extraction module, and the plurality of second intermediate features may be obtained by performing a plurality of feature extraction operations on the feature information using the plurality of convolutional layers and the plurality of pooling layers in the feature extraction module.
Wherein the initial feature of the chest CT image may be performed by first selecting a plurality of first target intermediate features and a plurality of second target intermediate features from the plurality of first intermediate features and the plurality of second intermediate features, respectively, then performing a feature fusion operation on the respective first target intermediate features, the respective second target intermediate features in turn using a plurality of fusion modules to obtain respective fusion results, and inputting the respective fusion results to a next fusion module. And then, performing next feature fusion operation on the fusion result of the last fusion module, the corresponding first target intermediate feature and the corresponding second target intermediate feature by using the next fusion module until the fusion result of the last fusion module is obtained, wherein the fusion result of the last fusion module is used as the initial feature of the chest CT image. In some embodiments, the aforementioned feature fusion operations may include, but are not limited to, addition, subtraction, multiplication, and max operations. As will be described in detail later in connection with fig. 2-3.
Based on the final features of the obtained chest CT image, at step S108, an optimal transmission map is calculated based on the final features and the corresponding length of stay to adjust the distribution of the length of stay. In one embodiment, the final features may be first classified by the classification module and the weights of the classification module are obtained, and then an optimal transmission map is calculated in the feature space of the final features based on the corresponding length of stay and the weights of the classification module to adjust the distribution of the length of stay. More specifically, the target measure may be determined based on the feature space of the final features, the weight of the classification module, and the corresponding length of stay, and then an optimal transmission mapping may be calculated based on the points of the feature space (source domain) and the target measure, so as to adjust the distribution of the length of stay. Which will be described in detail later in connection with fig. 4.
As can be seen from the above description, the embodiments of the present disclosure adjust the distribution of the stay length by extracting the final features of the chest CT image of the pneumonia patient and calculating the optimal transmission map based on the feature space of the final features and the corresponding stay length. Distribution of length of stay in hospital can be effectively adjusted based on optimal transmission mapping, and the problem of unbalanced distribution of length of stay in hospital is solved. Furthermore, the embodiment of the disclosure performs deep fusion on various types of data through the feature extraction module and the fusion module, so that the extracted features are richer. In addition, this disclosed embodiment still fuses there is image information for the characteristic of extraction is more comprehensive, is favorable to promoting categorised accuracy, and then more is favorable to adjusting the distribution of length of a hospital stay.
Fig. 2 is an exemplary structural block diagram illustrating the acquisition of final features of a CT image of a breast from the CT image of the breast, feature information and image information according to an embodiment of the present disclosure. It is to be understood that fig. 2 is a specific embodiment of step S106 of the method 100 of fig. 1, and therefore the description made above with respect to fig. 1 is equally applicable to fig. 2.
As shown in fig. 2, a breast CT image 201 and feature information 202 obtained based on the breast CT image 201 are first input to a feature extraction module 203 for feature extraction operation, and respective corresponding intermediate features are obtained. It should be understood that the foregoing feature extraction operations on the breast CT image 201 and the feature information 202 are independent of each other. As previously noted, the feature extraction module 203 may include a plurality of convolutional layers and a plurality of pooling layers. In an implementation scenario, the breast CT image 201 may obtain a plurality of corresponding first intermediate features 204 via a plurality of convolutional layers and a plurality of pooling layers, and the feature information 202 may obtain a plurality of corresponding second intermediate features 205 via a plurality of convolutional layers and a plurality of pooling layers. As previously described, the feature information 202 may include gaussian curvature, mean curvature, and reed curvature.
In an implementation scenario, since the size of the breast CT image 201 is not uniform, the feature extraction module 203 may employ a sliding window (the size may be 64 × 64), so that the size of the breast CT image 201 is 64 × 64. In the implementation scenario, the gaussian curvature and the average curvature are superimposed to be regarded as two-dimensional tensor data with a size of 64 × 64, and then input to the feature extraction module 203, since the gaussian curvature and the average curvature are volume data and are consistent with the size of the chest CT image 201. For a curie curvature, it can be expressed as a three-dimensional tensor, whose size can be 64 × 3.
Based on the extracted plurality of first intermediate features 204 and the plurality of second intermediate features 204, a plurality of first target intermediate features 206 and a plurality of second target intermediate features 207 are first selected from the plurality of first intermediate features 204 and the plurality of second intermediate features 205, respectively. Next, the plurality of first target intermediate features 206 and the plurality of second target intermediate features 207 are input into a fusion module 208 to perform a feature fusion operation to obtain initial features 209 of the CT image of the breast. In some embodiments, the aforementioned Fusion module 208 may include a plurality, and the Fusion module 208 may be, for example, a multi-factor Fusion structure ("MFS") module. Except for the first fusion module, the inputs of other fusion modules comprise the fusion result of the last fusion module, the corresponding first target intermediate feature and the second target intermediate feature. How this initial feature is obtained will be described in detail later in connection with fig. 3.
Further, the initial features 209 are merged with image information (e.g., area data of abdominal wall muscles, subcutaneous fat, and intra-abdominal fat, bone density absolute value, liver CT value, aortic calcium score, thyroid density, etc.) 210 obtained based on the chest CT image 201, so as to obtain final features of the chest CT image 201. It is understood that the aforementioned image information is usually real data, and the dimension of the real data can be normalized, tiled, and then stitched with the initial features to obtain the final features of the breast CT image.
Based on the foregoing, the breast CT image and the feature information (e.g., gaussian curvature, mean curvature, and curie curvature) are first input into the feature extraction module for feature extraction, so as to obtain a plurality of intermediate features corresponding to the breast CT image and the feature information. Then, corresponding target intermediate features are selected from the plurality of corresponding intermediate features, and are fused by a fusion module, so that the initial features of the chest CT image can be obtained. How to obtain the initial features of the breast CT image will be described in detail below in conjunction with fig. 3.
Fig. 3 is an exemplary structural block diagram illustrating initial features of obtaining a thoracic CT image according to an embodiment of the present disclosure. As shown in fig. 3, it is assumed that a plurality of first target intermediate features 301 are selected from a plurality of first intermediate features corresponding to a chest CT image, a plurality of second target intermediate features 302 are selected from a plurality of second intermediate features corresponding to gaussian curvature and average curvature, and a plurality of second target intermediate features 303 are selected from a plurality of second intermediate features corresponding to a richly curvature. In this scenario, feature fusion operations are performed on the corresponding target intermediate features via a plurality of fusion modules (e.g., four MFS modules 304 are exemplarily shown in the figure), and the fusion result output by the last MFS module 304 is taken as the initial feature 305 of the breast CT image.
Specifically, first, a feature fusion operation is performed on the corresponding first target intermediate feature 301-1, the corresponding second target intermediate feature 302-1 and the corresponding first target intermediate feature 303-1 via the first MFS module 304 to obtain a fusion result corresponding to the first MFS module 304. Next, the fusion result corresponding to the first MFS module 304 is input to the second MFS module 304, and then the second MFS module 304 performs a feature fusion operation based on the fusion result of the last fusion module (i.e., the first MFS module 304), the corresponding first target intermediate feature 301-2, the corresponding second target intermediate feature 302-2, and the corresponding first target intermediate feature 303-2, so as to obtain a fusion result corresponding to the second MFS module 304. Similarly, a feature fusion operation is performed via the third MFS module 304 based on the fusion result of the second MFS module 304, the respective first target intermediate feature 301-3, the respective second target intermediate feature 302-3, and the respective first target intermediate feature 303-3 to obtain a corresponding fusion result for the third MFS module 304. A feature fusion operation is performed via the fourth MFS module 304 based on the fusion result of the third MFS module 304, the respective first target intermediate feature 301-4, the respective second target intermediate feature 302-4, and the respective first target intermediate feature 303-4 to obtain a corresponding fusion result for the fourth MFS module 304. Based on the method, a plurality of image data types can be subjected to depth fusion, so that the obtained initial features contain richer information.
In the disclosed example, the MFS module described above may include a fully connected layer. In performing the feature fusion operation using the MFS modules, the first MFS module performs operations of adding, subtracting, multiplying, and maximizing the corresponding first target intermediate feature and the corresponding second target intermediate feature, and fuses the results corresponding to the aforementioned operations (for example, as shown in the dashed boxes in the figure), so as to obtain the fusion result of the first MFS module. And for other MFS modules, adding, subtracting, multiplying and maximizing the fusion result of the last MFS module, the corresponding first target intermediate feature and the corresponding second target intermediate feature, and fusing the results corresponding to the operations to obtain the corresponding fusion result. Taking the fourth MFS module 304 as an example, the fourth MFS module 304 performs an addition (indicated by a "+" symbol in the figure), a subtraction (indicated by a "-" symbol in the figure), a multiplication (indicated by an "×" symbol in the figure) and a maximum value taking operation (indicated by a "max" symbol in the figure) based on the fusion result of the third MFS module 304, the corresponding first target intermediate feature 301-4, the corresponding second target intermediate feature 302-4 and the corresponding first target intermediate feature 303-4, respectively, and fuses (for example, splices) the results corresponding to the operations to obtain a fusion result corresponding to the fourth MFS module 304. Assuming that the fourth MFS module is the last one, the corresponding fusion result is the initial feature 305 of the chest CT image.
According to the above knowledge, the final feature of the CT image of the chest can be obtained by stitching the initial feature and the image information. And further, classifying the final characteristics by using a classification module and acquiring the weight of the classification module, and calculating an optimal transmission mapping in a characteristic space of the final characteristics based on the corresponding length of stay and the weight of the classification module so as to adjust the distribution of the length of stay. As will be described in detail below in connection with fig. 4.
Fig. 4 is an exemplary flow diagram illustrating adjustment of the distribution of length of stay in a hospital according to an embodiment of the present disclosure. As shown in fig. 4, at step S402, the final features are set as points within the source domain and the weights of the classification modules are set as target points. In an implementation scenario, the point in the source domain is the final feature extracted based on the chest CT image of each patient (e.g., pneumonia patient). Similarly, the target point is the corresponding weight when classifying each final feature. Next, at step S404, the frequency of the stay of the pneumonia patient is counted. That is, the number n of patients with pneumonia in the j time period is counted j
After the frequency count is obtained, the measure of the target point is adjusted according to the frequency count to obtain the adjusted measure of the target point at step S406. In one embodiment, the measure of the target point may be adjusted according to the following formula:
Figure RE-GDA0003795772870000111
wherein v is j Measure representing adjusted target point, n j Indicates the frequency, and K indicates the number of distribution types.
Further, at step S408, an optimal transfer mapping is calculated based on the measures of the points within the source domain, the target points and the adjusted target points, to pairThe distribution of length of stay was adjusted. In one embodiment, an auxiliary function may be first constructed, and then the optimal transfer mapping is determined by continuously optimizing the auxiliary function through the measurements of the point in the source domain, the target point, and the adjusted target point to adjust the distribution of the length of stay in the hospital. In the implementation scenario, the constructed auxiliary function is assumed to be denoted as h k And the point in the source domain is denoted x i And the target point is marked as omega k The adjusted target point measure is v j . First, the inner product of the target point and the point in the source domain can be calculated, and the number of maximum values obtained by the sum of the inner product and the auxiliary function is counted, and the maximum values are expressed by a mathematical expression
Figure RE-GDA0003795772870000121
Where j represents a point x within each source domain i At omega k The number of maximum values taken. For example, assume that the number of points in the source domain is 100, with 20 points at ω k Where the maximum value is taken, j is 20.
Then, the frequency of the total number of the maximum values is calculated based on the number of the maximum values, and the frequency is assumed to be recorded as
Figure RE-GDA0003795772870000122
Then
Figure RE-GDA0003795772870000123
Where N represents the total number (e.g., 100). Further, a calculation is made based on the aforementioned frequency and the measure of the adjusted target point
Figure RE-GDA0003795772870000124
The
Figure RE-GDA0003795772870000125
And calculate
Figure RE-GDA0003795772870000126
Then, h is updated by the gradient descent method k And continuously iterating until
Figure RE-GDA0003795772870000127
Is less than a preset threshold, an optimal transmission mapping T (-) is obtained
Figure RE-GDA0003795772870000128
Based on the aforementioned obtained optimal transmission map T (-) the distribution of the length of stay of hospital can be adjusted to a uniform distribution, thereby solving the problem of unbalanced distribution of the length of stay of hospital, such as shown in fig. 5.
Fig. 5 is an exemplary schematic diagram illustrating before and after the distribution adjustment of the length of stay in a hospital according to an embodiment of the present disclosure. The upper left diagram in fig. 5 shows the source domain before adjustment, and the lower left diagram shows the source domain after adjustment. The upper right panel shows the measurements of the target points before adjustment, which are 0.7, 0.1 and 0.2, respectively, i.e. the distribution of the length of stay before adjustment is uneven. The lower right graph shows the measure of the adjusted target points, which are all 0.33, i.e. the distribution of the adjusted length of stay is uniform.
Fig. 6 is an exemplary block diagram illustrating an overview of adjusting the distribution of length of stay in a hospital according to an embodiment of the present disclosure. As shown in fig. 6, a chest CT image 601 (i.e., the chest CT image 201 shown in fig. 2, described above), two-dimensional tensor data 602 in which gaussian curvature and mean curvature are superimposed, and a reed curvature 603 are input to a feature extraction module, and a feature extraction operation is performed via the feature extraction module. The gaussian curvature, the mean curvature, and the reed curvature may be obtained based on the CT image of the chest, which may specifically refer to the content described in fig. 1, and the details of the present disclosure are not repeated herein. As previously described, the aforementioned feature extraction module may include a plurality of convolutional layers and a plurality of pooling layers, and each convolutional layer may include convolution, BatchNormalization, and Relu. As an example, 10 convolutional layers and 4 pooling layers may be employed.
As further shown in the figure, the chest CT image 601, the superimposed two-dimensional tensor data 602 of gaussian curvature and average curvature, and the curie curvature 603 perform feature extraction via a feature extraction module, and may obtain a corresponding plurality of first intermediate features 604, a plurality of second intermediate features 605, and a plurality of second intermediate features 606, respectively. Next, a plurality of target intermediate features are selected from the plurality of first intermediate features and the second intermediate features, and input into the MFS module 607 to be fused, so as to obtain initial features 608 of the breast CT image. In one exemplary scenario, assume that a first intermediate feature 604-1 to a first intermediate feature 604-5 is selected from the plurality of first intermediate features 604, a second intermediate feature 605-1 to a second intermediate feature 605-5 is selected from the plurality of second intermediate features 605, and a second intermediate feature 606-1 to a second intermediate feature 606-5 is selected from the plurality of second intermediate features 606. In an implementation scenario, the choice of the target intermediate feature may be determined based on the number of layers of the residual module. After the target feature is acquired, it may be fused using multiple MFS modules 607.
For example, the first MFS module 607 is first utilized to fuse (including performing addition, subtraction, multiplication, and maximum operations) the first intermediate feature 604-1, the second intermediate feature 605-1, and the second intermediate feature 606-1 to obtain corresponding fusion results. Then, the fusion result of the first MFS module 607 is input to the second MFS module 607, and the second MFS module 607 fuses based on the fusion result of the first MFS module 607, the first intermediate feature 604-2, the second intermediate feature 605-2, and the second intermediate feature 606-2, and inputs the fusion result to the next MFS module 607 until the fusion result of the last MFS module (shown as the fifth MFS module) is obtained, and the fusion result of the last MFS module is used as the initial feature 608 of the breast CT image. Based on this, multiple image data types can be deeply fused, so that the initial features contain richer information. For more details on obtaining the initial features, reference may be made to what is described above with reference to fig. 3, and the disclosure is not repeated here.
Further, image information (e.g., area data of abdominal wall muscles, subcutaneous fat, and intra-abdominal fat, absolute value of bone density, CT value of liver, aortic calcium score, thyroid density, etc.) 609 obtained based on the CT image 601 of the chest is performed with the initial features 608 to obtain final features of the CT image of the chest. Then, the classification module 610 is used to classify the final features and obtain the weights of the classification module 610, so as to calculate the optimal transmission mapping in the feature space of the final features based on the corresponding hospital stay and the weights of the classification module, so as to adjust the distribution of the hospital stay. For more details on adjusting the distribution of the stay length, reference may be made to the description of fig. 4 above, and the detailed description of the disclosure is omitted here.
Fig. 7 is a block diagram illustrating an apparatus 700 for adjusting the distribution of length of stay in a hospital according to an embodiment of the present disclosure. It is to be understood that the device implementing aspects of the present disclosure may be a single device (e.g., a computing device) or a multifunction device including various peripheral devices.
As shown in fig. 7, the apparatus of the present disclosure may include a central processing unit or central processing unit ("CPU") 711, which may be a general purpose CPU, a special purpose CPU, or other execution unit that processes and programs run. Further, the device 700 may also include a mass storage 712 and a read only memory ("ROM") 713, wherein the mass storage 712 may be configured to store various types of data, including various types of data related to the acquisition of CT images of the chest of a pneumonia patient, algorithmic data, intermediate results, and various programs needed to operate the device 700. ROM 713 may be configured to store data and instructions required for power-on self-test of device 700, initialization of various functional blocks in the system, basic input/output drivers for the system, and booting of an operating system.
Optionally, device 700 may also include other hardware platforms or components, such as the illustrated tensor processing unit ("TPU") 714, graphics processing unit ("GPU") 715, field programmable gate array ("FPGA") 716, and machine learning unit ("MLU") 717. It is to be understood that although various hardware platforms or components are shown in the device 700, this is by way of illustration and not of limitation, and one skilled in the art may add or remove corresponding hardware as may be desired. For example, the device 700 may include only a CPU, associated memory devices, and interface devices to implement the disclosed method for adjusting the distribution of length of stay.
In some embodiments, to facilitate the transfer and interaction of data with external networks, the device 700 of the present disclosure also includes a communication interface 718 such that it may connect to a local area network/wireless local area network ("LAN/WLAN") 705 via the communication interface 718, which in turn may connect to a local server 706 via the LAN/WLAN or to the Internet ("Internet") 707. Alternatively or additionally, the device 700 of the present disclosure may also be directly connected to the internet or a cellular network based on wireless communication technology through the communication interface 718, such as based on 3 rd generation ("3G"), 4 th generation ("4G"), or 5 th generation ("5G") wireless communication technology. In some application scenarios, the device 700 of the present disclosure may also access the server 708 and database 709 of the external network as needed to obtain various known image models, data and modules, and may remotely store various data, such as various types of data or instructions for presenting breast CT images, feature information, imagery information, and a plurality of intermediate features obtained by performing feature extraction on the breast CT images, feature information.
The peripheral devices of the apparatus 700 may include a display device 702, an input device 703 and a data transmission interface 704. In one embodiment, the display device 702 may, for example, include one or more speakers and/or one or more visual displays configured for voice prompting and/or image video display of the present disclosure to a chest CT image, a feature extraction process, a feature fusion process, initial features, or final features of feature information. Input devices 703 may include other input buttons or controls, such as a keyboard, a mouse, a microphone, a gesture capture camera, etc., configured to receive input of chest CT images, feature information, and imagery information, and/or user instructions. The data transfer interface 704 may include, for example, a serial interface, a parallel interface, or a universal serial bus interface ("USB"), a small computer system interface ("SCSI"), serial ATA, FireWire ("FireWire"), PCI Express, and a high-definition multimedia interface ("HDMI"), etc., configured for data transfer and interaction with other devices or systems. In accordance with aspects of the present disclosure, the data transmission interface 704 may receive a thoracic CT image of a pneumonia patient from a CT device and transmit data or results including the thoracic CT image, feature information, and imagery information or various other types to the device 700.
The aforementioned CPU 711, mass storage 712, ROM 713, TPU 714, GPU 715, FPGA 716, MLU 717, and communication interface 718 of the disclosed device 700 may be interconnected by a bus 719, and enable data interaction with peripheral devices through the bus. Through the bus 719, the CPU 711 may control other hardware components and their peripherals in the device 700, in one embodiment.
An apparatus for adjusting the distribution of length of stay for a hospital that can be used to carry out the present disclosure is described above in connection with fig. 7. It is to be understood that the device structures or architectures herein are merely exemplary, and that the implementations and implementation entities of the present disclosure are not limited thereto but may be modified without departing from the spirit of the present disclosure.
From the above description in conjunction with the accompanying drawings, those skilled in the art will also appreciate that embodiments of the present disclosure may also be implemented by software programs. The present disclosure thus also provides a computer program product. The computer program product may be used to implement the method for adjusting the distribution of length of stay in a hospital as described in the present disclosure in connection with figures l-6.
It should be noted that while the operations of the disclosed methods are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Rather, the steps depicted in the flowcharts may change the order of execution. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
It should be understood that when the claims of the present disclosure, and when the terms first, second, third, fourth, etc. are used in the specification and drawings, they are used only to distinguish one object from another, and not to describe a particular order. The terms "comprises" and "comprising," when used in the specification and claims of this disclosure, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the disclosure herein is for the purpose of describing particular embodiments only, and is not intended to be limiting of the disclosure. As used in the specification and claims of this disclosure, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should be further understood that the term "and/or" as used in the specification and claims of this disclosure refers to any and all possible combinations of one or more of the associated listed items and includes such combinations.
Although the embodiments of the present disclosure are described above, the descriptions are only examples for facilitating understanding of the present disclosure, and are not intended to limit the scope and application scenarios of the present disclosure. It will be understood by those skilled in the art of the present disclosure that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure, and that the scope of the disclosure is to be limited only by the appended claims.

Claims (10)

1. A method for adjusting a distribution of length of stay in a hospital, comprising:
acquiring chest CT images of a plurality of pneumonia patients and hospital stay durations corresponding to the pneumonia patients;
acquiring feature information and image information related to the chest CT image based on the chest CT image;
acquiring the final characteristics of the chest CT image according to the chest CT image, the characteristic information and the image information; and
calculating an optimal transmission map based on the final characteristics and the corresponding length of stay to adjust the distribution of lengths of stay.
2. The method of claim 1, wherein the characteristic information relates to characteristics of lung regions and the characteristic information includes at least gaussian, mean and/or reed curvatures, the image information including at least image information relating to muscle and abdominal fat, image information relating to vertebral bone, and image information relating to liver, cardiovascular and thyroid in the chest CT image.
3. The method of claim 2, wherein obtaining final features of the CT image of the breast from the CT image of the breast, the feature information, and the imagery information comprises:
respectively performing feature extraction operation on the chest CT image and the feature information by using a feature extraction module to obtain intermediate features corresponding to the chest CT image and the feature information;
performing the feature fusion operation on the corresponding intermediate features by using a fusion module to obtain initial features of the chest CT image; and
and splicing the initial features and the image information to acquire final features of the chest CT image.
4. The method of claim 3, wherein the feature extraction module comprises a plurality of convolutional layers and a plurality of pooling layers, and wherein performing feature extraction operations on the breast CT image and the feature information with the feature extraction module, respectively, to obtain intermediate features corresponding thereto comprises:
performing a plurality of feature extraction operations on the chest CT image with the plurality of convolutional layers and the plurality of pooling layers in the feature extraction module to obtain a plurality of first intermediate features; and
performing a plurality of feature extraction operations on the feature information using the plurality of convolutional layers and the plurality of pooling layers in the feature extraction module to obtain a plurality of second intermediate features.
5. The method of claim 4, wherein the fusion module comprises a plurality, and wherein performing the feature fusion operation on the corresponding intermediate features with the fusion module to obtain initial features of the chest CT image comprises:
selecting a plurality of first target intermediate features and a plurality of second target intermediate features from the plurality of first intermediate features and the plurality of second intermediate features, respectively;
sequentially executing feature fusion operation on the corresponding first target intermediate features and the corresponding second target intermediate features by utilizing the plurality of fusion modules to obtain corresponding fusion results, and inputting the corresponding fusion results to the next fusion module;
executing next feature fusion operation on the fusion result of the last fusion module, the corresponding first target intermediate feature and the corresponding second target intermediate feature by using the next fusion module until the fusion result of the last fusion module is obtained; and
and taking the fusion result of the last fusion module as the initial characteristic of the chest CT image.
6. The method of claim 4, wherein calculating an optimal transmission map based on the final characteristics and the corresponding length of stay to adjust the distribution of lengths of stay comprises:
classifying the final features by using a classification module and acquiring the weight of the classification module; and
and calculating an optimal transmission mapping in the feature space of the final features based on the corresponding length of stay and the weight of the classification module so as to adjust the distribution of the length of stay.
7. The method of claim 6, wherein computing an optimal transmission map within the feature space of the final features based on the corresponding length of stay and the weights of the classification modules to adjust the distribution of lengths of stay comprises:
setting the final feature as a point within a source domain and the weight of the classification module as a target point;
counting the frequency of hospitalization time of the pneumonia patients;
adjusting the measure of the target point according to the frequency to obtain the adjusted measure of the target point; and
calculating an optimal transmission map based on the measurements of the points within the source domain, the target point and the adjusted target point to adjust the distribution of length of stay.
8. The method of claim 7, wherein adjusting the measure of target points according to the frequency comprises: adjusting the measure of the target point based on the following formula:
Figure RE-FDA0003795772860000031
wherein v is j A measure, n, representing the adjusted target point j Indicates the frequency, and K indicates the number of distribution types.
9. An apparatus for adjusting the distribution of length of stay in a hospital, comprising:
a processor; and
a memory storing program instructions for adjusting a distribution of length of stay, which when executed by the processor, cause the apparatus to implement the method of any one of claims 1-8.
10. A computer-readable storage medium having stored thereon computer-readable instructions for adjusting a distribution of length of stay in a hospital, which, when executed by one or more processors, implement the method of any one of claims 1-8.
CN202210635531.2A 2022-06-06 2022-06-06 Method for adjusting distribution of stay in hospital and related product Pending CN115064250A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210635531.2A CN115064250A (en) 2022-06-06 2022-06-06 Method for adjusting distribution of stay in hospital and related product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210635531.2A CN115064250A (en) 2022-06-06 2022-06-06 Method for adjusting distribution of stay in hospital and related product

Publications (1)

Publication Number Publication Date
CN115064250A true CN115064250A (en) 2022-09-16

Family

ID=83199556

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210635531.2A Pending CN115064250A (en) 2022-06-06 2022-06-06 Method for adjusting distribution of stay in hospital and related product

Country Status (1)

Country Link
CN (1) CN115064250A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107117510A (en) * 2017-04-14 2017-09-01 特斯联(北京)科技有限公司 A kind of intelligent elevator remote monitoring system
CN111401138A (en) * 2020-02-24 2020-07-10 上海理工大学 Countermeasure optimization method for generating countermeasure neural network training process
CN111814871A (en) * 2020-06-13 2020-10-23 浙江大学 Image classification method based on reliable weight optimal transmission
CN111932541A (en) * 2020-10-14 2020-11-13 北京信诺卫康科技有限公司 CT image processing method for predicting prognosis of new coronary pneumonia
CN112488992A (en) * 2020-11-13 2021-03-12 上海健康医学院 Epidermal growth factor receptor mutation state judgment method, medium and electronic device
CN112767340A (en) * 2021-01-13 2021-05-07 大连理工大学 Apparatus and related products for assessing focal zone based on neural network model
CN113223711A (en) * 2021-04-29 2021-08-06 天津大学 Multi-modal data-based readmission prediction model
CN113284149A (en) * 2021-07-26 2021-08-20 长沙理工大学 COVID-19 chest CT image identification method and device and electronic equipment
CN113420820A (en) * 2021-06-29 2021-09-21 河北工程大学 Regularized optimal transmission theory-based unbalanced data classification method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107117510A (en) * 2017-04-14 2017-09-01 特斯联(北京)科技有限公司 A kind of intelligent elevator remote monitoring system
CN111401138A (en) * 2020-02-24 2020-07-10 上海理工大学 Countermeasure optimization method for generating countermeasure neural network training process
CN111814871A (en) * 2020-06-13 2020-10-23 浙江大学 Image classification method based on reliable weight optimal transmission
CN111932541A (en) * 2020-10-14 2020-11-13 北京信诺卫康科技有限公司 CT image processing method for predicting prognosis of new coronary pneumonia
CN112488992A (en) * 2020-11-13 2021-03-12 上海健康医学院 Epidermal growth factor receptor mutation state judgment method, medium and electronic device
CN112767340A (en) * 2021-01-13 2021-05-07 大连理工大学 Apparatus and related products for assessing focal zone based on neural network model
CN113223711A (en) * 2021-04-29 2021-08-06 天津大学 Multi-modal data-based readmission prediction model
CN113420820A (en) * 2021-06-29 2021-09-21 河北工程大学 Regularized optimal transmission theory-based unbalanced data classification method
CN113284149A (en) * 2021-07-26 2021-08-20 长沙理工大学 COVID-19 chest CT image identification method and device and electronic equipment

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
LEI N等: "FFT-OT: A Fast Algorithm for Optimal Transportation", 《 IEEE XPLORE》, 28 February 2022 (2022-02-28) *
王卫兵;徐倩;韩再博;: "基于最优质量传输光流法和神经网络的火焰和烟雾检测", 哈尔滨理工大学学报, no. 01, 15 February 2017 (2017-02-15) *
王振常等: "基于CT图像及临床分型探讨新型冠状病毒肺炎患者住院时长的影响因素", 《临床和实验医学杂志》, 10 May 2022 (2022-05-10) *
王时龙等: "《中国战略性新兴产业研究与发展 智慧工业》", 31 December 2021, 机械工业出版社, pages: 94 - 97 *

Similar Documents

Publication Publication Date Title
CN107123112B (en) Blood flow state analysis system and method
CN106887000B (en) Gridding processing method and system for medical image
CN102460471B (en) Systems for computer aided lung nodule detection in chest tomosynthesis imaging
Kalra Developing fe human models from medical images
US8335359B2 (en) Systems, apparatus and processes for automated medical image segmentation
WO2021128825A1 (en) Three-dimensional target detection method, method and device for training three-dimensional target detection model, apparatus, and storage medium
CN106573150B (en) The covering of image medium vessels structure
US8149237B2 (en) Information processing apparatus and program
CN104240287B (en) A kind of utilization CT images generate the method and system of coronary artery panorama sketch
US10275909B2 (en) Systems and methods for an integrated system for visualizing, simulating, modifying and 3D printing 3D objects
Kamiya et al. Automated segmentation of psoas major muscle in X-ray CT images by use of a shape model: preliminary study
CN115578404A (en) Liver tumor image enhancement and segmentation method based on deep learning
CN107708550A (en) For the surface modeling for the segmentation acoustic echo structure for detecting and measuring anatomic abnormalities
US11756292B2 (en) Similarity determination apparatus, similarity determination method, and similarity determination program
Davamani et al. Biomedical image segmentation by deep learning methods
CN112381822B (en) Method for processing images of focal zones of the lungs and related product
Hu et al. Multi-rigid image segmentation and registration for the analysis of joint motion from three-dimensional magnetic resonance imaging
CN112381824B (en) Method for extracting geometric features of image and related product
CN112750110A (en) Evaluation system for evaluating lung lesion based on neural network and related products
CN115035375A (en) Method for feature extraction of chest CT image and related product
JP6827707B2 (en) Information processing equipment and information processing system
CN115064250A (en) Method for adjusting distribution of stay in hospital and related product
CN112884706B (en) Image evaluation system based on neural network model and related product
EP3270308B1 (en) Method for providing a secondary parameter, decision support system, computer-readable medium and computer program product
CN114708974A (en) Method for predicting hospitalization duration of new coronary pneumonia patient and related product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination