CN117152178A - Thyroid gland cutting device and cutting method - Google Patents
Thyroid gland cutting device and cutting method Download PDFInfo
- Publication number
- CN117152178A CN117152178A CN202311215560.4A CN202311215560A CN117152178A CN 117152178 A CN117152178 A CN 117152178A CN 202311215560 A CN202311215560 A CN 202311215560A CN 117152178 A CN117152178 A CN 117152178A
- Authority
- CN
- China
- Prior art keywords
- thyroid
- image
- abnormal
- pixel
- cutting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 210000001685 thyroid gland Anatomy 0.000 title claims abstract description 220
- 238000005520 cutting process Methods 0.000 title claims abstract description 133
- 238000000034 method Methods 0.000 title claims abstract description 38
- 238000010191 image analysis Methods 0.000 claims abstract description 31
- 238000003860 storage Methods 0.000 claims abstract description 16
- 238000004891 communication Methods 0.000 claims abstract description 4
- 230000002159 abnormal effect Effects 0.000 claims description 168
- 238000010586 diagram Methods 0.000 claims description 41
- VMGAPWLDMVPYIA-HIDZBRGKSA-N n'-amino-n-iminomethanimidamide Chemical compound N\N=C\N=N VMGAPWLDMVPYIA-HIDZBRGKSA-N 0.000 claims description 18
- 238000012545 processing Methods 0.000 claims description 18
- BOLDJAUMGUJJKM-LSDHHAIUSA-N renifolin D Natural products CC(=C)[C@@H]1Cc2c(O)c(O)ccc2[C@H]1CC(=O)c3ccc(O)cc3O BOLDJAUMGUJJKM-LSDHHAIUSA-N 0.000 claims description 18
- 238000004458 analytical method Methods 0.000 claims description 10
- 230000004807 localization Effects 0.000 claims 1
- 208000013076 thyroid tumor Diseases 0.000 abstract description 9
- 208000024770 Thyroid neoplasm Diseases 0.000 abstract description 7
- 230000008569 process Effects 0.000 description 18
- 238000004590 computer program Methods 0.000 description 5
- 206010028980 Neoplasm Diseases 0.000 description 4
- 201000011510 cancer Diseases 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 201000002510 thyroid cancer Diseases 0.000 description 3
- 238000001514 detection method Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000036210 malignancy Effects 0.000 description 2
- 230000003211 malignant effect Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 208000004434 Calcinosis Diseases 0.000 description 1
- 208000009453 Thyroid Nodule Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 208000016842 benign thyroid gland neoplasm Diseases 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001575 pathological effect Effects 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 238000012285 ultrasound imaging Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Radiology & Medical Imaging (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Theoretical Computer Science (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- General Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Quality & Reliability (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
The application provides a thyroid gland cutting device and a thyroid gland cutting method, which relate to the technical field of image recognition and comprise an image acquisition module, a cutting module and a terminal processor, wherein the image acquisition module and the cutting module are in communication connection with the terminal processor; the image acquisition module is used for acquiring a thyroid ultrasonic image of a patient; the terminal processor comprises an image analysis unit and a storage unit; the cutting module is used for cutting thyroid of the neck of the patient; the method and the device are used for solving the problems that in the prior art, the obtained cutting area cannot be more specifically selected in a frame mode, so that the cutting area is deviated when the cutting device cuts thyroid tumors in the cutting area, accurate positioning cannot be achieved, and the cutting process of the thyroid tumors is affected.
Description
Technical Field
The application relates to the technical field of image recognition, in particular to a thyroid cutting device and a thyroid cutting method.
Background
Epidemiological studies have shown that thyroid cancer has rapidly increased in incidence in recent years, and thyroid malignant tumors are the main pathological type of thyroid cancer and occupy a significant proportion of the incidence. Ultrasound images are widely recognized as the primary diagnostic tool for thyroid nodule screening and as a method for preoperative assessment of thyroid malignancy. Ultrasound imaging features such as microcalcifications, high solids content, irregular edges and shapes are typical of thyroid malignancy considerations.
The existing improvement for thyroid gland cutting generally improves the accuracy of thyroid gland tumor in benign and malignant detection, for example, in the application patent with application publication number of CN108520518A, a thyroid gland tumor ultrasonic image identification method and a device thereof are disclosed, the scheme is that a doctor is assisted to diagnose benign and malignant thyroid gland tumor, the accuracy of more than 90% is obtained in a thyroid gland ultrasonic image tumor benign and malignant detection test, the method has great reference significance for clinical actual diagnosis, other improvements for thyroid gland cutting generally improve the ultrasonic image of thyroid gland tumor, the obtained tumor position is more accurate, but the obtained cutting area cannot be more specifically selected, so that the cutting device deviates from the cutting area when cutting thyroid gland tumor in the cutting area, the cutting progress of the thyroid gland tumor cannot be accurately positioned, and the existing thyroid gland cutting technology is improved.
Disclosure of Invention
Aiming at the defects existing in the prior art, the application aims to provide a thyroid gland cutting device and a cutting method, which are used for solving the problems that in the prior art, an obtained cutting area cannot be more specifically selected in a frame mode, so that the cutting area can deviate when the cutting device cuts thyroid tumors in the cutting area, accurate positioning cannot be realized, and the cutting process of the thyroid tumors is influenced.
In order to achieve the above purpose, the application provides a thyroid gland cutting device, which comprises an image acquisition module, a cutting module and a terminal processor, wherein the image acquisition module and the cutting module are in communication connection with the terminal processor;
the image acquisition module is used for acquiring a thyroid ultrasonic image of a patient and recording the thyroid ultrasonic image as a thyroid image;
the terminal processor comprises an image analysis unit and a storage unit, wherein the image analysis unit is used for analyzing thyroid images and selecting a cutting area in a frame mode;
the storage unit stores a plurality of groups of thyroid ultrasonic images under the condition of thyroid normal;
the cutting module cuts thyroid of the neck of the patient based on the cutting area in the image analysis unit.
Further, the image acquisition module is configured with an image positioning strategy, the image positioning strategy includes:
acquiring size data of the neck of a patient and acquiring a region of the neck of the patient to obtain a thyroid image;
the method comprises the steps of marking an image formed by size data of the neck of a patient as a neck image, putting the neck image into a rectangular coordinate system of the neck, and marking the area where the neck image is positioned as a neck area;
the region in the neck region, in which thyroid images are acquired from the neck of the patient, is marked as the image region.
Further, the image acquisition module is configured with an image pairing strategy, and the image pairing strategy comprises:
acquiring height data, weight data and neck width length data of a patient, acquiring the height data, weight data and neck width length data of a plurality of groups of users under the condition of thyroid gland normal from a storage unit, solving absolute values of differences between the height data, weight data and neck width length data of the patient and the height data, weight data and neck width length data of the users under the condition of thyroid gland normal respectively, marking the absolute values as the height absolute difference, the weight absolute difference and the neck width absolute difference respectively, adding the height absolute difference, the weight absolute difference and the neck width absolute difference to obtain a comparison total difference, and selecting a thyroid ultrasonic image of the user corresponding to the minimum value in the plurality of groups of comparison total differences to be recorded as the comparison image;
and overlapping the comparison image with the neck region, marking a region corresponding to the image region in the comparison image based on the position parameter of the image region in the neck region, and marking the region as a normal neck image.
Further, the image analysis unit is configured with a first image processing policy, the first image processing policy comprising:
carrying out pixelation processing on the normal neck image, and marking the normal neck image after the pixelation processing as a normal pixel image;
carrying out pixelation treatment on the thyroid image, and marking the thyroid image after the pixelation treatment as a thyroid pixel image;
acquiring the number of pixel points of a first row of a normal pixel image, and marking the number as K, wherein K is a positive integer;
dividing the normal pixel image into K columns based on the pixel points of the first row of the normal pixel image, and marking the K columns as normal columns 1 to K;
the thyroid pixel image is divided into K columns based on the pixel points of the first row of the thyroid pixel image, and is denoted as thyroid column 1 to thyroid column K.
Further, the image analysis unit is configured with a comparison group division policy, and the comparison group division policy includes:
marking the normal column 1 and the thyroid column 1 as a first comparison group 1, and obtaining a first comparison group 2 to a first comparison group K based on the normal column 2 to the normal column K and the thyroid column 2 to the thyroid column K;
for any one of the first comparison groups 1 to K, acquiring the number of pixel points in a normal column, and marking the number as L, wherein L is a positive integer;
the pixel points in the normal columns are marked as normal pixel points 1 to L from top to bottom, the pixel value of each pixel point in the normal pixel points 1 to L is obtained, and the pixel values are marked as normal values 1 to L;
marking the pixels of the thyroid column as a nail-shaped pixel 1 to a nail-shaped pixel L from top to bottom, obtaining the pixel value of each pixel in the nail-shaped pixel 1 to the nail-shaped pixel L, and marking the pixel value as a nail-shaped value 1 to a nail-shaped value L;
the normal value 1 and the formazan value 1 are recorded as a second comparison group 1, and the second comparison group 2 to the second comparison group L are obtained based on the normal value 2 to the normal value L and the formazan value 2 to the formazan value L.
Further, the image analysis unit is configured with a first comparison strategy, which includes:
for any one of the second comparison groups 1 to L, the absolute value of the difference obtained by subtracting the formazan value from the normal value is recorded as a pixel difference, and when the pixel difference is greater than or equal to the standard pixel difference, the second comparison group is recorded as an abnormal comparison group;
when the pixel difference is smaller than the standard pixel difference, the second comparison group is marked as a normal comparison group;
when the number of abnormal comparison groups in the second comparison groups 1 to L is greater than or equal to the standard abnormal number, marking thyroid columns in the first comparison groups corresponding to the second comparison groups 1 to L as abnormal thyroid columns;
when no normal comparison group exists between the continuous first number of abnormal comparison groups in the second comparison groups 1 to L, the continuous first number of abnormal comparison groups are marked as first abnormal blocks;
when no normal comparison group exists between the continuous second number of abnormal comparison groups in the second comparison groups 1 to L, the continuous second number of abnormal comparison groups are marked as second abnormal blocks, wherein the first abnormal blocks and the second abnormal blocks do not conflict with each other;
and when the number of the first abnormal blocks in the second comparison groups 1 to L is more than or equal to the third number or the number of the second abnormal blocks is more than or equal to the fourth number, marking thyroid columns in the first comparison groups corresponding to the second comparison groups 1 to L as abnormal thyroid columns.
Further, the image analysis unit is configured with a second image processing policy, the second image processing policy comprising:
acquiring all abnormal thyroid columns from the thyroid pixel image, marking pixel points corresponding to the abnormal comparison groups in all abnormal thyroid columns based on all abnormal comparison groups, and marking the marked abnormal points;
for any one of thyroid columns 2 to K-1, when abnormal points exist in the thyroid columns, acquiring the nearest thyroid columns with abnormal points on two sides of the thyroid column, and marking the nearest thyroid columns as adjacent abnormal columns;
for any abnormal point in the thyroid gland column, connecting the abnormal point with the nearest abnormal point in the adjacent abnormal column, and marking the connecting line as an abnormal line;
and acquiring images after all abnormal lines in the thyroid pixel image are connected, and marking the images as an abnormal block diagram.
Further, the image analysis unit is configured with a block diagram analysis strategy, the block diagram analysis strategy comprising:
and acquiring an abnormal line at the outermost side of the abnormal block diagram, and recording a closed graph formed by connecting the abnormal line at the outermost side of the abnormal block diagram as a cutting graph.
Further, the block diagram analysis strategy further comprises:
placing the abnormal block diagram into a plane rectangular coordinate system, acquiring a minimum circumscribed rectangle of the abnormal block diagram by using a minimum circumscribed rectangle algorithm, and marking the minimum circumscribed rectangle as a cut rectangle;
the cut rectangle is placed into the thyroid pixel image based on the position of the anomaly block diagram in the thyroid pixel image.
Further, the cutting module is configured with a cutting area selection strategy, the cutting area selection strategy comprising:
and acquiring a thyroid pixel image, moving the cutting device to the area where the cutting rectangle is located based on the cutting rectangle in the thyroid pixel image, and marking the area where the cutting rectangle is located as a cutting area.
Further, the cutting module is configured with a cutting strategy comprising:
based on the cutting pattern, the cutting pattern is cut in the cutting area using a cutting device.
In another aspect, the present application also provides a thyroid cutting method, including:
step S1, acquiring a thyroid ultrasonic image of a patient, and marking the thyroid ultrasonic image as a thyroid image;
s2, analyzing thyroid images, and selecting a cutting area by a frame;
and S3, cutting thyroid on the neck of the patient based on the cutting area in the image analysis unit.
The application has the beneficial effects that: the application carries out the comparison and analysis after carrying out the pixelation treatment on the thyroid image and the normal image of the region where the thyroid is positioned, which has the advantages that the comparison of the thyroid image and the normal neck image can be conveniently carried out through the pixelation treatment;
meanwhile, the normal neck image is obtained through the area where the thyroid is located, so that the size data of the normal neck image is the same as that of the thyroid image, the number of the pixels in the normal pixel image and the number of the pixels in the thyroid pixel image obtained after the normal neck image and the thyroid image are subjected to pixelation processing are the same, and comparison and analysis can be performed one by one;
the cutting pattern to be cut and the cutting area where the cutting device is to be placed can be obtained more intuitively by connecting the abnormal lines to the obtained abnormal block diagram after the analysis is finished.
Additional aspects of the application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the application.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments, given with reference to the accompanying drawings in which:
FIG. 1 is a schematic block diagram of an apparatus of the present application;
FIG. 2 is a flow chart of the steps of the method of the present application;
FIG. 3 is a schematic view of the acquisition of a normal neck image according to the present application;
fig. 4 is a schematic diagram of the acquisition of a normal column according to the present application.
Detailed Description
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the application. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the present application.
Embodiments of the application and features of the embodiments may be combined with each other without conflict.
Example 1
In a first aspect, referring to fig. 1, the present application provides a thyroid gland cutting device, which includes an image acquisition module, a cutting module and a terminal processor, wherein the image acquisition module and the cutting module are in communication connection with the terminal processor;
the image acquisition module is used for acquiring a thyroid ultrasonic image of a patient and recording the thyroid ultrasonic image as a thyroid image;
the image acquisition module is configured with an image positioning strategy, and the image positioning strategy comprises:
acquiring size data of the neck of a patient and acquiring a region of the neck of the patient to obtain a thyroid image;
the method comprises the steps of marking an image formed by size data of the neck of a patient as a neck image, putting the neck image into a rectangular coordinate system of the neck, and marking the area where the neck image is positioned as a neck area;
marking a region for acquiring thyroid images of the neck of a patient in the neck region, and marking the region as an image region;
in the implementation process, the image area is an area where the thyroid of the neck of the patient is located, and the thyroid of the neck of the patient can be positioned by acquiring the image area;
the image acquisition module is also configured with an image pairing strategy, and the image pairing strategy comprises:
acquiring height data, weight data and neck width length data of a patient, acquiring the height data, weight data and neck width length data of a plurality of groups of users under the condition of thyroid gland normal from a storage unit, solving absolute values of differences between the height data, weight data and neck width length data of the patient and the height data, weight data and neck width length data of the users under the condition of thyroid gland normal respectively, marking the absolute values as the height absolute difference, the weight absolute difference and the neck width absolute difference respectively, adding the height absolute difference, the weight absolute difference and the neck width absolute difference to obtain a comparison total difference, and selecting a thyroid ultrasonic image of the user corresponding to the minimum value in the plurality of groups of comparison total differences to be recorded as the comparison image;
referring to fig. 3, Q1 is a rectangular coordinate system of the neck, Q2 is a neck region, Q3 is an image region, Q4 is a comparison image, Q5 is a normal neck image, the comparison image is overlapped with the neck region, and a region corresponding to the image region is marked in the comparison image based on a position parameter of the image region in the neck region, and is recorded as the normal neck image;
in the implementation process, the normal neck image which is most consistent with the physical condition of the patient can be more accurately found in the storage unit by acquiring the height data, the weight data and the neck width length of the patient, and meanwhile, the normal neck image is obtained based on the position parameters of the image area in the neck area, so that the size data of the normal neck image is the same as the size data of the image area, and the follow-up image analysis is facilitated;
the image analysis unit is configured with a first image processing strategy comprising:
carrying out pixelation processing on the normal neck image, and marking the normal neck image after the pixelation processing as a normal pixel image;
carrying out pixelation treatment on the thyroid image, and marking the thyroid image after the pixelation treatment as a thyroid pixel image;
in the specific implementation process, the images can be compared and analyzed more systematically through pixelation, and meanwhile, as the size data of the normal neck image and the size data of the image area are the same, the number of the pixel points in the normal neck image and the image area are the same, so that the pixel points can be analyzed more specifically;
acquiring the number of pixel points of a first row of a normal pixel image, and marking the number as K, wherein K is a positive integer;
dividing the normal pixel image into K columns based on the pixel points of the first row of the normal pixel image, and marking the K columns as normal columns 1 to K;
dividing the thyroid pixel image into K columns based on the pixel points of the first row of the thyroid pixel image, and marking the thyroid pixel image as thyroid columns 1 to thyroid columns K;
in the implementation process, the number of the normal columns is equal to the number of the thyroid columns because the size data of the normal neck image and the size data of the image area are the same;
the image analysis unit is configured with a comparison group division strategy, and the comparison group division strategy comprises:
marking the normal column 1 and the thyroid column 1 as a first comparison group 1, and obtaining a first comparison group 2 to a first comparison group K based on the normal column 2 to the normal column K and the thyroid column 2 to the thyroid column K;
for any one of the first comparison groups 1 to K, acquiring the number of pixel points in a normal column, and marking the number as L, wherein L is a positive integer;
the pixel points in the normal columns are marked as normal pixel points 1 to L from top to bottom, the pixel value of each pixel point in the normal pixel points 1 to L is obtained, and the pixel values are marked as normal values 1 to L;
marking the pixels of the thyroid column as a nail-shaped pixel 1 to a nail-shaped pixel L from top to bottom, obtaining the pixel value of each pixel in the nail-shaped pixel 1 to the nail-shaped pixel L, and marking the pixel value as a nail-shaped value 1 to a nail-shaped value L;
in the implementation process, the number of normal values is equal to the number of formazan values because the size data of the normal neck image and the size data of the image area are the same;
marking the normal value 1 and the formazan value 1 as a second comparison group 1, and obtaining a second comparison group 2 to a second comparison group L based on the normal value 2 to the normal value L and the formazan value 2 to the formazan value L;
the image analysis unit is configured with a first comparison strategy, the first comparison strategy comprising:
for any one of the second comparison groups 1 to L, the absolute value of the difference obtained by subtracting the formazan value from the normal value is recorded as a pixel difference, and when the pixel difference is greater than or equal to the standard pixel difference, the second comparison group is recorded as an abnormal comparison group;
in the specific implementation process, the standard pixel difference is 100, and when the difference between the pixel values of two pixel points exceeds 100, the fact that the color difference between the two pixel points is larger is indicated and marking should be carried out;
when the pixel difference is smaller than the standard pixel difference, the second comparison group is marked as a normal comparison group;
when the number of abnormal comparison groups in the second comparison groups 1 to L is greater than or equal to the standard abnormal number, marking thyroid columns in the first comparison groups corresponding to the second comparison groups 1 to L as abnormal thyroid columns;
in the specific implementation process, the standard abnormal number is set to be 30% of L, and when the number of the abnormal comparison groups is greater than or equal to 30% of the total number of the second comparison groups, the condition that more abnormal conditions occur in the second comparison groups 1 to L is indicated, and the thyroid columns corresponding to the second comparison groups 1 to L are abnormal;
when no normal comparison group exists between the continuous first number of abnormal comparison groups in the second comparison groups 1 to L, the continuous first number of abnormal comparison groups are marked as first abnormal blocks;
when no normal comparison group exists between the continuous second number of abnormal comparison groups in the second comparison groups 1 to L, the continuous second number of abnormal comparison groups are marked as second abnormal blocks, wherein the first abnormal blocks and the second abnormal blocks do not conflict with each other;
in the implementation process, the first number is recorded as 20, the second number is recorded as 10, the third number is recorded as 5, the fourth number is recorded as 15, and the same pixel point can be recorded as a first abnormal block or a second abnormal block, so that the first abnormal block and the second abnormal block can be overlapped, and the first abnormal block and the second abnormal block are not in conflict;
when the number of the first abnormal blocks in the second comparison groups 1 to L is more than or equal to the third number or the number of the second abnormal blocks is more than or equal to the fourth number, marking thyroid columns in the first comparison groups corresponding to the second comparison groups 1 to L as abnormal thyroid columns;
the image analysis unit is configured with a second image processing strategy comprising:
acquiring all abnormal thyroid columns from the thyroid pixel image, marking pixel points corresponding to the abnormal comparison groups in all abnormal thyroid columns based on all abnormal comparison groups, and marking the marked abnormal points;
for any one of thyroid columns 2 to K-1, when abnormal points exist in the thyroid columns, acquiring the nearest thyroid columns with abnormal points on two sides of the thyroid column, and marking the nearest thyroid columns as adjacent abnormal columns;
for any abnormal point in the thyroid gland column, connecting the abnormal point with the nearest abnormal point in the adjacent abnormal column, and marking the connecting line as an abnormal line;
acquiring images after all abnormal lines in the thyroid pixel images are connected, and marking the images as an abnormal block diagram;
the image analysis unit is configured with a block diagram analysis strategy, which includes:
acquiring an abnormal line at the outermost side of the abnormal block diagram, and recording a closed graph formed by connecting the abnormal line at the outermost side of the abnormal block diagram as a cutting graph;
in the implementation process, the exception line at the outermost side of the exception block diagram can acquire the image outline of the filled exception block diagram after filling the inside of the exception block diagram;
the block diagram analysis strategy further comprises:
placing the abnormal block diagram into a plane rectangular coordinate system, acquiring a minimum circumscribed rectangle of the abnormal block diagram by using a minimum circumscribed rectangle algorithm, and marking the minimum circumscribed rectangle as a cut rectangle;
placing the cut rectangle into the thyroid pixel image based on the position of the abnormal block diagram in the thyroid pixel image;
the cutting module is configured with a cutting area selection strategy, which includes:
acquiring a thyroid pixel image, moving a cutting device to an area where the cutting rectangle is located based on the cutting rectangle in the thyroid pixel image, and marking the area where the cutting rectangle is located as a cutting area;
the cutting unit is configured with a cutting strategy comprising:
based on the cutting pattern, the cutting pattern is cut in the cutting area using a cutting device.
The terminal processor comprises an image analysis unit and a storage unit, wherein the image analysis unit is used for analyzing thyroid images and selecting a cutting area in a frame manner;
the storage unit stores a plurality of groups of thyroid ultrasonic images under the condition of normal thyroid;
the cutting module cuts thyroid of the neck of the patient based on the cutting area in the image analysis unit.
Example 2
In a second aspect, the present application also provides a thyroid cutting method comprising:
step S1, acquiring a thyroid ultrasonic image of a patient, and marking the thyroid ultrasonic image as a thyroid image;
s2, analyzing thyroid images, and selecting a cutting area by a frame;
step S2 comprises the following sub-steps:
step S201, acquiring size data of the neck of a patient and acquiring a region of thyroid gland images of the neck of the patient;
the method comprises the steps of marking an image formed by size data of the neck of a patient as a neck image, putting the neck image into a rectangular coordinate system of the neck, and marking the area where the neck image is positioned as a neck area;
marking a region for acquiring thyroid images of the neck of a patient in the neck region, and marking the region as an image region;
in the implementation process, the image area is an area where the thyroid of the neck of the patient is located, and the thyroid of the neck of the patient can be positioned by acquiring the image area;
step S202, acquiring height data, weight data and neck width length data of a patient, and acquiring thyroid ultrasonic images closest to the height data, the weight data and the neck width length data of the patient in a storage unit, and recording the thyroid ultrasonic images as comparison images;
overlapping the comparison image with the neck region, marking a region corresponding to the image region in the comparison image based on the position parameter of the image region in the neck region, and marking the region as a normal neck image;
in the implementation process, the normal neck image which is most consistent with the physical condition of the patient can be more accurately found in the storage unit by acquiring the height data, the weight data and the neck width length of the patient, and meanwhile, the normal neck image is obtained based on the position parameters of the image area in the neck area, so that the size data of the normal neck image is the same as the size data of the image area, and the follow-up image analysis is facilitated;
step S203, the normal neck image is subjected to pixelation processing, and the normal neck image after the pixelation processing is recorded as a normal pixel image;
carrying out pixelation treatment on the thyroid image, and marking the thyroid image after the pixelation treatment as a thyroid pixel image;
in the specific implementation process, the images can be compared and analyzed more systematically through pixelation, and meanwhile, as the size data of the normal neck image and the size data of the image area are the same, the number of the pixel points in the normal neck image and the image area are the same, so that the pixel points can be analyzed more specifically;
acquiring the number of pixel points of a first row of a normal pixel image, and marking the number as K, wherein K is a positive integer;
referring to fig. 4, where Z1 is a normal pixel image, Z2 is a normal column 1, Z3 is a normal column 2, Z4 is a normal column K-1, Z5 is a normal column K, and the normal pixel image is divided into K columns based on the pixel points of the first row of the normal pixel image, and is recorded as a normal column 1 to a normal column K;
dividing the thyroid pixel image into K columns based on the pixel points of the first row of the thyroid pixel image, and marking the thyroid pixel image as thyroid columns 1 to thyroid columns K;
in the implementation process, the number of the normal columns is equal to the number of the thyroid columns because the size data of the normal neck image and the size data of the image area are the same;
step S204, marking the normal column 1 and the thyroid column 1 as a first comparison group 1, and obtaining a first comparison group 2 to a first comparison group K based on the normal column 2 to the normal column K and the thyroid column 2 to the thyroid column K;
for any one of the first comparison groups 1 to K, acquiring the number of pixel points in a normal column, and marking the number as L, wherein L is a positive integer;
the pixel points in the normal columns are marked as normal pixel points 1 to L from top to bottom, the pixel value of each pixel point in the normal pixel points 1 to L is obtained, and the pixel values are marked as normal values 1 to L;
marking the pixels of the thyroid column as a nail-shaped pixel 1 to a nail-shaped pixel L from top to bottom, obtaining the pixel value of each pixel in the nail-shaped pixel 1 to the nail-shaped pixel L, and marking the pixel value as a nail-shaped value 1 to a nail-shaped value L;
in the implementation process, the number of normal values is equal to the number of formazan values because the size data of the normal neck image and the size data of the image area are the same;
marking the normal value 1 and the formazan value 1 as a second comparison group 1, and obtaining a second comparison group 2 to a second comparison group L based on the normal value 2 to the normal value L and the formazan value 2 to the formazan value L;
step S205, for any one of the second comparison groups 1 to L, the absolute value of the difference value obtained by subtracting the formazan value from the normal value is marked as a pixel difference, and when the pixel difference is greater than or equal to the standard pixel difference, the second comparison group is marked as an abnormal comparison group;
in the specific implementation process, the standard pixel difference is 100, and when the difference between the pixel values of two pixel points exceeds 100, the fact that the color difference between the two pixel points is larger is indicated and marking should be carried out;
when the pixel difference is smaller than the standard pixel difference, the second comparison group is marked as a normal comparison group;
when the number of abnormal comparison groups in the second comparison groups 1 to L is greater than or equal to the standard abnormal number, marking thyroid columns in the first comparison groups corresponding to the second comparison groups 1 to L as abnormal thyroid columns;
in the specific implementation process, the standard abnormal number is set to be 30% of L, and when the number of the abnormal comparison groups is greater than or equal to 30% of the total number of the second comparison groups, the condition that more abnormal conditions occur in the second comparison groups 1 to L is indicated, and the thyroid columns corresponding to the second comparison groups 1 to L are abnormal;
when no normal comparison group exists between the continuous first number of abnormal comparison groups in the second comparison groups 1 to L, the continuous first number of abnormal comparison groups are marked as first abnormal blocks;
when no normal comparison group exists between the continuous second number of abnormal comparison groups in the second comparison groups 1 to L, the continuous second number of abnormal comparison groups are marked as second abnormal blocks, wherein the first abnormal blocks and the second abnormal blocks do not conflict with each other;
in the implementation process, the first number is recorded as 20, the second number is recorded as 10, the third number is recorded as 5, the fourth number is recorded as 15, and the same pixel point can be recorded as a first abnormal block or a second abnormal block, so that the first abnormal block and the second abnormal block can be overlapped, and the first abnormal block and the second abnormal block are not in conflict;
when the number of the first abnormal blocks in the second comparison groups 1 to L is more than or equal to the third number or the number of the second abnormal blocks is more than or equal to the fourth number, marking thyroid columns in the first comparison groups corresponding to the second comparison groups 1 to L as abnormal thyroid columns;
step S206, obtaining all abnormal thyroid columns from the thyroid pixel image, and marking pixel points corresponding to the abnormal comparison groups in all abnormal thyroid columns based on all abnormal comparison groups, and marking the marked abnormal points;
for any one of thyroid columns 2 to K-1, when abnormal points exist in the thyroid columns, acquiring the nearest thyroid columns with abnormal points on two sides of the thyroid column, and marking the nearest thyroid columns as adjacent abnormal columns;
for any abnormal point in the thyroid gland column, connecting the abnormal point with the nearest abnormal point in the adjacent abnormal column, and marking the connecting line as an abnormal line;
acquiring images after all abnormal lines in the thyroid pixel images are connected, and marking the images as an abnormal block diagram;
step S207, obtaining an abnormal line at the outermost side of the abnormal block diagram, and recording a closed graph formed by connecting the abnormal line at the outermost side of the abnormal block diagram as a cutting graph;
in the implementation process, the exception line at the outermost side of the exception block diagram can acquire the image outline of the filled exception block diagram after filling the inside of the exception block diagram;
placing the abnormal block diagram into a plane rectangular coordinate system, acquiring a minimum circumscribed rectangle of the abnormal block diagram by using a minimum circumscribed rectangle algorithm, and marking the minimum circumscribed rectangle as a cut rectangle;
placing the cut rectangle into the thyroid pixel image based on the position of the abnormal block diagram in the thyroid pixel image;
s3, cutting thyroid of the neck of the patient based on the cutting area in the image analysis unit;
step S3 comprises the following sub-steps:
step S301, acquiring a thyroid pixel image, moving a cutting device to an area where the cutting rectangle is located based on the cutting rectangle in the thyroid pixel image, and marking the area where the cutting rectangle is located as a cutting area;
step S302, cutting the cutting pattern in the cutting area by using the cutting device based on the cutting pattern.
Example 3
In a third aspect, the present application provides a storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of any of the methods described above. By the above technical solution, the computer program, when executed by the processor, performs the method in any of the alternative implementations of the above embodiments to implement the following functions: firstly, acquiring a thyroid ultrasonic image of a patient, and recording the thyroid ultrasonic image as a thyroid image; then analyzing the thyroid image, and selecting a cutting area by a frame; finally, based on the cutting area in the image analysis unit, the thyroid gland of the neck of the patient is cut.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media having computer-usable program code embodied therein. The storage medium may be implemented by any type or combination of volatile or nonvolatile Memory devices, such as static random access Memory (Static Random Access Memory, SRAM), electrically erasable Programmable Read-Only Memory (Electrically Erasable Programmable Read-Only Memory, EEPROM), erasable Programmable Read-Only Memory (Erasable Programmable Read Only Memory, EPROM), programmable Read-Only Memory (PROM), read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk, or optical disk. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
The above examples are only specific embodiments of the present application, and are not intended to limit the scope of the present application, but it should be understood by those skilled in the art that the present application is not limited thereto, and that the present application is described in detail with reference to the foregoing examples: any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or perform equivalent substitution of some of the technical features, while remaining within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (12)
1. The thyroid gland cutting device is characterized by comprising an image acquisition module, a cutting module and a terminal processor, wherein the image acquisition module and the cutting module are in communication connection with the terminal processor;
the image acquisition module is used for acquiring a thyroid ultrasonic image of a patient and recording the thyroid ultrasonic image as a thyroid image;
the terminal processor comprises an image analysis unit and a storage unit, wherein the image analysis unit is used for analyzing thyroid images and selecting a cutting area in a frame mode;
the storage unit stores a plurality of groups of thyroid ultrasonic images under the condition of thyroid normal;
the cutting module cuts thyroid of the neck of the patient based on the cutting area in the image analysis unit.
2. The thyroid cutting device of claim 1, wherein the image acquisition module is configured with an image localization strategy comprising:
acquiring size data of the neck of a patient and acquiring a region of the neck of the patient to obtain a thyroid image;
the method comprises the steps of marking an image formed by size data of the neck of a patient as a neck image, putting the neck image into a rectangular coordinate system of the neck, and marking the area where the neck image is positioned as a neck area;
the region in the neck region, in which thyroid images are acquired from the neck of the patient, is marked as the image region.
3. The thyroid cutting device of claim 2, wherein the image acquisition module is configured with an image pairing strategy comprising:
acquiring height data, weight data and neck width length data of a patient; obtaining height data, weight data and neck width length data of a plurality of groups of users under the condition of thyroid gland normal from a storage unit, solving absolute values of differences between the height data, the weight data and the neck width length data of a patient and the height data, the weight data and the neck width length data of the users under the condition of thyroid gland normal respectively, marking the absolute values as the height absolute differences, the weight absolute differences and the neck width absolute differences respectively, adding the height absolute differences, the weight absolute differences and the neck width absolute differences to obtain a comparison total difference, selecting a thyroid ultrasonic image of the user corresponding to the minimum value in the plurality of groups of comparison total differences, and marking the thyroid ultrasonic image as the comparison image;
and overlapping the comparison image with the neck region, marking a region corresponding to the image region in the comparison image based on the position parameter of the image region in the neck region, and marking the region as a normal neck image.
4. A thyroid-cutting apparatus as claimed in claim 3, wherein the image analysis unit is configured with a first image processing strategy comprising:
carrying out pixelation processing on the normal neck image, and marking the normal neck image after the pixelation processing as a normal pixel image;
carrying out pixelation treatment on the thyroid image, and marking the thyroid image after the pixelation treatment as a thyroid pixel image;
acquiring the number of pixel points of a first row of a normal pixel image, and marking the number as K, wherein K is a positive integer;
dividing the normal pixel image into K columns based on the pixel points of the first row of the normal pixel image, and marking the K columns as normal columns 1 to K;
the thyroid pixel image is divided into K columns based on the pixel points of the first row of the thyroid pixel image, and is denoted as thyroid column 1 to thyroid column K.
5. The thyroid cutting apparatus of claim 4, wherein the image analysis unit is configured with a alignment grouping strategy comprising:
marking the normal column 1 and the thyroid column 1 as a first comparison group 1, and obtaining a first comparison group 2 to a first comparison group K based on the normal column 2 to the normal column K and the thyroid column 2 to the thyroid column K;
for any one of the first comparison groups 1 to K, acquiring the number of pixel points in a normal column, and marking the number as L, wherein L is a positive integer;
the pixel points in the normal columns are marked as normal pixel points 1 to L from top to bottom, the pixel value of each pixel point in the normal pixel points 1 to L is obtained, and the pixel values are marked as normal values 1 to L;
marking the pixels of the thyroid column as a nail-shaped pixel 1 to a nail-shaped pixel L from top to bottom, obtaining the pixel value of each pixel in the nail-shaped pixel 1 to the nail-shaped pixel L, and marking the pixel value as a nail-shaped value 1 to a nail-shaped value L;
the normal value 1 and the formazan value 1 are recorded as a second comparison group 1, and the second comparison group 2 to the second comparison group L are obtained based on the normal value 2 to the normal value L and the formazan value 2 to the formazan value L.
6. The thyroid-cutting device of claim 5, wherein the image analysis unit is configured with a first comparison strategy comprising:
for any one of the second comparison groups 1 to L, the absolute value of the difference obtained by subtracting the formazan value from the normal value is recorded as a pixel difference, and when the pixel difference is greater than or equal to the standard pixel difference, the second comparison group is recorded as an abnormal comparison group;
when the pixel difference is smaller than the standard pixel difference, the second comparison group is marked as a normal comparison group;
when the number of abnormal comparison groups in the second comparison groups 1 to L is greater than or equal to the standard abnormal number, marking thyroid columns in the first comparison groups corresponding to the second comparison groups 1 to L as abnormal thyroid columns;
when no normal comparison group exists between the continuous first number of abnormal comparison groups in the second comparison groups 1 to L, the continuous first number of abnormal comparison groups are marked as first abnormal blocks;
when no normal comparison group exists between the continuous second number of abnormal comparison groups in the second comparison groups 1 to L, the continuous second number of abnormal comparison groups are marked as second abnormal blocks, wherein the first abnormal blocks and the second abnormal blocks do not conflict with each other;
and when the number of the first abnormal blocks in the second comparison groups 1 to L is more than or equal to the third number or the number of the second abnormal blocks is more than or equal to the fourth number, marking thyroid columns in the first comparison groups corresponding to the second comparison groups 1 to L as abnormal thyroid columns.
7. The thyroid-cutting device of claim 6, wherein the image analysis unit is configured with a second image processing strategy comprising:
acquiring all abnormal thyroid columns from the thyroid pixel image, marking pixel points corresponding to the abnormal comparison groups in all abnormal thyroid columns based on all abnormal comparison groups, and marking the marked abnormal points;
for any one of thyroid columns 2 to K-1, when abnormal points exist in the thyroid columns, acquiring the nearest thyroid columns with abnormal points on two sides of the thyroid column, and marking the nearest thyroid columns as adjacent abnormal columns;
for any abnormal point in the thyroid gland column, connecting the abnormal point with the nearest abnormal point in the adjacent abnormal column, and marking the connecting line as an abnormal line;
and acquiring images after all abnormal lines in the thyroid pixel image are connected, and marking the images as an abnormal block diagram.
8. The thyroid-cutting device of claim 7, wherein the image analysis unit is configured with a block-diagram analysis strategy comprising:
and acquiring an abnormal line at the outermost side of the abnormal block diagram, and recording a closed graph formed by connecting the abnormal line at the outermost side of the abnormal block diagram as a cutting graph.
9. The thyroid-cutting device of claim 8, wherein the block diagram analysis strategy further comprises:
placing the abnormal block diagram into a plane rectangular coordinate system, acquiring a minimum circumscribed rectangle of the abnormal block diagram by using a minimum circumscribed rectangle algorithm, and marking the minimum circumscribed rectangle as a cut rectangle;
the cut rectangle is placed into the thyroid pixel image based on the position of the anomaly block diagram in the thyroid pixel image.
10. The thyroid-cutting device of claim 9, wherein the cutting module is configured with a cutting area selection strategy comprising:
and acquiring a thyroid pixel image, moving the cutting device to the area where the cutting rectangle is located based on the cutting rectangle in the thyroid pixel image, and marking the area where the cutting rectangle is located as a cutting area.
11. The thyroid-cutting device of claim 10, wherein the cutting module is configured with a cutting strategy comprising:
based on the cutting pattern, the cutting pattern is cut in the cutting area using a cutting device.
12. A thyroid cutting method, adapted for use with the thyroid cutting apparatus of any one of claims 1-11, comprising:
step S1, acquiring a thyroid ultrasonic image of a patient, and marking the thyroid ultrasonic image as a thyroid image;
s2, analyzing thyroid images, and selecting a cutting area by a frame;
and S3, cutting thyroid on the neck of the patient based on the cutting area in the image analysis unit.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311215560.4A CN117152178B (en) | 2023-09-20 | 2023-09-20 | Thyroid gland cutting device and cutting method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311215560.4A CN117152178B (en) | 2023-09-20 | 2023-09-20 | Thyroid gland cutting device and cutting method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117152178A true CN117152178A (en) | 2023-12-01 |
CN117152178B CN117152178B (en) | 2024-04-02 |
Family
ID=88904499
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311215560.4A Active CN117152178B (en) | 2023-09-20 | 2023-09-20 | Thyroid gland cutting device and cutting method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117152178B (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108364293A (en) * | 2018-04-10 | 2018-08-03 | 复旦大学附属肿瘤医院 | A kind of on-line training thyroid tumors Ultrasound Image Recognition Method and its device |
CN115239655A (en) * | 2022-07-15 | 2022-10-25 | 北京精康科技有限责任公司 | Thyroid ultrasonic image tumor segmentation and classification method and device |
CN116452464A (en) * | 2023-06-09 | 2023-07-18 | 天津市肿瘤医院(天津医科大学肿瘤医院) | Chest image enhancement processing method based on deep learning |
-
2023
- 2023-09-20 CN CN202311215560.4A patent/CN117152178B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108364293A (en) * | 2018-04-10 | 2018-08-03 | 复旦大学附属肿瘤医院 | A kind of on-line training thyroid tumors Ultrasound Image Recognition Method and its device |
CN115239655A (en) * | 2022-07-15 | 2022-10-25 | 北京精康科技有限责任公司 | Thyroid ultrasonic image tumor segmentation and classification method and device |
CN116452464A (en) * | 2023-06-09 | 2023-07-18 | 天津市肿瘤医院(天津医科大学肿瘤医院) | Chest image enhancement processing method based on deep learning |
Also Published As
Publication number | Publication date |
---|---|
CN117152178B (en) | 2024-04-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109636808B (en) | Lung lobe segmentation method based on full convolution neural network | |
CN113112443B (en) | Method and device for segmenting ultrasonic image focus and computer equipment | |
US20090196475A1 (en) | Automatic mask design and registration and feature detection for computer-aided skin analysis | |
CN110599465B (en) | Image positioning method and device, computer equipment and storage medium | |
WO2013028762A1 (en) | Method and system for integrated radiological and pathological information for diagnosis, therapy selection, and monitoring | |
KR102349515B1 (en) | Tumor automatic segmentation based on deep learning in a medical image | |
EP3063736B1 (en) | Registration of tissue slice image | |
CN112712522A (en) | Automatic segmentation method for oral cancer epithelial tissue region of pathological image | |
CN117372439B (en) | Nuclear magnetism and CT fusion-based uterine lesion position identification method, system and medium | |
CN113989294A (en) | Cell segmentation and typing method, device, equipment and medium based on machine learning | |
KR20210020619A (en) | Abdominal organ status diagnosis based on abnominal organ volume change analysis using abnominal organ automatic segmentation | |
CN111798410A (en) | Cancer cell pathological grading method, device, equipment and medium based on deep learning model | |
CN117152178B (en) | Thyroid gland cutting device and cutting method | |
US20090310883A1 (en) | Image processing apparatus, method, and program | |
CN110060246B (en) | Image processing method, device and storage medium | |
CN113920114A (en) | Image processing method, image processing apparatus, computer device, storage medium, and program product | |
KR20160140194A (en) | Method and apparatus for detecting abnormality based on personalized analysis of PACS image | |
CN115227393A (en) | Puncture path planning method in liver tumor radio frequency ablation | |
JP6697274B2 (en) | Medical information processing apparatus, medical image diagnostic apparatus, medical information processing method, and medical information processing program | |
CN113823419A (en) | Operation process recording method, device, medium and computing equipment | |
CN110738664B (en) | Image positioning method and device, computer equipment and storage medium | |
KR102332472B1 (en) | Tumor automatic segmentation using deep learning based on dual window setting in a medical image | |
CN117557560A (en) | Method and system for identifying focus of lung nodule based on PET and CT image fusion | |
US20240037855A1 (en) | Method for generating three-dimensional prostate pathological image, and system therefor | |
CN113425266B (en) | Skin cancer screening system based on infrared imaging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |