CN116269507B - Method and device for classifying hyperuricemia and gouty nephropathy and electronic equipment - Google Patents

Method and device for classifying hyperuricemia and gouty nephropathy and electronic equipment Download PDF

Info

Publication number
CN116269507B
CN116269507B CN202310580524.1A CN202310580524A CN116269507B CN 116269507 B CN116269507 B CN 116269507B CN 202310580524 A CN202310580524 A CN 202310580524A CN 116269507 B CN116269507 B CN 116269507B
Authority
CN
China
Prior art keywords
image
hyperuricemia
gouty nephropathy
classification
prediction model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310580524.1A
Other languages
Chinese (zh)
Other versions
CN116269507A (en
Inventor
郑敏
马立勇
李广涵
刘健
张波
武敬平
田艳
牟姗
李文歌
郑宏岩
牛宇政
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Japan Friendship Hospital
Original Assignee
China Japan Friendship Hospital
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Japan Friendship Hospital filed Critical China Japan Friendship Hospital
Priority to CN202310580524.1A priority Critical patent/CN116269507B/en
Publication of CN116269507A publication Critical patent/CN116269507A/en
Application granted granted Critical
Publication of CN116269507B publication Critical patent/CN116269507B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/085Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Vascular Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The application discloses a classification method, a classification device and electronic equipment for hyperuricemia and gouty nephropathy, wherein the classification method comprises the following steps: b-type ultrasonic images of kidney areas of the object to be predicted are obtained, and gouty nephropathy is predicted on the B-type ultrasonic images through a classification prediction model of hyperuricemia and gouty nephropathy, so that classification results of the hyperuricemia and the gouty nephropathy are obtained. Therefore, the B-type ultrasonic image is processed through the classification prediction model of hyperuricemia and gouty nephropathy, so that hyperuricemia patients can be conveniently and rapidly screened, and gouty nephropathy of the patients can be classified, and the subsequent application of the classification prediction model of hyperuricemia and gouty nephropathy in gouty nephropathy stage diagnosis is facilitated to carry out individual diagnosis.

Description

Method and device for classifying hyperuricemia and gouty nephropathy and electronic equipment
Technical Field
The application relates to the field of medical images, in particular to a method and a device for classifying hyperuricemia and gouty nephropathy and electronic equipment.
Background
Gouty kidney disease is also known as uroacidosis kidney disease, and is a kidney damage caused by hyperuricemia due to excessive production or decreased excretion of blood uric acid. The clinical manifestations of gouty nephropathy can be uric acid stones, small molecule proteinuria, edema, nocturia, hypertension, blood, increased urine uric acid and impairment of renal tubule function.
At present, in the existing stage, the symptoms of early gouty nephropathy are not obvious, and once patients feel that the patients feel obvious fatigue, anemia and other symptoms, the patients go to medical treatment, and possibly the patients go through the early stage, so that the later treatment is more difficult. After a B-type ultrasonic image of a kidney region of a patient to be detected is acquired, how to screen out hyperuricemia crowd in a convenient and quick way so as to diagnose and classify gouty nephropathy, and the method has important significance for accurate diagnosis and treatment of gouty nephropathy.
Disclosure of Invention
Therefore, an object of the present application is to provide a classification method, a classification device and an electronic device based on a classification prediction model of hyperuricemia and gouty nephropathy.
According to an embodiment of the first aspect of the present application, a classification method based on a classification prediction model of hyperuricemia and gouty nephropathy is provided, including: acquiring a B-type ultrasonic image of a kidney region of an object to be predicted; inputting the B-type ultrasonic image into an embedding layer in the classification prediction model of hyperuricemia and gouty nephropathy, performing blocking processing on the B-type ultrasonic image through the embedding layer, determining image feature vectors and position vectors corresponding to a plurality of image blocks obtained by blocking, and performing splicing processing on the image feature vectors and the position vectors corresponding to the image blocks respectively to obtain spliced feature vectors corresponding to the image blocks; inputting the spliced feature vectors corresponding to the image blocks into a coding layer in the classification prediction model of hyperuricemia and gouty nephropathy, so as to perform fusion processing on the image feature vectors corresponding to the image blocks through the coding layer, and obtain a feature image of the B-type ultrasonic image; inputting the characteristic image into a classification layer in the classification prediction model of hyperuricemia and gouty nephropathy to obtain a classification result of hyperuricemia and gouty nephropathy of the B-type ultrasonic image.
Optionally, as a first possible implementation manner of the first aspect, the coding layer includes: the method for obtaining the characteristic image of the B-type ultrasonic image includes the steps of: inputting the spliced feature vectors corresponding to the image blocks into the attention sub-layer so as to determine the attention weights corresponding to the image blocks through the attention sub-layer; and in the coding sublayer, carrying out weighted fusion processing on the spliced feature vectors corresponding to the corresponding image blocks based on the attention weights corresponding to the image blocks so as to obtain the feature image of the B-type ultrasonic image.
Optionally, as a second possible implementation manner of the first aspect, the inputting the spliced feature vectors corresponding to the image blocks into the attention sub-layer to determine the attention weights corresponding to the image blocks respectively through the attention sub-layer includes: in the attention sublayer, determining the association degree between every two image blocks according to the splicing feature vectors corresponding to the image blocks; and determining the attention weight corresponding to each image block according to the association degree.
Optionally, as a third possible implementation manner of the first aspect, the method further includes: acquiring a first image size supported by the classification prediction model of hyperuricemia and gouty nephropathy; determining a second image size of the B-mode ultrasound image; and in the case that the first image size and the second image size are not consistent, performing scaling processing on the B-mode ultrasonic image so that the image size of the B-mode ultrasonic image after scaling processing is the same as the first image size.
Optionally, as a fourth possible implementation manner of the first aspect, the training manner of the classification prediction model of hyperuricemia and gouty nephropathy is: b ultrasonic images of kidney areas of sample objects and corresponding classification results of hyperuricemia and gouty nephropathy are obtained; and taking the B-type ultrasonic image of the kidney region of the sample object as input of the classification prediction model of hyperuricemia and gouty nephropathy, taking the classification result of the hyperuricemia and gouty nephropathy corresponding to the sample object as output of the classification prediction model of hyperuricemia and gouty nephropathy, and training the classification prediction model of hyperuricemia and gouty nephropathy.
According to the classification method based on the classification prediction model of hyperuricemia and gouty nephropathy, firstly, a B-type ultrasonic image of a kidney region of an object to be predicted is acquired, the B-type ultrasonic image is input into an embedding layer in the classification prediction model of hyperuricemia and gouty nephropathy, the B-type ultrasonic image is subjected to block processing through the embedding layer to determine image feature vectors and position vectors corresponding to a plurality of image blocks obtained by block processing, splicing processing is carried out on the image feature vectors and the position vectors corresponding to the image blocks respectively to obtain spliced feature vectors corresponding to the image blocks respectively, then the spliced feature vectors are input into a coding layer in the classification prediction model of hyperuricemia and gouty nephropathy, fusion processing is carried out on the image feature vectors corresponding to the image blocks respectively through the coding layer to obtain feature images of the B-type ultrasonic image, and finally, the feature images are input into classification layers in the classification prediction model of hyperuricemia and gouty nephropathy to obtain the classification result of the hyperuricemia and gouty nephropathy of the B-type ultrasonic image. Therefore, the B-type ultrasonic image is processed through the classification prediction model of hyperuricemia and gouty nephropathy, and the classification results of hyperuricemia and gouty nephropathy corresponding to the B-type ultrasonic image of the object to be detected can be conveniently and rapidly determined, so that the subsequent application of the classification prediction model of hyperuricemia and gouty nephropathy in gouty nephropathy stage diagnosis and treatment is facilitated.
According to a second aspect of the present application, an embodiment provides a hyperuricemia and gouty nephropathy classification apparatus, the apparatus comprising: the acquisition module is used for acquiring a B-type ultrasonic image of the kidney region of the object to be predicted; the embedding layer module is used for inputting the B-type ultrasonic image into an embedding layer in the classification prediction model of hyperuricemia and gouty nephropathy, carrying out blocking processing on the B-type ultrasonic image through the embedding layer, determining image feature vectors and position vectors corresponding to a plurality of image blocks obtained by blocking, and respectively carrying out splicing processing on the image feature vectors and the position vectors corresponding to the image blocks so as to obtain spliced feature vectors corresponding to the image blocks; the coding layer module is used for inputting the spliced feature vectors corresponding to the image blocks into a coding layer in the classification prediction model of hyperuricemia and gouty nephropathy so as to perform fusion processing on the image feature vectors corresponding to the image blocks through the coding layer to obtain a feature image of the B-type ultrasonic image; and the classification layer module is used for inputting the characteristic image into a classification layer in the hyperuricemia and gouty nephropathy classification prediction model so as to obtain a hyperuricemia and gouty nephropathy classification result of the B-type ultrasonic image.
According to the gouty nephropathy classifying device based on the hyperuricemia and gouty nephropathy classifying and predicting model, firstly, a B-type ultrasonic image of a kidney area of an object to be predicted is acquired, the B-type ultrasonic image is input into an embedding layer in the hyperuricemia and gouty nephropathy classifying and predicting model, the B-type ultrasonic image is subjected to blocking processing through the embedding layer, image feature vectors and position vectors corresponding to a plurality of image blocks obtained through blocking are determined, splicing processing is carried out on the image feature vectors and the position vectors corresponding to the image blocks respectively to obtain splicing feature vectors corresponding to the image blocks respectively, then the splicing feature vectors are input into a coding layer in the hyperuricemia and gouty nephropathy classifying and predicting model, the image feature vectors corresponding to the image blocks are fused through the coding layer to obtain feature images of the B-type ultrasonic image, and finally the feature images are input into a classifying layer in the hyperuricemia and gouty nephropathy classifying and predicting model to obtain hyperuricemia and gouty nephropathy classifying results of the B-type ultrasonic image. Therefore, the B-type ultrasonic image is processed through the classification prediction model of hyperuricemia and gouty nephropathy, and the classification results of hyperuricemia and gouty nephropathy corresponding to the B-type ultrasonic image of the object to be detected can be conveniently and rapidly determined, so that the subsequent application of the classification prediction model of hyperuricemia and gouty nephropathy in gouty nephropathy stage diagnosis and treatment is facilitated.
Additional aspects and advantages of the application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
fig. 1 is a flow chart of a classification method based on classification prediction models of hyperuricemia and gouty nephropathy according to an embodiment of the present application.
Fig. 2 is a flow chart of another classification method based on classification prediction model of hyperuricemia and gouty nephropathy according to an embodiment of the present application.
Fig. 3 is a schematic flow chart of training classification prediction models of hyperuricemia and gouty nephropathy according to an embodiment of the present application.
Fig. 4 is a schematic flow chart of another classification prediction model for training hyperuricemia and gouty nephropathy according to an embodiment of the present application.
Fig. 5 is a schematic structural diagram of a classification device based on classification prediction models of hyperuricemia and gouty nephropathy according to an embodiment of the present application.
Fig. 6 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are exemplary and intended for the purpose of explaining the present application and are not to be construed as limiting the present application.
A classification method, apparatus and electronic device based on classification prediction models of hyperuricemia and gouty nephropathy according to embodiments of the present application are described below with reference to the accompanying drawings.
Fig. 1 is a flow chart of a classification method based on classification prediction models of hyperuricemia and gouty nephropathy according to an embodiment of the present application.
As shown in fig. 1, the method comprises the steps of:
step 101, acquiring a B-mode ultrasonic image of a kidney region of an object to be predicted.
The classification method based on the classification prediction model of hyperuricemia and gouty nephropathy in the present embodiment is performed by a classification device based on the classification prediction model of hyperuricemia and gouty nephropathy, and the classification device may be implemented by software and/or hardware, where the classification device may be an electronic device or may be configured in an electronic device.
The electronic device may be any device with computing capability, for example, may be a personal computer, a mobile terminal, a server, etc., and the mobile terminal may be, for example, a vehicle-mounted device, a mobile phone, a tablet computer, a personal digital assistant, a wearable device, etc., which have various operating systems, touch screens, and/or display screens.
The classification device in this example is used for classifying and predicting hyperuricemia and gouty nephropathy.
Wherein the object to be predicted is a virtual object to be predicted, wherein the virtual object is a virtual object created for medical objects in the real world, which may be patients (persons), or other animals such as cats, dogs, etc. The virtual object may appear as a patient identification. The patient identification may be a patient account number, a patient ID (number or identification card number), a patient name, or the like.
In this example, a medical object is taken as an example of a patient.
In some examples, the examination and repair of the kidney region of the patient corresponding to the object to be predicted can be performed by the ultrasonic image acquisition device, so as to obtain a B-type ultrasonic image of the kidney region corresponding to the object to be predicted.
As an exemplary embodiment, in order for the input image to satisfy the requirements of the classification prediction model of hyperuricemia and gouty nephropathy, after acquiring a B-mode ultrasound image of a kidney region of a subject to be predicted, acquiring a first image size supported by the classification prediction model of hyperuricemia and gouty nephropathy; and determining a second image size of the B-mode ultrasonic image, and performing scaling processing on the B-mode ultrasonic image under the condition that the first image size and the second image size are inconsistent so that the image size of the B-mode ultrasonic image after the scaling processing is the same as the first image size.
Wherein, it can be understood that the training of the classification prediction model of hyperuricemia and gouty nephropathy is already completed before the formal use of the classification prediction model of hyperuricemia and gouty nephropathy. That is, the classification prediction model for hyperuricemia and gouty nephropathy described above is pre-trained.
Among other things, the classification prediction model for hyperuricemia and gouty nephropathy in this example may be established based on the visual transformation VIT (Vision Transformer) model.
Step 102, inputting the B-type ultrasonic image into an embedding layer in the classification prediction model of hyperuricemia and gouty nephropathy, performing blocking processing on the B-type ultrasonic image through the embedding layer, determining image feature vectors and position vectors corresponding to a plurality of image blocks obtained by blocking, and performing splicing processing on the image feature vectors and the position vectors corresponding to the image blocks respectively to obtain spliced feature vectors corresponding to the image blocks.
It should be noted that, in the embodiment of the present application, the embedding layer performs a blocking process on the B-type ultrasound image based on a preset size, so as to obtain a plurality of image blocks.
The preset size is a size preset in the embedded layer, for example, the preset size may be 16×16. It is understood that the preset size may be preset in the embedded layer according to actual requirements, and the embodiment is not limited in particular.
For example, the size of the B-mode ultrasound image of the embedding layer in the preset size of 16x16, the size of 224x224 in the classification and prediction model of hyperuricemia and gouty nephropathy, and correspondingly, the embedding layer may perform a blocking process on the B-mode ultrasound image according to the size of 16x16 to obtain 196 image blocks.
In some examples, each image block may be separately vector represented to obtain a respective corresponding image feature vector for each image block. Specifically, each image feature vector may be determined according to all pixel values in each channel in each image block, for example, for each image block, all pixel values in each channel in the image block may be integrated into a vector to obtain an image feature vector corresponding to the image block.
Wherein the position vector in this example is obtained by vector representation of the position encoding of the individual image blocks in the B-mode ultrasound image.
And step 103, inputting the spliced feature vectors corresponding to the image blocks into a coding layer in the classification prediction model of hyperuricemia and gouty nephropathy, so as to perform fusion processing on the image feature vectors corresponding to the image blocks through the coding layer, and obtain the feature image of the B-type ultrasonic image.
And 104, inputting the characteristic images into a classification layer in a classification prediction model of hyperuricemia and gouty nephropathy to obtain classification results of hyperuricemia and gouty nephropathy of the B-type ultrasonic image.
Among these, the hyperuricemia classification results in this example include: hyperuricemia and hyperuricemia-free; classification results of gouty nephropathy include: no gouty kidney disease, gouty kidney disease; the gouty nephropathy may further include the following classification results: gouty kidney disease is early stage, gouty kidney disease is middle stage, gouty kidney disease is end stage; or the gouty nephropathy may include the following classification result: stage 1, stage 2, stage 3, stage 4 and stage 5 of gouty nephropathy.
It can be understood that at present, a professional doctor cannot directly diagnose gouty nephropathy stage according to the B-type ultrasonic image, and the B-type ultrasonic image is processed through the classification prediction model of hyperuricemia and gouty nephropathy, so that the gouty nephropathy stage corresponding to the B-type ultrasonic image can be determined, and the final diagnosis result of the object to be tested can be conveniently determined by the subsequent professional doctor based on the results output by the classification prediction model of hyperuricemia and gouty nephropathy and other medical examination results.
According to the hyperuricemia and gouty nephropathy classification method, firstly, a B-type ultrasonic image of a kidney area of an object to be predicted is acquired, the B-type ultrasonic image is input to an embedding layer in a gouty nephropathy classification prediction model, the B-type ultrasonic image is subjected to block processing through the embedding layer to determine image feature vectors and position vectors corresponding to a plurality of image blocks obtained through block processing, splicing processing is carried out on the image feature vectors and the position vectors corresponding to the image blocks respectively to obtain spliced feature vectors corresponding to the image blocks, then the spliced feature vectors are input to a coding layer in the hyperuricemia and gouty nephropathy classification prediction model, fusion processing is carried out on the image feature vectors corresponding to the image blocks through the coding layer to obtain a feature image of the B-type ultrasonic image, and finally the feature image is input to a classification layer in the hyperuricemia and gouty nephropathy classification prediction model to obtain hyperuricemia and gouty nephropathy classification results of the B-type ultrasonic image. Therefore, the B-type ultrasonic image is processed through the classification prediction model of hyperuricemia and gouty nephropathy, so that the classification results of hyperuricemia and gouty nephropathy corresponding to the B-type ultrasonic image of the object to be detected can be conveniently and rapidly determined, and the subsequent classification results of hyperuricemia and gouty nephropathy of the classification prediction model of hyperuricemia and gouty nephropathy can be conveniently applied to the classification diagnosis and treatment of hyperuricemia and gouty nephropathy.
In some exemplary embodiments, in order to clearly understand how the encoding layer in the classification prediction model of hyperuricemia and gouty nephropathy obtains the feature image of the B-mode ultrasound image, a classification method based on the classification prediction model of hyperuricemia and gouty nephropathy of the embodiment is further described in the following by way of example with reference to fig. 2.
Fig. 2 is a flow chart of another classification method based on classification prediction models of hyperuricemia and gouty nephropathy according to an embodiment of the present application.
As shown in fig. 2, may include:
in step 201, the spliced feature vectors corresponding to the image blocks are input to the attention sub-layer.
Among other things, it is understood that the coding layers of the classification prediction model for hyperuricemia and gouty nephropathy in this example include an attention sub-layer and a coding sub-layer.
In step 202, the attention weights corresponding to the image blocks are determined through the attention sub-layer.
In some exemplary embodiments, in the attention sub-layer, the association degree between every two image blocks is determined according to the splicing feature vectors corresponding to the image blocks, and the attention weight corresponding to each image block is determined according to the association degree.
In some examples, for any two image blocks of the plurality of image blocks, the association degree calculation may be performed based on the stitching feature vectors corresponding to each of the any two image blocks, so as to obtain the association degree between the any two image blocks.
And 203, in the coding sublayer, carrying out weighted fusion processing on the spliced feature vectors corresponding to the corresponding image blocks based on the attention weights corresponding to the image blocks so as to obtain the feature image of the B-type ultrasonic image.
In some examples, the stitched feature vectors corresponding to the corresponding image blocks are weighted based on the attention weights corresponding to the image blocks, so as to obtain weighted stitched feature vectors, and the weighted stitched feature vectors corresponding to the image blocks are processed according to the position codes of the image blocks in the B-mode ultrasonic image, so as to obtain the feature image of the B-mode ultrasonic image.
In this example, the attention weights corresponding to the image blocks are accurately determined through the attention sub-layer, and in the coding sub-layer, the spliced feature vectors corresponding to the corresponding image blocks are subjected to weighted fusion processing based on the attention weights corresponding to the image blocks, so as to obtain the feature image of the B-type ultrasonic image. Therefore, the characteristic image of the B-type ultrasonic image can be accurately determined.
In some exemplary embodiments, in order to clearly understand how the classification prediction model of hyperuricemia and gouty nephropathy is trained, the training of the classification prediction model of hyperuricemia and gouty nephropathy of the example is further illustratively described below in connection with fig. 3.
Fig. 3 is a schematic flow chart of training classification prediction models of hyperuricemia and gouty nephropathy according to an embodiment of the present application;
as shown in fig. 3, the process of training the classification prediction model for hyperuricemia and gouty nephropathy may include:
step 301, obtaining a B-mode ultrasonic image of a kidney region of a sample object, and classifying results of hyperuricemia and gouty nephropathy corresponding to the sample object.
Wherein the sample object is a virtual object selected as a sample. Wherein the virtual object is a virtual object created for a medical object in the real world, which may be a patient (person), or other animal, such as a cat, dog, etc. The virtual object may appear as a patient identification. The patient identification may be a patient account number, a patient ID (number or identification card number), a patient name, or the like.
Step 302, taking a B-mode ultrasonic image of a kidney region of a sample object as input of a classification prediction model of hyperuricemia and gouty nephropathy.
Step 303, taking the classification result of the hyperuricemia and the gouty nephropathy corresponding to the sample object as the output of the classification prediction model of the hyperuricemia and the gouty nephropathy.
Step 304, training of a classification prediction model of hyperuricemia and gouty nephropathy is completed according to the B-type ultrasonic image of the kidney region of the sample object and the classification result of hyperuricemia and gouty nephropathy corresponding to the sample object.
In the example, the classification prediction model of hyperuricemia and gouty nephropathy is trained through a plurality of sample objects and classification results of hyperuricemia and gouty nephropathy corresponding to the sample objects, so that the classification prediction model of hyperuricemia and gouty nephropathy which are trained is obtained, and the subsequent classification of gouty nephropathy can be conveniently performed on the B-type ultrasonic image of the kidney area corresponding to the sample to be tested based on the trained classification prediction model of hyperuricemia and gouty nephropathy.
In order that the present application may be clearly understood, the process of training the classification prediction model of hyperuricemia and gouty nephropathy of this embodiment is described below as an example in connection with fig. 4.
Fig. 4 is a schematic flow chart of another classification prediction model for training hyperuricemia and gouty nephropathy according to an embodiment of the present application.
As shown in fig. 4, the process of training the classification prediction model of hyperuricemia and gouty nephropathy is as follows:
step 401, obtaining a preset number of virtual objects in different phases of 5 phases of the confirmed hyperuricemia and the gouty nephropathy, obtaining a preset number of virtual objects in a kidney health state, and obtaining B-type ultrasonic images of kidney areas of all obtained virtual objects.
The preset number is a number set according to a requirement, for example, the preset number may be 100, or 200, etc., and the value of the preset number is not specifically limited in this embodiment, and the value of the preset number may be set according to an actual requirement, which is not specifically limited in this embodiment.
And 402, labeling the B-mode ultrasonic images of the kidney areas of all the virtual objects, and generating a data set according to the B-mode ultrasonic images of the kidney areas of all the virtual objects and the corresponding classification labeling results of hyperuricemia and gouty nephropathy.
Step 403, taking the data with the first preset proportion in the data set as a training data set, and taking the data with the second preset proportion in the data set as a test data set.
It is understood that the value obtained by adding the first preset proportion and the second preset proportion is equal to 1, and the first preset proportion and the second preset proportion are preset proportions when training the classification prediction model of hyperuricemia and gouty nephropathy, for example, the first preset proportion may be 80%, and the second preset proportion may be 20%. For another example, the first preset proportion may be 75%, and the second preset proportion may be 25%. It is understood that the values of the first preset proportion and the second preset proportion may be preset according to actual requirements, and the embodiment is not limited in particular.
Step 404, training the initial VIT model according to the training data set to obtain a classification prediction model of hyperuricemia and gouty nephropathy.
And step 405, inputting the B-type ultrasonic image in the test data set into a classification prediction model of hyperuricemia and gouty nephropathy to obtain classification prediction results of hyperuricemia and gouty nephropathy corresponding to the B-type ultrasonic image.
Step 406, determining preset performance indexes of the classification prediction model of hyperuricemia and gouty nephropathy based on the classification prediction result of the hyperuricemia and gouty nephropathy corresponding to the B-type ultrasonic image and the classification labeling result of the hyperuricemia and gouty nephropathy corresponding to the B-type ultrasonic image.
The preset performance indexes can comprise classification accuracy rate indexes, specificity indexes, sensitivity indexes and the like.
The method for calculating the index value corresponding to the classification accuracy index comprises the following steps:
the method for calculating the index value of the sensitivity index is as follows:
the calculation method of the index value of the specificity index is as follows:
TP represents the number of samples in the test data set which are actually positive examples and are judged to be positive examples by a classification and prediction model of hyperuricemia and gouty nephropathy; TN represents the number of samples in the test data set that are actually negative examples and that are determined to be negative examples by the classification prediction model for hyperuricemia and gouty nephropathy; FP represents the number of samples in the test data set that are actually negative examples and that are determined to be positive examples by the classification prediction model of hyperuricemia and gouty nephropathy; FN represents the number of samples in the test data set that are actually positive but are judged to be negative by the classification prediction model of hyperuricemia and gouty nephropathy; n is the total number of test samples in the test data set. The closer each index is to 1, the better the performance of the method.
In some examples, the predictive performance indicators for each gouty kidney disease stage obtained from testing the classification predictive model for hyperuricemia and gouty kidney disease based on the test data set are shown in table 1.
Table 1 example predictive performance on stage of gouty nephropathy
Based on the table, it can be seen that the classification prediction model of hyperuricemia and gouty nephropathy trained by the example can complete the prediction of gouty nephropathy stage, and the accuracy is over 80%.
Here, the gouty nephropathy in fig. 4 refers to chronic gouty nephropathy.
Step 407, determining the classification prediction model of hyperuricemia and gouty nephropathy as an effective model under the condition that the preset performance index of the classification prediction model of hyperuricemia and gouty nephropathy meets the preset requirement.
In some examples, in the case where the preset performance indexes include the above three performance indexes (classification accuracy index, specificity index, sensitivity index), it may be determined whether the index values corresponding to the above three performance indexes are each greater than the preset threshold value corresponding to each, and if the index values of the above three performance indexes are each greater than the preset threshold value corresponding to each, it is determined that the classification prediction model of hyperuricemia and gouty nephropathy is an effective model.
The preset thresholds corresponding to the three performance indicators may be the same or different, which is not specifically limited in this embodiment.
It can be understood that in the case that the classification prediction model of hyperuricemia and gouty nephropathy is an effective model, the method can be used for processing the B-type ultrasonic image of the object to be predicted through the classification prediction model of hyperuricemia and gouty nephropathy, and can be used for accurately determining the classification result of hyperuricemia and gouty nephropathy of the B-type ultrasonic image of the object to be predicted.
Fig. 5 is a schematic structural diagram of a classification device based on classification prediction models of hyperuricemia and gouty nephropathy according to an embodiment of the present application.
As shown in fig. 5, the classification device based on the classification prediction model of hyperuricemia and gouty nephropathy comprises: an acquisition module 51, an embedding layer module 52, an encoding layer module 53, a classification layer module 54, wherein:
an acquisition module 51 is configured to acquire a B-mode ultrasound image of a kidney region of the object to be predicted.
The embedding layer module 52 is configured to input the B-mode ultrasound image to an embedding layer in the classification prediction model for hyperuricemia and gouty nephropathy, perform a blocking process on the B-mode ultrasound image through the embedding layer, determine image feature vectors and position vectors corresponding to the image blocks obtained by the blocking, and perform a stitching process on the image feature vectors and the position vectors corresponding to the image blocks, respectively, so as to obtain stitched feature vectors corresponding to the image blocks.
The encoding layer module 53 is configured to input the spliced feature vectors corresponding to the plurality of image blocks into an encoding layer in the classification prediction model for hyperuricemia and gouty nephropathy, so as to perform fusion processing on the image feature vectors corresponding to the plurality of image blocks through the encoding layer, so as to obtain a feature image of the B-type ultrasound image.
The classification layer module 54 is configured to input the feature image to a classification layer in a classification prediction model of hyperuricemia and gouty nephropathy, so as to obtain classification results of hyperuricemia and gouty nephropathy of the B-mode ultrasound image.
Further, the coding layer module 53 includes: an attention sub-layer unit 531 and a coding sub-layer unit 532.
And the attention sub-layer unit 531 is configured to input the spliced feature vectors corresponding to the image blocks into an attention sub-layer, so as to determine the attention weights corresponding to the image blocks through the attention sub-layer.
The encoding sublayer unit 532 is configured to perform weighted fusion processing on the spliced feature vectors corresponding to the corresponding image blocks based on the attention weights corresponding to the image blocks, so as to obtain a feature image of the B-type ultrasound image.
In one embodiment of the present application, the attention sub-layer unit 531 is specifically configured to:
In the attention sublayer, determining the association degree between every two image blocks according to the splicing feature vectors corresponding to the image blocks;
and determining the attention weight corresponding to each image block according to the association degree.
In one embodiment of the present application, the classification device based on the classification prediction model of hyperuricemia and gouty nephropathy may further include:
the image size processing module is used for acquiring a first image size supported by the classification prediction model of hyperuricemia and gouty nephropathy; determining a second image size of the B-mode ultrasound image; in the case where the first image size and the second image size are not identical, the B-mode ultrasound image is subjected to scaling processing such that the image size of the B-mode ultrasound image after the scaling processing is the same as the first image size.
In one embodiment of the present application, the training mode of the classification prediction model for hyperuricemia and gouty nephropathy is: b ultrasonic images of kidney areas of sample objects and corresponding actual classification results of hyperuricemia and gouty nephropathy are obtained; b-type ultrasonic images of kidney areas of sample objects are used as input of classification prediction models of hyperuricemia and gouty nephropathy, actual results of gouty nephropathy classification are used as output of classification prediction models of hyperuricemia and gouty nephropathy, and the classification prediction models of gouty nephropathy are trained.
It should be noted that the foregoing explanation of the method embodiment is also applicable to the apparatus of this embodiment, and will not be repeated here.
According to the hyperuricemia and gouty nephropathy classifying device, firstly, a B-type ultrasonic image of a kidney area of an object to be predicted is acquired, the B-type ultrasonic image is input into an embedding layer in a hyperuricemia and gouty nephropathy classifying and predicting model, the B-type ultrasonic image is subjected to blocking processing through the embedding layer to determine image feature vectors and position vectors corresponding to a plurality of image blocks obtained through blocking, splicing processing is performed on the image feature vectors and the position vectors corresponding to the image blocks respectively to obtain spliced feature vectors corresponding to the image blocks respectively, then the spliced feature vectors are input into a coding layer in the hyperuricemia and gouty nephropathy classifying and predicting model, fusion processing is performed on the image feature vectors corresponding to the image blocks through the coding layer to obtain a feature image of the B-type ultrasonic image, and finally the feature image is input into a classifying layer in the hyperuricemia and gouty nephropathy classifying and predicting model to obtain hyperuricemia and gouty nephropathy classifying results of the B-type ultrasonic image. Therefore, the B-type ultrasonic image is processed through the classification prediction model of hyperuricemia and gouty nephropathy, and the classification results of hyperuricemia and gouty nephropathy corresponding to the B-type ultrasonic image of the object to be detected can be conveniently and rapidly determined, so that the subsequent application of the classification prediction model of hyperuricemia and gouty nephropathy in the classification diagnosis and treatment of hyperuricemia and gouty nephropathy is facilitated.
The embodiment of the application also provides electronic equipment, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the classification method based on the classification prediction model of hyperuricemia and gouty nephropathy of any one of the embodiments is realized when the processor executes the computer program.
Fig. 6 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application.
The present application is described more fully below by way of examples. This application may be embodied in many different forms and should not be construed as limited to the exemplary embodiments set forth herein.
As shown in fig. 6, the electronic device 600 may include: memory 601, processor 602, communication interface 603, bus architecture 604, wherein memory 601 is used for storing computer-executable instructions for executing the present application and is controlled by processor 602 for execution. The processor 602 is configured to execute computer-executable instructions stored in the memory 601, thereby implementing a classification method for hyperuricemia and gouty nephropathy provided in the above-described embodiments of the present application.
The communication interface 603, the processor 602, and the memory 601 may be connected to each other through a bus architecture 604; the bus architecture 604 may be a peripheral component interconnect (peripheral component interconnect, PCI) bus, or an extended industry standard architecture (extended industry Standard architecture, EISA) bus, among others. The bus architecture 604 may be divided into address buses, data buses, control buses, and the like. For ease of illustration, only one thick line is shown in fig. 6, but not only one bus or one type of bus.
The processor 602 may be a CPU, microprocessor, ASIC, or one or more integrated circuits for controlling the execution of the programs of the present application.
The communication interface 603 uses any transceiver-like means for communicating with other devices or communication networks, such as ethernet, radio access network (radio access network, RAN), wireless local area network (wireless local area networks, WLAN), wired access network, etc.
The memory 601 may be, but is not limited to, ROM or other type of static storage device that can store static information and instructions, RAM or other type of dynamic storage device that can store information and instructions, or an electrically erasable programmable read-only memory (EEPROM), compact-only-memory (CD-ROM) or other optical disk storage, optical disk storage (including compact disk, laser disk, optical disk, digital versatile disk, blu-ray disk, etc.), magnetic disk storage or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be self-contained and coupled to the processor through bus architecture 804. The memory may also be integrated with the processor.
The electronic device exists in a variety of forms including, but not limited to:
(1) A mobile communication device: such devices are characterized by mobile communication capabilities and are primarily targeted to provide voice and data communications. Such terminals include: smart phones (e.g., iPhone), multimedia phones, functional phones, and low-end phones, etc.
(2) Ultra mobile personal computer device: such devices are in the category of personal computers, having computing and processing functions, and generally also having mobile internet access characteristics. Such terminals include: PDA, MID, and UMPC devices, etc., such as iPad.
(3) Portable entertainment device: such devices may display and play multimedia content. The device comprises: audio, video players (e.g., iPod), palm game consoles, electronic books, and smart toys and portable car navigation devices.
(4) And (3) a server: the configuration of the server includes a processor, a hard disk, a memory, a system bus, and the like, and the server is similar to a general computer architecture, but is required to provide highly reliable services, and thus has high requirements in terms of processing capacity, stability, reliability, security, scalability, manageability, and the like.
(5) Other electronic devices with data interaction functions.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present application, the meaning of "plurality" is at least two, such as two, three, etc., unless explicitly defined otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and additional implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present application.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium may even be paper or other suitable medium upon which the program is printed, as the program may be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
It is to be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. As with the other embodiments, if implemented in hardware, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
Those of ordinary skill in the art will appreciate that all or a portion of the steps carried out in the method of the above-described embodiments may be implemented by a program to instruct related hardware, and the program may be stored in a computer readable storage medium, where the program when executed includes one or a combination of the steps of the method embodiments.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing module, or each unit may exist alone physically, or two or more units may be integrated in one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated modules may also be stored in a computer readable storage medium if implemented as software functional modules and sold or used as a stand-alone product.
The above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, or the like. Although embodiments of the present application have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the application, and that variations, modifications, alternatives, and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the application.

Claims (6)

1. A classification method based on classification prediction models of hyperuricemia and gouty nephropathy, the method comprising:
acquiring a B-type ultrasonic image of a kidney region of an object to be predicted;
inputting the B-type ultrasonic image into an embedding layer in the classification prediction model of hyperuricemia and gouty nephropathy, performing blocking processing on the B-type ultrasonic image through the embedding layer, determining image feature vectors and position vectors corresponding to a plurality of image blocks obtained by blocking, and performing splicing processing on the image feature vectors and the position vectors corresponding to the image blocks respectively to obtain spliced feature vectors corresponding to the image blocks;
inputting the spliced feature vectors corresponding to the image blocks into a coding layer in the classification prediction model of hyperuricemia and gouty nephropathy, so as to perform fusion processing on the image feature vectors corresponding to the image blocks through the coding layer, and obtain a feature image of the B-type ultrasonic image;
Inputting the characteristic image into a classification layer in the classification prediction model of hyperuricemia and gouty nephropathy to obtain a classification result of hyperuricemia and gouty nephropathy of the B-type ultrasonic image;
the coding layer includes: the method for obtaining the characteristic image of the B-type ultrasonic image includes the steps of:
inputting the spliced feature vectors corresponding to the image blocks into the attention sub-layer so as to determine the attention weights corresponding to the image blocks through the attention sub-layer;
in the coding sublayer, carrying out weighted fusion processing on spliced feature vectors corresponding to corresponding image blocks based on the attention weights corresponding to the image blocks so as to obtain a feature image of the B-type ultrasonic image;
the inputting the spliced feature vectors corresponding to the image blocks into the attention sub-layer to determine the attention weights corresponding to the image blocks through the attention sub-layer, including:
In the attention sublayer, determining the association degree between every two image blocks according to the splicing feature vectors corresponding to the image blocks;
and determining the attention weight corresponding to each image block according to the association degree.
2. The method according to claim 1, wherein the method further comprises:
acquiring a first image size supported by the classification prediction model of hyperuricemia and gouty nephropathy;
determining a second image size of the B-mode ultrasound image;
and in the case that the first image size and the second image size are not consistent, performing scaling processing on the B-mode ultrasonic image so that the image size of the B-mode ultrasonic image after scaling processing is the same as the first image size.
3. The method according to any one of claims 1-2, wherein the classification prediction model for hyperuricemia and gouty nephropathy is trained in the following manner:
b ultrasonic images of kidney areas of sample objects and corresponding classification results of hyperuricemia and gouty nephropathy are obtained;
and taking the B-type ultrasonic image of the kidney region of the sample object as input of the classification prediction model of hyperuricemia and gouty nephropathy, taking the classification result of the hyperuricemia and gouty nephropathy corresponding to the sample object as output of the classification prediction model of hyperuricemia and gouty nephropathy, and training the classification prediction model of hyperuricemia and gouty nephropathy.
4. A classification device based on classification prediction models of hyperuricemia and gouty nephropathy, the device comprising:
the acquisition module is used for acquiring a B-type ultrasonic image of the kidney region of the object to be predicted;
the embedding layer module is used for inputting the B-type ultrasonic image into an embedding layer in the classification prediction model of hyperuricemia and gouty nephropathy, carrying out blocking processing on the B-type ultrasonic image through the embedding layer, determining image feature vectors and position vectors corresponding to a plurality of image blocks obtained by blocking, and respectively carrying out splicing processing on the image feature vectors and the position vectors corresponding to the image blocks so as to obtain spliced feature vectors corresponding to the image blocks;
the coding layer module is used for inputting the spliced feature vectors corresponding to the image blocks into a coding layer in the classification prediction model of hyperuricemia and gouty nephropathy so as to perform fusion processing on the image feature vectors corresponding to the image blocks through the coding layer to obtain a feature image of the B-type ultrasonic image;
the classification layer module is used for inputting the characteristic image into a classification layer in the hyperuricemia and gouty nephropathy classification prediction model so as to obtain a hyperuricemia and gouty nephropathy classification result of the B-type ultrasonic image;
The coding layer includes: the method for obtaining the characteristic image of the B-type ultrasonic image includes the steps of:
inputting the spliced feature vectors corresponding to the image blocks into the attention sub-layer so as to determine the attention weights corresponding to the image blocks through the attention sub-layer;
in the coding sublayer, carrying out weighted fusion processing on spliced feature vectors corresponding to corresponding image blocks based on the attention weights corresponding to the image blocks so as to obtain a feature image of the B-type ultrasonic image;
the inputting the spliced feature vectors corresponding to the image blocks into the attention sub-layer to determine the attention weights corresponding to the image blocks through the attention sub-layer, including:
in the attention sublayer, determining the association degree between every two image blocks according to the splicing feature vectors corresponding to the image blocks;
And determining the attention weight corresponding to each image block according to the association degree.
5. The device of claim 4, wherein the classification prediction model for hyperuricemia and gouty nephropathy is trained in the following manner:
b ultrasonic images of kidney areas of sample objects and corresponding classification results of hyperuricemia and gouty nephropathy are obtained;
and taking the B-type ultrasonic image of the kidney region of the sample object as input of the classification prediction model of hyperuricemia and gouty nephropathy, taking the classification result of hyperuricemia and gouty nephropathy as output of the classification prediction model of hyperuricemia and gouty nephropathy, and training the classification prediction model of hyperuricemia and gouty nephropathy.
6. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the method of any one of claims 1 to 3 when the computer program is executed.
CN202310580524.1A 2023-05-23 2023-05-23 Method and device for classifying hyperuricemia and gouty nephropathy and electronic equipment Active CN116269507B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310580524.1A CN116269507B (en) 2023-05-23 2023-05-23 Method and device for classifying hyperuricemia and gouty nephropathy and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310580524.1A CN116269507B (en) 2023-05-23 2023-05-23 Method and device for classifying hyperuricemia and gouty nephropathy and electronic equipment

Publications (2)

Publication Number Publication Date
CN116269507A CN116269507A (en) 2023-06-23
CN116269507B true CN116269507B (en) 2023-07-25

Family

ID=86818957

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310580524.1A Active CN116269507B (en) 2023-05-23 2023-05-23 Method and device for classifying hyperuricemia and gouty nephropathy and electronic equipment

Country Status (1)

Country Link
CN (1) CN116269507B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020257482A1 (en) * 2019-06-19 2020-12-24 Seno Medical Instruments, Inc. Method and system for managing feature reading and scoring in ultrasound and/or optoacoustice images
CN112754522A (en) * 2019-11-01 2021-05-07 深圳迈瑞生物医疗电子股份有限公司 Doppler calculus imaging method and ultrasonic imaging device
CN112819773A (en) * 2021-01-28 2021-05-18 清华大学 Ultrasonic image quantitative evaluation method
WO2021125950A1 (en) * 2019-12-17 2021-06-24 Universiteit Maastricht Image data processing method, method of training a machine learning data processing model and image processing system
CN113842166A (en) * 2021-10-25 2021-12-28 上海交通大学医学院 Ultrasonic image acquisition method based on ultrasonic imaging equipment and related device
CN216602913U (en) * 2021-12-10 2022-05-27 浙江省人民医院 Hyperuricemia patient joint ultrasonic screening device
CN115120262A (en) * 2021-03-24 2022-09-30 陈海冰 Identification device based on ultrasonic image
CN115462836A (en) * 2022-09-26 2022-12-13 河南中医药大学第一附属医院 Obstetrical and gynecological clinical prenatal monitoring system for pregnant women

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6736864B2 (en) * 2015-10-09 2020-08-05 コニカミノルタ株式会社 Ultrasound diagnostic imaging device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020257482A1 (en) * 2019-06-19 2020-12-24 Seno Medical Instruments, Inc. Method and system for managing feature reading and scoring in ultrasound and/or optoacoustice images
CN112754522A (en) * 2019-11-01 2021-05-07 深圳迈瑞生物医疗电子股份有限公司 Doppler calculus imaging method and ultrasonic imaging device
WO2021125950A1 (en) * 2019-12-17 2021-06-24 Universiteit Maastricht Image data processing method, method of training a machine learning data processing model and image processing system
CN112819773A (en) * 2021-01-28 2021-05-18 清华大学 Ultrasonic image quantitative evaluation method
CN115120262A (en) * 2021-03-24 2022-09-30 陈海冰 Identification device based on ultrasonic image
CN113842166A (en) * 2021-10-25 2021-12-28 上海交通大学医学院 Ultrasonic image acquisition method based on ultrasonic imaging equipment and related device
CN216602913U (en) * 2021-12-10 2022-05-27 浙江省人民医院 Hyperuricemia patient joint ultrasonic screening device
CN115462836A (en) * 2022-09-26 2022-12-13 河南中医药大学第一附属医院 Obstetrical and gynecological clinical prenatal monitoring system for pregnant women

Also Published As

Publication number Publication date
CN116269507A (en) 2023-06-23

Similar Documents

Publication Publication Date Title
JP7085062B2 (en) Image segmentation methods, equipment, computer equipment and computer programs
CN110807788B (en) Medical image processing method, medical image processing device, electronic equipment and computer storage medium
CN110310256B (en) Coronary stenosis detection method, coronary stenosis detection device, computer equipment and storage medium
CN108846829B (en) Lesion site recognition device, computer device, and readable storage medium
JP6474946B1 (en) Image analysis result providing system, image analysis result providing method, and program
CN109102501B (en) Joint image processing method and image processing equipment
CN111951276A (en) Image segmentation method and device, computer equipment and storage medium
US11361435B2 (en) Processing fundus images using machine learning models to generate blood-related predictions
CN112333165A (en) Identity authentication method, device, equipment and system
CN116269507B (en) Method and device for classifying hyperuricemia and gouty nephropathy and electronic equipment
KR20220039881A (en) Method for evaluating diagnostic ablility of an expert, artificial intelligence based method for assisting diagnosis and device using the same
CN116825236A (en) Method, device, equipment and medium for generating drug molecules of protein targets
CN112101114A (en) Video target detection method, device, equipment and storage medium
CN111403032A (en) Child brain development level assessment method, system and storage device
CN111899848B (en) Image recognition method and device
CN113705595A (en) Method, device and storage medium for predicting degree of abnormal cell metastasis
CN116342986B (en) Model training method, target organ segmentation method and related products
CN115089112B (en) Post-stroke cognitive impairment risk assessment model building method and device and electronic equipment
CN113192031A (en) Blood vessel analysis method, blood vessel analysis device, computer equipment and storage medium
CN115601303A (en) Method and device for evaluating activity of rheumatoid arthritis, electronic device, and medium
CN116824213A (en) Muck truck mud detection method and device based on multi-view feature fusion
CN110853012B (en) Method, apparatus and computer storage medium for obtaining cardiac parameters
CN116259072B (en) Animal identification method, device, equipment and storage medium
CN111310669A (en) Real-time measuring method and device for head circumference of fetus
CN117079825B (en) Disease occurrence probability prediction method and disease occurrence probability determination system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant