WO2023030426A1 - Procédé et appareil de reconnaissance de polype, support et dispositif - Google Patents

Procédé et appareil de reconnaissance de polype, support et dispositif Download PDF

Info

Publication number
WO2023030426A1
WO2023030426A1 PCT/CN2022/116425 CN2022116425W WO2023030426A1 WO 2023030426 A1 WO2023030426 A1 WO 2023030426A1 CN 2022116425 W CN2022116425 W CN 2022116425W WO 2023030426 A1 WO2023030426 A1 WO 2023030426A1
Authority
WO
WIPO (PCT)
Prior art keywords
polyp
target
image
pixel
probability
Prior art date
Application number
PCT/CN2022/116425
Other languages
English (en)
Chinese (zh)
Inventor
边成
李剑
杨志雄
Original Assignee
北京字节跳动网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京字节跳动网络技术有限公司 filed Critical 北京字节跳动网络技术有限公司
Publication of WO2023030426A1 publication Critical patent/WO2023030426A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30028Colon; Small intestine
    • G06T2207/30032Colon polyp

Definitions

  • the present disclosure relates to the field of image processing, and in particular, to a polyp identification method, device, medium and equipment.
  • Endoscopes are widely used for colon screening and polyp detection, but the detection accuracy of endoscopes largely depends on the experience of endoscopists. However, because the characteristics of polyps are difficult to identify, and many polyps are small in size, the missed detection rate of polyp detection is relatively high, which greatly increases the difficulty of early polyp screening.
  • the deep learning method can be used for model training to be used in a computer-aided diagnosis system for polyp identification and segmentation.
  • the accuracy of the results output by the model will drop greatly.
  • usually only the recognition result corresponding to the model can be output, and it is difficult to determine the accuracy of the recognition result, so that the user cannot determine the reliability of the recognition result.
  • the present disclosure provides a polyp identification method, the method comprising:
  • each set of target feature map sets includes target feature maps obtained by sampling multiple feature maps once;
  • each target feature map in the set of target feature maps determines the polyp recognition probability corresponding to the polyp image, wherein the polyp recognition probability includes each pixel in the polyp image The target probability distribution corresponding to the point;
  • a target recognition result of the polyp image and an uncertainty measure corresponding to the target recognition result are determined.
  • the present disclosure provides a polyp identification device, the device comprising:
  • a receiving module configured to receive an image of a polyp to be identified
  • a processing module configured to obtain, according to the polyp image and the polyp recognition model, feature maps corresponding to the polyp image corresponding to multiple output nodes of the target feature layer of the polyp recognition model;
  • a sampling module configured to perform multiple sampling on multiple feature maps to obtain multiple sets of target feature map sets, wherein each set of target feature map sets includes target feature maps obtained by sampling multiple feature maps once ;
  • the first determining module is configured to, for each set of target feature maps, determine the polyp recognition probability corresponding to the polyp image according to each target feature map in the target feature map set, wherein the polyp recognition probability includes all The target probability distribution corresponding to each pixel in the polyp image;
  • the second determination module is configured to determine the target recognition result of the polyp image and the uncertainty corresponding to the target recognition result according to the polyp recognition probability corresponding to the polyp image determined under each set of target feature maps measure of sex.
  • the present disclosure provides a computer-readable medium on which a computer program is stored, and when the program is executed by a processing device, the steps of the method described in the first aspect are implemented.
  • an electronic device including:
  • a processing device configured to execute the computer program in the storage device to implement the steps of the method in the first aspect.
  • a plurality of feature maps corresponding to the polyp image can be obtained according to the polyp image and the polyp recognition model; and then multiple sampling is performed on a plurality of the feature maps to obtain features based on the multiple sampling
  • the figure determines the final recognition result and the corresponding uncertainty measure of the recognition result. Therefore, through the above technical solution, on the basis of the existing polyp identification model, through multiple sampling, the volatility of the identification result can be determined based on the multiple sampling data, and the uncertainty measure of the identification result can be determined. , so that the user can be prompted for the uncertainty measure of the recognition result while prompting the user for the recognition result.
  • the user can process the identification result based on the uncertainty measure, for example, for the identification result with a high uncertainty measure, the user can perform manual identification, etc., so as to further ensure the accuracy of polyp identification.
  • it can provide effective and accurate data support for users to make decisions based on the recognition results, and improve user experience.
  • FIG. 1 is a flowchart of a polyp identification method provided according to an embodiment of the present disclosure
  • Fig. 2 is a schematic structural diagram of a polyp recognition model provided according to an embodiment of the present disclosure
  • Fig. 3 is a schematic structural diagram of an encoder in a polyp recognition model
  • Fig. 4 is a block diagram of a polyp identification device provided according to an embodiment of the present disclosure.
  • FIG. 5 shows a schematic structural diagram of an electronic device suitable for implementing the embodiments of the present disclosure.
  • the term “comprise” and its variations are open-ended, ie “including but not limited to”.
  • the term “based on” is “based at least in part on”.
  • the term “one embodiment” means “at least one embodiment”; the term “another embodiment” means “at least one further embodiment”; the term “some embodiments” means “at least some embodiments.” Relevant definitions of other terms will be given in the description below.
  • the existing trained polyp recognition model can be used to recognize polyp images, in which the recognition is performed based on a function of a large number of definite parameters through training, that is, after the image to be recognized is input into the polyp recognition model , no matter what kind of input, the model will give an output result, and it is difficult for the user to determine the accuracy and reliability of the output result.
  • the present disclosure provides the following embodiments.
  • FIG. 1 it is a flowchart of a polyp identification method provided according to an embodiment of the present disclosure. As shown in FIG. 1, the method includes:
  • step 11 an image of a polyp to be identified is received.
  • the polyp image may be an acquired endoscopic image (such as a gastroscopy image, a colonoscopy image, etc.).
  • data collection can be performed on patients to obtain detection data containing polyps, and then in order to ensure uniform processing of polyp images, the detection data can be standardized, for example, the obtained detection data contains white light endoscopic images of polyps as the polyp image.
  • the resolution and size of the polyp image can be standardized to obtain a polyp image of uniform size, which is convenient for the subsequent identification process.
  • step 12 according to the polyp image and the polyp recognition model, feature maps corresponding to the polyp image corresponding to the multiple output nodes of the target feature layer of the polyp recognition model are obtained.
  • the polyp recognition model may be a transformer model, as shown in FIG. 2 , which may include a linear projection layer 21 , an encoder 22 and an output layer 23 .
  • the encoder can be the Encoder in the Vision Transformer, and the linear projection layer (Linear Projection) can be understood as a fully connected layer.
  • the encoder includes: Multi-Head Self Attention (MSA, multi-head attention), Norm and classification layer, and the classification layer can be MLP (Multilayer Perceptron).
  • MSA Multi-Head Self Attention
  • Norm Norm
  • classification layer can be MLP (Multilayer Perceptron).
  • each feature layer in the encoder contains a plurality of output nodes (ie, neurons), so that the feature map output by each output node in a specific layer can be obtained.
  • the target feature layer is the last feature layer among the multiple feature layers of the polyp recognition model, so as to ensure the accuracy and comprehensiveness of features in the feature map output by the target feature layer.
  • step 13 multiple feature maps are sampled to obtain multiple sets of target feature map sets, wherein each set of target feature map sets includes target feature maps obtained by sampling multiple feature maps once.
  • the multiple feature maps are usually fused to obtain a target feature map, and then upsampling and softmax processing are performed according to the target feature map and an output layer to obtain a final recognition result.
  • multiple samplings may be performed on the multiple feature maps, so as to determine the final recognition result based on the multiple sampling results. It should be noted that if the uncertainty measure of the recognition result is low, it means that the features in the polyp image to be recognized already exist in the distribution of the trained data of the polyp recognition model.
  • the prediction results determined by sampling should be similar; if the uncertainty measure of the recognition results is high, it means that the features in the polyp image to be recognized do not exist in the distribution of the trained data of the polyp recognition model. At this time, the prediction results determined by multiple sampling should be more random.
  • step 14 for each target feature map set, according to each target feature map in the target feature map set, determine the polyp recognition probability corresponding to the polyp image, wherein the polyp recognition probability includes each The target probability distribution corresponding to the pixel.
  • step 15 the target recognition result of the polyp image and the uncertainty measure corresponding to the target recognition result are determined according to the polyp recognition probabilities corresponding to the determined polyp images under each set of target feature maps.
  • multiple feature maps corresponding to the polyp image can be obtained according to the polyp image and the polyp recognition model;
  • the obtained feature map determines the final recognition result and the corresponding uncertainty measure of the recognition result. Therefore, through the above technical solution, on the basis of the existing polyp identification model, through multiple sampling, the volatility of the identification result can be determined based on the multiple sampling data, and the uncertainty measure of the identification result can be determined. , so that the user can be prompted for the uncertainty measure of the recognition result while prompting the user for the recognition result. And it can enable the user to process the identification result based on the uncertainty measure, for example, for the identification result with a high uncertainty measure, the user can perform manual identification, etc., so as to further ensure the accuracy of polyp identification. In addition, it can provide effective and accurate data support for users to make decisions based on the recognition results, and improve user experience.
  • step 12 may include the following steps:
  • a polyp image can be divided into multiple sub-images of equal size (which can be represented as patches) according to a specified size.
  • a polyp image can be represented as Then you can segment according to the specified size P*P to obtain sub-images
  • H ⁇ W is used to represent the length and width of the polyp image
  • C represents the number of channels of the polyp image
  • the polyp image is an RGB image
  • the polyp image is 224*224
  • the specified size is 16*16, then the polyp image can be divided into 196 sub-images.
  • the joint vector corresponding to the sub-image is determined, and the position vector is used to indicate the position of the sub-image in the polyp image.
  • the linear projection layer can be used to flatten each sub-image first, that is, to flatten the sub-image into a one-dimensional vector, and then perform a linear transformation on the one-dimensional vector corresponding to the sub-image (which can be understood as passing through a fully connected layer ), to perform dimension reduction processing on each sub-image to obtain an image vector corresponding to the sub-image (which can be expressed as a patch embedding), and the image vector can represent the sub-image.
  • the 6 blocks output by Linear Projection are image vectors.
  • a position vector (which may be expressed as position embedding) for indicating the position of the sub-image in the polyp image may also be generated, where the size of the position embedding is the same as that of the patch embedding.
  • the 6 blocks identified by numbers 1-6 are the position embeddings corresponding to each sub-image.
  • an image vector that is, the block identified by the symbol "#”
  • a position vector that is, the block identified by the number 0
  • the position embedding can be random Generated, the encoder is able to learn the representation of the position of the corresponding sub-image in the polyp image.
  • an image vector representing the subimage Denotes the position vector and D is used to denote the dimension of the embedding vector space.
  • the encoder can generate the encoding vector corresponding to each sub-image according to the joint vector corresponding to each sub-image, and at the same time, can also generate according to the joint vector corresponding to all sub-images
  • the encoding vector corresponding to the polyp image can be understood as the vector learned by the encoder and can represent the corresponding sub-image
  • the encoding vector corresponding to the polyp image can be understood as the vector learned by the encoder and can represent the entire polyp image.
  • the model can include multiple encoders, and the joint vector corresponding to each sub-image can be input into each encoder, and the encoder outputs the encoding vector corresponding to each sub-image to obtain the encoding vector corresponding to the polyp image.
  • the patch embedding and position embedding are spliced and input to the encoder.
  • the Multi-Head Attention in the encoder can divide the patch embedding+position embedding into h groups, and then input them into h
  • the attention structure, the obtained results are concat, and normalized using Norm to obtain the attention feature map corresponding to the polyp image.
  • Query, Key and Value can be included, where Key and Value are in pairs, for a given Query vector Match k Key vectors through inner product calculation (the dimension is d-dimensional, and the matrix is obtained by stacking ), the resulting inner product can be normalized by softmax to obtain k weights, then the attention output corresponding to the Query vector is the Value vector corresponding to the k Key vectors (that is, the matrix ) weighted average.
  • the attention output Attention is as follows:
  • the scaling factor can be set according to the actual application scenario to avoid the influence of variance brought by the dot product.
  • h attention heads can be defined, that is, h self attention is applied to the joint vector of the polyp image, and the joint vector can be split into h N*d size sequence, and concat the obtained results to obtain the attention feature map MSA(X) corresponding to the polyp image X, the formula is as follows:
  • the attention feature map can be input into the classification layer MLP to obtain the encoding vector corresponding to the polyp image, that is, the feature map.
  • step 13 multiple samplings are performed on multiple feature maps to obtain multiple groups of target feature map sets.
  • An exemplary implementation manner is as follows, and this step may include:
  • the target distribution for sampling is determined according to the target activation rate corresponding to the target feature layer.
  • Multiple sampling is performed from multiple feature maps according to the target distribution to obtain multiple sets of target feature map sets, wherein the target feature maps in each set of target feature map conform to the target distribution.
  • the target activation rate may be set according to an actual usage scenario.
  • dropout can usually be used to reduce the interaction between hidden layer nodes in the network. For example, during the forward propagation, the activation value of a certain neuron can be stopped with a certain probability, so that the output of the next layer will not be too dependent on some local features, and the generalization of the model can be enhanced.
  • the dropout method when performing polyp identification based on the trained polyp identification model, for the multiple feature maps output by the target feature layer, the dropout method can be used to sample from multiple feature maps to obtain Multiple target feature maps.
  • the target distribution may be a Bernoulli distribution, and when the target activation rate p is determined, the target distribution may be expressed as a Bernoulli (1-p) distribution.
  • the target feature map set zl can be determined as follows:
  • z h-1 is used to represent the corresponding multiple feature maps of the target feature layer
  • W l is used to represent the weight corresponding to the target feature layer
  • M l is used to represent the dropout mask corresponding to the target feature layer.
  • the same sampling method as in the polyp training process can be used to sample multiple feature maps multiple times, so that multiple sets of different target feature map sets can be obtained to ensure the randomness of the target feature map. Predict the recognition results based on different target feature map sets, so that the uncertainty measurement of the recognition results can be determined based on the difference of the recognition results corresponding to the multiple sets of target feature map sets to provide reliable data support.
  • the method of graph sampling is fast and simple, which can also improve the efficiency of polyp identification to a certain extent.
  • multiple feature maps are sampled multiple times in step 13, and another exemplary implementation manner of obtaining multiple sets of target feature map sets is as follows, and this step may include:
  • the correlation matrix is also a correlation coefficient matrix, which is composed of correlation coefficients between columns of the matrix. That is to say, the element in row i and column j of the correlation matrix is the correlation coefficient between column i and column j of the original matrix, and each column in the original matrix can correspond to a feature map of an output node.
  • the target activation rate corresponding to the correlation matrix and the target feature layer perform multiple determinant dot product process calculations, and determine that the correlation matrix obtained by each determinant dot product process calculation corresponds to the target feature layer
  • the maximum volume sub-matrix corresponding to the target activation rate of wherein the feature maps corresponding to the elements contained in the maximum volume sub-matrix serve as a set of target feature map sets.
  • the determinant dot product process is a probability model that can convert complex probability calculations into simple determinant calculations, and calculate the probability of each subset through the determinant of the kernel matrix.
  • DPP Determinantal Point Process
  • the subset with the largest correlation and diversity in each feature map can be determined through maximum a posteriori probability estimation, so that multiple feature maps can be determined as a set of target feature maps.
  • the probability of the occurrence of an empty set is given, that is, the target activation rate
  • the probability of occurrence of the subset Y is P(Y) ⁇ det(L Y ), where L Y is used to indicate that the subscripts of the rows and columns belong to the sub-matrix of the matrix L formed by Y.
  • each feature map can be regarded as a one-dimensional vector in a space, and the volume of the polygonal cube formed by the vector is used to measure the probability of the appearance of the target feature map set, and the physical meaning of the determinant is that each vector Therefore, the maximum a posteriori probability estimation can be transformed into the problem of solving the largest determinant. Based on this, the problem of solving the determinant can be used to determine the corresponding maximum volume sub-matrix under the target activation rate, and then the feature map corresponding to each vector in the sub-matrix can be used as the target feature map. Wherein, the process of solving the determinant is a well-known technology in the art, and will not be repeated here.
  • each feature map contained in the determined maximum volume sub-matrix can be a different sampling set, so as to perform different sampling based on the different sets of multiple sampling. Deterministic analysis.
  • multiple sampling can be performed from multiple feature maps based on the determinant dot product process to obtain multiple sets of target feature map sets, so that the diversity of each target feature map in each set of target feature map sets , so that as many features in the polyp image as possible can be collected in each sampling process, so as to ensure the accuracy of the target recognition result.
  • each sampling can be predicted based on different and diverse target feature maps, which can also improve the accuracy of uncertainty measurement to a certain extent.
  • step 14 an exemplary implementation of determining the polyp recognition probability corresponding to the polyp image according to each target feature map in the target feature map set is as follows, and this step may include:
  • Multiple target feature maps are weighted and summed to obtain a fusion feature map.
  • the multiple target feature maps can be weighted and summed according to the weights corresponding to the output nodes corresponding to the determined target feature maps to obtain the fusion feature map, wherein the weights corresponding to the output nodes are trained in the polyp recognition model It is determined in middle school.
  • Upsampling is performed according to the fusion feature map and the full convolutional network to obtain a prediction feature map of the same size as the polyp image.
  • the last fully connected layer based on the convolutional neural network is replaced by a convolutional layer.
  • the convolutional layer is used to upsample the feature map of the last convolutional layer in the last convolutional neural network, so that the feature map can be restored to the same size as the input image, that is, the predicted feature map is obtained.
  • the size of the polyp image and the predicted feature map are the same, and the predicted feature map is obtained by upsampling based on the fusion feature map, then the distance between each element in the predicted feature map and each pixel in the polyp image One-to-one correspondence, and the location information is also corresponding, that is, the spatial information in the original polyp image is preserved in the prediction feature map.
  • the predicted feature map with the same size as the polyp image can be obtained through the full convolutional network. While ensuring the spatial information of the polyp image, softmax processing can be performed on each element value in the predicted feature map, so that Each element in the prediction feature map is obtained corresponding to the recognition probability of each classification, so that when the polyp image is recognized, pixel-level prediction can be performed, that is, for each pixel in the polyp image, determine the corresponding Target probability distribution, so as to determine the classification corresponding to the pixel point, so as to determine the final recognition result of the polyp image according to the classification of each pixel point, which can effectively increase the fineness of polyp recognition and improve the accuracy of polyp recognition to a certain extent.
  • step 15 according to the polyp recognition probability corresponding to the polyp image determined under each set of target feature maps, the target recognition result of the polyp image and the An exemplary implementation of the uncertainty measurement corresponding to the target recognition result is as follows, and this step may include:
  • For each pixel in the polyp image determine the probability value that the pixel corresponds to the same classification in the target probability distribution of multiple polyp recognition probabilities, and determine the average value of the multiple probability values as the pixel corresponding to target probability for that category.
  • each pixel in the polyp image corresponds to the classification S1 as described above.
  • the probability from S2 to Sn that is, the target probability distribution corresponding to the pixel.
  • each pixel point x in the polyp image it can be assigned the average value of the probability values corresponding to the same classification in the target probability distribution of polyp recognition probability obtained by T sampling, such as classification S1
  • the average value of the probability values corresponding to the category S2 in the T target probability distributions determined by the pixel point x in T sampling is taken as the pixel point x corresponding to the target probability of the category S2
  • the target probabilities, and so on, determine the target probabilities of the pixel point x corresponding to the categories S1, S2 to Sn.
  • the target probabilities corresponding to each classification can be determined in the same manner.
  • the target recognition result is determined, then the final recognition result can be determined based on multiple sampling data, so that the target recognition result can be guaranteed to a certain extent
  • the accuracy provides accurate data support for subsequent decision-making of image processing.
  • this step may include:
  • For each pixel in the polyp image determine the classification of the maximum target probability corresponding to the pixel as the classification corresponding to the pixel, wherein the target recognition result includes each pixel in the polyp image The category corresponding to the point.
  • the pixel-level recognition of the polyp image can be realized, and the fineness of the target recognition result can be improved while the target recognition result is obtained, which can reduce the A certain polyp missed rate.
  • the polyp classification corresponding to the polyp image can be comprehensively determined according to the classification corresponding to each pixel in the polyp image in the target recognition result, so as to realize the recognition and classification of polyps in the polyp image and improve user experience.
  • An uncertainty measure corresponding to the target recognition result is determined according to multiple polyp recognition probabilities corresponding to the polyp image, and each pixel in the polyp image corresponds to a target probability of each classification.
  • the uncertainty measure is used to represent the uncertainty of the target recognition result output by the polyp recognition model.
  • the polyp The recognition model learns fewer features about the polyp image, and the output target recognition result has higher uncertainty. At this time, it is difficult for the model to accurately recognize the polyp image based on the learned knowledge.
  • the sum of the prediction variances of each pixel in the polyp image is determined as the uncertainty measure.
  • the polyp recognition model can accurately recognize the polyp image based on the learned features, and the corresponding recognition result The higher the confidence, the lower the uncertainty.
  • the features of the polyp image are known features of the polyp recognition model, similar recognition results should be obtained from the data obtained from multiple samplings. If the features of the polyp image are not in the features learned by the polyp recognition model, then the polyp recognition model is more likely to recognize the polyp image through mismatching experience, and the confidence of the corresponding recognition result is lower than that of the polyp recognition model. Low means higher uncertainty.
  • the features of the polyp image are features unknown to the polyp recognition model, the prediction from data obtained from multiple samplings is highly random, and different recognition results may be obtained for different sampling data.
  • the uncertainty measurement analysis can be realized through the variance between the prediction probabilities corresponding to the target feature map sets obtained by multiple sampling.
  • the classification corresponding to the pixel is n
  • the number of sampling is T times
  • it can be based on the pixel x corresponding to the classification S1 in the target probability distribution corresponding to T times of sampling
  • the probability of and the determined pixel point x corresponds to the target probability of classification S1
  • the variance calculation is performed:
  • x) is used to represent the prediction variance of pixel x corresponding to the i-th category
  • x) is used to represent the probability that pixel x corresponds to the i-th category in the target probability distribution of the t-th sampling;
  • the prediction variance corresponding to each pixel point and each classification can be determined, and then the sum of the prediction variances of the pixel point under each classification is determined as the prediction variance of the pixel point, and further the prediction variance of each pixel point The sum of the variances determines the uncertainty measure.
  • the uncertainty of the target recognition result can be characterized by determining the variance between the predicted probabilities determined in the multiple sampling processes. If the variance is small, it means that the polyp recognition probabilities determined based on multiple sampling data are similar, that is, the target recognition result is determined based on the known knowledge in the polyp recognition model, and the accuracy of the target recognition result If the variance is large, it means that the polyp recognition probabilities determined based on multiple sampling data have a large difference, that is, the target recognition result is not determined based on the known knowledge in the polyp recognition model. The accuracy of the recognition result is insufficient, that is, the uncertainty of the target recognition result is high. At this time, it can be prompted to the user, so that the user can focus on such recognition results with high uncertainty, and improve the use of the polyp recognition method. Scenario and applicability, to improve accurate and reliable data support for subsequent image processing decisions.
  • the present disclosure also provides a polyp identification device, as shown in FIG. 4 , the device 10 includes:
  • a receiving module 100 configured to receive an image of a polyp to be identified
  • a processing module 200 configured to obtain, according to the polyp image and the polyp recognition model, feature maps corresponding to the polyp image corresponding to multiple output nodes of the target feature layer of the polyp recognition model;
  • the sampling module 300 is configured to perform multiple sampling on multiple feature maps to obtain multiple sets of target feature map sets, wherein each set of target feature map sets includes target features obtained by sampling multiple feature maps once picture;
  • the first determination module 400 is configured to, for each set of target feature maps, determine the polyp recognition probability corresponding to the polyp image according to each target feature map in the target feature map set, wherein the polyp recognition probability includes The target probability distribution corresponding to each pixel in the polyp image;
  • the second determination module 500 is configured to determine the target recognition result of the polyp image and the non-identification probability corresponding to the target recognition result according to the polyp recognition probability corresponding to the polyp image determined under each set of target feature maps. Certainty measure.
  • the sampling module includes:
  • the first determination submodule is used to determine the target distribution for sampling according to the target activation rate corresponding to the target feature layer;
  • the first sampling sub-module is configured to perform multiple samplings from multiple feature maps according to the target distribution to obtain multiple sets of target feature map sets, wherein the target feature maps in each set of target feature map conform to the target distribution .
  • the sampling module includes:
  • the second determining submodule is used to determine the feature map of the output node in the target feature layer, and determine the correlation matrix corresponding to the target feature layer;
  • the second sampling sub-module is used to perform multiple determinant dot product process calculations according to the correlation matrix and the target activation rate corresponding to the target feature layer, and determine the correlation obtained by each determinant dot product process calculation.
  • the first determination module includes:
  • the fusion sub-module is used to weight and sum multiple target feature maps to obtain a fusion feature map
  • the upsampling submodule is used to perform upsampling according to the fusion feature map and the full convolutional network to obtain a prediction feature map of the same size as the polyp image;
  • the processing sub-module is configured to perform softmax processing on the element value of each element in the prediction feature map, and obtain the target probability distribution corresponding to the pixel corresponding to the element in the polyp image.
  • the second determination module includes:
  • the third determination submodule is used to determine, according to each pixel point in the polyp image, the probability value corresponding to the same classification in the target probability distribution of multiple polyp recognition probabilities of the pixel point, and combine the multiple probability values The average value is determined as the target probability that the pixel corresponds to the category;
  • the fourth determining submodule is used to determine the target recognition result according to the target probability corresponding to each classification for each pixel in the polyp image;
  • the fifth determining sub-module is used to determine the uncertainty corresponding to the target recognition result according to the multiple polyp recognition probabilities corresponding to the polyp image and the target probabilities corresponding to each classification of each pixel in the polyp image. measure of sex.
  • the fourth determination submodule includes:
  • For each pixel in the polyp image determine the classification of the maximum target probability corresponding to the pixel as the classification corresponding to the pixel, wherein the target recognition result includes each pixel in the polyp image The category corresponding to the point.
  • the fourth determination submodule includes:
  • the sixth determining submodule is used for each pixel in the polyp image, according to the probability that the pixel corresponds to the same classification in the target probability distribution of each polyp recognition probability, and the pixel corresponds to the classification , determine the prediction variance corresponding to the pixel point and the classification, and determine the sum of the prediction variance of the pixel point under each classification as the prediction variance of the pixel point;
  • a seventh determining submodule configured to determine the sum of prediction variances of each pixel in the polyp image as the uncertainty measure.
  • FIG. 5 it shows a schematic structural diagram of an electronic device 600 suitable for implementing the embodiments of the present disclosure.
  • the terminal equipment in the embodiment of the present disclosure may include but not limited to such as mobile phone, notebook computer, digital broadcast receiver, PDA (personal digital assistant), PAD (tablet computer), PMP (portable multimedia player), vehicle terminal (such as mobile terminals such as car navigation terminals) and fixed terminals such as digital TVs, desktop computers and the like.
  • the electronic device shown in FIG. 5 is only an example, and should not limit the functions and scope of use of the embodiments of the present disclosure.
  • an electronic device 600 may include a processing device (such as a central processing unit, a graphics processing unit, etc.) 601, which may be randomly accessed according to a program stored in a read-only memory (ROM) 602 or loaded from a storage device 608. Various appropriate actions and processes are executed by programs in the memory (RAM) 603 . In the RAM 603, various programs and data necessary for the operation of the electronic device 600 are also stored.
  • the processing device 601, ROM 602, and RAM 603 are connected to each other through a bus 604.
  • An input/output (I/O) interface 605 is also connected to the bus 604 .
  • the following devices can be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; including, for example, a liquid crystal display (LCD), speaker, vibration an output device 607 such as a computer; a storage device 608 including, for example, a magnetic tape, a hard disk, etc.; and a communication device 609.
  • the communication means 609 may allow the electronic device 600 to communicate with other devices wirelessly or by wire to exchange data. While FIG. 5 shows electronic device 600 having various means, it should be understood that implementing or having all of the means shown is not a requirement. More or fewer means may alternatively be implemented or provided.
  • embodiments of the present disclosure include a computer program product, which includes a computer program carried on a non-transitory computer readable medium, where the computer program includes program code for executing the method shown in the flowchart.
  • the computer program may be downloaded and installed from a network via communication means 609, or from storage means 608, or from ROM 602.
  • the processing device 601 When the computer program is executed by the processing device 601, the above-mentioned functions defined in the methods of the embodiments of the present disclosure are performed.
  • the above-mentioned computer-readable medium in the present disclosure may be a computer-readable signal medium or a computer-readable storage medium or any combination of the above two.
  • a computer readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination thereof. More specific examples of computer-readable storage media may include, but are not limited to, electrical connections with one or more wires, portable computer diskettes, hard disks, random access memory (RAM), read-only memory (ROM), erasable Programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave carrying computer-readable program code therein. Such propagated data signals may take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • a computer-readable signal medium may also be any computer-readable medium other than a computer-readable storage medium, which can transmit, propagate, or transmit a program for use by or in conjunction with an instruction execution system, apparatus, or device .
  • Program code embodied on a computer readable medium may be transmitted by any appropriate medium, including but not limited to wires, optical cables, RF (radio frequency), etc., or any suitable combination of the above.
  • the client and the server can communicate using any currently known or future network protocols such as HTTP (HyperText Transfer Protocol, Hypertext Transfer Protocol), and can communicate with digital data in any form or medium
  • HTTP HyperText Transfer Protocol
  • the communication eg, communication network
  • Examples of communication networks include local area networks (“LANs”), wide area networks (“WANs”), internetworks (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network of.
  • the above-mentioned computer-readable medium may be included in the above-mentioned electronic device, or may exist independently without being incorporated into the electronic device.
  • the above-mentioned computer-readable medium carries one or more programs, and when the above-mentioned one or more programs are executed by the electronic device, the electronic device: receives the polyp image to be identified; according to the polyp image and the polyp identification model, obtains A plurality of output nodes of the target feature layer of the polyp recognition model respectively correspond to feature maps corresponding to the polyp image; multiple samplings are performed on a plurality of the feature maps to obtain multiple sets of target feature map sets, wherein, Each target feature map set includes a target feature map obtained by sampling a plurality of the feature maps once; for each target feature map set, according to each target feature map in the target feature map set, determine the A polyp identification probability corresponding to a polyp image, wherein the polyp identification probability includes a target probability distribution corresponding to each pixel in the polyp image; The polyp recognition probability of the polyp image is determined to determine the target recognition result of the polyp image and the uncertainty measure corresponding to the target recognition result.
  • Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, or combinations thereof, including but not limited to object-oriented programming languages—such as Java, Smalltalk, C++, and Includes conventional procedural programming languages - such as "C" or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (for example, using an Internet service provider to connected via the Internet).
  • LAN local area network
  • WAN wide area network
  • Internet service provider for example, using an Internet service provider to connected via the Internet.
  • each block in a flowchart or block diagram may represent a module, program segment, or portion of code that contains one or more logical functions for implementing specified executable instructions.
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or they may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented by a dedicated hardware-based system that performs the specified functions or operations , or may be implemented by a combination of dedicated hardware and computer instructions.
  • the modules involved in the embodiments described in the present disclosure may be implemented by software or by hardware. Wherein, the name of the module does not constitute a limitation of the module itself in some cases, for example, the receiving module can also be described as "a module that receives the image of the polyp to be identified".
  • FPGAs Field Programmable Gate Arrays
  • ASICs Application Specific Integrated Circuits
  • ASSPs Application Specific Standard Products
  • SOCs System on Chips
  • CPLD Complex Programmable Logical device
  • a machine-readable medium may be a tangible medium that may contain or store a program for use by or in conjunction with an instruction execution system, apparatus, or device.
  • a machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
  • a machine-readable medium may include, but is not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems, apparatus, or devices, or any suitable combination of the foregoing.
  • machine-readable storage media would include one or more wire-based electrical connections, portable computer discs, hard drives, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, compact disk read only memory (CD-ROM), optical storage, magnetic storage, or any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read only memory
  • CD-ROM compact disk read only memory
  • magnetic storage or any suitable combination of the foregoing.
  • Example 1 provides a polyp identification method, wherein the method includes:
  • each set of target feature map sets includes target feature maps obtained by sampling multiple feature maps once;
  • each target feature map in the set of target feature maps determines the polyp recognition probability corresponding to the polyp image, wherein the polyp recognition probability includes each pixel in the polyp image The target probability distribution corresponding to the point;
  • a target recognition result of the polyp image and an uncertainty measure corresponding to the target recognition result are determined.
  • Example 2 provides the method of Example 1, wherein the performing multiple sampling on multiple feature maps to obtain multiple sets of target feature map sets includes:
  • Multiple sampling is performed from multiple feature maps according to the target distribution to obtain multiple sets of target feature map sets, wherein the target feature maps in each set of target feature map conform to the target distribution.
  • Example 3 provides the method of Example 1, wherein the performing multiple sampling on multiple feature maps to obtain multiple sets of target feature map sets includes:
  • the target activation rate corresponding to the correlation matrix and the target feature layer perform multiple determinant dot product process calculations, and determine the target activation rate of the correlation matrix obtained by each determinant dot product process calculation
  • the maximum volume sub-matrix wherein the feature maps corresponding to the elements contained in the maximum volume sub-matrix serve as a set of target feature map sets.
  • Example 4 provides the method of Example 1, wherein the determining the polyp recognition probability corresponding to the polyp image according to each target feature map in the target feature map set includes:
  • Upsampling is performed according to the fusion feature map and the full convolutional network to obtain a prediction feature map of the same size as the polyp image;
  • Example 5 provides the method of Example 1, wherein, according to the polyp recognition probability corresponding to the polyp image determined under each set of target feature maps, the determined The target recognition result of the polyp image and the uncertainty measure corresponding to the target recognition result include:
  • each pixel in the polyp image it is determined that the pixel corresponds to the probability value of the same classification in the target probability distribution of multiple polyp identification probabilities, and the average value of the multiple probability values is determined as the pixel corresponding to target probability for that category;
  • An uncertainty measure corresponding to the target recognition result is determined according to multiple polyp recognition probabilities corresponding to the polyp image, and each pixel in the polyp image corresponds to a target probability of each classification.
  • Example 6 provides the method of Example 5, wherein the target recognition result is determined according to the target probability corresponding to each classification for each pixel in the polyp image, include:
  • For each pixel in the polyp image determine the classification of the maximum target probability corresponding to the pixel as the classification corresponding to the pixel, wherein the target recognition result includes each pixel in the polyp image The category corresponding to the point.
  • Example 7 provides the method of Example 5, wherein the multiple polyp identification probabilities corresponding to the polyp image and each pixel in the polyp image correspond to The target probability of each classification determines the uncertainty measure corresponding to the target recognition result, including:
  • the sum of the prediction variances of each pixel in the polyp image is determined as the uncertainty measure.
  • Example 8 provides a polyp identification device, wherein the device includes:
  • a receiving module configured to receive an image of a polyp to be identified
  • a processing module configured to obtain, according to the polyp image and the polyp recognition model, feature maps corresponding to the polyp image corresponding to multiple output nodes of the target feature layer of the polyp recognition model;
  • a sampling module configured to perform multiple sampling on multiple feature maps to obtain multiple sets of target feature map sets, wherein each set of target feature map sets includes target feature maps obtained by sampling multiple feature maps once ;
  • the first determining module is configured to, for each set of target feature maps, determine the polyp recognition probability corresponding to the polyp image according to each target feature map in the target feature map set, wherein the polyp recognition probability includes all The target probability distribution corresponding to each pixel in the polyp image;
  • the second determination module is configured to determine the target recognition result of the polyp image and the uncertainty corresponding to the target recognition result according to the polyp recognition probability corresponding to the polyp image determined under each set of target feature maps measure of sex.
  • Example 9 provides a computer-readable medium on which a computer program is stored, wherein, when the program is executed by a processing device, the method described in any one of Examples 1-7 is implemented A step of.
  • Example 10 provides an electronic device, including:
  • a processing device configured to execute the computer program in the storage device, so as to implement the steps of the method in any one of examples 1-7.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

La présente divulgation concerne un procédé et un appareil de reconnaissance de polype, un support et un dispositif. Le procédé consiste à : recevoir une image de polype à reconnaître ; obtenir, en fonction de l'image de polype et d'un modèle de reconnaissance de polype, des cartes de caractéristiques qui correspondent à l'image de polype et qui correspondent respectivement à une pluralité de nœuds de sortie d'une couche de caractéristique cible du modèle de reconnaissance de polype ; échantillonner de multiples fois la pluralité de cartes de caractéristiques pour obtenir une pluralité d'ensembles de cartes de caractéristiques cibles ; pour chacun des ensembles de cartes de caractéristiques cibles, déterminer, en fonction des cartes de caractéristiques cibles dans l'ensemble de cartes de caractéristiques cibles, en fonction de chaque carte de caractéristiques cibles dans l'ensemble de cartes de caractéristiques cibles, une probabilité de reconnaissance de polype correspondant à l'image de polype ; et en fonction de la probabilité de reconnaissance de polype déterminée correspondant à l'image de polype sous chacun des ensembles de cartes de caractéristiques cibles, déterminer un résultat de reconnaissance cible de l'image de polype et une métrique d'incertitude correspondant au résultat de reconnaissance cible. Par conséquent, en échantillonnant de multiples fois une carte de caractéristiques, un utilisateur peut être informé de la métrique d'incertitude d'un résultat de reconnaissance tandis que le résultat de reconnaissance est notifié à l'utilisateur.
PCT/CN2022/116425 2021-09-02 2022-09-01 Procédé et appareil de reconnaissance de polype, support et dispositif WO2023030426A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111028068.7 2021-09-02
CN202111028068.7A CN113470026B (zh) 2021-09-02 2021-09-02 息肉识别方法、装置、介质及设备

Publications (1)

Publication Number Publication Date
WO2023030426A1 true WO2023030426A1 (fr) 2023-03-09

Family

ID=77867225

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/116425 WO2023030426A1 (fr) 2021-09-02 2022-09-01 Procédé et appareil de reconnaissance de polype, support et dispositif

Country Status (2)

Country Link
CN (1) CN113470026B (fr)
WO (1) WO2023030426A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113470026B (zh) * 2021-09-02 2021-11-05 北京字节跳动网络技术有限公司 息肉识别方法、装置、介质及设备

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100183210A1 (en) * 2009-01-22 2010-07-22 Van Uitert Robert L Computer-assisted analysis of colonic polyps by morphology in medical images
WO2016161115A1 (fr) * 2015-03-31 2016-10-06 Mayo Foundation For Medical Education And Research Système et procédés de détection automatique de polypes à l'aide de réseaux neuronaux convolutionnels
WO2017027475A1 (fr) * 2015-08-07 2017-02-16 Jianming Liang Procédés, systèmes et moyens permettant simultanément de surveiller la qualité d'une vidéo coloscopique et de détecter des polypes dans une coloscopie
CN109447973A (zh) * 2018-10-31 2019-03-08 腾讯科技(深圳)有限公司 一种结肠息肉图像的处理方法和装置及系统
US20190080454A1 (en) * 2016-05-19 2019-03-14 Avantis Medical Systems, Inc. Methods for polyp detection
CN111105412A (zh) * 2019-12-30 2020-05-05 郑州大学 一种用于肠道息肉检测识别的智能辅助系统
CN111784628A (zh) * 2020-05-11 2020-10-16 北京工业大学 基于有效学习的端到端的结直肠息肉图像分割方法
CN112465766A (zh) * 2020-11-25 2021-03-09 武汉楚精灵医疗科技有限公司 扁平、微小息肉图像识别方法
CN112884702A (zh) * 2020-12-29 2021-06-01 香港中文大学深圳研究院 一种基于内窥镜图像的息肉识别系统和方法
CN113470026A (zh) * 2021-09-02 2021-10-01 北京字节跳动网络技术有限公司 息肉识别方法、装置、介质及设备

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107239733A (zh) * 2017-04-19 2017-10-10 上海嵩恒网络科技有限公司 连续手写字识别方法及系统
US10878269B2 (en) * 2018-06-19 2020-12-29 Sap Se Data extraction using neural networks
CN109003260B (zh) * 2018-06-28 2021-02-09 深圳视见医疗科技有限公司 Ct图像肺结节检测方法、装置、设备及可读存储介质
CN109472789A (zh) * 2018-11-20 2019-03-15 北京贝叶科技有限公司 一种用于皮肤病理图像处理的神经网络训练方法及装置

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100183210A1 (en) * 2009-01-22 2010-07-22 Van Uitert Robert L Computer-assisted analysis of colonic polyps by morphology in medical images
WO2016161115A1 (fr) * 2015-03-31 2016-10-06 Mayo Foundation For Medical Education And Research Système et procédés de détection automatique de polypes à l'aide de réseaux neuronaux convolutionnels
WO2017027475A1 (fr) * 2015-08-07 2017-02-16 Jianming Liang Procédés, systèmes et moyens permettant simultanément de surveiller la qualité d'une vidéo coloscopique et de détecter des polypes dans une coloscopie
US20190080454A1 (en) * 2016-05-19 2019-03-14 Avantis Medical Systems, Inc. Methods for polyp detection
CN109447973A (zh) * 2018-10-31 2019-03-08 腾讯科技(深圳)有限公司 一种结肠息肉图像的处理方法和装置及系统
CN111105412A (zh) * 2019-12-30 2020-05-05 郑州大学 一种用于肠道息肉检测识别的智能辅助系统
CN111784628A (zh) * 2020-05-11 2020-10-16 北京工业大学 基于有效学习的端到端的结直肠息肉图像分割方法
CN112465766A (zh) * 2020-11-25 2021-03-09 武汉楚精灵医疗科技有限公司 扁平、微小息肉图像识别方法
CN112884702A (zh) * 2020-12-29 2021-06-01 香港中文大学深圳研究院 一种基于内窥镜图像的息肉识别系统和方法
CN113470026A (zh) * 2021-09-02 2021-10-01 北京字节跳动网络技术有限公司 息肉识别方法、装置、介质及设备

Also Published As

Publication number Publication date
CN113470026B (zh) 2021-11-05
CN113470026A (zh) 2021-10-01

Similar Documents

Publication Publication Date Title
WO2022252881A1 (fr) Procédé et appareil de traitement d'image, support lisible et dispositif électronique
WO2022078197A1 (fr) Procédé et appareil de segmentation de nuage de points, dispositif et support de stockage
JP2023547917A (ja) 画像分割方法、装置、機器および記憶媒体
WO2023030298A1 (fr) Procédé de typage de polype, procédé d'entraînement de modèle et appareil associé
WO2023030370A1 (fr) Procédé et appareil de détection d'image d'endoscope, support de stockage et dispositif électronique
CN111275721A (zh) 一种图像分割方法、装置、电子设备及存储介质
WO2023030523A1 (fr) Procédé et appareil de positionnement de cavité tissulaire pour un endoscope, support et dispositif
WO2023077995A1 (fr) Procédé et appareil d'extraction d'informations, dispositif, support et produit
WO2022171036A1 (fr) Procédé de suivi de cible vidéo, appareil de suivi de cible vidéo, support de stockage et dispositif électronique
WO2023030373A1 (fr) Procédé et appareil pour positionner une cavité tissulaire, et support lisible et dispositif électronique
WO2023030097A1 (fr) Procédé et appareil pour déterminer la propreté d'une cavité tissulaire, et support lisible et dispositif électronique
CN112800276B (zh) 视频封面确定方法、装置、介质及设备
WO2023030427A1 (fr) Procédé d'entraînement pour modèle génératif, procédé et appareil d'identification de polypes, support et dispositif
WO2023035896A1 (fr) Procédé et appareil de reconnaissance vidéo, support lisible, et dispositif électronique
WO2023185516A1 (fr) Procédé et appareil d'apprentissage de modèle de reconnaissance d'image, procédé et appareil de reconnaissance, support et dispositif
US20240013564A1 (en) System, devices and/or processes for training encoder and/or decoder parameters for object detection and/or classification
WO2023030426A1 (fr) Procédé et appareil de reconnaissance de polype, support et dispositif
WO2023125008A1 (fr) Procédé et appareil de traitement d'image d'endoscope basé sur l'intelligence artificielle, support et dispositif
CN111402113B (zh) 图像处理方法、装置、电子设备及计算机可读介质
CN113140012B (zh) 图像处理方法、装置、介质及电子设备
CN114240867A (zh) 内窥镜图像识别模型的训练方法、内窥镜图像识别方法及装置
WO2023130925A1 (fr) Procédé et appareil de reconnaissance de police, support lisible et dispositif électronique
US20240029420A1 (en) System, devices and/or processes for application of kernel coefficients
WO2023016290A1 (fr) Procédé et appareil de classification de vidéo, support lisible et dispositif électronique
US20240037713A1 (en) System, devices and/or processes for image anti-aliasing

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22863561

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE