CN117611584B - Tissue engineering peripheral nerve graft culture method and system - Google Patents
Tissue engineering peripheral nerve graft culture method and system Download PDFInfo
- Publication number
- CN117611584B CN117611584B CN202410086454.9A CN202410086454A CN117611584B CN 117611584 B CN117611584 B CN 117611584B CN 202410086454 A CN202410086454 A CN 202410086454A CN 117611584 B CN117611584 B CN 117611584B
- Authority
- CN
- China
- Prior art keywords
- nerve graft
- image
- nerve
- graft
- time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 210000000578 peripheral nerve Anatomy 0.000 title claims abstract description 31
- 238000012136 culture method Methods 0.000 title description 2
- 210000005036 nerve Anatomy 0.000 claims abstract description 430
- 239000011159 matrix material Substances 0.000 claims abstract description 88
- 238000000034 method Methods 0.000 claims abstract description 45
- 238000012258 culturing Methods 0.000 claims abstract description 14
- 238000001514 detection method Methods 0.000 claims description 81
- 238000012549 training Methods 0.000 claims description 70
- 239000007943 implant Substances 0.000 claims description 39
- 238000002372 labelling Methods 0.000 claims description 33
- 230000004927 fusion Effects 0.000 claims description 19
- 238000004364 calculation method Methods 0.000 claims description 3
- 238000002054 transplantation Methods 0.000 abstract description 4
- 239000013598 vector Substances 0.000 abstract description 3
- 230000000694 effects Effects 0.000 abstract description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 108010025020 Nerve Growth Factor Proteins 0.000 description 1
- 102000007072 Nerve Growth Factors Human genes 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000007621 cluster analysis Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000003900 neurotrophic factor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
- G06T7/0016—Biomedical image inspection using an image reference approach involving temporal comparison
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/11—Surgical instruments, devices or methods, e.g. tourniquets for performing anastomosis; Buttons for anastomosis
- A61B17/1128—Surgical instruments, devices or methods, e.g. tourniquets for performing anastomosis; Buttons for anastomosis of nerves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/049—Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/761—Proximity, similarity or dissimilarity measures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/11—Surgical instruments, devices or methods, e.g. tourniquets for performing anastomosis; Buttons for anastomosis
- A61B2017/1132—End-to-end connections
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Computational Linguistics (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Biophysics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Medical Informatics (AREA)
- Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Multimedia (AREA)
- Neurology (AREA)
- Databases & Information Systems (AREA)
- Heart & Thoracic Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a method and a system for culturing a tissue engineering peripheral nerve graft. The invention is based on obtaining slice images of cultured tissue engineered peripheral nerve grafts that are not matched with the cultured tissue engineered peripheral nerve grafts that are to be subjected to nerve grafting during the process of obtaining the cultured tissue engineered peripheral nerve grafts. And combining the feature vectors in all the images into a feature three-dimensional matrix. According to the judgment feature, the growth state of different areas in the nerve slice, namely the change of the pixel distribution judgment image is detected, so that the prediction is performed. And the time for culturing the tissue engineering peripheral nerve graft capable of performing nerve transplantation is obtained by obtaining the characteristic matrix of each time point independently to form a three-dimensional graft characteristic matrix and obtaining the time characteristic three-dimensional matrix through a modified time convolution network to jointly perform prediction culture. The sections are obtained at the time for matching, so that the matching rate of the nerve graft and the receptor nerve can be improved, and the nerve repair effect is improved.
Description
Technical Field
The invention relates to the technical field of computers, in particular to a method and a system for culturing a tissue engineering peripheral nerve graft.
Background
At present, the construction of the missing nerve bundles by using the nerve graft has great development, and the provided nerve graft comprises a sensory nerve bundle channel, a motor nerve bundle channel and a mixed nerve bundle channel, wherein each nerve bundle channel is matched with the nerve bundles of the corresponding sections of the normal nerve. The nerve graft, which is matched in appearance to the nerve segment to be repaired, has an internal nerve bundle channel shape that also matches precisely with the nerve bundle to be repaired. Secondly, different neurotrophic factors are respectively loaded in the nerve graft, the nerve bundles with different functions can be promoted and guided in directions, and the shape of the nerve bundle channel, the motor nerve bundle channel and the mixed nerve bundle channel are required to be highly matched with the repaired nerve bundles. In the process of culturing the tissue engineering peripheral nerve graft, the cut surface image of the nerve graft needs to be acquired with high precision, and the cultured tissue engineering peripheral nerve graft is damaged by multiple detection, so that a proper time for judging and corresponding to the high coincidence degree of nerve bundles to be repaired needs to be found, and the surgical graft for repairing the nerve defect with wider application range and better quality can be obtained.
Disclosure of Invention
The present invention is directed to a method and a system for culturing tissue-engineered peripheral nerve grafts, which are used for solving the above-mentioned problems in the prior art.
In a first aspect, an embodiment of the present invention provides a method for culturing a tissue-engineered peripheral nerve graft, including:
obtaining a set of nerve graft images; the nerve graft image set comprises a plurality of nerve graft images corresponding to the detected cross sections at a plurality of time points in the tissue engineering peripheral nerve graft culture process;
obtaining a current nerve graft image and a labeling nerve graft image; the current nerve graft image is a nerve graft image with a time length from a current time point in the nerve graft image set longer than that of other nerve graft images; the labeling nerve graft image is an image of a nerve graft expected to be capable of performing nerve grafting;
inputting the marked nerve graft image and the current nerve graft image into a nerve graft detection network to obtain a detection value; the detection value represents similarity to a nerve graft capable of detecting nerve grafting;
judging the growth speed change of the nerve graft through a nerve graft growth network according to the detection value, the labeling nerve graft image and the nerve graft image set to obtain a prediction time length;
acquiring a new nerve graft image, inputting the new nerve graft image and the marked nerve graft image into a nerve graft detection network, and judging whether the nerve graft can carry out nerve grafting or not; the new nerve graft image is a nerve graft image acquired at the current time point plus the time point of the predicted time length.
Optionally, the nerve graft growth network comprises a nerve graft detection network, a time convolution network and a prediction change network;
the prediction variation network includes a time variation network and a nerve graft prediction network.
Optionally, the determining, according to the detection value, the labeling nerve graft image and the nerve graft image set, the change of the growth rate of the nerve graft through the nerve graft growth network, to obtain the predicted time length includes:
inputting the nerve graft image in the nerve graft image set into a nerve graft detection network, extracting features, and obtaining a graft feature matrix;
obtaining a plurality of implant feature matrixes correspondingly by the plurality of nerve implant images;
constructing a plurality of implant feature matrixes into an implant three-dimensional matrix;
the nerve graft images are input into a time convolution network from far to near in time to obtain a time characteristic three-dimensional matrix; the time feature three-dimensional matrix represents the change features of the nerve graft image containing time information;
and obtaining the prediction time length through a prediction change network according to the implant three-dimensional matrix, the time feature three-dimensional matrix, the labeling nerve implant image and the current nerve implant image in the nerve implant image set.
Optionally, the obtaining, according to the graft three-dimensional matrix, the time feature three-dimensional matrix, the labeling nerve graft image and the current nerve graft image in the nerve graft image set, a prediction time length through a prediction change network includes:
inputting the three-dimensional matrix of the implant into a time variation network, and extracting the time variation characteristics to obtain implant variation characteristics;
inputting the time characteristic three-dimensional matrix into a time variation network, extracting time variation characteristics and obtaining time variation characteristics;
fusing the implant change characteristics and the time change characteristics to obtain a fusion matrix;
and inputting the fusion matrix, the current nerve graft image and the marked nerve graft image into a nerve graft prediction network for prediction to obtain the prediction time length.
Optionally, the inputting the graft three-dimensional matrix into a time-varying network, extracting the time-varying feature, and obtaining the graft variation feature, including:
convolving the graft three-dimensional matrix with a three-dimensional convolution kernel with the size of 2xNxN according to the time direction to obtain a graft change characteristic; m nerve graft images are contained in the nerve graft image set to correspondingly obtain M-1 graft change characteristics;
and inputting M-1 graft change characteristics into a three-dimensional convolution network to obtain the graft change characteristics.
Optionally, the training process of the prediction change network:
obtaining a first training subset; the first training subset is a non-empty subset of a set of nerve graft images;
obtaining a second training subset; the second training subset is a subset of the set of nerve graft images that is mutually exclusive to the first training subset;
obtaining a training image; the training image includes a training fusion matrix, a first nerve graft image, and a second nerve graft image;
the first nerve graft image is a nerve graft image with the time point in the first training subset being smaller than other time points from the current time point;
the second nerve graft image is a random one of the nerve graft images in the second training subset;
obtaining annotation data; the labeling data is the time length of the time point image corresponding to the second nerve graft image minus the time point image corresponding to the first nerve graft image;
inputting the training fusion matrix, the first nerve graft image and the second nerve graft image into a nerve graft prediction network for prediction to obtain training time length;
carrying out loss calculation on the training time length and the labeling data to obtain a loss value;
and training a time change network and a nerve graft prediction network according to the loss value to obtain a trained prediction change network.
Optionally, the method for acquiring the training fusion matrix includes:
based on the first training subset, extracting features through a nerve graft detection network to obtain a first graft three-dimensional matrix;
based on the first training subset, obtaining a first time feature matrix through a time convolution network;
inputting the first graft three-dimensional matrix into a time change network, extracting time change characteristics, and obtaining first cluster change characteristics;
inputting the first time feature matrix into a time change network, extracting time change features and obtaining first time change features;
and fusing the first cluster change characteristics and the first time change characteristics to obtain a training fusion matrix.
Optionally, the nerve graft growth network is trained by training the nerve graft growth network to obtain a trained nerve graft growth network through training the loss value of the prediction change network and back-propagating.
Optionally, the training method of the nerve graft detection network includes:
acquiring a plurality of similar nerve graft image sets and a plurality of dissimilar nerve graft image sets; the group of similar nerve graft images includes two nerve graft images having a Euclidean distance less than a threshold; the dissimilar nerve graft image group comprises two nerve graft images with Euclidean distance greater than or equal to a threshold value;
inputting the similar nerve graft image group or the dissimilar nerve graft image group into a nerve detection network to obtain a detection value; the detection value of 0 indicates dissimilarity and the detection value of 1 indicates similarity; the detection value of 0 indicates dissimilarity and the detection value of 1 indicates similarity; the detection value of 0 indicates dissimilarity.
In a second aspect, embodiments of the present invention provide a culture system for tissue-engineered peripheral nerve grafts, comprising:
the acquisition module is used for: obtaining a set of nerve graft images; the nerve graft image set comprises a plurality of nerve graft images corresponding to the detected cross sections at a plurality of time points in the tissue engineering peripheral nerve graft culture process; obtaining a current nerve graft image and a labeling nerve graft image; the current nerve graft image is a nerve graft image with a time length from a current time point in the nerve graft image set longer than that of other nerve graft images; the labeling nerve graft image is an image of a nerve graft expected to be capable of performing nerve grafting;
and a detection module: inputting the marked nerve graft image and the current nerve graft image into a nerve graft detection network to obtain a detection value; the detection value represents similarity to a nerve graft capable of detecting nerve grafting;
a growth change judging module: judging the growth speed change of the nerve graft through a nerve graft growth network according to the detection value, the labeling nerve graft image and the nerve graft image set to obtain a prediction time length;
and (3) a re-detection module: acquiring a new nerve graft image, inputting the new nerve graft image and the marked nerve graft image into a nerve graft detection network, and judging whether the nerve graft can carry out nerve grafting or not; the new nerve graft image is a nerve graft image acquired at the current time point plus the time point of the predicted time length.
Compared with the prior art, the embodiment of the invention achieves the following beneficial effects: the embodiment of the invention also provides a method and a system for culturing the tissue engineering peripheral nerve graft, wherein the method comprises the following steps: a set of nerve graft images is obtained. The set of nerve graft images includes a plurality of nerve graft images corresponding to detected cross-sections at a plurality of time points during tissue engineering peripheral nerve graft culture. A current nerve graft image and a labeling nerve graft image are obtained. The current nerve graft image is a nerve graft image with a time length from a current time point in the nerve graft image set longer than that of other nerve graft images. The labeled nerve graft image is an image of a nerve graft expected to be capable of performing nerve grafting. And inputting the marked nerve graft image and the current nerve graft image into a nerve graft detection network to obtain a detection value. The detection value indicates a similarity to a nerve graft capable of detecting nerve grafting. And judging the growth speed change of the nerve graft through a nerve graft growth network according to the detection value, the labeling nerve graft image and the nerve graft image set, and obtaining the prediction time length. And acquiring a new nerve graft image, inputting the new nerve graft image and the marked nerve graft image into a nerve graft detection network, and judging whether the nerve graft can carry out nerve grafting or not. The new nerve graft image is a nerve graft image acquired at the current time point plus the time point of the predicted time length.
According to the method, the time for culturing the tissue engineering peripheral nerve graft capable of performing nerve transplantation is judged according to the slice image which is not matched with the tissue engineering peripheral nerve graft which needs to perform nerve transplantation in the process of obtaining the tissue engineering peripheral nerve graft. And acquiring a feature matrix of each image. And combining the feature vectors in all the images into a feature three-dimensional matrix. According to the judgment feature, the growth state of different areas in the nerve slice, namely the change of the pixel distribution judgment image is detected, so that the prediction is performed. And the time for culturing the tissue engineering peripheral nerve graft capable of performing nerve transplantation is obtained by obtaining the characteristic matrix of each time point independently to form a three-dimensional graft characteristic matrix and obtaining the time characteristic three-dimensional matrix through a modified time convolution network to jointly perform prediction culture. The sections are obtained at the time for matching, so that the matching rate of the nerve graft and the receptor nerve can be improved, and the nerve repair effect is improved.
Drawings
Fig. 1 is a flowchart of a method for culturing a tissue-engineered peripheral nerve graft according to an embodiment of the present invention.
Detailed Description
The present invention will be described in detail with reference to the accompanying drawings.
Example 1
As shown in fig. 1, an embodiment of the present invention provides a method for culturing a tissue-engineered peripheral nerve graft, the method comprising:
s101: a set of nerve graft images is obtained. The set of nerve graft images includes a plurality of nerve graft images corresponding to detected cross-sections at a plurality of time points during tissue engineering peripheral nerve graft culture.
S102: a current nerve graft image and a labeling nerve graft image are obtained. The current nerve graft image is a nerve graft image with a time length from a current time point in the nerve graft image set longer than that of other nerve graft images. The labeled nerve graft image is an image of a nerve graft expected to be capable of performing nerve grafting.
S103: and inputting the marked nerve graft image and the current nerve graft image into a nerve graft detection network to obtain a detection value. The detection value indicates a similarity to a nerve graft capable of detecting nerve grafting.
S104: and judging the growth speed change of the nerve graft through a nerve graft growth network according to the detection value, the labeling nerve graft image and the nerve graft image set, and obtaining the prediction time length.
S105: and acquiring a new nerve graft image, inputting the new nerve graft image and the marked nerve graft image into a nerve graft detection network, and judging whether the nerve graft can carry out nerve grafting or not. The new nerve graft image is a nerve graft image acquired at the current time point plus the time point of the predicted time length.
Optionally, the nerve graft growth network includes a nerve graft detection network, a time convolution network, and a prediction change network.
The prediction variation network includes a time variation network and a nerve graft prediction network.
Optionally, the obtaining the number of nerve bundles of the nerve graft according to the nerve graft monitoring image set includes:
inputting the nerve graft image in the nerve graft image set into a nerve graft detection network, extracting the characteristics, and obtaining a graft characteristic matrix.
The implant feature matrix is a matrix formed by unclassified features of the output of the nerve implant detection network.
The plurality of nerve graft images correspondingly obtain a plurality of graft feature matrices.
And constructing a plurality of graft characteristic matrixes into a graft three-dimensional matrix.
And (3) the plurality of nerve graft images are input into the time convolution network from far to near in time to obtain a time characteristic three-dimensional matrix. The temporal feature three-dimensional matrix represents the varying features of the nerve graft image including temporal information.
And obtaining the prediction time length through a prediction change network according to the implant three-dimensional matrix, the time feature three-dimensional matrix, the labeling nerve implant image and the current nerve implant image in the nerve implant image set.
Through the method, the feature matrix of each image is obtained. And combining the feature vectors in all the images into a feature three-dimensional matrix. According to the judgment feature, the growth state of different areas in the nerve slice, namely the change of the pixel distribution judgment image is detected, so that the prediction is performed.
Optionally, the obtaining, according to the graft three-dimensional matrix, the time feature three-dimensional matrix, the labeling nerve graft image and the current nerve graft image in the nerve graft image set, a prediction time length through a prediction change network includes:
inputting the three-dimensional matrix of the implant into a time variation network, and extracting the time variation characteristics to obtain implant variation characteristics.
And inputting the time characteristic three-dimensional matrix into a time variation network, extracting the time variation characteristics and obtaining the time variation characteristics.
And fusing the implant change characteristics and the time change characteristics to obtain a fusion matrix.
And adding the graft change characteristics and the characteristics of the corresponding positions in the time change characteristics, and then averaging to perform fusion.
And inputting the fusion matrix, the current nerve graft image and the marked nerve graft image into a nerve graft prediction network for prediction to obtain the prediction time length.
By the method, the cluster change condition is changed. And taking the time characteristic three-dimensional matrix as an auxiliary adjustment graft three-dimensional matrix, thereby adjusting the change network.
Optionally, the inputting the graft three-dimensional matrix into a time-varying network, extracting the time-varying feature, and obtaining the graft variation feature, including:
and convolving the graft three-dimensional matrix with a three-dimensional convolution kernel with the size of 2xNxN according to the time direction to obtain the graft change characteristic. M nerve graft images are contained in the nerve graft image set, and M-1 graft change characteristics are correspondingly obtained.
Wherein 2 represents two time points, and N is the same as the length or width of the cluster analysis matrix.
Wherein M represents the number of nerve graft images used for determining the change speed, in this embodiment, the number of elements in the nerve graft image set, that is, the number of nerve graft images included in the nerve graft image set. Meanwhile, M also represents the number of time points at which the nerve graft images in the nerve graft image set are acquired.
Wherein the convolution kernel is shifted in the time direction by a step size of 1.
And inputting M-1 graft change characteristics into a three-dimensional convolution network to obtain the graft change characteristics.
In this embodiment, the three-dimensional convolutional network is a three-dimensional convolutional neural network (3D-CNN).
By the method, the three-dimensional convolution kernel with the size of 2xNxN is obtained, and the characteristic relation between every two is obtained by taking the step length as 1, so that the change condition is obtained.
Optionally, the training process of the prediction change network:
a first training subset is obtained. The first training subset is a non-empty subset of a set of nerve graft images.
A second training subset is obtained. The second training subset is a subset of the set of nerve graft images that is mutually exclusive of the first training subset.
A training image is obtained. The training image includes a training fusion matrix, a first nerve graft image, and a second nerve graft image.
The first nerve graft image is a nerve graft image in which the time point in the first training subset is less than the current time point than the other time points.
The second nerve graft image is a random one of the nerve graft images in the second training subset.
And obtaining labeling data. The labeling data is the time length of the time point image corresponding to the second nerve graft image minus the time point image corresponding to the first nerve graft image.
Inputting the training fusion matrix, the first nerve graft image and the second nerve graft image into a nerve graft prediction network for prediction to obtain the training time length.
And carrying out loss calculation on the training time length and the labeling data to obtain a loss value.
And training a time change network and a nerve graft prediction network according to the loss value to obtain a trained prediction change network.
Optionally, the method for acquiring the training fusion matrix includes:
and extracting features through the nerve graft detection network based on the first training subset to obtain a first graft three-dimensional matrix.
Based on the first training subset, a first time feature matrix is obtained through a time convolution network.
Inputting the first graft three-dimensional matrix into a time change network, extracting the time change characteristics, and obtaining first cluster change characteristics.
And inputting the first time feature matrix into a time change network, extracting time change features, and obtaining first time change features.
And fusing the first cluster change characteristics and the first time change characteristics to obtain a training fusion matrix.
Optionally, the nerve graft growth network is trained by training the nerve graft growth network to obtain a trained nerve graft growth network through training the loss value of the prediction change network and back-propagating.
Wherein, because there is a connection relationship between the output values of the nerve graft growth network and the prediction variation network, the nerve graft detection network, the time convolution network, the time variation network and the nerve graft prediction network can be trained by back-propagation of the loss values.
Optionally, the training method of the nerve graft detection network includes:
a plurality of similar nerve graft image sets and a plurality of dissimilar nerve graft image sets are acquired. The set of similar nerve graft images includes two nerve graft images having a Euclidean distance less than a threshold. The set of dissimilar nerve graft images includes two nerve graft images having a Euclidean distance greater than or equal to a threshold.
And inputting the similar nerve graft image group or the dissimilar nerve graft image group into a nerve detection network to obtain a training detection value. The training test value of 1 indicates similarity. The training test value of 1 indicates dissimilarity.
In this embodiment, the threshold is 0.9.
Example 2
Based on the above-mentioned method for culturing the tissue-engineered peripheral nerve graft, the embodiment of the invention also provides a system for culturing the tissue-engineered peripheral nerve graft, which comprises an acquisition module, a detection module, a growth change judgment module and a re-detection module.
The acquisition module is used for acquiring a nerve graft image set. The set of nerve graft images includes a plurality of nerve graft images corresponding to detected cross-sections at a plurality of time points during tissue engineering peripheral nerve graft culture. A current nerve graft image and a labeling nerve graft image are obtained. The current nerve graft image is a nerve graft image with a time length from a current time point in the nerve graft image set longer than that of other nerve graft images. The labeled nerve graft image is an image of a nerve graft expected to be capable of performing nerve grafting.
The detection module is used for inputting the marked nerve graft image and the current nerve graft image into a nerve graft detection network to obtain a detection value. The detection value indicates a similarity to a nerve graft capable of detecting nerve grafting.
The growth change judging module is used for judging the growth speed change of the nerve graft through the nerve graft growth network according to the detection value, the labeling nerve graft image and the nerve graft image set, and obtaining the prediction time length.
The re-detection module is used for acquiring a new nerve graft image, inputting the new nerve graft image and the marked nerve graft image into the nerve graft detection network, and judging whether the nerve graft can carry out nerve grafting or not. The new nerve graft image is a nerve graft image acquired at the current time point plus the time point of the predicted time length.
The algorithms and displays presented herein are not inherently related to any particular computer, virtual system, or other apparatus. Various general-purpose systems may also be used with the teachings herein. The required structure for a construction of such a system is apparent from the description above. In addition, the present invention is not directed to any particular programming language. It will be appreciated that the teachings of the present invention described herein may be implemented in a variety of programming languages, and the above description of specific languages is provided for disclosure of enablement and best mode of the present invention.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be construed as reflecting the intention that: i.e., the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the apparatus of the embodiments may be adaptively changed and disposed in one or more apparatuses different from the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component and, furthermore, they may be divided into a plurality of sub-modules or sub-units or sub-components. Any combination of all features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be used in combination, except insofar as at least some of such features and/or processes or units are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments can be used in any combination.
Various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some or all of the components in an apparatus according to embodiments of the present invention may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present invention can also be implemented as an apparatus or device program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the present invention may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.
Claims (7)
1. A method of culturing a tissue-engineered peripheral nerve graft, comprising:
obtaining a set of nerve graft images; the nerve graft image set comprises a plurality of nerve graft images corresponding to the detected cross sections at a plurality of time points in the tissue engineering peripheral nerve graft culture process;
obtaining a current nerve graft image and a labeling nerve graft image; the current nerve graft image is a nerve graft image with a time length from a current time point in the nerve graft image set longer than that of other nerve graft images; the labeling nerve graft image is an image of a nerve graft expected to be capable of performing nerve grafting;
inputting the marked nerve graft image and the current nerve graft image into a nerve graft detection network to obtain a detection value; the detection value represents similarity to a nerve graft capable of detecting nerve grafting;
judging the growth speed change of the nerve graft through a nerve graft growth network according to the detection value, the labeling nerve graft image and the nerve graft image set to obtain a prediction time length;
acquiring a new nerve graft image, inputting the new nerve graft image and the marked nerve graft image into a nerve graft detection network, and judging whether the nerve graft can carry out nerve grafting or not; the new nerve graft image is a nerve graft image acquired at the current time point plus the time point of the predicted time length;
the nerve graft growth network comprises a nerve graft detection network, a time convolution network and a prediction change network;
the prediction change network comprises a time change network and a nerve graft prediction network;
judging the growth speed change of the nerve graft through a nerve graft growth network according to the detection value, the marked nerve graft image and the nerve graft image set to obtain a prediction time length, wherein the method comprises the following steps of:
inputting the nerve graft image in the nerve graft image set into a nerve graft detection network, extracting features, and obtaining a graft feature matrix;
obtaining a plurality of implant feature matrixes correspondingly by the plurality of nerve implant images;
constructing a plurality of implant feature matrixes into an implant three-dimensional matrix;
the nerve graft images are input into a time convolution network from far to near in time to obtain a time characteristic three-dimensional matrix; the time feature three-dimensional matrix represents the change features of the nerve graft image containing time information;
obtaining a prediction time length through a prediction change network according to the implant three-dimensional matrix, the time feature three-dimensional matrix, the labeling nerve implant image and the current nerve implant image in the nerve implant image set;
the training method of the nerve graft detection network comprises the following steps:
acquiring a plurality of similar nerve graft image sets and a plurality of dissimilar nerve graft image sets; the group of similar nerve graft images includes two nerve graft images having a Euclidean distance less than a threshold; the dissimilar nerve graft image group comprises two nerve graft images with Euclidean distance greater than or equal to a threshold value;
inputting the similar nerve graft image group or the dissimilar nerve graft image group into a nerve graft detection network to obtain a detection value; the detection value of 1 indicates similarity; the detection value of 0 indicates dissimilarity.
2. The method according to claim 1, wherein the obtaining the predicted time length by predicting the change network according to the graft three-dimensional matrix, the time feature three-dimensional matrix, the labeling nerve graft image and the current nerve graft image in the nerve graft image set comprises:
inputting the three-dimensional matrix of the implant into a time variation network, and extracting the time variation characteristics to obtain implant variation characteristics;
inputting the time characteristic three-dimensional matrix into a time variation network, extracting time variation characteristics and obtaining time variation characteristics;
fusing the implant change characteristics and the time change characteristics to obtain a fusion matrix;
and inputting the fusion matrix, the current nerve graft image and the marked nerve graft image into a nerve graft prediction network for prediction to obtain the prediction time length.
3. The method according to claim 2, wherein the step of inputting the three-dimensional matrix of the graft into a time-varying network to extract the time-varying features and obtain the graft-varying features comprises:
convolving the graft three-dimensional matrix with a three-dimensional convolution kernel with the size of 2xNxN according to the time direction to obtain a graft change characteristic; m nerve graft images are contained in the nerve graft image set to correspondingly obtain M-1 graft change characteristics;
and inputting M-1 graft change characteristics into a three-dimensional convolution network to obtain the graft change characteristics.
4. The method of claim 2, wherein the training of the predictive variability network comprises:
obtaining a first training subset; the first training subset is a non-empty subset of a set of nerve graft images;
obtaining a second training subset; the second training subset is a subset of the set of nerve graft images that is mutually exclusive to the first training subset;
obtaining a training image; the training image includes a training fusion matrix, a first nerve graft image, and a second nerve graft image;
the first nerve graft image is a nerve graft image with the time point in the first training subset being smaller than other time points from the current time point;
the second nerve graft image is a random one of the nerve graft images in the second training subset;
obtaining annotation data; the labeling data is the time length of the time point image corresponding to the second nerve graft image minus the time point image corresponding to the first nerve graft image;
inputting the training fusion matrix, the first nerve graft image and the second nerve graft image into a nerve graft prediction network for prediction to obtain training time length;
carrying out loss calculation on the training time length and the labeling data to obtain a loss value;
and training a time change network and a nerve graft prediction network according to the loss value to obtain a trained prediction change network.
5. The method of claim 4, wherein the training fusion matrix acquisition method comprises:
based on the first training subset, extracting features through a nerve graft detection network to obtain a first graft three-dimensional matrix;
based on the first training subset, obtaining a first time feature matrix through a time convolution network;
inputting the first graft three-dimensional matrix into a time change network, extracting time change characteristics, and obtaining first cluster change characteristics;
inputting the first time feature matrix into a time change network, extracting time change features and obtaining first time change features;
and fusing the first cluster change characteristics and the first time change characteristics to obtain a training fusion matrix.
6. The method of claim 1, wherein the nerve graft growth network is trained by training the nerve graft growth network to obtain a trained nerve graft growth network by training the loss value of the predicted change network and back-propagating.
7. A culture system for tissue engineered peripheral nerve grafts, comprising:
the acquisition module is used for: obtaining a set of nerve graft images; the nerve graft image set comprises a plurality of nerve graft images corresponding to the detected cross sections at a plurality of time points in the tissue engineering peripheral nerve graft culture process; obtaining a current nerve graft image and a labeling nerve graft image; the current nerve graft image is a nerve graft image with a time length from a current time point in the nerve graft image set longer than that of other nerve graft images; the labeling nerve graft image is an image of a nerve graft expected to be capable of performing nerve grafting;
and a detection module: inputting the marked nerve graft image and the current nerve graft image into a nerve graft detection network to obtain a detection value; the detection value represents similarity to a nerve graft capable of detecting nerve grafting;
a growth change judging module: judging the growth speed change of the nerve graft through a nerve graft growth network according to the detection value, the labeling nerve graft image and the nerve graft image set to obtain a prediction time length;
and (3) a re-detection module: acquiring a new nerve graft image, inputting the new nerve graft image and the marked nerve graft image into a nerve graft detection network, and judging whether the nerve graft can carry out nerve grafting or not; the new nerve graft image is a nerve graft image acquired at the current time point plus the time point of the predicted time length;
the nerve graft growth network comprises a nerve graft detection network, a time convolution network and a prediction change network;
the prediction change network comprises a time change network and a nerve graft prediction network;
judging the growth speed change of the nerve graft through a nerve graft growth network according to the detection value, the marked nerve graft image and the nerve graft image set to obtain a prediction time length, wherein the method comprises the following steps of:
inputting the nerve graft image in the nerve graft image set into a nerve graft detection network, extracting features, and obtaining a graft feature matrix;
obtaining a plurality of implant feature matrixes correspondingly by the plurality of nerve implant images;
constructing a plurality of implant feature matrixes into an implant three-dimensional matrix;
the nerve graft images are input into a time convolution network from far to near in time to obtain a time characteristic three-dimensional matrix; the time feature three-dimensional matrix represents the change features of the nerve graft image containing time information;
obtaining a prediction time length through a prediction change network according to the implant three-dimensional matrix, the time feature three-dimensional matrix, the labeling nerve implant image and the current nerve implant image in the nerve implant image set;
the training method of the nerve graft detection network comprises the following steps:
acquiring a plurality of similar nerve graft image sets and a plurality of dissimilar nerve graft image sets; the group of similar nerve graft images includes two nerve graft images having a Euclidean distance less than a threshold; the dissimilar nerve graft image group comprises two nerve graft images with Euclidean distance greater than or equal to a threshold value;
inputting the similar nerve graft image group or the dissimilar nerve graft image group into a nerve graft detection network to obtain a detection value; the detection value of 1 indicates similarity; the detection value of 0 indicates dissimilarity.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410086454.9A CN117611584B (en) | 2024-01-22 | 2024-01-22 | Tissue engineering peripheral nerve graft culture method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410086454.9A CN117611584B (en) | 2024-01-22 | 2024-01-22 | Tissue engineering peripheral nerve graft culture method and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117611584A CN117611584A (en) | 2024-02-27 |
CN117611584B true CN117611584B (en) | 2024-04-12 |
Family
ID=89944674
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410086454.9A Active CN117611584B (en) | 2024-01-22 | 2024-01-22 | Tissue engineering peripheral nerve graft culture method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117611584B (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104623738A (en) * | 2015-01-12 | 2015-05-20 | 南通大学 | Tissue engineering nerve graft with suspension fiber scaffold and preparation method thereof |
CN110507857A (en) * | 2019-08-30 | 2019-11-29 | 江南大学 | A kind of engineered nerve graft and preparation method thereof |
CN114288478A (en) * | 2021-12-24 | 2022-04-08 | 南通大学 | Tissue engineering nerve complex and preparation method and application thereof |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2536174B (en) * | 2013-12-17 | 2020-12-16 | Dtherapeutics Llc | Devices, systems and methods for tissue engineering of luminal grafts |
CA3179113A1 (en) * | 2020-06-03 | 2021-12-09 | Paul Holzer | Selection and monitoring methods for xenotransplantation |
-
2024
- 2024-01-22 CN CN202410086454.9A patent/CN117611584B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104623738A (en) * | 2015-01-12 | 2015-05-20 | 南通大学 | Tissue engineering nerve graft with suspension fiber scaffold and preparation method thereof |
CN110507857A (en) * | 2019-08-30 | 2019-11-29 | 江南大学 | A kind of engineered nerve graft and preparation method thereof |
CN114288478A (en) * | 2021-12-24 | 2022-04-08 | 南通大学 | Tissue engineering nerve complex and preparation method and application thereof |
Non-Patent Citations (1)
Title |
---|
Construction of tissue engineered nerve grafts and their application in peripheral nerve regeneration;Xiaosong Gu等;《Progress in Neurobiology》;20120229;第204-230页 * |
Also Published As
Publication number | Publication date |
---|---|
CN117611584A (en) | 2024-02-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP2015087903A (en) | Apparatus and method for information processing | |
CN112766110A (en) | Training method of object defect recognition model, object defect recognition method and device | |
CN110969600A (en) | Product defect detection method and device, electronic equipment and storage medium | |
CN112201300A (en) | Protein subcellular localization method based on depth image features and threshold learning strategy | |
CN112200887A (en) | Multi-focus image fusion method based on gradient perception | |
CN113393443A (en) | HE pathological image cell nucleus segmentation method and system | |
CN112183224A (en) | Model training method for image recognition, image recognition method and device | |
CN111563556A (en) | Transformer substation cabinet equipment abnormity identification method and system based on color gradient weight | |
CN115760770A (en) | Knee bone joint image intelligent detection method and device, electronic equipment and readable medium | |
CN114037868B (en) | Image recognition model generation method and device | |
CN117611584B (en) | Tissue engineering peripheral nerve graft culture method and system | |
CN106997590A (en) | A kind of image procossing and detecting system based on detection product performance | |
CN112017730B (en) | Cell screening method and device based on expression quantity prediction model | |
CN111325281B (en) | Training method and device for deep learning network, computer equipment and storage medium | |
CN115701868B (en) | Domain self-adaptive enhancement method applicable to various visual tasks | |
CN111311601A (en) | Segmentation method and device for spliced image | |
Pereira et al. | Assessing active learning strategies to improve the quality control of the soybean seed vigor | |
CN113591601B (en) | Method and device for identifying hyphae in cornea confocal image | |
CN115511798A (en) | Pneumonia classification method and device based on artificial intelligence technology | |
CN109840501B (en) | Image processing method and device, electronic equipment and storage medium | |
EP1694056A1 (en) | Method of real-time correction of non-functioning pixels in digital radiography | |
CN114049303A (en) | Progressive bone age assessment method based on multi-granularity feature fusion | |
CN117711609B (en) | Nerve transplanting scheme recommendation method and system based on big data | |
CN112686277A (en) | Method and device for model training | |
CN114639099B (en) | Identification and positioning method, device, equipment and medium for target object in microscopic image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |