CN116630862A - Intelligent preparation method and system of animal embryo extract - Google Patents

Intelligent preparation method and system of animal embryo extract Download PDF

Info

Publication number
CN116630862A
CN116630862A CN202310662939.3A CN202310662939A CN116630862A CN 116630862 A CN116630862 A CN 116630862A CN 202310662939 A CN202310662939 A CN 202310662939A CN 116630862 A CN116630862 A CN 116630862A
Authority
CN
China
Prior art keywords
centrifugal
feature
centrifugal state
feature vector
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310662939.3A
Other languages
Chinese (zh)
Inventor
潘力
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Kaixin Advertising Trading Co ltd
Original Assignee
Hunan Kaixin Advertising Trading Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Kaixin Advertising Trading Co ltd filed Critical Hunan Kaixin Advertising Trading Co ltd
Priority to CN202310662939.3A priority Critical patent/CN116630862A/en
Publication of CN116630862A publication Critical patent/CN116630862A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61KPREPARATIONS FOR MEDICAL, DENTAL OR TOILETRY PURPOSES
    • A61K35/00Medicinal preparations containing materials or reaction products thereof with undetermined constitution
    • A61K35/12Materials from mammals; Compositions comprising non-specified tissues or cells; Compositions comprising non-embryonic stem cells; Genetically modified cells
    • A61K35/48Reproductive organs
    • A61K35/50Placenta; Placental stem cells; Amniotic fluid; Amnion; Amniotic stem cells
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • G06V10/765Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects using rules for classification or partitioning the feature space
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/70Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in livestock or poultry

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Developmental Biology & Embryology (AREA)
  • Cell Biology (AREA)
  • Data Mining & Analysis (AREA)
  • Immunology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Epidemiology (AREA)
  • Pharmacology & Pharmacy (AREA)
  • Medicinal Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Zoology (AREA)
  • Virology (AREA)
  • Biotechnology (AREA)
  • Reproductive Health (AREA)
  • Pregnancy & Childbirth (AREA)
  • Image Analysis (AREA)

Abstract

The application relates to the field of intelligent preparation, and particularly discloses an intelligent preparation method and an intelligent preparation system of animal embryo extract.

Description

Intelligent preparation method and system of animal embryo extract
Technical Field
The application relates to the field of intelligent preparation, and in particular relates to an intelligent preparation method and system of animal embryo extract.
Background
Animal embryo extract is a hormone extracted from animal embryo tissue, can promote cell division and growth, and has important application value in biotechnology and medical field.
In the conventional process of preparing animal embryo extract, there is a problem that a large amount of bioactive substances are destroyed, and beneficial biological effects are greatly reduced. Thus, a solution is desired.
Disclosure of Invention
The present application has been made to solve the above-mentioned technical problems. The embodiment of the application provides an intelligent preparation method and an intelligent preparation system of animal embryo extract, which realize automatic control of centrifuge operation based on a centrifugal monitoring video by adopting deep learning and artificial intelligence technology, thereby avoiding subjectivity and inaccuracy of manual judgment in the traditional control process, improving purity and yield of hydrolysate, saving energy and reducing cost.
According to one aspect of the present application, there is provided an intelligent preparation method of animal embryo extract, comprising: acquiring a centrifugal monitoring video of a preset time period acquired by a camera; extracting a plurality of centrifugal state monitoring key frames from the centrifugal monitoring video; the plurality of centrifugal state monitoring key frames are respectively passed through a convolutional neural network model comprising a depth feature fusion module to obtain a plurality of centrifugal state feature matrixes; respectively expanding the plurality of centrifugal state feature matrixes into centrifugal state feature vectors, and then obtaining centrifugal state time sequence associated feature vectors through a centrifugal state associated feature extractor based on a converter; calculating Euclidean distance values between every two adjacent centrifugal state feature matrices in the plurality of centrifugal state feature matrices to obtain a centrifugal state neighborhood time sequence associated feature vector consisting of a plurality of Euclidean distance values; fusing the centrifugal state neighborhood time sequence associated feature vector and the centrifugal state time sequence associated feature vector to obtain a classification feature vector; and passing the classification feature vector through a classifier to obtain a classification result, wherein the classification result is used for indicating whether the centrifuge stops running or not.
In the above method for intelligently preparing animal embryo extract, extracting a plurality of centrifugation state monitoring key frames from the centrifugation monitoring video comprises: and sampling the centrifugal monitoring video at a preset sampling frequency to obtain the plurality of centrifugal state monitoring key frames.
In the above method for intelligently preparing animal embryo element, the steps of passing the plurality of centrifugal state monitoring key frames through a convolutional neural network model including a depth feature fusion module to obtain a plurality of centrifugal state feature matrices respectively include: extracting a shallow feature matrix from a shallow layer of the convolutional neural network model; extracting a deep feature matrix from the deep layer of the convolutional neural network model; and fusing the shallow feature matrix and the deep feature matrix to obtain the centrifugal state feature matrix; wherein the ratio between the deep layer and the shallow layer is more than or equal to 5 and less than or equal to 10.
In the above method for intelligently preparing animal embryo extract, the steps of respectively expanding the plurality of centrifugal state feature matrices into centrifugal state feature vectors, and then obtaining centrifugal state time sequence associated feature vectors by a centrifugal state associated feature extractor based on a converter comprise: one-dimensional arrangement is carried out on the plurality of centrifugal state feature vectors so as to obtain global centrifugal state feature vectors; calculating the product between the global centrifugal state feature vector and the transpose vector of each centrifugal state feature vector in the plurality of centrifugal state feature vectors to obtain a plurality of self-attention correlation matrices; respectively carrying out standardization processing on each self-attention correlation matrix in the plurality of self-attention correlation matrices to obtain a plurality of standardized self-attention correlation matrices; obtaining a plurality of probability values by using a Softmax classification function through each normalized self-attention correlation matrix in the normalized self-attention correlation matrices; weighting each centrifugal state feature vector in the centrifugal state feature vectors by taking each probability value in the probability values as a weight to obtain the context centrifugal state feature vectors; and cascading the plurality of upper and lower Wen Yuyi centrifugal state feature vectors to obtain the centrifugal state time-series associated feature vector.
In the intelligent preparation method of the animal embryo element, european style between every two adjacent centrifugal state feature matrixes in the plurality of centrifugal state feature matrixes is calculatedThe distance value to obtain a centrifugal state neighborhood time sequence associated characteristic vector composed of a plurality of Euclidean distance values comprises: matrix expansion is carried out on the plurality of centrifugal state feature matrixes so as to obtain a plurality of centrifugal state feature vectors; calculating the Euclidean distance value between every two adjacent centrifugal state feature vectors in the plurality of centrifugal state feature vectors with the following distance formula; wherein, the formula is:wherein->And->Characteristic values representing respective positions of any two adjacent centrifugal state characteristic vectors of the plurality of centrifugal state characteristic vectors, respectively, +.>Representing the dimension of the feature vector ∈>Representing the Euclidean distance value between any two adjacent centrifugal state feature vectors in the plurality of centrifugal state feature vectors; and one-dimensionally arranging the plurality of Euclidean distance values to obtain the centrifugal state neighborhood time sequence associated feature vector.
In the above method for intelligently preparing animal embryo element, fusing the centrifugation state neighborhood time sequence related feature vector and the centrifugation state time sequence related feature vector to obtain a classification feature vector comprises: fusing the centrifugal state neighborhood time sequence associated feature vector and the centrifugal state time sequence associated feature vector by adopting a class converter space migration displacement fusion mode according to the following fusion formula to obtain the classification feature vector; wherein, the fusion formula is: Wherein (1)>Is the centrifugal state neighborhood time sequence associated feature vector,>is the centrifugal state time sequence associated feature vector, < >>Andthe +.f in the centrifugal state neighborhood time sequence associated feature vector>A local feature expansion feature vector and a +.th +.>Individual context local feature expansion feature vector, < >>Is a distance matrix between vectors, +.>Representing the Euclidean distance between vectors, +.>Is a mask threshold superparameter, and the vectors are all row vectors, +.>、/>And->Position-by-position addition, subtraction and multiplication of feature vectors, respectively, < >>Representing a matrix multiplication of the number of bits,representation->Function (F)>Is the classification feature vector.
In the above method for intelligently preparing animal embryo extract, the classification feature vector is passed through a classifier to obtain a classification result, where the classification result is used to indicate whether to stop the operation of the centrifuge, and the method includes: performing full-connection coding on the classification feature vectors by using a plurality of full-connection layers of the classifier to obtain coded classification feature vectors; and passing the coding classification feature vector through a Softmax classification function of the classifier to obtain the classification result.
According to another aspect of the present application, there is provided an intelligent preparation system of animal embryo extract, comprising: the monitoring video acquisition module is used for acquiring centrifugal monitoring video of a preset time period acquired by the camera; the sampling module is used for extracting a plurality of centrifugal state monitoring key frames from the centrifugal monitoring video; the centrifugal state feature extraction module is used for enabling the plurality of centrifugal state monitoring key frames to respectively pass through a convolutional neural network model comprising a depth feature fusion module to obtain a plurality of centrifugal state feature matrixes; the centrifugal state associated feature extraction module is used for respectively expanding the plurality of centrifugal state feature matrixes into centrifugal state feature vectors and then obtaining centrifugal state time sequence associated feature vectors through a centrifugal state associated feature extractor based on a converter; the Euclidean distance calculation module is used for calculating Euclidean distance values between every two adjacent centrifugal state feature matrices in the plurality of centrifugal state feature matrices to obtain a centrifugal state neighborhood time sequence associated feature vector composed of the plurality of Euclidean distance values; the fusion module is used for fusing the centrifugal state neighborhood time sequence associated feature vector and the centrifugal state time sequence associated feature vector to obtain a classification feature vector; and
And the classification result generation module is used for enabling the classification feature vector to pass through a classifier to obtain a classification result, and the classification result is used for indicating whether the operation of the centrifugal machine is stopped or not.
According to still another aspect of the present application, there is provided an electronic apparatus including: a processor; and a memory having stored therein computer program instructions that, when executed by the processor, cause the processor to perform the method of intelligent preparation of animal embryo extract as described above.
According to a further aspect of the present application there is provided a computer readable medium having stored thereon computer program instructions which, when executed by a processor, cause the processor to perform the method for intelligent preparation of animal embryo extract as described above.
Compared with the prior art, the intelligent preparation method and the system for the animal embryo extract, provided by the application, realize automatic control of the centrifuge operation by adopting deep learning and artificial intelligence technology and based on a centrifugal monitoring video, thereby avoiding subjectivity and inaccuracy of manual judgment in the traditional control process, improving the purity and yield of hydrolysate, saving energy and reducing cost.
Drawings
The above and other objects, features and advantages of the present application will become more apparent by describing embodiments of the present application in more detail with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of embodiments of the application and are incorporated in and constitute a part of this specification, illustrate the application and together with the embodiments of the application, and not constitute a limitation to the application. In the drawings, like reference numerals generally refer to like parts or steps.
FIG. 1 is a schematic view of a process for the intelligent preparation of animal embryo extract according to an embodiment of the present application; FIG. 2 is a flow chart of a method for intelligently preparing animal embryo extract according to an embodiment of the present application; FIG. 3 is a system architecture diagram of an intelligent preparation method of animal embryo extract according to an embodiment of the present application; FIG. 4 is a flow chart of convolutional neural network coding in an intelligent preparation method of animal embryo extract according to an embodiment of the present application; FIG. 5 is a flowchart of a centrifugal state correlation feature extraction process in an intelligent preparation method of animal embryo extract according to an embodiment of the present application; FIG. 6 is a block diagram of an intelligent preparation system for animal embryo elements in accordance with an embodiment of the present application; fig. 7 is a block diagram of an electronic device according to an embodiment of the application.
Detailed Description
Hereinafter, exemplary embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and it should be understood that the present application is not limited by the example embodiments described herein.
Summary of the application: the application provides a method for hydrolyzing placenta, which comprises the following specific steps: step 1, weighing fresh or frozen placenta, removing fascia and large blood vessels in a clean sterile environment, and washing in sterile ice water at the temperature of 4-10 ℃ to obtain a washed placenta; step 2, cutting the cleaned placenta obtained in the step 1 into small blocks, and mincing in a triturator to obtain smashed placenta; step 3, adding small molecular group water with the weight equal to that of the fresh or frozen placenta obtained in the step 2 and with the pH greater than 9.6, placing the small molecular group water in a sterile stainless steel vessel, sealing, quick-freezing overnight at the temperature of minus 20 ℃ or below, then completely thawing the small molecular group water in the environment of 3-5 ℃, and repeatedly quick-freezing and thawing the small molecular group water repeatedly to obtain hydrolysate; step 4, centrifuging the hydrolysate obtained in the step 3 in a centrifuge for 30-60 minutes to obtain supernatant and precipitate; wherein the centrifuge has a centrifuge speed of at least 3500RPM; step 5, filtering the supernatant obtained in the step 4 by qualitative filter paper to obtain filtrate, and filtering the filtrate from a 10KD ultrafiltration column to obtain leakage liquid with the molecular weight smaller than 10 KD; filtering the leaked liquid from a 1KD ultrafiltration column to obtain an ultrafiltrate with the molecular weight of 1KD-10 KD; and 6, vacuum drying and superfine grinding the precipitate obtained in the step 4 at the temperature of 56-65 ℃ to obtain a placenta drying product, wherein the placenta drying product is animal embryo extract.
In this way, the method avoids the introduction of pollutants and other harmful factors caused by the hydrolysis by strong acid and strong alkali in the prior art; and simultaneously, repeated freeze thawing at-20 ℃ and 4 ℃ is combined, so that the damage of high temperature to bioactive substances is avoided, and the activity of biomolecules in the placenta is reserved to the maximum extent.
In the above step 4, it is a critical technical problem to determine when the centrifuge is stopped. Because the premature stopping of the centrifuge has an effect on the purity and yield of the hydrolysate, thereby affecting the quality and effect of the subsequent steps; while stopping the centrifuge too late increases unnecessary energy consumption. However, in the conventional control process, the judgment of when to stop the centrifuge is performed manually, and the judgment result may be inaccurate due to the problem of strong subjectivity and the like of the manual judgment.
Aiming at the problems, the technical concept of the application is to realize automatic control of the operation of the centrifugal machine based on centrifugal monitoring video by utilizing deep learning and artificial intelligence technology, thereby avoiding subjectivity and inaccuracy of manual judgment in the traditional control process, improving the purity and yield of hydrolysate, saving energy and reducing cost.
Specifically, in the technical scheme of the application, firstly, a centrifugal monitoring video of a preset time period acquired by a camera is acquired. Here, the centrifugal process can be monitored in real time by adopting the camera to collect the centrifugal monitoring video, and the state of the hydrolysate can be reflected from the discrete monitoring video.
Given the large amount of redundant information in the off-axis surveillance video, such as the large amount of similar frames and repeated frames, the redundant information can greatly increase the computational power of the network model. Therefore, in the technical scheme of the application, a plurality of centrifugal state monitoring key frames are extracted from the centrifugal monitoring video so as to improve the calculation efficiency.
And then, the plurality of centrifugal state monitoring key frames are respectively passed through a convolutional neural network model comprising a depth feature fusion module to obtain a plurality of centrifugal state feature matrixes. Here, through the convolutional neural network model, deep feature extraction can be performed on each centrifugal state monitoring key frame, including shallow features such as color, texture, shape and the like, and deep features about the essence of the centrifugal state. Meanwhile, by introducing the depth feature fusion module, features of different layers (namely shallow features and deep features) can be fused, and the characterization capability and the distinguishing degree of the features are improved.
Since the state of the hydrolysate during centrifugation presents a complex dynamically varying distribution in the time dimension. That is, during centrifugation, the centrifugal state may change continuously with the lapse of time. In the technical scheme of the application, after the plurality of centrifugal state feature matrices are respectively unfolded into the centrifugal state feature vectors, the centrifugal state time sequence associated feature vectors are obtained through a centrifugal state associated feature extractor based on a converter. Here, the centrifugal state correlation feature extractor based on the converter may perform global semantic correlation on a plurality of centrifugal state feature vectors to extract a centrifugal state time sequence correlation feature vector. The time sequence correlation characteristic vectors can reflect the time sequence change rule in the centrifugal process and provide powerful support for optimizing and controlling the centrifugal process.
And then, calculating Euclidean distance values between every two adjacent centrifugal state feature matrices in the plurality of centrifugal state feature matrices to obtain a centrifugal state neighborhood time sequence associated feature vector consisting of the plurality of Euclidean distance values. That is, the degree of change in the centrifugal state in space can be reflected by calculating the euclidean distance value between the adjacent two centrifugal state feature matrices. The Euclidean distance values can form a centrifugal state neighborhood time sequence associated feature vector, and the change rule of the centrifugal state in space is reflected.
As described above, the centrifugation state neighborhood time-sequence-associated feature vector reflects the spatial-associated feature of the centrifugation state, and the centrifugation state time-sequence-associated feature vector reflects the time-sequence-associated feature of the centrifugation state.
Further, the classification feature vector is passed through a classifier to obtain a classification result, wherein the classification result is used for indicating whether the centrifuge is stopped. The classifier can map different input features to different output labels, so that classification and identification of samples are realized. Specifically, the classification feature vector is used as the input of the classifier, and the trained classifier can judge the state of the current hydrolysate based on historical data and give corresponding feedback guidance for preparation and production. More specifically, the classifier has two classification tags, namely, stop the operation of the centrifuge and not stop the operation of the centrifuge, to represent the current operation adjustment strategy of the centrifuge. Thus, automatic control of the centrifugal machine can be realized according to the classification result in the preparation and production processes.
Here, when the classification feature vector is obtained by fusing the centrifugal state neighborhood time-series correlation feature vector and the centrifugal state time-series correlation feature vector, the context correlation feature of the image feature semantics of the depth fusion of the plurality of centrifugal state monitoring key frames is expressed by taking into account the centrifugal state time-series correlation feature vector, and the centrifugal state neighborhood time-series correlation feature vector is obtained by calculating the euclidean distance value between every two adjacent centrifugal state feature matrices in the plurality of centrifugal state feature matrices, so that the overall feature distribution of the centrifugal state neighborhood time-series correlation feature vector has space migration in a high-dimensional feature space relative to the overall feature distribution of the centrifugal state time-series correlation feature vector. Therefore, if the fusion effect of the centrifugal state neighborhood time-series related feature vector and the centrifugal state time-series related feature vector under the condition of space migration can be improved, the feature expression effect of the classification feature vector can be improved.
Accordingly, applicants of the present application employ class-transformer space migration permutation fusion to fuse the centrifugal state neighborhood timing-related feature vectors, e.g., denoted as And the centrifugal state timing-associated feature vector, e.g., denoted +.>The method is specifically expressed as follows: />Is the distance matrix between the vectors, i.e. the +.>The feature value of the position is the centrifugal state neighborhood time sequence associated feature vector +.>Is>Personal characteristic value->A feature vector associated with the centrifugal state time sequence>Is>Personal characteristic value->Distance between->Representing the Euclidean distance between vectors, +.>Is a mask threshold superparameter and the vectors are all row vectors.
Here, the class transformer spatial migration permutation fusion is performed by correlating feature vectors in the centrifugal state neighborhood timingAnd said centrifugal state timing-associated feature vector +.>Mask prediction of a class transformer mechanism for spatial distances of feature value pairs by differential characterization of the feature value pairs, implementing the classification feature vector +.>Edge affine encoding in high-dimensional feature space and passing said classification feature vector +.>Time-sequence-related feature vector +.>And said centrifugal state timing-associated feature vector +.>Global rotation and translation under the converter mechanism are not deformed, so that the centrifugal state neighborhood time sequence associated feature vector is realized >And said centrifugal state timing-associated feature vector +.>The space migration displaceability of the feature distribution of the class feature vector is improved, so that the fusion effect of the class feature vector on the centrifugal state neighborhood time sequence associated feature vector and the centrifugal state time sequence associated feature vector is improved, and the feature expression effect of the class feature vector is also improved.
Based on the above, the application provides an intelligent preparation method of animal embryo extract, which comprises the following steps: acquiring a centrifugal monitoring video of a preset time period acquired by a camera; extracting a plurality of centrifugal state monitoring key frames from the centrifugal monitoring video; the plurality of centrifugal state monitoring key frames are respectively passed through a convolutional neural network model comprising a depth feature fusion module to obtain a plurality of centrifugal state feature matrixes; respectively expanding the plurality of centrifugal state feature matrixes into centrifugal state feature vectors, and then obtaining centrifugal state time sequence associated feature vectors through a centrifugal state associated feature extractor based on a converter; calculating Euclidean distance values between every two adjacent centrifugal state feature matrices in the plurality of centrifugal state feature matrices to obtain a centrifugal state neighborhood time sequence associated feature vector consisting of a plurality of Euclidean distance values; fusing the centrifugal state neighborhood time sequence associated feature vector and the centrifugal state time sequence associated feature vector to obtain a classification feature vector; and passing the classification feature vector through a classifier to obtain a classification result, wherein the classification result is used for indicating whether the centrifuge stops running or not.
Fig. 1 is a schematic view of a scenario of an intelligent preparation method of animal embryo extract according to an embodiment of the present application. As shown in fig. 1, in this application scenario, a centrifugal monitoring video of a predetermined period of time is acquired by a camera (e.g., C as illustrated in fig. 1). The monitoring video is then input to a server (e.g., S in fig. 1) deployed with an intelligent preparation algorithm for animal embryo elements, wherein the server is capable of processing the input video with the intelligent preparation algorithm for animal embryo elements to generate a classification result indicating whether to stop the operation of the centrifuge.
Having described the basic principles of the present application, various non-limiting embodiments of the present application will now be described in detail with reference to the accompanying drawings.
An exemplary method is: FIG. 2 is a flow chart of an intelligent preparation method of animal embryo extract according to an embodiment of the present application. As shown in fig. 2, the intelligent preparation method of animal embryo extract according to the embodiment of the application comprises the following steps: s110, acquiring a centrifugal monitoring video of a preset time period acquired by a camera; s120, extracting a plurality of centrifugal state monitoring key frames from the centrifugal monitoring video; s130, enabling the plurality of centrifugal state monitoring key frames to respectively pass through a convolutional neural network model comprising a depth feature fusion module to obtain a plurality of centrifugal state feature matrixes; s140, respectively expanding the plurality of centrifugal state feature matrixes into centrifugal state feature vectors, and then obtaining centrifugal state time sequence associated feature vectors through a centrifugal state associated feature extractor based on a converter; s150, calculating Euclidean distance values between every two adjacent centrifugal state feature matrices in the plurality of centrifugal state feature matrices to obtain a centrifugal state neighborhood time sequence associated feature vector composed of the plurality of Euclidean distance values; s160, fusing the centrifugal state neighborhood time sequence associated feature vector and the centrifugal state time sequence associated feature vector to obtain a classification feature vector; and S170, passing the classification feature vector through a classifier to obtain a classification result, wherein the classification result is used for indicating whether the centrifuge stops running or not.
Fig. 3 is a system architecture diagram of an intelligent preparation method of animal embryo extract according to an embodiment of the present application. As shown in fig. 2, in the network structure, first, a centrifugal monitoring video of a predetermined period of time acquired by a camera is acquired; then, extracting a plurality of centrifugal state monitoring key frames from the centrifugal monitoring video; the plurality of centrifugal state monitoring key frames are respectively passed through a convolutional neural network model comprising a depth feature fusion module to obtain a plurality of centrifugal state feature matrixes; respectively expanding the plurality of centrifugal state feature matrixes into centrifugal state feature vectors, and then obtaining centrifugal state time sequence associated feature vectors through a centrifugal state associated feature extractor based on a converter; then, calculating Euclidean distance values between every two adjacent centrifugal state feature matrices in the plurality of centrifugal state feature matrices to obtain a centrifugal state neighborhood time sequence associated feature vector consisting of a plurality of Euclidean distance values; fusing the centrifugal state neighborhood time sequence associated feature vector and the centrifugal state time sequence associated feature vector to obtain a classification feature vector; further, the classification feature vector is passed through a classifier to obtain a classification result indicating whether the centrifuge is stopped.
More specifically, in step S110, a centrifugal monitoring video of a predetermined period of time acquired by a camera is acquired. In the technical scheme of the application, the centrifugal process is required to be monitored in real time, so that the automatic control of the operation of the centrifugal machine is realized based on a centrifugal monitoring video, the subjectivity and inaccuracy of manual judgment in the traditional control process can be avoided, and the purity and yield of the hydrolysate are improved. In a specific example, the centrifugal process can be monitored in real time by collecting the centrifugal monitoring video through the camera, and the state of the hydrolysate can be reflected from the discrete monitoring video.
More specifically, in step S120, a plurality of centrifugal status monitoring key frames are extracted from the centrifugal status monitoring video. Given the large amount of redundant information in the off-axis surveillance video, such as the large amount of similar frames and repeated frames, the redundant information can greatly increase the computational power of the network model. Therefore, in the technical scheme of the application, a plurality of centrifugal state monitoring key frames are extracted from the centrifugal monitoring video so as to improve the calculation efficiency. More specifically, the centrifugal monitoring video is sampled at a predetermined sampling frequency to obtain the plurality of centrifugal status monitoring key frames.
More specifically, in step S130, the plurality of centrifugal state monitoring key frames are respectively passed through a convolutional neural network model including a depth feature fusion module to obtain a plurality of centrifugal state feature matrices. That is, the convolutional neural network model including the depth feature fusion module is used to perform feature extraction on the plurality of centrifugal state monitoring key frames, and in one example, the convolutional neural network including the depth feature fusion module may extract shallow features such as color, texture, shape, and deep features related to the essence of the centrifugal state. Meanwhile, by introducing the depth feature fusion module, features of different layers (namely shallow features and deep features) can be fused, and the characterization capability and the distinguishing degree of the features are improved. More specifically, the convolutional neural network comprises a plurality of neural network layers that are cascaded with each other, wherein each neural network layer comprises a convolutional layer, a pooling layer, and an activation layer. In the coding process of the convolutional neural network, each layer of the convolutional neural network carries out convolutional processing based on a convolutional kernel on input data by using the convolutional layer in the forward transmission process of the layer, carries out pooling processing on a convolutional feature map output by the convolutional layer by using the pooling layer and carries out activation processing on the pooling feature map output by the pooling layer by using the activation layer.
Fig. 4 is a flowchart of convolutional neural network coding in an intelligent preparation method of animal embryo extract according to an embodiment of the present application. As shown in fig. 4, in the convolutional neural network coding process, it includes: s210, extracting a shallow feature matrix from a shallow layer of the convolutional neural network model; s220, extracting a deep feature matrix from the deep layer of the convolutional neural network model; s230, fusing the shallow feature matrix and the deep feature matrix to obtain the centrifugal state feature matrix; wherein the ratio between the deep layer and the shallow layer is more than or equal to 5 and less than or equal to 10.
More specifically, in step S140, the plurality of centrifugal state feature matrices are respectively developed into centrifugal state feature vectors, and then the centrifugal state time-series associated feature vectors are obtained by a centrifugal state associated feature extractor based on a converter. Since the state of the hydrolysate during centrifugation presents a complex dynamically varying distribution in the time dimension. That is, during centrifugation, the centrifugal state may change continuously with the lapse of time. In the technical scheme of the application, after the plurality of centrifugal state feature matrices are respectively unfolded into the centrifugal state feature vectors, the centrifugal state time sequence associated feature vectors are obtained through a centrifugal state associated feature extractor based on a converter. Here, the centrifugal state correlation feature extractor based on the converter may perform global semantic correlation on a plurality of centrifugal state feature vectors to extract a centrifugal state time sequence correlation feature vector. The time sequence correlation characteristic vectors can reflect the time sequence change rule in the centrifugal process and provide powerful support for optimizing and controlling the centrifugal process.
Fig. 5 is a flowchart of a centrifugal state-associated feature extraction process in an intelligent preparation method of animal embryo extract according to an embodiment of the present application. As shown in fig. 5, in the centrifugal state-associated feature extraction process, it includes: each layer using the converter-based centrifugal state-associated feature extractor performs respective processing on input data in forward transfer of the layer: s310, performing one-dimensional arrangement on the centrifugal state feature vectors to obtain global centrifugal state feature vectors; s320, calculating the product between the global centrifugal state feature vector and the transpose vector of each centrifugal state feature vector in the plurality of centrifugal state feature vectors to obtain a plurality of self-attention correlation matrices; s330, respectively carrying out standardization processing on each self-attention association matrix in the plurality of self-attention association matrices to obtain a plurality of standardized self-attention association matrices; s340, each normalized self-attention correlation matrix in the normalized self-attention correlation matrices is processed by a Softmax classification function to obtain a plurality of probability values; s350, weighting each centrifugal state feature vector in the centrifugal state feature vectors by taking each probability value in the probability values as a weight to obtain the context centrifugal state feature vectors; and S360, cascading the plurality of upper and lower Wen Yuyi centrifugal state feature vectors to obtain the centrifugal state time sequence associated feature vector.
More specifically, in step S150, euclidean distance values between every two neighboring centrifugal state feature matrices among the plurality of centrifugal state feature matrices are calculated to obtain a centrifugal state neighborhood time-series associated feature vector composed of the plurality of euclidean distance values. That is, after the plurality of centrifugal state feature matrices are obtained, euclidean distance values between every two adjacent centrifugal state feature matrices in the plurality of centrifugal state feature matrices are calculated to obtain a centrifugal state neighborhood time sequence associated feature vector composed of the plurality of Euclidean distance values. That is, the degree of change in the centrifugal state in space can be reflected by calculating the euclidean distance value between the adjacent two centrifugal state feature matrices. The Euclidean distance values can form a centrifugal state neighborhood time sequence associated feature vector, and the change rule of the centrifugal state in space is reflected. In a specific example of the present application, first, the plurality of centrifugal state feature matrices may be reconstructed by a matrix to obtain a plurality of centrifugal state feature vectors, and then the euclidean distance value between every adjacent two centrifugal state feature vectors in the plurality of centrifugal state feature vectors is calculated, more specifically, the euclidean distance value between every adjacent two centrifugal state feature vectors in the plurality of centrifugal state feature vectors is calculated in the following distance formula; wherein, the formula is: Wherein->And->Characteristic values representing respective positions of any two adjacent centrifugal state characteristic vectors of the plurality of centrifugal state characteristic vectors, respectively, +.>Representing the dimension of the feature vector ∈>Representing the Euclidean distance value between any two adjacent centrifugal state feature vectors in the plurality of centrifugal state feature vectors; and after the plurality of Euclidean distance values are obtained, further carrying out one-dimensional arrangement on the Euclidean distance values to obtain the centrifugal state neighborhood time sequence association feature vector.
More specifically, in step S160, the centrifugal state neighborhood timing-associated feature vector and the centrifugal state timing-associated feature vector are fused to obtain a classification feature vector. According to the technical scheme, the feature information in the centrifugation process is more comprehensively described by fusing the centrifugation state neighborhood time sequence correlation feature vector and the centrifugation state time sequence correlation feature vector, when the classification feature vector is obtained by fusing the centrifugation state neighborhood time sequence correlation feature vector and the centrifugation state time sequence correlation feature vector, the context correlation feature of the image feature semantics of the depth fusion of the centrifugation state monitoring key frames is considered by taking the centrifugation state time sequence correlation feature vector into consideration, and the centrifugation state neighborhood time sequence correlation feature vector is obtained by calculating the Euclidean distance value between every two adjacent centrifugation state feature matrices in the centrifugation state feature matrices, so that the space migration exists in a high-dimensional feature space relative to the integral feature distribution of the centrifugation state time sequence correlation feature vector. Thus (2) If the fusion effect of the centrifugal state neighborhood time sequence related feature vector and the centrifugal state time sequence related feature vector under the condition of space migration can be improved, the feature expression effect of the classification feature vector can be improved. Accordingly, applicants of the present application employ class-transformer space migration permutation fusion to fuse the centrifugal state neighborhood timing-related feature vectors, e.g., denoted asAnd the centrifugal state timing-associated feature vector, e.g., denoted +.>The method is specifically expressed as follows:wherein (1)>Is the centrifugal state neighborhood time sequence associated feature vector,>is the centrifugal state time sequence associated feature vector, < >>Andthe +.f in the centrifugal state neighborhood time sequence associated feature vector>A local feature expansion feature vector and a +.th +.>Individual context local feature expansion feature vector, < >>Is a distance matrix between vectors, +.>Representing the Euclidean distance between vectors, +.>Is a mask threshold superparameter, and the vectors are all row vectors, +.>、/>And->Position-by-position addition, subtraction and multiplication of feature vectors, respectively, < >>Representing matrix multiplication +.>Representation->Function (F)>Is the classification feature vector. Here, the class transformer spatial migration permutation fusion is performed by correlating feature vectors +. >And said centrifugal state timing-associated feature vector +.>Mask prediction of a class transformer mechanism for spatial distances of feature value pairs by differential characterization of the feature value pairs, implementing the classification feature vector +.>Edge affine encoding in high-dimensional feature space and passing said classification feature vector +.>Time-sequence-related feature vector +.>And said centrifugal state timing-associated feature vector +.>Global rotation and translation under the converter mechanism are not deformed, so that the centrifugal state neighborhood time sequence associated feature vector is realized>And said centrifugal state timing-associated feature vector +.>The space migration displaceability of the feature distribution of the class feature vector is improved, so that the fusion effect of the class feature vector on the centrifugal state neighborhood time sequence associated feature vector and the centrifugal state time sequence associated feature vector is improved, and the feature expression effect of the class feature vector is also improved.
More specifically, in step S170, the classification feature vector is passed through a classifier to obtain a classification result indicating whether to stop the operation of the centrifuge. That is, after the classification feature vector is obtained, it is further passed through a classifier to obtain a classification result indicating whether or not to stop the operation of the centrifuge. Specifically, the classifier includes a plurality of fully connected layers and a Softmax layer cascaded with a last fully connected layer of the plurality of fully connected layers. In the classification processing of the classifier, multiple full-connection encoding is carried out on the classification feature vectors by using multiple full-connection layers of the classifier to obtain encoded classification feature vectors; further, the encoded classification feature vector is input to a Softmax layer of the classifier, i.e., the encoded classification feature vector is classified using the Softmax classification function to obtain a classification label. In particular, the classifier is capable of mapping different input features onto different output labels, thereby enabling classification and identification of samples. Specifically, the classification feature vector is used as the input of the classifier, and the trained classifier can judge the state of the current hydrolysate based on historical data and give corresponding feedback guidance for preparation and production. More specifically, the classifier has two classification tags, namely, stop the operation of the centrifuge and not stop the operation of the centrifuge, to represent the current operation adjustment strategy of the centrifuge. Thus, automatic control of the centrifugal machine can be realized according to the classification result in the preparation and production processes.
In summary, the intelligent preparation method of the animal embryo extract according to the embodiment of the application is clarified, and by adopting the deep learning and artificial intelligence technology and based on the centrifugal monitoring video, the automatic control of the operation of the centrifugal machine is realized, so that subjectivity and inaccuracy of manual judgment in the traditional control process are avoided, the purity and yield of hydrolysate are improved, and meanwhile, energy sources can be saved and the cost is reduced.
Exemplary System FIG. 6 is a block diagram of an intelligent preparation system for animal embryo elements in accordance with an embodiment of the present application. As shown in fig. 6, an intelligent preparation system 300 of animal embryo extract according to an embodiment of the present application comprises: a surveillance video acquisition module 310; a sampling module 320; a centrifugal state feature extraction module 330; a centrifugal state-associated feature extraction module 340; a Euclidean distance calculation module 350; a fusion module 360; and a classification result generation module 370.
The monitoring video acquisition module 310 is configured to acquire a centrifugal monitoring video of a predetermined period acquired by a camera; the sampling module 320 is configured to extract a plurality of centrifugal status monitoring key frames from the centrifugal monitoring video; the centrifugal state feature extraction module 330 is configured to obtain a plurality of centrifugal state feature matrices by passing the plurality of centrifugal state monitoring key frames through a convolutional neural network model including a depth feature fusion module, respectively; the centrifugal state associated feature extraction module 340 is configured to obtain centrifugal state time sequence associated feature vectors by using a centrifugal state associated feature extractor based on a converter after the centrifugal state feature matrices are respectively expanded into centrifugal state feature vectors; the euclidean distance calculating module 350 is configured to calculate euclidean distance values between every two neighboring centrifugal state feature matrices in the plurality of centrifugal state feature matrices to obtain a centrifugal state neighborhood time sequence associated feature vector composed of a plurality of euclidean distance values; the fusion module 360 is configured to fuse the centrifugal state neighborhood time-sequence-associated feature vector and the centrifugal state time-sequence-associated feature vector to obtain a classification feature vector; and the classification result generating module 370 is configured to pass the classification feature vector through a classifier to obtain a classification result, where the classification result is used to indicate whether to stop the operation of the centrifuge.
In one example, in the above-described intelligent preparation system 300 of animal embryo elements, the sampling module 320 is configured to: and sampling the centrifugal monitoring video at a preset sampling frequency to obtain the plurality of centrifugal state monitoring key frames.
In one example, in the above-described intelligent preparation system 300 of animal embryo elements, the centrifugal status feature extraction module 330 is configured to: extracting a shallow feature matrix from a shallow layer of the convolutional neural network model; extracting a deep feature matrix from the deep layer of the convolutional neural network model; and fusing the shallow feature matrix and the deep feature matrix to obtain the centrifugal state feature matrix; wherein the ratio between the deep layer and the shallow layer is more than or equal to 5 and less than or equal to 10.
In one example, in the above-described intelligent preparation system 300 of animal embryo elements, the centrifugal state-associated feature extraction module 340 is configured to: one-dimensional arrangement is carried out on the plurality of centrifugal state feature vectors so as to obtain global centrifugal state feature vectors; calculating the product between the global centrifugal state feature vector and the transpose vector of each centrifugal state feature vector in the plurality of centrifugal state feature vectors to obtain a plurality of self-attention correlation matrices; respectively carrying out standardization processing on each self-attention correlation matrix in the plurality of self-attention correlation matrices to obtain a plurality of standardized self-attention correlation matrices; obtaining a plurality of probability values by using a Softmax classification function through each normalized self-attention correlation matrix in the normalized self-attention correlation matrices; weighting each centrifugal state feature vector in the centrifugal state feature vectors by taking each probability value in the probability values as a weight to obtain the context centrifugal state feature vectors; and cascading the plurality of upper and lower Wen Yuyi centrifugal state feature vectors to obtain the centrifugal state time-series associated feature vector.
In one example, in the above-described intelligent preparation system 300 of animal embryo elements, the euclidean distance calculating module 350 is configured to: matrix expansion is carried out on the plurality of centrifugal state feature matrixes so as to obtain a plurality of centrifugal state feature vectors; calculating the Euclidean distance value between every two adjacent centrifugal state feature vectors in the plurality of centrifugal state feature vectors with the following distance formula; wherein the formulaWherein->And->Characteristic values representing respective positions of any two adjacent centrifugal state characteristic vectors of the plurality of centrifugal state characteristic vectors, respectively, +.>Representing the dimension of the feature vector ∈>Representing the Euclidean distance value between any two adjacent centrifugal state feature vectors in the plurality of centrifugal state feature vectors; and one-dimensionally arranging the plurality of Euclidean distance values to obtain the centrifugal state neighborhood time sequence associated feature vector.
In one example, in the above-described intelligent preparation system 300 of animal embryo elements, the fusion module 360 is configured to: by adopting a mode of class converter space migration displacement fusion, the method comprises the following steps ofBlending the centrifugal state neighborhood time sequence associated feature vector and the centrifugal state time sequence associated feature vector by a formula to obtain the classification feature vector; wherein, the fusion formula is: Wherein (1)>Is the centrifugal state neighborhood time sequence associated feature vector,>is the centrifugal state time sequence associated feature vector, < >>And->The +.f in the centrifugal state neighborhood time sequence associated feature vector>A local feature expansion feature vector and a +.th +.>Individual context local feature expansion feature vector, < >>Is a distance matrix between vectors, +.>Representing the Euclidean distance between vectors, +.>Is a mask threshold superparameter, and the vectors are all row vectors, +.>、/>And->Position-by-position addition, subtraction and multiplication of feature vectors, respectively, < >>Representing matrix multiplication +.>Representation->Function (F)>Is the classification feature vector.
In one example, in the above-described intelligent preparation system 300 of animal embryo elements, the classification result generation module 370 is configured to: performing full-connection coding on the classification feature vectors by using a plurality of full-connection layers of the classifier to obtain coded classification feature vectors; and passing the coding classification feature vector through a Softmax classification function of the classifier to obtain the classification result.
In summary, the intelligent preparation system 300 of animal embryo extract according to the embodiment of the application is illustrated, which adopts deep learning and artificial intelligence technology and realizes automatic control of centrifuge operation based on a centrifuge monitoring video, thereby avoiding subjectivity and inaccuracy of manual judgment in the traditional control process, improving purity and yield of hydrolysate, saving energy and reducing cost.
As described above, the intelligent preparation system of animal embryo extract according to the embodiment of the present application can be implemented in various terminal devices. In one example, the intelligent preparation system 300 of animal embryo elements according to embodiments of the present application may be integrated into the terminal device as a software module and/or hardware module. For example, the intelligent preparation system 300 of animal embryo extract may be a software module in the operating system of the terminal device, or may be an application developed for the terminal device; of course, the intelligent preparation system 300 of animal embryo elements can also be one of the hardware modules of the terminal device.
Alternatively, in another example, the intelligent preparation system 300 for animal embryo elements may be a separate device from the terminal device, and the intelligent preparation system 300 for animal embryo elements may be connected to the terminal device through a wired and/or wireless network and transmit interactive information in a agreed data format.
Exemplary embodiments: the present application will be described in further detail with reference to the following examples in order to make the objects, technical solutions and advantages of the present application more clear and clarified. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
The invention provides a method for hydrolyzing placenta, which comprises the following steps:
step 1, weighing fresh or frozen placenta, removing fascia and large blood vessels in a clean sterile environment, and washing in sterile ice water at the temperature of 4-10 ℃ to obtain a washed placenta;
step 2, cutting the cleaned placenta obtained in the step 1 into small blocks, and mincing in a triturator to obtain smashed placenta;
step 3, adding small molecular group water with the weight equal to that of the fresh or frozen placenta obtained in the step 2 and with the pH greater than 9.6, placing the small molecular group water in a sterile stainless steel vessel, sealing, quick-freezing overnight at the temperature of minus 20 ℃ or below 3 ℃, then placing the small molecular group water in an environment of 3-5 ℃ for complete melting, and repeatedly carrying out quick-freezing and melting operation for a plurality of times to obtain hydrolysate;
step 4, centrifuging the hydrolysate obtained in the step 3 in a centrifuge for 30-60 minutes to obtain supernatant and precipitate; wherein the centrifuge has a centrifuge speed of at least 3500RPM; wherein the free radical element is mainly left in the supernatant, and the free hormone of the precipitate is substantially removed.
Step 5, filtering the supernatant obtained in the step 4 by qualitative filter paper to obtain filtrate, and filtering the filtrate from a 10KD ultrafiltration column to obtain a filtrate with a molecular weight less than 10 KD; filtering the leaked liquid from a 1KD ultrafiltration column to obtain an ultrafiltrate with the molecular weight of 1KD-10 KD;
And 6, vacuum drying and superfine grinding the precipitate obtained in the step 4 at the temperature of 56-65 ℃ to obtain a placenta drying product. Wherein, the frozen placenta in the step 1 is required to be completely thawed in advance at the temperature of 4-10 ℃. The granularity of the crushed placenta obtained in the step 2 is 60-120 meshes. Several examples of the method of hydrolyzing placenta of the present invention are given below.
Example 1: step 1, weighing fresh placenta of a person, removing fascia and large blood vessels in a clean sterile environment, and washing the fresh placenta in sterile ice water at 4 ℃ to obtain a washed placenta; wherein the particle size of the crushed placenta is 60 meshes;
step 2, cutting the cleaned placenta obtained in the step 1 into small blocks, and mincing in a triturator to obtain smashed placenta;
step 3, adding small molecular group water with the same weight as the fresh placenta obtained in the step 1 and the pH of 12.5 into the crushed placenta obtained in the step 2, placing the crushed placenta into a sterile stainless steel vessel, sealing, quick-freezing overnight at the temperature of minus 20 ℃, then placing the crushed placenta into an environment of 3 ℃ for complete melting, and repeatedly carrying out quick-freezing and melting operation for 3 times to obtain hydrolysate; wherein, the small molecular group water is formed by 3-6 water molecules, and has high movement speed, extremely strong penetrability, diffusion and dissolution. The higher the pH of the small molecular weight water, the higher the efficiency of hydrolysis.
Step 4, centrifuging the hydrolysate obtained in the step 3 in a centrifuge for 30 minutes to obtain supernatant and precipitate; wherein the centrifugal speed of the centrifugal machine is 3500RPM;
step 5, filtering the supernatant obtained in the step 4 by qualitative filter paper to obtain filtrate, and filtering the filtrate from a 10KD4 ultrafiltration column to obtain leakage liquid with the molecular weight smaller than 10 KD; filtering the leaked liquid from a 1KD ultrafiltration column to obtain an ultrafiltrate with the molecular weight of 1KD-10 KD;
and 6, vacuum drying and superfine grinding the precipitate obtained in the step 4 at the temperature of 56 ℃ to obtain a placenta drying product.
Example 2: step 1, weighing frozen placenta of an animal, thawing completely in advance at the temperature of 4 ℃, removing fascia and large blood vessels in a clean sterile environment, and washing in sterile ice water at the temperature of 10 ℃ to obtain a washed placenta;
step 2, cutting the cleaned placenta obtained in the step 1 into small blocks, and mincing in a triturator to obtain smashed placenta; wherein the particle size of the crushed placenta is 120 meshes;
step 3, adding small molecular groups with the same weight as the frozen placenta in step 1 and the pH of 10 into the crushed placenta obtained in step 2, and the invention also provides a placenta drying product, which is obtained by the method for hydrolyzing the placenta in the above specific embodiment.
The ultrafiltrate of the invention is used for skin irritation detection, acute eye irritation detection and skin allergy detection experiments respectively to prove the safety, and the safety is as follows:
(1) Skin irritation detection
(1) Selecting 4 healthy adult white rabbits with weight of 2.4-2.8kg, and controlling the temperature of an animal room at 22-25 ℃ and the relative humidity at 55-70%;
(2) on the day before the experiment, the hairs on the two sides of the spine of the back of the rabbit are cut off, and the hair removing range is about 3cm multiplied by 3cm on the left and right sides. 0.5ml of the sample was applied to the left dehaired skin in the range of 2.5X2.5cm2, once daily, for 14 consecutive days, with the right skin as a control. The skin reaction was observed after one hour from the next day, after dehairing before daily application and washing with clear water. And judging the stimulus intensity of the test object to the skin according to the skin stimulus intensity grading standard.
(2) Acute eye irritation detection
(1) 3 healthy adult white rabbits are selected, the weight of the white rabbits is 2.2-2.5kg, the male and female rabbits are used together, the temperature of an animal room is controlled at 22-25 ℃, and the relative humidity is 55-70%; (2) dripping 0.1ml sample into conjunctival sac of one side eye of rabbit, immediately closing rabbit eye for 1s, and washing with sufficient physiological saline with rapid flow rate for 30s until 30s; the other side eye was not treated as a self-control. Rabbit glasses were examined at 1h, 24h, 48h, 72h, respectively (sodium fluorescein was used after 24 h). The intensity of the stimulus of the test object to the eyes is judged according to the eye stimulus response grading standard. After 1h of eye drop, the structural films of the eyes of the tested eyes of the rabbits are slightly hyperemia, and the rabbits return to normal after 24 h.
(3) Skin allergy detection
(1) Selecting 36 healthy adult white guinea pigs, wherein the weight of the healthy adult white guinea pigs is 200g-300g, the male and female parts are half, and the temperature of an animal room is controlled at 22-25 ℃ and the relative humidity is 55-70%; (2) the guinea pigs were divided into test groups (20) and negative control groups (16) each female and male half by the local occlusion coating method.
(3) Induction contact: 24 hours before the test, the haired on the left side of the back of the guinea pig is shaved off, the range is about 3cm multiplied by 3cm, 0.2ml of sample is smeared on the dehaired skin on the left side of the back of the tested guinea pig, the smearing range is 2cm multiplied by 2cm, the haired skin is covered by two layers of gauze and one layer of cellophane, the haired skin is sealed and fixed by a non-stimulated adhesive tape for 6 hours, and then the tested object is removed. The same procedure was repeated once on each of the 7 th and 14 th days. No priming contact was given to the negative control group guinea pigs.
(4) Excitation contact: 14 days after the final sensitization, the samples were applied to the dehairing area on the right back of the guinea pigs in the test group, and the samples were used for excitation contact of the guinea pigs in the negative control group in the same way, and skin reactions in the tested area on the right back of the guinea pigs were observed 24 hours and 48 hours after the drug removal. The experimental results are shown in table 3.
Exemplary electronic device an electronic device according to an embodiment of the present application is described below with reference to fig. 7.
Fig. 7 illustrates a block diagram of an electronic device according to an embodiment of the application.
As shown in fig. 7, the electronic device 10 includes one or more processors 11 and a memory 12.
The processor 11 may be a Central Processing Unit (CPU) or other form of processing unit having data processing and/or instruction execution capabilities, and may control other components in the electronic device 10 to perform desired functions.
Memory 12 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, random Access Memory (RAM) and/or cache memory (cache), and the like. The non-volatile memory may include, for example, read Only Memory (ROM), hard disk, flash memory, and the like. On which one or more computer program instructions may be stored that are executable by the processor 11 to perform the functions described above in the method for intelligently preparing animal embryo elements in accordance with various embodiments of the present application and/or other desired functions. Various contents such as classification feature vectors may also be stored in the computer-readable storage medium.
In one example, the electronic device 10 may further include: an input device 13 and an output device 14, which are interconnected by a bus system and/or other forms of connection mechanisms (not shown).
The input means 13 may comprise, for example, a keyboard, a mouse, etc.
The output device 14 may output various information including the classification result and the like to the outside. The output means 14 may include, for example, a display, speakers, a printer, and a communication network and remote output devices connected thereto, etc.
Of course, only some of the components of the electronic device 10 that are relevant to the present application are shown in fig. 7 for simplicity, components such as buses, input/output interfaces, etc. are omitted. In addition, the electronic device 10 may include any other suitable components depending on the particular application.
Exemplary computer program product and computer readable storage Medium: in addition to the methods and apparatus described above, embodiments of the application may also be a computer program product comprising computer program instructions which, when executed by a processor, cause the processor to perform steps in the functions of the method for intelligent preparation of animal embryo extract according to the various embodiments of the application described in the "exemplary methods" section of this specification.
The computer program product may write program code for performing operations of embodiments of the present application in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present application may also be a computer-readable storage medium, having stored thereon computer program instructions which, when executed by a processor, cause the processor to perform the steps in the functions of the intelligent preparation method of animal embryo elements according to the various embodiments of the present application described in the "exemplary methods" section of the specification above.
The computer readable storage medium may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium may include, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The basic principles of the present application have been described above in connection with specific embodiments, however, it should be noted that the advantages, benefits, effects, etc. mentioned in the present application are merely examples and not intended to be limiting, and these advantages, benefits, effects, etc. are not to be considered as essential to the various embodiments of the present application. Furthermore, the specific details disclosed herein are for purposes of illustration and understanding only, and are not intended to be limiting, as the application is not necessarily limited to practice with the above described specific details.
The block diagrams of the devices, apparatuses, devices, systems referred to in the present application are only illustrative examples and are not intended to require or imply that the connections, arrangements, configurations must be made in the manner shown in the block diagrams. As will be appreciated by one of skill in the art, the devices, apparatuses, devices, systems may be connected, arranged, configured in any manner. Words such as "including," "comprising," "having," and the like are words of openness and mean "including but not limited to," and are used interchangeably therewith. The terms "or" and "as used herein refer to and are used interchangeably with the term" and/or "unless the context clearly indicates otherwise. The term "such as" as used herein refers to, and is used interchangeably with, the phrase "such as, but not limited to.
It is also noted that in the apparatus, devices and methods of the present application, the components or steps may be disassembled and/or assembled. Such decomposition and/or recombination should be considered as equivalent aspects of the present application.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present application. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the application. Thus, the present application is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, this description is not intended to limit embodiments of the application to the form disclosed herein. Although a number of example aspects and embodiments have been discussed above, a person of ordinary skill in the art will recognize certain variations, modifications, alterations, additions, and subcombinations thereof.

Claims (10)

1. An intelligent preparation method of animal embryo extract is characterized by comprising the following steps: acquiring a centrifugal monitoring video of a preset time period acquired by a camera; extracting a plurality of centrifugal state monitoring key frames from the centrifugal monitoring video; the plurality of centrifugal state monitoring key frames are respectively passed through a convolutional neural network model comprising a depth feature fusion module to obtain a plurality of centrifugal state feature matrixes; respectively expanding the plurality of centrifugal state feature matrixes into centrifugal state feature vectors, and then obtaining centrifugal state time sequence associated feature vectors through a centrifugal state associated feature extractor based on a converter; calculating Euclidean distance values between every two adjacent centrifugal state feature matrices in the plurality of centrifugal state feature matrices to obtain a centrifugal state neighborhood time sequence associated feature vector consisting of a plurality of Euclidean distance values; fusing the centrifugal state neighborhood time sequence associated feature vector and the centrifugal state time sequence associated feature vector to obtain a classification feature vector; and passing the classification feature vector through a classifier to obtain a classification result, wherein the classification result is used for indicating whether the centrifuge stops running or not.
2. The method of claim 1, wherein extracting a plurality of centrifugation state monitoring key frames from the centrifugation monitoring video comprises: and sampling the centrifugal monitoring video at a preset sampling frequency to obtain the plurality of centrifugal state monitoring key frames.
3. The method for intelligently preparing animal embryo extract according to claim 2, wherein the step of passing the plurality of centrifugation state monitoring key frames through a convolutional neural network model including a depth feature fusion module to obtain a plurality of centrifugation state feature matrices comprises: extracting a shallow feature matrix from a shallow layer of the convolutional neural network model; extracting a deep feature matrix from the deep layer of the convolutional neural network model; and fusing the shallow feature matrix and the deep feature matrix to obtain the centrifugal state feature matrix; wherein the ratio between the deep layer and the shallow layer is more than or equal to 5 and less than or equal to 10.
4. The method for the intelligent preparation of animal embryo extract according to claim 3 wherein the step of obtaining the centrifugation state time series correlation feature vector by the centrifugation state correlation feature extractor based on the converter after expanding the plurality of centrifugation state feature matrices into centrifugation state feature vectors, respectively, comprises: one-dimensional arrangement is carried out on the plurality of centrifugal state feature vectors so as to obtain global centrifugal state feature vectors; calculating the product between the global centrifugal state feature vector and the transpose vector of each centrifugal state feature vector in the plurality of centrifugal state feature vectors to obtain a plurality of self-attention correlation matrices; respectively carrying out standardization processing on each self-attention correlation matrix in the plurality of self-attention correlation matrices to obtain a plurality of standardized self-attention correlation matrices; obtaining a plurality of probability values by using a Softmax classification function through each normalized self-attention correlation matrix in the normalized self-attention correlation matrices; weighting each centrifugal state feature vector in the centrifugal state feature vectors by taking each probability value in the probability values as a weight to obtain the context centrifugal state feature vectors; and cascading the plurality of upper and lower Wen Yuyi centrifugal state feature vectors to obtain the centrifugal state timing-associated feature vector.
5. The method of claim 4, wherein calculating euclidean distance values between every two neighboring ones of the plurality of centrifugal state feature matrices to obtain a centrifugal state neighborhood time-sequence-associated feature vector consisting of the plurality of euclidean distance values, comprises: matrix expansion is carried out on the plurality of centrifugal state feature matrixes so as to obtain a plurality of centrifugal state feature vectors; calculating the Euclidean distance value between every two adjacent centrifugal state feature vectors in the plurality of centrifugal state feature vectors with the following distance formula; wherein, the formula is:wherein->And->Characteristic values representing respective positions of any two adjacent centrifugal state characteristic vectors of the plurality of centrifugal state characteristic vectors, respectively, +.>Representing the dimension of the feature vector ∈>Representing the Euclidean distance value between any two adjacent centrifugal state feature vectors in the plurality of centrifugal state feature vectors; and performing one-dimensional arrangement on the plurality of Euclidean distance values to obtain the centrifugal state neighborhood time sequence associated feature vector.
6. The method of claim 5, wherein fusing the centrifugation state neighborhood time-series associated feature vector and the centrifugation state time-series associated feature vector to obtain a classification feature vector, comprises: fusing the centrifugal state neighborhood time sequence associated feature vector and the centrifugal state time sequence associated feature vector by adopting a class converter space migration displacement fusion mode according to the following fusion formula to obtain the classification feature vector; wherein, the fusion formula is: Wherein (1)>Is the centrifugal state neighborhood time sequence associated feature vector,>is the centrifugal state time sequence associated feature vector, < >>Andthe +.f in the centrifugal state neighborhood time sequence associated feature vector>A local feature expansion feature vector and a +.th +.>Individual context local feature expansion feature vector, < >>Is a distance matrix between vectors, +.>Representing the Euclidean distance between vectors, +.>Is a mask threshold superparameter, and the vectors are all row vectors, +.>、/>And->Position-by-position addition, subtraction and multiplication of feature vectors, respectively, < >>Representing a matrix multiplication of the number of bits,representation->Function (F)>Is the classification feature vector.
7. The method of claim 6, wherein the step of passing the classification feature vector through a classifier to obtain a classification result, wherein the classification result is used to indicate whether to stop the centrifuge, comprises: performing full-connection coding on the classification feature vectors by using a plurality of full-connection layers of the classifier to obtain coded classification feature vectors; and passing the coding classification feature vector through a Softmax classification function of the classifier to obtain the classification result.
8. An intelligent preparation system of animal embryo element, which is characterized by comprising: the monitoring video acquisition module is used for acquiring centrifugal monitoring video of a preset time period acquired by the camera; the sampling module is used for extracting a plurality of centrifugal state monitoring key frames from the centrifugal monitoring video; the centrifugal state feature extraction module is used for enabling the plurality of centrifugal state monitoring key frames to respectively pass through a convolutional neural network model comprising a depth feature fusion module to obtain a plurality of centrifugal state feature matrixes; the centrifugal state associated feature extraction module is used for respectively expanding the plurality of centrifugal state feature matrixes into centrifugal state feature vectors and then obtaining centrifugal state time sequence associated feature vectors through a centrifugal state associated feature extractor based on a converter; the Euclidean distance calculation module is used for calculating Euclidean distance values between every two adjacent centrifugal state feature matrices in the plurality of centrifugal state feature matrices to obtain a centrifugal state neighborhood time sequence associated feature vector composed of the plurality of Euclidean distance values; the fusion module is used for fusing the centrifugal state neighborhood time sequence associated feature vector and the centrifugal state time sequence associated feature vector to obtain a classification feature vector; and the classification result generation module is used for enabling the classification feature vector to pass through a classifier to obtain a classification result, and the classification result is used for indicating whether the operation of the centrifugal machine is stopped or not.
9. The intelligent preparation system of animal embryo extract of claim 8 wherein the sampling module is configured to: and sampling the centrifugal monitoring video at a preset sampling frequency to obtain the plurality of centrifugal state monitoring key frames.
10. The intelligent preparation system of animal embryo extract of claim 9 wherein the fusion module is configured to: fusing the centrifugal state neighborhood time sequence associated feature vector and the centrifugal state time sequence associated feature vector by adopting a class converter space migration displacement fusion mode according to the following fusion formula to obtain the classification feature vector; wherein, the fusion formula is:wherein (1)>Is the centrifugal state neighborhood time sequence associated feature vector,>is the centrifugal state time sequence associated feature vector, < >>Andthe +.f in the centrifugal state neighborhood time sequence associated feature vector>A local feature expansion feature vector and a +.th +.>Individual context local feature expansion feature vector, < >>Is a distance matrix between vectors, +.>Representing the Euclidean distance between vectors, +.>Is a mask threshold superparameter, and the vectors are all row vectors, +. >、/>And->Position-by-position addition, subtraction and multiplication of feature vectors, respectively, < >>Representing a matrix multiplication of the number of bits,representation->Function (F)>Is the classification feature vector.
CN202310662939.3A 2023-06-06 2023-06-06 Intelligent preparation method and system of animal embryo extract Pending CN116630862A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310662939.3A CN116630862A (en) 2023-06-06 2023-06-06 Intelligent preparation method and system of animal embryo extract

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310662939.3A CN116630862A (en) 2023-06-06 2023-06-06 Intelligent preparation method and system of animal embryo extract

Publications (1)

Publication Number Publication Date
CN116630862A true CN116630862A (en) 2023-08-22

Family

ID=87613285

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310662939.3A Pending CN116630862A (en) 2023-06-06 2023-06-06 Intelligent preparation method and system of animal embryo extract

Country Status (1)

Country Link
CN (1) CN116630862A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117065876A (en) * 2023-09-05 2023-11-17 浙江艾领创矿业科技有限公司 Intelligent Sand Mill System and Method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117065876A (en) * 2023-09-05 2023-11-17 浙江艾领创矿业科技有限公司 Intelligent Sand Mill System and Method
CN117065876B (en) * 2023-09-05 2024-03-22 浙江艾领创矿业科技有限公司 Control method and control system of intelligent sand mill

Similar Documents

Publication Publication Date Title
CN116630862A (en) Intelligent preparation method and system of animal embryo extract
Lei et al. Lightweight V-Net for liver segmentation
Öztürk et al. Effects of histopathological image pre-processing on convolutional neural networks
CN104789630B (en) A kind of bluefin tuna ossein hydrolysate and preparation method thereof
Bi et al. Computer-aided skin cancer diagnosis based on a New meta-heuristic algorithm combined with support vector method
de Lima et al. Medical data set classification using a new feature selection algorithm combined with twin-bounded support vector machine
CN110558975B (en) Electrocardiosignal classification method and system
CN113723287B (en) Micro-expression recognition method, device and medium based on bidirectional circulating neural network
CN114360644A (en) Method and system for predicting combination of T cell receptor and epitope
CN112949560A (en) Method for identifying continuous expression change of long video expression interval under two-channel feature fusion
CN113052010A (en) Personnel mask wearing data set generation method based on deep learning
Abbas et al. Enhanced Skin Disease Diagnosis through Convolutional Neural Networks and Data Augmentation Techniques
CN116837058A (en) Plum blossom small molecule polypeptide preparation method and intelligent system thereof
Zhang et al. Video based cocktail causal container for blood pressure classification and blood glucose prediction
Zare et al. DenseNet approach to segmentation and classification of dermatoscopic skin lesions images
CN116492224A (en) A topical Chinese medicinal plaster and its preparation method
Varalakshmi et al. A comparative analysis of machine and deep learning models for cervical cancer classification
CN115547502B (en) Hemodialysis patient risk prediction device based on time sequence data
CN117158994A (en) Brain electrical signal classification method and system for cerebral apoplexy patient based on motor imagery
Dahdouh et al. A new approach using deep learning and reinforcement learning in healthcare: skin cancer classification
Yusoff et al. Performance of neural network architectures: Cascaded MLP versus extreme learning machine on cervical cell image classification
Qin et al. Multi-scale feedback feature refinement u-net for medical image segmentation
Al-ahmadi et al. Classification of Diabetic Retinopathy by Deep Learning.
CN116188901A (en) Cervical OCT image classification method and system based on mask self-supervised learning
Ma et al. An improved pulse coupled neural networks model for semantic IoT

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination